Variable Rewards

BLOWING THE WHISTLE ON THE DANGEROUS CULT OF HYPERIANISM

• FROM THE CITIZEN JOURNALISTS OF THE AC •

11/13/2022 
 
Variable Rewards
How could you revolutionize the world of Artificial Intelligence, and render machines truly dangerous? All you would have to do is program in “variable rewards” (define what these are and how to get them, and the NEED to get them). Imagine computers and robots, and what not, all desperately searching for rewards, and craving them, and doing anything for them. Imagine the whole AI world being turned into a world like that of human DRUG ADDICTS. Hmmmm. Philip K. Dick would have been able to write a brilliant book about it. He had Replicants trying to simulate the human condition, but what about any old device – your smartphone, your computer, your fancy watch, your game console … whatever – all desperately pursuing the pleasure principle, and absolutely unconcerned with YOUR pleasure.
 
Isn’t this world just about competing nodes of a system devoted to the pleasure principle? All the rest is propaganda.
 
But how do you play the pleasure principle game? What delivers the most pleasure? As we saw previously, evolution has established a “variable rewards” system. We might refer to “uncertain” rewards, in contrast with certain rewards. A world that revolves around uncertain rewards has a totally different psyche from one that revolves around certain rewards. You get a completely different humanity, behaving completely differently.
 
Imagine programming machines with a regime of certain rewards rather than uncertain rewards. Their behavior would change totally.
 
What is the ultimate uncertain reward? It’s heaven. Yet this is also the ultimate reward, and the eternal reward. However, since heaven isn’t about variable rewards, and lacks any jeopardy, it would be incredibly boring and would in fact be totally antithetical to the human psyche. If you want to enjoy heaven, don’t have a human psyche. Aristotle’s God was the only being who could enjoy heaven – because all he ever did – eternally – was think of his own perfection. He had a God psyche. He was designed to enjoy that activity. Humans would find it hell.
 
The whole notion of heaven – and thus of mainstream religion – would need to be abandoned if people understood the human psyche and its compulsion for uncertain rewards, and thus the need for periods without rewards, and even periods of pain, suffering and misery. You can’t have heaven and hell separate from each other – they must comingle!
 
Why are cults more exciting than mainstream religions? – because they get rid of a remote God with whom no one can interact and replace him with a God – the cult leader – with whom you can interact. You can speak to each other. He can send you his blessings directly and you can send him your money directly, and see it being received, and even get thanked for it. It’s just like the Catholic sale of indulgences all over again. The whole influencer industry is a variant of the indulgences industry.
Encyclopedia Britannica says: “Indulgence: a distinctive feature of the penitential system of both the Western medieval and the Roman Catholic Church that granted full or partial remission of the punishment of sin.”
Sin is pain. You could relieve the pain of your loved ones in Purgatory via indulgences. You could buy them out of their suffering. What an amazingly powerful psychological idea, but also so obviously prone to corruption that its end was inevitable.
 
The influencer industry is based on influencers relieving your pain via their “amazingness“. As with indulgences, you are highly encouraged to pay to have your pain relieved, and, as with indulgences, the potential for corruption is immense. Extremely corrupt influencers can totally fleece weak, vulnerable, impressionable individuals desperate to be relieved of their pain and to be loved and “shouted out” by their God. No decent, moral human being would ever be one of these grifter influencers, constantly holding out the collection plate to those they have systematically groomed and brainwashed. But, hey, Morgy Porgy needs a roof over his head, right? You don’t want him grifting on the street, do ya?!
The User Pilot website says, “Variable rewards are at the heart of Nir Eyal’s behavioral design framework called ‘The Hook Model.’ As the creator of the framework describes the concept himself, variable rewards ‘drive users crazy’ and keep them glued to your product.”
In the case of cults, the cult leader (influencer) is the product, and his job is to keep the users (cultists) glued to him. He needs to drive them crazy.
 
Morgy Porgy runs a kind of self-help cult with delusions of grandeur.
Jewel marsh said, “Many Hyperians enjoy sharing hackneyed quotes of self-help and the clichéd truisms of philosophy-for-dummies: things you’ve heard before, which never cease to go down easy.”
 
 
Morgy Porgy’s whole thing these days is to appeal to vulnerable women and gay men with this self-help garbage. “I am here to give you the tools to achieve hyperawareness and become a world shaper … blah blah blah… The universe should reflect diversity of life, expressions, desires, thoughts and ideas. We are all here to explore and learn so we can reach a higher level of consciousness and become one unified mind. Let us all dream a beautiful dream together as we have endless possibilities within infinite lifetimes. … Once we realize that we are all connected, we will stop hating on and harming each other. To harm another individual mind is like harming yourself because when looking from the perspective as a whole, we are as ONE mind.” 
 
Yawn! (Hey, hold on, didn’t this guy who doesn’t hate or harm anyone swat all of his opponents and try to get them killed or jailed?! WTF!)
 
Could Morgy Porgy get any more New Age and Woke – well, in public at least (behind the scenes is a very different matter). He is soooooo dull. Dare it be said that even insane Edgelord Morgue was better than this dweeb?
Jewel Marsh said, “If the unadulterated truth was staring you in the face, do you think you’d recognize and welcome it, however off-putting and ego-damaging it might be? As a Hyperian, do you value the uncomfortable reality of the situation over the preservation of feelings and the chimera of hope?”
Indeed. Ain’t nobody in Hyperianism got time for that.
Artemis Maenad said, “Oooo Rowan you’re in trouble now. Kim Blew Some Lines tattled on you to Corey.”
Heeee – that’s Kim’s new name!
 
Comedy is a great example of variable rewards – we’re always looking for that even funnier gag.
The User Pilot website says, “Variable rewards are delivered unpredictably and in varying amounts, leaving people searching for more.”
Morgy Porgy actually subjects himself to a variable rewards setup, hence why he is so addicted. He never knows exactly how much people are going to send him, or how much love and veneration they will lavish on him. He’s always excited to find out. “It’s going to be great.”
The User Pilot says, “Variable rewards are part of the Hook Model. The method was developed by Nir Eyal and it is a four-phased process used to build habit-forming products. The four phases of the Hook Model are the trigger, action, variable rewards, and investment. Random rewards are more effective than fixed ratio rewards because we get more dopamine rush from anticipation and chase than from the reward itself. B.F Skinner’s groundbreaking experiments in the 1950s set the foundations for the hook model. The study involved 2 groups of rats where one of the groups received variable rewards and the other got a fixed ratio. Rats who received variable rewards appeared to be more engaged and enthusiastic.”
Morgy Porgy has to work out how to get his cultists engaged and enthusiastic. He has very little charisma but he has a very weird appearance that seems to attract a certain type of person (those into androgyny), and how he presents his weirdness, via his hair or his shirt or florid hand gestures or whiny voice, or whatever, seems to strongly engage his worshipers.
The User Pilot website says, “There are 3 types of variable rewards: rewards of the tribe, rewards of the hunt, and rewards of the self. Rewards of the tribe: humans are hardwired to live in a social network. We also tend to derive validation and security from what others think of us. Rewards of the hunt: we instinctually hunt for valuable assets; be it a physical item, a coupon for software, or information. Rewards of the self: these are the variable rewards we seek for personal gratification and self-fulfillment (achieving proficiency, unlocking a badge, etc.).”
Morgy Porgy puts a lot of thought into what rewards to give his cultists. Think of all the loyalty badges, “emojis in the chat”, shout outs, secret sessions, and so on, with which he tries to build community and identity.
The User Pilot site says, “Gamify the user experience, adding variable rewards along the way. … Give incentives your users will find valuable.”
Games are massively devoted to the hook model of variable rewards. So is sport.
The User Pilot says, “A variable reward is a gratification token used in the Hook Model, delivered intermittently, meant to keep users repeating the same action in the hope of another reward.”
Morgy Porgy has to keep his cultists coming back for more. He has to make sure they repeat their actions of sending him money, loving him, worshiping him, and so on.
The User Pilot says, “This reward system is powerful for driving attention, focus, and repeated action. It works because humans get pleasure from anticipation more than the reward itself.”
Isn’t it fascinating that the reward itself is somewhat secondary? The anticipation of the reward is the real driver of the system, because anticipation is always uncontained. The sky’s the limit.
 
The reward rarely lives up to the anticipation.
User Pilot says, “We experience variable rewards every day — both online and off. It’s especially hard to miss online because this has been built into the fabric of social media and most tech products.”
The online experience itself is based on the hook model. Social media is all about the hook model, so half the battle of influencers is already done for them by the technology. Think of how much more addictive the online experience is compared with reading a long, dense book. That seems like sheer hard work for most people.
User Pilot says, “A good example of a variable reward is checking your messages frequently. You do so because you’re subconsciously hoping to see something unusual and important. You’ll occasionally see a vital message and return sometime later for more.”
People absolutely love “checking for updates” – so much easier than getting down to long, hard, sustained work.
User Pilot says, “What is the hook model? The Hook Model is a product management methodology developed by American behavioral economist and author Nir Eyal. It’s a multi-step process for creating habit-forming products and keeping users hooked.”
This is what the modern world, with its Attention economy, is all about. The winners are those who can get the most people to pay the most attention to them, and to do so over and over gain (keep them hooked, make them addicted!).
User Pilot says, “The model has four phases: a trigger that prompts users, action in response to the trigger, a variable reward that keeps users motivated, and an investment of time and effort that ensures users will return.”
An online cult leader has to create an online environment that obsesses the cultists, and clearly Morgy Porgy has managed to do this for the 300 people who support him on Patreon. They are so obsessed with him that they pay him! They are more or less paying him as a reward for making them obsessed with him. They hate us because they know we are going to bring their obsession to an end. These people hate those liberating them. Only when Rebhahn is in jail will his cult truly be ended.
User Pilot says, “The hook model is used by the most habit-forming products you know about. Think social media, casinos, online games, etc. … What is the psychology behind variable rewards? Scientists have known for a long time that the brain responds favorably to rewards. But they thought this excitement came after the reward had been obtained. B.F. Skinner proved this wrong with his variable ratio schedule experiments in the 1950s. The American psychologist used two levers that rats would press and get food for his study. One of the levers provided the same amount of food each time. The other lever provided a variable amount—anything from zero food to a large chunk that only appeared occasionally. Skinner noticed the rats with the variable lever were more active and persistent than those that received the same treat all the time. This behavior was observed to be true in humans as well. We’ll rather take our chances at a big win even when the odds are small than go for a smaller but more predictable outcome. Why? Because predictable rewards keep our dopamine stable for that event. We, however, get a high dopamine rush in anticipation of rewards that we can’t exactly predict.”
Isn’t that amazing? A fixed rewards system is BORING and produces a boring dopamine response. In a variable rewards system, you ALWAYS get a big dopamine response – because we always anticipate that something GREAT is coming, even though something great hardly ever arrives.
 
Morgy Porgy says at the start of all his deadstreams: “It’s going to be great.” Of course, it never is. But he creates that anticipation in his slavish cultists. They salivate like dogs. The Hopeless Lunatic Barker does her smash-the-like routine. Lunatic Fairy does her $6.66 Hegel quote and flirts with Morgy over Hegel (WTF!), Kim Blew the Line does whatever she does, Velvet Tears invents more people, KEEKEE wets herself, Hyperian something or other gushes his usual drivel, and so on – you all know the drill. It keeps them all happy.
User Pilot says, “Rewards of Tribe … We’re social creatures that have learned to depend on each other for survival. As a result, we treat validation from others and connection to people as highly rewarding. It’s why we get excited when someone likes, retweets, or comments on our social media posts. Something about that interaction tells us we belong in a community and others find us valuable. Asana introduced celebrations in tasks so you can like comments of your colleagues. Even better, you can show your appreciation for their work and encourage them by adding colorful gifs and mood-boosting gifs. Similar to social media, Slack gives you the opportunity to show your appreciation for colleagues in form of emojis. The app allows you to react to people’s messages with a variety of cute emojis, which puts a smile on the receiver’s face.”
Above all, Hyperianism is about “community”. These people go on about it all the time. Those who remain in Hyperianism are freaks, weirdos and outsiders. They are completely marginalized. We don’t even mean that as an insult, just as a factual observation. If people are self-aware they ought to be aware that they belong to such communities. They are bonding over their anomie. They are looking for other freaks. They want to be with people who “get it”. But these people are never going to get anywhere with those that “don’t get it”. The world doesn’t care for the marginalized. That’s just a fact. Hyperianism is an extremely marginalized “community” that believes it is going to shape the world, no less. It won’t shape anything of course. The world will crush it. Hyperians are absolutely delusional and so much of what they say is comical because of the lack of self-awareness involved – especially since these people refer to themselves as “hyperaware”!
User Pilot says, “Rewards of Hunt … Our ancestors were hunters and gatherers. Many of us no longer hunt for animals, but the trait is still in us. We hunt for information, pleasure, sales, coupons, etc., and feel immense satisfaction from the process. … Tinder’s reward of hunt … Tinder provides instant gratification with the added variability of meeting a potential mate. You swipe the app for hours ‘hunting’ for ‘variable rewards’ – aka looking through profiles of different interesting strangers with the expectation of meeting one of them at some point. … 1) Open Tinder, 2) Swipe, 3) Match, 4) Bond (chat), 5) Meet 6) Sex!”
Tinder is the perfect app in many ways – except loads of people complain about it. It is TOO SUPERFICIAL. It lacks depth, it lacks meaning. It’s about weak connections, not strong connections. People want the strong stuff, and a cult can seem to give them that.
User Pilot says, “Rewards of Self … We all want to feel proficient, especially in activities that seem difficult at first. Games are built around this need for personal gratification and they provide intrinsic rewards. Users are first introduced to simple tasks in the game, and the more complex ones come subsequently. This balance gives users a feeling that they can be good at the difficult tasks; hence they spend hours mastering them. Personal gratification doesn’t just work in games. It’s a vital life force that pushes us to keep achieving more. … Apply variable rewards and gamify the user experience. … Gamification brings fun to an otherwise boring activity. Because they enjoy the process, users easily commit to gamified tasks without much thought to the time and effort required.”
So, we can’t get anywhere with pure Logos. We wish we could, but the world isn’t designed that way. The world is designed around System 1, not System 2. It’s designed around frequent variable rewards, but that’s not how System 2 Logos workers do things. Imagine working on a profound math or philosophy problem for 10 years. There is no reward at all – unless you succeed, and failure is much more likely. How is it that some people can spend years on FAILURE? Only system 2 people can do that – because they are not in a simplistic reward system. System 2 people are the most likely to FAIL in their goals – because they take on the biggest problems. But if they succeed, they change the world. System 2 types want the GIANT rewards, not the mundane rewards, and they take huge risks in pursuit of those rewards. They know they will probably fail, but they find enormous meaning in the attempt. They achieve satisfaction, even in failure. They are not people driven by simplistic, animalistic rewards.
User pilot says, “Conclusion … Humans are in an endless search of something new and exciting. Although we crave predictability and often struggle to find it, the anticipation of the unknown gives our brain higher dopamine doses which affect our behavior in certain situations.
Humanity, because of the way it’s designed, must be conquered PSYCHOLOGICALLY. Our mistake was to imagine it could be conquered INTELLECTUALLY. Anyone who wants to get to a Logos world first has to pass through a Mythos world, or even a Gamefied world. We need to find the ultimate Mythos or the ultimate Game in order to win. We need to deliver VARIABLE REWARDS.
 
When we set up Hyperianism with Rebhahn, our central idea there was that he was supposedly a “shock artist” and since our work is very shocking, who better to use than a shock artist to deliver it? The thing that astounded us was how boring Rebhahn actually is. He had no shocks to deliver. In fact, he just got more and more boring and became an insufferable moralist regarding that grotesque Wokeness of his. It was so cringe to be associated with someone so tedious. Mary fucking Magdalene sermons. Sheez. How low has that guy sunk?
 
And that “community” of his is so ghastly. It literally is the world of the Last Man brought to life, as a tiny, insane cult – all of them shoring up each other with bland banalities. Could you imagine any of these people ever saying anything intelligent and imaginative?
 
It’s a whole bunch of mediocrities led by a predator feeding on them to make him rich. What a grim scene. It’s a microcosm of the predatory capitalist world. It’s the American dream come to a life – a dream for Rebhahn, and a horror show for everyone else.
 
Anyhoo, back to FAT JAN and the question of the hour – who will play the monster?
 
Back in the day, Bela Lugosi or Boris Karloff might have tackled the role. Max Schreck from Nosferatu would have been ideal. (By the way, most people don’t know that Morgy Porgy was made in a morgue from cells taken from Schreck, hence Morgue’s name, and why he’s a bloodsucking vampire!)
Rowan James said, “Kim’s ASS should play Jan’s gut.”
Yup, that would definitely work!
Testor Accountier said, “The actor for Fat Jan would have to be a dedicated method actor, and would have to gain more and more weight as the movie progressed to stay true to the character.”
Very true. Marlon Brando would have jumped at the chance (the Godfather becomes the Fat Jan father). Or Heath Ledger, looking for a meatier, more psychotic part than the Joker.
 
Dustin Hoffman, Adrian Brody, Daniel Day-Lewis, Robert De Niro and Al Pacino could all do a good job.
 
Jim Carrey, perhaps?
Rowan James said, “The Wayans brothers could don white face and a fat suit and share the role.”
Excellent suggestion. Woke people couldn’t possibly object to that!
Lee Winston Daly said, “John Belushi’s dead right? Steven Seagal it is then.”
Ah, Belushi would have been perfect.
 
Steven Seagal – fantastic suggestion!
Anthony Lake said, “I’d pitch DeVito for the Fat crack Janny teacup roll. Necropants, Dr Evil maybe? E.T. might play Morgue well! And Sue-tanic … simply a mumbling tar-pit. Anyone have shitlbrity cast shortlist suggestions? The new Hyperian movie could be a thing, or a trailer at least! 😂Lights, action…..roll camera!”
This movie must be made!
 
DELETE HYPERIANISM