In nature, there are neither rewards nor punishments. There are only consequences. — Robert Green Inge
When I was getting my psychology education, I had to read about B.F. Skinner and the “behaviorists”. Skinner was interested in how sentient beings process environmental cues and consequences, and how they learn to adapt to changing incentives. He generated a huge amount of data studying rats and pigeons under controlled conditions, and built a theory of behavior that he applied to humans. Needless to say, he got some things right and a LOT wrong.
What he got right was that behavior is affected by the things that happen in the environment. In the mid-20th century, Freudian psychoanalytic theory was the dominant paradigm of human behavior, and it focused almost completely on what was happening INSIDE of people. Behaviorists like Skinner and Watson and Thorndike believed that what happened in the OUTSIDE world was equally important in understanding why sentient beings (human and otherwise) do what they do. And they were of course right. Unfortunately they went to the other extreme of denying that anything of importance was going on inside our heads!
I found the animal studies pretty boring at the time (I wanted to be a therapist), but there are some important takeaways that shed light on human behavior and how to change it. These insights have been expanded by the fields of behavioral economics and game theory that attempt to develop mathematical models for predicting individual and group behavior.
Here are some of the main findings of B.F. Skinner’s research on what he called “operant conditioning”:
If you offer a reward (anything we experience as valuable/pleasurable) to someone AFTER they do something, they will tend to do MORE of that behavior (the “law of effect”)
If you reward (reinforce) that behavior EVERY time they do something, and then stop rewarding them, they will keep doing that behavior for a short while, and even increase it; then they will notice the reward has stopped, and stop the behavior (extinction)
[Here’s where it gets interesting]
If you reward a behavior every X times they do it ( a “ratio reward schedule”, which is a form of algorithm), they will do more of it. When you withdraw the reward, they will keep doing it a LOT longer than when you were rewarding them EVERY time they did it. In fact, the larger the number X (reward every 5, 10, 25 … times they do something), the longer they will keep doing it when you stop the reward (because the uncertainty about the reward conditions is greater).
If you give a reward every 5 times they do something, you can slowly increase the X in the algorithm (every 10 times, every 30 times, every 50 times etc.) and they will keep on doing that thing even though they are getting a much smaller return on investment for each action. In fact, some of Skinner’s studies showed that if you deliver a food pellet (or heroin or cocaine!) after a rat or pigeon pushes a lever, you can keep increasing the X of the reward algorithm endlessly and the animal will keep on pressing the bar until they drop dead of starvation or exhaustion. Pretty horrifying.
The Psycho-Security-Industrial Complex
Psychological research is supposed to be dedicated to the alleviation of suffering and improvement of well-being. It’s in the ethical guidelines for behavioral science professionals. But since psychologists are people too (really!), they are open to having their professional morals corrupted by an algorithm that delivers large rewards (read: paycheck) for unethical behavior.
Psychologists offer their services to a range of companies and military/security agencies whose missions are not, shall we say, “benign”. Military and national security agencies hire psych consultants to develop better techniques for interrogation, “brain-washing” and terrorizing/disabling of the enemy. The CIA’s “enhanced interrogation” (torture) program after the attacks of 9–11 was led by two psychologists. When the membership of the American Psychological Association advocated for the censure and expulsion of these members due to ethics violations, the leadership of the APA refused, thereby revealing their own corruption and ethical degradation.
Corporations hire behavioral scientists to help increase their profits by manipulating their customers to buy more product. Here are some examples:
Big Tobacco
For many decades, women smoked fewer cigarettes than men. Edward Bernays, sometimes called the “father of public relations” (and the nephew of Sigmund Freud!), advised his tobacco company clients in the 1960’s to link cigarette smoking with women’s drive for more freedom and autonomy. Tobacco ads showing attractive strong independent women smoking cigarettes generated exponential growth in women’s smoking rates, nearly erasing the previous gender disparity within a few decades (other factors were operating as well, such as increased workforce participation/stress etc.).
Manufacturing
When Henry Ford reconfigured the building of cars from its historic artisan workshop structure (where people built whole things) into the atomized assembly line model, he was able to drive greater speed and efficiencies in the manufacturing process. Taylor and Gilbreth’s time-and-motion studies demonstrated that you can crank up a productivity algorithm and drive people to do more piece work in a shift period.
Using Skinner’s ratio reward schedule, you just keep nudging up the X (# of widgets to earn a dollar) and people will jump and speed up. When you see your Amazon and FedEx delivery person dashing to and from your front door, they are dancing to the tune of the algorithm (now supercharged by wireless digital tech and AI). It’s a process driving human resource (HR) management in many businesses, and a root cause of the widespread burnout epidemic.
Supermarkets
As corporate grocery chains drove small independent butchers and fishmongers and greengrocers out of business (as CVS later did to independent pharmacists), they aimed to motivate higher per-customer consumption. The S&H supermarket chain implemented their “Green Stamps” program where they would give shoppers 1 stamp per X dollars spent. These stamps could be pasted into books (building on the then popular hobby of stamp collecting) and redeemed for “prizes” (small appliances, clothes, toys etc.).
When you see the words “1 stamp per X dollars spent”, I hope you remember the discussion of the Skinner box and the ratio reward schedule/algorithm above. Once you get shoppers “hooked” on redeeming stamps for “free” prizes, you can slowly (and secretly) increase the value of X so more dollars need to be spent on groceries to earn those cherished stamps and fill a book (you can also increase the number of stamps per book and number of books per “prize” in the algorithm). Customers will reliably spend more money to earn little Susie that doll she has been demanding as well as the new gloves Mom has been eyeing in the Green Stamps Catalog. Works like a charm! Profits soared.
Big Ag(riculture)
As control of meat and fruit/vegetable and grain production, processing and distribution in the U.S. and elsewhere has become ever more concentrated in a few near-monopolistic corporations, those companies are able to impose draconian algorithms on the small(er) farmers and ranchers that produce their “raw materials” (beef, milk, chickens, corn etc.). Once these corporate behemoths dominate the market and become the only game in town, they can ratchet up their payment algorithm and “squeeze” their local suppliers by paying less and less per unit product. With nowhere else to turn, the small farms and dairies and cattle operations are driven to exhaustion, bankruptcy and in increasing numbers even suicide.
What happens in Vegas …
People gamble with fantasies of striking it rich playing in their heads. And sometimes they do. But gambling is a big business, and its investors expect big returns (and they don’t always play nice when those profits fall short of expectations). So it’s vital that “the house always wins”.
“Always” doesn’t mean winning with every gambler every time. If it did, customers would quickly notice they were on an extinction schedule and stop gambling. So casinos hire behavioral scientists to help them create a rewards algorithm that is “rich” enough to keep the rats … er, I mean the customers … pulling the slot machine handles while still ensuring that the machines pay out less than they take in. They also give their gamblers “free” drinks and snacks (rewards) and even hotel rooms for the high rollers, all to keep them hooked and happily spending their money with dreams of a big payout that rarely comes. Hope springs eternal under the right algorithm.
Social Media Platforms
In the early days of Facebook, behavioral scientists weaponized the human need for social approval and status by implementing a “Like” button. Who doesn’t like to be liked? Shortly after Like launch, the number of customers and the time per customer spent on the platform soared. People couldn’t get enough Likes, and couldn’t stop returning to the platform to check and re-check for likes and shares and comments. Since Facebook’s business model monetizes these participation stats as the basis for their advertising rates (their only source of revenue/profit), attracting/holding customer attention is an existentially important activity, and social status became the drug of choice on Facebook, Twitter, Instagram etc.
Show me the money!
If you want to change someone’s behavior, the surest way is to attach a monetary value in the algorithm. If you want to help someone lose weight or stop smoking cigarettes (two behavioral goals with huge failure rates), attach a financial penalty for failing to reach the weekly/monthly weight or abstinence goal. It’s the MOST effective addiction treatment strategy going.
If you want someone to show up for work each day and produce a certain amount of output, pay them some money on a regular basis in the form of, oh say, a salary! Threaten to take that salary away if they get a bad performance review. Amazing how hard you can get people to work running that simple algorithm.
So what happens when it’s not only social status (Likes, Claps, Comments, Retweets etc.) on the line, but money too? Well, you generate even more behavioral control and a lot more emotion (mostly fear and anger). It’s a potent and potentially toxic brew if your algorithm threatens several levels of Maslow’s needs hierarchy at once.
Big Brother IS watching you!
Does this all sound too dystopian? Well, read about how China is building and rolling out an algorithm that assigns a “social credit score” to every person based on their on-line behavior and their financial and health and criminal records. This score will then be used to make decisions about whether to hire a person or give them an auto loan or home mortgage or accept them to a school. Imagine the power of an algorithm like that to control and manipulate millions of people with existential rewards and punishments, and how that will generate powerful emotions of fear and anger and mistrust and helplessness. Big Brother is really watching you.
Pushing Back on an Algorithmic World
We should all be continuously alert and aware whenever we are working or living or playing in a space that is running an algorithm that determines when and how we get (or don’t get) what we want. We can try to figure out how the formula is working and how it’s affecting our mood and well-being. Since individuals can’t control corporate or state-run algorithms, the only choices we have are to either (a) strategize how to cope with the algorithm in a way that maximizes our benefit and minimizes harm, or (b) leave the space where the algorithm is running if the risk/reward ratio is too toxic.
[HINT: there are very few algorithm-free spaces left. That’s why time spent in “nature” and at home and with friends is so restorative.]
We should also try to create and operate our own beneficial algorithms in as many spaces in our lives as possible. As partners, leaders, parents, friends and in our dealings with non-human beings and the natural world, let’s all try to be as transparent and benevolent as possible in running the numbers that drive what matters most.
Here's another thing your essay makes me wonder about. If the little dopamine hits from the "like" buttons are sufficient to manipulate large numbers of people, what is the impact of having, say, one percent of the population bigoted enough to give whole classes of people occasional reminders of "dislike." I wonder what studies exist on living with that. As an example, it's probably rare to be ejected from a flight for being Black, but it happens. https://www.bbc.com/news/articles/c722ejmvlwno
Thanks for another great essay. The TV show Black Mirror did a brilliant episode titled Nosedive on what you mention China is now working on. If you haven't seen it, I can guarantee it will interest you. In fact, Black Mirror should probably be used in college courses. Dark humor is the court jester of our day..https://en.wikipedia.org/wiki/Nosedive_(Black_Mirror)