Skip to content

Categories:

The Art Of Manipulation

Manipulation Puppet

Editor’s Note: Nir Eyal is a founder of two startups and an advisor to several Bay Area companies and incubators. He is a Lecturer in Marketing at the Stanford Graduate School of Business and blogs about the intersection of psychology, technology, and business at NirAndFar.com. Follow him on Twitter@nireyal and see his previous Techcrunch posts here.

Let’s admit it, we in the consumer web industry are in the manipulation business. We build products meant to persuade people to do what we want them to do. We call these people “users” and even if we don’t say it aloud, we secretly wish every one of them would become fiendishly addicted.

Users take our technologies with them to bed. When they wake up, they check for notifications, tweets, and updates before saying “good morning” to their loved ones. Ian Bogost, the famed game creator and professor, calls the wave of habit-forming technologies the “cigarette of this century” and warns of equally addictive and potentially destructive side-effects.

When Is Manipulation Wrong?

Manipulation is a designed experience crafted to change behavior — we all know what it feels like. We’re uncomfortable when we sense someone is trying to make us do something we wouldn’t do otherwise, like when at a car dealership or a timeshare presentation.

Yet, manipulation can’t be all bad. If it were, what explains the numerous multi-billion dollar industries that rely heavily on users willfully submitting to manipulation? If manipulation is a designed experience crafted to change behavior, then Weight Watchers, one of the most successful mass-manipulation products in history, fits the definition.

Much like in the consumer web industry, Weight Watchers customers’ decisions are programed by the designer of the system. Yet few question the morality of Weight Watchers. But what’s the difference? Why is manipulating users through flashy advertising or addictive video games thought to be distasteful while a strict system of food rationing is considered laudable?

A More Addictive World

Unfortunately, our moral compass has not caught-up with what technology now makes possible. Ubiquitous access to the web, transferring greater amounts of personal data at faster speeds than ever before, has created a more addictive world. Addictiveness is accelerating and according to Paul Graham of Y Combinator, we haven’t had time to develop societal “antibodies to addictive new things.” Graham puts responsibility on the user: “Unless we want to be canaries in the coal mine of each new addiction—the people whose sad example becomes a lesson to future generations—we’ll have to figure out for ourselves what to avoid and how.”

But what of the people who make these manipulative experiences? The corporations who unleash these addictive technologies are, after all, made up of human beings with a moral sense of right and wrong. We too have families and kids who are susceptible to addiction and manipulation. What shared responsibilities do we code slingers and behavior designers have to our users, to future generations, and to ourselves?

The Manipulation Matrix

I offer a simple decision support tool for entrepreneurs, employees, and investors to be used long before product is shipped or code is written; even before customer development has begun. The Manipulation Matrix does not try and answer which businesses are moral or which will succeed. Nor does it describe what can and cannot become a habit-forming technology. The matrix seeks to help you answer not, “Can I hook users?” but “Should I attempt to?”

To use the Manipulation Matrix, the maker needs to ask two questions. First, “Will I use the product myself?” and second, “Will the product help users materially improve their lives?”

The Facilitator

When you create something that you will use and believe makes the user’s life better, you’re facilitating a healthful habit. It’s important to note that only you can decide if you would actually use the service and what “materially improving the life of the user” really means.

If you find yourself squirming as you ask yourself those questions or needing to create a preamble starting with, “If I were a…” STOP! You failed. You have to actually want to use the product and believe it materially benefits your life as well as the lives of your users. The one exception is if you would have been a user in your younger years. For example, in the case of an education company, you may not need to use the service right now, but are positive you would have used it in your not so distant past. Note however that the further you are from your former self, the lower your odds of success.

While I don’t know Mark Zuckerberg or the Twitter founders personally, I believe from their well-documented stories that they would see themselves as making products in this quadrant. There is also a long list of companies creating new products to improve lives by facilitating healthful habits. Whether getting users to exercise more, creating a habit of journaling, or improving back posture, these companies are run by authentic entrepreneurs who desperately want their products to exist, firstly to satisfy their own needs.

But what about when an addiction to a well-intended product becomes extreme, even harmful? For a product in this quadrant, I agree with Paul Graham in saying the responsibility falls to the user. In any normal distribution, a small percentage of people will be on the extremes. If the designers make a product that they would use themselves, and they believe it improves the lives of their users, they have fulfilled their moral obligation. To take liberties with Mahatma Gandhi, facilitators “build the change they want to see in the world.”

The Peddler

But heady altruistic ambitions can at times, get ahead of reality. Too often, designers of manipulative technology have a strong motivation to improve the lives of their users, but when pressed, they admit they would not actually use their own creations. Their holier-than-thou products often try to “gamify” some task no one actually wants to do by inserting hackneyed incentives like badges or points that don’t actually hold value for the user.

Fitness apps, charity websites, and products that claim to suddenly turn hard work into fun often fall in this quadrant. But possibly the most common example is in peddler advertising. Countless companies convince themselves they’re making ad campaigns users will love. They expect their videos to go viral and their branded apps to be used daily. Their reality distortion fields keep them from asking the critical question of, “Would I actually find this useful?” The answer to this uncomfortable question is nearly always “No,” so they bend their brain into the mind of a user they believe might find the ad valuable.

Materially improving users’ lives is a tall order. But attempting to create a persuasive technology which you don’t find valuable enough to use yourself is nearly impossible. There’s nothing immoral about peddling; it’s just the odds of success are depressingly low. You’ll lack the empathy and insights needed to create something users actually want. The peddler’s project tends to end up a time-wasting failure because fundamentally, no one finds it useful or fun. If it were, the peddler would be using it instead of hawking it.

The Entertainer

In fact, sometimes makers just want to have fun. If a creator of a potentially addictive technology makes something that they would use but can’t in good conscience claim improves the lives of their users, they’re making entertainment.

Entertainment is art and is important for its own sake. Art provides joy, helps us see the world differently, and connects us with the human condition. These are all important and age old pursuits. Entertainment, however, has particular attributes which the entrepreneur, employee, and investor should be aware of when using the Manipulation Matrix.

Art is often fleeting; products that form addictions around entertainment tend to fade quickly from users’ lives. A hit song, repeated over and over again in the mind, becomes nostalgia after it is replaced by the next single. A blog article like this one is read, shared, and thought about for a few minutes until the next interesting piece of brain candy comes along. Games like Farmville and Angry Birds engross users for a while, but then are relegated to the gaming dustbin along other hyper-addictive has-beens like Pac Man and Tetris.

Entertainment is a hits-driven business because the brain adapts to stimulus. Art is about creating continuous novelty and building an enterprise on ephemeral desires is a constantly running treadmill. In this quadrant, the sustainable business isn’t the game, the song, or the book — it’s the distribution system for getting those goods to market while they’re still hot.

The Dealer

Creating a product that the designer does not believe improves the user’s life and which the maker would not use is exploitation. In the absence of these two criteria, presumably the only reason you’re hooking users is to make a buck. Certainly there is money to be made addicting users to behaviors that do little more than extract cash; and where there is cash, there will be someone willing to take it.

The question is: Is that someone you? Casinos and drug dealers offer users a good time, but when the addiction takes hold, the fun stops.

In a satirical take on Zynga’s Farmville franchise Ian Bogost created Cow Clicker, a Facebook app where users did nothing but incessantly click on virtual cows to hear a satisfying “moo.” Bogost intended to lampoon Farmville by blatantly implementing the same game mechanics and viral hacks he thought would be laughably obvious to users. But after the app’s usage exploded and some people became frighteningly obsessed with the game, Bogost shut it down, bringing on what he called, “The Cowpocalypse.”

Judging for Yourself

Bogost was right in comparing addictive technology to the cigarette. Certainly, the incessant need for a smoke in what was once the majority of the adult population has been replaced by a nearly equal compulsion to constantly check our devices. But unlike the addiction to nicotine, new technologies offer an opportunity to dramatically improve the lives of users. It’s clear that like all technologies, recent advances in the habit-forming potential of web innovation have both positive and negative effects.

But if the innovator has a clear conscience that the product materially improves people’s lives — first among them, the creator’s — then the only path is to push forward. Users bear ultimate responsibility for their actions and makers should not be blamed for the misuse or overuse of their products.

However, as the march of technology makes the world a more addictive place, innovators need to consider their role. It will be years, perhaps generations, before society develops the antibodies to new addictions. In the meantime, users will have to judge the yet unknown consequences for themselves, while creators will have to live with the moral repercussions of how they spend their professional lives.

My hope is that Manipulation Matrix helps innovators consider the implications of the products they create. Perhaps after reading this, you’ll start a new business. Maybe you’ll join an existing company with a mission you believe in. Or, perhaps after reading this you’ll decide it’s time to quit your job, which you now come to realize no longer agrees with your moral compass.

Thank you to Amy Jo Kim, Jess Bachman, Max Ogles for reading early versions of this essay.

Photo Credit: byJess.net, Sarah G…, and NirAndFar.com


Read more : The Art Of Manipulation

Posted in Uncategorized.


0 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.



Some HTML is OK

or, reply to this post via trackback.