- Economy
- Education And Career
- Companies & Markets
- Gadgets & Technology
- After Hours
- Healthcare
- Banking & Finance
- Entrepreneurship
- Energy & Infra
- Case Study
- Video
- More
- Sustainability
- Web Exclusive
- Opinion
- Luxury
- Legal
- Property Review
- Cloud
- Blockchain
- Workplace
- Collaboration
- Developer
- Digital India
- Infrastructure
- Work Life Balance
- Test category by sumit
- Sports
- National
- World
- Entertainment
- Lifestyle
- Science
- Health
- Tech
How Technology Removes Our Choices
How product designers are using various techniques to lure people to choose from limited options
Photo Credit :


If you use Google to search for “Italian restaurant,” you are likely to see a small box at the top of the screen with a few results below a map. The positioning is significant: viewers are significantly more likely to click on those results than on anything else on the page, much as shoppers are more likely to pick up products from shelves at eye level in supermarkets than from higher and lower shelves.
But whereas in the physical world this limitation primarily affects our shopping experience, in the online and technology worlds, this algorithmic and sometimes intentional selection affects every subsequent thing that we see or do on that page—and far beyond it. The menu is the interface that controls the manner of engagement and sets limits on it, and the way menus are layered can radically alter the way we behave with technology.
For example, on iPhones Apple has an important... feature: the toggle that wipes in-app advertising identifiers that app makers can use to analyze and track users. Unfortunately, Apple places that feature deep in the menu: three layers deep. As a result, few people use it, even though regularly using the feature might significantly benefit their privacy by making it much harder for companies to track their behavior in smartphone apps.
(The industry would say that using it would lead people to have less personalized and less useful experiences, which is certainly true; there is always a trade-off.)
Apple has in general taken a strong leadership position in protecting the privacy of its customers—by minimizing storage of customer data and by designing systems such as Apple Pay to present fewer opportunities for third parties to access and potentially intercept those data. But its placement of that single toggle deep in the weeds on the iPhone illustrates how decisions by product makers influence our freedom of choice and our relationship with technology. By clearing that identifier regularly, phone users would wipe away some of the capabilities of application developers to accurately target and personalize in-product offers, e-mails, and other entreaties to further guide or limit our choices and set the agenda for us.
Another example is the ability to set notifications in the iPhone. Apple does not allow us to make global changes to all the notification settings of our apps. This means we must go through, app by app, and set notification settings. Sure, we can turn them all off by putting our device in “Do Not Disturb” mode. But that is a clumsy fix. Apple’s menu design for managing notifications reduces our choices and not necessarily to our advantage (which seems odd from Apple, a company that has become dominant precisely by simplifying technology).
As a number of thinkers in this field, led by former Google design ethicist Tristan Harris, explain, menus also frame our view of the world. A menu that shows our “most important” e-mails becomes a list of the people we have corresponded with most often recently rather than of those who are most important to us. A message that asks “Who wants to meet for brunch tomorrow?” goes out to the most recent group of people we have sent a group text to, or to preset groups of friends, effectively locking in these groups and locking out new people we have met. On the set of potential responses to e-mail that Google automatically suggests in its Inbox e-mail program, we have yet to see “Pick up the phone and call this person” as an option, even if, after a heated e-mail exchange, a call or a face-to-face conversation may well be the best way to communicate and to smooth the waters.
A feed of world news becomes a list built by a nameless, faceless algorithm of topics and events the system decides interest us. It limits our choice by confining it to options within a set of patterns deriving from our past consumption history, and this may or may not relate to our immediate needs or interests. Unfortunately, no one has yet developed an effective algorithm for serendipity. From the start of the day, a feed of what we missed on Facebook or Twitter as we slept presents us with a menu of comparisons that stokes our fear of missing out (FOMO). This is so by design. However benign its intent, its effect is to significantly limit our frames of reference and our thinking.
In May 2016, Tristan Harris published an influential essay titled “How technology is highjacking your mind—from a magician and Google design ethicist,” describing the many ways by which smartphones suck people into their vortex and demand constant attention. Harris traced the lineage of (both inadvertent and intentional) manipulation common in the design of technology products directly to the numerous techniques that slot-machine designers use to entice gamblers to sit for hours losing money.
Inspired by Harris and other advocates of more-mindful technology product design, a small but growing Silicon Valley movement in behavioral design is advocating greater consideration of the ethics and the human outcomes of technology consumption. (After leaving Google, Harris launched a website, Time Well Spent, that focuses on helping people build healthier interactions with technology.)
Harris, New York University marketing professor Adam Alter, and others have criticized the various techniques that product designers are using to encourage us to consume ever more technology even to our own clear detriment.
Tightly controlling menus to direct our attention is one common technique (one that is not as easily available to offline businesses). For his part, Harris suggests that we ask four questions whenever we’re presented with online menus: (1) What’s not on the menu? (2) Why am I being given these options and not others? (3) Do I know the menu provider’s goals? (4) Is this menu empowering for my original need, or are the choices actually a distraction?
We assure you, once you start asking these questions, you will never look at the Internet or at software applications in the same light again!
Another technique, alluded to in the title of Harris’s slot-machine article, is the use of intermittent variable rewards: unpredictability in the rewards of an interaction. The first behaviorist, psychologist B.F. Skinner, introduced this concept with his “Skinner box” research.4 Skinner put rats into boxes and taught them to push levers to receive a food pellet. The rats learned the connection between behaviour and reward quickly, in only a few tries. With further research, Skinner learned that the best way to keep the rats motivated to press the lever repeatedly was to reward them with a pellet only some of the time—to give intermittent variable rewards. Otherwise, the rats pushed the lever only when they were hungry.
The casinos took the concept of the Skinner box and raised it to a fine art, designing multiple forms of variable rewards into the modern computerized versions of slot machines.
Those machines now take in 70 to 80 percent of casino profits (or, according to an industry official, even 85 percent).5,6 Players not only receive payouts at seemingly random intervals but also receive partial payouts that feel like a win even if the player in fact loses money over all on a turn. With the newer video slots, players can place dozens of bets on the repetition of a screen icon in various directions and in varying sequence lengths.
Older mechanical slot machines displayed three reels and one line. Newer video slot machines display digital icon grids of five by five or more. This allows for many more types of bets and multiple bets in the same turn. For example, the player can bet on how many times the same icon will appear in a single row, how many times it will appear on a diagonal, and how many times it will appear in a screen full of icons, all in one turn. This allows players to win one or more small bets during a turn and gain the thrill of victory, albeit that in aggregate they lost money on their collective bets for the turn. The brain’s pleasure centers do not distinguish well between actual winning and the techniques that researchers call losses disguised as wins (LDW). The machines are also programmed to highlight near misses (nearly enough of the right numbers), since near misses actually stimulate the same neurons as real wins do.
Machine designers use myriad other clever sensory tricks—both visual and auditory—to stimulate our neurons in ways that encourage more playing. As explained in a 2014 article in The Conversation, “Losses disguised as wins, the science behind casino profits,” Special symbols might be placed on the reels that provide 10 free spins whenever three appear anywhere within the game screen. These symbols will often make a special sound, such as a loud thud when they land; and if two symbols land, many games will begin to play fast tempo music, display flashing lights around the remaining reels, and accelerate the rate of spin to enhance the saliency of the event. When you win these sorts of outcomes you feel as though you have won a jackpot; after all, 10 free spins is 10x the chances to win big money right? The reality is that those 10 free spins do not change the already small probability of winning on any given spin and are still likely to result in a loss of money. For many games, features such as this have entirely replaced standard jackpots.
What helps these techniques entice humans to keep playing is that our brains are hard wired to become easily addicted to variable rewards. This makes sense when you think that finding food in prehistoric, pre-agricultural times was a perfect example of intermittent variable rewards. According to research by Robert Breen, video-based gambling games (of which slots represent the majority) that rely on intermittent variable rewards result in gambling addiction three to four times faster than does betting on card games or sporting events.
Smartphones were not explicitly designed to behave like slot machines, but their effect is nearly the same. As Harris writes, When we pull our phone out of our pocket, we’re playing a slot machine to see what notifications we got. When we pull to refresh our email, we’re playing a slot machine to see what new email we got. When we swipe down our finger to scroll the Instagram feed, we’re playing a slot machine to see what photo comes next. When we swipe faces left/right on dating apps like Tinder, we’re playing a slot machine to see if we got a match. When we tap the [red badge showing us the number of notifications in an app], we’re playing a slot machine to [see] what’s underneath.
Through this lens we can see how many actions deeply embedded in the technology we use are acting as variable rewards systems, and when we look at the technology in our lives, we can find intermittent variable rewards in nearly every product, system, or device. Embedded in everything from e-mail to social media to chat systems to Q&A sites such as Quora, this reward structure is omnipresent and not easy for us to control without going to extremes and without constant vigilance.