Excerpt from Wired.com, Feb 2018
BJ Fogg is an unlikely leader for a Silicon Valley movement. He’s a trained psychologist and twice the age of the average entrepreneur with whom he works. His students describe him as energetic, quirky, and committed to using tech as a force for good: In the past, he’s taught classes on making products to promote peace and using behavior design to connect with nature. But every class begins with his signature framework, Fogg’s Behavior Model. It suggests that we act when three forces—motivation, trigger, and ability—converge.
Now, from Facebook’s former president claiming that Silicon Valley’s tools are “ripping apart the social fabric of society” to France formally banning smartphones in public schools, we are starting to reexamine the sometimes toxic relationships we have with our devices. Looking at the source of product designers’ education may help us understand the downstream consequences of their creations—and the way to reverse it. Tristan Harris, the former Google Design Ethicist now leads the Time Well Spent movement.
Harris left Google in 2015 to expand the conversation around persuasive design outside of Mountain View. “Never before has a handful of people working at a handful of tech companies been able to steer the thoughts and feelings of a billion people,” he said in a recent talk at Stanford. “There are more users on Facebook than followers of Christianity. There are more people on YouTube than followers of Islam. I don’t know a more urgent problem than this.”
Harris has channeled his beliefs into his advocacy organization, Time Well Spent, which lobbies the tech industry to align with societal well-being. Three years later, his movement has begun to gain steam. Just look at Facebook, which recently restructured its news feed algorithm to prioritize the content that people find valuable (like posts from friends and family) over the stuff that people mindlessly consume (like viral videos). In a public Facebook post, Mark Zuckerberg wrote that one of Facebook’s main priorities in 2018, “is making sure the time we all spend on Facebook is time well spent.” Even, he said, if it’s at the cost of how much time you spend on the platform.
Facebook’s reckoning shows that companies can redesign their products to be less addictive—at the very least, they can try. Perhaps in studying the model that designers used to hook us to our phones, we can understand how those principles can be used to unhook us as well.
Finding the Cure
Fogg acknowledges that our society has become addicted to smartphones, but he believes consumers have the power to unhook themselves. “No one is forcing you to bring the phone into the bedroom and make it your alarm clock,” he says. “What people need is the motivation.”
Eyal’s next book, Indistractible, focuses on how to do that, using Fogg’s model in reverse. It takes the same three ideas—motivation, trigger, and ability—and reorients them toward ungluing us from our phones. For example, you can remove triggers from certain apps by adjusting your notification settings. (Or better yet, turn off all your push notifications.) You can decrease your ability to access Facebook by simply deleting the app from your phone.
“People have the power to put this stuff away and they always have,” says Eyal. “But when we preach powerlessness, people believe that.”
Others, like Harris and venture capitalist Roger McNamee, disagree. They believe corporations’ interests are so intertwined with advertisers’ demands that, until we change the system, companies will always find new ways to maximize consumers’ time spent with their apps. “If you want to fix this as quickly as possible, the best way would be for founders of these companies to change their business model away from advertising,” says McNamee, who was an early investor in Facebook and mentor to Zuckerberg. “We have to eliminate the economic incentive to create addiction in the first place.”
There is merit to both arguments. The same methods that addict people to Snapchat might keep them learning new languages on Duolingo. The line between persuasion and coercion can be thin, but a blanket dismissal of behavior design misses the point. The larger discussion around our relationship with our gadgets comes back to aligning use with intent—for product designers and users.
Where We Go Next
Harris and McNamee believe manipulative design has to be addressed on a systems level. The two are advocating for government regulation of internet platforms like Facebook, in part as a public health issue. Companies like Apple have also seen pressure from investors to rethink how gadget addiction is affecting kids. But ultimately, business models are hard to change overnight. As long as advertising is the primary monetization strategy for the web, there will always be those who use persuasive design to keep users around longer.
So in the meantime, there are tangible steps we can all take to break the loop of addiction. Changing your notification settings or turning your phone to grayscale might seem like low-hanging fruit, but it’s a place to start.
“It’s going to take the companies way longer than it would take you to do something about it,” says Eyal. “If you hold your breath and wait, you’re going to suffocate.”
Hooked On Technology
- Our minds have been hijacked by our phones. Tristan Harris wants to rescue them.
- The smartphone emotional complex has given rise to a whole new class of products: gadgets that save you from your gadgets.
- Are new technologies really eroding human decency? Or are smartphones just our latest scapegoat?
Our society is being hijacked by technology.
What began as a race to monetize our attention is now eroding the pillars of our society: mental health, democracy, social relationships, and our children.
What we feel as addiction is part of something much bigger.
There’s an invisible problem that’s affecting all of society.
Facebook, Twitter, Instagram, Google have produced amazing products that have benefited the world enormously. But these companies are also caught in a zero-sum race for our finite attention, which they need to make money. Constantly forced to outperform their competitors, they must use increasingly persuasive techniques to keep us glued. They point AI-driven news feeds, content, and notifications at our minds, continually learning how to hook us more deeply—from our own behavior.
Unfortunately, what’s best for capturing our attention isn’t best for our well-being:
- Snapchat turns conversations into streaks, redefining how our children measure friendship.
- Instagram glorifies the picture-perfect life, eroding our self worth.
- Facebook segregates us into echo chambers, fragmenting our communities.
- YouTube autoplays the next video within seconds, even if it eats into our sleep.
These are not neutral products.
They are part of a system designed to addict us.
The race for attention is eroding the pillars of our society.
The race to keep us on screen 24/7 makes it harder to disconnect, increasing stress, anxiety, and reducing sleep.
The race to keep children’s attention trains them to replace their self-worth with likes, encourages comparison with others, and creates the constant illusion of missing out.
The race for attention forces social media to prefer virtual interactions and rewards (likes, shares) on screens over face-to-face community.
Social media rewards outrage, false facts, and filter bubbles – which are better at capturing attention – and divides us so we can no longer agree on truth.
The whole system is vulnerable to manipulation.
Phones, apps, and the web are so indispensable to our daily lives—a testament to the benefits they give us—that we’ve become a captive audience. With two billion people plugged into these devices, technology companies have inadvertently enabled a direct channel to manipulate entire societies with unprecedented precision.
Technology platforms make it easier than ever for bad actors to cause havoc:
- Pushing lies directly to specific zip codes, races, or religions.
- Finding people who are already prone to conspiracies or racism, and automatically reaching similar users with “Lookalike” targeting.
- Delivering messages timed to prey on us when we are most emotionally vulnerable (e.g., Facebook found depressed teens buy more makeup).
- Creating millions of fake accounts and bots impersonating real people with real-sounding names and photos, fooling millions with the false impression of consensus.
Meanwhile, the platform companies profit from growth in users and activity.
Won’t we just adapt? Four reasons why it’s different this time.
People always worry that new technology will harm society. Four distinct forces make today different from anything in the past, including TV, radio, and computers:
No other media drew on massive supercomputers to predict what it could show to perfectly keep you scrolling, swiping or sharing.
No other media steered two billion people’s thoughts 24/7 – checking 150 times per day – from the moment we wake up until we fall asleep.
No other media redefined the terms of our social lives: self-esteem, when we believe we are missing out, and the perception that others agree with us.
No other media used a precise, personalized profile of everything we’ve said, shared, clicked, and watched to influence our behavior at this scale.
Exponentially-growing platforms are easily exploited.
As content grows exponentially, platform companies rely increasingly on automation:
- YouTube automates billions of videos to play next for 1.5 billion users.
- Facebook automates millions of ads shown to 2 billion users.
- Twitter automates showing millions of #trending topics to hundreds of millions of users.
Unfortunately, these automatic algorithms are easily gamed to manipulate society at a massive scale, because platforms lack the capacity to reliably check for conspiracies, lies, and fake users.
Profiting from the problem, platforms won’t change on their own.
We can’t expect attention-extraction companies like YouTube, Facebook, Snapchat, or Twitter to change, because it’s against their business model.
Platforms would lose money if they solved the problem:
- Facebook would lose revenue if they blocked advertisers from micro-targeting lies and conspiracies to the people most likely to be persuaded.
- Twitter’s stock price would fall if they were to remove the millions of bots on their platform, which academics estimate at 15% of their user base.
- Facebook would lose revenue if their tools didn’t allow advertisers to automatically test millions of variations of content — word choices, color, images — to capture the most minds.
So we have to change the rules of the game.
The Way Forward
In the future, we will look back at today as a turning point towards humane design: when we moved away from technology that extracts attention and erodes society, towards technology that protects our minds and replenishes society.
Humane Design is the solution.
To design humane technology,
we need to start by deeply understanding ourselves.
Humane Design starts by understanding our most vulnerable human instincts so we can design compassionately to protect them from being abused:
- How are we vulnerable to getting overwhelmed, stressed or outraged?
- How are we vulnerable to micro-targeted persuasion? (e.g. messages that use our personality and traits against us?)
- How are we vulnerable to the expectation of being available 24/7 to each other?
We are creating humane design standards, policy, and business models that more deeply align with our humanity and how we want to live.
How? Four levers to redefine our future.
Inspire Humane Design
Apple, Samsung, and Microsoft can help solve the problem, because keeping people hooked to the screen isn’t their business model. They can redesign their devices and core interfaces to protect our minds from constant distractions, minimize screen time, protect our time in relationships, and replace the App Store marketplace of apps competing for usage with a marketplace of tools competing to benefit our lives and society.
Apply Political Pressure
Governments can pressure technology companies toward humane business models by including the negative externalities of attention extraction on their balance sheets, and creating better protections for consumers. We are advising governments on smart policies and better user protections.
Create a Cultural Awakening
Consumers don’t want to use products that they know are harmful, especially when it harms their kids. We are transforming public awareness so that consumers recognize the difference between technology designed to extract the most attention from us, and technology whose goals are aligned with our own. We are building a movement for consumers to take control of their digital lives with better tools, habits and demands to make this change.
Talented employees are the greatest asset of technology companies – and the ones companies are most afraid to lose. Most engineers and technologists genuinely want to build products that improve society, and no one wants participate in an extraction-based system that ruins society. We are empowering employees who advocate for non-extraction based design decisions and business models.
We have built a world-class team of deeply concerned former tech insiders and CEOs who intimately understand the culture, business incentives, design techniques, and organizational structures driving how technology hijacks our minds.
Since 2014, we have articulated the problem to millions of people through mainstream media outlets like 60 Minutes, TED, and Real Time with Bill Maher. We have convened top industry leaders in private settings and presented directly to Heads of State and political leaders to drive change. We’ve also worked with leading designers and technology executives to find ways to align their products with the well-being of humanity.
Co-Founder & Executive Director
Tristan Harris, called the “closest thing Silicon Valley has to a conscience” by The Atlantic magazine, was the former Design Ethicist at Google. He became a world expert on how technology steers the thoughts, actions, and relationships that structure two billion people’s lives, leaving Google to engage public conversation about the issue. Tristan spent over a decade understanding subtle psychological forces, from his childhood as a magician, to working with the Stanford Persuasive Technology Lab, to his role as CEO of Apture, a technology company that was acquired by Google. His work has been featured on 60 Minutes, TED, The Atlantic, the PBS News Hour, and more. He has worked with major technology CEOs and briefed political leaders, including Heads of State. Tristan holds several patents from his work at Apple, Wikia, Apture, and Google.
Roger McNamee is co-founder of Elevation Partners and a 35-year veteran of technology investing. Roger helped Facebook in its early days as an advisor to Mark Zuckerberg, and introduced Sheryl Sandberg to Mark. Starting in 2016, he became concerned about Facebook not taking responsibility for how its platform was being exploited to influence elections and disenfranchising different groups. Roger is best known for early stage venture investments in Electronic Arts, Rambus, Overture, Facebook, and Yelp, as well for the leveraged buyout of Seagate.
Renée researches the impact of social network communities on specific policies and political issues. She is particularly interested in the spread of pseudoscience, and in the role of automation and bots in mass collective action. She regularly writes and speaks about the role that tech platforms play in the proliferation of disinformation and conspiracy theories.
Co-Founder & Chief Strategy Officer
Aza Raskin helped build the web at Mozilla as head of user experience, was named to Inc and Forbes 30-under-30 and became the Fast Company Master of Design for his work founding Massive Health, a consumer health and big data company. The company was acquired by Jawbone, where he was VP of Innovation. Before that, he founded Songza.com (acquired by Google), and studied dark matter physics. For Aza, the problem is especially personal: his father, Jef Raskin, created the Macintosh project at Apple with the vision that technology should help, not harm, humans.
Randima (Randy) Fernando
Co-Founder & Chief Operating Officer
Randy Fernando brings a values-driven mix of management, technology, and nonprofit backgrounds. For 7 years, he was Executive Director at Mindful Schools, a nonprofit he helped to grow from a fledgling $200K organization to a $4 million leader in the mindfulness field, impacting nearly a million children worldwide. Prior to that, Randy ran several award-winning projects over 7 years at NVIDIA, including three books he authored on real-time computer graphics. His work has been used in many top-flight video games, including Grand Theft Auto V, Assasin’s Creed IV, and Far Cry 4.
PR & Media
Co-Founder, Time Well Spent
Ex-Google, Philosopher at
Oxford Internet Institute
Co-Founder, Time Well Spent
Head of Content & Storytelling
Key Advisors & Supporters
Co-Founder of Asana
Co-Invented FB ‘Like’ Button
Founder & CEO
Dr. Natasha Schüll
Author, Addiction by Design
Dr. Jon Kabat-Zinn
Founder, Mindfulness-Based Stress Reduction
The National Day of Unplugging
Other Ways You Can Help
We’ve got ways for everyone to join this rapidly growing movement.
Help Us Find the Best Talent
Share our open job postings with outstanding candidates.
Offer Financial Support
Your financial support helps bring our projects to life more quickly.