Showing posts with label Tristan Harris. Show all posts
Showing posts with label Tristan Harris. Show all posts

Monday, May 6, 2019

Preventing Human Downgrading: A Race to the Top for Wellbeing


In the past I've written about Tristan Harris and his efforts to help guide technology to make us better, improving our lives and relationships versus technology that addicts us. (See One to Watch: The Center for Humane Technology as well as Your Phone is a Purposely Designed Slot Machine Designed to Be Addictive) Two weeks Tristan and the Center for Humane Technology held an event and announced "A New Agenda for Tech," challenging Silicon Valley leaders to "reverse human downgrading by inspiring a new race to the top and realigning technology with humanity."
 
The talk is a must watch--please make 45 minutes to do so. (If that seems long, consider that the average person spends 60+ minutes/day on YouTube--and watching this talk will make you aware of an eye-opening unintentional result of YouTube's artificial intelligence recommendation engine which affects us all and is changing world wide culture.) It's important for educators, parents, kids, politicians, technology leaders, and everyone to understand what has and is happening, and act upon it. The following email I received last week summarizing the event does a great job of concisely summarizing the content of the talk better than I can:
Introducing “Human Downgrading”
Our primary goal was to move public discourse in Silicon Valley from a cacophony of disconnected grievances and scandals ("they took our data!") to a meaningful humane agenda of actions that address the vast surface area of problems arising from technology’s race for attention. 
In last week’s presentation, we explained how seemingly separate problems – tech addiction, teen depression, shortening attention spans, political polarization, the breakdown of truth, outrage-ification of culture, and the rise of vanity/micro-celebrity culture – are actually not separate issues. They are all symptoms of one underlying problem: the race between tech giants to capture human attention, which becomes a race to overwhelm human weaknesses. Put together, that race creates “human downgrading.” 
Giving a name to the connected system of human downgrading is crucial, because without it, solution creators end up working in silos and attempt to solve the problem by playing an infinite “whack-a-mole” game. It’s like working on coral reefs, ocean acidification, or hurricanes before there was a recognition of systemic climate change. Shared language creates the opportunity and leverage to develop systemic solutions and unite the voices of concerned technologists, investors, researchers, media, policymakers, and parents. 
Human downgrading is a problem that reduces our capacity to solve all other problems. It suppresses critical thinking and nuance, makes us more lonely, and reduces our capacity to find common ground and shared values. 
The Solution?
While the problem is immense, the good news is that the solution involves one thing: better protecting the vulnerabilities of human nature. Technologists must approach innovation and design with an awareness of protecting of the ways we’re manipulatable as human beings. Instead of more artificial intelligence or more advanced tech, we actually just need more sophistication about what protects and heals human nature and social systems. To that end, we are developing a new model that technologists can use to explore and assess how technology affects us at the individual, relational, and societal levels. 
What’s Next?
This is the beginning of a long journey. Instead of a race to the bottom of “how can we most effectively manipulate you,” we must create a new a race to the top to completely reverse human downgrading. To start facilitating this transition, we are announcing four initiatives: 
Opportunities for key stakeholders to plug into working groups to take action.
Your Undivided Attention” — a new podcast launching in June where Tristan and Aza gather insights about the invisible limits and ergonomics of human nature from a wide range of experts to address human downgrading. 
Design guides to facilitate assessment across human sensitivities and social spaces to help guide designers in redesigning their products. 
A Humane Technology conference in the next year to bring together people working on many different aspects of Human Downgrading. 
I'm thankful that Tristan and his colleagues are pushing for this. We have already seen some of the fruits of their labor. As mentioned during the video, Apple's Screen Time and Google's Digital Wellbeing tools to help users take control of their own technology and be more mindful of its use are now in place on over a billion phones. That's promising change happening already. I look forward to more to come. A while back I joined their online community forums and encourage you to do the same. Keep talking about these topics with those around you--with family, friends, and colleagues. Read more in related posts, too:

Monday, March 13, 2017

Your Phone is a Slot Machine Purposely Designed to Be Addictive

This video on the Time Well Spent site nicely explains
how apps are designed to be addictive.

In his TED Talk, How Better Tech Could Protect Us From Distraction and an essay, Tristan Harris explains that apps on your phone are a like a slot machine, designed to be addictive. He certainly has the background and inside experience to be able to state such a thing, as a former Google Design Ethicist and student at the Stanford Persuasive Technology lab (yes, there is such a place "looking for back doors in people's minds to influence their behavior"). The fact that phones are addictive is not news, but the fact that a former designer is not only openly explaining how tech companies are working to make things even more addictive than they are already and some are working to encourage an alternative solution is. I really admire this and appreciate his work.

This isn't the first time I have heard and blogged about Tristan. I first wrote about him two years ago (see Techcognition in an Attention Economy) after hearing him on WNYC's Note to Self podcast, one of my favorites. Host Manoush Zomorodi interviewed Tristan again last week about addictive apps and his new company, Time Well Spent. The episode, Will You Do a Snapchat Streak with Me? is worth hearing. I listened with great interest after seeing so many kids addicted to Snapchat. (With my own kids, we don't allow Snapchat until age 16, and often wish we hadn't ever allowed it!) A few main points which Tristan spoke about include:
  • 40 people at three companies are shaping how billions of people behave.
  • Recently the Netflix CEO said its biggest competitors are YouTube, Google, and sleep.
  • On Snapchat's Streaks: Snapchat's goal to is to hook kids and make it a habit (vs. an alternative like Duolingo, where users set usage goals to learning Spanish). 
  • No one is malicious, but technology isn't neutral. It has an intelligent engine in it fine tuned to make you use it more.
I especially found his description of phones as slot machines enlightening. In an essay on his website, Tristan asks: 
If you’re an app, how do you keep people hooked? Turn yourself into a slot machine. One major reason why is the #1 psychological ingredient in slot machines: intermittent variable rewards (he links to Wikipedia for more on this). The average person checks their phone 150 times a day. Why do we do this? Are we making 150 conscious choices? ... here’s the unfortunate truth — several billion people have a slot machine their pocket:
When we pull our phone out of our pocket, we’re playing a slot machine to see what notifications we got. 
When we pull to refresh our email, we’re playing a slot machine to see what new email we got. 
When we swipe down our finger to scroll the Instagram feed, we’re playing a slot machine to see what photo comes next. 
When we swipe faces left/right on dating apps like Tinder, we’re playing a slot machine to see if we got a match. 
When we tap the # of red notifications, we’re playing a slot machine to what’s underneath.
I'm struck by this analogy and concerned how we are giving our kids slot machines, getting them hooked into this cycle so early. So what can we do about it?
"Imagine a future where technology is 
built on our values, not our screen time."

Tristan explains that thousands of engineers are working to keep your attention. If they were to stop fighting the war for your time, they will lose money. So the only solutions to this are either regulation or a demand by consumers for something different. In his TED Talk, Tristan points out that McDonald's didn't offer salads to consumers until they demanded it. So, too, must we demand that technology be designed to use our time differently. I love the phrase at the end of the Panda Dancing video on their site linked above, "Imagine a future where technology is built on our values, not our screen time." That sounds great, and one we should start demanding!