Monday, May 6, 2019

Preventing Human Downgrading: A Race to the Top for Wellbeing

In the past I've written about Tristan Harris and his efforts to help guide technology to make us better, improving our lives and relationships versus technology that addicts us. (See One to Watch: The Center for Humane Technology as well as Your Phone is a Purposely Designed Slot Machine Designed to Be Addictive) Two weeks Tristan and the Center for Humane Technology held an event and announced "A New Agenda for Tech," challenging Silicon Valley leaders to "reverse human downgrading by inspiring a new race to the top and realigning technology with humanity."
The talk is a must watch--please make 45 minutes to do so. (If that seems long, consider that the average person spends 60+ minutes/day on YouTube--and watching this talk will make you aware of an eye-opening unintentional result of YouTube's artificial intelligence recommendation engine which affects us all and is changing world wide culture.) It's important for educators, parents, kids, politicians, technology leaders, and everyone to understand what has and is happening, and act upon it. The following email I received last week summarizing the event does a great job of concisely summarizing the content of the talk better than I can:
Introducing “Human Downgrading”
Our primary goal was to move public discourse in Silicon Valley from a cacophony of disconnected grievances and scandals ("they took our data!") to a meaningful humane agenda of actions that address the vast surface area of problems arising from technology’s race for attention. 
In last week’s presentation, we explained how seemingly separate problems – tech addiction, teen depression, shortening attention spans, political polarization, the breakdown of truth, outrage-ification of culture, and the rise of vanity/micro-celebrity culture – are actually not separate issues. They are all symptoms of one underlying problem: the race between tech giants to capture human attention, which becomes a race to overwhelm human weaknesses. Put together, that race creates “human downgrading.” 
Giving a name to the connected system of human downgrading is crucial, because without it, solution creators end up working in silos and attempt to solve the problem by playing an infinite “whack-a-mole” game. It’s like working on coral reefs, ocean acidification, or hurricanes before there was a recognition of systemic climate change. Shared language creates the opportunity and leverage to develop systemic solutions and unite the voices of concerned technologists, investors, researchers, media, policymakers, and parents. 
Human downgrading is a problem that reduces our capacity to solve all other problems. It suppresses critical thinking and nuance, makes us more lonely, and reduces our capacity to find common ground and shared values. 
The Solution?
While the problem is immense, the good news is that the solution involves one thing: better protecting the vulnerabilities of human nature. Technologists must approach innovation and design with an awareness of protecting of the ways we’re manipulatable as human beings. Instead of more artificial intelligence or more advanced tech, we actually just need more sophistication about what protects and heals human nature and social systems. To that end, we are developing a new model that technologists can use to explore and assess how technology affects us at the individual, relational, and societal levels. 
What’s Next?
This is the beginning of a long journey. Instead of a race to the bottom of “how can we most effectively manipulate you,” we must create a new a race to the top to completely reverse human downgrading. To start facilitating this transition, we are announcing four initiatives: 
Opportunities for key stakeholders to plug into working groups to take action.
Your Undivided Attention” — a new podcast launching in June where Tristan and Aza gather insights about the invisible limits and ergonomics of human nature from a wide range of experts to address human downgrading. 
Design guides to facilitate assessment across human sensitivities and social spaces to help guide designers in redesigning their products. 
A Humane Technology conference in the next year to bring together people working on many different aspects of Human Downgrading. 
I'm thankful that Tristan and his colleagues are pushing for this. We have already seen some of the fruits of their labor. As mentioned during the video, Apple's Screen Time and Google's Digital Wellbeing tools to help users take control of their own technology and be more mindful of its use are now in place on over a billion phones. That's promising change happening already. I look forward to more to come. A while back I joined their online community forums and encourage you to do the same. Keep talking about these topics with those around you--with family, friends, and colleagues. Read more in related posts, too:

No comments:

Post a Comment