Keep Your Friends Close…: Technology and the Politics of Fear

Tracy Dennis-Tiwary
5 min readJun 29, 2016

Speaking at the Personal Democracy Forum (PDF) 2016 was one of those paradigm-shifting conference experiences for me. Before PDF, I tended to hear technophilic, almost Pollyannaish narratives about how technology can make our lives- and our civic lives — better. I was clearly behind the times because I now see the narrative shifting and morphing into a much more challenging, questioning viewpoint that might be best described by the saying “keep your friends close, but keep your enemies closer.”

In almost every talk I heard, technology and the digital economy was described as a double-edged sword, as a way to ignite change, but with high potential costs, and full of booby traps. Those who create technology? A mixed bag at best. Anil Dash didn’t mince words when he called the technocrats and Silicon Valley billionaires liars and the new robber barons. Kentaro Toyama compared the digital economy to The Matrix, in which our personal data is the lifeblood of same Silicon Valley billionaire evil robot overlords.

I have to admit that I take grim pleasure in the aptness of these metaphors, and have uttered identical words myself. However, it is also clear that these ideas are polarizing and, like extremism in politics, privilege emotions above logic to drive more fractious and divisive discourse. Luna Malbroux’s hilarious talk about “EquiTable,” a faux app she developed to create dialogue about social justice and equity, is a nice example of how to break away from bitter recriminations and instead to use humor as a powerful weapon for change.

But if technology is a very sharp double-edged sword, how do we wield it without cutting ourselves? How do we, as Yvette Alberdingk Thijm described in her talk about using technology as civic witnesses, harness technology for good without allowing others to use it against us.

Keep your friends close…

PDF yielded many ideas and solutions. I mention only a few below (including mine). I was particularly interested in those ideas and solutions demanding that technology serve humanistic goals and that the well-being of people be part and parcel of how we design and build technology. To do this, we have to open our eyes and take a cold, hard look at how our romance with technology has caused us to take our hands off the wheel (no pun with driverless cars intended).

My talk (text can be found here) centered on technology and mental health. I argued that the psychological and emotional nature of the tech we build is not peripheral or ancillary — it is fundamental to shaping how we use tech for healing. Right not, technology and digital culture is precisely and relentlessly designed to high jack our attention and our emotional brains for the economic benefit of its creators — this is the basis of the attention economy. To gather, mine, and sell our personal data, technology needs to be addictive, keeping us looking, clicking, buying, eyeballs on the screen, swiping, checking, clutching our devices, hoping to hear the next best thing, to feel connected, soothed, and understood. This is counter to health promotion, and creates imbalance instead of balance, weakness instead of strength. The notion that technology is designed to high jack our brains was beautifully and compelling described in a blog post just a few days after PDF by Tristan Harris.

I ended my talk with a call to action, that we must reclaim the technology culture to serve and amplify humanity and well-being, rather than serve the attention economy. We must further anchor this new culture in key values, including the value that our attention is sacred and valuable, not just the coin of the realm. We must own and be responsible for how we spend our precious attention.

Sherry Turkle observed how our excitement over the rapid pace of technological advances makes us forget some fundamental, common-sense things we know about life. For example, after research suggesting that self-reported declines in empathy among millennials could be caused by growing use of social media and digital communication, one researcher’s solution was to build an “empathy app.” Why would we ever think that technology could make us more empathic, that the thing that might have caused declines in empathy could also be the solution? Dr. Turkle described how many aspects of digital technology actually allow us to effectively hide from the challenges of feeling and expressing emotions in our relationships, to “sidestep physical presence” and seek “frictionless relationships.” Solution — we need to reclaim common sense and realize that we are the empathy app, as Dr. Turkle quipped.

danah boyd called our attention to the immense ethical disconnect in how the digital infrastructure of our civic lives — code — is constructed. This is an industry in “perpetual beta” and thus there are few if any standards, audits, or inspections of code. There also is little consideration of the resources taken up to maintain the immense glut of data generated every day, and little awareness of how bias and inaccuracy are built into data analytics. These questions are of the utmost importance because an increasing number of decisions in our personal and civic lives are being made based on algorithms and digital profiling. She exhorts us to be careful of how and what we code.

…but keep your enemies closer

As in everything, knowledge is power. I felt that we at PDF, speakers, participants, and audience alike, implicitly but universally agreed to keep our eyes open, to look our crush, technology, in the face and see that she may not be on our side anymore but to hope that it’s not too late. Technology is empowering, BUT…. We all agreed to spend more time on the “buts,” as well as on the when, how, and under what conditions we can reclaim technology for humanity. In his PDF talk, Kentaro Toyama evoked the great Isaac Asimov and the First Law of Robotics from Asimov’s “I, Robot” (A robot may not injure a human being or, through inaction, allow a human being to come to harm). In Asimov’s universe, the powers of technology are at their fundamental core designed and harnessed for the benefit of people. I believe that we must and can insist that our technology conform to this higher standard, and that with this as a guiding light, we can wield the double-edged sword of technology for more good than ill.

Like this:

Like Loading…

Related

Originally published at psychescircuitry.wordpress.com on June 29, 2016.

--

--

Tracy Dennis-Tiwary

Psychologist and researcher. Anxiety wrangler. Founder of Wise Therapeutics. Science is the new rock & roll. wisedtx.com; dennis-tiwary.com.