New: Try my AI Bot new film

“It’s hard to be a moral person. Technology is making it harder.” Sigal Samuel on technology use (my high-lights)

yes

"It’s hard to be a moral person. Technology is making it harder." Sigal Samuel / Vox.com (my highlights)

“I half-choked on my tea and stared at my laptop. I recognized the post as a plea for support. I felt fear for him, and then … I did nothing about it, because I saw in another tab that I’d just gotten a new email and went to check that instead. (…) I began to notice that digital technology often seems to make it harder for us to respond in the right way when someone is suffering and needs our help.”

Read more via Vox.com

“What if it’s also making us less empathetic, less prone to ethical action? What if it’s degrading our capacity for moral attention — the capacity to notice the morally salient features of a given situation so that we can respond appropriately? (…) Many a bystander has witnessed a car accident or a fist-fight and taken out their phone to film the drama rather than rushing over to see if the victim needs help.”

Read more via Vox.com

Plain old attention — the kind you use when reading novels, say, or birdwatching — is a precondition for moral attention, which is a precondition for empathy, which is a precondition for ethical action. (…) Decreating the self — that’s the opposite of social media,” she says, adding that Facebook, Instagram, and other platforms are all about identity construction. Users build up an aspirational version of themselves, forever adding more words, images, and videos, thickening the self into a “brand.”  Read more via Vox.com

“But what’s even more disconcerting is that our devices disconnect us even when we’re not using them. As the MIT sociologist Sherry Turkle, who researches technology’s adverse effects on social behavior, has noted: “Studies of conversation, both in the laboratory and in natural settings, show that when two people are talking, the mere presence of a phone on a table between them or in the periphery of their vision changes both what they talk about and the degree of connection they feel. People keep the conversation on topics where they won’t mind being interrupted. They don’t feel as invested in each other.”  Read more via Vox.com

“This would be a fair critique if there were symmetrical power between users and tech companies. But as the documentary The Social Dilemma illustrates, the companies understand us better than we understand them — or ourselves. They’ve got supercomputers testing precisely which colors, sounds, and other design elements are best at exploiting our psychological weaknesses (many of which we’re not even conscious of) in the name of holding our attention. Compared to their artificial intelligence, we’re all children, Harris says in the documentary. And children need protection.”  Read more via Vox.com

“the “lack of addictive features” is part of why new social networks meant as more ethical alternatives to Facebook and Twitter — like Ello, Diaspora, or App.net — never manage to peel very many people off the big platforms. (…) More time on these platforms equals more money, so if the healthy thing for society was less use of Facebook and a very different kind of Facebook, that’s not in line with the business model and they’re not going to be for it.” Via Vox.com

A must watch keynote on the Future of Social Media

How could social media become ‘human’ again? How can we stop the disinformation, dehumanisation and dataism that has resulted from social media’s algorithmic obsessions?

newsletter

* indicates required
latest book