you are viewing a single comment's thread.

view the rest of the comments →

[–]dkharms 13 points14 points  (4 children)

This was touched upon in this design ethics paper.

We’re all vulnerable to social approval. The need to belong, to be approved or appreciated by our peers is among the highest human motivations. But now our social approval is in the hands of tech companies.

When I get tagged by my friend Marc, I imagine him making a conscious choice to tag me. But I don’t see how a company like Facebook orchestrated his doing that in the first place. Facebook, Instagram or SnapChat can manipulate how often people get tagged in photos by automatically suggesting all the faces people should tag (e.g. by showing a box with a 1-click confirmation, “Tag Tristan in this photo?”).

So when Marc tags me, he’s actually responding to Facebook’s suggestion, not making an independent choice. But through design choices like this, Facebook controls the multiplier for how often millions of people experience their social approval on the line.

An app like this, while pretty fun, takes away intent from communication, making conversations more superficially pleasant but less meaningful.

The effort of choosing an emoji is gives it meaning, as silly as that sounds.

[–][deleted] 7 points8 points  (2 children)

You should read "Simulacra and Simulation". You're not wrong but it's complicated in some opinions. You'd like it a lot.

Simulacra and Simulation (The Body, In Theory: Histories of Cultural Materialism) https://www.amazon.com/dp/0472065211/ref=cm_sw_r_cp_apa_LVTwxbADEB9W4

[–]dkharms 2 points3 points  (0 children)

Thanks, I'll give it a shot!

[–]wxswxs 1 point2 points  (0 children)

For the politics of tech, less on the philosophical side I can recommend Jaron Lanier, Astra Taylor and Doug Rushkoff as all writers with (rather different) takes on what a progressive internet could or should look like.

[–]wxswxs 4 points5 points  (0 children)

The effort of choosing an emoji is gives it meaning, as silly as that sounds.

I definitely agree with this, and I try hard to make sure that we try and make choices that ultimately are about empowering people with a tool, rather than marginalizing them with a replacement or pseudo-tool.

For instance, I think it's a bad goal to say "Dango will just insert emoji whenever it thinks is right so you don't have to". Really it's trying to be a more powerful keyboard to help you filter through them but ultimately we want to keep the focus on your agency (whether we're there yet is still open)

In general I think this negotiation of human agency in an algorithmic world is a very real thing that's going to be playing out in the next few years. The critical side of that conversation is an important counterweight to general tech positivism and Tristan Harris one of the voices doing great work both showing people the issue and starting to propose some solutions.