How psychological and interpersonal processes are influenced by human-computer interactions

This article includes a video of an interview 2024-08-02 with Ian Axel Anderson, a Caltech faculty researcher in applied social psychology.[1] It is posted here to invite others to contribute other perspectives, subject to the Wikimedia rules of writing from a neutral point of view citing credible sources[2] and treating others with respect.[3]

Ian Axel Anderson, a Caltech faculty researcher in applied social psychology,[1] discusses psychological and interpersonal processes that influence media use and human-computer interaction. Dr Anderson has studied how Internet companies exploit social-psychological processes – such as habit formation, social learning, and attention span – in ways that threaten democracy and rule of law. He is a member of the Coalition for Independent Technology Research.[4]

Ian Anderson discusses how psychological and interpersonal processes are influenced by human-computer interaction and how social media companies make money destroying democracy and encouraging violence.
29:00 mm:ss audio podcast from Interview conducted 2021-08-02 with Ian Anderson by Karl Brooks and Spencer Graves about how Internet companies make money by exploiting human psychology to the detriment of individuals and society

Dr. Anderson’s experiments have studied habits, online posting and scrolling behavior, hate speech, extremism, conspiracies, rumors, well-being, identity, stereotypes, and social media influence. He mentioned "The Facebook Papers", which were tens of thousands of Facebook’s internal documents that former Facebook employee and whistleblower Frances Haugen released to the Securities and Exchange Commission and The Wall Street Journal in 2021. These documents establish that Facebook executives knew that their algorithms were creating problems for many users and others including proactively inciting violence such as the genocide of Rohingya Muslims in Myanmar but prioritized company income over the wellbeing of users and society more generally.[5] Anderson said that Facebook’s content moderation policies in the US and Europe are much more friendly to users and society than in many other countries.

Mark Twain observed, “How easy it is to make people believe a lie, and how hard it is to undo that work again!” Anderson discusses research supporting this observation. This problem is exacerbated by the “continued influence effect”, which is the tendency for misinformation to continue to influence memory and reasoning even after a person agrees that the information was erroneous.[6] A more subtle effect is that people who read only headlines on social media have less actual knowledge while thinking they know more than people who catch a standard news broadcast or read a longer report in a standard newspaper (but not a tabloid); the actual knowledge of both may translate into increased civic participation, but the actions of those informed by social media headlines are less likely to be constructive.[7]

For countering misinformation in social media, crowdsourcing trustworthyness, i.e., judgments of news source quality, seems to be effective.[8]

Anderson is interviewed by Karl Brooks[9] and Spencer Graves.[10]

The threat

edit

More on these threats to democracy and world peace is summarized in Category:Media reform to improve democracy.

Discussion

edit

Notes

edit
  1. 1.0 1.1 Ian Axel Anderson, Wikidata Q128639294
  2. The rules of writing from a neutral point of view citing credible sources may not be enforced on other parts of Wikiversity. However, they can facilitate dialog between people with dramatically different beliefs
  3. Wikiversity asks contributors to assume good faith, similar to Wikipedia. The rule in Wikinews is different: Contributors there are asked to "Don't assume things; be skeptical about everything." That's wise. However, we should still treat others with respect while being skeptical.
  4. Coalition for Independent Technology Research members, Wikidata Q128696184
  5. In defence of the decisions by executives of Facebook and Meta, they could be fired or sued if they prioritized the wellbeing of users over shareholder value .
  6. The "continued influence effect" is listed in a table in the section on "Other memory biases" in the Wikipedia article on "List of cognitive biases". See also Cacciatore (2021).
  7. Schäfer and Schemer (2024).
  8. Pennycook and Rand (2019).
  9. Karl Boyd Brooks, Wikidata Q128214400
  10. Spencer Graves, Wikidata Q56452480

Bibliography

edit