- Joined
- Jan 30, 2008
- Messages
- 5,083
This might be of interest to some.
The text of the email I received, plus a link to the whole report.
https://shorensteincenter.org/information-disorder-framework-for-research-and-policymaking/
"A new report published by First Draft, a project of the Shorenstein Center, commissioned by the Council of Europe, examines how dis-information campaigns have become widespread and, heavily relying on social media, contribute to a global media environment of information disorder.
The report is a comprehensive analysis of information disorder and its related challenges, such as filter bubbles and echo chambers, and provides a framework for dialogue that can be used by policymakers and researchers. It contains 34 recommendations for stakeholders such as technology companies, media, national governments, civil society, and educational agencies to help them identify strategies to address the problem.
The report was written by Dr. Claire Wardle, Research Fellow at the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School and Executive Director of First Draft, and writer and researcher Hossein Derakhshan.
“What we are witnessing is something completely new: disinformation campaigns, often playing with people’s emotions, spreading at great speed, with a potential to have an enormous impact on society,” said Wardle. “To fight misinformation, simply pushing out more factual information, without understanding the emotional and ritualistic elements of communication, could be a complete waste of time.”
Key findings of the report include:
The text of the email I received, plus a link to the whole report.
https://shorensteincenter.org/information-disorder-framework-for-research-and-policymaking/
"A new report published by First Draft, a project of the Shorenstein Center, commissioned by the Council of Europe, examines how dis-information campaigns have become widespread and, heavily relying on social media, contribute to a global media environment of information disorder.
The report is a comprehensive analysis of information disorder and its related challenges, such as filter bubbles and echo chambers, and provides a framework for dialogue that can be used by policymakers and researchers. It contains 34 recommendations for stakeholders such as technology companies, media, national governments, civil society, and educational agencies to help them identify strategies to address the problem.
The report was written by Dr. Claire Wardle, Research Fellow at the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School and Executive Director of First Draft, and writer and researcher Hossein Derakhshan.
“What we are witnessing is something completely new: disinformation campaigns, often playing with people’s emotions, spreading at great speed, with a potential to have an enormous impact on society,” said Wardle. “To fight misinformation, simply pushing out more factual information, without understanding the emotional and ritualistic elements of communication, could be a complete waste of time.”
Key findings of the report include:
- There are three different types of “information disorder”:
- Mis-information, when false information is shared, but no harm is meant;
- Dis-information, when false information is knowingly shared to cause harm;
- Mal-information, when genuine information is shared to cause harm, often by making public information designed to stay private.
- There is a complex web of motivations for creating, disseminating, and consuming “polluted” messages.
- Political: Some see disinformation as a tool to mobilize action and behavior change.
- Financial: Fabricated “news” sites may create lucrative opportunities.
- Psychological: Some who create and share misinformation are seeking prestige or reinforcement.
- Social: Some are motivated by a desire to connect with others around a shared group identity or world view.
- There are myriad content types and techniques for amplifying content:
- Messages that are most shared evoke an emotional response, have a powerful visual element and strong narrative, and are repeated.
- Algorithms that favor images and video have the potential to reach more users than articles or text posts—regardless of veracity.
- Visuals are overwhelmingly the most shared pieces of disinformation and are the most difficult to debunk.
- There are innumerable platforms hosting and reproducing this content:
- Seeing the same message many times across platforms makes users more likely to believe it to be true.
- Sharing information on social networks allows people to “perform” their identities with others who share their worldviews.
- Filter bubbles keep opposing viewpoints out of social media feeds. If platforms changed their algorithms to provide users with more challenging material, people would likely spend less time on them.
- The long-term implications of disinformation campaigns designed to sow mistrust and confusion and to sharpen socio-cultural divisions using existing nationalistic, ethnic, racial, and religious tensions are concerning.
- There is a risk that in the near future, audiences may have little trust in the information they find online.