Informační strategie boje proti dezinformaci: přehledová studie vlivných technologických společností po prezidentských volbách v USA v roce 2016

Title: Informační strategie boje proti dezinformaci: přehledová studie vlivných technologických společností po prezidentských volbách v USA v roce 2016
Variant title:
  • Information strategy in the war against disinformation: survey of influential technology companies after the presidential elections in the USA in 2016
Author: Ulrich, Petr
Source document: ProInflow. 2018, vol. 10, iss. 2, pp. 23-51
Extent
23-51
  • ISSN
    1804-2406 (online)
Type: Article
Language
License: CC BY 3.0 CZ
 

Notice: These citations are automatically created and might not follow citation rules properly.

Abstract(s)
Účel – Prudký rozvoj informačních technologií spolu se sociálními sítěmi vytvořil bezprecedentní prostředí, ve kterém jsou dezinformace a s nimi spojované informační operace povýšeny na velmi účinné a devastující zbraně hromadného ničení. Účelově vedená dezinformační kampaň v kyberprostoru již dnes dokáže ovlivnit výsledky demokratických voleb na úrovni světové velmoci. Jedná se proto o velmi závažné celospolečenské téma a problém, u kterého dosud neexistuje účinná obrana. Naším cílem je zmapovat aktuální přístupy boje proti dezinformaci z pohledu provozovatele technologických platforem, které jsou pro šíření dezinformace zneužívány, a navrhnout jednoduchý kategorizační model obranných informačních strategií. Design/metodologie/přístup – Hlavní otázkou, kterou si klademe, je, jak vlivné technologické společnosti ve svých procesech, nástrojích a vizích dalšího produktového směřování reagovaly na událost prezidentských voleb v USA v roce 2016. Svou pozornost zúžíme pouze na oblast sociálních sítí a sledované období uzavřeme březnem 2018. Tato teoretická práce bude založena na následujících metodických postupech: a) Dokumentová analýza primárních zdrojů publikovaných jednotlivými sledovanými technologickými společnostmi. b) Mediální analýza zpravodajských zdrojů, které situaci komentují jak z pohledu informační strategie, tak i z pohledu různých sociokulturně-politických "spouštěčů". Výsledky – Na základě analýzy sledovaných technologických společností a sociokulturně-politických "spouštěčů" jsme stanovili základní kategorizaci informačních strategií boje proti dezinformaci a promítli ji do první verze funkčního modelu, jaký dosud ještě nebyl pro tuto potřebu vytvořen. Z dílčích strategií se jako nejnosnější prokázala strategie postavená na automatizovaných fact-checking systémech s podporou umělé inteligence. Originalita/hodnota – Mediátorům obsahu, tedy vydavatelům a jejich redakcím, je již v současnosti věnováno mnoho pozornosti v řadě studií a analýz, které navazují na rozsáhlý historický výzkum v oblasti vývoje médií a komunikace. Na druhou stranu rovina informační strategie provozovatele komunikační platformy, na kterou se v naší analýze zaměřujeme, je v tomto kontextu online světa novou a dosud nepříliš probádanou oblastí, která přináší nové otázky a výzvy s ohledem na řízení těchto platforem z perspektivy oborů informačního managementu a informační politiky.
Purpose – The rapid development of information technologies with social networks has created an unprecedented environment in which disinformation and related information operations are recognized as very effective and devastating weapons of mass destruction. A purpose-driven disinformation campaign in cyberspace is able to influence results of democratic elections at the level of the world superpower. This is a very serious social issue and a problem for which there is no effective defense yet. Our objective is to map current approaches to combating disinformation from the viewpoint of technology platform operators whose platforms are misused to spread misinformation and suggest a simple classification model of defensive information strategies. Design/methodology/approach – Focus is laid on how influential technology companies reacted to the landmark of the US presidential elections in 2016 in their processes, strategy and product visions. Our attention is limited to the area of social networks and limited by March 2018. This theoretical work is based on the following methodological approaches: a) Document analyses of primary sources published by individual companies being monitored. b) Media analyses of news sources covering situation from an information strategy viewpoint as well as the socio-cultural political context and triggers associated with this phenomenon. Results – Based on the analyses of selected technology companies and socio-cultural political triggers we have defined a simple classification of information strategies for combating disinformation in online platforms. With this classification we derived a first version of a functional model, that hadn’t existed before. Strategy based on automated fact-checking systems with the support of artificial intelligence proofed to be the most vital. Originality/value – Content mediators, publishers and their editors, have already received a significant amount of attention in many studies and analysis following a vast historical research in the area of media and communication. On the other hand, the view of the information strategy of a communications platform operator, which we are focusing on in our work, is in this context of the online world a new and still not very well researched area that brings new questions and challenges regarding the platform management from the perspectives of information management and information policy.
References
[1] A.I. could fabricate fake news: Artificial intelligence could make fake news even harder to spot. [Online]. (2017). Retrieved April 02, 2018, from https://www.facebook.com/verge/videos/1618100861559584/

[2] Allcott, H., & Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. Journal Of Economic Perspectives, Spring 2017(Volume 31, Number 2), 26.

[3] Albright, J. (2017). Total Reach by Page [Online]. Retrieved March 30, 2018, from https://public.tableau.com/profile/d1gi#!/vizhome/FB4/TotalReachbyPage

[4] Armistead, L. (2010). Information operations matters: best practices. Washington, D.C.: Potomac Books.

[5] Aro, J. (2016). The cyberspace war: propaganda and trolling as warfare tools, 12.

[6] Babakar, M., & Moy, W. (2016). The State of Automated Factchecking: How to make factchecking dramatically more effective with technology we have now, 36.

[7] Benedictus, L. (2016). Invasion of the troll armies: from Russian Trump supporters to Turkish state stooges [Online]. Retrieved March 31, 2018, from https://www.theguardian.com/media/2016/nov/06/troll-armies-social-media-trump-russian

[8] Boffey, D. Europe's new cold war turns digital as Vladimir Putin expands media offensive: Russia is deploying social media trolls in an attempt to effect political change, and Britain is not immune [Online]. Retrieved March 24, 2018, from https://www.theguardian.com/world/2016/mar/05/europe-vladimir-putin-russia-social-media-trolls

[9] Bowles, N., & Thielman, S. Facebook accused of censoring conservatives, report says [Online]. Retrieved March 25, 2018, from https://www.theguardian.com/technology/2016/may/09/facebook-newsfeed-censor-conservative-news

[10] Břešťan, R. (2017). Hnutí ANO se inspirovalo projektem Demagog, na webu Můj demagog vyvrací "lži o Babišovi" [Online]. Retrieved April 02, 2018, from https://hlidacipes.org/hnuti-ano-se-inspirovalo-projektem-demagog-webu-muj-demagog-vyvraci-lzi-babisovi/

[11] Facebook's failure: did fake news and polarized politics get Trump elected? [Online]. Retrieved November 05, 2017, from https://www.theguardian.com/technology/2016/nov/10/facebook-fake-news-election-conspiracy-theories

[12] Fake Obama created using AI video tool - BBC News [Online]. (2017). Retrieved April 02, 2018, from https://www.youtube.com/watch?v=AmUC4m6w1wo

[13] Fake News Challenge: Exploring how artificial intelligence technologies could be leveraged to combat fake news. [Online]. Retrieved April 01, 2018, from http://www.fakenewschallenge.org/

[14] Ferrara, E. (2017). DISINFORMATION AND SOCIAL BOT OPERATIONS IN THE RUN UP TO THE 2017 FRENCH PRESIDENTIAL ELECTION. Information Sciences Institute, 33.

[15] Fredheim, R. (2018). ROBOTROLLING. Litva: NATO STRATEGIC COMMUNICATIONS CENTRE OF EXCELLENCE.

[16] Friedman, G. (2015). Ohrožená Evropa: rodící se krize. Praha: Aligier s.r.o. společně s Nakladatelstvím Tomáš Krsek.

[17] Garmazhapova, A. Где живут тролли. И кто их кормит: Специальный репортаж из офиса, в котором вешают лапшу в три смены [Online]. Retrieved March 24, 2018, from https://www.novayagazeta.ru/articles/2013/09/07/56253-gde-zhivut-trolli-i-kto-ih-kormit

[18] Giampaolo di Paola, & Panizzi, M. (2011). NATO Military Public Affairs Policy [Online]. Retrieved October 20, 2018, from https://www.nato.int/ims/docu/mil-pol-pub-affairs-en.pdf

[19] Giles, K. (2016). THE NEXT PHASE OF RUSSIAN INFORMATION WARFARE, 16.

[20] Giles, K. (2016). Russia's 'New' Tools for Confronting the West: Continuity and Innovation in Moscow's Exercise of Power. Russia And Eurasia Programme, 73.

[21] Giles, K. (2016). HANDBOOK OF RUSSIAN INFORMATION WARFARE. Fellowship Monograph, 90.

[22] Glaser, A. (2017). What Was Russia Up To?: Here's what we know about how Russia tried to use Facebook, Google, and Twitter to sway the 2016 election. [Online]. Retrieved March 30, 2018, from http://www.slate.com/articles/technology/future_tense/2017/10/what_we_know_about_russia_s_use_of_american_facebook_twitter_and_google.html

[23] Google launches Fact Check in search results worldwide: After a tentative launch in October 2016, Google has released its Fact Check feature in search results worldwide. [Online]. (2017). Retrieved April 01, 2018, from https://searchenginewatch.com/2017/04/10/google-launches-fact-check-in-search-results-worldwide/

[24] Gorwa, R. (2017). Computational Propaganda in Poland: False Amplifiers and the Digital Public Sphere: Computantional Propaganda Research Project (1st ed.). UK: University of Oxford.

[25] Goda, J. Pod falošným menom som písal hoaxy pre Hlavné správy [Online]. Retrieved March 25, 2018, from https://dennikn.sk/995647/pod-falosnym-menom-som-pisal-hoaxy-pre-hlavne-spravy/

[26] Griffin, A. (2018). DELETE FACEBOOK CAMPAIGN TAKES OFF – BUT ACTUALLY REMOVING YOUR DATA MIGHT PROVE MORE DIFFICULT THAN IT SEEMS [Online]. Retrieved March 30, 2018, from https://www.independent.co.uk/life-style/gadgets-and-tech/news/delete-facebook-cambridge-analytica-campaign-deactivate-data-remove-hide-privacy-a8266671.html

[27] Grush, L. (2018). Elon Musk has removed Tesla and SpaceX's Facebook pages after Twitter challenge [Online]. Retrieved March 30, 2018, from https://www.theverge.com/2018/3/23/17156402/elon-musk-deleted-tesla-and-spacex-facebook-pages-twitter-challenge

[28] Howard, J. G. (2015). Information Operations and the Islamic State (Master's Capstone Theses). Charles Town, WV.

[29] Hybrid influence – lessons from Finland [Online]. (2017). Retrieved November 05, 2017, from https://www.nato.int/docu/review/2017/Also-in-2017/lessons-from-finland-influence-russia-policty-security/EN/index.html

[30] Jowett, G., O'Donnell, V., & Jowett, G. (c2012). Propaganda & Persuasion (5th ed). Thousand Oaks, Calif.: SAGE.

[31] MacFarquhar, N. (2016). A Powerful Russian Weapon: The Spread of False Stories [Online]. Retrieved April 02, 2018, from https://www.nytimes.com/2016/08/29/world/europe/russia-sweden-disinformation.html

[32] McCullagh, D. Obama Harvested Data from Facebook and Bragged About It. Why Are We Only Freaking Out About This Now?: Why are politicians now freaking out about a feature that has been publicly documented since its inception and that was discontinued three years ago? [Online]. Retrieved March 25, 2018, from http://reason.com/archives/2018/03/23/cambridge-analytics-dust-up-reveals-lawm

[33] Mosseri, A. (2016). Building a Better News Feed for You [Online]. Retrieved March 30, 2018, from https://newsroom.fb.com/news/2016/06/building-a-better-news-feed-for-you/

[34] Mosseri, A. (2017). Improving the Integrity of News Feed: Principles for Publishers [Online]. Retrieved April 07, 2018, from https://media.fb.com/2017/05/19/improving-the-integrity-of-news-feed-principles-for-publishers/

[35] Morris, I. (2017). K čemu je dobrá válka?: konflikty a pokrok civilizace. Praha: Argo.

[36] Newton, C. (2016). Facebook partners with fact-checking organizations to begin flagging fake news: Plus new tools for reporting hoaxes [Online]. Retrieved April 07, 2018, from https://www.theverge.com/2016/12/15/13960062/facebook-fact-check-partnerships-fake-news

[37] Newton, C. (2017). Senators blast tech companies over Russian meddling: 'Do something about it — or we will': Sen. Dianne Feinstein leads charge against Facebook, Google, and Twitter [Online]. Retrieved March 30, 2018, from https://www.theverge.com/2017/11/1/16591646/facebook-senate-hearing-feinstein-russia-google-twitter

[38] Nothing is real: How German scientists control Putin's face [Online]. (2016). Retrieved April 02, 2018, from https://www.youtube.com/watch?v=ttGUiwfTYvg

[39] Panetta, K. (2017). Gartner Top Strategic Predictions for 2018 and Beyond: From bots and AI to counterfeit reality and fake news, these predictions require IT leaders to pace their adoption. [Online]. Retrieved April 02, 2018, from https://www.gartner.com/smarterwithgartner/gartner-top-strategic-predictions-for-2018-and-beyond/

[40] Parakilas, S. (2017). We Can't Trust Facebook to Regulate Itself [Online]. Retrieved March 30, 2018, from https://www.nytimes.com/2017/11/19/opinion/facebook-regulation-incentive.html

[41] Perry, P. (2017). These A.I. tools could lead to the next generation of fake news [Online]. Retrieved April 02, 2018, from http://bigthink.com/philip-perry/these-ai-tools-could-lead-to-the-next-generation-of-fake-news

[42] Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach: Whistleblower describes how firm linked to former Trump adviser Steve Bannon compiled user data to target American voters [Online]. (2018). Retrieved March 30, 2018, from https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election

[43] Robert B. Cialdini. (2007). Influence: the psychology of persuasion (Rev. ed.). New York: Collins.

[44] Robertson, A. (2018). Malaysian government threatens 10-year prison sentences for pushing fake news [Online]. Retrieved April 01, 2018, from https://www.theverge.com/2018/3/26/17163920/malaysia-fake-news-law-proposal-election-najib-razak

[45] Romm, T., & Wagner, K. (2017). Facebook admits 'malicious actors' spread misinformation during the 2016 U.S. election: It also cites a government report that found Russia played a major role in the presidential race. [Online]. Retrieved April 22, 2018, from https://www.recode.net/2017/4/28/15476142/facebook-report-trump-clinton-russia-us-presidential-election

[46] Roy, E. A. (2018). Facebook rolls out trial of 'dislike' button for downvoting comments [Online]. Retrieved July 08, 2018, from https://www.theguardian.com/technology/2018/may/01/facebook-rolls-out-trial-of-dislike-button-for-downvoting-comments

[47] Shane, S., & Goel, V. (2017). Fake Russian Facebook Accounts Bought $100,000 in Political Ads [Online]. Retrieved March 30, 2018, from https://www.nytimes.com/2017/09/06/technology/facebook-russian-political-ads.html

[48] Section 230: A Key Legal Shield For Facebook, Google Is About To Change: THE NEW CLASH BETWEEN FREE SPEECH VS. PRIVACY [Online]. (2018). Retrieved March 31, 2018, from https://www.npr.org/sections/alltechconsidered/2018/03/21/591622450/section-230-a-key-legal-shield-for-facebook-google-is-about-to-change

[49] Silverman, C. (2016). This Analysis Shows How Viral Fake Election News Stories Outperformed Real News On Facebook [Online]. Retrieved April 02, 2018, from https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook

[50] Snow, J. (2017). AI Could Set Us Back 100 Years When It Comes to How We Consume News [Online]. Retrieved April 02, 2018, from https://www.technologyreview.com/s/609358/ai-could-send-us-back-100-years-when-it-comes-to-how-we-consume-news/

[51] Snow, J. (2017). Can AI Win the War Against Fake News?: Developers are working on tools that can help spot suspect stories and call them out, but it may be the beginning of an automated arms race. [Online]. Retrieved April 02, 2018, from https://www.technologyreview.com/s/609717/can-ai-win-the-war-against-fake-news/

[52] Taplin, J. T. (2017). Move fast and break things: how Facebook, Google, and Amazon cornered culture and undermined democracy. New York: Little, Brown and Company.

[53] The Law that Gave Us the Modern Internet—and the Campaign to Kill It: Ever heard of Section 230 of the Communications Decency Act? It gave birth to the social web. Here's why we need more laws just like it. [Online]. (2013). Retrieved March 31, 2018, from https://www.theatlantic.com/business/archive/2013/09/the-law-that-gave-us-the-modern-internet-and-the-campaign-to-kill-it/279588/

[54] Thompson, N., & Vogelstein, F (2018). Inside the Two Years That Shook Facebook—and the World: How a confused, defensive social media giant steered itself into a disaster, and how Mark Zuckerberg is trying to fix it all. [Online]. Retrieved March 25, 2018, from https://www.wired.com/story/inside-facebook-mark-zuckerberg-2-years-of-hell/

[55] Tynan, D. (2016). How Facebook powers money machines for obscure political 'news' sites: From Macedonia to the San Francisco Bay, clickbait political sites are cashing in on Trumpmania – and they're getting a big boost from Facebook [Online]. Retrieved April 02, 2018, from https://www.theguardian.com/technology/2016/aug/24/facebook-clickbait-political-news-sites-us-election-trump

[56] Ordway, D.-M. (2017). Fake news and the spread of misinformation [Online]. Retrieved April 29, 2018, from https://journalistsresource.org/studies/society/internet/fake-news-conspiracy-theories-journalism-research

[57] Wanless, A., & Berk, M. (2018). The Strategic Communication Ricochet: Planning Ahead for Greater Resiliency [Online]. Retrieved October 20, 2018, from https://thestrategybridge.org/the-bridge/2018/3/7/the-strategic-communication-ricochet-planning-ahead-for-greater-resiliency

[58] Weedon, J., Nuland, W., & Stamos, A. (2017). Information Operations and Facebook: Verze 1.0 [Online], 13. Retrieved from https://fbnewsroomus.files.wordpress.com/2017/04/facebook-and-information-operations-v1.pdf

[59] What is the future of news? Bleak, probably.: A professional fact-checker explains why our fake news problem isn't going away. [Online]. Retrieved March 25, 2018, from https://www.vox.com/conversations/2016/11/28/13714596/media-democracy-donald-trump-fake-news-internet-journalism-social-media

[60] Why Is Finland Able to Fend Off Putin's Information War?: Helsinki has emerged as a resilient front against Kremlin spin. But can its successes be translated to the rest of Europe? [Online]. Retrieved November 05, 2017, from http://foreignpolicy.com/2017/03/01/why-is-finland-able-to-fend-off-putins-information-war/

[61] Wong, J. C. (2018). Facebook The Cambridge Analytica Files Mark Zuckerberg apologises for Facebook's 'mistakes' over Cambridge Analytica: Following days of silence, CEO announces Facebook will change how it shares data with third-party apps and admits 'we made mistakes' [Online]. Retrieved March 30, 2018, from https://www.theguardian.com/technology/2018/mar/21/mark-zuckerberg-response-facebook-cambridge-analytica