Disinformation and Big Tech as a Security Threat
From BigTechWiki
Jump to navigationJump to search
- Disinformation was a major national security threat due to its ability to boost our adversaries and sow discord and division in America. The American Security Project posited that the propagation of disinformation “could work towards increasing Russian and Chinese spheres of influence.” They further stated that disinformation risked “negatively impacting the U.S.’s standing in the world a global leader and cooperative partner. The New Yorker described online disinformation as “an ongoing threat to our country” that was “already damaging our political system and undermining our public health.” The United Nations said the internet and social media became “power tools for terrorist groups to radicalize, inspire and incite violence.
- Disinformation led to the decline of liberal democracies in the past decade and sowed discord within America, which undermined the country’s democratic institutions. The number of liberal democracies in the world had steadily decline in the past decade, with WIRED saying social media platforms were “a prime culprit” for why. Disinformation sowed discord within American society. Former NSA General Counsel Glenn Gerstell said disinformation was a national security threat because it “either sow[ed] discord in our society or it undermine[d] confidence in our democratic institutions. The American Security Project worried disinformation could “degrade the fundamentals of democratic societies” like trust in institutions, a free press and trust in free and fair elections. John McCain believed there was “no national security interest more vital to the U.S. than the ability to hold free and fair elections without foreign interference.
- Disinformation intensified crisis situations by destabilizing the public mood and hindering the function of governments. In a research paper, Security and Defense Quarterly warned that disinformation intensified crisis situations and “contribute[d] to the destabilization of public moods, hindering the function of governments and “increased the negative effects of crisis events.” A disinformed public could put pressure on policymakers to undertake decisions based on false information. There were little national defense entities could do to protect the country from disinformation.
- Disinformation spread like wildfire across social media, nearly 6 times faster than accurate stories. Fake news and false rumors were found to penetrate more deeply into social networks and reach more people than accurate stories. A false story on social media was found to reach 1,500 people 6 times faster than a true story did. AI created new challenges in combatting disinformation campaigns because it had begun to be able to create realistic photo, audio, and video forgeries. The Congressional Research Service worried AI forgeries could be “used against the United States And U.S. Allies to generate false news reports, influence public discourse, erode public trust, and attempt to blackmail diplomats. The AI technology that manipulated videos and audio was becoming sophisticated enough to fool forensic analysis tools.[1]
- Russia and Iran were the leading purveyors of disinformation on Facebook between 2018-2021, targeting America most often with their disinformation campaigns. Facebook acknowledged that the U.S. was the most frequent target of disinformation campaigns. Quantifying the exact amount of disinformation that was being produced at any one time was difficult according to government officials and experts.[2]
- Russia made disinformation campaigns an integral part of their military doctrine. Disinformation and the Weaponization of information was an integral part of Russian military doctrine. Russia’s main goal of their disinformation campaigns was to create confusion and distrust within democratic societies in order to weaken foreign resolve against Russian policies. Russia aimed to weaken the United States through discord, division and distraction, believing that would make America less able to challenge Russia’s strategic objectives. Russia’s operations were quite invasive. They went so far as to co-opt unwitting American groups to amplify their narratives designed to divide Americans. Russia also aimed their disinformation campaigns to degrade cohesion among NATO member states.
- Russian disinformation campaigns were unnervingly effective. Russia’s disinformation was quite effective because of their strategies behind disinformation campaigns and divisions in America, which Russia had fostered to maximize discord. Russia pushed disinformation in a comprehensive, integrated way to give its content an aura of authenticity. During disinformation campaigns, Russia often mimicked both sides of divisive issues to maximize discord. Their campaigns worked in the U.S. because of the sharp partisanship of Americans, which they had worked to develop. Russia’s disinformation campaigns had worked across the globe as well. Russia began hacking Eastern European news sites where they manipulated real content and inserted fake articles for immediate dissemination. A poll in Prague found disinformation campaigns had led to 51% of Czechs viewing the U.S.’s role in Europe negatively and reduced Czech’s approval of the EU to only 32%.
- China began conducting disinformation campaigns in the U.S. and followed Russia’s playbook on how to effectively conduct them. China used disinformation campaigns to further incite chaos in the U.S. and shift the blame of COVID away from them. Chinese agents created fake social media accounts akin to Russian-backed trolls that pushed false messages to create chaos in the U.S. The Department of Homeland Security believed a main goal of China’s disinformation war during COVID was an attempt to shift responsibility for the pandemic to other countries, including the United States. China used their plethora of social media accounts to spread falsehoods about the killing of George Floyd and the Black Lives Matter movement. China was also behind the false message claiming Trump planned to lock down the country in March 2020, a lie that spread so much that the National Security Council had to issue announcement stating the claim was fake. China had the potential to build out a frighteningly robust disinformation campaign infrastructure because of its vast intelligence resources.
- Iran joined China and Russia in their disinformation campaigns while also harnessing social for espionage. Iran was behind COVID disinformation, using videos, cartoons and news stories to appeal to an American audience. Iran also sent emails and videos to voters in Arizona, Florida And Alaska, purporting to be from the Proud Boys, saying “vote for trump or we will come after you.” Iran also used Facebook to engage in espionage on other state actors.
- Facebook whistleblower Frances Haugen believed Facebook was a “national security issue.” Facebook was the No. 1 social media network for disinformation. Haugen said Facebook was very aware that their platform was being used by American adversaries to push and promote their interests at the expense of Americans. Despite knowing this, Facebook consistently understaffed its counter-espionage information operations and counterterrorism teams.
- Facebook knew that their core products led to the proliferation of disinformation on their site. Internal researchers for Facebook found that Facebook’s “core product mechanics” let disinformation and hate speech flourish on the site. An internal researcher for Facebook warned that the platforms “internal systems [were] not yet identifying, demoting and/or removing anti-vaccine comments often enough. A study found that in a single month, nearly all of the most-shared Facebook links on voting came from sites that peddled disinformation. 17 out of the 20 most-shared links on Facebook about voting came from right-wing outlets that frequently spread false narratives and disinformation on voting and elections.
- Facebook refused to act on the suggestions from their internal researchers on how to address the platforms amplification of divisive content. A May 2020 Wall Street Journal headline read “Facebook executives shut down efforts to make the site less divisive.” The report came after Facebook researchers informed senior executives that the platform’s algorithms “exploit[ed] the human brain’s attraction to divisiveness.” The researchers warned that “if left unchecked” Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”
- Facebook was uninterested in publicly disclosing how much Russian interference was on their platform. In 2018, Facebook’s then-chief information security officer, Alex Stamos, left the company after executives resisted his advocating for more disclosures on Russian interference on their platform. Stamos also suggested some restricting to better address disinformation on its platform. Stamos planned to leave earlier than he did, but company executives begged him to stay because they worried his departure would look bad. Facebook’s security team at large was also in conflict with Facebook’s legal and policy team when the security team pushed for more disclosures on how nation states had misused their sites.[3]
- Facebook had no incentive or desire to reform their platform to curb disinformation or act in America’s best interest because “lies [paid] as well as the truth” did. Brookings believe that disinformation on Facebook and its subsidiaries was the “logical result of a revenue model” that rewarded the volume of information over its veracity. Brookings said “when lies pay just as well as the truth, there is little incentive to only tell the truth.” Facebook whistleblower Frances Haugen said conflicts of interest between what was good for the public and what was good for Facebook “and Facebook over and over again, chose to optimize for its own interests."
- ↑ https://www.theatlantic.com/technology/archive/2018/03/largest-study-ever-fake-news-mit-twitter/555104/
- ↑ https://www.nbcnews.com/politics/national-security/russia-iran-were-top-two-sources-disinfo-facebook-targeting-u-n1268550
- ↑ https://www.nytimes.com/2018/03/19/technology/facebook-alex-stamos.html?action=click&module=Top%20Stories&pgtype=Homepage