Difference between revisions of "Google Disinformation"

From BigTechWiki
Jump to navigationJump to search
Line 6: Line 6:
* YouTube hosted thousands of videos that pushed vaccine misinformation, one of which was viewed 2 million times and only taken down after a reporter inquired about it.
* YouTube hosted thousands of videos that pushed vaccine misinformation, one of which was viewed 2 million times and only taken down after a reporter inquired about it.
* The January 6th Committee noted that YouTube allowed insurrectionists to livestream the attack on the Capitol and hosted election misinformation in the leadup to the attack.
* The January 6th Committee noted that YouTube allowed insurrectionists to livestream the attack on the Capitol and hosted election misinformation in the leadup to the attack.
* A day before the 2020 election, YouTube said it had not implemented any specific policies around misinformation stemming from the election, instead relying on the content policies it had developed in the last three years. In the ten days after the election, videos claiming Trump had won and that voter fraud was rampant proliferated, but YouTube refused to remove then, arguing that it wanted to allow “discussion” of the results. By the time YouTube changed its policies in early December 2020 to curtail election misinformation, videos promoting the Big Lie had been viewed 2.4 billion times, including 180 million views promoting “voter fraud” that had been viewed after YouTube supposedly banned such content.
* A day before the 2020 election, YouTube said it had not implemented any specific policies around misinformation stemming from the election, instead relying on the content policies it had developed in the last three years. In the ten days after the election, videos claiming Trump had won and that voter fraud was rampant proliferated, but YouTube refused to remove then, arguing that it wanted to allow “discussion” of the results. By the time YouTube changed its policies in early December 2020 to curtail election misinformation, videos promoting the Big Lie had been viewed 2.4 billion times, including 180 million views promoting “voter fraud” that had been viewed after YouTube supposedly banned such content.<ref>https://www.washingtonpost.com/politics/2021/03/23/technology-202-where-is-youtube-ceo-susan-wojcicki/</ref>
* In early 2022, videos spreading misinformation about the 2020 election and the January 6th insurrection were still being posted to the platform, accruing hundreds of thousands of views.
* In early 2022, videos spreading misinformation about the 2020 election and the January 6th insurrection were still being posted to the platform, accruing hundreds of thousands of views.
* In 2019, a year after it was announced that information from Wikipedia would be displayed on popular conspiracy theory videos, the added context was missing on some conspiracy videos.
* In 2019, a year after it was announced that information from Wikipedia would be displayed on popular conspiracy theory videos, the added context was missing on some conspiracy videos.

Revision as of 19:40, 11 March 2022


YouTube spreads disinformation

  • YouTube’s algorithm put equal weight on climate science videos and climate denial videos and continued to serve up new climate misinformation to those who viewed similar content. Though Google took steps in October 2021 to demonetize climate denial YouTube videos, it still allowed climate denial content if it was presented as “public discourse.”[1]
  • YouTube hosted thousands of videos that pushed vaccine misinformation, one of which was viewed 2 million times and only taken down after a reporter inquired about it.
  • The January 6th Committee noted that YouTube allowed insurrectionists to livestream the attack on the Capitol and hosted election misinformation in the leadup to the attack.
  • A day before the 2020 election, YouTube said it had not implemented any specific policies around misinformation stemming from the election, instead relying on the content policies it had developed in the last three years. In the ten days after the election, videos claiming Trump had won and that voter fraud was rampant proliferated, but YouTube refused to remove then, arguing that it wanted to allow “discussion” of the results. By the time YouTube changed its policies in early December 2020 to curtail election misinformation, videos promoting the Big Lie had been viewed 2.4 billion times, including 180 million views promoting “voter fraud” that had been viewed after YouTube supposedly banned such content.[2]
  • In early 2022, videos spreading misinformation about the 2020 election and the January 6th insurrection were still being posted to the platform, accruing hundreds of thousands of views.
  • In 2019, a year after it was announced that information from Wikipedia would be displayed on popular conspiracy theory videos, the added context was missing on some conspiracy videos.