Difference between revisions of "Google Disinformation"

From BigTechWiki
Jump to navigationJump to search
(Created page with " '''YouTube spreads disinformation''' * YouTube’s algorithm put equal weight on climate science videos and climate denial videos and continued to serve up new climate misinformation to those who viewed similar content. Though Google took steps in October 2021 to demonetize climate denial YouTube videos, it still allowed climate denial content if it was presented as “public discourse.” * YouTube hosted thousands of videos that pushed vaccine misinformation, one of...")
 
 
(2 intermediate revisions by the same user not shown)
Line 3: Line 3:
'''YouTube spreads disinformation'''
'''YouTube spreads disinformation'''


* YouTube’s algorithm put equal weight on climate science videos and climate denial videos and continued to serve up new climate misinformation to those who viewed similar content. Though Google took steps in October 2021 to demonetize climate denial YouTube videos, it still allowed climate denial content if it was presented as “public discourse.”
* YouTube’s algorithm put equal weight on climate science videos and climate denial videos and continued to serve up new climate misinformation to those who viewed similar content. Though Google took steps in October 2021 to demonetize climate denial YouTube videos, it still allowed climate denial content if it was presented as “public discourse.”<ref>https://www.bloomberg.com/news/articles/2021-01-08/why-climate-science-doesn-t-go-viral-on-youtube</ref>
* YouTube hosted thousands of videos that pushed vaccine misinformation, one of which was viewed 2 million times and only taken down after a reporter inquired about it.
* YouTube hosted thousands of videos that pushed vaccine misinformation, one of which was viewed 2 million times and only taken down after a reporter inquired about it.
* The January 6th Committee noted that YouTube allowed insurrectionists to livestream the attack on the Capitol and hosted election misinformation in the leadup to the attack.
* The January 6th Committee noted that YouTube allowed insurrectionists to livestream the attack on the Capitol and hosted election misinformation in the leadup to the attack.
* A day before the 2020 election, YouTube said it had not implemented any specific policies around misinformation stemming from the election, instead relying on the content policies it had developed in the last three years. In the ten days after the election, videos claiming Trump had won and that voter fraud was rampant proliferated, but YouTube refused to remove then, arguing that it wanted to allow “discussion” of the results. By the time YouTube changed its policies in early December 2020 to curtail election misinformation, videos promoting the Big Lie had been viewed 2.4 billion times, including 180 million views promoting “voter fraud” that had been viewed after YouTube supposedly banned such content.
* A day before the 2020 election, YouTube said it had not implemented any specific policies around misinformation stemming from the election, instead relying on the content policies it had developed in the last three years. In the ten days after the election, videos claiming Trump had won and that voter fraud was rampant proliferated, but YouTube refused to remove then, arguing that it wanted to allow “discussion” of the results. By the time YouTube changed its policies in early December 2020 to curtail election misinformation, videos promoting the Big Lie had been viewed 2.4 billion times, including 180 million views promoting “voter fraud” that had been viewed after YouTube supposedly banned such content.<ref>https://www.washingtonpost.com/politics/2021/03/23/technology-202-where-is-youtube-ceo-susan-wojcicki/</ref>
* In early 2022, videos spreading misinformation about the 2020 election and the January 6th insurrection were still being posted to the platform, accruing hundreds of thousands of views.
* In early 2022, videos spreading misinformation about the 2020 election and the January 6th insurrection were still being posted to the platform, accruing hundreds of thousands of views.
* In 2019, a year after it was announced that information from Wikipedia would be displayed on popular conspiracy theory videos, the added context was missing on some conspiracy videos.
* In 2019, a year after it was announced that information from Wikipedia would be displayed on popular conspiracy theory videos, the added context was missing on some conspiracy videos.
*After Google and other tech companies announced they were taking seemingly decisive steps to cut the Kremlin’s disinformation machine from their platforms during Russia's invasion of Ukraine, an analysis by NewsGuard found that, "despite those statements, dozens of websites promoting Russian disinformation about the Ukraine war continue to receive advertising revenue from Google and other advertising companies. These include websites that hide their sources of funding and control, that are registered in countries such as Cyprus, or that are owned by business associates of Putin. These are part of the broader ecosystem of Russian disinformation where myths often originate on Kremlin-owned sites and are then spread by a network of sites repeating the myths."<ref>https://www.newsguardtech.com/special-reports/ads-russian-propaganda/</ref>

Latest revision as of 19:11, 23 March 2022


YouTube spreads disinformation

  • YouTube’s algorithm put equal weight on climate science videos and climate denial videos and continued to serve up new climate misinformation to those who viewed similar content. Though Google took steps in October 2021 to demonetize climate denial YouTube videos, it still allowed climate denial content if it was presented as “public discourse.”[1]
  • YouTube hosted thousands of videos that pushed vaccine misinformation, one of which was viewed 2 million times and only taken down after a reporter inquired about it.
  • The January 6th Committee noted that YouTube allowed insurrectionists to livestream the attack on the Capitol and hosted election misinformation in the leadup to the attack.
  • A day before the 2020 election, YouTube said it had not implemented any specific policies around misinformation stemming from the election, instead relying on the content policies it had developed in the last three years. In the ten days after the election, videos claiming Trump had won and that voter fraud was rampant proliferated, but YouTube refused to remove then, arguing that it wanted to allow “discussion” of the results. By the time YouTube changed its policies in early December 2020 to curtail election misinformation, videos promoting the Big Lie had been viewed 2.4 billion times, including 180 million views promoting “voter fraud” that had been viewed after YouTube supposedly banned such content.[2]
  • In early 2022, videos spreading misinformation about the 2020 election and the January 6th insurrection were still being posted to the platform, accruing hundreds of thousands of views.
  • In 2019, a year after it was announced that information from Wikipedia would be displayed on popular conspiracy theory videos, the added context was missing on some conspiracy videos.
  • After Google and other tech companies announced they were taking seemingly decisive steps to cut the Kremlin’s disinformation machine from their platforms during Russia's invasion of Ukraine, an analysis by NewsGuard found that, "despite those statements, dozens of websites promoting Russian disinformation about the Ukraine war continue to receive advertising revenue from Google and other advertising companies. These include websites that hide their sources of funding and control, that are registered in countries such as Cyprus, or that are owned by business associates of Putin. These are part of the broader ecosystem of Russian disinformation where myths often originate on Kremlin-owned sites and are then spread by a network of sites repeating the myths."[3]