Monday, April 1, 2024

Fox News Hires Disinformation Expert | TOME

Date:

Fox News, one of the most relentless critics of the war on disinformation, now faces a new challenge as its parent company, Fox Corporation, seeks to enhance its internal capability to combat disinformation. Last week, Fox Corporation posted a job opening for a corporate “trust and safety behavioral analyst” tasked with identifying misinformation and disinformation. The goal is to establish a content moderation system across Fox’s businesses, including Fox News, to combat disinformation. The corporation plans to collaborate closely with partners both within and outside the company to achieve this objective, utilizing pattern recognition, a crucial component of artificial intelligence, to identify hostile users.

The responsibilities of the analyst will include maintaining the ongoing community health and brand safety of Fox sites and apps that engage directly with users to safeguard user communities. A background in psychology, criminal justice, social media, gaming, news, or media is considered a plus for the role. When questioned about the job posting, Fox did not provide a response.

This corporate focus on disinformation stands in stark contrast to Fox News’s critical coverage of anti-disinformation efforts that regulate social media content, which the network often equates with censorship. Notably, when the Department of Homeland Security established a now-defunct Disinformation Governance Board in 2022, prominent Fox News hosts vehemently condemned the move. Hosts like Sean Hannity and Tucker Carlson referred to the board as a “Ministry of Truth,” drawing parallels to George Orwell’s dystopian novel “1984.” Brian Kilmeade echoed their sentiments, suggesting that the Biden administration was using Orwell’s work as a manual rather than a warning.

Following the revelation of the Disinformation Governance Board, Fox News became fixated on the disinformation battle. According to a defamation lawsuit filed by Nina Jankowicz against Fox News, 70 percent of Fox’s one-hour segments in the week after the board’s disclosure referenced disinformation and the DHS official overseeing the board. Throughout 2022, Jankowicz was mentioned over 300 times on Fox News. Fox’s corporate interest in disinformation differs from that of the federal government, focusing more on audience engagement.

The use of AI technology in content moderation has become increasingly prevalent across various industries. Companies like Facebook (now Meta) have reported that over 95 percent of hate speech takedowns are conducted by AI rather than humans. The advancements in AI technology have made content moderation more cost-effective and scalable than ever before. Fox’s job posting references machine learning, large language models, and natural language processing as key components of their AI strategy to combat disinformation.

While AI technology has revolutionized content moderation practices, its impact on the broader disinformation debate remains relatively unexplored. The federal government is also leveraging AI to identify foreign influence operations, as highlighted in the Biden administration’s recent budget request to Congress. As AI continues to evolve, it is poised to play a significant role in combating disinformation across various platforms.

In conclusion, Fox Corporation’s decision to invest in combating disinformation reflects a shifting landscape where AI technology is increasingly utilized to address misinformation and safeguard user communities. As companies adapt to these technological advancements, the battle against disinformation is likely to intensify, raising important questions about censorship, freedom of speech, and the ethical implications of AI-driven content moderation.

Latest stories