UAA project uses artificial intelligence to battle Covid ‘misinformation,’ spreads bad information on polio

23

University of Alaska Anchorage professors are concerned that bad information about Covid-19 spreads quickly on social media. Professor Shawn Butler is part of a project to combat the problem with the use of artificial intelligence, such at Chat.openai.com, an increasingly popular new web application that is being used by university teams to respond to what is considered by artificial intelligence to be misinformation about Covid.

Butler, a computer scientist, is working in cooperation with other professors and the university’s Department of Journalism and Public Communication’s parallel efforts to combat false Covid claims on public-facing Facebook pages, by using what is considered by the project team to be scientifically accurate information from credited sources. The journalism department has an Alaska Public Health Information Response Team project, started two years ago to fight misinformation about Covid.

But misinformation is sometimes in the eye of the beholder.

“The damage done with misinformation, especially on social media, is something we’ve never seen before,” Butler was quoted as saying on the University’s website. “We almost eradicated polio until people started saying they’re not going to take the vaccine because of something they read online.”

In fact, the Centers for Disease Control and Prevention website does not support Butler’s statement. It’s misinformation.

Read what the CDC says about polio in the United States.

The CDC says no cases of polio from the wild have originated in the United States since the 1990s. There is a rare exception: A case of polio detected in in New York State in 2022 left a man paralyzed. But that case of polio came from the vaccine itself.

“During the fall of 2022, an unvaccinated man in Rockland County, New York, was paralyzed as a result of a polio infection. The NY man was infected with a type of poliovirus that came from the oral polio vaccine, called vaccine-derived paralytic polio,” according to an article published by the Children’s Hospital of Philadelphia. “The case was concerning because we know that only about 1 of every 2,000 people infected with this vaccine-derived strain will be paralyzed, meaning that many others are likely also being exposed to this virus. High vaccination rates in most U.S. communities mean that often we remain unaware of the presence of the virus, but recent drops in routine immunizations following the pandemic have left some communities more vulnerable.”

Polio occurs occasionally in the world, but mainly where people are not using social media. In 2021, only three places in the world had wild (non-vaccine-originated) polio: Afghanistan, Pakistan and Malawi. Read more from PolioEradication.org.

Identifying and responding to misinformation on the internet can be a daunting and time-consuming process, the University of Alaska Anchorage explained. And yet, this is a government agency, headed by a government-funded professor, perfecting criticism of the public and using artificial intelligence to tamp down alternative viewpoints.

“So Butler and her team developed a way to use machine learning to assist in automatically identifying Covid-19 misinformation through natural language processing analysis, where a model is fed a data set of text labeled as ‘misinformation’ or ‘not misinformation’ with point values assigned to certain keywords or phrases to train the model to identify misinformation that is not labeled,” the university said.

“Currently, Butler’s model boasts an 80% accuracy rate when identifying misinformation and a 50% accuracy rate when identifying what isn’t misinformation — a number she is confident will improve after providing the model with a much larger labeled data set,” UAA said.

The team is satisfied that they are changing people’s minds with their debunking and what Butler calls “pre-bunking” — getting good information out in front, before bad information can go viral.

“In controlled situations, research shows that pre-bunking is more effective than debunking,” Butler said. “If somebody knows what the scam is, it’s easier for them to see it rather than be convinced once they have already fallen for it.”

Once the computer algorithm identifies potential misinformation, it notifies the Alaska Public Health Information Response Team, which then reviews the post and provides scientifically accurate information to correct the misinformation.

The UAA project could be seen by some as an early form of artificial intelligence being used for government psychological operations, also knowns as psy-ops. This kind of operation could be expanded for other types of government-approved propaganda or material on a variety of topics.

The project may also be a government-funded program perilously close to suppressing the First Amendment rights of citizens.

More information on the Alaska Public Health Information Response Team in this UAA story from 2021.

Must Read Alaska asked Chat.openai.com to rewrite this story

Must Read Alaska asked artificial intelligence to “improve this story.” Here is the version of the story from Chat.openai.com, the artificial intelligence tool, which makes it clear that the algorithm disagrees with a premise in the story as it pertains to the CDC’s position on polio in the USA. From the perspective of artificial intelligence, claims made in the story above would be considered “misinformation”.

“The University of Alaska Anchorage is taking a proactive approach in combating the spread of misinformation about Covid-19 on social media. Professor Shawn Butler, a computer science professor, is collaborating with the Department of Journalism and Public Communication to use artificial intelligence, such as the Chat.openai.com web application, to respond to false claims and misinformation about Covid-19 on public-facing Facebook pages. The project aims to provide scientifically accurate information from reputable sources to combat the spread of misinformation.

“However, the article contains some inaccuracies and misunderstandings. For example, the article incorrectly cites the Centers for Disease Control and Prevention (CDC) as not supporting Butler’s statement about polio, but the CDC’s statement actually addresses the eradication of polio in the United States and not the impact of misinformation on vaccine uptake. The article also suggests that the project may be a form of government psy-ops, which is not supported by the evidence presented.

“Despite these inaccuracies, the project is a promising initiative in the fight against the spread of misinformation about Covid-19. By using machine learning and natural language processing to identify potential misinformation, the project team can provide scientifically accurate information to correct false claims and promote public health. However, care must be taken to ensure that the project does not infringe on the First Amendment rights of citizens or become a tool for government propaganda.”