Skip to Main Content
Explore News

Medill professor using data to tackle misinformation

Malthouse’s research looks at using AI to personalize news recommendations

A smiling Edward Malthouse is overlaid on a digital theme. In the background, a person interacts with a smartphone displaying cloud technology icons.
Ed Malthouse's AI and fake news research.

EVANSTON, ILL. -- Professor Edward Malthouse enjoys looking at things from different angles. As the Erastus Otis Haven Professor at Medill and a research fellow at the Media Management Center, his research supports the larger news ecosystem and tackles misinformation.

“I think one of the most pressing problems of the day is misinformation,” he said. “There’s a lot of misinformation out there, and it’s hard to stay ahead of it.”

Malthouse began teaching at Northwestern in the Kellogg School of Management in 1995 after completing a PhD in statistics. He joined the IMC faculty in 1997, and became the Director of the Spiegel Center for Database and Digital Marketing in 2012, also known as the Spiegel Research Center.

The Spiegel Research Center supports research in the fields of marketing and advertising, analyzing behaviors across platforms to provide insights to companies.

“It’s about understanding how newer forms of customer engagement affect financial outcomes,” said Malthouse.

In 2023, the National Science Foundation awarded a grant to Malthouse and a team of researchers from the University of Minnesota, Twin Cities. Their research, titled A Research News Recommender Infrastructure with Live Users for Algorithm and Interface Experimentation, uses artificial intelligence to personalize news recommendations.

“We’re not rewriting any articles,” said Malthouse. “We’re helping you find stories of interest that match your topical interests using AI systems.”

For Malthouse, the NSF grant gave him the chance to create opportunities for students as well.

“It’s a huge honor, and it enabled me to fund a PhD student for a while,” he said.

Using the grant, the student focused on using large-language models (LLMs) for news recommendations, as well as a fake news detection tool.

“If you ask an LLM to evaluate a text using these traditional fact-checking questions, and then you bring back the responses to those questions, you can more accurately identify fake news articles than if you just did it from basic machine learning approaches,” Malthouse said.

Malthouse’s position with Speigel, as well as work with Medill’s Local News Initiative, provides him the opportunity to bring his research into the classroom. Malthouse makes a lot of the cases he uses in his curriculum with data from projects he has completed over the years.

“It’s unusual for students to get access to these data sets from real companies,” said Malthouse. “I’m really grateful to the news organizations and other organizations that have provided this data to give to the students.”

Malthouse acknowledges the concerns about artificial intelligence in journalism. He notes that conversations surrounding the implementation of AI today mirror past conversations about different technologies.

“I remember the Medill faculty had a big debate over whether students should be allowed to use Wikipedia about 20 years ago,” he said.

He encourages newsrooms to “embrace it responsibly,” using AI to supplement traditional newsgathering and fact-checking methods.

“You need to be taught ways to corroborate what you find, and find independent sources that say the same thing,” he said.

Overall, Malthouse looks for ways companies can embrace artificial intelligence as a supportive assistant.

“I think it can be a very powerful tool to enhance your productivity, but you have to check everything first,” said Malthouse.

 

By: Victoria Ryan