Why Chris Chen wants us to rethink our relationship with AI

By Colin Bowyer on Dec. 29, 2025

As artificial intelligence reshapes communication and culture, the College of Liberal Arts welcomes Chris Chen, a new assistant professor whose research examines the psychological and social effects of AI

Image
a woman in a black shirt and pink jacket looking at the camera

Cheng "Chris" Chen

By Hoku Tiwanak, CLA Student Writer - December 30, 2025

When Cheng “Chris” Chen first heard about Alexa and Siri during her Ph.D. program at Penn State University, she found herself intrigued by these new conversational agents. “I wanted to understand why people talk to these devices, what they can do, and how those interactions shape the way we think and behave,” she said. That curiosity set her on a path that now defines her research at the intersection of artificial intelligence, social psychology, and human behavior.

Chen recently started her new role as a tenure-track faculty member in the College of Liberal Arts’s School of Communication, bringing her experience as an AI researcher and designer. Before arriving in Corvallis, she taught at Elon University in North Carolina as an assistant professor of communication design, specializing in user experience and user interface design and human-computer interaction. 

Now as assistant professor of emerging media and technology, Chen teaches courses such as Global Media (NMC 280) and, this winter, Media Effects of AI (NMC 535). Her classes encourage students to think critically about the quickly evolving relationship between humans and technology while helping her students understand both the opportunities and the risks that come with AI. “Technology can enhance our lives, but we shouldn’t become overly reliant on it or let it replace our cognitive abilities.” she said. 

Chen’s most recent research, published in Media Psychology, examines racial bias in AI training data. Her study asked participants to assess image datasets used in AI training, revealing how often people fail to recognize bias, such as misrepresentation of certain racial groups. 

“When AI is trained on data that doesn’t reflect diverse populations, its performance suffers,” she explained. “The people designing AI bring their own perspectives and biases to the process, we have to keep human welfare and wellbeing in the loop.” Her goal is to help people identify and mitigate these biases before they become embedded in the systems we all use.

Looking ahead, Chen plans to continue her collaborations with researchers at Penn State, Elon University, and international partners to study how people can become more aware of and resilient to AI bias and its defects. She finds her work most rewarding when it has real-world applications, in hopes that her findings will someday shape the way AI is developed. The greatest challenge she has identified is the lack of access to proprietary AI design processes. “We often have to study the effects of AI after products are already released,” she said. “That makes it harder to prevent issues upfront, but it also makes our research more essential.”

As Chen begins her journey at the College of Liberal Arts, she looks ahead to advancing the conversation about how humans and technology can coexist more responsibly. She looks to inspire students and researchers to think critically about the ethical and social implications of AI, ensuring that innovation continues to serve people first.