AI at CJC

Personalization vs. Personification in Corporate Communications

Tom Kelleher’s research on how people perceive artificial intelligence is leading him to pursue another angle: How that will help organizations determine how conversational their AI is.

The Associate Dean in the University of Florida College of Journalism and Communications Division of Graduate Studies and Research has been studying “conversational voice” since social media started in the early 2000s, when blogs were becoming popular. Then Twitter and Facebook gained popularity, and public relations morphed to a field with many more opportunities for PR professionals to have online conversations with their audiences.

Traditional corporate messaging tends to be not very human, Kelleher said. “It’s automated and doesn’t feel very human, not personalized. It’s 1-to-many and doesn’t adapt to my needs.”

Conversation is the opposite — highly individualized human communication with real people that results in better PR outcomes.

In a project with Cen April Yue, Ph.D. 2020, called “A Conversation Model for Computer-Mediated Communication,” Kelleher describes two other scenarios:

  • Personalization, which is very automated but very individualized. “That’s where AI comes in. It’s a recommender system, the Netflix model. Sometimes you don’t want a whole conversation.” He likens it to ordering pizza, where you can be reminded what you ordered last time and maybe even modify it — automated but individualized.
  • Personification, which is human but not necessarily individualized. This includes messaging that goes out to a mass audience, like Tesla CEO Elon Musk talking about is cars. “He’s very human, but I don’t talk to him specifically, so it’s not individualized to me in any way.”

As public relations and journalism move further into the realm of AI, the lines are blurring — people don’t always know whether they’re talking to a real person or a really good machine. “That then becomes the practical question: To what degree should companies invest in real people and pay people to have these conversations for these outcomes and when does it make sense to have a really good self-learning machine that can answer customer questions,” Kelleher asks.

Working with a team of CJC Ph.D. students on a series of studies on humanness vs. “machine-ness,” his research group has just completed a survey of people who interacted with an Amazon chatbot. The initial results show that 65 percent  knew they were chatting with a virtual agent, but the others – about a third – thought they were talking to a real person or weren’t sure. “It goes to show, people aren’t even sure if they’re talking to an AI agent or real person.”

The team is finding that the more human the replies, the more people perceive a relationship  investment by the business, which leads to trust, he said. “Maybe it doesn’t matter. Maybe a machine is just as efficient in getting the job done if you’re just recommending a drill set or a pair of shoes. But if you’re trying to get people to trust your organization, it might take more of a human voice or a real conversation with a real human.”

That’s important to public relations specialists, journalists and even social scientists, he said. Kelleher hopes the information he’s gathering now will lead to more front-end work on how AI is perceived. “An engineering faculty member can say, ‘OK, we built this AI, let’s test it and see if it works.’ Then convert over to social scientists, who ask ‘What is it people are looking for? How will we know whether it works?’’’

That information then helps managers, researchers and PR professionals decide whether they should hire more people or use AI. “If you hire a ton of people and they’re just on script, what are you paying for? Or vice versa: If you’re getting really good AI and it’s doing exactly what you want, why would you pay people to do that?”

“We can develop valid and reliable measures of how these technologies and/or real people are perceived as agents, and that can be used in asking all sorts of future research questions involving ethics and everything else.”

 

 


Tagged as: , , , ,