Consumers are wary of products made using artificial intelligence, particularly in the medical, electronic and financial sectors.
Research coordinated by Washington State University found a low level of “emotional trust” in AI-supported goods.
The study, published in the Journal of Hospitality Marketing & Management, was based on experiments conducted with more than 1000 people.
Lead author Mesut Cicek said the findings consistently showed products described as using artificial intelligence were less popular.
“When AI is mentioned, it tends to lower emotional trust, which in turn decreases purchase intentions,” he said.
“We found emotional trust plays a critical role in how consumers perceive AI-powered products.”
In one experiment, participants were presented with identical descriptions of smart televisions, except that one used the term “artificial intelligence” and the other did not.
“The group that saw AI included in the product description indicated they were less likely to purchase the television,” the study report said.
“Researchers also discovered that negative response to AI disclosure was even stronger for ‘high-risk’ products and services, those which people commonly feel more uncertain or anxious about buying, such as expensive electronics, medical devices or financial services.”
Dr Cicek said the findings provided valuable insights for companies.
“Marketers should carefully consider how they present AI in their product descriptions or develop strategies to increase emotional trust,” he said.
“Emphasising AI may not always be beneficial, particularly for high-risk products. Focus on describing the features or benefits and avoid the AI buzzwords.”
The full report is on the WSU website.