• Latest
  • Trending
  • All
  • News
Are Customers Lying to Your Chatbot?

Are Customers Lying to Your Chatbot?

May 6, 2022
Malaysia’s $30bn wealth fund to stand by carbon-emitting state companies

Malaysia’s $30bn wealth fund to stand by carbon-emitting state companies

June 26, 2022
Japan’s biggest chipmakers from Toshiba to Sony brace for engineer shortage

Japan’s biggest chipmakers from Toshiba to Sony brace for engineer shortage

June 26, 2022
How to Earn Passive Income Through Amazon Dropshipping

How to Earn Passive Income Through Amazon Dropshipping

June 26, 2022

Scaling Up Your Freelancing Career to a Small Business

June 26, 2022
Attend Digital Summit Portland 2022 for the Latest Cutting-Edge Marketing

Attend Digital Summit Portland 2022 for the Latest Cutting-Edge Marketing

June 26, 2022
10 Ways to Take Your Current Marketing Plan to the Next Level

10 Ways to Take Your Current Marketing Plan to the Next Level

June 26, 2022
How Emotionally Intelligent People Use the ‘Warren Buffett Rule’ to Become Exceptionally Persuasive

How Emotionally Intelligent People Use the ‘Warren Buffett Rule’ to Become Exceptionally Persuasive

June 26, 2022
Try the ‘3-R’ Sleep Ritual to Help Combat the Dreaded Revenge Bedtime Procrastination

Try the ‘3-R’ Sleep Ritual to Help Combat the Dreaded Revenge Bedtime Procrastination

June 26, 2022
Kraken Daily Market Report for June 24 2022

Kraken Daily Market Report for June 24 2022

June 25, 2022
Swiss Luxury Brand Hublot Adopts Bitcoin Payments

Swiss Luxury Brand Hublot Adopts Bitcoin Payments

June 25, 2022
How The Qualities Of Bitcoin Baffle Nocoiners

How The Qualities Of Bitcoin Baffle Nocoiners

June 25, 2022
Was Aristotle a Bitcoiner?

Was Aristotle a Bitcoiner?

June 25, 2022
  • About
  • Advertise
  • Privacy & Policy
  • Contact
Sunday, June 26, 2022
  • Login
WallStreetReview
  • Home
  • News
No Result
View All Result
WallStreetReview
No Result
View All Result
Home News

Are Customers Lying to Your Chatbot?

by Editor
May 6, 2022
in News
0
Are Customers Lying to Your Chatbot?
491
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Automated customer service systems that use tools such as online forms, chatbots, and other digital interfaces have become increasingly common across a wide range of industries. These tools offer many benefits to both companies and their customers — but new research suggests they can also come at a cost: Through two simple experiments, researchers found that people are more than twice as likely to lie when interacting with a digital system than when talking to a human. This is because one of the main psychological forces that encourages us to be honest is an intrinsic desire to protect our reputations, and interacting with a machine fundamentally poses less of a reputational risk than talking with a real human. The good news is, the researchers also found that customers who are more likely to cheat will often choose to use a digital (rather than human) communication system, giving companies an avenue to identify users who are more likely to cheat. Of course, there’s no eliminating digital dishonesty. But with a better understanding of the psychology that makes people more or less likely to lie, organizations can build systems that discourage fraud, identify likely cases of cheating, and proactively nudge people to be more honest.

Imagine you just placed an online order from Amazon. What’s to stop you from claiming that the delivery never arrived, and asking for a refund — even if it actually arrived as promised? Or say you just bought a new phone and immediately dropped it, cracking the screen. You submit a replacement request, and the automated system asks if the product arrived broken, or if the damage is your fault. What do you say?

Dishonesty is far from a new phenomenon. But as chatbots, online forms, and other digital interfaces grow more and more common across a wide range of customer service applications, bending the truth to save a buck has become easier than ever. How can companies encourage their customers to be honest while still reaping the benefits of automated tools?

To explore this question, my coauthors and I conducted two simple experiments that allowed us to measure honest behavior in an unobtrusive way. First, a researcher asked participants to flip a coin ten times and told them they’d get a cash prize depending on the results. We had some participants report their coin flip outcomes to the researcher via video call or chat, while others reported their outcomes via an online form or voice assistant bot. They flipped the coins in private, so there was no way to know if any individual participant lied, but we were able to estimate the cheating rate for a group of participants (since overall, only 50% of the coin flips should be successful).

What did we find? On average, when participants reported to a human, they reported 54.5% successful coin flips, corresponding to an estimated cheating rate of 9%. In contrast, when they reported to a machine, they cheated 22% of the time. In other words, a bit of cheating is to be expected regardless — but our participants were more than twice as likely to cheat when talking to a digital system than when talking to a human. We also found that blatant cheating, which we defined as reporting an implausibly high success rate of nine or ten successful coin flips, was more than three times more common when reporting to a machine than when reporting to a human.

Next, we determined through a follow-up survey that the main psychological mechanism driving this effect was participants’ level of concern for their personal reputations. We asked a series of questions designed to measure participants’ concern about how the researcher viewed them, and we found that those who had reported their coin flips to a machine felt a lot less close to the researcher, and as a result were a lot less concerned about their personal reputations, than those who reported to the researcher. As such, we hypothesized that anthropomorphizing the digital reporting system (in our case, by giving it a human voice rather than a text-only interface) might make it feel more human, and thus make the participants more worried about maintaining their reputations and less likely to lie. However, we found that participants still cheated just as much, suggesting that if people know they are interacting with a machine, giving that machine human features is unlikely to make much of a difference.

To be sure, it’s possible that advances in convincingly human-like AI systems could make this a more effective strategy in the future. But for now, it’s clear that digital tools make cheating a lot more prevalent, and there’s no obvious quick fix.

The good news is, our second experiment did identify a strategy that can help companies address this issue: While there’s no eliminating dishonesty, it is possible to predict who is more likely to lie to a robot, and then push those users to use a human communication channel instead.

In this experiment, we first assessed participants’ general tendency to cheat by asking them to flip a coin ten times and report the results via an online form, and then categorized them accordingly as “likely cheaters” and “likely truth-tellers.” In the next part of the experiment, we offered them the choice between reporting their coin flips to a human or via an online form. Overall, roughly half of the participants preferred a human and half preferred the online form — but when we took a closer look, we found that “likely cheaters” were significantly more likely to choose the online form, while “likely truth-tellers” preferred to report to a human. This suggests that people who are more likely to cheat proactively try to avoid situations in which they have to do so to a person (rather than to a machine), presumably due to a conscious or subconscious awareness that lying to a human would be more psychologically unpleasant.

Thus, if dishonest people tend to self-select into digital communication channels, this could offer an avenue to better detect and reduce fraud. Specifically, collecting data on whether customers are opting to use virtual rather than human communication channels could complement companies’ existing efforts to identify users who are more likely to cheat, enabling these organizations to focus their fraud detection resources more effectively. Of course, customers may figure out what companies are doing and try to game the system by choosing to speak with a real agent, thus avoiding being flagged as higher-risk — but this is really a win-win, since according to our research, they’ll be much more likely to behave honestly if they talk to a human.

Ultimately, there’s no cure for digital dishonesty. After all, lying to a robot just doesn’t feel as bad as lying to a real human’s face. People are wired to protect their reputations, and machines fundamentally don’t pose the same reputational threat as humans do. But with a better understanding of the psychology that makes people more or less likely to lie, organizations can build systems that can identify likely cases of cheating, and ideally, nudge people to be more honest.

Read More

Share196Tweet123Share49
Editor

Editor

  • Trending
  • Comments
  • Latest
Canada’s OSC Flags Tweets From Coinbase, Kraken CEOs

Canada’s OSC Flags Tweets From Coinbase, Kraken CEOs

February 22, 2022
Trudeau Invokes Rare Emergency Powers To Shut Down ‘Freedom Convoy’ Blockades

Trudeau Invokes Rare Emergency Powers To Shut Down ‘Freedom Convoy’ Blockades

February 15, 2022
S&P 500 confirms correction as stocks stumble on war fears

S&P 500 confirms correction as stocks stumble on war fears

February 23, 2022
Scholz to warn Putin of western resolve on Ukraine

Scholz to warn Putin of western resolve on Ukraine

0
Waning stockpiles drive widespread global commodity crunch

Waning stockpiles drive widespread global commodity crunch

0
FT Global MBA Ranking 2022: US business schools dominate

FT Global MBA Ranking 2022: US business schools dominate

0
Malaysia’s $30bn wealth fund to stand by carbon-emitting state companies

Malaysia’s $30bn wealth fund to stand by carbon-emitting state companies

June 26, 2022
Japan’s biggest chipmakers from Toshiba to Sony brace for engineer shortage

Japan’s biggest chipmakers from Toshiba to Sony brace for engineer shortage

June 26, 2022
How to Earn Passive Income Through Amazon Dropshipping

How to Earn Passive Income Through Amazon Dropshipping

June 26, 2022
WallStreetReview

Copyright © 2022. WallStreetReview.com

Navigate Site

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Follow Us

No Result
View All Result
  • Home
  • News

Copyright © 2022. WallStreetReview.com

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Don't miss the

NEWSLETTER

Exclusive editorial

Breaking News

Quality Company Coverage

Expert Writers

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

WallStreetReview will use the information you provide on this form to be in touch with you and to provide updates and marketing.