Site icon RealCallAIBlog

Now that AI voice starts being used in scam calls, how to fight against scammers?

No matter how the whole world is crazy about AI, one tragic fact is unavoidable, that is, AI makes scams more convincing and easier for scammers than before.

Recently, a news report by the Washington Post indicates that scammers are using artificial intelligence to sound more like victims’ family members in distress. People (especially old people) are falling for it and losing thousands of dollars.

According to the report, scammers called targeted old people to seduce them to transfer money to help them get out of emergencies with AI-generated voices of their friends or relatives.

Such a type of scam based on AI-generated voices mimicking people’s voices is nothing new. As early as 2019, Forbes released a news report in which a staff from a UK-based energy firm transferred €220,000 (about $243,000) to the bank account of a Hungarian supplier after receiving a call from his “boss”. In fact, the voice belonged to a fraudster using AI voice technology to spoof the boss.

Grandparent scams have been quite popular these days. Scammers take advantage of their love towards their grandchildren, relatives, family, and friends. With AI tools, scammers can make it with lower scamming costs. Before AI is widely applied, scammers usually depend on text messages to scam money from old people by pretending to be their known people. It’s not easy to establish such contact through a text message.

Nowadays, however, with AI used to imitate anyone’s voice, it’s extremely difficult to distinguish real voices and fake ones. As a matter of fact, the voices generated by AI can be 80-90% the same as that of a person’s real voice. At least, the difference fails to be figured out by naked ears, which is worse when it comes to old people’s ears.

No doubt, AI dramatically reduces scammers’ scamming costs.

Subbarao Kambhampati, a computer science professor at Arizona State University specializing in AI, indicated that there are ways in which scammers can clone a voice with just three seconds of a person’s voice. Additionally, they don’t need to have some advanced equipment. There are endless such apps and services available as you Google such a tool. Worse still, most of them can be used just for free.

Scams occur because of benefits. When cost goes down, it means a higher ROI is accessible to scammers. Since AI technology is capable in numerous fields, it’s estimated that it’ll be widely used by scammers for fraudulent money.

AI Voice Cloning Scam Red Flags

Here are three red flags indicating the possible risks of AI voice cloning scams.

Then, isn’t there a solution to this problem? Absolutely YES!

If you want to fight against AI, use AI.

Let AI sense risky numbers.

When scammers try to send fake text messages or make calls with fake voices, they still depend on real phone numbers. Although scammers are able to forge fake content in messages or calls, the numbers they use can never be forged. Each phone number has its own caller ID that indicates all the information about the number owner like who is the caller, where is the caller from, which organization this caller belongs to, etc.

With a number database established and used by RealCall team, when a call arrives, the caller ID will be instantly identified. If that’s a suspicious number, the call will be immediately blocked behind the screen. That’s how RealCall Blocklist works.

To add accuracy and flexibility, OpenAI technology is also combined with RealCall Blocklist so as to make the number base compatible with specific contact demands.

Let AI personalize.

What’s spam for one person is not necessarily spam for another.

Personal AI enables not only prevents spam and fraud but also understands which calls are valuable to users so that they can stay connected without being scammed. That’s another great feature AI technology gives to scam call-blocking tools like RealCall.

Whenever you smoothly answer or refuse a call, the relationship between you and the caller ID will be taught to AI so that they’re able to learn more about the contact preferences belonging to you alone.

For example, when you time and time again refuse calls coming from numbers belonging to foreign countries or cities, AI will naturally believe you don’t need to establish contacts with such numbers. If a call arrives from a foreign country number or area code, the AI will automatically block it without being told.

RealCall AI: a One-to-One Mobile Communication Guardian

This is how RealCall works on your phone: OpenAI + RealCall Blocklist.

Simply put, the more you use RealCall, the less risk you’ll receive spam or scam calls.

Between You and Scammers is RealCall AI as a One-to-One Mobile Communication Guardian.

Powered by OpenAI, the leading AI research and deployment company, RealCall AI is capable of automatically dealing with all risky and unwanted texts behind the screen. Based on the AI language model, ChatGPT 4 can analyze and process language input to identify patterns, sentiments, and intent and sense the trivial malicious intent hidden in the text message ordinary people fail to notice. Plus, with the continuously updated risky number database developed by RealCall team and long-term users’ reports, RealCall AI is capable of accurately and quickly identifying scam-likely texts and dealing with them in users’ personalized ways.

Instead of blocking alone, RealCall AI is capable of letting pass all the important and necessary text messages that really belong to users’ demands like those from real hospitals, banks, etc. Between you and scammers is RealCall AI as a one-to-one mobile communication guardian.

Exit mobile version