Is That Late Night Caller Family – or AI?

When a family member or close friend calls you in peril, how often will you stop to question if it is legitimate? If you suddenly get a call and hear your teenage child’s voice on the phone, telling you that they have landed in trouble with the authorities and need money to pay a fine to get out of trouble, how often are you going to pause to question if it is indeed their voice talking to you? This is something that interests everyone handling AI in business as well as individuals.

The odds are that you will not question it – and that is exactly what hackers are counting on. Sometimes, your loved one will then seemingly pass on the phone to a police officer, or in more sinister scenarios a kidnapper, who heightens that tension and provides you with an unfamiliar account to send money to. 

That is the terrifying reality of AI voice cloning technology in the 2020s – you may never truly know who you are talking with is the person closest to you, a threat that shadows.

The Rise of Voice Cloning

AI voice cloning is on the rise in the world of scamming, and it relies on scare tactics to put the victim in a position where they feel they have no option but to help. 

You will be familiar with deepfake technology, used to create an AI-generated image of a person in online videos and even apps that make an image of how you will look old or as a member of the opposite sex. Today, this technology has developed into ways to simulate your voice through recordings of your voice and others – and all you have to do is answer the phone in most cases.

Everything from social media videos to podcast appearances can allow these threat actors to lift your voice and run it through AI, making it almost impossible to prevent these scammers from becoming you in an audio sense. However, there are ways that you can prevent these scammers from taking effect around your circles as well as AI in business: Creation of a shared verbal password.

Personal Passwords

Agreeing on a password for the family that you can all remember and use is a safety precaution that you need to keep out of public knowledge. It also needs to be something that everyone can remember.

When someone calls you or someone who trusts you (or sends an email or text) in urgent need of money, you can ask them for the password. If they cannot tell you, then you know they are fake. You can always attempt to verify with further questions only the real person would have the answer to, which can also work if you don’t set a password with the parties. The password can even be left on a pad next to your phone – as no scammer will break into your home to find it.

With deep fakes ever evolving and destined to be a vital tool in cloning scams towards AI in business, these techniques help protect against future scams involving machine-generated video or photo avatars that become incredibly sophisticated and realistic. 

These alerting high-technology scams are a focus of AI events for both personal and enterprise AI threats, helping to inform all of the dangers and pitfalls of threat actors prying on the vulnerable through voice cloning of their loved ones and work colleagues.

If you want to learn more about AI conferences in the coming months, look out for these AI events London.

Similar Posts