Artificial intelligence isn’t just making phishing emails seem more realistic – it’s making vishing attacks more convincing, too. With the help of AI, cybercriminals can use small audio clips to clone a person’s voice. Frighteningly, it can also be used to replicate the voice of your loved ones. After all, someone’s more likely to wire money to a grandchild needing to buy college textbooks than they are to a complete stranger.
If you receive a phone call asking for money or sensitive information, pause and breathe before reacting. Do not engage further and call the real person directly using a trusted phone number, even if the voice on the other end is one you know. If money was sent, contact your financial institution and stop the transaction. Help protect others by spreading awareness – share this message with friends and family so they know how to stay safe too.