January 2024 Scam: A.I. Scams

Scammers are constantly looking for new ways to defraud you and enhance their technique. Abusing technology like artificial intelligence (AI) is par for the course.

The Federal Trade Commission (FTC) warned that con artists could use AI to clone the voice of a loved one. This technology is called deepfake, which is fabricated media that has been digitally altered. All that is needed is a 3-second audio clip, which is quite easy with so much content posted online these days. Deepfakes can also be applied to images and videos.

Tips to Protect Yourself:

  • Don’t trust the voice, video or image.

  • AI is not perfect so look/listen for imperfections.

    • Audio: unnatural speech, odd pronunciation, words or pauses

    • Images: blurry or distorted details like faces, hands, etc. 

    • Video: strange shadows or light flashes or unnatural body language

  • Call the person directly and verify.

  • Create a code word and share it with family and close friends to help validate phone calls, if needed.

  • Never use phone numbers or links found in texts and emails. Use a number that you know is legit.

  • Scammers typically ask for payment via, gift cards, money transfers and cryptocurrency because they are much harder to trace.

  • Don’t be pressured into making an immediate, emotional response.

Sources: bbb.orgftc.gov

Proudly Sponsored by

 
 
Previous
Previous

February 2024 Scam: Romance Rip-offs

Next
Next

December 2023 Scam: Tis the Season for Holiday Scams