TECHNOLOGY

The FBI wants you to come up with a code word to ward off deepfake AI attacks on your phone

×

The FBI wants you to come up with a code word to ward off deepfake AI attacks on your phone

Share this article
The FBI wants you to come up with a code word to ward off deepfake AI attacks on your phone


I get that many of you would prefer that the FBI not put their sticky trigger fingers in our smartphones. But lately the G-men have been less interested in seeing what is in your phone and are focused more on how you can protect your personal data and content. For example, just the other day the FBI warned iOS and Android users not to send cross-platform texts unless a platform offering end-to-end encryption is used. 

This is because Apple decided to support a protocol of Rich Communication Services (RCS) that doesn’t support end-to-end encryption. A good app that you might want to use is WhatsApp which will completely encrypt iOS users text messages to an Android user and vice versa. Texts sent from one iOS user to another iOS user, or from one Android user to another Android user, are encrypted end-to-end.

The FBI now suggests in public service alert number I-120324-PSA that smartphone users are under attack from hackers using AI to correct errors on fake emails and texts to make them look more realistic so that they can be used to trick more victims into revealing personal data. As we’ve told you many times, when you get an email or text that smells fishy (or even if it doesn’t), spelling errors or grammatical mistakes is usually the sign that the message is bogus. With AI, such telltale mistakes can be easily fixed by hackers before they send you the fake email or text.

Video Thumbnail

See also  Sweet new $150 Galaxy S24 Ultra discount makes an amazing phone even better

The G-men warn that generative AI can be used in scams to:

  • Produce photos to share with victims to help convince them that they are speaking with a real person.
  • Create images of celebrities and social media personalities to show them promoting fraudulent activities.
  • Create an audio clip of a loved one in a crisis situation requesting financial aid.
  • Produce video clips of company executives and law enforcement.
  • Create video clip to prove that an online contact is a real person.
If you get a phone call asking for personal data or money, hang up the phone, and call a verified phone number of the company that the caller says that he is calling for.

Come up with a secret word that only your family and you know to prove if your loved one is really in danger if you receive an AI-generated scam call.

Never, never, never share sensitive information with people you meet online or over the phone.

While you wouldn’t want the FBI cracking open your phone, you can heed their advice and keep your personal data safe. That in turn will keep attackers from accessing your financial accounts.



Source Link Website

Leave a Reply

Your email address will not be published. Required fields are marked *