AI Risks

Column 7.8 | Risk Warnings On Using New AI Technologies To Implement Fraud

Column 7.8 | Risk Warnings On Using New AI Technologies To Implement Fraud

Column 7.8 | Risk Warnings On Using New AI Technologies To Implement Fraud

7.8 Column on Beware of the Risk Warning of Using New AI Technology to Implement Fraud At present, the widespread application of AI technology provides the public with personalized and intelligent letters.

AI face-changing fraud method_AI face-changing fraud risk

AI technology fraud risk_AI face-changing fraud method

AI technology fraud risk_AI face-changing fraud method_AI face-changing fraud

Column 7.8

On the risk warning of using new AI technologies to implement fraud

AI technology fraud risk_AI face-changing fraud method

At present, the widespread application of AI technology provides personalized and intelligent information services to the public, and also brings opportunities to online fraud. For example, criminals use facial replacement and voice synthesis to create false images, audio, and videos, imitate others' identities to commit fraud, infringe on the legitimate rights and interests of consumers.

Scam methods

There are mainly two methods of using new AI technologies to implement fraud: "sound-like" and "face-changing", that is, to gain trust by simulating other people's voices or images, and then defrauding money. Criminals usually use the excuses of "online store customer service", "marketing promotion", "recruitment and part-time jobs", "marriage and dating", etc. to contact consumers through WeChat, QQ, phone calls, etc., and collect pronunciations, sentences or facial information. Then, use technologies such as "sound-like" and "face-changing" to synthesize consumers' false audio or videos and images, and use excuses such as borrowing money, investment, and emergency assistance to induce their relatives and friends to transfer money, or provide sensitive information such as bank account passwords, and then transfer funds immediately. In addition, criminals may also artificially synthesize audio and videos such as celebrities, experts, officials, etc., and use their identities to spread false information to achieve the purpose of fraud.

Risk warning

"Seeing" online channels is not necessarily true

A major feature of synthetic technologies such as "sound-like" and "face-changing" is that "using fake to the real", criminals can use such technologies to easily disguise themselves as others, and quickly screen target groups and customize fraud scripts to accurately implement fraud. Therefore, when it comes to financial transactions, a "very familiar phone call" and a "video that looks like an acquaintance" may be a fraudulent routine for criminals, and consumers should be vigilant.

Make sure to verify the identity of the other party when transferring money

When faced with self-proclaimed "acquaintances" and "leaders" who induce remittances through social software, telephone, etc. for various reasons, be sure to verify the other party's identity. If you can ask questions that are only known to both parties during the communication process, you can also use other communication methods or meet to verify, or verify your identity and status with your friends and family members. When the other party's identity cannot be confirmed, transfer operations should be avoided as much as possible.

Protect personal information such as photos and sounds

Consumers should improve their awareness of personal information protection, not download unfamiliar software, register unfamiliar platforms, or add unfamiliar friends at will, and be vigilant about the security of personal social accounts. Try to avoid exposing too much personal photos, sounds, videos and other information on the Internet, and do not want to conveniently store ID cards, bank card photos, etc. directly in your mobile phone. If you are accidentally cheated or encounter suspicious circumstances, you should pay attention to preserving the evidence and report the case to the local public security organ immediately.

More