AI Risks

Column 7.8 | Risk Warnings On Using New AI Technologies To Implement Fraud

Column 7.8 | Risk Warnings On Using New AI Technologies To Implement Fraud

Column 7.8 | Risk Warnings On Using New AI Technologies To Implement Fraud

Column 7.8 | Risk warnings on using new AI technologies to implement fraud

At present, the widespread application of AI technology provides personalized and intelligent information services to the public, and also brings opportunities to online fraud. For example, criminals use facial replacement and voice synthesis to create false images, audio, and videos, imitate others' identities to commit fraud, infringe on the legitimate rights and interests of consumers.

AI phonetic fraud risk_AI face-changing technology fraud

There are mainly two methods of using new AI technologies to implement fraud: "sound-like" and "face-changing", that is, to gain trust by simulating other people's voices or images, and then defrauding money. Criminals usually use "online store customer service", "marketing promotion", "recruitment and part-time jobs", "marriage and dating" as excuses, etc., to contact consumers through WeChat, QQ, phone calls, etc., to collect pronunciations, sentences or facial information. Then, use technologies such as "sound-like" and "face-changing" to synthesize consumers' false audio or videos and images, and use excuses such as borrowing money, investment, and emergency assistance to induce their relatives and friends to transfer money, or provide sensitive information such as bank account passwords, and then transfer funds immediately. In addition, criminals may also artificially synthesize audio and videos such as celebrities, experts, officials, etc., and use their identities to spread false information to achieve the purpose of fraud.

Risk warning:

AI image-changing technology fraud_AI image-changing technology fraud risk

1. The "seeing" of online channels is not necessarily true. A major feature of synthetic technologies such as "sound-like" and "face-changing" is that "using fake to the real", criminals can use such technologies to easily disguise themselves as others, and quickly screen target groups and customize fraud scripts to accurately implement fraud. Therefore, when it comes to financial transactions, a "very familiar phone call" and a "video that looks like an acquaintance" may be a fraudulent routine for criminals, and consumers should be vigilant.

2. When transferring money, be sure to verify the other party’s identity. When you call yourself "acquaintances" and "leaders" and other people induce remittances through social software, telephone, etc. for various reasons, be sure to verify the other party's identity. If you can ask questions that are only known to both parties during the communication process, you can also use other communication methods or meet to verify, or verify your identity and status with your friends and family members. When the other party's identity cannot be confirmed, transfer operations should be avoided as much as possible.

3. Protect photos, sounds and other personal information. Consumers should improve their awareness of personal information protection, not download unfamiliar software, register unfamiliar platforms, or add unfamiliar friends at will, and be vigilant about the security of personal social accounts. Try to avoid exposing too much personal photos, sounds, videos and other information on the Internet, and do not want to save ID cards, bank card photos, etc. directly on your mobile phone. If you are accidentally cheated or encounter suspicious circumstances, you should pay attention to preserving the evidence and report the case to the local public security organ immediately.

More