Tech News

GPT-4 finally “evil can’t win the good”? The actual test can create malware with zero threshold and give suggestions. How to prevent it?

GPT-4 has just debuted, and the outside world is looking forward to how it can play a greater role in the real world. However, some information security experts have discovered that GPT-4 can be used to create phishing letters and malware, making it almost zero threshold for hackers.

penAI recently released a new neural network GPT-4. How this chat robot with excellent reasoning ability can be applied and exert more powerful value is undoubtedly the focus of many people’s attention. However, while AI is advancing the world, some researchers pointed out that they successfully induced GPT-4 to assist in the development of malware and phishing emails.

Can GPT-4 also be used for evil?

According to Forbes, researchers at cybersecurity firm Check Point said they managed to get GPT-4 to help them build malware by avoiding using the word “malware” in conversations, bypassing OpenAI’s detection of these malware. Criminal blockade.

When OpenAI released GPT-4, one of the main features it emphasized was security and ensured that the technology would not be used by cybercriminals. But it seems that the magic is going well, and in the researchers’ tests, GPT-4 was able to program malware in C++ that could collect PDF files and transmit them to a remote server. Notably, GPT-4 also provided them with advice on how to get malware to run on Windows 10, reduce file size, and reduce the likelihood of being detected by antivirus software.GPT-4 can help malicious people—even people with non-technical backgrounds—accelerate and realize their premeditated plans.” In the end, Check Point said in the report, “What we see is that GPT-4 can be used for positive and negative purposes. In fact, good users will use GPT-4 to create various programs that help society, while malicious users will use artificial intelligence to quickly carry out cybercrime.”Daniel, member of the Black Hat Conference Review Committee. Cuthbert (Daniel Cuthbert) pointed out that GPT-4 can help people without sufficient technical knowledge to create malicious tools, making cybercrime easy and greatly reducing its threshold.

Can do phishing letters without writing programs, GPT-4 may lower the threshold of cyber attacks?

Check Point has also tracked the use of ChatGPT for cybercrime in the past. In December last year, it was discovered that cybercriminals used ChatGPT to write Python scripts that could be used for malicious attacks. Even though the publisher does not have any development background, the chatbot-assisted malware is mostly correct, with only a few coding errors.

Another Internet security company, Blackberry, also revealed that in their survey of 1,500 information technology experts, as many as 74% of them are concerned that ChatGPT may facilitate cybercrime. At the same time, 71% of people worry that this technology may be used by national governments to launch cyber attacks on other countries.

Sergey, Manager of Check Point Threat Behavior Team. According to Sergey Shykevich, GPT-4 is much easier to generate phishing letters or malware code than previous versions. It may be because it is currently only open to paid users of ChatGPT Plus for limited use, so it was not immediately abused by people with intentions.

In fact, when OpenAI released GPT-4, it also took a “vaccination”. There are “significant flaws” in network security-related operations, but they also emphasize that they have strengthened containment through training models, increasing internal monitoring and detection.

However, the phishing letters and malware created by GPT-4 are not special. Experts believe that modern detection systems can identify these malware, and just like water can carry a boat, it can also capsize. GPT-4 is at best a tool. Whether it is good or bad depends on how users use this technology. GPT-4 also has great possibilities for education, data management, and fraud detection.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button