Sina technology news Beijing time March 29 morning news, it is reported that Microsoft launched a new artificial intelligence dialogue tool Security Copilot (safety co-pilot), can help network security team to prevent hacker attacks, and after the attack to deal with.
Microsoft recently released a batch of artificial intelligence conversation tools known collectively as Copilot. The safety co-pilot used a GPT-4 language model developed by AI specialist OpenAI, as well as expertise in cybersecurity, the company said on Tuesday.
The idea is to use the tool to help workers quickly find connections between different pieces of evidence in a hack, such as a suspicious email, a rogue software file or what remains of an information system after an attack.
Microsoft and other security companies have been using machine learning for years to find vulnerabilities and root out suspicious behavior. But the latest AI technology not only speeds up analysis, but also allows users to converse in English and vernacular, making it accessible to people other than AI or cyber security experts.
Vasu Jakkal, Microsoft's vice president of cybersecurity, compliance, identity and privacy, said there is a shortage of cybersecurity workers in the United States, so the tool comes at a good time.
Since the outbreak, Microsoft has seen a surge in cyber hacking attacks, Jakkal said, and once a user clicks on a phishing link, it takes an average of one hour and 12 minutes for the hacker to have free access to the entire inbox. In the past, securing such access would have taken weeks or even months.
Users can ask the security first officer questions like, "How can I isolate a device that has been compromised by hackers?" Users can also ask the security officer to compile a list of all the people who sent or received emails with dangerous links before and after the attack. The first officer can also easily write a summary report on the occurrence and disposal of hacking attacks.
Microsoft will initially let some customers use the safety first officer, and will gradually increase the number of users in the future. Jakkal did not disclose when it will be widely available or who its first users will be.
The first officer used data from researchers at Microsoft and from U.S. government agencies who had long been focused on cyberattacks and cybercrime syndicates in the United States.
Security Copilot must work with Microsoft's other security software if action is to be taken against hackers, and Microsoft plans to integrate Security Copilot into other software products in the future.
Microsoft has been very active in generative AI this year, launching a number of products, some of which are still immature. In announcing the safety co-pilot, Microsoft also stressed that the system still has some bugs. During a demonstration of the Microsoft team's safety co-pilot, it reportedly revealed a flaw in its own operating system, Windows 9, which Microsoft doesn't actually own.
The safety co-pilot can also learn from the user.Users can make personal privacy Settings to determine the extent of information that the safety first officer is allowed to collect.Jakkal said Microsoft could use the data to help other customers if users allow the safety first officer to collect it.
Managing Editor: Liu Mingliang
Recommended use China IT News APP
Download flyfish app to read news