Microsoft introduces an A.I. chatbot for cybersecurity experts

[ad_1]

Satya Nadella, chief executive officer of Microsoft Corp., speaks during the Windows 10 Devices event in New York on Oct. 6, 2015. Microsoft Corp. introduced its first-ever laptop, three Lumia phones and a Surface Pro 4 tablet, the first indication of the company’s revamped hardware strategy three months after saying it would scale back plans to make its own smartphones.

John Taggart | Bloomberg | Getty Images

Microsoft on Tuesday announced a chatbot designed to help cybersecurity professionals understand critical issues and find ways to fix them.

The company has been busy bolstering its software with artificial intelligence models from startup OpenAI after OpenAI’s ChatGPT bot captured the public imagination following its November debut.

The resulting generative AI software can at times be “usefully wrong,” as Microsoft put it earlier this month when talking up new features in Word and other productivity apps. But Microsoft is proceeding nevertheless, as it seeks to keep growing a cybersecurity business that fetched more than $20 billion in 2022 revenue.

The Microsoft Security Copilot draws on GPT-4, the latest large language model from OpenAI — in which Microsoft has invested billions — and a security-specific model Microsoft built using daily activity data it gathers. The system also knows a given customer’s security environment, but that data won’t be used to train models.

The chatbot can compose PowerPoint slides summarizing security incidents, describe exposure to an active vulnerability or specify the accounts involved in an exploit in response to a text prompt that a person types in.

A user can hit a button to confirm an answer if it’s right or select an “off-target” button to signal a mistake. That sort of input will help the service learn, Vasu Jakkal, corporate vice president of security, compliance, identity, management and privacy at Microsoft, told CNBC in an interview.

Engineers inside Microsoft have been using the Security Copilot to do their jobs. “It can process 1,000 alerts and give you the two incidents that matter in seconds,” Jakkal said. The tool also reverse-engineered a piece of malicious code for an analyst who didn’t know how to do that, she said.

That type of assistance can make a difference for companies that run into trouble hiring experts and end up hiring employees who are inexperienced in some areas. “There’s a learning curve, and it takes time,” Jakkal said. “And now Security Copilot with the skills built in can augment you. So it is going to help you do more with less.”

Microsoft isn’t talking about how much Security Copilot will cost when it becomes more widely available.

Jakkal said the hope is that many workers inside a given company will use it, rather than just a handful of executives. That means over time Microsoft wants to make the tool capable of holding discussions in a wider variety of domains.

The service will work with Microsoft security products such as Sentinel for tracking threats. Microsoft will determine if it should add support for third-party tools such as Splunk based on input from early users in the next few months, Jakkal said.

If Microsoft were to require customers to use Sentinel or other Microsoft products if they want to turn on the Security Copilot, that could very well influence the purchasing decisions, said Frank Dickson, group vice president for security and trust at technology industry researcher IDC.

“For me, I was like, ‘Wow, this may be the single biggest announcement in security this calendar year,'” he said.

There’s nothing stopping Microsoft’s security rivals, such as Palo Alto Networks, from releasing chatbots of their own, but getting out first means Microsoft will have a head start, Dickson said.

Security Copilot will be available to a small set of Microsoft clients in a private preview before wider release at a later date.

WATCH: Microsoft threatens to restrict data from rival AI search tools

Microsoft threatens to restrict data from rival AI search tools

[ad_2]

Source link