Getty Images/iStockphoto
Microsoft Copilot for Security brings GenAI to SOC teams
Microsoft's latest AI-powered tool, now generally available, has been beneficial for security teams regarding efficiency, but infosec experts see some room for improvements.
Listen to this article. This audio was generated by AI.
Microsoft officially launched Copilot for Security on Monday, and while the generative AI tool might bolster security operations, enterprises could face implementation and integration challenges.
The tech giant unveiled Copilot for Security, originally called Security Copilot, in March 2023 to assist security and IT teams with threat detection and response. Following a series of rollout stages for the generative AI (GenAI) tool, Microsoft added a pay-as-you-go pricing model and new capabilities, such as knowledge base integrations and multilanguage support.
Vasu Jakkal, corporate vice president of security, compliance, identity and management at Microsoft, announced the launch in a blog post last month and emphasized that enterprises can use Copilot for Security as a standalone portal or embed the AI tool into existing security products.
Microsoft's latest chatbot uses OpenAI technology to help enterprises address security risks related to device and identity management, data protection, and incident response investigations. For example, Copilot for Security could be applied to reverse-engineer an exploit or discover which endpoint was affected during an attack.
"Copilot is informed by large-scale data and threat intelligence, including more than 78 trillion security signals processed by Microsoft each day, and coupled with large language models to deliver tailored insights and guide next steps," Jakkal wrote in the blog post.
Cybersecurity companies have increasingly implemented GenAI and large language models (LLMs) into their products and services over the last year in an effort to help organizations address an expanding threat landscape. Meanwhile, adversaries are also embracing GenAI. For example, in February, Microsoft published research on how nation-state threat actors are preparing for attacks by leveraging LLMs to research specific technologies and vulnerabilities.
Infosec professionals largely agreed that the tool will be beneficial in several ways. On the other hand, they cited challenges with implementation, data set limitations and product integration with non-Microsoft tools.
Microsoft rolled out Copilot for Security in a private preview for select partners last year. NCC Group was one of the private preview participants. Sian John, CTO at NCC Group, told TechTarget Editorial that the preview was helpful to understand Copilot's capabilities and how it could augment the company's services. Since preview mode limits the number of users, NCC Group only engaged expert users. In addition, it conducted hackathons where teams shared knowledge and tested use cases.
NCC Group primarily applied the tool for security operations center (SOC) analyst use, but anticipates broader use as more capabilities become available.
"We created a quishing process to enrich the data involved in a QR phishing email, enhancing the productivity of an analyst in investigations and helping to create reports," John said. "We also were able to use Copilot to create queries that would investigate incidents across both [Microsoft] Sentinel and Splunk, improving productivity in a hybrid environment."
Microsoft marketed Copilot for Security as a timesaving tool for overworked security teams. Alessia Oliveri, senior product manager at NCC Group, told TechTarget Editorial that it saved NCC Group's SOC team approximately 50 hours per week.
While the preview went relatively well, Gerben van der Lei, strategy program manager at Fox-IT, part of NCC Group, highlighted one important lesson he learned: Good prompting skills are essential.
Potential integration challenges
While beneficial, GenAI products for cybersecurity aren't without risks. Hallucinations, which occur when the LLM generates inaccurate or nonfactual responses, are an ongoing concern as LLMs become more widely adopted.
Accuracy was a primary concern for Kalpana Singh, vice president and head of product marketing at Recorded Future. Last year, Recorded Future released an OpenAI chatbot model of its own that was trained on the vendor's threat intelligence.
Singh said she expects enterprises will apply Copilot for Security for day-to-day incident response and to manage alerts, but she expressed concern about how the model will adapt to new constructs that enterprises introduce in their architecture. For example, she questioned how Copilot for Security would function if an enterprise began deploying IoT devices and establishing new connections to the network.
"I think it's an interesting tool for folks who are already in the Microsoft ecosystem and want to accelerate and make their security operations really efficient," Singh said.
In addition, she highlighted potential problems with non-Microsoft product integration. Singh said Microsoft products mostly only work with other Microsoft products, which could pose a problem for enterprises outside the ecosystem.
"I have experience working with Microsoft on the integration side, and they can be very cagey about working with other tools. They will say, 'Oh, we have opened up,' but they are very cagey about actually opening up with the ecosystem," she said.
TechTarget Editorial asked Microsoft how Copilot for Security integration works with non-Microsoft products.
"Copilot has integrations with other ISVs to provide plugins and promptbooks to extend the insights customers get from the product," a Microsoft spokesperson said. "There are over 100 partners who joined the partner preview and are working on integrations and Copilot-related offerings."
Another related concern is that the product might contain a limited data set. Singh said Microsoft's intelligence gathering is driven by its own ecosystem. "I personally would not trust the [Microsoft] data to make informed decisions. I want the model to be trained on the company's [data]," she said.
The real test will be whether enterprises continue to face the same number of attacks following prolonged use of Copilot for Security, she said. Like with any new product, Singh anticipates a learning curve for users, especially for novice professionals.
Long learning curve
Forrester Research analysts also addressed the Copilot for Security learning curve in a blog post last month following Microsoft's announcement. Based on a preview event, the analysts said enterprises should expect 40 hours of training.
Jeff Pollard, vice president and principal analyst at Forrester, co-authored the blog post and told TechTarget Editorial that such training goes beyond the size and resources of a company. The primary challenge is that training requires users to adapt their behavior to the technology, and the extended training period could be a deterrent for some initial investments.
"The payoff is worth it in terms of productivity, but it's not easy nor simple," Pollard said.
He agreed that enterprises will mainly implement the tool in security operations. For example, Pollard said it will make it easier for SOC analysts to write reports, create queries and parse scripts. "Reducing friction in those areas to improve SOC analyst experience will be big wins for security teams," he said.
While Singh cited concerns with product integration, Pollard said Copilot for Security stands out from other GenAI products because it's fully integrated into the broader Microsoft ecosystem. He emphasized how that offers an inherent competitive advantage for Microsoft.
While Copilot for Security is not necessarily remarkable in terms of functionality capabilities, Pollard said it does stand out from other offerings because it's now generally available.
"Most of the offerings satisfy similar use cases like summarization and contextualization, but vendors are playing with words by saying things like their offering is 'generally available but in limited release.' Microsoft is beating those vendors to market with a generally available solution, which gives it a head start," he said.
A different pricing model
Experts also addressed Copilot for Security's new pay-as-you-go licensing model. While Microsoft marketed it as a way for enterprises to "scale your usage and costs according to your needs and budget," Pollard and Singh posed many concerns.
Pollard said he is worried that CISOs will dislike the unpredictably of the model. Since it's a new tool, CISOs don't know how much their teams will use it, how much it will cost or how to manage those costs. "Pay-as-you-go can make experimentation a bit easier, but making a larger commitment will be a challenge for CISOs on a calendar fiscal year that didn't plan for this last year when they finalized spending plans," he said.
Singh's primary concerns also revolved around budget constraints. She questioned how the pricing model could affect security teams' decisions to use the tool for everything, and if they do, how they might employ user constraints. If the tool is helpful, and everyone starts using it, she warned that enterprises could lose track of the usage and receive unexpected bills. Singh observed similar budget issues arise related to AWS Elastic Compute Cloud.
"In my previous job, I heard developers would run these instances and let them run, and then suddenly at the end of the month there is this massive bill that comes to IT and DevOps because people just ran these test instances and forgot about it," she said. "I am still skeptical about the pay-as-you-go because I do feel that it has some impact on how surprised or shocked enterprises may be when receiving bills."
Arielle Waldman is a news writer for TechTarget Editorial covering enterprise security.