Skip to content

Adding AI to Your SOAR Security Systems? Think ‘RoboCop’

Security leaders see plenty of potential as they integrate AI and LLMs into their security orchestration, automation, and response platforms, but bear in mind: It’s still early days, and expectations need to remain grounded.

Perspective

Fans of RoboCop, the sci-fi action flick about a law enforcement robot designed to be the perfect mix of human judgment and machine automation, might not realize there’s a good reason to re-watch it: The cult classic holds some surprising guidance about cybersecurity today.

Seriously, if you haven’t seen it in a while – or ever – RoboCop is worth streaming, because the issues it deals with are remarkably similar to what business and security leaders are facing as they consider how to best leverage AI to improve cybersecurity in general, particularly in the realm of security orchestration, automation, and response (SOAR), where AI advances seem capable of making swift improvements.

Quick backstory: Forty years ago, Orion Pictures purchased the script to RoboCop, but it took two years to find a director and get the film made. Paul Verhoeven rejected it twice before signing on to helm what’s now considered his cinematic masterpiece. The mashup of corporate culture satire, dark humor and uber violence follows the exploits of Murphy, a police detective nearly killed by thugs, then resuscitated by scientists who transform him into this half-man, half-machine hybrid, with a gun holster encased inside a titanium thigh. Hey, what could go wrong, right?

The future of IT and security is autonomous. But most organizations don’t know which manual processes are easy to eliminate. This is where you start.

The unintended consequences of its programming (and a bit of haunted humanity lingering inside all that metal) send things awry, leaving the cyborg’s scientific and corporate overlords scratching their heads.

And this is pretty much where today’s cybersecurity professionals find themselves.

With cyberattacks hitting organizations at unprecedented rates, many large enterprises have invested in their SOAR capabilities to integrate with existing security toolsets and automate routine security tasks. Whether it’s automatically initiating security alert investigations and threat response, isolating infected machines, or orchestrating complex security workflows, also known as playbooks, the goal has been the same: Reduce the amount of time it takes to detect and respond to security incidents.

Enterprises also invest in these SOAR platforms hoping to address alert fatigue and overstressed staff. Scott Crawford, research director of the information security channel at S&P Global Market Intelligence, says about half of the end user companies he’s consulted are so overwhelmed by security alerts that they can’t respond to them all in reasonable time. According to the B2B market research firm MarketsandMarkets, that’s one reason why the worldwide market for SOAR tools hit about $1.1 billion in 2022 and will reach nearly $2.3 billion by 2027.

Many enterprises have found the implementation of their SOAR platforms and processes disappointing. The reasons are many, and they’re likely to foreshadow the challenges enterprises will face when integrating AI into their SOAR systems.

Why playbooks evolve (or don’t) with SOAR

Integrating SOAR systems into existing security tools and processes is complex and requires expert customization, and many organizations struggle with data mapping, quality, and consistency across disparate data sources, which impacts SOAR effectiveness.

There are a lot of use cases that come to mind when one applies LLM technology to security – the most obvious being threat intelligence and analysis.

Scott Crawford, research director of the information security channel, S&P Global Market Intelligence

The result is many of the “playbooks” and their component “flows” and  “subflows” – preset automated or semi-automated road maps/workflows that guide responders through the correct response to specific types of incidents, whether ransomware or indicators of compromise found on internal systems – don’t evolve beyond relatively simple security tasks, like taking forensic snapshots or taking systems offline.

“While helpful, these systems have not entirely lived up to their promise,” says Michael Farnum, advisory CISO at technology services firm Trace3. “The automation provided remains for the simplest tasks, and the result is they haven’t brought the productivity bump many had hoped for.”

Others, however, remain optimistic that AI will be a value-add, especially for more complex security tasks.

[Read also: Ultimate AI cybersecurity guide – benefits, risks, and rewards]

Jason Rader, global CISO at Insight Technology Solutions, was an early adopter of generative AI capabilities integrated into the tools his security operations team used. “We are excited to have AI capabilities in several different products,” said Rader.

While it has helped streamline some tasks, other areas of AI — especially the incorporation of generative AI – have proven to help the security operations team with time-consuming but essential tasks such as writing Python scripts or SQL queries and report writing. On the other hand, some of the integrated generative AI capabilities in the orchestration and related platforms he’s seen have been “less than optimal,” Rader added, regarding analysis and more complex playbooks such as processes that may require multiple steps. “We didn’t expect this [AI] to be purpose-perfect yet,” added Rader.

[Read also: Racing to deploy GenAI? Security starts with good governance]

Security teams have new hope that large language models (LLMs) can help improve SOAR technologies and help security teams respond to threats both more rapidly and more intelligently. The challenge is getting the right mix of program maturity, human judgment, and AI decision-making and automation.

Transforming SOAR ops: Advanced LLMs fuel AI’s promise

The advancements in LLMs promise to transform enterprise SOAR operations by providing automated triage of incidents, threat detection, incident response, security data analytics, natural language interfaces, adaptive playbooks, and predictive capabilities. The hope is that, over time, the AI capabilities built within these tools (or the custom LLMs enterprises build in-house) can help security teams to better parse through all the threat intelligence, security alerts, and event data generated from networks and applications, identify the threats most significant to their industry and organization quickly – and respond automatically when possible.

It’s not possible yet, but in the future, playbooks will be created by AI systems, perhaps even created or at least adjusted on the fly.

Michael Farnum, advisory CISO, Trace3

Security experts see four ways LLMs are already beginning to transform SOAR platforms:

Security data analysis. These systems can detect potential threats by analyzing threat intelligence and enterprise systems’ event data. “There are a lot of use cases that come to mind when one applies LLM technology to security – the most obvious being threat intelligence and analysis,” says Crawford.

Automatically triage security alerts. That data analysis can also be used to prioritize incident response based on the business criticality of the at-risk data and systems and the configuration and status of the security systems defending those assets. These systems can present analysts with appropriate response playbooks. “Currently, AI is helping with the most routine of playbooks, but the playbooks will likely become more sophisticated over time and handle more complicated tasks,” says Farnum.

That will require both innovation (the buzzword that drives all business decisions and best practices today, and thus what enterprise leaders like) and testing (time-consuming, costly, unsexy – that is, all the things chief information security officers have a tough time selling to CEOs and the board).

[Read also: How CISOs can talk cyber risk so that CEOs actually listen]

“Playbooks are combinations of the scripts and integrations that you’ve been using for some time, and any playbooks developed by AI should be tested before making that playbook active in the environment,” says Rader.

Playbook creation and optimization. AI can view historical incident data, instantly identify patterns of what worked and what didn’t, and guide security analysts to create the most effective playbooks by focusing only on the steps needed to do the job. “It’s not possible yet, but in the future, playbooks will be created by AI systems, perhaps even created or at least adjusted on the fly,” says Farnum.

Automated predictive orchestration. Using the same breadth of data to analyze threat intelligence and IT environment information, AI can investigate the incident in real time, develop or select the best response tools, and even intelligently reallocate resources based on changes in the threat landscape. “Some of these businesses are saying they can do basic tasks now, to some degree, with the hopes of the technology maturing over time. How effective will it be here? The jury is still out,” said Crawford.

Crawl, walk, run…automate

While these capabilities with AI and SOAR are possible for enterprises, without some upfront work on data governance and security processes, enterprises risk facing the same challenges today when trying to implement SOAR that they faced when attempting to deploy earlier generations of SOAR tools: complexity and low data quality.

You cannot skip the maturity step to get to the AI and automation steps.

Jason Rader, global CISO, Insight Technology Solutions

Before embracing AI-powered automation in cybersecurity, organizations need to take a hard look at their security maturity. Automating those processes won’t be optimal if current SOAR methods aren’t optimal. Automation risks amplifying existing weaknesses, even creating new blind spots. Mature security programs have clearly defined workflows, high-quality data, and experienced teams who can properly train and oversee AI systems.

“You must have a certain level of maturity within your security program before fully leveraging AI and automation, because you can’t automate something that hasn’t been defined within your organization,” Rader says. “And you cannot skip the maturity step to get to the AI and automation steps. After your program is optimized, you can start to automate more.”

How to best prepare for the future of AI and SOAR integration

Despite global tensions, economic uncertainty, and the rise in cyber threats, it’s still an exciting time for security, as AI and automation start to take off and reduce analysts’ workload in different ways. However, experts don’t think there will be fully automated SOCs or security operations in the short term or even the intermediate term. “No one wants to oversee a widespread, self-inflicted denial-of-service attack because an AI made a poor decision,” says Crawford.

[Read also: A security chief on protecting the Marvel universe – what’s possible with AI and how to implement it]

To prepare for the eventual levels of AI and SOAR integration that experts foresee, Rader advises organizations get to work maturing their existing processes and improving their data sources and governance. He also recommends investing in a SOAR platform now and getting the security team comfortable with it.

“As new AI and automation abilities come out, they’ll be able to add those capabilities because they will understand the platform and be ready to integrate the AI,” he advises.

It’s also essential to keep the security team engaged with the platform and the overall direction of security operations. “It’s going to be people working with AI for some time because we haven’t gotten to RoboCop yet,” Rader says.

George V. Hulme

George V. Hulme is an information security and business technology writer. He is a former senior editor at InformationWeek magazine, where he covered the IT security and homeland security beats. His work has appeared in CSO Online, Computerworld and Network Computing.

Tanium Subscription Center

Get Tanium digests straight to your inbox, including the latest thought leadership, industry news and best practices for IT security and operations.

SUBSCRIBE NOW