Archive for August, 2011
Q1 Labs’ CSO, Chris Poulin, recently authored a paper defining best practices for IT Security in a cloud environment. In this, he covers some interesting viewpoints on various hurdles expected when organizations secure their public or private cloud environments, as well as the steps necessary to create an effective security policy, and the similarities between SIEM and cloud environments.
What are a few of the steps cloud providers and customers can take when building out their own cloud security plan? One major chunk of the process is to start with an assessment of risk. That is, understand your current data types, locations, business processes, and information flow. Understand where the critically sensitive data is. Just like any other enterprise, cloud computing requires customers and cloud providers to define their own information topology before any reasonable security policy can be defined and implemented.
Step 1: Discovery
Know where all of your data is, no matter how you classify it. The key is uncovering the difference between the data that can and cannot be housed in the cloud. An eDiscovery process is recommended to locate buried and even misplaced data. Too often organizations find that Personally Identifiable Information (PII) is mixed with less critical data and matched with the wrong security protocols.
Step 2: Classification
After understanding where your data is, it needs to be classified appropriately and distributed to systems with security controls to match the data sensitivity. This step alone can help you make progress meeting various compliance regulations.
Step 3: Data transit
SIEM can help define your data transit policy by monitoring endpoints, firewalls, and network activity to govern if the data should be allowed to proceed to the cloud or not. Content-aware network profiling from Data Loss Prevention (DLP) solutions can fed to the SIEM to perform more complex correlations with other data feeds. For example, watch for PII such as a social security number in a patient healthcare record and combine that with the firewall logs and network activity found within a SIEM to gain a bigger picture of malicious activity.
As Chris Poulin has blogged, there is no question that more modern SIEM (a.k.a. Security Intelligence) solutions have their place in the cloud. It’s not a matter of if SIEM is ready for the cloud, but if the cloud is ready for SIEM. For more on IT Security best practices in cloud environments, take a spin through Chris’ complete writeup.
Related: SIEM and Cloud might be cousins
This is part 2 of an ongoing series of posts that answer “Six Things You Always Wanted to Know About Security Intelligence but Were Afraid to Ask.”
Now that we have a good understanding of Security Intelligence, let’s draw a clear picture of how modern Security Intelligence solutions evolved – and differ – from first-generation SIEM products. SIEM has become a widely deployed technology over the last 5+ years, and for good reason. But due to scale limitations and lack of visibility, legacy SIEM products can no longer go toe-to-toe with the advanced targeted threats (AKA, advanced persistent threats) making headlines today.
Log Management and Security Information and Event Management (SIEM) products are a standard element of the IT security landscape today. Large and small organizations in private and public sectors have widely adopted the solutions, and Gartner has published its SIEM Magic Quadrant report for a number of years. (Reportedly it’s one of Gartner’s most popular MQ reports across all IT disciplines.) The popularity of SIEM owes to its value: sophisticated monitoring and reporting on diverse network activity, enabling the identification of potential security risks and ensuring compliance with regulatory and policy requirements.
But first-generation SIEM products are now obsolete. Yes, obsolete. Here’s where they lag Security Intelligence solutions:
- No network activity monitoring. In the past, event logs from devices, applications and servers gave you a rough idea of what was happening on your network. Today, that’s just a starting point. Security Intelligence now requires real-time visibility into the flows, user activity, social media usage, mobile access and application content traversing your network – something first-gen SIEM can’t offer. Is that conversation using port 80 really web traffic, or is it a hidden botnet IRC communication? Have intruders compromised a user account and used it to post sensitive information to social media sites? Are your employees committing fraud or transmitting sensitive intellectual property inappropriately? Without integrating network behavior analysis / anomaly detection into SIEM, you won’t know until it’s too late.
- Not architected to scale. First-gen SIEM products did a passable job of collecting and correlating event logs for moderate size organizations. But add in flow data, perform a few simultaneous searches, or deploy in a very large enterprise, and first-gen SIEM’s choke. The reason is simple: they’re not architected to scale. They depend on external relational databases, which struggle to support the volume of I/O operations involved in demanding scenarios. Security Intelligence solutions are built from the ground up with purpose-built databases, so they can collect and correlate massive volumes of data in real time, and still respond nimbly to ad hoc searches.
- No pre-exploit security awareness. The Security Intelligence timeline doesn’t begin at the point of exploit or breach. That’s just when the clock starts ticking on your detection and remediation activities. Modern Security Intelligence solutions inherently differ from first-generation SIEM products by integrating pre-exploit risk and vulnerability management capabilities, as one example. This allows you to identify, prioritize and reduce risks associated with misconfigured devices and unpatched vulnerabilities. In this way you actually reduce the number of breaches, as well as detect and remediate the ones that occur.
- Reliance on signature-based detection. The game has changed. You can’t sit back, update your malware signatures, and expect to protect your network. First-gen SIEM offerings relied too much on the assumption of a finite and familiar set of threats. This approach fails when the threat vectors grow exponentially more diverse by the day.
- Too slow to deploy, too expensive to staff. When first-gen SIEMs hit the market, early adopters were willing to spend plenty of time and money to get them up and running. Connectors and rules needed to be written, users needed to be trained and so on. Once in production, their staffing requirements could also be significant. They spit out too many false positives, thus requiring the addition of staff to investigate volumes of incidents. Modern Security Intelligence solutions use a broader set of data (event, flow, asset, topology, vulnerability, configuration, etc.) and advanced automation to cut through the noise and reduce – not expand – security staffing requirements. One organization, for example, reduced ongoing security staff time requirements by 88% with Security Intelligence:
In sum, Security Intelligence solutions have made first-generation SIEM point products obsolete, and now help organizations protect against more challenging and diverse threats, with far less effort. They expand the scope of analysis to identify and prioritize risks before the point of exploit, and detect and resolve breaches faster through user activity and content visibility. They also scale to far greater volumes of data at radically reduced storage costs. And they are deployable and manageable with less manual work, satisfying stringent budget and ROI parameters.
Or in the words of Coherent, Inc., a leading provider of photonics-based solutions:
“We recently had an incident where someone was trying to port scan one of our email servers. Our previous system would not have seen this intrusion. Because of QRadar, we quickly – in a matter of minutes – located the individual computer and shut down the activity before any further damage could be done. The ability to locate and analyze information quickly – almost instantaneously – and in a fashion we could not do before has saved us incredible amounts of time.”
Stay tuned for the next post in this series, where we’ll look in depth at the question of how much staffing and expertise is needed to use Security Intelligence solutions. Subscribe here with your favorite RSS reader today.
With the advent of the “Smart Grid”, the electric and power industry has been progressing through their version of the Renaissance. Historically, the biggest concern for this industry was physical security, e.g. how do we keep our physical grids secure from being tampered with? Now, they seem to be focused on service, moving towards the Smart Grid in order to help smooth the delivery of electricity to an increasing number of customers, provide new monitoring services, and reduce the frequency of blackouts. This effort has been led by states like California working closely with NIST’s Smart Grid Interoperability Panel. But have they left cyber-security out of the big picture?
Similar to SCADA systems, most smart meters are delivered and implemented with little to no security measures in place. As a result, a rapidly growing number of energy providers and critical infrastructure suppliers are implementing security intelligence solutions to help them collect, normalize, and analyze network event and device data generated by their smart grids. They are recognizing that as smart meters become more intelligent, the risk profile increases accordingly, exposing the nation’s energy grid to more advanced attacks (what Gartner calls Advancecd Targeted Threats).
In June 2011, the Obama administration released a report titled “A Policy Framework for the 21st Century Grid”, which has a task of defining the future of our nation’s energy policy. One of the goals in the report is focused directly on establishing policies and best practices for cyber-security, specifically standards and a knowledge-based culture.
The Administration is moving in the right direction by working with states and private companies to develop standards and guidelines to drive a more secure power grid, but we still have a ways to go before our critical infrastructure is adequately protected. For now, states like California are making noticeable progress on smart grid adoption, and private companies like Portland General Electric are making similar progress securing their infrastructure with security intelligence solutions. However, the vast majority of the industry is still operating in the dark, as revealed in a recent study by the Ponemon Institute, “State of IT Security: Study of Utilities and Energy Companies.” This study found that nearly half of global energy organizations did not view IT Security as a strategic initiative.
You’ve heard this before – but a cyber-terrorism attack would have a catastrophic impact on the nation’s electric grid, shutting down critical businesses, slowing our ability to respond locally with law enforcement, disabling cell phones and other communication devices, and more. U.S. Defense Secretary Leon Panetta recently warned that “The next Pearl Harbor we confront could very well be a cyber attack that cripples our power systems, our grid, our security systems, our financial systems, our governmental systems”.
Clearly our power grid (smart grid or not) is vulnerable to attack. Hopefully, as we move closer towards broader smart grid adoption, the industry will make progress adopting security intelligence solutions to help protect our critical infrastructure assets. Do you think the electric and power industry is prepared to adequately protect itself from attacks?
Recently, Gartner published a new report titled “Strategies for Dealing With Advanced Targeted Threats”. The message in this report is how to strategically deal with ATTs (Advanced Targeted Threats), which is Gartner’s expanded definition of APTs (Advanced Persistent Threats) in order to emphasize the focused nature of these high-magnitude attacks. A lot of emphasis is placed on the need for network activity monitoring, to the extent of even calling out “flows”, as we also saw in this year’s SIEM Magic Quadrant report.
Below is a breakdown of the report, beginning with The Problem definition:
- The term “advanced persistent threat” (APT) has been overhyped in the press and is distracting organizations from a very real problem. Targeted attacks are penetrating standard levels of security controls and causing significant business damage to enterprises that do not evolve their security controls. Gartner estimates that, for the average enterprise, 4% to 8% of executables that pass through antivirus and other defenses are malicious. Enterprises need to focus on reducing vulnerabilities and increasing monitoring capabilities to deter or more quickly react to evolving threats, and not focus on what country the attacks are coming from.
A major point that supports SIEM in general, and flows/behavior anomaly detection in particular, is in the analyst’s portrayal of “lean-forward” planning. This approach is especially needed in Critical Infrastructure, as the recent Ponemon survey pointed out the discrepancy in spending between physical and IT security, which is in fact evolving due to the potential for APT/ATTs.
Here are some more highlights from the Gartner report:
- Advanced attacks (often called “advanced persistent threats”) are using techniques that demand an evolution of existing defenses, and an introduction of new security controls and processes. Enterprises need to focus on the effectiveness and efficiency of their infrastructure protection approaches.
- Simply adding more layers of defense does not necessarily increase security against targeted threats — security controls need to evolve.
- Gartner estimates that, for the average enterprise, 4% to 8% of executables that pass through antivirus and other defenses are malicious.
- Enterprises need to focus on reducing vulnerabilities and increasing monitoring capabilities to deter or more quickly react to evolving threats, and not focus on what country the attacks are coming from.
- All the innovative techniques used in these attacks are detectable. One key to preventing their success is to focus on avoiding, minimizing or shielding the vulnerabilities they are exploiting.
- security information and event management (SIEM) products or other approaches that correlate information across defense “silos” should be used to gain better exception monitoring capabilities
- A lean-forward, continuous monitoring process includes the following steps:
1. Establish a baseline.
2. Update threat information.
3. Monitor and inspect network traffic and host logs.
4. Investigate possible threat activity.
5. Activate an incident response process, or update defenses or work-arounds.
6. Go to Step 1.
- Some SIEM and next-generation firewall products have added some of the flow analysis features of network behavior analysis.
- You must be prepared to invest in and staff lean-forward processes
Bottom line: Advanced Targeted Threats are front and center in our minds, and this report emphasizes important elements of a responsive strategy for dealing with these threats. It is a must read for all information security professionals concerned with staying ahead of these threats, especially in Critical Infrastructure.
Read more about how Q1 Labs’ Security Intelligence has been protecting Critical Infrastructure customers in our recent release, “A Year on from Stuxnet, More than 100 Critical Infrastructure Customers Rely on Q1 Labs for Security Intelligence.”