Archive for April, 2012
Earlier this week, IBM announced a network behavioral analysis (NBA) extension for its Network IPS offering which is based on the QRadar Security Intelligence platform.
Using advanced behavioral analytics and anomaly detection, the new QRadar Network Anomaly Detection appliance continuously analyzes network traffic in real-time — using deep packet inspection and passive monitoring of Layer 7 flow data, performed by QFlow and VFlow Collectors — to rapidly identify and prioritize advanced threats such as zero-day attacks and “low and slow” data breaches, as well as more common attacks such as botnets and other malware infections.
In addition, the new appliance correlates its own behavioral information about network activity with alerts and events from the IBM Security Network IPS console, IBM SiteProtector. It also leverages contextual information – to aid in prioritizing the most critical threats – from additional sources including vulnerability assessments, user activity and identity information, and threat intelligence feeds.
By applying behavioral algorithms to network traffic data, the new appliance can immediately flag abnormal events such as:
- Outbound network traffic detected to regions where the company does not conduct any business.
- FTP traffic observed in a department that doesn’t regularly use FTP services.
- A known application running on a non-standard port, or in areas where it is not allowed (e.g. unencrypted traffic running in secure areas of the network).
- Hosts that are sending an abnormally high volume of packets, indicating a potential malware infection.
Prioritizing Threats and Gaining Greater Visibility
QRadar Network Anomaly Detection allows organizations to quantify multiple risk factors in order to evaluate the significance of a reported threat, such as the business value of targeted assets and any vulnerabilities that have been identified for those assets, such as missing patches. It leverages core QRadar functionality – such as auto-discovery of assets, protocols and services – to provide a comprehensive asset profile database and real-time network view that is continuously updated based on passive monitoring of network flows, without consuming bandwidth or impacting the network infrastructure.
Integrating QRadar Network Anomaly Detection with IBM Network IPS also provides IBM Network IPS customers with enhanced visibility into their data via QRadar’s Big Data capabilities such as instant search (Google-like indexing across large volumes of unstructured data) as well as sophisticated network security dashboards and pre-configured compliance reports.
Upgradeable to Full QRadar SIEM
QRadar Network Anomaly Detection will be upgradeable to the full-blown SIEM capabilities provided by QRadar SIEM. The full SIEM delivers additional capabilities including the ability to collect and correlate events from a wider range of sources such as firewall logs, Windows and Linux host logs, application logs, database activity monitoring and vulnerability assessment technologies such as IBM Guardium, and configuration/patch management systems such as IBM Security End-Point Manager (BigFix). QRadar SIEM also offers a more comprehensive library of pre-configured correlation rules, dashboards and compliance reports.
Leverages X-Force Threat Intelligence
Like QRadar SIEM, the new appliance receives IP Reputation data from IBM X-Force research, providing insight into suspect entities from a massive URL database containing information about more than 15 billion Web pages and images – believed to be the world’s 2nd largest URL database (after Google) – which are monitored and classified on a continuous basis.
The X-Force feed provides QRadar Network Anomaly Detection with a list of potentially malicious IP addresses such as malware hosts, spam sources, anonymous proxies and other threats. If the appliance sees any traffic to or from these sites, it can immediately alert the organization and provide rich contextual information about the observed activity.
IBM also announced the newest version of its Network IPS, which now provides hybrid protection combining the open source capabilities and common rule syntax of SNORT with the broad protection found in IBM’s Protocol Analysis Module (PAM). This gives clients the ability to easily create and share custom IPS rules in a popular open source format while continuing to leverage IBM’s advanced network IPS capabilities.
Considered to be one of the industry’s most comprehensive threat detection engines, IBM’s PAM leverages packet, content, file and session inspection to go beyond the protection offered by traditional IPS technologies and defend against advanced threats such as browser attacks, data leakage and malicious web applications.
Since PAM is a modular and extensible module that does not depend solely on signature detection, new security protections can be easily added over time. For example, “shell-code heuristics” have been built into PAM to increase its ability to detect obfuscated or dynamic threats.
PAM is also fed updates from IBM X-Force, including protections for new vulnerabilities discovered by IBM’s X-Force R&D team as well as threat information obtained from the real-time monitoring of 12 billion security events per day and 20,000+ devices for IBM’s managed services clients in more than 130 countries worldwide.
IBM’s Vision for Advanced Threat Protection
This announcement demonstrates IBM’s commitment to evolving its IPS technology to provide advanced threat protection at the network layer, in combination with QRadar Security Intelligence and X-Force Threat Intelligence. This vision will continue to be expanded and delivered over time.
To read the full press release of the announcement, click here.
To read a detailed blog posting describing the benefits of combining IPS with Security Intelligence, click here.
Last week I participated in a panel on Continuous Monitoring at FOSE. Joining me were Mark Crouter from MITRE as the moderator, John “Rick” Walsh, chief of technology and business processes in the Cybersecurity Directorate of the Army’s Office of the CIO, and Angela Orebaugh, Fellow and Senior Associate at Booz Allen Hamilton. Auspicious company indeed.
For those not tuned into the federal government’s cybersecurity initiatives, the concept of continuous monitoring evolved from the previous approach in FISMA (federal information security management act), which mandated annual reviews of federal agencies’ security programs. After a few years of implementation it was widely recognized that the reviews generated rooms full of paper, which were obsolete as soon as they were printed, but didn’t elevate information security plan effectiveness to an acceptable level. Between 2006 and 2010, the number of security incidents rose by over 650%. The resulting strategy is embodied in FISMA 2012 (2.0), which is aimed at continuous monitoring of security controls, determining gaps between current and accepted security baselines, and quantifying risk.
Rick has been facing the challenges of implementing continuous monitoring within the government, and his experience has been that the different business processes, missions, and systems create obstacles, but once overcome, the solution yields financial and process efficiencies, and improved security. One of the biggest challenges is enumerating the assets, but once done is sure to reveal duplication of systems and opportunities to consolidate systems and software licensing.
Angela framed the conversation in her intro, which was appropriate since she co-authored NIST Special Publication 800-137, Information Security Continuous Monitoring for Federal Information Systems and Organizations. She has also been involved with the Security Content Automation Protocols (SCAP, pronounced ess-cap) project, which provides a set of standards for describing vulnerabilities (CVE, common vulnerabilities & exposures), systems (CPE, common platform enumeration), and configuration standards (CCE, common configuration enumeration), as well as a scoring system (CVSS), a test definition language (XCCDF), and a vulnerability definition language (OVAL). Angela advocated use of SCAP as a foundation for continuous monitoring.
Questions from the audience mainly focused on how to implement continuous monitoring, including getting buy-off from senior management and budgeting. The key is to show short-term results that are meaningful to business stakeholders. While continuous monitoring is in the process of being mandated, the danger is treating it as a checklist and doing the bare minimum to comply; whereas, when done right continuous monitoring can be the cornerstone for real security improvements, including interrupting the kill chain through early attack detection, provide total visibility to include troubleshooting operational problems, and give management a security dashboard with both technical and business gauges. The State Department was one of the first successful adopters of continuous monitoring and was able to not only ameliorate their high-risk vulnerabilities by 90%, but also slash the cost of certification and accreditation by 62%.
One of the more amorphous questions was how continuous is continuous? Does data need to be analyzed in real-time or near real-time? Does this apply to all systems? The answer is that it depends on each individual agency’s goals and the telemetry that can be collected from the systems. Organizations don’t want to have to retool systems to provide events as they occur–unless the systems are critical enough to warrant that cost and effort and there is no other way to gain the needed visibility. The panel all agreed that some systems only need to report into a central monitoring solution on an occasional basis–vulnerability scanners, for example–while network monitoring should report in near real-time, which means in one-minute intervals for most systems that create NetFlow records. Ultimately, there is no one-size-fits-all answer.
My overall impression from the panel is that continuous monitoring to the federal sector is what we call Security Intelligence in private industry, and both need to be defined and implemented per the enterprise or agency’s specific needs. The primary difference is that continuous monitoring is focused on metrics: quantifying the delta between expected state of assets and the measured states and classifying these differences as vulnerabilities. The scorecard approach provides a common baseline for different organizations to compare themselves against each other, and for management to better understand their organizational security posture at any given moment in time and compare it against past performance.
I was asked at the GTRA conference how the public and private sectors differ. My view is that the government does more up-front analysis and planning, while the private sector sees a need and builds a solution. Between well-considered frameworks, like FISMA 2.0, and tools like QRadar and OpenPages, the federal government and industry have an opportunity to collaborate on a complete Security Intelligence solution incorporating continuous monitoring and meaningful security scorecards and dashboards.
Click here to learn how Security Intelligence can help Federal organizations address continuous monitoring requirements. Find out how QRadar Risk Manager addresses the need for configuration auditing, and assessing the risk of configuration changes, across multi-vendor network environments (switches, routers, firewalls and IDS/IPS).
Have your security practices been guided by old wives’ tales and horror stories of installations past? In this article for Security Week, Chris Poulin explains why it’s time to revisit your security posture, especially when it comes to SIEM and Security Intelligence. Don’t let superstition influence your strategy!
“Another area where superstitious habits aren’t effectively influenced through SFP [self fulfilling prophecy] is information security. And yet we continue to spend a good part of our security technology budget on the latest iteration of firewall technology–application firewalls, UTM gateways, data diodes–and anti-virus, the perennial favorite, even though conservative figures estimate that A/V protects endpoints from less than 50% of current malware. Granted, much of this spend is aimed at preventing data leakage, which is a positive shift from the perimeter defense strategy, designed primarily to keep out external threats.”
Read the full article to learn how a next generation SIEM, the cornerstone of Security Intelligence, can help keep your organization protected against today’s threats, and why Chris thinks trying to operate without Security Intelligence is equivalent to insanity!