Thursday, July 28, 2016

"Run-time Cyber Economics – Applying Risk-Adaptive Defenses"

G2, Inc. has been involved in OpenC2 for quite some time. This is a great read relating to active cyber defense and OpenC2.

Run-time Cyber Economics – Applying Risk-Adaptive Defenses

Posted by: CyberSecurityChief Categories: Active Cyber Defense Articles

Well, it has been a long break since the last article of this series but I feel duty-bound to do this third article on cybersecurity investment since I find the possibilities resulting from a “risk-adaptive” security approach to be compelling. Generally cyber defenses must be pre-planned with cost-benefits carefully weighed prior to investing in new tools to bolster defenses. However a risk-adaptive approach can change cyber investment to a fluid, service-oriented, run-time decision that can be made using well-understood economic principles. Learn how risk-adaptive defenses can raise the quality of an organization’s security posture while reducing capex and opex.

Today’s new defense strategies are focused on hunting and mitigating threats. In general, the threat vectors don’t change greatly – however the malware is being packaged and delivered in ways that are designed to evade detection and to deceive users. Some examples of this malware trend can be found here and here. As a reaction to this trend, cyber protection systems are beginning to move away from static signature-based approaches (Intel’s pending sale of McAfee anti-virus is an example of this lack of faith in static signature-based defenses) to an integrated proactive model based on a range of different narrow aperture collectors feeding big databehavioral models that can sense anomalies. These sense-making models output alerts to cyber decision-making systems that produce courses of action (COAs). The COAs are implemented byorchestrators which synchronize detection mechanisms and instigate mitigation services, such as sending updates to nexgen firewalls, or signaling a suite of other software-based protection services. These services are designed to quickly respond and stop attacks or prevent data breaches. This is really good but how do we balance the investment in these different tools with the risk posture and scale of the organization? That is, how do these tools reduce the cyber value-at-risk at a rate that makes investment in these tools worthwhile? This question was recently highlighted in a “Security Challenges” market landscape report by SDxCentral. When asked to indicate all their major security challenges, there was no single overwhelming problem identified by respondents to the SDxCentral survey; 49% said “Lack of visibility” was an issue, followed by the “Cost effectiveness of security solutions at scale,” at 44%. What was also evident from this survey is that organizations lack common measures to quantify cyber risk, curtailing their ability to make clear strategic decisions concerning optimal cyber security investment levels.
Adaptive Cyber Defenses Can Be Effective in Reducing Attacker Dwell Time and Minimizing Loss to Cyber Intrusions

One aspect of cyber security that is being measured is the cost of breaches. Studies have shown that the cost impact to the business goes up to exponential proportions the longer the attack goes undetected. This is where a proactive cyber strategy can help. The benefit of a proactive cyber strategy lies in its ability to drastically reduce the “dwell time” of an attacker on compromised platforms by accelerating the OODA loop. This reduction in dwell time inhibits the attacker’s ability to pivot and move laterally across the network to cause more harm and therefore drive up costs. However a successful proactive cyber strategy depends on overcoming four main challenges:

1 – As you might surmise, the sense-making process is often a bottleneck as there’s too much data and not enough context provided by the collectors. The sense-making process must enrich the collected data at a rate and level of accuracy (i.e., low false positive rate) that matches the cyber threat. One method of enrichment is to use one or more (more is preferred) threat intelligence sources. Cyber threat intelligence providers supply Indicators of Attack (IOA) and / or Indicators of Compromise (IOC) to help direct the sense-making service about what to look for and where to look.

2 – The second major challenge is defining COAs. There are two main issues here:

The first issue is there is no mutually understood, generally accepted, machine-readable, and shareable language between the different IT organizations as well as the business side who are involved in COA development and incident response that allows all sides to really connect, perform critical impact and root cause analysis, make efficient and faster decisions, implement response strategies, and, ultimately, work with less friction. A standards effort that is working to help in this area is the Open C2 COA Standardization WG. The Open C2 work group, a partnership between NSA, DHS, and industry is initially focused on defining a language at a level of abstraction that will enable command and control of cyber defense entities that execute the actions with enough generality to provide flexibility in the implementations of devices and accommodate future products. This effort, if successful should help to reduce the upfront investment cost in COA development by defining a common language for COAs and for sharing COAs.

The second major issue is actually identifying the specific courses of action that need to be performed for a given intrusion set to enable a given set of services that protect a given mission or business system. This issue requires algorithm development. A variety of mathematical theories can be used to model and analyze cybersecurity. Resource-allocation problems in network security can be formulated as optimization problems. In dynamic systems, control theory is beneficial in formulating the dynamic behavior of the systems. Game theory also provides rich mathematical tools and techniques to express security problems. This DTIC report highlights some of the issues and an approach to COA algorithm development.

3 – The third challenge involves integrating the tools needed to automate the COAs. This challenge is being addressed through community efforts led by NSA, DHS, and Johns Hopkins Applied Physics Lab. In addition, work by OASIS’ STIX provides COA, Incident, Threat and other related schemas which can be leveraged by tools seeking interoperability across threat data, COA, and cyber impact. See figure below for an example.

4 – The fourth major challenge involves culture change – i.e., overcoming a lack of confidence in automating the decision-making of cyber response. Generally, most organizations will insist in having a man-in-the-middle in the COA decision-making processes until confidence is well-established in the COA algorithms. Having a vetting process will also be essential prior to sharing COAs or accepting shared COAs from other organizations.
Adaptive Defenses Require OODA Loops at Each System [and Business] Layer

As pointed out by Emami-Taba et al, it is necessary to provide a holistic approach in implementing adaptive defenses. Depending on the architecture layer, the source of the data to be monitored is different and the adaptive cyber decision-making and responses are different. For example, to detect a cyber attack at the network level, the data to be monitored can be packet data, network traffic, etc. Intrusion detection systems are a cyber detection, decision-making, and response mechanism at the network layer. Intrusion-detection systems can take adaptive actions such as intensifying monitoring efforts when malicious behavior is detected. Likewise at the application layer, a cyber attack can be detected from various data sources. For example, the system can monitor the number of transactions by a specific user or the access rights of a user to a particular piece of sensitive data. An adaptive access control system may prevent access to the data if the behavior of the user appears abnormal. Therefore, COAs should not be limited to actions in only one layer of the systems being defended but cover top to bottom of the system stack. This holistic approach mirrors the advances made by tool vendors regarding their newer approach to malware: Security companies are aiming lower in the system stack, essentially running their software in a position where they can observe all activity on the device – examples include Tanium and Bromium. However, it is also important to connect adaptive defenses to the business layer so mission dependencies can be evaluated, business disruption can be assessed, the value-at-risk can be determined, and appropriate risk mitigation action can be taken, i.e., what’s the risk impact to the business related to a particular attack and COA response? The answer to this question is what the respondents to the SDX Challenge Survey were searching.
Risk-Adaptive Defenses Relate Protections to an Economic Model

“Risk-adaptive” defenses can be used to help provide visibility and governance to cyber defenses since they can quantify risk and manage allocation of protections using an economic model. One example of a risk-adaptive approach is Fuzzy MLS, an access control model which in a limited context can be used to quantify risk associated with information access. The ability to quantify risk makes it possible to treat risk that an organization is willing to take as a limited and countable resource. This enables the use of a variety of economic principles to manage the resource (risk) allocation with the goal of achieving the optimal utilization of risk, i.e., allocate risk in a manner that optimizes the risk vs. benefit trade-off.

According to Pau-Chen – the author of Fuzzy MLS: “the fact that when a security administrator creates the [access control] policy, she is guessing and codifying what risk-benefit trade-offs will be acceptable for information accesses that will happen in the future. Clearly, for an organization with dynamic needs the future risk-benefit trade-offs are not predictable and the guesses made about future risk-benefit trade-offs, encoded in the security policy are likely to be in conflict with the real risk-benefit trade-offs at the time of access.”

The main feature of Fuzzy MLS is that it considers access control as an exercise in risk management where access control decisions are made on the basis of risk, risk tolerance, and risk mitigation, where risk has the usual connotation of expected damage. Viewed in terms of risk, the process of setting a traditional access control policy is actually determining a fixed trade-off between the risks of leakage of sensitive information versus the need of the organization to provide such information to its employees for them to perform their job. This fixed trade-off sets up a non-adaptive, binary access control decision model where accesses have been pre-classified as having either acceptable risk or non-acceptable risk and only the accesses with acceptable risk are allowed. Fuzzy MLS devises a way to compute an estimate of risk associated with an access by quantifying the “gap” between the subject’s value and the object’s value. With these quantified estimates of risk, a risk scale can be built such that each access is associated with a point on the scale. With such a scale, the access control model can be made risk–adaptive by adjusting the point of trade-off on the scale as the needs and environment change. Fuzzy MLS goes one step further by expanding this point of trade-off into a region on the scale. An access associated with a point below the lower-bound of the region (also called the soft boundary) is allowed, an access associated with a point above the upper-bound of the region (the hard boundary) is denied. The region is further divided into bands of risk such that each band is associated with a risk mitigation measure(s) or course of action. An access located in a band is allowed only if the risk mitigation measure(s) / courses of action associated with that band can be applied to the access. Thus, the Fuzzy MLS model depicts a risk management system that resembles a Fuzzy control system and thus the name “Fuzzy MLS.”

One of the keys to such a risk-adaptive system such as Fuzzy MLS is coming up with values which can be used to quantify the subject/object gaps and perform risk trade-offs. I suggest that two types of values be assigned to cover two scenarios: 1) for access control scenarios, the value of an asset to the organization could be applied; and, 2) for attack scenarios, the value of an asset to an attacker could be applied. In the former case, the value of an asset can be established through well-known processes such as business impact analysis, where the levels of confidentiality, integrity and availability can be ascertained and translated into economic terms. In the latter case, a different approach is needed as a seemingly worthless piece of data from an organizational perspective might be extremely valuable to an attacker to profile and phish a target. This attacker view of an asset value requires intelligence about how attackers rate and target your assets.

The perishability or shelf life of an asset should also be determined. In this way you can adaptively change protections over a time span when the value of the asset changes – e.g., M&A plans have a relatively short shelf life and access protections need to increase around these plans during that time period. In the same regard, a vulnerability can be considered an asset from an attacker’s perspective that has a shelf life. In contrast, detections against weaponizer artifacts are defender assets that have a very durable shelf life.

To conclude, risk-adaptive cyber defenses can help in the prioritization and selection of courses of action by instantiating economic principles that reflect the true mission impact of a course of action mitigation or remediation action.

Source: July 26th 2016

Wednesday, January 27, 2016

G2 invited to speak at the RSA Conference 2016

G2 Inc, was invited to speak at the RSA Conference 2016 for the second year in a row. G2 is most recently known as the prime contractor responsible for supporting NIST in the development of the Cybersecurity Framework (CSF).  G2 cybersecurity engineers Greg Witte and Tom Conkle will present on G2’s experiences helping customers across the nation with using the CSF. 

The session, “Effectively Measuring Cybersecurity Improvement: A CSF Use Case” highlights G2’s experience helping  customers use the CSF to achieve measurable and continuous improvement. The session harnesses momentum established by the 2014 release of the CSF, as organizations work with G2 in leveraging the Framework to develop and maintain effective cybersecurity program outcomes.

This session explains how G2 engineers helped an organization use the CSF as a common language for communicating program goals and activities among stakeholders. Through quantitative measurement of current and planned activities (including references to standards, e.g., ISO 27002), in alignment with senior executives’ priorities, G2 helped to clearly articulate gaps between their existing cybersecurity program and the one needed to achieve their risk objectives. 

The talk will illustrate how Board members were able to quickly understand the identified security deficiencies, enabling resource and planning discussions.

Join G2 in West Room 2014 on Thursday, March 3, 2006 at 9:10am PT to learn more about the CSF case study.