Tuesday, October 28, 2014

G2 Attending 6th Cybersecurity Framework Workshop


Over the next two days, Paul Green, Brian Hubbard, Tom Conkle and several other G2'ers will be supporting and contributing to the 6th Cybersecurity Framework Workshop which will be hosted at the University of South Florida in sunny Tampa, FL.

The purpose; "Executive Order 13636, Improving Critical Infrastructure Cybersecurity, directed NIST to work with stakeholders to develop a voluntary framework for reducing cyber risks to critical infrastructure. Version 1.0 of the Cybersecurity Framework, released on February 12, 2014, was developed in an open manner with input from stakeholders in industry, academia, and government, including a public review and comment process, workshops, and other means of engagement.

In the time since the Framework's publication, NIST's primary goal has been to raise awareness of the Framework and encourage its use as a tool to help industry sectors and organizations manage cybersecurity risks.

The purpose of this workshop is to gather input to help NIST understand stakeholder awareness of, and initial experiences with, the framework and related activities to support its use. NIST is planning to release a formal Request for Information (RFI) asking for further feedback in these areas. Responses to the RFI will inform the workshop agenda.

Target Audience
Critical Infrastructure Owners and Operators and cybersecurity staff. Specifically those who have operational, managerial and policy experience and responsibilities for cybersecurity, technology and/or standards development for Critical Infrastructure companies." - Source NIST.Gov

For additional (Non NIST Sponsored) open discussion about the Cybersecurity Framework, check out CForum.

Thursday, October 2, 2014

Director of R&D, Dr. Pat Muoio, was recently interviewed by AFCEA Signal about Cloud Security.

Isolation
 Mechanisms 
Help Protect Data in Public Cloud

October 1, 2014
By Sandra Jontz

Usage has spurred growth in the virtualization market.

Explosive amounts of data and the strains on limited financial resources have prompted corporations and governmental agencies alike to explore joint tenancy in the cloud for storing, processing and transmitting data. But while good fences—or in this case isolation mechanisms—make good neighbors, in the virtual world of cloud security the idiom might not ring entirely true. In the public cloud arena, risks arise when organizations place their data in a cloud system but cannot control who their neighbors might be.

“There’s a risk that your data or your processes could bleed or be accessible from your cloud by your neighbors in a way you don’t intend them to be,” says Pat Muoio, director of research and development at G2 Incorporated in Maryland. “The kinds of mechanisms you need to protect against these risks of multitenancy are strong isolation mechanisms. A lot of virtualization systems provide isolation of your data and your processes from the next guy’s data and processes, but making sure that the mechanisms … are sound and strong, I think, is a key way to address this multitenancy risk.”

Cloud security vulnerabilities are just as high as those in networking. “That’s just a risk of [information technology] in general, not just a risk to cloud,” says Muoio, who served as a senior executive supervising more than 100 researchers in the federal government and developed capabilities to operate safely in compromised environments. In addition, she provided strategic direction to secure wireless technology, resilient systems, trustworthy computing, science of security, cryptography and system design and analysis.

Putting all of the security burdens of network computing on the back of the “poor cloud” is not useful, she adds. “We have to think about what’s different about the cloud,” says Muoio, whose technical focus areas include cyber physical systems, cybersecurity and advanced data processing. “For the most part, in my mind, those differences only become acute when we’re talking about the public cloud,” Muoio continues. “When we start talking about putting your data somewhere else, I think the risks change a little.”

The cloud offers attractive, affordable solutions that do not require much of an upfront investment and can be paid for based on usage or through subscriptions. It will be a booming market, a study by Global Media and Entertainment Solutions for the Cloud reports. While the cloud market earned roughly $100 million in 2013, it is expected to grow nearly ninefold by 2020, the report states. The Office of Management and Budget (OMB) already requires federal agencies to adopt a “cloud first” policy when contemplating information technology purchases.

Generally, public cloud use appeals to researchers, smaller companies and individuals who might need a lot of computing power for short durations. It also is attractive for cloud bursting, when running an application on a private cloud or data center is not enough and a user needs to burst into a public cloud for a brief capacity spike. “You might need a lot of compute power for an hour or two, or only once a week or so. If you were to buy that size of a computer, it would be very expensive and you might not get as much use out of it to justify that expense,” Muoio explains. “A lot of big companies are actually slower to move to public clouds because they have richer internal resources and have a better understanding of their compute load, which is much more steady.”

The growth of cloud usage spurred increased attention to and investment in virtualization, Muoio says. This is key to some possible solutions such as the growing trend of bring your own device (BYOD), in which employees use their own mobile devices such as cellular phones and tablets for work purposes. “You can save a lot of money if you work with virtual machines rather than be limited by the barriers that are on physical machines,” she says. “Absent virtualization, if I wanted to keep my work separate from your work, we’d have to put them on different physical computers. Whereas now, you use half a computer, I use half a computer, we can share it because we have these virtualization technologies.”

But access to the data—when users want it and how users want it­—presents an additional concern. Data is stored off the cloud user’s premises and in somebody else’s space. There is a risk of not being able to gain access to the data if, for example, a network crashes. Consumers should conduct ample research when choosing suitable vendors to meet their needs, Muoio advises. “You would be doing poor due diligence in picking a contractor if you need a 99 percent availability and that vendor only offers 80 percent.”

Midsize and larger corporations have migrated toward using technology that takes them from a “recovery” of data mindset to a “resiliency” one. This technology provides seamless backup between data centers or access to cloud computing when one center is compromised and shuts down, says Matt Waxman, vice president of product management for the data protection and availability division of EMC Corporation. The company created the VPLEX technology, which the U.S. military uses as a backup system between data centers.

“There’s a big difference between recovery and resiliency, and VPLEX really plays into the resiliency. Whether it’s a power failure or a flood or a hurricane, … it keeps your applications online across two data centers without the need for any human intervention,” Waxman explains. “It’s a hardware and software solution that effectively can turn storage of data into this continuous availability model.”

Although technology such as VPLEX offers a recovery and resilience solution, Muoio points out that other techniques also are available. For example, if users rely on public cloud computing centers and access to multiple data centers is out of reach, they can mitigate problems through diligent tagging, such as specifying a date to delete stored data from the cloud, she advises.

“Understand the relationship and how much trust you are willing to put in [a company.] Put in the cloud the data that matches the trust you have in the system’s integrity. You can see companies making choices where they might put less sensitive data out there and keep their intellectual property in-house. The part of using these resources is understanding what they are good for, what they are too risky for.”