Clear Metrics for Cloud Security? Yes, Seriously

Since publication of my first article -- Cloud Security: Danger (and Opportunity) Ahead -- it seemed new informations and cloud solutions were appearing daily. I'm gratified, for example, to see NIST, the National Institute of Science and Technology, has published its 15th draft on cloud computing, and with it, agreed with much of the definition I proposed in the previous article: "Service-based data processing and storage capability which is flexible, extensible and virtual."

NIST suggested cloud computing has the following salient characteristics: "On-demand self-service, based upon ubiquitous network access, using location-independent resource pooling; feature rapid elasticity and provide a measured service."

It's interesting to note that NIST specifically called out the piece about the service having to be measured. I wholeheartedly agree and take this to be a step in the maturity of cloud computing.

Security ModelsThe Jericho Forum proposed an interesting approach to cloud computing security. Starting with a description of cloud layers, it allows us to envision the problem. Here, the forum proposed that security (and identity management) are elements that cross all layers and in effect provide a design they call Collaboration Oriented Architecture (COA).

Once this foundation has been laid, they defined cloud security as a cube-shaped model that highlights various possibilities of architecture. The one addressed here is, of course, the outsourced/external/de-parameterized option. At about the same time, the Cloud Security Alliance, of which I am a member, designed a not-too-different view. The CSA broke down cloud computing into three delivery types:

  • 1. Infrastructure as a Service (IaaS)
  • 2. Platform as a Service (PaaS)
  • 3. Software as a Service (SaaS)

And then proceeded to define the cloud consumption models:

  • 1. Private
  • 2. Public
  • 3. Managed
  • 4. Hybrid

The CSA's model of service delivery stacks, however, is very complicated. While I do not disagree with their reference model, I find it to be exceedingly complex. So allow me here to define the problem statement a bit differently. Let's expand the basic three tenets of security:

  • 1. Confidentiality
  • 2. Availability
  • 3. Integrity

Clearly, in the case of cloud computing, and especially in the public/external case, we no longer have any control. Once the bits "leave our network," control passes elsewhere. Losing one control typically mandates an increase in the other controls. Here, we have another set of problems.Let us explore the remaining controls:

ConfidentialityTypically, we handle confidentiality through the use of technologies such as encryption and access control. We can still encrypt, but imagine what happens to a large data set. It has to be sent, or assembled, in the cloud, remain there in an encrypted form, and be transferred to us, for processing.

Once the data is at our location, we have to decrypt it, perform the operations needed, then re-encrypt and resend to the cloud. Doable yes. But the performance tax here is huge. While today's routers and servers no longer have their performance brought down to 1/6th by encryption, we still pay a heavy price.

One other element within confidentiality is the ability to destroy data. In a cloud that we do not own, and on storage media that we do not control, there is high probability that the same media be used for other purposes. These storage buckets are dynamic and the service/platform/application provider might allocate them to other users.

This sharing, and in many cases, repeated sharing, of storage media leads to the need for assured destruction. We must follow a strict regime that states how long is data to be kept, when and by whom destroyed, and how such destruction is verified. Since degaussing tapes and shredding CDs is out of the question, we must employ more agile software (or, dare we say hardware?) based methods to assure that destruction.

This question becomes infinitely more complicated when we consider that data at rest does not necessarily "rest" on a certain part of a certain hard drive. The data can, and usually does, move between storage locations on the drives. The onus is still on us to assure confidentiality, but we don't manage the drives. The only practical solution here is to demand regular scouring of storage media from the service providers. Do we think that such a requirement is feasible?

Finally, lest someone think I am only talking about the storage aspect of cloud computing, the above discussion is easily applicable to processing in a cloud as well.

AvailabilityWhen dealing with a cloud-computing resource, we are at the mercy of the network, the remote server, and whatever controls are applicable along the way, be they host- or network-related.

Yes, we always were at the mercy of such risks, but we owned them before. At what point does the enterprise take notice? As we can see from recent, published outages at Google and elsewhere, users are very sensitive to the information they require, and rightly so.

Even when taking steps to "assure" access, which in reality translates into reducing exposure to this particular risk, we have typically resulted to building redundancy into the system.

Here, that would presumably add lines, servers, networking equipment and personnel. Doable, but at what cost? What does the complexity of redundancy mean to an organization? What is the true cost of operations?

Let's look at an example: We have a volume of data which stretches at times by a factor of 10, so cloud computing seems like the perfect solution. So here is what may happen:

  • 1. We ask the cloud service provider for an availability in data storage bursting. We will estimate this payment at 10 percent of our regular Cloud computing cost.
  • 2. We ask our network services provider to create another redundant and highly-available path to the Cloud service provider. We will estimate that cost at 25 percent of our regular data communications cost.
  • 3. And now we must consider what we are to do if such data-burst occurs when we have no availability to send it to the cloud. Are we going to dispose of it? Cease operations? No and no. So here we must plan for (at least) the storage of such data regardless whether we use cloud computing services.

IntegrityWe can detect changes after they were made. From hashing to redundancy checks, from digital signatures to trip-wiring we are able to ascertain that a change occurred. But we can no longer prevent changes.

The bastion of defense-in-depth has crumbled when we talk about cloud computing. We do not own the moats, the walls, or the doors. Accepting data without verification should be unthinkable, verifying all inbound data will be complex and costly, adding yet another layer to the mix of technologies and methodologies that we must rustle.

Indeed, the cloud unchecked could lead to a wave of new attacks aimed directly at data whose guardians (by virtue of possession) are not incentivized to protect it from change, only mostly to be able to speed it on its way.

Cloud computing could be a goldrush to people designing man-in-the-middle attacks, too. While most hosting companies will boast of their monitoring and security, few, if any, can assure you that they have never been compromised. In fact, a provision of cloud data, with its already built-in doorway (or tunnel) to you, makes their life easier.

They can now both alter the data AND assure that it, and associated payloads, make their way to the intended destination.

So even if we are the best of meaning CIOs and the furthest thing from our mind is flouting the law, we are faced with a few obstacles in our way. Let's state some, in no particular order:

  • 1. How do we comply with breach notification laws?
  • 2. What happens if we have data regarding an EU national?
  • 3. What must we do when we disclose risk information to Auditors? To the SEC?
  • 4. How do we comply with rules relating to CALEA? E-Discovery? Data Forensics?

Lastly, we do remember that data has a lifecycle. Such DPLC mandates, ultimately, that the data be disposed off in a secure manner. Remember those Cloud-buckets? Well, these must be certifiably-erased when we are done with their utility. How do we do that in a Cloud?

If we remember the example we used above, authenticity of data is a problem that must be addressed.

Sometimes seen as a combination of non-repudiation, integrity and accountability, authenticity is a super-set that defines the reliability we assign and the trust we place in our data.

Should data in/from a cloud be seen as less-trusted data? If so, is there any worth to it? Would cloud end up being used only for data we could care less about?

Only time will tell.

Ariel Silverstone is a veteran of the Israeli Defense Forces with experience in physical and information security and is a regular contributor to CSO Magazine. During his IT and management consulting career, he focused on providing IT strategy, engineering, and assimilation solutions for a portfolio of primarily Fortune 500 clients, including USAA, Chase Manhattan, Citibank, GTE, General Motors, Ford Motor Company, Vanguard Funds and others. He has also been a director at Symantec Corp. and CISO for Temple University.

This story, "Clear Metrics for Cloud Security? Yes, Seriously" was originally published by CSO.

What’s wrong? The new clean desk test
Join the discussion
Be the first to comment on this article. Our Commenting Policies