Privacy and Visibility – The dichotomy of encryption and inspection

The encoding or encryption of communications and information is a very old practice. The concept is relatively simple. One of the easiest examples is simply to reverse the alphabet, A for Z, B for X and so on. The reverse function is the ‘key’ to deciphering the message. We needn’t go into the detailed but fascinating history of the evolution of cryptography and the concept and method of the key. Instead we only need to touch on a few key historical milestones and how they have impacted the world today.

Cryptography is indeed an old practice. The ancient Romans would write encrypted messages on strips of cloth that were wrapped around wooden staffs of various widths. They would then send just the cloth strip with the courier. Only if the right staff was used could the message be deciphered. Here the ‘key’ is the width of the staff. That information would either be known or communicated to the receiver ahead of time so that they would have the right staff on hand to decipher the message. Obviously if anyone intercepted the information regarding the width of the staff, they could also decipher the message if they intercepted that as well. So we see that the secrecy of the key is a very important thing in all of this. Let’s move forward to the Mongol Empires of Asia, when one Khan wished to send a secret message to another they would shave the heads of two boys. On one of them they would tattoo the key on his scalp on the other child they would tattoo the encrypted message. Then they would let their hair grow back and send them on separate caravans to wherever they needed to be sent. Once again, the ‘key’ child would be sent before the ‘message’ child. Once the children were received their heads were shaved, the information was recorded and then they were placed into training for the Khan who received them or used once again to send a return response. Here we start to see an intentional effort to hide not only the message but the key as well. This practice is known as steganography. This will begin a long line of intrigue and secrecy that is still prevalent today in the world of cryptography in cyberspace. Key cracking is a very important method as well as hash reversal to obtain clear text passwords. If we move forward again to the Second World War we have the legendary Enigma machines of Nazi Germany. This code was literally unbreakable in the earlier portions of the war. This was due to the increased complexity of the method, which involved various geared disks and dials of various ratios within a machine but also in the sophistication of the key distribution method; little black books that every field commander who handled communications was in possession of. Obviously, it was a very big leap forward for the allies to obtain one of these books, but even then without detailed knowledge of the Enigma method it was of little use and these books were updated and distributed on a regular basis as well so they were used for only a limited amount of time. Needless to say however, through the diligent reverse engineering of Marian Rejewski, Alan Turing and others the Enigma code was cracked and this breakthrough contributed strongly to the Allied victory. What many folks don’t realize is that this was also the foundation of modern computing as we know it today. That’s right, modern computing started as a need to crack a code to decipher the encrypted messages of the enemy. So from time immemorial, the pattern had not changed.

Guess what? – Times have changed!

Today we use encryption without even thinking about it and that is the intention. It’s a part of daily life. It is transparent to us as users. There is encryption of data in movement as well as encryption of data at rest and access authentication as well as transactions. As an example, the disk on the laptop I’m using is encrypted. I never have to deal with it other than that I see a brief message about it during boot up. It is the same thing with web sites, when I go to a secure site the encryption and key methods are automatic. I don’t need to deal with the complexities of the key exchange. This is obviously all very nice but it also creates an infrastructure that can be potentially compromised by someone with the right skill and tools. So this creates an ‘arms race’ of sorts that we witness today with stronger encryption methods and much longer – really, really long keys that are regenerated very frequently. The evolution of cracking methods follows. Quantum key generation is a recent evolution in this as there are two things that make for complexity in cracking a key, the length of it and the randomness of its generation. QKG can easily accommodate both.  Indeed with advent of the developments around true quantum encryption a truly unbreakable cipher will finally be achieved that is beyond the known powers of modern computing to crack. So it sounds like we won right? It sounds like we have beaten the bad guys and we will now soon have the ability for secure communications without fear of interception. While this sounds good, like all things in security, not quite.

This has alarmed authorities. Many law enforcement and national justice agencies are raising the concern that unbreakable (or at least exponential) ciphers are essentially impossible to eavesdrop in anything near real time. This becomes a very good avenue for criminal and terrorist organizations to establish and maintain communications with little risk of being monitored. This is already done on a regular basis with existing ‘strong’ cryptographic methods. Law enforcement is reduced to analyzing communications patterns which is eerily similar to the situation that the Allies were in at the early part of WWII. So we come to following dichotomy as we will illustrate below.

  • GOOD – There are malicious entities that strongly desire to extract or intercept and read your data. They would also like to impersonate you. As a consequence you desire secrecy through the use of encryption.
  • BAD – There are malicious entities that wish to communicate to one another in secret as a consequence they also would desire secrecy through the use of encryption.
  • GOOD – Law enforcement agencies desire the ability to intercept and decrypt criminal or terrorist communications in order to gain better insight into their activities and plans.
  • BAD – Malicious entities also desire the ability to intercept and decrypt your communications in order to gain better insight into your data and identity.
  • BAD – Malicious entities also desire to use your own encryption methods as a cover of darkness to move laterally and northbound towards your sensitive high risk assets.

Two things to note in this, first the bad points are more numerous than the good! Second, point number five is a very scary thought if it has never occurred to you before. Many folks make the incorrect but easy assumption and invest in a strong outer perimeter defense and the use of encryption within that perimeter for the protection of sensitive data both in movement and at rest. At first glance this seems like good prudence, but if the practice is taken too far it can be a very bad thing from a security practice perspective.

There is now the realization that cyber-communications requires some sort of independent inspection visibility into the end to end data path. The reasons for this are multifold as two simple case points illustrate;

1). User password and account access privileges can be compromised and the intruder uses this compromise to gain access to encrypted services. From there avenues for further infection and the establishment of command and control are provided by the normal encrypted channels.

2). A user’s device could be compromised allowing the attacker to utilize the user’s identity to further infiltrate the network and attached systems. There are several methods for this to occur with mobile edge devices that at times may yield vulnerabilities due to user’s behavior, a shortcoming in the edge protection mechanisms or a combination of both.

There are many examples in both categories and others that illustrate that assuming total end point protection is not a realistic proposition. If there is a determined intruder, they will get access. The question is then, how far do they get and what is the impact? How deep can they penetrate into the network or how much damage can they do before they are discovered? The data is scary. 80% of breaches are not discovered by internal IT security but instead by 3rd party resources such as law enforcement agencies or financial partners and though this number has improved recently (it shows folks are listening!) the ratio is still far too high. Add to this the fact that the average time of infiltration is around 240 days prior to breach or discovery. Again, this number is improving but not nearly fast enough and it is likely to never become ‘zero day’. These are disturbing results. It means that much of the security infrastructure we have built is being used against us in ways that we didn’t intend, allowing intruders to move or extract content at will, particularly if we are not diligent in practice and become complacent on the technology.

This clearly demonstrates that we require visibility into the network information flows in order to see anomalies quickly and investigate their cause. Because there is one thing that is certain, if you are a target they will get in – eventually.  So it is no longer a question of maintaining a secure perimeter with a strong inspection and sandboxing environment. It’s about detecting the threat before it gets too far into the secure domains of interest. This could be credit card data, health care personal records, individual criminal justice records, industrial control networks, intellectual property, etc. The list could go on and on and it will only increase with the growth of the Internet of Things. If you haven’t noticed the whole paradigm has shifted from encryption and secrecy to visibility and inspection. Indeed there is an almost Yin Yang relationship between the two in a truly comprehensive security practice.

 

Visibility into the SDN Fx Fabric

SDN Fx Fabric Connect yields what is referred to as a ‘stealth’ network topology. There are two key terms here that need to be pointed out. Stealth – the ability to be undetectable or undiscoverable, and Topology – the actual layout of the network infrastructure and internal switching paths. As pointed out earlier it is important to realize that stealth networking is focused on this alone and hence network data is not encrypted by default.

All accepted encryption methods work over SDN Fx however. So an IT architect could in theory encrypt into the stealth network end to end.  So the question is where do we get the visibility we require?

This is where our work began with various security partners. The need to have visibility into network traffic is a key critical asset for a proper security practice. What we have arrived at is part technology, part practice. It is based on three points.

Point one – End to end encryption is a bad thing as a general practice

The simple fact of the matter is that the bad guys use encryption as well. Many will argue that they would pick out these covert sessions but this is increasingly becoming untenable. If end points that use valid encryption become compromised the very same ‘trusted’ encrypted session could be used to further infiltrate and compromise or be used for exfiltration of data. If you’re encrypting end to end then there is no place so see any indication of abnormality other than changes in traffic patterns.

 

Point two – There needs be the enforcement of encryption free zones

As a response to point one, when encryption is used there always has to be a point for the inspection of clear data. Ideally, this should be coordinated at the most efficient and cost effective level. This requires some thought to the implementation and purpose. Further below in this article we will provide some further insights into this challenge.

Point three – There needs to be coordination between the security and network infrastructure to deliver the points highlighted above

In order for the above two points to be realized there needs to be the coordination of network service paths to enforce that required data is decrypted, inspected and if required re-encrypted. This brings about the concept of a network security service chain and it can be very powerful if properly leveraged with the right technology sets.

There is a resulting triangulated relationship which results in what is displayed below. On the lower left hand corner there is the category of ‘Stealth’; the ability to decrease or eliminate the ability for topological interpretations of the network infrastructure as well as movement within it. This is also married with ‘Hyper-segmentation’, which is the capacity for micro-segmentation with dynamic elasticity. On the lower right we have Encryption, which while at the surface might seem a good thing (in the end it is I promise…. stay tuned) as we’ve seen it can be a bad thing as well. Malicious entities could use the very encryption you trust to be used against you.

At the apex of the triangle we see the requirement for visibility into the traversing traffic patterns of the network. Without this capability the overall security practice is drastically compromised. The holistic aspect of this approach is obvious. If we can intercept or corral information into proper inspection boundaries then we have the ability to enforce encryption free zones where security instrumentation can be applied with the most effective results

The following diagram illustrates several SDN Fx topologies where encryption free zones can be engineered. Any allowed encrypted data patterns should require strict policy exception and should be monitored by a separate encryption free zone further into or out of the network. This requires the use of service chaining the zones into whatever form is required by the service in question.

At the traditional data center demarcation we see the use of two forms of encryption free zones. The first is the User to Network Interface boundary (UNI) which deals with normal 802.1Q tagged data. The second is found at the Network to Network Interface level (NNI) which is based upon 802.1ah framed data. SDN Fx is based upon 802.1ah transport. This provides for two potential areas where encryption free zones can be enforced to provide for total visibility to traversing traffic. Any use of encryption should be prohibited or at least require policy exceptions and alternative methods of inspection.  While the above approach covers the majority of north-south user traffic there is also the need to provide for inspection of any incoming traffic from the DMZ or other perhaps federated demarcations.

We also see that encryption free zones can also be implemented in certain edge situations where there is a high volume of encrypted peer to peer traffic. This is increasingly becoming a requirement in certain distributed IOT frameworks. This is due to the increased push of both computing and data to the network edge. The term that is often used for this phenomenon is a term known as ‘fog computing’.  The ability to inspect on these types of traffic patterns requires encryption free zones that are relatively close to the network edge. Here the UNI/NNI boundary is the best place to implement inspection as well and it may be very specific to a function or a service. Note also that with the recent introduction of VOSS 6.0, SDN Fx can now mirror I-SID’s (Virtual Service Networks) from the UNI side to monitoring systems, which provides for an excellent method to focus on certain edge behaviors in a very dedicated and discreet fashion. This approach does however require that the encryption function be decoupled from the actual device or end station. If this cannot be accomplished then the traffic should go to a demarcation where such an encryption free zone can be enforced. This incidentally is the approach of our Surge 2.0 Secure IOT solution offering.

The notion of a network security service chain is a very powerful one and not necessarily new. I first came upon this issue back in 2006 and posted a patent on the notion of mid-com interception for inspection which was posted in US Pat# 7,739,728.  SDN Fx provides for the ability to control service paths and accomplish this concept of ‘data corralling’. With this, controlled network hyper-segments are directed into encryption free zones where inspection can then occur. In the diagram below there is the illustration of a security service chain for the interception of clear data for inspection now in development that can occur within a single network element.

As can be seen there is an encrypted hyper-segment that is coming into one side of the network element that is decrypted, cataloged and then placed into a security inspection service chain that would involve full packet capture as well as threat and intrusion services. Additionally there may be other services such as load balancing or firewalls that may be desired, particularly at major traffic junctions. On the out bound side of the network element the data is re-encrypted and then sent out to its destination. Note that clear data is never on the wire. The whole service chain can occur within a single network element. This is the core essence of the security inspection point highlighted in patent # 7,739,728. But the true evolution is in the embodiment of the platform to support such functions in a truly virtualized fashion. This has now been achieved in our development labs.

Security technology partners, can provide for the threat intelligence and analytics that can be placed within these zones whether UNI or NNI. The end result is a holistic secure framework that incorporates encryption for content security but in a controlled and monitored fashion that always has a point of inspection of traversing data. Why is this important? Well, we are all aware of the severe damage that the Wannacry ransomware attack has wrecked. It literally took down Spain’s Healthcare infrastructure to the point where only emergency situations were being handled. Many other countries were affected as well. The reality of it is that encryption offers no protection against such a worm. As a matter of fact it will use your encryption services to propagate quietly and undetected. SDN Fx makes it very difficult for the worm to propagate due to hyper-segmentation and isolation of critical communities. So while you might be impacted by it, if the network is properly designed your critical assets should remain safe. But this is clearly not enough. You need to be able to detect it prior to it doing any significant damage. Encryption free inspection zones offer this capability.

This brings a new level of meaning to the motto – “Prevention is an ideal, but detection is an absolute must!”

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: