Information in transit from one place to another is never completely safe from interception.
Having accepted that fundamental fact, it becomes clear that the goal is to make it so difficult to intercept and decipher data in transit that nobody will make the effort. That presupposes that it’s easy in the first place, which it is!
Tempest
According to the : “In 1985, Wim van Eck published the first unclassified technical analysis of the security risks of emanations from computer monitors. This paper caused some consternation in the security community, which had previously believed that such monitoring was a highly sophisticated attack available only to governments; van Eck successfully eavesdropped on a real system, at a range of hundreds of meters, using just $15 worth of equipment plus a television set.”
The same Wikipedia item reports that Bell Labs was able to produce 75% of the plaintext being processed in a secure facility from a distance of 80 feet, and this was during World War II!
As a result, the world’s foremost hackers, the US National Security Agency (NSA) has established the codename “Tempest” to describe “spying on information systems through leaking emanations, including unintentional radio or electrical signals, sounds, and vibrations.” Tempest standards exist for shielding, filtering, standard distances between system components and more designed to prevent such data interception.
Points of Entry
There are two components involved in protecting against compromise of data in transit. The first is interception, and the second is translation or deciphering. The second step presupposes that you are using data encryption. If you’re communicating with most cloud services that’s just about always a given. If you’re not, you should have no expectation of privacy whatsoever. Encrypting data in transit either through a firewall or other measures is a necessity.
Unfortunately, data criminals have techniques available to them for stealing encryption keys, so every effort made to prevent them from intercepting your data in the first place is absolutely worthwhile.
There are three basic parts of the journey when data is in transit. The cloud provider’s data center is the source, the data travels over the network, and your computer, tablet, smartphone or other device is the endpoint.
While the source will encrypt data on its way to you, they should never have possession of the encryption key. They don’t need it. They never need to decrypt the data. When they receive data from you all they do is store it awaiting the next call to transport it.
Data in transit across the network must be protected, preferably using the IPSec suite of protocols, modes, and associations. This includes an authentication header to assure the identity of the sender, encapsulation, tunneling, and other techniques designed to make it nearly impossible for unauthorized invaders to obtain your data.
Endpoint protection is critical. This includes protecting the endpoint against viruses, worms, Trojans and other malware, data loss and leakage prevention, content filtering, user authentication, network access control and a variety of other techniques designed to protect the device itself and the session it is running on the network from compromise.
You’ve Got a Friend
Data security across the internet is a complex maze of protocols, modes, techniques, and technologies that no casual user should expect to master themselves. Turn to your friends at CloudStrategies to point you at the right solutions to assure that your data gets from you and your people to your cloud service provider and back.