Saturday, May 14, 2016

Entropy, Randomness and Modern Cryptosystems


We use encryption technologies to keep our secret data safe and secure. But, there are a number of pitfalls associated with this.


We take a secret plaintext message and encrypt it using a strong secret encryption key to generate the ciphertext. The purpose is, an adversary should not be able to retrieve the secret plaintext message from the ciphertext, provided he does not know the secret key. But, no modern encryption algorithm is absolutely secure. Many a times attackers manage to extract meaningful information about the plaintext message from the ciphertext. Entropic Security is a security definition which is used to indicate how difficult it is for an attacker to extract meaningful information about the plaintext from the ciphertext when he does not know the secret key.



What is Entropy ?


In cryptography, a cryptosystem is said to be semantically secure if it is computationally infeasible for an attacker to extract any knowledge of the plaintext based on the ciphertext and its length.

Some encryption schemes, such as RSA without encryption padding and many block ciphers used in Electroninc Codebook or ECB more or with a constant initialization vector cannot be called semantically secure. They always produce the same ciphertext for a given plaintext and key, even over separate executions of the encryption algorithm. So, an attacker can use this knowledge to do some statistical analysis on the ciphertext and gain much knowledge on the plaintext.





Entropic security of an encryption scheme is similar to semantic security when the message spaces have highly entropic distribution. In other words, an encryption is said to be entropically secure if it is computationally infeasible for an adversary to extract any information about the plaintext from the corresponding ciphertext.


In Information Theory, an entropy is a measure of unpredictability of information content in a message. In other words, it is the expected value of the information contained in each message. Randomness is a measure of uncertainty in an outcome and thus is applied to the concept of information entropy.



Entropy and Modern Cryptosystems


Modern cryptosystems rely heavily on randomly generated keys. We randomly generate a secret key and encrypt secret data using that key.

For example, in SSL communications, we generate a very large random number and utilize that to encrypt the communication. These random keys are generated based on specific information from some predefined sources. From some specific sources, entropy is collected and then it is utilized to generate the random keys. And, that is how entropy, randomness and modern cryptosystems are related to each other.



How is entropy generated


There are a number of ways entropy is generated and collected in a modern system. A number of them are mentioned below :

  • Linux kernel generates entropy from keyboard timings, mouse movements and IDE timings and make the random data available through the special files /dev/random and /dev/urandom.
  • Some software packages use userspace processes to gather random characters and utilize them.
  • Modern CPUs and hardware often use integrated generators to create high quality and high speed entropy and rovide that to the Operating System through /dev/hw_random.
  • Some companies manufacture entropy generation devices to generate high quality entropy in an efficient manner.
  • One can even collect entropy of a system from the computer's microphone or by building a sensor to measure the air turbulence inside a disk drive or even from webcams.



This article gives the basic information on entropy and how it is related to modern cryptosystems. Hope you enjoyed this.

No comments:

Post a Comment