Are We Nearing the Goal to Ubiquitous Encryption?
We all know that data encryption was invented in order to ensure the security of all data. However, at this stage, when you use data, you need to de-encrypt it and then re-encrypt it. This causes processors to slow down the overall performance of its systems. If we can develop a system where the encryption process happens at the processor level, then everything becomes faster.
photo credit:Â blog.cbadigital.co.uk
Technology watchers are saying that in the near future, all data whether they are at rest or in transit will already be encrypted. This will be for all data, whether they are sensitive or not. This will soon become possible because of the new inroads in on-chip acceleration. These inroads are making the ubiquitous use of encryption a feasible solution.
Data encryption at present levels requires four steps: transforming the data; rekeying; clearing the copy of the entire data set; and ongoing encryption/decryption. Initial encryption simply means that pre-existing data must be encrypted for the first time. Rekeying involves using a new key and overwriting the original key to ensure continuing security. Clearing the copy is done also on a periodical basis to ensure that data remains secure and requires that the entire data set be decrypted and only one calculation is performed per data block. Ongoing encryption/decryption is done to allow for new data to be ingested into the system. The whole process requires that a portion of the CPU bandwidth be reserved for encryption alone.
CPU vendors at present are looking into the release of support for â€œcryptographic accelerationâ€ in the hardware by dedicating instruction sets. By using finely tuned on-chip cryptographic instructions, the impact of encryption is reduced to nearly zero. Now that it is possible for encryption to happen at chip level, quantum performance improves making encryption transparent to application users.