Happy New Year! Hope you all had a wonderful holiday. Recently, I've been writing about Zscaler's 2014 Security Cloud Forecast. Also recently, our CEO Jay Chaudhry wrote about some of the challenges organizations may face inspecting SSL 2048 in 2014.
We all know that SSL is important for securing Internet traffic. What’s the downside? you might say.
To improve security, more traffic is being encrypted. It used to be that login pages for services like Facebook, Google, and others were encrypted; now they are shifting to encryption of all their pages. At this writing, some 30% of enterprise traffic is SSL-encrypted, a significant increase over previous years.
On top of this growing trend, the default key length for SSL is increasing from 1024-bit encryption to 2048-bit encryption makes it even more challenging for inspecting SSL traffic.
Why does this represent a security risk? In essence, the answer goes back to APTs and other sophisticated attacks. Malware is in the business of covering its tracks. Botnet traffic back to the command-and-control server is often encrypted, and if you’re going to inspect it, you have to decrypt it.
That botnet traffic might well be taking the blueprints for your new product design, your financials, or other sensitive information. Two clicks in Gmail attaches these documents. The vast majority of data loss prevention tools don’t detect that type of leakage.
To inspect encrypted traffic, you have to decrypt it. This is processor-intensive and impacts network performance. More traffic is being encrypted, which means more of this type of work and more of a strain on the network. The strength of the encryption is doubling, which means a five-fold increase in performance requirements to decrypt that traffic.
As a result, SSL is making your job harder.