2009 Web Security Predictions
2009 is a year when some fundamental shifts in web technology will begin to take hold and attackers will adjust accordingly. With the emergence of revolutionary changes such as cloud computing, widespread adoption of next generation web application technologies and the ‘real’ web arriving on mobile devices, we must anticipate that attackers will adjust their tactics to leverage these shifts. From a corporate perspective, anticipating attack and business trends can make the difference between being prepared for an emerging threat or being blind-sided by a new attack. We have combined the wealth of knowledge shared among Zscaler researchers with our unique access to web traffic to assemble our top 10 web security predictions for the coming year.
1.) Cloud Computing Is Ready For Primetime
2008 was the year in which 'cloud computing' truly emerged on the public stage and gained acceptance as a force to be reckoned with. In the coming year the honeymoon period is over and it's time for cloud computing to prove it's worth. Now we’re obviously biased but I fully expect 2009 to be a critical year for security in the cloud. We've moved past peaked curiosity to detailed evaluations and bake-offs of the many competing solutions that are beginning to emerge. This is the year where the pretenders will quickly be relegated to the sidelines and by year-end dominant players will begin to emerge.
For this process to occur, companies are going to need to define criteria for assessing security in the cloud. It will require a different approach than traditional software assessments, as you don't get to hold and touch SaaS solutions. This in turn makes it even more important that evaluators hold SaaS vendor's feet to the fire. Just because a vendor says that a feature exists - ensure that you understand how and determine if it will meet your needs.
2.) The ‘Mobile Web’ Is No Longer A Separate Platform
Check historical security predictions and you'll see that every year for the past decade was labeled as the moment when mobile malware would explode. It made sense. After all more and more people were leveraging mobile devices so attackers needed to adjust their focus eventually...yet they didn't. The number of large-scale attacks on mobile devices has been minimal. How can this be and will 2009 be any different?
Mobile malware has largely remained on the sidelines for two reasons. First, the multitude of platforms has limited the payback from putting in the work to exploit vulnerabilities in a single implementation. Secondly, mobile applications have largely lacked the feature set of their desktop counterparts, which leads to limited use.
Thanks in part to the competitive innovation injected by Apple's iPhone, the ‘mobile web’ is no longer a separate platform. What do we mean by that? There is no longer a clear distinction between web content for mobile and traditional browsers. You don't need WAP, WML, etc. to access the web on your phone. Standard web applications are now realistically accessible despite the limited screen real estate on a mobile device.
3.) Is Client Side Browser Storage a Feature or a Ticking Time Bomb?
The line between web and native desktop applications is continuing to blur. Rich Internet Application technologies such as Flash and SilverLight and the emergence of development approaches such as AJAX have made web applications much more interactive and user friendly. However, despite these advancements, a critical differentiator between desktop and web applications remains the need for connectivity. Sure, Google Docs is a great alternative to Microsoft Office - until you board a plane. You can't use web applications unless you have access to them.
This too is starting to change as browsers are gaining access to client side storage solutions. Flash storage, Google Gears and Structured Client Side Storage, detailed in the HTML 5 specification all address this issue. While this opens up new doors for web applications, as with any new technology, if it is implemented insecurely, it can increase risk and create new headaches for corporate security staff. Our early review of these technologies suggests that they are not well understood and are indeed being poorly implemented. This is turn will lead to the leakage of sensitive information and client side equivalents of XSS and SQL injection. Stay tuned for further posts on this topic.
4.) Web API Vulnerabilities Lead to Mass Attacks
Code reuse has always been encouraged - why reinvent the wheel. It's also a good security practice as open, stable code has been scrutinized by many eyeballs and is therefore more likely to be secure. In web development however, that is not always the case. While we may leverage functionality created by others, the security of that code may not have been assessed by anyone other than the original creators. That occurs because we're not necessarily dealing with open source code or even compiled binaries but rather web based APIs. In this case, we're largely counting on the providers of a given service to ensure that it's secure. Google for example makes available a plethora of APIs for everything from maps to social networks and despite investing in securing code prior to release; they have seen their fair share of vulnerabilities. When a vulnerability is discovered in such an API, hundreds and thousands of sites can instantly be affected. Fortunately, patching can generally be completed quickly as only servers hosting the API need to be updated but this can also break associated applications. From an attacker's perspective, a web API vulnerability can signal a target rich environment with a small window of opportunity - a good reason to keep quiet once a vulnerability is discovered.
5.) Abusing the Architecture Of The Web As Opposed To Vulnerabilities In Specific Applications
6.) Internet Explorer Gets Competition
We've often argued that vulnerability statistics are a poor indicator of the relative security of a product. Rather than providing insight into the relative security of products, they tend to instead reveal the popularity of products. Attackers/researchers want the biggest bang for their buck. If they're going to spend time looking for vulnerabilities in a certain product, they're more likely to focus on the one with the largest user base. Internet Explorer (IE) has held the browser crown for several years now and that in our opinion is why it has tended to see a more significant number of vulnerability reports. The landscape is however beginning to change. Not only are mobile browsers starting to take off (see prediction #2) but there are some interesting new challengers in the market such as Google Chrome. While we don't expect Microsoft to fall from the pole position any time soon, we do expect the focus to shift to some of these new and intriguing entrants. Expect to see a decreased number of IE vulnerabilities in 2009 and more from mobile browsers and especially Google Chrome. While Google has a decent security track record, they haven't faced the same difficult but important learning curve climbed by Microsoft over the years. Browser development is also a different game than web application development, Google's forté, so expect some tarnish on the Chrome in '09.
7.) Data Leakage Via The Web Reaches A Tipping Point
The Internet is converging on ports 80 and 443? Why? They're always accessible for outbound connections on corporate networks. Whether you're dealing with VoIP, P2P or malicious code, network aware applications are becoming increasingly intelligent. While they may initially attempt to connect via high-level ports using proprietary protocols for efficiency, they will often try a variety of approaches and ultimately regress to tunneling content through HTTP/HTTPS traffic. Combine this with the fact that users are increasingly encouraged to share content online (Facebook, Myspace, YouTube, etc.) and it's easy to see why data leakage via web channels is fast becoming a top priority for many enterprises. 2007 and 2008 were banner year for data leakage solutions in general (Vontu and Provilla acquired) but in 2009 the focus will shift to web based DLP.
8.) The Long Overdue Death of URL Filtering
Network administrators are finally starting to give up on managing web traffic through URL filtering. URL filtering is a dated technology, which tends to focus on blocking traffic at the domain level and leaves the administrator with a binary decision - allow or don't allow access to a given resource. This approach was reasonable when the web was dominated by static, corporate content. This is no longer the case. Today, content is extremely dynamic and often user generated and this changes the rules. A page that was good today may be bad tomorrow and while a domain such as Facebook.com may be perfectly acceptable, individual pages could contain objectionable or malicious content. In 2009, enterprises will seek solutions that support dynamic content filtering and page level reputation.
9.) Controlling Web Application Functionality, Not Just Content
It is no longer enough to block/allow access to sites. Administrators are demanding control over functionality, not just content. There may be perfectly legitimate reasons to permit access to a given resource but not want to permit specific functionality. Take YouTube for example. While it may be seen primarily as an entertainment site, many businesses have begun to leverage it as a marketing tool. If URL filtering is your only option for controlling access to YouTube (see prediction #8), you're stuck with allowing/disallowing access to the site as a whole. You may however wish to permit viewing videos to allow for competitive intelligence (or to avoid being the fun police) but not want to permit uploading content for fear of data leakage (see prediction #7). Administrators are increasingly demanding the ability to manage the - who, what, when, where and how much of web security. They want the granularity to determine that only the Marketing Department can upload videos during work hours to YouTube, so long as they don't use more that 5% of available bandwidth. Solutions need to understand not just the destination of a web request, but also the business logic of that destination in order to be able to permit the granular level of control required to manage third party web applications.
10.) Mobile Platforms Open Up For Developers…And Attackers
Desktops and mobile devices have evolved along nearly opposite paths - until now. Desktops have always been about an open architecture - you're free to add whatever applications you like, after all, you own it. Cell phones on the other hand have traditionally been black boxes - don't like a key feature? Buy a new phone. Much of this was driven by the control crazy mobile carriers who until recently have wielded the power. They didn't want you to be able to add features, as it would then be harder for them to differentiate their 'exclusive' handset offerings. This has changed however as device manufacturers have begun calling the shots and the carriers are now tripping over one another to argue about who is the most 'open'. While this overall is a positive thing, it does have security implications. The more freedom that you provide someone with the more likely that freedom will be abused. Providers are taking different approaches to an 'open' model. Apple for example allows you to install whatever you want, so long as they approve of it. Are they however, assessing submitted applications for security weaknesses, or just undesirable functionality? We suspect it's the later. Google on the other hand is taking a more 'pure' open source approach and not locking down as many portions of the O/S for developers. Will attackers abuse these new open platforms? You bet.