Does April 8, 2014 ring a bell? That was the day that Windows XP officially went end of life. Yet, somehow, I still see Windows XP systems out in the wild. Not just at your grandmother’s house — I’m talking about at large, multinational corporations. XP’s sticking power has been both baffling and impressive, still boasting an 8% market share on PCs despite being 16 years old and 3 revisions behind Microsoft’s current OS.
How did XP become so ubiquitous? Why has XP stuck around for so long? In short, it’s mostly because it just worked. Most things businesses use still run on it, and the work most people do can still be accomplished with it. XP came around just after the dot-com boom, when everyone was getting online and companies were investing heavily in a distributed client/server model. Tremendous investment was made on top of the XP platform, and some of that infrastructure is still in use today.
Another reason? Cloud. Cloud is diminishing the role of the end user client, turning it into a consumption device rather than a creation platform. More and more applications are being moved to the cloud, allowing users to access their work from multiple locations, multiple devices, and numerous device types.
EUC devices—the gateway to your network
Despite the current hype around peer-to-peer models like blockchain, client/server computing still dominates the landscape. End users log on with an end user computing (EUC) device—still a PC more often than not within enterprises—and consume data hosted on a server via a client application. However, these client applications are more commonly becoming web-based, rather than the traditional “fat client” of the 1990s and 2000s, leading to opportunities to change the way we think about EUC. This has already happened in the consumer market, where mobile device usage already exceeds traditional desktop computing.
While server devices are typically much more powerful than the client devices, client systems these days are very powerful computing devices that can be very useful if control falls into the wrong hands. The City of Atlanta is one of many to fall victim of ransomware attacks recently, joining the Colorado Department of Transportation, Davidson County (North Carolina), Boeing, and many others. Most of these attacks originated on end user devices.
The most common source of malware is through social engineering of end users. One of the most infamous data breaches of the last few years, Target, was achieved through a series of events that started with a phishing email to an employee of one of Target’s vendor partners. Once the malware was present on the user’s PC, they could steal the credentials the vendor was using to log on to Target’s systems.
The easiest way to prevent malware from being successful? Deny it the means to execute. There are many ways to do this, each with their own pros and cons. The ones I most commonly see generally involve restricting the existing end user client, utilizing virtual desktop infrastructure (VDI), and using cloud-enabled clients.
Restricting the end user client
Although Mac OS has made surprising gains in the enterprise market of late, the majority of enterprises are still predominantly running Windows PCs. As a result, the majority of exploits designed to attack end users are written for Windows PCs. Anti-malware software for PCs has improved over the years, but attackers continue to find ways to evade detection.
Application whitelisting is a popular choice to restrict end user clients, taking a different approach from most anti-malware solutions by locking down your devices to only execute known applications, which is much more secure than traditional blacklisting approaches which are easier to evade. This is supported by NIST, who has even published a detailed guide on how to implement application whitelisting in your environment.
The more of your applications that are browser based, the easier application whitelisting is. If you’re still heavily reliant on fat clients, it may not be right for you, as it can be a bit of an operational burden. It can, however, help extend the life of your existing desktop architecture, allowing it to be locked down to serve specific functions, run browser based apps, or even to serve as a terminal for VDI.
VDI—finally ready for prime-time
Virtual desktop infrastructure (VDI for short) has been around for years, but it has taken time to mature. Early VDI solutions were costly to license, required expensive back-end hardware, and always-on network connectivity. On top of this, many organizations who were early adopters failed to take the correct approach to get the most out of the solution, generally operating in more of a “lift and shift” mentality of moving their existing desktop model into the virtual world, not taking advantage of the unique features that VDI can offer.
Newer VDI solutions are more affordable, due to new licensing models, and can make you much more secure, turning your datacenter into a private cloud of non-persistent virtual desktops, with solutions architected to reduce the need for patching by providing the user a freshly updated, fully patched desktop each time they log in. This is a huge win for enterprises, as patching remains one of the biggest operational pain points. Although we generally assume most breaches are a result of difficult-to-prevent hacking activities, approximately 57% of breach victims reported that the method of exploit used in their breach was a known vulnerability that already had a patch available which hadn’t been properly distributed.
With the reduced cost of enterprise storage, as well as the ubiquity of broadband and 4G internet, the virtual desktop is now a viable solution for the enterprise, enabling organizations to utilize bring your own device (BYOD) solutions, leverage thin clients, and hyper-restrict PC endpoints, allowing fewer opportunities for malware to take hold within their enterprise footprint.
Using cloud-enabled applications and devices
With the advent of browser-based client computing, as well as the rise of Software-as-a-Service, there is opportunity to take advantage of a completely different type of end-user computing device: the cloud-enabled client. One of the most popular cloud-enabled client operating systems is Google’s Chrome OS. Chrome OS devices like Chromebooks are designed to run cloud apps, while simultaneously allowing you to reduce your end user computing attack surface.
In 2017, Chrome OS devices accounted for 5.5% of all PCs sold, coming in third behind Windows and Mac OS X, and ahead of Linux. These devices are highly secure, very manageable, and capable of working with most browser-based applications.
Thin clients, typically based on Linux or stripped-down versions of Windows, are often thought of as terminals for VDI, but many can also run an assortment of browser-based apps. These devices are often incompatible with standard malware, offer write filters which prevent unapproved applications for executing, are inexpensive, and last for years.
The IoT is just going to make this worse
A casino in Las Vegas recently had a database of high-profile clients stolen by hackers who accessed their network via a vulnerability in a network-connected thermostat used to monitor the temperature of water in the lobby fish tank.
Many internet connected devices run highly insecure operating systems (some of them Windows XP based!), up to and including some router/firewall devices used by businesses. As the Target example shows, one weak link can allow attackers to spread across your network. Looking beyond your EUC devices, it’s critical to make sure you are choosing the right devices, keeping your hardware portfolio refreshed, and keeping the firmware and software patches up to date.
Operating system vendors are already planning for the demise of the OS
Looking strategically towards the future, you should be looking to move beyond the traditional end user desktop and laptop setup. After all, Microsoft is reorganizing around cloud and de-emphasizing Windows themselves, do you want to be more reliant on Windows than they are? Some are even calling the cloud the death of Windows. It’s not just Microsoft that is planning for a future that is not OS-centric; Red Hat is looking beyond Linux, as well.
Many organizations have struggled with the sunset of Windows XP (and Windows Server 2003, it’s server counterpart.) Some are still struggling with it today, 4 years after it has gone end of life. By moving to a less OS-centric model, you can begin to future-proof your environment, while at the same time being able to be more flexible and provide a secure environment with more flexibility and better user experience.
This article is published as part of the IDG Contributor Network. Want to Join?