Desktop virtualization has been making strong inroads in today’s workforce. Success stories abound on how firms are benefiting from the move to desktop virtualization. Right from tellers at banks to customer service executives at BPO’s to mobile business executives – it has brought about a difference in IT operations and provided tangible benefits. However, there have rarely been examples surrounding desktop virtualization for power users. Power users are typically coders, testers, graphic artists, designers, scientists requiring complex calculation outputs, research analysts etc. Basically, anyone requiring intense compute and storage power with a highly customizable system in tow.

The reason isn’t hard to find. The need for high processing power, storage, non-standard, highly individualistic & customized nature of each users system has its own implementation challenges. Add to it the expectations of users for a performance akin to, if not exceeding, the normal desktop environment that they are used to.

However in our experience there is no reason why power users have to be devoid of the benefits that desktop virtualization offers. The challenge lies in delivering high compute & storage without negating the business case. An added challenge is delivering IT simplicity & efficiency without compromising on customization needed.

We undertook a similar  exercise for a global managed analytics provider. The firm provides services in market research, retail and brand pricing. Users were made up primarily of three major types:

  1.  Analysts, who did client research & generated reports out of reams of data.
  2. Programmers, who created custom built applications suited for each client engagement.
  3. Testers, who tested the applications developed on different platforms & use case scenarios.

Employees use the proprietary big data platform as well as over 130+ applications including IBM SPPS, IBM Dimension 6, Flowspeed, FTP, and several different industry platforms to deliver results. In addition, several applications had customized outlook plug-ins. Its technology infrastructure can best be described as one requiring high compute & storage, non-standard & highly customized yet one requiring accelerated provisioning, flexibility of ramp-ups and ramp downs and rapid roll-outs and updates.

In such a scenario while it is normal for VDI implementers to focus on uniformity & standardization, a one size fits all approach to VDI implementation just doesn’t apply. On the other hand, to address the individualistic customization needs, one might end up over-provisioning resources which can lead to a VM sprawl where it becomes difficult for the administrator to manage efficiently. Although VMs are easily created, they have the same licensing, support, security and compliance issues that physical machines do, this can defeat the gains of virtualization. The skill lies in walking the fine line between customization & standardization. We are glad to report, the company is currently considering expanding their virtualized set up.

Virtualization has several benefits – reducing TCO, improving application delivery and end-user experience being the more significant ones. The key to reap these benefits lies not only in realizing the type of virtualization best suited to your business requirements, but also in choosing the right technology partner capable of guiding you through and beyond your infrastructure overhaul.

We came across an interesting blog post  that discusses performance management on the cloud and the toss up between public and private clouds. You can read it here:

Why Performance Management is Easier in Public than On-Premise Clouds — Performance is one of the major concerns in the cloud. But the question should not really be whether or not the cloud performs, but whether the Application in question can and does perform in the cloud.

The main problem here is that application performance is either not managed at all or managed incorrectly and therefore this question often remains unanswered. Now granted, performance management in cloud environments is harder than in physical ones, but it can be argued that it is easier in public clouds than in on-premise clouds or even a large virtualized environment.

How do I come to that conclusion? Before answering that let’s look at the unique challenges that virtualization in general and clouds in particular – pose to the realm of APM.

We believe performance management will become easier in private clouds rather than public. This is mainly because it needs to be remembered that the two different group who manage infrastructure in Public clouds can also be siloed and this could result in a number of performance problems for end-users. So whether public or private, it is critical that all the dependant factors are woven together and pro-actively monitored.

I believe the basis of performance management has to start with end-user experience management. Unfortunately most approaches to monitoring are inward focused and they don’t really look at what effect breaches of various system thresholds have on end-users. Admittedly, it’s not easy to put in place a system which consistently ensures end-user performance is measured but it’s also not that complex if attempted through proper process charts. We have repeatedly seen customers having a non-integrated performance management, which ends up like a reactive system because in fact what’s getting monitored has no relation to what’s been delivered to the end-users.

My recommendation would be that be it private or public cloud start your performance management from end-user and move up the ladder to the data center. Connect all the points, identify dependencies, define relative thresholds (relate them to what kind of impact this will have on the end-user) and create an agent-less system to monitor end-user experience. This has proven effective for us both in Private & Public cloud application usages. It can become a lot easier in private cloud as we can have a single integrated system which can connect them all, in public cloud we might be restricted by different methods employed to measure performances by different vendors.

With mobility being the new employee mantra, IT teams are struggling to keep in step with the challenges that the mobile workforce brings. According to Gartner, by 2017 nearly 38% of organizations will embrace BYOD and stop providing devices to its employees. Our conversations with Indian CTOs tell us that mobility is a top concern while BYOD is still some way off, though it is being embraced in niche areas like for agency workforces in Insurance or salesforce in FMCG. But overall, Desktop based systems aren’t disappearing anytime soon and IT heads still have the huge inventory of PC’s that they regularly need to refresh to ensure productivity levels. So many CTOs may find themselves wondering, is it better to refresh my desktop or should I look at virtual desktop solutions and consider thin clients. The answer as always is…. Depends! We discuss three of the most common cases below.


Most organisations follow a 5-year hardware refresh cycle but in India it is not uncommon to come across enterprises that will stretch that to 7 years or beyond! Essentially, it’s a case of if it ain’t broke don’t fix it! In such cases can the IT expect to establish a case for virtualized desktops rather than invest in new PCs? On the face of it, you don’t have to be a genius to say NO! But, these are precisely the cases where business is driving IT to go beyond replace. So can a business case be built in such cases? How do you compare a 35,000 PC price to the initial investment required to bring in the IT efficiency that virtual desktop solutions brings? Point is you need to compare apples to apples. Even if a PC costs you just 35,000, what does support cost? How much power does the PC consume? How much does a data breach cost? And so on. We have found that if IT can think business saving rather than IT saving, a business case can easily be built for replacing 500 desktops with virtualized desktops.


With support to Windows XP coming to an end, enterprises are saddled with multiple systems spread across the company that are now vulnerable to data loss and security breaches. Lack of support to XP may also mean issues with software compatibility which can lead to user dissatisfaction and productivity loss. In such cases CTOs are faced with the question of do they move to Windows 7 or use that as a trigger to transform delivery. As always, in many cases it will be driven by hard numbers. Migrating to Windows7 means investing in Windows 7 licenses and frequently hardware upgrades. Assuming you are ready, to spend on both, what is the residual life of the existing desktop is worth considering. We have found that the additional investment on license and the hardware upgrade, on an already sweated asset, makes little sense, especially when all that investment can come to naught if the PC itself starts to wear out. The same amount when directed towards virtual desktop solution initiative creates an opportunity to benefit from the IT efficiencies while postponing the need to change the PC. As and when the PC wears out it can be replaced with a lower cost TC with benefits of virtualization gained from the start.


Many visionary CTOs have used business expansion as a trigger to transform. The business case here is not unlike that in the PC refresh case except that you need to factor in your organisation’s refresh cycle and attendant financials to those of a virtualized environment. In addition to the economics this is the “perfect” case to test a new and better technology. It also gives you a clean slate as far as end-users are concerned. So governance and culture around use of desktop & applications can be laid down with no baggage and comparisons to unlimited storage and downloads. With the right partner, IT heads are quickly able to demonstrate the many advantages that virtual desktop solutions can deliver and in our experience never look back.


Every CTO has to find a way to balance being a pragmatist and a visionary. Most CTOs understand that newer technologies such as desktop virtualization have benefits but convincing non-technical managements and boards means justifying the investment with a “business case”. In most cases unfortunately the term business case is usually used to convey the economic case while business value is ignored. Perhaps that is the key to getting support to your desktop virtualization initiative: emphasize the business value while showing you have done enough hard work on the economic case.


Subscribe to our mailing list and get interesting stuff and updates to your email inbox.