Application virtualisation is catching on. Philip Anderson, EOH Navigor, maintains that this is thanks to the vast amount of processing power now available on the market. This makes it possible for companies to move towards smaller hardware footprints – while benefiting from the far greater resources these make available to them. As such, application virtualisation would seem an important space to watch.
When it comes to processing power, the sky is starting to be the limit. The development of blade servers now means that companies can bulk up on processing power and memory while cutting down on costs and space. While this in itself is extremely good news for anyone who uses any sort of application or software solution, possibly better news is that virtualisation further builds on these new benefits – enabling users to shift these resources among the multiple applications running on this hardware platform.
In practical terms, application providers can now change the location of their applications to better suit their individual customer’s needs. This means if a company has a more interactive data requirement during the day and more of a batch processing need at night, the provider can allocate its hardware resources accordingly.
Virtualisation thus gives application providers much more flexibility and the opportunity to utilise resources in a far more cost-effective way.
Another big benefit of virtualisation is that the machines and hardware involved are not tied to a specific server. They rather sit on a storage area network on a disc array for example. This means that should any hardware problem arise – if a server goes down for instance – the application is just shifted to another resource and started up there. Complete redundancy is therefore now possible.
While these opportunities and concepts have arguably been around for over twenty years in the form of huge computer mainframes, it’s only now that companies are really starting to harness them. This is largely thanks to this technology finally becoming affordable. SMEs and mid-sized businesses can now go out and invest in a huge amount of processing power that’s literally the size of box – making for easy storage and maximising their ROI (Return On Investment).
Despite all of the above, uptake has remained limited – but interest is growing. The testing, training time of virtualisation seems to be coming to an end. This is because companies are finally beginning to appreciate that their current hardware set-ups involve an incredible amount of duplication which virtualisation eliminates. Building an identical production set-up using less hardware is now possible for example. This also has a knock-on effect when it comes to operating licences.
With market interest growing, vendors and product developers are starting to drive these products in their own capacity. Oracle for example has just launched its own virtual server software. It is additionally certifying its database and other solutions as “virtual”; able to run on virtual servers.
This is having a huge impact in the market. Not only do customers now have the support they need, but they are also being provided with a template – allowing easier uptake and facilitating the smooth running of the solutions. VMware is also starting to play significantly in the software space, with other hardware vendors like IBM ensuring their own architecture is more scalable than ever before.
The slow but steady virtualisation revolution would thus seem to have begun. Before investing in new database software then, business decision-makers need to design their own “virtual” roadmap and a plan to achieve this. In this way, they’ll ensure they maximise both their hardware and software investment for the longest possible time to come.