The conversation on virtualisation to date has centred on infrastructure – servers, desktops, networks. That conversation is now heading into the application layer, and specifically application testing, a critical part of any software development project.
Back in the development dark ages (about 15 years ago), thought leaders in the field realised that the waterfall approach to software development was not necessarily the best, or most efficient.
Waterfall development happened sequentially – requirement, specification, design, code, test. This meant that any major flaws in the project’s specification, design or code were only discovered at the end of the project – often way too late to be remedied or mitigated.
The waterfall approach consequently gave way to agile development methodologies whereby software is developed in small chunks. The specify, design, code sequence is repeated multiple times over the course of a project and developers test early, and often.
Or so best practice dictates.
“Something else happened that made it difficult to test early – the move to as a Service computing. Today developers are building systems of systems, and these systems consume services from other service providers and pull it all together into one offering,” says IndigoCube MD, Ziaan Hattingh.
“This is difficult to test because, for example, if you’re developing a payment system that links to SWIFT, or Mastercard, you need access to that service to test the system.”
These systems of systems are so complex to test, Hattingh notes, that people tend to leave it ’til last.
“This means they often find big, dangerous flaws only at the end of the development project, when they are very expensive to fix.”
Until virtualisation technology took that one necessary step forward and made virtual testing possible, that is.
“A virtual testing environment enables you to create virtual services – so you can develop your software then plug it into virtual services (like a mock Mastercard system) to test as you go. This virtual system responds appropriately so that you can test the solution often and early and fix problems as you go.”
While virtual systems are not real, they are realistic. And they offer another big benefit – cost-savings.
“Creating a test environment is expensive,” Hattingh comments. “You need a laboratory environment, software licenses for test software, hardware and so on. And if you are developing multiple features or services you need multiple test environments for each function. Virtual testing environments are substantially more cost-effective and allow you to do performance testing early on.
“Your networks, servers, databases, database types all affect performance and come together to provide your performance profile. This is also usually left until last because that is when all these elements are finally brought together,” Hattingh comments.
“Now you can test your software’s performance early, as software functions become ready so that when you get to the end performance, functionality, integration, all have been tested,” he says.
“You will probably still do final testing on the full environment, but you won’t be in for any big, costly surprises.”
Back in the development dark ages (about 15 years ago), thought leaders in the field realised that the waterfall approach to software development was not necessarily the best, or most efficient.
Waterfall development happened sequentially – requirement, specification, design, code, test. This meant that any major flaws in the project’s specification, design or code were only discovered at the end of the project – often way too late to be remedied or mitigated.
The waterfall approach consequently gave way to agile development methodologies whereby software is developed in small chunks. The specify, design, code sequence is repeated multiple times over the course of a project and developers test early, and often.
Or so best practice dictates.
“Something else happened that made it difficult to test early – the move to as a Service computing. Today developers are building systems of systems, and these systems consume services from other service providers and pull it all together into one offering,” says IndigoCube MD, Ziaan Hattingh.
“This is difficult to test because, for example, if you’re developing a payment system that links to SWIFT, or Mastercard, you need access to that service to test the system.”
These systems of systems are so complex to test, Hattingh notes, that people tend to leave it ’til last.
“This means they often find big, dangerous flaws only at the end of the development project, when they are very expensive to fix.”
Until virtualisation technology took that one necessary step forward and made virtual testing possible, that is.
“A virtual testing environment enables you to create virtual services – so you can develop your software then plug it into virtual services (like a mock Mastercard system) to test as you go. This virtual system responds appropriately so that you can test the solution often and early and fix problems as you go.”
While virtual systems are not real, they are realistic. And they offer another big benefit – cost-savings.
“Creating a test environment is expensive,” Hattingh comments. “You need a laboratory environment, software licenses for test software, hardware and so on. And if you are developing multiple features or services you need multiple test environments for each function. Virtual testing environments are substantially more cost-effective and allow you to do performance testing early on.
“Your networks, servers, databases, database types all affect performance and come together to provide your performance profile. This is also usually left until last because that is when all these elements are finally brought together,” Hattingh comments.
“Now you can test your software’s performance early, as software functions become ready so that when you get to the end performance, functionality, integration, all have been tested,” he says.
“You will probably still do final testing on the full environment, but you won’t be in for any big, costly surprises.”