There is nothing worse than phoning a call centre and waiting while the consultant apologises for their system being slow. Or trying to complete a transaction online, only for it to fail at some point.

By Lourens Sanders, solution architect at Infinidat

What the customer sees on the front end relies on the back end, and the customer experience is thus heavily tied to the database. Storage can become a bottleneck that causes performance issues if the incorrect storage architecture is in place to support the required workload.

Intelligent software, in the form of Dynamic Random-Access Memory (DRAM) and neural caching, offers a cost-effective way to prevent storage from becoming a performance and customer experience bottleneck.

An online shopping example

When a user shops online, there are many backend processes that need to happen. For an item to be reserved for a sale, the system needs to query the database and then return it.

Payment gateways need to query their own databases to send through the payment. This then needs to be logged to complete the sale, as fast as possible. This involves many processes, of which storage is an important component.

If the storage at the back end cannot handle the volume of transactions being processed, it may contribute to degraded performance. If there is latency and the product searches take too long, customers may lose interest and take their business elsewhere.

The speed and performance of the underlying storage are essential for the customer experience as a whole.

High-performance storage is a must

E-commerce is not the only environment that requires a high-performance storage layer.

There are many other environments, including other online transactional processing applications like financial trading, virtual workloads, virtual machines, databases, streaming services and, importantly in today’s world, DevOps – the ability to develop and roll out applications on the fly in real-time.

Any lag or latency will negatively impact the ability of all these applications to function, so when data is queried, it needs to be readily available.

Demystifying DRAM and the neural cache

Intelligent storage software, in the form of neural caching algorithms, is the solution to cost-effective high-performance storage at a petabyte scale, even in multi-tiered storage architectures.

DRAM is a specific type of compute memory that stores data from applications that are currently running, enabling those applications to access data extremely fast. This architecture allows applications to query data either in a sequential or random fashion, which means it does not matter where the data is located in memory.

The real performance boost comes from the neural cache, which is the software component – a caching technique that uses machine learning and artificial intelligence to improve storage performance by predicting and pre-fetching data accurately into the DRAM for instantaneous access.

The algorithm intelligently places data onto different storage media, with metadata tagging for tracking purposes, so that the system can accurately predict which data is needed at the front end based on different input/output profiles and access patterns.

A variety of applications

Any application can benefit from such a caching mechanism, particularly those that require batch processing, because the search for data blocks that need to be included in the batch can cause significant time wastage.

With DRAM and neural cache, the entire process can be run quickly, because all of the data will already be available in the DRAM, ready for when it needs to run.

When it comes to virtualisation, neural cache can also be a significant benefit, enabling faster deployment of servers, containerised workloads, backup and restore applications and virtual machines.

Any instance where storage can be a bottleneck to performance can benefit from DRAM and neural cache, with superior performance gained using the right combination of media, architecture and intelligent software capability.