Big data is not a silver bullet; however, when applied to appropriate use cases, it can bring significant value to banks and insurance companies. This was the core message from the 2014 Big Data in Financial Services conference that was chaired by Gary Allemann, MD of Master Data Management, in Dubai earlier this month.

Most credit card users will, at some stage, have received a call from their bank to verify a transaction. Maybe the transaction amount was much larger than usual, maybe the goods purchased were unusual, or the transaction was made in an unusual location. This is an example of big data analytics for fraud detection.

Another common use for big data in financial services is to better understand customer behaviour – whether to reduce churn or to cross sell. An integrated view of customer activities across channels can deliver insight as to where that customer is in a buying cycle, or whether similar activities were the precursor to churn in other clients. Big data provides the insight to improve the customer experience and, ultimately, results in increased revenue.

“For each use case discussed, the importance of sound planning, quality data and data governance were stressed. Big data brings with it additional complexity and, if approached in an ad hoc way, is likely to fail. Businesses must also address privacy concerns, particularly when applied to big data analytics in the cloud.

“One approach, implemented by a global exchange, is to allow users to access cloud-based processing power. For example, to do real time fraud analytics on trading data, while storing all sensitive data within the secure internal environment. This hybrid approach balances the need for governance with the cloud’s ability to reduce costs,” says Allemann.

Big data is, in many ways, synonymous with the Hadoop platform. One important lesson is that Hadoop does not replace existing Business Intelligence (BI) and data warehousing solutions. However, where traditional Enterprise Data Warehouse (EDW) solutions are highly structured and difficult to adapt to new data sources, Hadoop provides the flexibility to rapidly integrate data from multiple sources – both structured and unstructured.

“This is the basis of a use case described by a global bank – to optimise their data warehouse through the use of a Hadoop landing area that reduces both the time and cost of data integration, and also allows them to cost effectively store much more data,” he adds.

“The bank stressed the importance of commercial support for their Hadoop environment: Hadoop’s open source nature carries risk. The framework is evolving and has a number of competing options – some of which may not survive. It is also technically complex, requiring an investment in JAVA, R, Python and other specialist development environments.”

The real value of big data analytics can be best realised through the use of a self-service big data platform, such as Datameer. Datameer provides a single, intuitive application for the full analytics life cycle – combining data integration, data management and self-service analytics on top of Hadoop

“Self-service big data platforms, such as Datameer, can shield your organisation from much of this technical complexity, proving the governance and management framework to exploit big data in a fraction of the time and effort required using the pure Hadoop stack,” concludes Allemann.