The Internet of Things promises to deliver many benefits and opportunities to businesses and consumers alike. But one thing that has not been fully considered is the massive explosion it will create in the amount of data being generated worldwide, says Sofocles Socratous, vice-president: sales and marketing at Seagate.
How we will manage this surge, and how we will measure it in the coming years? What will happen to all the data that gets collected and what impact will such decisions have on the technology ecosystem?
Experts at Cisco have projected that the number of machine-to-machine connected objects will reach 50-billion by 2020, which equates to more than 6.5 connected devices per person. Researchers IDC have further noted that this “Internet of Things” market will exceed $50-billion in value by 2018, and that is just in in Central and Eastern Europe, the Middle East, and Africa.
As a result, by 2020 there will be approximately 44 zettabytes of data in the world as compared to the 4 zettabytes that exist today. To give you some perspective, a zettabyte is the highest unit of measure we use today for data storage, with a single zettabye enough to store around two-billion years of music.
IDC predicts that 13 of those 44 zettabytes created by 2020 will need to be stored somewhere – yet the current install capacity means that we will only be at about 6.5 zettabytes in 2020.
In short, to make the Internet of Things a reality we clearly need to address the requirement for transformational storage and services solutions now. The good news is that several innovative technologies are in development to help solve this problem.
For example, heat-assisted magnetic recording (HAMR) is expected to increase the limits of magnetic recording by up to 100 times, although it’s unlikely to be commercially available before 2016. The focus until then therefore needs to be on how we store data more efficiently.
We have been talking recently about putting companies and employees on a ‘data diet’, and it’s certainly true that smarter data policies in the workplace, better de-duplication methodologies, and more stringent backup strategies will all have their part to play.
Yet this alone is almost certainly not enough. Luckily there are other exciting developments from a cloud and data center perspective too. Reliably storing large amounts of data is not without costs, and there is great potential for storage systems to offer more compelling economics from both a CAPEX and OPEX perspective.
There’s an opportunity to offer new and emerging technologies from a cloud perspective to deliver the infrastructure that modern organisations need, innovating across the full information stack.
Seagate’s Kinetic platform for example, is making hard-drives more intelligent. Previously they were great at storage, but moving forward they will be able to talk to the stack to make decisions about where and how data should be stored. This can reduce TCO by up to 40% as well as enhance performance, increase rack density and boost agility.
One approach that will also gain momentum is a more efficient tiered model for storage. By intelligently layering conventional hard disk drives with SSHD (solid state hybrid) drives and SSD (solid state) drives, IT architects will be able to organise data much more effectively. This gives businesses quick and easy access to the most critical data from all devices while ensuring that the less valuable data is still available and secure on HDD drives.
We’ll see more data centres built around this approach as the industry works to manage increases in data from the Internet of Things.
There is still plenty of work to be done from a data management perspective before the promises of the Internet of Things can become a reality. But exciting developments are underway to make it happen – as long as we continue to see investment and understanding of the challenges at hand. Furthermore, we need to work together as an industry to bring this compelling vision of interconnectedness to life.