Kathy Gibson is at Fujitsu Forum in Munich – There has been a lot of hype about artificial intelligence (AI), but its uptake in enterprise organisations has not been as high as expected.
Just 42% of companies are adopting AI, according to an independent survey by Fujitsu.
Some of the reasons why 54% of companies surveyed are not working with AI include a lack of hygiene factors like the availability of skills and expertise, awareness of cases and applicability, availability of the quality data required to be effective, and operational maturity.
There are also technology challenges, including the fact that general-purpose hardware is not giving the acceleration required.
Many organisations have also been unable to proceed beyond their proof of concept projects due to a lack of visibility around return on investment (ROI).
They are frequently also unable to understand the components and software that need to be integrated.
Fujitsu is taking a human-centric approach to AI with its Zinrai framework, says Udo Würtz, Fujitsu distinguished engineer and deputy chief technology officer: business line products.
“When we talk to companies in Germany, for instance, we find they are facing more issues in terms of competition.
“For example, less expensive machines may not be as reliable as those produced in Germany, but they include an AI component that allows for predictive maintenance and predictive availability – which allows for big cost savings.”
The challenges around implementing AI are huge, but companies have to find a way to address them, Würtz says.
Fujitsu spends $2-billion on R&D, and among the solutions it is bringing to market is AI acceleration via a deep learning unit (DLU) that has been developed to run on top of standard Fujitsu hardware.
The DLU has several advantages over GPU technology in the deep learning space, he adds.
In fact, it is believed that the DLU will deliver 10-times the performance per watt of comparable GPU-based systems – and it is designed to handle large-scale neural networks.
“We believe the market for DLU is huge,” says Wurtz. “We are moving to a higher demand for big data and AI to bring additional value to those products.
“Plus the bigger an installation is, the higher the throughput will be – and the input pipe becomes critical. This gives impetus to technologies like storage. And this is the benefit of Fujitsu because we have a huge portfolio that we can bring together.”
The next step of the Zinrai solution is the driver and microcode, then the operating system and container system.
“On the Zinrai software stack, there are a lot of Fujitsu ingredients but, at the same time, we believe in open source,” Würtz says.
For instance, the operating system is Ubuntu Server, while the supported containers include Docker, Prometheus and Grafana among others.
Using Zinrai frameworks, customers can build their own AI solutions or use the neural network out of the box with their use case on top of that. The third option is to use co-created applications from partners and bring them to life on the framework.
Würtz explains that systems out of the box include audio recognition, reinforcement learning, natural language processing, sentiment analysis, image recognition, realtime object detection, recommendations, generative adversarial networks, question and answer, and more.
“But it’s not that easy,” he adds. “For instance when we talk about image recognition there are four different neural networks required to deliver this. You need partners with the skills to bring AI dream to life.”
Hardware that is faster than generic solutions, and built for AI, gives customer an advantage in terms of performance and better power consumption.
Fujitsu’s solutions built using DLU and the Zinrai framework will encompass entry-level and scale-out systems. “We have started already on an early access programme and have already delivered hardware for this,” Würtz says.