Kathy Gibson reports from Gitex – Dubai is known as one of the world’s most technologically advanced cities, with an impressive array of intelligent and automated features making the city easy to navigate and govern.

But, as they say in the classics, you ain’t seen nothing yet. The vision for the city of the future is breath-taking in its ambition.

At the heart of this vision is how the city can employ technology like Internet of Things (IoT) and artificial intelligence (AI) to become more human-centric.

This seemingly contradictory goal is what keeps Dr Magd Zoorob, senior vice-president: future technologies at Expo City Dubai, busy.

He tells the Digital Cities conference at Gitex Global 2023 in Dubai that human-centricity is a concept that looks at the people living in the city first, with technology layered on as a neutral participant.

“Technology must abstract and automate non-value work to empower people,” he says. “It must be inclusive across all people, with its purpose being to give people back the maximum amount of time and availability.”

The vision would be enabled by user-centric services that are personalised, efficient, accessible, dynamic, interconnected, and non-static.

“Services must react and respond to the needs of the user depending on their intentions, what they are doing on that day,” Dr Zoorob explains. “And this is where AI comes in; giving services the ability to change flows depending on what people need.”

Dubai’s human-centric digital city roadmap has five levels, which Dr Zoorob expects to be achieved by around the year 2100.

The city is currently somewhere between Level 0 and Level 1, he says, with Level 0 representing a traditional city with no intelligence or automation and no interactivity.

Dubai currently has a number of smart city IoT solutions connected, with central management and monitoring of all city assets and devices through a flexible dashboard.

An open IoT cloud platform with scalability is employed, with customisable insights and KPIs, with integration to application processing interfaces (APIs).

These systems will soon provide more insights and benefits, including anomaly detection, root cause detection, predictive maintenance, and machine learning (ML) models for devices and buildings, coupled with AI insights.

“This is the basic layer that enables human-centricity,” Dr Zoorob says.

As the city moves closer to achieving its Level 1 goals, it will layer AI on top of these connected platforms.

Initiatives already being piloted include an AI waste management model that will guide people to using the right recycling bin depending on the particular object to be disposed.

Importantly, this system will also link all smart waste disposal bins, which will enable the system to understand the volume and weight of waste being disposed of. This information can be fed back to recycling service provider creating the beginning of a closed loop system.

Another nascent initiative uses AI to increase business efficiency. “This is like a website assistant, but with private models based on the user’s own data. So the user can ask questions on the site and receive tailored information.”

This model is already being introduced internally by Dr Zoorob’s team for facilities management.

By 2030, the city expects to start moving to Level 2, where services will become more autonomous.

In this level, the engagement of citizens and visitors will be more personalised depending on who the user is.

This autonomy will be presented as AI guidance or AI guardians for people and businesses, Dr Zoorob explains.

“An AI guardian is a model that is personal to you, it walks with you across city services and learns what your behaviour is. So the services you engage with understand your intentions and aims without having to redescribe them for every disparate service.

“You could initiate the intention to do certain activities and the AI can launch them without you having to spin them up or request them.”

These AI models will learn and evolve over time, he stresses, as they are not predefined or locked.

Adding even more human-centricity to these interactions, the AI guardian could take the form of an avatar.

“These models are under testing now and we are looking at how to integrate them,” Dr Zoorob says.

Planning for Levels 3 and up is not so detailed, but the broad outlines are as follows:

* Level 3, anticipated by 2050, will start to automate tasks as well as the flow of goods and information to add more autonomy. This AI orchestration will enable autonomous services to be implemented around the user.

* Level 4, in around 2070, is where the infrastructure of the city will have to be rethought with the flow of data, goods, and services digitised and automated. Everything will be tracked and automated and run through AI models that will allow the city to track assets so as to understand intention and movement that can be monitored at every step. This level will be driven by AI models so that can query and redefine themselves.

* Level 5, going beyond 2100, is where we can start building a completely circular economy, which can only be accomplished once all the models are built and are interacting.