By 2028, enterprise spending on battling misinformation and disinformation will surpass $30-billion – cannibalising 10% of marketing and cybersecurity budgets to combat a multifront threat, according to Gartner.
In the new Gartner book World Without Truth, authors Dave Aron, distinguished VP analyst and Gartner Fellow in the Gartner for High Tech Leaders and Providers practice; Andrew Frank, distinguished VP analyst in the Gartner for Marketing Leaders practice; and Richard Hunter, retired distinguished VP analyst and Gartner Fellow, explain that false information poses significant financial and reputational risks to organisations.
As a result, all executive leaders must address it as an enterprisewide priority.
In fact, a December 2024 Gartner survey of 200 senior business and technology executives found that 72% of respondents identified misinformation, disinformation, and malinformation as very or relatively important issues to their executive committees. However, only 30% said it was a top five concern.
“The lack of reliable information needs to be seen as a meta-issue that compromises everyone’s ability to understand and deal with all other issues,” says Frank. “In a world without truth, how can society decide how big a concern climate change is, what its causes are, and how to address them? How can society address global health challenges? And at the corporate level, how can companies maintain relationships with their customers, employees, investors, and other stakeholders in a world where people feel they can no longer believe what they read, see, or hear?”
Aron adds: “The disinformation threat will continue to grow, fueled by increasingly elaborate uses of synthetic reality, behavioural science and digital media. However, there are concrete steps that organisations can take to marginalise the impact and track new threat vectors.”
Three forces driving the rise of disinformation
“False information has been part of human culture for as long as societies have existed, but three distinct phenomena are combining to radically amplify its power and danger,” says Frank.
- Lower-cost, mass customised communications. The Internet, social media, and mobile apps have allowed multimedia content to instantly reach vast numbers of people at almost no cost. These tools can tailor messages so each person or customer/citizen segment receives something different.
- Easier generation of realistic-looking fake content. Generative AI can help create increasingly convincing and compelling conversational dialogues as well as deepfake images, audio, and video content.
- Greater potential influence and impact of communications. Behavioural science, combined with big data, analytics, and AI enables the creation of the most compelling messages for individuals with the aim of changing their mental models, decisions, and actions.
Trust is crucial
As AI raises new challenges to transparency and trust in content, TrustOps is a proactive, integrated approach to enhancing organisational trustworthiness, credibility, and transparency while mitigating risks from misinformation and harmful associations.
“The core idea is to treat trust not as an incidental outcome of marketing or compliance, but as a deliberate operational objective – to protect content integrity and foster consumer confidence,” says Hunter.
The book recommends organisations form Trust Councils – composed of representatives from across organisational boundaries and led by C-Suite individuals from communications, IT, finance, legal, HR, and marketing. Broad participation is needed because each function has a unique perspective and responsibilities related to the topic of trustworthy content.
Additionally, TrustNets of companies, technologies, and tools can create “tunnels of trust” through the Internet. TrustNets ensure trust among participants through verification, transparency, and security.
Four levers to combat disinformation
Along with TrustOps, Gartner recommends using four levers to attack disinformation – rules, governance, and processes; education; nudges and incentives; and technology and tools.
Rules, governance and processes: Organisations must define and implement policies and practices to establish sound principles around trust and minimise threats of harm from new technologies and adversaries. As new practices emerge to counter emerging disinformation tactics, corporate leadership must continuously learn and adapt them from advisors, legal sources, and industry consortia.
Education: Within the enterprise, making the connection between AI and TrustOps is a key strategy for mobilisation. For example, leverage cross-functional teams for employee education, strategy, and delivering responses.
Nudges and incentives: Behavioural science techniques such as nudges or subtle interventions can reduce susceptibility to disinformation. Organisations can use nudges and incentives to reinforce desired habits of skepticism toward dubious content in consumers, customers, employees, and partners.
Technology and tools: To detect and debunk false content, organisations should adopt narrative intelligence tools that track influence operations that propagate disinformation across a variety of social media platforms.