The world’s biggest technology companies are set to spend nearly $700-billion this year building data centres to power artificial intelligence. Global data centre energy consumption is expected to double by 2028, with AI workloads driving the majority of that growth.
Refiant AI has announced a $5-million seed round led by VoLo Earth Ventures, a top-decile climate technology fund with $225-million invested across 35 portfolio companies.
Reliant AI uses nature-inspired algorithms to radically compress AI models, slashing the hardware and energy required to run them. It. The company has already successfully demonstrated it can compress a 120-billion parameter AI model to run on a standard laptop, reducing energy requirements by over 80% while preserving near-identical quality.
“AI’s growing energy footprint is one of the most urgent and underappreciated challenges in the climate space,” says Sid Gutta, co-founder of Relaint AI.
“The industry’s default answer is to build more data centres and consume more power. Ours is to make the AI itself dramatically more efficient.”
Radical efficiency
Modern AI models are enormous and growing every quarter, with parameter sizes running into the trillions.
Running them requires banks of GPUs, cooling systems, and vast amounts of electricity. For most organisations, using AI means sending data to power-hungry cloud infrastructure operated by a handful of tech giants – with the associated energy cost, carbon footprint and loss of data control.
Refiant’s founders believe the more sustainable path is to make current AI radically more efficient. The start-up recently compressed a 120-billion parameter AI model – one of the most powerful open-source models available at the time – to run on a MacBook Pro with just 12GB of RAM. The same model would normally require hardware with at least 80GB of memory.
The model retained 95% to 99% of its fidelity, ran alongside a second AI model on the same machine, and the entire process took four hours with no cloud computing required.
Energy consumption was measured in a Faraday cage to eliminate external electromagnetic interference and ensure accurate readings. Under these conditions, the compressed model achieved approximately 3 000 tokens per kilowatt-hour – up to 100 times more energy efficient than running the same model on conventional data centre hardware.
The energy required to process a single AI prompt on standard infrastructure could power roughly 100 equivalent prompts using Refiant’s approach.
The implications are major. Replace racks of GPU servers drawing thousands of watts with standard laptops, multiply that across thousands of organisations, and the energy savings become material at grid level. For countries with limited data centre infrastructure, it means the ability to run powerful AI locally rather than exporting compute to Silicon Valley.
AI and climate: not a trade-off
Businesses today are caught between two competing imperatives: reduce your carbon footprint, and adopt AI to stay competitive. For Refiant, the commercial opportunity and the environmental one are inseparable.
“Those two mandates don’t have to be in tension,” says co-founder Mathew Haswell. “AI adoption and sustainability commitments can coexist – but only if the technology itself becomes more efficient. Organisations shouldn’t have to choose between deploying AI and meeting their energy targets – and they shouldn’t have to send their data halfway around the world to do it.”