Connect with us

Hi, what are you looking for?

AI Technology

Lumai Launches Iris Server, World’s First Optical System for Real-Time AI Inference

Lumai unveils the Iris inference server, the world’s first optical system enabling real-time execution of billion-parameter AI models with 90% lower energy consumption.

OXFORD, United Kingdom, April 28, 2026 (GLOBE NEWSWIRE) — Lumai, the optical compute company specializing in scalable AI solutions, has unveiled its Lumai Iris inference server, touted as the world’s first optical computing system capable of running billion-parameter large language models (LLMs) in real time. This announcement marks a pivotal milestone in AI infrastructure, showcasing the commercial potential of optical compute for large-scale AI inference tasks.

The Lumai Iris servers leverage light for accelerating inference workloads, offering a speed and efficiency advantage over traditional silicon-based systems. Notably, Lumai’s optical compute technology promises up to 90% lower energy consumption compared to conventional architectures, making it a more environmentally sustainable option than current GPU-based systems.

The Lumai Iris lineup includes three server models: Nova, Aura, and Tetra, with the first model, Iris Nova, already available for evaluation by hyperscalers, neo-clouds, enterprises, and research institutions. This diverse family of servers aims to address the growing demand for efficient AI inference.

Powering the Inference Era

As artificial intelligence transitions into a new phase characterized by real-world deployment, the demand for efficient inference workloads has surged. However, traditional data centers are facing critical power and scalability challenges, with silicon-based architectures struggling to keep pace with increasing demands. According to the International Energy Agency, global data center power demand is projected to double by 2030, highlighting the urgency for more efficient computing approaches.

The Lumai Iris family of inference servers presents a solution to this “Energy Wall,” delivering significantly enhanced performance per kilowatt. This new computing paradigm allows for the scaling of AI capabilities without the burdensome energy and cost implications associated with existing systems.

Moreover, traditional silicon architectures are encountering fundamental physical limitations, often referred to as the “Silicon Ceiling.” Each new silicon generation offers marginal improvements while requiring substantially more power and financial investment to enhance scalability. Dr. Xianxin Guo, CEO and Co-Founder of Lumai, emphasized this shift: “As the industry transitions into the inference era, we are simultaneously crossing the threshold into the post-silicon era. By shifting the computation paradigm from electrons to photons, Lumai can deliver an order-of-magnitude increase in performance with significant energy savings.”

The Lumai Iris system utilizes optical computing to enable more efficient execution of essential AI operations. Developed from extensive research at the University of Oxford, this technology employs light in a three-dimensional volume, overcoming the limitations of conventional two-dimensional chip designs. The approach allows for massive spatial parallelism, facilitating the execution of millions of operations simultaneously, which is particularly advantageous for compute-bound workloads.

Iris Nova is equipped to perform real-time inference on models such as Llama 8B and 70B, utilizing a sophisticated hybrid processing architecture. This architecture combines digital processing for system control and software with an optical tensor engine dedicated to executing core mathematical operations, ensuring a seamless integration into existing data center infrastructures.

The Advanced Research and Invention Agency (ARIA), a UK government-backed organization funding advanced AI initiatives, has expressed optimism about Lumai’s advancements. Suraj Bramhavar, Program Director at ARIA, stated, “The demands on existing AI processors necessitate an urgent search for alternative scaling pathways. Lumai is leading the charge in demonstrating that optical processors could provide one such pathway, and ARIA is excited to partner with them to explore the shift beyond our traditional digital computing paradigm.”

The Lumai Iris Nova inference server is now available for evaluation, with future systems in the Iris family anticipated to further enhance performance and efficiency for broader deployment across hyperscale and enterprise environments. Lumai’s groundbreaking optical AI technology not only aims to transform AI infrastructure but also to promote sustainable intelligence at a global scale.

For more information about Lumai’s innovations or to request an evaluation of the Lumai Iris Nova inference server, visit lumai.ai.

About Lumai: Founded in 2021 as a spin-off from leading optics research at the University of Oxford, Lumai is dedicated to creating the next-generation AI infrastructure for the Inference Era. The company is committed to achieving materially faster inference, significantly higher execution efficiency, and up to 90% lower energy consumption than conventional GPU architectures. Lumai has been recognized with several awards, including the Falling Walls Award for Science Breakthrough of the Year 2025, and is an alumnus of Intel Ignite’s inaugural London cohort.

Media Contact:
Stephanie Olsen
Lages & Associates
(949) 453-8080
[email protected]

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

Top Stories

Perplexity CEO Aravind Srinivas asserts that AI will elevate the iPhone into a vital "digital passport," amplifying its role in users' lives amid evolving...

AI Generative

OpenAI's ChatGPT Images 2.0 outshines Gemini's Nano Banana 2 by delivering superior realism in image edits, consistently winning tests despite longer processing times.

AI Research

Microsoft's new report highlights 40 careers, including teaching and writing roles, most vulnerable to AI disruption, with 5 million U.S. jobs at risk.

AI Generative

IS Dongseo launches a comprehensive Generative AI training program to boost employee productivity and proficiency, enhancing practical skills across operations.

Top Stories

Amazon anticipates a 14% revenue surge to $188B in Q1 2026, fueled by AWS growth and a 21% rise in advertising revenue to $16.84B

AI Tools

Alteryx study reveals 77% of AI pilots fail to scale due to governance gaps, highlighting the need for effective data management and oversight.

AI Technology

Intel projects Q2 revenue of up to $14.8B, driven by AI demand for its Xeon CPUs, despite a GAAP loss per share of $0.73...

AI Generative

Kimg AI revolutionizes freelancing by enabling rapid, high-quality image generation, significantly reducing project turnaround times and enhancing creative output.

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.