OXFORD, United Kingdom, April 28, 2026 (GLOBE NEWSWIRE) — Lumai, the optical compute company specializing in scalable AI solutions, has unveiled its Lumai Iris inference server, touted as the world’s first optical computing system capable of running billion-parameter large language models (LLMs) in real time. This announcement marks a pivotal milestone in AI infrastructure, showcasing the commercial potential of optical compute for large-scale AI inference tasks.
The Lumai Iris servers leverage light for accelerating inference workloads, offering a speed and efficiency advantage over traditional silicon-based systems. Notably, Lumai’s optical compute technology promises up to 90% lower energy consumption compared to conventional architectures, making it a more environmentally sustainable option than current GPU-based systems.
The Lumai Iris lineup includes three server models: Nova, Aura, and Tetra, with the first model, Iris Nova, already available for evaluation by hyperscalers, neo-clouds, enterprises, and research institutions. This diverse family of servers aims to address the growing demand for efficient AI inference.
Powering the Inference Era
As artificial intelligence transitions into a new phase characterized by real-world deployment, the demand for efficient inference workloads has surged. However, traditional data centers are facing critical power and scalability challenges, with silicon-based architectures struggling to keep pace with increasing demands. According to the International Energy Agency, global data center power demand is projected to double by 2030, highlighting the urgency for more efficient computing approaches.
The Lumai Iris family of inference servers presents a solution to this “Energy Wall,” delivering significantly enhanced performance per kilowatt. This new computing paradigm allows for the scaling of AI capabilities without the burdensome energy and cost implications associated with existing systems.
Moreover, traditional silicon architectures are encountering fundamental physical limitations, often referred to as the “Silicon Ceiling.” Each new silicon generation offers marginal improvements while requiring substantially more power and financial investment to enhance scalability. Dr. Xianxin Guo, CEO and Co-Founder of Lumai, emphasized this shift: “As the industry transitions into the inference era, we are simultaneously crossing the threshold into the post-silicon era. By shifting the computation paradigm from electrons to photons, Lumai can deliver an order-of-magnitude increase in performance with significant energy savings.”
The Lumai Iris system utilizes optical computing to enable more efficient execution of essential AI operations. Developed from extensive research at the University of Oxford, this technology employs light in a three-dimensional volume, overcoming the limitations of conventional two-dimensional chip designs. The approach allows for massive spatial parallelism, facilitating the execution of millions of operations simultaneously, which is particularly advantageous for compute-bound workloads.
Iris Nova is equipped to perform real-time inference on models such as Llama 8B and 70B, utilizing a sophisticated hybrid processing architecture. This architecture combines digital processing for system control and software with an optical tensor engine dedicated to executing core mathematical operations, ensuring a seamless integration into existing data center infrastructures.
The Advanced Research and Invention Agency (ARIA), a UK government-backed organization funding advanced AI initiatives, has expressed optimism about Lumai’s advancements. Suraj Bramhavar, Program Director at ARIA, stated, “The demands on existing AI processors necessitate an urgent search for alternative scaling pathways. Lumai is leading the charge in demonstrating that optical processors could provide one such pathway, and ARIA is excited to partner with them to explore the shift beyond our traditional digital computing paradigm.”
The Lumai Iris Nova inference server is now available for evaluation, with future systems in the Iris family anticipated to further enhance performance and efficiency for broader deployment across hyperscale and enterprise environments. Lumai’s groundbreaking optical AI technology not only aims to transform AI infrastructure but also to promote sustainable intelligence at a global scale.
For more information about Lumai’s innovations or to request an evaluation of the Lumai Iris Nova inference server, visit lumai.ai.
About Lumai: Founded in 2021 as a spin-off from leading optics research at the University of Oxford, Lumai is dedicated to creating the next-generation AI infrastructure for the Inference Era. The company is committed to achieving materially faster inference, significantly higher execution efficiency, and up to 90% lower energy consumption than conventional GPU architectures. Lumai has been recognized with several awards, including the Falling Walls Award for Science Breakthrough of the Year 2025, and is an alumnus of Intel Ignite’s inaugural London cohort.
Media Contact:
Stephanie Olsen
Lages & Associates
(949) 453-8080
[email protected]
Tesseract Launches Site Manager and PRISM Vision Badge for Job Site Clarity
Affordable Android Smartwatches That Offer Great Value and Features
Russia”s AIDOL Robot Stumbles During Debut in Moscow
AI Technology Revolutionizes Meat Processing at Cargill Slaughterhouse
Seagate Unveils Exos 4U100: 3.2PB AI-Ready Storage with Advanced HAMR Tech




















































