Skip to main content

InfraAI: Building an AI Infrastructure Fit for the Future

Posted by Karen Davidson on Thursday, March 19, 2026
Building an AI Infrastructure Fit for the Future

Artificial Intelligence is now accessible to nearly everyone — download an app, write a few lines of code, purchase an API, and you can integrate it into your own product. But accessibility is only half the story. As artificial intelligence grows more powerful and widespread, the infrastructure that supports it is struggling to keep up. At UT Austin’s Computer Science, researchers at the InfraAI Center are focused on solving this critical challenge by building a new type of infrastructure that will keep AI running reliably as demand becomes truly ubiquitous. The challenge touches every sector as AI works its way into every corner of modern life.

InfraAI is a consortium that partners with industry to solve real-world business challenges. The Center is currently working with researchers at Google, Meta, Cisco, Amazon and Nutanix. Industry partners participate in research discussions, technical workshops and collaborative projects that help to shape its research agenda, while providing students with a front-row seat to the most complex challenges in the industry. By combining academic research with close collaboration from industry partners, InfraAI aims to design AI systems that can adapt, learn and scale to meet every future challenge.

Led by Aditya Akella and Chris Rossbach, both professors in the Computer Science department and experts in operating systems design, the Center brings together researchers across networking, operating systems, robotics and hardware to build a computing infrastructure that can learn and adapt as fast as the AI it supports.

“The gap between evolving AI demands and rigid computing infrastructure is growing,” Akella notes. To close that gap, the Center is designing systems that can automatically adapt and optimize themselves. Whether it’s a tiny sensor on a factory floor or a massive warehouse-sized data center, these systems will use AI to improve their performance in real-time. 

InfraAI has two central research thrusts: 

  • Nimble AI Stacks: The Center is creating flexible, high-performing AI software stacks that evolve with models and hardware. Embedding intelligent learning into the stack itself will make AI infrastructure more efficient, adaptable, and broadly accessible to businesses large and small across industries. 
     
  • Intelligent Infrastructures: InfraAI is designing AI-driven systems that manage themselves. By developing new building blocks and abstractions, these systems adapt in real time, optimize performance, and maintain safety across cloud, edge, and consumer devices—reducing the need for constant human intervention.

Projects like SYMPHONY highlight InfraAI’s vision of computing systems that can automatically adapt to the needs of modern AI. SYMPHONY improves memory management for LLM workloads—in essence, it makes AI chatbots and AI assistants faster and more efficient by addressing a cache issue. Today, when you chat with an AI agent, it needs to remember every question you’ve asked so it can understand your next question. That memory is stored in cache and takes up an enormous amount of resources as it requires the system to repeatedly compute the information. SYMPHONY enables the AI to predict what you are about to ask by using hints. This enables it to prepare the memory ahead of time, move it out of cache and into the fast GPU memory. Research studies show that SYMPHONY can handle more than eight times the amount of traffic compared to current industry standards.

Also under the InfraAI research umbrella is Learning-Directed Operating Systems (LDOS), a research initiative backed by the U.S. National Science Foundation’s Expeditions in Computing. Operating systems traditionally manage how computers allocate resources such as processors, memory and network capacity. LDOS is taking a clean-slate approach to operating systems. Instead of trying to patch up decades-old technology, the group is building a new type of system in which machine learning directs these decisions automatically.

As AI becomes increasingly integrated into fields like health care, logistics, transportation and manufacturing, the infrastructure supporting those systems must be reliable, flexible, fast and efficient. By reimagining the foundations of the digital age, the InfraAI Center is ensuring that the AI systems of tomorrow will be ready to handle future challenges – from the edge to the datacenter and everywhere in between. 

Tags