
Rust is gradually becoming the default choice. The clearest indicator of this came when OpenAI’s Greg Brockman simply remarked, “Rust is a perfect language for agents.” The response told the real story. Within hours, developers across the AI ecosystem replied with the same sentiment in different words.
rust is a perfect language for agents, given that if it compiles it's ~correct
— Greg Brockman (@gdb) January 2, 2026
That alignment now has data to back it up. The 2025 Stack Overflow Developer Survey placed Rust as the most admired programming language, with a 72% rating. More tellingly, it is also one of the most desired. Only 29.2% of respondents used Rust extensively last year, yet 72.4% expressed the desire to use it next year. Rust has now held the ‘most loved language’ title for nine consecutive years.
That gap tells the real story. Rust is not winning because it is easy. It is winning because developers believe the trade-off is worth the effort for what lies ahead.
And what lies ahead is AI.
Over the past three to four months, Rust has shifted from a systems language admired by a dedicated minority to a default choice for serious AI infrastructure. This shift is not driven by theory or conference talks, but by daily friction. Modern AI systems run continuously. They touch files and networks, manage memory under pressure and act without supervision.
When something goes wrong, failures are often silent, difficult to debug and expensive to fix. Rust reduces the number of places where those failures can occur.
The anecdotes accumulating on X and Reddit show why. Developers describe moving memory-constrained services from Go to Rust. As one put it precisely, “The Rust rewrite eliminated GC memory spikes and makes the services super predictable and stable on low-resource machines.”
Rust also aligns well with LLM-driven development. Its strict type system gives richer information to agents, sufficient training data for models to produce usable code, and a simple rule that changes everything: if it compiles, it is usually correct. Brockman pointed this out directly.
Why the Shift is Explicit
A recent viral post by a Microsoft executive, Galen Hunt, said the company plans to migrate the entire C and C++ codebases to Rust by 2030. Although Hunt later clarified that his post was taken out of context, the sentiment resonated widely.
The same pattern shows up inside major AI labs. Andrej Karpathy provided one of the clearest signals late last year. While documenting experiments with AI-assisted coding, he mentioned building a tokeniser in Rust instead of relying on existing libraries. Tokenisers sit directly on the performance path of every model.
Choosing Rust there is a quiet admission that Python’s convenience ends where correctness and speed begin. Even Elon Musk has publicly agreed that Rust is the language for AGI.

At OpenAI, the shift is explicit rather than implied. The Codex command line interface was rewritten entirely in Rust and released publicly. This was not a cosmetic change. OpenAI said the Rust version is faster, more stable and easier to reason about as it scales. Engineers pointed directly to Rust’s guarantees as a reason. Code that compiles tends to behave. For autonomous agents that read and write files, manage tools and run unattended, that property matters far more than syntactic comfort.
A similar learning curve is visible at Anthropic. Engineers there have discussed learning Rust while building tooling around Claude, often with AI systems help them write the code itself. That detail matters. Rust is no longer treated as too difficult for fast-moving teams. AI assistance has flattened the learning curve enough that teams can get Rust’s benefits without years of accumulated muscle memory.
Popular Meta apps, including Instagram and Facebook Messenger, utilised a decade-old C codebase for their messaging library, it led to problems with memory management and a subpar developer experience. In July last year, the company finally switched to Rust.
On an episode of Meta Tech Podcast, three of the company’s engineers, Eliane W, Buping W and Pascal Hartig, shared their experience with using C, discussing the challenges faced, the migration process and their success with Rust.
“I think one of the biggest things about Rust is the compile-time memory safety enforcement. C++ doesn’t have that. Since a lot of the issues that we face in the day-to-day are related to memory management, it doesn’t make sense that we wouldn’t choose a language that combats that most effectively,” Elaine said.
Meta’s migration demonstrated that modern systems languages can deliver both an improved developer experience and greater production reliability.
Broader Ecosystem Shift
The ecosystem has followed. Rust-based inference engines and local runtimes have appeared with the shared goal: running models faster, safer and closer to the hardware for AI. Projects like candle and mistral.rs circulate widely among engineers focused on deployment instead of demos. Python still dominates research notebooks but production systems tell a different story.
Companies around the world, such as Atlassian, Discord, Figma and Mozilla, have significantly benefited from implementing Rust in production. In fact, more than 90 companies across a wide range of industries now rely on Rust for production today at various levels of their tech stack.
Read: How Prime Video Cut UI Latency 7.6x by Switching to Rust
The Stack Overflow data reinforces this again. Rust is not only admired as a language. Cargo, its build tool and package manager, ranks as the most admired cloud development and infrastructure tool at 71%. Tooling friction compounds fast when systems sprawl across services, models and machines. Cargo meaningfully reduces that friction.
There is also a broader infrastructure trend underneath the language debate. Modern AI systems depend on vector databases, streaming pipelines, schedulers and low-latency services. Several companies have rebuilt these components in Rust over recent months. These are not experiments. They are core systems where memory safety bugs or unpredictable pauses are unacceptable.
This is why comparisons with Go keep resurfacing. Many engineers acknowledge that Go works well and ships fast. Developers also describe hitting ceilings. Garbage collection introduces spikes. Predictability matters more as systems shrink onto smaller machines or run closer to the edge. Rust trades ease for control.
The recurring Reddit discussions capture this clearly, even when they argue. Developers describe rewriting heavy data processing jobs and cutting runtimes from tens of minutes to minutes while dramatically reducing memory usage.
What ties all of this together is a change in how AI is built. Rust fits this phase better than almost any other language. It offers C-level performance without C-level risk. The research world may still speak Python, but the systems that actually run modern AI are increasingly speaking Rust.
The post AI Engineers are Openly Choosing Rust Over Everything Else appeared first on Analytics India Magazine.