Ubuntu plans AI features with focus on local inference


Canonical is preparing to add AI features to Ubuntu over the next year, according to a blog post by Jon Seager, the company’s vice-president of engineering.

Seager said the features will appear in two categories. Some will use AI models in the background to improve existing operating system functions, others are to introduce “AI native” workflows for users who choose to use them. The planned Ubuntu AI features include speech-to-text, text-to-speech, improved accessibility functions, troubleshooting support, and personal automation. Canonical said its approach will prioritise model transparency, open-weight models, open-source harnesses, and local inference where possible.

Canonical’s internal AI use

Canonical’s plan also covers how its own engineering teams use AI tools. Seager said the company has begun a more deliberate internal effort to help engineers understand where AI tools are useful, without measuring staff by token use or the amount of code written with AI.

“I will not be measuring people at Canonical by how much they use AI, but rather continue to measure them on how well they deliver,” Seager wrote.

Canonical is also encouraging teams to test different AI tools which Seager said is intended to help the company learn where the tools are useful. Seager said AI might help with development and education, and that production code needs to remain controlled and review-able.

Seager also said low-quality AI-generated contributions to open source projects have never been acceptable and are not encouraged at Canonical. Engineers and contributors need to remain sceptical of AI-generated output.

The company’s plans for Ubuntu separate AI features into “implicit” and “explicit” categories. Implicit AI refers to existing OS functions that are improved with AI models without changing how users interact with the software: speech-to-text and text-to-speech are examples. He described them as accessibility features that can be improved with local inference and open-weight models.

Explicit AI features would be more visible to users, such as agent-based workflows for writing documents or applications and automating tasks. The company pointed to Ubuntu’s existing Snap packaging model as part of the foundation for managing risk.

Local inference and model access

Local inference is central to Canonical’s plan, with models running on the user’s device via snaps, with the goal of reducing the complexity of setting up models and hardware-specific versions. Inference snaps are designed to provide optimised components for supported silicon platforms when available.

Inference snaps also follow the same confinement rules as other snaps. Canonical said this limits model access to the user’s machine and data.

The company is also weighing model licensing terms, not treating access to model weights as the only measure of openness. Seager said Canonical will take a balanced view of licences when selecting models to make available in Ubuntu.

That approach reflects a distinction between open-weight models and the broader transparency expectations usually associated with open source software. Canonical said its preference is for local inference, open-source harnesses, and clearly defined interfaces to external services where users need them.

The company is also tracking newer models that support functions like tool calling. These abilities allow models to interact with external APIs, search the web, access file systems, and assist with troubleshooting when given permission.

Canonical said it plans to increase work on inference snaps by keeping up with newer model releases and adding optimised variants in more silicon platforms.

Agent workflows for Ubuntu

Seager also described a longer-term goal of making Ubuntu more context-aware. This includes using AI agents to help users navigate Linux workstation abilities, especially where the desktop ecosystem remains fragmented in many tools and components.

The same approach could extend beyond desktop use. Seager said site reliability engineers managing Ubuntu systems could use AI to interpret logs during incidents, support root cause analysis, or run scheduled maintenance tasks in strict controls.

Canonical said such workflows should rely on existing production safeguards, including access controls, audit trails, scoped permissions, and separation between observation and action.

Seager said the question is not only whether organisations can trust agents, but whether agents can operate in the same controls already used in production systems. That includes read-only analysis, tightly scoped permissions for actions, and auditability for decisions and outcomes.

He also gave examples of user-facing tasks that could be handled through this model, like troubleshooting a Wi-Fi connection or setting up an open-source software forge with security and TLS already configured.

Hardware and efficiency limits

Hardware availability remains a constraint for local inference. Smaller models can run on more common hardware, but they do not yet match larger models for many tasks.

Seager said Canonical is watching developments in consumer-grade silicon with stronger inference abilities. He also said performance needs to be considered with power efficiency, especially as local accelerators become more capable.

He noted that comparing cloud-based models and local models only by speed can miss part of the issue. Local accelerators can also reduce power draw for inference workloads, which is relevant as more AI functions are expected to run closer to the operating system.

Canonical did not give a specific release date for individual Ubuntu AI features. Seager said the features will be added throughout the next year when the company considers them mature enough for release.

(Photo by Gabriel Heinzer)

See also: OpenAI brings GPT-5.5 to Codex for coding tasks

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events, click here for more information.

AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.