Register now for better personalized quote!

HOT NEWS

GNOME's new AI assistant can even run Linux commands for you - here's how

Aug, 05, 2025 Hi-network.com
gettyimages-2195307970
Pakpoom Makpan/Getty

Key takeaways

  • Newelle is an AI assistant for the GNOME desktop.
  • It's capable of standard chats and even running commands.
  • However, Newelle does require Flatseal to run commands on Linux.

There's a new AI assistant available for the GNOME desktop, and it just reached version 1.0 status. That new AI assistant is called Newelle, and it's already proven to be a worthy contender for your desktop.

Newelle isn't just another large language model manager, but a full-blown assistant that can run Linux commands from human-readable descriptions (more on that in a bit), serve as a traditional AI chatbot, and more. Newelle uses Bai Chat as its backend and allows you to download and select from different LLMs (some of which will require an API key). 

Also: The top 5 GNOME extensions I install first (and what they can do for you)

Newelle features chat history, chat editing, profiles, a mini app, extensions, keyboard shortcuts, chat save, recording, and more. 

This app can easily serve as a missing link to add something similar to what Gemini is to Android (although it does take a bit more work to get it there).

I've been using Newelle for a few days now and have found it to be quite a handy app. In fact, it's replaced Ollama/Msty as my go-to GUI for local LLMs. Not only is Newelle as easy to use as Ollama/Msty, it better fits the GNOME aesthetic, is faster, and doesn't take nearly the system resources.

The only caveat I've found with Newelle is that getting certain commands to run properly can be a challenge. There are specific steps you must take to allow commands to run, but once you've taken care of those configurations, you can use Newelle to run commands on your Linux system.

Let's install Newelle and then configure it to run commands.

Installing Newelle

What you'll need: Newelle is installed on Linux via Flatpak, so you'll need a distribution with that universal package manager installed and working.

1. Open a terminal window

The first step is to open a terminal window. You can use any terminal window you have installed.

Show more

2. Install Newelle

To install Newelle, run the following command:

Show more

flatpak install https://dl.flathub.org/repo/appstream/io.github.qwersyk.Newelle.flatpakref

When the command finishes, you can close the terminal window with theexitcommand.

Also: How I feed my files to a local AI for better, more relevant responses

Adding an LLM

Before you configure Newelle to run commands, you'll need to select which large language model (LLM) to use. Some LLMs are easy to add, while others require an API key to function properly. Let's make this easy and set up a local LLM. Here's how.

1. Open Newelle

Open the Newelle app from your desktop menu. 

2. Open Settings

From the Newelle main window, click the three horizontal line menu button near the top left and select Settings.

The Newelle main window.

The Newelle UI is well designed and easy to use.

Jack Wallen/

3. Select your model

From the LLM listing, click the downward-pointing arrow to download an LLM and then, once it's downloaded, select it by clicking the associated radio button. You can download as many LLMs as you want, but obviously, you can only use one at a time. Once you've taken care of this, you can close the Settings window.

Also: I tried Sanctum's local AI app, and it's exactly what I needed to keep my data private

The Newelle LLM configuration window.

You can download as many LLMs as needed, but you can only use one at a time.

Jack Wallen/

Configuring Newelle to run commands

This is where it gets a bit tricky. You have to change a few specific bits to enable Newelle to run commands for you. 

1. Open Settings

Go back to the Settings window and click on the General tab.

2. Disable "Command virtualization"

Under Neural Network Control, disable "Command virtualization."

The Newelle General tab in Settings.

This is the only configuration you must make in Newelle so commands can be run from the app.

Jack Wallen/

3. Install Flatseal

Because Newelle is installed within a sandboxed environment, you have to give it permission to access your file system. To do that, you must install Flatseal with the command:

flatpak install flathub com.github.tchx84.Flatseal

4. Configure Newelle permissions

Open Flatseal and click on Newelle in the left sidebar. Scroll down until you see the Filesystem section. In that section, enable "All user files."

The Newelle entry in Flatseal showing the "All user files" option.

This is the first configuration you must make in Flatseal.

Jack Wallen/

5. Configure the System Bus

Scroll down to the System Bus section and click the + button associated with Talks. In the new field, add:

org.freedesktop.Flatpak

The Flatseal System bus configuration section for Newelle.

This is the final configuration you must make so Newelle can run Linux commands.

Jack Wallen/

You can now close Flatseal.

Once you've taken care of the above, you can then run commands within Newelle. For instance, in the chat field, type: 

create a folder in my home directory named

Newelle will create the folder, and it's ready to use.

Also: How I made Perplexity AI the default search engine in my browser (and why you should too)

And that's the gist of Newelle. Using it as a traditional AI chatbot is straightforward and simple enough that anyone can use it. Sure, getting it to run commands takes a few extra steps, but if you're using Linux, you should be OK with that process.

Get the morning's top stories in your inbox each day with our Tech Today newsletter.

tag-icon Hot Tags : Tech Services & Software Operating Systems

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.
Our company's operations and information are independent of the manufacturers' positions, nor a part of any listed trademarks company.