There's a new AI assistant available for the GNOME desktop, and it just reached version 1.0 status. That new AI assistant is called Newelle, and it's already proven to be a worthy contender for your desktop.
Newelle isn't just another large language model manager, but a full-blown assistant that can run Linux commands from human-readable descriptions (more on that in a bit), serve as a traditional AI chatbot, and more. Newelle uses Bai Chat as its backend and allows you to download and select from different LLMs (some of which will require an API key).
Also: The top 5 GNOME extensions I install first (and what they can do for you)
Newelle features chat history, chat editing, profiles, a mini app, extensions, keyboard shortcuts, chat save, recording, and more.
This app can easily serve as a missing link to add something similar to what Gemini is to Android (although it does take a bit more work to get it there).
I've been using Newelle for a few days now and have found it to be quite a handy app. In fact, it's replaced Ollama/Msty as my go-to GUI for local LLMs. Not only is Newelle as easy to use as Ollama/Msty, it better fits the GNOME aesthetic, is faster, and doesn't take nearly the system resources.
The only caveat I've found with Newelle is that getting certain commands to run properly can be a challenge. There are specific steps you must take to allow commands to run, but once you've taken care of those configurations, you can use Newelle to run commands on your Linux system.
Let's install Newelle and then configure it to run commands.
What you'll need: Newelle is installed on Linux via Flatpak, so you'll need a distribution with that universal package manager installed and working.
The first step is to open a terminal window. You can use any terminal window you have installed.
To install Newelle, run the following command:
flatpak install https://dl.flathub.org/repo/appstream/io.github.qwersyk.Newelle.flatpakref
When the command finishes, you can close the terminal window with theexitcommand.
Also: How I feed my files to a local AI for better, more relevant responses
Before you configure Newelle to run commands, you'll need to select which large language model (LLM) to use. Some LLMs are easy to add, while others require an API key to function properly. Let's make this easy and set up a local LLM. Here's how.
Open the Newelle app from your desktop menu.
From the Newelle main window, click the three horizontal line menu button near the top left and select Settings.
The Newelle UI is well designed and easy to use.
From the LLM listing, click the downward-pointing arrow to download an LLM and then, once it's downloaded, select it by clicking the associated radio button. You can download as many LLMs as you want, but obviously, you can only use one at a time. Once you've taken care of this, you can close the Settings window.
Also: I tried Sanctum's local AI app, and it's exactly what I needed to keep my data private
You can download as many LLMs as needed, but you can only use one at a time.
This is where it gets a bit tricky. You have to change a few specific bits to enable Newelle to run commands for you.
Go back to the Settings window and click on the General tab.
Under Neural Network Control, disable "Command virtualization."
This is the only configuration you must make in Newelle so commands can be run from the app.
Because Newelle is installed within a sandboxed environment, you have to give it permission to access your file system. To do that, you must install Flatseal with the command:
flatpak install flathub com.github.tchx84.Flatseal
Open Flatseal and click on Newelle in the left sidebar. Scroll down until you see the Filesystem section. In that section, enable "All user files."
This is the first configuration you must make in Flatseal.
Scroll down to the System Bus section and click the + button associated with Talks. In the new field, add:
org.freedesktop.Flatpak
This is the final configuration you must make so Newelle can run Linux commands.
You can now close Flatseal.
Once you've taken care of the above, you can then run commands within Newelle. For instance, in the chat field, type:
create a folder in my home directory named
Newelle will create the folder, and it's ready to use.
Also: How I made Perplexity AI the default search engine in my browser (and why you should too)
And that's the gist of Newelle. Using it as a traditional AI chatbot is straightforward and simple enough that anyone can use it. Sure, getting it to run commands takes a few extra steps, but if you're using Linux, you should be OK with that process.
Get the morning's top stories in your inbox each day with our Tech Today newsletter.