If we have to find a winner of the AI chatbot race, it appears to be Opera from among the best web browsers. My favorite part – privacy is at its core.
That's what stuck with me most after attending the Opera Browser Days conference in Bristol last week as the team gave us a sneak peek of what's in store.
"All Opera users will benefit from on-device AI by the end of 2025," reads the slide projected behind Krystian Kolondra, EVP of Desktop & Gaming at Opera, while he explains Opera's ambitious goal: building a local LLM model so that none of your data has to leave your device. "Which is perfect," he said. "Because then you have no issue when it comes to privacy."
Opera was an early pioneer in the generative AI space, integrating ChatGPT in February 2023. It then became the first multi-LLM AI browser solution when it introduced its in-house AI chatbot Aria later in the year with Opera One. Now, Opera wants to become the first browser to offer the "ultimate privacy" AI experience – local AI.
"Aria today is built on top of multiple large language models, like Google Gemini and OpenAI ChatGPT. But we're also building local models that are going to work that way," Kolondra told me.
An AI-based browsing experience
Right now, Opera isn't your usual default browser, nor does it want to be. The firm has always strived to be the users' choice, so the team is generally quick to jump on the newest technologies, experiment, and bring additional functionalities to its browsers. This is exactly what happened when Opera entered the generative AI space in 2023.
In February, the company announced a partnership with OpenAI – the father of ChatGPT – to realize its ambition of seamlessly integrating artificial intelligence and generative computing technologies into its products. A month later, Opera kicked off becoming an AI browser, introducing sidebar integrations of the popular GPT-based chatbots ChatGPT and ChatSonic, alongside Opera's first native proprietary AI feature, AI Prompts, on both Opera and Opera GX.
The real revolution, however, came with Opera One: an AI-powered browser built to help bring new features for a generative AI-based future. The first of these was Aria – Opera's built-in AI service, which is completely free to use on all Opera applications (Windows, MacOS, Linux, Android, and iOS) and available in 50 languages across 180 countries.
In an AI space where the more data is the better, what's most interesting about Aria is that it goes in the opposite direction, allowing Opera to keep up with its privacy promise: users' anonymity.
For starters, as mentioned earlier, Opera is now a multi-LLM browser. This means it harvests the best of the two worlds: the power of multiple large language models (OpenAI's ChatGPT and Google Gemini) and the flexibility of its own AI engine.
As Joanna Czajka, Product Director at Opera and Lead of the Opera Desktop Browser, told me, this is advantageous because developers can switch between the best models for certain features. Plus, it means more privacy for users as they will interact directly with Aria – which isn't in charge of training the LLM – to use AI functionalities on the browser.
Last week, Opera axed the log-in requirement for Aria. Arjan Van Leeuwen, Product Engineering Lead at Opera, told me the company did that to make it easier for people to start using Aria. "We believe this service will be an integral part of the browser," he told me. "We can't have users missing out on it just because they didn't want to create an account. That's why accountless is important."
AI training has been one of the most contentious privacy issues since the boom of generative AI. ChatGPT-like tools need data – lots of data – to carry out their functions. While scraping every corner of the internet to gather training data, LLMs also use personal data people may share with the chatbot to keep improving.
As per Opera's FAQ, Aria can only access the information you provide by writing in the chat and cannot remember any information or data from previous conversations. "Neither the browser nor other web services can access any data that you provide to Aria," it confirms, adding "OpenAI and Google can’t use questions you provide to the chat to improve their systems."
It's worth mentioning, though, that Opera stores the conversation history on its servers for 30 days, while OpenAI's servers also keep anonymized (not connected to your identity) parts of conversations for 30 days.
On its side, Google Gemini's privacy hub explains the Big Tech giant collects all your Gemini Apps conversations, related product usage information, info about your location, and your feedback "to provide, improve and develop Google products and services and machine learning technologies."
How does Aria process people's data?
The best example of this is how AI can assist with tab management. Imagine starting your morning browsing routine with all your tabs open from the day before. On the Aria Command Line, you can ask the AI engine to close all the open YouTube tabs to save you time.
As Czajka put it, we should think of Aria as preparing a recipe. "What is sent to our server – so, to the AI engine – is only the prompt ‘Close all my YouTube tabs.’ Nothing else" she said, adding that once Aria prepares the recipe, the very generic instructions are sent to the device to be read and performed. It's the browser that locally detects your YouTube tabs and executes the action.
"This is very crucial because there are so many awesome things you can do with Aria and we want to be able to give it to users but we thought, OK, let's make it private first."
So, despite not being in charge of the LLM training, Opera found a way to make the AI experience more private for users. With Aria, the engineering team supposedly has the flexibility to "anonymously and generically" teach Aria browsing patterns and add new capabilities to the browser interface.
A step closer to local AI
Sure, a multi-LLM browser model that uses the browser's very own private AI engine to interact with users sounds a way more convenient – and frankly less scary – option. But, imagine how better it would be if none of your data left your machine.
This is what Opera is currently working on, large language models that are completely local. "Just a piece of software running on your computer," Arjan Van Leeuwen, Product Engineering Lead at Opera, told me. "It's the ultimate privacy because nothing is going out from your device."
The goal is ambitious, LLMs are notoriously big software that potentially not all computers could support without crashing. Or, if they do, there's a high risk that a local LLM will considerably slow down performances. The question now is, how ready are current devices for local AI?
This is exactly what Opera is set out to discover. Since April, anyone with a developer version of Opera One can download and start using one of the local LLMs supported, such as Llama from Meta, Gemma from Google, Mixtral from Mistral AI, and more.
"The idea is to determine what kind of models people use, how they use them, and, of course, in the future, how we can offer them automatically," said Van Leeuwen.
With new-generation laptops – such as the new ARM-Windows devices – spreading, it isn't difficult to share the same optimism seen among Opera engineers. "This is definitely the future," said Van Leeuwen.
While it's too early to say how far Opera is from implementing fully local AI at this stage, the first tests have unveiled promising results. This is why the provider said to be ready to experiment with a hybrid approach.
On this point, Van Leeuwen told me: "Right now we're at the stage where we know we can do some things locally. But, for some things, we still need cloud processing. What we're trying to determine is what we need for which and how we can get to something that's as private as possible, but still has as many of the features as you want."
The idea is simple – have a smaller local LLM on the device and a large model running in the cloud. The browser will take a user request without the data, ask the larger model for precise instructions, and execute the action on the device. This means the heavy lifting is done in the cloud, while your data remains locally.
You won't need to wait too long to see this hybrid AI approach working in practice either. Some new features are expected to come to Aria very soon and they will already use this hybrid mode. "But in the future, we see that there will be also AI work happening on the device," Kolondra added.
Watchwords: privacy and security
Opera's approach to AI is just one way the browser seeks to keep its users secure and anonymous when surfing the web.
The provider claims to offer security and privacy by design, promising to keep its users anonymous by minimizing the identifiable data it collects.
As a Norwegian company with offices in the EU, Opera has to follow the block's strict privacy laws. The provider goes a step further by implementing GDPR rules outside the EU, across those markets where these protections don't generally apply.
All Opera browsers (besides Opera Mini) also come with a built-in virtual private network service. This is not just completely free to use but also operates under a strict no-log policy which has recently been independently audited. It's official – Opera VPN never records your activities and data, boosting even more your online privacy when browsing.
As Opera users kept asking for more VPN options, the team partnered with TechRadar's best VPN pick, NordVPN, to develop a more comprehensive premium service – Opera VPN Pro.
As a privacy expert, the thought of generative AI spreading across all our daily apps and software brings a sense of fear for what it will mean for our digital rights.
Yet, after listening to Opera's new developments and asking questions to both the privacy and engineering team, I felt perhaps there is a way to benefit from the power of artificial intelligence without giving up on what I care about the most – my personal data.
I still hear it now as I write the last sentence Kolondra told me. "We don't want users' data, at least the data that we could connect to one particular user. We don't believe it is useful for us in any way. We only need the data to make our products better."
"If we want to know something – the game that people play, the websites they like, or more – we just ask politely, with a survey. I don't know why many companies are not doing this way," he added.
He's right. The best way to achieve privacy is not to create room for abuse, mistake, or leak. So, by not asking or giving away the data in the first place. Local AI is promising exactly that – and I cannot wait to see it working.