
Google Chrome stands as the most popular web browser, with a 67.97% share worldwide (translating to around 3.62 billion people), at least according to StatCounter. Apple's Safari comes in at a distant second with 17.09%.
In a surprising turn of events, security researcher Alexander Hanff (also known as "That Privacy Guy") released a damning report alleging that Google Chrome is silently downloading and installing a 4GB on-device AI model onto users’ devices without their knowledge or consent (via Tom's Hardware).
Perhaps more concerningly, Hanff flagged a similar issue with Anthropic's AI software, citing that the company's Claude Desktop app installed a Native Messaging bridge across seven Chromium-based browsers on every system it was installed on.
Alexander Hanff described his discoveries as a broader look at the patterns in how big tech deploys AI features across its portfolios and properties. While the news of Google Chrome installing a 4GB AI model has sparked controversy, with many referring to it as bloatware or spyware, the AI model is actually designed to protect your privacy when you're using AI features.
For context, navigating to your hard drive’s Chrome User Data folder (C:\Users\\AppData\Local\Google\Chrome\User Data), you'll discover a massive folder called OptGuideOnDeviceModel containing a big file named “weights.bin.”
weights.bin is Google's Gemini Nano on-device AI model and is part of Google's broader strategy to integrate AI tech directly into its Chromium-based browser. Chrome depends on these weights to run AI features locally on your device instead of via the cloud, which comes with its fair share of challenges, including data privacy questions.
While speaking to Android Authority, a Google spokesperson commented on the issue:
We’ve offered Gemini Nano for Chrome since 2024 as a lightweight, on-device model. It powers important security capabilities like scam detection and developer APIs without sending your data to the cloud. While this requires some local space on the desktop to run, the model will automatically uninstall if the device is low on resources. In February, we began rolling out the ability for users to easily turn off and remove the model directly in Chrome settings. Once disabled, the model will no longer download or update. More details in our help center article.
Interestingly, the large file does seemingly come with a "warning" that Chrome will download it, suggesting that the browser may have been granted permission to download any files it requires to function.
"Chrome does not surface it," That Privacy Guy added. "If the user deletes it, Chrome re-downloads it."
So, Google Chrome will automatically download the AI model directly to your computer when you first interact with any feature that relies on its new AI-centric APIs. However, your device must meet the following specifications:
- OS: Windows 10 or 11; macOS 13 or later; Linux; Chrome OS on Chromebook Plus devices.
- Storage: 22GB of free space.
- CPU or GPU: Built-in models can run with a GPU or a CPU:
- GPU: More than 4GB of VRAM.
- CPU: 16GB of RAM or more and four CPU cores or more.
- Network: Unlimited data or an unmetered connection for the initial model download.
To find out whether the foundational model has already been installed on your device, navigate to chrome://on-device-internals in Google Chrome’s address bar. You can also check the model's version, installation path, and folder size in the Model Status section.
Windows Central's take
It's quite obvious that users have been split on the usefulness of AI on their PCs for a while now. Some would rather have AI features completely scrapped from their systems, while others have found reasonable use cases to support their workflows and hobbies.
I'm not opposed to the idea of having AI features embedded into my operating system, as it seems that's the direction the world is headed anyway. Still, I'd rather "Big Tech" use more subtle approaches to their AI roadmap. For instance, giving users more control over how AI is deployed onto their devices, with complete transparency, and at the very least, having an option to opt out of experiences that might not necessarily be viewed as paramount for the everyday user.

Join us on Reddit at r/WindowsCentral to share your insights and discuss our latest news, reviews, and more.