Security researcher Alexander Hanff has reported that Google Chrome is silently downloading a 4GB AI model to user devices without providing notice or obtaining explicit consent. The file, identified as "weights.bin," is part of Google’s on-device AI system powered by the Gemini Nano model. According to Hanff, the browser automatically evaluates a user's hardware and initiates the download in the background if the system meets specific requirements.
Silent Background Downloads
Hanff, also known as "That Privacy Guy," conducted a controlled test on a fresh macOS profile to verify the behavior. Using filesystem event logs, he observed that the browser created a directory and downloaded the 4GB payload during idle browsing time without any human interaction. The process reportedly completed in approximately fourteen minutes.
Users who locate and delete the file may find that it is re-downloaded automatically unless they disable specific experimental flags or remove the browser entirely. There is currently no straightforward setting within Chrome to prevent this download, nor is there a clear consent flow to inform users that a multi-gigabyte AI model is being stored on their local machine.
Legal and Environmental Concerns
The researcher argues that this practice may violate European privacy laws, specifically the ePrivacy Directive’s rules regarding the storage of data on user devices and the GDPR’s requirements for transparency and lawful processing. While these claims have not been tested in court, they highlight a growing tension between the rapid deployment of AI features and regulatory expectations.
Beyond legal implications, Hanff raised concerns regarding the environmental and financial costs of such large-scale distributions. He estimates that if the model is pushed to hundreds of millions of devices, the total emissions impact could reach tens of thousands of tons of CO2 equivalent. Additionally, for users with metered or capped internet connections, the silent transfer of 4GB of data can result in significant financial consequences.
A Pattern of System Integration
This report follows a previous analysis by Hanff regarding Anthropic’s Claude Desktop app, which he claimed installed a browser integration bridge across multiple Chromium-based browsers without user disclosure. In that instance, the integration was designed to reinstall itself if removed.
Hanff suggests that both the Anthropic and Google cases reflect a broader pattern among large technology companies, where user devices are treated as deployment targets rather than environments under the user's control. Google has not provided a detailed public response to the findings at this time. The core issue remains whether browsers should require an explicit opt-in before downloading gigabytes of data to a user's local storage.

Comments (0)
to join the discussion
No comments yet
Be the first to share your thoughts!