The increasing enshittification of every service pushed me to GrapheneOS long before Google could force this shit on me
The article doesn’t really say but this is just for desktop chrome right now right? I’ve long had chrome disabled since graphene isn’t an option unless I build it myself but I do worry about that pesky web view that snuck it’s way into everything.
Man I sure do wish Linux mobile was a thing. I really want a full on Linux phone.
How unsurprising anymore in this hellish world where corporates hate your desire for anonymity… but try to hide theirs, such as dark expense accounts, tax evasion, secret offshore banking accounts, connections with crime and hate groups, etc.
It’s funny because they’re trying to find ways to cut cloud costs by offloading to users, but when that’s not a concern, they shove everything into the cloud and then ensure no local running option is available or viable.
They want all of your data in the cloud so they own it.
They want all their crap on your device, so you pay for it.
Bingo
When something is free, you are the product.
You’re forgetting FLOSS. That wisdom only applies when the product is made for financial incentive and is free, and not always then either.
deleted by creator
deleted by creator
My Google chrome took 10GB space on system disc so I needed to unistal it. Now I use only Firefox.
All solved with one simple step, DON’T INSTALL THIS GARBAGE!
At least all the carbon emissions will be offset by Windows being carbon aware! Right? Right…?!
“Windows Update is committed to helping reduce carbon emissions.”
Now, excuse me as I generate a picture of a man with 15 fingers on my browser, using my computer with chatgpt built intp it, while next to my phone with Gemini built into my texting app and assistant and maps app and photos app and
Alarming, but not surprising.
The setup that works for me is LibreWolf as primary browser and Firefox ESR if a site doesn’t work.
I don’t do web development or anything, but I haven’t run into anything that hasn’t worked recently. Librewolf works for almost everything, but if some stupid login page doesn’t like some privacy thing that librewolf is doing, I’ll try one more time with some more loose permissions, then it’s over to normal firefox.
i honestly feel proud that i left chrome years ago, when they removed manifest v2, it pushed me to switch to other browsers based on firefox, currently i use librewolf
Can someone ELI5 why they are doing this? I thought all the AI shit was in the cloud?
AI runs in the cloud because it needs a powerful server to run the biggest (i.e. “smartest”) models.
The cloud servers are doing nothing special that another powerful enough computer could do, just a huge amount of data processing.
You can run an ai chat on a steam deck or directly on a phone, if it’s not too demanding (“smarter” models are bigger data files, so won’t fit in the memory of a small device).
Today, for instance, I had a phone call from “Spectrum Internet support” and part-way through the call my phone blared an alarm and said “possible scam” on screen.
The phone itself interpreted the conversation as sus.
https://support.google.com/phoneapp/answer/15654065?hl=en
For Pixel 9 and later devices: Scam Detection is powered by Gemini Nano on-device
The cloud being a bunch of computational power (servers). A bunch of phones in a network also can be utilized for said computational power. Passing the savings on to you! ;)
I recall this. Pretty much the same idea, but this time it is opt-out(or is it actually?)
They need their features to work offline too probably.
It’s also cheaper, if they can offload a portion to the user’s computer.
Cheaper for them, that is.
What I want to see is throttleable models, kind of like progressive JPEG, where the default model is “nano” and it has a watch function that analyzes if more tokens might be needed for a certain task and scales up as needed — identifying if the resources are too much for the device and offloading to the cloud (with explicit permission) only if (but always if) needed. Over time as the technology improves, larger models move to the endpoint.
And then people could have a basic set of sliders: on-device only, on-cloud only, or somewhere in between, based on the user’s preferences.
That’s basically model routing, and has existed a while. Open AI’s GPT-5 and llama-swap do that, for example. If the task is simple, it uses a smaller, less intensive model, and only uses the slower, larger one of the task is more complex.
Though most tend to operate with models on the same device/service, rather than a model run elsewhere.
In an internet browser?
Yeah, even there. A page loading is one thing, but browser features are somewhat independent of the content. There’s also a good chance this is being used as a hook for other Google products like Drive or Docs (which are basically websites under the hood) to allow offline file management, creation, etc.
It’s a bad choice, but it wouldn’t be the first bad choice Google has made.
Well everything else is in it.
Shit, Chrome supports the use of COM ports. It’s an OS within an OS.
this makes zero sense because it’s on device, it’s no difference than the damage that just owning a phone is costing, are people here special?
I swear people just want a reason to freak out. Atleast make sense if you’re going to post such a stupid article title.
Downloading 4 GB of data without your consent is a pretty big issue. If you’re on a bandwidth starved or data capped network, that becomes a huge problem. Wasting 4GB of space on this nonsense that users aren’t gonna use too is pretty absurd.
nice strawman, how many trees did you burn posting this comment?
I didn’t write my comment to disagree with your point or to make a bad faith argument. I merely wanted to point out another thing that is of greater concern to me and most likely many others. That is in no way mutually exclusive to your point and doesn’t invalidate it. So get that snark out of here.
I frankly don’t think the amount of “trees being burnt” from posting a comment is relevant compared to the potential of hundreds of devices performing a unnecessary download. Hell, a billionaire’s carbon footprint in a couple seconds likely already far exceeds that. At least this conversation is going somewhere. In any case, this is an unquantifiable cost anyways.
i would burn a whole forest right now just to unread your comment
worse than a 4gb llm already
no shit
deleted by creator
you’re right!
Just because the software is running on your device doesn’t mean it can’t phone home. You might not care about your local storage but 4GB is pretty steep for some of us.
Not to mention the performance impact of actually running the thing.
Honestly shouldn’t even explain it to him. With a thought process like his he deserves what’s coming to him.
strawman and entirely missed the point of the article, howamy trees did you burn scrolling Instagram today?
Probably a ridiculous notion but personally I wouldn’t feel 100% sure my device isn’t joining a distributed
bot netai datacenter for other people’s slop generation, burning my energy, my battery, my ssd (higher temps shorten their half life) without my consentHalf life? ;)
another strawman, it’s a local LLM, just using your phone means youre doing irresponsible damage. how many trees did you burn with this comment?
How many trees to send some text? Lol, my aging phone depletes its battery more if I don’t use it xD
I just have a high level of distrust of FAANG, and of proprietary software; I still use most stuff I’m not very extreme in practice
Who’s to say that local LLM model won’t be used when we leave our device on standby, to ease a usage spike from other users? It’s technically possible, the things stopping many companies are the potential legal trouble (but there’s an ever longer TOS for that), different levels of morality, company culture, etc.; many botnets were from simple games installed by regular people connected on a residential line. Some VPNs have been caught using their clients line as an endpoint for other clients in different countries
You could say it’s a ridiculous thought for now regarding this “hidden” Gemini local LLM, I just feel like it’s a possibilty.
so fucking glad I wiped that piece of shit off my devices.
it’s all been downhill since.
since edge is chrome based i’d expect another 4g copy there. fuck windows
4gb for google’s ai and 4gb for microsofts ai lol
trained to click yes to anything. tl;dr legalese nobody reads
The Gemini on my device just became 10x smarter. Google Assistant didn’t know ‘what was the temperature a year ago’. and Gemini has no problem with it.











