super interesting to me that "AI = the worst chatbots = theft = ecocide = right-coded" has become an article of faith for a decent swath of users of Bluesky
like, I totally get why, but it's also a case study in how something can become politicized
https://bsky.app/profile/ajh.bsky.social/post/3le7ylw6g4k2h
like, I totally get why, but it's also a case study in how something can become politicized
https://bsky.app/profile/ajh.bsky.social/post/3le7ylw6g4k2h
Reposted from
A Jay Holmgren
I can’t even fully grasp the anti-AI fury on this site. Were math profs like this when statistical software first came out?
Comments
Nuance doesn’t put butts in seats these days.
I think you’ve been doing this well for a while! Certainly not a (majority-)hit among any of the faceless masses I know of
1. YouTube / Spotify / Google / etc. harming creators.
2. Crypto being almost definitionally a project to waste energy for right-coded (or Ron Paul libertarian-coded) purposes.
There’s no agenda, it’s just a very natural fit.
freely available for people to run on their own devices
Open models aside, until very recently the best available LLMs could be accessed for free
Google has racked up trillions of dollars in value by making people believe that they are champions of "open"
There's an argument for consolidation for the billionaire class
If anything, the existence of the open web allows them to say "see! We're not the only option"
If the models are huge vessels for wealth, having open SOTA models on laptops all over would undermine that narrative.
There's a parallel world in which the underlying ideas were aggressively patented and used by one organization to build tools exclusive to the highest bidders
The hilarious thing is they thought no one except a bunch of academics would use it; there's that pic of their presentation
Corporations open sourcing LLMs makes it nigh impossible to close pandora's box on content theft in training.
Also, for those late to the party, it reduces their competitor's edge, and gains free unpaid labour/publicity to catch up.
Better to hold up one's nose than to roll up one's sleeves.
AI created for the benefit of humanity would be a pretty easy sell. Cure cancer, solve wealth inequality, invent fusion power.
That's not what we're getting right now because the billionaire oligarchs running it are driven by greed
Most of what we are calling "AI" these days was created by individual researchers with a strong bias towards working in the open - that's why we are in the state we are today with high quality openly licensed releases and healthy competition across multiple labs
Unfortunately, that sort of outcome is the stuff of science fiction. LLMs are just text extrusion machines; they have no capacity for thought, research, or invention.
This is an absurd criticism.
I can say with some certainty that an AI use important enough to win a Nobel prize will be met with derision and scorn on this website.
'Cuz it happened.
The problem is that LLMs are getting all the attention when they are marginally useful at best actively harmful at worst and widely recognized as a dead end to AGI.
- timing, AI has blown up at a point where a lot of people are struggling. correlation isn’t causality but it can feel like it
- opacity, a lot of redundancies at the moment are being blamed on AI efficiencies even if AI isn’t actually involved (1/2)
- product quality, most consumer AI products so far… aren’t great? so users are naturally cynical about why its being forced on them
Same politics too, also opposed by the degrowth people.
AI is mostly unregulated and driven by profits
I was challenging that it's not quite as obvious a case to make as it would be if free, openly licensed models did not exist at all
AI has a branding problem right now and it is being driven by the desire for investment returns as quickly as possible
Just thought of an example where the coding on it is changing or has flipped: nuclear power
*n=2
https://bsky.app/profile/drrambio.bsky.social/post/3ldzm5kt4bs23
LLMs are just Transformers trained on text.
Also, knowledge transfers between modalities, so even in some non-text applications, you still want to include text training or train off a foundation that learned text.
Trying to engage with them and they refuse to believe that AI can be useful. They would prefer to stick their head in the sand rather than learn and adapt.
A sibling comment to yours epitomizes this with an oft-heard claim “there is no useful purpose to [gen]AI” (para).
Enshittification is real. Are there potential truly productive uses for gen AI? Maybe. But currently it seems to be operating at a net negative, and that is how people experience it.
This too is a lack of “critical thinking”.
It’s a rather bold statement to believe people are aware enough of the size of the full body of evidence to properly conclude they’ve a preponderance of it.
art pushers while threatening to put multiple industries out of business. A lot of trust that tech can solve for the future has been burned.
"new projects in Phoenix, Arizona, and Mt. Pleasant, Wisconsin, will pilot zero-water evaporated designs in 2026."
https://www.microsoft.com/en-us/microsoft-cloud/blog/2024/12/09/sustainable-by-design-next-generation-datacenters-consume-zero-water-for-cooling/
"this design will avoid the need for more than 125 million liters of water per year per datacenter."
I get the hostility if your mental model is mainly that it's ChatGPT writing people's essays for them from a single prompt, or Midjourney stealing commissions from the artists it was trained on
It’s not as broad a field as “all of mathematics” but it’s maybe bigger than, say, “calculus” or “Newtonian motion.”
No, that's not it. The magic is deeper/emergent or whatever.
Creative uses may be fraction of AI, but niche players like Midjourney/Sudowrite have led many writers and artists to fear an existential threat. Such articulate foes: again unique.
Bsky users didn't decide who is advancing the AI/crypto agenda. That was done outside of this platform.
My above comment was an indifferent statement of observation.
I know I'm abandoning a term, but I don't think I can fight the marketing forces on this.
I basically assume if something is "AI" that it's a LLM/bullshit machine and not what I as a lay person view as the good ML models.
And yes, most of us over a certain age are well aware it used to mean many other things. But not any more.
Ecologically… I don’t fully understand that side of it but the people explaining that it’s bad are people who I generally find trustworthy.
The resources required (water, electricity, human effort) to create some dumb AI entertainment…
I don’t understand how it compares to other dumb entertainments people enjoy on the internet/social media.
There’s an online geography game I play every day. How much water, electricity, & human effort goes into that?
Gesture recognition? Right-coded.
Voice control? You got it, right-coded.
The conflation is deliberate on the part of the chatbot people. They want the successes of ML to improve the reputation of their garbage. See “AI will solve climate change” for example.
I still haven’t seen a clear definition of AI…
https://slate.com/technology/2019/02/openai-gpt2-text-generating-algorithm-ai-dangerous.html
“The blog post fretted that it could be used to generate false news articles, impersonate people online, and generally flood the internet with spam and vitriol.”
One weird consequence is that increasingly AI = generative AI in everyday speech
I didn't expect to see positive posts about luddites in 2024, but here we are.
Luddites were actually early precursors to the Labour Union movement.
Yes, we'd be much better off if we just produced everything by hand. Let's get rid of farm machinery and have 90% of the population back on the farm.
He's found a kindred spirit! Just like every fucking programmer who thought humanities classes were a waste of tuition and time...
https://www.bloodinthemachine.com/p/one-year-of-blood-in-the-machine
Productivity improvements is what makes our standard of living possible. The transitions suck and we need to do everything possible to ameliorate the pain, but there is a huge amount of opposition based on fear.