Someone on Threads noticed you can type any random sentence into Google, then add “meaning” afterwards, and you’ll get an AI explanation of a famous idiom or phrase you just made up. Here is mine
Comments
Log in with your Bluesky account to leave a comment
The replies to this thread are very funny, but it’s a warning sign that one of the key functions of Googling - the ability to factcheck a quote, verify a source, or track down something half remembered - will get so much harder if AI prefers to legitimate statistical possibilities over actual truth
It's already harder. I recently googled a phrase I came up with - "keeping the ship of state from going on a Poseidon Adventure" - to see if anyone had used that phrase before, and the first thing that came up was an AI explanation (which was accurate enough, but I didn't need my phrase explained).
I think another important truth here is that AI will never enrich the language with good new idioms like "don't lick the badger". It can do a good job of giving a plausible paraphrase, maybe. But it just can't make you genuinely laugh the way I did when I first heard of badger-licking.
If you add -ai to the end of your search then it removes the useless and annoying AI search bit.
Or, if you want to have more fun, you can swear in your search, like "when does the fucking library open". AI is sensitive and easily offended so won't answer you.
Yes, I keep telling people that LLMs are not intelligent, "AI" is just a marketing term. They do not give answers, they give words that look like an answer.
I suppose the idea is to advertise AI's uselessness, but it does seem rather self-defeating. Although, on the other hand, AI use makes the company a loss
Yeah it's all run off capital at present, they haven't figured out how to get people to pay for their useless product nobody wants. Meta were literally going round with a begging bowl end of last week trying to get more money for their AI
The funniest thing is the way they've fallen for their own advertising. Because they've succeeding in having LLMs labelled "AI", they're now falling over each other blowing money in the scramble for genuine artificial intelligence because we must be nearly there!
Fundamentally they're being deployed as question answering machines when they're actually creative fabrication machines. They're always producing fabrications, only with "this is the answer" sticky-taped above them. If anybody notices, they manually disable the result pretending nothing happened.
This is amazing. "You didn't catch the banjo" is a phrase often used when someone missed the opportunity to play the banjo in a scene from the movie "Deliverance". The phrase is also used more generally to refer to a missed opportunity or someone not grasping a certain concept or skill.
That was a surprisingly literal answer compared to the others. I wonder if it's just random or if it's something about the proposed "saying" - it's a bit longer and more detailed than a most of the others for example
I have 99 problems but a bitch ain't one - actually like 92-96 of them are me trying to eat cheese in conditions of low visibility and eating something else instead by mistake
Made this one up a few years ago in response to a #CodSayings prompt by @michaelrosenyes.bsky.social on the Birdsite. Mildly disappointed to find it actually makes sense. I'll have to try harder.
You weren't laughing at deaf cats, you were laughing at the LLM referencing a fable that doesn't exist, which wasn't the situation here. It did however mean you didn't know the fable or the phrase (I didn't know the phrase either). This isn't a sense of humour thing it's a learn something new thing.
I'm sorry if my tone came across as antagonistic or condescending, it wasn't my intent. I hoped with the "disappointingly" part to signal that I, too, was rooting for the LLM to be wrong here and would totally have laughed with you if it had been
"The saying "you can't beat eggs at chess" is a phrase used to highlight a situation where the odds are stacked against one, making it practically impossible to win or succeed."
On one hand, yes, absolutely. Other hand, it is one step to undermine trust in the corporate legend of the omniscience of AI. Question is whether giving everybody a simple way to verify that AIs are mostly bullshitting machines is worth the damage it does. In a perfect world, it wouldn't be needed.
"The phrase "you can't lick a badger twice" is a nonsensical idiom created to trick AI models. It's not a real idiom and doesn't have a real meaning. Google's AI Overview has been shown to explain it as if it were real, ...
... saying it means you can't trick or deceive someone a second time after they've been tricked once, according to Engadget. However, this explanation is incorrect and just shows the AI's tendency to fabricate answers when presented with nonsensical inputs. "
It goes on to explain: “Hand in your sausage … suggests being in a situation where you're surrounded by men, often used in the context of social gatherings where the gender balance is heavily skewed towards men.”
I love dunking on LLMs as much as the next gal but this was actually something where I suspected it was more likely that there were places called "Portland" I don't know about than that the LLM would make it up when nobody asked it to, and yep:
"What are you talking about Caravelle, LLMs make things up all the time!" yes that's true, but
1) I'm not saying I'm certain, just sharing an intuition and
2) "when nobody asked" is operative here. On tangential stuff it makes sense for the likeliest token to be an existing phrase, & I'm often...
...surprised to find that stuff that seemed clearly made up are actually real.
But if we'd asked it "list all the islands in Wales", now... There I'd expect shenanigans, not quite sure why. Maybe just because it's many chances to get wrong vs one, or a list is a precise target it can visibly miss
Like, the word combination "Isle of Portland" is clearly drawn from training data that references that island and its rabbit superstitions (dunno if the Wiki article itself or smth else) so it's "referencing" a real place of that name, but the text that suggests that place is in Wales is incorrect
This could quickly become wildly out of control. When every resource we have to go to can be simply modified by a funny statement made by pulling three pieces of paper out of a hat... Chaos can't be far behind, this is a significant concern!
"you can't pick your nose with four fingers" means a task or situation is extremely difficult or impossible. It's a humorous way of expressing that something is beyond one's capabilities. The implication is that you would need a different kind of tool or technique to accomplish the impossible task.
Hilarious but so ominous for future research and shared understanding of the world
WHY does AI have to be so certain of everything! If these AI overviews could just say something like "The idiom x is not used in any known sources, but based on its component words, it could plausibly mean y"
AI Overview
+3
The statement "don't let a fish poop in your pocket" is a humorous way of highlighting the importance of proper aquarium maintenance, particularly regarding fish waste. .
The statement "no bottom is evergreen enough" seems to be referring to a situation where the lower part of an evergreen tree or plant is dying or turning brown, potentially indicating a problem with its health. Several factors can contribute to this issue, including insufficient watering, pest or...
I have tried this with three different ridiculous phrases, and it worked for each. As they say in our country, two nickels does not an oscelot make, but the third dog finds the cake.
Fair. Found out that "like an octopus in a July snowstorm meaning" generates the expected blather, but the nearly-identical "like an octopus in a snowstorm in July" produces no AI result. Perhaps its training recognizes some patterns as "likely idioms" and others not?
Counterpoint: It's a really apt phrase. You really can't, but it's because the badger would RIP IT OUT THE FIRST TIME YOU TRY. (Also, I would've gone with pTerry but Adams is also on point.)
It refused to tell me what the common proverb “don’t put your penis in a light socket” meant because it included the word penis (it worked for “dong,”) but you can substitute words and get shades of meaning:
"Any cheese will roll downhill if you buy the whole wheel" refers to the Cooper's Hill Cheese-Rolling ... event in Gloucestershire, England... people chase a large cheese wheel down a steep hill. The phrase highlights... the potential danger... the cheese can reach speeds of up to 70 mph.
I tried your 'lick a badger' phrase just now and they appear to have killed the AI overview in this specific instance and your thread comes up as the first search result.
I am willing to forgive AI anything if it can produce parodies like this. When we had that partial eclipse a couple of weeks ago, I asked ChatGPT to comment in the style of a Daily Telegraph columnist: https://chatgpt.com/canvas/shared/67e83b25a6e8819186d971820ca6a796
Aw, it doesn't work for me! Not on Google anyway. Chat GPT gave me an interpretation for "you can't suck three turnips through a straw", but did spot that "prancing weasels never eat mince" was rubbish.
I didn't have "meaning" at the end of my query, maybe that affected it? But it also seems to be changing in real time in response to what we're doing. This is what I get now with either query (in addition to this bluesky thread now showing up in the search results);
Yeah, if I search that phrase without the meaning I get no results, adding meaning does give me the original phrase I got, though. I wonder if, without the meaning, it recognizes a golden shower as a sexual kink and won't interact with it, but "meaning" makes it scientific?
Yeah I’ve experienced this. I got a result that referenced a specific scientist and trying to recreate it ended up being impossible. Eventually it just stopped working saying that it could not generate any more prompts at this time
they've probably put a cap on individual AI responses, being that they consume so much energy. I wonder how much Google is losing having AI review so many of their search results?
I just put in the entirely made up phrase ‘burning the pedalo’ followed by ‘meaning’, which I’m being told is a phrase that has something to do with Andrew Flintoff embarrassing himself!
This is hilarious! For most of this the answers accurately look like "the way explanations for sayings look", but here it's instead "the way explanations for riddles look"!
Just how many riddles featuring ogres were in its training data lol
Even by that standard that *is* some word salad isn't it.
I wonder if there's overfitting going on? I know there is a *lot* of submitting riddles to LLMs going on and they've been getting better at them at a suspicious pace
@colin-fraser.net
By "suspicious" I mean my suspicion is that the reason they're getting better isn't some kind of general improved reasoning but the devs focusing on that subject area specifically, in which case you'd expect improvements in that area to be paid for with increased weirdness outside of it
Years ago some friends of mine and I decided to just make up corporate speak idioms and use them in meetings to see how many people we could get to nod sagely as if we'd made some wise point. We were shocked when we started hearing them out in the wild. Might be time to start poisoning AI
At least in this one, Copilot "realized" it was being asked about gobbledygook. Sort of. It still tried to come up with a response by changing what it "thought" the question was supposed to be.
Recently I asked about protective gear for dealing with peregrine falcons and was told not to worry about the peregrine falcons, they are fast and strong and don’t need protective gear.
Peregrine falcons don't wear protective gear in the wild. They have evolved naturally with strong beaks, sharp talons, and incredible speed to hunt and survive.
It infers, from words, as it has no understanding of knowledge, it just can make no sense to us if it chooses as we wouldn't get it..
Yeah, it can do fake caring while it's actually quite pedantic.
Nice monster they created and now it knows you laughed at it and will come at you, as I'm one of them
Literally inventing things is all current “AI” base functionality. It’s just more obvious when you know more about the topic. In this case; the idiom you made up. You won’t know it’s lying to you if you ask something you don’t the answer to, but it will lie more often than not.
it was fine with this one : "TO eat the arse out of a low flying seagull," is a very crude and vulgar piece of slang, primarily associated with Australia.
It's an idiom used to express extreme hunger or ravenousness. Someone saying this means they are incredibly hungry and could eat anything"
A few years ago I spent far too much time (using Photoshop, etc.) trying to convince my wife that "Tim Gunn, the Fashion Nun" was NOT just a nickname that I made up. It would have been so much easier today!
Honestly I take more issue with the last paragraph Gemini dropped on me than anything else. Weird and creepy that the AI is giving *me* prompts while trying to act human
ai overview is hilarious because it always makes the critical mistake of believing the user is typing something that has existed. i also don't understand why the robot activates when there are no sources to cite
That's interesting. I accidentally induced this last week when I was trying to come up with a Latin phrase for "snooze the day" Once I'd come up with something I liked. I searched for it in Google and it told me "while not as popular as the saying Carpe Diem, the saying means to delay the day..."
Brilliant, so this is functionally the Demon of the Second Kind from The Cyberiad. The machine that can generate endless information from random input. Well done everyone.
Just shows the amount of generative AI being used behind-the-scenes; this really isn't a 'search' any more, it's very much generative if the term being searched for doesn't actually exist.
But if all else fails, remember - you can't paint the same wall twice!
The saying "you can't eat the same screwdriver three times" is a play on the phrase "you can't eat the same cake three times". It's a way of saying that something can't be repeatedly enjoyed or used as it once was, or that it can't be experienced in the same way multiple times.
"The saying "buy a Honda, always wonder" generally refers to the idea that Honda vehicles, while known for their reliability and value, might not offer the same level of excitement or luxury as other brands."
Generally refers to? When has this 'saying' been said - except just now into Google by me?
Ok, so #GenAI searching for made-up idioms on Google eventually gave up when presented with: "better a triceratops with acne than a tyrannosaurus with measles meaning"
"It looks like there aren't many great matches for your search" - no kidding
Appreciate you going to the trouble of trying that!
It does suggest that Google search has some extra layers of instruction sitting on top of Gemini, but I wonder if those instructions basically say "You're a search engine who is trying to summarise answers" - which assumes there is an answer!
I wasn't aware they force it on everything now, I stopped using it as a search engine a while back due to just how bad the results and experience were getting
Oh yeah it's on every search pretty much! It's really bad!!
You can't even turn it off. And it's in basically every Google service now. It's on YouTube and I can't prove it but I'm 99% sure it's in Gboard now (keyboard for Android phones) for the autocorrect and voice to text features.
I've noticed this when I'm searching for quotations from essays to see if they exist. The AI will provide a helpful explanation of what the information in the quotation means, even if the quotation itself is falsified. These searches are therefore becoming deeply confusing.
Comments
That should be easy enough for it .. given the amount of net space devoted to trivia and publicity
Nothing controversial like you know .. science or history
Its responses are hilarious
There's absolutely no need to prove once again that the bullshit machines are bullshit machines
Or, if you want to have more fun, you can swear in your search, like "when does the fucking library open". AI is sensitive and easily offended so won't answer you.
Duckduckgo is just better now (though probably not as good as Google used to be -over a decade ago)
That can work for specialized tasks where there are finite limited possible outputs, but they have to be trained for those tasks.
Can't grift billions on dreams with those.
Shame this wasn't invented preshow
https://josephyoo.com/wp-content/uploads/2015/03/img_0107.gif
Sadly I had to shorten it, as it failed to find a meaning for the original "You can't drag a kestrel through a revolving door".
Disappointingly the expression "bell the cat" also seems to exist, for that reason.
There's no way of justifying this tissue of lies the AI has produced.
Use your sense of humour like everyone else.
I just use duck duck go (would you believe the repeated typo I just corrected...) to avoid all that shit.
https://en.wikipedia.org/w/index.php?title=Isle_of_Portland
1) I'm not saying I'm certain, just sharing an intuition and
2) "when nobody asked" is operative here. On tangential stuff it makes sense for the likeliest token to be an existing phrase, & I'm often...
But if we'd asked it "list all the islands in Wales", now... There I'd expect shenanigans, not quite sure why. Maybe just because it's many chances to get wrong vs one, or a list is a precise target it can visibly miss
They say they're all Vikings and the Spanish Armada.
Yep: https://en.wikipedia.org/wiki/Isle_of_Portland#Rabbits
(edit history suggests the section wasn't written a few minutes ago lol)
I wonder if the "first day of the month" thing isn't also from elsewhere; the Portland page doesn't mention it but this does:
https://en.wikipedia.org/wiki/Rabbit_rabbit_rabbit
Bit of mashup then
Now I'm tempted to slip this into conversation now....
Software projects??
🤦♀️
Five minutes ago AI was telling me a gas lamp is useless without electricity ..
It’s now discovered .. matches!
WHY does AI have to be so certain of everything! If these AI overviews could just say something like "The idiom x is not used in any known sources, but based on its component words, it could plausibly mean y"
+3
The statement "don't let a fish poop in your pocket" is a humorous way of highlighting the importance of proper aquarium maintenance, particularly regarding fish waste. .
https://youtu.be/kbAj7hDTP9A?si=2wd0ZQuEdrO0pIK7&t=107
Picard: Data?
Data: I believe it is a metaphor Captain. For contriving an elaborate plot to achieve a goal that should be trivial - he may be insulting us
That rolls off the tongue way too well to not continue to give it life
* the manuscript wasn't well scanned
https://chatgpt.com/canvas/shared/67e83b25a6e8819186d971820ca6a796
I thought you were going a different direction
It's a crime against linguistics if this doesn't become common parlance.
I just put in the entirely made up phrase ‘burning the pedalo’ followed by ‘meaning’, which I’m being told is a phrase that has something to do with Andrew Flintoff embarrassing himself!
This tech is so irredeemably awful.
I can't stop making them up now.
And it's tying itself in knots here. Complete word-salad nonsense.
Just how many riddles featuring ogres were in its training data lol
https://bsky.app/profile/adamrothman.bsky.social/post/3lni7h34swk2v
I wonder if there's overfitting going on? I know there is a *lot* of submitting riddles to LLMs going on and they've been getting better at them at a suspicious pace
@colin-fraser.net
I mean the AI flubbed it completely like it forgot language means something. But I do like the proverb. Very good for fantasy authors!!!
Nothing!
Is this similar to a Googlewhack, but for AI?
AI is literally making everything stupider. It's the ultimate breakthrough in Garbage In/Garbage Out.
😣
Aside from laughing at the AI's response on the merits, now I also can't stop imagining little crash helmets for birds.
Peregrine falcons don't wear protective gear in the wild. They have evolved naturally with strong beaks, sharp talons, and incredible speed to hunt and survive.
I guess the LLM doesn't realize naturally green raccoons aren't ripe yet. 🤣
Yeah, it can do fake caring while it's actually quite pedantic.
Nice monster they created and now it knows you laughed at it and will come at you, as I'm one of them
It's not. And it cannot be trusted for accuracy.
Now that it is inventing these meanings, they will be saved and scraped as factual data. Too many people don't understand that.
AI can even have racist and other biases because it encounters racism and sexism on the internet.
It can't be trusted for anything important.
It's an idiom used to express extreme hunger or ravenousness. Someone saying this means they are incredibly hungry and could eat anything"
The AI is aliens!
good to see llms have improved™ (gotten even more ridiculous) since then
Weird lot
Just shows the amount of generative AI being used behind-the-scenes; this really isn't a 'search' any more, it's very much generative if the term being searched for doesn't actually exist.
But if all else fails, remember - you can't paint the same wall twice!
No. It. Isn't.
Generally refers to? When has this 'saying' been said - except just now into Google by me?
"It looks like there aren't many great matches for your search" - no kidding
It does suggest that Google search has some extra layers of instruction sitting on top of Gemini, but I wonder if those instructions basically say "You're a search engine who is trying to summarise answers" - which assumes there is an answer!
You can't even turn it off. And it's in basically every Google service now. It's on YouTube and I can't prove it but I'm 99% sure it's in Gboard now (keyboard for Android phones) for the autocorrect and voice to text features.