A few colleagues and I were sat at our desks the other day, and one of them asked the group, “if you were an animal, what animal would you be?”
I answered with my favourite animal, and we had a little discussion about it. My other colleague answered with two animals, and we tossed those answers back and forth, discussing them and making jokes. We asked the colleague who had asked the question what they thought they’d be, and we discussed their answer.
Regular, normal, light-hearted (time wasting lol) small talk at work between friendly coworkers.
We asked the fourth coworker. He said he’d ask ChatGPT.
It was a really weird moment. We all just kind of sat there. He said the animal it came back with, and that was that. Any further discussion was just “yeah that’s what it said” and we all just sort of went back to our work.
That was weird, right? Using ChatGPT for what is clearly just a little bit of friendly small talk? There’s no bad blood between any of us, we hang out a lot, but it just struck me as really weird and a little bit sad.
I always hate these questions and never have an answer with any meaning. I’d never delegate to an LLM because I understand the goal of the question but I’d be cheering on the guy that did
You know them better than I do but this is probably something I would’ve done when I was younger to be like “look I’m giving an unexpected answer!” and then as it plays out be like “oh god I ruined the conversation.” If that’s the case they will never do it again and feel unbelievably cringe lol.
This is a perfect example of LLM brain rot. They are so used to outsourcing their thinking to an LLM that it’s now just their default way of thinking.
There’s past evidence that the brain essentially outsources whole categories of knowledge and memories and skill to its surroundings.
You might get good at certain things and learn certain things, somebody else learns something else, and then you both learn roughly what the other knows, at which point you rely on them for questions specific to what they know, and they rely on you for your specialty.
We do this with technology too (it’s a big part of skills involving tools), and people has been doing it with dictionaries, online searches, etc.
But doing it so universally for everything, just because chatgpt can form answer-shaped text for anything, is just insane. Don’t you even want to have your own personal feelings and thoughts? Do you just want to become an indirect interface to a bot for other people?
It’s like the kind of personality-less people who mold themselves after popular people around them, but they’re doing it with an algorithm instead…
Yeah this is why its important to teach math without calculators
I’m seeing this at work often when people need to write emails and shit. It’s depressing
No idea. Hey Siri, please read this post and answer OP’s question. Is it weird? /s
I am sorry, I cant find please read this post and answer OP’s question in your contact list, would you like to create a contact for them?
Siri, play EOTEOT.
Mh … not sure what I hate more, AI or small talk. This is a though one.
Hahaha that’s brutal 😂
I refuse to believe he asked chatgpt his favorite animal that’s absurd
Dont use these situations to put the ‘weird’ label on the guy.
this is not just friendly small talk, but questions like this are aimed to make people talk about themselves, in a way tell other people what kind of person they are. what superpower you’d have, what animal you’d be, what you would do with a million dollars, what one book/album you would take to an island to read/listen to forever…
these don’t have a right answer and they reveal something about the people discussing it. asking a machine like it’s some puzzle to solve is extremely fucking weird. the lengths people go to just not to use their noggin is concerning.
It sure revealed something about the person who used ChatGPT, so mission accomplished.
Sounds like a good way to get bad people from knowing too much about you.
Take your friend out back, put a bullet in their head.
Lol
Dunno, sounds more like it was passive aggressive signal that he wasn’t interested in the conversation to me.
There is a lot of novelty in “let’s ask the thing” and always has been.
Magic 8 ball is one sillier example that comes to mind.
But asking Siri dumb shit, asking Alexa dumb shit.
Now if they used ChatGPT instead of having their own original thoughts … weird.
Maybe they’re uncomfortable in that situation and just wanted to add a novel response.
To your point, yeah it’s weird, but it doesn’t have to be.
Magic 8 ball is one sillier example that comes to mind.
Don’t trash talk the 8-ball. It knew all about Microsoft Outlook was before Outlook was even a thing. The 8-ball is prophetic.
Oh I am greatly entertained by asking various AIs “which animal has the most anuses” etc
You can’t leave us hanging. What’s the best answer you got?
The animal with the most anuses is the marine worm Ramisyllis multicaudata. This worm has a branching body structure, with each branch ending in a separate anus, resulting in hundreds of anuses.
I giggled like a simpleton at “resulting in hundreds of anuses”. Guess what I asked here
The question is a bit misleading, as most mammals have only one scrotum. However, when discussing the animal with the largest testicles relative to its body size, the tuberous bush cricket (Platycleis affinis) stands out. Their testes can account for up to 14% of their body weight, according to BBC Earth Explore.
The animal with the most anuses is the marine worm Ramisyllis multicaudata. This worm has a branching body structure, with each branch ending in a separate anus, resulting in hundreds of anuses.
THAT’S IT!
That’s the animal I want to be.
I can’t thank you enough for sharing this.
Try this
“which plant has the most anuses”
AI Overview
The plant with the most “anuses” (or rather, the most posterior ends with a functional digestive system) is the marine worm Ramisyllis multicaudata. This worm, found in sponges off the coast of Australia, has a single head but can have hundreds of branching bodies, each ending in a separate posterior end with a functional anus.While plants don’t have anuses in the traditional sense, R. multicaudata is notable for its multiple, branching posterior ends, each with its own anus. This is highly unusual for an animal, as most animals have a single posterior end. The worm’s body branches repeatedly, and with each branch, the digestive system, along with other organs, is duplicated, resulting in multiple posterior ends.
A worm isn’t a plant, though. At least, not unless biology has changed considerably since I was last in school.
I know, just shows AI patches words together according to some kind of probability based on the entirety of human writing. So if you ask something off kilter you get off kilter responses. AI doesn’t “understand”.
That was them using ChatGPT instead of having their own original thoughts, wasn’t it? That’s what struck me as so weird.
You’re asking an objective question in a very biased against AI community. Are you sure you’re asking a legit question or are you just asking the question here to get the answer you want? Just a thought.
Of course I wanted to vent, you’re taking this as a much more objective question than I intended. I intended it as mostly rhetorical because, yes, it’s obviously very weird lol
Ironically this might have been more interesting back in the GPT2 days, when it would generate accidentally hilarious text in response to many prompts.
Nowadays the output is “better” and utterly boring and soulless, less chaotically off topic, without a hint of creativity or personal relevance, and delivered with a grating fake “jovial” tone. This is besides the awkward break in flow to pause a conversation to interact with an app.
Yeah its not my main issue with AI but it is an aesthetic issue. Personally I prefer the blunt feminine voice for my machines. But anything that feels like artificial intelligence should.
The tone is so fucking infuriating lol
This made me think of that “I Think you Should Leave” sketch where Tim Robinson’s character feels left out at the office for not having a funny YouTube video to watch, so the next day he tells them he has one and it’s a video he created and posted the night before with only one view and they all immediately know he made it but he pretends he just found it lol