We all know by now that ChatGPT is full of incorrect data but I trusted it will no go wrong after I asked for a list of sci-fi books recommendations (short stories anthologies in Spanish mostly) including book names, editorial, print year and of course ISBN.

Some of the books do exist but the majority are nowhere to be found. I pick the one that caught my interest the most and contacted the editorial directly after I did not find it in their website or anywhere else.

This is what they replied (Google Translate):


ChatGPT got it wrong.

We don’t have any books with that title.

In the ISBN that has given you the last digit is incorrect. And the correct one (9788477028383) corresponds to “The Holy Fountain” by Henry James.

Nor have we published any science fiction anthologies in the last 25 years.


I quick search in the “old site” shows that others have experienced the same with ChatGPT and ISBN searches… For some reason I thought it will no go wrong in this case, but it did.

  • Not_mikey@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    1 year ago

    What would be your definition of intelligence if an chatgpt is not intelligence?

    My definition would be something along the lines of the ability to use knowledge, ideas and concepts to solve a particular problem. For example if you ask “what should I do if I see a black bear approaching?” Both you and chatgpt would answer the question by using the knowledge that black bears can be scared off to come to the solution “make yourself look big and yell”

    The only difference is the type of knowledge available. People can have experiential knowledge, eg. You saw a guy scare off a bear one time by yelling and waving their arms. Chatgpt doesn’t have that because it doesn’t have experiences. It does have contextual knowledge like us, you read or heard from someone that you can scare off a bear. This type of knowledge though is inherently probabilistic, the person who told you could always be giving false information. That doesn’t make you unintelligent for using it though and it doesn’t mean you don’t understand accuracy if it turns out to be false, it’s just that your brain made a guess that it was true that was wrong.