Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.

  • EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    1 year ago

    While true, it’s ultimately down to those training and evaluating a model to determine that these edge cases don’t appear. It’s not as hard when you work with compositional models that are good at one thing, but all the big tech companies are in a ridiculous rush to get their LLM’s out. Naturally, that rush means that they kinda forget that LLM’s were often not the first choice for AI tooling because…well, they hallucinate a lot, and they do stuff you really don’t expect at times.

    I’m surprised that Google are having so many issues, though. The belief in tech has been that Google had been working on these problems for many years, and they seem to be having more problems than everyone else.