• Septimaeus@infosec.pub
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    2 days ago

    There was a comment yesterday that offered a simpler explanation than the headline’s conclusion.

    The papers were published by Iranian researchers and in Farsi “scanning” (روبشی) and “vegetative” (رويشی) differ only by one character (ب and یـ) which also happen to be adjacent on the keyboard.

    That is, there’s some evidence that this is a typo or mistranslation that has been reused among non-native speakers, as opposed to a hallucination. If so, it could still be a LM replicating the error, but I’ve definitely seen humans do the exact same thing, especially when there’s a strong language barrier.

    Edit: brevity

    • bitcrafter@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      A couple of decades ago I got really confused because I found a lot of papers referring to “comer” cubes, but could not find an actual definition. Eventually I figured out that these were actually “corner” cubes, but somewhere a transcription error occurred that merged the r and n into an m, and this error kept getting propagated because people were just copying and pasting.

      • Septimaeus@infosec.pub
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        That’s an apt example from English, especially given the visual similarity of the error.

        It’s the kind of error we would expect AI to be especially resilient against, since the phrase “corner cube” probably appears many times in the training dataset.

        Likewise scanning electron microscopes are common instruments in many schools and commercial labs, so an AI writing tool is likely to infer a correction needed given the close similarity.

        Transcription errors by human authors, however, have been dutifully copied into future works since we began writing stuff down.

  • xep@fedia.io
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    The term is “Vegetative Electron Microscopy,” same as the other articles on this you may have seen.

  • 𝘋𝘪𝘳𝘬@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    So at least 22 papers from the study were AI generated and not checked afterwards.

    This says more about the authors the AI users who claim authorship than about AI.

    • fluxion@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      And AI is dumb AF and we’ve already basically thrown in the towel on having it run everything/everyone

      • Opinionhaver@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Saying “AI is dumb” is like saying “plants taste bad”

        You’re probably talking about our current Large Language Models.

        • fluxion@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          No I’m talking about shitty AI products being used for shit they shouldn’t be used for like determining which US workers to fire or spreading election propaganda to elect clowns. Not quite the super-intelligent overlords i thought would take over society

          • Septimaeus@infosec.pub
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 hours ago

            It’s semantics, but I think the person above is just pointing out that “AI” is an old umbrella term that refers to a lot of technologies that include previous current and future work, and shouldn’t necessarily be bound forever to one era’s misapprehension and misuse of a particular subset of those technologies.

            Prior examples of AI included early work by Alan Turing. Current examples include tools that enable people with disabilities. Future examples might offer solutions to major problems we face as a society. It would be a shame if use of a term as a buzzword was all it took to kill a discipline.