I used to look forward to emails from Oxford about new textbooks coming out and emerging fields of research.

Now they send me this fucking attrocity of an email about an event for AI “innovation” simulation in Healthcare, generating studies.

It took me a mere 148ms to unsubscribe, the only reason the date isn’t censored is because I hope somebody attends and makes a ruckus. FUCK YOU OUP.

  • FiniteBanjo@feddit.onlineOP
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    5
    ·
    1 day ago

    It literally says “How academic research can be simulated”, “why some models are more scalable than others”, and “Which publications are practically useful, and which ones aren’t”

    This is not ambiguous, it’s talking about trash chatbot GenAI and LLMs used to write and summarize papers.

    When I said in the title these assbots have killed people because of their use in healthcare, I was not exagerating. Physicians have been misinformed leading to punctured arteries then disability or death.

      • FiniteBanjo@feddit.onlineOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 hours ago

        That’s fair, but I still think the wording here does not imply normal learning machine use and that Oxford Publishing would have been careful to clarify such.

    • lime!@feddit.nu
      link
      fedilink
      arrow-up
      8
      arrow-down
      3
      ·
      1 day ago

      …how did you get llms from that? simulation is part of academic research, all machine learning systems use models, and it could just as well refer to useful “for the purposes of the simulation”.

      was there more info of the event?

      • FiniteBanjo@feddit.onlineOP
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        20 hours ago

        We live in a very brief but ongoing time period where AI means LLMs to the vast majority of people, and if Oxford weren’t shilling like so many other higher education institutions they would have been careful about their wording.

        • lime!@feddit.nu
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          19 hours ago

          that’s an assumption. it’s a marketing term and they want people te come to their seminar, of course they would use it.

          • FiniteBanjo@feddit.onlineOP
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            19 hours ago

            They’re definitely not getting people like me or any reputable physicians or medical researchers to go to their seminar this way. This is the equivalent to naming a movie “Melania”, imo.

            • lime!@feddit.nu
              link
              fedilink
              arrow-up
              2
              arrow-down
              2
              ·
              18 hours ago

              we’re a very small niche here. talking to people in my vicinity gives an entirely different perspective on ml tools than does talking to the professionals i work with. most randos are neutral to vaguely positive on the subject, though not enough to spend money on it.

              • FiniteBanjo@feddit.onlineOP
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                18 hours ago

                Lmao, fuck off mate. Why are you trying to normalize AI and Slopping in the Fuck_AI community?

                • lime!@feddit.nu
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  9 hours ago

                  i’m not, i’m so annoyed by it being everywhere and i don’t think there is a way to use current generative machine learning models ethically. but i studied this in uni twelve years ago so i know how it was used before the bubble, and there is nothing in what you posted that says this event is about generative systems.

                  putting everything in the same category is not helpful because it discredits genuinely useful medical tools that have been proven to work, while simultaneously helping the openai fuckery seem more legit.