• stealth_cookies@lemmy.ca
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    4
    ·
    4 months ago

    Why is this news? Every company that deals with intellectual property, proprietary information, and/or sensitive information should not be using public LLM tools due to the risk of leaking that data. That is why these companies are providing more sandboxed versions of these tools to protect against the issue.

  • surewhynotlem@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    4 months ago

    With copilot you can lock your data into your own tenant. You don’t leak data that way (except to Microsoft I guess)

    • demonsword@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      18
      ·
      4 months ago

      With copilot you can lock your data into your own tenant.

      openai is almost a Microsoft branch, what you said doesn’t made much sense to me

      • brick@lemm.ee
        link
        fedilink
        English
        arrow-up
        30
        ·
        4 months ago

        The selling point for M365 Copilot is that it is a turnkey AI platform that does not use data input by its enterprise customers to train generally available AI models. This prevents their internal data from being output to randos using ChatGPT. OpenAI definitely does use ChatGPT conversations to further train ChatGPT so there is a major risk of data leakage.

        Same situation with all other public LLMs. Microsoft’s investments in OpenAI aren’t really relevant in this situation.

      • Martin@feddit.nu
        link
        fedilink
        English
        arrow-up
        8
        ·
        4 months ago

        A lot of companies have existing contracts with Microsoft regarding data management, but those contracts are probably not relevant to an “almost branch of Microsoft”.

  • Ghostalmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    4 months ago

    Ehh, kinda.

    The DWP is trialing an internal tool based on Microsoft Copilot, a digital assistant, to help automate tasks, Civil Service World reported.

    Looks like they’re wrapping it in some stuff to govern its usage. They could’ve done this with vanilla Chat GPT, but they’re probably partnering with Microsoft because Microsoft is massive enough to actually build part of the solution.

    MS staffed up to do a fuck load of enterprise copilot stuff. I personally know a number of companies working with them on partnered copilot projects.

    • Jagermo@feddit.de
      link
      fedilink
      English
      arrow-up
      19
      ·
      4 months ago

      It’s probably about data control. If it’s part of your it and you can control access and data retention, fine. But if everyone feeds the chatgpt website with documents and sensitive info, that’s an issue.

      • Ghostalmedia@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        Absolutely. There is a ton of governance stuff is required at an organizational and national level. Want to use AI to automate some tasks for PII? You’re legally required to keep that stuff under tight control.

  • Obinice@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    9
    ·
    4 months ago

    The hell is social security? We don’t have one of those.

    Is this another one of those American terms they’re trying to replace our terminology with? It smells of that.

    Also Britain is an island, it doesn’t include Northern Ireland, which is part of the UK, and if we’re talking about a nation wide thing I suspect it’ll include more than just Britain, but in fact the entire UK.

    Sorry, I’m just tired of Americans trying to force their terminology on us after so many years of putting up with it :-(

  • kirbowo808@kbin.social
    link
    fedilink
    arrow-up
    14
    arrow-down
    2
    ·
    4 months ago

    Not really surprised about the irony considering the UK gov was willing to ban encryption and apps like Signal, just so they could further spy on us.

  • R0cket_M00se@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    Copilot is an official Microsoft offering like Outlook or PowerBI, of course they’d rather you use a product officially sanctioned by the company you already pay to handle your productivity software.

  • paddirn@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    My company has had a similar thing with AI and banned most of the bigger AI sites (Bard, ChatGPT, HuggingFace, and even Copilot originally). We’re in Microsoft’s ecosystem though, so Copilot in Edge is just there and AI has been added in to Adobe products, so I think they’ve just sort of given up trying to contain it.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    This is the best summary I could come up with:


    “Users must not attempt to access public AI applications (such as ChatGPT) when undertaking DWP business, or on DWP-approved devices,” the guidance now reads.

    The department is exploring how AI could help staff complete writing tasks and assist work coaches with clients in job centers.

    The DWP is trialing an internal tool based on Microsoft Copilot, a digital assistant, to help automate tasks, Civil Service World reported.

    It’s currently unclear why the DWP moved to scrap OpenAI’s ChatGPT and experiment with internal tools, but it’s likely the department is the latest body to decide privacy concerns around the LLM were too great to ignore.

    We actively work to reduce personal data in training our systems like ChatGPT, which also rejects requests for private or sensitive information about people.”

    The framework outlined 10 key principles staff should uphold when using generative AI, covering ethics, laws, and understanding of the technology’s limitations.


    The original article contains 570 words, the summary contains 150 words. Saved 74%. I’m a bot and I’m open source!

    • Fisch@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      There was that whole thing where MS trained it on open source code from GitHub, which means that they didn’t just use ChatGPT but made their own model

      • The Octonaut@mander.xyz
        cake
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        Microsoft Copilot and Github Copilot are not the same thing, despite Microsoft owning both. Just a lot of people like the “copilot” image. I assume they’ll eventually change one of the them to the similarly positive “Fuckbuddy” because “Crutch” sounds too negative

          • R0cket_M00se@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            I bet they regret blowing their load on Cortana already with their useless Windows 10 feature.

            Now their actual AI isn’t named after the AI from the videogame they specifically used for brand recognition. What a fuck up.

        • /home/pineapplelover@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          Dude I thought both copilots were the same thing wtf. So is regular copilot just chatgpt or does it also have github data? Do they both have chstgpt and github data?