• GreenKnight23@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    56 minutes ago

    bunch of greedy fucks.

    greed should be a registered mental illness that’s no different than OCD, schizophrenia, or PTSD.

    1000001574

  • Melvin_Ferd@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    2 hours ago

    I don’t think this is as dramatic as a lot of you are saying it is. It works or it doesn’t. This is what VC should do

    • dantheclamman@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 hour ago

      Have you ever used a chatbot for technical support? It’s infuriating. Yet the industry is barreling in that direction before the tech is ready, customers be damned. This is not what VC should do.

      • Melvin_Ferd@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 hour ago

        Have you ever had to use an indifferent college student that barely speaks English for technical support

  • Manticore@lemmy.nz
    link
    fedilink
    English
    arrow-up
    31
    ·
    8 hours ago

    Isn’t the MO for venture capitalists to run businesses into the ground, make them owe debt to themselves, cannibalise businesses from the inside and then run away with a profit while they bankrupt?

    Not surprising to make a decision that kills a business because the entire point is to kill the golden goose

  • arrakark@lemmy.ca
    link
    fedilink
    English
    arrow-up
    95
    arrow-down
    2
    ·
    13 hours ago

    LOL. If you have to buy your customers to get them to use your product, maybe you aren’t offering a good product to begin with.

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      11 hours ago

      There is another major reason to do it. Businesses are often in multi year contracts with call center solutions, and a lot of call center solutions have technical integrations with a business’ internal tooling.

      Swapping out a solution requires time and effort for a lot of businesses. If you’re selling a business on an entirely new vendor, you have to have a sales team hunting for businesses that are at a contract renewal period, you have to lure them with professional services to help with implementation, etc.

    • dantheclamman@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      31
      ·
      13 hours ago

      That stood out to me too. This is effectively the investor class coercing use of AI, rather than how tech has worked in the past, driven by ground-up adoption.

      • Jimmycakes@lemmy.world
        link
        fedilink
        English
        arrow-up
        36
        ·
        edit-2
        12 hours ago

        That’s not what this is. They find profitable businesses and replace employees with Ai and pocket the spread. They aren’t selling the Ai

    • venusaur@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      13 hours ago

      Plenty of good, non-AI technologies out there that businesses are just slow or just don’t have the budget to adopt.

    • Initiateofthevoid@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      7 hours ago

      The idea of AI accounting is so fucking funny to me. The problem is right in the name. They account for stuff. Accountants account for where stuff came from and where stuff went.

      Machine learning algorithms are black boxes that can’t show their work. They can absolutely do things like detect fraud and waste by detecting abnormalities in the data, but they absolutely can’t do things like prove an absence of fraud and waste.

      • vivendi@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 hours ago

        For usage like that you’d wire an LLM into a tool use workflow with whatever accounting software you have. The LLM would make queries to the rigid, non-hallucinating accounting system.

        I still don’t think it would be anywhere close to a good idea because you’d need a lot of safeguards and also fuck your accounting and you’ll have some unpleasant meetings with the local equivalent of the IRS.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 hours ago

      How easy will it be to fool the AI into getting the company in legal trouble? Oh well.

    • vivendi@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 hours ago

      This is because auto regressive LLMs work on high level “Tokens”. There are LLM experiments which can access byte information, to correctly answer such questions.

      Also, they don’t want to support you omegalul do you really think call centers are hired to give a fuck about you? this is intentional

      • Repple (she/her)@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        I don’t think that’s the full explanation though, because there are examples of models that will correctly spell out the word first (ie, it knows the component letter tokens) and still miscount the letters after doing so.

        • vivendi@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          No, this literally is the explanation. The model understands the concept of “Strawberry”, It can output from the model (and that itself is very complicated) in English as Strawberry, jn Persian as توت فرنگی and so on.

          But the model does not understand how many Rs exist in Strawberry or how many ت exist in توت فرنگی

          • Repple (she/her)@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 hours ago

            I’m talking about models printing out the component letters first not just printing out the full word. As in “S - T - R - A - W - B - E - R - R - Y” then getting the answer wrong. You’re absolutely right that it reads in words at a time encoded to vectors, but if it’s holding a relationship from that coding to the component spelling, which it seems it must be given it is outputting the letters individually, then something else is wrong. I’m not saying all models fail this way, and I’m sure many fail in exactly the way you describe, but I have seen this failure mode (which is what I was trying to describe) and in that case an alternate explanation would be necessary.

            • vivendi@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 hour ago

              The model ISN’T outputing the letters individually, binary models (as I mentioned) do; not transformers.

              The model output is more like Strawberry <S-T-R><A-W-B>

              <S-T-R-A-W-B><E-R-R>

              <S-T-R-A-W-B-E-R-R-Y>

              Tokens can be a letter, part of a word, any single lexeme, any word, or even multiple words (“let be”)

              Okay I did a shit job demonstrating the time axis. The model doesn’t know the underlying letters of the previous tokens and this processes is going forward in time

  • otacon239@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    12 hours ago

    I am so glad I got out of IT before AI hit. I don’t know how I would have handled customer calls asking why our chat is telling them their shit works when it doesn’t or to cover their computer in cooking oils or whatever.

    And only after they banged their head against the AI for two hours and are already pissed will they reach someone. No thanks.

    Thank god I can troubleshoot on my own.

    • tauisgod@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      12 hours ago

      When VC and PE call a company or industry “mature” it means they don’t see increasing revenue, only something to be sucked dry and sold for parts. To them, consistent revenue is worthless, it must be skyrocketing or nothing. If you want to see this in action right now, look what Broadcom is doing to VMWare. They also saw VMWare as a “mature company”.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    4
    ·
    9 hours ago

    Makes sense to me. AI bullshit generators may be worse than useless for most of the things people try to do with them, but they might just be the perfect tool for rationalizing the systematic looting of formerly productive companies by private equity.

  • Optional@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    12 hours ago

    “What if we threw a ton of money after the absolute shit ton of money we threw away?”