• partial_accumen
    link
    fedilink
    English
    55 hours ago

    yea, i’m surprised, 32GB is goddamn ridiculous for anything, let alone for a shitty hp branded autocorrect

    32GB is actually considered the bare minimum for most of the common locally run LLM models. Most folks don’t run a locally run LLM. They use a cloud service, so they don’t need a huge pile of RAM locally. However, more privacy focused or heavy users with cost concerns might choose to run an LLM locally so they’re not paying per token. With regards to locally run LLMs, this would be comparable to renting car when you need it vs buying one outright. If you only need a car once a year, renting is clearly the better choice. If you’re driving to work everyday then clearly buying the car yourself is a better deal overall.

    You are perfectly fine not liking AI, but you’re also out-of-touch if you think 32GB is too big for anything. Lots of other use cases need 32GB or more and have nothing to do with AI.

    I agree with your frustration with subscription laptops. I hope people don’t use it.

    • @U7826391786239@lemmy.zip
      link
      fedilink
      English
      74 hours ago

      well hp is aware that laptops are quickly becoming out of reach money-wise for a larger and larger chunk of consumers, they just had to figure out some way to exploit that.

      $420 a year for a laptop doesn’t sound like robbery at first, until you consider it’s just money out the window, and they’re 100% harvesting every 1 and every 0 input and output from that laptop that they still own/control. i haven’t even looked at the fine print, which i’m willing to bet makes the whole thing exponentially worse

    • XLE
      link
      fedilink
      English
      34 hours ago

      It all reads like a giant racket. AI requires 32GB of RAM on your laptop, 32GB of RAM is expensive, so you have to lease, and it’s expensive because AI requires RAM to run in the cloud. It’s a problem in search of a solution, and it keeps making new problems along the way.

      • partial_accumen
        link
        fedilink
        English
        33 hours ago

        Its only a problem if you want to run AI. If you don’t want AI locally or cloud based, then no need to spend the money on the high end 32GB model (for AI purposes) or paying for a cloud subscription. No one is required to get the 32GB model if they don’t want it.