• partial_accumen
    link
    fedilink
    English
    32 months ago

    Its only a problem if you want to run AI. If you don’t want AI locally or cloud based, then no need to spend the money on the high end 32GB model (for AI purposes) or paying for a cloud subscription. No one is required to get the 32GB model if they don’t want it.

    • emmanuel_car
      link
      fedilink
      32 months ago

      Sure, but it has made RAM more expensive for everyone, not just those who want to run LLMs.