• 0 Posts
  • 4 Comments
Joined 1 year ago
cake
Cake day: October 24th, 2023

help-circle

  • Yeah you make a really good point there! I was perhaps thinking too simplistically and scaling from my personal experience with playing around on my home machine.

    Although realistically, it seems the situation is pretty bad because freaky-giant-mega-computers are both training models AND answering countless silly queries per second. So at scale it sucks all around.

    Minus the terrible fad-device-cycle manufacturing aspect, if they’re really sticking to their guns on pushing this LLM madness, do you think this wave of onboard “Ai chips” will make any impact on lessening natural resource usage at scale?

    (Also offtopic but I wonder how much a sweet juicy exploit target these “ai modules” will turn out to be.)


  • To be fair: “For each answer it gives”, nah. You can run a model on your home computer even. It might not be so bad if we just had an established model and asked it questions.

    The “forest destroying” is really in training those models.

    Of course at this point I guess it’s just semantics, because as long as it gets used, those companies are gonna be non-stop training those stupid models until they’ve created a barren wasteland and there’s nothing left…

    So yeah, overall pretty destructive and it sucks…