It’s literally the same thing, the obvious difference is how much usage it’s getting at a time per gpu, but everyone seems to assume all these data centers are running at full load at all times for some reason?
The highest likelihood is you have literally no idea how any of this works and are just joining the crowd of AI bad because energy usage and have done zero research or hands on knowledge of how these tools actually work.
One user vs a public service is apples to oranges and it’s actually hilarious you’re so willing to compare them.
It’s literally the same thing, the obvious difference is how much usage it’s getting at a time per gpu, but everyone seems to assume all these data centers are running at full load at all times for some reason?
It’s explicitly and literally not the same thing.
The highest likelihood is you have literally no idea how any of this works and are just joining the crowd of AI bad because energy usage and have done zero research or hands on knowledge of how these tools actually work.