https://www.amazon.se/-/en/NVIDIA-Tesla-V100-16GB-Express/dp/B076P84525 price in my country: 81000SEK or 7758,17 USD
My current setup:
NVIDIA GeForce RTX 4050 Laptop GPU
cuda cores: 2560
memory data rate 16.00 Gbps
My laptop GPU works fine for most ML and DL tasks. I am currently finetuning a GPT-2 model with some data that I scraped. And it worked surprisingly well on my current setup. So it’s not like I am complaining.
I do however own a stationary PC with some old GTX 980 GPU. And was thinking of replacing that with the V100.
So my question to this community is: For those of you who have bought your own super-duper-GPU. Was it worth it. And what was your experience and realizations when you started tinkering with it?
Note: Please refrain giving me snarky comments about using Cloud GPU’s. I am not interested in that (And I am in fact already using one for another ML task that doesn’t involve finetuning) . I am interested to hear about the some hardware hobbyists opinion on this matter.
No. V100 is not ampere architecture and for that price is simply not worth. 3090 is cheaper and has 24 gb