I have an RTX 2080ti.
I still play in 1080p 60Hz, and the 2080 is plenty. But I’m looking to train some ML models, and the 11GB VRAM is limiting for that.
Thus, I plan to buy a new one. Also I don’t want a ML only GPU since I don’t want to maintain two GPUs.
Since I’m upgrading, I need to think of future compatibility. At some point I will move to at least 2k, although still I’m not bought into 4k as any perceivable benefit.
Given all these, I wanted to check with folks who have either card, should I consider 4090?
How much ML training will you do, and what kind of models? Are you just a hobbyist, or are you a student or researcher in ML?
If the former, you may be better served by renting a machine for training instead. Vast.ai is one such service for this and you can rent machines with a 4090 for something like 50 cents an hour. For hobbyist stuff this usually ends up cheaper than buying a whole card, especially if you find out that you need multiple GPUs to train the model effectively.
If you’re a researcher though, a 3090 might be a good buy. IMO the gains from a 4090 won’t be too crazy unless you’re doing specific mixed precision stuff (the newer gen tensor cores support more data types). Be aware that the large models that necessitate 24GB of VRAM usually require many GPUs to train them successfully in a reasonable amount of time, so having a large VRAM GPU is moreso useful for quick development and debugging rather than training large models, in which case the 4090 wouldn’t be all that much better.