You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is an excellent question that opens up plenty of passionate debate. When I am renting hardware to run my own models, I prefer renting CPU only hardware without GPU. The motivations are: AVX capable CPUs are cheap and I don't risk exceeding VRAM.
Before you consider me crazy, I recommend having a look at:
To reply to your question, small non convolutional models might be slower on GPU. I would use GPU only on bigger models with convolutions. I would expect improvement from 2x to 8x in convolutional models using GPU. My own models are trained on CPU only environments because I have found better price x performance on CPU.
Depending on where I am renting hardware, I can get 20 CPU cores for the cost of 1 GPU. Anyway, one model can be price effective on GPU and then the next model may not be. It's a moving target.
No description provided.
The text was updated successfully, but these errors were encountered: