Laptop/ Desktop for Training Deep Learning Models

Any recommendations for any laptop/desktop with a good GPU for training and deploying deep learning models please. Nominally I would use Gcp or aws. But now I am in between jobs and aws might be expensive. I was thinking of buying a laptop with the nvidia rtx 3080ti. Cyber monday deals are selling them in the ball park of $2000 (this is price is steep but might be willing to invest if it leads to reasonable model training time)

I wanted your opinion. Do you think I should run with a 3080ti/ or some other graphics card . Should I opt for spending the money on Aws. Should I opt for desktop/laptop

I already have a laptop with intel-i7 and Nvidia GeForce RTX 3050 ti. I could also just try running things with those , with the understanding things will be slow

Note this laptop would be for replicating state of the art open source code like yolox, point pillars etc abd also for experimenting with the latest papers + open source code in nerfs, gans etc.

2 Likes

I’m excited to see what people have to say about this.

@kausthubk didn’t you just buy/build a station?

I also think @Matt might have some good insight here as well.

Any input here from @trust_level_1?

2 Likes

I think that if you’re looking to get a nice laptop with a GPU, that sounds like a good idea. There are benefits to training your models outside of the cloud. It’s important to know whether you want the laptop for inference serving or for model training. If you want a laptop for model training, I’d highly encourage you to use free resources like Google Colab, or even purchase their premium subscription. But a dedicated GPU for model training would be less expensive or better equipped. You’d get a better deal with a desktop of course – even better with simply a board.

Either way, I think you’ll make a good choice.

2 Likes

Oh I have so many thoughts on this @harpreet.sahota (nah same machine setup as always - just upgraded my mics hahaha).

But here are my thoughts:

To Cloud or Not To Cloud - that is the question

def i_should_cloud() -> bool:
    if (you.discipline == "high") or (you.bank_balance == inf):
        return True
    else:
        return False

(P.S.: Sorry for writing code without unit tests)

Down sides of on-prem

Speed is less the issue than size of your VRAM tbh - SOTA models can be pretty damn huge to fit onto memory - take that into consideration. Either way it is simply not going to live up to the kinds of GPUs you can get on cloud (Tesla A100s :star_struck: etc.)

Why I hate laptops

  • battery issues
  • size & heat management issues for long training runs
  • hardware upgrade limitations

Why I hate desktops

  • I live in Australia… I really don’t need a space heater.
  • Still limited in terms of upgrade path

Down sides of cloud

Cloud costs are genuinely expensive - be wary of compute instances that you just leave running - that’s a time-bomb your wallet will NOT like. Especially if you’re linked up to a powerful GPU.

What’re you trying to do?

  • What kind of models are you trying to train?
  • Can they fit in your vRAM on a standalone graphics card?
  • What CAN and CAN’T you do with a commercial gaming card? make sure it suits your purpose.

Operating System Considerations

MacOS

For on-prem just don’t bother. Usually I would recommend this for developer experience… but since moving to apple silicon basically none of the key packages work reliably (inc. numpy, pandas, etc.) and you’d have to build some dependencies from source. It’s a hassle - wait for that to settle before going for mac again.

If you’re going pure cloud or dev-in-docker then this is still a pretty good option.

Linux

It’s a pain to set up but once it’s set up is the easiest to get up and running with training networks etc. This is great if all you do is ML with this machine.

I’d recommend this as a dual boot option for on-prem.

Windows

If it’s Win11 you’ll have Windows Subsystem Linux with GPU pass through. I’ve found this actually works extremely well post-June 2022. It’s my personal set up of choice because I like having access to other things like CAD software and good pro-grade software like office 365 etc. that just make my life easy.

It also means far less finicky drivers and machine setup.

Hardware Configurations to Consider

Towers/Desktops

  • Good: If your intent is a pure home workstation that’s static this is good
  • Good: Easy maintenance
  • Good: Value for money (no portability premium)
  • Bad: upgrades are limited by tower size. You may be able to upgrade ram and storage and CPU, but changing to a higher class GPU sometimes is impossible without a bigger case

Gaming “Laptops”

Let’s be real… these aren’t usually laptops but i’ve used a Dell G7 before and quite enjoyed using it.

  • Bad: portability premium for a machine that is best behaved as a desktop anyway
  • Bad: little to no upgradability

Laptops + eGPUs

This is a powerful combo if you want full power at home but only need CPU internal power to demo models live in front of people or want to just use the laptop as a regular laptop.

  • Good: Extremes of portability (unplug the GPU and tk
  • Good: Configurability (you can use the same GPU on other laptops as well.
  • Bad: Expensive
  • Bad: limits you to laptops that have a Thunderbolt 3 port

If you can afford it I would 100% recommend off-board GPUs.

2 Likes

Epic responses from @kausthubk and @Matt! Thank so much for taking time to write out such thoughtful responses and helping out our fellow community member @krswamin!

Hope you found these helpful, Krithika!

1 Like

I’ve started my jouney into deep learning with GeForce 960M, which was enough to get into bronze on Kaggle back in the day.

Doing DL on Laptop is doable. But I don’t recommend to have it as THE ONLY machine to own.

Pros:

  • Can train deep learning models from the forest
  • Ideal for quick debugging / prototyping

Cons:

  • Thermals. Laptops ARE NOT DESIGNED FOR SUSTAINED GPU/CPU LOAD. Sustained = HOURS or training. This would overheat one’s batter and reduce battery lifetime significantly. LiPo does not love 50+C, which is very common inside the tiny laptop case when GPU is in use. The problem is about prolonged exposure of the battery, not occasional load spikes or gaming.
  • GPU memory. At most you can get 16Gb of VRAM for top-notch laptop. Usually it is 8G but what you can train with that amount these days?
  • Speed. Mobile-version of RTX chips are just slower than their desktop brothers.

Usually you can get decent PC for the same budget which would be much faster.
I use HP Envy with 2060 (6Gb) as my laptop-to-go, while doing heavy lifting on desktop GPUs.

Why I hate desktops

  • Still limited in terms of upgrade path

Well, laptops are even more limited on this matter :slight_smile:

If you concerned on heating, think of moving the headless PC case outside in garage and connect to it via RDP/SSH from laptop.

3 Likes

Hey @kausthubk, the other folks have already given some great suggestions. Some things to consider,

  • A laptop is ok, but not powerful or extendable. Can’t change the GPU. Limited memory prevents you from working with some of the bigger models

  • Better to get a machine that can support the newest GPUs with a goal to get multiple GPUs. The data scientists at the top do not work with a single GPU. The skill of leveraging multi-gpus is the separating factor for the top teams. If you could afford it the A6000 is good.

Building your own environment in the cloud is not a bad skill to have either. You can have a strategy of Collab Pro to do dev and then move your notebook to your own instance to crunch it faster. Think of really cheap dev (writing code) environment and then a faster training environment. The faster you train and shut down your instance the more you save.

Getting in with some of these service providers is a great way to get better deals in the long term.

3 Likes

Thank you Matt, Kausthub , EKhvedchenya and Mark for all your replies ! Such great responses ! Much appreciated !

1 Like