Deep learning on a budget

Deep Learning on a Budget (i.e. $550)

The GPU power required for deep learning on large datasets does not come cheap, especially on the cloud. Currently, it is $0.90 per hour on AWS for a single Nvidia K80. Unless you’re a prodigy at picking hyperparameters, using this for training deep nets on large datasets can add up very fast. In fact, the cost of one month of training is more than the cost of an entry level DIY deep learning machine.

If you’re like me, and you would like to build a deep learning machine on a budget, it’s not that difficult. I built a new machine for myself over Christmas for less than $600. There are really only seven parts you need to worry about:

  • processor
  • motherboard
  • power supply
  • memory
  • hard drive
  • graphics card
  • chasis

For deep learning applications you will be doing the vast majority of your computation on GPUs. So, forking over a lot of money for a powerful processor is not necessary. The processor we’re going to look at is really fast, the only downside is that it only has two cores. You may want a more powerful processor for other applications, but this post is about building a bare bones deep learning machine. Thus, a Core i3-7100 will work just fine, and at $120 it’s a steal.

The motherboard is an essential component of any CPU build. For deep learning, there really is only one feature of the motherboard that we’re concerned with – a PCIe 3.0×16 slot for our GPU. This criterion is critical because it allows for minimal latency between the GPU and the CPU, but it’s standard on virtually all motherboards with the LGA 1151 chipset that we want for our i3-6500. In this case I’ll just recommend the motherboard I used, the ASUS H110M-A/m.2. At $59 it might be the cheapest LGA 1151 out there (it also takes an NVMe SSD if you want to upgrade).

Moving on, the power supply is absolutely critical. At a minimum, it should support the maximum requirements for all of our components even at only 90% of it’s rated power. For our machine, we won’t need over 350W, so we’ll be safe with a 400W power supply. We’ll go with the Rosewill RD400-2-SB. It isn’t certified bronze, silver, gold, titanium or platinum, but we’re not building a mission critical machine. We’re on a budget, and at $37 this is exactly what we need.

Now on to memory. We don’t want anything fancy here, and, in fact, 8GB is plenty for our entry-level deep learning machine. As a rule of thumb, you should have at least as much CPU RAM as GPU RAM. Having twice as much CPU RAM is ideal. Our GPU is going to have 3GB of RAM, so, since we do care about deep learning we definitely want 8GB. This is even more than double, which leaves 2GB that can always be dedicated to the OS. I will suggest just getting one DIM because there really isn’t a difference in price. We’ll go with the 8GB G.Skill Ripjaws V Series 2400MHz DDR4 SDRAM. This is slightly overclocked which doesn’t really matter for us, but, for $53 we’re still doing pretty well.

Now for the hard drive. This choice comes down to three criteria; size, speed and noise. Size is pretty obvious, and since you’re building a machine to work with large datasets we might need to splurge a little bit here. The last time I checked the ImageNet dataset was over a 1TB, so, I’m going to suggest that we go for a 2TB drive here. That’s enough to get you started and you can always add more storage later. Speed is another concern, but now 7200RPM drives are pretty common and affordable, so we’ll definitely opt for that. Finally comes volume, and this is where we’ll save some money. A lot of cheap drives function very well but make a lot of noise. You pay extra to avoid the noise and that’s just not a luxury we can afford (I just cut on my stereo to drown out the noise). Here we’re going to go with a HGST/Hitachi UltraStar 7200RPM 2TB SATA drive. At $52 this is another steal.

Now on to the GPU. Obviously, this is the most important component of our machine and that’s why it will also the most expensive. That said, the price per flop in Nvidia GPUs has dropped drastically in the past year (and as far as we’re concerned Nvidia is the only game in town). This drop in price is because Nvidia came out with it’s much anticipated line of 10-series cards. These cards use it’s new Maxwell architecture and are far superior to the previous generation. Don’t be deceived by discounted 9-series cards, they are obsolete for our purposes. We’re going to focus on the GTX1060, which I think is sort of Nvidia’s entry-level VR card. There are many choices from various manufacturers, but, under the hood they’re all the same. The one I wound up with is the Gigabyte GeForce GTX1060. It’s $200, so, we’re still doing well. The only downside is that it only has 3GB of RAM. For $40 we could go to 6GB of RAM, but I opted against it. The fact of the matter is that with most datasets you don’t need that much RAM. However, if you have a dataset that does need that much RAM you’ll get a really nasty memory error with the 3GB card. Not to worry though, all you need to do is half the mini batch size and you’ll be up and running again (in extreme cases you might have to half the mini batch size again). This won’t necessarily be ideal, so, if you do plan to be working with high dimensional datasets, like color images, you may want to spend the extra $40. For now we’ll just go with the 3GB card.

Last but not least is the chassis or case. There is certainly no need to spend a lot here unless you’d like the machine to look flashy or store a bunch of drives. If you’re on a budget I hope that’s the least of your worries. I went with a $30 Rosewill Micro ATX Tower. The FBM-05 should work just fine. It has an extra internal bay if you want to add another hard drive later, and it has three external bays.

So, that’s it. Well, you need a mouse, keyboard and monitor, but hopefully you’ve got that already. If not you can look on eBay or as the IT department at your school to see if they can help you out. You also want an operating system and I’d recommend Ubuntu. Free, of course, and a little easier to use with GPUs. I prefer CentOS, but it isn’t always straightforward to getting the X server running with the Nvidia driver.

All of the prices quoted here include shipping and come from a single online retailer, So, it’s pretty simple and straightforward. Just go there, order the parts and put them together. I guess I could have made a post showing you how to do that but there are plenty of videos on youtube that can help.

OK, so, that’s it. A $550 deep learning machine.

Leave a Reply

Your email address will not be published.