Stop Being Just Transactional. Be Transformational.

Leadership can be categorized into essentially two styles: Transactional Leadership and Transformational Leadership. Transactional leaders focus on applying management principles, prescribe desired…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




How to turn your Gaming PC into a Data Science Machine

There are tons of young students out there looking for an intro into Machine Learning and Data Science as a career path. However, one of the biggest limitations conducting this kind of work is the computation power required. Neural Networks require image processing power, synonymous with the requirements needed for PC gaming.

It is fairly common for Computer Science students to have gaming systems capable of preforming machine learning tasks. All that is necessarily required is setting up the environment for running the tools

Graphics cards allow you to do some pretty unique things when it comes to image processing. In particular graphics cards are really good at doing parallel processing.

When we need to say, render a game onto a computer screen, vast numbers of calculations need to be calculated at once and in sync with each-other. This means that we need a lot of cores. While the the typical CPU features 2–12 cores, a GPU can feature thousands. A GTX 1070ti contains 2,432 CUDA cores.

All these cores are needed when it comes to machine learning. Most Data Science work involves image processing and the use of Nueral Networks. When training a network, thousands of identical computations are being preformed in identical neurons. This means GPUs accelerate the computation speed of training a network by an extraordinary amount. You can do data science on a CPU, but your training time might take days compared to hours on a GPU.

Nvidia is currently the backbone of all machine learning work. Unfortunately, AMD, or rather OpenCL, does not provide a huge amount of support for neural network primitives. Nvidia works with researchers to provide cuDNN, the CUDA Deep Neural Network library. This allows researchers to spend more time training models and developing, rather than coding and optimizing GPU usage themselves.

Add a comment

Related posts:

Associate File Types in VS Code

I love a good editor. All of the great tooling, intellisense, auto-complete and all of the goodness and efficiency. Intelligent tooling is a huge part of what makes our day fun. Don’t you agree? For…

Breaking the generational curse

I recently signed up to be a volunteer tutor with an organization that serves homeless children throughout SoCal. These children were born into unfortunate circumstances and the odds of them escaping…

8 Amazing Data Science Tools that you must know!!

Do you ever wonder what is the process and methodology behind revolutionary technologies such as Artificial Intelligence and Machine Learning? The answer is Data Science. With the availability of…