Вы находитесь на странице: 1из 18

Group 3 - CUDA For

Machine/Deep
Learning
Pawan Hage - 111508027
Pratapsing Kachave - 111508031
Swapnil Kadam - 111508032
Nagesh Kamble - 111508033
What is CUDA
1. CUDA is a parallel computing platform and programming model developed by
Nvidia for general computing on its own GPUs (graphics processing units).

2. CUDA enables developers to speed up compute-intensive applications by


using the power of GPUs for the parallelizable part of the computation.
GPU vs CPU
GPU vs CPU
GPU

1. Hundreds of simpler cores


2. Thousands of concurrent threads

CPU

1. Very few complex cores


2. Single thread performance optimization
Advantages of GPU’s
1. Used in anywherean image is needed to be processed, a geometry is needed
to be drawn on screen and a mass pool of physics equations are needed to
be solved. Such as in PCs, smartphones and super computers.

2. GPUs are much faster in graphics related and massively parallel jobs.
Disadvantages of GPU’s
GPUs fail on branch prediction.

GPU can’t drive itself, needs a CPU to be controlled.

Modern graphics cards are power hungry. Having two of them in a system can
almost double the amount of power required to run them in tandem.

GPU’s are very costly.


Advantages of CPU over GPU’s
CPU’s are very cheap and can come with with different variety in memory size.

But the most Costliest GPU’s are Just upto 12 GB and can cost twice more than
the cost of entire computer.
Advantages of CUDA
● Huge increase in processing power over conventional CPU processing. Early
reports suggest speed increases of 10x to 200x over CPU processing speed.
● C language is widely used, so it is easy for developers to learn how to
program for CUDA.
● All graphics cards in the G80 series and beyond support CUDA.
● Harnesses the power of the GPU by using parallel processing; running
thousands of simultaneous reads instead of single, dual, or quad reads on the
CPU.
Disadvantages of CUDA
● Limited user base- Only NVIDIA G80 and onward video cards can use CUDA,
thus isolating all ATI users.
● Speeds may be bottlenecked at the bus between CPU and GPU.
● Developers still sceptical as to whether CUDA will catch on.
● Mainly developed for researchers- not many uses for average users.
● System is still in development
Why is GPU necessary for Machine/Deep
Learning
Machine learning has four main tasks :

1) Preprocessing input data


2) Training the deep learning model
3) Storing the trained deep learning model
4) Deployment of the model

Among all these, training the deep learning model is the most intensive task.
When you train a deep learning model, two main operations are performed:

● Forward Pass
● Backward Pass

Both of these operations are essentially matrix multiplications.

So in a neural network, we can consider first array as input to the neural network,
and the second array can be considered as weights of the network.

We can simply do this by doing all the operations at the same time instead of
doing it one after the other.
System requirements
To use CUDA on your system, you will need the following installed:

● CUDA-capable GPU
● A supported version of Linux with a gcc compiler and toolchain
● NVIDIA CUDA Toolkit (available at
http://developer.nvidia.com/cuda-downloads)
CUDA Toolkit
The NVIDIA® CUDA® Toolkit provides a development environment for creating high
performance GPU-accelerated applications.

Optimize and deploy your applications on GPU-accelerated embedded systems


desktop workstations enterprise data centers, cloud-based platforms and HPC
supercomputers.

Debugging and optimization tools, a C/C++ compiler and a runtime library to deploy
your application.
GPU-accelerated CUDA libraries enable drop-in acceleration

Image and video processing, deep learning and graph analytics.

Custom algorithms, you can use available integrations with commonly used
languages and numerical packages as well as well-published development APIs.
CUDA Application Domains
1. Bioinformatics
2. Medical imaging
3. Data science
4. Defense
5. Electric design automation
6. Imaging and computer vision
7. Machine Learning/Deep Learning
8. Numerical analytics
THANK YOU...Any
Question?

Вам также может понравиться