Вы находитесь на странице: 1из 2

Deep Learning – Creating Minds can perform efficiently when large amount of data

needs to be processed.
Saneev Kumar Das
Computer Science & Engineering, College of Based on training time, machine learning takes less
Engineering & Technology, B.P.U.T.,Odisha, India time to train the data whereas due to higher number
saneevdas.061995@gmail.com of parameters deep learning takes comparatively
more time to train the data.

Abstract. Deep learning is a sub-field of machine Based on accuracy, deep learning is an expertise as
learning which further is a super sub-field of artificial compared to machine learning reason being higher
intelligence. A computer today is equipped with highly the time to train, more is the accuracy.
competent hardware as well as software to meet specific
needs of humans. This paper tries to explain how deep Based on hardware requirements, deep learning
learning has created a huge scope to perform similar to requires GPUs and high-ended devices whereas
human brains. Furthermore, it discusses certain machine learning can work well with low-end
advancement in deep learning technology and provides machines.
the reader with an in-depth knowledge about this
emerging technology of deep learning. Based on tasks, machine learning can perform
supervised, unsupervised as well as reinforcement
Kewords. Deep Learning, TensorFlow, Caffe, Neural
learning whereas deep learning being a sub-field
Networks, Convolution.
performs exclusively unsupervised learning and
1. Introduction reinforcement learning.

Real world data when collected is basically 3. Implementation and Frameworks


unstructured i.e., in technical terms unlabeled. Such
A Python-based framework called TensorFlow is a
data to get structured / labeled needs high accuracy
popularly used one. It generally is embedded with
and machine learning technology is not enough
two tools TensorBoard and TensorFlow and is
capable to provide such high accuracy and so is the
well-documented to ease the beginners.
evolution of a sub-field of machine learning called
deep learning. Deep Learning has a prime attention A high-speed framework called Caffe came into
in impersonating the human brain work existence in order to effectively work on
methodology in making decisions as well as convolution neural network supported by C, C++,
creating certain patterns. Machine learning is not Python, Matlab as well as command-line. It has
enough capable of processing natural data in raw also found its application in vision recognition.
form [5]. Deep learning showed its efficiency in
image recognition[1, 4] and speech recognition[3, Furthermore, The Microsoft Cognitive Toolkit was
12] defeating machine learning when it comes to introduced in order to perform similar to Caffe but
predict the potential drug molecule activities[13], exclusively for training data coming under
and while the analysis of particular data of reinforcement learning methodology.
accelerator[11,14] reconstructing brain circuits. It
A framework used by Facebook, Google, etc.,
uses neural networks which are artificially created
called PyTorch based on Lua enough competent to
like the human brains comprising of trillions of
execute tensor computations employing popularly
neurons. Today digital data has shown an enormous
known CUDA.
increase in the real world and so came the concept
of big data which needs high processing and in A Python-based framework called Keras with a
such case the machine learning technique cant very user-friendly interface for construction of
show its efficiency. To perform big data processing recurrent networks as well as convolution neural
we currently need deep learning. Furthermore, we networks.
discuss in this paper regarding the difference
between machine learning and deep learning, A very highly powerful framework which is
implementation of deep learning and advancement capable to modify the neural networks during the
in deep learning. runtime called Chainer is Python-based and
supports multi-GPU as well as CUDA.
2. Deep Learning Vs Machine
A framework supported by C++, Python, R and
Learning Julia called MXNet supporting long short-term
Based on performance, machine learning works memory, recurrent as well as convolution networks.
well in case of small data whereas deep learning
A framework with a deep network support recently 6. Kavukcuoglu, K. et al. Learning convolutional
feature hierarchies for visual recognition. In Proc.
developed with Java called Deeplearning4j works
Advances in Neural Information Processing Systems
with covolution neural network, recurrent neural 23 1090–1098 (2010).
network, long short-term memory, recursive neural 7. Kingma, D., Rezende, D., Mohamed, S. & Welling,
tensor flow, restricted Boltzmann machine and M. Semi-supervised learning with deep generative
models. In Proc. Advances in Neural Information
deep belief network.
Processing Systems 27 3581–3589 (2014).
8. Mnih, V. et al. Human-level control through deep
Deep learning supports all the above frameworks.
reinforcement learning. Nature 518, 529–533 (2015).
9. Collobert, R., et al. Natural language processing
4. Advancements in Deep Learning (almost) from scratch. J. Mach. Learn. Res. 12,
2493–2537 (2011)
In today’s era we have found large scope of deep 10. LeCun, Y., Bottou, L., Bengio, Y. & Haffner, P.
learning which has transformed the entire Gradient-based learning applied to document
landscape. It has found its scope in fields such as recognition. Proc. IEEE 86, 2278–2324 (1998).
11. Ciodaro, T., Deva, D., de Seixas, J. & Damazio, D.
healthcare, self-driven cars, disaster-detection, etc. Online particle detection with neural networks based
on topological calorimetry information. J. Phys. Conf.
Technological advancements include performance Series 368, 012030 (2012).
of deep learning in performing NLP tasks such as 12. Mikolov, T., Deoras, A., Povey, D., Burget, L. &
speech recognition, parsing and machine translation. Cernocky, J. Strategies for training large scale neural
network language models. In Proc. Automatic
Furthermore, long short-term memory is a wide
Speech Recognition and Understanding 196–201
scope of deep learning. BERT i.e., bidirectional (2011).
encoder representations from transformers 13. Ma, J., Sheridan, R. P., Liaw, A., Dahl, G. E. &
introduced by Google recently is a bidirectional Svetnik, V. Deep neural nets as a method for
language modelperforming 11 complex NLP tasks quantitative structure-activity relationships. J. Chem.
Inf. Model. 55, 263–274 (2015).
which includes sentiment analysis, paraphrase 14. Kaggle. Higgs boson machine learning challenge.
detection, etc. Also, video to video synthesis and Kaggle https://www.kaggle. com/c/higgs-boson
improved word embeddings are some of the recent (2014).
advancements in the deep learning field of study.

5. Concluding Remarks
Deep learning is a future trend and needs lot of
further research and advancements to match up
with the on-going world and needs to solve the
limitations it comprises of. To justify the title of the
paper, human mind is invincible and can’t ever be
replaced by deep learning but deep learning can in
future perform many such tasks which are
performed on a regular basis by a human mind. The
implementation of deep learning in various sectors
show the enormous demand of deep learning in
dealing with big data.

References
1. Farabet, C., Couprie, C., Najman, L. & LeCun, Y.
Learning hierarchical features for scene labeling.
IEEE Trans. Pattern Anal. Mach. Intell. 35, 1915–
1929 (2013).
2. Szegedy, C. et al. Going deeper with convolutions.
Preprint at http://arxiv.org/ abs/1409.4842 (2014).
3. Hinton, G. et al. Deep neural networks for acoustic
modeling in speech recognition. IEEE Signal
Processing Magazine 29, 82–97 (2012).
4. Krizhevsky, A., Sutskever, I. & Hinton, G. ImageNet
classification with deep convolutional neural
networks. In Proc. Advances in Neural Information
Processing Systems 25 1090–1098 (2012).
5. Deep learning Yann LeCun, Yoshua Bengio &
Geoffrey Hinton.

Вам также может понравиться