Академический Документы
Профессиональный Документы
Культура Документы
S.PRAVEEN KUMAR
3rd year CSE Department,
Email:praveencse333@gmail.com
Cloud computing refers to the
delivery of computing resources over the
Internet. Instead of keeping data on your
own hard drive or updating applications for
your needs, you use a service over the
Internet, at another location, to store your
information or use its applications. Doing so
may give rise to certain privacy
implications. For that reason the Office of
the Privacy Commissioner of Canada (OPC)
has prepared some responses to Frequently
Asked Questions (FAQs). We have also
developed a Fact Sheet that provides
detailed information on cloud computing
and the privacy challenges it presents
Cloud Computing
Cloud computing is the delivery of
computing services over the Internet. Cloud
services allow individuals and businesses to
use software and hardware that are managed
by third parties at remote locations.
Examples of cloud services include online
file storage, social networking sites,
webmail, and online business applications.
The cloud computing model allows access to
information and computer resources from
anywhere that a network connection is
available. Cloud computing provides a
shared pool of resources, including data
storage
space,
networks,
computer
processing power, and specialized corporate
and
user
applications.[1]
TYPES OF CLOUDS
Cloud providers typically centre on
one type of cloud functionality provisioning:
Infrastructure, Platform or Software /
Application, though there is potentially no
restriction to offer multiple types at the same
time, which can often be observed in PaaS
(Platform as a Service) providers which
offer specific applications too, such as
Amazon
EC2,
Zimory,
3 Hadoop subprojects:
Hadoop
Common:
utilities package
common
Hadoop MapReduce
processing
occurs
on
Hardware Failure
Streaming Data Access
Large Data Sets
Moving Computation is Cheaper
than Moving Data.
Conclusion
As we disguised in this paper it
shows that how Cloud Computing provides
us in various aspects. Analyzing new and
diverse digital data streams can reveal new
sources of economic value, provide fresh
insights into customer behaviour and
identify market trends early on. But this
influx of new data creates challenges for IT
departments. To derive real business value
from big data, you need the right tools to
capture and organize a wide variety of data
types from different sources, and to be able
to easily analyze it within the context of all
your enterprise data..Hadoop is a large scale,
open source software framework dedicated
to scalable, distributed, data-intensive
computing. The framework breaks up large
data into smaller parallelizable chunks and
handles scheduling Maps each piece to an
intermediate value Reduces intermediate
values to a solution User-specified partition
and combiner options Fault tolerant,
reliable, and supports thousands of nodes
and petabytes of data If you can rewrite
algorithms into Maps and Reduces, and your
REFERENCES:
[1]www.priv.gc.ca
[2]www.cse.buffalo.edu/~bina/CloudComp
utingJun28.
[3]
Hadoop
Distributed
Filesystem.
http://hadoop.apache.org.
[4]http://www.forbes.com/sites/louiscolumb
us/2014/02/24/the-best-cloud-computingcompanies-and-ceos-to-work-for-in-2014/
[5]http://www.javacodegeeks.com/2012/05/
mapreduce-for-dummies.html
[6]HDFS
Java
API:
http://hadoop.apache.org/core/docs/current/a
pi/
[7]HDFS
source
code:
http://hadoop.apache.org/core/version_contr
ol.html