Вы находитесь на странице: 1из 18

Cloud computing

From Wikipedia, the free encyclopedia


This article has multiple issues. Please help improve it or discuss these issues on
the talk page.

 It contains too much jargon and may need simplification or further


explanation. Tagged since January 2011.
 It reads like a personal reflection or essay. Tagged since April 2011.

 It is in need of attention from an expert on the subject. Tagged since April 2011.

For other uses, see Cloud (disambiguation)

Cloud Computing visual diagram

Cloud computing refers to the on-demand provision of computational resources (data, software) via
a computer network, rather than from a local computer. Users or clients can submit a task, such as word
processing, to the service provider, without actually possessing the software or hardware. The
consumer's computer may contain very little software or data (perhaps a minimal operating
system and web browser only), serving as little more than a display terminal connected to the Internet.
Since the cloud is the underlying delivery mechanism, cloud based applications and services may support
any type of software application or service in use today.

In the past, both data and software had to be stored and processed on or near the computer. The
development of Local Area Networks allowed for a system in which multiple CPUs and storage devices
may be organized to increase the performance of the entire system. In an extension to that concept,
cloud computing fundamentally allows for a functional separation between the resources used and the
user's computer, usually residing outside the local network, for example, in a remote datacenter.
Consumers now routinely use data intensive applications driven by cloud technology which were
previously unavailable due to cost and deployment complexity.[citation needed] In many companies employees
and company departments are bringing a flood of consumer technology into the workplace and this raises
legal compliance and security concerns for the corporation.[citation needed]

The term "software as a service" is sometimes used to describe programs offered through "The Cloud".

A common shorthand for a provided cloud computing service (or even an aggregation of all existing cloud
services) is "The Cloud".[citation needed]

An analogy to explain cloud computing is that of public utilities such as electricity, gas, and water.
Centralized and standardized utilities freed individuals from the difficulties of generating electricity or
pumping water. All of the development and maintenance tasks involved in doing so was alleviated. With
Cloud computing, this translates to a reduced cost in software distribution to providers who still use hard
mediums such as DVDs. Consumer benefits are that software no longer has to be installed and is
automatically updated but savings in terms of dollars is yet to be seen.

The principle behind the cloud is that any computer connected to the Internet is connected to the same
pool of computing power, applications, and files. Users can store and access personal files such as
music, pictures, videos, and bookmarks or play games or do word processing on a remote server rather
than physically carrying around a storage medium such as a DVD or thumb drive. Even those who
use web-based email such as Gmail, Hotmail, Yahoo, a company owned email, or even an e-mail client
program such as Outlook, Evolution, Mozilla Thunderbird or Entourage are making use of cloud email
servers. Hence, desktop applications which connect to cloud email can also be considered cloud
applications.

Contents
[hide]

• 1 How it works

• 2 Technical description

• 3 Overview

o 3.1 Comparisons

o 3.2 Characteristics

o 3.3 Architecture

• 4 History

• 5 Key characteristics
• 6 Layers

o 6.1 Client

o 6.2 Application

o 6.3 Platform

o 6.4 Infrastructure

o 6.5 Server

• 7 Deployment models

o 7.1 Public cloud

o 7.2 Community cloud

o 7.3 Hybrid cloud and hybrid

IT delivery

o 7.4 Combined cloud

o 7.5 Private cloud

• 8 Cloud engineering

• 9 Cloud storage

• 10 The Intercloud

• 11 Issues

o 11.1 Privacy

o 11.2 Compliance

o 11.3 Legal

o 11.4 Open source

o 11.5 Open standards

o 11.6 Security

o 11.7 Availability and

performance

o 11.8 Sustainability and siting

o 11.9 Use by Hackers

• 12 Research

• 13 Criticism of the term

• 14 See also

• 15 References

• 16 External links
[edit]How it works
This section may need to be rewritten entirely to comply with
Wikipedia's quality standards, as it is not describing how a cloud works.You
can help. The discussion page may contain suggestions. (April 2011)

Cloud computing utilizes the network as a means to connect the user to resources that are based in the
'cloud', as opposed to actually possessing them. The 'cloud' may be accessed via the Internet or a
company network, or both. Cloud services may be designed to work equally well with Linux, Mac and
Windows platforms. With smartphones and tablets on the rise, cloud services have changed to allow
access from any device connected to the Internet, allowing mobile workers access on-the-go, as
in telecommuting, and extending the reach of business services provided by outsourcing.

The service provider may pool the processing power of multiple remote computers in "the cloud" to
achieve the task, such as backing up of large amounts of data, word processing, or computationally
intensive work. These tasks would normally be difficult, time consuming, or expensive for an individual
user or a small company to accomplish, especially with limited computing resources and funds. With
'cloud computing', clients only require a simple computer, such as netbooks which were created with
cloud computing in mind, or even a smartphone, with a connection to the Internet, or a company network,
in order to make requests to and receive data from the cloud, hence the term "software as a service"
(SaaS). Computation and storage is divided among the remote computers in order to handle large
volumes of both, thus the client need not purchase expensive hardware or software to handle the task.
The outcome of the processing task is returned to the client over the network, depending on the speed of
the Internet connection.

[edit]Technical description
The National Institute of Standards and Technology (NIST) provides a concise and specific definition:

Cloud computing is a model for enabling convenient, on-demand network access to a shared
pool of configurable computing resources (e.g., networks, servers, storage, applications, and
services) that can be rapidly provisioned and released with minimal management effort or
service provider interaction.[1]

Cloud computing provides computation, software, data access, and storage services that do not require
end-user knowledge of the physical location and configuration of the system that delivers the services.
Parallels to this concept can be drawn with the electricity grid, where end-users consume power without
needing to understand the component devices or infrastructure required to provide the service.

Cloud computing describes a new supplement, consumption, and delivery model for IT services based on
Internet protocols, and it typically involves provisioning of dynamically scalable and
oftenvirtualized resources[2][3] It is a byproduct and consequence of the ease-of-access to
remote computing sites provided by the Internet.[4] This may take the form of web-based tools or
applications that users can access and use through a web browser as if they were programs installed
locally on their own computers.[5]

Cloud computing providers deliver applications via the internet, which are accessed from a Web browser,
while the business software and data are stored on servers at a remote location. In some cases, legacy
applications (line of business applications which until now have been prevalent in thick client Windows
computing) are delivered via a screen sharing technology such as Citrix XenApp, while the compute
resources are consolidated at a remote data center location; in other cases entire business applications
have been coded using web based technologies such as AJAX.

Most cloud computing infrastructures consist of services delivered through shared data-centers. The
Cloud may appear as a single point of access for consumers' computing needs, notable examples include
the iTunes Store, and the iPhone App Store. Commercial offerings may be required to meet service level
agreements (SLAs), but specific terms are less often negotiated by smaller companies.[6]

[edit]Overview

[edit]Comparisons

Cloud computing shares characteristics with:

1. Autonomic computing — "computer systems capable of self-management."[7]

2. Client–server model – client–server computing refers broadly to any distributed


application that distinguishes between service providers (servers) and service requesters
(clients).[8]
3. Grid computing — "a form of distributed computing and parallel computing, whereby
a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers
acting in concert to perform very large tasks."
4. Mainframe computer — powerful computers used mainly by large organizations for
critical applications, typically bulk data processing such as census, industry and consumer
statistics,enterprise resource planning, and financial transaction processing.[9]
5. Utility computing — the "packaging of computing resources, such as computation and
storage, as a metered service similar to a traditional public utility, such as electricity."[10]
6. Peer-to-peer – distributed architecture without the need for central coordination, with
participants being at the same time both suppliers and consumers of resources (in contrast to
the traditional client–server model).
7. Service-oriented computing – Cloud computing provides services related to
computing while, in a reciprocal manner, service-oriented computing consists of the computing
techniques that operate on software-as-a-service.[11]

[edit]Characteristics

The key characteristic of cloud computing is that the computing is "in the cloud"; that is, the processing
(and the related data) is not in a specified, known or static place(s). This is in contrast to a model in which
the processing takes place in one or more specific servers that are known. All the other concepts
mentioned are supplementary or complementary to this concept.

[edit]Architecture

Cloud computing sample architecture

Cloud architecture,[12] the systems architecture of the software systems involved in the delivery of cloud
computing, typically involves multiple cloud components communicating with each other over application
programming interfaces, usually web services and 3-tier architecture. This resembles the Unix
philosophy of having multiple programs each doing one thing well and working together over universal
interfaces. Complexity is controlled and the resulting systems are more manageable than
their monolithic counterparts.

The two most significant components of cloud computing architecture are known as the front end and the
back end. The front end is the part seen by the client, i.e. the computer user. This includes the client’s
network (or computer) and the applications used to access the cloud via a user interface such as a web
browser. The back end of the cloud computing architecture is the ‘cloud’ itself, comprising various
computers, servers and data storage devices.

[edit]History
The term "cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to
represent the telephone network,[13]and later to depict the Internet in computer network diagrams as
an abstraction of the underlying infrastructure it represents.[14]

Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented


architecture, autonomic and utility computing. Details are abstracted from end-users, who no longer have
need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them.[15]

The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that
"computation may someday be organized as a public utility." Almost all the modern-day characteristics of
cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison
to the electricity industry and the use of public, private, government and community forms, were
thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility.

The actual term "cloud" borrows from telephony in that telecommunications companies, who until the
1990s primarily offered dedicated point-to-point data circuits, began offering Virtual Private Network(VPN)
services with comparable quality of service but at a much lower cost. By switching traffic to balance
utilization as they saw fit, they were able to utilize their overall network bandwidth more effectively. The
cloud symbol was used to denote the demarcation point between that which was the responsibility of the
provider, and that which was the responsibility of the user. Cloud computing extends this boundary to
cover servers as well as the network infrastructure.[16] The first scholarly use of the term “cloud
computing” was in a 1997 lecture by Ramnath Chellappa.[17]

After the dot-com bubble, Amazon played a key role in the development of cloud computing by
modernizing their data centers, which, like most computer networks, were using as little as 10% of their
capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud
architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza
teams" could add new features faster and more easily, Amazon initiated a new product development
effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on
a utility computing basis in 2006.[18][19] The first exposure of the term Cloud Computing to public media is
by GoogleEx CEO Eric Schmidt at SES San Jose 2006.[20] It was reported in 2011 that Amazon has
thousands of corporate customers, from large ones like Pfizer and Netflix to start-ups, Amongst them also
include many corporations that live on Amazon's web services, includingFoursquare, a location-
based social networking site; Quora, a question-and-answer service; Reddit, a site for news-sharing
and BigDoor, a maker of game tools for Web publishers.[21]

In 2007, Google, IBM and a number of universities embarked on a large-scale cloud computing research
project.[22] In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for
deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European
Commission-funded project, became the first open-source software for deploying private and hybrid
clouds, and for the federation of clouds.[23] In the same year, efforts were focused on providing QoS
guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the
framework of the IRMOS European Commission-funded project.[24] By mid-2008, Gartner saw an
opportunity for cloud computing "to shape the relationship among consumers of IT services, those who
use IT services and those who sell them"[25] and observed that "[o]rganisations are switching from
company-owned hardware and software assets to per-use service-based models" so that the "projected
shift to cloud computing ... will result in dramatic growth in IT products in some areas and significant
reductions in other areas."[26]

[edit]Key characteristics
This section appears to contain a large number of buzzwords. Specific
concerns can be found on the Talk page. Please improve this sectionif you
can. (May 2011)

 Agility improves with users' ability to rapidly and inexpensively re-provision technological
infrastructure resources.[27]

 Application Programming Interface (API) accessibility to software that enables machines to


interact with cloud software in the same way the user interface facilitates interaction between humans
and computers. Cloud computing systems typically use REST-based APIs.

 Cost is claimed to be greatly reduced and in a public cloud delivery model capital
expenditure is converted to operational expenditure.[28] This ostensibly lowers barriers to entry, as
infrastructure is typically provided by a third-party and does not need to be purchased for one-time or
infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-
based options and fewer IT skills are required for implementation (in-house).[29]

 Device and location independence[30] enable users to access systems using a web browser
regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure
is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from
anywhere.[29]

 Multi-tenancy enables sharing of resources and costs across a large pool of users thus
allowing for:

 Centralization of infrastructure in locations with lower costs (such as real estate, electricity,
etc.)

 Peak-load capacity increases (users need not engineer for highest possible load-levels)
 Utilization and efficiency improvements for systems that are often only 10–20% utilized.[18]
 Reliability is improved if multiple redundant sites are used, which makes well designed cloud
computing suitable for business continuity and disaster recovery.[31]

 Scalability via dynamic ("on-demand") provisioning of resources on a fine-grained, self-


service basis near real-time, without users having to engineer for peak loads.

 Performance is monitored, and consistent and loosely coupled architectures are constructed
using web services as the system interface.[29]

 Security could improve due to centralization of data,[32] increased security-focused resources,


etc., but concerns can persist about loss of control over certain sensitive data, and the lack of
security for stored kernels.[33] Security is often as good as or better than under traditional systems, in
part because providers are able to devote resources to solving security issues that many customers
cannot afford.[34] However, the complexity of security is greatly increased when data is distributed
over a wider area or greater number of devices and in multi-tenant systems which are being shared
by unrelated users. In addition, user access to security audit logs may be difficult or impossible.
Private cloud installations are in part motivated by users' desire to retain control over the
infrastructure and avoid losing control of information security.

 Maintenance of cloud computing applications is easier, because they do not need to be


installed on each user's computer. They are easier to support and to improve, as the changes reach
the clients instantly.

[edit]Layers

Once an Internet Protocol connection is established among several computers, it is possible to share
services within any one of the following layers.

[edit]Client

See also: Category:Cloud clients

A cloud client consists of computer hardware and/or computer software that relies on cloud computing for
application delivery, or that is specifically designed for delivery of cloud services and that, in either case,
is essentially useless without it. Examples include some computers, phones and other devices, operating
systems andbrowsers.[35][36][37][38][39]. Cloud Desktop as a Service or Hosted Desktop, is a term often used to
refer to a container of a collection of virtual objects, software, hardware, configurations etc., residing on
the cloud, used by a client to interact with remote services and perform computer related tasks[40].

[edit]Application

See also: Category:Cloud applications

Cloud application services or "Software as a Service (SaaS)" deliver software as a service over the
Internet, eliminating the need to install and run the application on the customer's own computers and
simplifying maintenance and support. People tend to use the terms "SaaS" and "cloud" interchangeably,
when in fact they are two different things.[citation needed] Key characteristics include:[41][clarification needed]

 Network-based access to, and management of, commercially available (i.e., not custom)
software

 Activities that are managed from central locations rather than at each customer's site,
enabling customers to access applications remotely via the Web

 Application delivery that typically is closer to a one-to-many model (single instance, multi-
tenant architecture) than to a one-to-one model, including architecture, pricing, partnering, and
management characteristics

 Centralized feature updating, which obviates the need for downloadable patches and
upgrades

[edit]Platform

See also: Category:Cloud platforms

Cloud platform services, also known as Platform as a Service (PaaS), deliver a computing
platform and/or solution stack as a service, often consuming cloud infrastructure and sustaining cloud
applications.[42] It facilitates deployment of applications without the cost and complexity of buying and
managing the underlying hardware and software layers.[43][44]

[edit]Infrastructure

See also: Category:Cloud infrastructure

Cloud infrastructure services, also known as Infrastructure as a Service (IaaS),


deliver computer infrastructure – typically a platform virtualization environment – as a service. Rather
than purchasing servers, software, data-center space or network equipment, clients instead buy those
resources as a fully outsourced service. Suppliers typically bill such services on a utility computing basis;
the amount of resources consumed (and therefore the cost) will typically reflect the level of activity. IaaS
evolved from virtual private server offerings.[45]
Cloud infrastructure often takes the form of a tier 3 data center with many tier 4 attributes, assembled
from hundreds of virtual machines.

[edit]Server

The servers layer consists of computer hardware and/or computer software products that are specifically
designed for the delivery of cloud services, including multi-core processors, cloud-specific operating
systems and combined offerings.[35][46][47][48]

[edit]Deployment models

Cloud computing types

[edit]Public cloud
Public cloud or external cloud describes cloud computing in the traditional mainstream sense, whereby
resources are dynamically provisioned on a fine-grained, self-service basis over the Internet, via web
applications/web services, from an off-site third-party provider who bills on a fine-grained utility
computing basis.[29]

[edit]Community cloud
A community cloud may be established where several organizations have similar requirements and seek
to share infrastructure so as to realize some of the benefits of cloud computing. The costs are spread
over fewer users than a public cloud (but more than a single tenant). This option may offer a higher level
of privacy, security and/or policy compliance. In addition it can be economically attractive as the
resources (storage, workstations) utilized and shared in the community are already exploited and have
reached their return of investment. Examples of community clouds include Google's "Gov Cloud".[49]

[edit]Hybrid cloud and hybrid IT delivery


The main responsibility of the IT department is to deliver services to the business. With the proliferation of
cloud computing (both private and public) and the fact that IT departments must also deliver services via
traditional, in-house methods, the newest catch-phrase has become “hybrid cloud computing.”[50] Hybrid
cloud is also called hybrid delivery by the major vendors including HP, IBM, Oracle and VMware who
offer technology to manage the complexity in managing the performance, security and privacy concerns
that results from the mixed delivery methods of IT services.[51]

A hybrid storage cloud uses a combination of public and private storage clouds. Hybrid storage clouds
are often useful for archiving and backup functions, allowing local data to be replicated to a public cloud.
[52]

Another perspective on deploying a web application in the cloud is using Hybrid Web Hosting, where the
hosting infrastructure is a mix between cloud hosting and managed dedicated servers – this is most
commonly achieved as part of a web cluster in which some of the nodes are running on real physical
hardware and some are running on cloud server instances.[citation needed]

[edit]Combined cloud
Two clouds that have been joined together are more correctly called a "combined cloud". A combined
cloud environment consisting of multiple internal and/or external providers[53] "will be typical for most
enterprises".[54] By integrating multiple cloud services users may be able to ease the transition to public
cloud services while avoiding issues such as PCI compliance.[55]

[edit]Private cloud
Douglas Parkhill first described the concept of a "private computer utility" in his 1966 book The Challenge
of the Computer Utility. The idea was based upon direct comparison with other industries (e.g. the
electricity industry) and the extensive use of hybrid supply models to balance and mitigate risks.

"Private cloud" and "internal cloud" have been described as neologisms, but the concepts themselves
pre-date the term cloud by 40 years. Even within modern utility industries, hybrid models still exist despite
the formation of reasonably well-functioning markets and the ability to combine multiple providers.

Some vendors have used the terms to describe offerings that emulate cloud computing on private
networks. These (typically virtualization automation) products offer the ability to host applications or
virtual machines in a company's own set of hosts. These provide the benefits of utility computing – shared
hardware costs, the ability to recover from failure, and the ability to scale up or down depending upon
demand.

Private clouds have attracted criticism because users "still have to buy, build, and manage them" and
thus do not benefit from lower up-front capital costs and less hands-on management,[54]essentially
"[lacking] the economic model that makes cloud computing such an intriguing concept".[56] [57] Enterprise IT
organizations use their own private cloud(s) for mission critical and other operational systems to protect
critical infrastructures. [58] Therefore, for all intents and purposes, "private clouds" are not an
implementation of cloud computing at all, but are in fact an implementation of a technology subset: the
basic concept of virtualized computing.

[edit]Cloud engineering
Main article: Cloud engineering

Cloud engineering is the application of a systematic, disciplined, quantifiable, and interdisciplinary


approach to the ideation, conceptualization, development, operation, and maintenance of cloud
computing, as well as the study and applied research of the approach, i.e., the application of engineering
to cloud. It is a maturing and evolving discipline to facilitate the adoption, strategization,
operationalization, industrialization, standardization, productization, commoditization, and governance of
cloud solutions, leading towards a cloud ecosystem[further explanation needed]. Cloud engineering is also known as
cloud service engineering.

[edit]Cloud storage
Main article: Cloud storage

See also: Cloud storage gateway

Cloud storage is a model of networked computer data storage where data is stored on multiple virtual
servers, generally hosted by third parties, rather than being hosted on dedicated
servers. Hostingcompanies operate large data centers; and people who require their data to be hosted
buy or lease storage capacity from them and use it for their storage needs. The data center operators, in
the background, virtualize the resources according to the requirements of the customer and expose them
as virtual servers, which the customers can themselves manage. Physically, the resource may span
across multiple servers.

[edit]The Intercloud
Main article: Intercloud

The Intercloud[59] is an interconnected global "cloud of clouds"[60][61] and an extension of


the Internet "network of networks" on which it is based.[62] The term was first used in the context of cloud
computing in 2007 when Kevin Kelly stated that "eventually we'll have the intercloud, the cloud of clouds.
This Intercloud will have the dimensions of one machine comprising all servers and
attendantcloudbooks on the planet.".[60] It became popular in 2009[63] and has also been used to describe
the datacenter of the future.[64]
The Intercloud scenario is based on the key concept that each single cloud does not have infinite physical
resources. If a cloud saturates the computational and storage resources of its virtualization infrastructure,
it could not be able to satisfy further requests for service allocations sent from its clients. The Intercloud
scenario aims to address such situations, and in theory, each cloud can use the computational and
storage resources of the virtualization infrastructures of other clouds. Such form of pay-for-use may
introduce new business opportunities among cloud providers if they manage to go beyond theoretical
framework. Nevertheless, the Intercloud raises many more challenges than solutions concerning cloud
federation, security, interoperability, quality of service, vendor's lock-ins, trust, legal issues, monitoring
and billing.[citation needed]

The concept of a competitive utility computing market which combined many computer utilities together
was originally described by Douglas Parkhill in his 1966 book, the "Challenge of the Computer Utility".
This concept has been subsequently used many times over the last 40 years and is identical to the
Intercloud.

[edit]Issues

[edit]Privacy

The cloud model has been criticized by privacy advocates for the greater ease in which the companies
hosting the cloud services control, and thus, can monitor at will, lawfully or unlawfully, the communication
and data stored between the user and the host company. Instances such as the secret NSA program,
working with AT&T, and Verizon, which recorded over 10 million phone calls between American citizens,
causes uncertainty among privacy advocates, and the greater powers it gives to telecommunication
companies to monitor user activity.[65] While there have been efforts (such as US-EU Safe Harbor) to
"harmonize" the legal environment, providers such as Amazon still cater to major markets (typically the
United States and the European Union) by deploying local infrastructure and allowing customers to select
"availability zones."[66]

[edit]Compliance

In order to obtain compliance with regulations including FISMA, HIPAA and SOX in the United States,
the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to
adopt community or hybrid deployment modes which are typically more expensive and may offer
restricted benefits. This is how Google is able to "manage and meet additional government policy
requirements beyond FISMA"[67][68] and Rackspace Cloud are able to claim PCI compliance.[69] Customers
in the EU contracting with cloud providers established outside the EU/EEA have to adhere to the EU
regulations on export of personal data.[70]
Many providers also obtain SAS 70 Type II certification (e.g. Amazon,[71] Salesforce.com,[72] Google[73] and
Microsoft[74]), but this has been criticised on the grounds that the hand-picked set of goals and standards
determined by the auditor and the auditee are often not disclosed and can vary widely.[75] Providers
typically make this information available on request, under non-disclosure agreement.[76]

[edit]Legal

In March 2007, Dell applied to trademark the term "cloud computing" (U.S. Trademark 77,139,082) in the
United States. The "Notice of Allowance" the company received in July 2008 was canceled in August,
resulting in a formal rejection of the trademark application less than a week later. Since 2007, the number
of trademark filings covering cloud computing brands, goods and services has increased rapidly. As
companies sought to better position themselves for cloud computing branding and marketing efforts,
cloud computing trademark filings increased by 483% between 2008 and 2009. In 2009, 116 cloud
computing trademarks were filed, and trademark analysts predict that over 500 such marks could be filed
during 2010.[77]

Other legal cases may shape the use of cloud computing by the public sector. On October 29, 2010,
Google filed a lawsuit against the U.S. Department of Interior, which opened up a bid for software that
required that bidders use Microsoft's Business Productivity Online Suite. Google sued, calling the
requirement "unduly restrictive of competition."[78] Scholars have pointed out that, beginning in 2005, the
prevalence of open standards and open source may have an impact on the way that public entities
choose to select vendors.[79]

[edit]Open source
Open source software has provided the foundation for many cloud computing implementations, one
prominent example being the Hadoop framework.[80] In November 2007, the Free Software
Foundationreleased the Affero General Public License, a version of GPLv3 intended to close a
perceived legal loophole associated with free software designed to be run over a network.[81]. There are
many open source platform offerings including AppScale, CloudFoundry, OpenShift, and Heroku.

[edit]Open standards
See also: Category:Cloud standards

Most cloud providers expose APIs which are typically well-documented (often under a Creative
Commons license[82]) but also unique to their implementation and thus not interoperable. Some vendors
have adopted others' APIs[83] and there are a number of open standards under development, including
the OGF's Open Cloud Computing Interface. The Open Cloud Consortium (OCC)[84] is working to develop
consensus on early cloud computing standards and practices.

[edit]Security
Main article: Cloud computing security

The relative security of cloud computing services is a contentious issue which may be delaying its
adoption.[85] Issues barring the adoption of cloud computing are due in large part to the private and public
sectors unease surrounding the external management of security based services. It is the very nature of
cloud computing based services, private or public, that promote external management of provided
services. This delivers great incentive amongst cloud computing service providers in producing a priority
in building and maintaining strong management of secure services.[86]

Organizations have been formed in order to provide standards for a better future in cloud computing
services. One organization in particular, the Cloud Security Alliance is a non-profit organization formed to
promote the use of best practices for providing security assurance within cloud computing.[87]

[edit]Availability and performance


In addition to concerns about security, businesses are also worried about acceptable levels of availability
and performance of applications hosted in the cloud.[88]

There are also concerns about a cloud provider shutting down for financial or legal reasons, which has
happened in a number of cases.[89]

[edit]Sustainability and siting


Although cloud computing is often assumed to be a form of "green computing", there is as of yet no
published study to substantiate this assumption.[90] Siting the servers affects the environmental effects of
cloud computing. In areas where climate favors natural cooling and renewable electricity is readily
available, the environmental effects will be more moderate. Thus countries with favorable conditions,
such as Finland,[91] Sweden and Switzerland,[92] are trying to attract cloud computing data centers.

SmartBay, marine research infrastructure of sensors and computational technology, is being developed
using cloud computing, an emerging approach to shared infrastructure in which large pools of systems
are linked together to provide IT services. [93]

[edit]Use by Hackers
Just like with privately purchased hardware, Hackers posing as legitimate customers can purchase the
services of cloud computing for nefarious purposes. This includes password cracking and as a means of
launching attacks.[94] In 2009, a banking trojan illegally used the popular Amazon service as a command
and control channel that issued software updates and malicious instructions to PCs that were infected by
the malware.[95]

[edit]Research
Many universities, vendors and government organizations are investing in research around the topic of
cloud computing.[96][97]

Joint government, academic and vendor collaborative research projects include the IBM/Google
Academic Cloud Computing Initiative (ACCI). In October 2007 IBM and Google announced the multi-
university project designed to enhance students' technical knowledge to address the challenges of cloud
computing.[98] In April 2009, the National Science Foundation joined the ACCI and awarded approximately
million in grants to 14 academic institutions.[99]

In July 2008, HP, Intel Corporation and Yahoo! announced the creation of a global, multi-data center,
open source test bed, called Open Cirrus,[100] designed to encourage research into all aspects of cloud
computing, service and data center management.[101] Open Cirrus partners include the NSF, the
University of Illinois (UIUC), Karlsruhe Institute of Technology, the Infocomm Development Authority
(IDA) of Singapore, the Electronics and Telecommunications Research Institute (ETRI) in Korea, the
Malaysian Institute for Microelectronic Systems(MIMOS), and the Institute for System Programming at the
Russian Academy of Sciences (ISPRAS).[102] In Sept. 2010, more researchers joined the HP/Intel/Yahoo
Open Cirrus project for cloud computing research. The new researchers are China Mobile Research
Institute (CMRI), Spain's Supercomputing Center of Galicia (CESGA by its Spanish acronym), Georgia
Tech's Center for Experimental Research in Computer Systems (CERCS) and China Telecom.[103][104]

In July 2010, HP Labs India announced a new cloud-based technology designed to simplify taking
content and making it mobile-enabled, even from low-end devices.[105] Called SiteonMobile, the new
technology is designed for emerging markets where people are more likely to access the internet via
mobile phones rather than computers.[106] In November 2010, HP formally opened its Government Cloud
Theatre, located at the HP Labs site in Bristol, England.[107] The demonstration facility highlights high-
security, highly flexible cloud computing based on intellectual property developed at HP Labs. The aim of
the facility is to lessen fears about the security of the cloud.[108] HP Labs Bristol is HP’s second-largest
central research location and currently is responsible for researching cloud computing and security.[109]

The IEEE Technical Committee on Services Computing[110] in IEEE Computer Society sponsors the IEEE
International Conference on Cloud Computing (CLOUD).[111] CLOUD 2010 was held on July 5–10, 2010 in
Miami, Florida

On March 23, 2011, Google, Microsoft, HP, Yahoo, Verizon, Deutsche Telekom and 17 other companies
formed a nonprofit organization called Open Networking Foundation, focused on providing support for a
new cloud initiative called Software-Defined Networking.[112] The initiative is meant to speed innovation
through simple software changes in telecommunications networks, wireless networks, data centers and
other networking areas.[113]
[edit]Criticism of the term
Some have come to criticize the term as being either too unspecific or even misleading. CEO Larry
Ellison of Oracle Corporation asserts that cloud computing is "everything that we already do", claiming
that the company could simply "change the wording on some of our ads" to deploy their cloud-based
services.[114][115][116][117][118] Forrester Research VP Frank Gillett questions the very nature of and motivation
behind the push for cloud computing, describing what he calls "cloud washing"—companies simply
relabeling their products as "cloud computing", resulting in mere marketing innovation instead of "real"
innovation.[119][120] GNU's Richard Stallman insists that the industry will only use the model to deliver
services at ever increasing rates over proprietary systems, otherwise likening it to a "marketing hype
campaign".[121]

Вам также может понравиться