Cloud computing
Cloud computing refers to accessing computing resources that are typically owned and operated by a third-party provider on a consolidated basis in data center locations. It is aimed at delivering cost-effective computing power over the Internet, including virtual private networks (VPN) or even virtual private line networks (i.e., Layer 2 VPN) mapped onto facility providers. Consumers of cloud computing services purchase computing capacity on-demand and are not generally concerned with the underlying technologies used to achieve the increase in server capability.
In terms of the problem it solves, it is less new technology and more "a new deployment model." [1]
It has similarities to a number of network-enabled computing methods, but some unique properties of its own. The core point is that users, whether end users or programmers, request resources, without knowing the location of those resources, and are not obliged to maintain the resources. The resource may be anything from an application programming interface to a virtual machine, on which the customer writes an application, to Software as a Service (SaaS), where the application is predefined and the customer can parameterize but not program. Free services such as Google and Yahoo are one type of SaaS, while some well-defined business applications, such as customer resource management as provided by Salesforce.com, are among the most successful paid SaaS applications. PayPal and eBay arguably are SaaS models, paid, at the low-end, on a transaction basis.
"What goes on in the cloud manages multiple infrastructures across multiple organizations and consists of one or more frameworks overlaid on top of the infrastructures tying them together. Frameworks provide mechanisms for:
- self-healing
- self monitoring
- resource registration and discovery
- service level agreement definitions
- automatic reconfiguration
"The cloud is a virtualization of resources that maintains and manages itself. There are of course people resources to keep hardware, operation systems and networking in proper order. But from the perspective of a user or application developer only the cloud is referenced[2]
It is, by no means, a new concept in computing. Bruce Schneier reminds us of that it has distinct similarities in the processing model, although not the communications model, with the timesharing services of the 1960s, made obsolete by personal computers. " Any IT outsourcing -- network infrastructure, security monitoring, remote hosting -- is a form of cloud computing."
The old timesharing model arose because computers were expensive and hard to maintain. Modern computers and networks are drastically cheaper, but they're still hard to maintain. As networks have become faster, it is again easier to have someone else do the hard work. Computing has become more of a utility; users are more concerned with results than technical details, so the tech fades into the background.[3]
Trust and Security
But what about security? Isn't it more dangerous to have your email on Hotmail's servers, your spreadsheets on Google's, your personal conversations on Facebook's, and your company's sales prospects on salesforce.com's?
You don't want your critical data to be on some cloud computer that abruptly disappears because its owner goes bankrupt. You don't want the company you're using to be sold to your direct competitor. You don't want the company to cut corners, without warning, because times are tight. Or raise its prices and then refuse to let you have your data back. These things can happen with software vendors, but the results aren't as drastic.
This essay originally appeared in The Guardian.
I have always said mainframes are like cats. They may deign to give you a small amount of their time, but its on their terms and you have to be satisfied with it. Whereas PCs are like dogs. Their entire attention is focused on you and you call the shots. SaaS is similar to the "network is king" mentality in that it is converting PCs back into cats - you no longer have the ability (or authority) to customize your desktop exactly the way you want, you have to use the application(s) that the IT dept provides, rather than one(s) that meet(s) your requirements, they keep adding more bulk of overhead software and when THEY fail it's YOUR problem not theirs.
There is no single industry-accepted definition.[4] Some services broker extra capacity available on enterprise servers, as well as resources in pools of managed virtual servers. Others sell capacity on virtual servers. Yet others include any external computing resource, even to outsourced backup services, within the definition.
The applications of cloud/utility computing models are expanding rapidly as connectivity costs fall, and as computing hardware becomes more efficient at operating at scale. The economic incentives to share hardware among multiple users are increasing; the drawbacks in performance and interactive response that used to discourage remote and distributed computing solutions are being greatly reduced. As a result, the services that can be delivered from the cloud have expanded past web applications to include storage, raw computing, or access to any number of specialized services.
Offerings
While the details of the service vary, some common features of sizing apply:
- Separation of application code from physical resources.
- Ability to use external assets to handle peak loads (not having to engineer for highest possible load levels).
- Not having to purchase assets for one-time or infrequent intensive computing tasks.
Virtual machine services
Amazon Elastic Compute Cloud (Amazon EC2) is a cloud offering on which customer developers write application on a wide assortment of virtual machines, which the customer builds from choices among operating systems, data bases, web servers, etc. [5]
Java services also are offered in clouds, but there is differentiation among the offerings. Nikita Ivanov describes two basic approaches, which are not mutually exclusive, but different products tend to have one or the other dominate. [6] The first is much like the way a traditional data center is organized, where the developers have little control over infrastructure. "The second approach is something new and evolving as we speak. It aims to dissolve the boundaries between a local workstation and the cloud (internal or external) by providing relative location transparency so that developers write their code, build and run it in exact the same way whether it is done on a local workstation or on the cloud thousands miles away or on both."
- Heavy UI oriented. These types of applications or framework usually provide UI-based consoles, management applications, plugins, etc that provide the only way to manage resources on the cloud such as starting and stopping the image, etc. The key characteristic of this approach is that it requires a substantial user input and human interaction and thus they tend to be less dynamic and less on-demand. Good examples would be RightScale, GigaSpaces, ElasticGrid.
- Heavy framework oriented. This approach strongly emphasizes dynamism of resource management on the cloud. The key characteristic of this approach is that it requires no human interaction and all resource management can be done programmatically by the grid/cloud middleware - and thus it is more dynamic, automated and true on-demand. Google App Engine (for Python), GridGain would be good examples.
SaaS
- Salesforce
- Medical records
Quite a few business services are really SaaS, such as PayPal and eBay; they are services to facilitate transactions between users. The creation of various credit card and check payment features are examples of how SaaS can be customized without programming.
Compute-intensive services
Some cloud computing applications do try to replicate compute-intensive supercomputer applications using highly distributed parallel processsing. [7]
Cloud storage
Cloud storage is a model of networked data storage where data is stored on multiple virtual servers, generally hosted by third parties, rather than being hosted on dedicated servers.[8] There are cloud storage service for the small and home office market, such as Carbonite (backup)[9] and MozyHome (backup).
Even the consumer services differentiate, from being a "flash drive in the cloud" to offering encryption, incremental backups, data sharing, etc.
There are high-end massive data backups, as well as federated data bases.
Hosting companies operate large data centers; and people who require their data to be hosted buy or lease storage capacity from them and use it for their storage needs. The data center operators, in the background, virtualize the resources according to the requirements of the customer and expose them as virtual servers, which the customers can themselves manage. Physically, the resource may span across multiple servers.
Underlying architecture
Internally, cloud computing almost always uses several kinds of virtualization. The application software will run on virtual machines, which can migrate among colocated or networked physical processors. The virtual machines may be deliberately machine- and OS-independent (e.g., Java virtual machine), or virtualized operating system instances.
For reasons of commercial reliability, however, the resources will rarely be consumer-grade PCs, either from a machine resource or form factor viewpoint. Disks, for example, are apt to use Redundant Array of Inexpensive Disks technology for fault protection. Blade server, or at least rack mounted server chassis will be used to decrease the data center floor space, and often cooling and power distribution, complexity. These details are hidden from the application user.
The cloud provider can place infrastructure in geographic areas that have reduced costs of land, electricity, and cooling. While Google's developers may be in Silicon Valley, the data centers are in rural areas further north.
- Sharing of peak-load capacity among a large pool of users, improving overall utilization.
- Separation of infrastructure maintenance duties from domain-specific application development.
- Ability to scale to meet changing user demands quickly, usually within minutes
There are two different types of cloud computing customers. The first only pays a nominal fee for these services -- and uses them for free in exchange for ads: e.g., Gmail and Facebook. These customers have no leverage with their outsourcers. You can lose everything. Companies like Google and Amazon won't spend a lot of time caring. The second type of customer pays considerably for these services: to Salesforce.com, MessageLabs, managed network companies, and so on. These customers have more leverage, providing they write their service contracts correctly. Still, nothing is guaranteed.
The second type runs their applications on networked large groups of servers that often use low-cost PC technology, with specialized connections to spread data-processing chores across them.
References
- ↑ V. Bertocci (April 2008), Cloud Computing and Identity, MSDN
- ↑ Kevin Hartig (15 April 2009), "What is Cloud Computing?", Cloud Computing Journal
- ↑ "Cloud Computing", Schneier on Security, June 4, 2009
- ↑ Eric Knorr, Galen Gruman (7 April 2008), What cloud computing really means
- ↑ Amazon Elastic Compute Cloud (Amazon EC2), Amazon.com
- ↑ Nikita Ivanov, Java Cloud Computing - Two Approaches, GridGain Computing Platform
- ↑ Aaron Ricadela (16 November 2007), "Computing Heads for Clouds", Business Week
- ↑ Lucas Mearian (13 July 2009), "Consumers find rich array of cloud storage options: Which online service is right for you?", Computerworld
- ↑ About Carbonite, Carbonite Computer Company