migrating interactive media and web applications to the private cloud
TRANSCRIPT
Migrating interactive media and web applications to the private cloud By Dave Ohara
This research was underwritten by Equinix
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 2 -December 2011
CLOUD COMPUTING
Table of contents
Table of contents 2
3
EXECUTIVE SUMMARY ........................................................................................4
INTRODUCTION.....................................................................................................5
Audience...........................................................................................................5
LIFE AFTER THE PUBLIC CLOUD........................................................................6
Growing pains...................................................................................................7
CLOUD COMPUTING.............................................................................................2
Using cloud metrics.........................................................................................12
UNDERSTANDING PRIVATE CLOUD TECHNOLOGIES....................................12
Enabling technologies ....................................................................................13
A new way of thinking about data center design..............................................14
CREATING A MIGRATION STRATEGY...............................................................15
Finding the geographic location.......................................................................17
Performance considerations 17
Other considerations 18
Finding a data center.......................................................................................18
Building or buying a dedicated space..............................................................19
Leasing a dedicated space..............................................................................19
Using co-location.............................................................................................19
Carrier-specific and carrier-neutral providers 20
Using a hosting service...................................................................................21
Creating a bid..................................................................................................21
MAKING HARDWARE DECISIONS.....................................................................21
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 3 -December 2011
CLOUD COMPUTING
THE IMPORTANCE OF ENERGY CONSUMPTION............................................22
Planning rack density......................................................................................23
Planning and flexibility 24
PLANNING STORAGE.........................................................................................24
EVALUATING YOUR APPLICATIONS.................................................................25
IMPLEMENTING THE SOFTWARE LAYERS......................................................26
Virtualization software.....................................................................................26
Management software ....................................................................................27
IT PERSONNEL....................................................................................................28
ZYNGA: A CASE STUDY.....................................................................................28
Zynga and AWS...............................................................................................30
The Zynga private cloud..................................................................................31
What's next......................................................................................................32
More information.............................................................................................32
PERSPECTIVES ON THE FUTURE.....................................................................32
KEY TAKEAWAYS................................................................................................34
ABOUT DAVE OHARA.........................................................................................35
ABOUT GIGAOM PRO.........................................................................................35
FURTHER READING............................................................................................36
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 4 -December 2011
CLOUD COMPUTING
Executive summary Web content that relies on interactivity, social networking and personalization is
becoming the dominant form, but it puts particular demands on the network, since it
requires a low-latency environment that can provide sites that respond quickly to user
input. Content delivery networks (CDNs), which are designed to deliver large amounts
of static content, may be too slow to provide this environment. Even public clouds may
not be adequate: They provide a generic network that cannot be tuned to the demands
of a particular application. Outages are also a problem.
So if your company has a cloud application with a predictable audience size or one that
is costing you more than $25,000 a month to host, you may want to consider
maintaining a private cloud. It is important to remember that using a private cloud
does not preclude also using the public cloud. There is a spectrum of possibilities,
including using a hybrid solution.
This paper is an overview of the factors that decision makers who are developing a
public-to-private cloud-migration strategy should consider, recognizing that the public
versus private cloud strategy is not an all-or-nothing proposition. There is
considerable flexibility along a spectrum of implementation choices. The paper
describes some pitfalls that must be avoided along the way, and it provides a case
study of Zynga, a company that has found a way to use both the private and public
clouds to create a hybrid solution.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 5 -December 2011
CLOUD COMPUTING
IntroductionThe word on the street is that the smart money is now in private and hybrid clouds.
Despite ubiquitous marketing messages that promote the public cloud, private and
hybrid clouds are currently growing, in terms of percentages, at a faster rate.
The changing nature of media delivery and web applications is the primary motivator
behind this trend. These applications are becoming increasingly interactive,
personalized and tied to social networking sites. These developments are, in turn,
driving networking trends such as application-specific peering and the use of edge
networks for application logic. The requirements of these new types of applications can
turn the one-size-fits-all approach of the public cloud and content delivery networks
into a competitive disadvantage. Consequently, many growing media and content
companies now find that the increased flexibility of privately managed, vertically
integrated data centers makes more sense than public cloud services. For these
companies, private cloud and hybrid cloud computing provide opportunities for
competitive advantage through reduced latency, increased throughput and lower
ongoing operating expenses.
AudienceThis paper is intended for executives — both at startups and mid- to large-sized
enterprises — who determine their organization’s business strategies. Developing a
private or hybrid cloud involves the expertise of many disciplines, from finance to
facilities management to software and network architecture. Readers are not assumed
to be experts in any of these areas. In fact, one of the motivations behind this paper is
to introduce relevant concepts across disciplines. For example, experts in site selection
will need to coordinate with network architects. Facilities managers will need to
understand at a high level what the software and network architects are saying, and
vice versa.
Also, it is not necessary to be actively planning a migration to a private cloud to benefit
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 6 -December 2011
CLOUD COMPUTING
from this paper. Even if migrating to a private or hybrid cloud is not the right choice
for you now, the research here can help you to anticipate the changes that growth
brings and to understand the forces that are influencing current data center trends.
Although it is focused on media delivery and web applications such as retail
operations, the principles discussed have wider application, too.
Life after the public cloudIn less than a decade, cloud computing has come to dominate how media and web
applications are delivered, and there are good reasons for this success. Cloud
computing, roughly speaking, can be said to have four main benefits:
The cloud provides on-demand allocation of resources, which gives system
developers the perception of infinite resources for data storage, computation and
network bandwidth. With cloud computing, capacity planning becomes a
nonissue for developers, who never need to worry about their application
outgrowing the fixed capabilities of any particular data center. The on-demand
allocation of resources greatly shortens the time needed for deployment of a new
application. There’s no ramp-up time for the data center itself.
The cloud provides a self-service approach for administrators to manage
resources. For example, there might be a web portal for requesting additional
compute instances. Resources can also be managed automatically by software.
Self-service management reduces labor costs and is operationally efficient.
The cloud provides scalable resource usage. Applications can run on a single
virtual server instance or thousands of servers that are distributed across the
globe. Scaling from a few servers to thousands can be accomplished in minutes
or hours, not weeks or months. The ability to scale up quickly is one of the most
important cloud benefits for new applications.
The cloud provides measurable resource utilization. For the public cloud, this
means that you pay as you go based on what you use. The pay-as-you-go model
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 7 -December 2011
CLOUD COMPUTING
helps conserve valuable startup capital, and it gives visibility to operational costs.
(For private clouds, resource measurement may take the form of
interdepartmental accounting chargebacks or monthly reports.)
It is therefore no surprise that many startups use the public cloud services offered by
companies such as Amazon, Rackspace and SoftLayer to present their initial product
offerings. Letting someone else take responsibility for networking, storage and
computing resources allows a product team to concentrate on tasks that make its
business grow. This is the promise of Infrastructure-as-a-Service offerings, and it
works.
Growing painsIt is also true, however, that as products stabilize and audiences grow, the value
proposition of the public cloud can become less attractive for an application. Questions
of performance, control and cost come to the fore. What worked for the startup phase
of a company may not be ideal as it matures, as is the case with Zynga, whom we
profile later in this report.
These issues tend to come into play once an application has reached the point that its
operational expenditures for cloud services exceed $25,000 per month. By the time
cloud services are costing $100,000 per month, these issues may become pressing.
To understand this, it is helpful to look at an example.
Web-based applications that deliver static content such as video and software
downloads often use commercial content delivery networks. CDNs employ hierarchical
replication and distribution to reduce the distance that data must travel when a user of
the service requests a file. Files can be served from the network’s edge locations, which
are in proximity to end users. Akamai and Limelight Networks are well-known CDN
providers, although public cloud providers like Amazon and Microsoft, as well as
transit providers like Level 3, are also entering the CDN business. Even telcos have
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 8 -December 2011
CLOUD COMPUTING
started to develop CDN offerings for their networks. The following diagram shows the
operation of a traditional CDN.
Figure 1: media distributed to edge networks by a traditional CDN
Source: Dave Ohara/GreenM3
The role of a CDN is being impacted by a trend toward more-interactive media-based
applications, such as the web-based game FarmVille, which attracts more users when
latency is low.
Traditional CDN design is intended for static media such as large video files.
Interactive and dynamically produced or personalized media reduce the usefulness of
the original hierarchical CDN distribution approach. An alternative to this approach is
to move some of the application logic to the edge network in addition to media files.
For example, you might build a tiered application structure where some processing is
performed on edge servers that forward certain requests to a more centralized data
center, as seen below in Figure 2.
Figure 2: media and application logic distributed to edge networks
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 9 -December 2011
CLOUD COMPUTING
Source: Dave Ohara/GreenM3
This example shows that network design and application design are often
interconnected for applications that are large enough to merit the additional
investment in this kind of customization. In other words, when an application reaches
some level of maturity, the organization is likely to have the resources to invest in a
network and application structure that is optimized for particular attributes of that
application. Different applications compete with one another based on the user’s
perception of performance; studies have shown that web applications for retail sales
produce progressively less revenue as latency increases. For mature applications that
have numerous competitors and predictable capacity requirements, the one-size-fits-
all approach of a public cloud provider (even with the additional support of a
traditional CDN) may fail to produce an interactive application with competitive
performance.
Reducing network latency by integrating application and network design is one of the
reasons you might want to consider migrating to a private or hybrid cloud. In the case
described above, the application and the network need to be modified to remedy the
performance bottleneck. It’s a case where system architects need more control over the
environment than what would be offered by a public cloud provider. Such a
modification would not be possible for an application that is hosted by a cloud service
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 10 -December 2011
CLOUD COMPUTING
provider, since the cloud service provider handles all details of the network design.
Note: Although this paper compares public and private clouds, it is also useful to
briefly compare the private cloud approach to traditional IT. One of the interesting
consequences of building applications in the cloud is that the development and
operations teams work together more closely than in traditional IT. There are a
number of benefits to the integration of development and operations, including
timely feedback on the efficacy of decisions. With a private or hybrid cloud, changes
to the application and network structure will quickly be visible, as there will be an
increase or decrease in performance, revenue and traffic. In traditional IT,
efficiencies provided by changes to the code or the IT configuration are difficult to
identify, because they are executed by different teams and do not occur
simultaneously. For example, the deployment of additional data center capacity may
happen months after an application change is made. As a result of this lack of agility,
servers in traditional IT centers often are over-provisioned and are not configured
for the specific needs of the applications they run. Traditional IT focuses on a
centralized data center and tends to neglect the distributed nature of large-scale
Internet applications. Private and hybrid clouds, in contrast, are fully distributed
computing platforms that include network design as an integral component.
Also, it is still unclear how the existing CDN players will adapt to the trend to move
applications to edge networks. In general, the more interactive or personalized your
media distribution is for users, the less value a traditional CDN will offer. New
service providers, such as Contendo, have entered the market to focus on the problem
of content distribution in an age of highly interactive applications.
Relying on an out-of-the-box data center and network isn’t always what an application
needs. Examples of such a scenario include:
Your system architects realize that the public cloud is “generic.” A retail business
and a gaming business use the same public cloud even though they have very
different networking requirements. An architect can create a data center that is
tailored to your specific needs. You may even benefit from using different servers
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 11 -December 2011
CLOUD COMPUTING
and storage devices than are supplied by your public cloud provider.
Your websites aren’t as responsive as you would like. Slow response times
translate into lost revenue, because customers become frustrated and go to a
competitor’s site.
You rely on your cloud service provider for continuous service, but you have
experienced downtime when your sites aren’t available. This is an obvious source
of lost revenue.
You can improve performance to important markets by being geographically
closer to those markets.
You feel that it is risky to be completely dependent on an outside provider. From
a business perspective, you are uncomfortable with being vulnerable to changes
in price as well as to any problems with the network.
You want to control network security to decrease the risk of hacking, computer
viruses and other forms of attack.
You want to be sure that you can comply with changing regulatory standards,
such as practices for the storage and transmission of personal information.
You have realized that public clouds lower your up-front capital costs but that a
mature application may incur sizable recurring costs. Your operating expense
grows as your business grows. At a certain point the higher operating expense
outweighs the benefits of lower capital investment in infrastructure.
If any of these reasons are compelling, you will want to evaluate your situation in more
detail. Metrics and a migration strategy can help you.
Note: It is true that some of the largest web applications such as Netflix, Yelp,
Newsweek, IMDb, Foursquare and Zynga rely on a public cloud provider such as
Amazon. Very large organizations have the influence to negotiate custom
agreements with cloud services providers that alleviate some or all of the issues
mentioned above. For example, the largest organizations can demand custom
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 12 -December 2011
CLOUD COMPUTING
engineering of the network, and they can even specify where data centers are
located. There are only a handful of applications, such as Zynga and Netflix, that can
demand a custom infrastructure.
Using cloud metricsMetrics give you a way to quantify your business objectives and to measure them over
time. Metrics and business goals should always align. Clearly uptime and cost are
concerns, but they are only two factors among many.
Network performance metrics can also be useful to help you evaluate how well the
public cloud is working for you. You may want to collect data on bandwidth, latency,
the number of hops a packet must traverse before it reaches its destination, the
amount of time it takes to establish a connection with a server, the amount of time it
takes for downloads and how often your application is unavailable because of
downtime. How metrics affect the decision is dependent on an organization’s business
model. A financial institution cannot tolerate seconds of downtime, but a startup
search service could survive for minutes.
Understanding private cloud technologiesFrom the service-oriented point of view of the cloud, the application stack can be
divided into three layers.
The Infrastructure-as-a-Service (IaaS) layer corresponds to physical facilities, network,
power and computing hardware. The Platform-as-a-Service (PaaS) layer includes
instances of a computer operating system that are hosted in partitions managed by
virtualization software. The Software-as-a-Service (SaaS) layer is the application,
which provides services to users who connect using the Internet. There are
opportunities for a service-oriented approach at each layer of the technology stack.
Cloud services generally operate at the IaaS layer. They provide the virtualized
resources for computing hardware and networking.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 13 -December 2011
CLOUD COMPUTING
Enabling technologies There are several technologies that enable public and private cloud computing. The
first is virtualization, which uses binary files (or images) that represent a server’s
configuration and data state. With virtualization, a single server can run multiple
operating system instances at the same time, and the configuration state of a particular
virtual server ("a compute instance") can easily be replicated. For example, an image
can be loaded on many servers quickly as a way to scale up the number of compute
instances in response to higher-than-expected demand.
The second enabling technology is software that allows for automated operation of the
data center. Open-source software such as OpenStack makes it possible for you to
build a private cloud without the investment in custom management software that was
previously required. Facebook has launched the Open Compute Project, an open-
source initiative that provides cost-efficient and energy-efficient designs for data
centers and servers. The Open Data Center Alliance is another organization that offers
open-source software to create cloud infrastructures.
You can also use commercial tools to build, manage and operate cloud-based
infrastructure. Nimbula has tools to create a cloud operating system that supports IaaS
in both public and private clouds in a manner that is similar to Amazon EC2. Another
commercial tool for managing cloud environments is RightScale. (For more about
RightScale, see the Zynga case study later in this paper.) Commercial cloud-
management software that enables enterprises to integrate the management of the
cloud with their existing IT systems is produced by all the enterprise IT software
providers.
Choosing cloud management software is daunting. It can be difficult to evaluate the
tools provided by each vendor or open-source community. Your technical team can
evaluate the different options and, in concert with the executive leadership, choose the
cloud-management software that best serves your business goals.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 14 -December 2011
CLOUD COMPUTING
A new way of thinking about data center designThe service-oriented approach to cloud computing has inspired some significant
changes in the way that data centers are designed. In cloud-based designs, there is a
focus on system-level reliability rather than on redundant hardware at each layer of
the system. For example, a cloud-oriented data center would not use expensive,
redundant power supplies in an attempt to make individual servers more reliable.
Instead, the application would be written in a way that lets it continue on a new
virtualized server after a hardware failure, which may be located within the data center
or in another geographic location. Applications that are resilient to hardware failures
maintain the overall robustness of the system, even when individual components fail.
This allows a cloud-based data center to be created from inexpensive, commodity
hardware. The traditional IT data center that invests in reliability at each level of the
stack is a dying breed.
Cloud-based data center designs include the adoption of software engineering
principles by non-software disciplines. The principles of componentization and the
standardization of interfaces are used to reduce the need for each component to have
customized configuration. It is quite common now for developers to spend time in
operations, which is where the new term “DevOps” comes from.
Also, note that a cloud-based approach to data center and network design requires
cooperation from all layers of the system, including the application. A private cloud
provides the same kinds of virtualized execution environments and other services as a
public cloud. In this sense, it is quite different from traditional enterprise data centers,
which generally do not provide on-demand, self-service access to virtualized
computing resources.
If you use a public cloud today, your application will generally be easier to migrate to a
private cloud than applications that run on legacy IT data centers. This represents an
opportunity to bring cloud-oriented efficiencies into the corporate data center and to
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 15 -December 2011
CLOUD COMPUTING
take advantage of the best practices developed in public clouds and apply them to a
private cloud.
As you consider the possibility of migrating from the public cloud to a private cloud,
you must keep in mind that the system you are building is significantly different from a
traditional data center and that creating a new team to build the private cloud may be
easier than asking existing teams to support both it and the traditional data center.
Creating a migration strategyLet’s assume that the benefits of moving at least some of your applications to a private
cloud make sense to your organization and you decide that this is something you need
to explore. How do you proceed? The first step is to develop a migration strategy. Keep
in mind that the cloud environment, because it is based on virtualization and
automated system administration, is more dynamic than a traditional IT environment.
The more that you rely on automation rather than on manual intervention, the easier
the transition will be.
A migration strategy defines how your company can move from a public cloud to a
private or hybrid cloud, how much the move will cost and how long it will take.
Migration strategies need to be carefully considered and very detailed. In fact, public
cloud providers count on the difficulty of creating a good migration strategy as a way of
keeping their customers locked in.
Your migration strategy should address the following areas:
Selecting geographic location(s)
Type of data center — built, leased, co-located or hosted
Desired lease terms
Funding plan
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 16 -December 2011
CLOUD COMPUTING
Single or multicarrier networking
Peering requirements
Content distribution strategy (hierarchical, peered, hybrid)
Selection of hardware configurations and rack density
Data storage plan
Opportunities for optimizing the application
Analysis of whether a hybrid cloud strategy should be used
Implementation of a configuration system
Choice of virtualization software
Plan for data center management software
Human resources plan for IT personnel
Of course, your migration strategy will reflect the nature of your company and its
business objectives. A key point to remember is that you are not creating a traditional
IT data center. You are creating a cloud, and some of the guiding principles are:
Users expect the cloud to scale in minutes. Automation is your only option.
Plan for elastic capacity. You should be able to scale up as demand increases and
down as it decreases. The application should have as much capacity as it needs.
Plan for high availability. For example, if a server fails, users should never know
about it. The goal is to eliminate downtime.
Emphasize redundancy over resiliency. Assume that some percentage of your
hardware will fail and that it will not be repaired or replaced.
Standardize the environment. Use commodity components.
Automate the environment to minimize the amount of human involvement
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 17 -December 2011
CLOUD COMPUTING
needed to run the data center. The efforts of your IT staff can be better spent
elsewhere.
In essence, your migration strategy must include a way to replace the management
functionality that was supplied by your public cloud provider.
The following sections of this paper describe some of the elements of the migration
strategy in more detail.
Finding the geographic locationOne of the first planning steps is to decide on a physical location for the data center.
Where do you need to be? An analysis of your traffic patterns will help you to decide
where to put the data center. You should also analyze which ISPs best serve your
markets. Here are some areas to consider.
Performance considerations
Where you place the data center can affect performance, so you may want to be as
close to your end users as possible. This is particularly important for media companies
whose products rely on real-time interactivity and personalization. These applications
will be sensitive to problems such as long latency periods and routes with many hops.
(For more information, see the Equinix study cited earlier in this paper.)
Ask yourself these questions:
Do you have a large amount of traffic to a particular location?
Do you need to locate the data center in another country? Although the United
States is one of the cheapest places in the world to build and run data centers,
you may need an international presence to reach particular markets quickly.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 18 -December 2011
CLOUD COMPUTING
Do you have a dependency on another company? For example, do your
applications run on the Facebook platform? If you do have such a dependency,
you might want to consider locating near that company’s data center so that you
can build a direct connection to it.
Do you use a particular ISP and want to be near its backbone?
Do you have real-time interactions with users or personalize your websites? If so,
low latency will give you an advantage. For example, it might help you to conduct
real-time auctions for advertising. These auctions award each set of eyeballs to
the winner of the auction with every page view. You may even want to build
direct connections to your advertising partners.
Other considerations
Can you locate in an area that offers tax incentives? See the Data Center
Knowledge article on tax incentives for more information.
Can you locate in an area that offers other advantages, such as an underutilized
power grid? The main things data centers need are power and connectivity.
Where there is an abundance of capacity you can lower your costs and know you
have capacity for your growth. An overutilized grid means that power to the data
center will be expensive and possibly limited. One of the worst mistakes you can
make is to build a data center where you can't get enough power. An
underutilized grid means an abundance of capacity and lower costs.
Finding a data centerAfter you’ve decided on a geographic area, you have several options. You can build a
dedicated space, buy a dedicated space, lease a dedicated space or co-locate. “Co-
location” means that you rent rack space from a provider. In all cases, you will need to
know how much rack space you will need now and for some time in the future. You
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 19 -December 2011
CLOUD COMPUTING
may have 500 servers initially but want enough space for 1,000 servers. Your business
plan should help you estimate your requirements. Rather than square feet, these
requirements are often expressed in terms of power consumption. For example, you
may decide you need 100 kW of space. (A rule of thumb is that one rack consumes 7.5
kW.)
Building or buying a dedicated spaceThis option is selected only by very large companies and is outside the scope of this
paper.
Leasing a dedicated spaceCurrently, data center space around the world is competitively priced, with a range of
companies to choose from, like Digital Realty Trust, CoreSite, Duport Fabros and
Vantage Data Center. This space is marketed differently than general-purpose office
space. If you don’t have a data center location specialist in your company, most large
commercial real estate brokerage companies, such as Jones Lang LaSalle, Grubb &
Ellis, CBRE, Cushman Wakefield and Colliers have data center executives who can
help you. A common practice in many companies is to use the brokers with whom the
corporate real estate group has an established relationship. If you know your power
requirements, they can tell you what is on the market that is suitable. These executives
are usually familiar with the relevant issues such as financing, taxes and regulatory
constraints. If you want to rent space, they can also help you to negotiate a good lease,
which can be a complex process. Commercial real estate brokerage companies work on
commission.
Using co-locationCo-location can be a very efficient option for creating a private cloud. A co-location
provider rents you rack space, maintains the Internet connections and is responsible
for the environmental conditions, such as uninterruptable power supplies (UPS), air-
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 20 -December 2011
CLOUD COMPUTING
conditioning and safety features such as good security and fire prevention. Co-location
offers the same flexibility as if you had your own data center. You control the hardware
and the software. However, if you want to keep your capital expense low and if you
don’t need a specific hardware configuration, you may want to lease equipment from
the co-locator rather than provide your own. If you need multiple locations, look for a
co-location provider with multiple data centers that are near the markets you want to
reach now and in the foreseeable future. Equinix is an example of such a company.
When you host an application in a public cloud, you should ask whether eventual co-
location of your own servers is an option. This is a desirable feature, even if you don’t
need that capability today.
Carrier-specific and carrier-neutral providers
Some co-location providers are tied to a particular carrier, but others are carrier-
neutral. A carrier-neutral provider such as Equinix offers connections to many
different ISPs. Access to multiple carriers allows for failover strategies, more
competitive pricing and increased opportunities for establishing peering relationships.
If you do not have the in-house expertise to decide which carriers you want to use, you
may want to consult a co-location broker, who will help you find the best services and
the best prices. A broker can also tell you which co-location providers offer those
connections and are located in the appropriate areas. If you use multiple ISPs, check
that the co-location site supports Ethernet handoffs, which means that they can bridge
between carriers in an automatic and scalable fashion.
Carrier neutrality is the most common option in the United States and Western
Europe, but it is not universally available worldwide. In many markets outside these
areas, the major service providers hold monopolies in their country, so many data
centers offer limited carrier options. If you do find a co-locator that is carrier-neutral
in these markets, the space will most likely be more expensive, but the competitive
carrier choices can offset the increased costs. Adding capacity in the United States and
Western Europe is significantly easier than in other markets, and you should consider
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 21 -December 2011
CLOUD COMPUTING
working with consultants or companies who have experience building a worldwide
data center strategy that includes carrier locations.
Peering reduces Internet access costs by allowing companies to exchange information
over a connection that is not on the Internet. You can go to www.peeringdb.com and
search for companies such as Google, Facebook, Microsoft and Yahoo to see what they
have deployed globally, where they are located and who you can contact to peer with
their networks. For example, Google has bought 111 Eighth Ave. in New York City for
approximately $1.9 billion. This is one of Manhattan's largest buildings and is a key
Internet hub.
Using a hosting serviceA variant of co-location is to use a hosting service for your private cloud. These
providers offer more support than a standard co-location company. They can design,
provision, configure, deploy and manage your cloud. Of course, this can become very
expensive, but it offers more flexibility than the public cloud. If you don’t have an
experienced engineering team of your own, you may want to consider this option.
Rackspace and SoftLayer are examples of this kind of hosting service.
Creating a bidAfter you have decided on the type of facility, you can create a bid request package,
which should be as detailed as you can make it. Expect the bids to be close to one
another. If one bid is significantly higher or lower than the others, it probably means
that there was some miscommunication between you and the provider.
Given that the bids are competitive, examine other distinguishing, less tangible
features, such as your confidence in the operations team, to help you make a decision.
Making hardware decisionsOne advantage of setting up a private cloud is that you can control the hardware
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 22 -December 2011
CLOUD COMPUTING
configuration. Before you purchase new hardware (or lease it), first take an inventory
of your existing hardware. It’s easy to lose track of equipment. Desks and benches
littered with hardware that no one knows anything about are common sights in many
companies. Many companies already have a configuration management database in
place, which helps you keep track of your assets.
The types of applications you run can influence your hardware decisions. Applications
that are compute-intensive might benefit from a different hardware configuration than
applications that are data-intensive. Even if you are not doing it now, you might be
planning to perform detailed analytics on the data you accumulate, and you might
want some of your hardware purchases to reflect this.
Two examples of technologies that are more likely to be used in private clouds than in
public clouds are OpenFlow software and microservers. OpenFlow allows networks to
be virtualized and run at a lower cost than current solutions. Microservers are small
units with hundreds of energy-efficient processors that are used in large-scale
environments. Microservers provide groups of dedicated compute nodes where
multicore CPU architectures and virtualization aren't practical. For example,
Online.net, a large hosting company in France, uses microservers. Each system runs
one OS and one application per server, has one CPU per server, and has twelve servers
per chassis. Examples of these energy-efficient processors are the Intel Atom and ARM
processors.
The importance of energy consumptionIt is important to keep your energy costs low. The power and cooling infrastructure
accounts for 85 percent of the cost in building a data center. Only 10 percent of the
cost is in building the shell. By measuring the energy efficiency of your transactions,
you can reduce the costs of your operations. For example, eBay measures the
performance per watt of its transactions.
Note: To give you a sense of the amount of energy that a very large company's data
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 23 -December 2011
CLOUD COMPUTING
centers use, you can see the data that Google has published about the amount of
energy it uses to support search, Gmail and YouTube.
Also, cloud metrics that measure end-to-end efficiency, such as transactions per watt
of electricity, can be helpful in aligning diverse teams in your organization to a
common goal. For example, if application developers are held accountable for the
amount of power used per transaction, they will be motivated to create the most
power-efficient applications possible and coordinate their efforts with network and
data center architects.
Planning rack densityMany people who design clouds use 7.5 kW per rack as a guideline for density. Higher
densities that consume approximately 20 kW can create areas where the temperatures
are hot enough that they may damage your equipment and heat up the surrounding
server racks. These areas are called hotspots. Load your racks so that the servers are
evenly distributed. Don’t try to place as many servers as you can in a rack. You
generally pay for space in a data center based on the power you use, not the square
footage you occupy, so there is no real benefit to extreme rack density. High rack
density is heavily marketed due to its higher profit margins. But data center experts at
Facebook and Amazon who are focused on low total cost of ownership (TCO) deploy
7.5 kW per rack. Low density requires slower air movement and fewer fans, which
translates into using less energy and reduced costs. Rack density is still a hotly debated
topic.
One server uses approximately 200 W. This means that you can have about 35 servers
per rack. If you assume that a server costs between $2,500 and $3,000, then one rack
of servers costs between $87,500 and $105,000. Each server might run 12 virtual
machines (VM) or more at a time. There is usually one VM per core if the instance is
busy, but it is possible to run more than one VM on a single core if the load is low.
It is interesting to note that even software licensing terms are following the trend
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 24 -December 2011
CLOUD COMPUTING
toward using power to measure data center complexity. A recent shift in virtualization
licensing by VMware reflects this. It has changed its licensing model so that it is now
based on the RAM footprint in the servers. The amount of RAM tends to be better
correlated with system load (and power consumption) than the number of processor
cores is.
Planning and flexibility
When you plan for power consumption, remember that one of the advantages of
building a private cloud is that you have flexibility. The 7.5 kW per rack value is a
guideline, not a fixed value. Some racks might use 10 kW, some will use 7.5 kW and
others might use 12 kW. After you have a sense of how much power you will consume,
broaden that value to include an acceptable range of possibilities. Consider options
that use one-third less power and other options that use one-third more power than
you had initially planned. Don't lock yourself into a single number as you formulate a
plan.
Planning storageJust as you need servers, you also need data storage. Your strategy for storage should
adhere to cloud principles, such as automated system administration, just as your
strategy for servers does. Think in terms of storage virtualization, which is analogous
to server virtualization. Here are some goals:
Be able to quickly provision (and de-provision) storage. You should be able to
provide storage whenever or wherever it is needed. You should also be able to
quickly de-provision or repurpose the storage when you don’t need it anymore.
Be able to manage the storage efficiently. Because you must be able to react
quickly to changing demands, a centralized storage management console is a
good idea.
Be able to replicate data in a variety of locations. It’s a given that you will have
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 25 -December 2011
CLOUD COMPUTING
redundant copies of your data. You must also decide on how often the
duplication will occur and where you will store the copies. Multiple locations that
are dispersed geographically are now possible.
Be highly reliable. Because storage availability is so critical, you must have a
good failover strategy. Downtimes mean lost revenues and lost consumer
confidence.
Be scalable. You want to be able to scale up and back quickly. The ability to
quickly expand capacity to meet demand without a complex set of integration
and allocation processes — and without downtime — is critical for a production
cloud.
Evaluating your applications Many companies that use the private cloud integrate their infrastructure team with
their development team. It’s not an "over-the-wall" kind of situation. The people who
write the code support the code. This approach can work well, because the application
developers make most of the decisions about the application’s design and how it
communicates with the infrastructure.
You may need to evaluate your applications and make sure that they are designed to
operate in the private cloud. One difference between the public cloud and the private
cloud is that, with the public cloud, application developers write to the provided
software platform. They adapt the application to an abstracted environment, and they
are removed from the underlying infrastructure. In the private cloud, there is the
potential to integrate the infrastructure with the application. For example, developers
can use techniques that limit processing to servers in the same rack or switch, which is
critical to high performance in some cases. This is analogous to vertical integration in
manufacturing, where tightly meshed stages can make the final product more efficient
and its design cleaner.
Another issue to investigate is if your public cloud applications were written to
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 26 -December 2011
CLOUD COMPUTING
facilitate migration. Are they abstracted from the underlying mechanisms? There
should be as few points of dependency on the vendor-provided platform as you can
manage. For example, if you use the public cloud for peak shaving, you might insert a
layer between your application and the cloud to communicate between them. Clear
component layering can help to make your migration path easier.
You should keep in mind that public and private clouds are not your only choices.
Many companies find that a hybrid approach works best. You can use the public cloud
to handle high-demand periods that would overload the private data center. In
addition, you can place some parts of an application on the public cloud and other
parts on a private cloud. For example, a distributed login process that uses a federated
trust model might be split across public and private infrastructures.
Implementing the software layersAs mentioned in the previous section, the public cloud provides many software layers
and services that you rely on. To build a private cloud means that you must provide
those layers yourself. Many companies use open-source software, such as OpenStack,
to provide at least some of the required functionality. Open-source software is
attractive because it is free and because you don’t have to sacrifice the flexibility that is
one of the reasons to build a private cloud in the first place. Proprietary software
licensed on physical CPUs can, for example, make it difficult to move virtual machines
(VM) from one host to another. To build a private cloud, you will need, at a minimum,
virtualization software and management software.
Virtualization softwareA cloud cannot exist without a virtualization layer. Server virtualization hides the
physical servers, including their number, identities and processors, behind a layer of
abstraction. Storage virtualization does the same for storage devices, such as storage
area networks (SANs). Server virtualization includes the use of a hypervisor or virtual
machine manager (VMM). Hypervisors allocate physical resources and allow multiple
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 27 -December 2011
CLOUD COMPUTING
operating systems, or guests, to run concurrently on a host computer. It’s possible that
you will have to use multiple types of hypervisors in your cloud, which may affect your
decision about the management software that you need. It's also possible that you will
not be able to find an integrated tool set that gives you all the management capabilities
that you need. In this case, you may have to write some of the software yourself.
Management software Just as a cloud cannot exist without virtualization, it also requires management tools
that centralize management tasks and promote automation. Some of the tools are:
Infrastructure management software. Infrastructure management software
allows you to manage the physical and virtualized environment. For example, it
gives you control over VMs, storage, and backup and recovery operations.
Service-level management software. Service-level management allows you to
manage workloads, and it occurs above the virtualization layer. This layer
supplies automation capabilities.
Change management. Change management is the process of authorizing and
propagating upgrades and releases.
Release and deployment management. In the private cloud, someone must be
responsible for deploying new releases and patches, such as a new version of the
operating system.
Knowledge management. Knowledge management is the process of gathering,
analyzing, storing and sharing information within an organization.
Incident and problem management. Incident and problem management
identifies and resolves operational issues. Identifying a problem also determines
who is responsible for resolving it. If a problem is related to the operating
system, it might be an IT responsibility. If it is related to the application, then it
is the application owner's responsibility. How responsibilities are delegated
depends on how your company is organized.
Access management. Access management at the application level ensures that
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 28 -December 2011
CLOUD COMPUTING
only authorized users can use the services they request.
Systems administration. Systems administration is the performance of tasks that
are required to keep the system healthy. Because many of these tasks are
repetitive, anything that can be automated should be.
Configuration management system. Automated management software requires
knowledge of the system's configuration. It will need databases that contain
information about all of your company’s assets. The other management software
systems rely on it for information. Databases contain not only information about
physical assets and configurations but also about logical ones.
IT personnelObviously, what you can do is partially controlled by your IT staff. To build a private
cloud requires an experienced staff that understands how to implement cloud
principles. People who have experience in traditional IT departments have a different
view of how things should work and what expectations must be met. You may want to
use consultants, consider outsourcing some responsibilities or hire some new people.
Some companies have opted for new hires, because information is now a core asset,
and analysis of that information is a critical activity. For example, Best Buy has
recently decided to bring its IT operations back in-house for this reason. The company
is building a new data center in Minneapolis and plans to hire 350 people. For more
information, see the Trefis article "Best Buy's Headed to $35, Hires More Geeks to
Grow Online Sales."
Zynga: a case studyZynga is a gaming company whose products exemplify the current trend toward
interactivity, personalization and social networking. That games should be social is one
of the tenets of the company’s operating philosophy. Zynga games run on the
Facebook platform, which is one way the company makes this goal tenable.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 29 -December 2011
CLOUD COMPUTING
With a customer base of more than 280 million active users each month, Zynga is one
of the largest users of cloud computing, and its growth has been so fast that it has
taken everyone, including its own management, by surprise.
When Zynga launched Farmville, the company thought 200,000 daily active users in
the first two months would be a success. As it turned out, for the first 26 weeks
Farmville added 1 million new users per week. The game currently has 70 million
active users per month. With a growing portfolio made up of new and mature games,
Zynga has moved from a public cloud to a hybrid approach, using private data centers
as well as Amazon Web Services (AWS).
For much of its four-year history, Zynga has relied on AWS for its infrastructure.
Although it expects this dependence to continue, Zynga wants to diversify and create a
hybrid solution. To that end, it is now leasing its own data centers in Santa Clara, Calif.
and Ashburn, Va. One reason is that its significant dependence on AWS makes it
vulnerable. In its filing with the SEC, Zynga says that, because of this dependence, any
significant interruption in the network service could affect its business. For example,
the SEC filing says specifically that in April 2011, several of Zynga's most important
games, including Farmville and Cityville, were interrupted for several hours due to an
AWS network outage.
Zynga also realizes the importance of responsiveness. In its SEC filing it says, "If a
particular game is unavailable when players attempt to access it or navigation through
a game is slower than they expect, players may stop playing the game and may be less
likely to return to the game as often, if at all."
Another reason that low latency is so important is that Zynga's games are data-driven.
Zynga relies on metric-based player feedback to know how to target its marketing and
to create new content.
Zynga is so committed to gaining control of its infrastructure that it is spending
between $100 million and $200 million in the second half of 2011 for this purpose.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 30 -December 2011
CLOUD COMPUTING
Zynga and AWSZynga had just run out of data center capacity when it launched Farmville, so it used
Amazon’s EC2 platform. Given the game's explosive growth, that was a fortuitous
decision. Mark Williams, the company's VP of network operations, has said that
without Amazon, Farmville would have failed.
On Amazon, Zynga uses Apache PHP on the front end, memcached (from Couchbase)
for active user play and MySQL on the back end. It uses memcached to store key value
pairs for active user player sessions, and it writes the values to the database as
necessary. It uses the RightScale cloud management platform to acquire instances at
will. As convenient as AWS has proven, because of the risks associated with relying so
heavily on a third party and the desire to improve performance, Zynga now has a new
model for introducing games.
Zynga plans to introduce its new games on Amazon. Given how difficult it has been to
predict growth, Zynga relies on AWS to accommodate even the most popular games.
Within less than a day, Zynga can allocate to a new game enough Elastic Compute
Cloud (EC2) capacity to support 10 million daily active users. Each new game is
watched for the next three to six months. If at any time growth becomes flat or
predictable, the game is moved from Amazon and into Zynga’s data centers, where it
can be optimized to work with the company's resources.
Williams tries to maintain a fifty-fifty split between using EC2 and Zynga's data
centers. At any one time, the split could be thirty-seventy or seventy-thirty, depending
on the ratio of new games to predictably growing ones. Williams said he doesn’t want
to test Amazon’s capacity and that he would rather not have all of his eggs in any one
basket, but he values the ability to grow quickly.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 31 -December 2011
CLOUD COMPUTING
The Zynga private cloudZynga currently leases data center space from two wholesale data center providers,
DuPont Fabros Technology (DFT) and Digital Realty Trust (DLR). In the wholesale
data center model, a tenant leases dedicated, fully built data center space. Zynga felt
that this approach offered greater control and security than shared co-location space,
and it was quicker and cheaper than building an entire data center. This approach
relieved Zynga of the capital investment needed to construct the data center.
Another advantage was that Zynga was able to locate several of its data centers
adjacent to some of Facebook's data centers, which allowed it to connect to Facebook's
facilities directly. This resulted in decreased latency for more-responsive games that
users would enjoy playing.
Zynga calls its cloud the ZCloud, and it operates similarly to AWS. However, it is
designed specifically for social games in terms of availability, network connectivity,
server processing power and storage throughput. Zynga achieves this by providing
redundant power to each rack, state-of-the-art servers with high-memory capacity, a
fully non-blocking network infrastructure, the use of in-line hardware-based load
balancers and local disk storage. The zCloud integrates with operational and
management tools from RightScale. Essentially, RightScale provides the management
interface for both Zynga’s public EC2 resources and private Cloud.com resources,
allowing Zynga to easily migrate games from Amazon to Cloud.com using the same
configuration templates. Zynga uses CloudStack software to build the private cloud
aspects of its infrastructure. Along with CloudStack, Zynga uses other open-source
technologies such as Apache, MySQL, memcached, Couchbase and Nagios. This new
infrastructure, coupled with the ability to use AWS when there are particularly strong
surges, allows Zynga to add up to 1,000 servers a day.
Williams says that Zynga's vendors state that the zCloud is one of the world’s largest
private clouds. He believes that it will give Zynga the flexibility it needs to deliver
games and features to its users in a rapid fashion.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 32 -December 2011
CLOUD COMPUTING
What's nextOne of Williams’ next big projects is to help Zynga unify its infrastructure as it
diversifies across new platforms, including mobile. Right now, Zynga dedicates
infrastructure separately to each social network’s instance of any one game. Now, with
launches such as Farmville for iPhone, there is an increased emphasis on keeping the
state of users' games synchronized, no matter how many platforms they use to play
them.
More informationHere are links to two articles that discuss Zynga's hybrid cloud solution.
“Gaining Altitude in the Cloud”
“Lessons From FarmVille: How Zynga Uses the Cloud”
Perspectives on the futurePeople tend to think of data centers as large rooms filled with expensive equipment
that costs their company money, but data centers are much more than overhead. In
fact, they can be thought of as factories that produce valuable products. Increasingly,
organizations view information gathered from customers during their interactions
with web-based applications as raw material that can be processed into various value-
added by-products. When indexed and analyzed, the data collected as part of these
operations has tangible economic value. Analytics is a growing field, and many
innovative techniques are being developed.
YouTube is just one example of an organization that uses information to generate new
revenue. YouTube is developing systems that convert the spoken soundtrack of an
uploaded video into text. Its technology uses automated spiders that listen and
transcribe the words in the audio track. The transcription is then embedded in the
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 33 -December 2011
CLOUD COMPUTING
video as metadata. YouTube uses the audio transcript to search for keywords. The
keywords allow YouTube to effectively target advertising to viewers of that video. In
other words, by knowing what is said in a given video it is possible to conduct
additional advertising keyword auctions that would not otherwise be possible. The new
revenue comes from data that already exists in the data center.
Another example is the gaming company Zynga. Its games are free, and they attract
many players. Zynga earns revenue by selling virtual goods such as badges and
graphics to these users. Because only a small percentage of the players spend money,
Zynga needs to know how best to target its products at the users who will most likely
make purchases. To do this, Zynga tracks and analyzes player behavior. The result of
the analysis supports highly targeted ads that drive new revenue. The player data
already exists; using it in novel ways produces new sources of revenue.
As the example of web analytics has showed, you should keep your options open. You
may find that the public cloud works best for you now but your business model
projects that in two years you will have grown enough to make the private cloud viable.
Last, it is easy to be swayed by the latest and greatest trends. Let your business model
drive your technology choices, now and in the future.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 34 -December 2011
CLOUD COMPUTING
Key takeaways
A growing percentage of web content is based on interactivity, social networking
and personalization.
To satisfy consumers, this content requires a low-latency environment so that
the sites respond quickly to input.
Personalization also requires responsive sites so that the appropriate advertising
appears in a timely fashion.
Companies that produce interactive content may find that public cloud providers
and traditional CDNs cannot provide the correct environment. The response
time may be too slow, and outages are also a problem.
Companies that spend more than $25,000 per month on public cloud services
may find a private or hybrid cloud to be less expensive than using the public
cloud only.
Use metrics to evaluate the performance of your public cloud.
Develop a detailed migration strategy.
Gather input from every department that might be affected. Building a cloud is
an interdisciplinary activity.
Consider co-location providers as well as other options when you think about the
location of the data center.
Consider whether you can use open-source as well as commercial products to
implement virtualization and service layers.
Plan to use analytics more and more, because the data center is a source of
valuable information.
Use your business plan to drive your technology choices. Plan for the future and
not just for immediate contingencies.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 35 -December 2011
CLOUD COMPUTING
About Dave OharaDave Ohara holds a degree in Industrial Engineering and Operations from the
University of California, Berkeley. He joined HP after graduation, working in process
engineering–type of jobs in PC manufacturing, quality and reliability, and distribution
logistics. After five years, he joined Apple to redesign its distribution logistics system
and quickly moved through a variety of HW and SW product development positions.
He has also worked at Microsoft, on Windows 3.1 Far East versions.
About GigaOM ProGigaOM Pro gives you insider access to expert industry insights on emerging markets.
Focused on delivering highly relevant and timely research to the people who need it
most, our analysis, reports and original research come from the most respected voices
in the industry. Whether you’re beginning to learn about a new market or are an
industry insider, GigaOM Pro addresses the need for relevant, illuminating insights
into the industry’s most dynamic markets.
Please visit us at http://pro.gigaom.com
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 36 -December 2011
CLOUD COMPUTING
Further readingInfrastructure Q3: OpenStack and flash step into the spotlight
Last quarter we highlighted the fast maturation of the Platform-as-a-Service and big
data spaces. Those two trends only picked up speed during the third quarter of 2011.
Joining them on the cusp of IT greatness, though, are the OpenStack project and flash
storage. The former gathered serious validation from big-name companies, while the
latter saw less funding than last quarter but a significant number of product launches.
A field guide to cloud computing: current trends, future opportunities
Cloud computing has grown from a pie-in-the-sky vision to a major IT movement over
the past few years. As its promise has grown, though, so too has its scope: Cloud
computing now encompasses a wide range of IT functions delivered as services,
including infrastructure, software, application platforms and storage. This report
examines the key areas that are driving cloud spending and innovation, highlights the
current state of each market and provides informed insights into where each
individual sector — and cloud computing in general — is headed.
Defining internal clouds: from Appistry to VMware
Internal clouds are real and they’re here, but many efforts are still in their early days.
The problem is that transitioning to a cloud-enabled environment can involve large
degrees of technical, cultural and budgetary evolution, and it is of utmost importance
that organizations deploy the right solution. With this in mind, customers need to
consider many things, and we have profiled numerous solutions and companies to
create a guide to deploying the right cloud solution to the right enterprise.
GigaOM Propro.gigaom.com
© 2011 GigaOMAll rights reserved
Migrating interactive media and web
applications to the private cloud
- 37 -December 2011
CLOUD COMPUTING
Want more information?
Contact Dave Ohara, the author of this report,
or any of the other experts at GigaOM Pro.
Discuss this report online.
Suggest a research topic.