Cool Green IT Products from DNS-DIRECT

IGEL Slide

Save money & energy Green IT

WEB UD2 Summerpromo 600px

Saturday 19 February 2011

Cloud-based backup working together

CLOUD SERVICES: CLOUD STORAGE AND CLOUD-BASED BACKUP DEFINED
Although the terms are often used interchangeably, there's a big difference between cloud storage and cloud backup. Cloud storage is storage as a service. To tap into cloud storage, you get an account with a service provider; they provide you with their API and you use some type of software that enables you to store data via that API. Voila! You have storage with unlimited capacity. You don't manage the storage where your data resides, and you don't even have to ask for additional capacity. All you have to do is pay the bill.
All cloud storage services charge a "storage fee," a monthly rate based on how many gigabytes of data are stored in your account. In addition, some cloud storage providers may charge a fee for each gigabyte that's downloaded or uploaded -- essentially a "bandwidth fee." With cloud storage, you still have to manage the application that's sending the data into the cloud.
To be considered a cloud backup service, a cloud service must provide all of the above plus the software to make the backups happen. A cloud backup service typically provides some type of client software that must be installed on all the systems to be backed up. Backups are then automatically scheduled to occur on a regular basis. The backup software generally uses techniques such as delta-level backups or full deduplication to minimize network traffic.
The provider's service-level agreement (and the price they charge) will determine what happens when things don't go as planned. At a minimum, the service may provide an on-screen pop-up notification or an email message to tell you that things are going well (or not). The service may also have the ability to automatically escalate the problem when failed backups aren't addressed.


TRADITIONAL DATA BACKUP SOFTWARE MEETS THE CLOUD

Some companies may use a cloud backup service for all of their backups, while others may opt for a combination of traditional backup methods and cloud services. There are two very different ways to go about integrating traditionaldata backup software and the cloud. You can use a traditional backup system in parallel with a cloud backup system, or you can use backup software that has the ability to use a cloud storage system as its target.
If the main reason you're considering using cloud-based backup is the "hands-off" aspect, then this is the route to take. You can continue using traditional backup software to perform the bulk of your backups, then use cloud backup software to handle those parts where it would be most beneficial. The most common practice is to start by performing remote site and laptop backups using the cloud backup service. Many companies aren't yet performing backups of their laptops, and backing them up with traditional backup software is problematic, to say the least. Most companies back up their remote sites, but they often use less than desirable methods because their remote offices don't have dedicated IT staff. A cloud backup service can solve both the laptop and remote-office problems; all you have to do is write a check.
Using cloud storage as a target for a traditional backup software package is a bit more problematic, but it's not without its advantages. The same things that are true of cloud storage for traditional data are true of cloud storage for backups: no management, endless capacity, etc. As a "bonus" you automatically get off-site backups, which is still a hassle for many companies. There may be more challenges than advantages, however, when it comes to using cloud storage as the destination for traditional backups.
TOO MUCH DATA FOR THE CLOUD?
The first challenge is that traditional backup sends and stores a lot of data. Traditional backup systems typically perform full backups once a week, and even backup apps that don't perform repeated fulls on file systems (e.g., IBM's Tivoli Storage Manager) perform full backups on applications. (Many companies even perform daily full backups of some key applications.) In addition, all traditional backup applications perform full-file incremental backups. That means if just a single byte has changed in a file, the modification time is changed or the archive bit is set so the entire file is included in that night's backups.
Both of these typical practices create a lot of data that's sent across the network and stored on the target device. If the target device was a cloud backup service, it would require significantly increased bandwidth and higher charges to store the data in the cloud. Remember that traditional backup systems are why data deduplication was developed. The backup applications create 20 GB on "tape" for every 1 GB on primary disk. So a 10 TB data center would need to pay for approximately 200 TB of cloud storage every month.

Data backup software and cloud-based backup working together


All cloud storage services charge a "storage fee," a monthly rate based on how many gigabytes of data are stored in your account. In addition, some cloud storage providers may charge a fee for each gigabyte that's downloaded or uploaded -- essentially a "bandwidth fee." With cloud storage, you still have to manage the application that's sending the data into the cloud.
To be considered a cloud backup service, a cloud service must provide all of the above plus the software to make the backups happen. A cloud backup service typically provides some type of client software that must be installed on all the systems to be backed up. Backups are then automatically scheduled to occur on a regular basis. The backup software generally uses techniques such as delta-level backups or full deduplication to minimize network traffic.
The provider's service-level agreement (and the price they charge) will determine what happens when things don't go as planned. At a minimum, the service may provide an on-screen pop-up notification or an email message to tell you that things are going well (or not). The service may also have the ability to automatically escalate the problem when failed backups aren't addressed.


TRADITIONAL DATA BACKUP SOFTWARE MEETS THE CLOUD


Some companies may use a cloud backup service for all of their backups, while others may opt for a combination of traditional backup methods and cloud services. There are two very different ways to go about integrating traditionaldata backup software and the cloud. You can use a traditional backup system in parallel with a cloud backup system, or you can use backup software that has the ability to use a cloud storage system as its target.
If the main reason you're considering using cloud-based backup is the "hands-off" aspect, then this is the route to take. You can continue using traditional backup software to perform the bulk of your backups, then use cloud backup software to handle those parts where it would be most beneficial. The most common practice is to start by performing remote site and laptop backups using the cloud backup service. Many companies aren't yet performing backups of their laptops, and backing them up with traditional backup software is problematic, to say the least. Most companies back up their remote sites, but they often use less than desirable methods because their remote offices don't have dedicated IT staff. A cloud backup service can solve both the laptop and remote-office problems; all you have to do is write a check.
Using cloud storage as a target for a traditional backup software package is a bit more problematic, but it's not without its advantages. The same things that are true of cloud storage for traditional data are true of cloud storage for backups: no management, endless capacity, etc. As a "bonus" you automatically get off-site backups, which is still a hassle for many companies. There may be more challenges than advantages, however, when it comes to using cloud storage as the destination for traditional backups.


TOO MUCH DATA FOR THE CLOUD?

The first challenge is that traditional backup sends and stores a lot of data. Traditional backup systems typically perform full backups once a week, and even backup apps that don't perform repeated fulls on file systems (e.g., IBM's Tivoli Storage Manager) perform full backups on applications. (Many companies even perform daily full backups of some key applications.) In addition, all traditional backup applications perform full-file incremental backups. That means if just a single byte has changed in a file, the modification time is changed or the archive bit is set so the entire file is included in that night's backups.
Both of these typical practices create a lot of data that's sent across the network and stored on the target device. If the target device was a cloud backup service, it would require significantly increased bandwidth and higher charges to store the data in the cloud. Remember that traditional backup systems are why data deduplication was developed. The backup applications create 20 GB on "tape" for every 1 GB on primary disk. So a 10 TB data center would need to pay for approximately 200 TB of cloud storage every month.

The rule about not relying on a single copy of your backups stored in the cloud still applies whether you're able to use deduplication or not.  
In addition to the cloud storage vendor's fees for disk capacity used and the amount of data transferred, there are the costs associated with having sufficient bandwidth to get the data to the cloud storage vendor. If you consistently and regularly create a 10 TB full backup and want to send it to the service over the wire, using a cloud storage vendor isn't likely to be practical. But even if your backup needs aren't that extreme, the behavior of traditional backup will make the cloud part of your backup system cost quite a bit.
The second challenge ironically involves one of the key advantages of using a cloud backup service: having backup data stored off-site. Assuming you solve the problem of getting the data off-site in the first place, you then have the problem of all of your data being in a different location than your servers. Obviously, this can significantly hamper your ability to meet your recovery time objectives (RTOs). This means that any copy of your data that's stored in the cloud should be just that, a copy. More specifically, it shouldn't be the copy you rely on for routine data recoveries. Using cloud storage as the only copy of large amounts of data that need to be transferred across the Internet is simply a disaster waiting to happen.
This sounds like a problem for data deduplication to solve, right? Sort of. A lot of backup software packages can deduplicate the data before sending it over the Internet. That can certainly address the challenge of getting the backups onto cloud storage, but it doesn't address the challenge of getting the data back. So the rule about not relying on a single copy of your backups stored in the cloud still applies whether you're able to use deduplication or not.


DATA BACKUP SOFTWARE CAN LINK TO THE CLOUD


There are now a number of companies with software and hardware products that support backing up to the cloud. The first backup application vendor to announce support was Zmanda Inc., a commercial firm that offers its version of Amanda, an open source backup program. Amanda Enterprise 3.1 is capable of backing up directly to Amazon's Simple Storage Service (S3) cloud storage service.

If your company uses a backup application that doesn't yet support backing up to the cloud, you might want to consider Nasuni Corp.'s Filer, which provides an NFS/CIFS NAS gateway to cloud storage. Any decent backup software package can back up to an NFS or CIFS mount.



Make sure you TEST IT and TEST IT

Cloud-based backup can be great complements to traditional backup systems, especially when those systems provide some level of integration. Because a cloud backup service will require little if any hardware to be installed on your site, it's relatively easy to perform a full proof of concept using real data. This is especially important because implementation may require substantial investments for licenses and have a profound effect on your backup environment. As with any backup product or service, you should test everything and believe nothing.

Vince Bailey




Why is cloud computing

Why is cloud computing so hard to understand? It would be an equally fair question to ask why today’s Information Technology is so hard to understand. The answer would be because it covers the entire range of business requirements, from back-office enterprise systems to various ways such systems can be implemented. Cloud computing covers an equal breadth of both technology and, equally important, business requirements. Therefore, many different definitions are acceptable and fall within the overall topic.
Businesses are becoming more like the [cloud] technology itself: more adaptable, more interwoven and more specialized.Peter Fingar, Author
But why use the term "cloud computing" at all? It originates from the work to develop easy-to-use consumer IT (Web 2.0) and its differences from existing difficult-to-use enterprise IT systems.
A Web 2.0 site allows its users to interact with other users or to change content, in contrast to non-interactive Web 1.0 sites where users are limited to the passive viewing of information. Although the term Web 2.0 suggests a new version of the World Wide Web, it does not refer to new technology but rather to cumulative changes in the ways software developers and end-users use the Web.
World Wide Web inventor Tim Berners-Lee clarifies, "I think Web 2.0 is, of course, a piece of jargon; nobody even knows what it means. If Web 2.0 for you is blogs and wikis, then that is 'people to people.' But that was what the Web was supposed to be all along. The Web was designed to be a collaborative space where people can interact."
In short, Web 2.0 isn’t new technology; it’s an emerging usage pattern. Ditto for cloud computing; it’s an emerging usage pattern that draws on existing forms of IT resources. Extending Berners-Lee’s definition of Web 2.0, the companion to this book, Dot Cloud: The 21st Century Business Platform, helps clarify that cloud computing isn’t a new technology: “The cloud is the 'real Internet' or what the Internet was really meant to be in the first place, an endless computer made up of networks of networks of computers."
"For geeks," it continues, "cloud computing has been used to mean grid computing, utility computing, Software as a Servicevirtualization, Internet-based applications, autonomic computing, peer-to-peer computing and remote processing -- and various combinations of these terms. For non-geeks, cloud computing is simply a platform where individuals and companies use the Internet to access endless hardware software and data resources for most of their computing needs and people-to-people interactions, leaving the mess to third-party suppliers."
Cloud’s birth in the new world
Again, cloud computing isn’t new technology; it’s a newly evolved delivery model. The key point is that cloud computing focuses on the end users and their abilities to do what they want to do, singularly or in communities, without the need for specialized IT support. The technology layer is abstracted, or hidden, and is simply represented by a drawing of a "cloud." This same principle has been used in the past for certain technologies, such as the Internet itself. At the same time, as the Web 2.0 technologists were perfecting their approach to people-centric collaboration, interactions, use of search and so on, traditional IT technologists were working to improve the flexibility and usability of existing IT resources.

The trend toward improving the cost and flexibility of current in-house IT capabilities by using virtualization can be said to be a part of cloud computing as much as shifting to Web-based applications supplied as services from a specialist online provider. Thus it is helpful to define cloud computing in terms of usage patterns or "use cases" for internal cost savings or external human collaboration more than defining the technical aspects.
There are differences in regional emphases on what is driving the adoption of cloud computing. The North American market is more heavily focused on a new wave of IT system upgrades; the European market is more focused on the delivery of new marketplaces and services; and the Asian market is more focused on the ability to jump past on-premise IT and go straight to remote service centers.
How the cloud shift affects front-office activities
There is a real shift in business requirements that is driving the "use" as a defining issue. IT has done its work of automating back office business processes and improving enterprise efficiency very well, so well that studies show the percentage of an office worker’s time spent on processes has dropped steadily. Put another way, the routine elements of operations have been identified and optimized. But now it's the front office activities of interacting with customers, suppliers and trading partners that make up the majority of the work.
Traditional IT has done little to address this, as its core technologies and methodologies of tightly-coupled, data-centric applications simply aren’t suitable for the user-driven flexibility that is required in the front office. The needed technology shift can be summarized as one from "supply push" to "demand pull" of data, information and services.
Cloud computing is simply a platform where individuals and companies use the Internet to access endless hardware software and data resources for most of their computing needs.Peter Fingar
Business requirements are increasingly being focused on the front office around improving revenues, margins, market share and customer services. To address these requirements, a change in the core technologies is needed in order to deliver diversity around the edge of the business where differentiation and real revenue value are created. Web 2.0 user-centric capabilities are seen as a significant part of the answer.
The technology model of flexible combinations of "services" instead of monolithic applications, combined with user-driven orchestration of those services, supports this shifting front office emphasis on the use of technology in business. It's not even just a technology and requirement match; it’s also a match on the supply side. These new Web 2.0 requirements delivered through the cloud offer fast, even instantaneous, implementations with no capital cost or provisioning time.
This contrasts to the yearly budget and cost recovery models of traditional back office IT. In fact many cloud-based front office services may only have a life of a few weeks or months as business needs continually change to suit the increasingly dynamic nature of global markets. Thus the supply of pay-as-you-go instant provisioning of resources is a core driver in the adoption of cloud computing. This funding model of direct cost attribution to the business user is in stark contrast to the traditional overhead recovery IT model.
While cloud computing can reduce the cost and complexity of provisioning computational capabilities, it also can be used to build new shared service centers operating with greater effectiveness "at the edge" of the business where there’s money to be made. Front office requirements focus on people, expertise and collaboration in any-to-any combinations.
According to Dot Cloud, "There will be many ways in which the cloud will change businesses and the economy, most of them hard to predict, but one theme is already emerging. Businesses are becoming more like the technology itself: more adaptable, more interwoven and more specialized . These developments may not be new, but the advent of cloud computing will speed them up."

Cloud computing crime poses unique forensics challenges


Cloud services are relatively new, insofar as use by the general public for storage is concerned, said Martin Novak, physical scientist at the National Institute of Justice (NIJ). Over time, it's expected that clouds will contain more and more evidence of criminal activity. To help extract that evidence, the Department of Justice's research arm, the NIJ, recently revealed plans to fund research into improved electronic forensics in several areas, including the cloud.

There is a paucity of case law specific to forensics as applied to cloud computing.
Martin Novak, physical scientist at the National Institute of Justice (NIJ)
Over time, the use of digital evidence in criminal and civil matters will continue to expand. Cloud providers and customers need to set up their infrastructures to meet these lawful requests or face fines and other legal repercussions. Furthermore, they need to do so without violating local privacy laws or accidentally giving away competitive secrets.
Swamped by justice
The demands of cloud forensics could prove costly as lawsuits and investigations become more complex. A 2009 study by McKinsey & Company found that electronic discovery requests were growing by 50% annually. This is mirrored by a growth in e-discovery spending from $2.7 billion in 2007 to $4.6 billion in 2010, according to a Socha Consulting LLC survey.
In the U.S., courts are becoming insistent on the need for systems to gather and preserve digital evidence. In early 2010, Judge Shira Scheindlin imposed sanctions on 13 parties that neglected to meet discovery obligations. She wrote, "Courts cannot and do not expect that any party can meet a standard of perfection. Nonetheless, the courts have a right to expect that litigants and counsel will take the necessary steps to ensure that relevant records are preserved when litigation is reasonably anticipated and that such records are collected, reviewed and produced to the opposing party."
The U.S. government has also attempted to expand the scope of data that can be lawfully requested without a warrant through a National Security Letter (NSL). In August, the Obama administration requested to add "electronic communication transaction records" to the data included in an NSL, which would require providers to include the addresses a user has emailed, the times and dates of transactions, and possibly a user's browser history. This will create a need to ensure that the provider's infrastructure can deliver on these requests in a timely manner.
The trouble with cloud forensics
"Cloud forensics is difficult because there are challenges with multi-tenant hosting, synchronization problems and techniques for segregating the data in the logs," said Keyun Ruan, a PhD candidate at the Centre for Cyber Crime Investigation in Ireland. "Right now, most of the cloud service providers are not open to talking about this because they don't know the issue ."
Traditional computer forensics must address the following steps: Collection of media at the crime scene or location where the media was seized; preservation of that media; and validation, analysis, interpretation, documentation and courtroom presentation of the results of the examination. In traditional computer forensics, the evidence contained within the media is within the control of law enforcement from the moment of seizure. Assuming that the cloud in question is within the United States, the forensic challenges raised by cloud computing are related to control of the evidence, including collection, preservation and validation.
"With cloud computing, law enforcement does not have physical control of the media nor the network on which it resides," Novak said. "Many users will have access to a particular cloud. How does law enforcement seize only that portion of the media where the evidence may exist? How will they know if they have gotten everything that they will need during the analysis, interpretation, documentation and presentation phases?"
Another challenge comes from the massive databases used in customer relationship management systems and social graphs that current forensics cannot address.
At the moment, investigators rely on traditional evidence-gathering methods while documenting the steps taken by law enforcement during the seizure and examination phases. "While this approach might suffice in some cases," Novak said, "there is a paucity of case law specific to forensics as applied to cloud computing ."
The non-localized nature of cloud computing will also usher in a debate about jurisdiction. As Novak noted, "One of the long-term issues related to cloud computing are those clouds that physically exist on a foreign server. What legal jurisdiction does law enforcement have in these cases? Do they have jurisdiction at all? Will the country in question be cooperative in terms of obtaining evidence?"
Developing a cloud forensics strategy
Organizations face numerous federal and state laws relating to the preservation of information related to taxes, securities and employment regulation. At the same time, they need to maintain compliance with other laws relating to the destruction of information that is no longer needed.
Cloud computing also raises new questions about who owns the data and the customer's expectations of privacy. Laws vary on the legal protections regarding data in the cloud from country to country. "We need to make a list of comparisons about privacy and how to deal with confidential data," Ruan said.
Most of the cloud service providers are not open to talking about [cloud forensics] because they don't know the issue.
Keyun Ruan, PhD candidate at the Centre for Cyber Crime Investigation in Ireland
The case law around expectations of cloud privacy is still in its early stages. But in the recent case of State vs. Bellar, Oregon Court of Appeals Judge Timothy Sercombe wrote, "Nor are a person's privacy rights in electronically stored personal information lost because that data is retained in a medium owned by another. Again, in a practical sense, our social norms are evolving away from the storage of personal data on computer hard drives to retention of that information in the 'cloud,' on servers owned by Internet service providers. That information can then be generated and accessed by hand-carried personal computing devices. I suspect that most citizens would regard that data as no less confidential or private because it was stored on a server owned by someone else."
In the long run, new cloud-based electronic discovery tools might help to keep these costs down. Companies including Orange, Autonomy, Clearwell and Kazeon have launched hosted services for collecting, preserving and analyzing digital evidence. Gartner research director Debra Logan said she expects that many corporations will start investing in e-discovery infrastructure and that, by 2012, companies without this infrastructure will spend 33% more to meet these requests.
The complex nature of cloud computing may lead to specialization. Bill Jeitner, CEO of BK Forensics, said, "Cloud forensics will be closer to a medical field analogy where you will have a general forensics practitioner and there will be different areas of cloud computing."
But like any tool, investigators want to get the most benefit at the least cost.
"You have to think in conjunction with other tools," Jeitner explained. "You are not going to do an analysis for thousands of dollars when you can get that same information easier. You don't go into everything 100%; you look at what you need to solve the crime."

The cloud: everything to everyone, or too good to be true?


As we increasingly see technologies redefined with the suffix “as a service” are we entering a time of “everything as a service”? How will the future of the cloud change the way we work, bring new benefits to market and improve customer service? How can the cloud re-invent organisations?

IT decision-makers are already looking to the cloud to support new IT delivery models such as software-as-a-service (SaaS), platform-as-a-service (PaaS) and infrastructure-as- a-service (IaaS). And some experts argue many firms are moving towards an “everything- as-a-service” (EaaS) model using the cloud.
More and more traditional IT applications are being made available through the cloud, such as: n Cloud data storage with data-retention IT consulting services that deliver a flexible solution for managing and retrieving data over the internet on a pay-as-you-go basis n Cloud-based, on-demand identity and access management services that make it easier and more cost-effective for enterprise clients to securely extend and manage user access to cloud-based resources, while maintaining control over policies and governance n Security-as-a-service to give companies the ability to quickly and easily add robust security services and avoid purchasing expensive equipment. Such services enable customers to quickly and easily secure new locations with the latest security technology n Enterprise mobility-as-a-service which allows on-the-go workers to quickly and securely access their corporate networks, while making it easier for IT managers to manage a global mobile workforce.

Cloud and the role of IT

Analyst Gartner sees the cloud playing a central role in the future development of IT organisations and their role within businesses. In its recently published Top Predictions for IT Organisations and Users for 2011 and Beyond report, Gartner says a clear linkage of IT investments and business results will become “an imperative” for IT organisations.
“With costs still under pressure, growth opportunities limited and the tolerance to risk low, IT faces increased levels of scrutiny from stakeholders both internal and external,” said Gartner analyst Darryl Plummer. “As organisations plan for the years ahead, our predictions focus on the impact this scrutiny will have on outcomes, operations, users and reporting. All parties expect greater transparency, and meeting this demand will require that IT become more tightly coupled to the levers of business control.”
Gartner said last year’s themes of rebalancing supply, consumer demand and regulation are still present, but the view has shifted further toward external effects, and the cloud plays an important part here. The analyst says that by 2015, tools and automation will eliminate 25% of labour hours associated with IT services. Cloud computing, says Gartner, will hasten the use of tools and automation as firms move towards self-service, automated provisioning and metering, for instance, to deliver services.
Forrester Research analyst James Staten says the cloud is something that firms should not worry about. It is something they should just get on with as its business adoption is inevitable, despite a number of expected business cloud failures. Staten predicts that many cloud deployments will fail, but he says this experience should be put to good use. He says, “This is a good thing, because through this failure you will learn what it really takes to operate a cloud environment. “Knowing this, your strategy should be to fail fast and fail quietly. Don’t take on highly visible or wildly ambitious cloud efforts. Start small, learn, and then expand.”
The potential of the cloud has also not been missed by government, particularly in the face of the economic downturn and in response to public sector cutbacks. The UK government has decided to back cloud computing, saying the technology offers “real economic benefits” for business and the public. Communications minister Ed Vaizey


recently said cloud computing could drastically reduce costs for new companies and expand mobile capabilities. “Access to the networked resources provided by clouds enables companies to enter markets without having to meet the capital costs of building their own computer infrastructure,” Vaizey said. He also said the cloud would play a major role in addressing the explosion in the number of portable devices with limited storage capacity being launched onto the market and being adopted by business.
“Access to clouds enables organisations to transcend that storage limitation and provide a level of functionality which would normally be associated with much larger machines,” he said. But Vaizey warned cloud providers that consumers and governments need to co-operate to ensure that issues such as individual privacy and data security are fully addressed, “Cloud computing is a good illustration of the need for international co- operation to ensure the very important developments on the internet are taken forward.”
The UK government is already involved in the development of its G-Cloud initiative, which will see all government departments step up their drive to share server, network and internet resources via the cloud in an attempt to cut costs in hardware, software, bandwidth and data storage. Local government is not being left behind either. Councils are being encouraged to more widely share network and communications resources to save cash and the cloud is coming into play. Recently, the London boroughs of Merton, Kingston-upon-Thames and Sutton announced they intended to move their IT infrastructure to a “community cloud” model in the next 12 months, sharing common resources and citizen data via a secure hub located in the cloud.
Andrew Miller, head of information security in government at consultant PricewaterhouseCoopers (PwC), said, “Vaizey’s comments about the security and privacy challenges facing cloud computing are particularly apt. While providers have been quick to articulate the commercial proposition to attract customers, the security models to protect customer information within the cloud architecture are currently lagging behind. “Hopefully this message from government will spur providers to embed security within the cloud architecture rather than attempt to bolt it on later.”

Cloud standards

The cloud industry is taking standards and security seriously. The Cloud Industry Forum (CIF), made up of cloud providers and other related technology suppliers, recently launched a code of practice for the delivery of cloud services. The aim of the code is to standardise and certify companies offering cloud computing services. The draft code, drawn up by more than 200 organisations, covers a wide range of issues, including security, operational issues, delivery, financial viability of suppliers, governance and technology standards and interoperability.
CIF chairman Andy Burton said, “We firmly believe that the market needs a credible and certifiable code of practice that provides transparency of cloud services, so that consumers can have clarity and confidence in their choice of provider. The market now has that benchmark.”
CIF member Phil Haylor added, “Cloud-based computing is growing at a phenomenal rate and so this sector needs control mechanisms. By laying down the code of practice, the CIF has established a credible gauge for customers to assess a vendor’s capability to deliver a robust and secure high quality cloud service. With this clarity of information in place the industry can move forward and be judged on its ability to deliver.”

Tesco in the cloud

As well as standards, the cloud also needs big commercial hitters to back it, and they don’t come any bigger than Tesco, the UK’s largest retailer. Tesco has adopted cloud web technology to support its popular Tesco Clubcard voucher campaign, which allows customers to double the value of their earned discount vouchers by registering on the main Tesco website.
Tesco uses website accelerator services hosted in the cloud by a third party to cope with this promotion, which takes pressure off Tesco’s already busy server farm. Tesco adopted this technology after “discovering that ISPs couldn’t support the kind of bandwidth Tesco demanded”. Ed Camp, head of server and storage at Tesco.com, said, “We knew it would be straightforward to plug the cloud services into our website to take the load from our infrastructure and provide the support we needed, no matter how much traffic visited our website.”
So with central and local government backing for the cloud in place, the support of big commercial operations, and emerging standards coming to the fore, perhaps the cloud really is becoming “everything to everybody”.