Tag Archive for cloud

These Clouds Have Silver Linings

We recently had the opportunity to discuss cloud computing with several prominent hosting companies such as GoDaddy!, Hostway, and Verio. In addition, we also spoke with two companies that help build large scale datacenters: Green Platform Corporation (GPC), a company that provides a leading edge vibration-less equipment rack, and ZT Systems, a leader in manufacturing standards-based servers designed directly to the customer’s requirements. Our goal was to find out and share with our readers the factors that these companies feel are critical to building and maintaining a rapidly scaling hosting platform. In addition, for our readers interested in cloud solutions, we investigated how to make the cloud accessible to businesses of all sizes—but especially SMBs—and the benefits that such businesses can expect from moving IT resources into the cloud.

While few can agree on a firm definition of “the cloud,” it may not actually matter very much. Common to cloud offerings is that they are services that can be turned on and off as needed without the end user exhibiting full mastery of the underlying technologies. For example, customers don’t need to know how to provision server hardware, add a network drop, or install an operating system. According to Ivan Hurtt, Sr. Product Manager of Managed Computing Solutions for Verio, “Customers don’t necessarily care what software and hardware they are using. They just want the complexity gone and the solution provided.”

Customers turn to the cloud to provide flexibility. A business might run a pilot project in which they need 50 servers for two weeks. Traditionally, they’d have to buy the servers, which would require a substantive capital investment as well as the knowledge and resources to install and configure them. Now, they can simply turn to a cloud provider to spin up 50 server instances within minutes—and when they’re done spin them down just as easily. The cloud provider takes care of the underlying technologies and the customers reap the benefits. According to Flavio Andrade, Product Manager, Hosting, at GoDaddy!, “We are in the business of enabling everyone’s dreams of being successful. We need to stand behind them when they take off. With our 4G hosting environment we can transparently scale customer sites across servers and clusters.”

To accomplish this abstraction, GoDaddy!, Hostway, and Verio have dedicated a great deal of time, effort, and resources designing and building their cloud solutions to be as modular and automated as possible. The underlying hardware and software is standards based so it can be swapped in and out as needed. “We provide some of the largest service providers and content delivery networks with standards-based hardware designed to meet their requirements,” says Brent Miller, VP of Sales for ZT Systems. Brian Krouse, GoDaddy’s Senior Director of Product Development for Hosting, explained the importance of standards-based computing to cloud service providers by stating that “elasticity and scalability are critical. At our scale we absolutely need standards because in a heterogeneous environment you end up trying to manage a mess. Our hardware (servers, storage, networking devices) is all standardized and we generally use commodity hardware with our in-house developed management software.”

While standards-based computing is important, each of the hosting providers interviewed feels that the real value the providers add is in the proprietary tools they’ve written to help provision and manage their cloud. Hurtt of Verio says, “We provision servers very quickly. We can turn on a new server in less than three seconds, move and consolidate data on the fly…. Most of the really cool stuff that enables us to offer cloud services directly to consumers is in the proprietary back-end we’ve built.” Likewise, Krouse of GoDaddy! urges, “you really have to be efficient, and the rate of growth in the datacenter needs to be automated.”

Why do cloud service providers write their own tools instead of using open-source or commercially available management software? Krouse offers an explanation: “We wrote a lot of our platform ourselves. We rely heavily on open source and have hundreds of developers on staff. You can’t just take hardware and software off the shelves and build an ISP.” In addition, service providers must be on the cutting edge of technology as they face challenges that most other businesses don’t, such as the deleterious effect on performance and availability of massive amounts of vibration from hard drives, power supplies and more. For this reason GPC provides an anti-vibration solution built around a tunable carbon fiber rack which, according to CEO/CTO Gus Malek-Madani, “has been shown to increase storage system performance by more than 50%.”

In many ways, the cloud is about bringing enterprise class equipment and services to the masses. As providers take advantage of economies of scale so can consumers. “In many ways, we are the next channel for our technology partners as companies are changing the ways in which they consume technology, “ says Aaron Hollobaugh, VP of Marketing at Hostway. Through a recently announced partnership with StillSecure, Hostway has expanded the number of security services they provide and dropped protection prices to very aggressive levels. Security is equally important at GoDaddy!, where a 24/7 security operations team maintains vigilance against distributed denial-of-service (DDoS) attacks.

In the virtual world, just like in the real world, no two clouds are exactly alike, yet to the untrained eye the similarities are overwhelming. Moreover, cloud service providers are building to such varying scale that even a small cost savings or a performance tweak can make wide-spread and long-term improvements. In a recent article in Wired,Mystery Men Forge Servers for Giants of Internet,” my former PC Mag colleague (and current Wired editor) Cade Metz writes about the interest on the part of the world’s largest internet companies to “buy servers designed specifically for their sweeping online operations. Because their internet services are backed by such an enormous number of servers, these companies are looking to keep the cost and the power consumption of each system to a minimum. They want something a little different from the off-the-shelf machines purchased by the average business.”

According to Metz, “the net’s biggest names have caused a tectonic shift in the worldwide server market. These are the companies that need more servers than anyone else on the planet, and they’re moving away from traditional server makers such as Dell and HP.” John Mills of ZT Systems backs up Metz’s assertion by stating “ZT Systems has been designing and building computers in the USA for over 17 years and for the last 5 we’ve seen tremendous growth in demand for our solutions.” And why stop at custom-designed servers? Efficiencies of scale can further be tapped using specialized equipment such as GPC’s anti-vibration carbon fiber–based rack. Again, not something that the average business would buy, but something that the average business can benefit from as it uses the cloud service built on top of this cutting edge hardware.

Andrade of GoDaddy! explains the significance of working closely with vendors, stating: “For us and our customers to benefit, a service provider needs to innovate and move forward. Many times this is accomplished by working closely with vendors.” Verio’s Hurtt puts it this way: “Even with standards based hardware, there is always a slightly different way to do something. Any generic hardware design needs some tweaking for our environment, and while it is great to have an independent standards body that can make sure products adhere, we also need to go beyond standards in order to provide the best solution for our customers.”

Cloud Market Wrap-Up

Cloud adoption continues to expand.  In a report published by Forrester (April 2011), the total size of the public cloud market is estimated at $25B and is projected to grow to $160B by 2020. In addition, a recent Morgan Stanley survey (May 2011) conducted by Morgan Stanley establishes that public cloud usage is expected to show a 23% CAGR through 2014.

SMBs are leveraging the cloud to significantly reduce their hardware costs, while larger enterprises are using public clouds more for rapidly extending the functionality of internally developed applications, with extensions to facilitate mobility as one of the most common applications.

Furthermore, larger organizations are accessing the cloud for executing compute-intensive workloads like analytics. As 37% of companies surveyed indicated they would solely use public cloud deployments, the remainder are gravitating toward the three types of cloud deployment: public, private and hybrid.

Research shows that companies view cloud computing as a way to:

  • Lower IT costs,
  • Increase corporate agility, and
  • Provide the foundation for leading edge, flexible architectures that will result in higher quality systems.

However, the top barriers to cloud adoption continue to be:

  • Security
  • Regulatory and compliance issues,
  • Data privacy
  • Reliability
  • Interoperability,
  • Vendor lock-in and
  • Complexity.

Interestingly customers don’t always feel that public clouds are less secure than their own data centers.

GigaOm ranks Amazon as the largest public cloud provider with 2010 revenue of $500MM and they continue to expand their AWS through investment. Rackspace, Salesforce, Microsoft, Google, IBM, and other global entities also are targeting to be providers of public cloud infrastructure and services, making multi-million dollar investments to remain competitive in offerings and price.

VA Hospitals Leverage Technology to Provide Greater Access to Healthcare

We recently had the opportunity to speak with Leonard Goldschmidt, M.D., Ph.D, National Director for the Department of Veterans Affairs (VA) Diabetic Teleretinal Screening Program, about the VA’s use of different technologies and how these technologies increase the quality of and access to health care. From smartphone apps that help prevent and diagnose illness to the extensive use of tele-health technologies, VA hospitals are at the forefront of improving patient care through the use of technology.

The VA, which spends more than $500 million a year on research programs, and there are currently has over 50,000 patients nationwide participating in tele-health programs from home. Over 700,000 diabetic patients have had their retinas imaged by retinal cameras since the inception of the program in 2006. Other patients answer questions via web sites, telephone, or smartphone, and use home-based devices such as a stethoscope and blood pressure-equipped peripherals to track their condition on a daily basis. Results are analyzed by nurse practitioners who follow up directly with the patient. This home-based care coordination program has improved patient care at the VA because caregivers have access to more accurate and timely patient status information. Dr. Goldschmidt says, “more timely and better information means that I can treat my patients better,” and right now the VA is gathering the right information.

As long as significant security issues hang over use of “cloud” computing, there’s an overall hesitancy to involve any technology that may compromise the security and privacy of a health record. All patient records, including charts, scanned images, and diagnostic images are stored in an EHR which has been in use since 1999. The VA patient database is “second in size only to that of the IRS and the security of our health records is paramount” according to Dr. Goldschmidt. For this reason all technology solutions must be vetted internally before being deployed as they affect the health care process. The future of VA healthcare seems bright, because there is both internal and public demand for technology solutions that work to improve the health of our nations’ veterans.

The views expressed here are from Dr. Goldschmidt’s and not of the VA’s.