The National Science Foundation thinks scientists need their own clouds

The National Science Foundation is investing $20 million to help launch two cloud computing testbeds that will allow scientists to experiment with new types of computing architectures. The agency is giving $10 million a piece to the projects, called Chameleon and CloudLab, in the hopes they will help scientists figure out if different types of hardware, processors or distributed designs are better suited to tackle particular computing challenges.

Chameleon will be comprised of 650 computing nodes and 5 petabytes of storage, and will allow researchers access to a broad range of options. According to a press release, these will include “low-power processors, general processing units (GPUs) and field-programmable gate arrays (FPGAs), as well as a variety of network interconnects and storage devices. Researchers can mix-and-match hardware, software and networking components and test their performance.”

CloudLab aims to accomplish a similar results, albeit with a very different architecture — its 15,000 processing cores and 1 petabyte of storage will be spread across three university data centers. According to the release, “Each site will have unique hardware, architecture and storage features, and will connect to the others via 100 gigabit-per-second connections on Internet2’s advanced platform, supporting OpenFlow (an open standard that enables researchers to run experimental protocols in campus networks) and other software-defined networking technologies.”

The testbeds seem fairly wise within the realm of science, where certain applications and areas of research almost certainly can benefit from on-demand access to infrastructure that won’t make its way into public clouds anytime soon. Many scientists have already benefited from the sheer scale of resources available on public cloud platforms such as Amazon Web Services, Microsoft Azure and Google Compute Engine, but limited options and configurability also limit the types of jobs that can be run. Security and performance concerns could prevent other research that involves sensitive data or guaranteed fast network connections.

However, like any sort of private or specialized cloud efforts before it, the NSF’s NSFCloud initiative (under which these two projects fall) will likely to have to figure out a way to match the user experience that public clouds provide. A clunky experience might suffice for testing out architectures that will eventually be deployed locally on physical gear, but if the goal is to host production jobs and achieve real results on NSF-funded infrastructure, it might be difficult to resist the limited (although always expanding) features and ease of use that commercial clouds provide.

Image copyright Chris Coleman, School of Computing, University of Utah.

Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.



More From paidContent.org