UIC News
Search UIC News
The University of Illinois at Chicago
Current Issue News Section

Sports Section Button
Events Section Button
News Clips Section Button
UICNews Business Items
Contact Page Button

Submit News Page Button

Advertising Page Button

Sign up for UICNEWS email alerts
Resource Guide
current issue
 
News

Software links cloud computing data centers

[print version] [email article]

Robert Grossman (center), director of the National Center for Data Mining, computer scientist Yunhon
Robert Grossman (center), director of the National Center for Data Mining, UIC computer scientist Yunhong Gu and Joel Mambretti of Northwestern demonstrate their Sector software at the annual meeting of the American Association for the Advancement of Science.

Photo: Paul Francuch


Whimsically named computer “clouds” have a new way of talking which — while not loud as thunder — moves fast as lightning and brings many together in conversation.

Cloud computing extends the ability to compute beyond a single PC or a rack of computers to a whole “data center” — floors or a whole building full of linked computers, ready to team up and tackle complicated problems.

There are hundreds of data centers all over the world. But until now, it was almost impossible to distribute a computation — basically, information processing — across two or more data centers.

Enter “Sector,” distributed computing platform software developed at the UIC-based National Center for Data Mining and demonstrated last weekend at the annual meeting of the American Association for the Advancement of Science.

Robert Grossman, center director and professor of mathematics and computer science, says Sector successfully performed a computation across computer data centers located from coast to coast at UIC, the StarLight computer network switching facility at Northwestern University, Johns Hopkins University in Baltimore and the University of California-San Diego.

“Sector can work with very large data sets — terabytes in size — that are distributed across data centers,” Grossman says.

“We now have high-performance networks that can be used to connect data centers, but until Sector, there hasn’t been the software that would allow you to do a large data computation across data centers at scale.”

Massive search engines like Google and Yahoo and big Internet users like Microsoft, Facebook and others use cloud computing as the foundation for the various applications they provide to users, but the cloud software they use works with only one data center at a time.

Sector allows computations to span multiple centers simultaneously and quickly, as Grossman and his colleagues successfully demonstrated at the AAAS meeting, using dreamy-sounding facilities called the “open cloud test bed” run by the “open cloud consortium.”

“It worked very well,” he says. “We were very pleased.”

He said distributing a computation across four data centers simultaneously showed only about a 5 percent degradation in the quality of the computing work, compared to using just a single data center, thanks to the robust nature of Sector.

Sector was developed over the last two years under the leadership of National Center for Data Mining research scientist Yunhong Gu. Open-source software, free and available to any scientific or commercial project, Sector has built-in security features and it is relatively easy to use, Grossman says.

That’s a key attribute, adds Joel Mambretti, director of the International Center for Advanced Internet Research at Northwestern University and a partner in the Sector project.

“Existing architectures and methods are proprietary and closed. This is the only cloud architecture technology based on open standards and open architecture, and it’s the only architectural approach that can handle large data sets,” Mambretti says. “That’s why the science community is excited about this.”

Grossman says linking data centers is necessary because it’s unrealistic to expect everything you need will be housed at a single data center. He adds that Sector’s more overcast style of cloud computing makes it possible to build data centers in locations that are cheaper to operate, and where electricity rates to power and cool computers are lower.

“You can reduce the total cost of data centers and have your computations work transparently to the location of the data,” Grossman says. “That would be a significant transition.”

francuch@uic.edu


Related stories

UIC researchers' cloud computing wins bandwidth award

UIC researchers pull off astronomical data transfer feat

Internet researchers develop faster Web


Browse Back Issues

Go to UIC Main Site Visit the UIC News Bureau Check news from UIUC Go to Job Guide

RSS Subscribe

Follow UIC News
Facebook Twitter