Model of Corporate Networks Could Improve Stability

Because they are so complex, servers at corporate data centers are often stuck running obsolete software on inefficient networks. A new computer model from researchers at MIT could enable IT managers to better monitor their information infrastructure.

Tags:
2011-10-14

Corporate information infrastructure is complex and unpredictable, leading to change-reluctant IT managers and inefficient network designs. A new model of this type of infrastructure, however, could improve predictions about network changes.

rkr_datacenter_0.jpg
Corporate servers often consist of scores of processors and connections.

Researchers at MIT’s Department of Civil and Environmental Engineering unveiled the computer model at the Institute of Electrical and Electronics Engineers’ (IEEE's) Cluster Conference in September.

The model is able to predict changes to complex networks of up to 64 or even 128 separate processors. It overcomes a limitation of previous models. Namely, servers at corporate data centers are especially hard to analyze because they are connected to many other servers, both at the data center and around the world. While previous researchers have modeled collections of servers at data centers, the new research models “every processor in every server, every connection between processors and disk drives, and every connection between servers and between data centers.” It even models how processing tasks are distributed across a network.

“We take the software application and we break it into very basic operations, like logging in, saving files, searching, opening, filtering—basically, all the classic things that people do when they are searching for information,” Sergio Herrero-Lopez, a graduate student in CEE, explained in a statement.

Just a single operation can be incredibly complex, according to Herrero-Lopez, but the model takes into account all of the computational resources needed for an individual task. “It takes [into account] this many cycles of the CPU, this bandwidth, and this memory,” he said.

The researchers tested their model on networks at Ford Motor Co., where it accurately predicted network processes. Because the computer model is modular, it can be readily adapted to different companies’ networks and needs.

“You can use the simulator for performance estimation and capacity planning, or to evaluate hardware or software configurations or bottlenecks—what happens if a link between two data centers is broken?—or for background processes or denial-of-service attacks,” Herrero-Lopez said. “It’s nonintrusive—you’re not touching the system—it’s modular, and it’s cheap.”

Source : Smarter Technology