Anothony D’Angelo, a professor at Syracuse University’s Newhouse School and Director of public relations, was one of three public relations professionals recently quoted in the The Wall Street Journal in a story about Roseanne Barr’s racist tweets. D’Angelo wrote: “Roseanne Barr’s brand…
NSF grant, new tools give SU supercomputing power
NSF grant, new tools give SU supercomputing powerNovember 10, 2003Jill Leonhardtjlleonha@maxwell.syr.edu
In 1996, health care analysts at the University of Virginia studied a common emergency-room procedure, right heart catheterization, in critically-ill patients. They used classical parametric modeling methods – that is, methods based on constants whose values change in relation to other variables – and reported that, counter to long-held assumptions, use of this procedure appeared to be related to an increase in death rates. Though parametric modeling methods are widely used and accepted, they are based-like so many other complex assessments-on certain assumptions and potential limitations.
Then, in 2001, another group of analysts, including Maxwell School of Citizenship and Public Affairs economics professor Jeffrey Racine, used computationally intensive “nonparametric” statistical methods they had recently developed to reassess the efficacy of right heart catheterization, using data obtained from UVA. The nonparametric approach involved feeding huge amounts of data on the procedure into a supercomputer, with a minimum of accompanying assumptions. According to Racine, “once we eliminated the rigid assumptions that were built into the original analysis, a completely different conclusion emerged. We found that the procedure, if anything, lowered the death rate for critically ill patients,” says Racine.
This is just one example of how nonparametric modeling, which relies on actual patterns present in the data rather than a range of potentially false human assumptions, may yield a completely different result from the traditional parametric approach. But until recently, the enormous cost of access to the supercomputers necessary to do this sort of computationally intensive analysis put nonparametric capabilities out of reach for many institutions.
This is about to change at the Maxwell School. With a grant of $162,810 from the National Science Foundation, including a cost share of $48,843 from the SU Office of Sponsored Programs, Racine has purchased a computer cluster – known as a “Beowulf cluster” – to be housed in the Center for Policy Research (CPR). The center’s new cluster can do the same kind of sophisticated nonparametric data analysis that formerly only supercomputers could do.
Scientists like Racine have figured out that instead of acquiring a traditional supercomputer, clusters of commodity “off-the-shelf” processors such as those available from Intel and AMD can be hooked together to leverage the individual computing capability of each. Thus, Beowulf cluster computers are little more than collections of tightly coupled desktop processors-the kind one can buy at any computer store-hooked together using open source software and libraries that were developed by a consortium of government and private sector groups in the 1990s. Such a cluster efficiently breaks a large job into several smaller chunks and assigns each chunk to the available processors. With the falling prices of commodity processors and the availability of free open source software such as GNU/Linux, these clusters can provide supercomputing speed at a fraction of the price of a traditional supercomputer. In fact, the third fastest supercomputer computer in the world is currently a cluster computer.
The NSF grant has funded the purchase of a 37 dual node hyperthreaded 3.06 Ghz Xeon processor cluster delivering the throughput of up to 148 processors, which takes up the same amount of space as two over-sized filing cabinets. Racine says that at least nine faculty members and as many as 30 graduate students will benefit initially from this massive new supercomputing capability, with that number growing as other Maxwell research faculty develop projects that take advantage of the Beowulf cluster computer’s capabilities.
“CPR has a number of faculty and students who work with numerically intensive econometric methods or who routinely struggle with computational aspects of modeling large datasets. With this cluster computer, we will be able to teach students not only the theory of these methods but how to apply them to large, real-world databases like those with which they will work after graduation,” says Racine. He anticipates an enhanced ability to “develop policy prescriptions based on nonparametric modeling instead of the standard parametric approach.”
Vice Chancellor and Provost Deborah A. Freund, who is herself an economist, shares Racine’s enthusiasm-not only about the new computing capability but also about having won the NSF grant. “SU has firmly established itself as a leading research institution and, as such, we are increasingly dependent on outside support to fund our cutting-edge work. I applaud Professor Racine for his diligence- for going after this grant and for convincing the NSF that our top-notch economics department and Center for Policy Research deserve its support.”
Racine has a number of projects intended for the cluster, including verifying via simulation theoretical conjectures regarding newly developed nonparametric estimators, and reassessing union and gender wage gaps using these methods.