Dear Colleagues, I am writing to provide the Zoom information for the faculty listening sessions with me and my team to inform our thinking about the upcoming academic year. Specifically, I want to check in with you to see how…
Research Computing Gains Momentum at University
Research computing continues to grow at Syracuse University. Supported by Information Technology Services (ITS), enhanced resources offer University researchers more support, greater capacity and an expanding toolset. A National Science Foundation award in 2013 funded infrastructure upgrades that expanded data-intensive computational capabilities and increased the number and diversity of ITS’s interactions with research communities on campus and around the world.
“ITS continues to foster relationships with the research community on campus while engaging with national-level research efforts,” says Eric Sedore, associate chief information officer for Infrastructure Services in ITS. “These relationships shape the ways in which we build and maintain advanced infrastructure to support research computing on campus.”
The most recent development is Crush, an agile compute cloud comprised of a loosely coupled set of heterogeneous hardware organized by the Crush Management Framework—a set of tools created by ITS. The Green Data Center houses the majority of Crush, with a smaller portion located in the Machinery Hall Data Center. Crush was developed specifically to support a variety of compute-intensive operations with the ability to run several projects simultaneously without any disruption.
Adjunct Professor Damian Allis and his team of experimentalists are working on four projects involving several hundred-structured calculations. “Crush has been instrumental in allowing us to conduct our projects in parallel. Each project is very computationally demanding, some requiring several weeks,” Allis says. “In 2014, we would have prioritized the projects due to resource limitations. We are now queuing calculations for all four projects and producing high-level analyses. A year’s worth of work will be available to us almost nine months sooner than expected.”
Leveraging Crush’s computing strength, ITS worked with Heath Hanlin, associate professor and department chair of Art, Design and Transmedia in the College of Visual Performance Arts, to construct a render farm. Hanlin produced “Branches,” a high-resolution, computationally complex 3-D animation, his newest exploration of line, light and sound.
“A computer artist is dependent upon the machine to realize the work. I could have built a personal render farm, but that would only go so far.” Hanlin says. “It was exciting to witness ITS’s depth of services and how responsive and excited they were to support the initiative.”
Using SideFX’s Houdini procedural modeling and rendering software, and a combination of scripting and programming languages, Hanlin completed “Branches” in a year. The first five months were dedicated to calculations and computational analysis.
“Crush outperformed initial estimates. We ran continuously on Crush for 90 days. We were anticipating five months,” Hanlin says. Rendering 14,452 frames for approximately 1.5 million hours in 4K high frame rate (HFR) resolution produced a five-minute short at 48 frames per second.
Crush utilizes 10-gigabyte connectivity, consists of 1,300 cores and six terabytes of memory. This summer, ITS will expand Crush to over 8,000 cores and more than 20 terabytes of memory.
Crush is built upon the foundation established with OrangeGrid. “OrangeGrid (OG) is similar to Crush in scale but different in research support. Crush’s components—among them solid state drive storage and dual 10-gigabyte connectivity—allow it to support additional forms of research,” Sedore says. “OG is a significant and vital resource to the campus, supporting a diverse population of researchers in need of a High Throughput Compute environment.”
The NSF award funded vital infrastructure updates, including upgrading the campus network backbone from 10 to 40 gigabytes and building connectivity from 1 to 10 gigabytes. Since the upgrades, OG peaks at 15,000 central processing unit (CPU) cores overnight, a 30 percent increase since spring 2014. Just in June 2015:
- OG provided six and a half million hours of compute time.
- Eight SU researchers utilized nearly 2 million hours of compute time.
- OG shared6 million hours of compute time with the Open Science Grid’s (OSG) research efforts in medicine, chemistry, physics, computer science and bioinformatics. OSG is a nationally funded research grid that connects organizations and enables them to share and utilize resources collectively.
Between November and January, research groups like the Syracuse University Gravity Wave Group (SUGWG) consumed 600,000 compute hours, five times the amount consumed in 2012. Thirteen million hours were contributed to several public science initiatives, including Einstein@Home, Collatz Conjecture, NFS@Home, POEM@HOME, Rosetta@Home, malariacontrol.net, World Community Grid, climateprediction.net and Constellation.
Paired with the University’s Academic Virtual Hosting Environment (AVHE), OG continues to provide solutions to many researchers on campus. In addition to the growth in OG use, AVHE usage has more than doubled since 2014.
AVHE is a private virtual research cloud built to support small to moderate-sized research efforts. Currently, there are over 450 running virtual research machines (VM) in the AVHE, and researchers use over 675 terabytes of storage. Virtualization provides flexibility and hardware sharing to allow multiple researchers to operate simultaneously on an underlying server and storage infrastructure. Additionally, the AVHE provides high availability and automatically migrates workloads to alternate resources in the event of physical server failure.
Barry Davidson, a mechanical and aerospace engineering professor in the College of Engineering and Computer Science, is working with apprenticed graduate students to perform finite element analyses. His team first utilized a dedicated quad core server, which required two hours to complete each run. They transitioned to OG and AVHE and decreased run-times from two hours to just 10 minutes.
“Our current focus requires somewhat larger models, which we would not be able to consider without the use of AVHE, CRUSH or OrangeGrid,” Davidson says.
Postdoctoral biology researcher Kirill Borziak and Associate Professor of Biology Steve Dorus, in collaboration with Weeden Professor of Biology Scott Pitnick and Biology Professor John Belote, all in the College of Arts and Sciences, utilized OG to run a covariance analysis of genes from 15 fly species, and AVHE to implement pipelines for assembly and analysis of next-generation sequencing data.
“The number of nodes available on OrangeGrid is impressive. We’ve been using OrangeGrid since October and slowly we are increasing our load,” Borziak says. “As of now, we have 15,000 jobs submitted to OrangeGrid. Our local computer only has 12 core CPUs. The additional nodes from OrangeGrid have sped up our research tremendously.”
For more information about research computing at SU, including how to put it to work on one of your research projects, contact Sedore at 315-443-3534 or firstname.lastname@example.org.