You are here
Athena doesn’t look like much. Just a bunch of black computing towers, punctuated by cooling units, in a nondescript room. But looks can be deceiving.
Named for the Greek goddess of wisdom, Athena is currently the most powerful
computer on the UW campus, helping physicists and astronomers tackle fundamental questions about our universe.
The push to acquire Athena was spearheaded by David Kaplan, director of the Institute for Nuclear Theory (INT), Tom Quinn, professor of astronomy, and Richard Coffey, director of IT for physics and astronomy. They were responding to what can be a frustrating Catch-22 for scientists whose research involves complex calculations — a field known as computational science. Scientists can apply for time on a huge computer at a national lab, but they will likely be turned down if they cannot demonstrate expertise at using such machines efficiently.
The team envisioned a UW supercomputer—not nearly as powerful as the machines at national labs, but about 1,000 times more powerful than an individual computer workstation—that could push existing research projects to new extremes and familiarize scientists with working with computers on a grand scale.
After several failed attempts to fund Athena through grants, Kaplan turned to the UW’s Office of Research, which was looking for ways to support e-Science (science that links many computers). The Office of Research joined with the College of Arts and Sciences to provide $700,000 for the project.
“They gave a lot of money up front, believing this would work,” says Kaplan. “That took vision.”
The remaining funding came from departments willing to invest in the
project. In the end, three A&S units ponied up: the Department of Astronomy, the Department of Physics, and the INT. The computer is housed in the UW’s Center for Experimental Nuclear Physics and Astrophysics.
Brain Power to Harness Computing Power
Athena’s power equals 133 high-end PC servers working in close communication.
It calculates at nearly ten teraflops (Tflops), or about ten trillion calculations per
second. But how that power and speed are harnessed, and how those PCs communicate, is complicated. So complicated, in fact, that the team would not consider purchasing Athena without hiring a computational science expert to program it.
“Scientists may know how to program their desktop, but programming a supercomputer requires a whole other set of skills because it is doing many tasks simultaneously,” says Kaplan. “Careful choreography is required so that individual computations can be done in the right order and assembled into something useful, in a way that takes advantage of the speed of the machine.”
That’s where Jeff Gardner comes in. Gardner received a Ph.D. in astronomy at the UW and then spent five years at the Pittsburgh Supercomputing Center (PSC), programming one of those massive national computers, before returning to the UW as a senior research scientist to help with Athena.
“A truly amazing part of this collaboration was the hiring of Jeff,” says Coffey, who led the national search for the position. “Three departments pooled their limited resources to hire an expert in the field, filling this often overlooked gap between the science and the computing.”
Gardner is currently working with faculty on about a dozen research projects that use Athena. Some faculty meet with him sporadically, others weekly. All pay for his time through research grants.
“Every project, every scientific code is different, so it has to be parallelized in a different way,” says Gardner. “You need a set of tools and experience to figure out how to go about it.”
Gardner also assists in writing grant proposals, “because that’s where plans formulate,” he explains. “What we don’t want is for faculty to propose something that our technology can’t do.”
Computing Complex Interactions
What Athena can do is impressive. Research projects range from the grandest scale—studying the universe—to the smallest, looking at atomic interactions. What all have in common is the complexity of interactions being studied.
One example is Tom Quinn’s study of structure formation in the universe. Quinn looks at the creation of our galaxy and neighboring galaxies. He does this, in part, by gathering measurements of remote objects—dating back to when the universe was about 100,000 years old—and comparing them to the galaxies we see today, factoring in the role of gravity.
|Athena By the Numbers|
|10 trillion calculations. In one second, Athena can compute 10 trillion calculations, compared to 10 billion on the average PC.|
|9 months. From conception to deployment, it took nine months to get the Athena cluster in place.|
|One quarter. That’s the fraction of the purchase price used to cool and provide power to Athena.|
|1024 cores. Most modern computers have two cores; Athena has 1,024.|
|20.7 billion pages. That’s the amount of data Athena is able to store.|
|40 hairdryers. Athena’s heat output is equivalent to 40 hairdryers, all blowing at once.|
|15 minutes. Without an integrated cooling system, the room housing Athena would overheat in just 15 minutes.|
“That sort of calculation can’t be done on the back of an envelope,” says Quinn, massively understating the computational challenge. “With pencil and paper, you can figure out how three objects interact with gravity. But the universe has billions of objects in each galaxy.”
Clearly Quinn needs tremendous computing power to handle such calculations. But just having multiple processors do the math simultaneously won’t work. “The issue is how to get all those processors to work together,” says Quinn. “It’s not a problem I can easily divide up, because the calculations are all very interconnected.”
Much of Quinn’s research requires using a massive computer at a national center. But before he can tap into that resource, he needs to devise algorithms—with Gardner’s help—that will work on a multi-processor system.
“I need something I can test on,” says Quinn. With Athena, he is able to try different algorithms and compare results. He has the luxury of time for testing because his department has part ownership in the computer. And if a result leads to additional questions, he can pursue those immediately.
“It happens fluidly,” says Quinn. “National centers aren’t set up to do that sort of thing.”
More Power, More Grants
When Athena was proposed, the units funding the project believed it would eventually pay for itself by helping to generate new grants. Within months of its arrival, that already started to happen. Several major NSF grants tied to Athena have been funded, with others pending.
Some challenges remain. While programming and maintaining the system have been manageable, the administration involved in sharing the computer and related personnel across departments has been daunting. Yet all agree that the benefits of Athena overshadow any administrative headaches.
“In 2001, the fastest computer on the planet for unclassified research was six teraflops,” says Gardner, “and thousands of scientists across the country had to compete with one another for time on it. Athena is ten teraflops and is shared among just three departments.
“It is truly remarkable that we have access to so much computing right on campus.”