Cosmos Briefing: Supercomputing and Big Data

It’s hard to get your head around the size of datasets tackled by supercomputers, and the complexity of the problems they’re tasked to solve. Fortunately, the guest speakers at yesterday’s Cosmos Briefing – people using supercomputing resources for research and business – provided the answers.

Supercomputers, says astronomer Dr Karen Lee-Waddell, director of the Australia SKA Regional Centre (AusSRC), are “used for computing large intensive processes and so they can solve a multitude of complex math equations…numerous times over very large data sets”.

“They’re processing up to terabytes of data and storing petabytes of data – and that’s pretty much what we need them to do,” Lee-Waddell adds.

All this processing power comes – and runs – at an impressive cost.

“Gadi, the current supercomputer, that we have here was set up with a budget of around $70 million,” says Professor Sean Smith, director of the National Computational Infrastructure, in Canberra. “So they’re expensive to buy up front, but they’re also pretty expensive to run. Gadi runs at about two to three megawatts, 24 hours, 360 days a year. That’s about the size of a small-to-medium-sized suburb in terms of electricity draw.”

With delightful understatement, Smith describes the cost of running Gadi as “entirely non trivial”.

The reason you need this computational firepower is the ever-increasing size of datasets

“When I started my career in industry, post my graduate studies, we were dealing with business questions that you could easily [solve] with spreadsheets,” says Ivan Galea, vice president, Analytics and Data Science at software maker Atlassian. “Now we’re dealing with millions of users, and that’s in a business company like Atlassian; for consumer companies like Facebook, Google, you’re dealing with billions-plus users, and every user is leaving a large thumbprint of events and actions.”

Without supercomputing, says Galea, “you’re not able to derive meaningful insights with the speed and the time that are necessary to derive content to make business impact”.

Not surprisingly, the future of supercomputing is bright, shaded only by the enormity of the questions it will be used to solve and the benefits it will likely bring to our lives.

Lee-Waddell say that, 10 years from now, supercomputing answer all the biggest questions that we have: “Like how did black holes and stars originally form? Was Einstein right about gravity? And, you know, the big one: Are we alone in the universe?

Smith cites the example of personalized medicine. “Genomics is going to explode,” he says. “It’ll be possible to have your genome read and stored and correlate that with other sorts of phenotypical data, correlated with smart-sensor data – what your Apple watch is telling you about your blood pressure and so forth.

“And in that sense, [it will] be able to look to enhance health outcomes for individuals, and massively save money for the health system in terms of prevention of disease.”

Galea thinks the biggest business outcome will be the continued convergence of consumer products and business products. “Ten years ago, we didn’t have a smartphone with the power that we have today,” he says. “And within a decade, we were able to do things on a smartphone that far exceeds the compute for rockets 50 years ago – first rocket that went to the moon.”


Read more about in these Cosmos online articles:

Please login to favourite this article.