One story is dominating the astronomy news this week – the announcement of exoplanet and, in the words of some sections of the media, potential ‘Earth twin’ Kepler-22b. I don’t want to talk about that though; I want to give some attention to a few other interesting bits of astro-news that are in danger of being eclipsed. If you do want to discuss the much hyped new planet however, it’s the topic for this week’s astronomy twitter journal club.
Firstly, a new era of submillimetre astronomy began at the James Clerk Maxwell Telescope in Hawaii with the unveiling of the SCUBA-2 camera. This large instrument is the successor to SCUBA, which I’ve written about here before. Indeed it’s its faster, more sensitivity son, and should hopefully prove an invaluable tool in understanding the dusty Universe. To do this it has to be cooled to within 0.1 degree of absolute zero which the press release confidently states makes it “…colder than anything in the Universe that we know of”. Except, as was quickly pointed out on twitter, the Planck satellite which also has to operate at these chilly temperature.
The next story on my list is an intriguing e-petition calling for astronomy weather reports to be included in the normal weather forecast:
We the undersigned request that The Met Office produces regular Stargazing / Astronomy focused weather information, to be shown as part of the BBC Weather reports. Not only would this be a boon to amateur astronomers, it will also help keep the study of Astronomy prevalent in the public consciousness, which in turn helps encourage the study of science, which should be a boon to the economy.
Onto the Hubble Space Telescope and the publication of the 10,000th refereed scientific paper using its observations. Reaching this milestone really reflects how Hubble has managed to remain a cutting-edge instrument throughout its long life. The paper in question reports the finding of the faintest supernova ever associated with a long duration gamma ray burst – you can read it here. I wonder if the Space Telescope Science Institute will send the authors a prize for their achievement?
I was going to end with this story reporting the discovery of the most massive black holes ever seen. However, I really want to devote a separate post to this (coming soon, hopefully). Instead, here’s one of the most iconic images of our own planet – the blue marble hanging in space – which was taken nearly 40 years ago today. Thanks to Galileo’s Pendulum for the tip.
More is better, right? Bigger telescopes and bigger surveys are both undoubtedly good things, but to make the best use of these advances we need to be able to handle the corresponding increase in data flow, and subsequent pressure on the astronomical archives which are going to have to cope with it.
This is a cross posting with the Astronomy Twitter Journal Club who are going to be discussing this topic on twitter (search for the #astrojc hashtag) this Thursday at 20:10 GMT. If you’re interested please come and join in.
This ‘data tsunami’ is almost upon us, according to a new paper by G. Bruce Berriman and Steven Groom. The recent addition of large datasets from the Spitzer and WISE telescopes has massively increased queries to the online Infrared Science Archive (IRSA), and, unsurprisingly, slowed down the response time of the database. This is only going to get worse as the archive’s growth is expected to accelerate over the next few years.
The paper also points out that how astronomers use archives is going to change. At the moment, raw datasets are typically downloaded and then reduced on a user’s own computer. However, once data reach peta-byte scales it’s likely that they’ll have to be handled in situ, if only to avoid breaking the internet.
So what can be done? And, more importantly, can we do whatever we’re going to do in as cheap a way as possible? Firstly, we need better ways to search multiple online datasets efficiently – the excellent Virtual Observatory is already developing techniques to help here.
Next, we need to explore new technologies like cloud computing. The Square Kilometre Array (which will generate 10 gigabytes per second) will have theSkyNet, the (worryingly named) community based cloud which will harness the power of volunteers’ computers to process its data.
Finally we need to talk more, especially to IT experts in computer infrastructure, and then share what we’ve learned in the authors’ proposed new journal dedicated to information technology in astronomy. We then need to properly reward the effort people put into this area, as well as giving young astronomers a grounding in software engineering to better prepare them for this data-heavy future.
If we do all that then, the authors’ suggest, we’ll be able to survive the coming data flood. Fingers crossed.
G. Bruce Berriman, & Steven L. Groom (2011). How Will Astronomy Archives Survive The Data Tsunami? ACM Queue arXiv: 1111.0075v1
Guess what’s the largest hurdle impeding scientific progress in astronomy? Lack of money? Governmental disinterest? Nope, according to a paper published yesterday it’s our bad programming skills.
Modern astronomers are much more likely to be found in front of a computer these days than behind a telescope. We spend our time analysing our data and to do so we need to write computer programs. Some of these are pretty simple (`read table, make graph, join dots’), but often they’re much more sophisticated (`read table, decipher Universe’ etc). However, whilst we generally received some basic programming training during our degrees, we tend not to be proper software developers. As the authors of the paper point out this can mean that our programs:
…often contain the goto statement every 10–20 lines; names of variables do not
follow any conventions, i.e. a1, a2, aa1; the code is unreadable: no or bad indentations,
very long function bodies and/or source ﬁles. There is a lot of hard-coding of ﬁle and
device names, ﬁle system paths.
When the author is
returning to the same program after several months or years, he/she often ﬁnds that the
existing procedure/function calls do not satisfy his/her needs, however is not willing to
modify them to keep the backward compatibility. Then, a wrapper routine is created
which is calling some underlying procedures/functions in a slightly diﬀerent way.
These aren’t the only flaws they highlight but they’re the ones, I’m embarrassed to admit, I might, perhaps, maybe inflict on my own code (though not the GOTO statement – I don’t do that)!
(GOTO as seen by Futurama from this page of programmer’s jokes)
So, is hiring software developers to code `properly’ for us the answer? Nope: our programs might be inefficient and unwieldy but
…at the end the program does what it is supposed to, because the author
knows exactly what it should do. Even though it may sometimes crash during run-time
or have very poor performance,
whereas a programming professional could write pretty code but might not spot an obvious flaw in the output. It’s not all bleak though, as they also give some examples of well-written pieces of astronomy software such as the excellent TOPCAT which Niall has guest-blogged about before over at AstroBetter.
The authors’ solution to the problem is to introduce mandatory undergraduate courses in these skills and to wean us all off the obsolete programming languages (in their opinion) that we insist on using. This sounds like a good idea, especially as this problem will only increase in future as the amount of data we have to handle rises. This only leaves the question of what to teach everyone which, as xkcd points out, is another kettle of fish entirely.
Igor Chilingarian, & Ivan Zolotukhin (2010). The True Bottleneck of Modern Scientific Computing in Astronomy Astronomical Societ of the Pacific arXiv: 1012.4119v1