Home » English

Category Archives: English

Querying the ESO archive

As the big observatories of the world observe ever more astronomical objects, their archives become powerful research tools. Finding out whether an object has been observed with a certain instrument is just a few mouse clicks away, if the observatory has a public archive like ESO provides for all VLT instruments.

However, if you would like to perform more complex searches, the web interface may not be enough, although it has become very comfortable to use, especially with Astropython’s astroquery package.

For a research project, I recently needed to find all local AGNs ever observed with a certain instrument (SINFONI at the VLT). Since I didn’t know the target names or programmes, I got all unique observed coordinates, resolved them via Simbad (which also gives the class of an object) and then selected the AGNs among all the targets.

Since ESO unfortunately does not provide direct access to the archive database, a query like “give me all unique observed coordinates” is not possible per se. So I had to download all headers, parse the relevant information and build my own database (SQLite for the moment).

I have a script to collect the metadata, which does this:

  • query the ESO archive for all observations of a day
  • then parse the resulting html file for the unique identifiers of each dataset (“data product ID” or DPID, e.g. XSHOO.2015-04-13T04:46:11.730)
  • download the header for the given DPID
  • parse the header for relevant information and construct an SQL insert statement
  • insert all into a database

There are also scripts that

  • query the database for all programmes and search metadata (PI/CoI names, titles) for them
  • get atmospheric data for all observations (querying the ambient conditions server)
  • And there is a top-level script that calls all of these scripts in a meaningful way and that I call about once a month or when needed to update the database.

My database consists of one table for each ESO instrument that I am interested in (currently MIDI, SINFONI and X-SHOOTER), a table with programme meta data (PI/CoI names and titles), a table with atmospheric data as well as tables with basic information about calibrators and science objects that I use for matching up observations and building LaTeX tables in a scripted way. This has become quite handy over the recent years and has helped me in building the largest sample of interferometrically observed AGNs with MIDI (Burtscher et al. 2013) as well as the largest sample of local AGNs observed with SINFONI (Burtscher et al. 2015) and a follow-up paper (submitted).

In case you are interested in tables that I have already compiled and am maintaining, please contact me and I will be happy to share the database with you. It is currently about 700 MiB and I update it every month.

Apart from nice science, one can also use this database to create other plots of interest, like a map of the exposure depth of SINFONI for example:


Interestingly, the Galactic Center (at 17:45h, -29 deg) is not the field with the deepest SINFONI integration time (“just” about 400 hours). Instead the Extended Chandra Deep Field South is the deepest SINFONI field with about 600 hours of integration time. Another field with deep coverage is the COSMOS south field (10:00h, +02 deg). About 300 hours of total integration time have been spent on this field.


Update 12 Jan 2016: I have now put my codes and the database online. Please see the github project page for further details on how to use these.

Sync your life

If you would like to sync contacts and calendar entries wirelessly between your Mac and your iOS device(s) (such as iPhone, iPad, …), you can either subscribe to Apple’s “MobileMe” service at 79 € per year or you can manually set up sync solutions, e.g. using Google’s services. While “MobileMe” is certainly the much more convenient solution, it is also expensive. (more…)

Testing your memory using memtest

Not yours, actually, but your Mac’s memory. I recently got two 4 GB modules (yes, that’s 8 GB of memory… wow!) and wanted to see if they are OK. Faulty memory modules may cause system crashes and all kinds of weird behaviour. The program that can be used for this purpose is memtest which seems to be freely available with a GPL license but the official download website asks to pay a small amount for downloading it. However, there are alternative download sites and even alternative frontends to the memtest command line tool.

IDL and page orientation

In case you ever want to rotate a postscript file that’s in seascape orientation into ordinary landscape orientation, e.g. because IDL produces postscript files that are seascape, you can use this command for ghostscript:

gs -dBATCH -dNOPAUSE -sOutputFile=pstest.pdf -sDEVICE=pdfwrite “-dAutoRotatePages=/None” -c “<< /Orientation 1 >> setpagedevice” 0 rotate 0 0 translate -f pstest.ps
gs -dBATCH -dNOPAUSE -sOutputFile=pstest.pdf -sDEVICE=pdfwrite "-dAutoRotatePages=/None" -c "<< /Orientation 1 >> setpagedevice" 0 rotate 0 0 translate -f pstest.ps
where of course pstest.pdf and pstest.ps are the names of the files.
A more convenient way is to rotate within IDL, though, e.g. by using David Fanning’s fixps.pro.

iOS 4 with free iPhone: Tethering still possible

I didn’t find on the internet whether Apple’s new iOS 4 would still allow Tethering for free iPhones, e.g. iPhones bought in Italy (like mine) and used with any SIM card. So I had to try out myself to find out: It does.

Convert multiple ps files into pdf using ps2pdf

Unfortunately ps2pdf doesn’t allow that. But with a script as simple as that it will convert all ps files in your working directory into pdf and delete the original files:

## Written by Leonard Burtscher (burtscher@mpia.de)
## 20 Apr 2010
## To transform all ps files of a directory into pdf using ps2pdf
## Then delete all ps files
filelist=`(find . -name \*.ps)`
for i in $filelist; do
ps2pdf $i
rm $i
echo ‘Converted $i into pdf, deleted original file.’

filelist=`(find . -name \*.ps)`

for i in $filelist; do

        ps2pdf $i

        rm $i

        echo 'Converted $i into pdf, deleted original file.'


On the plural of AGN

In the current view of galaxy formation and cosmology most galaxies undergo phases of high nuclear activity, where they accrete lots of matter (compared to their Eddington accretion rate) and shine brightly. Such an object is then called an Active Galaxy and the thing in the center that actually shines and where lots of other stuff is happening is called an Active Galactic Nucleus or AGN.

Now how would one call two such objects? Certainly they are Active Galactic Nuclei, since the latin word nucleus (Second declension, “o-Deklination”) has a plural ending of -i in the nominative.

However, this does not mean that the plural of AGN necessarily must be AGN, AGNi or some other funny, but counter-intuitive, abbreviation.

AGNs is also a the only grammatically valid plural form of AGN. This is because the term AGN is so widely used that it itself has actually become lexicalized, i.e. it has become a word by itself. In other words: People actually say “The galaxy has an A-G-N” (instead of “The galaxy has an Active Galactic Nucleus”). And since standard English words get an ‘s’ in the plural, the plural of AGN is therefore AGNs.

Working in multiple locations

I guess it becomes more and more usual to work not only in one place but have your office wherever you are. Since years I find it extremely practical to have all my data on one mobile computer (a MacBook Pro, currently) and work with this laptop in various locations. But different locations require different computer settings: The network setup may change, the printer system as well, the filesystem you’re connected to can be different as well as many other things. Since I found it annoying to always have to change all these things by hand, I put in place a set of scripts with which all setttings can be changed from one location to another with one simple shell command. When I get to my institute, I call a script called switchmpia.sh which looks like this:

rm ~/.bash_profile
ln -s ~/.bash_profile_mpia ~/.bash_profile
sudo mv /etc/cups/client.conf.off /etc/cups/client.conf
echo “**************     Network environment set to home ******************”
echo “**************         Environment set to MPIA        ***************”
echo “Printer settings only become effective after restart of applications.”


rm ~/.bash_profile

ln -s ~/.bash_profile_mpia ~/.bash_profile

sudo mv /etc/cups/client.conf.off /etc/cups/client.conf


It changes the default .bash_profile to one that has some institute-specific settings (see below), changes the CUPS settings (so that I can use my USB printer at home and the CUPS network printers in my institute) and calls an Apple Script that changes my Mac’s network settings (see below).
My .bash_profile_mpia looks like this:
export MIDIDATA=/Volumes/astrodata/MIDIDATA
# …
echo “Environment: MPIA”
source ~/.bash_profile_generic
It sets a number of environment variables that I use in my scripts to values appropriate for the disks that I have available in my office, then it calls my generic .bash_profile.
The above mentioned Apple Script switch_network_mpia.osascript looks like this:
tell application “System Events” to tell (process 1 whose frontmost is true) to click menu item “MPIA” of menu 1 of menu item “Umgebung” of menu 1 of menu bar item 1 of menu bar 1
It changes my Mac’s network location (which uses a German localization) to the settings for my institute (setting mainly the correct proxy servers). A good website to learn how Apple Script can help you with scripting system preferences was macosxautomation.com. The Apple Script was the most tricky bit for me and I was happy to receive very helpful comments from Pierre L. in Apple’s discussion forum for this (see the link for more information about this). Thanks again, Pierre L.!

Privacy of letters

I recently got a letter back that I had sent to an outdated address. The envelope was a little bit damaged and on the backside I found a note that the letter had been received damaged back in Eppelheim (where I live). Interestingly there was also a signature testifying this. I take this as evidence as how important privacy of letters is actually treated.

Recipient unknown -- letter not opened
Recipient unknown -- letter not opened

When to submit to astro-ph

Two days ago I submitted my first, recently accepted, paper to the widely used preprint server arXiv/astro-ph. People have studied the dependence of long-term citation rates of astrophysical papers put on astro-ph as a function of position in the daily listing. Even astro-ph seem to have realized that and offer a page where part of the relevant information to be first in the listing is published. They do not tell you, however, at which time one has to submit to be first and how resubmissions are treated. So I had to find out myself.

I waited until the evening (22:00 CEST = 16:00 EDT) with uploading my paper to the server and finally submitted it at 22:02:00 CEST. However, due to differences in the TeX system on astro-ph and on my computer, the paper didn’t appear as it should have and I had to edit a number of things. I resubmitted the final, corrected version on the next day, still before the deadline of course, at 16:29:51 CEST.

The paper appeared on position 21 (out of 25) in the Cosmology and Extragalactic Astrophysics section. This is what Konrad and I conclude:

  • Papers are sorted in descending order, the on submitted last appears on top of the page.
  • Numbers are given in ascending order as soon as the paper is the first time accepted by the system (later submission means higher number) and the date and time when you change your paper after the first acceptance by the arXiv system does not have any influence on the position of the paper

The best time to have a paper accepted by astro-ph is therefore 21:59:59 CEST.

Update: As Konrad found out, the inverse is true for the position of the paper in the daily mailing. So if you want to be first in the mailing, submit at 22:00:00 CEST. If you want to be first on the web page, submit at 21:59:59 CEST.

I am currently moving my social media presence from Twitter to Mastodon

My Tweets