Today I’m starting my new job as a Postdoctoral Associate at Duke University where I’ll be working with Pankaj Agarwal. I am now trying to find a car (anyone here selling a car in the Raleigh/Durham area?).
The list of accepted papers for ISAAC 2009 in Hawaii has now been posted.
I’m happy to see that my paper “Counting in the Presence of Memory Faults” is on the list (number 136), it has been co-authored by Gerth Brodal, Allan Jørgensen and Gabriel Moruz. The paper presents counting algorithms designed in the Faulty Memory RAM proposed by Finocchi and Italiano.
A fault-tolerant resilient counting algorithm as defined in our paper is a datastructure with an increment operation and a query operation. The query algorithm returns the number of increments operations that has been performed on the data structure, within an additive error factor.
It is simple to design such a datastructure in the standard RAM using a simple integer variable, but this is not possible in faulty memory RAM where arbritrary memory cells can get corrupted at any point in time. In the paper we present tradeoff lower bounds and upper bounds and tradeoffs between the additive error factor and the time used per operation.
I have finally created a twitter account, follow me while it’s still hot!
Also, it’ll be awesome, I promise!
A few months ago I made the following video and uploaded it to youtube (HD version available on youtube):
From the youtube description:
This video shows what parts of Denmark will flood if the ocean rises. The flood maps were computed using the TerraSTREAM software package and the finished flood maps were visualized by importing the resulting rasters into the GRASS open source GIS.
The algorithm used by TerraSTREAM takes dikes and natural features into account and works directly on the original input terrain. The terrain has been down-sampled afterwards for visualization purposes, the few seemingly disconnected flat areas are due to the fact that the computed output has been down-sampled for this video.
The size input grid dem used in this computation was around 100 gigabytes. The video was used in recent article(in Danish) which was about TerraSTREAM’s ability to compute flooding maps like the one shown in the video, on even very big terrains. This type of flood map can be used in an initial screening phase for computing flood risks.
In my opinion TerraSTREAM is a good example of I/O-efficient (external memory) algorithms put to use in efficiently solving real world problems on massive datasets.
Update: A similar, interactive, example be seen at our global flood map site.
I am currently managing the Aarhus University (and only Danish!) site for the Nordic Collegiate Programming Competition (NCPC 2008).
We have 5 teams here – which is a 250% increase from last year (which was also an increase from the year before that). We are currently two hours into the competition and all the teams here are doing well and have solved at least two problems. Right now, “MADALGO Men” is in the lead in the “student” class, and only one team in the open class have solved more problems! There is a total of about 140 teams competing.
Take a look at the Live Scoreboard.
The live scoreboard can be seen here: http://acm.tju.edu.cn/toj/vcontest/ranklist1951.html .
We have about 15 people participating right now, only those who have submitted something shows up on the scoreboard.
The contest ends at 19:00 european central time.
This is probably most interesting for students at Aarhus University. I’m coaching the teams from Aarhus that’ll go to the North-West European Regional Finals (NWERC 2008) in Utrecht. If you are interested in trying out for the teams you can read more here.
We have some talented people participating this yeah, I think they’ll do well at NWERC :) This will be my fifth NWERC event. I participated in 2003, 2004 and 2005 and coached in 2006. I was at Duke during NWERC 2007 so I couldn’t go to that year.
Remember. programming is fun!
There’s only one session left of the second day of SoCG 2008 here at the University of Maryland in College Park. The temperature is close to 40 degrees C and it’s very humid so everyone stays inside in the air-conditioned building of the conference venue. As usual it’s hard to dress such that you’re not completely fried when you go outside, and such that you don’t freeze when you’re inside. The conference is taking place at the business school, I’m definitely not used to seeing stock tickers in computer labs – fun detail.
Bardia presented our paper, I/O-Efficient Algorithms for Computing Contours on a Terrain, as the very first day of the talk. The audience raised a couple of interesting points which I’m sure we’ll think more about after the conference.
The closes thing to a conference excursion was two busses that took us to Dupont Circle in downtown Washington, DC for dinner yesterday. I went to a Korean restaurant with a couple of people and had good Korean-style steak thingy, and I liked walking in the Dupont Circle area, which hadn’t explored on foot before.
Tonight is the conference banquet, that should be fun.