Deblur Famous / Interesting Photos
While we are developing research tools for deblurring burry images caused by camera shake, we ran into many real-world examples that are either historically famous, or have interesting aspects from research point of view. We had so much fun with them! That's why we share some of them with the community on this page to stimulate scientific discussion and inspiration for future research.
If you know/have famous or interesting photos that you want us to give a try, please feel free to drop me an email at arphid AaTt GOOGLE EMAIL DO0OT C0M. Please just send me a link to the photo, no email attachments. Please read the following disclaimer before contacting me.
1. The ONLY purpose of this webpage is for entertainment and scientific research.
2. Please only send blurry photos caused by camera shake, we cannot deal with motion blur or defocus blur.
3. We only have limited time for this besides work and life. So, if you sent me something and didn't hear back from me for a while, that's completely normal. It could be multiple reasons: we are too busy; or your photo is not interesting to us (sorry), or you photo is not reasonable (sorry again). We cannot respond to every request or deblur every picture. We appreciate your understanding.
4. As our research is evolving rapidly, we may update the results with better ones later.
5. When you send us something, you need to make sure we have the rights to use and display it for research purposes.
added 10/ 23/11 added 10/ 23/11 added 10/ 23/11 added 10/ 23/11 added 10/ 26/11
Robert Capa’s Omaha Beach photo (June 6 1944)
Image description: this is perhaps the most famous blurry photo in American history. It was taken by Robert Capa on June 6, 1944. It captured an American soldier landing on Omaha Beach, D-Day, Normandy, France. What you see here is a scanned version (there are multiple versions on the Internet). This photo is interesting to us because it has extremely bad quality. Look at the noise and artifacts added by the scanner. It's an extreme test to our algorithm.
(Blur kernel recovered, visualized at 4χ)
To me the system does a reasonable job. It recovers some details that you won't be able to see easily in the original. Of course the noise gets boosted somehow, we applied a small amount of noise removal on the output, but maybe a decent denoising algorithm can help here. I am not saying the output is a better picture than the original, for art work nothing beats original. The most interesting thing is the estimated blur kernel, or in other words, how Robert Capa moved his camera. The kernel shown above not only depicts the camera trajectory, but also shows how long the camera stays at each spatial location (white means longer, black means shorter). It seems Robert kept the camera steady for a while, then suddenly moved to the left before the shutter closed completely. Of course all these happened in the fraction of time when his shutter was open. :)
The bottom image was provided by Wez Crozier from UK, who combined the original and the deblurred one into a much nicer composite. It shows that deblurring could be part of your workflow to create something really nice.
How our research started: Hwanho Sunrise Park in Pohang, Korea
Image description: we have some special feelings about this one, since this was an early motivation for my colleagues in Korea to start working on this problem. This shot was very blurry, and at the begining we all thought it was impossible. Then at one mid-night it worked!
(Blur kernel recovered)
The most interesting thing to me is the size of the blur kernel, the camera trajectory is almost 55 pixel long. It means to recover a single pixel, we have to consider at least 55 pixel nearby (in practice far more than that). The final output looks good, I especially like the trees in the background, and the small black structure on the right side of the temple (not sure what it is, a chair maybe?). They need to fix some broken tiles on the roof. :)
This bad camera shake reminds me of another scientific application: measuring how shaky one's hands are by asking the person to take a picture. Imagine in the future, when your car is stopped by a Police officer, he will offer you two choices: a breathalyzer or a camera!
Image description: Of course we are not the only group working on deblurring. Microsoft Research has an interesting paper where they recorded the gyro sensor information when they took the picture, then deblur the picture using the additional sensor information. Below is one of their examples. We don't have the gyro data, and we just feed the image to our system, and it worked pretty well overall.
The interesting thing for this example is the person in the photo, you can see an enlarged zoom-in version above. The person has some ghosting artifacts. Why? Because the person is moving fast so she has both motion blur and camera shake blur on her. Since the camera shake is relatively small, the motion blur dominants those pixels, and we are applying wrong blur kernel to those pixels. This is the limitation of the algorithm, you will have to manually remove the artifacts on the person somehow. Future research direction here!