Matt C pointed me at a really cool site, astrometry.net. You upload a photo of the night sky you took and it does a bunch of image recognition to figure out where you were pointing. Sort of obvious in retrospect that it’s possible, it’s a more general version of what self-aligning scope computers do. But much harder to do with arbitrary photos at various scales and angles. Also super cool to have as a web service. The Flickr thing is clever, too.
|Center (RA, Dec):||(11.843, 65.819)|
|Center (RA, hms):||00h 47m 22.285s|
|Center (Dec, dms):||+65° 49′ 09.090″|
|Size:||29.7 x 44.5 deg|
|Pixel scale:||160 arcsec/pixel|
|Orientation:||Up is 46.8 degrees E of N|
The annotated image is clearly understandable.
Not a lot to say about this image, it wasn’t anything particularly good. But this inspires me to try to get better sky images, something can make sense of them!
(It strikes me you could use this to do lens distortion calibration, too. The red-green image shows the variance between the image of the star and where the star actually is, you could use that to map the lens distortion and then warp it away.)