-
Notifications
You must be signed in to change notification settings - Fork 0
Star Detection
Jussi Saarivirta edited this page Jul 30, 2021
·
3 revisions
Watney's star detection (the DefaultStarDetector) is a somewhat simple job and could no doubt be refined to be better. However for the solving purposes thus far it has been "good enough". It only works with monochrome pixel buffers (8, 16 and 32 bit) - therefore it relies on the image readers to do conversions from whatever pixel format to monochrome. This is how it works:
- To determine the background value (where signal starts), we calculate the histogram from the image
- We find the mean pixel value
- We calculate the standard deviance (stdDev)
- We accept that the background value is the mean + 3 * stdDev. Anything above that is considered signal.
- We create "pixel bins" from the pixels that are above the background value
- The image is scanned line-by-line, and contiguous signal pixels form into bins
- Adjacent bins are joined together
- Filters are applied to weed out non-starlike bins
- The
DefaultStarDetectionFilterweeds out too small, too large, streakish (satellite trails and such) and non-round stars
- The
- Star centers are determined from the 50% of the brightest pixels, unless the star is small and the geometric center is used.
From this we get a good set of stars which can be used to form quads.
Table of contents
-
Supportive Libraries
-
Quad/Star Database Tools