Pixel blooming in Google Earth

Discussion in 'Digital Cameras' started by Jeff R., Jan 12, 2008.

  1. Jeff R.

    Jeff R. Guest

    I happened upon these two rather impressive blown highlights in Google
    Earth, at:

    150°59'49.49"E 33°49'0.68"S

    (and a hundred metres or so north of that) and and just wondering... Is
    this common in Google's sat shots? Anyone seen others?

    Didn't the goddam astronaut check his histogram before posting?

    It actually begs another set of questions...
    My scanner frequently blooms just like those shots if I'm scanning highly
    reflective items, but my digital cameras don't bleed in a linear fashion
    like that - just a big white patch.

    Is Google using a huge flatbed scanner in its satellites?
    Jeff R., Jan 12, 2008
    1. Advertisements

  2. Here's nice starburst one.
    52°30'40.78"N 4°56'59.52"E
    /\\BratMan/\\, Jan 12, 2008
    1. Advertisements

  3. Jeff R.

    Matt Ion Guest

    Giant disco ball??
    Matt Ion, Jan 12, 2008
  4. Jeff R.

    Jeff R. Guest

    Thanks for that.
    That shows the sort of reflection/sensor overload that I'd expect to see in
    a conventional-type digital camera sensor. My camera shows the same
    diffraction spikes if I stop it down far enough. (Not a function of

    Yet the original post shows bleeding which is more consistent with an
    overloaded flatbed scanner, or the type of CCD camera that astronomers use
    on the back of their telescopes.

    It just gets me wondering what sort of imaging setup is used by Google. Two
    posters have suggested aircraft rather than satellites... I wonder.

    ....and thanks for all the comments to date.
    Jeff R., Jan 12, 2008
  5. Jeff R.

    Matt Ion Guest

    I expect Google does very little of their own imaging, if any. Most
    photos would be purchased from a variety of other sources, such as NASA
    and other space geo-imaging outfits, with close-zoom images coming from
    assorted aerial-photo suppliers.

    Satellite photos can only get you so much detail - even the best sensors
    and lenses can't magically dissipate atmospheric interference, haze,
    pollution, and so on. Small detail could only come from sources WITHIN
    the atmosphere. Mosaics would have to be compiled over faily extensive
    time frames to cover the entire surface of the globe without cloud cover
    and other such interference, so you'll have varying light conditions
    from one area of the surface to the next, as well.
    Matt Ion, Jan 12, 2008
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.