Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Technology

Researchers Detail AI that De-hazes and Colorizes Underwater Photos (venturebeat.com) 25

Kyle Wiggers, writing for VentureBeat: Ever notice that underwater images tend to be be blurry and somewhat distorted? That's because phenomena like light attenuation and back-scattering adversely affect visibility. To remedy this, researchers at Harbin Engineering University in China devised a machine learning algorithm that generates realistic water images, along with a second algorithm that trains on those images to both restore natural color and reduce haze. They say that their approach qualitatively and quantitatively matches the state of the art, and that it's able to process upwards of 125 frames per second running on a single graphics card. The team notes that most underwater image enhancement algorithms (such as those that adjust white balance) aren't based on physical imaging models, making them poorly suited to the task. By contrast, this approach taps a generative adversarial network (GAN) -- an AI model consisting of a generator that attempts to fool a discriminator into classifying synthetic samples as real-world samples -- to produce a set of images of specific survey sites that are fed into a second algorithm, called U-Net.
This discussion has been archived. No new comments can be posted.

Researchers Detail AI that De-hazes and Colorizes Underwater Photos

Comments Filter:
  • Ever notice that underwater images tend to be be blurry and somewhat distorted? That's because phenomena like light attenuation and back-scattering adversely affect visibility. Well OK. But put that water under a microscope and there's probably all kinds of shit floating around which also can attenuate light, even water from a swimming pool.
    • Enhance! Enhance!!

    • Fun fact: Every vegan eats animals with every meal. Tardigrades, rotifers, nematodes, and maybe more.

      I just *need* to plug Journey To The Microcosmos [youtube.com] here, because it is the best, if you want to know more about micro-organisms.

    • Re:Underwater photos (Score:5, Informative)

      by Xest ( 935314 ) on Wednesday January 01, 2020 @12:08PM (#59576212)

      I'm a fairly experienced underwater photographer with some published images, so I can add a bit to this discussion. The premise of what you quote is a little bit weird:

      "Ever notice that underwater images tend to be be blurry and somewhat distorted?"

      No, not properly taken ones I don't. In this respect there's no difference between underwater photography and land photography; it'll only be blurry and distorted if you do a shit job, i.e. fail to shoot at a fast enough shutter speed to avoid motion blur, or fail to hit your subject with sufficient lighting to freeze it in place, or simply use an inappropriate aperture setting, or simply fail to achieve correct focus.

      What you do find with most amateur underwater photographs is that they're lacking realistic colour. Before I go into that though it's important to talk about what "realistic colour" means. You see, the deeper you dive, the more colour you lose underwater, however human eyes are pretty good at adapting to that, in the first 15 metres or so you'll find that things look astoundingly vibrant and colourful to the eye, but take a photo and there'll be a significant lack of colour; this is a combination of the fact that cameras can't capture light as well as our eyes, and the fact that cameras aren't backed by processors as powerful as our brain that can automatically adjust for the loss of certain colours in the spectrum. So realistic colour can mean one of a few things in underwater photography:

      1) How a subject looks to a diver's own eyes underwater (fairly colourful)

      2) How a subject would appear if it were not in the water (true colour)

      3) How a subject appears to a camera without any artificial lighting (significant loss of natural colour)

      Now here's the thing, to achieve 1) or 2) with a camera you have to do one of a few things. To achieve 1) you have to edit the photo, whether that's letting the camera do it with in-camera white balance, or performing an identical procedure out of the water on raw files using white balance in a tool like Lightroom. To achieve 2) with a camera you simply use artificial light underwater, this has it's limitations in that it too will only illuminate so far, so you have to get close to your subject.

      Given the requirement to do something to achieve 1) or 2), I've heard people say "Well that's cheating, it's not really what it looks like if you have to edit it". I disagree with this, by achieving 1) or 2) you are making it look like what it looks like to the human eye either in shallower water or with a torch being shined upon it in the water. If you don't do this you're simply ending up with photos that don't match any reality other than that generated by the technical limitations of modern cameras all because of some obscure and meaningless notion of photographic purity.

      This is the number one thing people need to learn about when trying to make their underwater photos look like they did to their own eyes in the water, or under torchlight in the water.

      Subsequent issues are indeed to do with simply how much particulate shit there is in the water, whether it's algae, or whether it's microorganisms; this sort of stuff can truly ruin photos if you're using underwater lighting because position your lights wrong and it'll reflect off these items and show as hundreds of speckles on your photo. You can mitigate this by angling your strobes appropriately so that the backscatter reflects outwards away from the lens, but the best way to deal with it is to simply get closer to your subject to minimise the amount of shit in the water between you and your subject; this becomes easier the wider the angle of your lens.

      So here's the problem with this AI, it doesn't really seem to be doing much other than manually white balancing. It's not altering the photo to adjust colours sufficiently to mimic lighting underwater, and it's not removing backscatter from particulate in the water. I've no doubt it's possible to train an AI to remove backscatter and so forth, but right now this

      • by ceoyoyo ( 59147 )

        Water scatters light more than air, and also generally supports more and larger particles, which also scatter light. Both effects cause the mean path length of a ray of light in water to be much shorter than in air. The scattering and absorption is also frequency dependent, so white balance changes with propagation distance.

        The effect they're after is really de-hazing, apparently so they can do object recognition more reliably.

        • by Xest ( 935314 )

          Makes sense, one of the biggest problems shooting wide angle with strobes is that whilst you can light up a near subject to give perfect colours, anything past a few metres is still discoloured. This is the downside of shooting with artificial light of course, it means if you wish to light balance that in the distance by increasing the amount of light in the red spectrum, everything close up will be too red.

          I don't see this sort of AI ever really improving macro photography because you're typically so close

          • by ceoyoyo ( 59147 )

            The colour alteration will depend on the total path length to the object (in water) from the light source, as well as the particular scattering properties of the stuff in the water, so technically the AI could improve on simple white balance by using depth and colour information it can estimate from the image. That might make a noticeable difference for artificial lighting, but as you've noticed, simple algorithms are usually enough for most naturally lit underwater photos.

            The haze is a bit different. It de

  • Bathroom windows are no longer safe. Find pattern... apply algorithm... Pr0n!
  • by ffkom ( 3519199 ) on Wednesday January 01, 2020 @08:52AM (#59575906)
    I do a lot of underwater-videos, and I am absolutely not impressed with their results. To achieve similar results, just use a white balance filter that operates in the "long-medium-short" (LMS) color-space (which much more accurately models the human eye vision than the other usual color-spaces). You can find a good implementation e.g. in the "colgate" filter that is part of the free open source "frei0r" library, usable from e.g. ffmpeg, also in "real-time" if you need. Add a little bit of contrast in case the water is murky, and possibly a little bit of saturation, and you are done.

    These "Oh look we did X with some artificial neural network!" news become increasingly annoying in that they too often fail to mention that equivalent, less randomized results have been achieved with simple, well understood algorithms before.
    • by dfghjk ( 711126 )

      "Human eye vision" has absolutely nothing to do with image degradation that occurs underwater, so an "accurate model" of it is irrelevant to the problem. LMS can be freely converted to and from other color spaces so you approach amounts to diddling bits without any fundamental understanding of the cause of the problem. You may do "a lot" of "underwater-videos" but you're a knob-turner, nothing more.

      You cannot correct for degradation after the data is lost. You can synthesize missing data after the fact,

      • by ffkom ( 3519199 )
        You are nothing but a rude troll who has obviously neither an idea of white balancing nor of who I am and what my experience is. Go read some science papers [academia.edu] on the relevance of using the LMS color space for white balance corrections to educate yourself. And learn some manners, while you are at it.
    • I achieved similar (if not better) results with a friend's underwater photos by creating a Photoshop batch process which simply ran:
      • Auto-levels to remove the haze. This stretches the luminosity histogram to remove unused light and dark luminosity values (what your brain interprets as haze), thus increasing the contrast.
      • Auto-colors to stretch the histogram curves of each color channel independently. Green and especially red light are absorbed more strongly by water and are thus lacking in the picture. S
      • Yeah the problem is much less severe at low depth, where you still have most of the color spectum reaching. At 30m (or even less), reds are just gone. They even make you look at a color chart during training and the first two swatches (red and orange) look gray. No amount of fiddling with the pixels will bring these colors back, because they could've been anything originally. Even gray. The only solution is to bring your own light with you.

        Interestingly though I read about another solution to this problem j

  • That last sentence sounds like the old comedy skit of the car mechanic explaining that you'll need to spend a lot of money because of the broken rafilator, especially cause your's has Framis injection.
  • by Anonymous Coward

    ...you just invented white balance!!!

  • It's a neural network. Aka a generic function you throw in if you have no clue how to write an algorithm for that. You merely train it. What it actually does, and how it does it, is by definition unknown, often not what you thought it does, and sometimes surprising. Otherwise you wouldn't need a neural net, and could code a faster precise algorithm yourself.

    • by ceoyoyo ( 59147 )

      They trained the neural net using simulated underwater images created using a model of light scattering in water.

      You could, however, use a neural net to estimate the parameters of such a model to apply the inverse effect. Neural networks are quite good at estimating parameters of physical models from complicated or incomplete data (such as a photograph).

  • Looks like msmash did a seach for AI and posted everything to slashdot.
    Thanks, I guess.
  • by balbeir ( 557475 ) on Wednesday January 01, 2020 @06:40PM (#59577316)
    Seems to be similar to this https://petapixel.com/2019/11/... [petapixel.com]
  • I dont think these people know what the words blurry and distorted actually mean.

"Little else matters than to write good code." -- Karl Lehenbauer

Working...