Google explains how its ‘Portrait Light’ works, the artificial intelligence lighting of the Pixel

Share on facebook
Share on twitter
Share on linkedin
Share on reddit
Share on email
Share on whatsapp

Google explains how its 'Portrait Light' works, the artificial intelligence lighting of the Pixel

Google is, first and foremost, a software development company, especially when it comes to its products based on machine learning or, as it is known in short, artificial intelligence. Your physical products, such as your phones (your own or the Android ecosystem) or your personal assistants, seem in fact supports for demonstrate your advancements in AI.

Hence, the Mountain View team does not hesitate to publish from time to time how they work, if excessive private details, their functionalities. We have already known how they read layers to apply portrait mode or how they process sound to recognize the songs we hum. Now we return to the photographic field for Google to tell us how your “Portrait Light” works, the artificial light generated by the Pixel camera app.


This is how the Pixel “Portrait Light” works

Google explains how its 'Portrait Light' works, the artificial intelligence lighting of the Pixel 1

As Google itself explains, this artificial “Portrait Light” is already present in the cameras of the Pixel 4, Pixel 4 XL, Pixel 4a, Pixel 4a 5G and Pixel 5 and allows you to change the lighting of the photographs once captured. It is, as expected, a photo retouching mode articulated with artificial intelligence and Google explains how it has developed the aforementioned effect.

The “Portrait Light” photographic mode has been trained with a computational lighting system called Light Stage that allows both the automatic placement of directional light in photographs and a backlight, once the photograph is already captured.

Reflectance

Google says that to train their Light Stage they have photographed thousands of individuals on a set with 64 cameras located at different points of view, and using 331 individually programmable LED light sources. Thus, by photographing each individual with one light at a time, Google’s AI has been able to generate synthetic portraits with multiple lighting environments and train with them.

For lighting the scene, live or retroactively, the Pixel camera uses machine learning to estimate which lighting profile is the most suitable for the photograph we are taking. The AI ​​behind the Google camera evaluates the direction, relative intensity and color of each light source of the scene, in addition to the pose of the head at the time of capture.

Estimate

So, Google estimates the ideal situation for its synthetically generated light and it places it virtually out of the scene, generating lights and shadows on the person to be photographed that are not really there, but are virtually indistinguishable from real lights in a photography studio.

Estimate

Thousands of individuals photographed with 64 cameras and 331 independent and programmable spotlights have trained the AI

The advantage of Google’s AI is that it continues to train with each photograph we take once the “Portrait Light” mode has been released and is fully operational. Hidden behind-the-scenes AI also monitors every detail of lighting using geometric models to help you recognize the scene. Remember that, once captured, the scene is completely flat and Google reconstructs it three-dimensionally to continue playing with light. The system is similar to that used for the separation of layers to apply to the blur.

Google considers that its “Portrait Light”, trained with the Stage Light system, it’s just the first step on a journey to creative lighting controls over photos already captured with mobile phones. It is to be expected, therefore, that the machine learning from the Mountain View team continue to evolve and that we obtain, little by little, a much more efficient backlighting mode than the current one.

More information | Google


The news

Google explains how its ‘Portrait Light’ works, the artificial intelligence lighting of the Pixel

was originally published in

Xataka Android

by
Samuel Fernandez

.

Google explains how its 'Portrait Light' works, the artificial intelligence lighting of the Pixel 2

Google explains how its 'Portrait Light' works, the artificial intelligence lighting of the Pixel 3