18.08.2021

Ten Things You Should Know About the Google Pixel 2


With all the hype surrounding the announcement of the Google Pixel 2 and Pixel 2 XL and their world's best DxO Mark mobile camera, it's easy to get lost. What is important to know about this new phone? Where did Google leave flaws? Why is this phone's camera better than its predecessor? Why photographers are intrigued by the new camera technology.

Dual Pixel AF

The new Pixel smartphones have technology that is used in Canon's DSLRs and mirrorless cameras. Also, this technology was used in the smartphone Samsung Galaxy S7. Dual Pixel allows the camera to see a three-dimensional image without using two modules. The division of pixels under each microlens into two simulates the left and right human eye. Due to this, objects are visible with a slight offset between the left and right pixel, which makes it possible to build a depth map. This information is also used for very fast autofocus.

This technology is officially called Dual Pixel. It has several advantages over the dual cameras used in the 2017 flagship Apple iPhone 8 and 8Plus. An iPhone X with a dual camera module is also expected.

Autofocus Dual Pixel AF is very fast and accurate. It works even in low light conditions. Moving the focus in the video is smooth. Considering how well the Pixel 2 can stabilize when shooting 4K video, you can even shoot professional-quality footage with the right lighting and scene building skills.

Dual Pixel + Machine Learning

Pixel displacement allows you to build a depth map of the field of view. Algorithms that analyze the scene and identify objects also participate in the construction of the map.

The system is smart. However, the stereo mismatch between the two images is likely to be very small compared to the dual camera, where the offset between the modules is much more significant. This can make it difficult for the camera to separate the subject from the background. It also explains the poor results of the DxO tests, showing the shortcomings of the Google algorithms that blur the background. Background separation works best when the subject is close to the camera.

It should also be noted that the new smartphone can take pictures in portrait mode while maintaining a resolution of 12MP. The previous generation of smartphones could only generate 5MP files.

Google's "blur" follows a more complex algorithm than Apple's simple Gaussian blur. The blur area looks nicer, but it comes at a cost. If Apple offers the algorithm to work in real time, then the Pixel 2 smartphone takes some time to calculate all the parameters. You can keep shooting, but you will see the result only after a few seconds.

Hardware Protection

Unsurprisingly, the insides of the new Pixel 2 phones aren't all that impressive, as they're no different from other 2017 flagships. Both versions of the smartphone received a top-end set of elements that are hidden in a case that provides IP67 protection. This is very good compared to the previous generation, which had IP53 protection, but falls short of what Apple offered. The housing is wear resistant and very well protected.

We lost the headphone jack but got stereo speakers in the front. The XL has a smaller bezel, but it's still not as thin as on Samsung phones. No dual camera. The RAM and processor are similar to those used on other Android phones. You can summon an assistant by simply squeezing the body in your hand.

Nothing really stands out. But wait, there's more.

First AI

Google CEO Sundar Pichai constantly makes a statement in his speeches that they have implemented something for the first time. Now Google is shifting its strategy from the world's first smartphones with innovative technology to the world's first smart assistants. Google now offers not just handheld devices with good processing power, but smart devices that can adapt to our needs and make our lives easier. And Google is a leader in this area thanks to its own developments and the data that the company receives from its own search service.

Artificial intelligence is increasingly being used in many services to make them better. CEO Pichai recently gave an example from a fitness app: every time he opens it, he goes to a different page. But instead of having an in-app command to change the default start page or add an option, he thinks the AI ​​should just learn your preferences and do what you normally do.

What does this mean for photography and videography? This is just a thought for now, but imagine a camera that knows your taste in photography, how you edit your photos, what style of photography you take or what filters you apply. How about training your taste in music so that when Google Assistant automatically builds a video from your photo and video library, it selects your favorite music for the video.

The possibilities are endless, and we're likely to see a lot of cool stuff in the new Pixel phones.

Google lens

Sundar Pichai first spoke about Google Lens at the I/O developer conference earlier this year. He spoke about combining machine vision and artificial intelligence. It's now available for the first time in the Photos app and Google Assistant on the new Pixel phones. Google's machine vision algorithms can analyze what the camera sees and use artificial intelligence to do cool things like determine what type of flower is in the frame.

This type of technology is applicable to photography. Pichai talked about how the automatic recognition of objects in the scene has improved. Everything from the fence to the motorcycle, food and animals, everything became clearer to the algorithm. Google is getting better at defining these objects and understanding their interactions.

And once you understand what an object is in the frame, you can do almost anything with it. Delete, change lighting, search by content. Now you don't need to enter descriptions of pictures beforehand to search for, for example, animals, food, sky, birds, planes, people, etc. The Photos app has been able to do this kind of search for a long time. It also searches not only by objects, but also by events, such as birthday, vacation, travel, and more. Many are looking forward to seeing how the addition of Google Lens to new phones will make photography and assistant better.

Maybe smart object detection can even fix the glare issue on the previous Google Pixel, but the new smartphone doesn't have that problem.

Goodbye ugly glare

Luckily, the annoying glare issues that plagued the first Pixel phones can apparently be fixed by lifting the camera module off the glass substrate, which has also been reduced and streamlined to sit flush with the rest of the phone.

The camera module rises a little in the back, although it's a compromise that had to be made to eliminate all issues and fit the best technology.

Incredibly smooth video

When the first version of the Pixel was released, Google claimed that its camera kept up with other devices thanks to its digital stabilization and did not need optical stabilization. This was the case despite the lack of OIS (optical image stabilization). The software was very effective at correcting for camera shake. Since then, the algorithms have gotten better. But Google didn't stop there. OIS is critical when choosing camera size. Stabilization makes the modules more bulky.

This year, Google took a different path. OIS is combined with electronic image stabilization (EIS). The larger camera block was able to fit it all. The results appear to be quite impressive. The original Pixels had very good video stabilization, even when shooting in 4K, but the combination of OIS + EIS was incredible. Video recording is even smoother. This is evidenced by the video provided by Google.

For shooting in low light, stabilization is critical. It allows you to shoot at slower shutter speeds and get sharper images. You will also get better results in macro photography.

Where is the color management

The bad news is that Google hasn't said anything about proper color management on the new phones. Previous Pixels had beautiful OLED displays, but colors were wildly inaccurate and often too saturated due to the lack of any sort of color management or proper calibrated display modes.

iPhones have some of the most accurate screens around. Their wide color gamut displays with special calibration display very accurate color. They cover most of the DCI-P3 color space, but more importantly, iOS can automatically switch between the two gamut display modes. A properly calibrated SRGB and DCI-P3 standard color gamut mode is used. Switching occurs on the fly without delay, depending on what type of content is currently being viewed.

This means that you will view photos and videos with the colors that the developers have laid down, and not as the smartphone wanted to display. It also means that when you send images from your iPhone to print, you'll get the same colors as the display, just a little darker because the paper doesn't have the same backlight as displays.

The Samsung Galaxy S8 is also calibrated to run in both DCI-P3 and SRGB modes, though you'll have to manually switch between the two. Newer Pixel phones don't provide any calibration or color profile management, although this feature is implemented in software in the new version of Android Oreo. Although, as in the Windows system, it is up to the applications. It should be borne in mind that the ability to manage a color profile does not guarantee accurate color reproduction without display calibration, and this should be done by the manufacturer. At least there was hope that smartphone developers would do it, like Samsung did with its Galaxy S8.

HDR display?

Unfortunately, there was no mention of this either, as well as 10-bit images or HDR display of photos and videos using HDR10 or Dolby Vision standards. This leaves much to be desired.

The iPhone X will play HDR video using multiple streaming services, but more importantly for photographers, it will (for the first time ever on any device) show HDR photos. Apple is pushing the industry forward by talking about displaying HDR in photos, not just videos. Remember that this has nothing to do with HDR shooting, but instead, actually displaying photos on OLED displays can reproduce a wider range of tones.

Simply put, photos taken on iPhone X and viewed on iPhone X will look better than if viewed on another device thanks to support for HDR displays and accurate color reproduction. This is important, and Google seems to have missed this point completely.

HDR displays require less editing and less exposure to shots. HDR shooting is useful because it allows you to preserve the detail in the shadows by reducing the amount of noise. At the same time, on conventional displays, you need to greatly increase the brightness of dark areas so that details can be seen on the display. iPhone X allows you to display a wide range of tones without aggressively adjusting your shots.

Unlimited storage is limited

Google's promise to provide Pixel 2 and Pixel 2 XL owners with unlimited cloud storage for full-resolution photos and videos turned out to be a catch. It couldn't be otherwise.

The company said that owners of any Pixel phone will be able to store all the photos and videos they want online in Google Photos as long as they use the Pixel phone. Each picture will be saved in full resolution, just like the video.

At the moment, free unlimited storage is only available if you transcode photos to a resolution of up to 16MP, and videos to Full HD. It is also possible to download data in original resolution, but it will take up a free limited amount of 15GB, which also applies to Gmail and Google Drive.

Pixel owners will get a small reprieve. Their unlimited storage in full resolution will be available until 2020 inclusive. After that, the situation will change. New photos and videos taken on any Pixel phone after 2020 will be compressed to "high quality" (16MP photos and 1080p video).

This information is written in small print:

"Free, unlimited storage in original quality for photos and videos taken with Pixel until the end of 2020, and free unlimited high-quality storage for photos taken with Pixel thereafter." This statement is a bit confusing.

Google said that after the maximum resolution free storage expires, the files will remain intact, but new ones will be downloaded with compression.

Google is betting on the quality of the Pixel 2's camera and unlimited storage to make it stand out from the crowd, including the iPhone X and Samsung Galaxy Note 8 competitors. In fact, Google thinks its single 12-megapixel camera is better. than the competitors' dual cameras. The Pixel 2 also scored higher on DxOMark. But it is worth knowing that his 98 points is a generalized number. If we look at individual categories, we can see that Szmsung and Apple win on many points.


2022
maccase.ru - Android. Brands. Iron. news