Pixel 4 series star...

  • 2022-10-14 10:29:59

Pixel 4 series starts to pre-order dual-lens algorithm and evolves night vision to shoot the Milky Way

Possibly the last flagship of the year (?), Google unveiled its new year's Pixel 4 phone in New York last night, but the information leaked out before the conference was enough to understand, so the conference also What did you say?

The appearance is just like the outflow. The BD08AV2 has no bangs, but in order to put in the face recognition and Soli sensor chip, it has a wide screen frame, so the top and bottom are a little asymmetrical. Those who like the full frame feel of the full screen may feel that It's kind of... retro.

The back of the Pixel 4 doesn't have two segments of color, so this time it's really pure black, white, and orange.

QQ图片20211117132415.png

The key specifications, if not written separately, both are the same specifications.

Operating system: Android 10

Processor: Snapdragon 855, Titan M Security Module, Pixel Neural Core

Memory: 6GB

Capacity: 64GB, 128GB

Screen:

Pixel 4→5.7 inches, FH+, OLED, up to 90Hz screen refresh rate, HDR support

Pixel 4 XL→6.3-inch, QHD+, OLED, up to 90Hz screen refresh rate, HDR

Battery:

Pixel 4→2800mAh

Pixel 4 XL→3700mAh

Support PD2.0-18W fast charging (included in the package), Qi wireless charging

Two rear cameras:

1200 -ic/" title="Model 1200">1200-ic/" title="Model 1200">1200-ic/" title="Model 1200">1200-ic/" title="Model 1200">1200- ic/" title="Model 1200">1200-ic/" title="Model 1200">1200-ic/" title="Model 1200">1200-ic/" title="Model 1200">1200-ic/ " title="1200 model">1200-ic/" title="1200 model">1200-ic/" title="1200 model">1200-ic/" title="1200 model">1200-ic/" title ="1200 model">1200-ic/" title="1200 model">1200-ic/" title="1200 model">1200-ic/" title="1200 model">1200-ic/" title=" Model 1200">1200-ic/" title="Model 1200">12 million pixel main camera, f1.7, 1.4 micron per pixel, Dual-Pixel and phase focus, OIS and EIS, 77 degrees field of view, support 4K/ 30fps video

16 million pixel telephoto (2x), f2.4, 1 micron per pixel, phase focus, OIS and EIS, spectral and flicker sensor, field of view 52 degrees

Front single lens: 8 million pixels, f2.0, 1.22 microns per pixel, fixed focus, 90-degree field of view

size:

Pixel 4→68.8 x 147.1 x 8.2mm

Pixel 4 XL→75.1 x 160.4 x 8.2 mm

Dual SIM: nano SIM+eSIM

Others: IP68, support 5CA, stereo speakers, three microphones, support Active Edge side press

The key points of function drawing are:

A new generation of night vision mode, with more accurate low-light colors and improved details, can capture astrophotography-like photos.

Added dual exposure control function to adjust the exposure of different light and dark parts separately.

Correcting complex lighting white balance accuracy with machine learning.

Re-enhanced HDR+ mode

With more telescopic lenses and high-resolution zoom, the zoom details are better.

One more lens makes portrait mode cutout more accurate.

The first mobile phone with built-in miniature radar sensor (Soli Sensor), which brings Motion Sense for "quick gesture operation" in space.

Face unlock with higher security level, with Titan M security chip.

The OLED screen supports a 90Hz screen refresh rate, which can be activated according to the required program, making the motion smoother.

Google Assistant has a new interface and speed upgrades.

A new recording program converts speech into verbatim in real time. (English)

The biggest difference between the Pixel 4 camera is that it has become a standard + telephoto dual lens, which is a big change for Pixel phones, but in the overall market, without the ultra-wide angle, it may be less attractive. force...

As for why the super wide-angle is not added, I think someone will ask at the on-site QA later, and I will add it when I see it.

2x optical + high resolution zoom brings out better digital zoom

Although it is a dual lens, it still makes a big square bottom. Why do you want to make a very obvious square in a high-profile way? I am a little puzzled. It was mentioned at the press conference that this square also contains a set of spectrum and flicker. Sensor, microphone for recording and high-brightness fill light.

QQ图片20211117130938.png

The 2x optical zoom lens of the Pixel 4 is of course used to take zoom photos. Whether you use a 1x main lens or a 2x lens, when zooming in, the Pixel 4 can use the 2x optical zoom image as the basis. Coupled with the "high-resolution zoom" algorithm, a photo with a higher magnification but better clarity than ordinary digital zoom is obtained. However, the aperture of the telephoto lens is f2.4. I don’t know if there is any problem with the picture when it is cloudy or indoors. I hope I can test it if I have the opportunity.

QQ图片20211117131401.png

Can take portraits farther away

Portrait mode is also different thanks to an extra lens. On the Pixel 3, machine learning and dual-pixel technology are used to estimate the depth map and take a photo of the blurred background. On the Pixel 4, it becomes the use of machine learning to calculate the depth from dual pixels and dual lenses, so even if The subject is farther away from the lens, but an accurate depth map can still be captured. In this way, when shooting large objects or the whole body, when you need to stand farther away, the quality of the cutout and blur will be better.

QQ图片20211117131455.png

Live HDR+

Pixel 4 further enhances the effect of HDR+. After shooting up to 9 short-exposure images in a row, it takes details and enhances dark parts for synthesis. The more frames that are synthesized, the less noise. When using nine frames to synthesize, it can be reduced by 1/3 Noise, you can get a cleaner HDR+ picture.

In addition, when shooting HDR+ in the past, it was impossible to see the effect from the framing screen in real time. Pixrl 4 adds a "Live HDR+" function. Through machine learning, an approximate image is calculated, allowing you to take pictures. See what HDR+ photos look like on the screen in real time, and what you see is what you get when shooting HDR+.

QQ图片20211117131727.png

Dual exposure control

Not only what you see is what you get, but the Pixel 4 has further added a "Dual Exposure Control" function that can adjust the contrast of HDR+.

A good HDR+ should have a good dynamic range, clear details in the bright and dark parts, and natural exposure levels, but maybe you don’t want such an average exposure in this scene. At this time, you can use the dual exposure control to adjust the shooting separately and manually. When the exposure brightness of the whole picture, as well as the exposure of the dark part (adjust Tone Mapping), see the adjustment degree, through this, you can pull out a photo with the background brightness you want, but the main body becomes a silhouette.

QQ图片20211117131818.png

Machine Learning White Balance

On the Pixel 3, Google uses machine learning to correct the white balance in night vision mode so that objects don't have color casts in low light, or lose their original saturation.. etc., on the Pixel 4, this is A set of machine learning algorithms for correcting and correcting white balance, put it in each shooting mode, under strange and complex light, can also capture the correct white balance, especially when there are people in it, the correct color tone is needed... This function is so necessary~~ I always get a strange white balance with KTV on cloudy days. The white balance is really annoying..

Night vision can shoot astronomical stars

The eye-catching night vision mode has evolved on the Pixel 4 to take astrophotography, such as denser stars and the Milky Way.

QQ图片20211117131857.png

After pressing the shutter, the camera will do 15 exposures and compositing processes of up to 16 seconds each. This process takes about 4 minutes, so it is a mode that requires a tripod or the camera to be fixed for shooting. In the process of taking astronomical photos, machine learning is also used to correct the white balance in such low light.

And the problem of noise reduction... Pixel 4 uses the so-called segmentation semantics. In short, when shooting night vision, darken the sky part and do noise reduction. In addition, the photosensitive element will generate hot pixels during long exposure shooting. For this special white particle noise, the longer the exposure time, the more hot pixels. When taking astronomical photos in night vision mode, the total exposure time is as long as 4 minutes. There must be a lot of this noise. As mentioned on the stage, Pixel 4 Using a clever algorithm to heat pixels, how clever... In short, it seems to be removed from the picture. But I would like to know how to avoid mistaking the stars as white noise and accidentally remove them...

QQ图片20211117131950.png

Looking forward to moon mode?

Finally, it is very interesting that Macr Levoy, a Google distinguished engineer who is in charge of the Pixel camera team and has a lot of research on computational photography, talked on stage about what the Pixel 4 can't take? La...

He is really not tearing down his own stage, but showing off and creating expectations...

He mentioned the moon.

The Pixel 4 can also take pictures of the moon, but it is not the kind of Zoom in Huawei that takes a single moon, but uses the aforementioned “dual exposure control” to take a round yellow moon in the real environment. The Pixel 4 can also capture the scenery in front of you in moonlight-only locations.

QQ图片20211117132041.png

However, the Pixel 4 can't take photos of both the moon and the landscape, because the dynamic range/brightness contrast between the brightness of the full moon and the landscape is too large, even beyond the scope of a single-lens camera, which the current Pixel 4 still has. Can't get a good shot, but...

The future is uncertain. In their belief, the Pixel phone is a camera defined by software. It can make the camera more and more amazing through software updates, so let us look forward to it.

QQ图片20211117132143.png

Originally, these updates to the camera were all expected, but re-reading the double exposure control mentioned at the time of publication and the night vision astronomical mode ignited some expectations.

Face ID/Unlock

But on the top of the screen, in addition to the 8-megapixel lens for taking photos and videos, it also contains a lot of sensors, including two infrared cameras for facial recognition, dot matrix projectors, floodlights, and Soli radar sensor chips , proximity sensor, ambient brightness sensor...etc.

The Pixel 4 has also done facial recognition with a level of precision and security similar to the iPhone, but Google previously mentioned that the current face unlock requires the user to pick up the phone and face it at a certain angle to sense it. It takes a swipe of the screen to get to the desktop and functions, but the Pixel 4 shortens this process.

When the user reaches for the Pixel 4, the Soli sensor senses that you want to unlock the phone, and will automatically turn on the sensor group of facial recognition. After the facial recognition group and algorithm recognize your face, pick up the phone. It will be unlocked at that moment.

Previous Google articles mentioned that users can perform facial recognition at any angle, even if the phone is held upside down.

I think users should not use the phone upside down, but they can hold it horizontally. If it can be recognized horizontally, it seems to be more convenient than the iPhone, but it still seems that the phone needs to be picked up and faced with the face. (Actually, the face unlocking function has been used so far, and I think the speed is very good. The focus of the next improvement should be to increase the recognizable angle, even if the mobile phone is flat on the desktop, it can be recognized...)

In addition to unlocking the phone, this facial recognition can also be used to authenticate Google Play payments, just like fingerprints are now available.

Motion Sense

This is a function that was officially announced very early, and the version listed in Taiwan also has it.

This is through the Soli sensor chip in the front, which uses microwave radar to sense changes in gestures, corresponding to different functions. Developed on the Pixel 4 are...

Wave your hand to the top of the screen to change songs, approach it to mute it when you have a call, when the alarm rings, when you reach out to your phone, the sound will be reduced first, and then turn off when you wave your hand, and as mentioned earlier, when you reach out to your phone, activate the face first Ministry identification...etc.

Whether the gesture operation can be performed depends on whether the program supports it.

QQ图片20211117132223.png

But students may think that Samsung and LG have done it? Palm selfie, AirMotion of LG G8, recent Note10 S Pen remote control, etc., are all operated from a distance. What is the difference in experience?

never used... don't know...

The current remote operation still has problems such as misjudgment, insufficient real-time response, and possible distance limitations. If the Motion Sense of Pixel 4 can avoid these problems, it will be a difference.

In addition, through the radar microwave induction matching algorithm, it is possible to distinguish the subtle movements of the fingers (such as rubbing/turning/twisting the fingers...etc.), not just the simple wave of the palm. At present, the Pixel 4 announced that the wave of the palm These all seem to be very simple (shallow?) gesture applications (the alarm clock is turned down first, swiped, and then closed these two gesture applications, which is more special).

When I saw this technology published on I/O before, I also demonstrated the gesture of using a soundboard to let the speaker play music in the air, or like goodbye, repeatedly waved my hand to turn off the sound, so Soli can actually use many kinds of palms, Finger movements are used to operate different commands in the air, so this is a "potential" gesture operation solution.

90Hz screen refresh rate

It can be seen from the screen specifications that the Pixel 4 supports a screen refresh rate of 90Hz, which will be enabled/disabled according to the application program. For example, when the user is playing games or browsing photos, the screen will be updated at a frequency of 90Hz to ensure smoothness. , and usually maintain 60Hz to save battery power, allowing the Pixel 4 to last a full day. In addition, Pixel 4 also has a smart screen adjustment function, which can automatically adjust the color temperature according to the surrounding environment and light source conditions.

QQ图片20211117132330.png

Protecting private data inside the Titan M device

In addition to the Snapdragon 855 processor, the Pixel 4 is also equipped with the Pixel Neural Core chip. This engine specially designed for the Pixel 4 is also the basis for the application of Motion Sense, face unlocking Google Assistant and other functions to the operation of neural network functions. Useful camera functions such as lenses, often taking pictures of faces, etc.

In addition, Pixel 4 is also equipped with an exclusive "Titan M" security chip, which protects the user's most confidential, such as facial information, frequently photographed faces, etc. personal data, and can use Google's advanced security services. .

After reading it, the most curious thing is still taking pictures and Motion Sense. The Pixel phone has brought us new technologies. As a technology fan, I should be eager to try it, but from a practical point of view, I think the few things are: video.

This time, there is almost no mention of any updates to the video recording function. The specifications indicate that the maximum recording is 4K/30fps. For the flagship, it seems a bit sad that it does not support 4K/60fps recording, and the powerful algorithms do not seem to be extended to video recording. In addition to the lack of ultra-wide-angle, another possible shortcoming. (However, we will only know after testing)

In addition, there is not much focus on the game experience. Compared with the game mode provided, whether the experience is good or not may be checked later.