In this unveiledNew Pixel 4 series phonesIn this year's Pixel 4, Google once again demonstrated how to use artificial intelligence technology to create a modern mobile phone, while fully integrating Google's services through the Pixel XNUMX series of mobile phones.
The Pixel 4 series phones launched this time only differ in body size and battery capacity. The rest including processor, memory, storage capacity, and even body color and lens configuration all use consistent specifications, making it easier for the consumer market to choose according to their needs.
The Soli technology used in the Pixel 4 series of mobile phones is more like a public test project
According to Nanda Ramachandran, Google Pixel's global product general manager, the Pixel 4 series will debut Soli technology, which is based on radar principles. This technology will bring a new human-computer interaction experience, allowing users to access more operating functions through floating gestures.
However, this design will not be open to other OEM manufacturers for the time being and will only be provided on the Pixel 4 series of phones. However, relevant API resources will be provided to developers, allowing them to use Soli technology functions to create different App usage modes. For example, the "Pokémon" application content displayed on the Pixel 4 series of phones this time can interact with the Pokémon inside through floating gestures waving at Pikachu.
Therefore, in my opinion, Google's idea is more like to first install this feature on the Pixel 4 series of mobile phones to observe the actual usage experience in the consumer market. In addition, by using its own developer ecosystem to create richer Soli technology applications, it may be possible to further promote this feature to more OEM manufacturers in the future.
In fact, manufacturers including Samsung and LG have already applied gesture operations to their mobile phone products a long time ago, but ultimately they were unable to successfully promote the application demand ecosystem, and ultimately they could only become a short-lived specific mobile phone function.
However, if Soli technology can be promoted through the Pixel 4 series of mobile phones, thereby attracting more developers to participate in the design, it is expected to successfully promote more floating gestures or application modes using radar for depth recognition. Ultimately, it will smoothly expand mobile phone products from touch and voice control to gestures and facial recognition, which are other new human-computer interaction modes.
More artificial intelligence technology demonstrations
In the Pixel 4 mobile phone launched this time, more emphasis is placed on the application of artificial intelligence technology and the performance of software and hardware design integration.
A similar design was already evident in last year's Pixel 3 series, and even in the Pixel 3a series launched in the first half of this year, Google confirmed that despite removing the Pixel Visual Core and adopting a relatively low-performance processor, it can still realize the original Pixel 3 series' artificial intelligence technology application functions, such as the ultimate night photography effect. The overall computing speed may not be as good as the Pixel 3 series, but the overall price can be significantly reduced, thereby attracting more consumers.
Nanda Ramachandran stated that the same marketing strategy will continue in the future, suggesting that Google may still release derivatives of the Pixel 4 series with simplified hardware specifications and more affordable prices. However, Nanda Ramachandran did not directly reveal whether the Pixel 4a name will continue to be used in the future.
Compared to last year's Pixel 3 series, this year's Pixel 4 series features many enhancements to previous designs, resulting in a simpler design. Aside from the main camera's square-bottom dual-lens configuration and the change from the original double-layer composite surface to a simpler design, the Pixel 4 series doesn't have a distinctly distinctive design, and doesn't even feature narrower bezels or a higher display-to-body ratio like current mobile phone products.
The focus is more on integrating artificial intelligence technology with software and hardware on the Pixel 4 series of mobile phones.
For example, the new Android 10 operating system previously providedLive Caption instant subtitle conversion functionThe recorder in the Pixel 4 series of mobile phones can convert the recorded content into digital text in real time, and can be found through search methods later, and even further annotated and classified with tags (Note).
Note:Currently, this feature is still mainly in English. Whether it will support Chinese or not seems to depend on Google's subsequent efforts.
Another more obvious example is naturally the photo-taking function this time.
In addition to further improving HDR+ and Smart White Balance, Google also allows users to adjust exposure and dynamic contrast directly during the shooting process for more precise results. Accelerated by the new Pixel Neural Core computing chip, users can instantly preview the adjustment results during the shooting process without any delay.
Furthermore, the high-resolution zoom feature, which debuted last year on the Pixel 3, is now enhanced with a new 2x optical zoom lens to capture even more image information. Through artificial intelligence correction, this achieves up to 8x high-resolution zoom, allowing users to shoot even further. Furthermore, the second lens captures even more image information, enabling long-range depth-of-field portraits and more natural rendering of details along the edges of animal fur.
As for the improvements in extreme night scene shooting effects, users can easily capture the Milky Way and starry sky. It is even indicated that in the future, through software updates, users will be able to capture more natural moon scenes in extremely dark night environments without abnormal overexposure problems.
To further demonstrate the AI photography experience on the Pixel 4 series, Google doesn't offer a direct switch between the wide-angle and 2x telephoto lenses. Instead, it simultaneously activates both lenses for correction calculations, achieving a high-magnification zoom effect while maintaining high resolution, thereby satisfying the desire of most users to capture farther shots.
The reason why Google didn't include an ultra-wide-angle lens in the Pixel 4 series of phones may be that its overall usage rate is relatively low, but it is more likely that Google wants to focus on showcasing the performance of the shooting functions created by artificial intelligence technology, while also demonstrating different dual-lens application modes to more OEM manufacturers.
It won't be too soon to enter 5G network applications, but we are ready
According to Nanda Ramachandran, the widespread use of 5G network applications is indeed getting closer, and Google has actually made preparations for 5G network applications. However, it believes that there is still more room for development, so it will not launch 5G network application products too quickly. The Pixel 4 series of mobile phones launched this year will only focus on 4G network application specifications.
Regarding the numerous leaks of information about the Pixel 4 series phones before the launch event, Nanda Ramachandran clarified that aside from the previously announced name of the Pixel 4 and the confirmation that it would be equipped with Soli technology, the leaks of other information were not official promotional activities.








