The iPhone 12pro lidar will be provided by Sony to enhance camera and AR capabilities

Zhidong news on September 4, foreign media fast company recently reported that Sony’s lidar camera will be used in this year’s new iPhone pro, which was used in iPad Pro earlier this year. The lidar system, which will be produced by Sony, will be able to accurately measure the distance between an object and the camera lens using light pulses, according to fast company. With these data, the camera can focus automatically more accurately, distinguish the foreground and background better, and achieve beautiful image effects including more realistic portrait mode. < / P > < p > since the iPhone X in 2017, the high-end iPhone already has a different type of front-end 3D structured light camera system, called truedepth, for facial recognition and 3D animated expression. The system projects 30000 infrared points on the surface of the user’s face to form a “3D map” of its contour, providing high-precision measurement within a distance of less than one meter. < / P > < p > the iPhone’s new lidar system will measure the distance of objects, both close and long. In this system, a “direct time of flight” method is used, and a large number of photons are emitted by a vertical cavity surface emitting laser. They move at the speed of light, deflecting objects or surfaces in the camera’s field of view. < / P > < p > some photons bounce back to a small sensor near the camera. The software then uses their round-trip time to derive the distance from the object or surface to the camera lens. This is very similar to radar, except that lidar uses lasers instead of radio waves. < / P > < p > at present, there are already iPhones on the market that can achieve the effect of portrait mode, but this is achieved by measuring the perspective difference of different cameras on objects. This difference is called “parallax”, because the parallax is larger in nearby objects and smaller in distant objects, so the distance between objects can be estimated. When the laser radar system, which is used to measure the depth of the components, the effect will be more realistic. < / P > < p > this lidar camera will bring new cognitive capabilities to the iPhone. Mobile phone sensors collect accurate distance data, which enables the camera to better understand the object in front of it. < / P > < p > for example, it can make objects in the foreground look sharper and the background more blurred. Improving on these nuances can optimize the iPhone’s auto focus function to take more realistic photos, which will be closer to the photos taken by traditional full-size lens cameras. < / P > < p > lidar system will also upgrade mobile phone ar experience. These depth data, along with additional data from other camera sensors and motion sensors, will help ar software more accurately place digital images in the real world image layer seen through the camera lens. < / P > < p > lidar system can also bring higher accuracy for measuring space, objects or people. For example, a medical rehabilitation ar application might use depth data to more accurately measure the range of motion of a patient’s arm. < / P > < p > the sensor chip for the front camera of the iPhone x is provided by Italian French semiconductor microelectronics company. Many people in the industry believe that the sensor chip in the iPad Pro is also provided by it. < / P > < p > but in the past two years, Sony’s image sensor business has grown as its focus on developing CMOS image sensors has become popular. In fact, Sony renamed its semiconductor division “imaging and sensing solutions” in the second quarter of 2019 to reflect its newly discovered source of growth. < / P > < p > Sony has developed advanced sensor technology integrated with computer vision artificial intelligence, and has won a lot of smart phone image sensor contracts, which greatly promoted this move. Sony’s time of flight depth sensing module has been used in high-end Samsung and Huawei smartphones, Apple’s iPad pro, and now will be used in new iPhone Pro models. Sony has long been a major supplier of camera sensors for apple and other smartphone makers. < / P > < p > according to leaked documents, Apple plans to release four new iPhones sometime this fall. It’s reported that all four devices will support 5g wireless connectivity, but, as always, advances in camera functionality will take the lead. < / P > < p > many reports say that at the top of the line are two Pro Phones – a 6.1-inch device may be called the iPhone 12 pro, and the other 6.5-inch device may be called the iPhone 12 Pro max. According to fast company, it is these two high-end devices that will adopt the new lidar system. < / P > < p > according to the information leaked by user Jon & middot; Prosser, several lidar related camera functions are displayed in the camera settings of the PVT version of iPhone 12 Pro max, which can be switched on and off. For video, the lidar system will continuously track the depth of different parts of an object as it moves through the screen. This allows the camera software to separate a 3D object from the video background and apply special effects to it, such as ultra slow motion. The options screen also shows a new “enhanced night mode” that will open the shutter longer to collect more light for the shot. < / P > < p > according to the report of fast company, Sony’s lidar camera will be used in the new iPhone released this year, which will improve the performance of the phone from the aspects of photo effect and AR application, although the change may not be so obvious. The deeper significance of using lidar camera is to complete Apple’s ar layout from the application plate, which is an important attempt to deeply explore the human-computer interaction mode and value of the new generation of smart phones. Continue ReadingAmerican companies begin to give up R & D: who should pay for corporate research?