The third beta of iOS 13 and iPadOS has revealed a very subtle but potentially game-changing development in FaceTime video conferencing. If you’ve been talking on a video call, you know that you usually look at the screen to see the person you’re talking to, not the camera. That means that the person you are talking to is not looking directly at you, but is looking slightly away from you in another direction.
iOS 13 and iPadOS want to solve this detail that video conferences have always had with a software correction: the systems will detect our face and move our gaze, so that the person or persons looking at us on the screen will have the feeling that we are looking directly at them .
In AppleIOS 13 Beta 3 indicates that data could be shared between wired iPhones
The finding has been shared on Reddit and there are already some images shared on Twitter, and the effect is very successful:
The secret is in the use of ARKit: iOS makes a three-dimensional map of our face and uses it to reposition both the nose and the eyes:
In anticipation that the correction may be somewhat uncomfortable for some, Apple has included an option in the systems to be able to deactivate it :
Of course, if this feature is not removed in the next betas, it will be something that we will test as soon as iOS 13 and iPadOS are released on a stable basis for everyone. There are still some doubts, such as whether the fix will only work on calls between two people or also on group calls; or whether this fix will also be present in Catalina macOS.
Apple on Instagram
Sharing iOS 13 and iPadOS will correct our gaze on FaceTime calls to make it look like we’re looking at the camera
The best comments:
See 18 comments
Topics of interest