Apple uses ARKit for Facetime eye contact

Apple presents what is probably the most impressive augmented reality face filter to date.

Apple's FaceTime video chat software comes with a new software feature in iOS 13 that goes by the euphonious name of "attention correction": If the user looks at the screen instead of the camera during a video chat, the software corrects their gaze so that they appear to be looking at the camera - and thus maintain digital eye contact.

The effect is intended to solve a common problem in video conferences: namely that the conference participants look at each other on the monitor and thus look past each other instead of exchanging glances via the camera.

Eye movements in directions outside the screen are transmitted normally. Attention correction can also be completely deactivated.

AR face filter for advanced users

In technical terms, the eye correction is an augmented reality trick familiar from Snaps and Facebook's face filters: a depth map of the face is created with the 3D face measurement "TrueDepth" of the iPhone XS, XS Max and XR models.

Apple's AR software ARKit can then precisely superimpose the digital eyes over the real ones in this depth map.

Apple may be providing Facebook's research department with an interesting example of implementation here: the company is working on very realistic VR avatars that even synchronise eye movements with the original person, but do not yet offer direct eye contact due to camera distortions.

Source: mixed

Leave a Reply
Related Posts
EN