There’s always an awkwardness to video calls; do you look at the camera and not see what the other person is doing or do you stare at the screen and not have any “eye contact”. Luckily, Apple has solved this problem with a feature called Facetime Attention Correction.
Apple is known for solving problems that we didn’t know that we had. Here they’ve used an AI to find your eyes and raise them to make it look like you are looking at the person on the other side of the video call without you having to stare into the camera. Something relatively simple, but it makes a video call a lot more intimate. No longer does there feel as if there is a barrier between you and the other person.
The feature uses the TrueDepth camera system on the front of the phone, the same camera system that is used for unlocking the phone and Animojis in conjunction with their ARKit. It creates a 3D face map and depth map of the user’s face, determines where the eyes are, and adjusts them accordingly.
Sometimes when facetime raises the eye line very well; other times it can be quite unsettling. This is a great step in the right direction and will improve over time.