The next version of FaceTime could use artificial intelligence to make your viewer feel like you are watching the camera, even if your eyes are fixed on your device’s screen.
Have you ever tried looking directly at your phone’s camera – usually located above the screen – during a videoconference? If so, your look is probably returned to your interviewer in the screen barely a few seconds later.
The problem with watching your screen during a videoconference is the lack of eye contact with the person you are talking to.
According to application designer Mike Rundle, the beta version of iOS 13 – currently available to testers – automatically corrects the look of FaceTime users. This means that even if your eyes are fixed on the screen of your phone, your interlocutor will have the impression that your eyes are on the camera and, therefore, that you look in the eyes.
Mr. Rundle demonstrated this in a publication on his Twitter account. The publication has four screenshots, and the one with the correction of the look is labeled “Looking at screen with FaceTime in iOS 13 beta 3”. In this picture, Mike Rundle looks at the screen of his phone, but his eyes appear as if they are fixed on the camera.
Guys – "FaceTime Attention Correction" in iOS 13 beta 3 is wild.
— Will Sigmon (@WSig) July 2, 2019
For now, this feature is available in beta only on iPhone XS and XS Max. As this is a test version, it is possible that it will be modified or omitted during the official release of iOS 13 this fall.