thispersondoesnotexist is a popular website that generates human faces created via AI. It is a very interesting tool and generates impressive results. The site also provides links to the code in case you want to test locally and better understand how the algorithm works. It's an amazing tool and worth knowing. However, thanks to the quality of the results, this site specifically has attracted scammers looking for faces to mask their fake accounts. In this article, we'll look at spots that we can illuminate to detect manually generated faces.

To get started, let's look at a set of six examples:

At first glance, they may look like ordinary photos. But looking in more detail, you can identify some inconsistencies denouncing that these photos are not real. With some color tags, it's easier to analyze them:In yellow: the site usually generates images with a single color or a blurred background. However, sometimes it includes detailed landscapes or backgrounds that end up being distorted noise. The result is especially sharp in the lower left photo, where the woman appears to be in a garden painted by Salvador Dalí. Also notice how, in the upper right photo, the background to the left of the image is slightly distorted compared to the right side, although it still looks like the same environment.

In red: I reviewed more than two hundred images generated by the code and the most frequent and visible problem is the ears. Ears are almost always generated independently of each other, which leads to differences in size, placement and shape. Mainly in the ear lobes.

In Blue: Another rarer but very visible effect is the shoulders. It can happen that an image is generated with differences between sides. See in the top center image how the right side does not have the same collar as the other side. Or in the image at right, where the left shoulder simply doesn't exist.

In Green: Last but not least, skin and limb inconsistencies. In the upper central photo, in the neck region, we see a sudden change in skin tone. And in the lower right photo, we see what should be fingers, also in different skin tone.

These "mistakes" happen because the site puts together multiple AI templates that are created for different purposes. When used individually, both background images and limbs (eg hands) are rendered to perfection. See the sample images in the code repositories: [1]

These cities represent background environments. Note that there are still many errors, especially in water reflections. But the result is clear.The algorithms themselves are excellent and their evolution is an unquestionable advance. However, criminals will take advantage of these advances to deceive their victims.

Speaking of the images generated on the site specifically, there are other factors that allow identification: the photo is always of the face, angles and positioning of the faces. I believe that most scammers, or at least a large portion of the less sophisticated, will not bother to download the code and use it locally. They will only use an image directly from the site that they deem appropriate. That said, the points noted here can be a good start for identifying fake faces.

DISCLAIMER: not all the use of fake images like these on social profiles is reprehensible. If you need a higher level of privacy, using such realistic faces in your profile can significantly increase your security.