With the flood of AI generated imagery, can we still distinguish what is real and what is “fake”? Is it really fake if it was generated by an algorithm?
This website shows a new face every time it is loaded, and that person does not actually exist. It’s using StyleGAN2 and was created in 2019, but I ran into this again and was fascinated (again).
With each load, a new face is shown, and that person does not actually exist. This was created in 2019, but I ran into this again and was fascinated (again).
It is actually still a very good algorithm, very few flaws, at least to my untrained eye (see how can you tell below).
Have I Been Trained
Reverse identification of images used to train models.
Search 5.8 billion images used to train popular AI art models
Which face is real
Our aim is to make you aware of the ease with which digital identities can be faked, and to help you spot these fakes at a single glance.
How can you tell
TLDR; Potential tell offs
- Background problems
- Fluorescent bleed
Stable Attribution’s algorithm decodes an image generated by an A.I. model into the most similar examples from the data that the model was trained with.