Topic  4 – Images

Une image contenant capture d’écran, conception

Description générée automatiquement

Image or video : reliable ?

Images and videos consist of a large part of what we watch everyday. How can we step aside to analyse them ?

1. Keep in mind that images or videos are not to be taken for granted : it is not because the film exists that it is true. We can face Manipulations or interpretations

2. Try to put it in context : use for this the 5 W / who is the author ? Where did it take place ? When ? How was it broadcasted ? What is the message ? You have to determine what all it is about. The caption adds elements but sometimes gives one view, even interpretation.

3. Read the comments : you can get sometimes details or relevant remarks. 

4. Check if the information exists somewhere else on the web. For this, you can use a tool like google reverse image search https://images.google.com   or tineye https://tineye.com/ 

5. Don’t rely on how many times it has been shared or forwarded. It is not a proof of reliability. It only means the message has a great impact, makes people react.

and What about images generated by Artificial intelligence ?

Thanks to, or because of, artificial intelligence (AI), fake news propagators have a new means for spreading misinformation. Using a variety of softwares, they are now able to generate fictitious yet highly realistic images. Some of these images, linked to current events, caused much confusion. It’s all the more important to pay attention to certain points to sort out the real from the fake. Too few AI softwares makes the task easier by leaving a visual signature.

How to spot their existence? Here are some visual clues : 

AI has shortcomings that lead to visual inconsistencies : hands of the people represented are very often deformed, asymmetries are often present, disproportionate faces. Teeth, ears and hair may look strange too.
AIs have difficulty in generating reflections, so look for shadows, mirrors, water.

Since the shot created by AI is a mixture of several other images, the experts also advise scrutinising potential changes in light. Overall, it is in the background of the photo that the AI is most likely to give itself away. The further away an element is, the more an object will be blurred, distorted and have incorrect perspectives.

But AIs are getting better every day and we probably won’t be able to rely on this type of anomaly to detect false images.That’s why we need to be extra vigilant and carry out a fact checks attitude to avoid being trapped. For instance, compare any suspect image with other photos of the same event from one or more reliable sources. Using image search is a good way of tracing the source of the publication.