Since news broke of former finance minister Ken Ofori-Atta’s detention by Immigration and Customs Enforcement (ICE) officials in the US, social media users have shared a particular image and a video depicting moments purported to be when the former minister was arrested.
These content were shared across platforms – X, Facebook, Tiktok and Instagram. One of these posts, shared on January 9, 2026, on Facebook, which shows Mr Ofori-Atta handcuffed and being escorted into a van by two ICE officials had garnered at least 1.6K likes, 257 shares and 192 comments.
Similar posts on other platforms like X gained at least close to 2000 views. On TikTok and Instagram, the same image and video were shared, some even by non-Ghanaian accounts or pages ( see here, here, and here).
While the content on face value appear to be AI-generated, some users may still be susceptible to believing them, as can be gleaned from comments here, here, and here. As is also usually the case, a number of commenters online asked whether the image was authentic or AI-generated.

In this piece, GhanaFact analyzes the image and video to prove that indeed they were fake.
1. The four pointed star at the bottom-right corner of the image is the logo of Google AI platform, Gemini
2. It appears as if the former minister is being taken away from the detention center not being brought in.
3. While the inscription behind him reads ‘ICE DETENTION CENTER, FARMVILLE, VA’ Ofori-Atta is being detained at the CAROLINE DETENTION FACILITY.

Added to this are detection tools that can easily enable individuals to spot fakes and AI-generated content. These include tools like Google SynthID Detector, Hive Moderation, and WasitAI, among many others.
For example, GhanaFact ran the circulating images of Mr. Ofori-Atta’s arrest through Google SynthID, and the platform provided a categorical verdict that it is an AI-generated image.
“SynthID detected in all or part of the uploaded content. SynthID confidence: Very High,” the results noted.

Analysis of video of arrest
In the case of the six-second video depicting the arrest, GhanaFact first saw it on Facebook. It was an animated version of the AI-generated image and it captured Ofori-Atta being ushered into the white ICE van by the two officers.
When GhanaFact passed the video through the Google Gemini platform, it returned a result that “the image and its associated video contain significant signs of being AI-generated or digitally manipulated, though there is real-world news regarding the subject.”
After providing the real-world news summary on the content, Gemini summarized its findings as follows: “while the event of his detention is real, the specific image and video showing him at a “Farmville” facility are likely AI-generated “deepfakes” or dramatizations created to illustrate the news.”


Ofori-Atta’s arrest was first reported by his lawyers in Ghana via a press statement. There has been no comment on the matter either by ICE or by the Federal Bureau of Investigation (FBI) who are said to have been involved in the arrest and detention.
No images of Ofori-Atta, before, during or after the arrest have been published by any credible media outlet and reports indicate that he had also rejected any consular support by the Ghana Embassy in the US.
This is not the first time a video has been created from a viral image. GhanaFact has worked on a similar case, where the 2022 digitally altered viral image of former president Akufo-Addo and Serwaa Broni, allegedly in a private jet was animated in 2025.
Some practical tips in spotting AI-mis/disinformation
As the information environment continues to evolve, AI-generated images and videos have been used to blur the lines between what is real and what isn’t. But while misleading content has become increasingly common, there are also ways to determine when a particular media content is a deepfake, manipulated, or AI-generated.
First, there are manual ways to detect when an image or video is AI-generated. This was largely applied above in the case of the Ofori-Atta image.
Among others, checking the images for signs of facial inconsistencies or visual clues in the environment (if any) of the subject in the image. Some of the things to look out for also include whether;
- There are identifiable faces of the subjects in the image
- Anybody’s proportions are unexpected or have been distorted in any form. For example, misaligned ears, unnaturally asymmetrical eyes.
- The image exists or matches other verifiable images of the event
AI-generated images or content often have an airbrush look, which makes it look smooth, glossy or flawless. These are also indicators, an image was generated with AI.
Conclusion
The recent circulation of AI-generated images and videos of former finance minister Ken Ofori-Atta has placed emphasis on the essence of media literacy in identifying and detecting such synthetic media.
Written by Gifty Danso











