The Rise of AI-Generated Action Figures: A Fun Trend with Hidden Consequences












2025-05-01T13:56:35Z

As April rolled in, social media platforms like LinkedIn and X saw a remarkable surge in the popularity of personalized action figures. Each figure is an artistic rendition of the individual who created it, showcasing their likeness with striking accuracy. These digital creations come complete with unique accessories, reflecting the creator's personalitythink reusable coffee cups, yoga mats, and headphones. This fascinating trend has captivated users, encouraging them to share their newly minted digital avatars with friends and followers.
Central to this phenomenon is OpenAIs innovative GPT-4o-powered image generator. This state-of-the-art tool enhances the capabilities of ChatGPT, allowing users to effortlessly edit images, generate text, and much more. Notably, the image generator can also produce captivating images reminiscent of the enchanting style of Japanese animated film studio Studio Ghibli, which has further fueled its popularity, rapidly spreading across the internet.
Creating one of these action figures or Studio Ghibli-inspired images is a simple process that requires only a free ChatGPT account and a photograph. However, this ease of use comes at a cost: users must provide a significant amount of personal data to OpenAI, which could potentially be utilized to refine its models and improve its services.
Hidden Data Risks
When using an AI image editor, the data you share often remains obscured. According to Tom Vazdar, an area chair for cybersecurity at the Open Institute of Technology, uploading an image to ChatGPT may inadvertently yield an extensive collection of metadata. This encompasses the EXIF data linked to the image file, which can reveal crucial information such as the time the photo was taken, its location, and even GPS coordinates.
Moreover, OpenAI collects data regarding the device you use to access the platform, which includes details like the type of device, operating system, browser version, and unique identifiers. Vazdar further elaborates that because platforms like ChatGPT operate through conversational interactions, they also gather behavioral data. This includes what users type, the kinds of images they request, their interactions with the interface, and the frequency of those actions.
It's important to note that uploading a high-resolution photo does not only share your facial features; it also reveals everything else contained within the image. This might include the background, other individuals present, items in your living space, and any readable documents or badges. Camden Woollven, group head of AI product marketing at risk management firm GRC International Group, highlights that this information could inadvertently expose much more than the user intended.
This voluntarily provided data, which is backed by user consent, represents a gold mine for training generative modelsespecially for those that integrate visual inputs, asserts Vazdar. The wealth of information can significantly enhance the capabilities of AI products.
While OpenAI dismisses claims that it is orchestrating these viral photo trends as a strategy to amass user data, the company undeniably benefits from the situation. Users willingly uploading their own images eliminates the need for OpenAI to scrape the internet for facial data. Vazdar points out that whether by design or simply a fortuitous development, this trend is providing the company with vast quantities of fresh, high-quality facial data that encompasses a diverse range of ages, ethnicities, and geographical locations.
OpenAI asserts that it does not actively pursue personal information to train its models, nor does it utilize public data from the internet to create user profiles for advertising purposes or for selling data. An OpenAI spokesperson clarified to WIRED that the images submitted through ChatGPT may be retained for the purpose of improving the models, emphasizing the company's commitment to user privacy and data security.
Angela Thompson
Source of the news: Wired