Exposing.ai has developed a free tool that you can use to detect whether your image has been used to build face recognition systems. Developers often use pictures posted on websites to develop facial recognition systems without the knowledge of the people who posted them.
The system developed by Exposing.ai can determine whether your photos are among the pictures that have been scrapped. The tool uses publicly available image datasets to check if Flickr photos were used in surveillance research.
Flickr is an American image hosting and video hosting platform. The service seems to be a logical target for the tool’s launch as it is regularly utilized for AI research.
All you have to do is to enter your Flickr username, photo URL, or hashtag in the website’s search bar, and the tool will automatically scan nearly over 3.5 million snaps. The search engine checks if your photos are present in the datasets by referring to Flickr identifiers. If an exact match is found, the results are displayed on the screen. According to the tool’s creators, revealing how yesterday’s photographs became today’s training data is a part of their project’s goal.
In 2019, IBM released a dataset of nearly a million images scrapped from Flickr without the uploader’s knowledge. The company then said that this would help in reducing the researcher’s rampant biases of facial recognition. At the same time, the dataset could be used to develop powerful surveillance technology.
Exposing.ai found out nearly 3,600,000 images from Flickr that were used to build MegaFace. MegaFace is the largest publicly available dataset consisting of photos used for facial recognition. This dataset has been used in nearly 78 projects spanning 14 countries. However, not much can be done once your images have been used through image datasets. However, some may allow you to remove your pictures to prevent future use.
Currently, Expose.ai only works on Flickr and on limited image training datasets. The creators may include more search options in the future. The tool’s main objective is to reveal how photos are being used for development without the uploader’s consent. A change in company policies and laws may help prevent this practice.
Consultant Intern: Kriti Maloo is currently pursuing her B.Tech from Indian Institute of Technology (IIT) Bhubaneswar. She is interested in Data Analytics and its applications in various domains. She is a Bibliophile and loves to explore new advancements in the field of technology.