A new app called 'Gallery Guardian' has been launched, responding to the risks involved with young people sending or receiving explicit photos. The app scans through every image a child takes or receives on their device then, using an image recognition algorithm, automatically detects any images that include nudity. If something is detected, the app sends an alert to the phone of the young person's parent or carer, without saving or showing the image.
Like most filtering programmes, this approach can be a great help with younger age groups, when children first gain access to the internet and mobile devices. It could also serve as a really useful tool for preventing grooming of younger children, and could help start early conversations about why some photos are not ok.
But for older teens, asking (or demanding) that the app is installed on their phone could undermine the trust of the parent/child relationship. Also, it wouldn’t stop a determined young person from looking for ways around the programme: by using an alternative user login, uninstalling the app, or by gaining access to a second secret phone.
This app is an interesting example of a technological solution to a real life problem. Technology does have a role to play, but educating young people, about the consequences of sending or receiving explicit images, also needs to happen and will have a greater impact on older age groups. See our free resource which will help you talk with young people about the consequence of sexting.