When Facebook purchased Instagram in 2012, they were already seeing the writing on the wall. The billion-dollar start-up was a hit with teen users and Facebook was seeing a decline in their userbase among the same age group. Instagram’s buzzy photo and video sharing features were quickly becoming the equivalent of the high school quad: a place for teens to post their best selfie photos and connect with friends online.
Today, Instagram has over 900 million users worldwide, about 25% of which are 18 years and younger. In the U.S. alone, 22 million teens log into Instagram each day. They spend 50% more time on Instagram than they do on Facebook.
Unfortunately, those that seek to do harm to children and teens have taken notice. According to the National Center for Missing and Exploited Children, 2020 was a record-breaking year for online child exploitation, with more than 21.7 million instances reported. Cases of online child grooming are up 30% in the last year, and Facebook-owned apps (Facebook, Messenger, Instagram, WhatsApp) plus Snapchat account for more than 70% of these cases. Instagram was responsible for a more than a quarter of these, making it the leading social media platform for child grooming.
Predators use various paths on Instagram to groom children and teens. They exploit Instagram’s comment system to network with each other and fish for victims, use Instagram’s hashtag system to drop gross content into clean places, and they use Instagram’s DM system to directly message kids.
According to a recent investigative report by The Atlantic, a network of predators on Instagram used the hashtag #dropboxlinks to find and share explicit photos of underage children, noting that the alleged child-porn-trading users set up anonymous accounts with throwaway usernames or handles.
“I’ll trade, have all nude girl videos,” one user commented in September 2018. “DM young girl Dropbox,” said another. “DM me slaves,” said someone else. Many others commented “HMU to trade Dropbox links” on various throwaway accounts. “Young boys only,” another user posted several times, and…an account with “Dropbox” in their username posted “DM if u want young girl links.”
Playing Whac-A-Mole with Child Predators
After The Atlantic report, Instagram restricted this hashtag, but there’s no doubt that it’s taken some other malevolent form and likely is propagating somewhere else on the platform. Instagram is taking defensive action against these predators but given the incentive structures that are in place, this will continue to be a game of Whac-A-Mole for them. It’s unlikely that Instagram will institute a blanket policy of identity proofing member accounts to protect children — that would reduce their userbase, which is critical for their advertising business model. Predators will continue to exploit the platform’s features and ability to anonymously create accounts to groom child and teen users.
On Instagram, instituting an identity system isn’t universally popular. One only needs to look at a recent post by Idris Elba, the talented British actor, writer, producer, and rapper. He received thousands of negative comments to his post which suggested that social media companies should require verified identity to open an account. He noted:
“If cowards are being supported by a veil of privacy and secrecy, then social media is not a safe place. If cowards want to spout racial rhetoric, then say it with your name, not your username.”
Unfortunately, this landed on deaf ears. For many respondents, protecting their anonymity and not sharing their personal information with Instagram was more important than guarding against hate speech and reducing the dangers posed to children and teens on the platform.
A User Driven Solution to Child Grooming on Instagram
But what’s most popular isn’t always what’s best.
There needs to be a solution advanced that protects children on Instagram. It’s clear to me that it has to be user driven — we could spend a lifetime waiting for Instagram to universally adopt anything that might negatively impact their advertising revenue. Let’s empower users and not wait for Instagram.
The way in which predators typically groom children online is through a series of initial conversations which can appear innocent. Predators, typically male adults, attempt to establish a relationship to gain trust. They use fake usernames and photos, lie about their age, and often use a burner phone to communicate with the child outside the platform once the relationship advances. The groomer will know popular music artist, clothing trends, or another activity or hobby the child may be interested in. They try to relate and build trust…then they prey on the child’s desire for romance, adventure, and sexual information.
A sure way to prevent this from happening is to the break the cycle from the outset. If Instagram users had an easy to use, free identity proofing app readily available, that could go a long way toward preventing online child grooming.
Dentity has developed this solution. It is entirely user driven and doesn’t require the social media platform to do anything. We’ve developed an enterprise class, bank grade identity verification app that allows users to verify their identity and request others do the same. The only way to pass our advanced AI technology is to have a valid, government-issued identity document which is tied to a billable mobile phone number. We don’t accept texting apps, burner phones, or VOIP lines because they can’t be easily correlated with a verified identity. Our technology sniffs those out — they’re often used by people specifically trying to hide their identity.
Anonymity is the greatest threat to children online. Predators hide behind fake accounts which are untethered to a verified identity. If users have to “say it with their name and not their username,” Instagram would be a safer place. Please vote for identity and create an account with Dentity.