Dabbling in the world of user-generated content is a risky business for startups who don’t have the appropriate safeguards in place to ensure user data is protected – especially when that data belongs to children.
Knowing the risks
We were horrified to discover this month that 4.6 million Snapchat users – the majority of whom were children – were hacked and had their data published online.
Despite the fact millions of social media users are victims of cyber bullying, sexting, trolling and privacy violations, startups are given little direction on how to develop their businesses and products responsibly.
You may be surprised to hear that Snapchat never intended for its app to be used by young children, yet now they make up the majority of its user-base.
Had they ensured 10-year-old users were ‘front of mind’ during the development phase, they may have been more tuned into the implications that inappropriate use of the app would have on them.
Inappropriate use
Even without illegal hacking, there are always risks associated with use of the platform itself.
Ask.fm admitted earlier this year that their platform has suffered, following the bad PR they received off the back of issues such as the widely reported suicide of Hannah Swift after reportedly being targeted by bullies on their network.
Ultimately, users are accountable for appropriate use of social media sites, and last week the Government spotlighted this by announcing that internet trolls could face two years in prison for online abuse.
But just as developers take a huge share of accountability when it comes to data protection, so should they encourage responsible use of their products.
Protect yourself
So what would you do if the product your company developed got into the hands of a 10-year-old child? We’ve outlined a few things for you to consider, to help mitigate against any risk should this happen:
-
Can a child capture and share inappropriate content using your product? If so, to what extend is the child protected from a security violation?
-
Do you have protective features such as ‘reporting’ or ‘blocking’ in place to prevent trolling or cyber bullying on your platform? And what systems do you have in place to manage a high volume of complaints through these protective features?
-
How would you respond to the media or the police if questioned about inappropriate use of your platform by a child?
-
Do you have the appropriate terms and conditions in place to protect yourselves against any liability you may face, should a child use your product irresponsibly?
You’ll be surprised how quickly parents, investors, the media, even the police become interested in your brand if it ends up under the spotlight for the wrong reason.
To protect your business, we would highly advise that social responsibility is at the forefront of your mind through every phase of the development and marketing process.
Emma Robertson is a Digital Creative Strategist and Co-Founder of Digital Awareness UK, the online safety organisation set up to directly address the need for greater awareness of social media-related problems in schools. Emma tweets at @DigitalSisters