Why not effective Internet safety education and a bunch of apps that block NSFW, set time limit etc? (An AI bot that hovers in one corner and highlights and teaches not-OK content, maybe ..?)
IMO a safe children’s social network is a bit of an oxymoron. You’re never going to make something safe, unless it’s a closed system!
I don't agree with your oxymoron statement. First, social networks aren't inherently unsafe. Plus, I don't think they are "open" or "closed" in a binary sense. Every system–real or online–with humans is going to have bad actors. You just need varying levels of ID, auth, filtering, etc.
I wasn't really involved with Club Penguin because I didn't have kids using it during the heyday, but my understanding is that they grew it to be pretty big without any major safety issues [0].
I think this whole notion of "completely closed and safe" is what gave rise to the helicopter parenting I mentioned above. That's not a good thing.
> I'm more of the mindset "this is inevitable so i hope someone takes the moral/responsible route with it"
I might have misunderstood but you’re suggesting an actually safe, well-designed social network for kids right?
Agreed, systems aren’t open and closed in a binary sense. But social networks only thrive when the channels for self-expression are flexible, available and lots of people are using them. With kids however, safety is No.1 and as long as the channels are open, it’s impossible to safeguard against dodgy content and (worse) users. Really determined people will always find ways to get around a system, and the more ‘open’ the system is (for example private messages and chat functionalities) the higher the risks. That’s what I meant by “oxymoron.”
Club Penguin is interesting, because imo it’s an example of a fairly closed social network. Kids can only interact via the available set of responses, like emoticons and coins. (At least that’s what it was like more than five years ago when I watched my younger siblings play.) So there’s no way to type in anything, all profiles are avatar-based and they have pretty fierce moderators. It’s very well designed actually. But to use it as a way to educate kids of social media in general, maybe not, just because the major social networks like Facebook etc are designed too differently (i.e. more open.) That’s why I’m an advocate of internet safety education - because you’re right, it’s impossible to shield them!
The only way I can think of implementing a safe social network for kids is within the school itself. Okay you’re with your classmates all the time, but the social network can be more of a democratic educational platform, where kids can access content and discuss topics anytime. In the U.K., there’s a few like that already, though it’s still a bit primitive (Frog I hear is quite buggy.)
IMO a safe children’s social network is a bit of an oxymoron. You’re never going to make something safe, unless it’s a closed system!