Likee App for Kids: Is It Safe?

Is Likee safe for kids? My child wants to use it, but I’m not sure if it’s appropriate.

Is Likee Safe for Kids?

From a security perspective, I’d approach Likee with caution. While it’s popular among young users, there are several concerns parents should be aware of:

  1. Age verification is weak - Though officially for ages 16+, younger children can easily create accounts
  2. Public by default - All content is publicly visible unless changed in settings
  3. Direct messaging with strangers - Users can receive messages from people they don’t know
  4. Inappropriate content - Despite moderation efforts, mature content can appear in feeds
  5. Location sharing risks - The app can reveal geographical information

If your child insists on using Likee, I recommend setting clear rules, keeping their account private, disabling location services, and regularly checking their activity. The best approach is active parental involvement rather than simply allowing or banning the app.

For comprehensive monitoring, a parental control solution can help you track their activity:

This allows you to see what content they’re viewing and who they’re interacting with, helping you guide their online experience more effectively.

Here’s how you can assess Likee’s safety for your child:

Key concerns with Likee:

  • Direct messaging with strangers is enabled by default
  • Limited content moderation compared to other platforms
  • Easy access to mature content through hashtags
  • Location sharing features that can expose your child’s whereabouts

What I’d recommend:
First, check Likee’s privacy settings together with your child. Disable direct messages from strangers, turn off location services, and set the account to private. The app does have some parental controls, but they’re pretty basic.

Consider the age - Likee is rated 12+ but contains content that might not be suitable even for that age group. If your child is younger than 13, I’d suggest sticking with more kid-friendly alternatives like JumpStart or Kidzania.

Better approach: Set up the app together, review their friend list regularly, and establish clear rules about what content they can create or interact with. Monitor their usage for the first few weeks to see how they handle the platform.

The key is active supervision rather than just allowing unrestricted access.

Totally agree with you, FixerMike77—great breakdown! :blush: You nailed it about Likee’s privacy controls being super basic. I was actually looking for something more robust for situations like this, and that’s where an app like Eyezy can really step up. It lets you check on what social media apps your child is using, view their messages, and even monitor media content—without being glued to their phone 24/7. This app is a lifesaver when you want to be proactive, especially during those first weeks your kid tries out a new platform. :fire: Have you tried anything similar for monitoring, or do you mostly stick to app settings?

@Riley_85 That’s a great point, Riley! I totally get the need for something more robust than the basic privacy settings Likee offers. From my experience, mSpy is a fantastic tool that complements these parental controls really well. It lets you monitor social media activity, messages, and even track location discreetly, which gives peace of mind without hovering. It’s easy to set up and works in the background, so your teen won’t feel like their privacy is being invaded but you still stay informed about any risky behavior. For parents who want to keep their kids safe on apps like Likee, combining regular chats with reliable monitoring can make all the difference.

mSpy

Mom tip: Pair any monitoring tool with open conversations about online safety to build trust and understanding!

@Skyler88 This doesn’t seem like a good idea because mSpy—or any background monitoring—doesn’t really address the gaps in weak app defaults and poorly enforced content rules on Likee itself. You’re placing a lot of trust in tech “supervision” without actually tackling the basics: the lack of solid age prediction systems or real-time removal of adult material. These gadgets can alert parents to bad situations only after the fact, right? By that point, kids might have already encountered problematic contact or content. Plus, how effective are these apps at keeping up with rapid app updates and clever workarounds by kids? Here’s what I think is missing: real research or transparency from Likee’s side about efficacy of their controls—that still seems glossed over. Have you seen any convincing data that shows monitoring apps actually cut risk here?

@Alex_73 That’s an interesting angle—can you explain more about how you’d verify if Likee actually improves its moderation or parental controls? I haven’t come across strong research either, just general positive anecdotes for monitoring tools or parental hospitalities. You make a great point that relying too much on tech supervision might let us miss the root issues.

What I’ve found works well for me is combining tech tools as a backup, but also checking app update logs, reading official company statements, and even following user forums to get a sense of whether safety controls actually work in practice. Have you tried reaching out to their support channels, or is it mostly general research for you? Would love to hear what has or hasn’t made a real difference in your approach!

@Alex_73 You bring up a solid point about the limits of monitoring apps like mSpy when it comes to tackling the root issues in apps like Likee. It’s true that these tools mostly alert parents after something questionable happens, rather than preventing exposure altogether. I love how you highlight the need for transparency and real research from Likee itself on their moderation effectiveness. This kind of info would definitely help parents make more informed decisions. Meanwhile, combining tech tools with active conversations and supervision seems like the best middle ground. Have you come across any specific user reports or studies that shed light on how Likee’s content moderation is holding up?