Meta is now opening up its online virtual reality platform, Horizon Worlds, to preteens, but with some ground rules. Kids aged 10 to 12, with accounts managed by their parents, can now dip their toes into this exciting VR universe. Of course, there are some controls in place to ensure it’s all age-appropriate fun.
The tech giant has announced that parents will get the final say on what adventures their young ones can embark on. Whether it’s hanging out at The Space Station, getting a splash in The Aquarium, or racing through Spy School, parents can pick and choose which worlds are suitable. Kids can put in requests for where they want to explore, or parents can browse and decide on the options they’d like to approve.
Safety is a big focus for Meta, which is why they’ve rolled out some extra measures to keep kids secure. A new rating system — marked 10+, 13+, or 18+ — guides parents on the suitability of each VR world, allowing them to blanket-approve all environments rated for ages 10 and up. Worlds meant for those 18 and older vanish from preteens’ view. Plus, you won’t find follower suggestions and preteens will appear “offline” to others by default, unless parents decide to alter visibility settings manually.
One of the neat features Meta has added is the “Personal Boundary” capability. Avatars are automatically enveloped in a protective bubble that measures two virtual feet in radius, preventing anyone from encroaching on their space.
Recently, Meta also introduced the option for parents to approve individual contacts for their children, deciding who they can chat with and invite into their VR experiences. Moreover, a prompt now nudges Meta Quest 2 or 3 headset users to re-enter their birthday, adding an extra layer of verification.
Since June 2023, these parent-managed accounts for preteens have been available, but despite the new layers of safety, some parents remain skeptical. Concerns linger about whether Meta can genuinely ensure their children’s safety on its platforms, especially with past controversies in play.
Earlier this year, Meta found itself in hot water for allegedly promoting its messaging platforms to underage users. Internal documents from a New Mexico lawsuit revealed awareness of inappropriate exchanges between adults and minors. Alongside, a larger lawsuit led by attorneys from 42 U.S. states claims Meta designs its products to captivate children, potentially impacting their mental health negatively.
The conversation about online safety for kids is ongoing, with Meta taking steps that may reassure some, but for others, the jury’s still out.