Our Commitment to Minor Safety
We are committed to making BandLab a safe and positive platform for everyone, particularly for younger users. As BandLab is not intended for children, you must be at least 13 years old to create an account*. If we learn someone is under the minimum age, their account will be terminated. Minors (aged 13 years to the applicable age of majority) also need the consent of their parent/ guardian to create an account.
[*Minimum age may differ depending on the user’s jurisdiction, e.g. users from Australia must be at least 16 years old.]
Minor Safety Protection
To protect minors, we combine safety policies and products to prevent exploitation, abuse, and endangerment. There are three key approaches we have taken to create an age-appropriate platform for minors:
Protection from harm — Removing or limiting content that violates our policies and puts minors at risk of sexual exploitation, abuse, psychological harm, or dangerous behaviour.
Privacy by design — Providing enhanced privacy settings to reduce exposure of minors’ content and personal information, and to prevent unwanted contact.
Active prevention — Building new safety products, and partnering with specialist platforms and law enforcement to detect and eliminate child sexual abuse material.
Our goal is to create a safe and positive environment for minors to express their creativity and explore music, and to give their parents or guardians peace of mind.
Minor Safety Policies
We have strict policies to protect minors from abuse, exploitation, and self-endangerment. As a platform, we have zero tolerance for child sexual abuse material (CSAM) and child sexual exploitation (CSE), whether real, fictional, AI-generated or digitally altered. Confirmed cases will be reported to law enforcement.
BandLab also restricts any content that depicts or encourages minors to be in unsafe, exploitative, or harmful situations. This includes self-generated content that could expose them to harm. Under our Depiction of Minors policy, benign content (e.g. music performances, everyday activities) depicting minors may remain on the platform, but with limited visibility to reduce the risk of exploitation.
BandLab has zero tolerance for CSAM and CSE, including grooming, solicitation, sextortion, and any promotion or facilitation of paedophilia or CSAM trade. This covers all content that sexualize or exploit minors, whether real, fictional, AI-generated or self-produced. Examples include explicit sexual acts, suggestive imagery or videos, sexualized posing or dancing, fetish or cosplay content implying underage traits, as well as written or audio material describing or simulating child sexual abuse.
All confirmed violations are removed, associated accounts terminated, and reports filed with the National Center for Missing and Exploited Children (NCMEC). Survivors or advocates may share educational or awareness content, provided it avoids explicit visuals, detailed descriptions or personal information to prevent revictimization. Learn more.
Endangerment of minors extends beyond sexual exploitation and includes physical, emotional, and psychological abuse, as well as neglect and unsafe behavior. We do not allow content that portrays or encourages minors to engage in physically harmful or dangerous acts, including smoking, drug use, unsafe “challenges”, stunts, or roughhousing. To avoid retraumatizing victims, content showing or implying physical abuse (e.g. bruises, burns, or visible injuries), or depicting coercion, neglect, exposure to unsafe environments is not allowed.
We restrict content that can cause emotional or psychological harm to minors, including humiliation, bullying, intimidation, harassment, or unwarranted discussions about a minor’s sexual orientation or personal life. Such content will be removed, and repeated violations may result in account suspension. Professional stunts involving child performers (e.g. circus acts), and educational or news content addressing child safety or abuse are allowed when clearly contextualized, non-distressing, and free of identifying information. Learn more.
BandLab supports young people’s creative expression and recognizes that parents or media may at times share images of minors without realizing the potential risks of exploitation. To minimize these risks, benign depictions of minors may remain on the platform but will have limited visibility.
Benign depictions include minors engaged in age-appropriate activities or music performances that do not involve indecent exposure, personal information or suggestive framing. Content showing minors in full or partial nudity, sexualized posing or dancing, as well as clothing that emphasizes intimate body parts is not allowed.
In exceptional cases, involuntary depictions of nude or partially clothed minors in vulnerable situations, such as warzones, natural disasters or fundraising campaigns, may remain on the platform if shared respectfully and without exposing minors' personal data. Learn more.
Minor Safety Product
BandLab’s minor safety products are designed to protect younger users by reducing exposure to sensitive content, preventing exploitation, and ensuring age-appropriate interactions. Through sensitive-content filtering, stricter privacy settings, and proactive detection features, we help minors explore and create music safely while meeting global child-safety and privacy standards.
Building an Age-Appropriate Environment for Everyone
We allow a wide range of non-violative content on BandLab, but not all of it is suitable for minors or for users who prefer to avoid sensitive material. To address this, we’ve built a multi-tier content classification system that filters age-inappropriate content for minors and blurs sensitive content for everyone else.
Violative content refers to material that violates our Community Guidelines — including Minor Safety policies or applicable laws. Such content, whether posted publicly or privately, is removed for all users. Common examples include mature material, bullying or harassment, illegal activities, and spam.
Sensitive content is allowed on the platform but its visibility is limited based on the user’s age or viewing preferences. To illustrate, we provide the rationale and a list of common examples for some of our sensitive content policies:
-
Regulated Goods & Services — may increase substance abuse risks or normalize gun violence for minors:
Depictions of alcohol, tobacco, or legally prescribed drugs.
Descriptive stories of substance abuse recovery or discussions.
Depictions of real firearms or explosives.
Firearms or explosives in video games, especially when hyper-realistic or showing excessive gore.
-
Mental Health Expressions — unintentional depictions that may increase the risk of self-harm contagion, especially among younger viewers:
Mild expressions of mental health struggles (e.g., sadness, stress, anxiety).
Potentially triggering personal accounts of mental health challenges or eating disorder recovery.
-
Artistic Expressions of Mature Themes — mature content that may require parental guidance for younger audiences:
Suggestive imagery in album art, music videos, or performances.
Violent, dark, or controversial imagery — including fictional depictions of mental health, social, or identity issues.
Suggestive lyrics or graphic language.
Please note that the above list is not exhaustive, as we regularly review emerging content that could cause psychological, physical, or developmental harm to minors. Where needed, we will introduce new sensitive content policies or additional age tiers, ensuring our platform evolves in step with online risks and safety regulations. For the latest updates, check back on this page.
Partnering with Industry Experts to Protect Minors
At BandLab, we have zero tolerance for CSAM, including CSE. To strengthen our safeguards, we have integrated specialized models into our moderation systems.
Thorn’s CSAM technologies combine hash-matching of known abusive material with AI models trained to spot new and previously unseen content in both images and videos. They also include text-based detection that can help identify grooming, coercion, or other exploitative behavior before it escalates. These models are purpose-built by child safety experts and are continuously updated with trusted data sources to stay effective at scale.
If we become aware of any potential CSAM or CSE flagged to us — we act immediately by removing the content and terminating the account, including any linked accounts, to reduce further risk. All suspected violations are reported to the National Center for Missing and Exploited Children (NCMEC), which works closely with law enforcement agencies globally to review cases and protect victims.
By combining advanced detection capabilities with NCMEC’s global enforcement network, we ensure our platform stays proactive in identifying, preventing and protecting minors from potential exploitation and abuse. We also encourage our community to help keep BandLab safer by reporting any content or behavior that may raise concerns about minors’ safety.
Default Privacy Settings for Minor Accounts
At BandLab, we are committed to creating a safe and welcoming community for everyone, that includes putting more measures in place to protect minors. To safeguard their experience, we apply age-based protections in line with global child safety and privacy regulations.
-
Minimum age — You must be at least 13 years old (or applicable minimum age) to use BandLab.
Default privacy settings — For minors, certain features are restricted to reduce unwanted interactions and promote a safer environment. These include limits on direct messaging and public comments, applied automatically until the user reaches the applicable age of majority in their jurisdiction.
These safeguards are designed to:
Protect privacy by limiting exposure to unwanted or inappropriate interactions.
Promote safety by encouraging respectful and positive engagement on the platform.
Ensure compliance by meeting global child online safety standards.
Comments
Article is closed for comments.