This website is a safe space for those affected.
No personal data will be stored or passed on to a third party.
We would like to offer support to people affected with this disposition, and to provide them with ways that can help them live with it.
Sexual violence against children does not happen very often.
If children say NO, they actually mean YES.
No means NO! This is true regardless of whether a child or an adult is speaking. All children have the right to be taken seriously when they say no. By the way, the same accounts for saying “yes”.
No child wants to have or enjoys sexual actions.
Children, who become victims of sexual assault, are never asking for it and neither want nor enjoy sexual interactions with adults. Regardless of the clothing style, posing or behaviour of a child, nothing allows an adult to engage in sexual activities with a child.
Studies with convicted offenders show that most sexual criminal actions are well planned.
Adults do not commit sexual assaults because they have lost control. They can restrain their sexual impulses and it is their responsibility to do so.
Studies show that most sexual offenders are regarded as “normal” by friends, colleagues and personality tests.
Those who commit sexual assaults, are not, as often suggested or claimed, emotionally and mentally damaged or unstable.
The victim often does not know the offender.
Most victims know their offender. They can be a friend, neighbor, acquaintance or even a member of their own family.
A child who has never been beaten or has never received injuries cannot be a victim of sexual violence.
Missing signs of injuries does not imply that the child has not been a victim of a sexual assault. Oftentimes an offender will threaten a child with violence or weapons, causing them to freeze and be unable to resist. That in no way means that the child consents to the sexual interaction. If anything, it implies that the child is trying to protect him or herself. Children, just like adults, react differently than they would normally in critical situations, and this includes when experiencing sexual assault. Some are obviously shocked and upset, whereas others may appear calm and controlled. The calm and controlled victim can be just as traumatized as the others but may just be unable to express it.
Child sexual abuse material (CSAM) is a more accurate name than child pornography.
FACT
The term "child sexual abuse material" is more accurate than "child pornography" because it emphasizes the abuse and exploitation involved, rather than suggesting consent. Unlike adult pornography, which involves consensual acts, material with children is inherently abusive. Using the term "pornography" is minimizing the harm and normalizing the abuse. This terminology shift highlights the severity and rejects any downplaying of the issue.
Child abuse materials (CSAM) consist only of explicit sexualized scenes.
MYTH
Child abuse material is not limited to explicit sexual content. It can range from seemingly ordinary photos with no overtly erotic content to explicit depictions of sexual acts. The COPINE project outlines a ten-level severity scale that includes non-explicit images which may still be sexualized by offenders. Additionally, child abuse material can involve written stories, audio files, and digitally manipulated content, all of which can be harmful regardless of explicitness.
Watching child abuse material (CSAM) does not harm anyone, because it’s already produced and has no direct victims.
MYTH
Every view or download of child sexual abuse material perpetuates harm, fueling further exploitation. Survivors live in fear of their images resurfacing and being recognized, causing lasting anxiety, vulnerability, and powerlessness. This ongoing trauma leads to severe emotional and mental struggles, including anxiety, depression, and difficulty in relationships, work, and education.
If I use only freely available images and videos and do not pay for abuse material, I contribute to the production of child sexual abuse material and cause further harm.
FACT
Using freely available images and videos, even without paying for abusive material, can still contribute to the production and spread of child sexual abuse material (CSAM). By viewing, sharing, or circulating such content, you increase demand and perpetuate harm, indirectly supporting the exploitation and abuse of children.
AI-generated materials can cause harm.
FACT
AI-generated and manipulated content can cause real harm, even if it doesn't depict real people. For instance, AI-created material that sexualizes minors, though fictional, normalizes harmful attitudes and complicates efforts to protect real victims. It also makes it harder for authorities to distinguish between real and computer-generated abuse, further hindering victim protection. Similarly, altering real images of children for explicit content can lead to harassment, sextortion, and trauma. In short, these digital materials pose serious risks to mental health, privacy, and safety, especially when they exploit or sexualize minors.
While watching images of children on approved major platforms like Google, Instagram, TikTok or Youtube, I can make sure that it does not contain any CSAM, because they apply strict community rules and do not allow such material to be posted.
MYTH
While platforms like Google, Instagram, TikTok, and YouTube have strict guidelines and use detection tools to remove CSAM, they cannot guarantee that all illegal content is prevented. Detection systems can be bypassed and the large volume of content makes it challenging to catch everything. There are also delays in identifying and removing harmful material, and some content may not be flagged accurately by AI or users. As a result, these platforms cannot fully ensure protection from CSAM.