Virtual child sexual abuse material depicts fictitious children, but it can be used to disguise real abuse

Virtual child sexual abuse material depicts fictitious children – but can be used to disguise real abuse
Credit: Mikhail Nilov from Pexels

Child sexual abuse material (previously known as child pornography) can be a confronting and uncomfortable topic.

Child specifically refers to the possession, viewing, sharing, and creation of images or videos containing sexual or offensive material involving children.

But less publicized is another form of child material: virtual child sexual material (VCSAM).

What's virtual child sexual abuse material (VCSAM)?

VCSAM is depicting fictitious children in formats such as text, drawings, deepfakes, or computer-generated graphics. It's also known as fictional , pseudo pornography, or fantasy images.

Recent technological advancements mean fictitious children can now be virtually indistinguishable from real children in child sexual abuse material.

Some offenders create VCSAM through a morphing technique which uses technology to transform real images into exploitative ones.

A non-sexual image of a real child could be visually altered to include sexual content. For example a child holding a toy altered to depict the child holding adult genitals.

Morphing can also happen in the reverse, where an image of an adult is morphed to look like a child—for example adult breasts are altered to look prepubescent.

Another type of VCSAM includes photo-editing multiple images to create a final, more realistic airbrushed image.

But what might be most troubling about VCSAM is it may still feature images and videos of real children being sexually abused.

In fact, certain software can be used to make images and videos of real victims look like "fictional" drawings or cartoons.

In this way, this allows offenders to effectively disguise a real act of child sexual abuse, potentially preventing law enforcement from bringing victims to safety.

It may also enable to avoid detection.

Why do some people engage with VCSAM?

There's limited evidence revealing why some people might engage with VCSAM.

To learn more about this offending group, we recently investigated the possible psychological basis for people who engage with such material.

We discovered several potential reasons why offenders might use VCSAM.

Some used it for relationship-building.

Despite the diverse offending group, some offenders who use child sexual abuse material have been found to have limited intimate relationships and heightened loneliness.

Online communities of other deviant but like-minded people may therefore provide offenders with a greater sense of belonging, social validation, and support. Such interactions may also, in turn, serve as positive reinforcement for their criminal behavior.

Others may use this material to achieve sexual arousal.

It could be argued the material may also normalize the sexualization of children.

In fact, professionals in and law enforcement seem to share the concern that VCSAM may "fuel the abuse" of children by framing the offenders' criminal behavior as acceptable.

Sometimes the material is used for "grooming."

Adult offenders may show child sexual abuse material to children, breaking down the child's inhibitions to falsely normalize the abusive act being depicted.

This is one form of grooming—that is, predatory conduct aimed to facilitate later sexual activity with a child.

Such material can also be used to teach children how to engage in sexual activities.

For example, offenders may use VCSAM to show children material depicting young—and, most alarmingly, happy—cartoon characters engaging in sexual activities.

An urgent cause for concern

Clearly, VCSAM is incredibly harmful.

It can be used to disguise the abuse of real children, as a gateway to "contact offending" against (meaning abusing them in real life), and as a grooming technique.

Child welfare and officials have sounded the alarm about the increasing creation and distribution of VCSAM for over a decade.

And it seems this problem will only escalate with the development of increasingly sophisticated software and digital technologies.

So while VCSAM remains illegal and are frequently prosecuted, detecting—and ultimately preventing—these often obscure acts of abuse remains a challenge.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation: Virtual child sexual abuse material depicts fictitious children, but it can be used to disguise real abuse (2022, June 10) retrieved 15 June 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

New research shows parents are major producers of child sexual abuse material


Feedback to editors