Cabaero: Taylor Swift, child pornography, and AI

Cabaero: Taylor Swift, child pornography, and AI
Beyond 30

A mother in Pinamungajan, Cebu, was arrested recently for selling photos and videos of her naked children to customers abroad. What she didn’t realize was that her business was about to collapse because of artificial intelligence (AI).

The mother went against the advice about not doing pornography and not posting online photos or videos of children because anyone could take the material, apply AI, and make pornography and earn without her knowing it. She went into child pornography, using her children, to sell to clients who were usually abroad. That was her business model.

Makers of deepfakes or altered videos and non-consensual AI porn, such as that which used the images of celebrity singer Taylor Swift, may take away the mother’s business model. Video of children doing provocative poses, as was the commerce of that mother, can be overshadowed by computer-generated images of pornography.

In the deepfaked pornographic images of Swift, thousands or millions viewed them before these were taken down by the platform X, formerly Twitter. Swift was not the first one to be a victim of deepfake. Many other personalities, including Pope Francis, have been victimized. In some cases, even dead people’s images were manipulated so they appeared promoting a product or saying something they never said.

What the controversy over Swift’s AI pictures highlights is the threat posed by users of AI to violate privacy, infringe on intellectual property rights, and go against common decency.

In the case of the mother arrested on Jan. 25, 2024, in Pinamungajan, Cebu, reports said she asked for P2,000 to pay for her electricity bill from her client abroad. In return, she showed a video of her minor daughter taking a bath and, on her instruction, posing provocatively.

One of the fears in the use of AI is that the technology could be used to replace human labor or perhaps replace the business model of a mother out to sell images of her children. With AI around, anyone who viewed her images could copy them and use them for similar purposes but without the mother knowing or benefiting from it. The deepfake maker is the one who could resell the manipulated material. New online child pornography images or videos can be created from existing and similar material.

A positive outcome to this could be if proponents against child pornography were to use deepfake or AI-generated images to catch sexual predators in the same way they created the “Sweetie” avatar to go after the buyers of online pornography in 2013.

“Sweetie” was a computer-generated child used to lure online sexual predators into a sting operation. A non-government international effort tricked these predators into giving their details so authorities could pursue them. With AI, a similar sting operation is possible.

Then, the mother’s business and the industry she is in could cave in to pressures to stop the online exploitation of children.

Trending

No stories found.

Just in

No stories found.

Branded Content

No stories found.
SunStar Publishing Inc.
www.sunstar.com.ph