This piece is tribal sex videospart of an ongoing series exploring what it means to be a woman on the internet.
When the world realized late last year that you could convincingly superimpose one person's face onto another person's face in a video, it was because men used the "deepfake" technology to force their favorite actresses to appear in their pornography of choice. Of course, they boasted about it on Reddit and 4chan, which prompted a frantic debate about the ethics of using artificial intelligence to swap people's faces -- and identities.
SEE ALSO: AI-powered tool helps domestic violence survivors file restraining ordersIn the midst of that controversy, two California lawyers with expertise in digital privacy and domestic violence advocacy found they were equally alarmed by how the technology was poised to destroy the lives of unwitting victims, some of whom they might one day aid or represent in court.
Imagine, for example, a survivor of domestic abuse discovering that her partner used deepfake technology to overlay her likeness onto a porn actress's face, and then deployed that counterfeit image or video as a means to control, threaten, and abuse her.
Adam Dodge, legal director of the domestic violence agency Laura's House in Orange County, California, and Erica Johnstone, partner of a San Francisco law firm and co-founder of nonprofit organization Without My Consent, were horrified by the possibility. Then they decided to do something about their fear.
"A lot of people didn’t even realize this technology existed, much less that it could be misused or weaponized"
In April, they published an advisory for domestic violence advocates, detailing how fake video technology could add another brutal dimension of trauma to emotionally and physically violent relationships.
"A lot of people didn’t even realize this technology existed, much less that it could be misused or weaponized against the population we serve every day," says Dodge.
The reality of deepfake technology will unnerve women who specifically avoided creating intimate photos or videos so they'd never have to worry about seeing themselves in nonconsensual porn, or revenge porn, wherein a victim's intimate photo or video is posted online without their permission.
Open-source scraping tools that pull photos and videos from publicly available social media accounts and sites can be fed into computer software programs capable of churning out pornographic deepfakes in a matter of hours. The perpetrator can effectively hijack someone else's identity, make it look like she appeared in pornography, and leverage search engine optimization and cybermobs to target her.
"This is nonconsensual porn on steroids," says Dodge.
In May, Rana Ayyub, an investigative journalist in India, wrote about being digitally attacked on social media by users who spread a pornographic deepfake video of her.
"The slut-shaming and hatred felt like being punished by a mob for my work as a journalist, an attempt to silence me," Ayyub wrote. "It was aimed at humiliating me, breaking me by trying to define me as a 'promiscuous,' 'immoral' woman."
This Tweet is currently unavailable. It might be loading or has been removed.
Neither Dodge or Johnstone knows of a case where a domestic violence victim's abuser created a pornographic deepfake as revenge or leverage, but both believe that scenario is imminent. They're choosing to publicize the possibility now because they both watched in the past as law enforcement, lawyers, judges, and advocates scrambled to respond to the rise of nonconsensual porn.
The problem, as Dodge and Johnstone describe it, is that some states learned from this experience and should be able to offer victims of fake video technology protection and recourse through the legal system, while other states remain woefully unprepared.
In California, for example, domestic abuse survivors whose former or current partners have posted nonconsensual porn of them can file a restraining order through family court. The same should be true for deepfake victims, says Johnstone, since publishing doctored images or video could count as false impersonation, stalking, harassment, or other forms of intimate partner abuse defined by state law. The perpetrator might also violate the law by stalking or engaging in harassment and intimidation to obtain the hundreds of photos needed to use a face-swapping AI program or app.
This Tweet is currently unavailable. It might be loading or has been removed.
Additionally, the state of California, under the leadership of then-Attorney General Kamala Harris, launched an eCrime Unit in 2011, and eventually provided training for investigators and prosecutors with specific emphasis on "cyber exploitation" and nonconsensual porn.
Johnstone imagines that if a victim who is well-organized, persistent, and has a compelling narrative tries to file a police report against her perpetrator in California, she'll have a good shot of encountering an investigator with experience or training. She also shouldn't be funneled into a legal system that's ambivalent or even hostile toward her cause. (Johnstone created a checklist so that people in other states can advocate for similar protections.)
Yet nonconsensual porn laws vary by state and training can only do so much. It's impossible for law enforcement to investigate every case, and it may not result in a criminal sentence when they do. Victims may need to hire an expensive private attorney, and even then may not win financial restitution in civil court.
Carrie Goldberg, a prominent New York lawyer who's taken on numerous nonconsensual porn cases, says the prospect of how deepfake victims will be treated is worrisome.
"Even if there is [a nonconsensual porn] law in their state, cops can be disbelieving or make my clients feel like they're getting upset over something trivial," Goldberg wrote in an email. "So, imagine if they walked in and said, 'Hey, a doctored image of me participating in a gangbang is ruining my life.' They’d be dismissed at a greater rate."
This Tweet is currently unavailable. It might be loading or has been removed.
Since there is no federal law that protects victims of nonconsensual porn, and state laws don't include commercial pornography in their policies against revenge porn, Goldberg says civil lawyers may need to use "creative tools" like copyright infringement and defamation suits to seek justice for their clients.
Johnstone sees a pro-active role for the clients themselves. While she's wary of issuing blanket statements about restricting access to one's personal videos and photos -- "a certain amount of trust is necessary for relationships" -- the advisory she wrote with Dodge recommends that victims make social media accounts private, ask family and friends to remove or limit access to photos that include the victim, and use Google search to identify public photos and videos for removal.
Women who may not suspect their partners of using fake video technology should still know the warning signs, which include asking for access to and downloading a cache of personal photos as well as frequent requests to pose for images or videos. Johnstone recommends setting "house rules" on a case-by-case basis about when photos are taken and in what circumstances.
"When someone flees an abusive relationship, [the abuser] looks for ways to recapture that level of power and control"
"If you want to be really cynical, assume this person would use whatever content you give them access to [in order] to shame you and humiliate you online," she says.
If that sounds like a far-fetched dystopia, know that Johnstone has represented clients whose profile images, consensual yet private intimate photos, and pictures from average photo shoots were used to embarrass them digitally, in perpetuity.
For victims of domestic violence, Dodge says deepfake technology poses a particularly malicious threat: "When someone flees an abusive relationship, [the abuser] looks for ways to recapture that level of power and control, and threatening to release a video or photo is a very powerful way to do that."
Even if the victim knows that photo or video is fake, she'll endure the painful task of trying to convince others that it's false -- or she may even decide to stay with or return to an abuser, believing nothing she can do will stop his behavior.
The debut of fake video technology, says Johnstone, marks a new phase in our tech-obsessed society, and it's poised to harm the most vulnerable among us, like domestic violence victims, and that fundamentally threatens our understanding of what's real in the world.
"The next generation of identity theft is not that you're reading fake things about a person but you’re also seeing them playing out," she says. "You used to say, 'You can’t believe everything you read.' Now it's that you can't believe everything you see."
Topics Artificial Intelligence Social Good
Twilight People: Subways Are for Sleeping by Joe KlocApple's Sensitive Content Warning will blur unwanted nudesScary Children Reading, and Other News by Sadie SteinHow to watch Love Island UK 2023 online with a VPNNetflix adds a bunch of games for summer, including 'Oxenfree 2'The Paris Review Mug: Now for Sale! by The Paris ReviewSelling Psalms, and Other News by Sadie SteinMall Santas would've had early COVID vaccines in canned Trump planWe Have a ... Winner? (NSFWA complete history of Keyboard Cat, the meme that won't be played offTwitter Blue subscribers now have more time to edit tweetsYanet’s Vintage Emporium by Julia CookeHappy Birthday, C. S. Lewis by Sadie SteinNetflix adds a bunch of games for summer, including 'Oxenfree 2'Alexandria OcasioNew York, Not Too Long Ago by Stephanie LaCavaWilliam Styron in Letters, Part 4 by William StyronWordle today: Here's the answer and hints for June 7Gulag Tunes by Sophie Pinkham7 DIY Halloween candy delivery systems for socially distant trick Google One officially launches with cheaper storage plans Omarosa book review: This is what complicity looks like Flexible squirrel is the star of a stunt Super tiny camera lenses can go inside the body, 'Magic School Bus' You need to read this author's disastrous rejection story all the way to the end Apple hacked by teen who stored files in 'hacky hack hack' folder Moto P30 is the most ridiculous iPhone X copy yet House Benghazi report faults security, not Clinton, for response Anki's wild journey from WWDC star to consumer robot innovator 200 Malaysian taxi drivers hold 4 'Game of Thrones' directors hint at Dany and Jon’s uncertain romance If you like these Hollywood love stories, try these Bollywood romances Driver disguised like a seat was for Ford's self Donald Trump gave a speech in front of a literal wall of garbage SEC is formally investigating Musk's tweet about taking Tesla private Classic tale of star 'How to Dad' series will get your baby cleaning the house, finally Does Amazon own Seattle? 'World of Warcraft: Battle for Azeroth' feels like older 'WoW': Review 5 things to fill the void now 'Game of Thrones' is over
3.8538s , 10545.2578125 kb
Copyright © 2025 Powered by 【tribal sex videos】,Defense Information Network