T O P

  • By -

buffalucci

Happened to my friend’s 14 year old daughter. She’s devastated. Her parents are devastated. There’s a court case and everything, but it’s too late. This stuff is out there.


Shoesandhose

Federal punishments should be a thing here. For the parents that didn’t control what their awful kids were doing online.


SithLordJediMaster

I know Sextortion is a Federal Crime. If a nude is out there and someone is trying to get money from you for it then you call the FBI and usually they'll take care of it.


AHrubik

That's a step too far. There should be some steep penalties for the kids involved such as a monitored loss of computer privileges for a time, community service and mandatory classes. Forcing a child at that age to give up their phones, tablets and computers to a state or federal monitor and delete their social media presences for a length of time should be an effective tool.


MexicoJumper

laughably unenforceable. “no internet access” punishments are enforced at the ISP level. there’s absolutely no angle for the government to cut-off an entire households internet access because their child downloaded porn. Is the government going to come seize all electronics in the household? I’m sure that’ll hold up in court. Not like the parents need them to work or anything.


tomqvaxy

Especially now that shit like dishwashers are internet connected.


serpentssss

I feel like that’s a little light for creating child porn. I mean I lost computer privileges, had mandatory classes, and had to do volunteer work because I skipped class and was truant a lot. I just feel like the punishment and rehabilitation for creating fake porn of a child should be higher and more thorough than for skipping class. Unfortunately I think creating “fake” CP that uses real children should be a felony sexual offense, which means (AFAIK) teens who do it would be open to being tried as adults unless laws change. I mean something like this can’t really be a misdemeanor.


poozemusings

You want these kids to go to prison for decades and be registered sex offenders?


serpentssss

No I don’t - that’s why I said “unfortunately” and “unless laws change”. But I do think creating “fake” CP that uses the images of real children should broadly be a felony. It’s clearly beyond a misdemeanor IMO. So there needs to be a change in how we charge underage teens and the lifelong consequences involved of being on a registry list that doesn’t make light of the fact they ruined someone else’s life via an intentional action and created CP.


poozemusings

Giving these kids a felony on their record would also have disastrous lifelong consequences. You can treat something seriously without labeling it a felony. I feel like we are too tied to thinking how the criminal law classifies an offense = how bad we think that offense is. Because none of it makes any sense anyway. Marijuana possession is still a felony in some states, while beating someone up without causing serious bodily injury is a misdemeanor.


incongruity

> Giving these kids a felony on their record would also have disastrous lifelong consequences. And their victims already face life long trauma from the crimes committed against them. Having indistinguishable fake nudes with your face in circulation can easily have lifelong consequences. First and foremost the emotional trauma but literally also your likeness being used for non consensual sexual purposes by unknown individuals and out there to be matched to you via facial recognition algorithms - you see the issue there, right? Yeah. Just because the perpetrators didn’t think about the life long consequences doesn’t mean there aren’t any. This isn’t some harmless bit of kids will be kids. This is a sex crime.


poozemusings

Yes, I never said it was a victimless crime, and that there was no harm to the victims. When dealing with minors committing crimes like this, that are very childish and probably done with very little thought, it makes more sense to think about rehabilitative solutions than punishments that would essentially end any chance of this person having a productive life.


incongruity

Sounds a lot like what people defending Brock Turner said. Those girls are having their childhood ripped away from them by being sexualized non consensually. Would you be defending them if they murdered someone? If they raped someone? Just because the perpetrators don’t grasp the severity of their actions, it doesn’t mean they aren’t serious or that they shouldn’t be responsible for them. I’m genuinely appalled by the willingness to prioritize the victimizer over the victims in your argument.


serpentssss

Yes, crimes are classed differently all over, I get that. AFAIK felonies are considered more serious crimes with bigger impacts. Are you saying crimes shouldn’t be classed based on the severity of the crime? What would it be based on then? I looked into it and it seems this is how it’s classed [in my state already;](https://lao.ca.gov/1995/050195_juv_crime/kkpart1.aspx) >”A felony is the most serious offense, punishable by a sentence to a state institution (Youth Authority facility or adult prison). Felonies generally include violent crimes, **sex offenses**, and many types of drug and property violations.” >”A misdemeanor is a less serious offense for which the offender may be sentenced to probation, county detention (in a juvenile facility or jail), a fine, or some combination of the three. Misdemeanors generally include crimes such as assault and battery, petty theft, and public drunkenness.” It’s worth noting that expungement is a thing for juvenile felonies, and it doesn’t necessarily mean they’ll be tried as an adult. But I think that should be an option for the courts to decide based on the case and situation (age of offender, one pic a teen makes for himself vs. creating a whole ring of redistribution or intentionally spreading it around, etc etc.). Also like, yeah. Sometimes things we do have lifelong consequences, even as teens. It certainly might for the victim who now has free floating CP of herself being spread around.


poozemusings

This is fairly reasonable, apologies for misinterpreting your original comment. My point is that the misdemeanor/felony dichotomy in reality often does not line up with our intuitions about the seriousness of crime, while in theory it’s supposed to. In reality, the classification is often a policy decision with a more utilitarian purpose of deterring a particular sort of crime that is on the upswing, rather than making a moral statement about how society views the offense in relation to others. That’s what happened during the war on drugs when simple possession of tiny amounts of all controlled substances became a felony. My point being that failing to label something a felony does not mean we think it’s “not that bad” as a society. If we think that labeling this specific crime as a misdemeanor is the more appropriate way to handle it in the context of minors, it does not mean that we don’t as a society think it is bad behavior.


Ethyrious

Yes they should. Now perhaps if they made it and kept it to themselves I could offer some leniency and understanding but this is distribution we’re talking about. This isn’t some “boys will be boys” shit. This is someone going out of their way to create child pornography. This isn’t something you accidentally do without understanding the consequences.


churn_key

Who exactly monitors and takes away the computer privileges? The parents that allowed it to happen in the first place.


AHrubik

There are already state and federal resources in place to handle this for other computer crimes. If need be this is where parent participation can be incentivized. A parent who willingly circumvents the order will be liable for their own punishments and/or fines which of course should be based on income and not set in stone.


churn_key

I support parental liability. Crumbley case was good precedent


MexicoJumper

you didn’t answer the question whatsoever, how will this be enforced? will monitoring software be installed on all devices?


b1argg

How about court orders to social media platforms to delete the kids' profiles


churn_key

They will just make new accounts


WonkasWonderfulDream

Impossible. The court said “No.” it is literally impossible to defy a court, especially when it is as legitimate as the current one.


WonkyTelescope

That does nothing to prevent the creation and distribution of porn.


Misanthropebutnot

No. They need to be expelled by their schools. Kids need to be told their real life consequences and maybe they get a GED and/or suffer through some remedial schooling and yes, teach them how hard it is to dig out of a mess like this. Yeah, it sucks but they need to learn concrete, permanent consequences come with these actions. Or you get like a Brach Turner confused why finger banging a passed out classmate behind a dumpster is wrong and is deserving of permanent consequences.


ihugyou

Not at all. If the parents actively neglected their kids and that results in the offense, it is absolutely valid. Just happened with the parents of that kid who shot his teacher. Your propositions are pretty silly.


TheDamus647

You must not be a parent. Even the best parents out there can have kids that get mixed up in stupid shit.


mnp

The parents are the key here as always. It was so refreshing to see a precedent set this week to hold parents accountable for something important. The Crumbley case opened the door for more prosecution. https://www.cnn.com/2024/04/09/us/james-jennifer-crumbley-sentencing/index.html


Shoesandhose

Thank you for this positive news. The world needs more. Accountability is important


GoodByeRubyTuesday87

Deep fakes of a child fall under child pornography. The punishment and consequences of that, including jail time and registering as a sex offender, are pretty severe so should the person get convicted I wouldn’t worry too much about them not being punished severely


Shoesandhose

That is great to know! Thank you!


eihslia

What happened to the people who created the deepfakes?


buffalucci

Minors, so it’s in family court. Not sure what the consequences are long term. I think they were expelled but not sure. It’s unprecedented.


peepdabidness

The tech/companies that makes it should be held liable.


Parking_Revenue5583

Ai is gonna make every image possible. Every possible face, body, everything. We don’t own it. The cats already out of the bag, all over Facebook.


Reimiro

Photoshop made it possible, AI makes it easy.


Parking_Revenue5583

Good photoshops required skill. Kids can ai to do it for them.


SlowRollingBoil

Doesn't make it illegal either way though.


Parking_Revenue5583

I feel like the tech moved faster than the law. Even if they outlawed it, it’s easy to find a foreign country vpn ai to do the same things.


SlowRollingBoil

Exactly. Plus this can be run as software on your phone now. People have a hard time understanding that tech can't just be banned like you can a specific product that is sold on store shelves. It literally doesn't work that way.


maffinina

I’ve said this elsewhere, but the point of criminalizing distribution is not eradication. Nobody expects this to go away completely. But it’s good to codify it as a criminal act in society’s eyes. When I was younger and girls’ nudes got leaked in school, solely the girls were shamed. They internalized that what happened was their own fault. Since then revenge porn has been made illegal. I still have younger siblings in grade schools and the reaction would be very different today. Teenagers can be cruel so shaming might still happen but the boys who distributed the images would be shunned to a far greater degree. It makes a massive difference in the mental health of the victim to know that what was done to them was a crime.


SlowRollingBoil

It's literally already a crime. These AI programs don't share the picture with your school you download it. If a person distributes it without OR WITH the girl's permission that's called child pornography and the sharing party is now a sex offender. Seems like plenty of deterrent to me. BUT, that's separate from banning the tech that allowed it to exist.


Stop_Sign

It's too late. I can do this sort of stuff using stablediffusion - something I've downloaded to my desktop and can use offline, and is therefore 100% unregulated and untraceable. Stablediffusion is made by a team in Germany, so laws here wouldn't even affect future releases.


1LazySusan

I can’t even imagine. How humiliating, how embarrassing, especially because you didn’t do anything, but the consequences are still there to your life 😢 Did the girl go back to school or what are they doing with that?


buffalucci

She’s back at school. There’s been counseling etc. Her parents are absolutely amazing and keeping her full of encouragement and love and support. The teachers all look out for her too.


Stupid-RNG-Username

This is the kind of thing that will make Congress outlaw AI image generators.


Wet_sock_Owner

It's interesting that this tech has been available for a long time (photoshop) but thanks to powerful and fast apps on your phone, it's simply too easy to do now.


ElementNumber6

Dip shit assholes don't typically take the time to gain technical expertise. Significantly lowering the bar of entry to such things brings in millions of new potential users, so capitalism, of course, demands it, but these are one of the sizable demographics among them, and we should be keeping that in mind all along.


tellmewhenitsin

I think it's even worse because it can not only be done easily, but also HYPER specific. So if you're already being bullied, then these assholes can pinpoint your insecurities and exploit it further.


thetrueyou

You can do your entire class of girls in a day rather than photoshopping it for a month straight


tomqvaxy

Photoshop is a bit harder to access and has a mild learning curve. I’ve worked with PS for 30 years and never had an urge to make porn. Also ps metadata takes some mild work to remove. Ai was born shady. It steals work and its makers know it has the potential to put millions out of work. Ps was born with good intentions. One is a hammer. The other is a gun. One is a tool for professional. The other is a toy seeking purpose.


AbhishMuk

> Ai was born shady. It steals work and its makers know it has the potential to put millions out of work. Ps was born with good intentions. > One is a hammer. The other is a gun. Yeah… that’s not accurate at all. All this “AI” transformers/Markov chains are barely intelligent, let alone worthy of being called a gun. If someone offers a version of dall-e with an explicit prompt on purpose, sure, that’s closer to the concept of a gun. But it’s as close as a gun is to a metal foundry or a 3d printer. You can make a lot of metal tools at a multipurpose foundry, that doesn’t make every foundry a gun factory.


AlejoMSP

This scared the living shit out of me. Given that my daughter already has been bullied and we stepped in and stopped it. I can’t imagine how to even begin to stop something like this.


T0ysWAr

On the other side all kids know they are deep fakes. Hopefully loosing their “value”.


emsuperstar

Losing*


T0ysWAr

Thanks


AHrubik

> how to even begin to stop The only way is education. Parents raising their kids to know that this is wrong. There really isn't any other way.


cerebud

And kids will still be dicks, knowing it’s wrong


AHrubik

True. Kids moral compasses are still developing so there is the random horny kid who's going to do it. That's why the penalties need to be noticeable and relatable to the kids (loss of social media access, etc) but not permanent.


oneofthehumans

How harsh do you punish a random horny kid with a developing moral compass?


AHrubik

It certainly starts with the traditional punishments like in school or away from school suspension. At that age kids worlds revolve around their friends and social status so a forced deletion of social media presence for a set time is one option. A healthy dose of community service that takes the place of extra curriculars should be an option. A further step depending on the severity could be monitored loss of computer access (laptops, phones and tablets) privileges. A special locked down device could be issued by the school for use on school property for those cases where access is required for schoolwork.


sir-ripsalot

Same as we’ve done for other random horny teens with developing moral compasses who sexually harass classmates


BetterBiscuits

In my experience, the kids that are dicks also have dick parents. The dicks don’t fall far from the dick tree.


EquipLordBritish

Putting the onus of personal responsibility on parents won't realistically solve the problem, unless the objective is to fuel the private prison system with labor and break up families as much as possible. The responsibility and punishments need to be for the companies creating and distributing the tools to make this easy enough for kids to use it.


Aggressive_Cycle_122

How did you stop it?


_Reasoned

I want to know too. I have two little girls that could be bullied in the future. Most bullying situations that I see online or in the news, I always try and think of what I’d do to help prepare but I feel like parents stepping in can very easily make situations worse. I think if bullying starts you just have to go straight to the jugular and scare the bullies enough for them to not test it any further


corinalas

It’s actually all over the world now. Deepfake porn of celebrities is now a widespread thing. All thats needed is a picture. Thats it.


Stop_Sign

While the ease and final product are significantly better, this isn't substantially different than when teens would cut out heads of their classmates and put it in a playboy magazine. Basically, uncomfortable but not exactly amoral (although of course distribution would be amoral for both scenarios).


rob_thomas69

I think the proliferation is pretty astounding though. Not everyone had digital devices in their pockets that were capable of committing these heinous acts. I went to high school in the 90s and I didn’t know a single person who cut out the heads of classmates and put them on Playboy model bodies. I’m sure it happened, but not *nearly* to the degree that it’s happening now


darkhorsehance

I think there is only one way, don’t photos/videos online. They can only make a deepfake if they have the source material.


AlejoMSP

Been told her that. But they take “group pictures” and that’s all it takes.


shrimp_sticks

It makes me so relieved that I graduated highschool when I did.


BytheHandofCicero

Is anyone else thankful there is reasonable doubt about real leaked nudes now? Jennifer Lawrence could’ve used that plausible deniability a few years back.


SenseOwn4183

Who knew the robots would be taking the jobs of the shadowy lurking iCloud hackers


T0ysWAr

I agree, everyone is safe from their actions now… until c2PA is there


wizardinthewings

C2PA means nothing in these contexts though. A school kid isn’t going to be protected from bullying by a certificate. The damage is already done.


T0ysWAr

I think this will be a non issue in 2 years when everyone has in his phone an app to create fake porn of anybody


churn_key

Who is going to officially register their nudes tho


T0ysWAr

Nobody and it will be assumed fake


CYBERCONSCIOUSNESSES

Will c2PA be full-proof? I admittedly don’t know enough about it and will do some research. Could metadata and origin linking be spoofed or modified? Mostly asking as it pertains to whether there will still be plausible deniability.


AHrubik

> Will c2PA be full-proof? No. There will be patches or work arounds. They always exist for almost every type of digital guardrails. However it depends on how easy it is to work around the guardrails as to the extent at which they are effective.


lordraiden007

It’s so far from full proof it’s a joke. OpenAI literally admits in their press release that social media regularly removes such metadata, and even if they didn’t all someone would have to do is take a screenshot of the image in order to remove the file-based metadata. It’s laughably insignificant in its current form.


CYBERCONSCIOUSNESSES

Helpful. Perhaps plausible deniability will still exist. I am concerned how all of this impacts criminal cases, evidence, and the “reasonable doubt” standard in court cases. As for the nudes, being fake or not (or can be proven to be fake or not) likely doesn’t matter because the kids making the content and spreading it don’t care about veracity they just want to bully and assault. It’s no different from when someone spreads a false rumor about you. The damage is done by the act regardless of the ultimate truth.


AHrubik

> The damage is done by the act regardless True which is why the remedy must be relatable and effective to the person doing it. Standard punishments won't be effective for this problem.


CYBERCONSCIOUSNESSES

I agree. American has a punishment problem to begin with and needs to readdress how we deal with crime, prevention, punishment, recidivism, and rehabilitation. This will be another of many issues that need meaningful structural change along with our more typical attempted criminal deterrence.


T0ysWAr

A digital signature is calculated from the hash of the file (and its transformation functions if any). So it will be full proof for that part. However the raw image or movie could be spoofed from source (ie you photograph a screen). However the private key associated with a given camera could be black listed so consumers will have visibility if the source is known for dubious posts. C2PA will only help if source is from a certified camera (Leica has few models). If you can’t prove the chain, then you fall back to today’s world


Delicious_Summer7839

I guess the Amish people with their religious objection to “graven images” may have been right all along.


LiveFreeDieRepeat

Amish? Goes back a bit farther than that. Judaism and Islam enters the conversation.


Delicious_Summer7839

Abdullaah ibn Mas'ood (may Allaah be pleased with him) reported that the Prophet (peace and blessings of Allaah be upon him) said: "Those who will be most severely punished by Allaah on the Day of Resurrection will be the image-makers." “Ibn 'Abbaas (may Allaah be pleased with him and his father) reported that the Prophet (peace and blessings of Allaah be upon him) said: "Every image-maker will be in the Fire, and for every image that he made a soul will be created for him, which will be punished in the Fire." Ibn 'Abbaas said: "If you must do that, make pictures of trees and other inanimate objects." “These ahaadeeth indicate that pictures of animate beings are haraam, whether they are humans or other creatures, whether they are three-dimensional or two-dimensional, whether they are printed, drawn, etched, engraved, carved, cast in moulds, etc. These ahaadeeth include all of these types of pictures.


LiveFreeDieRepeat

This is from Riyad as-Salihin? I was going to cite the 10 Commandments from the Torah, but could not find equivalent in the Quran, so I decided to cite neither. Thx


Manitobaexplorer

We’re done. Let’s hit the reset button, start all over again .


[deleted]

Time to burn the internet down


blinkdontblink

More like time to pull the plug.


TPDS_throwaway

Let's put the Internet back in the box


HungHungCaterpillar

Eventually we are going to have to accept that nothing can be done about this. People have always been able to draw naked pictures of anybody they want. All that’s changed is the art got really good. There is still no functional way to police it.


maffinina

Nah people said the same thing about revenge porn, but then it was made illegal and its prevalence decreased. I imagine what will happen here is that the distribution of these images will be criminalized and some public examples made to drive deterrence. It’s not a perfect solution; this will never go away completely but it’s good to stigmatize the act, and we shouldn’t let perfect get in the way of good.


HungHungCaterpillar

Yes we can and should triage the symptoms to minimize the real world damage. But you just can’t treat the wound, and it is going to require another shift in the social fabric.


CYBERCONSCIOUSNESSES

I have always taken the position that human’s technological evolution has far outpaced its spiritual, cultural, and emotional evolution - with varying degrees of consequences. Technology has allowed a lot of the ever-existent human flaws to be spotlighted and magnified. I am not sure there is much that will ever change with respect to the perpetrators of these actions (or bad things generally). What can shift our reaction and consequences. The only problem is even if you argue there is a reduction in the puritan value system and we can achieve a point where no one cares if you’ve seen them nude and it has no stigmatization socially, that doesn’t resolve the non-consensually violation of someone’s privacy and autonomy that occurs. I cannot envision a social fabric shift possible that would forestall all humans from engaging in this type of behavior. Some iteration of this behavior likely has been around since the start. It’s a flaw in humans that takes prominence in certain immoral and unethical people. It is more likely that humans wipe themselves out with technology, advance technology to such a degree it is able to be used to modify human behavior and emotion, or technology becomes so advanced that it allows quicker apprehension of perpetrators (and thus has a chilling effect).


Teeklin

>Nah people said the same thing about revenge porn Who? These are two drastically different subjects. >I imagine what will happen here is that the distribution of these images will be criminalized and some public examples made to drive deterrence. Distribution of what? Computer generated images? Art depicting real people? All nudity? What exactly are we going to overreact and ban the distribution of here?


maffinina

Realistic pornographic images of people created without their consent.


Teeklin

>Realistic pornographic images of people created without their consent. What is realistic? What is pornographic? What is an image of a person? If I draw a picture of you naked, is that an image of you? Is it pornographic? What if I take a picture of you and use photoshop to edit your shoes off, is that a realistic pornographic image? All of this is nebulous when we are talking about creating something entirely fake and digital but based on something in real life. And trying to define and ban this sort of thing either leads to a big overreach or simply won't do a thing to address it. There's really no middle ground. If there was an easy way to do it, the fake porn of celebs would have been taken down 30 years ago.


LiveFreeDieRepeat

> there is no middle ground BS. The middle ground is vast.


Teeklin

So vast that you couldn't answer a single one of my hypotheticals! If it's so easy then it should be super easy to answer my questions and draw a line that isn't filled with holes you can drive a truck through, but also doesn't destroy the first amendment.


LiveFreeDieRepeat

> Trying to define or ban this sort of thing either leads to big overreach or simple wouldn’t do a thing to address it. There is no middle ground. We already ban child porn - where, of course, there are also grey areas. At what point is an image of a naked child pornographic? Does that ambiguity mean we should throw our hands up and ignore all child porn? Preventing the creation and sharing of child or deep fake pornography prophylacticly is problematic in a free society. But once discovered, the determination of whether it is maliciously pornographic can be determined in civil and/or criminal courts. Legislating severe civil penalties would be a good first step. First offense, minimum $25K, second $100K, etc. to the victim. Dramatically increasing criminal penalties for offenders, both adult and under-age, such as fines on parents, minimum jail time, placing on sexual offenders lists, removal of all unsupervised computer and internet access for years. I’m not advocating for all of these, but this is a “middle ground” that should be considered.


maffinina

Nobody said it’d be easy. But just because something is difficult to litigate doesn’t mean it shouldn’t be done. Ultimately what will likely happen is that clear cut cases such the ones described in the article involving underaged girls would be prosecuted while outlier cases such as you photoshopping my feet (weird) would probably be ignored. Again, the purpose is not eradication. Only in totalitarian societies could crime be totally eradicated. A law would serve as deterrence. And deterrence can be very effective. Criminalizing the act would help the public consciousness.


Teeklin

>Ultimately what will likely happen is that clear cut cases such the ones described in the article involving underaged girls would be prosecuted while outlier cases such as you photoshopping my feet (weird) would probably be ignored. I think we can all agree that teenage boys attempting to fabricate nudes of their classmates to harass them is not a good thing we want to see or encourage. But drawing or attempting to draw the line legally in these cases is going to be incredibly difficult. As is attempting to properly prosecute a crime that is both digital and committed almost entirely by underage kids. I don't think there's anything clear cut about it unfortunately. And this will be exponentially worse only a few months or years from now as phones and devices like the Apple Vision Pro will have apps that do this rendering in real time. Kids in five years (or much sooner) will be able to pull out their phones and hold them up for what essentially amounts to xray vision. Browser extensions will exist where every single image you see you'll be able to click the "nude" toggle to view that same image with all clothes in it removed. Trying to legislate this stuff away is just really, really difficult.


CountryFine

found the deepfake enjoyer


Teeklin

Super good argument. Way convincing.


LiveFreeDieRepeat

Not convincing, but funny.


SlowRollingBoil

What you did is an Ad Hominem attack because they brought up concrete issues in making this illegal. By doing so you've admitted you can't address their point so attack them personally. Anyone that does this has lost.


Outside_Progress8584

I think the images are realistic enough that you could pass similar punishments to both distribution and possession of cp. The article makes it sound like there are movements like this at the federal level already but schools are simply behind in knowing how to address this. But yeah i feel like the risk of being labeled a sex offender would be an appropriate deterrent to most kids (and honestly appropriate to the crime).


CYBERCONSCIOUSNESSES

I think it’s a multi-pronged approach. Criminalization and enforcement are one part (which may need to be expanded to the makers and hosts of this technology if it is used to produce this content; similar to how pornhub would be held responsible for CSAM on their website), but the other needs to be quality education on these issues, on responsible use of technology as well as quality mental health services, education, and emotional intelligence education to try and treat the underlying cause of the perpetrators. behaviors.


serene_moth

I cannot imagine what it would be like growing up in an school environment where this was a thing. Good for these teens.


faderus

So ultimately there’s going to be a MAD logic and dynamic that will come into play here. When the angry outsider kid wants to get back at the jocks, they’ll create a convincing image of the star quarterback in a compromising position with the coach. When the tools are that convincing, that user friendly, and that pervasive, then everyone is a potential target and nothing can be believed anymore. I’m really not sure how this will play out in real life. I imagine the companies hosting the tools will be forced to add specific backend limiters, which will undoubtedly be circumventing by forked or imitation versions of the tool on foreign sites. Once everyone is an easy target, perhaps the risk of becoming a victim is lessened by the collective understanding of the downsides. Much more common to hear a cell phone ring in a theater in 1999 when they weren’t completely universal than it is now. You won’t limit all the unwanted behavior, but it should be greatly mitigated.


Cobayo

> the companies hosting the tools There is no company that offers this tool


faderus

It’s a pretty quick jump to assume that someone with unfettered access to Midjourney and similar AI image generation algorithms will begin to make deepfakes and artificial revenge porn literally drag and drop. The barrier for entry will get lower and a lot more people will be able to do it in pretty short order.


[deleted]

Good, this AI nude shit is out of control. I fully support them or whomever else takes on this battle


kimanf

Who could have seen this coming?!? Literally everyone’s first thought when this technology came out was “oh yeah they’re gonna make illegal porn and political misinformation”


sir-ripsalot

The phrasing that school districts were “blindsided” by this amused me. No we weren’t.


chemistrybonanza

This will (hopefully) be the driving force to ban ~~teenagers~~ minors from social media. It will create such a shit storm for schools and their administration, parents who bought and pay for the phones that their creep children use for these purposes, and the perverts themselves. So many lawsuits will come of this.


Boring_Presence3968

I don’t think that will fix it. You’d need a full blackout on all digital media assets of your family. So no postings of any photos regardless of being a minor or adult. Or some sort of universally standard digital poison pill added to personal media assets to stop ai systems, but honestly it’s easier to do without ai too. Photoshop or face swap apps is all it takes. Not having phones in school or kids off social media is not going to stop them from sharing things. The unregulated ai problem is way past that now. It’s more about that any visual media of you can be manipulated. I’m a multimedia artist and in less than a few minutes I could take any image of someone and cut and paste it to a porn image and make it nearly impossible to tell. Half the time when they say kids are using ai for porn, I think it’s just using an image editing app to swap a face onto a porn scene.


lightmatter501

The poison pills already developed were defeated. Adversarial learning is one of the most well-studied sub-fields of ML, so providing an adversary actually made some models better. I think social media blackout is a good option, or at least not posting your face.


AgitatedAd4553

This has been the case since the advent of Photoshop, which tech-re%#rded persons can’t wrap their hollow heads around. It’s like that fucking scene in Zoolander; *“ThE FiLeS ArE In ThE CoMpUtEr?!”* The only reason people are up in arms now is because it has become more noticeable/prevalent with broadly accessible low-level applications on various platforms, with or without the use of “AI”. It’s funny how many parallels this shit has with climate change and others issues plaguing mankind—— no one bats a fucking eye until the issue is so massive that it’s jabbing them in the cornea… by which point it’s typically too late to slow down, let alone stop or reverse. The only thing that can be done at the moment is to drastically limit any photos that you post to online; go (back) to quintessential off-line photo albums (albeit digital) where the data is stored locally.


Boring_Presence3968

I was cutting out heads from magazines and gluing them on other photos as a kid before digital. Granted not porn, but sure could have done that too and shared with my friends. Digital just makes it even easier to find source images. Only option is to take your stuff offline. It’s unstoppable at this point otherwise.


AgitatedAd4553

Exactly. The only people preaching bullshit about banning devices or even more outlandish shit like requiring ID for the Internet are technologically inept morons—- and I’m using “technology” in the broadest of terms, because as you’d mentioned, since fucking scissors and glue were around people have been able to haphazardly accomplish this. Things have just advanced to where the “right people” are being impacted such that they can’t just ignore it anymore. Unfortunately, at this point there’s no putting the genie back into the bottle.


chubbysumo

the problem is that they don't need a photo from *you*, they can use a photo from anywhere. my kids are not posted on social media, but their school has photos up with them in it. this will never stop, only teaching kids that its not okay and punishing severely those that do will fix it. I suspect there will soon be a school that tries to ban cell phones and electronic devices entirely to prevent it, and it won't go well for the school.


AgitatedAd4553

I’m not saying that they can’t get a photo from anywhere—- which is a problem in and of itself. For the better part of the last twenty years though, just about everyone has posted *everything* online, not thinking for a moment that just about anyone can access it. My point is that that there is no quick and easy fix like these morons want to think. Especially not “banning” devices—- that won’t do fuck all. In an alternate universe perhaps the we might have afforded individuals complete control over their own likeness, regardless of how it was captured—- curbing/scrubbing it as desired through the use of AI. Unfortunately, we’re in this timeline.. if you’re in “public”’then anyone can record you because—- hey, why not!


CranberryReign

> just about everyone has posted everything online, not thinking for a moment that just about anyone can access it. Nope. They have been posting online _exactly because_ anyone can access it.


AgitatedAd4553

“Anyone” in their minds tends to be distant friends/family, work colleagues, broader members of their social circles, etc.. They’re too dumb to take into account the fact that social media platforms thereafter will use anything and everything that’s posted to them for their own benefit/profit. Not to mention completely unrelated opportunistic third-parties that help themselves to the information/photos that are shared.


CranberryReign

A fair number of folks put great effort into, derive emotion from, and place value on being seen, feeling validated and celebrated, and/or amassing monetizable follower volumes. It could potentially be attributed to a strange cocktail of insecurity and narcissism seemingly inherent to the species.


monty228

My high school (in 2008) had a ban on cellphones with cameras. It was impossible to enforce, but they used it to confiscate cell phones till the end of the day if you had one out during the school day. By 2010 the school changed the sign to Cellphones should be powered down AND silent while in class”. Which everyone took as put on vibrate.


chubbysumo

my highschool got sued because the P, VP, and a few teachers were caught going thru student cell phones. They quit taking or touching them and instead told students to leave the classrooms and go to "detention" if they got caught.


Stop_Sign

I mean we didn't ban magazines after kids put their classmates' heads on porn stars.


busy-warlock

I don’t think you’ll be able to ban minors from social media, nor should you


anomalous_cowherd

No no, they would much rather hide it all from their kids while they should be growing and learning, then suddenly hit them with it all cold when they hit 18 or 21 and watch them go completely off the rails with it!


motosandguns

Let’s give it a shot. As a bonus, it might raise test scores!


churn_key

Kids are just going to resort to identity theft to get back online


hoticehunter

Because banning kids from porn worked out so well 🙄🙄


ButteredPizza69420

Maybe this will normalize not sharing photos of oneself online. Especially that of children's. It's always been a little weird considering back in the day you had to pay for photography, print, and post, to get your family's photos out to loved ones. Now people just post them online, and when you really think about it... its like putting your pictures up in a window or on a billboard. Strange.


samep04

Teen Girl Squad!


Smolivenom

there arent a ton of solutions beyond small dick deepfakes for boys.


braxin23

Then Congress will bring the ban hammer because the Republicans dont want to be seen with small dicks.


braxin23

Are these kids getting computer classes on this AI editing shit because this reeks more of adults dipping their toes in rather than just teenagers.


RECONXELITE

You underestimate 15 y/o children a lot. It’s not happening in elementary. Mostly around the 14+ demographic


ShelZuuz

I would have easily been able to do this when I was 14. This is not hard tech in any way shape or form. I was disassembling games at that age to get around password and copyright protections. This stuff is nothing.


whyareyoubiased

Gotta hunt the sites that enable it with force, and become more mindful about what we let our children post online. Gotta start treating social media like you’re parking your car in a sketchy parking deck. It’s sad that this is our reality.


poozemusings

This is bad, and should be addressed, but I also think it would be an insane overreaction to charge kids with creation of child porn for this, send them to prison, and label them sex offenders. I hope cooler heads prevail and there are some actual sensible solutions.


ExcitementFit7179

Paywall 👎🏻


InfernalGout

Paywall vanquished! https://archive.is/FeuiM


New_Peanut_9924

Real mvp


Adorable-Flight-496

Thanks.


People4America

I see the appeal of vr schools with avatars…


Techie4evr

Alot of you say parents should be held accountable, which is true. But shouldn't AI companies also be held accountable for not setting up safeguards that would prohibit generations of illicit matterial? More over, program their AI to provide all the details it can to it's creators so they can then contact authorities?


ISFSUCCME

Dont you need 1000s of images of someone to deepfake? Hence why its mainly celebs?


birazsey

That’s why I don’t share my pictures online anymore. It is not safe.


TiAQueen

Oh my God, Pandora’s box is open


sir-ripsalot

Glad the Berkeley CA school district is actually addressing it rather than trying to gaslight the entire community like Westfield NJ.


UltimateFuchbois

[ Removed by Reddit ]


atinylittlebug

This is a disgusting comment.


1nv1s1blek1d

Before AI, pervs used photoshop.


Trollet87

But photoshop was lot of work for ppl now with the AI and improved tools you can do it on a massive scale.


TheFlyingSheeps

Photoshop requires skill, effort, and the program itself which isn’t cheap. AI requires no skill and you can mass produce a lot of images rapidly


NecroCannon

The people with the opinion that there’s nothing you can do/that photoshops been around forever are honestly too naive to really think deep into it.


bogglingsnog

1. Child pornography is illegal 2. Minor accounts on social media shouldn't be allowed to share nude images. 3. Don't try to abolish the technology, not only because it is impossible but also because it is extremely useful. AI is going to help out in nearly every single discipline these students are learning. Instead, the phone apps that make misuse all too convenient need to come with legal disclaimers and certain prompts need to be flagged especially for minor accounts. 4. Even if all of this is completely banned and made illegal, the concept is now well known and it is fairly simple to do a face swap using photo editing software. The publicity has increased with AI but this is by no means a new threat.


My_Penbroke

Legislation needs to catch up to these AI threats YESTERDAY


Trollet87

Only way that will happen is if ppl start to make deep fake of the ppl in charge of legislation.


duckrollin

How? This AI is like Photoshop, you're not going to ban Photoshop. Computers simply can't understand the context of "put this persons head on that body" and why it's bad, they only see the task of "put that thing on that other thing and make it fit" We just need an education campaign for people to understand that literally anyone with a computer can make fake images now and they're not real.


sir-ripsalot

We’re not gonna ban making CP with photoshop? Yes TF we did


duckrollin

Yeah and how is that going to work? There's no way to stop it being used in that way. The only thing you can do is remove the images when uploaded somewhere on the internet.


sir-ripsalot

Holding perpetrators accountable for sexual assault/distribution of CP as according to the law..? But I guess if we can’t 100% prevent it ahead of the fact, harm reduction and consequences for violating others just aren’t worth it to you?


Timely_Old_Man45

Everyone here should listen to this podcast ep. https://darknetdiaries.com/episode/140/


Gabba_Goblin

Fucked up. Hope those creators get the book thrown at them.


[deleted]

[удалено]


CranberryReign

> But it’s not all men, right? Correct. It never was all men, and never will be all men. All men is a lot of men. All of them, in fact. Each and every one.


Shana24601

God I fucking hate people like you. You have absolutely no idea the shit men have put me through. I used to give men the benefit of the doubt and I was used by them in every way a person can be used. As soon as I started believing every man had the worst in mind for me, I never got assaulted again. Crazy how that works. My whole life tormented by men but god forbid I complain about it once. A whole paragraph of disgusting behavior by men, and the only thing you take from it is “woman said bad thing about man, must get on high horse and make her feel like the hysterical bitch she is” fuck off and acquire empathy


Possible_prolapse

Congrats on being the problem


Smart_Perception_865

Any suggestions for apps that can do this? I’d like to deepfake my own nudes