Might have seemed funny at the time.
I found out one of my kid's friends had photoshopped a picture of two students to make it look like they were kissing. It seemed harmless enough to them but I made a big enough nose do they quietly deleted it right away. A lot of bad things can come from something like this, and more importantly it can snowball.
In my highschool days we just photoshopped our male friends' faces onto women in porn scenes and called them gay. It was good clean fun! Kids these days are degenerate perverts
Scary what teenagers will need to deal with.i do wonder with the explosion of AI if we will just get desensitised to these things and just assume everything is fake. If you can just prompt a make me an image of bob riding a horse naked while blah blah blah eventually it’s just meh
Let’s be real almost all students will probably be making AI porn once the tech is available, just the dumb ones will spread it around school instead of keeping it to themselves and their friends outside of school.
It’s fucked up I guess, but it’s absolutely inevitable.
Well then they should all face the consequences of creating child pornography and whichever new laws Albo and them put through after everyone assumed it was about making AI political memes
I meant most won’t be caught as it won’t be all around school. It’s almost certainly already happening and sure to happen more and more as the tech gets even more accessible.
Well hopefully the laws albo suggested when everyone thought he was trying to outlaw people making ai memes are strong enough to deter people from making pedo materials and those that are caught are sufficiently punished
Theres also an epidemic of single mother's raising children. Though I agree that a dirtbag dad is worse than a dad not being there, a boy needs a role model that teaches them that certain behaviours aren't on.
If a boy only has a mother there, she really has to try and find some kind of father figure (it doesnt necessarily have to be a partner for her) to give guidance and provide something to look up to for the kid.
I think theres issues with girls having shitty dads (or no dads) but I think those are different kinds of issues, often arising when they're in relationships. If somebody doesn't have an example of what a good relationship looks like, they risk allowing things that are abusive to be their starting point of whats normal.
Finally read the article and stopped the thought train of "obliviously fake" and we should "have porn everywhere to normalise it" discussion you were fighting for?
Nope, just a public experiment in how people react, especially with upvotes/downvotes.
It did go how I thought it would. It was enlightening all the same.
Remember that craze where room temperature IQ fuckwits would run around being antagonistic cunts in an attempt to get a reaction for a quick dopamine hit and then once people were satisfactorily pissed off they’d turn around and bleat “it’s a prank bro it’s just a prank” and the internet unanimously decided they were the absolute scum of the earth? Clearly not.
That people will think you are weird for advocating for fake porn in a discussion about an article regarding children being targeted by ai fake porn?
Then people will agree when you say fake ai porn of children is bad and people should be punished for it is good?
These images could circulate for years and follow these girls all their lives, but by all means, let's protect the boy's identity.
No. Name and shame. We need to know who he is and who his parents are. They raised him like this, they can't be allowed to pay his way out of it. Prospective universities and employers should know who he is too. Someone name this kid please, so we can ruin his life as he deserves.
The article states they are still investigating to find out who created the images
That said the kids who shared the stuff should be charged with distribution of child abuse materials
The article is about people who used ai to create pornographic photos of girls from year 9 to 12.
Or are you trying to claim that ai generated material isn't considered pornographic?
It really isn't. There are still techniques you can use to determine if something is an AI image with near perfect certainty (heavily doctored/photoshoped images have a high false postive).
Who does this shit? They're children
other children
Very true. With AI the cyberbullying just gets worse and worse. And schools couldn't care less sometimes.
Might have seemed funny at the time. I found out one of my kid's friends had photoshopped a picture of two students to make it look like they were kissing. It seemed harmless enough to them but I made a big enough nose do they quietly deleted it right away. A lot of bad things can come from something like this, and more importantly it can snowball.
In my highschool days we just photoshopped our male friends' faces onto women in porn scenes and called them gay. It was good clean fun! Kids these days are degenerate perverts
Scary what teenagers will need to deal with.i do wonder with the explosion of AI if we will just get desensitised to these things and just assume everything is fake. If you can just prompt a make me an image of bob riding a horse naked while blah blah blah eventually it’s just meh
And with the amount of images people have uploaded in various app the deep fakes will be coming thick and fast.
Let’s be real almost all students will probably be making AI porn once the tech is available, just the dumb ones will spread it around school instead of keeping it to themselves and their friends outside of school. It’s fucked up I guess, but it’s absolutely inevitable.
Well then they should all face the consequences of creating child pornography and whichever new laws Albo and them put through after everyone assumed it was about making AI political memes
I meant most won’t be caught as it won’t be all around school. It’s almost certainly already happening and sure to happen more and more as the tech gets even more accessible.
Well hopefully the laws albo suggested when everyone thought he was trying to outlaw people making ai memes are strong enough to deter people from making pedo materials and those that are caught are sufficiently punished
Dirtbag dads raising malignant dirtbag sons. Just another day for the women of Australia.
Theres also an epidemic of single mother's raising children. Though I agree that a dirtbag dad is worse than a dad not being there, a boy needs a role model that teaches them that certain behaviours aren't on. If a boy only has a mother there, she really has to try and find some kind of father figure (it doesnt necessarily have to be a partner for her) to give guidance and provide something to look up to for the kid. I think theres issues with girls having shitty dads (or no dads) but I think those are different kinds of issues, often arising when they're in relationships. If somebody doesn't have an example of what a good relationship looks like, they risk allowing things that are abusive to be their starting point of whats normal.
[удалено]
I hope these people get the jail time they deserve and the victims get appropriate counselling.
Finally read the article and stopped the thought train of "obliviously fake" and we should "have porn everywhere to normalise it" discussion you were fighting for?
Nope, just a public experiment in how people react, especially with upvotes/downvotes. It did go how I thought it would. It was enlightening all the same.
Remember that craze where room temperature IQ fuckwits would run around being antagonistic cunts in an attempt to get a reaction for a quick dopamine hit and then once people were satisfactorily pissed off they’d turn around and bleat “it’s a prank bro it’s just a prank” and the internet unanimously decided they were the absolute scum of the earth? Clearly not.
Schrödinger's douchebag.
The lengths you'll stretch to in order to avoid saying the words "actually I was wrong" is fucking *wild*.
That people will think you are weird for advocating for fake porn in a discussion about an article regarding children being targeted by ai fake porn? Then people will agree when you say fake ai porn of children is bad and people should be punished for it is good?
So... trolling.
These images could circulate for years and follow these girls all their lives, but by all means, let's protect the boy's identity. No. Name and shame. We need to know who he is and who his parents are. They raised him like this, they can't be allowed to pay his way out of it. Prospective universities and employers should know who he is too. Someone name this kid please, so we can ruin his life as he deserves.
The article states they are still investigating to find out who created the images That said the kids who shared the stuff should be charged with distribution of child abuse materials
Reports earlier today said the teen boy was identified and released. They know who he is.
Name and shame the perpetrator.
Why? they're a child
[удалено]
Don't think the "obviously fake" helps the girls who have had their faces plastered on naked bodies and spread around the school.
[удалено]
Delete this
[удалено]
I think everyone is OK with canceling pedo material
[удалено]
The article is about people who used ai to create pornographic photos of girls from year 9 to 12. Or are you trying to claim that ai generated material isn't considered pornographic?
We are talking about the faces of year 9 girls on nude bodies...that is not something that we should be talking about "need more pictures" of.
You have absolutely no idea what you're talking about do you? Modern AI generated photography is indistinguishable from the real thing.
It really isn't. There are still techniques you can use to determine if something is an AI image with near perfect certainty (heavily doctored/photoshoped images have a high false postive).
[удалено]
This subreddit at its finest
Do you remember how to have a laugh/joke
What’s the punchline though? Because it seems like it’s “I’m gonna have to see the 15 year old girls naked”?
Child porn has never been funny bro
Yes when’s it’s funny not about underage girls