SleepingTigers Posted January 27 Posted January 27 Wouldn’t banning AI be more effective? Since that would ban other harmful AI? But I’m glad at least something is being done. Hope Taylor sues.
QueenBLadyG Posted January 27 Posted January 27 7 minutes ago, Cruel Summer said: You started your side of the conversation in this thread by suggesting that the current search limits were “doing way too much” and saying she wasn’t the first person to deal with this, then asking where this outrage was before. That probably doesn’t read to everyone as genuine concern about prior non-celebrity victims not having been taken seriously enough, but instead might come across as dismissing the current situation, as if to imply that she should just get over it, or that these measures shouldn’t be tried, or that people in general are overreacting - just because it’s happened to others before who weren’t given this same attention. Not saying that’s how I read it or how you actually meant it, but I think that’s why people reacted the way they did to you. Everyone who has a moral opposition to this kind of material being distributed probably agrees with you that attention to this kind of situation is long overdue and that it shouldn’t have taken one of the biggest celebrities in the world being a target for serious conversations to be had about it. I just think that it's already being dealt with, the photos are actively being taken down and deleted, it's over the top to ban her name. I would say the same if it was Beyoncé. I do think that people should have thicker skin. I can understand how some would consider it "dismissive" but, that's how I am. Ignore the hate and evil because it's not going away, and it doesn't define you. 1
Illuminati Posted January 27 Posted January 27 1 minute ago, QueenBLadyG said: They do post a lot of child porn to Twitter, unfortunately but, I was simply bringing up a viral story I had read before this happened on Twitter. I wasn't stating my opinion on this as an only Twitter thing. I was talking in general. I sense your opinion on this has somewhat shifted but the way you were coming in this initially was that it shouldn't be regulated at all because they waited too long. I would want that girl to be the last to experience that kind of harm, regardless of how the law (or TOS changes) came about. We could do a lot worse than have Taylor influence laws that are being passed, look at the grip JK Rowling has in the UK with her anti trans meltdowns. 1
QueenBLadyG Posted January 27 Posted January 27 3 minutes ago, kyoshi said: I believe you would say this about any victim, becase you really dgaf about them and the white house also has no plans to do anything about the ai images btw they’re just using it for some brownie points and to get people to vote for genocide joe Reality doesn't mean I "dgaf". Joe needs that 2nd term.
QueenBLadyG Posted January 27 Posted January 27 (edited) 5 minutes ago, Illuminati said: I sense your opinion on this has somewhat shifted but the way you were coming in this initially was that it shouldn't be regulated at all because they waited too long. I would want that girl to be the last to experience that kind of harm, regardless of how the law (or TOS changes) came about. We could do a lot worse than have Taylor influence laws that are being passed, look at the grip JK Rowling has in the UK with her anti trans meltdowns. No, I've always felt they should highly regulated. I was just saying that it took this for everybody to get there undies in a bunch? No one should have to deal with this. I feel terribly for Taylor. I know she felt extremely violated. And thats a whole other story and conversation to be had.... Oh, JK..... Edited January 27 by QueenBLadyG
byzantium Posted January 27 Posted January 27 1 hour ago, HRHCOLLECTION said: Should she post a statement on this situation or fly under the radar? I feel like she does not want to draw more attention to it. The second she speaks, the entire country will know about it and more people would likely look for it.
kyoshi Posted January 27 Posted January 27 16 minutes ago, QueenBLadyG said: Reality doesn't mean I "dgaf". Joe needs that 2nd term. Pretty much you don't give a **** and ofc you support genocide no surprises there
byzantium Posted January 27 Posted January 27 1 hour ago, QueenBLadyG said: Just search 'tay' or 'taylor'. And this is doing way too much. She isn't the first person to deal with AI pictures. Where was this outrage before? Take your stan goggles off for once. Just because something was done wrong before does not mean we have to perpetuate that for the end of time. 2
DoubleRainbow! Posted January 27 Posted January 27 Wish twitter did the same for every single female artists who has gone through the same
QueenBLadyG Posted January 27 Posted January 27 1 minute ago, kyoshi said: Pretty much you don't give a **** and ofc you support genocide no surprises there Life was a willow and it bent right to your wind (oh) 1
QueenBLadyG Posted January 27 Posted January 27 (edited) 4 minutes ago, byzantium said: Take your stan goggles off for once. Just because something was done wrong before does not mean we have to perpetuate that for the end of time. How can I take off my Swiftie goggles? Edited January 27 by QueenBLadyG 2
Popboi. Posted January 27 Posted January 27 1 hour ago, QueenBLadyG said: You mean like the little 14 year old girl who killed herself because of her fake AI porn pictures being distributed??? And why would something need to be "viral" to be taken care of? You have to be popular to be helped or shown compassion? That’s different with different actions to be taken though. Of course stuff like Taylor’s name being unsearchable on x happens when the original post of the AI pics was on x. Legal actions can be taken against the host platform that uploaded the pics, Elon’s team must’ve gotten a warning and are now working overtime to clean that. The other case did not originate on x, the news of it happening did and will have its due criminal investigation. “Viral” = visibility = more people outraged. Obvious math, even more when the victim is a loved public figure like Taylor. You're comparing apples and oranges.
QueenBLadyG Posted January 27 Posted January 27 (edited) 5 minutes ago, Popboi. said: That’s different with different actions to be taken though. Of course stuff like Taylor’s name being unsearchable on x happens when the original post of the AI pics was on x. Legal actions can be taken against the host platform that uploaded the pics, Elon’s team must’ve gotten a warning and are now working overtime to clean that. The other case did not originate on x, the news of it happening did and will have its due criminal investigation. “Viral” = visibility = more people outraged. Obvious math, even more when the victim is a loved public figure like Taylor. You're comparing apples and oranges. An incident shouldn't have to be "viral" for actions to be taken is the obvious point being made in the question asked. And I wasn't speaking only about X. I was making a blanket statement about AI. Edited January 27 by QueenBLadyG
Popboi. Posted January 27 Posted January 27 1 minute ago, QueenBLadyG said: An incident shouldn't have to be "viral" for actions to be taken is the obvious point being made in the question asked. And I wasn't speaking only about X. I was making a blanket statement about AI. Actions of this nature wouldn’t be able to be made for cases that they don’t even know about or won’t have repercussions to them. “Viral” is not a goalpost that has to be reached in this case, it’s a vehicle to bring subject matters to the right people for actions to be made. Nothing can be legally done against image generating AI (outside trademarked content) for now, Taylor’s mess might kickstart a precedent to legislate against it so really stop seeing it as some sort of negative thing (implicitly said by half the thread being you contradicting every other post ). 1
Goaty Posted January 27 Posted January 27 54 minutes ago, QueenBLadyG said: I do think that people should have thicker skin. I can understand how some would consider it "dismissive" but, that's how I am. Ignore the hate and evil because it's not going away, and it doesn't define you. Thicker skin? About deepfake AI no consensual porn?? this isnt a couple losers calling her ugly or a snake. There have been several famous people who have been victims of these kinds of sexual crimes and they’ve all talked about how terrible, frightening, and violated they felt. God forbid X do a little “overreacting” to make sure it’s not as easily spread online. they’re likely not even doing these measures out of the goodness of their heart anyway. This has blown up to the point the White House is commenting and I’m sure X is scrambling to legally cover their asses 3
Goaty Posted January 27 Posted January 27 “Just ignore the hate and evil, it doesn’t define you” get ******* real
*.Digambar.* Posted January 27 Posted January 27 38 minutes ago, QueenBLadyG said: How can I take off my Swiftie goggles? You'll first need to take off your hive goggles. 1 3
The Man Who Posted January 27 Posted January 27 Now we just need her to get that entire site taken down.
QueenBLadyG Posted January 27 Posted January 27 2 minutes ago, Popboi. said: Actions of this nature wouldn’t be able to be made for cases that they don’t even know about or won’t have repercussions to them. “Viral” is not a goalpost that has to be reached in this case, it’s a vehicle to bring subject matters to the right people for actions to be made. Nothing can be legally done against image generating AI (outside trademarked content) for now, Taylor’s mess might kickstart a precedent to legislate against it so really stop seeing it as some sort of negative thing (implicitly said by half the thread being you contradicting every other post ). You don't have to be famous or have your case go "viral" for actions to be taken and repercussions handed out though. At least you would think that. There's been actions taken from certain parties regarding AI images generated that they've excluded terms being used so the program couldn't create these kind've images. But this, with Taylor, is going to speed it all up, obviously, as we can see by the reaction. I think banning someone's name is over the top. It doesn't make me unable to still love and support them, all while understanding it could be uncomfortable for them.
QueenBLadyG Posted January 27 Posted January 27 4 minutes ago, *.Digambar.* said: You'll first need to take off your hive goggles. I've already stated I'd feel the same if it was Beyoncé.
45seconds Posted January 27 Posted January 27 Those images were upsetting. I can’t imagine how that must feel
chiliam Posted January 27 Posted January 27 2 hours ago, satellites.™ said: Not her fault, but this backlash is going to be MASSIVE. All of the women affected daily by AI and porn and the first time the victim is Elon's fave white woman he bans it immediately to protect her. Oh baby. Such a dumb and nasty comment, you are really blind by your hatred. And it hows the law work. There has to be a big media storm or unprecedented serious case for the law makers to move. Regulations need to be made with AI, Taylor viral case just help to push it done faster and seem like you guys prefer nothing to be done. Lol
The Music Industry Posted January 27 Author Posted January 27 2 hours ago, QueenBLadyG said: I just think that it's already being dealt with, the photos are actively being taken down and deleted, it's over the top to ban her name. I would say the same if it was Beyoncé. I do think that people should have thicker skin. I can understand how some would consider it "dismissive" but, that's how I am. Ignore the hate and evil because it's not going away, and it doesn't define you. Telling a victim of online porn to just have "thicker skin" is WILD
Recommended Posts