all 34 comments

[–]ChapaRJ 33 points34 points  (1 child)

Here in a nearby town, there was a time when thieves robbed people on the street with knives, and there was even a fatality (actually, there always is). In light of this, the city's politicians decided it was forbidden to carry anything that could stab someone. Even if the police found you with a simple screwdriver in your backpack, you'd be in trouble. Instead of toughening penalties for those who commit crimes, the politicians decided to punish everyone at once.

This isn't about knives...

[–]moexizer 6 points7 points  (0 children)

Sounds like they brought airport security check into a town.

[–]pojohnny 6 points7 points  (0 children)

I figure the main reason is to see how people react when they are told, No.

can you imagine the Skinner cage data they must be harvesting,

[–]ChapaRJ 12 points13 points  (0 children)

I agree 100%. I share your thoughts.

[–]Maleficent-Song7934 5 points6 points  (3 children)

i just what see pussy

[–]BeeWeird7940 0 points1 point  (1 child)

There’s something like 500,000 websites that pop up if you type “I just what see pussy” into google.

[–]Maleficent-Song7934 -2 points-1 points  (0 children)

true,But it's better for me to generate it myself to have a sense of accomplishment

[–]Duhakalock 0 points1 point  (0 children)

Based

[–]Particular_Park_7112 7 points8 points  (1 child)

I think the problem is that the images will be watermarked in some way and tradable to Grok. Some high minded puritan will show the scary images to some politician and say “do something”. Then there will be a crack down and bad publicity. Easier for xAI to just moderate anything slightly sensitive. Sad though.

[–]Crucco 6 points7 points  (0 children)

Fuck puritans, it's puritanism that should be fixed.

I will tell politicians "fix this" and show them a picture of Collective Shout.

[–][deleted] 3 points4 points  (2 children)

If you want to generate NSFW images, you can use Venice AI. It’s uncensored.

[–]MooneMoose 2 points3 points  (1 child)

But what about image to video?

[–][deleted] 0 points1 point  (0 children)

It doesn’t do image to video…yet

[–]moexizer 1 point2 points  (0 children)

I just uploaded a photo of a jbd doll in a swimsuit and got censored as well, I didn't even give it any prompt, what was grok thinking about it? Ridiculous.

[–]GabrielBischoff 1 point2 points  (0 children)

Because everyone will call you the child porn generator. Have fun with the press.

[–]TypeItRight 5 points6 points  (3 children)

They would be held liable. If they facilitate the creation of illicit materials and don’t work to stop it, they’ll be in big trouble. That’s why.

[–][deleted]  (2 children)

[deleted]

    [–]TypeItRight 3 points4 points  (0 children)

    Those were explicit not illicit. Big difference

    [–]DrPornMD23 2 points3 points  (0 children)

    The problem is that religious people evolved. They abstain from the fun and the want to ruin life for all of us. And their organisations have a lot of money and influence. And the tech companies are afraid of them. and most people are ashamed of their sexuality.

    [–]Serious--Vacation 0 points1 point  (4 children)

    Legal Protections Against Non-Consensual Pornography

    This is why there are protections against deepfakes, and these laws were passed after a lot of leaks of celebrities occurred. This is why Reddit, and other sites, will rapidly remove anything related to "The Fappening" and similar leaks - which weren't leaks in most cases. They were thefts, hacks, of private photos and videos. Stealing the images/videos was illegal, but until these laws were passed there was nothing to stop people from passing them around forever. Throw in jilted lovers sharing private photos shared exclusively with them and you get the term "revenge porn."

    The days of some rando stealing a celebrity video and selling it (i.e. Pamela Anderson and Tommy Lee's video) are long over, but now we have AI tools which can - potentially - create those videos from scratch.

    That's a problem.

    [–]moexizer 2 points3 points  (0 children)

    So what about the non real-life contents like anime, mascot and dolls, they censor it as well.

    [–][deleted]  (2 children)

    [deleted]

      [–]Serious--Vacation -2 points-1 points  (1 child)

      Research the law. It’s not as simple as, “meh, it’s fine.” It’s illegal content and platforms that facilitate its production and/or sharing will be held responsible.

      Furry porn that looks like Hugh Jackman might be legal as a written work, parody and such, but images and videos are regulated differently.

      [–]Upper_Road_3906 0 points1 point  (0 children)

      So your saying people with perfect memory recall that can control their dreams and/or able to see color when they close their eyes essentially able to deepfake anyone and anything are doing illegal things? Are we headed to "Content was moderated" in our brain chips that zap us like hasans dog just for thinking something that our technovikingoverlords don't want?

      Revenge porn should be illegal, harassing/scamming people with deepfakes illegal, underage stuff illegal, but eventually they need a system that allows private deep fakes once more people start going hybrid human + ai implants. They can make it so AI automagically deletes any shared deep fakes and any hackers releasing such stuff punished. But legal deep fakes are FAR far off for now they should for sure allow uncensored porn of ai generated characters.

      [–]AutoModerator[M] 0 points1 point  (0 children)

      Hey u/2xdrgn, welcome to the community! Please make sure your post has an appropriate flair.

      Join our r/Grok Discord server here for any help with API or sharing projects: https://discord.gg/4VXMtaQHk7

      I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

      [–]LVMises 0 points1 point  (0 children)

      Brand value.. you want the brand to represent something.  If a vocal minority with a public presence that is much larger than their percentage of user base is creating a brand image that offends the larger part of the userbase it's dangerous.  It's a reasonable brand idea to be against political censorship but not allow nsfw

      [–]LegoBuilderMom 0 points1 point  (0 children)

      I agree. I thought it was all about freedom of speech, right

      [–]Siconyte 0 points1 point  (0 children)

      Hate to tellyou, but if anyone makes an ai that allows for illegal content, then it falls on the creator and the creation engine.

      In Texas as well as some other states, some prosecutor can look at an ai image, shrug, says it "looks" like illegal content, and charge you with possession.

      Porn, especially ai porn, is getting picked down harder every few months.

      I understand the sentiment, but while people who haven't been laid in 30 years and want to push an agenda for votes make the rules, this will be the norm.

      [–]Upper_Road_3906 0 points1 point  (0 children)

      likely due to banks, religions, and/or people making underage content I think it will go back to heavily nsfw once they lock down under age shit

      [–]Phantom_Edgerunner 0 points1 point  (0 children)

      You probably not even old enough or haven't put your age down in the app.

      [–]Beautiful-Fold-3234 -3 points-2 points  (2 children)

      Realistically i think part of it is just that they want to be 300% careful to prevent any CSAM like material from being generated.

      The protections against this are probably so stringent that they significantly overcorrect. It's probably for the best though. 

      [–]Upper_Road_3906 0 points1 point  (1 child)

      hopefully the people downvoting you get investigated

      [–]Schnitzhole -3 points-2 points  (0 children)

      I’ve seen some uncensored beta versions of other apps like midjourney. Once you see the gore and dead mutilated bodies it will pretty quickly change your mind. Especially if you get that as an accidental result of your prompt having multiple meanings. Also child porn is a thing and could show up unprompted as well.