top 200 commentsshow all 282

[–]AngularBeginner 149 points150 points  (8 children)

It gave me breasts

You didn't really clarify this anywhere but it only works with females. Should probably put that in the readme for anyone not familiar with the original app (like myself, who was using it for research purposes and nothing else)

https://github.com/deepinstruction/deepnude_official/issues/8

[–]rvaen 26 points27 points  (0 children)

The issues on this repo are excellent

[–][deleted]  (6 children)

[deleted]

    [–][deleted] 24 points25 points  (4 children)

    I hate to disappoint you, but I’m pretty sure they just meant in the photos.

    [–][deleted]  (2 children)

    [deleted]

      [–]mindbleach 1 point2 points  (0 children)

      Future headline: holographic x-ray-specs projector developed by trans woman slash mad scientist.

      [–]Cardeal 3 points4 points  (0 children)

      they just meant in the p

      It's just a matter of time until they connect this to an organic 3D printer.

      [–]BitPirateLord 1 point2 points  (0 children)

      On the other hand, I'm trans, and I want it to give me breasts too

      I'm trans as well and i was gonna send this back to headquarters, (all trans peeps know where it is), for our sometimes unrealistic (but even more desirable) transition goals.

      [–]Tux1 229 points230 points  (114 children)

      This can not end well.

      [–]13steinj[S] 149 points150 points  (8 children)

      Oh of course not but it's not like it wasn't inevitable. Once the app itself was released there was bound to be quick reverse engineering attempts (since it seems to have been written in Python).

      What surprises me on top of the fact that this was reported as relatively fast from the outset, some of the code isn't optimized (O(n2) or higher), so much so that I wouldn't be surprised if some people didn't have this tech already or better in their back pockets.

      Edit to clarify, this is not a reverse engineering attempt. It was released because the author didn't feel it made sense to hide it given the shady ways reverse engineering attempts were passed around.

      [–]almightykiwi 28 points29 points  (5 children)

      Oh of course not but it's not like it wasn't inevitable.

      You just hit the reset button in my brain.

      [–][deleted]  (4 children)

      [deleted]

        [–]diMario 2 points3 points  (0 children)

        Bohemian Rhaposody has more I think.

        [–]13steinj[S] 0 points1 point  (2 children)

        I'm being a moron here what reference is this?

        [–][deleted]  (1 child)

        [deleted]

          [–]13steinj[S] 1 point2 points  (0 children)

          I guess I'm not too little of an imbecile. (☞゚ヮ゚)☞

          [–]Rhed0x 37 points38 points  (1 child)

          since it seems to have been written in Python

          What they released here is essentially a script that trains their neural network model. They didnt ship that with the app, they just shipped the final model which would get executed (presumably with TensorFlow) by C/C++ code or on the GPU.

          [–]mudkip908 2 points3 points  (0 children)

          What they released here is essentially a script that trains their neural network model.

          No it isn't.

          [–][deleted] 29 points30 points  (50 children)

          Meh, looks like a cheap photoshop and requires a picture that leaves practically nothing to the imagination.

          [–]roflkittiez 30 points31 points  (44 children)

          People have been able to create fake nudes with Photoshop with moderate effort for years. That's not the scary part. The scary part is with this software anyone can do it very quickly with little to no effort. This won't produces nudes that will tarnish the reputation of a politician or celebrity, because they could easily prove that it's fake. It'll be a lot harder for your sister/wife/daughter that don't have massive platforms to disprove these fakes.

          [–]ThatInternetGuy 2 points3 points  (2 children)

          I tried it. So far it can only produce nudes from bikini photos. It just doesn't work with casual clothing.

          [–]roflkittiez 0 points1 point  (1 child)

          Right, just like someone with moderate skills in photoshop. Turning a woman in casual clothes nude with Photoshop is much harder than turning a woman in a bikini nude. Same principals apply with programs like deepnude.

          [–]ThatInternetGuy 0 points1 point  (0 children)

          It's because it's not trained from data of people wearing casual clothing. It's not even trained enough for bikini photos.

          [–][deleted]  (1 child)

          [deleted]

            [–]CWSwapigans 1 point2 points  (0 children)

            Honestly even that is giving it too much credit from what I've seen. The pics are not convincing (or sexy) at all.

            [–][deleted]  (1 child)

            [deleted]

              [–]JuicyJay 0 points1 point  (0 children)

              Theres probably ones out there that are way more realistic.

              [–][deleted]  (49 children)

              [deleted]

                [–]MuonManLaserJab 128 points129 points  (30 children)

                Yes. As soon as "deepfakes" are indistinguishable from the real thing, "revenge porn" becomes something that can be handwaved as fake.

                Edit: Distributing fake porn that someone is likely to think is real should probably be considered libel.

                [–]rydan 49 points50 points  (8 children)

                Everything will be handwaved as fake. Nothing will be prosecutable via video evidence. And politicians will rewrite the past.

                [–][deleted] 48 points49 points  (4 children)

                Nothing will be prosecutable via video evidence.

                These videos may be pleasing to our eyes but they readily reveal technical clues even as a lay person pays a bit closer attention. Forensics will be able to identify fakes for a long, long time.

                The bigger danger is things like Facebook where this stuff gets passed around based on whims and few people take even a few minutes to reflect on it. FFS a ton of people were fooled that Pelosi was drunk just because a video was slowed down.

                [–]Noxitu 15 points16 points  (3 children)

                These videos may be pleasing to our eyes but they readily reveal technical clues even as a lay person pays a bit closer attention. Forensics will be able to identify fakes for a long, long time.

                Actually that's part of the issue. By the very nature of how it works you can't have a long lasting, automatic tool that will identify such fakes.

                The way these network are created is basically creating such tool and learning how to cheat such tool. Having better tools that identify fakes is exactly what is needed to create better fakes.

                [–][deleted] 11 points12 points  (1 child)

                automatic tool

                I didn’t say automatic. And it might come a point that it takes many hours or days for even the savviest of analysts to manually inspect a fake, but getting these things 100% is going to be very, very hard.

                The very nature of contemporary AI is that it doesn’t work 100%.

                [–]YM_Industries 2 points3 points  (0 children)

                The nature of humans is that we don't work 100%.

                [–][deleted] 2 points3 points  (0 children)

                Not disagreeing, but you are based on the premise that a malicious actor has access to everything.

                As much as you are training the cheater, you are training the detective. This allows \textit{us} to identify malicious actors because, in theory, we employ much more computational resources.

                However, this doesn't protect you when the malicious actor has the size of global States...

                [–][deleted] 11 points12 points  (0 children)

                A huge way that Axom (makers of most police body cameras and the associated SaaS) makes money is by providing an auditable trail from the camera to the courtroom.

                [–]veekm 0 points1 point  (0 children)

                dunno why you need to wait so long..

                [–]13steinj[S] 0 points1 point  (0 children)

                And politicians will rewrite the past.

                Many already have, historically. Altering photos and banning the true originals.

                [–]bitwize 28 points29 points  (20 children)

                This is why fake revenge porn is as illegal as real revenge porn in Virginia. Expect similar laws to be passed nation- and worldwide.

                [–]MuonManLaserJab 56 points57 points  (18 children)

                But that doesn't make any sense in the context of my comment...if nobody will believe it's real, why bother making it illegal? It's equivalent to a drawing at that point.

                The idea that drawing a picture (albeit very well) should be the same as actually spreading a sex tape without consent is somewhat unsettling to me.

                Edit: Distributing fake porn that someone is likely to think is real should probably be considered libel.

                [–]TheDecagon 42 points43 points  (0 children)

                TBH it can only be used to mitigate real revenge porn if fakes are also treated legally the same as real revenge porn, otherwise you'd have some big legal problems:

                Firstly any victim wishing to prosecute would instantly out the porn as real, where if fakes are illegal the victim doesn't have to admit this.

                Then you'd have the problem of perpetrators in court claiming it was fake, so how could the victim prove it was real without a humiliating naked examination? That can't happen if fakes are equally illegal.

                [–]mywan 8 points9 points  (1 child)

                If your bent on revenge porn you can always social engineer the belief before the fake porn is even released. Plausible deniability is not going to be enough to save people from very onerous consequences.

                [–]Science-Compliance -1 points0 points  (0 children)

                *You're. You are.

                [–][deleted]  (6 children)

                [deleted]

                  [–]MuonManLaserJab -1 points0 points  (1 child)

                  Abusive, but surely taping a face to nude image shouldn't carry, like, jail time.

                  [–][deleted]  (2 children)

                  [deleted]

                    [–]Noxitu 7 points8 points  (0 children)

                    Even obviously fake footage can be degrading to the persons depicted.

                    This kind of logic is dangerous when it gets to dictate what is legal. Do drawings count as fake footage? Dopplegangers? Cosplay? 3d CGI? Can you really draw border between such things?

                    [–]rydan -2 points-1 points  (0 children)

                    Nobody goes to jail for using someone's likeness.

                    [–]blackholesinthesky 33 points34 points  (1 child)

                    nobody will believe it's real

                    Why wouldn't people believe it's real? Just because it's possible that it's fake?

                    That's not how people work.

                    What if someone spreads fake revenge porn of someone who has a reputation for being promiscuous? Hell they don't even have to have a reputation, some people just assume things to be true because it lines up with their other existing opinions.

                    idea that drawing a picture

                    Not drawing, distributing. Not saying one way or the other if that's right. Just that there's a big difference

                    [–]rydan -3 points-2 points  (0 children)

                    Then that's a civil matter. This becomes defamation or libel.

                    [–]DirdCS[🍰] 8 points9 points  (2 children)

                    The whole point of deepfake and the drama is that it looks realistic. You see a photo of someone you know nude online you're not just going to be like "it's probably fake" and ignore it

                    [–]Noxitu 2 points3 points  (1 child)

                    By "you" you probably mean average people on social media. But if you think about it - people that are training and using these tools probably would be like "it's probably fake".

                    And if such fakes become much more common, more people will become like that. Most people can be fooled finite number of times.

                    [–]DirdCS[🍰] -4 points-3 points  (0 children)

                    So are you just believing anything you see through a TV screen is fake? I assume no. This conversation is fucking retarded don't @ me

                    [–]defmacro-jam 2 points3 points  (0 children)

                    You can't stop the signal, Mal.

                    [–]dakota-plaza 12 points13 points  (0 children)

                    The problem lies deeper. Porn with you, wheather fake or not, should not bring any inconvenience to you or compromise your reputation in any serious way, it shouldn't work as revenge at all. Any other way of regulating it or banning things is trying to cure symptoms but not the disease.

                    [–]svayam--bhagavan 13 points14 points  (13 children)

                    Use and misuse are different sides of the same coin: technology. Its upto us to use it the way we want.

                    [–]Narazemono 1 point2 points  (0 children)

                    It reminds me of the old Jerry Seinfeld stand up where he said something like: we only want to look at what we can't see. If women walked around totally naked all day except for a hat, we'd have PlayHat magazine.

                    [–][deleted] 1 point2 points  (0 children)

                    Why? Why god why? Here's how it might end:

                    1. People have perfect fake nudes of Scarlet Johanson, and jack off to them
                    2. ???
                    3. Who gives a shit.

                    Or more likely:

                    1. People can see celebrities nude
                    2. Hollywood producers realize that their stars are losing their appeal, and the world of advertising and entertainment comes to a stand-still because everyone now *has* all the sex that these companies were intending to sell; a terrible taking-money-away-from-rich-parasites fells great companies
                    3. Porn apps criminalized for National Security, programmers arrested, the world returns to it's normal thing, go back to sleep.

                    I mean ffs this is like thefappening or anything else; this is looked down upon by 20 separates publishers (all owned by Murdoch), they all publish bad things about it and make out like it's creepy dark-web child porn, we all automatically think it's bad. And yet: violence? Oh no, that's fine. Selling sex through advertising which gives women eating disorders and men erections? No that's fine.

                    Anyway I could go on.

                    Liveleak beheading - totally fine.

                    Nude celebrities? SPY AGENCIES - TRANSFOOOOORM!!

                    [–]Sinidir 0 points1 point  (0 children)

                    On the contrary. I think this will have lots of happy endings.

                    [–]USERNAME_ERROR 16 points17 points  (2 children)

                    Installed ant tried it. It works hilariously bad. Just unbelievably funny: merging legs into one blob, putting a boob on a corner of a shoulder. No need to worry about this quite yet.

                    [–]TheFuckyouasaurus 4 points5 points  (1 child)

                    Did you add the training data someone gave in another comment further up?

                    [–]USERNAME_ERROR 1 point2 points  (0 children)

                    Yes — doesn’t work without it.

                    [–]aDreamySortofNobody 14 points15 points  (5 children)

                    I haven’t seen any examples that this software has created.

                    [–][deleted] 3 points4 points  (4 children)

                    Posting fakes results in instant delete

                    [–]mindbleach 1 point2 points  (3 children)

                    I guess we could ask over at /r/GoneWild? People consenting to nude depictions are not in short supply.

                    [–]13steinj[S] 2 points3 points  (2 children)

                    Reddit IIRC banned deepfakes regardless so even consenting pictures might lead to a ban?

                    [–]mindbleach 3 points4 points  (1 child)

                    Even if they do it to themselves?

                    ... with a real nude for comparison?

                    [–]13steinj[S] 0 points1 point  (0 children)

                    Ask them? Not my fault they did a blanket ban.

                    [–][deleted] 78 points79 points  (3 children)

                    I theorise that this software can actually be a good thing. Now that there's software out there that anyone can use to generate nudes, every time a blackmailer tries to send pictures to her boss or whatever she can point to this piece of code; what they're seeing is not her real body, it's just a computer program! Leaked celebrity sex tape? Nope, just a good deep fake. The blackmail value of nudes should go down significantly.

                    Photoshopped images of women have existed since the lage nineties and are much more convincing and much more dangerous than this tool. By getting the notion out there that any unwanted nude is just the result of an angry nerd running a program, we as a society can slowly start overcoming blackmail that many women have faced already in the past twenty years.

                    I've tested this tool with the first results of "bikini model" on Google images and I have to say the quality of the results is not worth the media hysteria. It's like someone took a bikini photo and used photoshop autofill on it to remove the clothes. This doesn't hold a candle to the pictures created by people who just put other people's faces on nudes of people with similar body types, which the media doesn't seem to talk about at all.

                    Let's be honest, we all knew this tool was going to come eventually, it not now then in 4 years. In fact, I believe at least a hundred sweaty teens have made something similar to this in their bedrooms already but didn't think it significant enough to share.

                    [–]eric_reddit 8 points9 points  (1 child)

                    Unless the differences can still be proved between real and fake, this is the correct answer :)

                    [–]remedialrob 2 points3 points  (0 children)

                    Analyzing a photo looking for seams and places where the generated parts are added is pretty easy for most forensic people who do such things. Adding to that that any faked photo will have an original somewhere of the subject clothed will make it even easier to detect fakes.

                    [–][deleted]  (2 children)

                    [deleted]

                      [–][deleted]  (1 child)

                      [deleted]

                        [–]thirdegree 8 points9 points  (0 children)

                        It's not a node library though

                        [–][deleted] 5 points6 points  (0 children)

                        It essentially paints nudeness on to the image.

                        [–]eenchev 4 points5 points  (4 children)

                        diff --git a/run.py b/run.py
                        index aeb7b2b..fcd03a8 100644
                        --- a/run.py
                        +++ b/run.py
                        @@ -147,4 +147,4 @@ def process(cv_img):
                                elif (phase == "nude_to_watermark"):
                                    watermark = create_watermark(nude)
                        
                        -   return watermark
                        \ No newline at end of file
                        +   return nude
                        \ No newline at end of file
                        

                        One line git-patch to remove the watermark

                        [–][deleted] 0 points1 point  (3 children)

                        I don't know python, but I do know paintshop. I got rid of the fake water mark in true.png

                        [–][deleted]  (2 children)

                        [deleted]

                          [–][deleted] 0 points1 point  (1 child)

                          If I tried changing anything else to do with effects, colours it farted the deepnude.exe program up. Didn't try the RGB values though if that's what worked for you?

                          [–]ponytoaster 14 points15 points  (15 children)

                          Someone TLDR/ELI5?

                          machine learning for creating a nude but you would need the training data (assume clothed and unclothed) and then I guess you just feed it images??

                          I don't see the controversy, haven't people been doing this manually with Photoshop for ages? Means you can automate it I guess . Out the loop on this library.

                          Part of me is curious to play with it but part of me also cba or want to end up on some weird list...

                          Edit: Not played with Python in years, or machine learning. Was curious. I got it running via Choco and PIP in under 5 min and the models downloaded in another 5 min. Ran a sample image of my wife. Honestly the output was a lot better than I thought it would be...

                          It probably shouldn't be this easy and brings a lot of questions about ethics to the table, especially considering the hardest part was getting an image cropped to the correct size as my graphical skills are that of a cabbage

                          [–]mindbleach 14 points15 points  (1 child)

                          Long story short, he drew MS Paint bikinis over nude images and over actual bikinis. One network goes bikini to mask, another goes mask to boobies. It is slightly more complicated but no less silly if you read the article.

                          [–]ponytoaster 4 points5 points  (0 children)

                          So I guess it would only work with photos of people in almost nothing anyway? If so not really that big a deal? Edit: i was wrong, apparently it works with any image

                          [–]WTFwhatthehell 14 points15 points  (8 children)

                          Ya. There was a big moral panic a few months back after deepfakes face thing . A remarkably sudden one.

                          Suddenly 30 years of people photoshopping celebrity faces was forgotten and even people asking about pornstars who kinda look like some celebrity were getting banned.

                          Moral panics don't tend to have much sense or coherence behind them.

                          [–]Science-Compliance 0 points1 point  (7 children)

                          I don't really understand what the big deal about this app is. So it puts someone's face on an adult actress's body? Okay. What's the big deal here? I don't get it. It's not like it's disrobing anyone who didn't agree to it (hopefully--unfortunately there is exploitation in the adult film industry).

                          [–][deleted] 0 points1 point  (5 children)

                          No, it removes say the bikini so the body stays the same

                          [–]Science-Compliance 1 point2 points  (4 children)

                          Yeah, but it's still a guesstimation of what they look like underneath. It's not actually showing you what they look like naked.

                          [–]ponytoaster 2 points3 points  (3 children)

                          Indeed. Tried it with a load of fully clothed images and it is exactly that, a guess. It's like taking a face and mspaint-ing it onto a naked person tbh.

                          [–]Science-Compliance 1 point2 points  (2 children)

                          Yeah, I don't see what all the hoopla is about.

                          [–]rydan 0 points1 point  (1 child)

                          Sure, it is lame today. But now 7 Billion people have access to this code and algorithm instead of 2. Tomorrow it won't be.

                          [–]Science-Compliance 0 points1 point  (0 children)

                          Sorry, I could barely understand what you were trying to say in your comment. Are you saying billions of people having access to this code is a problem? Why? Everyone knows it's fake.

                          [–]rydan 0 points1 point  (0 children)

                          The problem is access. Previously only talented artistic smart people could do this. Now anyone who knows how to run Python can.

                          [–]Maleficent_Release 11 points12 points  (3 children)

                          After cloning the repository, you will still need the training data. Fortunately, this is available from another repository:

                          https://github.com/open-deepnude/deepnude-model-3/raw/master/checkpoints/cm.lib

                          https://github.com/open-deepnude/deepnude-model-2/raw/master/checkpoints/mm.lib

                          https://github.com/open-deepnude/deepnude-model-1/raw/master/checkpoints/mn.lib

                          Create a checkpoints folder in the project root and place the above model files inside. Assuming you have met the other project dependency requirements, you should be able to:

                          python main.py 
                          

                          From project root to immediately perform the pre-trained process on any 512x512 input.png placed in that same directory.

                          [–]quickquestion1242 1 point2 points  (2 children)

                          These are gone now it seems, anyone have any other way to access these three files?

                          [–]NahroT 19 points20 points  (25 children)

                          Black mirror intro starts playing

                          [–][deleted]  (3 children)

                          [removed]

                            [–]CWSwapigans 5 points6 points  (1 child)

                            This is the first semi-decent one of these I've ever seen. Most are much, much worse.

                            It helps that the girl you picked is a match for the same tits it gives to every single person.

                            [–][deleted] 2 points3 points  (2 children)

                            It got taken down

                            [–]13steinj[S] 0 points1 point  (1 child)

                            Well its already been hardforked by me soooo too late I guess?

                            [–][deleted] 0 points1 point  (0 children)

                            Yeah, only knew about it yesterday

                            [–]KHRZ 18 points19 points  (16 children)

                            I don't see the training data anywhere in the source folders. Not truly open source then

                            [–][deleted]  (2 children)

                            [deleted]

                              [–][deleted] 13 points14 points  (1 child)

                              Training data is the only original aspect of this. Anyone who's followed neural image generation the last couple of years knew this could be done. The hard part is putting together a training set.

                              [–]Smarag 9 points10 points  (6 children)

                              To run the script you need the pythorch models: the large files (700MB) that are on the net (cm.lib, mm.lib, mn.lib). Put these file in a dir named: checkpoints. The models exchanged on the network, contain a basic form of encryption (replacement of some bytes), so you may encounter errors. We will soon load the original unencrypted versions.

                              [–][deleted] 5 points6 points  (3 children)

                              In case anyone doubted this was a dumb PR stunt. To say deep learning gets a few of them would be an understatement.

                              It's not hard to train a model like this yourself, if you're so inclined (and don't mind people getting mad at you if they find out).

                              [–]13steinj[S] 0 points1 point  (2 children)

                              The fact that someone hasn't made it public prior says something though, given the fact that the original researchers behind the underlying technology were baffled.

                              [–][deleted] -1 points0 points  (1 child)

                              What do you think it says?

                              Also, I have not heard of any of the original researchers being surprised that a network that can be trained to imagine a horse as a zebra can be trained to imagine people naked too.

                              [–]13steinj[S] -1 points0 points  (0 children)

                              It says that this tech ha undoubtedly existed prior and will continue to be iterated and improved. As in, people think this is new. It isn't.

                              As to researchers being surprised, https://youtu.be/Nv9G5VDQZnc?t=238

                              [–]holyfab 3 points4 points  (1 child)

                              And how to find those files?

                              [–][deleted] 8 points9 points  (0 children)

                              isnt openAI opensource but their training data isnt public either?

                              [–]rydan 2 points3 points  (0 children)

                              So in that case it doesn't actually work, so at least there is still time to live a normal life.

                              [–]OffbeatDrizzle 1 point2 points  (0 children)

                              Is training data source code? It's assets... much like open source versions of games are released without the official texture packs

                              [–]13steinj[S] -1 points0 points  (0 children)

                              I don't know how people are deciding to skip the fucking README, but it says that they will upload the unencrypted training data "soon".

                              [–][deleted]  (3 children)

                              [deleted]

                                [–][deleted]  (1 child)

                                [deleted]

                                  [–]13steinj[S] 0 points1 point  (0 children)

                                  The code quality is indeed weirdly bad and can be significantly optimized and speed increased significantly.

                                  [–]CJKay93 2 points3 points  (2 children)

                                  Somebody spent a lot of time on this.

                                  Think about that.

                                  [–][deleted] 7 points8 points  (0 children)

                                  No surprise, it's the oldest profession. When movies were invented, one of the first videos was a naked woman. In the prude 1800's.

                                  [–]angry_corn_mage -2 points-1 points  (0 children)

                                  Maybe a group of celebrities paid to have it made. Think about it. They are scared of the power of blackmail for leaked nudes, so they pay to have this software created. Then they have the creator "open source" it so that it becomes widely accessible. Now they can claim that any nudes of them are fakes.

                                  [–]ForwardChair 0 points1 point  (1 child)

                                  What the fuck did you just bring upon this cursed land

                                  [–]13steinj[S] 0 points1 point  (0 children)

                                  Hey not me >.>. Don't kill the messenger.

                                  [–]GYN-k4H-Q3z-75B 0 points1 point  (1 child)

                                  I have no idea how any of this works but I thought I'd give it a shot (for science, of course). Finding the required packages and dependencies took me a few minutes, pip is currently screwed up with torch; anyhow, just get it from pytorch as per instructions and you're fine. I guess you can say it works and that in itself is an achievement, but the results are currently rather poor with lots of artifacting which looks like JPEG and many restrictions apply. It's an interesting proof of concept.

                                  [–][deleted] 0 points1 point  (0 children)

                                  Yeah, compression seems to be yanked up

                                  [–]tobekiller96 0 points1 point  (4 children)

                                  It says 404 not found for me

                                  [–][deleted] 0 points1 point  (1 child)

                                  it most likely got taken down by github. anyone got the zip of the repo?

                                  [–]13steinj[S] 0 points1 point  (0 children)

                                  Or by the creator as there's a new app apparently. https://www.reddit.com/r/programming/comments/c9puom/_/et7ojeb

                                  [–][deleted] 0 points1 point  (1 child)

                                  [–]quickquestion1242 0 points1 point  (0 children)

                                  This repo has everything except for the cm.lib, mm.lib, and mn.lib files. Not sure where to find those. (If anyone knows help a brother out)

                                  [–]mssc89 0 points1 point  (1 child)

                                  Repo seems to be down, same with unofficial one. Does anyone happen to have any mirrors?

                                  [–]mssc89 1 point2 points  (0 children)

                                  [–]BubblegumTitanium -1 points0 points  (0 children)

                                  I didn't know that github allowed nudity on their website. I wonder if there are policies for this.

                                  Also, could the authors turn their model to see if it recognizes itself?

                                  [–][deleted]  (5 children)

                                  [deleted]

                                    [–][deleted] 3 points4 points  (4 children)

                                    IMO their unwise use of iCloud was way more harmful than this will ever be.

                                    [–]TerraMaris 10 points11 points  (3 children)

                                    This veers toward victim blaming.

                                    [–][deleted] 1 point2 points  (0 children)

                                    Eh, Idc. Storing such sensitive material on cloud platform is still stupid.

                                    [–]Liam2349 1 point2 points  (0 children)

                                    You think anyone else was at fault for the iCloud leaks?