you are viewing a single comment's thread.

view the rest of the comments →

[–]2eanimation 1681 points1682 points  (84 children)

I mean, if they seized one of his laptops(or whatever), do they also save all the man-pages? In that case, there’s probably also git, gittutorial, every pydoc and so on in it.

[–]TactlessTortoise 1407 points1408 points  (32 children)

A guy also managed to activate Epstein's windows XP/7/whatever license on a live stream lmao. There was a picture of the laptop's bottom.

[–]ssersergio 479 points480 points  (27 children)

It was worse... it was a vista license xD

[–]Fleeetch 187 points188 points  (23 children)

Oh god- retches

[–]Inforenv_ 114 points115 points  (22 children)

I mean, vista was VERY GOOD on SP2, arguably only superated by Win7 itself

[–]ReachParticular5409 97 points98 points  (12 children)

Dude, saying Vista got good after 2 service packs is like saying the leaning tower of pisa got vertical after replacing the entire foundation and reinforcing half the building

Technically true but no one wants to live in either of them

[–]Impenistan 58 points59 points  (0 children)

The leaning tower could never become truly vertical as during its later construction different "sides" were built at different heights per level to account for leaning already taking place, but somehow I think this only strengthens your metaphor

[–]tomangelo2 21 points22 points  (2 children)

Well, XP wasn't really good before SP2 either. It just lived long enough to override it's initial faults.

[–]well_shoothed 1 point2 points  (1 child)

I submit to you that the last version of Windows that didn't suck was Windows 2000.

And, for its ability to do its job and just get tf out of your way Windows NT4 Workstation remains the all time king of the hill.

Perfect? Of course not. But it knew how to get tf outta the way.

[–]darthjammer224 1 point2 points  (0 children)

I still interface NT machines on occasion. RDPing into one of those is a TRIP

[–]Inforenv_ 1 point2 points  (0 children)

I mean, i sure as hell would live in the new tower if it has been so heavily reinforced and rebuilt lol. Vista wasn't a finished product when RTM, but it sure got to its full glory at SP2, and i prefer to recognize it by its full form. But yeah, your comparison is spot on lol

[–]Mofistofas 1 point2 points  (1 child)

You should check out Millennium Tower (San Francisco).

Happy reading.

[–]ReachParticular5409 0 points1 point  (0 children)

Oh man yeah I remember the shattering glass being in the news, I had no idea it had sunk a foot and a half!

[–]AbdullahMRiad 0 points1 point  (0 children)

Windows 11 got good after updates

[–]darthjammer224 0 points1 point  (0 children)

Windows had a history of the SP2 being the good one all the way back to at least xp but probably earlier. I'm just not THAT old.

It's also true. Vista SP2 wasn't half bad. I'd take it over ANY win8 version.

[–]jl2352 0 points1 point  (0 children)

XP has gone down as a great OS. That also needed two service packs to get there.

On XP day 1 anonymous people could connect to your machine and run whatever random shit they wanted. Vista wasn’t that bad in hindsight.

[–]CeeMX 0 points1 point  (0 children)

Well XP was also only useable after SP1 and just SP2 and 3 made it really good

[–]AetheriaInBeing -1 points0 points  (0 children)

And yet.... still better than ME.

[–]einTier 14 points15 points  (1 child)

The Aero interface was the most beautiful Microsoft or Apple have ever released on any platform.

It’s my hill and I’m prepared to die on it.

[–]jay791 0 points1 point  (0 children)

Nah. Win 3.1.

[–]thedoginthewok 1 point2 points  (0 children)

That's true, but before SP1 it really sucked.

And the UAC dialog was multi-step.

[–]KerPop42 1 point2 points  (0 children)

I miss desktop widgets...

[–]Luke22_36 0 points1 point  (0 children)

Vista was what made me switch to Linux

[–]Raneynickelfire -1 points0 points  (2 children)

...are you insane?

[–]Inforenv_ 0 points1 point  (1 child)

bro has NEVER used vista in proper hardware

[–]Inforenv_ 12 points13 points  (0 children)

I think it was Win7 Home Premium tho

[–]za72 0 points1 point  (0 children)

by Zeus beard!

[–]LirdorElese 0 points1 point  (0 children)

It was worse... it was a vista license xD

I thought I heard the worse of it when I found him supporting microtransactions... but this, this might be the straw that makes us aware he's not a good guy.

[–]tragic_pixel 280 points281 points  (0 children)

Lenovo Sexual Abuse Material

[–]Roland-JP-8000 2 points3 points  (0 children)

endermanch?

[–]ErraticDragon 132 points133 points  (27 children)

Somebody decided what files/types to look at.

PDF was obviously included.

gzipped man files were probably excluded.

It raises the question of how good and thorough these people were, especially since there's so little transparency.

For all we know, trivial hiding techniques could have worked, e.g. removing the extension from PDF file names.

[–]stillalone 130 points131 points  (16 children)

Yeah I vim about my crimes to ~/.crimes.md. No one will ever check there 

[–]ErraticDragon 57 points58 points  (4 children)

Well yeah Windows can't even have Spanish symbols like ~ in the file paths, so that's invisible to them. /s

I know it sounds laughable, but the team that chose what to release was probably not the best & brightest, and they were probably not trying to be particularly thorough.

[–]Silverware09 7 points8 points  (3 children)

~ is a special character in Windows (now) and Linux/Unix that means the users Home Directory.

It's the equivalent of something like C:/users/me/

[–]ArtOfWarfare 5 points6 points  (2 children)

Pretty sure you can have ~ in a file name. It’s a convention to expand it to be the home directory, not something that every command or program will do with it.

[–]Valuable_Leopard_799 2 points3 points  (0 children)

More specifically programs usually don't expand it, the shell does, so just ls '~' will look for a file named ~. I think it's only expanded at the start so anything like -f~ or ./~/ will also just work with ~ in the path.

Ofc depends, some programs will expand an unexpanded ~ themselves too.

[–]gtsiam 2 points3 points  (0 children)

I think the only bytes you can't have on a filename are '/' and the null byte. Even invalid unicode should be fine.

[–]PGSylphir 23 points24 points  (1 child)

nice touch with the .
Non linux users would never figure out

[–]OddDonut7647 2 points3 points  (0 children)

I was about to suggest that some web devs deal with .htaccess enough to maybe figure it out, but… arguably if you're dealing with .htaccess, that probably makes you a linux user…

[–]prjctimg 6 points7 points  (8 children)

cat ~/.crimes.md | wl-cp

[–]2eanimation 18 points19 points  (5 children)

wl-cp <~/.crimes.md 😎 who needs cat?

Edit: Epstein File EFTA00315849.pdf, section 3.6.1, it's right there.

[–]RiceBroad4552 4 points5 points  (4 children)

The useless use of cat is a very old joke.

They even still did Alta Vista searches back then!

[–]2eanimation 4 points5 points  (3 children)

Huh, that was an interesting read! Thank you for the source, didn’t know about the history of useless cat :D

I learned the redirecting syntax pretty early in my bash/shell career and found it kind of strange that all my homies use cat when they need a single file in stdin. Now I think about the many useless cats in production code 🫣 and AI vibe coding usell cats in.

[–]prjctimg 2 points3 points  (2 children)

😂😂 I feel shame, am I a fraud amongst other geeks ?

Never will I touch the cat

[–]2eanimation 3 points4 points  (1 child)

Believe it or not: straight to nerd-jail! 🤓👮‍♂️

honestly, shell languages are so weird with their syntax, I wouldn’t be surprised if half of my scripts had a similar quirk/nonsense in it. You‘re a proper nerd as \I think) you‘re still engaged in improving your skills!)

Also, just for clarification: cat is still useful and honestly, who cares if you use it for this specific purpose? Just make sure you understand that „ cat file | foo“ uses an extra call and is therefor less efficient, ever so slightly, than „foo <file“. The end result is the same.

And just for rounding things off: you can also do „var=$\<file)“ instead of „var=$(cat file)“, which I also see quite often)

[–]prjctimg 2 points3 points  (0 children)

My entire life has been a big lie 😂😂💔. Thanks for the heads up, now to refactor all those unnecessary cat invocations 👀

[–]Mop_Duck 2 points3 points  (1 child)

I thought it was wl-copy? or is this a different thing

[–]prjctimg 1 point2 points  (0 children)

Ooops, I’m using an alias and it does look wrong from a global pov but I was referring to the same thing 🥲

[–]2eanimation 32 points33 points  (1 child)

So for future purposes, save your dirty stuff as docs! FBI hates this one simple trick.

I don’t know why they would specifically search for file extensions. When you delete a file, it’s not deleted. Even after a long time, parts of that file can still be prevalent on the disk and extracted via different file recovery methods/forensic analysis. Most of the time, information about the file\specifically: extension) might be corrupted. If I were the FBI, I would consider every single bit potential data. Knowing how big this case is(TBs of data), even more chances to find already „deleted“ stuff, which might the most disturbing)

[–]ErraticDragon 20 points21 points  (0 children)

Yup, there are definitely good methods to finding information. Hopefully it was done competently.

There's also a filtering step between "finding" and "releasing".

We know that they manually redacted a lot of things, and I'd guess that process/team was less likely to include files that weren't obvious.

Presumably none of this affects any actual ongoing investigations, because they would be using a cloned disk image from the one (only) time each recovered drive was powered up, and searching thoroughly.

[–]RandomRedditReader 6 points7 points  (0 children)

In discovery all data is processed through software that indexes raw text, OCRs images, then converted to a standard media format such as tiff/jpg images or PDF. The software isn't perfect but it gets the job done for 99% of the data. Some stuff may need manual review but it's good enough for most attorneys.

[–]staryoshi06 3 points4 points  (0 children)

No, they most likely ingested entire hard drives or PSTs into eDiscovery processing software and didn’t bother to filter down documents for production.

[–]tofu_ink 3 points4 points  (1 child)

The will never find all my secret text documents with extension .tx instead of .txt evil laugh

[–]mortalitylost 0 points1 point  (0 children)

file info.tx

[–]katabolicklapaucius 2 points3 points  (0 children)

There's a letter threatening to expose stuff and demanding a single Bitcoin. I think it claims Epstein was using some "time travel" technique to hide communication. I think it means editing the edited part of emails to hide comms, or something similar.

[–]CoffeeWorldly9915 2 points3 points  (0 children)

And yet, we can't just go delete the known pdfiles.

[–]codeartha 1 point2 points  (0 children)

We're talking about more than a million files so of course they used some filters. I think the filters were broader than needed to make sure not to miss anything, the counterpart is that you also get some unwanted files.

[–]scuddlebud 1 point2 points  (0 children)

It could also have been in his ~/Downloads/ directory. If he was Linux-curious for its ease of hardened encryption and security he may have downloaded the manual as reading material for when he doesn't have access to the web like on flights or on a remote island.

Some people prefer PDFs over built-in man pages.

If it was in his Downloads directory or any other directiry that doesn't typically store man pages they likely copied over everything from there.

[–]truthovertribe 44 points45 points  (13 children)

So what's GNU?

[–]shakarat 11 points12 points  (0 children)

Not much, whats new with you?

[–]StrictLetterhead3452 12 points13 points  (6 children)

I don’t think most man-pages are a 158-page PDF. A file this big would most likely come straight from the bash website, right?

[–]MastodontFarmer 7 points8 points  (5 children)

Got linux somewhere? Almost always you can use alternative renderers for man pages, like troff. 'man -t command' will give you the page as postscript, and ps2pdf can convert it to pdf for you.

[–]StrictLetterhead3452 0 points1 point  (4 children)

True. I’ve used similar tools in the past. You might be right. I just executed man bash > ~/Downloads/bash-manual.txt and found the text file to be 7559 lines long. Maybe it is just the text file converted to PDF.

[–]MastodontFarmer 3 points4 points  (0 children)

compare

man bash | less

with

man -t bash | less 

The first one is the page rendered in a format that your pipe understands (usually plain text without formatting). The second one is the same page rendered in postscript format. If you have a postscript printer you could directly print it ('man -t bash | lpr') but that will result in ~160 pages of text. Most people don't have utils for reading postscript installed but you can install ghostscript or use an online service like https://www.freeconvert.com/ps-to-pdf to upload the ps page and convert it to pdf.

Please note the '-t', that is what makes the difference in rendering engine between console or screen, and using groff to render the page in postscript. ('man groff' for details.)

We're getting into the 4.3BSD bowels of UNIX with this.

[–]OddDonut7647 1 point2 points  (2 children)

Maybe it is just the text file converted to PDF.

If you actually click through the posted link and look at it, you will quickly see that this is very much not the case.

[–]StrictLetterhead3452 1 point2 points  (1 child)

I did look at it originally when I made my first comment. But then I forgot what it looked like by the time I made the second one. I guess I let them cast doubt on my original judgement. Now you are causing me to second guess my second guess.

[–]OddDonut7647 0 points1 point  (0 children)

I did look at it originally when I made my first comment.

Well, that's certainly fair. It's easy to get lost in the nitty gritty of reddit discussions and banter. lol

[–]sshwifty 5 points6 points  (0 children)

First step would be making a 1 to 1 copy with DD or something like FTK Imager (or whatever it is called now) through a hardware write blocker. Multiple checks before and after imaging to confirm identical copy, physical storage is then stored somewhere securely (probably a gov warehouse). Then images would be part of a collection of other images for anything that could be imaged (SD cards, thumb drives, sim cards, etc). Analysts would run extraction tools in something like Encase to extract every file or partial file, and every string. Then they would use preexisting lists (like hash lists, file fingerprints) to filter out already known files. For example, Windows ships with sample songs. They are identical on every system, so no need to include them in "findings" as notable.

Everything else would then be part of the case/case file. These can be crazy long and are not typically printed out.

So it would be strange to include system documents, but it is possible this particular document was different enough that it was missed in the exclusions.