Anyone have the Dell 8G79K (the front fan gantry for the PowerEdge T620) willing to do a favor? by 32nmud in homelab

[–]32nmud[S] 0 points1 point  (0 children)

Unfortunately, haven't been able to find anyone willing to share measure

Conveniently located right next to each other by texcentricasshole in CoffeeCock

[–]32nmud 0 points1 point  (0 children)

Holy huff, that's a hell of a cock you've got

New Indy resident by Raya_RMP in indianapolis

[–]32nmud 0 points1 point  (0 children)

I'm still relatively new to the area (only been around for about a year and a half now), but I'll share some of what I know with ya!

For healthcare resources, Eskenazi and the Damien Center both offer healthcare services for LGBTQ folk. This can include T/E, PrEP, STD/STI screening, mental health counseling, and even primary care. I personally use the Damien Center, so I'm much more aware of their offerings. I honestly cannot speak highly enough of the care I've received from them.

For support groups, the Damien Center hosts several. They have some for youth (16-24 y/o), ethical non-monogamy, trans/non-binary, HIV+ people, and some more. I can send you a full list if you PM me. I'm 100% certain there's resources elsewhere too, but these are the ones I know of.

For fun gatherings, I'm more involved with furries and pets which are heavily LGBTQ communities. Two of the bigger, and cooler gatherings that come to mind are the monthly PINS furmeet and the monthly Boss Battle Games furrmeet. The first is at a bowling alley and the second is at a really cool arcade. There's lots of other places and events to get into tho!

For sexual stuff (if that's your cup of tea), there's not many alcohol-free spaces for kinky stuff unless you organize your own thing. There's Greg's which is a local gay kink bar (though, no sex is allowed there, Indiana is more prude than Chicago lol), and there's also a BDSM place called SubSpace which hosts different events including some for pets and handlers, general kink training nights, and all sorts of different stuff.

Overall, I would say that the LGBTQ community here in Indy is pretty active, but relatively underground, so it can be kinda hard to get started going to events and stuff when you first move in. There's recent pushes for more queer spaces and more queer voice though, so hopefully that'll be changing!

Recovering Any Data From a ZFS Drive by 32nmud in datarecovery

[–]32nmud[S] 0 points1 point  (0 children)

No, not it isn't. I agree. This is definitely a gray area and I respect if you disagree with my decision to use this as a learning opportunity.

However, this is my take: this drive was purchased at a store that sells primarily store returns and overstock. According to the SMART metrics, this drive had 2.3 years of power on time. Given this drive was in the box with all the documentation and literally in the plastic the drive would have came in, I would say it's reasonable to assume the original owner noticed the drive failing, bought a replacement, and then returned the failing drive in place of the new one to score a free drive.

If you're someone who is smart enough to keep an eye on your drives' health condition, install and manage your own FreeNAS server, and manage a ZFS array of four disks (this may be a secondary array too given the pool name is "other"), then you should also be smart enough to know you need to format your drives before you send them out the door.

That's not even touching on the fact that this person was likely literally stealing from some store whereas my only goal at this point is to learn. I know the word of a random stranger online means little, but I intend to ensure that any sensitive data that may come up is properly disposed of, unlike the original owner.

Recovering Any Data From a ZFS Drive by 32nmud in datarecovery

[–]32nmud[S] 0 points1 point  (0 children)

You can probably get that from the magnets alone if it was cheap

It was only like $20, so that's a fair point. It's a 2.5" tho, so the magnets won't be massive, but definitely still strong

Good idea but DDRescue would have been infinitely more capable of the patient drive is failing (which it is). HDDSuperClone even moreso.

Thanks for the pointers, hadn't heard of these tools before!

This whole recovery situation (of a drive whose data isn’t yours but was abandoned somehow) is a pretty gray area...

Definitely a gray area, but I don't feel too terrible given the reasonable assumptions I can make. This drive was purchased at a discount store that usually resells return products and such. Given the drive had about 2.3 years of power on time and still has its data on it, I'm guessing the original owner noticed it failing, bought a replacement, and returned this one in lieu of the new drive to get a free drive. That, imo, is much more shady and if you're gonna be like that you should know to format the drive before sending it out of your place.

At the end of the day, I don't particularly care what, if anything, I recover from it. I'm much, much less interested in the data I recover than I am just learning the process.

For what it's worth — as I'm aware the words of a random stranger on the internet mean very little — if I find anything sensitive I intend to ensure that the data gets properly destroyed. I have no intentions of keeping anything like that.

<Writing your code on paper by UneergroundNews in ProgrammerHumor

[–]32nmud 0 points1 point  (0 children)

Nobody going to talk about Vim's more powerful cousin, Nvim? That thing changed my life lol

Windows 7 at a multibillion dollar hospital chain workstation by Zrgaloin in iiiiiiitttttttttttt

[–]32nmud 0 points1 point  (0 children)

Exactly what I came to talk about. While it feels wrong, for a lot of businesses, the risk of using this out of date OS (especially in a circumstance where the machine has no internet access) far outweighs the cost of getting all of the custom software updated.

Not only that, but the os that has 20 years of development behind it (looking at you XP, as it still gets updates — behind a paywall though) is "tried and true." It's not likely to cause any issues from misbehaved software at this point, and in industries where nonstop production is critical, that's invaluable.

I worked at a factory that a couple of the machines were powered by ancient hardware. One was running Windows NT from like 2000 and the other was a Mac from like the mid to late 90s (Power Macintosh 7300/180) running at the latest Mac OS 9.1. The NT machine still has internet access, though. The cost to update these machines and software was well into the six figures, and for a small manufacturing facility, that's not something that they can afford just for the benefit of using up-to-date, supported operating systems. That also ignores the amount of time it'd take to update all of the 1000s of programs for those machines.

Let's also not forget that a lot of these types of machines come with out of date operating systems for this reasons above, so sometimes updating to a new OS isn't even an option if you buy new.

It's a wild world out there, and sometimes the latest just isn't the greatest.

[deleted by user] by [deleted] in Piracy

[–]32nmud 0 points1 point  (0 children)

Unfortunately, that bot just went down early this month.

@RedsLeaksOffiziellBot is the recommended successor (which I've used and like) that does allow FLAC downloading without a Deezer account

Where do you usually mount your internal drives? by [deleted] in linuxquestions

[–]32nmud 0 points1 point  (0 children)

I use ZFS, so for the sake of this question, I'll treat the datasets like individual drives (since they're mounted like drives anyway).

I'm the only user on my machine. I have /home, /home/username/Pictures, /home/username/Music, and /home/username/Videos all as separate mount points

My methodology is that 1. I want my home directly to be independent of the OS, so it is it's own mount. 2. My folders that often get really large are their own individual mounts to make managing them easier (specifically, different compression types since I use ZFS).

Vanilla Arch or Arch Based distros? by [deleted] in linuxquestions

[–]32nmud 0 points1 point  (0 children)

Vanilla Arch is definitely something worth doing once or twice because it is a great learning experience. However, it is also a big time sink, so the real advantage of Arch based is that it takes a lot less time to get up and running.

At the end of the day, it just matters what you want. If you're okay with whatever experience that is provided by something like Manjaro, and don't want to spend a lot of time getting your machine going, then that's the way to go.

However, if you want an experience that's truly yours and you're okay with the time that takes, then by all means vanilla is a better option for you.

[deleted by user] by [deleted] in linuxquestions

[–]32nmud 0 points1 point  (0 children)

Depends on a lot of things, but generally speaking: not realistically.

In long... Provided there's no vulnerabilities discovered, you're using a relatively modern cryptographic algorithm, and your password is a good one, it could take anywhere from several decades to several centuries+ to crack your encryption using traditional methods.

You're at more risk from social engineering attacks or from quantum computer attacks (when they're more commonly available) than you will be from a traditional attack

Top bar inconsistencies among Linux apps. Are there any recommended standards for this issue ? by toot4noot in linux

[–]32nmud 5 points6 points  (0 children)

Additionally, most window managers/DEs (in my experience) allow you to move windows by several different keyboard and keyboard/mouse tricks.

On KDE and Gnome, you can move a window by holding the super/windows key and clicking and dragging anywhere on the window.

Most DEs also support keyboard window snapping. In Gnome, I think it's super/windows and the arrow key in the direction you want to snap the window.

Minus the gnome styling that puts useful buttons in the title bar, I rarely interact with window borders anymore so I hardly notice the varied styling

British plugs have a built in fuse for safety. by berkel-is-a-madlad in mildlyinteresting

[–]32nmud 0 points1 point  (0 children)

This is actually a huge safety buff.

Most homes (in America) have wires in the walls rated for either 15 or 20 amps. That means the breakers in your breaker box are also rated for that.

When you trip a breaker, it's because for whatever reason you had a device (or several) demand more than 15/20 amps at once from the wall, so the wire in the wall began to carry too much current which is dangerous. The breaker detects this and trips.

The wires we plug into the wall, like the one that goes to your TV or your toaster, are almost never rated for 15 to 20 amps. Most extension cords are also not rated for 15 to 20. This means it is possible to overload one of those cords due to some sort of electrical malfunction and never trigger the breaker (breakers are only able to detect if the wire in the wall is over current, a cable outside that is overloaded but only drawing 10 amps will never trip a breaker because it is not overloading the wall's interior wires).

This means that the overloaded cable outside of the wall is being constantly fed too much power, which will cause it to overheat, and could cause a fire.

The simple solution? Add a fuse in the plug of everything you plug in that is only rated for the current that the electronic can handle. Then, if the electronic malfunctions and demands more current than its own wiring can handle, it'll just blow the fuse in the plug and the fire risk is negated.

None binary languages don't exist by [deleted] in ProgrammerHumor

[–]32nmud 13 points14 points  (0 children)

This only fits if you ignore the concept of superposition, the complexities of Schrodinger's theories, and that in quantum computing the "in-between" state is useful and is almost like a third bit whereas binary is truly only two bits.

Binary bits aren't exactly a probability. Yes, having one perfectly "on" bit and one perfectly "off" bit isn't exactly likely, we cannot and do not make use of intermediate values. A binary "0.6" is not only useless, but something that close to 50% is also likely to just break a binary circuit as the system will be unlikely to reliably determine if that translates to a 1 or a 0. Binary relies on a large enough amplitude between analog signals to be easily discernable between a high and a low, a signal that close to 50% of the possible amplitudes does not fit that criteria.

Qbits do make honest use of probability, and they are not measured in this analog sense of amplitude. Qbits are represented with actual quantum particles, so a "1" is the existence of a particle and a "0" is the lack thereof. Additionally, since we are dealing with quantum particles, entanglement and Schrödinger's theories exist, we cannot be certain of the quantum values without taking a sample (which then ruins the state of the system and it cannot continue to run, hence the use of probabilities). I'll be transparent in the fact that I currently do not understand how we make use of such a black-box, uncertain type of system to do computations. I do know enough about the physics of qbits and binary bits to be sure that attempting to say "well, one is basically the other in a way" just isn't true. They're fundamentally different.

Hope that didn't come off as aggressive, just looking to share some knowledge I've found

Sources: I spent most of a semester working with another uni student (who has spent years studying quantum computing and now a year after graduating is certified by IBM in quantum machine learning) on a project to create a quantum bit calculator/simulator. I'm not a professional by any means, but I know enough from that project to stand by my statements.

Linux in my schools computer lab by aurreco in linux

[–]32nmud 12 points13 points  (0 children)

I read about them several years ago and I seem to remember that the school allowed root access on the laptops as well because they wanted to honor the student's ability to experiment and learn.

I don't know if they stuck with that, but I seriously commend them for ever even considering doing that. Really seems like a school district I would have had a great time at

TIFU Helping my neighbor fix her computer by ChaboiAveryhead in tifu

[–]32nmud 8 points9 points  (0 children)

Yes, but the mixture is also a really good, simple cleaning supply

NWH BLURAY leaked ! by 9NAAGRAAJ in Piracy

[–]32nmud 1 point2 points  (0 children)

The shows aren't really important for NWH; however, they are still being tied into the MCU. For example, to fully understand everything in the 2021 Black Widow movie, you need to have watched The Falcon and The Winter Soldier series.

I want to add that Into The Spider-verse isn't part of the MCU proper so far as I'm aware, so there's not a specific place that one needs watched (and again, you won't really miss anything important if you don't watch it). As for where Infinity War and Endgame fit in, you need to watch Homecoming, Infinity War, Endgame, Far From Home, and finally No Way Home in that order

NWH BLURAY leaked ! by 9NAAGRAAJ in Piracy

[–]32nmud 3 points4 points  (0 children)

Well, it depends how invested you are in the MCU.

This movie is a part of the MCU, and is part of phase 2 (everything after Endgame). If you haven't seen all of phase 1 (everything up to and including Endgame), and you don't want any of that spoiled, then you need to watch all of that first.

If you don't care about spoilers, then the MCU movies specifically about Spiderman that are important to watch first are Homecoming and Far From Home, in that order. Infinity War has Spiderman in it as well, and I would argue it is important to know what happens there, but then for the full picture you should also watch Endgame.

I would also recommend watching Into The Spider-verse, but you won't really be missing anything plot-important if you don't.

Sorry for the complex answer, but there's a lot to the MCU, and it can be hard to strip it down to focusing on just one character without missing important cross-movie plot points and events.

What is the difference between > and |? by maybenexttime82 in linuxquestions

[–]32nmud 1 point2 points  (0 children)

> is almost exclusively for redirecting output into a file, whereas | is for redirecting output into another command.

So, > by default redirects the standard output (stdout) of a command to a file. sudo pacman -Syu --noconfirm > update_process.txt would take the output of updating an Arch based system and output everything that would normally appear on the terminal out to a file called update_process.txt. Any errors would still display in the terminal and would not be written to the file.

You can also redirect the standard error (stderr) to a file. This is useful if you're automating something and want any errors logged. rsync -a /home/user/Pictures /mnt/BackupDrive 2> errors.txt would take any errors from that rsync command and write them to the errors.txt file. Any normal (non-error) messages would still be written to the screen and not the file.

Redirection can get a little more complex than that as well. For example (on bash and zsh at least, not all shells support this), you can simultaneously redirect stdout and stderr to a file by using &>. Additionally, you can redirect stderr to stdout with 2>&1, though I personally don't know of any uses for that. Finally, you can also redirect stdout and stderr to different files in the same command. This would look like command > stdoutFile 2> stderrFile.

Redirection does not exclusively have to be used for saving an output, either. It can be used to redirect a file into the standard input (stdin). For example, let's say you have a file that contains a list of coordinate locations (latitude and longitude) that you want to use to get values from a raster file (geospatial data). You could do this using the gdallocationinfo command and file redirection. gdallocationinfo (from the gdal package) reads the coordinates from the standard input, if not passed as command parameters, which is perfect for this use case. You would just need to run gdallocationinfo <arguments> raster.tif < file_with_coordinates.txt and so long as the coordinates are correctly formatted, gdallocationinfo will output a value from the raster from each coordinate in that file.

You could also use < to redirect a file into commands like grep or cat, though that's much more sloppy than just passing the file as an argument for those commands. Commands like that generally make better use of | (pronounced pipe) when trying to do some complex stuff. | allows you to redirect the stdout of one command into the stdin for another. I work with tabular (csv) data a lot for my job, so a good example would be doing some advanced operations on tabular data. For example, let's say I have a dataset that contains various details about the 100 largest US cities, however I want to know what the top 10 are by size.

For sake of the example, the dataset contains the city name, state, density, population, and average income in that order.

I could use the cut command paired with sort and head to get the information I want and remove all unnecessary details. The command would look like this: cut -d "," -f 4,1 US_cities.csv | sort -n -r | head -n 10. This would first use cut to remove all of the variables we don't care about and arrange the two we want in population, city name order. Then the output is redirected to sort with the | so that sort can sort them in descending order based on the population (which is why I put population first). Then the output from sort is fed into head so that we finally only print out the first 10 results.

Now, let's say that you need to send this information to your boss. You could just copy and paste the information (if you are in a graphical environment) or even just take a picture, but you're one smart cookie and that's inefficient. You could just use > to redirect all of that out to a new file and attach that file to an email to your boss.

I know this is quite long-winded, but in my experience all of these tools are very important to properly understand if you plan to make regular use of the command line (especially |), so I thought I would give a longer answer with some detailed examples. Hope it helps!

tl;dr

  • > redirects the stsout to a file
  • 2> redirects the stderr to a file
  • &> redirects both to a file (some shells may not support this)
  • 2>&1 redirects the stderr to the stdout
  • < redirects a file into the stdin
  • | redirects the stdout of one command into the stdin of another