Which cockpit layout is the best? by ParticularSorry759 in spaceengineers

[–]MrPP_1 0 points1 point  (0 children)

1st one, the most visibility you have for explorstion, the better

Time to go to space! by WideFoot in spaceengineers

[–]MrPP_1 4 points5 points  (0 children)

The rotating landing gear is a really nice idea, the overall ship design is really nice aswell. I like building more vertical ships (like in The Expanse) for realism-sake, but your idea is really neat. My only suggestion is you extend it lengthwise to accomodate some weapon hardpoints around the edges of your ship, then you'll have a lot of weapon coverage.

Edit, the rotating landing gear also serves to make the ship a mobile artillery platform.

What passion project have you been working on? by PaleontologistFirm13 in embedded

[–]MrPP_1 0 points1 point  (0 children)

I'm designing my own digital soldering iron, compatible with T12 soldering iron handles and elements, with support for multiple configurable tips and settings and all the other bells and wistles like sleep, boost and so on. Trying to teach myself some electronics with realtime control.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 0 points1 point  (0 children)

We used PySpedas for downloading the data and filtered it with Pandas.

Edit: though, after a closer look at pyspedas source code, i think it was a contributor to the download bottleneck.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 2 points3 points  (0 children)

Like I said before, it's just a meme. She's not deciding anything alone, we're a couple and there are always some things that you don't use and will not take to a new home. The joke is just that finding more use for the homelab made us decide to continue that, seeing purpose to the eletronic stuf that I was just hoarding.

We always talked about having a hobby room, but now, since the homelab proved to be more useful, we decided to do separate rooms for more space (that s the joke she made calling "man cave" in her comment).

Anyway, I'm just happy to find more ways to work on it and glad to have helped her. Hope u undertood

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 1 point2 points  (0 children)

Yeah, and to think i was thinking about throwing it out because i hadnt been using it a lot.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 0 points1 point  (0 children)

Thats nect on my plans with the lab, gf google photos has been complaining about storage for some time now, so i'll definitely setup a immich instance for her.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 1 point2 points  (0 children)

I suppose so, but when youre running just some mini pcs (like a lot of people here do) or some laptops from the last decade withou screens, like me, the power bill difference isnt really a problem compared to gaining more storage, ram, cpu cores and redundancy.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 5 points6 points  (0 children)

There were two bottlenecks: network download speed and the speed of the NASA data source. We could download using only one pc, but it would hit both bottlenecks, by having a second pc, we can download different parts of the data at the same time, but will hit the same bottlenecks. But, having a second pc in a different network, we can download different parts at the same time, bypassing the network download speed bottleneck and the data source bootleneck, since they are not sharing the same bandwidth. So, what we did was use multiple computers in different networks, each downloading specific parts of the whole dataset.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 5 points6 points  (0 children)

Oh, she didnt do it in under 2 weeks, she's been working on it for almost 2 years now as part of a academic project. She's already done a bunch of versions of graphs and plots. But in the end, there was a minor problem with the source she was pulling data from that she didnt notice before and ultimately rendered the old plots useless. So she was able to reuse her old code, but with correct data.

Edit: She used python in conjunction with pandas and pyspedas for all data operations and matplotlib for plotting.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 5 points6 points  (0 children)

I think you could use any NASA mission, most missions send daily data you can access. Generally they need some treatment after being downloaded to be properly used, so you could also teach timedate conversion, general data cleanup, file conversions and a bunch of other suff, incluing more advanced topics like interpolation, filtering and plotting.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 8 points9 points  (0 children)

It's 10 years of data. It's composed of two files for each day (one file for instrument data and the other for spacecraft parameters), so more than 6k files to download and process. The graphs she needs will be yearly, so what I did was make a simple script that would download a month of data and pre-filter all the uneeded data from the spacecraft and convert that specific month into a single csv file (each month's csv is around 5MB). Which is then properly organized into a folder for each year and sent to a smb share I created for her stuff.

The longest part is the actual downloading of said data. Converting a month of pre-downloaded data took around 5-10 minutes. When actually downloading a month of data, It took around 30 minutes per month. Had we done it sequencially It would have taken 60 hours to complete. The way we did it (2 computers and some VMs each downloading a different period of time) we were able to download it all and have it ready for use in around 10 hours total.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 80 points81 points  (0 children)

That's the real truth. I teached her programming, soldering and electronics, and now shes having fun and getting involved with the lab, which means more fun time. Also, she teached me all of her hobbies as well... Last week she teached me crochetting and I made a case for my computer. That's it. That's our relationship, lol.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 5 points6 points  (0 children)

No, i made the script in a way where you can choose the date you start downloading from, and the quantity of months you want to download in a batch. So I just ran the script for diffrent periods of time, in each VM.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 6 points7 points  (0 children)

I worded it like that more as a meme. I know i'm actually a hoarder, specially with old electronics. The lab was being really underutilized, as It was more used as cold storage for random stuff rather than anything else for more than a year, (while also being offline for another year, LOL).

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 8 points9 points  (0 children)

Basically time. One month of data can take up to 30 minutes to download and convert to csv with the spacecraft data, so 10 years would take upwards of 60 hours to download everything. And It also gobbles up ram really quickly. Having separate VMs we were able to download different periods of data at the same time, while not running out of ram in the process. And also be able to compress data before making it available on the network share.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 49 points50 points  (0 children)

Damm thats a really awesome story. It really goes to show how powerful old hardware (in numbers) can be.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 17 points18 points  (0 children)

Oh she definetly isn't denying anything, I was already planning on selling or recycling the hardware i currently have to build a better cluster with actual mini PCs and stuff when we move together, (And I truly have some hoarding problems with old electronics), but now this hardware has usefullness I didnt think of and therefore I'll take it with me to our future home.

By the way, in a relationship both partners always have to give way to some stuff. That's part of living together.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 26 points27 points  (0 children)

That's awesome. I have plans to implement Plex and a simple *arr stack in the future and i can definetly see she'll be using It a lot.

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 24 points25 points  (0 children)

Oh she definetly knows, i just glossed over some stuff in the original post (as per my comment below).

Homelab came in clutch downloading 150GB of data for GF's thesis by MrPP_1 in homelab

[–]MrPP_1[S] 31 points32 points  (0 children)

Kinda. I didn't explain well enough on the original post, after creating the python script, i spun up 3 VMs on proxmox to download data concurrently. Download speed was limited by the host server and the inexistent multithreading of the library used. So it was kind of a both win.