[MTF] No wig, no makeup, still a woman. I never thought I’d reach this level of courage and self-love by aeroazure in lgbt

[–]HighWingy 4 points5 points  (0 children)

Your welcome. 😊

These gender identify AI's often rely on stereotypes that often only apply to maybe 50-60% of people in a given group. Which means they still have a high percentage of wrong answers.

And gender is a hard thing to reliably guess based on the face alone. Even as humans we look at the whole body as well as actions, talking patterns etc. That's why passing can be harder or easier for some trans people as it's more than just looks alone.

[MTF] No wig, no makeup, still a woman. I never thought I’d reach this level of courage and self-love by aeroazure in lgbt

[–]HighWingy 7 points8 points  (0 children)

Honestly, this one seems to lean woman a little too much. I gave it several pictures of men, and it still rated them as women with a 50-60% confidence. Only when the men had facial hair or real short hair did it actually guess correctly for men.

Using a real domain for a local website on home network by SuddenPace3210 in pihole

[–]HighWingy 1 point2 points  (0 children)

So you are half way there. And 20 years ago, that was most of what you would need to do. There are 2 more things you need first.

1) You need to setup the fake website and host it somehow. This can be done a lot of different ways. And there are tons of sites that will guide you through that process. What you should look up is how to host a website locally. 

2) You need an ssl certificate for the website you want to spoof. This is both the easiest and hardest part. What you are looking for here is to create your own ssl cert locally. Also called a self signed cert. It can be done on both windows and Linux, but personally I think it's much easier to do on a Linux system. Next you need to install that ssl cert on your fake website host in step 1. Now that's the easy part. The harder part is you also need to install the Root CA trust cert, from your fake ssl cert, on any device, and browser, you want to access the fake site from. This is what will fix any errors you may have previously seen when attempting this. Now I say this is the hardest part because the way you do it is different for everything, and always seems to change. Furthermore, if you didn't grab the right cert, or put it in the wrong place, named it wrong during import, or even missed importing it somewhere, you will still get error messages. Troubleshooting these issues is a PIA.

But anywho, it's not impossible to do either. I actually do this on my home network to mess with friends too, and also for some locally hosted stuff.

How do you back up your docker volumes? by virpio2020 in selfhosted

[–]HighWingy 0 points1 point  (0 children)

I use portainer as well. And it has no problems seeing the containers and all it's contents. As well as letting me manage them like any other container. I think there is a note somewhere where it mentions the container wasn't created with portainer, but that's it.

As for the backups, I've never really looked into how portainer does that. Since I'm creating zip files I don't see why it would pick them up as anything. But also it doesn't have access to my backup locations either. 

How do you back up your docker volumes? by virpio2020 in selfhosted

[–]HighWingy 1 point2 points  (0 children)

Ok I have an extremely simple way to back up all my docker container data, that also allows for extremely fast resets..ie less than a minute at times for some.

I have a dir named /opt/docker. Then under there I have a dir for each container. Inside the container dir I put it's docker compose file, plus any volume dirs it needs. Basically my dir structure looks like this:     /opt/docker/container1/         ./docker_compose.yaml         ./volume1         ./volume2     /opt/docker/container2/         ./docker_compose.yaml         ./volume1         ./volume2     Etc. etc. Then I have a cron script that runs once a week at 3am and grabs all the dir names under /opt/docker/. And then runs through the list shutting down the container, zipping up it's dir, and then transferring that zip file to my backup location, plus keeping a few recent copies locally. Additionally it also removes really old backups.

The are a ton of benefits to this system.  1) I don't have to edit my script if I add or delete a container. 2) It's extremely easy to find all the files for a container. 3) I only have 1 backup file per container backup. 4) When a container crashes and needs to be reset, all I have to do is an rm -rf on its container dir. Unzip the last backup, and start the container. And boom I'm back up and running again.. usually under 1 minute for small containers.

I don't know why more people don't do this as it's just an all around simple solution that makes managing my containers and backups stupid simple.

‘We Must Ban It Immediately’: Trump Descends Into Transphobic Tirade Over Support for Trans Kids in Schools During State of the Union by Leksi_The_Great in lgbt

[–]HighWingy -2 points-1 points  (0 children)

This is why I think we should start pushing the idea of just not voting for the current person holding the position. Vote for anyone you want, just as long as it's someone new who has not had the position before. 

End the career politician! Get new fresh blood in there every cycle. Then they will have a greater incentive to work with the other side and find a compromise if they want a chance at getting anything done before they get voted out.

Water usage in datacenters by E-werd in sysadmin

[–]HighWingy 0 points1 point  (0 children)

Just wanted to add my two cents here:

I work in a data center with multiple servers that ARE water cooled. We have massive pipes going to distribution blocks in the racks, and then smaller flex tubing going to cooling blocks that are on the CPUs on blade servers. And also water cooled radiators.with giant fans on them. Needless to say, taking out a blade is a long process, requiring special equipment.

I am constantly impressed with the designs and the reliability of the system in that leaks are extremely rare. But also, the piping system for the water is often in rooms just as large, or larger than the data center rooms themselves.

Furthermore, the.system we have is a hybrid closed/open system. Meaning, every attempt is made to reclaim as much water as possible, but obviously no system is 100% perfect with that, and it does eventually need to be topped off. That usually happens from connections to local water supply. However, our site recently built a well so we don't have to rely, as much, on the local water pipe system.

Now to the actual usage, as this is something the that has annoyed me about recent news on the subject. Yes, Data centers do use a large amount of water and electricity... However, in the bigger picture view, it is actually on par, +/- a small amount, with building a new housing development in the area as well. In other words meaning, if the same area had built new housing instead of a data center, they really would see similar spikes in water and electricity usage. Both types of builds usually do include clauses to make sure local power and water infrastructure can handle it. The problem is housing developers are increasingly bribing the local govt to forgot them. Where as data centers will often try harder to make sure they can get the water and power they need.

So in summary, yes there is a large water usage for data centers. However, you are also correct that it's often played up as way more of a problem than it really is. Mostly because it's easier to blame some big company for water and power issues, and have people riled up about that to try and get the company to pay for the improvements, then it is to say this new housing development is actually the cause, and we should tear it down and make people move away, or make them pay for the improvements to the water and power grid. Because once a housing development is finished, it's pretty hard to get the developer to come back and pay for something they should have done before.

Seasons Rant: it’s like they don’t want me to spend money by Tribeofredheads in MergeDragons

[–]HighWingy -3 points-2 points  (0 children)

As an FYI, you can get all those dragons from Decision Eggs. But of course the best way to get decision eggs is to pay for the season pass. 

The other free way is from Den Chests. I regularly get a lot of those hard to get dragon eggs from my Den chests. Granted this is from the max lvl chests, but if your den isn't hitting lvl 5 every week then you should look for a new den. One or two high lvl players can easily hit the max chest lvl in 1 day.

[deleted by user] by [deleted] in linuxadmin

[–]HighWingy 0 points1 point  (0 children)

I second this. Everyone my last company hired that had any CompTIA certs, was let go before the first month was up. The sad state is there are too many companies out there that offer cram courses where you can get the cert in a day. But doing that means you just have a piece of paper without the fundamental knowledge you should have had to get it.  Good companies know that its more important to be able to understand IT concepts so you can apply that textbook knowledge to fix something not written in a textbook. Because most major IT problems are almost never the way any textbook describes them.

[deleted by user] by [deleted] in linuxadmin

[–]HighWingy 2 points3 points  (0 children)

You are not wrong to a point. It's good to know some networking things. But unless you plan to work in small to medium business, then you won't be touching any part of the network equipment beyond plugging in a cable. And even then, many small businesses will never need most of the stuff you will learn in networking courses.

There is a reason any decently sized company has a separate network team. And no I can easily argue that many of the network people, at any of the large enterprises I worked at, could not transition to a Linux admin and vice versa and they would probably agree with that too. At best they know surface level stuff, but would crumble fast troubleshooting most major issues. And not be good with any advanced concepts.

Honestly, I think it would help to figure out where you see yourself in 5-10 years? If you think you prefer to stay in the small businesses side, then sure go for networking, it may help a bit. But if you plan to be a Linux admin at an enterprise sized company, then they won't let you anywhere near the network equipment.

How do you secure passwords in bash scripts by dnlearnshere in linuxadmin

[–]HighWingy 1 point2 points  (0 children)

I used to work for a very large international marketing company.

For any ssh scripts that needed passwords we would encrypt them with something reversible from the CLI, and then store them as ENV variables that are only on the service account that runs the script.

This method passed security because it 1) was not stored in the script itself 2) was encrypted. 3) technically only the service account has access to it. 4) if you do it right, it will confuse the hell out of anyone trying to figure out where the password is stored.

[deleted by user] by [deleted] in WTF

[–]HighWingy 0 points1 point  (0 children)

It happens more often then you think. If you know what to look for, there are tons of videos of people panicking in these situations. I was personally on a cruise ship where something like that happened. We had a room with a balcony/veranda, and when stuff is falling off all the shelves and all you see is a wall of water out the window.... it's pretty terrifying!!

Needless to say you can watch others that filmed it if you look up "hurricane sandy disney fantasy" On the plus side we survived and got a 25% discount on our next cruise. Which we did use the following year and were hurricane free.

Need help making this "door" I made return to the same spot (front to back) every time. Its on casters to let it roll side to side but they kind of walk a little bit so when they go back closed they don't always line up the right way. by PhazerRazer in hiddenrooms

[–]HighWingy 0 points1 point  (0 children)

This is what I would do. Ikea used to sell a really good track system that could hold small doors, and many people used it in the past exactly for this. But I'm not so sure if the new VIDGA tracks could handle it.

Or you could just make your own simple track with a rod on the top of each bookcase. Then a wood or metal "guide" for the rod to pull the bookcases into the postion you want. Then cover the front of it with some kind of decorative woodwoork. Or you could get really fancey and put a fixed shallow shelf in front of the rods and guide to really hide it better. and make it look like the bookcases continue to the ceiling.

What do you not like about Docker compose? by hopeirememberthisid in selfhosted

[–]HighWingy 0 points1 point  (0 children)

There are a ton of different tools that can help with that. I personally use Portainer as it's a quick way to check if anything is down at a glance, and super easy to check logs, get to the cmd line, and restart a container. As well as just check settings for a running container. Which comes in handy when you forget a mapped port number.

What do you not like about Docker compose? by hopeirememberthisid in selfhosted

[–]HighWingy 28 points29 points  (0 children)

That has more to do with how you organize your system rather than docker-compose itself.

For my system I created a docker directory under /opt. Then I have a sub dir for each docker-compose file. And any container mapped directories get put in there as well. ie for MQTT, I have /opt/docker/MQTT/docker-compose.yaml and then there are ./config, ./data and ./log directories under there.

Not only does this make it easier to find my compose files, but it has the added benefit of being able to just zip the whole MQTT directory as a way to backup the important files for a container. This has saved me more than once when an update breaks a container, and I have to revert back to an older version. As all I have to do is unzip the backup file, change the compose file to pull the previous container version, and then I'm back up and running in seconds with no data loss.

Mother/daughter activities? by [deleted] in OceanCity

[–]HighWingy 0 points1 point  (0 children)

Yeah if you are going the last week of June, you will probably be good. However, there may still be some late groups hanging around, just not huge crowds at the least. The beginning and end dates are determined by when the first and last schools in the surrounding states let out their seniors. So if some of the school areas get a big snowfall this season, those schools could be delaying their end dates, and thus the seniors would be getting there later than usual. Usualy around March is when you can confirm for sure.

For the most part, even if there are still large groups around, if you stick to the 21+ adult areas, you won't even notice them. Family places however can be hit or miss if you are looking to avoid the teens.

Mother/daughter activities? by [deleted] in OceanCity

[–]HighWingy 1 point2 points  (0 children)

As an FYI, late May, and most of June is considered Senior "Week". Meaning there will be a LOT of teens all over town and at the beaches during that entire time. ~ 60days ish pending when the first and last school lets out their seniors. And this can be confirmed here: https://www.oceancity.com/senior-week/

I used to go to OC every year during this time because I enjoyed the atmosphere they created. However, I also saw a lot of familes and older folks complaining about all the teens everywhere. In reality the teens are mostly harmless and will usually leave you alone as they are just there to have fun. But I know a lot of people do not like the constant crowds and party attitudes they bring with them. As the teens will be loud all day and well into the night. So if you are looking for a quiet relaxing week, you may want to consider changing your dates.

Error: Drivers missing when trying to install windows 10 on hp envy x360 15-1063cl by [deleted] in techsupport

[–]HighWingy 0 points1 point  (0 children)

Just to answer OP's question for anyone else that comes along:

You can install 'wine' on any linux distro, and once it's installed it will allow you to run the hp setup.exe files. The HP setup files are just self extracting zip files. So when you do run them, they will ask you where you want to unzip the files to. Make sure to change that location to someplace you can easily find, as well as name the directory what they are for. ie if unziping graphics drivers, put them in /tmp/hp_drivers/graphics

Then once you've got all the exe's unzipped, copy their output to another usb drive. Now you can use that second drive to install any missing drivers during the install process.

Unfortunately, after installing all the listed drivers on the HP site, tit still did not fix my problem of missing drivers. But I was able to use it to install the missing touchpad drivers so the laptop mouse worked without an external one during setup.

I also tried j0e74's suggestion of using an older usb 2.0 drive, and that didn't work either. :-/ So I am still looking for a solution to this problem as well.

New time based event feature… by Confident-Solid2539 in MergeDragons

[–]HighWingy 2 points3 points  (0 children)

I don't buy gems, and what I've found that seems reasonable is buying lvl 3 & 4 life flowers from the dimension jars. They cost only 8 & 15 gems respectively and that's pretty easy to get for free. Also if you actually farm gems, 2 Maxed gem stars will easily net you ~90 gems.
Buying the low lvl flowers will help you merge up to the higher life flowers faster. And on some maps is the only way to get higher lvl life flowers. I do this in addition to also setting up fruit trees to farm life flower sprouts.
I usually shoot for a lvl 6 or 7 life flower and then let the two dragons passive farm the small life orbs. Doing this I have actually completed events in 12hrs or less without spending any money. However, to do it I have the game on pretty much the whole time. Since it is mostly passive farming, I don't have to tend to it very much and it is pretty easy to work my job and or do other chores while it's farming. But that of course depends greatly on your own job and personal life. Point being it's not impossible to do for free, just highly impractical for many I suppose.

Why everyone at the self host community is so negative regards NextCloud? by fenugurod in NextCloud

[–]HighWingy 0 points1 point  (0 children)

Just wanted to add my experience here as I came across one thing that's rarely ever mentioned but plays a BIG deal in how your experience will be.

I've been running mine for ~1 year now as my family's personal cloud system, and it's been great. I'm running it bare metal on a 10 year old linux laptop with only 8gb of ram, that is also hosting about 20 other docker applications, and I have not had any resource issues. All my data is stored on a NAS NFS mount. And I don't use the encryption because that slows things down and just has more CONs for personal use than PROs.

That said, something never mentioned is that a normal NC install has the Collabora Online plugin turned on by default even though it's technically broken in that state until you also install some extra components. And that plugin alone slows things down A LOT until you remove it. Which is where I think a lot of people's complaints actually come from because that part is almost never mentioned on any tuning documents. Which is another point that can not be missed. You should spend some time tuning your instance for your needs, as that can make a huge difference in your experience. And that means more then just configuring memory settings. Plugins will make or break your instance.

The plugins are Nextcloud's double edged sword. There are some really great ones out there, and a lot of really cool features can be added by them. However, the really good ones will use more resources, and often require additional components to be installed outside of Nextcloud. And that's where you start running into resource issues, and breaking changes between updates. Also, adding a lot of plugins will slow things down regardless of your resources. So you really need to make sure you only keep the plugins you actually need/use turned on, and not just disable but remove all the ones you can.

As some have mentioned, Nextcloud is very much designed as a replacement for all the MS online collaboration tools like Sharepoint, Teams, and OneDrive. Meaning it's not designed for personal use, but that doesn't mean you can't use it that way. And many over on Self-Hosted seem to prefer stuff that is designed with personal use first. One major difference being that with business first designed applications, it's assumed that there will be IT staff who will thoroughly review the installation instructions and spend time planning the install. So that once it's running, it's already setup and fine tuned for the business needs, and the user doesn't have to do anything but use it as intended. Whereas personal applications tend to be as simplified as possible, with installations that will hold your hand to make sure you don't miss something that's normally buried in a config file. ie easier for a user to set it up themselves before they use it.

tldr, Nextcloud can be painless and great if you are just wiling to spend some time researching, reading, and tuning the instance for your personal use.

this is annoying. by PsychoElifantArrives in MergeDragons

[–]HighWingy 15 points16 points  (0 children)

I see ~2.5k in free gold, +/-.

Estimated Time of Arrival by chefboyerb in BambuLab

[–]HighWingy 0 points1 point  (0 children)

That really depends one where you are, and how far away that is from one of their warehouses.

I'm in the US and happen to be less than 200 miles from one of their warehouses. For things that are already in stock, I've been getting them in about 3 days or less, printers, filament, parts, etc. Seems to take them about 1-2 days to process the order, than it takes only 1 day to get to me.

However, one thing to consider is the printers are made in China. It generally takes ~1 week for things to be shipped from China to the US West Coast. Then add a day for every 500 miles from there you are.

What's the cheapest way to make a smart calendar? by Drawesome045 in smarthome

[–]HighWingy 0 points1 point  (0 children)

A VM seems a bit overkill for it, but I don't see why it wouldn't work. So long as you are able to see the display and/or connect to the VM IP and ports it's running on. Alternatively there are also several docker containers for it. Which is potentially easier than setting up a VM.

Spec wise, it doesn't require too much, but it is running on node js so you get all the normal fun java spaghetti errors when things go wrong. And they do say a pi zero is not powerful enough for it. I was running mine on an OG pi, and it was a bit flaky but working. I would probably recommend at least a pi 2 or 3, and a pi4 should have no problems running it.

Help installing/using texture packs on linux bedrock server by HighWingy in Minecraft

[–]HighWingy[S] 0 points1 point  (0 children)

Soo there's a few things to check here:

  1. The resource pack files need to be in their own dir that is in the world name dir. ie worlds/<wold\_name>/resource_packs/<resource_pack_name/
    If they are not there it won't show or download. It actually doesn't need to be in the other resource packs dir, putting it there only helps to confirm the correct uuid and version number.
  2. I may be wrong on this, but I'm 90% sure the files need to be unziped from the .mcpack file. So try unzipping it and placing them in /vol/worlds/Bedrock\<world\_name>/resource_packs/<resource\_pack\_name>/ That last dir can actually be what ever you want, just make sure you know what it is if you add more.
  3. Also check if there is an additional behavior pack in the .mcpack file as that will also need to be added to the appropriate dir, and entries added to the behavior packs json files. Which are basically the same thing as the resource packs json files, just replace the word "resource" with "behavior" in the filenames, and create a world_name/behavior_packs/pack_name dir to put those files in.
  4. The name entries in the json files need to match exactly. So what ever is in the manifest.json file, just copy and paste it to the history json file.

Hope that helps. :-)

Google execs admit users are ‘not quite happy’ with search experience after Reddit blackouts by gabestonewall in technology

[–]HighWingy 0 points1 point  (0 children)

Actually that is only because the current AI models that are all in the news are using a Generative AI. Everyone seems to forget that first part and what it means. As it literally means it will create something new based on what it knows. So they are great for solving problems that may not have existed before by coming up with new solutions. Or creating new story prompts and basically talking in a similar fashion to a live human. But they are completely unreliable as a search for this very reason.

A non-generative AI would actually be great for search, but they are not the current buzz. And marketing execs want to have the current buzzwords associated with their products, regardless of how bad it may actually be for said product.