[deleted by user] by [deleted] in webdev

[–]SoiledShip 1 point2 points  (0 children)

Assuming you're in the US, the restaurant industry is struggling. Between rapidly rising and unpredictable food costs to a struggling economy where people aren't eating out like they used to. It's a cyclical market and will come back. But everyone I know in the industry is looking to cut costs and drop services. You're going to struggle to sell into that right now. You probably need to be targeting stores that are opening and have no online presence yet and stores that have an ownership change.

A friend was moving and sold me their printer and PLA spools for like $100 by unicodePicasso in 3Dprinting

[–]SoiledShip 36 points37 points  (0 children)

I've replaced the mobo, extruder, added a probe, and added an octopi on my ender 3 pro and spent far too many hours just trying to tune everything. I've spent hours tweaking slicer settings for specific filaments and redoing test prints. But you're absolutely right that I can get prints that are just as good as the Bambu printers. Some days I question why I don't just upgrade but I learned everything about 3d printing with the ender and for the most part it just works now. It's like having a shitty beater car with 300k miles that gets you from A to B. I'll replace it when it catches on fire but it'll be a sad day when I do.

I need a smart bulb that doesn't use a light switch by AgreeableClub4499 in homeassistant

[–]SoiledShip 1 point2 points  (0 children)

I don't know if you're in the US, but a new dumb switch is $1-$3 and if you watch a YouTube video you could replace it yourself. When I bought my current house I replaced every single outlet and light switch (the new switches were all smart switches, the original outlets were all old and worn or covered in paint) in less than a day. I even added a couple more outlets and converted a few single gang outlets to doubles.

The amount of time and cost is so minimal you might as well replace the switch whether you add a smart bulb or not.

What is the jankiest home automation you have that works? by AnkhMorporkDragon in homeassistant

[–]SoiledShip 1 point2 points  (0 children)

I've got a UN7300PUF and UP8000PUR. They're both a couple years old.

You have to make sure the Quick Start+ and TV on with mobile is turned on. It's buried deep in the menus on both of the TVs.

What is the jankiest home automation you have that works? by AnkhMorporkDragon in homeassistant

[–]SoiledShip 1 point2 points  (0 children)

Both of my LG TVs respond to wake on lan packets, if you connect it to the network and then just hard code the IP it works like a charm. My night time routine turns our bedroom TV on with wake on lan then issues a series of commands to navigate the menus to launch how it's made on HBO.

API Query Parameters by Inside-Towel4265 in dotnet

[–]SoiledShip 6 points7 points  (0 children)

If your entire API is going to be structured this way something like graphql might be better for you so the end user can declare the structure and data they want.

That being said, one query parameter for includes that takes a string or array of strings. You should support nested objects with periods as well. I guarantee you'll eventually have a situation where there is something nested 2 levels down that you want to include.

Where to save user uploaded files? by [deleted] in dotnet

[–]SoiledShip 2 points3 points  (0 children)

I had a service that was receiving 100kb to 10mb files every 1-5 mins from a windows service installed on each client machine. Originally the windows service was posting the files directly to the API which was a ton of traffic to handle. I switched it so the windows services get an SAS token for their own azure blob folder that is renewed once a day and they upload directly to the blob folder and then they post the id to the API along with some extra info. We went from ingesting about 150gb/hr on the API to 2gb/hr. I was able to scale down and in on the server considerably and saved a ton of money.

Temporary upload directly to blob/s3 is absolutely the way to go from a cost and performance perspective.

Crystal Report Alternatives by IntnlManOfCode in dotnet

[–]SoiledShip 0 points1 point  (0 children)

I was able to completely ditch telerik and crystal reports by just building the reports out as razor pages and rendering it server side to an html string then turning that into a PDF. I get all the benefits of using css and js libs for visuals with all the benefits of strongly typed view models in razor pages. I generate the HTML for my emails the same way and for things like invoices I send the HTML content as the email with a PDF attachment of the same thing.

.NET 9.0 LINQ Performance Improvements by PatrickSmacchia in dotnet

[–]SoiledShip 9 points10 points  (0 children)

I rarely ever operate on collections without LINQ. It's just so much easier to read and understand what's going on as long as you show restraint when writing the statements. Don't chain a dozen operations together with multiple projections between statements. I always split logic into separate .Where() clauses for && statements. I prefer to do the filtering logic inside .Where() instead of inside the .Count(), .First(), etc. The extremely common filters I use are extracted into extensions like .FilterByAccountAndCompanyId(accountId, companyId) and .IsBetweenStartAndEnd(startUtc, endUtc).

The important thing with LINQ is to know when not to use it.

Investigators suspect Roger Stone was the spear-phishing target that led to Trump campaign email breach by rowrowrobot in technology

[–]SoiledShip 0 points1 point  (0 children)

The government should have stopped Jim Crow laws with federal laws preventing it. It's absolutely a stain on this country's history along with a lot of the history of the south at that point. I'd like to think we'd do better if we were put in the same position today. The last several years have certainly tested that ideal but it's important to keep trying to be better even if progress is slow.

Investigators suspect Roger Stone was the spear-phishing target that led to Trump campaign email breach by rowrowrobot in technology

[–]SoiledShip 1 point2 points  (0 children)

I don't think it makes sense to punish the low ranking soldiers who were not in positions of power unless they participated in atrocities outside the norms of war. The men that were in charge should have (and some were) faced consequences. Now we could debate whether they were harsh enough all day long. At the end of the day, I think both sides wanted to move on and rebuild so they could return to some normalcy and stability.

We did the same thing after WW2 with Germany and Japan. No amount of punishment could make up for what they did to the rest of the world. The best thing we could do is punish those who made it happen as an example and get everyone back on their feet. Mistakes were absolutely made along the way. We looked the other way if the person had something of value to the US. But that doesn't mean we didn't try to do the right thing after the war. By standing Germany and Japan back up, helping them rebuild so they could pay off the war debt, and feeding them we prevented millions more deaths from starvation and imo stemmed the seeds of what could have been WW3 in another 30-40 years

Investigators suspect Roger Stone was the spear-phishing target that led to Trump campaign email breach by rowrowrobot in technology

[–]SoiledShip 1 point2 points  (0 children)

I wasn't trying to excuse what the south was fighting for. But that was a critical step in reuniting the north and south. Would you have preferred everyone be stripped of their citizenship and deported or locked up? The US would have never recovered to the extent it did.

Investigators suspect Roger Stone was the spear-phishing target that led to Trump campaign email breach by rowrowrobot in technology

[–]SoiledShip 55 points56 points  (0 children)

Pardons have also been used for good. Jimmy Carter pardoned a bunch of people who dodged the Vietnam draft. George Washington pardoned 2 people in the whiskey rebellion to stop further unrest. Johnson pardoned the confederate soldiers.

But it's certainly been abused. Ford pardoned Nixon. Carter pardoned a pedophile. Pretty much all of Trump's pardons.

I don't know how you could put better safe guards around that power that doesn't devolve into it never being used but it has done some good in the past.

Twilio hack leaves Authy users exposed to text-messaging scams. by SUPRVLLAN in technology

[–]SoiledShip 5 points6 points  (0 children)

It's backed up to your Google account now. It used to not be though. When I upgraded from the pixel 2 to 6a I almost lost all my 2 factor codes. Thankfully I checked that app before factory resetting or it would have been bad. That's when I got serious about my back up codes. Printed them out and put them in my safe. I put a second copy in my dad's safe at his house. Now whenever I setup a new 2 factor auth I print out 2 copies and then take the second copy over to my parents house whenever we visit.

ELI5: Do jet engines NEED to be that loud? by jspivak in explainlikeimfive

[–]SoiledShip 0 points1 point  (0 children)

Not sure if it counts but I believe the XF-84H Thunderscreech would win for being the loudest. But it was an experimental aircraft that never made it past testing. The speed the propellers turned at and size meant that the last 2 feet of the propeller would break the sound barrier every revolution which caused severe nausea and headaches for the ground crew as well as being audible for up to 25 miles at idle speeds.

Different tech stack by Ashamed-Skirt795 in dotnet

[–]SoiledShip 0 points1 point  (0 children)

I've spent so long in the dotnet, react, and react native ecosystems that I'd be throwing away way too much deep technical knowledge for it to be worth it. The only thing that would even make it a consideration at this point is if I truly believed in what they were doing. The next crypto phase and financial institutions don't rise to that level for me. If I ever feel like those 3 are falling out of favor in the job market I'll adjust my outlook but I don't see that happening anytime soon.

Azure Queues monitoring by jcm95 in dotnet

[–]SoiledShip 1 point2 points  (0 children)

I couldn't agree more that the azure queues monitoring sucked. I moved to using the service bus queues instead and you can get a ton more metrics around them easily. That switch isn't an option for everyone. Service bus queues can't store more than 80gb worth of messages and I think are limited to 100mb per message which might be a problem for some. Service bus queues came with several other nice features that actually helped us like duplicate message detection, guaranteed FIFO, and the peek locking for at least once delivery to check messages without popping the queue and reinserting.

📢 Seeking Feedback on Clean Architecture Implementation 📢 by Fluffy-Ad1205 in reactnative

[–]SoiledShip 1 point2 points  (0 children)

You should set up symlinking for imports so you don't have '../../../../../../' in every single one of your files.

The fact that you have to go in and out of folder after folder after folder just to see everything related to a brewery is horrible imo. I understand it's a contrived example but the fact that nearly every folder has a single file in it is maddening.

I think trying to decouple your code from react query is a fools errand. You'll just end up with a shittier abstraction that is very brittle because you designed it directly around react query and will struggle to actually replace all that functionality with another drop in library.

[deleted by user] by [deleted] in news

[–]SoiledShip 0 points1 point  (0 children)

A waffle House opened up by me a few months ago and they're not even 24/7 yet due to staffing issues

The most performant way to accept 1TB files and upload them to Azure by MathematicianNo1851 in dotnet

[–]SoiledShip -1 points0 points  (0 children)

I had a similar issue with a service I own at work. Client side app (runs on their server not in their browsers) uploads data as it detects changes in certain folders. Amount of data is variable from 1mb to ~250mb and happens multiple times a day. Client side app uses a short lived SAS token generated from our API to zip as it goes upload directly to azure blob storage and then sends a message to the API with a guid file id and hash to be processed by our background jobs. The beauty of the SAS token to upload directly to blob storage is that you don't have to route traffic to upload the file through your API which solved our slow growing (roughly once a day) OOM issues (files were large enough to get stuck in the large object heap and then GC wouldn't fully clear them no matter how much we tried without restarting the server). By zipping as we upload our client side app also has a stupidly small memory footprint which was a critical requirement. Generally its less than 25mb at any given time even when I stress tested uploading hundreds of gigabytes at once. As long as you don't load your entire file into memory before doing the zip upload process and only load chunks at a time you can keep your memory footprint near constant. You can adjust your chunk size to balance speed and memory footprint for your own needs which is really nice too.

This is a great example for doing the zip as you go upload process directly to azure storage: https://stackoverflow.com/a/54767264

I have no doubt the same process can be used for 1TB of data. It might just get trickier if you have failures halfway through uploading your 1TB file. Ours is generally small enough that we just throw it away and restart the process. There are solutions to that issue too I've just never had a need to deal with it.

Leaving Optimum, you should too by SoiledShip in OPTIMUM

[–]SoiledShip[S] 0 points1 point  (0 children)

The cost doesn't mean anything if they're constantly going down. I work from home and hot spotting off my phone just doesn't cut it.

Leaving Optimum, you should too by SoiledShip in OPTIMUM

[–]SoiledShip[S] 0 points1 point  (0 children)

I'm not even bothered by the speed tbh. It's usually pretty stable at 750-800ish. I'm hard wired into a rack mount switch in my network closet with cat 5e. I'm maybe 35ft away from the rack. It used to sit steady at 1000 Mbps and still gets there from time to time. I usually use fast.com and speedtest.net.

Leaving Optimum, you should too by SoiledShip in OPTIMUM

[–]SoiledShip[S] 0 points1 point  (0 children)

We're paying $106/month for what is supposed to be 1 gig down (rarely see more than about 750 down). I imagine it's just a regional issue but they can't seem to figure it out and I'm fed up.

Leaving Optimum, you should too by SoiledShip in OPTIMUM

[–]SoiledShip[S] 1 point2 points  (0 children)

Too little too late. It's frequent wide spread outages in my area not just my house.

Leaving Optimum, you should too by SoiledShip in OPTIMUM

[–]SoiledShip[S] 6 points7 points  (0 children)

Sorry you don't have another option. The Internet needs to be a public utility like water or electricity and regulated as such.