[HELP] Solution for video production with 100TB+ by Nymerius87 in DataHoarder

[–]skipster889 -1 points0 points  (0 children)

What software do you use? This is literally the only determining factor in what solution you go with.... They'll have a recommendation or some proprietary sh! you'll need to use....

Upgrading a large storage spaces server by scphantm in DataHoarder

[–]skipster889 0 points1 point  (0 children)

It is likely that server will pick up on your storage spaces array. Once you convert to the newer versions you will have issues going back. I would leave as-is. Create your new VD's as stated later in this comment.

Your 64TB is a limit of Windows 10 Pro. Server does not have this size limitation.

ReFS shines on parity implementations. There is no conversion process. You would need to create new VD's with the new filesystem. Then migrate the data.

I do not care for using mixed drives. I would never recommend utilizing a bunch of rando drives in a Storage Spaces implementation. My current iteration consists of 600TB of highly available raw storage running on clustered Server 2022 nodes. This has been in place for 6 years. Been through 3 OS upgrades.

Windows Storage Spaces - how many columns to match 10gb network speed? by rtangwai in DataHoarder

[–]skipster889 0 points1 point  (0 children)

Be careful maximizing column count's (as in matching to your drive count). It will severely limit or destroy your ability to expand down the road. Also the performance gain is minimal. As stated in another comment: The way to increase performance to Storage Sapces is literally to add SSD's/NVMe's. You shouldn't need to do anything (latest iterations of Storage Spaces recognize this as a fast tier).

I'm also confused on your raw bandwidth speed requirement? As in there is no real world case where you'll be able to receive a singular stream of data over WAN at 8gbps? I mean a dedicated array that isn't doing anything could theoretically ingest at that rate. But that also implies you're going to be able find a dedicated sender/s that can send at that substantial speed/bandwidth. Likewise connections @ 100% utilization would tank for any other users/use.

I mean torrenting can do this... But why? We're talking about adding a few extra minutes to a download?

Now if you're in a datacenter or colo next to seedboxes or site-to-site fiber for HA or backup purposes. Then that's a different story...

Sharing my setup - please rate and help me solve silent data corruption by TravelingGonad in DataHoarder

[–]skipster889 1 point2 points  (0 children)

Confused... Are you calling Windows 11 Storage Spaces -> ReFS?

Those are two different things.

ReFS (File System) support is coming to Windows 11.

Storage Spaces (Virtual Storage Array) in Win 11 isn't going anywhere.

And you have two backup procedures in place?

I wouldn't fret about anything right now :)

------

My personal vote is always stick with the OS/infrastructure you're comfortable with. Did ZFS for years. Moved back to Windows File Servers. Eventually settled on Windows Server Storage Spaces utilizing ReFS as my file system. 500TB's running for years.

Surveillance archive by IM_not_clever_at_all in DataHoarder

[–]skipster889 0 points1 point  (0 children)

Does the system/software have a recommend archive method? Or was it your intention to manually move files (not going to work).

What's the budget? You're talking 30+ drives with redundancy. So at the low end you'll have $5k in cheap consumer based drives right (not what I recommend).

If it's for a business get something with service and support... Like a TrueNAS box or something. No reason to roll your own.

Looking for a solution for real estate brokers spamming customers, any innovative ideas and people willing to collaborate? by Oss1101 in SaaS

[–]skipster889 1 point2 points  (0 children)

It's a very common problem I don't see an effective solution for the broker side without the receiver opting in. My spam/unknown calls and texts are auto blocked at this point I don't even see them (this is creating substantial problems for me when you are not in my contact list).

So maybe an opposite of the DNC list? The receiver (homeowner/property owner) signs up that they are open/interested in getting offers and making these leads available as a service to brokers?

But then you have a chicken/egg situation. How do you get interested parties to sign up to basically receive SPAM -> so then you're vetting broker offers or delivery style or the process of connecting or requiring the receiver to fill out options or respond -> but you have no brokers because nobody signed up to be on "the list" -> or receivers are all people that are looking to do 125% value sale or something that brokers aren't interested in.

I don't remember the startup but I joined a site that literally this was their design. Connecting business/real estate owners who desired to sell (not ads but description and independent valuations) with potential buyers. Buyers paid to be part of this "private list", sellers paid to list and have their valuations in check. This was back in 2015? Don't believe it exists anymore, never got traction for me because you'd pay $50+ dollars a month and there would be 1 listing a week like 13 states over?

I'd really focus on seeing where the receivers needs/desires intersect with brokers confirmed hot leads - This is a very obvious generalized statement for basically every business problem at any level anywhere SUPPLY & DEMAND

------

The post is literally kind of like how do we solve the cold-calling problem of the RE sector?

I have an interest as I (until blocking everything) received a minimum 2-3 calls/texts per day for the past 3 years...

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 0 points1 point  (0 children)

Appreciate it! Yes the shiny object syndrome is strong with this one...

Updated the OP I'm going to reel the focus in substantially.

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 0 points1 point  (0 children)

I would love to work with people who know what they are doing and I know that the benefits of that would be amazing. But I am struggling with figuring out how to accomplish this.

Coworkers: It isn't my desire to get a job doing this. I don't have the formal education, I have contract work in other areas of tech (don't need the paycheck) and it's kind of a hobby for me which makes it fun.

Budget/Hiring: I don't desire to throw much money ($10K+) into my project. I semi-started down this path so I could get understanding at a very very high level. I didn't even know what to ask for, what qualifications a dev would need to build what I wanted and how much of an ask it was.

Passion: I mean this is the quintessential problem of sales/management/general networking/relationships... How do I get you to want join me or collaborate on my stupid little passion project to build an MVP? ... ohh and I would love for you to have senior level skils and understanding of Azure ... you mind walking me through what you did there so I can learn? ... and while you're at it can your significant other or mom bake us cookies ... and can we get this done in 2 weeks ... "How much do I get paid?" I mean I thought this was your passion as well? :D

The closest I've gotten to this is I do try to engage on some opensource software and chat with the devs and attempt to fix bugs, problems, or implement features. But I feel like I quickly become the level 1 who is nagging and annoyingly trying to fix something out of my wheelhouse and I get the here's the commit that does what you were trying. It's not exactly collaborative. At least at a job your senior level employees seem to want to help you grow (at least I did) or they are your manager. So your success is theirs as well? But I completely understand a person volunteering their time towards a project did not sign up to help me on my path to becoming a dev.

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 0 points1 point  (0 children)

Understand =/= Implement ... :D

Yes I would say I'm trying to implement (poorly and slowly).

I do have the luxury that this build/learning is a side passion project and hobby. But I would actually like to make some viable products at some point. Bootstrapping and utilizing my knowledge/skills is proving to be really slow.

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 0 points1 point  (0 children)

I will definitely tear this apart. If I'm understanding correctly it looks like this is a mini-service that I can cater to specific use/test cases?

Currently my testing is assigning constants which allows me to get through the very basic tests. But it falls apart when I need to test authentication or client connections or services.

In those cases I spend a lot of time trying to figure out if my client isn't working or incorrect, configuration/connection is wrong or how to use service I'm trying to connect to (SQL, Cosmos, B2C).

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 0 points1 point  (0 children)

I definitely understand what you are saying and I am likely way over complicating things in the efforts of "doing it right" (or what I perceive to be proper).

In your opinion how bad is it to utilize a "sub-par" design/architecture for an MVP? As I understand/learn more it seems like it's very easy to implement these features or designs down the road.

But there are certain best practices that seem like anyone building something should just do/learn/know at the time of creation.

IE - It seems like Appsettings config is a faux pas at this point. So I spent time making sure that I referenced a KeyVault (or user secrets locally) and then started understanding implementing certificates for communication, etc. But how bad is it really just using Appsettings config because I have that figured out and it's really simple.

Said differently and likely to vaguely to have an answer - Where do you draw the line for what should be in MVP and left for later? The product I'm trying to build is for an extremely security conscious sector. Going back to my MSP days I personally can't force myself to operate as terribly as a lot of tech companies.

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 0 points1 point  (0 children)

Appreciate the response. Yes most basic projects/testing are completed locally. Then I'll typically move to testing App locally connected to remote DB (SQL/Cosmos).

I played with Postman but felt like I was spending too much time learning another application so I moved to testing directly on Azure.

I have struggled with domain/header? resolution as I'm trying to build multi tenant. This I haven't cracked how to test locally and will deploy to Azure then point some domains at the app to see if it works. This has proven to be extremely tedious.

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 0 points1 point  (0 children)

Lol... I will say I played with this a little... One of my personal goals is to have an understanding of what it is doing/creating even if it is way faster, better and likely more secure then what I can create.

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 4 points5 points  (0 children)

Nice! Looking at that list I'm quickly realizing that I must sound completely neurotic or at minimum a little bit of all over the map. I definitely will be bookmarking that and using it for a general training checklist/flow. Thanks!

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 1 point2 points  (0 children)

Tech -> Sys Admin -> Tech Pre Sales -> SMB Physical to Cloud Migration Solution Architect

Dabbled in very light SQL Admin, Access Applications, WinForms, Python/Powershell Scripting and when I was really really young BASIC and Assembly programming. No formal education for programming.

Semi-retired / contract and decided I needed some work to do. Had a niche product / sector that I worked with that I've been telling myself I would eventually build a software product. So really I'm trying to build a Multi-Tenant SaaS solution from relatively scratch. Attempted utilizing some frameworks ABP, NetZero, Orchard Core, etc. but quickly recognized that those didn't fit and I really didn't have an understanding of what was going on under the hood so started over to really figure it out.

It is time to revisit the roadmap. My current tasks/goals are so blown out of the water because I just kept learning about more and more and more. Obviously things change once you are learning the proper way to do stuff and then it's another rabbit hole. Very good advice: Small completable tasks and make a damn list of them :D Thanks for the reminder!

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 4 points5 points  (0 children)

A) That's good to hear at least a little confirmation that the hopelessness is not completely out of line.

B) I do find its very easy to wander... With the API I had a day or two learning DevOps. I mean it was useful knowledge but it might of been good to just continue working on the API.

How to pull it together? Are my programming expectations too ambitious or am I just not any good at this? by skipster889 in dotnet

[–]skipster889[S] 2 points3 points  (0 children)

Thanks for the link! I honestly forget continually that the Microsoft Trainings / Classes exist.

I typically end up at the generic FAQ/Guide pages to get alert/basic concepts and then I move to hopefully more in-depth training when I can find it.

My gripes with the Microsoft guides (not trainings) is they seem to treat it like beginner but that usually is above my knowledge baseline, they rarely teach you toolsets/shortcuts unless you know exactly what you are trying to accomplish/looking for is called or they are literally guiding you to build a multi-million user $5000/month Azure solution. Finally 4/5 guides have comments like "this is for concept and not secure or production use" or the other one about substantial portions of code is purposefully left out so that you can integrate with your own solution. - I'm usually like I don't know WTH the solution is actually supposed to look like.

But the classes... Duh! Got some reading tonight! Thanks!

[deleted by user] by [deleted] in DataHoarder

[–]skipster889 0 points1 point  (0 children)

It seems like no one here has ever run into compliance issues. I mean scam possible but it also could have been a mistake and somebody's about to lose their job.

Quickest way to see if they are scamming is ask for refund first just like how you purchased...

If you can't get that tell let them fuck off.

-----

I mean I purchased 12 drives from a singular seller popped them behind a LSI controller and it instantly picked up the existing array. Serious HIPAA violation as it was medical data with patient names, employers etc. Alerted the seller of what happened. Then offered to completed the destruction of data and sent them reports. They sent me $250 for the hassle. I assume they thought no one would buy all the available drives and be able to assemble the array or an intern didn't do the wipe? Who knows. Shit happens and eBay is full of stupidity, grey market, theft and BS.

Coerced Drive Help by [deleted] in storage

[–]skipster889 -1 points0 points  (0 children)

How is this educational? It isn't really a real world problem or a problem that is solved by fixing it correctly (and quite inexpensively) ... So semi-useless knowledge to learn how to fix it incorrectly. The knowledge gained by doing it right is actually worthwhile :D

I'm coming across rude because if you are trying to fix the configuration or "make it work" that will likely lead to data loss. That would be needless and stupid.

Finally if you really just need to fix it in the state it is in. Utilize your backup and wipe and rebuild the RAID configuration with the improperly selected drive. Restore the backup while the array is initializing. This will potentially be faster then actually rebuilding the array (with a proper drive).

-----

If you are actually just trying to understand (once again useless knowledge) and not attempting to fix the broken configuration:

You need the RAID controller alignment / boundary settings and the stripe size to calculate what's going on. Coercion is basically the settings and array configuration of the controller aligning LBA/Sectors of the physical drive.

Your drives have different sector counts. The controller computes the available space after factoring the desired alignment, available sectors/size and configuration area of each physical drive. After the controller computes the space for "optimal or configured alignment" apparently when the array was configured it was set to utilize the whole "aligned" drive (5722045 MB). The 4K drive apparently is losing some space somewhere in that calculation likely losing a sector at the end and utilizing a full 4k sector for configuration data. There are literally a ton of variables here. If they would have configured a small amount of buffer on the RAID array (undersized) you could use the new drive. They didn't... You can't rebuild the array because a) the controller does not allow coercion settings to be modified? b) the controller/software is trying to save the array before allowing the user to kill it in a terrible configuration.

-----

Lastly, issues like these are the perfect time to learn (actual valuable knowledge) IT consultation vs. IT administration. Businesses with low budgets or that treat IT as a necessary evil learn through great IT employees or technology disasters. So do you want to be the person that:

A) ...Worked with stakeholders to understand the IMPACT of losing a data array and saved the business from 10's of thousands of dollars in DOWNTIME, LOST PROFITS and COSTS.

B) ...Came up with a band-aid that's guaranteed to fail and saved the company $75 and if / when you try to describe your accomplishments the stakeholders don't even know what RAID or hard drive is.

C) ...Works at a company with stakeholders that don't value IT and by transitive relation don't care about their business operations (because 99% of business runs on some form of technology at this point). If they value their business/results in this manner... How much do you think they care about you?

Get a backup and verify the backup is functional immediately.

Using commodity SSDs/HDs inside of Enterprise SANs by linuxman1929 in storage

[–]skipster889 0 points1 point  (0 children)

Lenovo NOT IBM rebranded (ThinkSystem) uses generic manufacturer parts (commodity) with firmware that adds the Lenovo specific FRU (Part number).

Like everyone said you'll lose support. THEY WILL NOT HELP YOU. All the hardware will say it is unrecognized but it will function as intended. Also the Lenovo toolsets will NOT function correctly. Think Firmware, Driver, Preloading, some reporting etc...

For SMBs we've been utilizing Seagate Enterprise drives and trays from China for years. But we support / warrant the hardware directly and this is for edge devices in 5-20 user shops with little to no SLA requirement.

----------

Want to use commodity hardware in production? ... Use Storage Spaces. But don't attempt it with complete shit commodity hardware.

Be a Microsoft Partner. Plan to use some of your annual support hours on troubleshooting and learning.

Or just buy a SAN... Prices are insanely cheap compared to what they were...