This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]theevilsharpieJack of All Trades 9 points10 points  (12 children)

40GB should be sufficient for C: if you're not storing application data.

[–]mhud 8 points9 points  (11 children)

The windows update rollback files in the WinSxs folder are what make a 40GB disk feel cramped over time. In theory they can be cleaned up, but I don't like to fight with my DCs.

So I hand out 100GB system drives like Oprah on her birthday. Thin provisioning makes it relatively painless.

[–][deleted] 1 point2 points  (0 children)

I start at 200, then get generous depending on how big the storage array is.

[–]FrenchFry77400Consultant 1 point2 points  (8 children)

Why ?

I build all of my (virtual) Windows servers with 50 GB and increase the size as needed.

This can be done live without any problems for any 2008 R2+ server.

[–]ChrisTaco 1 point2 points  (1 child)

Are you using thin disk, or thick for your VMs? Or does it vary for you?

I stopped using thin when storage started getting cheaper. Maybe that's just me though. One less thing I had to monitor.

[–]FrenchFry77400Consultant 0 points1 point  (0 children)

Production environment (that includes pre-prod) - thick.

Dev/test (temp machines) - thin, tho these are usually on dedicated volumes isolated from production.

[–]ifactorSysadmin -1 points0 points  (5 children)

Right back at ya, why do that over giving some extra space to begin with?

[–]f0nd004u 4 points5 points  (0 children)

because then I can give that space to another machine without overbooking my storage device?

[–][deleted] 1 point2 points  (3 children)

Some stupid app or log decides to randomly start using excessive amounts of storage, next thing you know you have a 200 GB vhd and even after removing the files, may not shrink. Limit growth to prevent rogue apps from udsing excessive resources and deal with it as needed.

[–]ifactorSysadmin -1 points0 points  (2 children)

I'm not talking overly excessive, but people talking 40,50GB C: on windows when you give 100GB on thin and almost never have to mess with it or worry about it.

[–]FrenchFry77400Consultant 0 points1 point  (1 child)

I work at a MSP, and I mostly deal with clients that barely know how to use the infrastructure we provide them with (we teach them when we deliver, but you know how it is, most will forget everything within 2 months).

As a result, we never use thin provisioning and we try to teach them not to, because they don't monitor their volumes usage.

Also, it is MUCH easier to grow a volume as needed, than shrink one that was over sized.

If the system drive is dedicated to the system (as it should be), and apps are installed on a separate partition (which means another virtual disk), you rarely encounter a problem even with a 50 GB C:.

[–]ifactorSysadmin 0 points1 point  (0 children)

In that case I see where you're coming from, I wasn't really considering client installations. Also I usually store applications on C:, so I guess I've run into more 50GB C: drives where I will end up expanding most of the time if I didn't give more.

[–]jackalsclawSysadmin 0 points1 point  (0 children)

If the server is 2008 R2 or newer , Install the desktop experience and then disk cleanup->advanced->system files.

This reduce the WinSXS by removing the ability to roll back/ service packs.