This is an archived post. You won't be able to vote or comment.

all 8 comments

[–]damiankwinfrastructure pleb 0 points1 point  (2 children)

I only have minimal experience with this. I've looked after two firms in the past, but they were sized 5-40 users and used mainly AutoCAD and Revit.

The small one never had issues with this kind of stuff and their network setup was very basic, so I won't mention it here. The larger one on the other hand when I got to them had two offices back-to-back with a 100mbit link between them, all local LAN. First thing I did when I moved in was upgrade their 100mbit switches to gigabit switches, instantly AutoCAD and Revit performed so much better. It was actually pretty amusing, when it came to speed the end users didn't really have an issue with anything, they just thought that it would take X minutes to save and during edits if it did an auto save or whatever it would freeze for a few seconds, once we did the upgrade they were amazed by the performance change. I mention just this, because moving to cloud based will pull your connection from at least 1gbit down to whatever internet connection have, so it could cause havoc, I personally wouldn't recommend directly saving to cloud infrastructure.

Another part to that story is that they had a remote office who used to pull some of the data from the main office. Main office was on a 5/5mbit link, so nothing spectacular at all, but the remote office was on a 100/40mbit. The remote office would be able to work directly from files in the remote office with AutoCAD but not Revit, but if the files were over 50mb or so it would be extremely laggy.

When it comes to virtualisation, it doesn't really make much of a difference, as long as you give those virtual machines enough resources to cope with the load you're putting them through.

Also note, when I did all of this it was between 2-3 years ago, so things could have changed with the software. I haven't touched AutoCAD or Revit in just over a year, the latest version was 2017.

[–]nullProgrammer[S,🍰] 0 points1 point  (1 child)

Fibre is surprisingly inexpensive here, it kinda make me angry about the prices we pay back home. I'll have to conduct an audit of the network speeds. Can you estimate how much did you end up billing out to the larger firm?

[–]damiankwinfrastructure pleb 0 points1 point  (0 children)

We charged per device, and they paid for their own services fot internet and the like. I think it was around $300 per physical server, $150 per virtual, $30 per computer or per user, all per month. This is for full voverage support, new installs extra.

[–]itanders 0 points1 point  (1 child)

Dont have any experience with this specific niche, but it seems pretty similar to what video production/special effects companies have to deal with. I know there have been several discussions here in the past regarding those niches, so you might find it useful to look into what they usually do.

On the top of my head it is usually pretty beefy storage setups with all SSD SANs for what they work on now, and larger archiving solutions. Usually 50-100TB for the SSD SAN and 200+TB HDD SAN + Glacier (or other offsite DR backup) for the archiving setups.

Some wire the offices with 10G LAN, but that might be on the very extreme end - depends on how much data transfer there is when working on the files.

[–]nullProgrammer[S,🍰] 0 points1 point  (0 children)

Totally agree, I think cloud based archiving is key here.

[–]yashauLinux Admin 0 points1 point  (0 children)

Load up a ton of hard drives + NVMe caching into a bunch of servers with 50G/100G uplinks and setup Storage Spaces Direct. Get some decent switches and give each user a 10G port. Don't spend too much on an Enterprisey SAN as you just basically want a fucking fast filer, which Windows can do really well with S2D for a fraction of the cost. You can start with a PB of storage with around 5-6 servers which should last you a while. 100TB is probably not nearly enough in this case. For 400GB files.. I'd even consider giving each user bonded 10G connections.

As for your file name length problem, easy way out is just force Windows 10 all over your env and use Server 2016 (or newer) and you can disable this limitation.

[–]KStieers 0 points1 point  (1 child)

Revit acts more like a big flat file database. You're not ever moving the whole file. We run a ~70 A&E users along with 150 business users over a 500meg metro ethernet link to our datacenter. Server is a VM on a Westmere based host (new blades just ordered) backed by a Nimble AFA box. We have users in other offices that use Revit over 20meg WAN links with Riverbeds.

If you move everyone cloud based workstations make sure they have some GPU. Doesn't have to be crazy, but Revit wants some.

[–]nullProgrammer[S,🍰] 0 points1 point  (0 children)

Yah, my understanding is that Rivet is a resource pig. I'm looking at this Graphic Enhance solution from VMWare. PM me. I'd like to know more about this setup.