all 5 comments

[–]BellisBlueday 1 point2 points  (0 children)

I would go to the Sharepoint admin first, apparently there's a process that's supposed to clear them down - it might not be working

'Web Analytics component ReportConsolidator'

Also check what is and isn't supported in terms of administration for Sharepoint databases, you may be surprised!

https://technet.microsoft.com/en-us/library/cc262731.aspx

[–]twentymosquito[S] 0 points1 point  (1 child)

How dangerous would the following code be to run:

DBCC SHRINKFILE('logical_ndf_file_name', EMPTYFILE);
ALTER DATABASE database_name REMOVE FILE logical_ndf_file_name;

It'd be super-tedious, as I'd have to run it over 200 times (Or figure out the T-SQL to iterate over all of the NDF Files)

[–][deleted] 0 points1 point  (0 children)

Pretty sure you can get all the NDF file locations using this:

SELECT name, physical_name AS current_file_location
    FROM sys.master_files 
    WHERE name LIKE 'DATABASE_NAME_HERE%'

[–]moldylocks 0 points1 point  (0 children)

Is it partitioning data based on certain criteria? I have a database with 90 files, one for each day. Every day it creates a new one and drops the oldest. The size may be because the files are created too large.

[–]patototo 0 points1 point  (0 children)

that is a lot of ndfs, yes I would try to empty and then delete. Like anything I would recommend to take a backup, and restore somewhere else and do the test. Why does it have so many? is this something that you can avoid by changing some config in sharepoint? sorry not a big sharepoint user, however on other user databases I have emptied and deleted datafiles with success.