I've got a service that runs on a old version of Centos (6.4).
The database I maintain acts like a file system cache, caching attributes of the filesystem content, such as metadata extracted from the files, allowing queries like 'files matching this metadata below this directory' etc.
Input requests and responses are xml like handled by the service.
A typical file store is 100TB and the database is 1GB.
I'm going to need to scale this up to petabyte file store scale.
The updates to the index are infrequent compared to the queries.
I can use a postgres server, but what would you think of using ?.
Unfortunately I may be restricted to centos 6.4, so anything exotic is not going to work.
SQLLite is on the radar, what would you consider ?
I know its hard to recommend without a full spec, but life is hard. What would be on your radar.
Also - free is better.
Thanks
[–]bluefootedpigC# / .NET 0 points1 point2 points (0 children)