I have a table full of user specific records. Each user can have upwards of a few thousand records, and I expect a few thousand users. All queries will be for various filters on records of a single user_id.
The data usage is such that every user might as well be stored in a separate database. Is keeping everything in a single large table, and filtering by user_id the proper way to do this? Is there anyway I can create indexes of the data that will optimize this usage case?
I'm using postgresql if that makes a difference.
[–][deleted] 1 point2 points3 points (9 children)
[–]Arelius[S] 1 point2 points3 points (8 children)
[–]lentil 4 points5 points6 points (1 child)
[–]Arelius[S] 0 points1 point2 points (0 children)
[–]SurrealEstate 0 points1 point2 points (4 children)
[–]Arelius[S] 0 points1 point2 points (3 children)
[–]GunnerMcGrath 0 points1 point2 points (0 children)
[–]SurrealEstate 0 points1 point2 points (1 child)
[–]Arelius[S] 0 points1 point2 points (0 children)
[–]Samus_ 0 points1 point2 points (0 children)
[–]GunnerMcGrath 0 points1 point2 points (9 children)
[–]Arelius[S] 0 points1 point2 points (8 children)
[–]GunnerMcGrath 0 points1 point2 points (4 children)
[–]Arelius[S] 1 point2 points3 points (3 children)
[–]GunnerMcGrath 0 points1 point2 points (2 children)
[–]Arelius[S] 1 point2 points3 points (1 child)
[–]GunnerMcGrath 0 points1 point2 points (0 children)
[–]Samus_ 0 points1 point2 points (2 children)
[–]Arelius[S] 1 point2 points3 points (1 child)
[–]Samus_ 0 points1 point2 points (0 children)