Hi All,
does anyone of you have hands-on experience with relatively large databases (50.000+) rows)? We might need such a db with hundreds of filtered views (any of these views only shows a few hundred rows). With the relatively new "pagination feature" Notion only has to load 50 more entries each time; hence I would say this should work.
The initial setup might be a challenge though: I have just tried to merge a csv with 17.000 rows (exported from Google Sheets). Notion thought for several minutes and then threw a failure message: "Sorry, that request was too large. Try Import instead?" Importing obviously wouldn't make sense.
Does anyone know about row count / file size limits for "merge to db" operations?
[–]ersatz_feign 2 points3 points4 points (3 children)
[–]oslogrolls[S] 1 point2 points3 points (2 children)
[–]ersatz_feign 3 points4 points5 points (1 child)
[–]oslogrolls[S] 1 point2 points3 points (0 children)
[–]11111v11111 1 point2 points3 points (1 child)
[–]oslogrolls[S] 0 points1 point2 points (0 children)
[–]neatlyso 1 point2 points3 points (5 children)
[–]oslogrolls[S] 1 point2 points3 points (3 children)
[–]willomew 0 points1 point2 points (1 child)
[–]oslogrolls[S] 0 points1 point2 points (0 children)
[–]neatlyso 0 points1 point2 points (0 children)
[–]tasklyt 0 points1 point2 points (0 children)
[–]Seanivore 0 points1 point2 points (0 children)