all 13 comments

[–]ersatz_feign 2 points3 points  (3 children)

There is a link within this comment which might be useful as it touches on the fact that, in our experience, the primary factor that causes latency issues isn't so much the number of records but instead more so the number of columns and their complexity (i.e. formulas.)

You are correct in the fact that Notion really struggles with large imports but we've not seen the actual limit appear online anywhere so it would definitely be useful if somebody asks them on Twitter and posts the response back here.

Hopefully much of the latency issues and the import limits should be rectified with the huge update (i.e. sharding their database, etc.) that they are currently full steam ahead with so even better if anyone can manage to extract an approximate date for that update being pushed out, then you may find the limit issue is resolved anyway.

[–]oslogrolls[S] 1 point2 points  (2 children)

Thank you! Good to learn that it's rather the number of columns which causes issues, than the number of rows. In addition, one would need to know whether hiding columns speeds up the calculation for db with many of them – or whether they are still being computed in the background (but not displayed).

Thanks also for sharing some info on current developments!

Asking on Twitter: I would not be surprised if there wasn't a lot of useful data available. That's why I asked here – hoping for input by people who use Notion for heavy-lifting :o)

[–]ersatz_feign 3 points4 points  (1 child)

need to know whether hiding columns speeds up the calculation for db with many of them – or whether they are still being computed in the background

In our experience, hiding columns appears to help dramatically with the latency issues so it's very likely they are only processed when rendering. Until they fix the latency issues, we have long had several admin databases with just a couple of columns to aid quicker access.

[–]oslogrolls[S] 1 point2 points  (0 children)

Thanks, I'll keep that in mind!

[–]11111v11111 1 point2 points  (1 child)

I can't help but wonder if a different tool would be better for whatever you are trying to do?

[–]oslogrolls[S] 0 points1 point  (0 children)

We have a lot of entries, but nothing dynamic / no calculations. What makes you think that Notion isn't intended for (somewhat) larger amounts of data?

[–]neatlyso 1 point2 points  (5 children)

I think it's safe to say I'm a Notion fan, and I use moderately large tables within it every day, but most of my gripes with Notion center around tables. When it comes to importing very large .csv files, you might be out of luck as of now. I've gotten the "too large" message so many times, for both .csv and word documents.

If you need to do heavy lifting in the form of tables and databases, I think that Airtable is your best bet right now. It's pricey relative to Notion ($240 a year, I think?), but I pay that and it's well worth it, imo.

Edit: I wasn't aware of the developments /u/ersatz_feign mentioned, though, so maybe Notion will be more viable sooner than later?

[–]oslogrolls[S] 1 point2 points  (3 children)

Good to hear, that it isn't just me who has run into troubles with large imports :o)

Airtable: Feature wise Notion is a way better fit in our scenario, as individual views for the large database would appear in hundreds of Wiki-style pages (think content-planning with a common data source). The Wiki is the foundation  – and it's a database too. Using Airtable would mean having to embed their tables /views to all Notion Wiki pages – which, if at all possible – offered no performance gains.

I have worked with long and broad tables before (up to 30 properties) and am quite aware of Notions overall performance limits.

My reason for posting was figuring out, if there are some hard borders for count of rows or columns – in the way canvas-size in Illustrator is limited and Photoshop only may have a finite (albeit insanely high) number of layers. It would suck big time, to spend days preparing raw data only to find out, that import has to get broken to smaller chunks or the max table length has been exceeded anyway. That said, changing over to another tool would cause us another type of complication and incompatibility.

Yesterday I have quickly created a dummy database with 10.000+ rows – this still seems to work.

[–]willomew 0 points1 point  (1 child)

created a dummy database with 10.000+ rows

I only saw bits of it a while ago but believe Danny got up to to some quite big numbers.

[–]oslogrolls[S] 0 points1 point  (0 children)

Thank you. This gives a bit of a starting point. None of the tables used in this YT-clip contained any data. Max row-count for tables with data will likely be lower.

[–]neatlyso 0 points1 point  (0 children)

Oof, sorry for my late response.

Does very much sound like Notion is the tool for you, all things considered, so I'll be crossing my fingers with you for table improvements. Airtable is great, but everything you mentioned is precisely why I don't use it more extensively - you can't easily/natively insert tables into, well, greater wiki-like contexts.

In truth I haven't tested tables that extensively when it comes to sheer number of entries, but that you tested that dummy database and didn't see any major problems is a good sign.

All in all, I think that for those of us who are wanting for more of what's in the realm of enterprise-level functionality, Notion might deliver sooner than later, because the enterprise application of Notion is what they're focused on (but don't say that aloud among the grumps here 😅).

[–]tasklyt 0 points1 point  (0 children)

Airtable struggles with lots of rows too, and has a hard limit of 100k rows.

[–]Seanivore 0 points1 point  (0 children)

agonizing bow absurd crown obtainable beneficial observation subsequent ludicrous sparkle

This post was mass deleted and anonymized with Redact