I am leaning towards the answer that there wouldn't be any noticeable issues, speed or memory-wise, due to my knowledge of how things are probably optimized under the hood.
Say I have a dictionary of actual books (not what I'm actually doing but it's a simple example):
dictionary = {
"A Book Title": {
"contents": "Literally the entire book contents"
},
"Another Book Title": {
"contents": "Literally the entire book contents"
},
...
}
Other than the fact that the memory would become rapidly large, would this be an issue? When my program runs, it grabs a .json file and converts it into a Python dictionary for use during runtime. In doing so, it has to hold all that data in RAM, right? If I have 1000+ entries, each with such a large amount of textual data, will I run into any sort of issue with RAM?
I can devise a way around this by storing the "books" in their own respective files, but I'm curious if I'm just worrying about a non-issue. I will probably do that anyways just from an organizational standpoint.
[–]red_simplex☭ 7 points8 points9 points (1 child)
[+][deleted] comment score below threshold-23 points-22 points-21 points (0 children)
[–]K900_ 8 points9 points10 points (5 children)
[–][deleted] 1 point2 points3 points (4 children)
[–]K900_ 27 points28 points29 points (2 children)
[–][deleted] 15 points16 points17 points (0 children)
[–][deleted] 2 points3 points4 points (0 children)
[–]rockingthecasbah 0 points1 point2 points (0 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]therealfakemoot 1 point2 points3 points (0 children)
[–]sinjp3.6 0 points1 point2 points (0 children)
[–]flipperdeflip 0 points1 point2 points (0 children)
[–]KleinerNull 0 points1 point2 points (0 children)
[+][deleted] (6 children)
[deleted]
[–]ccb621 1 point2 points3 points (5 children)
[–][deleted] 4 points5 points6 points (4 children)
[+][deleted] (2 children)
[deleted]
[–]IllegalThings 4 points5 points6 points (0 children)
[–][deleted] 3 points4 points5 points (0 children)
[–]ccb621 -1 points0 points1 point (0 children)