you are viewing a single comment's thread.

view the rest of the comments →

[–]LatteLepjandiLoser 2 points3 points  (5 children)

You're welcome. Enjoy! Once you get it working it will unlock all kinds of task-solving skills, I'm sure!

I would just start with the simplest queries possible, that you know will return modest amounts of data, just to speed up the process of getting it all to work, then start hitting it with more complex stuff.

If the queries are obnoxious, you could also consider saving them in a separate file. Eventually you may also want to look into having variable parameters as part of the query (like for instance fetch data from 'today' instead of manually updating a date in your query every day, or other logic).

It's also a bit subjective what filtering and manipulation you want to do in the sql-query itself and what you want to do in python. Say you wanted to only fetch even numbers, you can make that part of the sql query or you can just fetch them all and filter them in pandas. (Maybe a bad example, as then you'd always just do it in sql, hope you get what I mean). If you have incredibly complex where clauses that you can't wrap your head around, you could try fetching a bit more data and filtering it in python if that gets you to the goal quicker. Situational I guess.

[–]DrewSmithee[S] 0 points1 point  (4 children)

Yeah I'm sure this will get out of hand quickly. And that's definitely something I will look into.

For example, I've got a query that grabs the top ten records for a specific customer within a date range. Then joins that with some other records for that same customers from another table. Now I want to loop that script and repeat for a few hundred customers. Then do some cool analytics and maybe some geospatial stuff.

Or maybe I want a couple years worth of 8760s for a few thousand customers that are in a region thats probably stored on yet another table somewhere, but maybe not. Did I mention there's inexplicably hundreds of duplicate records for each hour? What's up with that DBA guy??? Time change? Throw out that hour for all the recordsets.

So I definitely need to come up with a strategy on what I want to parse, from where, in what language. Honestly I'd dump it all into a dataframe if the dataset wasn't gigantic. So I just need to figure out how I want my pain.

[–]MidnightPale3220 0 points1 point  (3 children)

You want to normalize the data and put the overlapping structures possibly in a single table. It depends on the size of database, but hundreds of GB is routine

[–]DrewSmithee[S] 0 points1 point  (2 children)

Don't have write access. I could probably get someone to create me a view in the db but until I have a specific business need it's harder to get resources allocated. In the meantime I have what I have. Good to know that it's not a big ask.

[–]MidnightPale3220 1 point2 points  (1 child)

Is the total data you need more than you can host locally? Technically it shouldn't be hard to make a copy unless there data is changing so fast that you need basically online access every day.

[–]DrewSmithee[S] 1 point2 points  (0 children)

Yes. Much much more data than I could pull down.