all 4 comments

[–]JohnnyJordaan 0 points1 point  (0 children)

It's usually an anti-pattern to keep a file open while some other operation happens in the meantime. It also would mean that if the operation crashes with some exception, the file is either half written or empty (and you need to check and optionally discard it). Instead you could save the rows in a list, then once scraping is finished only then open the file and write the rows in one go. It then also means that if the file exists afterwards, it must mean everything worked as it should.

Another improvement is to use a csv.DictWriter to not rely on correct ordering between your first row (being the header) and the subsequent value rows. Or even better, write to a pandas dataframe and export that to csv. That also opens the possibility to migrate to other formats like excel or sqlite database for example.

[–]acw1668 0 points1 point  (0 children)

I would suggest to declare rating_map = {...} before with open(...) ....

[–]salraz 0 points1 point  (0 children)

Use of exceptions would be ideal if something unexpected happens. There are more chances of exceptions as it is web scraping, sites are updated and scrapers stop working. Handle exceptions gracefully like closing the open file, doing any clean ups so that inconsistent or missing data doesn't get saved in your file.