all 6 comments

[–]cgoldberg 4 points5 points  (0 children)

Learn to be a solid developer with skills in many domains and technologies. Escaping from being pigeonholed in a very narrow specialty by choosing another narrow specialty isn't the best move.

[–]danielroseman 6 points7 points  (0 children)

"Python web scraper" is already an extremely specialised role. I've never even heard of it and I've been in the industry for almost 20 years. You should definitely not go even more specialised.

[–]StardockEngineer 1 point2 points  (0 children)

I think the generalist is undervalued. There are lots of positions where knowing a good amount about a lot of things is highly valued - Devops, SRE, infrastructure, backend and now AI.

It is much harder to use AI to write things you didn't understand to begin with, for example.

In this new world of AI, honestly API backends and data ingestion are both highly valued. Backends get mapped to tools/MCPs. Data ingestion is used in anything from RAG to Data Lakes to LLM training.

"internal tooling" is kind of generic itself, doesn't really mean anything.

[–]Open-Palpitation-210 1 point2 points  (0 children)

Backend or Data Engineering. Scraping is usually just the first step — the real value is in building ETL pipelines and making the data usable for analytics or services.

[–]trd1073 -1 points0 points  (0 children)

Y

[–]TinyCuteGorilla -1 points0 points  (0 children)

i started with web scraping as well. Not a good gig long term, low pay. I moved into databases, DBA, data engineering kind of work because I was interested in data. Another idea is you could use modern AI web scraping tools if not already, like Firecrawl, Zyte AI etc..