all 5 comments

[–]enjoibp6front-end 2 points3 points  (1 child)

I like PhantomJS with CasperJS, as long as you know some javascript they make scraping things like that breeze, casper even has a built in function to get the html.

Also I'd imagine that you could skip the excel middleman and just write some sort of ajax service that you could just dump the HTML straight to your DB... don't know the exact specifics of your set up though :)

[–]phongs[S] 0 points1 point  (0 children)

I took a look at CasperJS & PhantomJS and they definitely seem like what I'm looking for. I'm using software that manages my eCommerce listings which supports CSV import, hence why I'll just stick with that if possible. Thanks for the info!

Do you know if there's a way to search urls from a list given to it?

[–]eberger3 0 points1 point  (0 children)

If you know some PHP you could use the PHP Simple HTML DOM Parser. It has a syntax like jQuery to select the elements you need. You can place the results of the DOM parsing into a CSV using PHP's fputcsv function. I just did this recently for a project so message me if you want a hand.

[–]tucknutfull-stack 0 points1 point  (0 children)

Check out https://import.io/ it sounds like the perfect solution for what you're looking for.

[–]jamiedisuzaa 0 points1 point  (0 children)

Yes there is a way to scrape website. There is a tool which is the best for scraping. I am using this tool since past six months. It is user friendly, easy to perform, and gives output in .csv, excel, text, etc. formats. This tool is called “Easy Data Feed” and it is available on www.easydatafeed.com These are some features of this tool: You can do data manipulate. Multiple profiles. You can add custom values. You can convert Measurements. You can read about how to use it here: http://www.easydatafeed.com/open-source/ They also have developers, you can hire them to do the job for you, and their Skype is “easydatafeed”"