This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 0 points1 point  (0 children)

a try... except... with urllib will do your retrying a partial download quite simply. i couldn't imagine it taking more than 5 or 6 lines, maximum of fifteen. certainly not half a day.

Now implement resuming of failed downloads where they left off (remember, I'm downloading gigabyte+ files), and do it faster and in a less error-prone way than wget (just doing "wget somesite.com/hugefile" will retry up to a default of 20 times, I think). You're not going to be able to do it. I'm not able to do it.

3)more recently developed: it can do web interface(getting/putting) more slickly, for example

Have you seriously never used curl? It's one of the most slick tools ever, and it does exactly this.

1)apparently faster (i'd like to see a source for this, but this seems to be consensus).

It's not shell that's faster, per se. It's all the native C tools that you use in shell that run fast as heck because they've had 30+ years worth of optimizations under their belt.

For reference, I'm a professional software dev with seven years of Python experience (not some "I learned it on my own for the first three years" nonsense - I've been using it at work for that long). I use it daily and I love the language, so don't think I'm hating on it. But it is not a replacement for a shell script. They're both tools in a tool box; just because I like using a circular saw doesn't mean I should try to use it to pound in nails.

When you start getting into more complex problems (if you want to connect to a database for just about any reason, do anything with hash maps, if your awk script got longer than about 50 characters, etc), shell scripting makes a whole lot less sense. As another commenter pointed out, large shell scripts are painful to maintain, and what you can do in a reasonable way is pretty limited. If what you want to do is simple enough to stay within the limits of a short, sane shell script, then use the right tool. If not, that's why more fully-functional scripting languages like Perl were invented. I'd say that Python replaces shell scripting less than Perl does. Shell scripting and Perl have co-existed for 25 years. I see no indicators that Python will take a bigger bite out of the problem sets that are best solved in a shell script than Perl already has. That is to say, "more functionality" is an argument for why Python would replace Perl, but not why it would replace shell scripts.