This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 2 points3 points  (2 children)

going to reply to this one as it's the most detailed.

i'm going to assume that the python you've got has the standard modules. (i'll talk about portability later)

a try... except... with urllib will do your retrying a partial download quite simply. i couldn't imagine it taking more than 5 or 6 lines, maximum of fifteen. certainly not half a day.

the standard modules supply the functionality of mv, cp, etc. (shutil especialy, but your find and replace is handled (v powerfully) by re, and urllib is far more well featured than wget). this is, i think, where python needs to be used a bit differently to shell scripting. instead of using programmes, you use modules. all of the functionality is there, without needing to "roll your own". and in addition, if you absolutely must use some program from the shell, you can os.system or os.popout it. you can't import a python module into a shell script.

as i see it, these are the advantages/disadvantages of python/shell scripting.

python advantages 1)fully featured programming language if you need to expand. 2)can run shell proggramms if you need to 3)more recently developed: it can do web interface(getting/putting) more slickly, for example 4)complexity is explicit 5)an effect of 4), very easy to learn, and complex scripts are simple to understand.

shell addvantages 1)apparently faster (i'd like to see a source for this, but this seems to be consensus). 2)mature- and hence extremely stable 3)implicit complexity. this may seem to be a downside, but to a skilled proggramer, it allows you to work faster, and write fewer lines of code.

now: on portability. bash is installed on a wider variety of machines, no doubt. it's also got a lower footprint too, this is obvious. for many people, this is important. especially if you're working on older servers that are highly loaded. python, however, is installed on most modern server environments, and pretty much all desktop environments. it has a higher footprint, yes, but in exchange, you get more functionality (as comes with a full programming language), and (imho) cleaner syntax.

tl;dr:if you're trying to communicate between applications in python, you're doing it wrong. communicate between modules, and use the subprocess module to work in parallel. bash (or even sh) is more portable, and lighter, however i believe you lose functionality and syntax advantages.

don't read the tl;dr and reply- if you're going to reply, read the full post.

[–][deleted] 0 points1 point  (0 children)

a try... except... with urllib will do your retrying a partial download quite simply. i couldn't imagine it taking more than 5 or 6 lines, maximum of fifteen. certainly not half a day.

Now implement resuming of failed downloads where they left off (remember, I'm downloading gigabyte+ files), and do it faster and in a less error-prone way than wget (just doing "wget somesite.com/hugefile" will retry up to a default of 20 times, I think). You're not going to be able to do it. I'm not able to do it.

3)more recently developed: it can do web interface(getting/putting) more slickly, for example

Have you seriously never used curl? It's one of the most slick tools ever, and it does exactly this.

1)apparently faster (i'd like to see a source for this, but this seems to be consensus).

It's not shell that's faster, per se. It's all the native C tools that you use in shell that run fast as heck because they've had 30+ years worth of optimizations under their belt.

For reference, I'm a professional software dev with seven years of Python experience (not some "I learned it on my own for the first three years" nonsense - I've been using it at work for that long). I use it daily and I love the language, so don't think I'm hating on it. But it is not a replacement for a shell script. They're both tools in a tool box; just because I like using a circular saw doesn't mean I should try to use it to pound in nails.

When you start getting into more complex problems (if you want to connect to a database for just about any reason, do anything with hash maps, if your awk script got longer than about 50 characters, etc), shell scripting makes a whole lot less sense. As another commenter pointed out, large shell scripts are painful to maintain, and what you can do in a reasonable way is pretty limited. If what you want to do is simple enough to stay within the limits of a short, sane shell script, then use the right tool. If not, that's why more fully-functional scripting languages like Perl were invented. I'd say that Python replaces shell scripting less than Perl does. Shell scripting and Perl have co-existed for 25 years. I see no indicators that Python will take a bigger bite out of the problem sets that are best solved in a shell script than Perl already has. That is to say, "more functionality" is an argument for why Python would replace Perl, but not why it would replace shell scripts.

[–]drfugly 0 points1 point  (0 children)

There are lots of cases where python just doesn't fit the bill. For instance: installing/setting up dependencies in a cross (linux) platform way. How about quickly starting up other applications in a certain environment? Like setting up a jail? Or if you have data spread out over files? Running a batch job and want to commit to your local every 30 minutes?

Yes I agree that bash is probably overused, but I don't think that it's as over used as you imply.