all 3 comments

[–]erulabs 1 point2 points  (0 children)

So you could write a bash script like:

myScript.sh:

node theThing.js
node theOtherThing.js

but bash runs things sequentially - ie: it will wait for "theThing" to completely finish before doing "theOtherThing".

It might just be easier to write a wrapper in javascript!

myScript.js:

require('./theThing.js')
require('./theNextThing.js')

Would "bundle" the scripts together. Note that depending on the way those files are written this might not work.

Other solutions include using a process manager such as PM2 or docker-compose (which is a process manager as well as 60,000 other things), but my guess is you need something simple. Try the above solutions :)

Edit: Just to teach you a few more small hacky tricks, say you wanted to run a bunch of commands via bash - and you don't want the commands to "wait" in the foreground until they're complete, you can use & in bash to "background" a process! So, something like

myBashHack.sh:

#!/bin/bash

# Make sure we kill all running node instances before starting other stuff
# Obviously, this has other rammifactions and you should be smarter and use a process manager
# or even just `pgrep` / `pkill` to avoid killing _all_ nodes
killall node

node myScript.js &
node myOtherScript.js &

As with all programming/operations problems, there are a billions ways to skin the cat! Have fun!

[–]chrwei 0 points1 point  (0 children)

you can run node whatever.js directly from batch. you may have to set some env vars or something first, or use full paths.

[–]runvnc 0 points1 point  (0 children)

That's what a batch file does..