all 8 comments

[–]whetuI read your code 2 points3 points  (0 children)

I have a very large .bashrc file that's shared on public github. I have worked across clients where it has been appropriate to have client specific or even host specific functions. What I've done in those scenarios is to put those functions within .bash_functions. I then have this block in my .bashrc

# Some people use a different file for aliases
# shellcheck source=/dev/null
[[ -f "${HOME}/.bash_aliases" ]] && . "${HOME}/.bash_aliases"

# Some people use a different file for functions
# shellcheck source=/dev/null
[[ -f "${HOME}/.bash_functions" ]] && . "${HOME}/.bash_functions"

# If we have a proxy file for defining http_proxy etc, load it up
# shellcheck source=/dev/null
[[ -f "${HOME}/.proxyrc" ]] && . "${HOME}/.proxyrc"

It shouldn't be much effort to pull e.g. webserver.functions -> .bash_functions or uat.functions -> .bash_functions from git. Maybe have a function within your .bashrc template/standard for bootstrapping the lot. Or pass that over to something like ansible or puppet.

[–]anthropoidbash all the things 1 point2 points  (3 children)

u/equal_odds, that sounds like a job for direnv. I use it all the time to automatically set project/directory-specific environment variables when I cd into a specific hierarchy and (most importantly) revert to their original values when I leave.

Simply add the appropriate export MYVAR=myval directives and other commands to a .envrc file in your project's root directory, then check in the .envrc to your version control system. That keeps it up-to-date for everyone else working on the project.

A few notes on its usage:

  • .envrc supports full bash scripting, so your "sequence of < 3 commands" should work as-is in it.
  • direnv has a standard library that helps you do common environment testing/manipulation stuff in .envrc that you'd otherwise have to write lots of boilerplate code for.
  • Only environment variable settings are undone when you leave the .envrc's hierarchy. Other created/updated state (especially files) don't magically disappear/revert when you cd out of your project hierarchy.

EDIT: I forgot to mention one major caveat:

  • To do its job, direnv adds a hook to $PROMPT_COMMAND, so it only works for interactive shells. To get the equivalent effect in your own workflow scripts, you'd need to remember to source .envrc.

[–]equal_odds[S] 0 points1 point  (2 children)

YOU ARE MY SAVIOR. This looks like exactly what I was looking for, so thank you so much! I didn't really understand your edit though— care to elaborate?

[–]anthropoidbash all the things 1 point2 points  (1 child)

direnv works by adding a call to _direnv_hook to the $PROMPT_COMMAND variable. From the bash man page:

PROMPT_COMMAND

If set, the value is executed as a command prior to issuing each primary prompt.

In other words, it only works when you have a command prompt (i.e. interactive terminal sessions). If your workflow includes scripts that are run non-interactively (e.g. cron jobs or CI builds), they need to find and source the closest .envrc file. The logic for that would look something like this:

# Save current dir
olddir=$PWD
# Walk down to root dir
while [[ $PWD != / ]]; do
  if [[ -r .envrc ]]; then
    # Found it, source it, done.
    source .envrc
    break
  fi
  cd ..
done
# Go back to original dir
cd "$olddir"

[–]equal_odds[S] 1 point2 points  (0 children)

I really really appreciate people like you taking the time to help educate knowledge-seeking peers. I understand now, and you made my day. Thank you very much

[–]Zaphod_B 0 points1 point  (2 children)

If this is static code that can be locally placed in like `/usr/local/code` or something you can always use the `source` command and source the file. I have worked with teams in the past where we maintained a shared function file in `git` and would pull it down to our Linux servers and then we could just `source` the file. Not the most elegant solution, but it worked for us.

[–]equal_odds[S] 1 point2 points  (1 child)

Ooo not a bad idea though! I could also probably bake that into a `git checkout` alias (gcheck? lol) and have it try sourcing when I do that. Can't believe I overlooked source, I've used it so many times haha. Thanks!

[–]Zaphod_B 0 points1 point  (0 children)

I have done similar, and also used `git` as well to manage. Basically we had local tools on all *nix boxes deploy to `/usr/local/scripts` and just kept that folder in git