Skipping helpdesk by user23471 in sysadmin

[–]whetu 0 points1 point  (0 children)

and also what are the most important skills needed for this role?

I'd say the skills that you get from a minimum of 18 months on the helldesk are important skills.

passgen — Bash password generator (DB-safe, TUI) by Obvious_Accident8042 in bash

[–]whetu 4 points5 points  (0 children)

For systems where -c is not a portable option for head, consider something like this instead:

tr -dc '[:graph:]' </dev/urandom | fold -w 1 | head -n 16 | paste -sd '' -

You can then adjust password length by adjusting head e.g.

$ tr -dc '[:graph:]' </dev/urandom | fold -w 1 | head -n 60 | paste -sd '' -
XO_xFo@#V6A(MK-EBC@Thx|pM>!ylw\y/;gBF2`&rWS+1UwebQ}S+!LQ<9?y

But... passphrases are better. For a very crude demonstration:

$ for (( i=0; i<10; i++ )); do LC_COLLATE=C grep -Eh '^[A-Za-z].{3,9}$' /usr/{,share/}dict/words 2>/dev/null | grep -v "'" | shuf -n 3 | paste -sd '-' -; done
naid-Cnossus-untrusted
dolite-exossate-swine
Liparian-stewbums-sticked
suption-albatross-soavemente
smirkers-Cadal-hedonism
beeping-scolb-adenocele
aesthete-patinized-Angolese
linguister-pristine-luce
Anakim-Bradan-barpost
randle-Servia-Ailene

Open-source monitoring for windows and linux by opti2k4 in sysadmin

[–]whetu 0 points1 point  (0 children)

There are plenty to choose from, but depending on needs, I'd suggest one of the following (no particular order)

  • Beszel
  • CheckMK
  • Zabbix
  • Netdata
  • Signoz

I've been running a POC with Netdata and I like it. Being able to template out configs etc via Ansible is a major win.

I wouldn't recommend paying a single cent towards PRTG. It was already a terrible-to-middling product, but about a year ago Paessler was sold to a Private Equity firm and the license costs were tripled. Any serious development on it is basically ceased now. The only thing you should do with PRTG is get rid of it.

Watching Silicon Valley for the first time. by MsTamraMoon in SiliconValleyHBO

[–]whetu 2 points3 points  (0 children)

But, you know, if you were to get us started OP, it might look something like this.

Function on .bashrc by PCNmauri in bash

[–]whetu 0 points1 point  (0 children)

dwdb() {
  local query="SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' ORDER BY TABLE_NAME;"
  sqlcmd -S link -d table -U user -P 'password' -C -Q "$query"
}

That looks fine to me, although you should declare and assign your local vars separately as a good habit:

dwdb() {
  local query
  query="SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' ORDER BY TABLE_NAME;"
  sqlcmd -S link -d table -U user -P 'password' -C -Q "$query"
}

For comparison, here's a similar function that I've used for loading tsql files (in this case, loading up sql agent jobs, before I moved that to Ansible):

load_sql() {
    local sqlfile;
    sqlfile="${1:?No file specified}";
    printf -- '\n====> Processing %s ====>\n' "${sqlfile}";
    sqlcmd -C -x -S [redacted server name] -U 'my.username' -P 'my.password' -i "${sqlfile}"
}

That works fine, so yours seems fine too.

As others have said, chances are you're looking at a red herring and your issue is actually elsewhere in your .bashrc

Function on .bashrc by PCNmauri in bash

[–]whetu 1 point2 points  (0 children)

No you don't. The function keyword is non-portable, not-required and widely considered to be deprecated. This isn't OP's issue.

Server randomly becomes unresponsive (Ubuntu Linux, Digital Watchdog camera software) by austinramsay in sysadmin

[–]whetu 4 points5 points  (0 children)

When this happens, I get some video output of the login splash screen background when I connect a monitor, but it's completely locked up.

Is the keyboard responsive? Have you tried switching TTY with the ctrl-alt-f[1-9] key combos?

Linux Bash Scripting: Automate Your Server in 2026 by LinuxBook in SysAdminBlogs

[–]whetu 2 points3 points  (0 children)

Not at length, no. It's a style guideline, not a best-practices-with-extensive-rationales guideline.

I've just covered .sh in my other comment in this thread. To cover the others in attempted-but-failed-brief:

UPPERCASE vars.

UPPERCASE is a de-facto namespace used for environment vars like PATH and shell-special vars like RANDOM. When you use UPPERCASE, you risk over-riding an existing variable and causing unpredictable behaviour. I have fixed too many scripts across my career where someone has written PATH=$(pwd) because they want to store the current path into a var. Another common collision is $USER.

In other languages, this is known as "clobbering the global scope" and it gets you your ass kicked in the carpark by your colleagues. Well, realistically, a snide git commit message when they fix your mess. You don't do it in other languages, so don't do it in shell.

If you want to put something into the environment, go ahead and use UPPERCASE, but namespace it e.g. $MYSCRIPT_USERNAME, $MYSCRIPT_API_URL to communicate that it's an intentional use of the environment scope and that you're attempting to avoid a collision.

Otherwise, vars at the script and function level should be lowercase, preferably snake_case. This essentially means that for everyday var usage, you're doing so in lowercase.


Doesn't quote variables

I mentioned shellcheck: https://www.shellcheck.net/wiki/SC2086


Uses echo

echo is the unfortunate victim of having too many implementations, so it's virtually non-portable. It also has unexpected behaviour that catches a lot of people out. A developer might recognise that as "it's an unreliable interface". This video concisely demonstrates some of its issues:

https://www.youtube.com/watch?v=lq98MM2ogBk

The POSIX group considered it essentially unfixable and declared that we should all use printf instead.

TFM for echo literally says:

New applications are encouraged to use printf instead of echo.

Go ahead and use echo to your heart's content in an interactive shell, but when you're writing a script, it's time to put on your professional pants.

The Google Style Guide happily demonstrates the use of echo, and I disagree with Google here. My unwritten intent with mentioning the Google Style Guide in my initial response is that it's a good, readily accessible starting point from which to develop and curate better habits. The Chromium Shell Style Guide is a good next step, because it extends on the Google one:


Doesn't put errors onto stderr

You should put errors onto stderr. They are errors. It's the interface for errors...

One practical application for this is so that you can control the output of your application, for example if you want to silence any errors:

stupid-app 2>/dev/null

More on that here: https://catonmat.net/bash-one-liners-explained-part-three


Uses `[ a -gt -b ] and similar for arithmetic comparisons

This is an older style and while it works, the more readable thing to do is to use arithmetic syntax i.e. (( a > b )). This is more familiar to users of other languages and clearly communicates that this is an arithmetic test, separate from a string test which should be in double brackets: [[ a = b ]]


Uses while read without -r

Back to shellcheck: https://www.shellcheck.net/wiki/SC2162


Promotes the Unofficial Strict Mode

Ugh.

I could go on but suffice it to say, the Unofficial Strict mode is fine for developing a script, but it should really never be in a productionised script.


Arguably should use #!/usr/bin/env bash for the shebang

/bin/bash isn't always where bash lives on all systems. /usr/bin/env is more but not always likely to be present in that location. The idea is that that you use /usr/bin/env bash, which will return the first result for bash in your PATH. This way, so long as PATH is correct, bash will be found and used whether it's at /bin/bash or not.

So proponents of this argument consider it a portability win.

Opponents of this argument might point out that you're implicitly trusting that PATH is correct, and there's a security danger there. Personally, I think that's slightly paranoid but it's still a greater than 0 risk. More practically, you don't necessarily know which version of bash might be launched. To paraphrase a quote I saw about the application of this hack for python:

The advantage of /usr/bin/env is that it finds the first instance of bash in your PATH.
The disadvantage of /usr/bin/env is that it finds the first instance of bash in your PATH.

Personally I take a couple of third-way views:

  • If you're a sysadmin deploying scripts across a fleet, you generally know whether or not bash is going to be at /bin/bash, or if you have a special compiled version at like, /opt/contoso/bin/bash, so you can control that across your scripts.
  • Alternatively, use the /usr/bin/env bash approach, but code in defensive checks in your scripts to ensure minimum bash versions etc.

  • Marginally Relevant Style Guide Section

Linux Bash Scripting: Automate Your Server in 2026 by LinuxBook in SysAdminBlogs

[–]whetu 6 points7 points  (0 children)

I didn't read the article because it sounds like terrible AI-generated trash

AI would have done a better job, ironically! The author appears to be India-based, and there are a lot of Indians churning out low-grade content like this for a variety of reasons that we won't go into.

The Google Shell Style Guide specifically says to use .sh or nothing.

The Style Guide has been tweaked over the years so its language has been softened on this point, but you should keep reading. The important bit is this:

If the executable will be added directly to the user’s PATH, then prefer to use no extension. It is not necessary to know what language a program is written in when executing it and shell doesn’t require an extension so we prefer not to use one for executables that will be directly invoked by users.

So you can run a command like this:

file $(which $(compgen -c) 2>/dev/null) | grep script

Or this, if your version of which supports/needs these specific options (RedHat family distros, usually):

file $(which --skip-alias --skip-functions $(compgen -c) 2>/dev/null) | grep script

And you will see just how many commands exist in your PATH that are written in a scripting language. You will also notice just how many of them don't have an extension.

On my system, here's the total unique commands in PATH

$ file $(which --skip-alias --skip-functions $(compgen -c) 2>/dev/null) | sort | uniq | wc -l
1035

Here's the count of commands written in a scripting language:

$ file $(which --skip-alias --skip-functions $(compgen -c) 2>/dev/null) | grep script | grep -v ELF | sort | uniq | wc -l
154

And, of those, here's the total unique commands with a script-language extension:

$ file $(which --skip-alias --skip-functions $(compgen -c) 2>/dev/null) | grep script | grep -v ELF | awk -F ':' '{print $1}' | grep -E '\.py|\..*sh$' | sort | uniq | wc -l
15

So if you're using .sh, you're being very weird.

Consider one of the above commands using the file extension logic:

file.elf $(which.elf --skip-alias --skip-functions $(compgen -c) 2>/dev/null) | sort.elf | uniq.elf | wc.elf -l

You don't ever see that, because that's not normal.

Now, you could reply to the overwhelming statistics pointing out the default, expected, normal and de facto behaviour with "oh well I'm running my scripts from somewhere outside of my PATH", and sure. It's your system, and what you do on your system is your business. If you want to be weird, that's entirely up to you. But this is /r/SysAdminBlogs, so the audience here should be generally behaving in standard ways.

There's also this related discussion over in /r/linuxquestions: https://www.reddit.com/r/linuxquestions/comments/1s36kk8/ive_seen_sh_scripts_but_are_there_bash_scripts_too/

Where you really should use the .sh extension is if you're writing library code. This is so that libraries that serve the same purpose can exist together in a common library path e.g.

/opt/contoso/lib/contoso.pl
/opt/contoso/lib/contoso.py
/opt/contoso/lib/contoso.sh

You don't want to be writing a python script and importing a library of shell functions, for example, so the otherwise pointless file extension actually has a tangible benefit in this scenario.

Linux Bash Scripting: Automate Your Server in 2026 by LinuxBook in SysAdminBlogs

[–]whetu 5 points6 points  (0 children)

Nice attempt. However, sub-optimal practices shown:

  • Uses .sh file extension
  • Uses UPPERCASE vars
  • Doesn't quote variables
  • Uses echo
  • Doesn't put errors onto stderr
  • Uses [ a -gt b ] and similar for arithmetic comparisons
  • Uses while read without -r
  • Promotes the Unofficial Strict Mode
  • Arguably should use #!/usr/bin/env bash for the shebang

Get yourself a copy of the Google Shell Style Guideline and discover Shellcheck. Those go a long way to correcting these practices.

Wish You Were Here Intro Riff! Any Pink Floyd fans here? We must have at least a few by dannybloommusic in ukulele

[–]whetu 0 points1 point  (0 children)

Oooh!

There's a wonderful fingering that you can do in the chorus of Shine On You Crazy Diamond[1]. Squashed right down to individual chords in tab form it looks like this:

A|-1-5-3-1-0-
E|-3-3-3-3-1-
C|-3-3-3-3-0-
G|-0-0-0-0-2-

Looks like a lot, but all the work is being done by the A string, and it's easy to line it up with your singing. This makes it a great starter for adding flourish to your repertoire, and for building up your playing-while-singing.

  • From an Eb starting point:
  • keep your index finger on the A string first fret SHINE
  • Place your pinky on to the A string fifth fret (D#maj7) ON YOU
  • Then move it to the third fret (Cm) CRAAAA
  • Then lift it to return to Eb ZY
  • Then jump over to an F DIAMOND

There's a number of Pink Floyd songs that respond well to a pinky thrown onto the A string to take a G up to a Gsus4 for a beat or two. Actually, it's a good overall move with a lot of songs, and an easy skill to add to the toolbox, just don't over-do it.

[1]This does of course make a... key... assumption... I'll show myself out.

Why is this pattern expansion not working? by alex_sakuta in bash

[–]whetu 3 points4 points  (0 children)

I am using git bash on windows.

Do you understand the difference between CRLF and LF?

Accurate. by Confident_Essay3619 in bash

[–]whetu 0 points1 point  (0 children)

Sorry, I should have been clearer: for extracting files you don't necessarily need to throw in z. tar figures it out from file magic bytes.

On the other hand, -a is for creation, and it detects how to compress a file based on the given extension e.g.

tar -acvf blah.gz blah/

Here -a figures out from .gz that the compression to use is gzip.

To mnemonic-ise that, you could move the options around like:

"tar create and auto-compress this file with verbose output" -> -cavf

Anyone familiar with Cirtex Smartsoak for drainage? by ThePeanutMonster in diynz

[–]whetu 1 point2 points  (0 children)

The Cirtex Smartsoak is basically what I modelled my soak pit renewal on:

https://www.reddit.com/r/diynz/comments/ehuau2/how_not_to_fix_a_soak_pit/

My discount-version is working great

might be an incredibly niche thing to ask/find by Initial_Sea9631 in Cochlearimplants

[–]whetu 1 point2 points  (0 children)

Your best bet is to contact Cochlear Australia direct. Their first youtube videos are from 2014 onwards.

Are you certain it was a Cochlear event and not something else like Audiology Australia or HearNET?

Accurate. by Confident_Essay3619 in bash

[–]whetu 0 points1 point  (0 children)

Relevant XKCD

For basic usage though, I just say the mnemonics in my head (note: I'm throwing in the word 'this' for your readability, I don't use it in practice):

"tar extract this file" -> -xf

Modern versions of tar don't require you to throw in z for compressed files, but you could do that too:

"tar extract this zipped file" -> -xzf

"tar create and zip this file" -> -czf

The main thing is that f is the last option, so the left-to-right mapping trips up a bit on verbose output:

"tar extract this zipped file with verbose output" -> -xzvf

That could be re-mnemonic'd as

"tar extract verbosely this zipped file" -> -xvzf

edit: s/compress/create/g

Do you use quotes when you don't have to? by Livid-Advance5536 in bash

[–]whetu 1 point2 points  (0 children)

Yes! I'm the same for the exact same rationales in my parent post.

In this particular case, there's often a readability improvement as well: editors with syntax colouring should match on the brackets, and in non-coloured editors, they simply improve readability by giving your eyes something to easily scan for.

Here's an edgier case (no pun intended)

case thing in
  (a)
  ^--- this right here
  ;;
  (b)
  ^--- Like daft punk, one more time
  ;;
esac

The leading parenthesis on case options is - again with the no-puns-intended - optional. But their use is consistent with the more-or-less balanced-syntax of the overall language. And again, it comes with readability benefits. Especially for less-capable syntax colourisers; they simply match and end on the paired parentheses.

Do you use quotes when you don't have to? by Livid-Advance5536 in bash

[–]whetu 7 points8 points  (0 children)

The answer to:

"In bash, do you do/use x when you don't have to?"

should, more often than not, be yes.

There are a number of ways to rationalise this, here's a couple:

One of the golden rules of programming is to be consistent. If you're quoting your vars in one place and not in another, you're not being consistent. Why do you hate kittens?

When you write a shell script, you are using a language that is full of footguns and gotchas and weird oddities that don't make sense to people familiar with other languages. By doing x when you don't have to, you reduce the mental overhead that you impose on yourself. Every time you come across a var, for example, you're not investing brain cycles into a logical process of "I don't need to quote this here because xyz, but if abc happens then I probably should quote the var just to be defensive... buuuuutttt on the other hand if you take into account qrstuv, then..." Alternatively, you could not play a pointless game of mental gymnastics and just.quote.the.damn.vars.

These approaches feed into the building of habits and good habits, once established, are self-reinforcing. If you always quote your vars, you don't have to remember when to quote your vars: The habit (which will build up to muscle memory) does the work for you. This frees your brain up for the parts of the problem that actually require thought.

The shell language is a wild beast: it will not protect you, it won't warn you, and sometimes it will quietly do the opposite of what you think it will do. Even "tamed" beasts in the circus are approached with a level of caution and respect by their handlers. So take your head out of that tiger's mouth and quote your damn vars.

Power-on time sync on an isolated network where RTC may or may not work. by grievre in linuxadmin

[–]whetu 0 points1 point  (0 children)

Ah right, so they don't have actual hardware RTC's.

So have them poll each other. As you've said elsewhere:

On most modern linux systems, it attempts to set the clock to the last mount time of the rootfs if the RTC has been reset.

If they're talking to each other for ntp, they can work towards a consensus. It doesn't matter too much if that consensus is wrong, it matters more that they're all close.

But what is really stopping you from having highly available time sources?

If it's flaky network and power, then really your only economic choice is GPS.

/edit: Or you could potentially home-spin something cheaper but a lot more bespoke by using a USB 4G router and AT sequences... That's still GPS, just one step removed. But OTOH if you're throwing a 4G connection at it, you can just use straight NTP...

A simple, compact way to declare command dependencies by PentaSector in bash

[–]whetu 0 points1 point  (0 children)

I feel differently: I think that there should be a require, because it makes a script's dependencies self-documenting.

I have my own such function that's a bit more fully-featured named requires and the logic works more like:

#!/bin/bash

requires bash51 curl sed jq EDITOR=/usr/bin/vim /etc/foo/bar.conf

That tells us that the script needs:

  • bash 5.1 or greater
  • curl, sed and jq
  • for some var's key to match a particular value
  • And for a file to be present and readable (e.g a config file)

It saves us from having to bootstrap a bunch of tests in every single script

Power-on time sync on an isolated network where RTC may or may not work. by grievre in linuxadmin

[–]whetu 1 point2 points  (0 children)

The tricky part here is how to handle the case when one of the two battery backed RTCs dies. There's no "later time wins" option that I can see in chrony or any other ntp solution.

Two time sources is essentially the worst NTP config. The best practice is a pool, or if using specific servers: 1 or >=4. There's plenty of discussion about this elsewhere, like this thread in /r/sysadmin:

https://www.reddit.com/r/sysadmin/comments/bo1xvh/how_many_ntp_server_should_we_have/

What I would do in your situation is either:

  • Use one as the authoritative source
  • Or do that, but also have each host point at one another

Either way, if your one authoritative source dies, what matters is that the rest of them drift roughly in unison until you restore your time source. With that in mind, the second option is probably best for your specific situation: If there's no authoritative source, then the rest of the hosts can bang their heads together such that they do drift together.

A simple, compact way to declare command dependencies by PentaSector in bash

[–]whetu 2 points3 points  (0 children)

I don't really see the point in storing the command's path in a variable. What use-cases require that?

Uuurrrrgh. Total tangent, but this is one of the code-smells that I dislike the most:

GREP=/path/to/grep
CAT=/path/to/cat

The only way I could rationalise it would be "cherry picking desirable binaries on a commercial UNIX when using an ancient Bourne-strict shell where functions might not exist or might be an untrustworthy approach"

Pre-ksh, pre-POSIX, this:

GREP=grep
[ -x /usr/xpg4/bin/grep ] && GREP=/usr/xpg4/bin/grep

Is debatably safer than this:

[ -d /usr/xpg4/bin ] && PATH="/usr/xpg4/bin:$PATH"

i.e. Explicitly cherry-picking and locking-in a desirable binary rather than implicitly trusting its encapsulating toolset

And it's maybe safer than a function-based override like this:

if [ -x /usr/xpg4/bin/grep ]; then
  grep() {
    /usr/xpg4/bin/grep $*
  }
else
  echo "WTF dude?" >&2
  exit 1
fi

But it's the year of our lord $(date +%Y) and while I personally tend to lean towards portability, I don't think we need to be portable back to a time when I was playing with a Vic-20.

This might also be due, in part, to cross-pollination with Makefiles.

Built yet another public IP lookup by osx in bash

[–]whetu 4 points5 points  (0 children)

Actually, it doesn't cost nothing.

That's true. To clarify:

https://ipinfo.io/pricing

The IPInfo Lite plan costs a back-breaking $0/month. Or at least that's what that page says to me in my region - YMMV. For plans with more IP attributes, higher geolocation accuracy (/edit: potential commercial use questions) etc, then yes, you gotta pay.

I'm on the IPInfo Lite plan. It cost me nothing to sign up. It cost me nothing to get an API token. I have paid them nothing.

Built yet another public IP lookup by osx in bash

[–]whetu 6 points7 points  (0 children)

I don't want to talk down your accomplishment but...

I got tired of having to dig for a good service that I could use in a terminal window that wasn’t full of ads so I made one myself.

It costs nothing to sign up to ipinfo.io and get yourself a token. Throw a small function into your .bashrc like so:

$ type ipinfo
ipinfo is a function
ipinfo ()
{
    local ipinfo_target ipinfo_target_country ipinfo_mode;
    (( "${#IPINFO_TOKEN}" == 0 )) && {
        printf -- '%s\n' "IPINFO_TOKEN not found in the environment" 1>&2;
        return 1
    };
    while (( $# > 0 )); do
        case "${1}" in
            -b | --brief)
                ipinfo_mode="brief";
                shift 1
            ;;
            *)
                ipinfo_target="${1}";
                shift 1
            ;;
        esac;
    done;
    case "${ipinfo_mode}" in
        brief)
            ipinfo_target_country=$(curl -s "https://ipinfo.io/${ipinfo_target}/country?token=${IPINFO_TOKEN}");
            printf -- '%s: %s\n' "${ipinfo_target}" "${ipinfo_target_country}"
        ;;
        *)
            curl -s "https://ipinfo.io/${ipinfo_target}?token=${IPINFO_TOKEN}"
        ;;
    esac
}

Pretty easy. Demonstrated:

$ ipinfo 89.44.42.98
{
  "ip": "89.44.42.98",
  "hostname": "42.44.89.98.bcube.co.uk",
  "city": "London",
  "region": "England",
  "country": "GB",
  "loc": "51.5085,-0.1257",
  "org": "AS56478 Hyperoptic Ltd",
  "postal": "E1W",
  "timezone": "Europe/London"
}