This is quaint, like an article about how bad opiates are, but it's from before fentanyl.
We bash developers are in a really bad spot. If you know one of us, please check on him or her. ChatGPT has enabled our digital hoarding like never before (some of us call it a "scripts folder"-- check for files like build5.sh or *.yml for really bad cases).
I don't know how to write bash. But I'm doing for loops and if statements like never before. I justify it by saying I'm only hurting myself. I've already gone through one hard drive, writing a new index.html over and over and over to the same spot, burning it into the device like the "2" in Sonic the Hedgehog 2 on a 90s CRT.
I lost one of my best friends after I asked him why we don't just use string literals and yaml files for everything. Another when I told him to just put his CSS in a style tag. I can't stop. I can't go back. All I can do is stay away from Makefile people and hope it doesn't get even worse. Don't write shell scripts, kids.
It blows my mind how big a deal argument parsing is in legacy scripting languages (including python).
That snippet is concise, but relies on the user supplying arguments in the order you demand. AFAICS, non-trivial scripts require you to loop over arguments, testing against long and short forms of parameters, and shifting if the parameter expects an argument.
In Powershell, you define your parameter and apply constraints. Voila:
Thanks for that. I'm currently migrating a set of bash scripts to PS1 for win compatibility. This is gonna help me! Undeniably a good feature. Sure there's the case / getopts idiom for parsing args in bash as you say, but I'd say this is a place PS excels them in. I totally agree that arg parsing should be standardized and simple like this. A nice CLI is just such a joy to use. Ideally something that includes usage info and (like the PS example seemingly) optionality.
Syntactically I'd even prefer a single line, like
[Pram(musthave):In(1, 20)/"Foos excessive or inadequate":int/$foo:"Foo count"/"Please supply how many foos you want."]
Sure, Windows doesn't have "legacy" like VMS or Unix etc, but it's been in use by every day people longer than any other current OS except for MacOS, which really isn't the same OS it launched as, whereas Windows has maintained backwards compatibility (for better for for worse) for decades. So.. Legacy? Why not.
Not only is parameter expansion delightfully esoteric, it can replace sed for many applications. Or, as you’ve done, inverse basename.
Since it’s a shell built-in, it avoids the overhead of exec. Ever wonder why asdf is slow? Go peek at the source code. This is why. I have a draft PR that fixes it, but there are still some failing tests I need to fix.
Replacing external program calls such as usual sed usage with shell (including bash specific) built-ins was also core in neofetch's design when was made as alternative to screenfetch. Will say that checking dylanaraps, someone worthy to be called a bash wizard, programs is a nice way to learn how to do complex shell scripting.
We use Makefiles to run our data science pipelines. So my apprentices (yes, we have a data science apprenticeship program with school leavers doing data science degrees on day release) are all familiar with writing and running Makefiles
Bash is great generally because Bash devs write for a stable target and don't concentrate on using the latest and greatest bleeding edge features. So you can expect a bash script written today to run on a distro from 15 years ago. This is something few other programming languages have achieved (except perl).
With python or rust or whatever a compiler/toolchain basically only lasts 3 months because some new feature is added and used that makes your rustc compiler from 4 months ago unable to compile it. Or with python you pretty much have to assume you system python won't work and install an entirely new toolchain using a dependency manager manager to manage your containerization and then within that (venv, pyenv, conda, etc, etc) a dependency manager (pip, PyPI, etc) to actually get the lib versions you need. Only then can you start trying to actually execute the program.
Bash? Just chmod +x and run the script. It'll work no matter where you're running it. And it won't stop working.
You will get people who decry bashisms that pollute the bourne shell. IMO they often make things safer so I want them - like [[ ]] which works so much more expectedly than [ ].
Shells are good at starting, stopping, backgrounding processes and piping, redirecting and substituting output that they make programming languages feel very clunky imo. Plus you can now almost expect bash to always be installed on a machine somewhere.
Reading the "Technology Overview" section of the linked "How I built a modern website in 2021" [0] site is terrifying.
It lists something like 15-20 technologies to build the site. I guess most of them are beneficial, but... why are some so proud of how large their dependency tree is?
I relate! But funny about the article because I read it as the guy had an ironic tone to exaggerate the negatives, but he actually loves it. I don't think he actually feels bad about it!
Yes! I only wish there was a better interface. Like an editor integration. Just something where you can go through each check line 1 by 1 and consider changing or not. I tried writing this in a simple way but gave up...
I don't know which editor you are using, but the shellcheck plugin for VS Code works really well. Underlines warnings and errors, and on hover gives you a short description and a link to a long description. There's also a "fix" button, but I it only works for the simplest of cases.
Thanks, for the link, man! Helpful. Is nvim, like, command line? I don't like to be in GUI when I'm coding. I mean I guess I could hook up my Quake style iTerm2 dropdown overlay to display a GUI window (but I think that only works on Windows with ConEmu). It's just easier for me to stay in command line, switch to GUI to eval or research...or watch videos! Haha
Use the editor's normal support for running a build/compile to run shellcheck with the `--format gcc` flag. Shellcheck will output in the traditional path:line:column:message format, which will be navigable like compiler errors.
This post inspired me to add a similar post to my own blog. Why?
Today I needed to parse some JSON at the command line and do some basic search and replaces and convert it all to CSV. The API call was easy enough with wget, and I was going to use JQ to parse the JSON. But after fiddling with it for 20 minutes and not having much luck, I resorted to my favorites:
Awk, sed, grep, cut. Save API call output to a couple of temp files, parse it with the aforementioned Unix staples in a for loop, And unless time than it would have taken me to properly learn JQ, I had the output and was able to ship it off to my customer.
Sometimes the best tool for the job is the tool you know, love, and trust.
Try `gron`. Great for taking JSON and converting it into something that can be grep'ed, and then if you need JSON out on the other end, you can then use `gron -u` to reverse the operation.
Saved my sanity more times than I can count, and allows me to avoid the Cthonic black magic known as `jq`.
You had some JSON blob, and you actually needed to glean something from it quickly and painlessly. To paraphrase Zawinski et al., you then found you had two problems.
In this age of big data, the debate between parsable and human-readable state has been resolved. JSON is an anti-pattern that requires supplementary tooling to make any sense of it, nevermind naive readability.
It's funny that you mention that! I was discussing it with a co-worker to see if they were any better with JQ than I was. They weren't. But then I realized I probably could have used chat GPT to do it. It really is a great tool for that kind of thing.
set -euo pipefail causes more confusion than good, IMO. Not everything has an exit (or return, more specifically) code of 0 for success, which is insane, but it’s shell.
I’m good with -u on its own, but the others, meh… that’s what shellcheck is for.
EDIT: Read this [0] and weep/revel in its glory, depending on your POV.
I agree that -o pipefail is not great since it's totally fine for pipes to break in some cases (e.g. I've occasionally seen it cause problems with pipelines like `grep | head`).
Hard disagree about -e though. I get it's not perfect but it has probably saved me dozens of hours of frustration and heartbreak. If you forget to handle errors for a particular line, then the script will just keep going and that's how you accidentally delete your entire home directory. Sure you can rely on shellcheck but for me personally I don't have shellcheck available 100% of the time as I often find myself writing scripts as raw strings (e.g. ssh -c).
I'm a fan of Bash3 Boilerplate (https://github.com/kvz/bash3boilerplate) and use a modified version of it for my shell scripts now. I like the code style and the logging, although I've amended the logging slightly to add a pipe input for long running processes that you want to see the output from before it finishes.
I prefer having a stop-on-unexpected script as it makes errors far more explicit and it's not too onerous to work round the peculiarities of return codes.
Greg's wiki (https://mywiki.wooledge.org/) is my go to resource for looking up snippets and learning to avoid the footguns - that and shellcheck are the key to "robust" bash scripts.
I really liked that article!
I've coming to bash this year and in my company we make things work the same way (principle). Chatgpt was a great helper and instructor to me learning bash and brought me up to Speed in no time. After my first weeks I was writing bash scripts that I thought was not possible to do in a shell which wants me to 'feel home' when I open a shell - that resonated with me in that post a lot.
I would be happy to find more bash script(er)s out there, to find inspiration and solutions for problems that increase my horizon of what's possible.
We bash developers are in a really bad spot. If you know one of us, please check on him or her. ChatGPT has enabled our digital hoarding like never before (some of us call it a "scripts folder"-- check for files like build5.sh or *.yml for really bad cases).
I don't know how to write bash. But I'm doing for loops and if statements like never before. I justify it by saying I'm only hurting myself. I've already gone through one hard drive, writing a new index.html over and over and over to the same spot, burning it into the device like the "2" in Sonic the Hedgehog 2 on a 90s CRT.
I lost one of my best friends after I asked him why we don't just use string literals and yaml files for everything. Another when I told him to just put his CSS in a style tag. I can't stop. I can't go back. All I can do is stay away from Makefile people and hope it doesn't get even worse. Don't write shell scripts, kids.