Unix is my IDE: script everything
I don't like remembering stuff. That is what computers are for. An ode to automation.
Inspired by this, I finally got around to publish a couple of small development utility scripts at https://github.com/latk/Unix-is-my-IDE.
Sure, programming on the command line instead of an IDE can be a real chore. There is so much stuff to stuff to remember that you only need once in a while. So why do I think the command line is the lazy choice?
Because of composable tooling.
There are a couple of aspects to this, and they are somehow all related:
- a script is executable knowledge
- scripting can save time
- scripting prevents human error and encourages correctness
- scripting encourages you to understand your system
- scripts are reproducible and can be version controlled
- it's just really cool :)
Here's a little story of a build script: Once upon a time, there was a software project that needed to install various dependencies. Because a lot of code had to be compiled on fairly lame hardware, this was a multi-hour affair – and you had to sit through it because something was bound to break, and you had to jump to the keyboard to fix it. Installations and upgrades were a huge pain, so they were avoided, so nobody really had any recent experience with a complete installation from scratch. That was not good.
Then, they started writing a small script that did all the necessary steps.
At first it just set up some environment variables and ran the compiler.
A simple ./configure && make && make install
affair.
Of course something would break, so they edited the script and re-ran it until it would work.
Over time, it grew and became smarter. It could automatically update the dependencies. It could set up the library paths as needed. It could apply patches to the project to work around bugs that could not be fixed upstream. It could handle the subtle differences between operating systems. It even assisted you to manage your Git credentials securely. In the end it could do quite a lot, and could even run unattended as a nightly build on a Jenkins server.
I am super proud of having written most of that script. Tracking down obscure bugs wasn't fun. I learned way more about dynamic linking or POSIX “compliance” than I could ever want. And a 700+ line shell script isn't pretty. But it felt great to know I would never have to solve the same problem twice: once the solution was part of the script, it would always work. My knowledge had become executable knowledge.
So there were all kinds of positive effects emanating from this script, and that really sold me on automation. There are many related ideas and stories how automation is valuable, or where automation would have prevented a bigger problem. A couple of examples I can think of:
-
The whole idea of DevOps and Continuous Deployment requires a high level of automation. In return, you get faster development cycles, higher perceived quality, and better resilience in the face of failures.
-
Test Driven Development (TDD) and variants such as Behaviour Driven Development (BDD) emphasize automated tests. In the case of BDD, the test is the specification, so the requirements have been turned into executable knowledge.
-
As anecdote for the above, here's the story of HP's firmware team embracing automation. To be fair, you probably don't have to wait 8 weeks to get feedback for your commits, but the story illustrates vividly how automation can catalyze powerful changes.
Of course automation can cause problems of its own. The Feb 2017 outage of Amazon S3 was caused by a typo while invoking a script. The good news is: the script can be hardened to detect unlikely input, so a similar outage isn't going to happen again. And here's the story of a new hire being fired on the first day of their job because they accidentally deleted the production database with a testing tool. Human error can't be prevented, but it can be made more unlikely by putting more logic into scripts.
Writing scripts is only possible if you have the necessary tools at your disposal. If you can only configure a compiler through a GUI wizard, you probably can't script that as easily as invoking a program with various command line flags.
That is where Unix/Linux really shines: many tools are command line first, are thoroughly configurable and composable, and typically have decent documentation in their man pages. Given such tools, I can easily recombine them into useful scripts, that let me do my job more comfortably, without having to remember the detailed procedure. And if I need to know it, I can always peek at the script. Because of this power, Unix is the IDE of my choice.
Usually I write per-project Makefiles to script reoccurring tasks: running tests, compiling documentation, or zipping up releases. E.g. this blog has a Makefile that can invoke the blog engine to build static HTML files, can optionally build draft posts, can spin up a preview server on localhost, and can run a deployment script that Rsyncs the files to my public web server.
Some scripts are more general, so I put them into a ~/bin
directory.
When writing these, it's always nice to realize that I can reuse another part of my toolbox.
For example, in one of the scripts I published today, I use the Ack tool not for searching files (what it's intended for), but to get a list of files that should be searched. In another, I figured that git ls-files
will give me all tracked files in a repository so that I can open them in an editor.
So to summarize: scripting is all kinds of good, and a great programming habit to have. Pervasive scripting is only possible in an environment that is conductive to automation and makes the necessary building blocks easily available: the command line, not a GUI.
- next post: Perl Docstrings: Put your POD into Heredocs
- previous post: How to check for an array reference in Perl