While I agree with some comments here that "Fuck you, I had to MAINTAIN your bullshit Taco Bell system", for any project that needs to be run likely no more than once (prototypes, single-run analysis, etc) or is never going to be checked into source control, the power of the shell cannot be underestimated.
I had an intern a couple years ago. Nice guy, but he didn't listen when we said "Keep this simple". We had all the data from an A/B test he ran, and we needed to do the analysis. He broke out MapReduce on EMR and all sorts of other complexity. It was a few MB of data!
After his analysis went pretty poorly, I wrote up a shell script in a few hours (sed, awk, xargs, woo) and got us the data we needed. I'd never ask someone to maintain that madness, but I was able to break it down into simple functions, piped into each other, in a single file.
Using MapReduce for a few MB of data? Jeese. That's a whole new can of worms.
I remember being younger, doing things the way I found "interesting" rather than the way I found practical. Eventually for a project or two you end up with basically nothing for all that complexity because you're focusing on the wrong things, and I learned to start putting functionality first.
I had an intern a couple years ago. Nice guy, but he didn't listen when we said "Keep this simple". We had all the data from an A/B test he ran, and we needed to do the analysis. He broke out MapReduce on EMR and all sorts of other complexity. It was a few MB of data!
After his analysis went pretty poorly, I wrote up a shell script in a few hours (sed, awk, xargs, woo) and got us the data we needed. I'd never ask someone to maintain that madness, but I was able to break it down into simple functions, piped into each other, in a single file.