Forgot your password?
Businesses Programming Unix

Taco Bell Programming 394

Posted by timothy
from the how-dare-you-insult-the-code-monkeys dept.
theodp writes "Think outside the box? Nah, think outside the bun. Ted Dziuba argues there's a programming lesson to be learned from observing how Taco Bell manages to pull down $1.9 billion by mixing-and-matching roughly eight ingredients: 'The more I write code and design systems, the more I understand that many times, you can achieve the desired functionality simply with clever reconfigurations of the basic Unix tool set. After all, functionality is an asset, but code is a liability. This is the opposite of a trend of nonsense called DevOps, where system administrators start writing unit tests and other things to help the developers warm up to them — Taco Bell Programming is about developers knowing enough about Ops (and Unix in general) so that they don't overthink things, and arrive at simple, scalable solutions.'"
This discussion has been archived. No new comments can be posted.

Taco Bell Programming

Comments Filter:
  • by syousef (465911) on Sunday October 24, 2010 @06:06PM (#34006784) Journal

    Seriously, what's going on with the articles here? "My code is like a Taco"? Is that flying because of CmdrTaco's username?

    Nothing new here:
    1) Code reuse. Woopdeedoo. The whole industry has invested heavily in many paradigms for reusing code: The reusable library, module reuse, object reuse etc.
    2) Stringing Unix commands together is news? Did I just take a Deloriane back to 1955? (Well that's a slight exaggeration. Unix has only been around since the 70s)

    Finally, who wants to compare their code reuse to a crappy junk food chain? I'd rather think of myself as a professional that earns commensurate pay than a junk food server who needs to be trained to ask "would you like fries with that?".

  • From TFA (Score:4, Interesting)

    by Jaime2 (824950) on Sunday October 24, 2010 @06:10PM (#34006806)
    From the article:

    I made most of a SOAP server using static files and Apache's mod_rewrite. I could have done the whole thing Taco Bell style if I had only manned up and broken out sed, but I pussied out and wrote some Python.

    It seems that only software he knows counts as "Taco Bell ingredients". I'd trust Axis (or any other SOAP library) much more than sed to parse a web service request. Heck, if you discount code that you don't directly maintain, SOAP requires very little code other than the functionality of the service itself. I had a boss like this once. He would let you do anything as long as you used tools he was familiar with, but if you brought in a tool that he didn't know, you had to jump through a thousand extra testing hoops. He stopped doing actual work and got into management in the early 90's, so he pretty much didn't know any modern tool. He once made me do a full regression test on a 50KLOC application to get approval to add an index to a Microsoft SQL Server table.

  • by Tablizer (95088) on Sunday October 24, 2010 @06:13PM (#34006820) Homepage Journal

    I've found the best reuse comes from simple modules, not from complex ones that try to do everything. The one that tries to do everything will still be missing the one feature you need. It's easier to add the features you need to the simple one because it's, well, simpler. With the fancier one you have to work around all the features you don't need to add those that you do need, creating more reading time and more mistakes.

  • Unexpected (Score:4, Interesting)

    by DWMorse (1816016) on Sunday October 24, 2010 @06:18PM (#34006858) Homepage

    Unexpected comparison of trained coders / developers, many with certifications and degrees, to untrained sub-GED Taco Bell employee... well... frankly, knuckle-draggers.

    Also, I don't care if your code is minimal and profitable, if it gives me a sore stomach as Taco Bell does, I'm opting for something more complex and just... better. Better for me, better for everyone.

    I get the appeal of promoting minimalistic coding styles with food concepts, and it's a refreshing change from the raggedy car analogies... but come on. Taco Bell? Really??

  • Re:Simplicity (Score:4, Interesting)

    by swamp boy (151038) on Sunday October 24, 2010 @06:24PM (#34006886)

    Sounds like your coworkers are busily filling out their resumes with all the latest fad software tools. Like you, I despise such thinking, and it's why I pass on any job opportunity where 'web apps' and 'java' are used in the same description.

  • Generalized experts (Score:1, Interesting)

    by Anonymous Coward on Sunday October 24, 2010 @06:45PM (#34006988)

    This is why I hire generalised experts, not specified experts. The wider the knowledge set of a developer, the better holistic solutions they can provide. Have an oracle dba? Every problem is solved with oracle. Got a java-only dev? Everything solved in java.

    Tool sets and solutions on a large/broad project are often solved best with a broad range of technologies understood and built by a team.

  • by Anonymous Coward on Sunday October 24, 2010 @07:02PM (#34007096)

    Software development is a craft with already half-a-century of knowledge, full of characteristics of its own. We should stop trying to adapt examples of other craft/engineering/whatever and focus more on ours. We already have some good ideas of what works and what doesn't, and running to the methodological fad of the week is one that doesn't.

    Frankly, getting ideas on how to program from how Taco Bell cooks its food? I wonder if I would ever bother to ask cooking tips from RMS.

  • by Giant Electronic Bra (1229876) on Sunday October 24, 2010 @07:07PM (#34007124)

    Once, about 20 years ago, I worked for a company who's line of business generated a VERY large amount of data which for legal reasons had to be carefully reduced, archived, etc. There were various clusters of VMS machines which captured data from different processes to disk, from where it was processed and shipped around. There were also some of the 'new fangled' Unix machines that needed to integrate into this process. The main trick was always constantly managing disk space. Any single disk in the place would probably have 2-10x its capacity worth of data moving on and off it in an given day. It was thus VITAL to constantly monitor disk usage in pretty much real time.

    On VMS the sysops had developed a system to manage all this data which weighed in at 20-30k lines of code. This stuff generated reports, went through different drives and figured out what was going in where, compared it to data from earlier runs, created deltas, etc. It was a fairly slick system, but really all it did was iterate through directories, total up file sizes, and write stuff to a couple report files, and send an email if a disk was filling up too fast.

    So one day my boss asks me to write basically the same program for the Unix cluster. I had a reputation as the guy that could figure out weird stuff. Even had played a small amount with Unix systems before. So I whipped out the printed Man pages and started reading. Now I figured I'd have to write a whole bunch of code, after all I'm duplicating an application that has like 30k lines of code in it, not gigantic but substantial. Pretty soon though I learned that every command line app in Unix could feed into the other ones with a pipe or a temp file. Pretty soon I learned that those apps produced ALL the data that I wanted and produced it in pretty much the format that I needed. All that I really had to do was glue it together properly. Pretty soon I (thank God it starts with A) I found awk, and then sed. 3 days after that I had 2 awk scripts, a shell script that ran a few things through sed, a cron job, and a few other bits. It was maybe 100 lines of code, total. It did MORE than the old app. It was easy to maintain and customize. It saved a LOT of time and money.

    There's PLENTY to recommend the KISS principle in software design. Not every problem can be solved with a bit of shell coding of course, but it is always worth remembering that those tools are tried and true and can be deployed quickly and cheaply. Often they beat the pants off fancier approaches.

    One other thing to remember from that project. My boss was the one that wrote the 30k LoC monstrosity. The week after I showed her the new Unix version, I got downsized out the door. People HATE it when you show them up...

  • by sakdoctor (1087155) on Sunday October 24, 2010 @07:27PM (#34007234) Homepage

    Had a friend confuse bulbs of garlic with cloves of garlic. Niiice.

  • by 19thNervousBreakdown (768619) <davec-slashdot&lepertheory,net> on Sunday October 24, 2010 @07:53PM (#34007386) Homepage

    How, exactly, are they brittle? I've heard this term used a number of times, but never actually seen a prediction of brittleness be an accurate predictor of any amount of bugs, maintenance issues, or really any negative outcome. As far as I can tell, it's just a weasel word to be used when you don't like something for aesthetic reasons or understand it fully.

    So, prove me wrong. Explain exactly what's bad about using code that's been more heavily used and tested in production systems than just about anything else for more than 20 years.

  • Re:Simplicity (Score:4, Interesting)

    by SimonInOz (579741) on Sunday October 24, 2010 @08:20PM (#34007544)

    “Debugging is twice as hard as writing code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. - Brian Kernighan”

  • by Blakey Rat (99501) on Sunday October 24, 2010 @09:17PM (#34007942)

    And why one well-paid Unix admin can outperform 10 MCSEs struggling to keep their servers afloat.

    Speaking of weasel-words, why are you setting up an unfair comparison? Why if the Windows admins are "well-paid" as well? What happens if the Windows admin is good, but doesn't have an MCSE?

    What universe are you living in where there are no good Windows admins and no bad Unix admins? I can tell you from experience: there are good and bad of both. (The worst, however, are Lotus Notes admins. Ugh.)

    Microsoft has gotten better lately wrt tools, but overall Windows servers still are light years behind a *nix server with solid hardware.

    Ok I'll bite: how? In what way are they behind? What tools do they lack?

    Please make sure your argument applies in 2010, i.e. don't give me examples from Windows NT 4 like so many *nix fans usually do.

  • by Anonymous Coward on Sunday October 24, 2010 @09:26PM (#34007976)

    Uhh, what? I can do everything you just mentioned in shell, using standard UNIX utilities and cron for retries. You do not need a magical distributed application to simply crawl websites. You can even have it crawl from multiple boxes if you really want, I can think of very simple UNIX solutions to that too. Web crawling isn't that magical.

  • by MightyMartian (840721) on Sunday October 24, 2010 @10:07PM (#34008148) Journal

    And for real age, it's something that's been known since Unix went into wide-scale usage in the 1970s. The original Bourne shell with the toolset of the time, while obviously limited in some respects, was pretty damned powerful. Pop in some of the newer updates like bash and you have a helluva an environment.

  • Re:8 keywords? (Score:3, Interesting)

    by Suzuran (163234) on Sunday October 24, 2010 @10:13PM (#34008188)

    I think programming on an old machine should be required for any sort of programming course. It would teach people to conserve resources and think about how the machine works.
    He who cannot program in 64K cannot program in more.

  • Re:From TFA (Score:3, Interesting)

    by metamatic (202216) on Sunday October 24, 2010 @10:32PM (#34008296) Homepage Journal

    Heck, if you discount code that you don't directly maintain, SOAP requires very little code other than the functionality of the service itself.

    However, any time you change the API--even to make a change that no client should notice--you have to regenerate the glue code from the WSDL and recompile all your client programs. Which is why these days, I build REST-based web services.

  • by flnca (1022891) on Monday October 25, 2010 @02:15AM (#34009180) Journal

    what another person might spend a week doing in C (which is spectacularly unsuited for such tasks anyway).

    A skilled C programmer also needs less than 1 hour for something like that. The standard C library has a lot of text processing functions (like sscanf()), plus it has a qsort(). Ever wonder why the C I/O library is suitable for managing database files? All the field functions in fscanf()/fprintf() etc. are suitable for database management.

    Also, C is still one of the prime choice languages for writing compilers, which do a lot of text processing.

  • by dkf (304284) <> on Monday October 25, 2010 @04:16AM (#34009650) Homepage

    something that uses a bunch of smaller tools might have more brittleness than something that is entirely contained in code controlled by the maintainer

    Not necessarily. The unix tools are very well specified by comparison with most libraries used in nearly any language you care to name (they're in the POSIX spec) so there's a substantial amount that you can rely on, and rely on long-term. They can be composed poorly, of course, but bad programmers can write bad programs in anything so it's (close to) a null argument.

    Brittleness in shell scripts typically refers to assumptions of particular filesystem layouts or that nobody will be silly enough to put odd characters in filenames (if only that were true!) but piped IO is very stable and well tested.

  • by ShakaUVM (157947) on Monday October 25, 2010 @07:48AM (#34010492) Homepage Journal

    Show me a new grad who is good at programming and I'll bet they didn't learn programming at university. A lot of new grads *think* they are good at programming. But apart from a handful here and there that cut their teeth on other projects, a new grad writing good code out of the gate is virtually unheard of. Hell, most people with 5 years working experience are crap.

    Most "real" CS people have been playing around with writing code since a young age. I'd written motion prediction code for a robot, an Axis and Allies simulator, a full AI suite, and a bunch of other stuff before I started college, but I think the university classes really polished my skills. Finite math taught me how to think about structuring loops so they always run correctly, my Theory class let me think about FSMs, CFGs, and Turing machines in a more logical manner, my programming languages and compilers classes really made me understand what was really happening when I hit cc (and also helped explain some of the bizarre compiler errors I'd seen over the years when my own compiler did the same thing), and most importantly, the UCSD CS TAs were absolute Nazis about proper coding technique. Not arbitrarily so, but if you've ever seen some code that made you want to punch someone, that's the sort of thing they knock 25% of your grade off for. Honestly, it really helped.

    You're right, though - Computer Science is a very weird mismash of different stuff all jumbled together.

    >>And even given the complete failure to actually learn anything that could be called science in their computer science degree 95% of the graduating class hasn't written more than 10K lines of code in their entire life.

    Mmm, just looking at my class assignments (that I saved) across 16 classes (quarters, not semesters), I wc at 20k lines of code. This doesn't count stuff that I wrote for fun, for work, or stuff that I deleted because it doesn't matter any more. The actual number should be several times that, that I wrote for school.

    IMO, if you're not writing software as a CS student, you're doing something wrong.

  • by bigrockpeltr (1752472) on Monday October 25, 2010 @09:52AM (#34011408)
    what most people fail to realise is that tail,cut,sort,uniq are most likely written in C so why reinvent the wheel? that is the only reason to use shell scripting (when there are existing tools) but surely good luck implementing your command from scratchpurely in bash. technically in C you can just write

    system("tail -n 100 /var/log/apache2/access_log | cut -f1 -d\" \" | sort | uniq");

    and achieve the same result in 1 line as well :P

One good reason why computers can do more work than people is that they never have to stop and answer the phone.