Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming Technology

The Growth of Command Line Options, 1979-Present (danluu.com) 83

Dan Luu, writing in a blog post: The sleight of hand that's happening when someone says that we can keep software simple and compatible by making everything handle text is the pretense that text data doesn't have a structure that needs to be parsed4. In some cases, we can just think of everything as a single space separated line, or maybe a table with some row and column separators that we specify (with some behavior that isn't consistent across tools, of course). That adds some hassle when it works, and then there are the cases where serializing data to a flat text format adds considerable complexity since the structure of data means that simple flattening requires significant parsing work to re-ingest the data in a meaningful way. Another reason commands now have more options is that people have added convenience flags for functionality that could have been done by cobbling together a series of commands. These go all the way back to v7 unix, where ls has an option to reverse the sort order (which could have been done by passing the output to tac).

[...] Over time, more convenience options have been added. For example, to pick a command that originally has zero options, mv can move and create a backup (three options; two are different ways to specify a backup, one of which takes an argument and the other of which takes zero explicit arguments and reads an implicit argument from the VERSION_CONTROL environment variable; one option allows overriding the default backup suffix). mv now also has options to never overwrite and to only overwrite if the file is newer. mkdir is another program that used to have no options where, excluding security things for SELinux or SMACK as well as help and version options, the added options are convenience flags: setting the permissions of the new directory and making parent directories if they don't exist. If we look at tail, which originally had one option (-number, telling tail where to start), it's added both formatting and convenience options For formatting, it has -z, which makes the line delimiter null instead of a newline. Some examples of convenience options are -f to print when there are new changes, -s to set the sleep interval between checking for -f changes, --retry to retry if the file isn't accessible.

This discussion has been archived. No new comments can be posted.

The Growth of Command Line Options, 1979-Present

Comments Filter:
  • by argStyopa ( 232550 ) on Thursday March 05, 2020 @12:50PM (#59799804) Journal

    Technical articles on slashdot?
    I come here for politics, memes, and celebrity news, not this shit.
    Jesus, if I wanted technical gobbledygook I'd call the damned Geek Squad.

    • Re: (Score:3, Funny)

      Seriously, I come here to get my daily dose of Coronavirus fear mongering and global warming alarmism. Crap like this has no place on Slashdot!
    • Just hold on. I'm sure there are many of us who have used, coded for/on, and administrated Unix and Linux systems for many, many years (30+ in my case) who would like to learn about this "command line option" thing. :-)

      • I had a printer tech come in to fix something with the network board on our printer.
        The poor little devil had never even HEARD of the ability copy files in a command line to a printer.
        ie COPY /A C:\directory.txt \\Server\Canon1

        He looked at me like I stepped out of the Matrix.

        He was also impressed with ipconfig /all > c:\Documents\ipconfig_info.txt .

        I felt like Merlin - imbued with magical powers ...and inconceivably old.

    • This is not a technical article so much as an anti-technical article that hates on choice and considers features the lowest common denominator of users don't use to be harmful.

      They don't seem to understand; the people they're trying to serve already have GUI interfaces to these command-line programs.

  • by Anonymous Coward on Thursday March 05, 2020 @01:08PM (#59799892)

    ... you're not done coding yet.

    I get why mplayer(1) needs so many switches. But the vast majority of command line utilities? I'd rather that most of the additional functionality be implemented in additional filters instead of making do-one-thing-and-do-it-well utilities become so bloated (and, probably, brittle). I'm waiting for the slew of new command line switches for "clear(1)"---you just know that someone's going to think that three are simply not enough.

    Prime bugaboo? Why do so many utilities implement their own sort functions when, in most cases, the perfectly good sort(1) utility would suffice? It's reminiscent of the MS-DOS days when every program had to have a complex, major-pain-in-the-behind-to-configure print function that worked like no other program's print function.

    • by Z00L00K ( 682162 )

      Just saying that if you want command line options - OpenVMS - and just the DIRECTORY command have enough of options for you to want a snack before you have tried them all.

    • because it can be significantly faster to sort on a native data structure, than to dump that as text to be sorted separately (also requiring the user to eyeball the separator and field offset and fiddle with the command line options?).

      that's exactly what this article is implying i think. it's not "just text" being sent to sort; it's text which you then have to, as a user, learn reinterpret the structure of because sort(1) doesn't know it, and shove that in as a bunch of command-line options. if the output o

      • by Aighearach ( 97333 ) on Thursday March 05, 2020 @02:49PM (#59800312)

        this is why people end up using perl or python for tasks instead of shell scripting

        No. People use scripting languages instead of shell scripts because of the awful syntax of shell scripting, compared to modern scripting languages that use normal types of syntax and grammar.

        • No. People use scripting languages instead of shell scripts because of the awful syntax of shell scripting

          Everyone who has ever spent an hour trying to fix the quotes in a 10 line bash script should upvote your post.

        • imho part of the awfulness of the syntax is because the language is only intended to shuttle text between one utility and the other, and then accreted a bit more cruft from special variant cases of that, like the backticks, and so on. it was never designed to do what it winds up doing, so of course it sucks.

          basically if you don't have one observation per row, formatted as either fixed-width or delimited fields, the unix toolchain quickly breaks down even for data analysis.

      • by mvdwege ( 243851 )

        Or, as I always put it: plain text output is great, because you can parse. And it sucks because you have to parse it.

        I understand why the original Unix went for it, the flexibility was exactly the Unix paradigm: solve the easy 90% first, and hand over the tools to the admin to cobble together a solution for the remaining 10%. That makes for easy, quickly developed tools. And a lot of work, predictably idiosyncratic to each individual admin, to fix all possible problems.

        • "predictably idiosyncratic to each individual admin"

          meaning: it's a job security system for neckbeards.

    • If you don't like the command-line option, don't use it. Problem solved. Having convenience options doesn't mean your code is against the "unix way." The unix way isn't minimalist (that would be assembly), it's functional. Divide your programs along lines that make sense, so they can be combined to do greater things.
      • A lot of people "forget" [never knew] that the "do one thing" stuff was only for the subset of command line programs that are piped together as filters. It was also always the "unix way" to have big monolithic programs with lots of features for other use cases.

        • I've long suspected that was the intent as well. As advice, it works great when it's strictly dealing with small, text-based utilities that you can pipe together in near infinite combinations. But how is "do one thing and do it well" supposed to apply to something like, say, a digital paint program, which is monstrously complex and has hundreds of features, albeit all related to digital painting?

          These days, I even hear people applying that principle to Linux internals in the eternal systemd debate, conven

          • by mvdwege ( 243851 )

            "Do one thing well", especially in the case of simple base utilities, is merely a corollary of the real Unix Way: fix the easy problems first, even if the work is only 90% complete. And make sure the admin has the tools to implement the remaining 10% as they see fit. Remember that Unix was a stripping down of Multics, one of those systems typical of that generation, that tried to do everything. And as an example, see the old Unix Haters Handbook, that was written by folks bitter that those comprehensive sys

            • As for systemd: that's mostly Lennart's personality of someone who just isn't happy with 90% solutions, and wanting to solve the problem 100%. Since that means he gets to impose his tastes on the end result, that tends to rub people wrong, especially if you're used to "Well, I can just add another element to the pipeline / another function to the startup script".

              The problem is that there's no such thing as a 100% solution. If there was there'd by one Linux desktop, one paint program, one word processor, and one email program. 100% solutions are like One Ring solutions--simple and powerful in theory, but full of quirky caveats in practice.

              • by mvdwege ( 243851 )

                Well, yes, I thought that was obvious in what I said.

                That being said, the solution here seems to satisfy the most people; including me.

                And sometimes the attempt at 100% solution, for all its quirks, just wins. There is only one GIMP, LibreOffice does not have a lot of competition either, how long did it take for LLVM to produce decent competition for GCC (and on anything but C-based languages, it's still behind)?.

                Sometimes the problem space is complex enough that it takes a lot of resources to get a soluti

    • Adding filters when the output isn't text just magnifies complexity by hiding it in the closet and adding a system for closet management.

      • Adding filters when the output isn't text just magnifies complexity

        indeed. Try using "sort" to reorder a JSON file.

    • by guruevi ( 827432 )

      I have some of those old scripts running, I actually just cleaned one up, it was just pipe after pipe after pipe cat echo - grep - awk - tr - sort - uniq - awk - echo - sed and it became unreadable. The thing I did find out troubleshooting those scripts is that the Unix tools that I've relied on have morphed and changed a ton over the last decade.

      Not sure why but the GPL2 and GPL3 versions of GNU CoreUtils sort differently and you get things sorted in a different order on different machines, everything else

  • by jellomizer ( 103300 ) on Thursday March 05, 2020 @01:10PM (#59799904)

    I remember earlier on the Debate between VMS Command Lines and Unix Command Lines.

    VMS commands while more wordy also had less options as there were more words that you can use and be programmed into the OS. While Unix back in the old days like to keep the command 2 or 3 digits long with more options.

    While the Unix Command line method seem to have become victorious. We have a new contender with Smart Speakers which in essence use a more natural language type of commands. But it is still a command line interface. As you state a sentence. then there is an output after that sentence.

    • by hawk ( 1151 ) <hawk@eyry.org> on Thursday March 05, 2020 @01:45PM (#59800076) Journal

      Several years ago, Dennis Ritchie assured me that Ken Thompson would agree with my insistence that the five letter names of "mkdir" and "rmdir" were bugs, and the the should have been two letter commands . . .

      hawk

      • For Unix conventions I would agree with that.
        mk command should encompass all filesystem makes make file (like touch) make folder/directory
        rm command should encompass all filesystem removals

        I guess we could go crazy and follow the Mac OS method and rm /dev/cdrom would unmount the CD-ROM and eject it. and mk /dev/cdrom would mount it.

        We can get away with a lot of stuff with a single command and deal with options and handle circumstances differently.
        However we get to a point where we are trying too hard and ma

      • Funny that, someone at Microsoft must've agreed with you long ago. MS-DOS aliased MKDIR and RMDIR to MD and RD, respectively, a feature that still lives on in Windows.

        • by hawk ( 1151 )

          that was common in unix user configurations *long* before MS got the bright idea to adopt unix-like syntax (or before it had directories, for that matter . . .)

  • Trying to even conceive of a comparison between CP/M and Windows 10, or UNIX on a PDP-11 vs GNOME 3.
    • Re: (Score:2, Informative)

      by Anonymous Coward
      Well GNOME 3 is a lot slower.
    • by Z00L00K ( 682162 )

      Modern operating systems - you'd use a CRT with both.

      Anyone here that have been programming with the ASR33 terminal?

  • Computers work using a simple set of opcode instructions. The most elegant approach is to just run a hex editor on the console: I can solve any problem by entering a sequence of these basic opcodes. By composing these fundamental building blocks together, I can build up any solution I need.

    The best part is that none of the opcodes have any command line options that I have to remember.

    • The best part is that none of the opcodes have any command line options that I have to remember.

      Not exactly true, depending on architecture. The opcodes take arguments that work a lot like command line options, and can be exotic. They are definitely more limited in number, but they're very, very picky about them.

      Random opcode [arm.com] I have laying around today.

      Still, there's a reason all the options exist, and it falls in the category of "keep it simple, but not too simple" bucket.

      • Still, there's a reason all the options exist, and it falls in the category of "keep it simple, but not too simple" bucket.

        On a RISC system this is even true!!

    • OK now look up the word "elegant."

      Hint: Spock is not elegant.

  • Piping commands (Score:2, Informative)

    by Dan East ( 318230 )

    Another reason commands now have more options is that people have added convenience flags for functionality that could have been done by cobbling together a series of commands. These go all the way back to v7 unix, where ls has an option to reverse the sort order (which could have been done by passing the output to tac).

    What about performance? Especially back in the day on much slower machines. From a performance perspective (both memory and speed), having the reverse sort functionality built directly into ls is far superior.

    Also, how the heck is tac supposed to sort an ls -l, since the file name is not at the beginning of each line? Oh, we can pipe the ls -l output through awk, and re-order the columns so the filename is first, pipe that through tac, then pipe that output through another awk that puts the column orders b

    • Re:Piping commands (Score:5, Informative)

      by suutar ( 1860506 ) on Thursday March 05, 2020 @02:05PM (#59800154)

      tac is just "reverse standard input". It doesn't actually sort; ls still needs to do that. But you could use tac instead of the -r option.

    • by nodan ( 1172027 )

      Putting the sort into "ls" makes the binary (a little) bigger. Having just one sort tool probably saves space, especially when memory is precious like it was 40+ years ago. That's the fundamental concept in *x: have small specialized tools that do one thing, but do it well.

      tac doesn't sort, it just reverse the order of lines.

    • by flink ( 18449 )

      Also, how the heck is tac supposed to sort an ls -l, since the file name is not at the beginning of each line?

      Assuming you want to sort ls by file name:
      ls -l | sort -k 9 | tac
      - or -
      ls -1 | sort | tac | xargs ls -ld

  • Besides convenience, command line tools got more options because they are much more powerful these days. Just compare rcs vs. cvs vs. svn vs. git.

    No need to know all options - most of the time, you use a small subset anyway and you can configure aliases for your shell.

  • by gosand ( 234100 ) on Thursday March 05, 2020 @01:41PM (#59800034)

    I guess over time the command line options have grown. Change is hard, and there are probably easier ways to do things. But my fingers just know "ls -lahtr" and "find . | grep -i this | grep -i that | grep -iv something "

    I know from using imagemagick, there are LOTS of options, but I would generally only use "convert -geometry wwxhh $file.jpg $file_wwxhh.jpg". Sometimes it's the UI.. like using handbrake to figure out what options I wanted to use converting videos, then getting the generated command line and just putting it in a script. Plenty of other command line tools in the toolbox as well.

    At work I wanted to parse out some info, and tried it in Notepad++ using search/replace. I was getting frustrated, and then remembered I had gitbash installed. Boom, got what I wanted right away using pipes with cat, sort, uniq, and sed.

    Bottom line... command line for the win!

    • I like Notepad++, but yeah if you want to parse sort search and replace several hundred megabytes or 10s of thousands of files then Windows GUIs are a joke, Linux is a dream to use oif course.
      • by gosand ( 234100 )

        I like Notepad++, but yeah if you want to parse sort search and replace several hundred megabytes or 10s of thousands of files then Windows GUIs are a joke, Linux is a dream to use oif course.

        I once had a team who was trying to create test data - a csv of 5 million rows to try to reproduce a client issue. They were using excel (because csv is excel, duh)!
        It was spinning for 10 minutes just trying to open the file, then they were going to search and replace values for new data.

        I used a shell script to generate 1000 rows of data, then I used cat and sed I manipulated it into a 1 MM row file with unique values. That was my template for creating their 5MM row file, and lots of other test data file

  • by hey! ( 33014 ) on Thursday March 05, 2020 @01:46PM (#59800078) Homepage Journal

    I actually worked briefly with Unix v7 in the early 80s. The philosophy of "each program does one thing" was what was known back in the day as the "software tools movement". The idea was that while it might occasionally be useful to have Swiss army knife (ours was called "awk"), the best tools for serious work do just one thing well.

    The reason this was useful wasn't because the philosophy was *elegant*. It simply fit kind of data processing tasks we largely did back then: take a file full of data (or more generally a stream) and transform it into another file full of data (or potentially some summary number like a line count). Putting a reverse option on "sort" was just common sense in an era when computer memories were measured in hundreds of KB and disk packs were measured in tens of MB. Workloads were tiny and and tasks simple by today's standards. Almost nobody was writing programs that persisted in memory indefinitely and responded to events, which is the norm today and which doesn't fit the STM paradigm.

    A lot of the growth in command line options have to do with the inherent limitations of the data we were handling: text or streams of crude binary data. You have to impose order on those streams, and for that you need conventions. Fixed width or delimited? Pipe delimited or comma? How do you embed the delimiters? How do you say "null"? If everybody chose the same conventions and followed them strictly, a lot of command line options would simply not be needed.

    I don't know much about it, but Microsoft's Powershell seems to address some of these limitations by passing *objects* between programs. That immediately eliminates the need for command line options that deal with selecting lexical or syntactic conventions.

    • I don't know much about it, but Microsoft's Powershell seems to address some of these limitations by passing *objects* between programs.

      PowerShell object output works well for programs that support it, but adding support for outputting objects to your own program is not intuitive. The advantage of text is that outputting text is dead simple and easy to work with. Every program does it, even if the author wasn't thinking about it.

    • I still try to subscribe to this philosophy. Has served me well over the years.
    • Yeah the Achilles heal of the Unix Philosophy turned out to be interdependency.

      One tool doing one thing well was a great philosophy... until you needed 110 tools to do task. Then you had to ensure all 100 tools were configured properly and things got really fragile. Ooops a tool accidentally introduced a slight variation on its output. Now your Rube Goldberg contraption fails at tool #38. Or tool # 78 stopped development and is no longer being maintained.

      I'm using one of those "applications" right now

      • A tool should continue to work as it has even if it's not being actively developed. A tool which has different behavior in a new version when using the same parameters or settings as the old version is being maintained by idiots.
        • by guruevi ( 827432 )

          Welcome to GNU CoreUtils. Ever since going to GPLv3 and getting woke around internationalization, the whole thing has become useless between various OS. I have a Red Hat Linux 5 box (not RHEL, RHL) and an old Solaris SPARCStation box (they are used to drive some old scientific instruments), they get the same outputs from sort. Upgrade Ubuntu between LTS versions and your sort outputs change without any other (eg. locale) changes.

    • I don't know much about it, but Microsoft's Powershell seems to address some of these limitations by passing *objects* between programs. That immediately eliminates the need for command line options that deal with selecting lexical or syntactic conventions.

      There is also nushell [nushell.sh] that takes this approach. It looks interesting.

  • I would absolutely be lost with tail -f. Once I was working on a VMS machine, and I pressured the "VMS expert" into writing it for me. Cut about 50% of the time out of debugging.
    • I would absolutely be lost with tail -f.

      I'd be lost without it!

      (Not really, it is only about 5 lines of Perl or Ruby to implement it.)

  • There doesn't seem to be any point or theory being discussed. While what is presented may all be true, what is the point of the post? What does the poster want to get out of Slashdotter conversation? Other than it will create directionless babble (and that never happens on /. as it is).
  • by urusan ( 1755332 ) on Thursday March 05, 2020 @02:54PM (#59800338)

    The summary makes it sound like Dan Luu's blog post is railing against this trend, but it's actually a far more balanced view of the situation. If anything, he's arguing against McIlroy's railing against this trend.

    He even ends with "To be clear, I'm not saying that I or anyone else could have done better with the knowledge available in the 70s in terms of making a system that was practically useful at the time that would be elegant today. It's easy to look back and find issues with the benefit of hindsight. What I disagree with are comments from Unix mavens speaking today; comments like McIlroy's, which imply that we just forgot or don't understand the value of simplicity, or Ken Thompson saying that C is as safe a language as any and if we don't want bugs we should just write bug-free code. These kinds of comments imply that there's not much to learn from hindsight; in the 70s, we were building systems as effectively as anyone can today; five decades of collective experience, tens of millions of person-years, have taught us nothing; if we just go back to building systems like the original Unix mavens did, all will be well. I respectfully disagree."

    • Actually, this is a perfect example of what happens when the nerd world collides with the real world. Even the popularization of Linux has its downside.
      • by urusan ( 1755332 )

        Most of the growth (half of it) happened by 1996, long before Linux became popular. This trend has been going on for a long time, and is very much a trend of the nerd world.

  • "cat -v Considered Harmful"

  • Someone should get the old Usenix proceedings on line.
    I seem to recall that Rob Pike (hey Rob, are you out there?) did a somewhat controversial talk one year about proliferation of command line args, particularly in Berkeley Unix, called "cat -v considered harmful".
    The discussion has been going on for a long time.

  • Taking an extreme example, gcc. So many options. It could be simplified to a degree by splitting it into gcc-x86, gcc-arm, etc. After all, at the moment it has a modal command line, which can't be a good thing. And then there's rsync. If I only had the time to read up on all its options...

  • by codeButcher ( 223668 ) on Friday March 06, 2020 @05:33AM (#59802486)
    When will we be able to supply a JSON string to a command?
  • Nostalgia is when sudo was Sudo', an unqualified God...

Trying to be happy is like trying to build a machine for which the only specification is that it should run noiselessly.

Working...