Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Unix

Steve Bourne Talks About the History of Sh 232

An anonymous reader writes "Steve Bourne, the creator of the Bourne shell, or sh, talks about its history as the default Unix shell of Unix Version 7. Bourne worked on the shell in 1975 and said the process took no more than 6 months. Sh aimed to improve on the Thompson shell. 'I did change the shell so that command scripts could be used as filters. In the original shell this was not really feasible because the standard input for the executing script was the script itself. This change caused quite a disruption to the way people were used to working. I added variables, control flow and command substitution. The case statement allowed strings to be easily matched so that commands could decode their arguments and make decisions based on that. The for loop allowed iteration over a set of strings that were either explicit or by default the arguments that the command was given. I also added an additional quoting mechanism so that you could do variable substitutions within quotes. It was a significant redesign with some of the original flavor of the Thompson shell still there. Also I eliminated goto in favour of flow control primitives like if and for. This was also considered rather radical departure from the existing practice. Command substitution was something else I added because that gives you very general mechanism to do string processing; it allows you to get strings back from commands and use them as the text of the script as if you had typed it directly. I think this was a new idea that I, at least, had not seen in scripting languages, except perhaps LISP,' he says."
This discussion has been archived. No new comments can be posted.

Steve Bourne Talks About the History of Sh

Comments Filter:
  • PowerShell (Score:4, Insightful)

    by Anonymous Coward on Thursday March 05, 2009 @01:26PM (#27079297)
    Welcome to 1975, Microsoft.
  • Re:PowerShell (Score:5, Insightful)

    by CannonballHead ( 842625 ) on Thursday March 05, 2009 @01:46PM (#27079603)

    Because most Windows users need a shell. Right.

    UNIX wasn't exactly one of those home-user targeted operating systems. It makes sense to have a rather powerful shell on it, scripting abilities, compilers, etc.

    Windows 95, 98, XP, etc., all the non-server ones, didn't need a shell. I grew up using Windows and never once needed something like that. Arguably, it would be nice on the server side, I guess... but Windows did appear to try to get AWAY from the command line.

    Besides. If they included a shell, everyone would just complain how they're copying UNIX and thus are even more useless. :)

  • Re:perl (Score:2, Insightful)

    by Sir Groane ( 1226610 ) on Thursday March 05, 2009 @02:04PM (#27079861) Homepage
    If all you're doing is moving files around or creating tarballs etc. then all those backticks in perl can become a PITA

    shell is still around 'cos it's still the right tool for some jobs...
  • Re:perl (Score:3, Insightful)

    by goombah99 ( 560566 ) on Thursday March 05, 2009 @02:06PM (#27079889)

    Your forkbomb script is rather a pointless action as I'm sure you are aware.

    if you are parsing text or doing any sort of complicated extractions you have to repeatedly use grep awk and sed in bash to accomplish the job. repeatedly launching these in aprogram can produce something that is easily 100-1000 times slower than the equivalent perl would be.

    thus the window of usefulness for a scripting language is extended three orders of magnitude.

    for example, let's say the ease of use of scripting means that until a program runs longer than, say, ten minutes to do a job you will prefer script over compiled lnaguages. the equivalent bash program if it does any serious parsing might take 3 hours. This is why one might still use a script language even when "speed" is an issue.

    moreover, for any complex set of operations, expressing the logic is always simpler in perl. Yet the syntax is almost the same.

    hence I wondered why shell script is used given perl seems to be always preferebale for any use.

  • Re:PowerShell (Score:4, Insightful)

    by CannonballHead ( 842625 ) on Thursday March 05, 2009 @02:09PM (#27079949)

    Just USE UNIX, then you don't need to worry about Windows copying it or not copying it.

    It seems the problem is people are willing to admit that Windows has something going for it, and thus wish it would be more like UNIX in some ways. Why not wish UNIX was more like Windows? I guess that's what some distro's of Linux are doing. Finally. :)

    (yes, I know, Windows has "95% of users" going for it... but not always...)

  • by Savage-Rabbit ( 308260 ) on Thursday March 05, 2009 @02:13PM (#27079993)

    Because most Windows users need a shell. Right.

    I think the original comment was directed at Windows Server users not Windows consumer desktop users (unless the user of that consumer desktop is a developer or an admin). I'll agree that most consumer desktop users don't need a shell. I may be a developer these days but I have been an administrator for Linux, Solaris, AIX, several lesser known incarnations of *NIX, Windows NT, Windows 2000 Server and Windows 2003. I can tell you that there are times when you really miss the command-line power of the Unix shell on Windows servers. There are tasks you simply can't do on a Windows server except through the GUI which is nice if you don't have to do it often but when you have, say... a project where you have to do the same set of tasks a few thousand times in a row and want to complete this project in a sane amount of time scripting is a must. The only alternative for solving some such problems even on Windows 2003 is to write a C# program because you can't solve the problem by scripting. Writing a C# program is something I wouldn't expect an average Windows admin to be able to do anymore than I require a Unix admin to be a seasoned Java developer. IMHO an average Windows Server admin or Unix admin should be seasoned at scripting but I wouldn't expect either to be seasoned at C# or Java programming, VB or Perl would be good though. I am not prepared to take a server OS seriously unless I can do more on it's command-line than I can do with the slick GUI management tools.

  • by RAMMS+EIN ( 578166 ) on Thursday March 05, 2009 @02:30PM (#27080277) Homepage Journal

    The parent comment was modded funny, but I think Greenspun's Tenth is still relevant today. And, applied to Unix, it's definitely true. Imagine what Unix would be like if there only were C. But there isn't only C, there is also the shell and various scripting languages. The shell's most important feature is that it's interactive, like Lisp's read-eval-print loop. Todays popular scripting languages on Unix (say, Perl and Python) implement many of the other features of Lisp, allowing programs to be expressed a lot more succinctly and conveniently than in C. But all these are part of the same universe: the shell works mostly by running other programs, and the scripting languages do some of their tasks by going through the shell or C libraries. So, with everything together, you end up with something vaguely like what Lisp offers in a single package.

    Of course, the world hasn't stood still, and the Unix universe now offers many features that aren't really present, or at least not standardized, in the Lisp universe.

    And, in the meantime, Java has come along, re-inventing and re-implementing tons of features from Lisp and Unix.

  • Re:PowerShell (Score:3, Insightful)

    by Hatta ( 162192 ) on Thursday March 05, 2009 @03:28PM (#27081141) Journal

    When the only tool you have is a hammer, every problem looks like a nail. When the only tool you have is a GUI, every problem looks like a clickfest. Until you know the command line, you don't realize how handy it is. So I would argue that every user needs the command line, they just don't know it yet. I'm a pretty normal desktop user, more skilled than most perhaps, but the tasks I do are pretty common. There's almost nothing I do that doesn't benefit from a CLI.

    But this is old news now, Windows has a CLI. I hear it's pretty powerful too. I don't spend enough time on Windows to bother learning it, but I'm glad they have it. If there are any useful ideas there, I'm sure they'll make it into Bash or ZSH or whatever.

  • Re:PowerShell (Score:3, Insightful)

    by Hatta ( 162192 ) on Thursday March 05, 2009 @03:30PM (#27081183) Journal

    Cygwin + the Terminator terminal makes a pretty nice environment when you're stuck with windows.

  • Re:PowerShell (Score:5, Insightful)

    by Tetsujin ( 103070 ) on Thursday March 05, 2009 @03:33PM (#27081225) Homepage Journal

    Welcome to 1975, Microsoft.

    Meh, give Powershell some credit. It exposes a lot more functionality with a lot better organization than a Unix shell would. They took the basic paradigm of the shell and made it fit the .NET environment - so users can express themselves using the same basic style as they'd use in a Unix shell, but working with a more powerful set of libraries and data types. I think it's significant, and I think the Unix world could learn a thing or two from it, about keeping what's good about the shell, but moving the basic technology out of the 1970s.

  • Re:perl (Score:2, Insightful)

    by Ex-Linux-Fanboy ( 1311235 ) on Thursday March 05, 2009 @04:02PM (#27081587) Homepage Journal

    I've never fully understood why bash is used anymore when perl is around

    The right tool for the right job. For example, I've been using sh/bash for a bunch of SQA regression tests for a command-line caching DNS server I'm working on (my current open-source project). Here is one of the simpler tests so you can get an idea of the syntax:

    for VALUE in 0 1 ; do

    cat > dwood2rc << EOF
    chroot_dir="$( pwd )"
    ipv4_bind_addresses="127.0.0.1"
    upstream_servers["."]="127.0.0.2"
    recursive_acl="127.0.0.1/16"
    maxprocs=8
    timeout_seconds=1
    handle_noreply=${VALUE}
    EOF

    ../../src/DwMain -f dwood2rc > /dev/null &
    sleep 1
    echo handle_noreply=$VALUE
    askmara -t 8 Awww.example.com.
    sleep 1
    killall DwMain > /dev/null 2>&1
    sleep 1

    done

    Now, yes, one could do a test like this in Perl, but all we're really doing is making a file with some parameters we're testing, then running the program being tested with those parameters. Here, DwMain is the DNS server I'm testing and askmara is like dig, but simpler.

    I used to be a big-time Perl scripter, but I feel it's usually too big and complicated for the tasks I'm doing.

    For embedded systems, keep in mind the Perl core library is well over a megabyte in size; a full *NIX system in busybox (with sh, awk, ls, and pretty much any other command you would type at the command line) is only about 500k in size. This matters in things like routers and mini-Linux distributions (I once made a Linux distribution that was under 30 megs in size that included a GUI and the Firefox web browser).

    Also, the thing that annoys me with Perl is that there is no standard that defines how Perl should act; the only standard is the Perl interpreter itself, and this has changed in strange ways that sometimes makes debugging Perl scripts difficult [google.com]. What guarantee is there that my Perl scripts will run in Perl 6 or what not?

    Also, when people add a lot of stuff from CPAN, Perl starts getting in to "dll hell"

    sh, on the other hand, has its behavior defined by POSIX, and if I make a POSIX-compliant script, there's a pretty good chance it will continue to run for the foreseeable future.

  • by tepples ( 727027 ) <tepples.gmail@com> on Thursday March 05, 2009 @04:07PM (#27081647) Homepage Journal

    The question, though, is why C# or Java "programming" is so different from "scripting" that you'd expect a sysadmin to know the latter, but not the former.

    Perhaps because the syntactic salt of C# and Java makes them cumbersome than the "P" languages for the sorts of automation tasks that sysadmins handle routinely:

    • The developer must compile a program explicitly, unlike the "P" languages that automatically call the compiler to produce bytecode.
    • The developer must define explicitly what class a particular translation unit represents, compared to Python where each file implicitly describes a module.
    • C# and Java use named interfaces instead of the typical duck-typing approach of Python where any object that implements a given set of methods will work.

    Not to mention that a lot of sysadmins learn some of their languages through hobby projects on shared web hosting, and more shared web hosting environments have "P" languages than ASP.NET and Java servlets.

  • Re:PowerShell (Score:2, Insightful)

    by Anonymous Coward on Thursday March 05, 2009 @04:35PM (#27081997)

    so users can express themselves using the same basic style as they'd use in a Unix shell, but working with a more powerful set of libraries and data types.

    Like a Unix user would be calling Perl or Python?

    The nice thing about Unix isn't about the shell, or the utilities (awk, sed, etc.), or the scripting languages: it's the fact that they can be all link together via pipes. As long as you move your data around as text, you can send it to anything on a Unix system.

  • by Tetsujin ( 103070 ) on Thursday March 05, 2009 @06:04PM (#27083343) Homepage Journal

    so users can express themselves using the same basic style as they'd use in a Unix shell, but working with a more powerful set of libraries and data types.

    Like a Unix user would be calling Perl or Python?

    Not quite... The shell user can call Perl or Python to access libraries or datatypes - but these concepts are meaningless within the framework of the shell itself. In Powershell, a commandlet returning an object yields something you can work with in the shell - see what object methods or data fields it provides, run methods, pass the object to another commandlet, etc.

    Powershell provides a powerful set of baseline assumptions for the format of data over the pipe - and so both the shell itself and commandlets running in the shell can take advantage of these assumptions. In Unix, the typical approach is to "roll your own format" each time - which is trivial for trivial problems, but substantially harder as you start worrying about questions like, what happens when my data fields contain the character I want to use as a delimiter?

    This is further complicated by the fact that existing Unix programs, outputting text, typically format that text for human consumption. The exact format of a program's input or output may change from release to release with no coherent strategy for establishing any kind of compatibility. In comparison, in Powershell a piece of data passed from one process to another has a predictable underlying structure - it's formatted for consumption by another process rather than for display at the terminal. But since the shell itself also recognizes this format, it has a reasonable default behavior for displaying a command output - or if necessary you can pipe through a command whose job is to make nicely formatted output of selected columns of another program's result.

    Now, what are the benefits of serializing to text format? You can look at it, printed on-screen, and know what it represents and how to parse it, right? The problem is this becomes less and less true as the data format becomes more intricate, more comprehensive - which is bound to happen as you start doing things like handling more complex problems, implementing data formats that account for future upgrades, and so on. The strength of PowerShell's approach (the same approach, basically, as you'd find in any other capable, interactive programming language) is that it knows enough about the format of data it handles that it can make that format easy to view and work with - easier, in fact, than plain text, because you see a representation of the data's meaning rather than of its underlying structure.

    As another example, consider what it would take to provide any kind of higher-order programming in the shell. There's a limited degree of this available already, of course: if you want to pass a "function" to something, you form it into a shell script, put it in a directory somewhere, and provide the full pathname to that function as an argument to your higher-order program - which will then use something like "system()", "popen()" or "exec()" to invoke it.

    Now, what if you want to include a set of data, representing the state of an "object" with that "function"? You can do that, too - you can write out a data file representing the current state, and pass both the script and the data file names to your higher-order program. Or you could have a program running in the background, communicating via some form of IPC - maybe over a named pipe or listening to a particular network socket or hosted by an object broker, and pass the necessary reference to the higher-order function. Or, about the nicest you can manage in the shell (though decidedly not a clean solution IMO) - start a process in the background which maintains the state you're working with, and have a second executable which communicates with the background process, passing on commands and bringing back results.

    The problem is, none of those me

  • Re:Yes, PowerShell (Score:4, Insightful)

    by buchner.johannes ( 1139593 ) on Thursday March 05, 2009 @06:10PM (#27083457) Homepage Journal

    We should make a coreutils package that outputs XML, JSON or similar, so we don't need stupid cut/grep/head tricks anymore and can, for example, directly access a column, or sum stuff up.

    The last command in the pipe chain would output in a user-readable format.

  • Re:perl (Score:3, Insightful)

    by goombah99 ( 560566 ) on Thursday March 05, 2009 @06:14PM (#27083531)

    Okay now suppose you wanted to perhaps have an exception test for the killall or the askmara. Or suppose you wanted to have a time-out if they never returned. Finally assume you wanted to log the result of the action. Maybe you want to use a command line variable to supply say a password and the number of retries.

    Yes you could do all that in shell script. it simply is easier in perl.

    your assurance that your perlscripts will run is that the first line of the perl program specifies which perl interpreter to use. if it requires 5.6 you tell it to use 5.6

  • And this speaks to why IMHO it was unfair (besides being stupid) to change the rules on software patents in 1986. Prior to that time, the huge amount of seminal, fundamental, wonderful work (by geniuses and people much smarter than me) in software and systems could not be patented, so it was either secret (for a while) or open. All those giants back then had no opportunity to set up a licensing toll bridge. And now, an infinite regression of trivialities are patented.

    Imagine what progress in computing would have been if Alan Kay had been able to patent windowing GUIs, or if object-oriented programming had been patented, or paged virtual memory, timesharing, CDMA, TCP, IP, programming macros, relocatable code linkers, electronic mail, image morphing, most computer graphics and imaging techniques, ... the list goes on.

    Some of the core ideas incorporated by Berners-Lee in his WWW creation could have been patented either by him, or by NeXT Computer before he had a chance. And then where would we be?

    Hell, I personally could have gotten patents on client-server image processing nets, steganography, SAN, image paging and pre-fetch, pan-and-zoom image and map display, a whole raft of specialize raster-op (bitblit) functions, physical mapping of image files onto disk sectors, street-address interpolation, for geolocation. And that's just a sample of the bigger stuff I was involved in from 1983-1985. Oh yeah - a collaborative sketchpad over ethernet, in 1982!

    At the time (early and mid 1980's) NONE of this was patentable. And now people are getting held up for $millions for stuff we didn't even bother to document or publish, because it was so trivial. And (just for perspective) I was just a regular schmoe - not one of the lights of programming.

    rant, rant, rant... I totally agree with what you said :) I was not and am not worthy either. And certainly neither are the market- and legal-droid twits at Amazon and Microsoft and elsewhere who browbeat the software writers into signing off on the post-placental detritus that modern software patents are and will always be.

Always draw your curves, then plot your reading.

Working...