Steve Bourne Talks About the History of Sh 232
An anonymous reader writes "Steve Bourne, the creator of the Bourne shell, or sh, talks about its history as the default Unix shell of Unix Version 7. Bourne worked on the shell in 1975 and said the process took no more than 6 months. Sh aimed to improve on the Thompson shell. 'I did change the shell so that command scripts could be used as filters. In the original shell this was not really feasible because the standard input for the executing script was the script itself. This change caused quite a disruption to the way people were used to working. I added variables, control flow and command substitution. The case statement allowed strings to be easily matched so that commands could decode their arguments and make decisions based on that. The for loop allowed iteration over a set of strings that were either explicit or by default the arguments that the command was given.
I also added an additional quoting mechanism so that you could do variable substitutions within quotes. It was a significant redesign with some of the original flavor of the Thompson shell still there. Also I eliminated goto in favour of flow control primitives like if and for. This was also considered rather radical departure from the existing practice.
Command substitution was something else I added because that gives you very general mechanism to do string processing; it allows you to get strings back from commands and use them as the text of the script as if you had typed it directly. I think this was a new idea that I, at least, had not seen in scripting languages, except perhaps LISP,' he says."
Sh! (Score:5, Funny)
That was a pre-emptive "sh!" Now, I have a whole bag of "sh!" with your name on it.
Re:Sh! (Score:5, Funny)
"I have a whole bag of "sh!" with your name on it."
In other words, you have the whole shebang?
Real history. (Score:5, Funny)
The history of "Sh" started when the first kid was born, and it has continued to this day. Later forked versions include "Shh!" and "STFU".
Re:Real history. (Score:5, Funny)
*BASH*
Re:Real history. (Score:5, Funny)
tcsh, tcsh, tcsh. -Mom
Re:Real history. (Score:4, Funny)
*BASH*
Bourne Again SHell - I remember when I first learned of it, thinking "Wow! Unix meets Jesus!".
Re: (Score:2)
So you've only been using unix for 7 years? Noob!
[Bourne Identity - 2002]
Re: (Score:3, Interesting)
P1: I wrote my first /bin/sh script on 9/10/1984
P2: I'm still in touch with the other intern from that phase, who calls me and asks things like, "Is it 2>&1 or 2&>1? I never can remember..."
C: Long term usage does not imply expertise.
I like to think I'm pretty good, but I still review Csh Programming Considered Harmful [faqs.org] for more esoteric usage of /bin/sh, when I have only that old tool available.
Re: (Score:2)
The history of "Sh" started when the first kid was born, and it has continued to this day. Later forked versions include "Shh!" and "STFU".
:D
You know, I was gonna write a shell and give it a name with "sh" at the end, but maybe I should call it "STFU" instead. :D
Re:Real history. (Score:5, Funny)
Shell To Frustrate Users?
PowerShell (Score:4, Insightful)
Re:PowerShell (Score:5, Insightful)
Because most Windows users need a shell. Right.
UNIX wasn't exactly one of those home-user targeted operating systems. It makes sense to have a rather powerful shell on it, scripting abilities, compilers, etc.
Windows 95, 98, XP, etc., all the non-server ones, didn't need a shell. I grew up using Windows and never once needed something like that. Arguably, it would be nice on the server side, I guess... but Windows did appear to try to get AWAY from the command line.
Besides. If they included a shell, everyone would just complain how they're copying UNIX and thus are even more useless. :)
Re: (Score:2, Interesting)
Oh, how I wish they would copy UNIX. File management would almost be tolerable.
--Brought to to you by the letters "c" and "p".
Re:PowerShell (Score:4, Insightful)
Just USE UNIX, then you don't need to worry about Windows copying it or not copying it.
It seems the problem is people are willing to admit that Windows has something going for it, and thus wish it would be more like UNIX in some ways. Why not wish UNIX was more like Windows? I guess that's what some distro's of Linux are doing. Finally. :)
(yes, I know, Windows has "95% of users" going for it... but not always...)
Re: (Score:2)
It seems the problem is people are willing to admit that Windows has something going for it, and thus wish it would be more like UNIX in some ways.
DirectX.
Re: (Score:2)
Re: (Score:2)
SCO's price for a "Linux licence".
Re:PowerShell (Score:4, Funny)
I thought he was referring to Apple's price for a Mac Mini with OS X installed.
Re:"Just USE UNIX..." (Score:2)
Sorry to state the obvious, but Linux IS a flavor of UN*X regardless of what the lawyers at the Open Group say.
Re: (Score:3, Insightful)
Cygwin + the Terminator terminal makes a pretty nice environment when you're stuck with windows.
Do windows users need a shell? (Score:5, Insightful)
Because most Windows users need a shell. Right.
I think the original comment was directed at Windows Server users not Windows consumer desktop users (unless the user of that consumer desktop is a developer or an admin). I'll agree that most consumer desktop users don't need a shell. I may be a developer these days but I have been an administrator for Linux, Solaris, AIX, several lesser known incarnations of *NIX, Windows NT, Windows 2000 Server and Windows 2003. I can tell you that there are times when you really miss the command-line power of the Unix shell on Windows servers. There are tasks you simply can't do on a Windows server except through the GUI which is nice if you don't have to do it often but when you have, say... a project where you have to do the same set of tasks a few thousand times in a row and want to complete this project in a sane amount of time scripting is a must. The only alternative for solving some such problems even on Windows 2003 is to write a C# program because you can't solve the problem by scripting. Writing a C# program is something I wouldn't expect an average Windows admin to be able to do anymore than I require a Unix admin to be a seasoned Java developer. IMHO an average Windows Server admin or Unix admin should be seasoned at scripting but I wouldn't expect either to be seasoned at C# or Java programming, VB or Perl would be good though. I am not prepared to take a server OS seriously unless I can do more on it's command-line than I can do with the slick GUI management tools.
Re:Do windows users need a shell? (Score:5, Interesting)
You can use perl and python for windows.
For example, for perl there's Bundle::Win32
http://search.cpan.org/~jdb/Bundle-libwin32-0.30/libwin32.pm [cpan.org]
Useful stuff like: Win32::TieRegistry , Win32::ChangeNotify
But be good and don't write malware. The antivirus people might give up trying to detect perl malware (think about it - polymorphic TMTDOWTDI perl malware...), they might just flag/blacklist perl itself :).
Re: (Score:2)
Hence, me stating the following?
Arguably, it would be nice on the server side, I guess... but Windows did appear to try to get AWAY from the command line.
I guess it's not even fashionable on slashdot to read the comment you are replying to ;) hehe.
I admit, shells and command-lines are pretty nice. If I want to know the IP of my windows box I do ipconfig, not double click the network connection.
But most people aren't running Windows Server. Most people are running Windows XP, Vista, whatever. They don't need a shell for most things... and, as someone replied to you, you can use python and perl on Windows. And there's always
Re: (Score:2)
I guess it's not even fashionable on slashdot to read the comment you are replying to ;) hehe.
You said it would be 'nice' which implies you could live without it, even on the server side. I was trying to make the point that a powerful shell on a server or scripting ability for an admin is not optional it is required. :-)
Re: (Score:2)
I suppose it depends on what the admin is doing. Running a simple web server or something like that doesn't require a ton of shell power, in my experience...
That said, point taken. I think ther'es a reason Linux/UNIX is way more popular on the server side than it is on the home consumer side.. :)
Re: (Score:3, Informative)
The question, though, is why C# or Java "programming" is so different from "scripting" that you'd expect a sysadmin to know the latter, but not the former.
C# and Java vs. the "P" languages (Score:4, Insightful)
The question, though, is why C# or Java "programming" is so different from "scripting" that you'd expect a sysadmin to know the latter, but not the former.
Perhaps because the syntactic salt of C# and Java makes them cumbersome than the "P" languages for the sorts of automation tasks that sysadmins handle routinely:
Not to mention that a lot of sysadmins learn some of their languages through hobby projects on shared web hosting, and more shared web hosting environments have "P" languages than ASP.NET and Java servlets.
Re: (Score:2)
Does Windows seriously not come with any way to automate things? I mean, besides batch scripts, which, unless I'm mistaken, allow you to do some of the things you could do under DOS, but that don't actually interface to what you would normally work with under Windows much.
Windows Script Host (Score:3, Informative)
Does Windows seriously not come with any way to automate things?
Windows Script Host [wikipedia.org] allows a program written in JScript or VBScript to control any app that exposes APIs through OLE Automation [wikipedia.org].
Re: (Score:2)
Every version since Win98 has included the Windows Script Host [wikipedia.org] by default. This allows one to automate quite a variety [microsoft.com] of tasks out of the box using vbscript or javascript. It's a little clunky for some things (e.g. recursive file searches), but is generally flexible enough for most needs.
Yes, PowerShell (Score:5, Informative)
Available from Microsoft for XP, 2003; included in Server 2008 and Windows 7.
The name is really lame, but it *is* damn powerful. At least for Windows which has most of it API exposed through object-oriented technologies (COM, .NET and WMI) which are easily used in a unified way by PowerShell.
Just a few quick samples:
Re: (Score:2)
Here's another example: finding all empty folders below the current one
In PowerShell:
$a = Get-ChildItem . â"recurse | Where-Object {$_.PSIsContainer -eq $True}
$a | Where-Object {$_.GetFiles().Count -eq 0} | Select-Object FullName
Using find:
find . -type d -depth -empty
Hmm... No COM, .NET or WMI technologies required.
Why so verbose? (Score:3, Informative)
when
ls -r | ?{-not($_|ls)}
would suffice?
Explanation:
1) list all items recursively from the current location
2) filter only those items where ls returns an empty set.
Btw, the objects returned from that command are *still* DirectoryInfo objects, allowing even further operations or property accesses.
Also, this command will work the same even if the "current location" is a node in the registry, the certificate store, the credentials store, a group in active directory etc etc.
In other words if you
Re:Yes, PowerShell (Score:4, Insightful)
We should make a coreutils package that outputs XML, JSON or similar, so we don't need stupid cut/grep/head tricks anymore and can, for example, directly access a column, or sum stuff up.
The last command in the pipe chain would output in a user-readable format.
Re: (Score:2)
Bah. Every version of Windows I've used has a command shell. Even when the available commands were basically equivalent to MS-DOS, it was possible to do almost anything via a .BAT file. One can pass variables, prompt for input, do conditional branching, loops, even create and call new batch files dynamically.
I personally enjoy the challenge of writing .BAT files that rely soley on native commands. As far back as Windows 95 I've written .BAT files that could: Run automatically on a per computer OR per user
Re: (Score:2)
vbscript, jscript, batch, and on most 2008 boxes powershell.
Optional:
perlscript
Also there are quite a few third party scripting engines available. There are a TON of things you can do on the command line that you can't do in the GUI including almost all AD debugging. I write batch files every week and vbscripts probably at least monthly. I can't wait until powershell supports remote objects so that it becomes more useful in a networked environment
Re: (Score:2)
Cygwin makes windows bearable.
Re: (Score:2)
Re: (Score:2)
Imagine the noise of a beowulf cluster of washing machines
Re:PowerShell (Score:4, Funny)
Imagine the noise of a beowulf cluster of washing machines
They're called laundromats.
Re:PowerShell (Score:5, Funny)
I've heard of those... they process threads there, don't they?
Re: (Score:2)
I don't know about any other sysadmin but I regularly need to go into the MSDOS shell in order to do something really fast or control something. eg. if you want to check why a certain file doesn't show up in explorer, you can drop down in shell and see the file and change it's attributes or delete/copy large amounts of files based on extension or any other part of the name (using * and ?)
Re: (Score:2, Interesting)
Many Mac users have found the Unix shell hidden under Mac OS X to be quite useful. And, remember that pre Mac OS X, not only didn't the Mac OS have the concept of environment variables. It didn't even have a command line prompt.
Of course, it isn't just the shell, it's the whole OS philosophy that's important. It's why people who use Linux/Unix based systems can easily cobble together their own backup solutions. Use "rsync" with Amazon's S3 service, and you have an online backup solution that's cheap and sec
Re: (Score:2)
It was in my home. Using Unix (in one form or another) since 1986, baby, yeah!
Re: (Score:2)
winipcfg -release
winipcfg -renew
Re: (Score:3, Insightful)
When the only tool you have is a hammer, every problem looks like a nail. When the only tool you have is a GUI, every problem looks like a clickfest. Until you know the command line, you don't realize how handy it is. So I would argue that every user needs the command line, they just don't know it yet. I'm a pretty normal desktop user, more skilled than most perhaps, but the tasks I do are pretty common. There's almost nothing I do that doesn't benefit from a CLI.
But this is old news now, Windows has a
Re: (Score:2)
Start->Run.
That's the GUI interface to the Windows command line (whereas CMD is the command line interface itself). That there exists such a thing indicates that command lines are useful even to regular users.
PowerShell ideas are more relevant to Windows (Score:2)
PowerShell ideas are more relevant to Windows than they are to *nix. PowerShell is object-oriented: The pipes are objects, not text. That saves a lot of parsing and allows interaction (like calling methods) with the objects flowing through.
PowerShell also unifies the object-oriented models on Windows: COM, .NET and WMI. Most of Windows APIs are now either fully object-oriented (e.g. DirextX, Speech) or have been wrapped in by object-oriented models such as .NET or WMI.
*nix generally does not expose API
Re: (Score:2)
Windows 95, 98, XP, etc., all the non-server ones, didn't need a shell
Umm, except for XP, all versions of Windows in your list had "command.com." All DOS versions of Windows executed "autoexec.bat" at start up with the DOS shell command.com. XP has "cmd.exe"
I grew up using Windows and never once needed something like that.
*you* may not have needed it.
but Windows did appear to try to get AWAY from the command line.
Yes, but because *you* don't see it, doesn't mean it isn't there and not used.
Besides. If they
Re: (Score:2)
Umm, except for XP, all versions of Windows in your list had "command.com." All DOS versions of Windows executed "autoexec.bat" at start up with the DOS shell command.com. XP has "cmd.exe"
I realize that. I used it. I started with DOS and would alternately install Windows 3.1 and Wayne Gretzky Hockey 3, as they would not both fit on my 20mb hard drive. Windows XP still has command.com by the way. But cmd is way nicer, and faster.
*you* may not have needed it.
I said I didn't need it, I didn't say I didn't use it. :)
Yes, but because *you* don't see it, doesn't mean it isn't there and not used.
Again, I didn't say it wasn't there, I said Windows appeared to try to get away from it being necessary. Mac OS appeared to do the same thing.
Windows on its own is useless. The only things that make it non-useless have more to do with 3rd party support than anything Microsoft does. That's why monopolies are bad, because, even though Windows sucks, users have little practical choice.
Now I understand the rest of your post... you hate MS and hate W
Re: (Score:2)
I realize that. I used it.
Then why did you say:
Windows 95, 98, XP, etc., all the non-server ones, didn't need a shell. I grew up using Windows and never once needed something like that. Arguably, it would be nice on the server side, I guess... but Windows did appear to try to get AWAY from the command line.
That paragraph absolutely tries to say that Windows does not have a shell. If it did have a shell, which you claim to know that it did, why would you say: "it would be nice on the server side, I guess."
N
Re: (Score:2)
Agreed, it was ad hom. Attempt at figuring out a perceived slant in the post. Meh, was unnecessary though, apologies.
why would you say: "it would be nice on the server side, I guess."
Because it's true. I personally do not think Windows really has/had the equivalent of bash or something (I've found batch scripts to be clunky and DOS not nearly as easy as bash), but it does have a command line which can be used. I phrased it "it would be nice..." because I haven't actually done a lot on Windows Server * aside from setting up basic functionality (DHCP, ActiveDirectory, D
Re: (Score:2)
Windows on its own is useless.
That's why they include Free Cell. :-)
Re: (Score:2)
--I discovered NDOS while using an old version of Norton Utilities (back when they were actually useful, and not bloated.) That led me to 4DOS and a whole world of useful stuff you could do in the extended-capability Command shell they supplied.
--WayCool stuff, if you were an old DOS hound like meself. ;-)
Re: (Score:2)
I grew up using Windows and never once needed something like that.
Well, kiddo, most of us geeks grew up using DOS and TOS and those Basic cartridges. Windows was a cutesy little app that ran on top of DOS. I spent pretty much all of the 80's and 90's working from a command shell, and even today on my Windows XP desktop, I have a couple hundred batch files and Perl scripts that follow me wherever I go. There is a wealth of tasks that are done more concisely and efficiently with a few text commands than any GUI could ever encompass.
Just look at the very handy things you
Re: (Score:2)
Hehe, hi gramps.
FWIW, I actually started with DOS as well. I never used BASIC cartridges but I actually did use BASIC. I knew that Windows ran on top of DOS, and most (hey, I was young) games that I played were DOS games. I still enjoy them once in a while, in fact. I still have a copy, on 5.25" floppies, of DOS 2.1, I think.
But I'd say I grew up using Windows, still, since I used it more than I used DOS. But I have known commands cd, rmdir, copy, etc., since I was fairly young... I think around 7?
Re: (Score:2)
Windows 95, 98, XP, etc., all the non-server ones, didn't need a shell. I grew up using Windows and never once needed something like that. Arguably, it would be nice on the server side, I guess... but Windows did appear to try to get AWAY from the command line.
MS-DOS had a half-decent command-line environment - don't knock it. For those of us that grew up with DOS, it was great, and moving to an all-GUI "Windows" environment was a painful shift.
I say MS-DOS had a half-decent CLI, but DOS is much better now [freedos.org]. You're welcome, btw. :-)
Re: (Score:2)
FWIW, I'm a software tester and regularly script in perl, bash, python, and a lot of xml stuff as well. The products that I test were primarily command-line products, and I'm involved in automating testing of that side of the product, hence the scripting. But thanks for assuming ignorance ;)
Re:PowerShell (Score:5, Insightful)
Welcome to 1975, Microsoft.
Meh, give Powershell some credit. It exposes a lot more functionality with a lot better organization than a Unix shell would. They took the basic paradigm of the shell and made it fit the .NET environment - so users can express themselves using the same basic style as they'd use in a Unix shell, but working with a more powerful set of libraries and data types. I think it's significant, and I think the Unix world could learn a thing or two from it, about keeping what's good about the shell, but moving the basic technology out of the 1970s.
Re: (Score:2, Insightful)
so users can express themselves using the same basic style as they'd use in a Unix shell, but working with a more powerful set of libraries and data types.
Like a Unix user would be calling Perl or Python?
The nice thing about Unix isn't about the shell, or the utilities (awk, sed, etc.), or the scripting languages: it's the fact that they can be all link together via pipes. As long as you move your data around as text, you can send it to anything on a Unix system.
PowerShell and critique of the Unix shell (Score:5, Insightful)
so users can express themselves using the same basic style as they'd use in a Unix shell, but working with a more powerful set of libraries and data types.
Like a Unix user would be calling Perl or Python?
Not quite... The shell user can call Perl or Python to access libraries or datatypes - but these concepts are meaningless within the framework of the shell itself. In Powershell, a commandlet returning an object yields something you can work with in the shell - see what object methods or data fields it provides, run methods, pass the object to another commandlet, etc.
Powershell provides a powerful set of baseline assumptions for the format of data over the pipe - and so both the shell itself and commandlets running in the shell can take advantage of these assumptions. In Unix, the typical approach is to "roll your own format" each time - which is trivial for trivial problems, but substantially harder as you start worrying about questions like, what happens when my data fields contain the character I want to use as a delimiter?
This is further complicated by the fact that existing Unix programs, outputting text, typically format that text for human consumption. The exact format of a program's input or output may change from release to release with no coherent strategy for establishing any kind of compatibility. In comparison, in Powershell a piece of data passed from one process to another has a predictable underlying structure - it's formatted for consumption by another process rather than for display at the terminal. But since the shell itself also recognizes this format, it has a reasonable default behavior for displaying a command output - or if necessary you can pipe through a command whose job is to make nicely formatted output of selected columns of another program's result.
Now, what are the benefits of serializing to text format? You can look at it, printed on-screen, and know what it represents and how to parse it, right? The problem is this becomes less and less true as the data format becomes more intricate, more comprehensive - which is bound to happen as you start doing things like handling more complex problems, implementing data formats that account for future upgrades, and so on. The strength of PowerShell's approach (the same approach, basically, as you'd find in any other capable, interactive programming language) is that it knows enough about the format of data it handles that it can make that format easy to view and work with - easier, in fact, than plain text, because you see a representation of the data's meaning rather than of its underlying structure.
As another example, consider what it would take to provide any kind of higher-order programming in the shell. There's a limited degree of this available already, of course: if you want to pass a "function" to something, you form it into a shell script, put it in a directory somewhere, and provide the full pathname to that function as an argument to your higher-order program - which will then use something like "system()", "popen()" or "exec()" to invoke it.
Now, what if you want to include a set of data, representing the state of an "object" with that "function"? You can do that, too - you can write out a data file representing the current state, and pass both the script and the data file names to your higher-order program. Or you could have a program running in the background, communicating via some form of IPC - maybe over a named pipe or listening to a particular network socket or hosted by an object broker, and pass the necessary reference to the higher-order function. Or, about the nicest you can manage in the shell (though decidedly not a clean solution IMO) - start a process in the background which maintains the state you're working with, and have a second executable which communicates with the background process, passing on commands and bringing back results.
The problem is, none of those me
Greenspun's Tenth Rule (Score:5, Funny)
it allows you to get strings back from commands and use them as the text of the script as if you had typed it directly. I think this was a new idea that I, at least, had not seen in scripting languages, except perhaps LISP,
Greenspun's Tenth Rule: "Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp"
Re: (Score:2)
I always wondered, why one would implement yet another language, when one can simply use dynamic libraries or include the JIT compiler functionality of one's preferred language in it. (And then calling that module with only that as parameters, which it is allowed to have access to.)
If one needs a real sandbox, one could still run the compiled module in it, instead of creating yet another sandbox implementation.
Re:Greenspun's Tenth Rule (Score:5, Informative)
These days it is quite easy to get embedded perl or lisp etc.
Re: (Score:2)
Some of us had to file our teeth rectangular in order to bite IBM 360 punch cards!
Single-language platforms (Score:2)
I always wondered, why one would implement yet another language, when one can simply use dynamic libraries or include the JIT compiler functionality of one's preferred language in it.
Back then, it's because UNIX was young, and it didn't yet have a standard interpreted language, as Sir Groane pointed out.
Nowadays, it's because you have to deploy your app on a half-dozen platforms, each with a different preferred language. For instance, XNA has C#, J2ME MIDP has Java, iPhone has Objective-C, Internet Channel has ActionScript, etc. The easiest way is to write your business logic in one language, and then write either interpreters in the deployment languages or compilers from your langua
Re:Greenspun's Tenth Rule (Score:4, Insightful)
The parent comment was modded funny, but I think Greenspun's Tenth is still relevant today. And, applied to Unix, it's definitely true. Imagine what Unix would be like if there only were C. But there isn't only C, there is also the shell and various scripting languages. The shell's most important feature is that it's interactive, like Lisp's read-eval-print loop. Todays popular scripting languages on Unix (say, Perl and Python) implement many of the other features of Lisp, allowing programs to be expressed a lot more succinctly and conveniently than in C. But all these are part of the same universe: the shell works mostly by running other programs, and the scripting languages do some of their tasks by going through the shell or C libraries. So, with everything together, you end up with something vaguely like what Lisp offers in a single package.
Of course, the world hasn't stood still, and the Unix universe now offers many features that aren't really present, or at least not standardized, in the Lisp universe.
And, in the meantime, Java has come along, re-inventing and re-implementing tons of features from Lisp and Unix.
Re: (Score:2, Funny)
cat
overlordofmu:x:1000:1000::/home/overlordofmu/:/bin/lsip
Example ---
mu login:
Password:
CMU Common Lisp 19e (19E), running on mu
With core:
Dumped on: Thu, 2008-05-01 11:56:07-05:00 on usrtc3142
See for support information.
Loaded subsystems:
Python 1.1, target Intel x86
CLOS based on Gerd's PCL 2004/04/14 03:32:47
*
Aren't I amusing!?!?
Re: (Score:2)
Re:Greenspun's Tenth Rule (Score:5, Funny)
Greenspun's Tenth Rule: "Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp"
As a corollary, we can see that any C or Fortran program that does not contain an ad hoc, informally-specified, bug-ridden slow implementation of half of Common Lisp is insufficiently complicated.
The first rule of Sh... (Score:2)
...is you can't talk about Sh.
Seriously.
Sh!
Sh? (Score:5, Funny)
$ Sh
sh: Sh: command not found
Compiler research (Score:3, Interesting)
I wonder if the bell labs researchers got the eureka moments when their applied research in compilers worked like the CERN physicists detect a theoretical particle.
This inspired me to write a tiny *NIX shell (Score:2, Interesting)
I saw this article on OSnews this morning, and it inspired me to write a tiny open-source (public domain) *NIX shell, which can be seen at http://www.samiam.org/software/yash.html. I know the busybox [busybox.net] guys are looking to rewrite their *NIX shell to be more modular; this code would be a good starting point.
- Sam
Now there's a man with hair on his chest (Score:5, Funny)
Re:Now there's a man with hair on his chest (Score:5, Insightful)
And this speaks to why IMHO it was unfair (besides being stupid) to change the rules on software patents in 1986. Prior to that time, the huge amount of seminal, fundamental, wonderful work (by geniuses and people much smarter than me) in software and systems could not be patented, so it was either secret (for a while) or open. All those giants back then had no opportunity to set up a licensing toll bridge. And now, an infinite regression of trivialities are patented.
Imagine what progress in computing would have been if Alan Kay had been able to patent windowing GUIs, or if object-oriented programming had been patented, or paged virtual memory, timesharing, CDMA, TCP, IP, programming macros, relocatable code linkers, electronic mail, image morphing, most computer graphics and imaging techniques, ... the list goes on.
Some of the core ideas incorporated by Berners-Lee in his WWW creation could have been patented either by him, or by NeXT Computer before he had a chance. And then where would we be?
Hell, I personally could have gotten patents on client-server image processing nets, steganography, SAN, image paging and pre-fetch, pan-and-zoom image and map display, a whole raft of specialize raster-op (bitblit) functions, physical mapping of image files onto disk sectors, street-address interpolation, for geolocation. And that's just a sample of the bigger stuff I was involved in from 1983-1985. Oh yeah - a collaborative sketchpad over ethernet, in 1982!
At the time (early and mid 1980's) NONE of this was patentable. And now people are getting held up for $millions for stuff we didn't even bother to document or publish, because it was so trivial. And (just for perspective) I was just a regular schmoe - not one of the lights of programming.
rant, rant, rant... I totally agree with what you said :) I was not and am not worthy either. And certainly neither are the market- and legal-droid twits at Amazon and Microsoft and elsewhere who browbeat the software writers into signing off on the post-placental detritus that modern software patents are and will always be.
Cambridge Phoenix system (Score:3, Interesting)
Command substitution a "new idea?" (Score:3, Informative)
"Command substitution was something else I added because that gives you very general mechanism to do string processing; it allows you to get strings back from commands and use them as the text of the script as if you had typed it directly. I think this was a new idea that I, at least, had not seen in scripting languages, except perhaps LISP,' he says."
Surely this feature was present in Calvin Mooer's TRAC [wikipedia.org], circa 1964 or thereabouts. I've forgotten the distinction between expanded macros by means of single or double slashes, but I believe one or the other of them substituted the macro expansion back in the stream for further processing. My recollection is that it was fundamental to the way TRAC was used in practice. My recollection is also that TRAC was moderately well-known in the community at the time, so the idea was "in the air."
I believe it also existed in a host of "macro" capabilities in assembly languages... familiar to me in MIDAS, an assembly language for the PDP-1 circa 1965 or so. MIDAS survived into the PDP-6 and PDP-10, may have been developed earlier for the TX-0, and I think may have been patterned on advanced macro assemblers for the IBM 709.
My /bin/sh rules (Score:3, Informative)
My scripting rules on UNIX like systems:
perl (Score:3, Interesting)
I've never fully understood why bash is used anymore when perl is around.
No I'm not trolling. in most applications that take a significant amount of time to run, perl is orders of magnitude faster than equivalent and akward bash script.
the syntax of perl is sufficiently close to bash that anyone fluent is bash ought to have little trouble with moving to perl.
So in total seriousness, what is the point of using bash for scripting?
Re: (Score:2, Insightful)
shell is still around 'cos it's still the right tool for some jobs...
Re: (Score:2)
1) Better error handling and checking. (yes you can check for exit codes with bash, but some stuff doesn't give useful exit codes for enough scenarios, so you have to do more tests to figure out what really has happened and what stage did stuff get to).
2) Logging and auditing[1]
3) More complicated job flows and handling, and process control.
[1] Yes on some distros you can do some stuff with logger (you do have to figure out whethe
Re: (Score:3, Informative)
If all you're doing is moving files around or creating tarballs etc. then all those backticks in perl can become a PITA
it's easy to wrap shell into perl when one feels the backtick are getting in the way. To be pendantic:
<perl commands go here >
system <<EOC; # the bash script follows /tmp /tmp
echo "embedded bash script begins"
ls
tar -xc fooberries.tar
mv tweedlee.txt tweedeldum.txt
echo "bash script done, now resuming perl interpreter"
EOC
<further perl commands here.>
Re: (Score:2)
#!/usr/bin/perl
use strict;
use warnings;
print "#!/usr/bin/perl\nsystem<<EOC;\n";
print foreach(<>);
print "EOC\n";
Man, I rock.
The Unix Shell and Scripting Languages (Score:4, Interesting)
Backticks? Why on earth would you use backticks to move files around? That's what File::Copy is for. And Archive::Tar handles tarballs.
Write Perl code, not shell scripts wrapped in Perl code.
All of this raises an issue that interests me, with regard to the shell and scripting languages...
The shell is supposed to be a convenient interface for accessing the functionality your system has to offer - but because of the way that functionality is offered it's hard to take advantage of it. The shell hasn't got much in the way of support for datatypes, namespaces, and so on. This makes it a lot easier (and, often, more efficient) to program in a scripting language like Perl or Python, and implement all kinds of useful functionality as libraries for that language, instead of as shell programs.
So scripting languages have the advantage of providing a much more structured and full-featured programming environment - a better foundation on which to build more complicated programs and more sophisticated tools. But the whole thing is one degree separated from the normal interaction with the shell - it's not trivial to expose all that functionality implemented for the scripting language to code outside the scripting language... The scripting language becomes a rich environment all its own, but that functionality isn't part of the shell environment, because the shell environment doesn't support the organizational concepts that make that code manageable within the framework of the scripting language.
I feel like this situation is a problem - I believe in what some people call "The Unix Way" - chaining together small tools to do bigger jobs, but the shell doesn't have the organizational constructs to make this work for complex problems - and as a result people are doing this great work on adding functionality to the system, but it's getting packaged up as scripting language modules since the shell can't handle it. It's something I'd really like to correct.
Re: (Score:2)
The original point of perl is to address your concerns. You want to be close the machine (in the unix not the C sense) but you want enough high level functionality. And you want to chain stuff.
Perl is the ultimate glue language. it's not so lovely as python or ruby but it's much more adapted for glueing and staying as close to shell scripting as possible. At the same time it has nice libraries that mean it can do so much more if you need to go there.
Re: (Score:2)
The original point of perl is to address your concerns.
But it's not a shell, and frankly I don't want it to be my shell.
My point is that I see the fact that we needed a strong string processing language to help with shell pipelining as a flaw in the way pipelines are handled. I also see the fact that so much good functionality is exposed as Perl or Python modules, but not as shell utilities, as indicative of a severe limitation in what the shell is capable of as a programming language (overcoming this limitation goes beyond just adding a stronger set of data t
Re: (Score:2)
Re: (Score:2, Interesting)
Write a perl shell then, and see how it's received?
Re: (Score:2)
http://www.focusresearch.com/gregor/sw/psh/ [focusresearch.com]
Re: (Score:2)
Re:perl (Score:4, Interesting)
Are you saying people should use Perl as an interactive shell? Or are you saying people should never use bash non-interactively?
The entire concept behind 'shell scripting' is to make it easy to tie together the same commands you type into the interactive shell. When I get used to doing 'rm' and 'cp' I can write an easy shell script which does the two together.
Of course, once you get to large shell scripts, then it becomes much more sane to use a real language rather than a shell script.
Re: (Score:2)
I've run out of fingers and toes trying to count the number of inefficient perl scripts I've replaced with much faster and simpler shell scripts. Bad code can be written in any language.
Re: (Score:2, Informative)
Bash is also a command line interpreter. This allows you to try out stuff before writing a script, which you can also do in the same environment : just use cat.
Just try this :
$ perl
ls
What ? No output ? :)
Re: (Score:2)
Nice.
Re: (Score:2)
I've never fully understood why perl is around. Not bashing (mind the pun), but it seems to bring nothing to the sh/awk/sed/... table, but fills all the closets with junk, and lays version incompatibility traps in the hallways. I swear that the infamous Gates memo ( about trying to get some application ) could be modified (using sed, of course) to be about perl, and everyone would nod their head in agreement. The idea is to make things simple, so you have a chance of making them work at all.
Re: (Score:2, Insightful)
I've never fully understood why bash is used anymore when perl is around
The right tool for the right job. For example, I've been using sh/bash for a bunch of SQA regression tests for a command-line caching DNS server I'm working on (my current open-source project). Here is one of the simpler tests so you can get an idea of the syntax:
for VALUE in 0 1 ; do
cat > dwood2rc << EOF
chroot_dir="$( pwd )"
ipv4_bind_addresses="127.0.0.1"
upstream_servers["."]="127.0.0.2"
recursive_acl="127.0.0.1/16
Re: (Score:3, Insightful)
Okay now suppose you wanted to perhaps have an exception test for the killall or the askmara. Or suppose you wanted to have a time-out if they never returned. Finally assume you wanted to log the result of the action. Maybe you want to use a command line variable to supply say a password and the number of retries.
Yes you could do all that in shell script. it simply is easier in perl.
your assurance that your perlscripts will run is that the first line of the perl program specifies which perl interpreter
Re: (Score:3, Insightful)
Your forkbomb script is rather a pointless action as I'm sure you are aware.
if you are parsing text or doing any sort of complicated extractions you have to repeatedly use grep awk and sed in bash to accomplish the job. repeatedly launching these in aprogram can produce something that is easily 100-1000 times slower than the equivalent perl would be.
thus the window of usefulness for a scripting language is extended three orders of magnitude.
for example, let's say the ease of use of scripting means that unt
Re: (Score:2)
Yes, well, I wonder why perl is used given that Ruby seems to be always preferable for any use...
[Go ahead, mod me flamebait, and I will become more powerful than you can possibly imagine! Bwahahaha!]