Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Books Media Book Reviews IT Technology

Web Development with Apache and Perl 221

davorg writes "In the dim and distant past when I first started doing web development, there was a book that everybody had a copy of. It was called 'How to Set Up and Maintain a Web Site' and it was by Lincoln Stein. The reason why everyone owned (and, more importantly, read) it was that it contained a complete high-level snapshot of everything you needed to know in order to run a web site at that time. Unfortunately, after a second edition in 1997 the book hasn't been updated. I suppose that the subject area has grown so much that no-one thought that a complete overview would be too high-level to be useful. They were probably right." davorg's review continues below.
Web Development with Apache and Perl
author Theo Petersen
pages 400
publisher Manning
rating 8
reviewer davorg
ISBN 1-930110-06-5
summary Good Overview of the State of the Art in Open Source Web Development

I mention Stein's book because that's what this new book reminded me of most (that, by the way, is a huge compliment). Petersen realises that an overview of the whole web development area would be difficult to write (and, ultimately, unhelpful) so he restricts himself to a subset of the available technologies - Perl and Apache - and gives a thorough review of the state of the art of web development in these areas.

But before he gets into the details of Apache and Perl, in chapter 1 Petersen takes a look at the wider world of Open Source Software and in the process presents one of the best arguments I've seen in print for why a company should choose Open Source Software. In chapters 2 and 3 he takes the same approach with web servers and scripting languages, giving compelling reasons for choosing Apache and Perl.

Having chosen his architecture, in part 2, Petersen moves on to looking at some common tools for web development. Chapter 4 looks at databases. The two main Open Source Databases (MySQL and PostgreSQL) are compared and MySQL is chosen as the basis for the rest of the examples. Chapter 5 discusses the shortcomings of the standard CGI architecture and introduces mod_perl as an alternative. This is a good introduction to a technology that some people can find hard to get to grips with. Petersen takes us through the use of Apache::Registry before moving on to the complexity and power of mod_perl handlers.

Chapter 6 looks at the importance of security in web applications and discusses in some depth the problems of user authentication and the use of SSL for secure data transmission. Chapter 7 looks at ways to separate content from presentation. First we look briefly at server-side includes, but the majority of the chapter is taken up with a review of the various templating systems that are available for Perl. The chapter finishes with a detailed look at two of the most popular templating solutions - HTML::Mason and Template Toolkit.

Part 3 of the book looks at three different types of web site in great detail. In each case Petersen uses the examples to take a brief survey of a number of the existing tools. For example chapter 9 looks at a community web site and contains information about a number of web-based forums and chat rooms. It also takes an extended look at Slashcode the software that runs Slashdot. Chapter 9 takes a similar approach for intranet sites and Chapter 10 for online stores.

In part 4 we take a longer term view of a web site. Chapter 11 looks at content management systems and chapter 12 lookat at performance tuning. Both of these chapters are full of useful advice on how to make running a web server as painless as possible.

I think this is a very useful book to have on your bookshelf. Anyone who is developing web applications using Apache and Perl will find something useful in the book. It should be obvious that in order for a single book to cover so much ground, sometimes there isn't quite as much technical detail as you might like, but there is a good bibliography that will show you where to go for more information. In my opinion the high-level approach makes the book particularly useful for a couple of groups of potential readers. Firstly I think it makes a great introduction to the subject for someone coming to Apache and Perl for the first time. Secondly (and perhaps most importantly) I can see the book (in particular the first three chapters) being very useful reading material for a manager who is making a decision between using Open Source Software or some proprietary technology.


You can purchase Web Development with Apache and Perl from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Web Development with Apache and Perl

Comments Filter:
  • by MarsBar ( 6605 ) <geoff @ g eoff.dj> on Thursday August 08, 2002 @10:24AM (#4032569) Homepage
    My, this guy is making all the right decisions today...
  • by krog ( 25663 ) on Thursday August 08, 2002 @10:32AM (#4032622) Homepage
    Apache and Perl was the way to go in 1996, but times have changed. Systems like PHP and (here comes the -1 Flamebait mod) ASP are faster and more efficient than Perl CGI. Serious webmasters do it in Java or C anyhow, for serious speed.
  • by Aliks ( 530618 ) on Thursday August 08, 2002 @10:34AM (#4032632)
    Don't want to be a troll but no matter how good the book is, surely for material like this, the web itself is the best medium.

    A paper book is certainly more portable, and for most people easier on the eye, especially when you read for a long time but . . .

    Topical material is quickly out of date

    You can't search too easily for the topic or phrase you want

    You can't easily look up a reference for a term or concept you don't understand

    If a certain paragraph doesn't make sense you can't look for alternative statements of the same concept

    Once you've read it there isn't any easy way to look up a particular section when you next need it (the books at home, borrowed by a colleague etc)

    Books cost!
  • by Daniel Dvorkin ( 106857 ) on Thursday August 08, 2002 @10:54AM (#4032775) Homepage Journal
    Old-school Perl CGI is relatively slow, yes, though an awful lot of that has to do with the quality of the back-end code. But mod_perl is unbelievably fast -- and I say that as someone who makes his living developing and maintaining a database with a Web interface written in PHP, so it's not like I'm prejudiced in Perl's favor. "Serious webmasters" choose whichever tool gets the job done best, and if they're also good programmers, they write good, clean code that executes fast enough that the client doesn't have to care what's driving the site.

    That being said (anti-flamebait!) I'm amazed you mentioned ASP, not because of of its flamebait value, but because, um, it sucks. I swear to God, there must be "if(client != "IEWin"){slow_down(); crash_unpredictably();}" in the source somewhere. Even these days, old-school Perl CGI is often the right tool for the job. Unless you're developing 100% for IEWin, IIS, and MS-SQL, ASP never is.
  • The arguments (Score:3, Insightful)

    by InsaneCreator ( 209742 ) on Thursday August 08, 2002 @10:59AM (#4032819)
    Petersen takes a look at the wider world of Open Source Software and in the process presents one of the best arguments I've seen in print for why a company should choose Open Source Software

    Could you _please_ share some of them with us? Because I'm really sick & tired of the same old "Anyone can change the source and fix the bugs" argument. Sooner or later people will have to realise, that the companies we want to convince switching to linux is a good idea, couldn't care less about the ability to change the source. Some of them even get turned off by this argument, since they believe it contans an implicit "You will have to change the buggy code yourself", just like when you're buying a house and the ad says it's perfect for a creative, DIY-type person - and you know you'll have to replace the roof, the walls and the floors within 2 weeks. :)
  • by bluGill ( 862 ) on Thursday August 08, 2002 @11:13AM (#4032917)

    Depends on what you want to do. Online is better when you are at a computer (with net access) and know what you want to know about. Paper is better when you just want to learn something interesting. I have not yet found a good way to skim online documentation and when something catches my eye drill down deep. Scrolling doesn't seem to work that way for some reason. An index is worse, it doesn't tell you if something is useful, so you have to look, which means wait for page loading times. After a few misses you give up.

    Both types have their place. I wouldn't want to be without my printer, even though I haven't turned it on in several weeks. There will never be a paperless office because when the problem gets really tough I print out everything, and then (as Fred Brooks said) take to the floor to figgure things out. Don't try to get rid of paper, use both paper and the computer to compliment each other.

  • Why Perl (Score:5, Insightful)

    by LinuxParanoid ( 64467 ) on Thursday August 08, 2002 @11:47AM (#4033177) Homepage Journal
    When I started web programming 2 years ago and faced the choice between working with Java, Perl and up-n-coming PHP, I looked around and it seemed to me that the people using Perl were doing the most innovative, creative stuff on the web. (Slashdot's "distributed moderation" scheme, which I regard as a quantum improvement over USENET moderation for providing large-traffic yet readable forums was just one example.)

    I wanted to do innovative, creative stuff, so I started writing Perl.

    No regrets. I don't think that aspect of Perl has been particularly usurped. Nor do I think there's another language which provides a platform for faster time-to-market and feature iteration.

    As mentioned elsewhere on this thread, and Java and PHP have their own distinct advantages also.

    --LP
  • by PerlPunk ( 548551 ) on Thursday August 08, 2002 @11:51AM (#4033212) Homepage Journal

    Perl & Apache is an excellent combination for bringing sanity to legacy systems.

    As far as dealing with legacy systems, nothing is better than Perl. For example, in a project I'm working on, in my company there is a vast array of legacy tools which require using telnet to get the work done. And the web interface I'm building (CGI & Net::Telnet) get's the work done beautifily. (Try doing this in PHP or Java.) The admin people are happy, and the development time so far has been almost negligible. Perl is the supreme glue language.

  • by @madeus ( 24818 ) <slashdot_24818@mac.com> on Thursday August 08, 2002 @11:58AM (#4033261)
    Systems like PHP and (here comes the -1 Flamebait mod) ASP are faster and more efficient than Perl CGI.

    Perl with mod_perl is *very* fast and no slower than PHP with mod_php. Perl on it's own certainly isn't very nippy under high load, but then PHP without mod_php isnt great either, though many PHP fans don't take the time to understand the difference before determining which is better (which is understandable when you take into account that PHP is now used more than Perl by less experienced users, due to it's greater ease of use).

    ASP pages (at least served by IIS) are significantly less able to cope with high load than Perl with mod_perl. IIS really strains when serving dynamic load (though it's superb at serving static content, though that's not so useful :). Of course I should say that it's entirely theoretically possible that an ASP solution under Unix might give comparable, or even better, performance but given it's such a small market share and has comparatively tiny amounts of development time (measured in developer hours) associated with it it's not realistically likely - to the extent that I don't think anyone would even claim it is. It's more of a way of giving customers ASP without having the system administrator management headache of having to run NT servers.

    There seems to be a big PHP bandwagon among the hip and trendy crowd these days. That's not entirely a bad thing, because (like Perl) PHP is a GoodThing(TM) but it is a when they are just repeating what everyone else is doing (which is always a bit worrying).

    The most compelling arguments are that PHP is of course easier, and so on balance PHP markup is much less prone to errors than Perl scripting.

    Serious webmasters do it in Java or C anyhow, for serious speed.

    Sorry, but I really dispute this. Not that serious webmaster to it in Java or C, just that it's not always very quick (I'll justify that in a minute) and that there is an implication there that 'serious webmasters' work only or even mostly in Java or C which I disagree with.

    If you are writing a large scale project with many developers some find it easier to keep control over the project if it's written in Java or C (over Perl) because of Perl's flexible nature (which some see as a curse when you have a large project). I can certainly see the merits of this and am a big fan of Java.

    The drawback is with C is that C code takes much, much longer to write, takes longer to modify and is more prone to errors due it's complexity. Very little CGI is written in C due to this. I think that most appropriate place for C CGI's is for particular CPU or memory intensive functions who's functionality will remain relatively static over time. This way you can rapidly build flexible CGI interfaces around a very fast C program, meaning that your servers are not tied up with that function, but that you can still easily modify and adapt the front end of your site. Perl modules written in C are typically created to be used in this sort of context.

    Java is excellent in that it imposes a similar strictness to C while allowing you to still comparatively rapidly produce code that is less error prone than either C or Perl (IMO), meaning it's great for development among many developers. The drawbacks are however that it still takes longer to code a solution in Java than in Perl (though it's quicker than C) and Java does not really become an option for high load situations unless you have large Sun hardware with plenty of RAM (several Gigabytes) to throw at it - though at which point it really shines.

    So 'speed' is a relative term, yes certainly code in C will execute faster than in Perl, but if it takes you a month instead of a week to write, you may have just wasted 3 weeks worth of project time (and this can doom a project or kill a startup). It's much, much quicker to write and modify code in Perl (or PHP) than in Java or C (though modifying _other_peoples_ Perl can be a different story, though I've never had any problem doing that, I can see how other people do).

    I once worked out timescales for a project that went like ~2 months of Perl coding or ~4 months of Java or ~6 months of C and noted the problems/advantages of each and let my manager pick one. We went with Perl (with the option to have parts in Java or C as we came across parts of the project that would benefit), even though Perl alone might struggle with the load of the task we were doing, because it was felt that Perl would easier to maintain, given that more people in the company were comfortable with Perl, Perl meant rapid development allow us to bend with typically ever changing requirements and that Perl monkeys are easier and cheaper to find (should we all leave and minion adjustments required to the software at a later stage).

    As I final note I'd say that PHP certainly is quite appropriate in many situations and I don't dislike it at all, but for those with a strong Unix bent, unless they have good Perl already they might be better off doing at least a couple of projects in Perl, as Perl becomes a key 'life skill' due the fact that it can be used to write extremely useful scripts that do very complex things very quickly, so learning it is very useful.

  • Re:The arguments (Score:3, Insightful)

    by pmz ( 462998 ) on Thursday August 08, 2002 @12:33PM (#4033526) Homepage
    Because I'm really sick & tired of the same old "Anyone can change the source and fix the bugs" argument.

    That's fine, but if you haven't seen other arguments, you haven't been looking very hard.

    1) The code is auditable. Individuals might find little value in this (although some really do), corporations and governments can fund a few people to look for trojans, back doors, or spyware, or even just to check for overall quality or functionality. This gives people and organizations an extra tool in their risk-mitigation toolbox. A real world example is Sun Microsystems' adoption of GNOME for Solaris. Could Sun be as sure that GNOME met their needs had the source been closed to them? Probably not. Has Sun found ways to "pay for" GNOME? Absolutely. They don't hide the fact that GNOME is Open Source, and they have contributed back to the project. There is a pretty good chance that people using GNOME on Linux or FreeBSD have benefitted somewhat from Sun's work.

    2) The developers' motives are different. Why is it that OpenOffice.org uses an easy to work with documented native file format? Why have binary configuration files not appeared in the /etc/ directories of Linux distributions? Why aren't there Open Source EULA clauses about preventing benchmarking, reverse engineering, or criticism of the software? Why are Open Source developers not very concerned about collecting marketing data about you and your habits?

    3) Open Source software lasts longer and is more stable (in a configuration-management mindset). If I install a Linux distribution today, there will always be an upgrade and maintenance path independent of any corporation. Linux has long since reached a threshold, where it can never go away. Many Open Source applications are like this, too. Emacs will still be here in ten years, as will GCC and XFree86, for example. An Open Source system almost never ceases to be useful. An executable is corrupted? Just recompile it. Discover a previously unknown bug? Contact the authors directly and talk about it. You need to collect data from old files? Unpack-n-grep.

    4) Open Source software lasts longer and is more stable (from a robustness point of view). Many Open Source applications are so mature that people take their stability for granted. How often does Emacs fail due to flaws in its code? Under normal use, almost never (the last time I saw Emacs crash was years ago). The same is true for Apache, GCC, XFree86, and the Linux kernel. While there is a huge pool of Open Source projects that are not this mature, it is a normal process for the occasional super-star to emerge and become best-of-breed in its category.

    Now, how much of all this is true of Microsoft and their products?
  • by cgenman ( 325138 ) on Thursday August 08, 2002 @01:43PM (#4034118) Homepage
    Lots of reasons.
    For one, Subways, busses, trains, airplanes, sidewalks, and doctors offices.

    Two, it's always better to have a paperback copy of documentation when fixing a computer. Even if you have another computer around (and you should), paperbacks still leave smaller holes in the wall.

    Three, when you are done with it you can loan it to a colleague. I've worked with enough clueless people to know that a little education can go a long way in smoothing a long term working relationship.

    Likewise, many of the supposed advantages proported of HTML are not so, namely timeliness and loanability. For some reason it seems to take as much cost to keep a site up to date after two years (or so) as it did to create the thing in the first place. Why would the publisher plunk down the coin to keep the site current? If they are going to recieve payment for updated versions of the html book, how is that any different from buying a dead-tree edition? And how, exactly, can you loan out a html book with the publisher's blessing? I understand that in HTML you can just page through and download the bloody thing, but... That seems like a lot more trouble than figuring out who you loaned your book to.

    And furthermore, if you are at a computer you can look up concepts you don't understand without any trouble. The portability of white pulp doesn't preclude reading it at a computer. And when Mozilla / IE / Netscape / AOL / ICab / Opera / Lynx / Konqueror crashes your system, your paperback will still be running fine and you will still know what you were looking up. (Opera, BTW, automatically saves the windows you have open, allowing you to painlessly resume after crashes).

    I don't think, in the current state, that searching and easy dissemination outweigh portability, ease-of-reading, and loanability in this circumstance.

    Dead tree technology has been around for three thousand years. Three thousand years of R and D is a lot of engineering. Let's not be so quick to abandon that power, eigh?

    -Chris

Waste not, get your budget cut next year.

Working...