Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Boolean Logic : George Boole's The Laws of Thought 254

Ian writes "The Globe and Mail has a piece about the man behind Boolean Logic - George Boole - The Isaac Newton of logic. 'It was 150 years ago that George Boole published his classic The Laws of Thought, in which he outlined concepts that form the underpinnings of the modern high-speed computer.'"
This discussion has been archived. No new comments can be posted.

Boolean Logic : George Boole's The Laws of Thought

Comments Filter:
  • by A nonymous Coward ( 7548 ) * on Saturday March 27, 2004 @03:39PM (#8690572)
    Also that year, Grace Hopper, an admiral in the U.S. Navy, recorded the first computer "bug" -- a moth stuck between the relays of a pre-digital computer.

    The computer was digital, it just used relays instead of integrated circuits. It wasn't stuck between relays, it was stuck in a relay.

    And while I'm at it, a nitpick. She wasn't an admiral until much later.
    • by jeepee ( 607566 ) on Saturday March 27, 2004 @03:48PM (#8690620) Homepage Journal

      Grace Hopper Found a bug? Was it a Grasshopper?

      1. The computer was digital, it just used relays instead of integrated circuits. It wasn't stuck between relays, it was stuck in a relay.

      I stumbled on a copy of the image of the log book, showing the moth taped to it. Quite interesting. Just to be annoying, and somewhat in spirit with this thread, here's a tip;

      1. 01101000 00111001 00110110 00110101 00110110 00110110 01101011 00101110 01101010 01110000 01100111

      If you can't figure that out, the image probably won't make much sense to you either.

    • Call that a nitpick? (Score:3, Informative)

      by panurge ( 573432 )
      I believe she eventually became a rear admiral, not an admiral. Also, she was a reservist when she found the bug.

      She was also already a PhD when she was called up for active service in WW2, so the grandparent post is really highly inaccurate.

      Grace Hopper - the third programmer in the United States, and a fitting successor to Ada Lovelace.

    • Prof. MacHale also notes that subsequent to The Laws of Thought, Boole undertook to rewrite the Bible in his mathematical logic.

      I'm very curious about this ... how exactly would you go about representing text with mathematical logic? He must have needed to invent some ad-hoc method to do this, right?

      Anyone know anything about this?

      • Boole undertook to rewrite the Bible in his mathematical logic.

        And 101101 sayeth unto 111000, "Don't partake of thy apple, for it is full of 110101001010". But sayeth 111000 back to 101101, "11111111 That! I am hungry!". Behold he biteth into thy apple, and suddenly God made him naked, and his "1" showed, and 101101 laughed because she thoughteth it was a decimal. But her "00" also showed, and 111000 laughed because he though they were two decimals.
    • It gets worse (Score:3, Informative)

      by fm6 ( 162816 )
      Plus they credit her with coining the term "bug". The squashed moth was a joke. "Bug" was slang for electronic glitches long before then.
  • by amigoro ( 761348 ) on Saturday March 27, 2004 @03:40PM (#8690579) Homepage Journal
    A comment has to be insightful AND funny OR it is NOT worth reading.

    Moderate this comment
    Negative: Offtopic [mithuro.com] Flamebait [mithuro.com] Troll [mithuro.com] Redundant [mithuro.com]
    Positive: Insightful [mithuro.com] Interesting [mithuro.com] Informative [mithuro.com] Funny [mithuro.com]

  • by James A. M. Joyce ( 764379 ) on Saturday March 27, 2004 @03:44PM (#8690596) Journal
    Let's not forget Lovelace, Ritchie, Knuth, von Neumann, Turing...
  • by amigoro ( 761348 ) on Saturday March 27, 2004 @03:48PM (#8690616) Homepage Journal
    George Boole was one of the teachers of Ada Lovelace [westga.edu] the first computer programmer. Some people [techtv.com] don't agree that Ada was the first computer programmer. Some people also don't think that Charles Babbage's Analytical Engine was the world's first computer.

    Moderate this comment
    Negative: Offtopic [mithuro.com] Flamebait [mithuro.com] Troll [mithuro.com] Redundant [mithuro.com]
    Positive: Insightful [mithuro.com] Interesting [mithuro.com] Informative [mithuro.com] Funny [mithuro.com]


    • Some people also don't think that Charles Babbage's Analytical Engine was the world's first computer.


      Some people also don't think that Earth is not flat.

    • by Waffle Iron ( 339739 ) on Saturday March 27, 2004 @04:24PM (#8690888)
      Some people also don't think that Charles Babbage's Analytical Engine was the world's first computer.

      It was, however, the world's first vaporware.

    • Some people also don't think that Charles Babbage's Analytical Engine was the world's first computer.

      My general rule is, "If all it does is react passively to the Earth's magnetic field and displace it's weight in water, it's not a computer."

      Since the Analytical Engine was never completed, I feel it falls in that category.

      However, you don't need a working computer to be a programmer. I've programmed for computers that hadn't been built yet as well as some that never got off the drawing board.
    • Some people also don't think that Charles Babbage's Analytical Engine was the world's first computer.

      His analytical engine was the design for the first computer, but it was never created due to lack of funding. Herman Hollerith used the designs of the Analytical Engine and the Difference Engine of Babbages to create his Tabulating Machine (early computing device) for the company that later became International Business Machine (IBM). So, Charles Babbage is known as the father of computer for his desig

    • tThat what you get for watching that crap they call Tech TV.

      Which should be called:
      "All the cool things pseudo geeks want to hear so they can think they are hip on technology, and other mindless drivel."
  • by Anonymous Coward on Saturday March 27, 2004 @03:48PM (#8690617)
    ...but it apparently made even less sense in Boolean if you can believe that's even possible. And if he wasn't a Unitarian this whole mess of ours would have been in base 3.
  • Laws of Thought (Score:5, Informative)

    by smchris ( 464899 ) on Saturday March 27, 2004 @03:50PM (#8690630)

    Not in Project Guttenberg yet :(

    There's nothing like reading the original works.
  • Why... (Score:5, Funny)

    by Phosphor3k ( 542747 ) on Saturday March 27, 2004 @03:53PM (#8690662)
    Offtopic: Why did it take me 15 seconds to realize the word "Google" was no where in the story title? Anyone else have that problem?
    • Offtopic: Why did it take me 15 seconds to realize the word "Google" was no where in the story title? Anyone else have that problem?

      Not only that, but I got really excited that Google was finally going to have boolean search queries.

      Alas....
    • weird!
    • It is cmomon knweldoge taht hmuan biengs prceoss informiotan in shuc a way that mkaes yuor conuifsion undstanedrable.

      If you read that in under 10 seconds, you just proved my ponit.

      LK
  • John von Neumann (Score:5, Interesting)

    by Jagasian ( 129329 ) on Saturday March 27, 2004 @03:53PM (#8690665)
    I think Turing and von Neumann had far more to do with the underpinnings of modern computers than Boole.

    Boole's great acheivement was his attempt to formalize logic algebraically at a time when logic was informal and far too meta for even mathematicians to consider formally. While this is great and all, it doesn't result in a general purpose computer.

    However, Turing machines and von Neumann machines are in everyway a general purpose computer.
    • The UTM is not binary. It reads marks on a tape, without any presumption of the format of those marks. In the same way, the Difference Engines were base 10 (that's base 9+1, for binarists, octalists and hexadecimalists) and some of the 40s work used base-10 designs.

      The importance of Boole's ideas, therefore, was that they provided a grand unifying framework for computer design.

      In fact Turing's ideas were more fertile for programming, and it's a pity that he lived in the UK after WW2 and was held back by the

      • von Neumann's computer architecture is still used today! Random access memory accessed by a CPU through a bus. That seems like a big deal to me!

        Also, just like you said, Turing machines don't require binary implementation, and therefore Boole's ideas played only a small part if any in Turing machines.

        The truth is far more sophisticated. Everything worth anything is the result of a continuum of research and researchers. However, my arguement is that von Neumann's contributions are the first to resemble
        • The "Von Neumann architecture" was in fact part of the original Manchester and Cambridge machines and would have been partof Ace. You are right about the continuum of ideas, but really calling it a Von Neumann architecture is about as accurate as calling it an A E Neuman architecture.
    • Don't forget Shannon (Score:3, Informative)

      by Jim McCoy ( 3961 )
      If you are going to list "those who made all this possible" you cannot ignore Claude Shannon. Creating information theory was important, but what is equally remarkable was his master's thesis: "Symbolic Analysis of Relay and Switching Circuits" (1941) was what proved that Boole's logic could be implemented in digital hardware.

      It was, by a long margin, the most important master's thesis in history.
  • Funny name (Score:5, Funny)

    by DRUNK_BEAR ( 645868 ) on Saturday March 27, 2004 @03:53PM (#8690667)
    Am I the only one who finds that name Grace Hopper and the expression "computer bug" go well together? :oP
  • Book title (Score:3, Informative)

    by imnoteddy ( 568836 ) on Saturday March 27, 2004 @03:54PM (#8690673)
    To be pedantic, the title of Boole's book was "An Investigation of the Laws of Thought" [mathbook.com]
    • correction (Score:2, Insightful)

      by Anonymous Coward
      Parent post is completely wrong. The complete title is actually "An Investigation of the Laws of Thought, on which are founded the Mathematical Theories of Logic and Probabilities".
      • Re:correction (Score:4, Interesting)

        by imnoteddy ( 568836 ) on Saturday March 27, 2004 @04:51PM (#8691074)
        Parent post is completely wrong. The complete title is actually "An Investigation of the Laws of Thought, on which are founded the Mathematical Theories of Logic and Probabilities".

        Parent post is not completely wrong - I got the first part of the title right. :-) And I blame Dover [yahoo.com]

  • by classicvw ( 743849 ) on Saturday March 27, 2004 @03:56PM (#8690687)

    "Also that year, Grace Hopper, an admiral in the U.S. Navy, recorded the first computer "bug" -- a moth stuck between the relays of a pre-digital computer.)"

    Ahh, but relays are digital.... They are either on or off. That was binary the last I looked.

  • by oever ( 233119 ) on Saturday March 27, 2004 @04:01PM (#8690724) Homepage
    To be or not to be!

    While that's a pretty clumsy way of saying, it, Shakespeare was ahead of Boole.

    I suggest we all add the following statement (or equivalent) to our code in honor of this great mind.

    typedef bool shakespear;

    • But since people are moding you as insightful:

      A simple pithy statement like "to be or not to be" is a long way for formalized digital logic. Boole's contrabution wasn't realising the idea of have two finite states, philsophers had that idea for a long time, and often over apply it in an attempt to prove arguments. What Boole did was to create a formalized and complete system for logic. Boolean operators can be used in a binary system to construct any more complecated operator. The basic Boolean operations,
  • Boole vs. Real World (Score:5, Interesting)

    by G4from128k ( 686170 ) on Saturday March 27, 2004 @04:02PM (#8690729)
    As wonderful as binary is, it falls utterly in capturing the fuzzy analog nature of life and the real world. Our recent debate on whether Sedna (or Pluto) is a planet is but one example of how the real world fails to fit into simple binary categories. Even at the subatomic level, the wave-particle duality gives lie to the fiction of discreteness.

    I'm not saying that binary is not great for doing all manner of wonderfully powerful proofs, logic, and computation. I'm only saying that it is a mere approximation to the real world and can thus fail when the real world is does not dichotomize to fit into Boole's logic.

    Boolean Logic illustrates both the tremendous power and weakness of mathematical systems. On the one hand the power of proof guarantees that man-made mathematical system with certain axioms will undeniably have certain properties. On the other hand, math gives one no guarantee that the real world obeys those axioms.
    • Boole vs binary (Score:3, Informative)

      by Anonymous Coward
      Binary != boolean algebra. There are a number of axioms (the number 5 sticks in my brain for some reason), such that any algebra that satisfies those axioms can be called a boolean algebra.

      One good boolean algebra is the logical algebra of probabilities. Every datum is a real between 0 and 1. x or y would be x + y - xy; x and y would be xy; not x would be 1 - x. (All of this is off the top of my head BTW). It's a perfectly valid boolean algebra.

      To say that boolean algebra is about "true" or "false" i

    • As wonderful as binary is, it falls utterly in capturing the fuzzy analog nature of life and the real world. Our recent debate on whether Sedna (or Pluto) is a planet is but one example of how the real world fails to fit into simple binary categories.

      Boolean logic can, through some very simple rows of "greater than/less than" questions describe the mass, radius, orbit and every other reasonable measurable quantity. It can also be used to measure subjective opinions - a set of "is an object bigger than/sma
    • As wonderful as binary is, it falls utterly in capturing the fuzzy analog nature of life and the real world.

      I wouldn't say "utterly"... using approximation and extrapolation to fill in the weak spots and get a "close enough" answer have been good enough to let the human race put a man on the moon, collect energy from doing naughty things with atoms, and create several series of fun to play WWII first person shooters (I'm most enthusiastic about that last one, but YMMV).

      I've read up on multivalent l

  • by Anonymous Coward on Saturday March 27, 2004 @04:06PM (#8690759)
    Set Theory is a Boolean Algebra. Odd there was no (explicit) mention of this. It is important to both mathematics in general, and Computer Science.

    Just as an aside, a mathematical structure is a Boolean Algebra if, and only if, if it contains two operations (generally denoted +, and *), such that for all elements A, B and C in the structure ...

    A + B = B + A
    A * B = B * A
    (A + B) + C = A + (B + C)
    (A * B) * C = A * (B * C)
    A + (B * C) = (A + B) * (A + C)
    A * (B + C) = (A * B) + (A * C)

    and there exists two elements 0 and 1 in the structure such that ..

    A + 0 = A
    A * A = A

    and for each A an element exists that's the "negation" of A ...

    A + ~A = 1
    A * ~A = 0

    In logic, + is equivilent to OR, * is equivilent to AND, a Tautology is equivilent to 1, and contradiction is equivilent to 0. ~ is NOT.

    Similar comparisons can be made in Set Theory. In the same order of above: Union, Inclusion, Universal Set, Empty Set, and Set Complement.

    So, if you prove one algebraic identity in Set Theory, you also proved the same exact identity in
    Propositional Logic (and vice versa.)

    (shrug)
    • Similar comparisons can be made in Set Theory. In the same order of above: Union, Inclusion, Universal Set, Empty Set, and Set Complement.

      Not all set theories have a universal set. Instead of having a universal set, you often define a domain of discourse. Therefore if D is your domain of discourse, you define:

      ~A = D - A

      However, it's important to note that D is not a "universal set".
  • Boole was born in Lincoln, England, in 1815, the eldest son of a poor shoemaker who also had a passion for mathematics. He was a precocious child. His mother boasted that young George, 18 months, wandered out of the house and was found in the centre of town, spelling words for money.

    Spelling expert? He would have stood out as a slashdotter.
  • Null ruined it all (Score:5, Interesting)

    by Tablizer ( 95088 ) on Saturday March 27, 2004 @04:30PM (#8690930) Journal
    Some complain that the intruduction of "null" into some systems (such as databases) ruins the simplicity of Boolean logic. It creates a "3-value logic" which can get messy to grok.

    I generally agree. I think nulls are perhaps fine for numeric calculations in some cases, such as the average if there are zero records, but not Booleans and not strings. But sometimes it is hard to limit it to one but not the other. It is a controversial topic nevertheless. Chris Date has written some white papers on how to get rid of null.
    • On the contrary, I find nulls extremely useful for both booleans and strings. With booleans, how do you express "don't know"? In the same way, with strings, it's useful to know whether data has yet been entered; the difference between "not yet asked" and "no comment".
      • With booleans, how do you express "don't know"?

        Good question. Hugh Darwen may have some answers [freeola.com]. When you do you express "don't know" as nulls, how do you, later on - when you get the null as a result in a query - get it's meaning out of:

        • not applicable
        • unknown
        • false
        ?
    • Getting rid of null is easy in programming languages; just add a polymorphic sum ("container," if you will) called "option", and then have your functions return an "list option" rather than a "list"--then anything of list type really has a list in it. Many functional languages do this very thing and it is superior in every way to null. (Except that some people aren't "used" to it.)
    • Some languages such as VHDL have even more logic values. The ones I can think of off the top of my head are 1, 0, X for undefined (If you assign two conflicting values to one signal, the state is undefined), U for unknown, there is also strong and weak, which are close to 1 and 0 respectively but not quite there... and there's a bunch of other ones I have never really used.
  • Atanasoff Missing (Score:5, Informative)

    by olafo ( 155551 ) on Saturday March 27, 2004 @04:45PM (#8691031)
    It appears this "Computer History" attempt overlooks John Vincent Atanasoff [vt.edu], credited by most reliable sources (Smithsonian, etc.) as developer of the first electronic digital computer" [ameslab.gov] years before the ENIAC. In fact, the ENIAC was derived from Atanosoffs's ABC Computer at Iowa State after an ENIAC developer visited Atanasoff (stayed several days in Atanasoff's home), and "stole" his ideas and proposed a larger verssion as the ENIAC to the army. Atanosoff's ABC computer was the first to solve Schroedinger's equation represented by the solution of a 39x39 system of matrix equations. However, time caught up with the ENIAC visitor, and the notebook he kept when he visited Atanasoff was his undoing when the U.S. Court in Minneapolis overturned previous patent rulings for computer developments and ruled they were all derived from Atanasoff's ABC computer. Hopefully, this attempt at a computer museum will soon be updated to accurately reflect the original development of the electronic computer by Atanasoff at Iowa State in 1942.
    • by LeJoueur ( 766021 )
      Just to be a bit pedantic, according to Simon Singh's book "The Code Book", the first "computer" was the ENIGMA code breaker, the British Bletchley Park WWII invention , COLOSSUS (http://www.acsa.net/a_computer_saved_the_world.ht m). It never received as much publicity as the ENIAC because it was a war secret... Cheers
  • by a.ameri ( 665846 ) on Saturday March 27, 2004 @07:35PM (#8692080)
    Well,Goerge Boole proposed the basic principles of Boolean Algebra in 1854 in his trearise "An investigation of the laws of thought on Which to Found the Mathematical Theories of Logic and Probablities". While admire Goerge Boole, and I certainly give hime credit for creating this branch of Algebra, it should be noted that Goerge Boole himself had nothing to do with computers or digital systems.In the middle of the 19th century, many mathematicians were working on something called "Principles of Logic". Their goal was to descibe the human thought, in pure mathematical format. They aimed to model the human logic, as a branch of science, and they wanted to formulate it and find the principles of human's way of thinking. If you have ever taken a Descrete Mathematic course, you certainly have seen nonsense statements that "If Today is Sunday" AND "if Betty is happy" THEN "The Sky is Red".

    This was what those mathematicians were aiming for. Goerge Boole also proposed a set of principles, which at the time no one thought had any practical use. This branc of mathematics was a purely theoric one. Mathematicians mostly abondend this subject after it was proven by experience that the human thought can not be formulated in to some mathematical notations.

    It wasn't untill in the 40s, when someone at the Bell Labs (forgot his name) suddenly found out that the Boolean Algebra can be used in digital systems, specifically in implementing digital circuits. Even the first computer built, the ENICA, used a decimal system, and didn't have anything to do with digital systems. It was only by an accident that it was found out that Boolean Algebra, which at the time was a completely useless and theoritic branch of math, found an application, and became a widely studied subject.

    What I am trying to say, is that Goerge Boole himself, by no means had any interest in digital systems, in programming, in computers, or in anything even remotely related to electronics. While as I said, I we should all give him immense credit for his work on Boolean Algebra, it should be noted that many people, contributed much more to the computer and electrical science, than Goerge Boole. Charles Babage and Lady Ada were actually writing computer programs in the 19th century; their only problem was that they had no computer at that time! And certainly, the father of today's computer architecture, is von Neuman.

    Give credit were credit is due, but over-crediting someone, like saying Goerge Boole invented the foundation of computers, is certainly not correct.
  • In the first semester of computer science last year, I had Boole's Algebra.
    I found it very interesting at first, but the final parts where really annoying. But the important thing is that after finishing the semester I started that even I've being a programer since 95 and having experience with turbo pascal, javascript, lambdaMOO, php, c, c++ and object pascal, I've just get better at programking because of boole!

    Boole is one of those things that looks simple and useless at first, but that is not the tru

  • It was 150 years ago that George Boole published his literary classic The Laws of Thought, wherein he devised a mathematical language for dealing with mental machinations of logic. It was a symbolic language of thought -- an algebra of logic (algebra is the branch of mathematics that uses letters and other general symbols to represent numbers and quantities in formulas and equations).

    It's a sad day when the editors of Science feel they need to define what algebra is to their audience.

  • by 1iar_parad0x ( 676662 ) on Sunday March 28, 2004 @04:06AM (#8694564)
    IMHO, the discovery of a real-world application of the idempotent law that was Boole's greatest accomplishment. One could argue that Lebnitz and Boole had independently discovered this. This is not unlike Hamilton's discovery of an application for non-commutative algebra.

    Boole's contribution to logic was profound. First, a real world model for any mathematical property ensures the consistency of that model. Boole's work provided an abstraction for elementary set theory. The key to this abstraction is idempotency. The aggregate of set A and itself is the set A (i.e. A+A=A). Thus, Boolean algebra [wolfram.com] formalizes the basic set theoretic operations of union and intersection, which in turn is almost trivially isomorphic to a Boolean ring [wolfram.com]. I could create all kinds of stupid rules [insert your favorite slam on mathematics here] that have no meaning in the real world. Most importantly, Boole seemed to be the first to attempt to bridge the gap between abstract thought and mathematics. Admittedly there was some previous work in attempting to formalize|classify all syllogistic reasoning. It was the first step towards a unified theory of logic and ultimately what is hope to be a universal theory of symbolism (see Chomsky's mathematical linguistics).

    The irony about mathematics is that often the best ideas are childishly simple. It's not the proof of deep theorems [slashdot.org] (although that has it's place) that often has the greatest impact. It's the fresh applications of mathematical rigour to some real world scenario. Thus, mathematics is often at it's weakest when done in isolation. Incidentally, Knuth's work in algorithm analysis was revolutionary. In a world described by (K-Complexity (AIT)|cellular automata|simple computer programs) algorithm analysis and ultimately a proof of P not= NP may be to hold the key to the fundamental laws of nature (i.e. physics, biology, and chemistry).

    Incidentally, the Martin Davis' The Universal Computer [maa.org] is a great popular science book on this topic. A free copy of the introduction is here [nyu.edu]. This book manages to introduce the ideas of Turing (Turing-Post?) Machines [wolfram.com] and the Diagonal Method [wolfram.com] to the lay reader. The author is a respected logician and computer scientist who studied under Church and Post.

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...