Boolean Logic : George Boole's The Laws of Thought 254
Ian writes "The Globe and Mail has a piece about the man behind Boolean Logic - George Boole - The Isaac Newton of logic. 'It was 150 years ago that George Boole published his classic The Laws of Thought, in which he outlined concepts that form the underpinnings of the modern high-speed computer.'"
Doesn't start out well (Score:5, Informative)
The computer was digital, it just used relays instead of integrated circuits. It wasn't stuck between relays, it was stuck in a relay.
And while I'm at it, a nitpick. She wasn't an admiral until much later.
Bug found by a Bug (Score:4, Funny)
Grace Hopper Found a bug? Was it a Grasshopper?
Re:Bug found by a Bug (Score:4, Funny)
Re:Bug found by a Bug (Score:2)
I'm just trying to figure out why this article is from the 332167 department....
Re:Doesn't start out well (Score:2)
I stumbled on a copy of the image of the log book, showing the moth taped to it. Quite interesting. Just to be annoying, and somewhat in spirit with this thread, here's a tip;
If you can't figure that out, the image probably won't make much sense to you either.
Re:Doesn't start out well (Score:4, Interesting)
Note the previous run! (Score:2)
1000 - Arctan stopped
arctan(1.2700) = 0.9037847025 (actual value 0.9037846992)
You've got to appreciate that.. 2 hours of work on a huge machine gave 5 decimals of precision. Today, any pocket calculator can do that in milliseconds!
Spoiler (Score:2, Interesting)
Yes, I'm aware than an AC beat me to it, but he's at -1 right now, so I'm posting this because it's more visible.
Re:Spoiler (Score:2)
Tell us what picture you are posting next time. I thought you were following standard Slashdot procedures and posting an attractive picture of the nerdy female in question.
Call that a nitpick? (Score:3, Informative)
She was also already a PhD when she was called up for active service in WW2, so the grandparent post is really highly inaccurate.
Grace Hopper - the third programmer in the United States, and a fitting successor to Ada Lovelace.
The Real Bible Code? (Score:2)
Prof. MacHale also notes that subsequent to The Laws of Thought, Boole undertook to rewrite the Bible in his mathematical logic.
I'm very curious about this ... how exactly would you go about representing text with mathematical logic? He must have needed to invent some ad-hoc method to do this, right?
Anyone know anything about this?
Re:The Real Bible Code? (Score:3, Funny)
And 101101 sayeth unto 111000, "Don't partake of thy apple, for it is full of 110101001010". But sayeth 111000 back to 101101, "11111111 That! I am hungry!". Behold he biteth into thy apple, and suddenly God made him naked, and his "1" showed, and 101101 laughed because she thoughteth it was a decimal. But her "00" also showed, and 111000 laughed because he though they were two decimals.
It gets worse (Score:3, Informative)
Boolean Logic (Score:5, Funny)
Moderate this comment
Negative: Offtopic [mithuro.com] Flamebait [mithuro.com] Troll [mithuro.com] Redundant [mithuro.com]
Positive: Insightful [mithuro.com] Interesting [mithuro.com] Informative [mithuro.com] Funny [mithuro.com]
Can't parse that (Score:2)
Re:Boolean Logic Demorganized (Score:2)
A comment has to be insightful AND funny, otherwise, it is NOT worth reading.
Demorganized:
A comment must NOT be insightful OR funny, otherwise, it is worth reading.
Re:Boolean Logic (Score:5, Funny)
Isaac Newton of modern computers? (Score:5, Informative)
Re:Isaac Newton of modern computers? (Score:2, Insightful)
RTF Post.
Re:Isaac Newton of modern computers? (Score:2, Funny)
Oh, on the other hand...
Boole Was Ada's Teacher (Score:5, Interesting)
Moderate this comment
Negative: Offtopic [mithuro.com] Flamebait [mithuro.com] Troll [mithuro.com] Redundant [mithuro.com]
Positive: Insightful [mithuro.com] Interesting [mithuro.com] Informative [mithuro.com] Funny [mithuro.com]
Re:Boole Was Ada's Teacher (Score:3, Funny)
Some people also don't think that Earth is not flat.
Re:Boole Was Ada's Teacher (Score:5, Funny)
It was, however, the world's first vaporware.
Re:Boole Was Ada's Teacher (Score:3, Interesting)
My general rule is, "If all it does is react passively to the Earth's magnetic field and displace it's weight in water, it's not a computer."
Since the Analytical Engine was never completed, I feel it falls in that category.
However, you don't need a working computer to be a programmer. I've programmed for computers that hadn't been built yet as well as some that never got off the drawing board.
Re:Boole Was Ada's Teacher (Score:2)
Some people also don't think that Charles Babbage's Analytical Engine was the world's first computer.
His analytical engine was the design for the first computer, but it was never created due to lack of funding. Herman Hollerith used the designs of the Analytical Engine and the Difference Engine of Babbages to create his Tabulating Machine (early computing device) for the company that later became International Business Machine (IBM). So, Charles Babbage is known as the father of computer for his desig
Re:Boole Was Ada's Teacher (Score:2)
=Smidge=
Re:Boole Was Ada's Teacher (Score:2)
Which should be called:
"All the cool things pseudo geeks want to hear so they can think they are hip on technology, and other mindless drivel."
Re:Boole Was Ada's Teacher (Score:2)
Boole tried to rewrite the Bible... (Score:3, Funny)
Laws of Thought (Score:5, Informative)
Not in Project Guttenberg yet
There's nothing like reading the original works.
Soon! (Score:5, Informative)
Why... (Score:5, Funny)
Re:Why... (Score:2)
Not only that, but I got really excited that Google was finally going to have boolean search queries.
Alas....
i did too (Score:2)
Re:Why... (Score:2)
If you read that in under 10 seconds, you just proved my ponit.
LK
Re:Why... (Score:2)
Then again, probably 10k people read an article so at least 5 others just went to the store and bought a coke, too
John von Neumann (Score:5, Interesting)
Boole's great acheivement was his attempt to formalize logic algebraically at a time when logic was informal and far too meta for even mathematicians to consider formally. While this is great and all, it doesn't result in a general purpose computer.
However, Turing machines and von Neumann machines are in everyway a general purpose computer.
Turing machine generalised, not binary. (Score:3, Informative)
The importance of Boole's ideas, therefore, was that they provided a grand unifying framework for computer design.
In fact Turing's ideas were more fertile for programming, and it's a pity that he lived in the UK after WW2 and was held back by the
Re:Turing machine generalised, not binary. (Score:2)
Also, just like you said, Turing machines don't require binary implementation, and therefore Boole's ideas played only a small part if any in Turing machines.
The truth is far more sophisticated. Everything worth anything is the result of a continuum of research and researchers. However, my arguement is that von Neumann's contributions are the first to resemble
Re:Turing machine generalised, not binary. (Score:2)
Re:Turing machine generalised, not binary. (Score:2)
Don't forget Shannon (Score:3, Informative)
It was, by a long margin, the most important master's thesis in history.
Re:John von Neumann (Score:2)
I don't have time to give you a computer science lesson on Turing machines, but just Google it and read up on what they are.
Re:John von Neumann (Score:3, Informative)
You have this little box which has an infinitely long tape fed into it. The box is a finite state machine which reads the tape, and based upon the state of the machine, writes to the tape, moves the tape left or right, and maybe changes its internal state.
There's a kind of neat Turing Machine Simulator here [igs.net] if you want.
Re:John von Neumann (Score:2)
The internet functions in this way. At any given time, people are adding more servers and therefore more tape to the net.
Re:John von Neumann (Score:2)
Funny name (Score:5, Funny)
Book title (Score:3, Informative)
correction (Score:2, Insightful)
Re:correction (Score:4, Interesting)
Parent post is not completely wrong - I got the first part of the title right. :-) And I blame Dover [yahoo.com]
"pre-digital computers"?? (Score:3, Insightful)
"Also that year, Grace Hopper, an admiral in the U.S. Navy, recorded the first computer "bug" -- a moth stuck between the relays of a pre-digital computer.)"
Ahh, but relays are digital.... They are either on or off. That was binary the last I looked.
Re:"pre-digital computers"?? (Score:5, Informative)
Not to mention that it is unlikely that Hopper ever claimed to find the first "bug".
The comment next to the moth taped in the logbook [navy.mil] seems to indicate that the word had been in use for some time, and Hopper was making a bit of a joke.
Shakespeare published first (Score:5, Insightful)
While that's a pretty clumsy way of saying, it, Shakespeare was ahead of Boole.
I suggest we all add the following statement (or equivalent) to our code in honor of this great mind.
typedef bool shakespear;
I'm guessing you were being funny (Score:2)
A simple pithy statement like "to be or not to be" is a long way for formalized digital logic. Boole's contrabution wasn't realising the idea of have two finite states, philsophers had that idea for a long time, and often over apply it in an attempt to prove arguments. What Boole did was to create a formalized and complete system for logic. Boolean operators can be used in a binary system to construct any more complecated operator. The basic Boolean operations,
Boole vs. Real World (Score:5, Interesting)
I'm not saying that binary is not great for doing all manner of wonderfully powerful proofs, logic, and computation. I'm only saying that it is a mere approximation to the real world and can thus fail when the real world is does not dichotomize to fit into Boole's logic.
Boolean Logic illustrates both the tremendous power and weakness of mathematical systems. On the one hand the power of proof guarantees that man-made mathematical system with certain axioms will undeniably have certain properties. On the other hand, math gives one no guarantee that the real world obeys those axioms.
Boole vs binary (Score:3, Informative)
One good boolean algebra is the logical algebra of probabilities. Every datum is a real between 0 and 1. x or y would be x + y - xy; x and y would be xy; not x would be 1 - x. (All of this is off the top of my head BTW). It's a perfectly valid boolean algebra.
To say that boolean algebra is about "true" or "false" i
It's not binary (or even boolean) logic that fails (Score:2)
Boolean logic can, through some very simple rows of "greater than/less than" questions describe the mass, radius, orbit and every other reasonable measurable quantity. It can also be used to measure subjective opinions - a set of "is an object bigger than/sma
Re:Boole vs. Real World (Score:2)
As wonderful as binary is, it falls utterly in capturing the fuzzy analog nature of life and the real world.
I wouldn't say "utterly"... using approximation and extrapolation to fill in the weak spots and get a "close enough" answer have been good enough to let the human race put a man on the moon, collect energy from doing naughty things with atoms, and create several series of fun to play WWII first person shooters (I'm most enthusiastic about that last one, but YMMV).
I've read up on multivalent l
Re:Boole vs. Real World (Score:2)
Re:Boole vs. Real World (Score:2)
Re:Boole vs. Real World (real numbers real?) (Score:5, Interesting)
Excellent point. But again, I'm not sure that the real world actually obeys the laws of real numbers either. Again, wave-particle duality makes a mess of mathematically notions of pure discrete and pure continuous. Some theories of physics suggest the existence of a quantum mechanical foam at dimensions of about 10^-33 meters. Perhaps the physical world is neither continuous (in the infinite-digit real number sense) nor discrete (in the exactly N-bits binary sense) Perhaps continuous real numbers are a good approximation, but whether real numbers are real (or just a very convenient mathematical construct) is debatable
Similarly, a question asking the color of something (which has finitely many answers) could be reformulated as a sequence of yes/no questions. For example, if the color is in 24-bit format, start with: Is the first bit a 1? and so on.
An interesting example. Yet real-world colors aren't 24-bit, although they can be approximated with a 24-bit color measuring systems. Its a crude approximation, unfortunately. I don't even know of a 24-bit system that has the color gamut of human vision, let alone one that properly measures the hyperspectral reflectance, transflectance, absorption, & flourescence properties of real-world materials. Yes, if you assume a 24-bit approximation, then binary yes/no questions suffice. My point is that one is forced to make a big (sometime right, sometime wrong) assumption in reducing the physical world to any N-bit approximation.
After all, everything you do on a computer, from playing video games to chatting via Instant messaging, ultimately gets reduced to binary form.
So very true.
To me, the deeper issue is whether the real world obeys the mathematicaly axioms of an algebra [wikipedia.org], Boolean or otherwise. The real world is nonlinear and that throws a wrench in the axioms right there. I also wonder about the axiom of closure -- that interactions of physical quantities in physical systems have consequences outside the algebraic variables of the system.
Again, I'm sure that algebras and real numbers or N-bit numbers are excellent approximations as long as we don't forget that they are only approxmations.
Re:Boole vs. Real World (real numbers real?) (Score:3, Interesting)
I believe that approximations are the best we can do. I've been trained as a mathematician. But, I don't believe in the square root of 2 in any physical sense. Some may argue, well construct a square 1 unit on a side, then the diagonal is square root of two. I argue, is it possible to construct a physical square 1 unit to a side? Each side would have to
Re:Boole vs. Real World (math == chainsaw) (Score:3, Interesting)
So, I don't even believe in 1 as a physically measurable number!
Cool, I can tell that you and I are on a similar wavelength. Whether that wavelenth is representable as a real number or as a 24-bit color is another matter.
I say again, every measurement is an approximation. Ergo, choose N large enough that no one can practically tell the difference. Then the approximation becomes reality.
And I agree 100% that a large enough N creates
Re:Boole vs. Real World (real numbers real?) (Score:2)
That depends. If you're dealing with a discrete value, you can measure it exactly. Continuous ones can sometimes be calculated exactly, but you can't actually measure them exactly. (Actually, I think the typical use of measurement involves mapping from some space, discrete or continuous, to a discrete one. Discrete space can map 1-to-1 to another discrete space (on a given interval), continuous space cannot.)
This is a different concept from needing
No mention of Set Theory. (Score:5, Informative)
Just as an aside, a mathematical structure is a Boolean Algebra if, and only if, if it contains two operations (generally denoted +, and *), such that for all elements A, B and C in the structure
A + B = B + A
A * B = B * A
(A + B) + C = A + (B + C)
(A * B) * C = A * (B * C)
A + (B * C) = (A + B) * (A + C)
A * (B + C) = (A * B) + (A * C)
and there exists two elements 0 and 1 in the structure such that
A + 0 = A
A * A = A
and for each A an element exists that's the "negation" of A
A + ~A = 1
A * ~A = 0
In logic, + is equivilent to OR, * is equivilent to AND, a Tautology is equivilent to 1, and contradiction is equivilent to 0. ~ is NOT.
Similar comparisons can be made in Set Theory. In the same order of above: Union, Inclusion, Universal Set, Empty Set, and Set Complement.
So, if you prove one algebraic identity in Set Theory, you also proved the same exact identity in
Propositional Logic (and vice versa.)
(shrug)
Re:No mention of Set Theory. (Score:2)
Not all set theories have a universal set. Instead of having a universal set, you often define a domain of discourse. Therefore if D is your domain of discourse, you define:
~A = D - A
However, it's important to note that D is not a "universal set".
Not a slashdotter (Score:2, Funny)
Spelling expert? He would have stood out as a slashdotter.
Re:Not a slashdotter (Score:2, Funny)
Null ruined it all (Score:5, Interesting)
I generally agree. I think nulls are perhaps fine for numeric calculations in some cases, such as the average if there are zero records, but not Booleans and not strings. But sometimes it is hard to limit it to one but not the other. It is a controversial topic nevertheless. Chris Date has written some white papers on how to get rid of null.
Re:Null ruined it all (Score:2)
Re:Null ruined it all (Score:2, Interesting)
Good question. Hugh Darwen may have some answers [freeola.com]. When you do you express "don't know" as nulls, how do you, later on - when you get the null as a result in a query - get it's meaning out of:
Re:Null ruined it all (Score:2)
Re:Null ruined it all (Score:2)
Atanasoff Missing (Score:5, Informative)
COLOSSUS missing too! (Score:2, Informative)
Giving Goerge Boole too much credit? (Score:3, Interesting)
This was what those mathematicians were aiming for. Goerge Boole also proposed a set of principles, which at the time no one thought had any practical use. This branc of mathematics was a purely theoric one. Mathematicians mostly abondend this subject after it was proven by experience that the human thought can not be formulated in to some mathematical notations.
It wasn't untill in the 40s, when someone at the Bell Labs (forgot his name) suddenly found out that the Boolean Algebra can be used in digital systems, specifically in implementing digital circuits. Even the first computer built, the ENICA, used a decimal system, and didn't have anything to do with digital systems. It was only by an accident that it was found out that Boolean Algebra, which at the time was a completely useless and theoritic branch of math, found an application, and became a widely studied subject.
What I am trying to say, is that Goerge Boole himself, by no means had any interest in digital systems, in programming, in computers, or in anything even remotely related to electronics. While as I said, I we should all give him immense credit for his work on Boolean Algebra, it should be noted that many people, contributed much more to the computer and electrical science, than Goerge Boole. Charles Babage and Lady Ada were actually writing computer programs in the 19th century; their only problem was that they had no computer at that time! And certainly, the father of today's computer architecture, is von Neuman.
Give credit were credit is due, but over-crediting someone, like saying Goerge Boole invented the foundation of computers, is certainly not correct.
it dosen't looks like at first, but boole is great (Score:2, Interesting)
I found it very interesting at first, but the final parts where really annoying. But the important thing is that after finishing the semester I started that even I've being a programer since 95 and having experience with turbo pascal, javascript, lambdaMOO, php, c, c++ and object pascal, I've just get better at programking because of boole!
Boole is one of those things that looks simple and useless at first, but that is not the tru
I don't believe this... (Score:2)
It's a sad day when the editors of Science feel they need to define what algebra is to their audience.
Why Boole's work was signifigant (Score:5, Informative)
Boole's contribution to logic was profound. First, a real world model for any mathematical property ensures the consistency of that model. Boole's work provided an abstraction for elementary set theory. The key to this abstraction is idempotency. The aggregate of set A and itself is the set A (i.e. A+A=A). Thus, Boolean algebra [wolfram.com] formalizes the basic set theoretic operations of union and intersection, which in turn is almost trivially isomorphic to a Boolean ring [wolfram.com]. I could create all kinds of stupid rules [insert your favorite slam on mathematics here] that have no meaning in the real world. Most importantly, Boole seemed to be the first to attempt to bridge the gap between abstract thought and mathematics. Admittedly there was some previous work in attempting to formalize|classify all syllogistic reasoning. It was the first step towards a unified theory of logic and ultimately what is hope to be a universal theory of symbolism (see Chomsky's mathematical linguistics).
The irony about mathematics is that often the best ideas are childishly simple. It's not the proof of deep theorems [slashdot.org] (although that has it's place) that often has the greatest impact. It's the fresh applications of mathematical rigour to some real world scenario. Thus, mathematics is often at it's weakest when done in isolation. Incidentally, Knuth's work in algorithm analysis was revolutionary. In a world described by (K-Complexity (AIT)|cellular automata|simple computer programs) algorithm analysis and ultimately a proof of P not= NP may be to hold the key to the fundamental laws of nature (i.e. physics, biology, and chemistry).
Incidentally, the Martin Davis' The Universal Computer [maa.org] is a great popular science book on this topic. A free copy of the introduction is here [nyu.edu]. This book manages to introduce the ideas of Turing (Turing-Post?) Machines [wolfram.com] and the Diagonal Method [wolfram.com] to the lay reader. The author is a respected logician and computer scientist who studied under Church and Post.
Re:Sneaky political criticism and genius-bashing (Score:4, Insightful)
Duh. Not the BBC (Score:2)
I was reading a BBC news page when writing the reply... Oh well, too much lager
Simon
Uh.... that's uh... nice...? (Score:2)
Re:Uh.... that's uh... nice...? (Score:2)
Or how Conrad Black controls both the major news media of Britain and The National Post, which is, incidentally enough, _NOT_ The Globe And Mail.
I'm missing your logic at how we're being oppressed by Britain when they have zero control over us except by name.
Re:Quick, how many here can define "bit"? (Score:3, Informative)
Re: Quick, how many here can define "bit"? (Score:2)
This is different from how I'd say it, but seems equivalent. Everyone else -- 4 other attempts at the time of this writing -- failed it. Scary, is not it?
Re: Quick, how many here can define "bit"? (Score:2)
Yes, everyone needs to know this. May be, not the "precise definition", but certainly something better than "binary digit".
This is -- as you point out -- the foundation of Informatics, a crucial part of the all-encompassing Computer Science.
Not knowing this causes DB tables with 15-character fields for IP-addresses, 32-characters for MD5 checksums, and other monstrocities all too common nowadays...
Re: Quick, how many here can define "bit"? (Score:2)
Not knowing where a dictionary can be found causes such monstrosities as monstrocities
Re:Quick, how many here can define "bit"? (Score:2)
Does not "with no prior knowledge" say, the values are equally probable? Certainly, that is an important part of the definition, and the explanation of, for example, why archiving (compressing) can exist...
I think most of us can (Score:2)
Re:Quick, how many here can define "bit"? (Score:3, Funny)
Re:Quick, how many here can define "bit"? (Score:2)
Jackass.
LK
Re:Quick, how many here can define "bit"? (Score:2)
Most typicly wrong. You are merely explaining, where the word "bit" is coming from. I asked for the definition of the term. Try defining meter (as in kilometer) as a homework. And no: "It is something, that's metered," is not going to cut it.
Thanks, moron.
Re:Quick, how many here can define "bit"? (Score:2)
Like I said before, a bit is a binary digit, 1 or 0.
LK
Re:Quick, how many here can define "bit"? (Score:2)
Bzz. Wrong on both counts. Last time meter was defined in a way you describe was in 1795 [colostate.edu].
Meter is a measure of distance. Bit is a measure of information. Both have definitions.
Your not knowing this foundation of Informatics -- despite it being offered and discussed elsewhere in this very thread perfectly illustrates my original point (moderated off-topic by now).
[Your stubborn insistence on "1 or 0" (what happened to "left or right", "black or white", then?), coupled with calling me "Jackass" point at
Re:Quick, how many here can define "bit"? (Score:2)
Zeros and ones are not bits either... I tried. I tried hard. I mentioned the definition of bit in this thread and I gave you a link to one of the online dictionaries, which offers a different (simpler!), but equivalent definition. If you still don't get it and insist on the stupid "binary digit" crap, there is no hope for you. You are, officially, a moron, and I can only pray, I never have to depend on a program written with your participation.
Good bye...
Re:Quick, how many here can define "bit"? (Score:2)
There can be less. If I told you your name (or anything else you already knew) -- that would be zero information. If I told you something, you did not know, but suspected, it would be above zero, but, possibly (depending on the number of other choices) less than a bit.
Re:AND before OR? (Score:2, Informative)
AND = x (multiply)
OR = + (add)
The order of operations states that we multiply before adding.
Re:AND before OR? (Score:5, Insightful)
In fact, for most practical purposes, AND *is* multiplication and OR *is* addition. Just compare the truth tables with multiplication and addition tables (one minor technicality, of course, is that addition carries while OR does not; the carry bit is simply the result of A AND B).
Re:AND before OR? (Score:3, Insightful)
Re:AND before OR? (Score:2)
In logics, you usually start with a very simple syntax that forces parenthesis around all binary operators. You can then add syntactic shortcuts for a + b + c = ((a + b) + c).
Introducing precendence has nothing to do with the logic itself.