Forget Math to Become a Great Computer Scientist? 942
Coryoth writes "A new book is trying to claim that computer science is better off without maths. The author claims that early computing pioneers such as Von Neumann and Alan Turing imposed their pure mathematics background on the field, and that this has hobbled computer science ever since. He rejects the idea of algorithms as a good way to think about software. Can you really do computer science well without mathematics? And would you want to?"
Damn straight! (Score:4, Insightful)
Computer Science != Software Engineering (Score:5, Insightful)
As if computer science wasn't stunted enough (Score:5, Insightful)
COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.
Applied mathematics (Score:5, Insightful)
WTF is the the author smoking?.. There are of course parts of CS that are less involved in math, but it is still overall a fundamental part.
Porn (Score:5, Insightful)
Sure thing Einstein (Score:5, Insightful)
Depends (Score:2, Insightful)
Math not essential - Logic is! (Score:5, Insightful)
He has no idea what math is (Score:5, Insightful)
Wrong, on many levels (Score:5, Insightful)
He also ignores the vast array of work on non-deterministic algorithms, stating that "Any program utilising random input to carry out its process, such...is not an algorithm". Sure, it's not a deterministic algorithm, but even if you artificially restrict your definition of algorithm to just be deterministic, it's a useful tool in analysing such problems.
Finally, statements such as "Computer science does not need a theory of computation" are just so bizarre as to be funny. I suggest he forgets all he knows about formal computational theory, and I'll contract "Theseus Research" to write me a program to determine the halting problem for an arbitrary program. I wonder what his bid will be, given that he doesn't need a theory of computation (that would tell him you can't do it, at least with our models of computation - and probably with any).
Now, all of this is not to say you can't make progress in computer science without the mathematics that's currently been developed - however, you will either spend a lot of time achieving unreliable results, be reinventing the wheel, or just be creating a new branch of mathematics.
Math is a subset of the bigger picture of ..... (Score:5, Insightful)
And computer science, the software side, is really the science of abstraction physics.
http://threeseas.net/abstraction_physics.html [threeseas.net]
At some point in the higher levels of abstraction creation and use you use the lower mathematical level as more or less a carrier wave of the higher level abstraction, than for the purpose of performing a mathematical calculation. The analogy is that of using radio waves to carry the music you hear over the radio, but the carrier wave is discardedafter it has done it job. Likewise, the mathematics of computers boils down to binary flipping of transistor swiches upon which the higher level of mathematics is carried upon.
With a correct approach to the abstraction manipulation machine computers really are, we can accomplish a lot more, similar to the difference between using the limitation of roman numerals in math vs. the decimal system with its zero place holder.
What if you grew them instead? (Score:2, Insightful)
Re:Computer science ? (Score:2, Insightful)
Making better engines uses the science of Physics and chemistry..
Cooking uses the science of chemistry..
To me it's like saying : 'Lego Science'.. It's not 'science'.. You don't need to know the physical aspects of a lego block to assemble something.. Although you need some insight into how the thing works - but it's not science per-se !
Then again, it depends on how 'science' is defined !
--Ivan
Teh Maths (Score:5, Insightful)
But the kicker is that you can't just tell a student that they should "study vector math" because one day they'll write a Quake Mod, because, truth be told, they probably won't. It's the trouble with all examples you give when students ask how math will be useful -- I could pull any number of examples from my life, but the problem is, they probably won't happen in a student's life. Instead, they'll have their own trials. The best you can tell someone is to study all the math they can, because some day it *might* be useful, and they'll want to have that tool in their toolkit.
And that's just not a very satisfying answer to students who want to make sure that they'll be damn well using what you're teaching in the future.
But believe me, I thought I'd never have an application for eigenvectors, and now not only do I have to clean out my brain on the topic, but I have to parse someone else's code (PhD thesis code no less) and add functionality to it. Two other friends of mine got stuck on legacy Fortran apps which are essentially mathematical solvers (one for differential equations, the other for huge linear algebra problems), and both of them are extremely happy they paid attention in their respective math classes.
So, yeah. To CSE students out there: take math. Pay attention. It could very well save your neck some day at a job, and if it doesn't, at least try to make it interesting to yourself to think of applications where you might use them. All math through the first two years in college can find applications for it quite easily.
Re:As if computer science wasn't stunted enough (Score:5, Insightful)
People have been fucking saying this about various versions of BASIC since the beginning. Instead of trashing it, what did BASIC's various incarnations teach us?
It taught us that Microsoft could roll what amounts to a scripting language into its Office line and make the programs ever more powerful without having to relearn something completely new and difficult. An education in just about any language, a book or a list of commands, and some time and you will have a fully functional module or two that saves you a ton of time and energy.
I honestly think a lot of the hostility, here, towards VB has to do with the fact that now pretty much anyone can write code and that it's from Microsoft. If you're somehow saying that if they used C/C++ or even Perl that their code would somehow be wonderful or safe, you're insane.
COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.
I agree and while most applications require this, if you look at VB as a way to either get people started coding or to do quick things because it's built into the system instead of concerning yourself with the necessity of math-based algorithms, it serves its need.
I'm no math whiz but I can write code (in languages other than VB) and so can plenty of others. Enough putting people down and being on your high-horse because you write in such and such. Math is important to CS and so is easy access to be able to write code.
Re:Computer science ? (Score:3, Insightful)
The Science in Computer Science consists largely of niches carved out of other disciplines e.g. algorithm analysis and crypto are mathematics, user interface design is psychology, computer graphics is really about approximating physics, audio compression is mathematics, psychology and physiology, AI steals ideals from biology... every now and then we find out that the physics department, or the electrical engineers, or the chemists, are actually doing almost identical research to us.
Anti-Intellectualism (Score:3, Insightful)
Just as few telescope makers are astrophysicists, most programmers aren't computer scientists. The author himself is evidently not one. Instead, he is one of the more vocal members of an angry, ignorant mob trying to burn down the edifice of computer science. Its members do not understand it, so they fear it and try to destroy it --- look what's happened to computer science at universities!
It was bad enough when courses about a programming language replaced ones about algorithms and data structures (I'm looking at you, Java and D-flat). It was bad enough when pure profit became the raison d'etre of computer science departments. It was bad enough when I noticed my peers start to joke about how they didn't care about this "maths bullshit" and just wanted to earn more money. It was bad enough when the object, not the bit, became the fundamental unit of information.
But what this author advocates is still worse. He's proposing that we replace the study of computer science with a vocational programming, and call that emaciated husk "computer science." We already have a "theory of process expression", and that's the rigorous of algorithms and data structures. We've constructed that over the past 50-odd years, and it's served us quite well.
That field has given us not only staples, like A* pathfinding, but a whole vocabulary with which we can talk about algorithms -- how do you say that a scheduler is O(log N) the number of processes except to, well, say it's O(log N)? You can't talk about computer science without talking about algorithms.
The author's fearful denunciation of algorithms is only one manifestation of the anti-intellectualism that's sweeping computer science. "We don't need to understand the underpinnings of how things work", the angry mob chants, "but only implement the right interfaces and everything will happen automatically."
The members of this angry mob sometimes manage to cobble something of a program together, but it's more like a huge rock pile than a colonnade. It often barely works, uses too much memory, doesn't handle corner cases, and is likely to crash. (See worsethanfailure.com.) Members of this mob even think that if the same algorithm is expressed in two different languages, it's two different processes. People like this ask painful questions like, "i know quicksort in C# but can someone teach it to me in PHP?"
Argh.
Even "new" developments in programming are just new claptraps for old ideas, with fashions that come and go over the years. The only really new things are algorithms, and increasingly, we're calling people who couldn't independently create bubble sort "computer scientists." It's ridiculous. Call computer science what it is, and create a separate label and department for people who can program, but not discover new things.
It's this idea that one doesn't need to understand or think to be successful that's at the center of the article, and it's not just affecting computer science. Look around you. I wonder whether we'll fall into an old science fiction cliché and regress so far that we are unable to understand or recreate the technology of our ancestors.
Re:Computer science ? (Score:5, Insightful)
That said, I believe there's a useful set of relationships well understood in other fields:
Science = The search for fundamental knowledge and predictive models;
Engineering = The creative application of the results of science;
Technology = The routine application of the results of engineering.
giving us, for example:
Science: Physics
Engineering: Electrical engineering
Technology: TV Repair, Cable TV Installation
The punch line is that application of this model to computing works as follows:
Science: Mathematics
Engineering: Programming, Informatics, "Computer Science"
Technology: Coding, Computer Installation, Home Computer Repair, etc.
Mathematics IS the science in "Computer Science".
Anyone who has studied advanced Mathematics knows that Math is not about numbers; think of mathematical logic, Boolean algebra, abstract algebra, set theory, topology, category theory, etc. ad infinitum. Dijkstra defined Mathematics as "the art of precise reasoning". In the same sense, "computation" doesn't mean "number crunching", but more generally the automated manipulation of information.
It is true that there are legitimate concerns in today's computational landscape (networking, concurrency, etc.) which didn't figure in the mathematical/engineering world view of the 1940s, but that's simply a sign that the field has grown up (i.e. grown beyond the limited perspectives of its founders). That's also true in many other applications of Mathematics. For example, early research in differential equations paid much more attention to linear differential equations (because they were more tractable). However, we now know that most "interesting" systems in the real world involve non-linearity.
Science, Engineering, and Technology share with living systems an important rule: "Grow or die!" Fortunately, the field of computing has grown.
I think the author is making a more subtle point (Score:5, Insightful)
If you buy that argument, then treating CS as if it were merely simply another branch of mathematics will not help solve those problems.
Of course, this also takes us into the perennial debate between where to draw the line between "computer science" and "software engineering". One could certainly define away the author's problem by saying that his examples are software engineering issues rather than computer science issues. And it's true that it's software engineering has been driving a lot of the theory with respect to expressiveness (design patterns and the like). But that view also seems to really impoverish computer science - if all you leave the field of computer science is the stereotypical mathematics, why not just become an applied mathematics major?
Re:As if computer science wasn't stunted enough (Score:4, Insightful)
It's just not that easy to do some of the cool stuff we want to do. No amount of wishing it were different is ever going to change that.
Re:Sure thing Einstein (Score:2, Insightful)
Re:So I am not alone (Score:2, Insightful)
--
The majority of _your_ work might be.
Re:As if computer science wasn't stunted enough (Score:4, Insightful)
No, the hostility is because now pretty much anyone THINKS he can write code, which lowers the valuation of people who actually can do it. That lowers software quality on two fronts: People who can program are forced to write lower quality code because they need to write more to compete with too many amateurs (in the derogatory sense of the word) and people who can't really program write code that doesn't handle errors properly and fails, often silently and undetectedly, when the input deviates from the expected.
Some people shouldn't code production systems (Score:5, Insightful)
That said, I'm sure you're good at what you do. I bet you can write good code in VB, as well as many other languages. This isn't a personal insult. VB, PHP, and other brutish languages are equally bad in my eyes.
These languages are brutish because they oversimplify key concepts. That oversimplification also makes them attractive to new programmers, and new programmers universally write terrible code. The languages themselves aren't bad, the coders are. That said, more experienced coders will generally choose more capable languages, so most of the time, a program written in a brutish language will be a bad one.
We need fewer programmers, not more. Maybe professional certification would help somewhat.
(Incidentally, we were lucky that Javascript became the de-facto client-side web language. We could have done far, far worse, and although we can change server languages, we can't change a user's web browser!)
Re:Applied mathematics (Score:3, Insightful)
I agree... (Score:1, Insightful)
If you've never had to deal with someone else's poor work then you [are the luckiest bastard on the planet, but more likely you] may want to consider a career change...
Re:Applied mathematics (Score:3, Insightful)
But you don't engineer a bridge by thinking about the interaction of individual atoms, not because that isn't the "right" way of doing it, but because it takes too long and is too expensive.
The article makes a good point saying that the obsession with mathematics at the exclusion of all else in computational theory is not necessarily a good thing for the IT field. Mathematics are on such a low level of abstraction that they are mostly useless when it comes to thinking about solutions to most of the problems "real world" software architects (outside universities) run across, like large-systems architecture, parallel computing, and most classes of high-level optimization. As a result, "real world" software architects mostly ignore the improvements in the field of theoretical computational science, since it has little bearing on what they do.
Most notably: a big problem right now facing the IT field is the end of moore's law, and the growing need to parallelize everything. What we see in practice is that most programmers don't really know how to write multi-threaded code, and as a result few applications are multi-threaded. The solution here is not to require all programmers to be CS grads, because that is too expensive and a big waste of resources. A programmer shouldn't have to know about loop invariants, just like a mason shouldn't have to know about load distribution between pilons. The solution is for the theoreticians to focus on something useful to real world architects.
But, let me be clear about this: a software architect should know their mathematics, just like a bridge architect should know their physics. If you don't know why you're designing a system a certain way, you can't know it is the right way to design it like that.
And what constitutes the growth environment? (Score:1, Insightful)
Re:Math not essential - Logic is! (Score:2, Insightful)
I think the problem is that people are completely ignorant about what mathematics is. In school you get taught sums a bit of trigonometry, quadratic equations and calculus. That is the extent of most people's exposure to mathematics but maths is much much broader than this and much more powerful and useful.
However, given that you are just a programmer or software consultant, you are correct in saying that you don't need to worry your pretty little head about the bigger picture.
CS - MA = IS (Score:3, Insightful)
It covers networking, scripting, database management, web design, hardware, etc. It's computer science without the science.
Also, Computer Science != Programming:
Lemme guess (Score:5, Insightful)
Computers are (by their very definition as well as by the word used to describe them) mathematical machines. A computer can essentially do NOTHING BUT calculate. It can in its core add, subtract, shift and move data around. How is this supposed to work without math?
Hammer, nail (Score:2, Insightful)
"If all you have is a hammer, everything looks like a nail" applies to some degree to the responses to the review.
We use math for these machines because that's how they were designed. They didn't have to be, although from our perspective a half-century on, it seems impossible that they might work any other way.
Computers may need math because of how they were created, but consider that an animator didn't need math to animate, rotate or transform a figure. Though it may be reduced to math, an artist doesn't need math to give depth, shading and perspective to an image. In fact, computers make such analog tasks incredibly math-intensive, as a previous poster noted.
Despite the depth and complexity of the resulting orchestrations, no math created -- though it may describe aspects of -- Beethoven's Ninth Symphony. Learning language and grammar remain elusive to mathematicians, and even Chomsky's "universal" theories end up flummoxed by the Pirahã language. The multiple readings of T. S. Eliot's The Wasteland would take more time to track than the Internet in real time.
Even in the sciences from antiquity, increasing description and formulation result in increasing complexity, but not necessarily increasing understanding. Earth, air, fire and water made sense in societal context; then extended elements; then the periodic table; then subatomic particles, light as particles and waves, and behavior of quarks. Magnetism remains elusive, as does an elegant theory of everything.
Each of these may use math as a description or even a tool, but the careful tuning analysis of the different kinds of gamelans does not apply to the gamelans, but only their analysis. The reference is to itself, and the gamelans go on with or without analysis.
In other words, were our computers not based initially in creating algorithms to manipulate the basic elements chosen to operate them, impelling the ultimate triumph of binary data over other representations, math may have receded to its place as just one tool of computer activity.
Dennis
We Are All Mozart [maltedmedia.com]
Re:Some people shouldn't code production systems (Score:4, Insightful)
I agree professional certification may help improve software in critical areas. Hell, at my workplace we sometimes hire EEs over CS if they're capable of learning to code. CS certification would probably improve our applicant pool (but we probably couldn't afford to hire them). It seems to me that it would also have many deleterious effects if it the requirement for it was applied with too wide a brush.
Re:Math not essential - Logic is! (Score:3, Insightful)
Math isn't important to software engineering, but it is of great importance to computer science.
Re:Well... Yeah (Score:3, Insightful)
Yes, because that's how you design and implement a file system.
Re:Some people shouldn't code production systems (Score:5, Insightful)
Certifications aren't worth the paper they're printed on. (The same, it seems, goes for degrees.)
Re:There's A Point Here (Score:3, Insightful)
To use your War analogy, Math is not War, but Math is necessary for War. (Unless you like losing, of course.) Someone may have done all the mathematics long ago, and stored it in a computer for you use, but it's still necessary. You can be infantry in a war without knowing how to add. Heck, I'd bet you could even be a low-level official without anything higher than elementary school math.
Programming is the same way. To use a PC, or script something up in VBScript, no math is necessary at all. To write a compiler (without which, computers can do nothing useful), you need college-level math. And for some applications, you need all the math that's known to humans.
For years I've heard this same 'you don't need math to program' argument, and it's like saying you don't need roads to drive cars on. Sure, it's -possible-, but it's far from efficient and you're very limited as to what you can do with it.
Re:Applied mathematics (Score:5, Insightful)
Well, actually you do and in multiple aspects too. Whether to design and evaluate the longevity of the applied materials, to the interaction between components, those aspects must be considered in multi-milion dollar projects where bridge building is included.
Besides that, civil and structural engineers also have to consider the mechanics of materials and also wave propagation. What field of science covers that? Physics, of course.
No it doesn't. The only point it makes is that the author does not have a clue about what mathematics is. Mathematics isn't algebra or calculus. Math isn't adding up numbers or multiplying things. Mathematics is structured deductive reasoning, which builds up from a solid starting point (axioms) and serves to represent, communicate and help with the construction of ideas. Those tasks are the basis of computer science, whether you are writing an operating system, designing a database, building a network or even creating a new language/protocol. If you take math out of the picture, you simply cannot do anything in CS, whether you actually know you are applying mathematical concepts or not.
Re:Computer science ? (Score:2, Insightful)
I'd say that Computer Science(or better Computational Science) should consist of logic, Boolean algebra and so on....
And should be a separate science.
Re:Idiotic. (Score:3, Insightful)
I started programming at 5 - boolean algebra was the first maths I learned, because it flowed naturally from learning programming, though it took a few years before I knew it had a name. But really, boolean algebra is just logic with symbols.
You mention programmers/software engineers and computer scientists spearately, and you're right to. The two have about as much in common as a builder and an architect - they'll share some vocabulary and some understanding of methods, but what they need to do their jobs are vastly different.
I enjoy reading CS research papers, and I have an interest in some subsets of CS - particularly compiler design - but I don't particularly enjoy maths, and tend to avoid maths heavy papers simply because my interest in CS is a hobby and maths heavy papers take more effort (and in compiler design you need very little maths apart from some very basic graph theory anyway - when people write maths heavy papers on compiler design, then to me it tends to be a sign they don't understand what they are writing about well enough to explain it plainly - so far I've seen very few exceptions to that).
But ultimately CS isn't my career - software engineering IS. The two are different fields, and it's time people actually realize that... More importantly, it's time more schools realize that, and start offering differentiated computer science and software engineering degrees.
Someone with an MSc or even PhD in Computer Science can easily be useless as software engineers. You wouldn't expect an architect to be able to step right into the job of a builder, after all, and you'd be skeptical about the choices of someone who picked an education as an architect if they wanted to become a builder. I've had to deal with my share of highly educated "software engineers", and frankly none of the best software engineers who have worked for me have had anything above a BSc in CS, and many of them had no degree or unrelated degrees that gave them a good appreciation of the specific domain they developer software for, whereas very few of the people I've hired with MSc's and PhD's in CS have done particularly well (there are the odd exception) - it's marked enough that I've gotten to the point that a MSc or PhD in CS is a warning sign that cause me to probe actual engineering skills a lot more thoroughly, as well as asking some pointed questions about what drove them to pursue their degrees and why they subsequently went into software engineering.
But even in CS, the extent of maths you need depends massively on what your focus is. As I mentioned, compiler design rarely need to use much maths (some people do, but not because it's necessary - people like different tools), and a lot of other areas use only some small subset or other of maths.
I hardly took any maths at university, and it's rare for me to come across CS papers even outside of compiler/programming language design that I'd have any problems following due to the maths content. What maths content there tends to be is most often limited enough for context alone to be sufficient to get most of it. When I do run into problems, I can usually easily find papers that have no problems expressing the same information without much maths, which signals that it's very much a communications issue rather than something inherent to the problem. The cases where the maths is so integral to the message that it actually makes much difference apart from reducing the potential audience is very limited.
Unnecessary use of maths in CS papers is one of my pet peeves. I'm not advocating "dumbing down" research, but scientists that use "big words" when there is no reas
Re:Teh Maths (Score:3, Insightful)
On the subject at hand though, the real key to why math is needed in Computer Science is due to the analytical nature of the subject. If you do not know how to solve problems, then you will hit many dead-ends in Computer Science. Math isn't just about solving mathematical problems, it is about looking at a problem and working out a solution. Computer Science is about looking at a problem and working out a solution.
So, you can take the math out of Computer Science, but the training that you get from learning how to solve math problems can be applied directly to computer science. As a major, Computer Science is or should be as much about how to come up with solutions as it is about knowing how to do this and that.
So, you can have one person who takes the same computer subjects as a computer science major, but the computer science major will tend to be able to figure out how to break a problem down better due to the math that is a part of the requirements of the major.
Re:Yes and no (Score:3, Insightful)
Whether that is a sensible way to look at things or not really depends on your viewpoint. I'd argue it's pointless.
That you can explain something using maths doesn't mean that everyone thinks about maths or "use maths" in any conscious way when they do that something.
I could do the transformation in your example before I'd ever heard of boolean algebra, and learned to spot it without having to think much more about it after having thought through it step by step a few times. My guess would be I figured it out at 7-8 years old based on what I remember of the complexity of my programming back then. I'd argue that I was/am not "using" DeMorgan's law, but just learned a pattern by rote that I understood due to language, not maths.
If you still insist on calling it maths, then fine. But then the logical extension is to conclude that people complaining there is "too much maths" in CS are highly unlikely to be complaining about basic stuff like that, which people can/will figure out without any background in maths as/when neeeded.
I could care less about Computer Science (Score:4, Insightful)
What I'd like is an arts program that concentrates on programming. I'd like something that stresses *reading* and *writing*. I want people to learn how to *communicate* in these programming languages; not just with the computer, but also with their fellow programmers. I'd like people to do research in language design where they ask the question, "How can I allow the programmer to be more expressive about their intent?" I'd like classes on collaborative design. I could go on forever.
I was at the art gallery the other day and wandered into the modern art section. They had a display of a particular type of performance art where someone would write out a description of an artwork on 3x5 index cards. A bunch of other artists would take the description and make the art. Along with the index cards and pictures of the finished work, there were a couple of letters. The letters were describing the disappointment the original artists had in the finished work. They even went so far as to accuse the artists following the instructions as being "incompetent".
I described this to a programmer colleague of mine. His response was, "Wow... I didn't know I was a performance artist". I can count the number of times in the last 20 years that I've had to do hard math in my job as a programmer on my fingers. But questions like, "How the hell did you think *that* was readable", "How can I turn a bunch of requirements into something that isn't crap", "How do I get 10 guys working on a project and have a single vision", etc, etc, etc; those questions I ask every day.
Sure computer science is important and personally I think math is a part of that. But, someday I hope someone will realize that programming is an *artistic* endeavor and we need to do a crap load of research in that area.
Re:Computer Science != Software Engineering (Score:5, Insightful)
Disagree (Score:3, Insightful)
Algebra is an obvious key to understanding computation. Discrete mathematics including probability and combinatorics tend to pop up in computing problems over a wide range of disciplines.
On the other hand, it would not be unfair to suggest that computing is more useful to calculus than calculus is to computer science. Continuous mathematics, like calculus, show up rarely if ever in most computer science specialties.
Fant also seems to be stuck on the word "algorithm." Computer scientists have a very different definition of an algorithm than mathematicians. LISP was the only moderately successful attempt to introduce computer scientists to the mathematical notion of an algorithm. I'll take the groans and dearth of hands raised to the question, "Is LISP your primary programming language?" as proof of just how little regard computer scienctists have for the mathematical notion of an algorithm.
Re:Damn straight! (Score:4, Insightful)
My last honors math class had 3 boys in it, out of twenty.
Re:As if computer science wasn't stunted enough (Score:5, Insightful)
But not for humans.
I have no idea what you just said, and I've been coding for years.
I agree that computing is hard. Well, I find it easy, but I agree that, in general, if you're going to use a computer, you're going to learn some logic, and I will not help you to avoid thinking.
But 99% of the apps I write require little to no knowledge of mathematics, beyond basic algebra and arithmetic, and maybe a few binary tricks. In particular, Calculus was a required course for CS in college, and I have never used Calculus in my computing, even programming. Ever.
I have not read that book, but I would argue that a big reason computer science is stunted is this absurd relation to mathematics. You need better-than-average communication skills if you're ever going to work on a team -- at least to comment your code. You need philosophy -- yes, philosophy! -- at least as much as you need math, and a decent logic class would be even more useful. And you need research skills a bit beyond justfuckinggoogleit.com, if you're going to RTFM -- considering you may have to go looking for a manual, or even the right software to begin with.
Re:As if computer science wasn't stunted enough (Score:3, Insightful)
No, apparently not.
I work for a major UK public service, working on tools and content surrounding an international standard. A lot of the code is VB6 and VBA. And the bulk of it has "On Error Resume Next" at the top of every routine. One of our contractors has an IDE plugin that inserts this piece of code automatically (the very idea of this is enough to make me froth at the mouth).. The rationale is twofold ;
Neither of these is an excuse - where you expect errors, handle them. Where you don't, present them to the user, log them, email to the Pope, whatever your environment demands. Ignoring them is setting yourself up for immense pain, in the form of "it just doesn't work" bug reports from users that take an epic amount of time to resolve.
This isn't helped by the general style of the code, which is heavily error-dependant - ie, it uses error conditions to check for things like whether an item is in a collection or not. The VBA.Collection class has a lot to do with this as it doesn't have another means of checking for contents.
But I know for a fact there are places where actual logic errors occur in normal use, but it's one of those situations where are large part of the requirements are expressed as the codebase, and the specifications are so vague and complex that it's just easier to raise a sigh and accept it. Where possible, I replace the error handling with something more structured, and it's actually possible to get a stack dump out of a lot of the code when things go bad now.
There's no excuse. Yes, it makes your codebase and your binaries larger (but importantly, not slower). But you can have "proper" error handling in VB6. At my previous post, where I was the only programmer brought in to maintain another train wreck, they went from scratching their heads and guessing where the error was to knowing to the line where the error occurred and which values passed as parameters caused it. And my effort was running some automatic code inserters, rewriting any existing error handling, and recompiling the binary.
Re:Damn straight! (Score:3, Insightful)
I dunno. The guy's argument from the article seems kinda flawed.
"A logic circuit is not a sequence of operations."
No, it's a subset of a sequence of operations. It's a component that fits into a deterministic set thereof, and *should* be calculated via boolean or classical arithmetic.
"An operating system is not supposed to terminate, nor does it yield a singular solution."
Then what's "Shut Down" do? And while it doesn't yield a singular solution, it yields a given solution for a given set of inputs.
"An operating system cannot be deterministic because it must relate to uncoordinated inputs from the outside world."
Non-sequitur. Inputs and processes being parallelized does not preclude the individual logical paths from being deterministic, even if the logical paths use each others' states as inputs.
"Any program utilising random input to carry out its process, such...is not an algorithm."
An input does not determine the qualities as an algorithm; even if the random is a preferred or generated input, it's counted as external to the program.
Not to mention that internally generated randoms are algorithmic in nature.
No, seriously. It seems that his entire argument is directed towards changing semantics to take the emphasis off of the mathematical underpinnings of computer science. Rar.
Re:Computer Science != Software Engineering (Score:2, Insightful)
If you are gong to go into "Computer Engineering" designing chips, hardware, developing new low level languages, creating the next user interface layer, writing new operating systems, etc.... then you need mathematics. There is no way you can do your job without it. Maybe even a dual degree in Computer Engineering and (Mechanical or Electrical Engineering).
If you are going to develop scientific and engineering based applications you need a strong understanding of mathematics.
If you are going to design business applications, do systems integration, web applications, etc... then you do not need a strong background in mathematics, but you would stand to benefit from a management, accounting, finance, English, and psychology background; because you more than anyone else are going to be working with the end user. If you cannot win over the end user with your design, functionality, user interface then they will not want to use your product.
Re:wahay! (Score:5, Insightful)
Re:I wholeheartedly agree (Score:3, Insightful)
Re:The True Nature of Computing (Score:5, Insightful)
From here you've both made a giant leap to assume that programs can't be described by an algorithm. You haven't understood that the difference between a "computation" and "reactive software" is actually a technical triviality that is easily overcome. Indeed it is so trivial that most languages simply ignore it and have stateful operations for input/output. Reactive programs are normally modelled as a sequence of algorithmic steps, everything that the program does apart from sending / receiving data is modelled by an algorithm. So we can either consider this "non-algorithm" to be a sequence of algorithms or consider the program as an algorithm operating over a larger state that includes the environment. The input/output actions become alrgorithmic state transitions over the program/environment state. Look at the way programs in CSP/CCS or other process algebra are written to how this works. To see how the theory of algorithms can be applied to reactive systems take a look at multi-headed Turing Machines.
Finally, if you're going to lob a technical term into a discussion then you should understand what it means. Automaton is a well defined term in CS, and it doesn't mean what you think. In particular what you are describing is not a decision problem and so there is not a problem of language recognition to be solved. I vaguely remembering reading the crank research that you are pointing before, and would like to ask you a simple question. Name one problem that you believe can be computed by a UBM, but not by a UTM?
Re:Math not essential - Logic is! (Score:5, Insightful)
Re:Computer Science != Software Engineering (Score:3, Insightful)
Re:Damn straight! (Score:5, Insightful)
mathematics as a base for CS was great in the 50's and 60's, but the real problems in computer software are people problems, not algorithmic ones. Once you program a calculation correctly, it generally stays that way.
But determining the optimal layout of a form to benefit the users of the system requires observing people and their needs. Understanding what parts of a program are going to be changed because of changing user needs is more important in program design than deciding whether you need a heap sort or insertion sort. Yes, you should know the difference, but you seldom need to program it, just choose the correct one from the system library.
CS graduates tend to design programs for machine efficiency, not human efficiency. But it is humans that are expensive, not machines.
Re:Damn straight! (Score:5, Insightful)
You are a programmer, not a computer scientist. I'd hire you to write code based on a specification. I wouldn't hire you to design rendering algorithms. It is too bad they didn't teach you the difference between compsci and programming during day one of your CS program.
Holy ignorant masses Batman! (Score:5, Insightful)
"The notion of the algorithm," he concludes "simply does not provide conceptual enlightenment for the questions that most computer scientists are concerned with."
The assertion that computer science is not math is similar to the assertion made in the book "The World is Flat" saying the world is now "flatter" than it used to be. In the case of the flat world, Friedman (the author of "The World is Flat") claims the world is flat to create a sense of shock that he can then use to get his message about globalization across. In the case of "computer science is not math" Fant here is trying first to shock as a method of capturing attention...
Most Americans use math in the singular. The Brits say maths. That is because there are multiple branches of mathematics. What we are discovering is that the tie between arithmetic and calculus and computer science is falsely reinforced. The fact is there are other branches of mathematics that are more important to computer science. There are also many new branches of mathematics that need to be developed in order to solve the new kinds of problems we are trying to solve in modern computer science.
I am really bothered by programmers who, when I interview them, say they have been writing software for years and can't remember ever having to use math.
I know they can't possibly mean that... or they don't know what math is...
I know that in several years of programming you must have at least been tempted to write an if statement or at least one loop of some kind.
The if statement uses a form of algebra called boolean algebra. It was named after George Boole [wikipedia.org] who was very much a mathematician. I know that there are many programmers today who use the if statement and this form of mathematics makes up a large part of many programmer's jobs. I guess it must be falling out of fashion.
I know how to perform boolean algebraic operations on a white board and I have many times been confronted with a gigantic morass of if and else if statements and using simple truth tables and a little boolean math have reduced enormous sets of ifs down to just a few.
The new computer science needs to focus on solving problems involving processes. Processes are like algorithms in that they have a set of instructions but they are unlike algorithms in that they also have many temporal components and may exhibit parallelism, asynchronous invocations, and may not have a finite product. These are the types of problems addressed in newer mathematic disciplines that are trying to see information processes not as tied to computing machinery but as tied to the natural world.
Computer Science may point to a new kind of science that describes an underlying natural computational order of the universe. We are starting to observe computational processes everywhere, in the brains of animals, to the interactions of ecosystems, to quantum mechanics. We may lack the right mathematics to describe these things and we may have to invent new kinds of math but that doesn't mean that math becomes unimportant. An understanding of math can help when studying logic and so too would it help in studying any new disciplines that we may need to invent.
New kinds of math are invented every day to describe new kinds of problems. To say you don't need math to study any formal science let alone computer science is just silly. It is just something shocking to say that grabs attention... and the article nearly contradicts itself by the end... and it's only 7 paragraphs. The distinction Fant makes is nearly academic. Just as the distinction between a Statistician, a Geometer ( a mathematician who studies geometry ), and a Logician is academic. Yet that is not what the readers of the headline will read... Fant is arguing to make computer science a new kind of science much as Wolfram has. Yet it would be sil
Re:wahay! (Score:3, Insightful)
Re:As if computer science wasn't stunted enough (Score:4, Insightful)
I wouldn't use the word "painless". Error handling will always be painful in the general case. It's like going to the dentist. It's painful, but if you don't do it, you can predict endless, total pain later on. But I think you knew that.
A haha haha haha. That's just great.
But I suspect your real problem is that noone replied: "Good for you. Now go on and use version control, or the guards will see you out". You're describing a situation where critical systems are written by people with an unprofessional attitude to their work (not using the tools they should know they need), and noone (except for an AC on Slashdot) appears to check their work.
I hate micro-management as much as the next guy, but somehow it's more attractive than no management at all.
Re:wahay! (Score:3, Insightful)
I think it's not so much an "if" as a "when". Maybe without Eistein e=mc2 wouldn't have been know for another 20 years. Imagine how drastically that would have changed the 20th Century. Now if Jobs didn't have this artistic side to him, and that offset GUIs by 10 years, then things like the internet and the adoption of PCs might well be at about the 1997 level right now. And that's assuming that the current Federal administration would have pushed for the internet in the same way that Gore did.
Rigorous thinking (Score:1, Insightful)
The world is full of people who oversimplify what is involved in programming (like my boss, for example...). There is a tendency to leave out relevant variables to make the problem at hand seem simpler to solve, and simpler to implement, than it really is. Because of this, a lot of people think programming is easier than it really is, and they also write programs which seem to get the job done but which wind up causing them problems further down the road. Such people usually aren't big math geeks themselves, don't see the relevance, and are very unlikely to ever become truly great programmers. When things get difficult, they will either run to a truly great programmer (who probably knows a lot about math, too), or try to make the case the the problem cannot be solved (either because there is no solution or because Microsoft didn't provide a robust enough toolset).
The times when you need to actually perform some calculus in order to write a program are quite rare (depending on the nature of your product, of course). Mostly basic arithmetic is all you need most of the time. However, the mental skills that are developed from the study of mathematics are crucial for the ability to program well.
I will also predict that the likelihood that a person will agree with me on this increases in direct propotion to their years of programming experience in the field (i.e., after completing school).
What math do you need? (Score:3, Insightful)
What math do you need in computer science today? It's a tough call. But today, I'd study number-crunching rather than discrite math.
I have a classical computer science education - automata theory, number theory, combinatorics, mathematical logic - all discrite math. That's what CS was supposed to be about in the 1980s. It's hasn't been enormously useful, and I'm writing this as someone who ran a successful project to develop a proof of correctness system. Mathematical logic gets used a little, but tree-like algorithms are more important. I'm not sure automata theory is useful for much of anything. It's one of those things, like proofs in plane geometry, taught in schools because it was the first theory in the field.
Number-crunching, probability, and statistics seem to be more useful today. If you do anything in graphics, you have to have the 3D transformation math down cold. I've had to do tensor calculus and integration of non-linear differential equations (I used to do physics engines for animation) and that required plowing through difficult math books and getting consulting from university math departments. Bayesian statistics have become very important - it's used in spam filters, search engines, computer vision, and most things that have to intelligently reduce messy real-world data. I didn't get enough of that at Stanford.
On the other hand, where are you going to use this stuff? Outside of Google, Microsoft, and universities, who does real CS research any more? All the good research centers (DEC WRL, HP Labs, IBM Almaden, PARC, etc.) are gone, or a shadow of what they once were.
Re:wahay! (Score:5, Insightful)
Oh I see. All this time I was lead to believe that Donald Knuth created TeX to satisfy the desperate need for a half decent digital typography tool and after all it must have been due to some class that steve jobs took when he dropped out of college. Knowing that TeX remains to this day the best typesetting system and knowing a bit about Adobe and the history of PostScript, I guess that that half baked assertion makes sense and must be true.
...or maybe not.
Please. Steve Jobs doesn't walk over water, nor is he behind every single thing which can be accounted as progress in the computer world. This whole jobs-worshiping thing is starting to become ridiculous.
Re:As if computer science wasn't stunted enough (Score:5, Insightful)
The book (which I haven't read, but have come across enough crank bullshit over the years to quote verbatim) is based on the idea that algorithms are the wrong model for program. It's a poor misguided idea based on a trivial technicality - an algorithm (by definition) takes an input, performs a computation, and produces an output. Program do not, and are generally called reactive as they maintain a dialogue with their environment of many inputs and output. It's a technical triviality because as the GP points out you can take a series of algorithms and substitute them as the "guts" between each of the I/O operations. Nothing much is lost in this modelling. If you really need to analyse the bits that are missing then just make an I/OP operation an atomic part of the model. Process calculi (used for concurrent and parallel) systems take this approach. If you really want to appease the anal fanatic cranks (like the book author) then just explain that all of their reactive components are parts of a large algorithm that encompasses their operation and their "environment".
But now to my point. I bet that you know more maths that you think that you do. It's just that the type of maths that you learnt is not the type they teach you in school. It has nothing to do with real numbers, or calculus. It's a branch called discrete maths that is the heart of computer science. You know how to use trees and graphs? They form a big part of discrete maths. How about changing state as your code executes? That's actually a branch called monad theory. Or do you know more than one language? You said you did CS in college so I'll guess that you're familiar with parsing / compiler writing / language design. A mathematician would probably call that abstract algebra (or at least it's a close fit).
So you know much more dry dusty old math than you suspect - but for the past fifty years these parts of maths have been called CS. Something that is lost on the book author....
Re:wahay! (Score:2, Insightful)
Admittedly, those tools target publishing, but still...
Re:Computer Science without math... (Score:5, Insightful)
The reason computer science is so heavily influenced by math is the binary architecture that every piece of hardware is designed around. Every real world problem, right down to choosing the color of a font, has to be translated into the digital world by algorithmic approximation - a lot of math! The problem is that it is this very abstraction that makes computers so "flexible" in what they can do. Analog computers existed many years ago but they could only ever be built for a single purpose.
Unfortunately(?) it is much easier to design and mass produce something which is based on a finite lowest common denominator (bits) than it is to do so based on the continuum that a non-digital solution would require.
That said, who's to say that a beautiful painting rendered in Gimp/PhotoShop isn't a program of sorts? Certainly it has input, (from the original creator), and output, (its effect on us), and the "code" can be modified to change both!
Re:wahay! (Score:4, Insightful)
Yes, but I wrote a new book that claims that debate is better off without logic. Early debating pioneers such as Kant and Aristotle imposed their logic background on the field, and this has hobbled debate ever since. I reject the idea of convincing arguments as a good way to resolve any conflict.
Re:Applied mathematics (Score:1, Insightful)
No, but we do. The structure of a honeycomb isn't "hexagonal" until a human is there to call it hexagonal. Prior to that it's just a beehive, made in the way bees make beehives.
Imagine 2 apples. Until someone is there to count them, the "set of 2 apples" doesn't exist. The apples exist, sure; but the set that encompasses them, and is of cardinality 2, exists only in our minds.
Mathematicians would be a lot less Platonistic, I think, if they'd take courses in semiotics. There's a big difference between the symbol and its referent. Apples exist; but integers exist only when there's someone there to count them. That's why you can have human societies with no conception of "number" at all - where the only "amounts" of anything they can perceive are none, one, and many.
You can't escape from mathematics.
Sure you can. It's not inherent to the universe. Mathematics, being a language, is inherent to the way our minds model reality - with symbols that stand in for referents.
Re:Applied mathematics (Score:3, Insightful)
My argument is that the description may stem from the human perspective. Your use of language supports this, as it implicitly refers to Mathematics as a tool to describe--to model--the universe, rather than the universe as an application of mathematics.
Consider, for the sake of argument, the Formic [wikipedia.org] perspective. While the Formics come from a fictional world, they don't hold logic as a fundamental building block of the universe. They lack a distinction between truth and falsehood. Instead, they consider everything that has happened truth, and 'forget' anything that turns out to be false. Ender notes this specifically in one of the later books (I can't recall which), and how it increases the difficulty of communication with the Hive Queen. Indeed, the three invasions described by the first book were due to a miscommunication between our race and theirs--Formics didn't understand individuality before they met humanity.
Another perspective, this time using your bee example: While bees create hexagonal structures, we have no evidence that they do this consciously, nor do we have evidence that an alien culture would manipulate their world consciously. Instead, either could "just know" the solution, and thus have no use for geometry. Humans, meanwhile, see these hexagons and say "See! Math is fundamental, even animal's use it!" However, far from being a fundamental concept of the universe, the human is merely applying his world-view (which includes the form and function of a hexagon) to the beehive.
The philosophical question underlying this is which perspective defines the other? On the one hand, mathematics could underlie the entirety of the universe. If this is the case, then we could, theoretically, find the truth of it. Unfortunately, if mathematics is hard-wired into the brain--if the basic axioms of mathematics are assumptions made by the human mind as a means of interpreting the world--we couldn't see past those assumptions. Every attempt to do so would necessarily rely upon them, creating a circular argument or self-fulfilling world view, so to speak.
A, perhaps, interesting analogy*: Imagine you view the world through emerald lenses. Everything you see would be tinted green, but, having perceived the world this way for the entirety of your life, you would be none the wiser, unless someone or something showed you otherwise. Even then, you would be flabbergasted, possibly to the point of denial, if someone were to show you evidence of non-tinted vision. Now, consider the glasses to be mathematics. The eyes are analogous to your brain, and the assumptions fundamental to mathematics are the tint of the glasses. Can you say you'd be any the wiser? **
This logical paradox, of sorts, prevents us from knowing the truth of the universe. In fact, an entire branch of philosophy--epistemology--is dedicated toward investigating what constitutes knowledge & truth. Those philosophers have concluded, at the time of this writing, that mathematics is only true because it is defined independently of our universe. Furthermore, any attempt to apply pure mathematical reasoning to the world at large creates incredible complexity. Consider quantum mechanics, string theory, astrophysics, and other such sciences. Each of these, while functional, sacrifices a great deal of the elegance of mathematics due to an intersection of pure reason with the real world.
Re:Computer Science != Software Engineering (Score:1, Insightful)
Re:Some people shouldn't code production systems (Score:2, Insightful)
I've seen some truly terrible code by folks with degrees in both Comp Sci and Software Engineering, and even worse code by people who are doing degrees in so called 'ICT'.
The code style was all over the place, one guy's test code didn't even compile!
As Excelsior said, a good technical interview is the only way. After the first interview, we then actually get interviewees to write a simple program in 4 hours and review it with them earlier. Sometimes that's the only way to tell if they're any good or whether they'll fit into our organisation.
Re:Damn straight! (Score:3, Insightful)
I think with CS we're still in the "tinkerer stage" where wonderfull new things come just as often from the guy working behind his computer in the attic as the computer science major working in some dev center for IBM.
Re:Damn straight! (Score:3, Insightful)
But I also have 30 years experience as a developer and consultant in IT security. CS people with your attitude are the cause of most problems with security because you assume that there always is a technical solution to problems. Real problems always involve people and people are what computers are for.
Remember Hamming's forward to his book on Numerical Analysis.
"The purpose of all computing is insight, not numbers"
Without insight into human behaviour and use of computer systems, you risk creating useless shelfware or avionics software that kills people. Read the RISKS digest for the number of cases where software was written without human considerations and thereby caused harm or failed.
Re:wahay! (Score:3, Insightful)
Re:The True Nature of Computing (Score:4, Insightful)
Wow. That's probably the most non-sensical statement I've read in Slashdot in a while, including the huge iraq-related threads... Quite an accomplishment!
Math == Latin (Score:3, Insightful)
Also, it is what Latin used to be in the middle ages - the common language of people all over the world. Scientists from different continents may be barely able to communicate in their respective mother languages or in english, but if they write down their formulas, they both know exactly what the other is talking about.
But no, the most important part is that math still evolves, and rapidly. As so many other critics, the author of the article appears to have a very limited understanding of math.
Re:Damn straight! (Score:2, Insightful)
Re:Some people shouldn't code production systems (Score:1, Insightful)
The worst code I have ever seen was designed and implemented by CS majors, give me a SE, CE or even a English BA, to design and write software. Heck forget the degree give me talent and an ability to think logically. Yes CS theory is important, but leave the application of that theory to people who know how to do it right.
Re:Some people shouldn't code production systems (Score:3, Insightful)
Rubbish.
In IT, all a certification means is that one of the candidate's previous employers had a training budget. I'd consider the two candidates exactly equal. If I could only hire one, I'd fall back on my gut feel of which one interviewed better.
Charles Miller
Re:Damn straight! (Score:3, Insightful)
"mathematics as a base for CS was great in the 50's and 60's, but the real problems in computer software are people problems, not algorithmic ones. Once you program a calculation correctly, it generally stays that way."
You, like the author of this article, are missing the point of an education in mathematics. It isn't to simply teach you algorithms, its a way of thinking through abstraction which is crucial to computer science.
"But determining the optimal layout of a form to benefit the users of the system requires observing people and their needs."
That work should be done by a web designer, not a CS grad. Yes, many projects fail to adequately separate out the view from the rest of the app or force the same developers who wrote the backend to also write the user interface. But complaining that the problem is that those developers are learning math instead of human/computer interaction is like making your lawyer do your taxes and then complaining that their education was too focused on constitutional law and not the tax code. If you make someone do the wrong job, they often won't have the right education to do it.
Re:Computer Science without math... (Score:4, Insightful)
Re:Damn straight! (Score:3, Insightful)
Also, anecdotally, I code monkey (by that I mean: do grunt coding work) for a computer vision research group, and it seems, from my admittedly limited experience, that the best work being done in computer vision is being done by people that have a great command of "pure" math. Without math, computer vision is reduced to trying to code things that kind of work (the hack-it approach), whereas with math the computer vision field is building mathematical models of things like shape and is able to push the limits much farther than I think would otherwise be possible. IMHO.
Re:Damn straight! (Score:2, Insightful)
While HCI and CS are two separate, yet intertwined disciplines, they are fundamentally different art forms, with different manners of thought, problem solving, techniques, and problem spaces. It would be a mistake to confuse one for the other. That being said, it's been quite useful for me, as I stumble through my career to have had a good grounding in CS fundamentals. While I'll never need to determine if a particular interface is Big O or not, that I have a better than the average bear's idea of what goes on below those pretty interfaces I design allows me to meet both the users needs and make the wire frames I deliver to whatever poor engineer is going to have to build this thing not want to find the nearest firearm and start taking shots at me.
To your greater point, I think there is some merit, as we move closer and closer to ubiquitous computing, the greatest challenge presenting system designers won't be how to eek out more horsepower from the processor, it will be shoehorning in the interactivity seamlessly to the user and the environment. One area where you do see a merging of pure MathCS/HCI is, ironically, in the field of aerospace. One of the posters mentioned trying to fly the (sexy) new 787 without a grounding in math...and while I grant that pilots need to know a whole lot of hard science, one of HCI's (er, rather Human Factors) most obvious areas of impact is in the cockpit, and instrument design. Boeing/Airbus/Fokker/whoever spend a lot of time, money, and research into figuring out the most intelligent, intuitive, and natural way of informing the pilot of everything he or she needs to know to make split-second decisions that have literal life-or-death consequences.
In a graduate course I took on dependable system design, the very first class, the professor had us read portions of the cockpit voice recorder transcript [avweb.com] for American Airlines flight 965 [wikipedia.org]. This was the flight which crashed in the mountains near Cali, Colombia back in 1995. One of the underlying reasons for this crash was the interface for the autopilot was overly complicated while entering waypoints into the system, and when the pilot-in-command chose a wrong waypoint with a similar name to the one he needed (without the system sanity checking and throwing some query back to the cockpit crew) and literally turned his 757 into a mountain.
In this case, all the math and science couldn't save the airplane, but perhaps a system that was designed to check user inputs against some sense of "hey, is this the right data point" might have allowed the pilots to get out of the situations before anything worse than needing to do a five minute loop around the mountains back onto their flight path.