Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming Education Math IT Technology

Forget Math to Become a Great Computer Scientist? 942

Coryoth writes "A new book is trying to claim that computer science is better off without maths. The author claims that early computing pioneers such as Von Neumann and Alan Turing imposed their pure mathematics background on the field, and that this has hobbled computer science ever since. He rejects the idea of algorithms as a good way to think about software. Can you really do computer science well without mathematics? And would you want to?"
This discussion has been archived. No new comments can be posted.

Forget Math to Become a Great Computer Scientist?

Comments Filter:
  • Damn straight! (Score:4, Insightful)

    by Anonymous Coward on Sunday July 08, 2007 @07:08AM (#19787965)
    Who needs math? Bogosort is a good a sort algorithm as any. Hey, without math, how would you be able to tell?
  • by Anonymous Coward on Sunday July 08, 2007 @07:09AM (#19787979)
    Maths IS needed for computer science. Just be sure not to confuse Computer Science with Software Engineering. Software engineering is only a part of the computer science sphere.

  • by cyborg_zx ( 893396 ) on Sunday July 08, 2007 @07:09AM (#19787981)
    Do the lessons of VB6 teach us nothing?

    COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.
  • by Rakshasa Taisab ( 244699 ) on Sunday July 08, 2007 @07:10AM (#19787987) Homepage
    Taking CS without math is like taking engineering without any physics.

    WTF is the the author smoking?.. There are of course parts of CS that are less involved in math, but it is still overall a fundamental part.
  • Porn (Score:5, Insightful)

    by Dwedit ( 232252 ) on Sunday July 08, 2007 @07:10AM (#19787989) Homepage
  • by Anonymous Coward on Sunday July 08, 2007 @07:10AM (#19787993)
    Good luck on doing a kernel, file system, network stack, crypto, image processing, window manager, animation or 3D without math or algorithms. I look forward to reviewing some of this guys code.
  • Depends (Score:2, Insightful)

    by capt.Hij ( 318203 ) on Sunday July 08, 2007 @07:15AM (#19788021) Homepage Journal
    This is just another stupid generalization. There are some areas where you can do good computer science without math. There are other areas where you absolutely need mathematics. For example, you cannot do scientific computing without mathematics. Broad generalizations like this for a wide spread field just shows the ignorance/narrow mind of the author.
  • by DeadlyEmbrace ( 740321 ) on Sunday July 08, 2007 @07:15AM (#19788025)
    I attained a Computer Science BS in 1986. At the time everyone was getting Math minors. I opted for a communication minor instead. I've worked in high-tech engineering environments with real-time programming for many years. What I found is that I've never needed the intense mathematics attained by those with math minors. I needed to be able to implement equations that staff mathmaticians would develop. Though math is a fundamental of computer science, I believe the ability to logically assess a situation from multiple perspectives; communicate your approach with the customer; and then implement a maintainable solution is the key components required for computer scientists.
  • by aleph taw ( 1113487 ) on Sunday July 08, 2007 @07:15AM (#19788029)
    This guy just doesn't seem to understand what math is. Substituting theory of computation with his "theory of expressions" just shifts focus on another field of math.
  • by adamwright ( 536224 ) on Sunday July 08, 2007 @07:19AM (#19788053) Homepage
    Mainly, he claims to want to create a "comprehensive theory of process expression". Fair enough, but as soon as you want to extract usable, reliable results from your "comprehensive theory", you've really just created a branch of mathematics. Maths is not just numbers and calculus, but any systematic treatment of relations in a symbolic fashion - unless he plans a lot of fairly useless hand waving, "Oh, my process is expressed as *insert long winded ambiguous English description", he will be working within the remit of mathematics. Heck, one of my areas of study is the development of processes (studied through the use of process calculi) - a highly mathematical tool.

    He also ignores the vast array of work on non-deterministic algorithms, stating that "Any program utilising random input to carry out its process, such...is not an algorithm". Sure, it's not a deterministic algorithm, but even if you artificially restrict your definition of algorithm to just be deterministic, it's a useful tool in analysing such problems.

    Finally, statements such as "Computer science does not need a theory of computation" are just so bizarre as to be funny. I suggest he forgets all he knows about formal computational theory, and I'll contract "Theseus Research" to write me a program to determine the halting problem for an arbitrary program. I wonder what his bid will be, given that he doesn't need a theory of computation (that would tell him you can't do it, at least with our models of computation - and probably with any).

    Now, all of this is not to say you can't make progress in computer science without the mathematics that's currently been developed - however, you will either spend a lot of time achieving unreliable results, be reinventing the wheel, or just be creating a new branch of mathematics.
  • by 3seas ( 184403 ) on Sunday July 08, 2007 @07:21AM (#19788063) Homepage Journal
    ....Abstraction.

    And computer science, the software side, is really the science of abstraction physics.

    http://threeseas.net/abstraction_physics.html [threeseas.net]

    At some point in the higher levels of abstraction creation and use you use the lower mathematical level as more or less a carrier wave of the higher level abstraction, than for the purpose of performing a mathematical calculation. The analogy is that of using radio waves to carry the music you hear over the radio, but the carrier wave is discardedafter it has done it job. Likewise, the mathematics of computers boils down to binary flipping of transistor swiches upon which the higher level of mathematics is carried upon.

    With a correct approach to the abstraction manipulation machine computers really are, we can accomplish a lot more, similar to the difference between using the limitation of roman numerals in math vs. the decimal system with its zero place holder.
  • by Colin Smith ( 2679 ) on Sunday July 08, 2007 @07:26AM (#19788099)
    Hmmm?

  • by ivan_w ( 1115485 ) on Sunday July 08, 2007 @07:26AM (#19788101) Homepage
    That's engineering !

    Making better engines uses the science of Physics and chemistry..

    Cooking uses the science of chemistry..

    To me it's like saying : 'Lego Science'.. It's not 'science'.. You don't need to know the physical aspects of a lego block to assemble something.. Although you need some insight into how the thing works - but it's not science per-se !

    Then again, it depends on how 'science' is defined !

    --Ivan
  • Teh Maths (Score:5, Insightful)

    by ShakaUVM ( 157947 ) on Sunday July 08, 2007 @07:34AM (#19788143) Homepage Journal
    This is something I've thought a lot about. There have been any number of times that math has helped me in my software development efforts. Things like trig to predict the path of a moving target in Robowars (back when I was in high school) to various vector and angle related maths in CustomTF for Quake 1 (www.customtf.com) to partial derivatives to calculate the slope on a surface. I've also needed math for various economics related things over the years, and probability and statistics have also been exceptionally useful to me. Currently I'm having to decipher a guy's code which is all eigenmath, so my linear algebra course is saving me from having to hire someone just to explain all the math to me.

    But the kicker is that you can't just tell a student that they should "study vector math" because one day they'll write a Quake Mod, because, truth be told, they probably won't. It's the trouble with all examples you give when students ask how math will be useful -- I could pull any number of examples from my life, but the problem is, they probably won't happen in a student's life. Instead, they'll have their own trials. The best you can tell someone is to study all the math they can, because some day it *might* be useful, and they'll want to have that tool in their toolkit.

    And that's just not a very satisfying answer to students who want to make sure that they'll be damn well using what you're teaching in the future.

    But believe me, I thought I'd never have an application for eigenvectors, and now not only do I have to clean out my brain on the topic, but I have to parse someone else's code (PhD thesis code no less) and add functionality to it. Two other friends of mine got stuck on legacy Fortran apps which are essentially mathematical solvers (one for differential equations, the other for huge linear algebra problems), and both of them are extremely happy they paid attention in their respective math classes.

    So, yeah. To CSE students out there: take math. Pay attention. It could very well save your neck some day at a job, and if it doesn't, at least try to make it interesting to yourself to think of applications where you might use them. All math through the first two years in college can find applications for it quite easily.
  • by garcia ( 6573 ) on Sunday July 08, 2007 @07:56AM (#19788261)
    Do the lessons of VB6 teach us nothing?

    People have been fucking saying this about various versions of BASIC since the beginning. Instead of trashing it, what did BASIC's various incarnations teach us?

    It taught us that Microsoft could roll what amounts to a scripting language into its Office line and make the programs ever more powerful without having to relearn something completely new and difficult. An education in just about any language, a book or a list of commands, and some time and you will have a fully functional module or two that saves you a ton of time and energy.

    I honestly think a lot of the hostility, here, towards VB has to do with the fact that now pretty much anyone can write code and that it's from Microsoft. If you're somehow saying that if they used C/C++ or even Perl that their code would somehow be wonderful or safe, you're insane.

    COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.

    I agree and while most applications require this, if you look at VB as a way to either get people started coding or to do quick things because it's built into the system instead of concerning yourself with the necessity of math-based algorithms, it serves its need.

    I'm no math whiz but I can write code (in languages other than VB) and so can plenty of others. Enough putting people down and being on your high-horse because you write in such and such. Math is important to CS and so is easy access to be able to write code.
  • by Anonymous Coward on Sunday July 08, 2007 @07:59AM (#19788277)
    It's an old joke that any subject that has "Science" in it's name is not a science e.g. Political Science, Social Science, Computer Science.

    The Science in Computer Science consists largely of niches carved out of other disciplines e.g. algorithm analysis and crypto are mathematics, user interface design is psychology, computer graphics is really about approximating physics, audio compression is mathematics, psychology and physiology, AI steals ideals from biology... every now and then we find out that the physics department, or the electrical engineers, or the chemists, are actually doing almost identical research to us.
  • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Sunday July 08, 2007 @08:04AM (#19788313)
    Algorithms exist whether you think about them or not, but if you don't think about them, you'll accidentally create terrible ones.

    Just as few telescope makers are astrophysicists, most programmers aren't computer scientists. The author himself is evidently not one. Instead, he is one of the more vocal members of an angry, ignorant mob trying to burn down the edifice of computer science. Its members do not understand it, so they fear it and try to destroy it --- look what's happened to computer science at universities!

    It was bad enough when courses about a programming language replaced ones about algorithms and data structures (I'm looking at you, Java and D-flat). It was bad enough when pure profit became the raison d'etre of computer science departments. It was bad enough when I noticed my peers start to joke about how they didn't care about this "maths bullshit" and just wanted to earn more money. It was bad enough when the object, not the bit, became the fundamental unit of information.

    But what this author advocates is still worse. He's proposing that we replace the study of computer science with a vocational programming, and call that emaciated husk "computer science." We already have a "theory of process expression", and that's the rigorous of algorithms and data structures. We've constructed that over the past 50-odd years, and it's served us quite well.

    That field has given us not only staples, like A* pathfinding, but a whole vocabulary with which we can talk about algorithms -- how do you say that a scheduler is O(log N) the number of processes except to, well, say it's O(log N)? You can't talk about computer science without talking about algorithms.

    The author's fearful denunciation of algorithms is only one manifestation of the anti-intellectualism that's sweeping computer science. "We don't need to understand the underpinnings of how things work", the angry mob chants, "but only implement the right interfaces and everything will happen automatically."

    The members of this angry mob sometimes manage to cobble something of a program together, but it's more like a huge rock pile than a colonnade. It often barely works, uses too much memory, doesn't handle corner cases, and is likely to crash. (See worsethanfailure.com.) Members of this mob even think that if the same algorithm is expressed in two different languages, it's two different processes. People like this ask painful questions like, "i know quicksort in C# but can someone teach it to me in PHP?"

    Argh.

    Even "new" developments in programming are just new claptraps for old ideas, with fashions that come and go over the years. The only really new things are algorithms, and increasingly, we're calling people who couldn't independently create bubble sort "computer scientists." It's ridiculous. Call computer science what it is, and create a separate label and department for people who can program, but not discover new things.

    It's this idea that one doesn't need to understand or think to be successful that's at the center of the article, and it's not just affecting computer science. Look around you. I wonder whether we'll fall into an old science fiction cliché and regress so far that we are unable to understand or recreate the technology of our ancestors.
  • by joel.neely ( 165789 ) on Sunday July 08, 2007 @08:04AM (#19788315)
    The term itself is a product of the academic environment, similar to the equally dubious "Library Science" and "Management Science". For what it's worth, the European term "informatics" would have been better, but never caught on.

    That said, I believe there's a useful set of relationships well understood in other fields:

    Science = The search for fundamental knowledge and predictive models;
    Engineering = The creative application of the results of science;
    Technology = The routine application of the results of engineering.

    giving us, for example:

    Science: Physics
    Engineering: Electrical engineering
    Technology: TV Repair, Cable TV Installation

    The punch line is that application of this model to computing works as follows:

    Science: Mathematics
    Engineering: Programming, Informatics, "Computer Science"
    Technology: Coding, Computer Installation, Home Computer Repair, etc.

    Mathematics IS the science in "Computer Science".

    Anyone who has studied advanced Mathematics knows that Math is not about numbers; think of mathematical logic, Boolean algebra, abstract algebra, set theory, topology, category theory, etc. ad infinitum. Dijkstra defined Mathematics as "the art of precise reasoning". In the same sense, "computation" doesn't mean "number crunching", but more generally the automated manipulation of information.

    It is true that there are legitimate concerns in today's computational landscape (networking, concurrency, etc.) which didn't figure in the mathematical/engineering world view of the 1940s, but that's simply a sign that the field has grown up (i.e. grown beyond the limited perspectives of its founders). That's also true in many other applications of Mathematics. For example, early research in differential equations paid much more attention to linear differential equations (because they were more tractable). However, we now know that most "interesting" systems in the real world involve non-linearity.

    Science, Engineering, and Technology share with living systems an important rule: "Grow or die!" Fortunately, the field of computing has grown.
  • by Stradivarius ( 7490 ) on Sunday July 08, 2007 @08:04AM (#19788319)
    I believe the author's point isn't that you don't need to know any mathematics, or that it doesn't have an important role to play in CS. He's simply arguing that some of the main issues in computer science are not fundamentally mathematical problems (even if they require some mathematics).

    If you buy that argument, then treating CS as if it were merely simply another branch of mathematics will not help solve those problems.

    Of course, this also takes us into the perennial debate between where to draw the line between "computer science" and "software engineering". One could certainly define away the author's problem by saying that his examples are software engineering issues rather than computer science issues. And it's true that it's software engineering has been driving a lot of the theory with respect to expressiveness (design patterns and the like). But that view also seems to really impoverish computer science - if all you leave the field of computer science is the stereotypical mathematics, why not just become an applied mathematics major?
  • by cyborg_zx ( 893396 ) on Sunday July 08, 2007 @08:09AM (#19788339)
    The point is that VB is fine if you want to basically do completely trivial things with what you've already got - basic component connection. As soon as you want to do something non-trivial it all falls apart. The language design is simply worse. It's not that you cannot fuck up royally in C/C++ it's just that some stuff can be really hard to do elegantly in VB6. The newer versions have rectified this somewhat but having a rigorous approach to language design is something worth investing in - even if it's 'too mathematical' for some people's tastes.

    It's just not that easy to do some of the cool stuff we want to do. No amount of wishing it were different is ever going to change that.
  • by msormune ( 808119 ) on Sunday July 08, 2007 @08:11AM (#19788345)
    There's math and then there's advanced math. I once built a simple 3d modelling software (used 3d Studio mesh file format), and got by just using basic trigonometry and algebra. This stuff does not have to be taught in an university.
  • by nospam007 ( 722110 ) on Sunday July 08, 2007 @08:12AM (#19788353)
    ... I've taken courses in algorithms, language theory, databases etc. and the majority of the work is not maths and if it is it's so obvious anyone can see it.
    --
    The majority of _your_ work might be.
  • by Anonymous Coward on Sunday July 08, 2007 @08:16AM (#19788371)
    fact that now pretty much anyone can write code

    No, the hostility is because now pretty much anyone THINKS he can write code, which lowers the valuation of people who actually can do it. That lowers software quality on two fronts: People who can program are forced to write lower quality code because they need to write more to compete with too many amateurs (in the derogatory sense of the word) and people who can't really program write code that doesn't handle errors properly and fails, often silently and undetectedly, when the input deviates from the expected.
  • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Sunday July 08, 2007 @08:22AM (#19788389)
    Let me make this clear: your ability to write code in no way makes you a computer scientist. It's like saying that the ability to operate a forklift makes you a structural engineer. Stop it already.

    That said, I'm sure you're good at what you do. I bet you can write good code in VB, as well as many other languages. This isn't a personal insult. VB, PHP, and other brutish languages are equally bad in my eyes.

    These languages are brutish because they oversimplify key concepts. That oversimplification also makes them attractive to new programmers, and new programmers universally write terrible code. The languages themselves aren't bad, the coders are. That said, more experienced coders will generally choose more capable languages, so most of the time, a program written in a brutish language will be a bad one.

    We need fewer programmers, not more. Maybe professional certification would help somewhat.

    (Incidentally, we were lucky that Javascript became the de-facto client-side web language. We could have done far, far worse, and although we can change server languages, we can't change a user's web browser!)
  • by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Sunday July 08, 2007 @08:24AM (#19788399)
    The thing about math, though, is that it's universal. If we ever discover an alien civilization that's a peer to our own, I'm sure it will have an identical formulation of Pythagoras' theorem. Given different starting conditions, we might have used different notation or words for computer science, but the concepts are inherent in the problems we solve, and therefore would eventually have been discovered and described regardless.
  • I agree... (Score:1, Insightful)

    by Anonymous Coward on Sunday July 08, 2007 @08:24AM (#19788401)
    Before groupthink nukes this person's comment into oblivion, could you please reflect on the last time you had to deal with someone else's shitty code? (I'm sure you don't have to think back very far.)

    If you've never had to deal with someone else's poor work then you [are the luckiest bastard on the planet, but more likely you] may want to consider a career change...
  • by jsebrech ( 525647 ) on Sunday July 08, 2007 @08:33AM (#19788447)
    Taking CS without math is like taking engineering without any physics.

    But you don't engineer a bridge by thinking about the interaction of individual atoms, not because that isn't the "right" way of doing it, but because it takes too long and is too expensive.

    The article makes a good point saying that the obsession with mathematics at the exclusion of all else in computational theory is not necessarily a good thing for the IT field. Mathematics are on such a low level of abstraction that they are mostly useless when it comes to thinking about solutions to most of the problems "real world" software architects (outside universities) run across, like large-systems architecture, parallel computing, and most classes of high-level optimization. As a result, "real world" software architects mostly ignore the improvements in the field of theoretical computational science, since it has little bearing on what they do.

    Most notably: a big problem right now facing the IT field is the end of moore's law, and the growing need to parallelize everything. What we see in practice is that most programmers don't really know how to write multi-threaded code, and as a result few applications are multi-threaded. The solution here is not to require all programmers to be CS grads, because that is too expensive and a big waste of resources. A programmer shouldn't have to know about loop invariants, just like a mason shouldn't have to know about load distribution between pilons. The solution is for the theoreticians to focus on something useful to real world architects.

    But, let me be clear about this: a software architect should know their mathematics, just like a bridge architect should know their physics. If you don't know why you're designing a system a certain way, you can't know it is the right way to design it like that.
  • by Anonymous Coward on Sunday July 08, 2007 @08:34AM (#19788451)
    Hmmm?
  • by Anonymous Coward on Sunday July 08, 2007 @08:46AM (#19788511)
    I also hate to break it to you, but logic is a part of mathematics.

    I think the problem is that people are completely ignorant about what mathematics is. In school you get taught sums a bit of trigonometry, quadratic equations and calculus. That is the extent of most people's exposure to mathematics but maths is much much broader than this and much more powerful and useful.

    However, given that you are just a programmer or software consultant, you are correct in saying that you don't need to worry your pretty little head about the bigger picture.
  • CS - MA = IS (Score:3, Insightful)

    by MobyDisk ( 75490 ) on Sunday July 08, 2007 @08:48AM (#19788525) Homepage
    Computer Science - Math = Information Systems
    It covers networking, scripting, database management, web design, hardware, etc. It's computer science without the science.

    Also, Computer Science != Programming:

    "Computer science does not need a theory of computation; it needs a comprehensive theory of process expression."
    That's not computer science, that's programming. The author is confusing the two. I know many great self-taught programmers who can't tell me what O(n) means. They get a feel for what data structures to use, but rarely create their own. There's plenty of use for such people - it's probably the majority of programmers. But it isn't CS.
  • Lemme guess (Score:5, Insightful)

    by Opportunist ( 166417 ) on Sunday July 08, 2007 @08:49AM (#19788533)
    The author really sucks at math but heard that there's big bucks in the computer stuff, right?

    Computers are (by their very definition as well as by the word used to describe them) mathematical machines. A computer can essentially do NOTHING BUT calculate. It can in its core add, subtract, shift and move data around. How is this supposed to work without math?
  • Hammer, nail (Score:2, Insightful)

    by Kalvos ( 137750 ) <bathory@maltedmedia.com> on Sunday July 08, 2007 @08:53AM (#19788549) Homepage

    "If all you have is a hammer, everything looks like a nail" applies to some degree to the responses to the review.

    We use math for these machines because that's how they were designed. They didn't have to be, although from our perspective a half-century on, it seems impossible that they might work any other way.

    Computers may need math because of how they were created, but consider that an animator didn't need math to animate, rotate or transform a figure. Though it may be reduced to math, an artist doesn't need math to give depth, shading and perspective to an image. In fact, computers make such analog tasks incredibly math-intensive, as a previous poster noted.

    Despite the depth and complexity of the resulting orchestrations, no math created -- though it may describe aspects of -- Beethoven's Ninth Symphony. Learning language and grammar remain elusive to mathematicians, and even Chomsky's "universal" theories end up flummoxed by the Pirahã language. The multiple readings of T. S. Eliot's The Wasteland would take more time to track than the Internet in real time.

    Even in the sciences from antiquity, increasing description and formulation result in increasing complexity, but not necessarily increasing understanding. Earth, air, fire and water made sense in societal context; then extended elements; then the periodic table; then subatomic particles, light as particles and waves, and behavior of quarks. Magnetism remains elusive, as does an elegant theory of everything.

    Each of these may use math as a description or even a tool, but the careful tuning analysis of the different kinds of gamelans does not apply to the gamelans, but only their analysis. The reference is to itself, and the gamelans go on with or without analysis.

    In other words, were our computers not based initially in creating algorithms to manipulate the basic elements chosen to operate them, impelling the ultimate triumph of binary data over other representations, math may have receded to its place as just one tool of computer activity.

    Dennis
    We Are All Mozart [maltedmedia.com]

  • by GTMoogle ( 968547 ) on Sunday July 08, 2007 @08:58AM (#19788583)
    Who needs fewer programmers? Some programmers would have higher salaries. Some companies would have better code. Many video games might not exist. Would linux? Firefox? Would computers be as useful today if all those crappy enterprise VB applications had consumed all the programming power that currently invests itself in more generally useful areas?

    I agree professional certification may help improve software in critical areas. Hell, at my workplace we sometimes hire EEs over CS if they're capable of learning to code. CS certification would probably improve our applicant pool (but we probably couldn't afford to hire them). It seems to me that it would also have many deleterious effects if it the requirement for it was applied with too wide a brush.
  • by j0nb0y ( 107699 ) <jonboy300NO@SPAMyahoo.com> on Sunday July 08, 2007 @09:12AM (#19788671) Homepage
    I think you're confusing software engineering with computer science.

    Math isn't important to software engineering, but it is of great importance to computer science.
  • Re:Well... Yeah (Score:3, Insightful)

    by RPoet ( 20693 ) on Sunday July 08, 2007 @09:20AM (#19788729) Journal

    Ok, someone said you needed math to get a filesystem going. I'm sorry, but you really don't need to know how to use a Fourrier Series, or to know the Achilles Numbers by heart to open a file, save some stuff in it, and then close it eh...

    Yes, because that's how you design and implement a file system.
  • by FishWithAHammer ( 957772 ) on Sunday July 08, 2007 @09:23AM (#19788749)
    And then you have circumstances where quality programmers (myself, not to toot my own horn too much) are then screwed because your Almighty Certifications cost an arm and a leg.

    Certifications aren't worth the paper they're printed on. (The same, it seems, goes for degrees.)
  • by Aladrin ( 926209 ) on Sunday July 08, 2007 @09:28AM (#19788805)
    I think you are confused. The question is not 'Is Math Computer Science?', the question is 'Is Math -necessary- for Computer Science.'

    To use your War analogy, Math is not War, but Math is necessary for War. (Unless you like losing, of course.) Someone may have done all the mathematics long ago, and stored it in a computer for you use, but it's still necessary. You can be infantry in a war without knowing how to add. Heck, I'd bet you could even be a low-level official without anything higher than elementary school math.

    Programming is the same way. To use a PC, or script something up in VBScript, no math is necessary at all. To write a compiler (without which, computers can do nothing useful), you need college-level math. And for some applications, you need all the math that's known to humans.

    For years I've heard this same 'you don't need math to program' argument, and it's like saying you don't need roads to drive cars on. Sure, it's -possible-, but it's far from efficient and you're very limited as to what you can do with it.
  • by GreatBunzinni ( 642500 ) on Sunday July 08, 2007 @09:32AM (#19788827)

    But you don't engineer a bridge by thinking about the interaction of individual atoms, not because that isn't the "right" way of doing it, but because it takes too long and is too expensive.

    Well, actually you do and in multiple aspects too. Whether to design and evaluate the longevity of the applied materials, to the interaction between components, those aspects must be considered in multi-milion dollar projects where bridge building is included.

    Besides that, civil and structural engineers also have to consider the mechanics of materials and also wave propagation. What field of science covers that? Physics, of course.

    The article makes a good point saying that the obsession with mathematics at the exclusion of all else in computational theory is not necessarily a good thing for the IT field.

    No it doesn't. The only point it makes is that the author does not have a clue about what mathematics is. Mathematics isn't algebra or calculus. Math isn't adding up numbers or multiplying things. Mathematics is structured deductive reasoning, which builds up from a solid starting point (axioms) and serves to represent, communicate and help with the construction of ideas. Those tasks are the basis of computer science, whether you are writing an operating system, designing a database, building a network or even creating a new language/protocol. If you take math out of the picture, you simply cannot do anything in CS, whether you actually know you are applying mathematical concepts or not.

  • by JAlexoi ( 1085785 ) on Sunday July 08, 2007 @09:34AM (#19788843) Homepage
    I have studied advanced math and I can tell that it's all about analysis.
    I'd say that Computer Science(or better Computational Science) should consist of logic, Boolean algebra and so on....
    And should be a separate science.
  • Re:Idiotic. (Score:3, Insightful)

    by vidarh ( 309115 ) <vidar@hokstad.com> on Sunday July 08, 2007 @09:50AM (#19788955) Homepage Journal
    I "forget" about it on a regular basis. Almost none of the software engineering work I've done over the last 12 years, or the hobby programming I did for another 15 before that have required much maths beyond basic boolean algebra, some understanding of computational complexity, and assorted other stuff that's mostly been at most at high school level.

    I started programming at 5 - boolean algebra was the first maths I learned, because it flowed naturally from learning programming, though it took a few years before I knew it had a name. But really, boolean algebra is just logic with symbols.

    You mention programmers/software engineers and computer scientists spearately, and you're right to. The two have about as much in common as a builder and an architect - they'll share some vocabulary and some understanding of methods, but what they need to do their jobs are vastly different.

    I enjoy reading CS research papers, and I have an interest in some subsets of CS - particularly compiler design - but I don't particularly enjoy maths, and tend to avoid maths heavy papers simply because my interest in CS is a hobby and maths heavy papers take more effort (and in compiler design you need very little maths apart from some very basic graph theory anyway - when people write maths heavy papers on compiler design, then to me it tends to be a sign they don't understand what they are writing about well enough to explain it plainly - so far I've seen very few exceptions to that).

    But ultimately CS isn't my career - software engineering IS. The two are different fields, and it's time people actually realize that... More importantly, it's time more schools realize that, and start offering differentiated computer science and software engineering degrees.

    Someone with an MSc or even PhD in Computer Science can easily be useless as software engineers. You wouldn't expect an architect to be able to step right into the job of a builder, after all, and you'd be skeptical about the choices of someone who picked an education as an architect if they wanted to become a builder. I've had to deal with my share of highly educated "software engineers", and frankly none of the best software engineers who have worked for me have had anything above a BSc in CS, and many of them had no degree or unrelated degrees that gave them a good appreciation of the specific domain they developer software for, whereas very few of the people I've hired with MSc's and PhD's in CS have done particularly well (there are the odd exception) - it's marked enough that I've gotten to the point that a MSc or PhD in CS is a warning sign that cause me to probe actual engineering skills a lot more thoroughly, as well as asking some pointed questions about what drove them to pursue their degrees and why they subsequently went into software engineering.

    But even in CS, the extent of maths you need depends massively on what your focus is. As I mentioned, compiler design rarely need to use much maths (some people do, but not because it's necessary - people like different tools), and a lot of other areas use only some small subset or other of maths.

    I hardly took any maths at university, and it's rare for me to come across CS papers even outside of compiler/programming language design that I'd have any problems following due to the maths content. What maths content there tends to be is most often limited enough for context alone to be sufficient to get most of it. When I do run into problems, I can usually easily find papers that have no problems expressing the same information without much maths, which signals that it's very much a communications issue rather than something inherent to the problem. The cases where the maths is so integral to the message that it actually makes much difference apart from reducing the potential audience is very limited.

    Unnecessary use of maths in CS papers is one of my pet peeves. I'm not advocating "dumbing down" research, but scientists that use "big words" when there is no reas

  • Re:Teh Maths (Score:3, Insightful)

    by Targon ( 17348 ) on Sunday July 08, 2007 @09:58AM (#19789009)
    You could say then, that math is used in so many different areas in life, that it is foolish to try to ignore it.

    On the subject at hand though, the real key to why math is needed in Computer Science is due to the analytical nature of the subject. If you do not know how to solve problems, then you will hit many dead-ends in Computer Science. Math isn't just about solving mathematical problems, it is about looking at a problem and working out a solution. Computer Science is about looking at a problem and working out a solution.

    So, you can take the math out of Computer Science, but the training that you get from learning how to solve math problems can be applied directly to computer science. As a major, Computer Science is or should be as much about how to come up with solutions as it is about knowing how to do this and that.

    So, you can have one person who takes the same computer subjects as a computer science major, but the computer science major will tend to be able to figure out how to break a problem down better due to the math that is a part of the requirements of the major.
  • Re:Yes and no (Score:3, Insightful)

    by vidarh ( 309115 ) <vidar@hokstad.com> on Sunday July 08, 2007 @10:11AM (#19789071) Homepage Journal
    Any competent programmer will know how to do the above, but many will do it because they've either learned the basic truth tables and know how to apply them, or because they've learned the most common transformations by rote. They are "using maths" the same way a builder is "using physics" when putting in an RSJ to prop up a wall, or the way we're "using physics" when we depend on objects not suddenly floating into outer space.

    Whether that is a sensible way to look at things or not really depends on your viewpoint. I'd argue it's pointless.

    That you can explain something using maths doesn't mean that everyone thinks about maths or "use maths" in any conscious way when they do that something.

    I could do the transformation in your example before I'd ever heard of boolean algebra, and learned to spot it without having to think much more about it after having thought through it step by step a few times. My guess would be I figured it out at 7-8 years old based on what I remember of the complexity of my programming back then. I'd argue that I was/am not "using" DeMorgan's law, but just learned a pattern by rote that I understood due to language, not maths.

    If you still insist on calling it maths, then fine. But then the logical extension is to conclude that people complaining there is "too much maths" in CS are highly unlikely to be complaining about basic stuff like that, which people can/will figure out without any background in maths as/when neeeded.

  • by wrook ( 134116 ) on Sunday July 08, 2007 @10:13AM (#19789085) Homepage
    I really don't care what you do with Computer Science. There is a lot of research that requires math, as others have pointed out. And a lot of it is really valuable. Equally there is a lot of research bundled under "computer science" (because it uses computers I guess) that requires no math. Whatever.

    What I'd like is an arts program that concentrates on programming. I'd like something that stresses *reading* and *writing*. I want people to learn how to *communicate* in these programming languages; not just with the computer, but also with their fellow programmers. I'd like people to do research in language design where they ask the question, "How can I allow the programmer to be more expressive about their intent?" I'd like classes on collaborative design. I could go on forever.

    I was at the art gallery the other day and wandered into the modern art section. They had a display of a particular type of performance art where someone would write out a description of an artwork on 3x5 index cards. A bunch of other artists would take the description and make the art. Along with the index cards and pictures of the finished work, there were a couple of letters. The letters were describing the disappointment the original artists had in the finished work. They even went so far as to accuse the artists following the instructions as being "incompetent".

    I described this to a programmer colleague of mine. His response was, "Wow... I didn't know I was a performance artist". I can count the number of times in the last 20 years that I've had to do hard math in my job as a programmer on my fingers. But questions like, "How the hell did you think *that* was readable", "How can I turn a bunch of requirements into something that isn't crap", "How do I get 10 guys working on a project and have a single vision", etc, etc, etc; those questions I ask every day.

    Sure computer science is important and personally I think math is a part of that. But, someday I hope someone will realize that programming is an *artistic* endeavor and we need to do a crap load of research in that area.
  • by allthingscode ( 642676 ) on Sunday July 08, 2007 @10:35AM (#19789231)
    What most people do in computer programming is like carpentry, and for that all you need to do is memorize how to write a few loops and and which methods to call. But then, every once in a while, you need something truly earthshaking, like solving string subsequence matching (comparing DNA sequences) in O(n**2) rather than O(2**n), and then you have to run to the people who can do the math. Another example is when people thought the best way to do AI was to mimic the neuron, but then, by applying some rigorous math, you end up with Support Vector Machines.
  • Disagree (Score:3, Insightful)

    by Spazmania ( 174582 ) on Sunday July 08, 2007 @10:37AM (#19789259) Homepage
    Its not so much that computer science isn't related to math. Its more that CS students are assigned the wrong math courses.

    Algebra is an obvious key to understanding computation. Discrete mathematics including probability and combinatorics tend to pop up in computing problems over a wide range of disciplines.

    On the other hand, it would not be unfair to suggest that computing is more useful to calculus than calculus is to computer science. Continuous mathematics, like calculus, show up rarely if ever in most computer science specialties.

    Fant also seems to be stuck on the word "algorithm." Computer scientists have a very different definition of an algorithm than mathematicians. LISP was the only moderately successful attempt to introduce computer scientists to the mathematical notion of an algorithm. I'll take the groans and dearth of hands raised to the question, "Is LISP your primary programming language?" as proof of just how little regard computer scienctists have for the mathematical notion of an algorithm.

  • Re:Damn straight! (Score:4, Insightful)

    by Firethorn ( 177587 ) on Sunday July 08, 2007 @10:51AM (#19789377) Homepage Journal
    From my experience, it might mean less girls.

    My last honors math class had 3 boys in it, out of twenty.
  • And I'm sorry but mathematics is just the way in which meaning is expressed for machines.

    But not for humans.

    And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.

    I have no idea what you just said, and I've been coding for years.

    I agree that computing is hard. Well, I find it easy, but I agree that, in general, if you're going to use a computer, you're going to learn some logic, and I will not help you to avoid thinking.

    But 99% of the apps I write require little to no knowledge of mathematics, beyond basic algebra and arithmetic, and maybe a few binary tricks. In particular, Calculus was a required course for CS in college, and I have never used Calculus in my computing, even programming. Ever.

    I have not read that book, but I would argue that a big reason computer science is stunted is this absurd relation to mathematics. You need better-than-average communication skills if you're ever going to work on a team -- at least to comment your code. You need philosophy -- yes, philosophy! -- at least as much as you need math, and a decent logic class would be even more useful. And you need research skills a bit beyond justfuckinggoogleit.com, if you're going to RTFM -- considering you may have to go looking for a manual, or even the right software to begin with.

  • by Dr_Barnowl ( 709838 ) on Sunday July 08, 2007 @10:57AM (#19789437)

    DO PEOPLE NEVER LEARN!?!!


    No, apparently not.

    I work for a major UK public service, working on tools and content surrounding an international standard. A lot of the code is VB6 and VBA. And the bulk of it has "On Error Resume Next" at the top of every routine. One of our contractors has an IDE plugin that inserts this piece of code automatically (the very idea of this is enough to make me froth at the mouth).. The rationale is twofold ;

    • Unhandled errors off the top of the stack and terminate the program, which doesn't give you a chance to save your work. That's fair enough....
    • Apparently, users don't like seeing error dialogues

    Neither of these is an excuse - where you expect errors, handle them. Where you don't, present them to the user, log them, email to the Pope, whatever your environment demands. Ignoring them is setting yourself up for immense pain, in the form of "it just doesn't work" bug reports from users that take an epic amount of time to resolve.

    This isn't helped by the general style of the code, which is heavily error-dependant - ie, it uses error conditions to check for things like whether an item is in a collection or not. The VBA.Collection class has a lot to do with this as it doesn't have another means of checking for contents.

    But I know for a fact there are places where actual logic errors occur in normal use, but it's one of those situations where are large part of the requirements are expressed as the codebase, and the specifications are so vague and complex that it's just easier to raise a sigh and accept it. Where possible, I replace the error handling with something more structured, and it's actually possible to get a stack dump out of a lot of the code when things go bad now.

    There's no excuse. Yes, it makes your codebase and your binaries larger (but importantly, not slower). But you can have "proper" error handling in VB6. At my previous post, where I was the only programmer brought in to maintain another train wreck, they went from scratching their heads and guessing where the error was to knowing to the line where the error occurred and which values passed as parameters caused it. And my effort was running some automatic code inserters, rewriting any existing error handling, and recompiling the binary.
  • Re:Damn straight! (Score:3, Insightful)

    by Fordiman ( 689627 ) <fordiman@g[ ]l.com ['mai' in gap]> on Sunday July 08, 2007 @11:01AM (#19789481) Homepage Journal
    Does boolean arithmetic count as math? I would say so.

    I dunno. The guy's argument from the article seems kinda flawed.

    "A logic circuit is not a sequence of operations."

    No, it's a subset of a sequence of operations. It's a component that fits into a deterministic set thereof, and *should* be calculated via boolean or classical arithmetic.

    "An operating system is not supposed to terminate, nor does it yield a singular solution."

    Then what's "Shut Down" do? And while it doesn't yield a singular solution, it yields a given solution for a given set of inputs.

    "An operating system cannot be deterministic because it must relate to uncoordinated inputs from the outside world."

    Non-sequitur. Inputs and processes being parallelized does not preclude the individual logical paths from being deterministic, even if the logical paths use each others' states as inputs.

    "Any program utilising random input to carry out its process, such...is not an algorithm."

    An input does not determine the qualities as an algorithm; even if the random is a preferred or generated input, it's counted as external to the program.

    Not to mention that internally generated randoms are algorithmic in nature.

    No, seriously. It seems that his entire argument is directed towards changing semantics to take the emphasis off of the mathematical underpinnings of computer science. Rar.
  • by rjpotts ( 152026 ) on Sunday July 08, 2007 @11:14AM (#19789599) Homepage
    I think that there is a little more to it than that. Computer Science is an old term that you to cover the computer industry as a whole. Just like Medical Science covers the medical field as a whole. Today you need to look at with a little more granularity.

    If you are gong to go into "Computer Engineering" designing chips, hardware, developing new low level languages, creating the next user interface layer, writing new operating systems, etc.... then you need mathematics. There is no way you can do your job without it. Maybe even a dual degree in Computer Engineering and (Mechanical or Electrical Engineering).

    If you are going to develop scientific and engineering based applications you need a strong understanding of mathematics.

    If you are going to design business applications, do systems integration, web applications, etc... then you do not need a strong background in mathematics, but you would stand to benefit from a management, accounting, finance, English, and psychology background; because you more than anyone else are going to be working with the end user. If you cannot win over the end user with your design, functionality, user interface then they will not want to use your product.
  • Re:wahay! (Score:5, Insightful)

    by kestasjk ( 933987 ) on Sunday July 08, 2007 @11:15AM (#19789603) Homepage
    It's lucky Jobs went to his calligraphy classes; if he hadn't we'd all still be using monochrome terminals. (A pretty arrogant thing to imply)
  • by IL-CSIXTY4 ( 801087 ) on Sunday July 08, 2007 @11:18AM (#19789627) Homepage
    Lots of people in this discussion mention that they don't use any of the math they were forced to take in college. I think the problem is that schools are requiring the wrong kinds of math, or maybe they're using math to "weed out" students instead of helping them. I think classes in formal logic and discrete math are invaluable to computer science students. Calc...eh, not so much.
  • by smallfries ( 601545 ) on Sunday July 08, 2007 @11:20AM (#19789657) Homepage
    What you have written is 100% nonsense and puts you in exactly the same crank camp as Fant. It is always interesting to hear people that don't understand computer science describe what is wrong with it. The model of interaction that you (and he) describe is normally called Reactive software, and it is true to say that it cannot be modelled by a Turing Machine as it performs interaction continuously rather than at the beginning and end of the computation.

    From here you've both made a giant leap to assume that programs can't be described by an algorithm. You haven't understood that the difference between a "computation" and "reactive software" is actually a technical triviality that is easily overcome. Indeed it is so trivial that most languages simply ignore it and have stateful operations for input/output. Reactive programs are normally modelled as a sequence of algorithmic steps, everything that the program does apart from sending / receiving data is modelled by an algorithm. So we can either consider this "non-algorithm" to be a sequence of algorithms or consider the program as an algorithm operating over a larger state that includes the environment. The input/output actions become alrgorithmic state transitions over the program/environment state. Look at the way programs in CSP/CCS or other process algebra are written to how this works. To see how the theory of algorithms can be applied to reactive systems take a look at multi-headed Turing Machines.

    Finally, if you're going to lob a technical term into a discussion then you should understand what it means. Automaton is a well defined term in CS, and it doesn't mean what you think. In particular what you are describing is not a decision problem and so there is not a problem of language recognition to be solved. I vaguely remembering reading the crank research that you are pointing before, and would like to ask you a simple question. Name one problem that you believe can be computed by a UBM, but not by a UTM?
  • by Coryoth ( 254751 ) on Sunday July 08, 2007 @11:22AM (#19789669) Homepage Journal

    Math isn't important to software engineering, but it is of great importance to computer science.
    Math is important to software engineering, it's just that you can get by without it. However, if you want assurances of correctness then type theory and proof theory are actually rather important; if you want to do concurrency well then process algebras are a good way to do it right, and with assurances that you are getting it right. You can get by quite happily without these things. You can, however, do an even better job with them.
  • by Strilanc ( 1077197 ) on Sunday July 08, 2007 @11:57AM (#19789953)
    Quicksort is randomized, has fantastic cache performance, and its worst case behavior is very unlikely. There are plenty of other randomized algorithms with bad worst-cases but which work better the majority of the time. That's what makes randomized algorithms so sexy.
  • Re:Damn straight! (Score:5, Insightful)

    by WGR ( 32993 ) on Sunday July 08, 2007 @12:06PM (#19790013) Journal
    Figuring out which sort to use is very seldom what a computer software creator does.

    mathematics as a base for CS was great in the 50's and 60's, but the real problems in computer software are people problems, not algorithmic ones. Once you program a calculation correctly, it generally stays that way.

    But determining the optimal layout of a form to benefit the users of the system requires observing people and their needs. Understanding what parts of a program are going to be changed because of changing user needs is more important in program design than deciding whether you need a heap sort or insertion sort. Yes, you should know the difference, but you seldom need to program it, just choose the correct one from the system library.

    CS graduates tend to design programs for machine efficiency, not human efficiency. But it is humans that are expensive, not machines.
  • Re:Damn straight! (Score:5, Insightful)

    by Anonymous Coward on Sunday July 08, 2007 @12:06PM (#19790017)

    as a CS major, i sucked at all the math subjects that were supposed to be a given for a CS major, HARD, but i still consistently was at or near the top of my class in anything involving actual programming, wrote cleaner, more efficient code, and was able to troubleshoot code much better than virtually all my peers.
    That's like a mechanic who is better at replacing a tie rod than the engineer that designed it.

    You are a programmer, not a computer scientist. I'd hire you to write code based on a specification. I wouldn't hire you to design rendering algorithms. It is too bad they didn't teach you the difference between compsci and programming during day one of your CS program.

  • by Zarf ( 5735 ) on Sunday July 08, 2007 @12:21PM (#19790129) Journal
    From the article:

    "The notion of the algorithm," he concludes "simply does not provide conceptual enlightenment for the questions that most computer scientists are concerned with."

    The assertion that computer science is not math is similar to the assertion made in the book "The World is Flat" saying the world is now "flatter" than it used to be. In the case of the flat world, Friedman (the author of "The World is Flat") claims the world is flat to create a sense of shock that he can then use to get his message about globalization across. In the case of "computer science is not math" Fant here is trying first to shock as a method of capturing attention...

    Most Americans use math in the singular. The Brits say maths. That is because there are multiple branches of mathematics. What we are discovering is that the tie between arithmetic and calculus and computer science is falsely reinforced. The fact is there are other branches of mathematics that are more important to computer science. There are also many new branches of mathematics that need to be developed in order to solve the new kinds of problems we are trying to solve in modern computer science.

    I am really bothered by programmers who, when I interview them, say they have been writing software for years and can't remember ever having to use math.

    I know they can't possibly mean that... or they don't know what math is...

    I know that in several years of programming you must have at least been tempted to write an if statement or at least one loop of some kind.

    The if statement uses a form of algebra called boolean algebra. It was named after George Boole [wikipedia.org] who was very much a mathematician. I know that there are many programmers today who use the if statement and this form of mathematics makes up a large part of many programmer's jobs. I guess it must be falling out of fashion.

    I know how to perform boolean algebraic operations on a white board and I have many times been confronted with a gigantic morass of if and else if statements and using simple truth tables and a little boolean math have reduced enormous sets of ifs down to just a few.

    The new computer science needs to focus on solving problems involving processes. Processes are like algorithms in that they have a set of instructions but they are unlike algorithms in that they also have many temporal components and may exhibit parallelism, asynchronous invocations, and may not have a finite product. These are the types of problems addressed in newer mathematic disciplines that are trying to see information processes not as tied to computing machinery but as tied to the natural world.

    Computer Science may point to a new kind of science that describes an underlying natural computational order of the universe. We are starting to observe computational processes everywhere, in the brains of animals, to the interactions of ecosystems, to quantum mechanics. We may lack the right mathematics to describe these things and we may have to invent new kinds of math but that doesn't mean that math becomes unimportant. An understanding of math can help when studying logic and so too would it help in studying any new disciplines that we may need to invent.

    New kinds of math are invented every day to describe new kinds of problems. To say you don't need math to study any formal science let alone computer science is just silly. It is just something shocking to say that grabs attention... and the article nearly contradicts itself by the end... and it's only 7 paragraphs. The distinction Fant makes is nearly academic. Just as the distinction between a Statistician, a Geometer ( a mathematician who studies geometry ), and a Logician is academic. Yet that is not what the readers of the headline will read... Fant is arguing to make computer science a new kind of science much as Wolfram has. Yet it would be sil

  • Re:wahay! (Score:3, Insightful)

    by qdaku ( 729578 ) on Sunday July 08, 2007 @12:35PM (#19790263)
    Most engineers never need calculus either, but we still take an awful lot. Why use does a geological engineer have for vector calculus? You'd be surprised where it shows up sometimes. But you need to know it to understand what's going on behind the scenes in various "standard" equations, modeling programs, etc., so you don't blindly follow what's laid down in a textbook. Most of the great engineers I know in industry and research (pretty closely connected in my industry actually) are also damn sharp at math.
  • by jgrahn ( 181062 ) on Sunday July 08, 2007 @12:40PM (#19790293)

    Error handling isn't optional. Error handling isn't something that gets added into a system. It should be an integral part of the system's design. Furthermore, with exceptions, error handling is painless.

    I wouldn't use the word "painless". Error handling will always be painful in the general case. It's like going to the dentist. It's painful, but if you don't do it, you can predict endless, total pain later on. But I think you knew that.

    Also, another harbinger of doom: "I don't need to use version control. I'm the only one working on the system."

    A haha haha haha. That's just great.

    But I suspect your real problem is that noone replied: "Good for you. Now go on and use version control, or the guards will see you out". You're describing a situation where critical systems are written by people with an unprofessional attitude to their work (not using the tools they should know they need), and noone (except for an AC on Slashdot) appears to check their work.

    I hate micro-management as much as the next guy, but somehow it's more attractive than no management at all.

  • Re:wahay! (Score:3, Insightful)

    by Original Replica ( 908688 ) on Sunday July 08, 2007 @12:49PM (#19790363) Journal
    Like saying relativity wouldn't have been discovered without Einstein.

    I think it's not so much an "if" as a "when". Maybe without Eistein e=mc2 wouldn't have been know for another 20 years. Imagine how drastically that would have changed the 20th Century. Now if Jobs didn't have this artistic side to him, and that offset GUIs by 10 years, then things like the internet and the adoption of PCs might well be at about the 1997 level right now. And that's assuming that the current Federal administration would have pushed for the internet in the same way that Gore did.
  • Rigorous thinking (Score:1, Insightful)

    by Anonymous Coward on Sunday July 08, 2007 @12:53PM (#19790389)
    In order to program well (by which I mean write code that performs quickly, uses minimal resources, has no (or few) bugs, is easy to change or enhance without creating new bugs, and consistently gets the correct result) you need to be able to think in a very rigorous, structured way. You also need to be able to hold many variables and logical connections in your mind all at once. It just so happens that the practice of mathematics exercises (and hence theoretically improves) this ability.

    The world is full of people who oversimplify what is involved in programming (like my boss, for example...). There is a tendency to leave out relevant variables to make the problem at hand seem simpler to solve, and simpler to implement, than it really is. Because of this, a lot of people think programming is easier than it really is, and they also write programs which seem to get the job done but which wind up causing them problems further down the road. Such people usually aren't big math geeks themselves, don't see the relevance, and are very unlikely to ever become truly great programmers. When things get difficult, they will either run to a truly great programmer (who probably knows a lot about math, too), or try to make the case the the problem cannot be solved (either because there is no solution or because Microsoft didn't provide a robust enough toolset).

    The times when you need to actually perform some calculus in order to write a program are quite rare (depending on the nature of your product, of course). Mostly basic arithmetic is all you need most of the time. However, the mental skills that are developed from the study of mathematics are crucial for the ability to program well.

    I will also predict that the likelihood that a person will agree with me on this increases in direct propotion to their years of programming experience in the field (i.e., after completing school).

  • by Animats ( 122034 ) on Sunday July 08, 2007 @12:57PM (#19790423) Homepage

    What math do you need in computer science today? It's a tough call. But today, I'd study number-crunching rather than discrite math.

    I have a classical computer science education - automata theory, number theory, combinatorics, mathematical logic - all discrite math. That's what CS was supposed to be about in the 1980s. It's hasn't been enormously useful, and I'm writing this as someone who ran a successful project to develop a proof of correctness system. Mathematical logic gets used a little, but tree-like algorithms are more important. I'm not sure automata theory is useful for much of anything. It's one of those things, like proofs in plane geometry, taught in schools because it was the first theory in the field.

    Number-crunching, probability, and statistics seem to be more useful today. If you do anything in graphics, you have to have the 3D transformation math down cold. I've had to do tensor calculus and integration of non-linear differential equations (I used to do physics engines for animation) and that required plowing through difficult math books and getting consulting from university math departments. Bayesian statistics have become very important - it's used in spam filters, search engines, computer vision, and most things that have to intelligently reduce messy real-world data. I didn't get enough of that at Stanford.

    On the other hand, where are you going to use this stuff? Outside of Google, Microsoft, and universities, who does real CS research any more? All the good research centers (DEC WRL, HP Labs, IBM Almaden, PARC, etc.) are gone, or a shadow of what they once were.

  • Re:wahay! (Score:5, Insightful)

    by GreatBunzinni ( 642500 ) on Sunday July 08, 2007 @12:58PM (#19790429)

    If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do.

    Oh I see. All this time I was lead to believe that Donald Knuth created TeX to satisfy the desperate need for a half decent digital typography tool and after all it must have been due to some class that steve jobs took when he dropped out of college. Knowing that TeX remains to this day the best typesetting system and knowing a bit about Adobe and the history of PostScript, I guess that that half baked assertion makes sense and must be true.

    ...or maybe not.

    Please. Steve Jobs doesn't walk over water, nor is he behind every single thing which can be accounted as progress in the computer world. This whole jobs-worshiping thing is starting to become ridiculous.

  • by smallfries ( 601545 ) on Sunday July 08, 2007 @01:08PM (#19790509) Homepage

    I have no idea what you just said, and I've been coding for years.


    The book (which I haven't read, but have come across enough crank bullshit over the years to quote verbatim) is based on the idea that algorithms are the wrong model for program. It's a poor misguided idea based on a trivial technicality - an algorithm (by definition) takes an input, performs a computation, and produces an output. Program do not, and are generally called reactive as they maintain a dialogue with their environment of many inputs and output. It's a technical triviality because as the GP points out you can take a series of algorithms and substitute them as the "guts" between each of the I/O operations. Nothing much is lost in this modelling. If you really need to analyse the bits that are missing then just make an I/OP operation an atomic part of the model. Process calculi (used for concurrent and parallel) systems take this approach. If you really want to appease the anal fanatic cranks (like the book author) then just explain that all of their reactive components are parts of a large algorithm that encompasses their operation and their "environment".

    But now to my point. I bet that you know more maths that you think that you do. It's just that the type of maths that you learnt is not the type they teach you in school. It has nothing to do with real numbers, or calculus. It's a branch called discrete maths that is the heart of computer science. You know how to use trees and graphs? They form a big part of discrete maths. How about changing state as your code executes? That's actually a branch called monad theory. Or do you know more than one language? You said you did CS in college so I'll guess that you're familiar with parsing / compiler writing / language design. A mathematician would probably call that abstract algebra (or at least it's a close fit).

    So you know much more dry dusty old math than you suspect - but for the past fifty years these parts of maths have been called CS. Something that is lost on the book author....
  • Re:wahay! (Score:2, Insightful)

    by smittyoneeach ( 243267 ) * on Sunday July 08, 2007 @01:09PM (#19790523) Homepage Journal
    <sarcasm>Sure am glad that Donald Knuth poseur got no undeserved credit for TeX and METAFONT.</sarcasm>
    Admittedly, those tools target publishing, but still...
  • by itwerx ( 165526 ) on Sunday July 08, 2007 @01:42PM (#19790793) Homepage
    is the same as writing litterature with a programming language.

    The reason computer science is so heavily influenced by math is the binary architecture that every piece of hardware is designed around. Every real world problem, right down to choosing the color of a font, has to be translated into the digital world by algorithmic approximation - a lot of math! The problem is that it is this very abstraction that makes computers so "flexible" in what they can do. Analog computers existed many years ago but they could only ever be built for a single purpose.
          Unfortunately(?) it is much easier to design and mass produce something which is based on a finite lowest common denominator (bits) than it is to do so based on the continuum that a non-digital solution would require.
          That said, who's to say that a beautiful painting rendered in Gimp/PhotoShop isn't a program of sorts? Certainly it has input, (from the original creator), and output, (its effect on us), and the "code" can be modified to change both!
  • Re:wahay! (Score:4, Insightful)

    by neapolitan ( 1100101 ) on Sunday July 08, 2007 @01:46PM (#19790829)
    > I'd be interested to debate the ideas, point by point.

    Yes, but I wrote a new book that claims that debate is better off without logic. Early debating pioneers such as Kant and Aristotle imposed their logic background on the field, and this has hobbled debate ever since. I reject the idea of convincing arguments as a good way to resolve any conflict.

  • by crashfrog ( 126007 ) on Sunday July 08, 2007 @02:00PM (#19790945) Homepage
    An organism can use math without perceiving it --- take bees, which produce hexagonal honeycomb structures. Do you think they perceive the hexagon shape, or the number six?

    No, but we do. The structure of a honeycomb isn't "hexagonal" until a human is there to call it hexagonal. Prior to that it's just a beehive, made in the way bees make beehives.

    Imagine 2 apples. Until someone is there to count them, the "set of 2 apples" doesn't exist. The apples exist, sure; but the set that encompasses them, and is of cardinality 2, exists only in our minds.

    Mathematicians would be a lot less Platonistic, I think, if they'd take courses in semiotics. There's a big difference between the symbol and its referent. Apples exist; but integers exist only when there's someone there to count them. That's why you can have human societies with no conception of "number" at all - where the only "amounts" of anything they can perceive are none, one, and many.

    You can't escape from mathematics.

    Sure you can. It's not inherent to the universe. Mathematics, being a language, is inherent to the way our minds model reality - with symbols that stand in for referents.
  • by coolGuyZak ( 844482 ) on Sunday July 08, 2007 @02:02PM (#19790961)

    Mathematics still describes them perfectly.

    My argument is that the description may stem from the human perspective. Your use of language supports this, as it implicitly refers to Mathematics as a tool to describe--to model--the universe, rather than the universe as an application of mathematics.

    I'm also at a loss to imagine an organism that can manipulate its environment consciously that is unable to come up with basic geometry. I realize that proof through incredulity is no proof at all, but please elaborate.

    Consider, for the sake of argument, the Formic [wikipedia.org] perspective. While the Formics come from a fictional world, they don't hold logic as a fundamental building block of the universe. They lack a distinction between truth and falsehood. Instead, they consider everything that has happened truth, and 'forget' anything that turns out to be false. Ender notes this specifically in one of the later books (I can't recall which), and how it increases the difficulty of communication with the Hive Queen. Indeed, the three invasions described by the first book were due to a miscommunication between our race and theirs--Formics didn't understand individuality before they met humanity.

    Another perspective, this time using your bee example: While bees create hexagonal structures, we have no evidence that they do this consciously, nor do we have evidence that an alien culture would manipulate their world consciously. Instead, either could "just know" the solution, and thus have no use for geometry. Humans, meanwhile, see these hexagons and say "See! Math is fundamental, even animal's use it!" However, far from being a fundamental concept of the universe, the human is merely applying his world-view (which includes the form and function of a hexagon) to the beehive.

    Sure, you might be able to build a neural net and train it without understanding mathematics. But you wouldn't understand how it worked; when you explored that, you'd find mathematics whether you liked it or not.

    The philosophical question underlying this is which perspective defines the other? On the one hand, mathematics could underlie the entirety of the universe. If this is the case, then we could, theoretically, find the truth of it. Unfortunately, if mathematics is hard-wired into the brain--if the basic axioms of mathematics are assumptions made by the human mind as a means of interpreting the world--we couldn't see past those assumptions. Every attempt to do so would necessarily rely upon them, creating a circular argument or self-fulfilling world view, so to speak.

    A, perhaps, interesting analogy*: Imagine you view the world through emerald lenses. Everything you see would be tinted green, but, having perceived the world this way for the entirety of your life, you would be none the wiser, unless someone or something showed you otherwise. Even then, you would be flabbergasted, possibly to the point of denial, if someone were to show you evidence of non-tinted vision. Now, consider the glasses to be mathematics. The eyes are analogous to your brain, and the assumptions fundamental to mathematics are the tint of the glasses. Can you say you'd be any the wiser? **

    This logical paradox, of sorts, prevents us from knowing the truth of the universe. In fact, an entire branch of philosophy--epistemology--is dedicated toward investigating what constitutes knowledge & truth. Those philosophers have concluded, at the time of this writing, that mathematics is only true because it is defined independently of our universe. Furthermore, any attempt to apply pure mathematical reasoning to the world at large creates incredible complexity. Consider quantum mechanics, string theory, astrophysics, and other such sciences. Each of these, while functional, sacrifices a great deal of the elegance of mathematics due to an intersection of pure reason with the real world.

  • by Anonymous Coward on Sunday July 08, 2007 @02:07PM (#19790993)
    Actually, it is possible to find the median of a list in O(n) time, and when you use this to pick your pivot then quick sort is in fact O(n log n) in the worst case.
  • by mjmartin_uk ( 776702 ) on Sunday July 08, 2007 @02:14PM (#19791031)
    Screw people who say you need qualifications to be a 'good' programmer. Maybe it helps but there a plenty of decent programmers in the world who write good code without the nessecary educational background. No area of expertise should be selective based on qualifications alone (except maybe medicine).

    I've seen some truly terrible code by folks with degrees in both Comp Sci and Software Engineering, and even worse code by people who are doing degrees in so called 'ICT'.

    The code style was all over the place, one guy's test code didn't even compile!

    As Excelsior said, a good technical interview is the only way. After the first interview, we then actually get interviewees to write a simple program in 4 hours and review it with them earlier. Sometimes that's the only way to tell if they're any good or whether they'll fit into our organisation.
  • Re:Damn straight! (Score:3, Insightful)

    by quintesse ( 654840 ) on Sunday July 08, 2007 @02:39PM (#19791229)
    Well, we would be talking about a mechanic who could probably build cars all by his own in new and imaginative configurations sometimes never thought off by the people who designed it. He might not be very good at calculating beforehand how to make the motor run 15% more efficiently though.

    I think with CS we're still in the "tinkerer stage" where wonderfull new things come just as often from the guy working behind his computer in the attic as the computer science major working in some dev center for IBM.
  • Re:Damn straight! (Score:3, Insightful)

    by WGR ( 32993 ) on Sunday July 08, 2007 @03:43PM (#19791701) Journal
    I have a B.Math in CS and statistics, a CISSP and have taught CS at several colleges and university, so I do know the math.

    But I also have 30 years experience as a developer and consultant in IT security. CS people with your attitude are the cause of most problems with security because you assume that there always is a technical solution to problems. Real problems always involve people and people are what computers are for.

    Remember Hamming's forward to his book on Numerical Analysis.
    "The purpose of all computing is insight, not numbers"

    Without insight into human behaviour and use of computer systems, you risk creating useless shelfware or avionics software that kills people. Read the RISKS digest for the number of cases where software was written without human considerations and thereby caused harm or failed.
  • Re:wahay! (Score:3, Insightful)

    by fbjon ( 692006 ) on Sunday July 08, 2007 @03:58PM (#19791813) Homepage Journal
    TeX is not a UI renderer, though. Did any OS have nice fonts back then?
  • by msuarezalvarez ( 667058 ) on Sunday July 08, 2007 @04:18PM (#19791949)

    The algorithm is the worst thing to have happened to computing. It is the primary reason that software is unreliable. Programming is hard precisely because it is based on the algorithm. I hope that this new realization among some of us that computing should not be based on the algorithm becomes more widespread. It will usher in the next computer revolution.

    Wow. That's probably the most non-sensical statement I've read in Slashdot in a while, including the huge iraq-related threads... Quite an accomplishment!

  • Math == Latin (Score:3, Insightful)

    by Tom ( 822 ) on Sunday July 08, 2007 @05:36PM (#19792615) Homepage Journal
    The article completely ignores the most important part about math: It's a language. A very good language, even. More modern than any of our natural languages, capable of expressing non-Newtonian, non-Euclidian and non-Aristotelian facts of the world we today know to be true but still struggle to fit into our mother tongues.

    Also, it is what Latin used to be in the middle ages - the common language of people all over the world. Scientists from different continents may be barely able to communicate in their respective mother languages or in english, but if they write down their formulas, they both know exactly what the other is talking about.

    But no, the most important part is that math still evolves, and rapidly. As so many other critics, the author of the article appears to have a very limited understanding of math.
  • Re:Damn straight! (Score:2, Insightful)

    by kartan ( 906030 ) on Sunday July 08, 2007 @07:36PM (#19793671)
    That's exactly the GP's point. You work on a renderer. You're "replacing a tie rod."
  • by Anonymous Coward on Sunday July 08, 2007 @08:13PM (#19793951)

    Let me make this clear: your ability to write code in no way makes you a computer scientist. It's like saying that the ability to operate a forklift makes you a structural engineer. Stop it already.
    Let me make this clear, your training as a computer scientist in no way makes you qualified to write code. It's like saying that a degree in structural enginering makes you a qualified forklift operator.

    The worst code I have ever seen was designed and implemented by CS majors, give me a SE, CE or even a English BA, to design and write software. Heck forget the degree give me talent and an ability to think logically. Yes CS theory is important, but leave the application of that theory to people who know how to do it right.
  • "If you have two candidates with identical creditentials except for the certification, it will and should make a difference."

    Rubbish.

    In IT, all a certification means is that one of the candidate's previous employers had a training budget. I'd consider the two candidates exactly equal. If I could only hire one, I'd fall back on my gut feel of which one interviewed better.

    Charles Miller
  • Re:Damn straight! (Score:3, Insightful)

    by nwbvt ( 768631 ) on Sunday July 08, 2007 @09:12PM (#19794371)

    "mathematics as a base for CS was great in the 50's and 60's, but the real problems in computer software are people problems, not algorithmic ones. Once you program a calculation correctly, it generally stays that way."

    You, like the author of this article, are missing the point of an education in mathematics. It isn't to simply teach you algorithms, its a way of thinking through abstraction which is crucial to computer science.

    "But determining the optimal layout of a form to benefit the users of the system requires observing people and their needs."

    That work should be done by a web designer, not a CS grad. Yes, many projects fail to adequately separate out the view from the rest of the app or force the same developers who wrote the backend to also write the user interface. But complaining that the problem is that those developers are learning math instead of human/computer interaction is like making your lawyer do your taxes and then complaining that their education was too focused on constitutional law and not the tax code. If you make someone do the wrong job, they often won't have the right education to do it.

  • by fractoid ( 1076465 ) on Sunday July 08, 2007 @10:29PM (#19794961) Homepage
    You probably aren't programming anything that requires math. Try 3D graphics programming - you need a lot of linear algebra, and some calculus if you're doing any kind of shading. Physics simulations require more differential equations than you can shake a stick at. Lossy compression requires frequency analysis and coordinate transforms. Of course, making business database front ends doesn't require much in the way of maths... *sigh* :/
  • Re:Damn straight! (Score:3, Insightful)

    by quadelirus ( 694946 ) on Sunday July 08, 2007 @11:01PM (#19795181)
    I agree. I am a CS student and that type of complaint seems to be running rampant through the field. There is a huge difference between computer SCIENCE and software engineering. If you don't want to be doing the math/research side of computers, computer science is not your field.

    Also, anecdotally, I code monkey (by that I mean: do grunt coding work) for a computer vision research group, and it seems, from my admittedly limited experience, that the best work being done in computer vision is being done by people that have a great command of "pure" math. Without math, computer vision is reduced to trying to code things that kind of work (the hack-it approach), whereas with math the computer vision field is building mathematical models of things like shape and is able to push the limits much farther than I think would otherwise be possible. IMHO.
  • Re:Damn straight! (Score:2, Insightful)

    by tarpy ( 447542 ) <tarpy@tarpi f y .com> on Monday July 09, 2007 @09:27AM (#19799319) Homepage

    But determining the optimal layout of a form to benefit the users of the system requires observing people and their needs. Understanding what parts of a program are going to be changed because of changing user needs is more important in program design than deciding whether you need a heap sort or insertion sort. Yes, you should know the difference, but you seldom need to program it, just choose the correct one from the system library. CS graduates tend to design programs for machine efficiency, not human efficiency. But it is humans that are expensive, not machines.
    I think you're confusing HCI [wikipedia.org] with CS. As a person who specialized in HCI in my undergrad [www.cmu.edu], I can tell you my coursework was radically different from my friends who were pure MathCS. While we did a number of courses in common, it was not out of the ordinary for me to be over in the College of Fine Arts taking an industrial design course, or over at the social sciences department taking a course in decision science (which is actually what my degree is in).

    While HCI and CS are two separate, yet intertwined disciplines, they are fundamentally different art forms, with different manners of thought, problem solving, techniques, and problem spaces. It would be a mistake to confuse one for the other. That being said, it's been quite useful for me, as I stumble through my career to have had a good grounding in CS fundamentals. While I'll never need to determine if a particular interface is Big O or not, that I have a better than the average bear's idea of what goes on below those pretty interfaces I design allows me to meet both the users needs and make the wire frames I deliver to whatever poor engineer is going to have to build this thing not want to find the nearest firearm and start taking shots at me.

    To your greater point, I think there is some merit, as we move closer and closer to ubiquitous computing, the greatest challenge presenting system designers won't be how to eek out more horsepower from the processor, it will be shoehorning in the interactivity seamlessly to the user and the environment. One area where you do see a merging of pure MathCS/HCI is, ironically, in the field of aerospace. One of the posters mentioned trying to fly the (sexy) new 787 without a grounding in math...and while I grant that pilots need to know a whole lot of hard science, one of HCI's (er, rather Human Factors) most obvious areas of impact is in the cockpit, and instrument design. Boeing/Airbus/Fokker/whoever spend a lot of time, money, and research into figuring out the most intelligent, intuitive, and natural way of informing the pilot of everything he or she needs to know to make split-second decisions that have literal life-or-death consequences.

    In a graduate course I took on dependable system design, the very first class, the professor had us read portions of the cockpit voice recorder transcript [avweb.com] for American Airlines flight 965 [wikipedia.org]. This was the flight which crashed in the mountains near Cali, Colombia back in 1995. One of the underlying reasons for this crash was the interface for the autopilot was overly complicated while entering waypoints into the system, and when the pilot-in-command chose a wrong waypoint with a similar name to the one he needed (without the system sanity checking and throwing some query back to the cockpit crew) and literally turned his 757 into a mountain.

    In this case, all the math and science couldn't save the airplane, but perhaps a system that was designed to check user inputs against some sense of "hey, is this the right data point" might have allowed the pilots to get out of the situations before anything worse than needing to do a five minute loop around the mountains back onto their flight path.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...