Forgot your password?
typodupeerror
Programming Education Math IT Technology

Forget Math to Become a Great Computer Scientist? 942

Posted by Zonk
from the this-is-why-i-wasn't-a-good-programmer dept.
Coryoth writes "A new book is trying to claim that computer science is better off without maths. The author claims that early computing pioneers such as Von Neumann and Alan Turing imposed their pure mathematics background on the field, and that this has hobbled computer science ever since. He rejects the idea of algorithms as a good way to think about software. Can you really do computer science well without mathematics? And would you want to?"
This discussion has been archived. No new comments can be posted.

Forget Math to Become a Great Computer Scientist?

Comments Filter:
  • Computer science ? (Score:3, Interesting)

    by ivan_w (1115485) on Sunday July 08, 2007 @07:11AM (#19787995) Homepage
    This is all fine.. But it doesn't explain something I have long thrived to understand :

    What is computer science ?

    Computer engineering.. yeah.. I can understand that.. But man.. Computer SCIENCE ?

    That's like saying 'car science', 'cooking science' or 'go at the bar and have a drink science' !

    --Ivan
  • Sadly mistaken (Score:3, Interesting)

    by Rumagent (86695) on Sunday July 08, 2007 @07:15AM (#19788023)
    Isn't that pretty much the same as arguing that a surgeon doesn't have to know about anatomy? What we do is inherently mathematical - there exists no other way of defining and understanding complexity, computability and so on.

    I agree that you do not need a good understanding of mathematics to create a homepage, but for anything remotely interesting you do.
  • Yes and no (Score:3, Interesting)

    by QX-Mat (460729) on Sunday July 08, 2007 @07:23AM (#19788083)
    I feel that there are a lot of software engineering areas where you don't need much in the way of maths experience - just logical thinking. Most real world math related implementations I've done haven't relied on a high level of maths... linear interpolation and solving quadratics are probably as "tricky" as math goes outside of academia... but...

    That's not the end of it. I've also done a lot of image manipulation work, and you NEED a good math background when you step over simple 2d convolution filters. Knowing your physics also helps - being able to identify trends and patterns in wave forms, and then applying the necessary maths is a great help. When dging into aliasing and reconstruction now, not just filtering, a high math proficiency is a must.

    I've taken to game programming recently. If you know your maths, the physics comes easily. If you know your maths, specially advance vector and matrix theory (with integration and differentiation being prerequisites), things become a breeze. I didnt know enough. And I still struggled from time to time today. Experience is helping me, but sometimes I wish I had a math background to roll on.

    I guess my ramblings are leading to a poor conclusion. Without maths you're limited in what you can do - but you're only limited by lateral field... In most cases you can take an specific soft eng field and go to town without hitting maths. I'm a very good software engineering and reverser, and I gotten here without having a math background. When I wanted to expand into games programming and image processing, things became much harder without the math.

    With all that said, I'm very very guilty of obscuring simple procedures with valid but pointless math - and I know for a fact there's too much pointless formal theory in computer science now. The pointless formal theory is actually what push me away from doing a masters in computer science, and find something more vocational and rewarding!

    Matt
  • Re:wahay! (Score:5, Interesting)

    by smilindog2000 (907665) <bill@billrocks.org> on Sunday July 08, 2007 @07:45AM (#19788189) Homepage
    I sometimes run into great algorithm programmers who were poor at math, but they're rare, and usually can be explained away based on what kind of drugs they did in college. For a good algorithms guy, I love hiring good mathematicians and physicists. You can train them into great programmers a lot quicker than the other way around. However, algorithms are really a very small part of the programming space we work in. I choose to work in this space because it suits me, but most programmers never need calculus. To build a tree-based data structure and a GUI to drive it takes about an 8th grade level of knowledge. Doing a GUI really well takes creativity I've never had (apparently a lot of guys like me work at M$. I don't know where Apple finds it's GUI guys).

    The summary of the author's points in the article make the book sound dead wrong on several counts, though it could just be the review. Procedural languages are the natural way to code most programs, and here's why: we've been recording recipes as a sequence of steps, with if statements and loops, since the invention of writing. It's become encoded in our genes. That's really all that early computer scientists put in our early languages like FORTRAN. It's all the stuff we've added since then that's up for debate, in my mind. The author makes money by pushing the boundaries of computing model research. I get big programs written by teams by restricting what language features are used, and how. I'd be interesting to debate the ideas, point by point.
  • Re:wahay! (Score:5, Interesting)

    by atrocious cowpat (850512) on Sunday July 08, 2007 @08:28AM (#19788417)
    "Doing a GUI really well takes creativity I've never had (apparently a lot of guys like me work at M$. I don't know where Apple finds it's GUI guys)."

    Maybe the question should rather be: Why doesn't Microsoft look for the kind of GUI-guys Apple hires. And the answer to that might well be found at the top of each company. A quote from Steve Jobs' Commencement address at Stanford (June 12, 2005):

    "Because I had dropped out [of college] and didn't have to take the normal classes, I decided to take a calligraphy class [...]. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating. None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, its likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do."

    Read the whole thing [stanford.edu], it's quite interesting (if not to say: inspiring).
  • by Anonymous Coward on Sunday July 08, 2007 @08:46AM (#19788505)
    Funny story about work.

    The current back-end system that translates front-end customer orders to actual tangible products often fails silently, and the person who wrote it (who's still with us), thinks that's okay.

    Eventually, management got tired of people not getting their orders, or getting the wrong person's order, and not having a way of detecting that there's any problem. So they hired a new guy to write a new production system.

    Talking to the new guy, he said that the system is almost working, but it fails silently, and he should add error handling if he has time.

    DO PEOPLE NEVER LEARN!?!!

    Error handling isn't optional. Error handling isn't something that gets added into a system. It should be an integral part of the system's design. Furthermore, with exceptions, error handling is painless. There's no excuse for not thinking about it.

    This system is also much more complicated than its predecessor. It needs a dedicated server, uses a long-running daemon process that polls(!) a database for something that really should be a simple event-driven process, and still fails silently!

    Also, another harbinger of doom: "I don't need to use version control. I'm the only one working on the system."
  • by syn1kk (1082305) on Sunday July 08, 2007 @08:49AM (#19788535)
    I am a DSP programmer, which basically means that all the stuff I code are mathematical formulas transformed into C code. I mention DSP because writing DSP algorithms forces the programmer to know his math really well... enough so that he can distill the complex math into an efficient C code implementation.

    I remember trying to get my specific algorithm to run under 500 micro seconds and the best I could get was like 10000 micro seconds. My coworker who looked at the underlying math equations for my code easily saw a better solution just by looking at the math equations for 5 minutes. After I changed my code to suit the new math equation I got my code to run at 280 micro seconds.

    The whole point of this example:
    When you approach the solution from a mathematical viewpoint, the mathematical viewpoint lets you see more clearly how to optimize an algorithm. In my case, I got lost looking at the C code and missed the elegant mathematical solution because I did not look at the math equations. So I ended up not being able to "distill the complex math into an efficient C code implementation" to find the elegant solution.

    In my case the elegant-math-derived-solution was about 35 times faster (10000 / 280) than the original solution I had come up with.

    -----

    Bottom line: The syntax and complex notations used for math equations lets you look at a problem from a much higher level of abstraction and this higher level of abstraction is much more conducive to seeing the elegant best solution (solutions that improve your algorithm by an orders of magnitude rather than solutions that improve your algorithm by some linear constant).

    p.s. if you were wondering what I was working on --> the function was a GMSK modulator ( http://en.wikipedia.org/wiki/GMSK [wikipedia.org] ) for a transmitter.
  • by QuoteMstr (55051) <dan.colascione@gmail.com> on Sunday July 08, 2007 @09:01AM (#19788605)
    You make a good point, but I wonder whether, given how much bad code is written, and how much effort is spent maintaining it, whether, in the end, we'd produce more quality code with fewer, better programmers.
  • by garcia (6573) on Sunday July 08, 2007 @09:07AM (#19788641) Homepage
    Let me make this clear: your ability to write code in no way makes you a computer scientist. It's like saying that the ability to operate a forklift makes you a structural engineer. Stop it already.

    I never made any such connection. All I said was that people here on Slashdot are on their high horse and think that their shit doesn't stink when it comes to writing code. Please stop THAT.

    We need fewer programmers, not more. Maybe professional certification would help somewhat.

    I don't consider myself a "programmer". I have a background in CS (a minor in it) and I use that knowledge to do my job better. If I had to wait for a "programmer" at my institution to write the code I use I'd never get my work done.
  • by Chemisor (97276) on Sunday July 08, 2007 @09:18AM (#19788707)
    > Good luck on doing a kernel, file system, network stack, crypto, image processing,
    > window manager, animation or 3D without math or algorithms.

    And when, may I ask, did you last do any of these things? Only a miniscule portion of us are working on the kernel, file system, or network stack (and none of them involve any math beyond simple algebra). Only one or two of us has ever written a window manager, and that's the way it should be. Only NSA people work with crypto on a regular basis; the rest of us just use premade libraries (made by cryptographers, who require years of practice to become good in their tiny little area of expertise). 3D is all done in hardware these days; software renderers went out of style in the last century and if you are still doing it, your software must be either really slow or running at 320x240. And as for image processing, most of us don't do that either, and when we do, we don't invent the algorithms; we ask mathematicians to do that.
    Face it, most of us write code that has absolutely no math in it (I don't count algebra - it's just the way you write code). We make user interfaces, write database queries, and, I am sorry to say, rewrite parts of application frameworks we don't like (and there are a lot of such parts). Instead of math and algorithmic theory, these tasks would benefit from knowing how to structure your code properly, how to ensure portability and ease of localization, and, most importantly, how to correctly think about object oriented design.

    > I look forward to reviewing some of this guys code.

    Yeah, do look forward to it, since you'll be waiting a long time to see anyone write an "algorithm". I haven't written one in years, and even then it was for a software renderer made for fun. Real programming is about arranging objects and the control and data flow between them, not about crunching numbers. Numeric algorithms only exist in academia and in a few specialized libraries that nobody wants to, or should, rewrite. Design is what programming is about and that is what programmers should know and be taught. Mathematicians be damned!
  • by timeOday (582209) on Sunday July 08, 2007 @10:02AM (#19789029)

    Maths IS needed for computer science. Just be sure not to confuse Computer Science with Software Engineering.
    I don't think that's what we're talking about. A more interesting question, I think, is whether "true AI," should it come to pass, will be derived from basic principles (i.e. math) or based on heuristics (i.e. not math). After laying the groundwork in the first few years of digital computers, the theory of computing has not progressed very much! There is no proof that encryption is secure. Quicksort, which is O(n^2), generally outperforms the O(nLog(n)) algorithms. There is still not even a proof that P != NP, even though it seems obvious. I think what has been proven is that most problem classes of interest are non-decidable and intractible. But so what? You can still get along quite well in the world without a provably optimal solution most choices. So now theory is concerned with deriving probabilistic bounds on accuracy and runtime for heuristic methods. I would call that nice to have, but is it necessary?
  • by Anonymous Brave Guy (457657) on Sunday July 08, 2007 @10:54AM (#19789413)

    3D is all done in hardware these day

    I hate to break this to you, but no. Drawing in 3D is often heavily supported by hardware and coded using libraries like OpenGL or Direct3D. But how do you think these libraries know where to put things? Someone's doing all the modelling to decide what to draw, and the requirements for this can be quite specialised if you're working in a field like CAD, or games development, or special effects. And who did you think newer versions of the libraries themselves grow on trees?

    Face it, most of us write code that has absolutely no math in it (I don't count algebra - it's just the way you write code). We make user interfaces, write database queries, and, I am sorry to say, rewrite parts of application frameworks we don't like (and there are a lot of such parts). Instead of math and algorithmic theory, these tasks would benefit from knowing how to structure your code properly, how to ensure portability and ease of localization, and, most importantly, how to correctly think about object oriented design.

    OK, but we're not talking about Software Engineering, we're talking about Computer Science. We're not talking about join-the-dots programming, we're talking about how you design the dots and the ways to join them. And you'd better hope the people who are doing that have sound theoretical bases for the models they adopt. Otherwise your database is going to be slow, your garbage collector will leak, your network stack is going to be insecure, and your operating system is going to deadlock.

  • by coolGuyZak (844482) on Sunday July 08, 2007 @11:16AM (#19789619)

    Statements like this make a sweeping assumption: that the fundamental theorems of mathematics are not the formalization of concepts hard-wired into the brain. For instance, the existence of prime numbers wouldn't be obvious to an organism that never used integers. Similarly, it may be possible to discover alien life that never had a use for the Pythagorean Theorem (perhaps they don't perceive space?).

    Thus, I believe that your statement is incomplete. Some classes of problems, particularly algorithms, use math by their nature. However, had the discipline branched off of, say, psychology, those classes of problems could be as atrophied as human computer interaction was a few years ago. It is reasonable to assume that CS as a whole would be vastly different. Would architectures resemble the brain? Would they be chemical rather than electrical? Programming languages may be easier to use, but chances are they would lack orthogonality, closure, etc. What would be more entertaining is a computer programmed like Pavlov's Dog...

    In an extreme formulation of this idea, certain elements of computer science may not even exist--imagine algorithm development with my latter example. To consider something a bit closer to home, what if the base discipline of computer science was linguistics [perl.org]?

  • by QuoteMstr (55051) <dan.colascione@gmail.com> on Sunday July 08, 2007 @11:27AM (#19789707)
    No, math is not hard-wired into the brain. It's hard-wired into the universe.

    An organism can use math without perceiving it --- take bees, which produce hexagonal honeycomb structures. Do you think they perceive the hexagon shape, or the number six? No. They've just evolved behaviors to produce those shapes. Mathematics still describes them perfectly.

    I'm also at a loss to imagine an organism that can manipulate its environment consciously that is unable to come up with basic geometry. I realize that proof through incredulity is no proof at all, but please elaborate.

    Anyway, regardless of whether computer science had originated in linguistics, chemistry, biology, or history, there would have eventually been a need to formally describe how it works. To do that, mathematical concepts would be involved.

    If computer science had originated in psychology and its first focus had been human-computer interaction (odd, since computer science existed before computers), then we would have had a need to describe data structures used in HCI, and a way to explain how to manipulate them. Bam, you've just re-discovered the mathematics of CS.

    Sure, you might be able to build a neural net and train it without understanding mathematics. But you wouldn't understand how it worked; when you explored that, you'd find mathematics whether you liked it or not.

    You can't escape from mathematics. It's there whether you want to use it or not, whether you use the numeral '1' or a picture of a very small cow to represent unity, or the word "axiom" or "chicken" to describe a basic assumption.
  • by Anonymous Coward on Sunday July 08, 2007 @12:00PM (#19789973)
    Steve Jobs is half Arab, his real father was a Syrian student, named Abdulfattah John Jandali (http://en.wikipedia.org/wiki/Steve_Jobs). Arabs are well known for their love of calligraphy.

    Anyway, the older he gets, the more Arabic he looks, he DEFINITELY has an Arabic (well, Semitic) nose, with proper clothing he could easily pass for a Middle East born, Arab or Jew.

    I wonder how does his Arabic background affect his business. Does the Government suspect him as a (potential) terrorist? He uses a corporate jet, which is good under these circumstances. He is also careful and does not wear a beard or a turban.
  • ...like solving string subsequence matching (comparing DNA sequences)...

    Last year I worked on just that. The (Smith-Waterman) algorithm is well studied, so I didn't have to derive all the math for it. What I did have to derive is the speedup gained by using our hardware. That required some algebra. I also did Gaussian smoothing on the data. That required some image processing math. Once upon a time I coded PHP/MySQL stuff for various web companies. I had to do two different kinds of math with that: accounting and statistical work including Chi squared, etc. Graphing and displaying all that data was real simple algebra stuff. It wasn't satisfying for me so I looked into more serious science work.

    I found the more serious work. My minor in math is, for the most part insufficient for my current work. In the past year I
    1. Worked out an edge detection algorithm using wavelets. Wavelets use tensor math -- math not covered until the third and fourth year for math majors. I never could get a full grip on the math. Fortunately, I found and ended up using a book that had all the algorithms for it already coded.
    2. Worked on path planning for robots using clothoids and Bezier curves. The algorithms to interpolate my existing data for those are too math-heavy for me. Have you ever tried to find the intersection point of two clothoids or Bezier curves? Find the nearest point on a clothoid to a given point? Or mix the two? It's tough stuff. It's loaded with numeric methods. My BS in computer science and minor in math didn't prepare me for that.
    3. Worked on converting 3D data between various map projections.
    4. Worked on CAD software that allows manipulation of 3D shapes in a 3D environment. It's loaded with trig and linear algebra.

    I could go on with various little details. Suffice it to say that it's darn frustrating when you're supposed to code a fancy wavelet demo and you can't read any book on the topic because it's over your head.

    I had a class in college on algorithms. The teacher was fantastic. He had an excellent skill at pointing out "now that's computer science becomes science." I remember his passion for back-propagation and all the little tricks to it he knew from study and experiment. That was fun.
  • by ml10422 (448562) on Sunday July 08, 2007 @01:28PM (#19790689)
    I'm just a GUI programmer, but every once in a while, the need for math crops up without warning: How do I quickly find the object closest to the place where the user clicked? How do I fill that polygon that the user just drew with color? What's the matrix math for converting between color spaces?
  • by tempest69 (572798) on Sunday July 08, 2007 @01:33PM (#19790743) Journal
    Science is about asking questions that havent been suitable answered for your purposes yet. IE how can we losslessly edit lossy image formats. Or how can we optimize the new database once we have installed solid state drives? Or how do we prevent a lockup with this new 128 core chip with 5 dedicated lock bits..


    Engineering is about building things once you understand the concept behind them. So building a jpg viewer/writer isnt science anymore, it was back circa 1980. Dont get me wrong, building one in without libraries is a mess but could be done. Or building a DES encryption box, it isnt easy, but its not science anymore. its engineering, and we need a bunch of good software engineers because they are realy hard to find.


    Most coders dont really fall into the engineer category, unless they are design pattern zealots, or have a robust methodology fro producing code, they are just using logic and application. This is where most of the really cool toys get built, this is where 99.8% of the absurd buggy code comes from. Most Computer Scientists dont Engineer their experiments, IE buggy test code. Without these people we dont really value software engineers. Without Computer Science, Software Engineers are stuck using the same tools over and over.



    Storm

  • Re:wahay! (Score:3, Interesting)

    by smitty_one_each (243267) * on Sunday July 08, 2007 @01:49PM (#19790855) Homepage Journal
    Right, well, asserting that Knuth's work had absolutely no influence on operating system fonts, as Jobs appears to have done in his statement, seems fishy.
  • Re:wahay! (Score:4, Interesting)

    by Larry Lightbulb (781175) on Sunday July 08, 2007 @02:14PM (#19791033)

    Procedural languages are the natural way to code most programs, and here's why: we've been recording recipes as a sequence of steps, with if statements and loops, since the invention of writing.
    It's worth noting that about 150 years ago Isabella Mary Mayson, universally known as Mrs Beeton, changed how recipes were written. She had the radical idea of putting the ingredients at the start of the recipe, and came up with the thought that it might be a good idea to write how long something should be cooked for.
  • Re:wahay! (Score:1, Interesting)

    by Anonymous Coward on Sunday July 08, 2007 @03:16PM (#19791485)
    And oddly enough, according to the people who designed the original Mac, Steve Jobs tried to kill the Mac off a number of times... only the passion of the engineers kept it going on the sly. Every time Jobs found out they were still working on it, he tried to kill it off again. The Mac doesn't exist *because* of Steve Jobs, it exists *in spite* of Steve Jobs.
  • by Anonymous Coward on Sunday July 08, 2007 @04:16PM (#19791925)

    Quicksort, which is O(n^2), generally outperforms the O(nLog(n)) algorithms.


    Yeah, that is also explained by math. By the way, you can make sure that Quicksort runs in O(n*log(n)), but it is not a good idea, because MATHEMATICS shows, that the worst case is so improbable to happen, that it is not practical.
  • Project Euler (Score:3, Interesting)

    by TeknoHog (164938) on Sunday July 08, 2007 @07:31PM (#19793635) Homepage Journal
    A few weeks ago I came across Project Euler [projecteuler.net]. Most of the exercises are good examples why math is good for coding; they have brute-force solutions that take a lot of time, but clever solutions should always take less than a minute to run.
  • Re:Damn straight! (Score:3, Interesting)

    by nwbvt (768631) on Monday July 09, 2007 @01:33AM (#19796311)

    "Are you really implying that your CS is useless. "

    Are you really implying that the lawyer who doesn't know accounting is useless? Or if you refuse to accept that analogy, what about a lawyer who primarily knows constitutional law when you are looking for a divorce attorney. Are they useless because their education focused on subject different from the one you are currently looking for?

    Yes, the user interface is an important part of most software systems. But software systems have many parts, and the user interface is usually a very small one, despite its importance. You ask about my particular work, no, I do not develop our product's UI. That is done by other developers who I work closely with and (hopefully) have more training in the subject. However at my previous position they had me developing the web tier (it should be noted that the actual designs were developed by dedicated usability experts, we merely implemented them), and that was the main reason I started looking for something different.

    You are also right in that software applications often have problems in the UI. But that is not because CS students are learning mathematics. Its because employers fail to recognize that developing a UI is different from developing back end software and put the wrong person on the job (as which happened in my previous job). Its a problem with management, not education.

    Do you know why frameworks like Struts became popular in the first place? Its because they separated the view out of the model and the control of the application (hence the term MVC). The reason for this is not some pure academic pursuit of software purity, the goal is to have a different person, someone who specializes in human computer interaction, develop the application's view.

    "I am the original author and I do have a math degree from U of Waterloo and do know math very well."

    I'm sorry but if you think memorizing formulas is what mathematics is about, I don't care what degree you have you don't know mathematics.

    "The algorithmic problems of computer science are minuscule compared to the human interface design problems and a CS education that spends more time on algorithmic complexity than psychology produces graduates who make poor CS researchers, let alone practitioners."

    Have you spent even an hour on an university campus in the past decade? There are dedicated programs all over the world dedicated to human computer interaction. In fact they are often very popular with students, who enjoy working with something that at least appears to be physical. The fact that you (and many employers) are ignorant of such programs does not mean there is a problem in our education system, it means there is a problem with you and those employers.

    But again, the UI is not the extent of software. I am aware that many people who have a poor understanding of software do not understand that since the UI is all that they see, but if you have half the qualifications you claim you should know that is not the case.

  • by Apro+im (241275) on Monday July 09, 2007 @01:36AM (#19796331) Homepage
    ... and here we find the fundamental problem. Programming != Computer Science.

    More accurately, a programmer is not necessarily a computer scientist any more than a computer scientist is necessarily a programmer. Neither is better or worse than the other, and both should know something about the other's skill set, but in practice, there are many amazing programmers who are poor computer scientists, and even more great computer scientists who are poor programmers.

    I would classify programmers as people who can get a computer to do what they want it to, and the measure of the skill of a programmer is how their code performs on some set of metrics (performance, reusability, readability, etc.)

    On the other hand, computer scientists are people who figure out what they can get a computer to do and how to do it. More often than not, these people work in research labs and in academia, and their measure of performance is how many (usefully) novel methods they've found of doing things or how many new things they've figured out they can make computers do. In most cases, aptitude in more advanced math does help computer scientists, although in some sub-fields, there is less dependency on this.

What the scientists have in their briefcases is terrifying. -- Nikita Khruschev

Working...