Forget Math to Become a Great Computer Scientist? 942
Coryoth writes "A new book is trying to claim that computer science is better off without maths. The author claims that early computing pioneers such as Von Neumann and Alan Turing imposed their pure mathematics background on the field, and that this has hobbled computer science ever since. He rejects the idea of algorithms as a good way to think about software. Can you really do computer science well without mathematics? And would you want to?"
Damn straight! (Score:4, Insightful)
Re:Damn straight! (Score:5, Funny)
Re:Damn straight! (Score:4, Insightful)
My last honors math class had 3 boys in it, out of twenty.
Re:Damn straight! (Score:5, Funny)
Re: (Score:3, Informative)
It was well known that men tend in much greater numbers towards genius and retardation. Part of his reasoning was that there may just be a smaller pool to draw from for the very top coupled with fundamental differences in brain development between th
Re: (Score:3, Insightful)
I dunno. The guy's argument from the article seems kinda flawed.
"A logic circuit is not a sequence of operations."
No, it's a subset of a sequence of operations. It's a component that fits into a deterministic set thereof, and *should* be calculated via boolean or classical arithmetic.
"An operating system is not supposed to terminate, nor does it yield a singular solution."
Then what's "Shut Down" do? And while it doesn't yield a singular solution, it
Re:Damn straight! (Score:5, Informative)
I think he's astroturfing for the pro-patents lobby.
One of the reasons you can't patent software in the EU (and probably many other places) is that algorithms are essentially mathematical constructs, and maths is generally regarded as unpatentable.
So maybe one of the big software houses has decided that the next time they go to court over patents, it might be useful to have a scholarly book saying how algorithms are not in fact math based, and should therefore be patentable.
It would also explain the odd references to circuit boards - which are another arguing point in the patent debate. If it has a physical expression, the argument goes, then it can't be maths.
Re:Damn straight! (Score:5, Insightful)
mathematics as a base for CS was great in the 50's and 60's, but the real problems in computer software are people problems, not algorithmic ones. Once you program a calculation correctly, it generally stays that way.
But determining the optimal layout of a form to benefit the users of the system requires observing people and their needs. Understanding what parts of a program are going to be changed because of changing user needs is more important in program design than deciding whether you need a heap sort or insertion sort. Yes, you should know the difference, but you seldom need to program it, just choose the correct one from the system library.
CS graduates tend to design programs for machine efficiency, not human efficiency. But it is humans that are expensive, not machines.
Re: (Score:3, Insightful)
But I also have 30 years experience as a developer and consultant in IT security. CS people with your attitude are the cause of most problems with security because you assume that there always is a technical solution to problems. Real problems always involve people and people are what computers are for.
Remember Hamming's forward to his book on Numerical Analysis.
"The purpose of all comp
SCIENCE != ENGINEERING != APPLICATION (Score:3, Interesting)
Engineering is about building things once you understand the concept behind them. So building a jpg viewer/writer isnt science anymore, it was back circa 1980. Dont get me wrong, building one in
Re:Damn straight! (Score:5, Insightful)
You are a programmer, not a computer scientist. I'd hire you to write code based on a specification. I wouldn't hire you to design rendering algorithms. It is too bad they didn't teach you the difference between compsci and programming during day one of your CS program.
Re: (Score:3, Insightful)
I think with CS we're still in the "tinkerer stage" where wonderfull new things come just as often from the guy working behind his computer in the attic as the computer science major working in some dev center for IBM.
Computer Science != Software Engineering (Score:5, Insightful)
Re:Computer Science != Software Engineering (Score:5, Interesting)
Re:Computer Science != Software Engineering (Score:5, Insightful)
Re:Computer Science != Software Engineering (Score:4, Interesting)
Last year I worked on just that. The (Smith-Waterman) algorithm is well studied, so I didn't have to derive all the math for it. What I did have to derive is the speedup gained by using our hardware. That required some algebra. I also did Gaussian smoothing on the data. That required some image processing math. Once upon a time I coded PHP/MySQL stuff for various web companies. I had to do two different kinds of math with that: accounting and statistical work including Chi squared, etc. Graphing and displaying all that data was real simple algebra stuff. It wasn't satisfying for me so I looked into more serious science work.
I found the more serious work. My minor in math is, for the most part insufficient for my current work. In the past year I
1. Worked out an edge detection algorithm using wavelets. Wavelets use tensor math -- math not covered until the third and fourth year for math majors. I never could get a full grip on the math. Fortunately, I found and ended up using a book that had all the algorithms for it already coded.
2. Worked on path planning for robots using clothoids and Bezier curves. The algorithms to interpolate my existing data for those are too math-heavy for me. Have you ever tried to find the intersection point of two clothoids or Bezier curves? Find the nearest point on a clothoid to a given point? Or mix the two? It's tough stuff. It's loaded with numeric methods. My BS in computer science and minor in math didn't prepare me for that.
3. Worked on converting 3D data between various map projections.
4. Worked on CAD software that allows manipulation of 3D shapes in a 3D environment. It's loaded with trig and linear algebra.
I could go on with various little details. Suffice it to say that it's darn frustrating when you're supposed to code a fancy wavelet demo and you can't read any book on the topic because it's over your head.
I had a class in college on algorithms. The teacher was fantastic. He had an excellent skill at pointing out "now that's computer science becomes science." I remember his passion for back-propagation and all the little tricks to it he knew from study and experiment. That was fun.
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
As if computer science wasn't stunted enough (Score:5, Insightful)
COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.
Re:As if computer science wasn't stunted enough (Score:5, Insightful)
People have been fucking saying this about various versions of BASIC since the beginning. Instead of trashing it, what did BASIC's various incarnations teach us?
It taught us that Microsoft could roll what amounts to a scripting language into its Office line and make the programs ever more powerful without having to relearn something completely new and difficult. An education in just about any language, a book or a list of commands, and some time and you will have a fully functional module or two that saves you a ton of time and energy.
I honestly think a lot of the hostility, here, towards VB has to do with the fact that now pretty much anyone can write code and that it's from Microsoft. If you're somehow saying that if they used C/C++ or even Perl that their code would somehow be wonderful or safe, you're insane.
COMPUTING IS HARD. You can't dumb it down just because it would be nice to do so. And I'm sorry but mathematics is just the way in which meaning is expressed for machines. There's no free lunch here. And he's wrong about algorithms too - since a non-terminating algorithm is always expressible by deconstruction into a series of terminating algorithms.
I agree and while most applications require this, if you look at VB as a way to either get people started coding or to do quick things because it's built into the system instead of concerning yourself with the necessity of math-based algorithms, it serves its need.
I'm no math whiz but I can write code (in languages other than VB) and so can plenty of others. Enough putting people down and being on your high-horse because you write in such and such. Math is important to CS and so is easy access to be able to write code.
Re:As if computer science wasn't stunted enough (Score:4, Insightful)
It's just not that easy to do some of the cool stuff we want to do. No amount of wishing it were different is ever going to change that.
Re:As if computer science wasn't stunted enough (Score:4, Insightful)
No, the hostility is because now pretty much anyone THINKS he can write code, which lowers the valuation of people who actually can do it. That lowers software quality on two fronts: People who can program are forced to write lower quality code because they need to write more to compete with too many amateurs (in the derogatory sense of the word) and people who can't really program write code that doesn't handle errors properly and fails, often silently and undetectedly, when the input deviates from the expected.
Re:As if computer science wasn't stunted enough (Score:5, Interesting)
The current back-end system that translates front-end customer orders to actual tangible products often fails silently, and the person who wrote it (who's still with us), thinks that's okay.
Eventually, management got tired of people not getting their orders, or getting the wrong person's order, and not having a way of detecting that there's any problem. So they hired a new guy to write a new production system.
Talking to the new guy, he said that the system is almost working, but it fails silently, and he should add error handling if he has time.
DO PEOPLE NEVER LEARN!?!!
Error handling isn't optional. Error handling isn't something that gets added into a system. It should be an integral part of the system's design. Furthermore, with exceptions, error handling is painless. There's no excuse for not thinking about it.
This system is also much more complicated than its predecessor. It needs a dedicated server, uses a long-running daemon process that polls(!) a database for something that really should be a simple event-driven process, and still fails silently!
Also, another harbinger of doom: "I don't need to use version control. I'm the only one working on the system."
Re: (Score:3, Insightful)
DO PEOPLE NEVER LEARN!?!!
No, apparently not.
;
I work for a major UK public service, working on tools and content surrounding an international standard. A lot of the code is VB6 and VBA. And the bulk of it has "On Error Resume Next" at the top of every routine. One of our contractors has an IDE plugin that inserts this piece of code automatically (the very idea of this is enough to make me froth at the mouth).. The rationale is twofold
Re:As if computer science wasn't stunted enough (Score:4, Insightful)
I wouldn't use the word "painless". Error handling will always be painful in the general case. It's like going to the dentist. It's painful, but if you don't do it, you can predict endless, total pain later on. But I think you knew that.
A haha haha haha. That's just great.
But I suspect your real problem is that noone replied: "Good for you. Now go on and use version control, or the guards will see you out". You're describing a situation where critical systems are written by people with an unprofessional attitude to their work (not using the tools they should know they need), and noone (except for an AC on Slashdot) appears to check their work.
I hate micro-management as much as the next guy, but somehow it's more attractive than no management at all.
Some people shouldn't code production systems (Score:5, Insightful)
That said, I'm sure you're good at what you do. I bet you can write good code in VB, as well as many other languages. This isn't a personal insult. VB, PHP, and other brutish languages are equally bad in my eyes.
These languages are brutish because they oversimplify key concepts. That oversimplification also makes them attractive to new programmers, and new programmers universally write terrible code. The languages themselves aren't bad, the coders are. That said, more experienced coders will generally choose more capable languages, so most of the time, a program written in a brutish language will be a bad one.
We need fewer programmers, not more. Maybe professional certification would help somewhat.
(Incidentally, we were lucky that Javascript became the de-facto client-side web language. We could have done far, far worse, and although we can change server languages, we can't change a user's web browser!)
Re:Some people shouldn't code production systems (Score:4, Insightful)
I agree professional certification may help improve software in critical areas. Hell, at my workplace we sometimes hire EEs over CS if they're capable of learning to code. CS certification would probably improve our applicant pool (but we probably couldn't afford to hire them). It seems to me that it would also have many deleterious effects if it the requirement for it was applied with too wide a brush.
Re: (Score:3, Interesting)
Re:Some people shouldn't code production systems (Score:5, Insightful)
Certifications aren't worth the paper they're printed on. (The same, it seems, goes for degrees.)
Re:Some people shouldn't code production systems (Score:5, Funny)
Re:$1 billion error at Fannie Mae (Score:5, Informative)
It was garden variety executive directed securities fraud. Not errors created by poor VB scripts in Excel.
VI. MISAPPLICATIONS OF GAAP, WEAK INTERNAL CONTROLS, AND
IMPROPER EARNINGS MANAGEMENT
As noted in previous chapters of this report, the extreme predictability of the financial results reported by Fannie Mae from 1998 through 2003 was an illusion deliberately and systematically created by senior management. This chapter provides specific examples how senior executives exploited the weaknesses of the Enterprise's accounting to accomplish improper earnings management and misapply Generally Accepted Accounting Principles (GAAP), and how they used a variety of transactions and accounting manipulations to fine-tune the Enterprise's annual earnings results. Those actions aimed to perpetuate management's reputation for achieving smooth and predictable double-digit growth in earnings per share and for keeping Fannie Mae's risk low, while assuring maximum funding of the pool from which senior management would receive bonus payments under the Enterprise's Annual Incentive Plan as well as maximum payments under other, longer-term executive compensation plans.
To provide context for the technical material that follows, the chapter first expands on several issues raised in the previous chapters by elaborating on the concept of improper earnings management and describing the circumstances that demonstrate that Fannie Mae senior management must have been aware of the evolving official concerns about such practices.
Following those discussions, the chapter reviews the improper accounting policies and control weaknesses that created opportunities for inappropriate manipulation of earnings at the Enterprise. The chapter then describes inappropriate accounting undertaken to avoid recording other-than-temporary impairment losses to avoid earnings volatility. The chapter concludes with discussions of several additional techniques used by senior management to fine-tune reported earnings results.
The actions and inactions of Fannie Mae senior management described in this chapter constituted unsafe and unsound practices that involved failures to comply with a number of statutory and other requirements. Several independent authorities, for example, require the Enterprise to verify and submit financial information. The Fannie Mae Charter Act--the statute that created the Enterprise--specifically requires that quarterly and annual reports of financial conditions and operations be prepared in accordance with GAAP.1 The Federal Housing Enterprises Financial Safety and Soundness Act of 1992, OFHEO's organic statute, requires Fannie Mae to provide OFHEO with reports on its financial condition and operations.
Similarly, regulations promulgated by OFHEO under that statute require the Enterprise to prepare and submit financial and other disclosures that include supporting financial information and certifications, on matters such as its financial condition, the results of its operations, business developments, and management's expectations.
Moreover, in accordance with applicable safety and soundness authorities, Fannie Mae should have had an effective system of internal controls in place under which:
policies and procedures would be sufficient to assure that the organizational structure of the Enterprise and the assignment of responsibilities within that structure would provide clear accountability;
policies and procedures would be adequate to manage and safeguard assets, and assure compliance with applicable law and regulation;4
policies and procedures would assure reports and documents would be generated that are timely, complete, and sufficient for directors and management to make informed decisions by providing relevant information with an appropriate level of detail; and
policies and procedures for managing changes in risk would be sufficient to permit the prudent management of balance sh
I think the author is making a more subtle point (Score:5, Insightful)
If you buy that argument, then treating CS as if it were merely simply another branch of mathematics will not help solve those problems.
Of course, this also takes us into the perennial debate between where to draw the line between "computer science" and "software engineering". One could certainly define away the author's problem by saying that his examples are software engineering issues rather than computer science issues. And it's true that it's software engineering has been driving a lot of the theory with respect to expressiveness (design patterns and the like). But that view also seems to really impoverish computer science - if all you leave the field of computer science is the stereotypical mathematics, why not just become an applied mathematics major?
Re:I think the author is making a more subtle poin (Score:5, Informative)
Re:As if computer science wasn't stunted enough (Score:5, Funny)
Re:As if computer science wasn't stunted enough (Score:5, Insightful)
But not for humans.
I have no idea what you just said, and I've been coding for years.
I agree that computing is hard. Well, I find it easy, but I agree that, in general, if you're going to use a computer, you're going to learn some logic, and I will not help you to avoid thinking.
But 99% of the apps I write require little to no knowledge of mathematics, beyond basic algebra and arithmetic, and maybe a few binary tricks. In particular, Calculus was a required course for CS in college, and I have never used Calculus in my computing, even programming. Ever.
I have not read that book, but I would argue that a big reason computer science is stunted is this absurd relation to mathematics. You need better-than-average communication skills if you're ever going to work on a team -- at least to comment your code. You need philosophy -- yes, philosophy! -- at least as much as you need math, and a decent logic class would be even more useful. And you need research skills a bit beyond justfuckinggoogleit.com, if you're going to RTFM -- considering you may have to go looking for a manual, or even the right software to begin with.
Re:As if computer science wasn't stunted enough (Score:5, Insightful)
The book (which I haven't read, but have come across enough crank bullshit over the years to quote verbatim) is based on the idea that algorithms are the wrong model for program. It's a poor misguided idea based on a trivial technicality - an algorithm (by definition) takes an input, performs a computation, and produces an output. Program do not, and are generally called reactive as they maintain a dialogue with their environment of many inputs and output. It's a technical triviality because as the GP points out you can take a series of algorithms and substitute them as the "guts" between each of the I/O operations. Nothing much is lost in this modelling. If you really need to analyse the bits that are missing then just make an I/OP operation an atomic part of the model. Process calculi (used for concurrent and parallel) systems take this approach. If you really want to appease the anal fanatic cranks (like the book author) then just explain that all of their reactive components are parts of a large algorithm that encompasses their operation and their "environment".
But now to my point. I bet that you know more maths that you think that you do. It's just that the type of maths that you learnt is not the type they teach you in school. It has nothing to do with real numbers, or calculus. It's a branch called discrete maths that is the heart of computer science. You know how to use trees and graphs? They form a big part of discrete maths. How about changing state as your code executes? That's actually a branch called monad theory. Or do you know more than one language? You said you did CS in college so I'll guess that you're familiar with parsing / compiler writing / language design. A mathematician would probably call that abstract algebra (or at least it's a close fit).
So you know much more dry dusty old math than you suspect - but for the past fifty years these parts of maths have been called CS. Something that is lost on the book author....
Applied mathematics (Score:5, Insightful)
WTF is the the author smoking?.. There are of course parts of CS that are less involved in math, but it is still overall a fundamental part.
Re: (Score:3, Insightful)
But you don't engineer a bridge by thinking about the interaction of individual atoms, not because that isn't the "right" way of doing it, but because it takes too long and is too expensive.
The article makes a good point saying that the obsession with mathematics at the exclusion of all else in computational theory is not necessarily a good thing for the IT field. Mathematics are on such a low level of abstraction that they are mostly use
Re:Applied mathematics (Score:5, Insightful)
Well, actually you do and in multiple aspects too. Whether to design and evaluate the longevity of the applied materials, to the interaction between components, those aspects must be considered in multi-milion dollar projects where bridge building is included.
Besides that, civil and structural engineers also have to consider the mechanics of materials and also wave propagation. What field of science covers that? Physics, of course.
No it doesn't. The only point it makes is that the author does not have a clue about what mathematics is. Mathematics isn't algebra or calculus. Math isn't adding up numbers or multiplying things. Mathematics is structured deductive reasoning, which builds up from a solid starting point (axioms) and serves to represent, communicate and help with the construction of ideas. Those tasks are the basis of computer science, whether you are writing an operating system, designing a database, building a network or even creating a new language/protocol. If you take math out of the picture, you simply cannot do anything in CS, whether you actually know you are applying mathematical concepts or not.
Re: (Score:3, Insightful)
Re:Applied mathematics (Score:5, Interesting)
Statements like this make a sweeping assumption: that the fundamental theorems of mathematics are not the formalization of concepts hard-wired into the brain. For instance, the existence of prime numbers wouldn't be obvious to an organism that never used integers. Similarly, it may be possible to discover alien life that never had a use for the Pythagorean Theorem (perhaps they don't perceive space?).
Thus, I believe that your statement is incomplete. Some classes of problems, particularly algorithms, use math by their nature. However, had the discipline branched off of, say, psychology, those classes of problems could be as atrophied as human computer interaction was a few years ago. It is reasonable to assume that CS as a whole would be vastly different. Would architectures resemble the brain? Would they be chemical rather than electrical? Programming languages may be easier to use, but chances are they would lack orthogonality, closure, etc. What would be more entertaining is a computer programmed like Pavlov's Dog...
In an extreme formulation of this idea, certain elements of computer science may not even exist--imagine algorithm development with my latter example. To consider something a bit closer to home, what if the base discipline of computer science was linguistics [perl.org]?
Re:Applied mathematics (Score:4, Interesting)
An organism can use math without perceiving it --- take bees, which produce hexagonal honeycomb structures. Do you think they perceive the hexagon shape, or the number six? No. They've just evolved behaviors to produce those shapes. Mathematics still describes them perfectly.
I'm also at a loss to imagine an organism that can manipulate its environment consciously that is unable to come up with basic geometry. I realize that proof through incredulity is no proof at all, but please elaborate.
Anyway, regardless of whether computer science had originated in linguistics, chemistry, biology, or history, there would have eventually been a need to formally describe how it works. To do that, mathematical concepts would be involved.
If computer science had originated in psychology and its first focus had been human-computer interaction (odd, since computer science existed before computers), then we would have had a need to describe data structures used in HCI, and a way to explain how to manipulate them. Bam, you've just re-discovered the mathematics of CS.
Sure, you might be able to build a neural net and train it without understanding mathematics. But you wouldn't understand how it worked; when you explored that, you'd find mathematics whether you liked it or not.
You can't escape from mathematics. It's there whether you want to use it or not, whether you use the numeral '1' or a picture of a very small cow to represent unity, or the word "axiom" or "chicken" to describe a basic assumption.
Re: (Score:3, Insightful)
My argument is that the description may stem from the human perspective. Your use of language supports this, as it implicitly refers to Mathematics as a tool to describe--to model--the universe, rather than the universe as an application of mathematics.
Porn (Score:5, Insightful)
Re:Porn (Score:5, Funny)
It's a symbiotic relationship.
Sure thing Einstein (Score:5, Insightful)
Oh? And when did you last write any? (Score:3, Interesting)
> window manager, animation or 3D without math or algorithms.
And when, may I ask, did you last do any of these things? Only a miniscule portion of us are working on the kernel, file system, or network stack (and none of them involve any math beyond simple algebra). Only one or two of us has ever written a window manager, and that's the way it should be. Only NSA people work with crypto on a regular basis; the rest of u
Re: (Score:3, Interesting)
I hate to break this to you, but no. Drawing in 3D is often heavily supported by hardware and coded using libraries like OpenGL or Direct3D. But how do you think these libraries know where to put things? Someone's doing all the modelling to decide what to draw, and the requirements for this can be quite specialised if you're working in a field like CAD, or games development, or special effects. And who did you think newer versions of the libraries themselves grow on t
Re: (Score:3, Informative)
But that wasn't what I was talking about, as I thought was pretty clear from my previous comment. Sure, the rendering itself is fairly straightforward, but how do you decide what to render? And yes, I do do this for a living, and the maths and algorithms involved in serious CAD (for example) are not tr
Re:Sure thing Einstein (Score:5, Informative)
Knock yourself out. [theseusresearch.com] Whether you agree or disagree with this guy, it's obvious his credentials put him at a level above 95% of the people criticizing him here.
Computer science ? (Score:3, Interesting)
What is computer science ?
Computer engineering.. yeah.. I can understand that.. But man.. Computer SCIENCE ?
That's like saying 'car science', 'cooking science' or 'go at the bar and have a drink science' !
--Ivan
Re:Computer science ? (Score:5, Insightful)
That said, I believe there's a useful set of relationships well understood in other fields:
Science = The search for fundamental knowledge and predictive models;
Engineering = The creative application of the results of science;
Technology = The routine application of the results of engineering.
giving us, for example:
Science: Physics
Engineering: Electrical engineering
Technology: TV Repair, Cable TV Installation
The punch line is that application of this model to computing works as follows:
Science: Mathematics
Engineering: Programming, Informatics, "Computer Science"
Technology: Coding, Computer Installation, Home Computer Repair, etc.
Mathematics IS the science in "Computer Science".
Anyone who has studied advanced Mathematics knows that Math is not about numbers; think of mathematical logic, Boolean algebra, abstract algebra, set theory, topology, category theory, etc. ad infinitum. Dijkstra defined Mathematics as "the art of precise reasoning". In the same sense, "computation" doesn't mean "number crunching", but more generally the automated manipulation of information.
It is true that there are legitimate concerns in today's computational landscape (networking, concurrency, etc.) which didn't figure in the mathematical/engineering world view of the 1940s, but that's simply a sign that the field has grown up (i.e. grown beyond the limited perspectives of its founders). That's also true in many other applications of Mathematics. For example, early research in differential equations paid much more attention to linear differential equations (because they were more tractable). However, we now know that most "interesting" systems in the real world involve non-linearity.
Science, Engineering, and Technology share with living systems an important rule: "Grow or die!" Fortunately, the field of computing has grown.
Re: (Score:3, Informative)
Actually, "Informatics" (which is, as you say, an incorrect term in English) is used in other languages to label "Computer Science". In Dutch it is "Informatica", in German it is "Informatik" and in French is "Informatique" (sorry, I now am at the boundaries of my own language skills). All there translate to "Computer Science".
I have to admit that I prefer the English term, because it says much more than the Dutch, French and German terms. Fact is: "Informatics" is the same thing as "Computer Science".
Re: (Score:3, Insightful)
The Science in Computer Science consists largely of niches carved out of other disciplines e.g. algorithm analysis and crypto are mathematics, user interface design is psychology, computer graphics is really about approximating physics, audio compression is mathematics, psychology and physiology, AI steals ideals from biology... every now and then we find out that the p
Sadly mistaken (Score:3, Interesting)
I agree that you do not need a good understanding of mathematics to create a homepage, but for anything remotely interesting you do.
Math not essential - Logic is! (Score:5, Insightful)
Re: (Score:3, Insightful)
Math isn't important to software engineering, but it is of great importance to computer science.
Re:Math not essential - Logic is! (Score:5, Insightful)
He has no idea what math is (Score:5, Insightful)
Re: (Score:3, Funny)
2. Narrowly define algorithms so that they don't include all computer expressions.
3. Proclaim that computer science doesn't need math.
4. ????
5. Profit!!
Wrong, on many levels (Score:5, Insightful)
He also ignores the vast array of work on non-deterministic algorithms, stating that "Any program utilising random input to carry out its process, such...is not an algorithm". Sure, it's not a deterministic algorithm, but even if you artificially restrict your definition of algorithm to just be deterministic, it's a useful tool in analysing such problems.
Finally, statements such as "Computer science does not need a theory of computation" are just so bizarre as to be funny. I suggest he forgets all he knows about formal computational theory, and I'll contract "Theseus Research" to write me a program to determine the halting problem for an arbitrary program. I wonder what his bid will be, given that he doesn't need a theory of computation (that would tell him you can't do it, at least with our models of computation - and probably with any).
Now, all of this is not to say you can't make progress in computer science without the mathematics that's currently been developed - however, you will either spend a lot of time achieving unreliable results, be reinventing the wheel, or just be creating a new branch of mathematics.
Math is a subset of the bigger picture of ..... (Score:5, Insightful)
And computer science, the software side, is really the science of abstraction physics.
http://threeseas.net/abstraction_physics.html [threeseas.net]
At some point in the higher levels of abstraction creation and use you use the lower mathematical level as more or less a carrier wave of the higher level abstraction, than for the purpose of performing a mathematical calculation. The analogy is that of using radio waves to carry the music you hear over the radio, but the carrier wave is discardedafter it has done it job. Likewise, the mathematics of computers boils down to binary flipping of transistor swiches upon which the higher level of mathematics is carried upon.
With a correct approach to the abstraction manipulation machine computers really are, we can accomplish a lot more, similar to the difference between using the limitation of roman numerals in math vs. the decimal system with its zero place holder.
Re: (Score:3, Informative)
How in depth the book goes I do not know, but I do know I've been on about the abstraction perspective for near two decades and communicating it to everyone I can including to those in positions at universities.
I have noticed these last few years there are others beginning to grasp the bigger picture, such as J. Wing of CMU and her "Computational Thinking" perspective http://www.cs.cmu.edu/computational_thinking.html [cmu.edu] per
Yes and no (Score:3, Interesting)
That's not the end of it. I've also done a lot of image manipulation work, and you NEED a good math background when you step over simple 2d convolution filters. Knowing your physics also helps - being able to identify trends and patterns in wave forms, and then applying the necessary maths is a great help. When dging into aliasing and reconstruction now, not just filtering, a high math proficiency is a must.
I've taken to game programming recently. If you know your maths, the physics comes easily. If you know your maths, specially advance vector and matrix theory (with integration and differentiation being prerequisites), things become a breeze. I didnt know enough. And I still struggled from time to time today. Experience is helping me, but sometimes I wish I had a math background to roll on.
I guess my ramblings are leading to a poor conclusion. Without maths you're limited in what you can do - but you're only limited by lateral field... In most cases you can take an specific soft eng field and go to town without hitting maths. I'm a very good software engineering and reverser, and I gotten here without having a math background. When I wanted to expand into games programming and image processing, things became much harder without the math.
With all that said, I'm very very guilty of obscuring simple procedures with valid but pointless math - and I know for a fact there's too much pointless formal theory in computer science now. The pointless formal theory is actually what push me away from doing a masters in computer science, and find something more vocational and rewarding!
Matt
Re: (Score:3, Insightful)
Whether that is a sensible way to look at things or not really depends on your
Teh Maths (Score:5, Insightful)
But the kicker is that you can't just tell a student that they should "study vector math" because one day they'll write a Quake Mod, because, truth be told, they probably won't. It's the trouble with all examples you give when students ask how math will be useful -- I could pull any number of examples from my life, but the problem is, they probably won't happen in a student's life. Instead, they'll have their own trials. The best you can tell someone is to study all the math they can, because some day it *might* be useful, and they'll want to have that tool in their toolkit.
And that's just not a very satisfying answer to students who want to make sure that they'll be damn well using what you're teaching in the future.
But believe me, I thought I'd never have an application for eigenvectors, and now not only do I have to clean out my brain on the topic, but I have to parse someone else's code (PhD thesis code no less) and add functionality to it. Two other friends of mine got stuck on legacy Fortran apps which are essentially mathematical solvers (one for differential equations, the other for huge linear algebra problems), and both of them are extremely happy they paid attention in their respective math classes.
So, yeah. To CSE students out there: take math. Pay attention. It could very well save your neck some day at a job, and if it doesn't, at least try to make it interesting to yourself to think of applications where you might use them. All math through the first two years in college can find applications for it quite easily.
Re: (Score:3, Insightful)
On the subject at hand though, the real key to why math is needed in Computer Science is due to the analytical nature of the subject. If you do not know how to solve problems, then you will hit many dead-ends in Computer Science. Math isn't just about solving mathematical problems, it is about looking at a problem and working out a solution. Computer Science is about looking at a problem an
Anti-Intellectualism (Score:3, Insightful)
Just as few telescope makers are astrophysicists, most programmers aren't computer scientists. The author himself is evidently not one. Instead, he is one of the more vocal members of an angry, ignorant mob trying to burn down the edifice of computer science. Its members do not understand it, so they fear it and try to destroy it --- look what's happened to computer science at universities!
It was bad enough when courses about a programming language replaced ones about algorithms and data structures (I'm looking at you, Java and D-flat). It was bad enough when pure profit became the raison d'etre of computer science departments. It was bad enough when I noticed my peers start to joke about how they didn't care about this "maths bullshit" and just wanted to earn more money. It was bad enough when the object, not the bit, became the fundamental unit of information.
But what this author advocates is still worse. He's proposing that we replace the study of computer science with a vocational programming, and call that emaciated husk "computer science." We already have a "theory of process expression", and that's the rigorous of algorithms and data structures. We've constructed that over the past 50-odd years, and it's served us quite well.
That field has given us not only staples, like A* pathfinding, but a whole vocabulary with which we can talk about algorithms -- how do you say that a scheduler is O(log N) the number of processes except to, well, say it's O(log N)? You can't talk about computer science without talking about algorithms.
The author's fearful denunciation of algorithms is only one manifestation of the anti-intellectualism that's sweeping computer science. "We don't need to understand the underpinnings of how things work", the angry mob chants, "but only implement the right interfaces and everything will happen automatically."
The members of this angry mob sometimes manage to cobble something of a program together, but it's more like a huge rock pile than a colonnade. It often barely works, uses too much memory, doesn't handle corner cases, and is likely to crash. (See worsethanfailure.com.) Members of this mob even think that if the same algorithm is expressed in two different languages, it's two different processes. People like this ask painful questions like, "i know quicksort in C# but can someone teach it to me in PHP?"
Argh.
Even "new" developments in programming are just new claptraps for old ideas, with fashions that come and go over the years. The only really new things are algorithms, and increasingly, we're calling people who couldn't independently create bubble sort "computer scientists." It's ridiculous. Call computer science what it is, and create a separate label and department for people who can program, but not discover new things.
It's this idea that one doesn't need to understand or think to be successful that's at the center of the article, and it's not just affecting computer science. Look around you. I wonder whether we'll fall into an old science fiction cliché and regress so far that we are unable to understand or recreate the technology of our ancestors.
CS - MA = IS (Score:3, Insightful)
It covers networking, scripting, database management, web design, hardware, etc. It's computer science without the science.
Also, Computer Science != Programming:
Lemme guess (Score:5, Insightful)
Computers are (by their very definition as well as by the word used to describe them) mathematical machines. A computer can essentially do NOTHING BUT calculate. It can in its core add, subtract, shift and move data around. How is this supposed to work without math?
Without math -- you get lost in the code! (Score:5, Interesting)
I remember trying to get my specific algorithm to run under 500 micro seconds and the best I could get was like 10000 micro seconds. My coworker who looked at the underlying math equations for my code easily saw a better solution just by looking at the math equations for 5 minutes. After I changed my code to suit the new math equation I got my code to run at 280 micro seconds.
The whole point of this example:
When you approach the solution from a mathematical viewpoint, the mathematical viewpoint lets you see more clearly how to optimize an algorithm. In my case, I got lost looking at the C code and missed the elegant mathematical solution because I did not look at the math equations. So I ended up not being able to "distill the complex math into an efficient C code implementation" to find the elegant solution.
In my case the elegant-math-derived-solution was about 35 times faster (10000 / 280) than the original solution I had come up with.
-----
Bottom line: The syntax and complex notations used for math equations lets you look at a problem from a much higher level of abstraction and this higher level of abstraction is much more conducive to seeing the elegant best solution (solutions that improve your algorithm by an orders of magnitude rather than solutions that improve your algorithm by some linear constant).
p.s. if you were wondering what I was working on --> the function was a GMSK modulator ( http://en.wikipedia.org/wiki/GMSK [wikipedia.org] ) for a transmitter.
I could care less about Computer Science (Score:4, Insightful)
What I'd like is an arts program that concentrates on programming. I'd like something that stresses *reading* and *writing*. I want people to learn how to *communicate* in these programming languages; not just with the computer, but also with their fellow programmers. I'd like people to do research in language design where they ask the question, "How can I allow the programmer to be more expressive about their intent?" I'd like classes on collaborative design. I could go on forever.
I was at the art gallery the other day and wandered into the modern art section. They had a display of a particular type of performance art where someone would write out a description of an artwork on 3x5 index cards. A bunch of other artists would take the description and make the art. Along with the index cards and pictures of the finished work, there were a couple of letters. The letters were describing the disappointment the original artists had in the finished work. They even went so far as to accuse the artists following the instructions as being "incompetent".
I described this to a programmer colleague of mine. His response was, "Wow... I didn't know I was a performance artist". I can count the number of times in the last 20 years that I've had to do hard math in my job as a programmer on my fingers. But questions like, "How the hell did you think *that* was readable", "How can I turn a bunch of requirements into something that isn't crap", "How do I get 10 guys working on a project and have a single vision", etc, etc, etc; those questions I ask every day.
Sure computer science is important and personally I think math is a part of that. But, someday I hope someone will realize that programming is an *artistic* endeavor and we need to do a crap load of research in that area.
Disagree (Score:3, Insightful)
Algebra is an obvious key to understanding computation. Discrete mathematics including probability and combinatorics tend to pop up in computing problems over a wide range of disciplines.
On the other hand, it would not be unfair to suggest that computing is more useful to calculus than calculus is to computer science. Continuous mathematics, like calculus, show up rarely if ever in most computer science specialties.
Fant also seems to be stuck on the word "algorithm." Computer scientists have a very different definition of an algorithm than mathematicians. LISP was the only moderately successful attempt to introduce computer scientists to the mathematical notion of an algorithm. I'll take the groans and dearth of hands raised to the question, "Is LISP your primary programming language?" as proof of just how little regard computer scienctists have for the mathematical notion of an algorithm.
Holy ignorant masses Batman! (Score:5, Insightful)
"The notion of the algorithm," he concludes "simply does not provide conceptual enlightenment for the questions that most computer scientists are concerned with."
The assertion that computer science is not math is similar to the assertion made in the book "The World is Flat" saying the world is now "flatter" than it used to be. In the case of the flat world, Friedman (the author of "The World is Flat") claims the world is flat to create a sense of shock that he can then use to get his message about globalization across. In the case of "computer science is not math" Fant here is trying first to shock as a method of capturing attention...
Most Americans use math in the singular. The Brits say maths. That is because there are multiple branches of mathematics. What we are discovering is that the tie between arithmetic and calculus and computer science is falsely reinforced. The fact is there are other branches of mathematics that are more important to computer science. There are also many new branches of mathematics that need to be developed in order to solve the new kinds of problems we are trying to solve in modern computer science.
I am really bothered by programmers who, when I interview them, say they have been writing software for years and can't remember ever having to use math.
I know they can't possibly mean that... or they don't know what math is...
I know that in several years of programming you must have at least been tempted to write an if statement or at least one loop of some kind.
The if statement uses a form of algebra called boolean algebra. It was named after George Boole [wikipedia.org] who was very much a mathematician. I know that there are many programmers today who use the if statement and this form of mathematics makes up a large part of many programmer's jobs. I guess it must be falling out of fashion.
I know how to perform boolean algebraic operations on a white board and I have many times been confronted with a gigantic morass of if and else if statements and using simple truth tables and a little boolean math have reduced enormous sets of ifs down to just a few.
The new computer science needs to focus on solving problems involving processes. Processes are like algorithms in that they have a set of instructions but they are unlike algorithms in that they also have many temporal components and may exhibit parallelism, asynchronous invocations, and may not have a finite product. These are the types of problems addressed in newer mathematic disciplines that are trying to see information processes not as tied to computing machinery but as tied to the natural world.
Computer Science may point to a new kind of science that describes an underlying natural computational order of the universe. We are starting to observe computational processes everywhere, in the brains of animals, to the interactions of ecosystems, to quantum mechanics. We may lack the right mathematics to describe these things and we may have to invent new kinds of math but that doesn't mean that math becomes unimportant. An understanding of math can help when studying logic and so too would it help in studying any new disciplines that we may need to invent.
New kinds of math are invented every day to describe new kinds of problems. To say you don't need math to study any formal science let alone computer science is just silly. It is just something shocking to say that grabs attention... and the article nearly contradicts itself by the end... and it's only 7 paragraphs. The distinction Fant makes is nearly academic. Just as the distinction between a Statistician, a Geometer ( a mathematician who studies geometry ), and a Logician is academic. Yet that is not what the readers of the headline will read... Fant is arguing to make computer science a new kind of science much as Wolfram has. Yet it would be sil
What math do you need? (Score:3, Insightful)
What math do you need in computer science today? It's a tough call. But today, I'd study number-crunching rather than discrite math.
I have a classical computer science education - automata theory, number theory, combinatorics, mathematical logic - all discrite math. That's what CS was supposed to be about in the 1980s. It's hasn't been enormously useful, and I'm writing this as someone who ran a successful project to develop a proof of correctness system. Mathematical logic gets used a little, but tree-like algorithms are more important. I'm not sure automata theory is useful for much of anything. It's one of those things, like proofs in plane geometry, taught in schools because it was the first theory in the field.
Number-crunching, probability, and statistics seem to be more useful today. If you do anything in graphics, you have to have the 3D transformation math down cold. I've had to do tensor calculus and integration of non-linear differential equations (I used to do physics engines for animation) and that required plowing through difficult math books and getting consulting from university math departments. Bayesian statistics have become very important - it's used in spam filters, search engines, computer vision, and most things that have to intelligently reduce messy real-world data. I didn't get enough of that at Stanford.
On the other hand, where are you going to use this stuff? Outside of Google, Microsoft, and universities, who does real CS research any more? All the good research centers (DEC WRL, HP Labs, IBM Almaden, PARC, etc.) are gone, or a shadow of what they once were.
Math == Latin (Score:3, Insightful)
Also, it is what Latin used to be in the middle ages - the common language of people all over the world. Scientists from different continents may be barely able to communicate in their respective mother languages or in english, but if they write down their formulas, they both know exactly what the other is talking about.
But no, the most important part is that math still evolves, and rapidly. As so many other critics, the author of the article appears to have a very limited understanding of math.
Re: (Score:3, Funny)
Re:I am able (Score:5, Funny)
However aren't they all integers, and therefore morally equivalent?
Re:wahay! (Score:5, Interesting)
The summary of the author's points in the article make the book sound dead wrong on several counts, though it could just be the review. Procedural languages are the natural way to code most programs, and here's why: we've been recording recipes as a sequence of steps, with if statements and loops, since the invention of writing. It's become encoded in our genes. That's really all that early computer scientists put in our early languages like FORTRAN. It's all the stuff we've added since then that's up for debate, in my mind. The author makes money by pushing the boundaries of computing model research. I get big programs written by teams by restricting what language features are used, and how. I'd be interesting to debate the ideas, point by point.
Re:wahay! (Score:5, Interesting)
Maybe the question should rather be: Why doesn't Microsoft look for the kind of GUI-guys Apple hires. And the answer to that might well be found at the top of each company. A quote from Steve Jobs' Commencement address at Stanford (June 12, 2005):
"Because I had dropped out [of college] and didn't have to take the normal classes, I decided to take a calligraphy class [...]. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating. None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, its likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do."
Read the whole thing [stanford.edu], it's quite interesting (if not to say: inspiring).
Re:wahay! (Score:5, Insightful)
Re: (Score:3, Insightful)
I think it's not so much an "if" as a "when". Maybe without Eistein e=mc2 wouldn't have been know for another 20 years. Imagine how drastically that would have changed the 20th Century. Now if Jobs didn't have this artistic side to him, and that offset GUIs by 10 years, then things like the internet and the adoption of PCs might well be at about the 1997 level right now. And that's assuming that the current Federal administration woul
Re: (Score:3, Funny)
what... the mouse was invented BEFORE apple???? gasp
Re:wahay! (Score:5, Insightful)
Oh I see. All this time I was lead to believe that Donald Knuth created TeX to satisfy the desperate need for a half decent digital typography tool and after all it must have been due to some class that steve jobs took when he dropped out of college. Knowing that TeX remains to this day the best typesetting system and knowing a bit about Adobe and the history of PostScript, I guess that that half baked assertion makes sense and must be true.
...or maybe not.
Please. Steve Jobs doesn't walk over water, nor is he behind every single thing which can be accounted as progress in the computer world. This whole jobs-worshiping thing is starting to become ridiculous.
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
Re: (Score:3, Insightful)
Re:wahay! (Score:4, Insightful)
Yes, but I wrote a new book that claims that debate is better off without logic. Early debating pioneers such as Kant and Aristotle imposed their logic background on the field, and this has hobbled debate ever since. I reject the idea of convincing arguments as a good way to resolve any conflict.
Re:wahay! (Score:4, Interesting)
Re:wahay! (Score:5, Funny)
Re:The True Nature of Computing (Score:5, Insightful)
From here you've both made a giant leap to assume that programs can't be described by an algorithm. You haven't understood that the difference between a "computation" and "reactive software" is actually a technical triviality that is easily overcome. Indeed it is so trivial that most languages simply ignore it and have stateful operations for input/output. Reactive programs are normally modelled as a sequence of algorithmic steps, everything that the program does apart from sending / receiving data is modelled by an algorithm. So we can either consider this "non-algorithm" to be a sequence of algorithms or consider the program as an algorithm operating over a larger state that includes the environment. The input/output actions become alrgorithmic state transitions over the program/environment state. Look at the way programs in CSP/CCS or other process algebra are written to how this works. To see how the theory of algorithms can be applied to reactive systems take a look at multi-headed Turing Machines.
Finally, if you're going to lob a technical term into a discussion then you should understand what it means. Automaton is a well defined term in CS, and it doesn't mean what you think. In particular what you are describing is not a decision problem and so there is not a problem of language recognition to be solved. I vaguely remembering reading the crank research that you are pointing before, and would like to ask you a simple question. Name one problem that you believe can be computed by a UBM, but not by a UTM?
Re: (Score:3, Informative)
Re:The True Nature of Computing (Score:4, Insightful)
Wow. That's probably the most non-sensical statement I've read in Slashdot in a while, including the huge iraq-related threads... Quite an accomplishment!
Re:Computer Science without math... (Score:5, Insightful)
The reason computer science is so heavily influenced by math is the binary architecture that every piece of hardware is designed around. Every real world problem, right down to choosing the color of a font, has to be translated into the digital world by algorithmic approximation - a lot of math! The problem is that it is this very abstraction that makes computers so "flexible" in what they can do. Analog computers existed many years ago but they could only ever be built for a single purpose.
Unfortunately(?) it is much easier to design and mass produce something which is based on a finite lowest common denominator (bits) than it is to do so based on the continuum that a non-digital solution would require.
That said, who's to say that a beautiful painting rendered in Gimp/PhotoShop isn't a program of sorts? Certainly it has input, (from the original creator), and output, (its effect on us), and the "code" can be modified to change both!
Re:Computer Science without math... (Score:4, Insightful)
Re:Computer Science without math... (Score:4, Interesting)
More accurately, a programmer is not necessarily a computer scientist any more than a computer scientist is necessarily a programmer. Neither is better or worse than the other, and both should know something about the other's skill set, but in practice, there are many amazing programmers who are poor computer scientists, and even more great computer scientists who are poor programmers.
I would classify programmers as people who can get a computer to do what they want it to, and the measure of the skill of a programmer is how their code performs on some set of metrics (performance, reusability, readability, etc.)
On the other hand, computer scientists are people who figure out what they can get a computer to do and how to do it. More often than not, these people work in research labs and in academia, and their measure of performance is how many (usefully) novel methods they've found of doing things or how many new things they've figured out they can make computers do. In most cases, aptitude in more advanced math does help computer scientists, although in some sub-fields, there is less dependency on this.
Re: (Score:3, Insightful)
Yes, because that's how you design and implement a file system.
Re: (Score:3, Insightful)
To use your War analogy, Math is not War, but Math is necessary for War. (Unless you like losing, of course.) Someone may have done all the mathematics long ago, and stored it in a computer for you use, but it's still necessary. You can be infantry in a war without knowing how to add. Heck, I'd bet you could even be a low-level official without anything higher than eleme
Re: (Score:3, Insightful)
I started programming at 5 - boolean algebra was the first maths I learned, because it flowed naturally from learning programming, though it took a few years
Re: (Score:3, Insightful)