





Math Toolkit for Real-Time Programming 153
Math Toolkit for Real-Time Programming | |
author | Jack W. Crenshaw |
pages | 466 |
publisher | CMP Books |
rating | 8 |
reviewer | oxgoad |
ISBN | 1929629095 |
summary | A casual discussion of algorithms ranging from abs to numerical calculus. |
Who & What
Jack W. Crenshaw, Ph.D. (Physics) wrote his first computer program in 1956 for an IBM 650. He has been working with real-time software for embedded systems ever since -- contributing several years to NASA during the Mercury, Gemini, and Apollo programs. In addition to other activities, he is currently a contributing editor for Embedded Systems Programming magazine and author of the Programmer's Toolbox column.
In Math Toolkit for Real-Time Programming, his effort is focused on describing the pitfalls of vendor-provided math libraries and providing robust replacements. In section one he gives a thorough overview of constants and the various manners in which to declare them, naming conventions, and error handling. As the work progresses, in section two, he builds a library of proven algorithms ranging from square roots to trigonometrical functions to logarithms. Did you suffer through calculus in college with a barely passing grade? Section three will teach you more about numerical calculus in a half-hour than you may have learned in three semesters.
Kudos
Math Toolkit is written in an easy to understand anecdotal manner. You might be tempted to think that the author was animatedly relating the history of computing square roots while having lunch with you. This method works very well and keeps what could be a rather heavy subject from becoming too much of a burden. Most chapters have historical tidbits liberally sprinkled throughout.
Even if college algebra left you with post-traumatic stress disorder, you will not have any trouble with section two. Indeed, you may find yourself intently following the author on the trail of the perfect arctangent algorithm -- much as a sleuth on the trail of a villain.
The depth of knowledge shown, and its presentation, is exceptional. The author's years of experience are evident in his self-confident writing style. You will rarely see a clearer overview of numerical calculus.Quibbles
The cover of the book states: "Do big math on small machines." This, combined with the Real-Time Programming phrase in the title, might lead one to believe that the book's primary audience is intended to be the embedded microcontroller crowd. Sadly, not so. There is very little here for the die-hard assembler programmer other than some very handy integer square root and sine routines - and these examples are in C++. Based on the cover, I would have liked to see a greater emphasis on processors lacking a floating point unit. Also, some code examples in pseudo-assembler would have been welcome, as the author chose C++ as the language of choice for all examples.
Crimes
As is so often the case nowadays, there are various typographical errors scattered throughout. This seems to be an epidemic in current technical books. Fortunately, it didn't affect the readability of Math Toolkit.
Conclusions
I believe Math Toolkit for Real-Time Programming would be a great, perhaps mandatory, addition to the bookshelf of anyone that is involved in writing code that has a heavy math component. Other than the somewhat misleading cover, I cannot find anything truly negative to say about this work. Congratulations are in order to Mr. Crenshaw on a job well done.
The book also includes a CD-ROM of all example source code. In reality, to get the best benefit from the book, you should mostly ignore the CD-ROM and work through the examples. To quote the author: "Never trust a person who merely hands you an equation."
Table of Contents
- Getting The Constants Right
- A Few Easy Pieces
- Dealing with Errors
- Fundamental Functions
- Getting the Sines Right
- Arctangents: An Angle-Space Odyssey
- Logging in the Answers
- Numerical Calculus
- Calculus by the Numbers
- Putting Numerical Calculus to Work
- The Runge-Kutta Method
- Dynamic Simulation
- Appendix A: A C++ Tools Library
Disclosure
I received a review copy of this book from the publisher. Thus, my loyalties and opinions may be completely skewed. Caveat Lector.
You can purchase Math Toolkit for Real-Time Programming from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
is Real Time programming still a Real Issue? (Score:3, Interesting)
However, with the the maturity of operating systems, many of them now include device drivers, APIs, objects and other goodies that insulate the average programmer from the hassle of issues like latency. So my question is, other than good academic study, would it pay for the rest of us to spend the $$ on such a book?
Though I admit, having to write my share of real-time apps back in the day has me curious enough to put the book on my wishlist.
Re:is Real Time programming still a Real Issue? (Score:4, Interesting)
Hey, I understand completely what you're saying. I for one am glad I don't have to deal with such as latency and pre-emption. In fact, here is a link to a nifty article entitled "Real Time Issues in Linux [helsinki.fi]" that essentially sums up what you asked with a resounding yes.
Re:is Real Time programming still a Real Issue? (Score:1)
Yes.
Real Time != Real Fast
Re:is Real Time programming still a Real Issue? (Score:5, Informative)
This deserves some more explanation, since everyone here seems to have missed this point.
A Real Time system is one where the ouptut isn't correct unless it arrives on time. Real Time systems are deterministic - not necessarily fast. The key is to use bounded-time algorithims so that you can predict the worst case execution time at compile time. RTOS's aren't designed to be fast, they are designed to have deterministic schedulers and kernel services.
Of course, faster processors make it easier to meet real time deadlines, but as processors get faster I'm seeing engineers ignore the real time analysis and design because the code passed the last test they ran. Then they are surprised when it fails in the field...
Jeff
Re:is Real Time programming still a Real Issue? (Score:5, Insightful)
The above generally doesn't apply to anyone doing serious embedded work with small and midrange microcontrollers. Often an operating system is thin to non-existent on these platforms. Some of the lower-range parts may have a 2-byte hardware stack, 28 bytes of RAM and maybe 512 bytes of program memory. Obviously, you won't be doing much sophisticated numerical work on these smallest of microcontrollers, but for more midrange parts, I've found this book to be a godsend.
The book is not aimed at PC users.
Re:is Real Time programming still a Real Issue? (Score:3, Interesting)
Re:is Real Time programming still a Real Issue? (Score:2, Informative)
But in many situations you might find yourself programming for, say, a small 1 MHz cpu in a timecritic controllsystem at a factory or chemicalplant or something like that.
That's when you'll need your skills in realtime programming.
Re:is Real Time programming still a Real Issue? (Score:4, Interesting)
I work with a group of eight other people updating 40 year old Assembler on an IBM Series 1. Something tells me that if this was included in our training programs, those that are
SUF
FER
ING
through the digit-crunching wouldn't have such a hard time. Most people consider this back-in-the-day, but there's an aaaawwwful lot out there that still reeks of old german engineering, and chunk-button ATMs.
Banks and messaging don't need real-time (Score:2)
Satellite controllers may need real-time programming - there's physical stuff moving, and if a signal needs to be responded to in the 100ms before the bird turns another degree, you need hard real-time. But there's nothing that a bank does that needs real-time, unless the device in an ATM that hands out the cash is really badly designed. Yes, you need to know that the customer has taken the cash out of the slot or that the receipt-printer's finished, but if you find out 100ms late some of the time, it isn't going to hand out the wrong amount of money, it's just going to be slightly later drawing the next screenful of customer interaction. Some of their stuff needs to get high volumes of work done quickly, but that's a throughput problem, not a real-time problem, and you might get better throughput if some of the transactions have wait their turn rather than preempting other ones.
Re:is Real Time programming still a Real Issue? (Score:3, Informative)
Re:is Real Time programming still a Real Issue? (Score:1)
Re:is Real Time programming still a Real Issue? (Score:3, Interesting)
integer square root (Score:2, Interesting)
If you still need a decent integer square root algo, check out this page. [azillionmonkeys.com] I used the mborg_isqrt2 variant on that page as a starting point for writing my highly optimized Intellivision version [spatula-city.org] for SDK-1600. [spatula-city.org] My optimized version takes about 600 - 700 cycles for a 16-bit square root, on a machine where most operations take 6 to 8 cycles. (The version I was replacing took 4000 - 10000 cycles.)
This book looks like it might be interesting to me. Here at work, we had our own math expert, but he's retired (or semi-retired). We've contracted with him to do math libraries, and that works for now. But what about 10 years from now? There's a lot of subtlety in some of these algorithms (it's not always just as easy as whipping through a Taylor series expansion), so it's probably time someone in our group started learning. :-)
--JoeRe:is Real Time programming still a Real Issue? (Score:4, Insightful)
Re:is Real Time programming still a Real Issue? (Score:1)
what are you photographing with that beastie?
Re:is Real Time programming still a Real Issue? (Score:2, Insightful)
It means predictable (bounded) latency! It's a
secondary issue if that latency is low or high.
My Linux is reasonably fast, but it's still far from real time: each time I touch my xawtv window, the whole machine freezes for a second...
Re:is Real Time programming still a Real Issue? (Score:2)
While mature OS'es are indeed very nice to have, they are not universally available. Mature OS == larger code size == larger HW demands == uses more power == larger battery == heavier equipment. And yes, this is still a very real issue. You are not always in a situation where you can throw more hardware at the problem.
As for latency, there are situations where you need absolute control over the timing. I recently participated in the development of a portable heart defibrillator. If there is a delay between the order to give a shock and the actual delivery of that shock, you may kill the patient instead of reviving him. For such jobs, you need guarantees, not promises.
So yes, I'd say thee is definitely a need for this book.
Wow, I'm old, I haven't seen Runge-Kutta in years (Score:5, Insightful)
I wonder how much better could we be if coders knew basic math, if they know how those little bitty chips actually computed the sine of something instead of assuming it works. We would probably have rock solid operating systems without all the glitzy GUI stuf..
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2, Funny)
And I remember when being able to spell the words they used was de rigueur for anyone with an education.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2)
Huh? What's the use of sine in an OS besides to draw glitzy GUI stuff?
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:3, Interesting)
Funny this topic should come up. I just did a 'Store Locator' for the company I work for (I'm the IT Manager, belive it or not). All I have is your basic HS diploma, and in creating the search, I realized I don't know a damn thing about sine and cosine. I don't know how they're used, or how they're applied. I have a feeling that they're somehow related to geometry (which makes sense, seeing I have to get a distance between two points on a curve - the earth), but I'm not sure.
Sure, it's probably taken me longer to write this post, than it took to find the php code I used as a basis for the search, but how much math is REALLY needed overall?
I slept through school, I did really bad, all because I felt it was worthless. I did feel that my business class, business law, and basic Algebra has been useful. But overall, it wasn't worth my time. Hell I had a physics teacher who'd pick on me because I was flunking (it's amazing what good test grades + 0 homework does to you), but I just found physics interesting - jeez, it was only HS. I was testing the waters, not padding my GPA. I believe that's what's HS is FOR.
And if you KNOW what you want to do (I knew I wanted to fix/program computers when I played on my Apple ][ in 6th grade), what the hell is college for?
The ease of the internet sure hasn't helped my perception.
Am I the only one?
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:3, Insightful)
But to get a job writing computer graphics software, or audio processing, or designing any sort of embedded hardware, knowledge of advanced math is required. The people who want to do this kind of work pursue higher educations, and if they enjoy what they're doing then that's great, too.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
Well, that sounds a bit belittling. I think building networks (I'm beyond admin, I just do EVERYTHING - including PBX) can be just as difficult as programming, and you get the same rewards.
I don't really grind out anything. Hell, I put up a TV antenna last summer, and hooked up the security cameras to a linux box for motion detection around xmas. I'd much rather be doing 90 different things, than concentrate on programming in 'X'..
Maybe I should have left out the 'Manager' part :) (I'm just the only one here.)
Then again, maybe I AM that good, and you're all just jealous! muhuhuhahaha! :)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2)
What came out as a belittling tone probably slipped through because I know that colleges around the country are churning out graduates with BSes in Information Technology or similar majors, all of whom are going to be going after YOUR JOB. Now it sounds like you've got a good mind and a good head start in the IT world so I wouldn't be too worried, but just know that your field isn't going to be getting any less competitive.
Cheers!
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
For what I've seen, I think they're mostly programmers and MCSE's. That's not too damaging to me. For practial, in-house purposes, I can pickup whatever I need programming-wise. I completely understand I won't be programming games, or advanced simulations any time soon (Hell, you can find my pitiful posts on wine-devel about trying to get FoxPro running.. rick@v a leoinc). But those positions always seemed like a small percentage of the market as a whole. Everybody needs a network, internet access, firewalls, phones - infrastructure. It just seems like a bigger target to me.
Fortunately for me, most people I run into are sorely lacking on what I would lump together as basic infrastructure.
(but at this moment, I have to put php aside,so I can figure out an EDI issue with FoxPro) I love having so many different things. How many EDI people know PC's? Networks? The consultant who interviewed me for this job didn't know many, so here I am!
Ok.. enough of the ego-boosting stuff for now :)
Personally, I think experience can replace college. You just have to be resourceful, and create a resume that shows it. I think I did a good job doing that.
Now, Social Skills OTOH....It would have been good for me to live in a dorm for a few years. I dormed weekends with my girlfriend - which got me to where I am today, family-wise :)
So maybe it wouldn't have been such a good idea to live in a dorm :P
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2)
Almost everybody in the EDI field sort of stumbled their way into it accidentally.
I used to be the primary EDI guy where I worked a few years ago, but I haven't touched it since. I think I can still look at an unparsed 850 (X12) transaction set and tell you what's on it without thinking about it, and writing a program to parse all those nested loops is pretty fun. I can probably belch out a 997 FA after chugging a pint of beer. That would impress a very small number of people, unfortunately.
But I also know a lot about PC's (served time doing desktop support) and a little networking.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1, Insightful)
A) Create superb programs
B) Fix them
and
C) Put some thought into the design so that others can use-, understand and change it easily.
not
Just hack something up and yell 'FINISHED!' when it seems to run "Good enough".
No, computers don't need math (Score:3, Interesting)
Scientific or engineering programming, they need the math because they are math programming. The rest, forget it, maybe you add some numbers for a shopping cart, multiply for sales tax, but programming has little use for math.
I learned long ago that when an 8 bitter needs trig functions, you use a look up table generated externally.
Re:No, computers don't need math (Score:3, Insightful)
Re:No, computers don't need math (Score:3, Insightful)
Ah yes, discrete math. (Score:2, Informative)
What's sad is that discrete math isn't really taught in public school. (At least, it wasn't when I was in school.) One day, I found a Discrete Math textbook at the local library in the 'For sale, $0.25' bin. I opened it up and thought "Oh my goodness, this is a programming and algorithms book!" To my mind, 'math' had always meant either calculation (symbolic or otherwise, your typical Algebra and Calculus), or geometry and proofs. While geometric proofs may border on discrete math, they really seem different to me. They're not algorithms.
Discrete Math branches into useful concepts such as graph theory (you couldn't do network routing successfully without it!), some of the basics of sorting, and so on. Basically, it was the math of "machines" -- that branch of mathematics which concerns itself with stepwise algorithms. Djikstra's algorithm (least cost path through a weighted graph), Prim's and Kruskal's algorithms (minimum cost spanning trees) were all in there. I thought the book was great.
And, of course, not a single line of code in it. (At least, not in any computer programming language.) But I still thought of it as a programming book.
--JoeRe:Ah yes, discrete math. (Score:2)
I find it's also true that there isn't much math involved in general programming; be it abstract algebra, topology, or analysis. Crap, even linear algebra doesn't come up explicitly that often unless you're writing a numeric package (or graphics, or interpolation, or sim.).
I say this with a tear in my eye b/c I'm in the middle of hunting for a more rewarding way to spend my math knowledge (not a lot of open spaces in crypto or AI programs). Grad school is the only option (if I can get in).
The one thing I do think higher math helps with is understanding the structure of problems. Math really is the science of patterns. It may take a long time to understand what topic in higher math applies, but I guarantee that there is an area of math that fits your problem.
Re:No, computers don't need math (Score:2)
It's actually an odd combination of continuous and discrete mathematcs, hence Concrete Mathematics [stanford.edu].
Re:No, computers don't need math (Score:2, Informative)
Actually even if you're just doing basic sorts, searches and manipulate data structures, its amazing how much math goes into it. Ever considered the algorithmic complexity of using binary trees versus randomized data structures like skip lists [nec.com]?
You can be a "computer programmer", but to be a good one that actually has a brain and knows the pros and cons of the algorithms you're coding out requires math. At least the basics of probability theory and calculus.
Follow-up (Score:1)
Note that I'm not making judgements here, I'm just underscoring the point that there are some jobs that require certain knowledge, and others that don't.
And FYI (so you can impress your coworkers and/or significant other <g>): The sine of an angle refers to the y-coordinate of the point at which a line drawn from a starting point at that angle would intersect with a circle of radius=1 drawn around the starting point.
Re:Follow-up (Score:2, Funny)
A couple minutes of going through results granted me a simple:
Re:Follow-up (Score:1)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:5, Insightful)
College (esp for computer engineering and CS) fundamentally teaches you:
1. How to solve problems
2. A toolset (ie math, algorithms) to go about solving those problems
True, you may not ever use calculus, but as a computer scientist you will use matrix theory because it is the best way to solve some problems.
This is not only for scientific/research either. If you try to write anything performance related, you'll have to use higher math. Computer science ain't easy.
Let me stress again that college teaches you about your subject matter and how to solve problems for it. You can come up with this stuff by yourself, in my experience only a tiny percent working without a college degree will ever accrue enough to offset what they missed in college.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
Thanks for not bashing, and please don't think I'm attacking when I say this but:
If anything was learned from this post, it's that there are a lot of PROGRAMMERS who read Slashdot. IMHO, Programming in itself is a limited set of jobs in the IT industry.
Let me stress again that college teaches you about your subject matter and how to solve problems for it. You can come up with this stuff by yourself, in my experience only a tiny percent working without a college degree will ever accrue enough to offset what they missed in college.
You post sounds depressing, but don't worry about me, I'm all set (maybe I'm even in that small percentage). Maybe I'll go back to college when my kids are teenagers. I'll still be less than 40. :)
Yes, I did EVERYTHING early - against the grain, thankyouverymuch
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:3, Insightful)
I changed majors from CS to Mathematics halfway through because I realized that programming is easy; you can always learn a new language or a new technique by picking up the appropriate O'Reilly book on the subject. But writing good programs -- programs that are robust, that scale well, that do as much as possible as quickly as possible -- is really applied math. And math is hard.
You simply have no idea how much you don't know, and with the attitude you have, you probably never will.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
But what kind of programs? Like another poster said, math isn't really involved in the mainstream too much anymore. The dealer locator is the only thing that has made me think of anything math related in quite a while.
I also didn't spell out my duties. While the inital post may be directed at programmers, how many programmers are directly affected by math? I don't just program, and my programming isn't very intense. I've taken ONE week-long C++ crash course, and that's it. While I still haven't done anything in C++, I've done things in FoxPro, C, PHP, Perl - simple stuff. Want an example of what I've done? www.havokmon.com/stuff. Little blurbs. Nothing major. I didn't need advanced math, and yes, they apply to my job :)
How do I compare with the rest of the industry? I don't know. I have NOT worked for a company that produces applications. I have worked for companies that produce their OWN applications. The only app I know of that had any intense math in it, was a 'sales tool'. You could visually zoom in on any locations on a US map, and get population, sales density, and some other figures. I'm SURE that required heavy math. But that's one app out of MANY.
You make a good point, and I understand where you're coming from. But IMHO, database knowledge is much more important. If you want to know if it's efficient, you watch it execute. If it seems fast enough, it's fast enough. Remember the 90/10 theory. I learned a long time ago (from Netware server performance, actually) that spending 90% of my time trying to tweak out 10% more performance really isn't worth it in the end.
Now, if you're talking embedded systems, or console game programming - ok. But otherwise there are WAY too many constantly changing variables to try and tweak stuff over 90%.
YMMV :)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:3, Interesting)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
(Okay, I am a bit biased; I'm a college math professor, and in addition I do a lot of research and consulting related to numerical computation).
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
Hehe, I'm coming from the opposite direction: A little BASIC on a TI99/4a, a little Apple ][ hardware install, a little PICK navigation to FDISK and boot to my games on my mom's PC, hardware, networking, OS...
At this point, my next 'advanced' task is to write a replacement shipping EDI application. Java I hope, but I don't see any advanced math coming into play. You know what's really scary, sometimes I forget (like now) what you call the base of a number, without the decimals. Is that an integer? Pretty bad, but I haven't needed it. I can't even remember what I used in my C++ class last month.. Not an int, a float? ah well.
Maybe when I get to OpenGL programming, which I assume may require algorythm design, I'll go take some math classes :) But I don't see that day coming any time soon.
At this point, with advanced math, I'm like Sean Connery in 'Indiana Jones and the Last Crusade',
"I wrote it down so I wouldn't HAVE to remember".
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2)
Lord Kelvin put it best (though the notebooks of Lazarus Long come darned close):
Anyone who doesn't know at least some basic math and statistics is a sucker for all the fallacies pushed by advertisers, politicians, and groups with an agenda. And once you've done your Google search for a formula to plug and chug, how do you test it if you don't know enough math to really understand it?
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:4, Interesting)
Now I've always been big on math but I was kind of surprised at how few people were willing to take a single class to earn a full-fledged minor.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
Wait! We're back in the early 80's again! =-)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2)
It's not that trading raw power against development costs is unreasonable where that choice exists - far from it - but rather that hand-waving away questions of efficiency on the assumption that God (or Moore's Law[1]) will provide is a sure recipe for the sort of bloated and near-unmaintainable messes that are so common today. A Mbyte here, a Mbyte there, an assumption that the compiler will find and optimize the invariant components of loops... if you're not careful these all start adding up to measurable numbers "why is this so s l o w . . .")
[1]And, of course, one can always paraphrase Parkinson's Law [google.com] for IT: programs and data expand to fill the processor power and storage available.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:2)
When I have, I've never needed anything beyond what I could find in a textbook or online in less than 5 minutes.
Heck, I've practically never even had any reason to use floating point math over that period of time.
Sure, there are lots of areas where you do need maths beyond what you can pick up from a book in 5 minutes, but there are far more where maths is irellevant.
Beyond basic algebra, maths is just another set of domain knowledge that you'll need to aquire to do particular types of software development, not something that is an inherent requirement in order to be a good coder.
Engineer versus Programmer (Score:2)
It will depend really on what you call yourself. I am an engineer and I have been programming for almost 25 years however my background is definitly skewed towards scientific programming. You can even see it in the sequence of programming languages that I learned over my career:
BASIC->FORTRAN->ASSEMBLER->PASCAL->C->LISP->XLISP- >C++->JAVA
I don't call myself a programmer, but an engineer who programs. This is because you will notice there are some importand tools missing from the above list. Things such as PERL which we know that every real programmer would have in their toolbox.
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
I think this prepared us pretty well for what would be a more theoretical-type CS career (i.e. not just going to work as a programmer or web developer, but also continuing on to your masters or PhD).
Some of the ideas the department was real big on was proving correctness, for example, by induction. Instead of giving you a compiler and API and saying here, do this, they made you write it out and actually write a proof about why/how your program works (now imagine people actually doing that, for their OS's CreateProcessEx function!).
Re:Wow, I'm old, I haven't seen Runge-Kutta in yea (Score:1)
flimsy review (Score:5, Insightful)
Re:flimsy review (Score:2, Funny)
Re:flimsy review (Score:3, Informative)
BTW, if anyone wants to take a gander at Numerical Recipes in C/Fortran they are available here [nr.com].
Re:flimsy review (Score:2)
Numerical Recipes (Score:3, Informative)
Re:Numerical Recipes (Score:3, Insightful)
Re:Numerical Recipes (Score:3, Informative)
An indispensible treasure (Score:3, Troll)
I'm surprised to see it posted on /., though, because he's pretty harsh towards the gaming community. In fact, he says near the beginning that game-related technology in CPUs (MMX and so forth) is taking away much-needed brainpower from research that should be reaching towards making chips do more math per unit time (not to mention driving up production costs for toy-obsessed, joyless loners). He calls for an immediate end to the pandering that Intel et al do to get into the pocketbooks of the socially-inept, technology pseudo-elite and wants real reform in the area of empowering science.
Powerful stuff.
Re:An indispensible treasure (Score:5, Funny)
Just wondering...
-ubermuffin
Re:An indispensible treasure (Score:1, Insightful)
Re:An indispensible treasure (Score:2)
Re:An indispensible treasure (Score:2)
If you're doing a lot of number crunching or data manipulation (in big sets, with hashes, etc) you're probably spending most of your time in the libraries which are written in C. In fact, being that they're written by programmers skilled in that specific area, you're probably getting better performance than if you wrote them yourself.
Perl isn't an interpretted language, in the traditional sense. In most BASICs, when the execution comes back to a given line it's parsed again, executed, and dumped. If anything, they usually only cache a line or two to help tight loops. Perl is interpretted/compiled all at once, when you start.
Runtime speed is a little slower than other languages, but it's mainly because you've got a lot of runtime checks and hidden memory allocation turned on. Use C++ with automatic array expanding and garbage collecting and you'll see the same kind of performance hits.
That said, the ease of perl causes a lot of features to be misused by programmers who don't know how long it'll take. If you have two pieces of data (a header name and value for instance) it's common to toss them into a hash to keep them related. This isn't really a good idea unless you need to look them up by the header name. If you're just going to dump them out in arbitrary order, you should probably use two arrays in sync. Pre-allocate them to avoid a little delay at every operation. This way you avoid the overhead of the hashing algorithm that you're never going to use, and the slightly-slower lookups compared to an array.
You can also do more complex things this way. I've seen people use hashes here and read the list out by sorting the keys to the hash and iterating through. They'll then do this a few times, sorting at every step. If you want these arrays sorted, but you still don't care about finding a specific header, use an array of two-element arrays, sort the master based on the first element, and not only do you avoid almost all the overhead of hashes, but you have a permanently sorted array, no need to sort at every use.
These programming "errors" are worse in Perl than most older languages because they're easier to implement. In C you'd have to find a library function to create hashes, or write your own. If you started to write your own you'd quickly realize how many cycles you were burning and probably find an easier way unless your application demanded it. In perl (and many little "scripting" languages) you can do so much in a single command that you may not realize.
This is why if I were hiring I'd only take programmers with a "traditional" background of C or other low-level language, before they got to the Perl, Java, Python, or whatever modern rapid-development language we were using. ASM experience is even a plus. Nobody understands the cost of a routine like someone who programmed in ASM. And it's worth thinking about. Usually you say that requiring 512MB of ram ($40 these days) is worth it to save an hour or two of programming, but hopefully at other times you realize a CGI on a busy site can't be that greedy.
So, in conclusion. Perl isn't traditionally interpretted. It's almost as fast, or faster, than C for anything that spends much time in libraries (most code). Most of what slows down Perl (or Python, or Java, or C++, etc) is programmers who don't know what routines they really need to use. The cause of this is usually not enough experience in less "helpful" languages.
Re:An indispensible treasure (Score:1)
Re:An indispensible treasure (Score:1)
Grrr. It's one thing to do a physics simulation with 64 bit doubles, and another thing to keep it stable with 32 bit floats. It's an art and not for the shallow thinker talk to real physicists (and gamers) at companies like Havok and MathEngine. As for intel pandering, he ought to read a book like "Platform Leadership" to learn just what Intel have done to get stuff into the hands of peasants like me, for whom Cray did diddly squat. Scientists? Hah!
Forth Algorithms (Score:4, Interesting)
Unfortunately, it's tricky to find Forth books these days.
That's a shame, because along with Smalltalk, Lisp and APL, I think Forth is one of the "mind expanding" languages all programmers should at least experience, instead of just deciding C/Java/C++/VB is the one true language.
Math in CS programs (Score:2, Interesting)
I don't know of other programs, but I know at the University of Waterloo (where I am a computer science student), we must take quite a lot of math courses, ranging from linear algebra, calc, classical algebra, combinatorics & optimization and statistics. The math content for the CS program is very high, and in the end you get a BMath degree.
Maybe this is different at other schools (well, actually I know it is at most, most don't do nearly as much math), but I would hope not. I think to be a solid programmer a solid math background is a requirement.
oh, and btw, for anyone nitpicking, UW now offers a BCS program, as well as the typical BMath Honours CS. The BCS seems to offer a bit more flexibility, so BCS students may not choose to take 'as much' math.
Re:Math in CS programs (Score:2, Interesting)
I think this may provide some insight into whether or not it's a GoodThing for CS students to have more math in their degrees. Microsoft hires more programmers from Waterloo than anywhere else. And just look at the QUALITY of their code.
On a somewhat tangential note, I'm in Communications Engineering at Carleton, and we badly need a stochastics course in our program, so Digital Comm doesn't keep flying over our heads. Sometimes more math is good.
Re:Math in CS programs (Score:1)
See, that is where your problem is! The school is setting curriculum based on employers. It should not happen this way. Your school is shortchanging every student who goes there, by effectively (though obviously not completely) limiting their student's employment choices after school. Post-secondary education, especially at the university level, should educate its students in a way in which they can work almost anywhere, not just the 1 or 2 big companies in the area.
And oh, as a side note for another reply, yes, MS hires more grads from U.Waterloo then anywhere else, and when even the slightest controversy comes about over MS controlling curriculum [slashdot.org], people get angry and fights start.
Neglected subject, good review, integer!=assembly (Score:5, Interesting)
A year (or so) ago I attended a lecture given by Guy Steele (of Lisp/Java/ Crunchly fame) on his proposal to alter how IEEE floating point numbers are mapped to real numbers. It quickly flew over my head, but gave a great insight into the whole field. Steele then had a fair old "discussion" with the one person in the audience whose head hadn't been overflown (sic), as there was plainly still much controversy left in this area. On trying to do some "why didn't I get this stuff at college" reading, I found there wasn't a great deal of literature.
The reviewer's concern that coprocessor-less systems should be covered is valid, but I'm not sure going as far as assembly is necessary. For example, I once had the privilige of reading through Hitachi's libm implementation for their H8 series microprocessor/microcontroller (one would be generous to call H8 a 16-bit system, and ungenerous to call it an 8-bit system). With one small exception (I think the cos table lookup) the whole thing was in (quite readable) C, and (at least for basic libm stuff) performance was perfectly acceptable. For didactic purposes, a C (or sane C++) implementation would be the thing one would want to find in a book - I get very annoyed at embedded books where the examples are written in asm for the author's favourite (obscure) microcontroller.
Re:Neglected subject, good review, integer!=assemb (Score:3, Informative)
Steele is God. He also invented Scheme, wrote the original Common Lisp manual, co-wrote with Harbison a classic reference manual for C, and wrote parallel languages for the Connection Machine.
On trying to do some "why didn't I get this stuff at college" reading, I found there wasn't a great deal of literature.
This [nec.com] is widely considered a good introduction.
Under covered subject; average review... (Score:4, Interesting)
To be honest, a lot of embedded coding is done with C or C++ these days. I've been following Crenshaw's articles in Embedded Developer magazine for years now. He explains a lot of what they try to teach in college Calc, etc. in simple, practical terms, and reduces it to usable algorithms.
I'd probably buy the book and add it to my shelf.
College Math (Score:3, Interesting)
Yes, it's an interesting book... (Score:3, Interesting)
As for the title, I agree it's a bit misleading. The book has pretty little to do about real-time (in fact nothing, as far as I could see). What it really should be called is "Computer arithmetic and a little of numerical methods for dummies". This book will help you understand how to write your own libm, and give you some ideas for more advanced tasks, but that's about it.
For me, who didn't know much of this stuff, it was very interesting. It will probably not save you that course in numerical algorithms (which I for one haven't taken), but even then, it will probably contain some interesting tidbits you didn't know.
On the other hand, if you have years of experience in writing computer math routines, it will probably quickly become dull, but that's true about anything you already know.
KISS philosophy? Bitchin'! (Score:2)
It's pretty easy to see that the author is a heavy follower of the KISS philosophy.
You mean he wants to rock and roll all nite and code every day? Sweet!
KISS rulez, Man!
GMD
Don't click on Slashdots book link (Score:3, Informative)
Save your self some money!
Decimal libraries (Score:3, Interesting)
The only library I know that supports it is the BC-library sometimes used with PHP. (Well, I guess you could say that COBOL has such also.) It actually uses strings to hold the results so that there is no machine-based limitation on precision size. Plus, that improves its cross-language use since almost everything supports dynamic strings these days.
(Not the fasted approach I suppose, but most biz apps are not math intensive anyhow. Most code is devoted to comparing strings, codes, and ID's and moving things around from place to place. IBM used to include decimal-friendly operations in its CPU's. Those days seem gone for some reason, yet biz apps are still a huge domain.)
Re:Decimal libraries (Score:2)
Oh, and there are lots of old fixed point code floating around. Looking for "fixed point" instead of "decimal math" might help you find what you want...
Re:Decimal libraries (Score:1)
Integers have a limited length in most built-in stuff. What if you want to store 0.666666666666666666666 in a variable?
Besides, one should still wrap such behind a library rather than manually manage the decimal position. You would then have an integer version of the BC library I mentioned.
Re:Decimal libraries (Score:2)
Re:Decimal libraries (Score:2)
IBM's Packed Decimal maxes out at 15 digits plus sign.
COBOL does a good job of keeping track of the (implicit) decimal points.
If you need predictable results, you need to be aware of rounding issues. In general the round of the sum is not equal to the sum of the rounds.
In business calculations, if you add a list of numbers from top to bottom and add the same list of numbers from bottom to top, you get the same answers, both right.
In some scientific calculations, if you add a list of numbers from top to bottom and add the same list of numbers from bottom to top, you get different answers, both wrong.
Re:Decimal libraries (Score:2)
Doing that with integers is trivial: Keep sums to the tenth of a pence, sum them together, and write your own one line inland_revenue_round() function that rounds the way Inland Revenue requires it.
Same applies for what you're saying. Of course you need to care about rounding, but doing rounding of integer fixed point representations is trivial if you know the rounding rule you need to apply.
But you have to deal with rounding issues and precision with floating point as well, though lots of people don't realize that and screw up because the results are much closer to expected most of the time.
Re:Decimal libraries (Score:2)
Basic rule of floating point is that numbers are NOT equal.
Specifically, an input number is not necessarily equal to the same number expressed as a constant.
Your example works well with integers, but the fun comes when they change the rules and you have other things that depend on those intermediate numbers.
One problem with holding number to more places that shown is you get columns of numbers that do not add up to the total shown.
Re:Decimal libraries (Score:2)
Also, sine and cosine look-ups use the same table with different offsets.
Re:Decimal libraries (Score:1)
Also, Java's java.lang.math.BigDecimal class contains just the kind of functionality you describe - its docs are here [sun.com].
In general, I think you'll find lots of fixed point math libraries around - they're mostly intended for numerical computation and mathematical cryptography (e.g RSA), but they should be quite applicable (if sometimes overkill) for your biz-app uses.
Re:Decimal libraries (Score:1)
I think you will find that floating point calculations have much more applications in realtime environments.
Fixed point calculation is integer based, sometimes using BCD representation in older machines.
Re:Decimal libraries (Score:2)
Why is that?
BTW, could you clarify what you mean by "real-time"? I have seen 2 different definitions before. One is that the response time has to be within a specified tolerance 100% of the time. The other is "interactive". I did not use that term IIRC.
Re:Decimal libraries (Score:1)
Sorry about not clarifying that. I suppose that also means I will get mod slammed. Knock me only 0.5 points down since I am still partly on topic, okay guys?
Re:Decimal libraries (Score:2)
I'd say the first definition is the correct one.
Also wrote "Let's Build A Compiler" series... (Score:2, Informative)
These articles don't go into a lot of the complicated stuff that's involved in modern compiler design-- Crenshaw keeps it simple, keeps it straightforward, and still produces a working (if not optimizing) compiler by the end of the second or third article.
No, it won't let you code a C compiler that will beat the pants off of gcc or Borland's latest offering, but the end result is pretty useful.
Amazon link, too (Score:2, Interesting)
For those who don't support Slashdot's Amazon embargo, here's their link to the book [amazon.com]. Not only are they selling the book for $35, they have 25 sample pages, including the entire index and the first half of the first chapter. (And no, I'm not in Amazon's affiliates program and don't make a dime if you buy the book using the link that I provided, as a quick glance at the URL will prove.)
Die Hard Assemblers? (Score:1, Funny)
Math ~= Calculus (Score:5, Insightful)
But by "math" the reference is almost always to calculus.
But math is not just calculus.
Math includes (and this is a MINIMAL list :
boolean logic Using logical expressions and understanding what they do is just the predicate calculus. Using logic languages (prolog primarily) is, well, logic.
Linear Algebra Try to program more than minimal graphics without linear algebra.
The structure of numbers computing square roots and the like. This kind of computing also typically involves calculus and its relatives
Calculus many parts of computational mathematics, including things like square roots, sin/cos and the like. Also, finding tangents and normals to surfaces which is a big part of reflection models in graphics. The logic involved is also used in the analysis of algorithms.
logical reasoning Every time someone writes a loop or a recursive function, they are essentially using mathematical induction (albeit informally). Propagation of pre/post conditions (not just in procedure calls, but on the statement to statement level is also logical reasoning (and informal proofs).
Fourier analysis Fourier analysis is essential in image manipulation (including compression), graphics in general, Most algorithms involving sound processing also rely on fourier analysis
Graph Theory Where doesn't graph theory show up? Dependency graphs, path algorithms of all sorts. Trees are graphs. Garbage collection involves graph theory. Programs are (on several levels) graphs. The internet as a network is a graph. Websites are graphs (and it can be interesting and revealing to look at them as such).
Number Theory Cryptography!
If you're not doing any of these things, you may be programming, but you're probably not programming well.
Juris Hartmanis said (half jokingly) in his Turing Award lecture that "Computer Science is the engineering of mathematics" I think its about as good a definition as any I have ever heard.
Faster Math for Game Programmers. (Score:3, Informative)
Work by Tang on combatting destructive cancellation in range reduction, the new semi-table based exponant and log methods, Intel's research into using Estrin's Method based SIMD for evaluating polynomials or Muller's book on Elementary Functions are beyond Crenshaw's experience, and it shows. This is a homebrew book rather than an introduction to the state of the art. More information at SCEA R&D Website [scea.com].
Jack Crenshaw (Score:3, Informative)
Link & More (Score:1, Interesting)