NYC Mayor Bloomberg Vows To Learn To Code In 2012 120
theodp writes "New York City Mayor Michael Bloomberg has announced his intention to take a coding class in 2012 via Twitter ('My New Year's resolution is to learn to code with Codecademy in 2012! Join me.'). So, is this just a PR coup for Codeacademy, or could EE grad (Johns Hopkins, '64) Bloomberg — who parlayed the $10 million severance he received after being fired as head of systems development at Solomon Brothers into his $19.5 billion Bloomberg L.P. fortune — actually not know how to program? Seems unlikely, but if so, perhaps Bloomberg should just apply to be a Bloomberg Summer 2012 Software Development intern — smart money says he'd get the gig!"
Cobol (Score:5, Funny)
Maybe he wants to know how to code in something besides cobol and fortran.
Re: (Score:2, Funny)
Maybe he wants to know how to code in something besides cobol and fortran.
Or morse...
Re:Cobol (Score:5, Interesting)
Fortran isn't that bad, considering it's from 1957. Anyone who can do Fortran, could learn C++ very quickly, [begin rant] Cobol on the other hand was a step backwards the day it appeared in 1959, and it's creators should be bludgeoned with a frozen fish for even writing the design document. And yes - I've written tons of Cobol - it doesn't grow on you. It's probably the first example of the fundamental misconception, that it's desirable (if even possible) to make formal descriptions using informal language. The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer. Using the word "plus" in stead of the symbol "+" is completely missing that fundamental point.[end rant]
Re:Fortran & COBOL are ok... apk (Score:5, Informative)
Should I use COBOL or ForTran (Formula Translator)?
No, it's "FORTRAN". While it does indeed stand for "formula translator", back in those days they didn't use CamelCase, and making portmanteaus and then writing them in all caps was normal. You can still see it in US military acronyms, such as "USCENTCOM" (US Central Command).
According to Wikipedia [wikipedia.org], they didn't start using camelcase for programming language names until the 1970s, and it only became fashionable for company names in the 80s.
Re: (Score:2, Informative)
No, it's "FORTRAN". While it does indeed stand for "formula translator", back in those days they didn't use CamelCase, and making portmanteaus and then writing them in all caps was normal.
Bzzzt [wikipedia.org]. Nowadays it's "Fortran". The Wikipedia article is an interesting read, for instance "Free-form source input, also with lowercase Fortran keywords" was first introduced in FORTRAN 90.
Re: (Score:2)
Object oriented progamming?
He said C++. While it is just about possible to write object oriented code in C++, I would be very surprised if more than 1% of C++ programmers managed it. What you are describing is programming with ADTs, which has some overlap with OOP but is as much OOP as defining functions is structured programming.
Re:Cobol (Score:5, Insightful)
The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer.
That's because software *is* the description of what the computer should do. Check this great article: http://www.osnews.com/story/22135/The_Problem_with_Design_and_Implementation [osnews.com]
Re: (Score:2)
That really is a great article. Thanks for the link!
Re: (Score:2)
The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer.
That's because software *is* the description of what the computer should do.
Only in some paradigms (procedural, functional programming). In logical programming, or SQL, you don't tell the computer what to do, you tell it what you want. Yet it is software.
Re: (Score:2)
The problem with that article is that it only focuses on one aspect of design: the low-level stuff. Sure, there's no need to specify every damn function and field. That's a lot better done in the code itself. However, source code doesn't answer "why". It's hard to tease out the grand architecture by looking at lines of code. It would be silly to examine the shape of the earth by looking at every grain of dust on the surface. Sure, eventually you'd get the picture (oblate spheroid with serious perturbations
Fortran (Score:2)
Fortran makes it really really easy to do complex matrix arithmetic. It also makes text manipulation a serious PITA.
So, like so many other things, it's a trade-off between what a language makes easy to code and what you actually want to code.
Re: (Score:3)
Fortran makes it really really easy to do complex matrix arithmetic. It also makes text manipulation a serious PITA.
So, like so many other things, it's a trade-off between what a language makes easy to code and what you actually want to code.
You are right. However, with the arrival of numpy, I don't see the benefit of Fortran any more.
Re: (Score:2)
And COBOL makes text manipulation easy, and everything else a PITA!
Re: (Score:2)
I love COBOL statements like
MULTIPLY SEVEN BY SIXTEEN
rather than
7 * 16
How that made things more reasonable is beyond me. Yes, some variants of COBOL do allow more ordinary mathematical expressions like you would see in C++ or even FORTRAN, but this is a "feature" of COBOL that has always seemed a little off.
Re: (Score:3)
COBOL always had FORTRAN-like arithmetic statements since the first
adopted
Re: (Score:2)
Re: (Score:2)
Yeah, I was trying to be funny, but COBOL isn't so bad... you can also easily link C libraries with most recent COBOL compilers.
Re: (Score:2)
It's probably the first example of the fundamental misconception, that it's desirable (if even possible) to make formal descriptions using informal language
It's always seemed to me like some confused attempt to make complex things simpler by writing them in laborious English. e.g. Integrating this function is proving to be quite hard. Perhaps the whole process of integration would be easier if we wrote "Integrate" rather than using that confusing stretched out S symbol.
Re: (Score:1)
Re: (Score:1)
Re:Cobol (Score:4, Informative)
Funny, but even this is ignoring reality. Accord to TFA summary, he was a EE grad in 1964. While those languages do indeed date back that far, EE students were probably not taught them at the time, and in fact probably weren't taught any programming at all, as that was a different discipline (CS). Even when I went to undergrad EE school in the early 1990s, we were only taught a little QBASIC, FORTRAN, C++, MATLAB (1/2 semester each), and x86 assembly language (full semester). There was some more in the junior/senior classes, but only if you elected to take those, and it was all concentrated on microcontroller and embedded programming. Back in the mid-60s, I imagine programming simply wasn't considered important for EEs, and that any EEs who ended up working on computers (which were room-size and mega-expensive at the time) would learn any necessary programming on the job. The fundamentals of EE simply don't include programming; they include network theory (Ohm's and Kirchoff's Laws), electromagnetics (Maxwell's equations), 3-phase power, etc. It's only been in very recent years (early/mid-90s and later) where they came up with the "computer engineering" degrees, or put the two together ("electrical and computer engineering" or ECE like at one university I went to).
Re: (Score:2)
My father took a class in FORTRAN in the early 1960's (I think it was 1962) at Brigham Young University. Yes, it was quite early and the computing equipment was quite primitive, but it was something being taught at the time. He later told me about it because he complained about how some of the students would mess up his software because they dared to touch some of the equipment inside of the computer, and how some of the tubes in that computer (yes, tubes, not even transistors) even had a water-cooled hea
Re: (Score:2)
When I got my EE in the early 90's, there wasn't any officially sanctioned CS degree on offer. You could either do an "Engineering and Applied Science" degree focusing on coding and algorithms, or a EE degree focusing on computer design and use.
If you went the EE route you still had to learn all the antenna theory and 3-phase stuff as well, but at the end you understood how computers worked -and- how to use them.
Re: (Score:2)
That must have been unique to your school, because my 2nd-rate state university had a CS program, and I'm pretty sure CS has been around in other universities since the 60s. Wasn't Dijkstra a famous CS professor?
Re: (Score:1)
Re: (Score:2)
I graduated from high school in '83 and there was no shortage of schools which offered computer science programs a the time, although admittedly many of the departments were relatively new at the time.
I took some time off to experience life and went back to school in the early 1990's where I know a separate Computer Engineering program was being created at the engineering college at the university I was attending, and there were four different majors offered at four different colleges that had a substantial
Re: (Score:1)
Re:Cobol (Score:4, Funny)
He'll have to learn APL, or at least Perl. After all, he's going "to take a coding class in 2012 via Twitter." Anything else and the program won't fit in a single tweet!
[rimshot]
Re: (Score:2)
Bloomberg's coding is probably going to end up running about as well as his Spanish sounds.
Seriously, I live in Newark, NJ and I can barely speak Spanish for the life of me, but every time there's a press conference and Bloomberg speaks Spanish it is downright hilarious.
retweet (Score:3)
He was never a programmer (Score:5, Informative)
Re: (Score:2)
Bloomberg LP even at one time claimed to own 2-screen setups. These days, there's not much on the Bloomberg Terminal platform that isn't available over the web from them or other sources.
Re: (Score:2)
Re: (Score:2)
I figured that was the case, but there is a distinction between resolving to learn something and not being able to do it. False dichotomy and all that.
Unfortunately, since you posted facts, theodp won't be able to practice his critical thinking skills. Or alternately practice being more subtle about driving page views for Codecademy with the rhetorical question. A few more obvious problems with this story are the following assumptions apparently based solely on a tweet and a short bio:
Heading up developm
Why was this mod'ed "Funny"? (Score:5, Insightful)
If you look at just about all tech companies, the person who got it going was the sales guy. In some cases the tech guy is also a great salesman - Larry Ellison of Oracle or Zuckerberg of Facebook - actually, FB is just a marketing data collection company.
In my years in software development, I've seen some really great ideas and implementations just get burried because the geek didn't know how to sell it's value.
All the tech bigshots knew how or knew someone who knew how to sell the value of their stuff.
Wozniak had the luck of having God's gift of salesmenship, Steve Jobs, as his friend. All the gazillionaire techies had someone with them that had the contacts and sales ability to take their idea and make it into something.
"Build a better mousetrap and the World will beat a path to your door" is a lie. The countless examples of inferior technology ruling the marketplace is proof.
Re: (Score:2)
Mike Bloomberg was always the business/sales guy at the company.
So he was basically like Steve Jobs, only without the gift for picking mega-popular designs.
Re: (Score:2)
The Bloomberg Terminal isn't mega-popular in its market?
Re: (Score:2)
Re:Much rather (Score:5, Funny)
I'd much rather he learn empathy, humility, and how to not be a giant fucking jackass.
Well then, learning to code is definitely NOT the way to go.
Head of systems development? (Score:5, Insightful)
So? Just beacuse you manage a department doesn't mean you can do the work they are doing. He was there to manage people, not code.. a vastly different skill set.
Sure, its nice if you can do the job of your people, so you can have a deeper understanding of what is going on, but its not a requirement.
Re: (Score:3)
You're an IT manager aren't you?
I have been in the past, but i am not currently. In my case i do have expirence in the IT/Engineering field as a 'worker bee', for 20+ years, but i still don't feel that is a prerequisite for managing IT people. General knowledge of the field sure, but i would not expect a manager to know low level stuff like how to sit down and code an application, or recite the resistor color codes ( for 2 examples ) to be an effective manager of people.
Re: (Score:2)
The most important job of a manager is to know WHO is really performing well, and who is the problem, and why are both the case.
Sometimes not having enough technical knowledge can hurt this significantly.
Re: (Score:2)
...and recognize those cases when adding some high performer reduces team productivity. (Or the other way round)
And a method of recognizing "performance" in the first place.
Re: (Score:2, Interesting)
Sure, its nice if you can do the job of your people, so you can have a deeper understanding of what is going on, ..
That's why I in principle like the announcement. Even if it turns out just to be a publicity stunt, it at least shows that Bloomberg thinks that learning something different would be good - or at least thinks, that his voters think that..
According to BBC, the reaction of the London mayor was that he's too busy for things like that. - Now, that shows a politican that needs to get rebooted. If politicians would do a couple things below their pay scale or volunteer for longer than a photo opportunity they m
Re:Head of systems development? (Score:5, Informative)
According to BBC, the reaction of the London mayor was that he's too busy for things like that.
That's completely wrong [bbc.co.uk]. The BBC actually reports [...] the mayor is in awe of his good friend Michael Bloomberg, and if re-elected will explore whether he can join him on that course. I believe you got Boris Johnson (current mayor) confused with Ken Livingstone (former mayor and current candidate for the opposing party). Ken Livingstone stated If I'm elected, I'll be a bit too busy to take any education courses.
Anyway, it's certainly nice if politicians broaden their minds, but it's reasonable that they have to allocate their time and set priorities.
Re: (Score:1)
Bloomberg on the Internet in 2001 (Score:2)
BW 2001 [businessweek.com]: Bloomberg still insists that the Net is too "unreliable" a way to deliver his product. Servers go down, security is dicey, and he has faith in a closed system. There's a Bloomberg Web site with data and news for free. But the CEO was an early skeptic of the Internet gold rush, and these days he figures that he has been proved more right than wrong.
Re:Bloomberg on the Internet in 2001 (Score:4, Informative)
Re: (Score:2)
Agreed, in 2001 that was about correct. Systems generally weren't that easy to make reliable, and for what he was doing it wouldn't have been worth it.
If someone asked me today why I'm not making a WP7 app, the answer is: They don't have enough of a market share for it to be worth our time yet. If, 3 years from now, WP7 owns the whole damn marketplace that doesn't mean my opinion about what I'm doing right this minute is wrong. The world changes, the question is whether or not you can evolve with it, an
Even today (Score:3)
Even today, critical communications don't travel over the public internet:
a) Mastercard & VISA card processing networks
b) ACH & Fedwire money transfers
c) US DoD communications.
Using IP protocol isn't really the problem (why invent hardware now), but control & management of network is a big deal. Besides, his servers & his clients can be concentrated in Manhattan. Bloomberg made the right choice then, and it's still the right choice.
Re:Even today (Score:4, Informative)
Bloomberg terminals now operate over the internet if I'm not mistaken.
They can encapsulate their feed over the Internet, but that limits functionality and requires extra login steps. The standard setup is over their own network. It's has extra security (including protection against Van Eck phreaking of the terminal itself). What you get in the browser is a very, very, very limited subset of functionality of what the terminal itself provides. Although the terminal itself, as an interface, has all the usability level of a cash register.
"Hiring in NYC/SF only" (Score:2)
They don't make programming tools like they use to (Score:5, Insightful)
Common in the 60s: Punch cards, text only dumb terminals, mainframes...
Common Now: Online storage, visual designers, client/server setups....
If your knowledge of computers ends in the 60s. there's a lot of updating to be done. Mayor Bloomberg has the right idea... every 10 years or so it's time to retrain to the current tools.
Re: (Score:1)
Terminals were a luxury in the 60's. Teletype machines were generally cheaper (if you were lucky enough to get to use one), even though they consumed a lot of paper.
What does the submitter have against Bloomberg? (Score:1)
Did Bloomberg do something to the story submitter? Sounds like Bloomberg kicked his dog or something.
Re: (Score:1)
It's important to hate him because he's rich. Anyone who has money is evil because they are able to fulfill their desires and I am not.
Bloomberg to Helpdesk: (Score:2, Funny)
Bloomberg: I need you to perform a privilege escalation on my compiler.
Helpdesk: Before we proceed, can you describe the symptoms?
Bloomberg: Yeah, it sometimes spits out some incomprehensible message, or the program says "Segmentation Fault." I don't care about its needs, I have work to do, now. So I'm calling in a privilege escalation. Now!
Helpdesk: Sir, I'm not sure that's going to help, do you know what a privilege escalation means?
Bloomberg: Yes, I think I do, or haven't you noticed that I'm the ri
Citytime (Score:2)
THAT explains the Citytime fiasco, eh - maybe he's looking to get in on the Nth version
http://www.nytimes.com/2011/11/01/nyregion/bloomberg-administration-admits-mishandling-citytime-and-nycaps-programs.html [nytimes.com]
My first program (Score:5, Funny)
10 Print "I've got lots of money"
20 goto 10
30 end
Why (Score:3)
Just curious -> why? Personal interest, or business venture?
And someone make sure he starts with C++. If he survives that, he won't have any trouble picking up other languages.
C/C++ is pretty bad place to start learning (Score:3)
And someone make sure he starts with C++. If he survives that, he won't have any trouble picking up other languages.
I've always been baffled by people who think that C/C++ is a good starting point when you want to learn/teach programming. I think that the most important thing to understand - whether you end up working as a programmer or not - is the basic structure/flow of the program (conditionals, loops, modularity/functions). Then the basic programming concepts (recursion, abstract data types, etc.) and then the libraries/APIs for your platform so that you can actually create something interesting/useful. I don't thin
Re: (Score:1)
A fair point. I started programming in Javascript before I moved to PHP, VB.Net (I regret this one), C#, Java, C++ in that order. However it could just have easily been C++ first.
If a person is really committed to learning a programming language, they would be fine learning C++ first which would teach not only all the fundamentals but also give some idea of how the system works.
It can also be unnecessary depending what they want to do though (as you suggested)
Re: (Score:2)
If we are talking about programming in general, I think I started with Logo, then Java / Q-Basic, then C, then JavaScript, then C++. Something like that, with HTML / VRML mixed in for good measure. Ah, good old VRML.
Currently enjoying C# as my primary language, and doing PHP work for a small project. Have a book on Ruby to finish reading, the AMD APP OpenCL reference for when I have some free time.
Re:C/C++ is pretty bad place to start learning (Score:4, Insightful)
I think C++ is a good starting point simply because it teaches memory management and class design.
Understanding the concept of a class is one of the most difficult programming concepts a novice will encounter. And they are used everywhere.
Just try explaining the concept of a class to a non-programmer. I will bet money that they will nod their heads, and still have no idea what you're talking about.
And memory management -> something you need to understand, even if you use a garbage collector.
If he's just taking a programming class to get a taste (dilettante) for programming, then by all means teach him Visual Basic or JavaScript or whatever. However, if he's taking a programming class to learn programming (he wants the programmer skillset a.k.a. a real programmer), then C++ is where he wants to be. Once you understand the concepts in C++ (which can be brutal / metal when it comes to learning), the hardest part of learning how to program is past.
Why, do you ask? Because otherwise you end up in sad scenarios, like when the PhDs in your Computer Science department do not know how to install an operating system, when the undergrads in your class have difficulty understanding the difference between an AMD processor and an Intel processor, or why one should never write a program in JavaScript that consumes 8 GB of the client computer's memory.
TLDR; C++ will expose him to the greatest number of programming concepts in the shortest period of time, and give him the minimal amount of understanding necessary to eventually grow into a respected programmer.
Re: (Score:3, Informative)
Re: (Score:2, Informative)
Like so many other classes required for a CS degree, I use nothing from it in my day-to-day work as a Ruby developer.
As a Ruby developer I just have to point out, without C you can't understand the Ruby source or write native extensions.
A Ruby developer without C is totally weak.
Re: (Score:1)
Re: (Score:2)
"I use nothing from it in my day-to-day work as a Ruby developer".
You're a Ruby developer today. You may not be in 5 or 10 years from now. Then, your educational background will serve you flex into a different position. Ruby developers who know just that will only be Ruby developers forever, because they are one-trick ponies.
Re: (Score:2)
Understanding by doing is a completely different thing than what is gained by repetition.
Re: (Score:2)
C++ is a terrible language for teaching OO. There are other languages that have a stronger OO model and it's consistent throughout the language.
Teach C for basic programming concepts, memory management and that kind of thing. Use a .NET lang/Java, Python or Haskell for the modern and OO stuff.
Re: (Score:2)
When teaching people to program I start with an introduction to binary and hexadecimal, making them do a few things like note the patterns various various numbers contain, and add up a couple of things (which illustrates why powers of two are important and convenient, etc.)
From there it is a brief introduction to logic gates. I demonstrate simple addition, and make them construct a circuit that will do add with carry.
Then I do a brief introduction to assembly (x86 these days, I used to choose Alpha.) I do
Re: (Score:3)
The advisor for my intro to programming project promptly nixed C++ and went with Python
A teacher who had worked with Fortran in the 70's said this: "Automatic memory management? You lucky bastard."
Moreover, Python has a fairly straightforward syntax without being _just_ a teaching language
Re: (Score:2)
Re: (Score:2)
Perl! (Score:3, Funny)
good example of lifelong learning (Score:4, Insightful)
This is not good. (Score:1, Funny)
Bloomberg is a narcissist, he's going to write a Hello World program and think he's an expert in all things technology related.
LK
He should take Constitutional law classes instead (Score:2)
Re: (Score:2)
Oooooh the butthurt.
Seems like bad PR to me. (Score:2)
I went there and looked at their other courses. Only 3? The Bloomberg thing is advertising for something that isn't there.
Rosetta Stone for C++ (Score:2)
.
Solitaire+Work=Bad, Golf+Work=Good (Score:2)
Good memory, I'd forgotten about Bloomberg's Double Standard On Mixing Games With Work [techdirt.com].
Re: (Score:2)
Re: (Score:2)
"half were employed full-time, 13% were unemployed"