Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Education Programming Politics

NYC Mayor Bloomberg Vows To Learn To Code In 2012 120

theodp writes "New York City Mayor Michael Bloomberg has announced his intention to take a coding class in 2012 via Twitter ('My New Year's resolution is to learn to code with Codecademy in 2012! Join me.'). So, is this just a PR coup for Codeacademy, or could EE grad (Johns Hopkins, '64) Bloomberg — who parlayed the $10 million severance he received after being fired as head of systems development at Solomon Brothers into his $19.5 billion Bloomberg L.P. fortune — actually not know how to program? Seems unlikely, but if so, perhaps Bloomberg should just apply to be a Bloomberg Summer 2012 Software Development intern — smart money says he'd get the gig!"
This discussion has been archived. No new comments can be posted.

NYC Mayor Bloomberg Vows To Learn To Code In 2012

Comments Filter:
  • Cobol (Score:5, Funny)

    by Anonymous Coward on Saturday January 07, 2012 @02:41PM (#38623974)

    Maybe he wants to know how to code in something besides cobol and fortran.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      Maybe he wants to know how to code in something besides cobol and fortran.

      Or morse...

    • Re:Cobol (Score:5, Interesting)

      by Kristian T. ( 3958 ) on Saturday January 07, 2012 @03:18PM (#38624314)

      Fortran isn't that bad, considering it's from 1957. Anyone who can do Fortran, could learn C++ very quickly, [begin rant] Cobol on the other hand was a step backwards the day it appeared in 1959, and it's creators should be bludgeoned with a frozen fish for even writing the design document. And yes - I've written tons of Cobol - it doesn't grow on you. It's probably the first example of the fundamental misconception, that it's desirable (if even possible) to make formal descriptions using informal language. The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer. Using the word "plus" in stead of the symbol "+" is completely missing that fundamental point.[end rant]

      • Re:Cobol (Score:5, Insightful)

        by zarlino ( 985890 ) on Saturday January 07, 2012 @04:15PM (#38624798) Homepage

        The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer.

        That's because software *is* the description of what the computer should do. Check this great article: http://www.osnews.com/story/22135/The_Problem_with_Design_and_Implementation [osnews.com]

        • That really is a great article. Thanks for the link!

        • The MBA's still think you can describe a piece of software in Word, and then it's a trivial process to make the software that customers want. Informal language is desirable to humans because it supports leaving out details - which is exactly what makes it useless for programming a computer.

          That's because software *is* the description of what the computer should do.

          Only in some paradigms (procedural, functional programming). In logical programming, or SQL, you don't tell the computer what to do, you tell it what you want. Yet it is software.

        • by siride ( 974284 )

          The problem with that article is that it only focuses on one aspect of design: the low-level stuff. Sure, there's no need to specify every damn function and field. That's a lot better done in the code itself. However, source code doesn't answer "why". It's hard to tease out the grand architecture by looking at lines of code. It would be silly to examine the shape of the earth by looking at every grain of dust on the surface. Sure, eventually you'd get the picture (oblate spheroid with serious perturbations

      • Fortran makes it really really easy to do complex matrix arithmetic. It also makes text manipulation a serious PITA.
        So, like so many other things, it's a trade-off between what a language makes easy to code and what you actually want to code.

        • Fortran makes it really really easy to do complex matrix arithmetic. It also makes text manipulation a serious PITA.
          So, like so many other things, it's a trade-off between what a language makes easy to code and what you actually want to code.

          You are right. However, with the arrival of numpy, I don't see the benefit of Fortran any more.

        • And COBOL makes text manipulation easy, and everything else a PITA!

          • by Teancum ( 67324 )

            I love COBOL statements like

            MULTIPLY SEVEN BY SIXTEEN

            rather than

            7 * 16

            How that made things more reasonable is beyond me. Yes, some variants of COBOL do allow more ordinary mathematical expressions like you would see in C++ or even FORTRAN, but this is a "feature" of COBOL that has always seemed a little off.

            • by cstacy ( 534252 )
              <quote><p>I love COBOL statements like</p><p>MULTIPLY SEVEN BY SIXTEEN</p><p>rather than</p><p>7 * 16</p><p>How that made things more reasonable is beyond me. Yes, some variants of COBOL do allow more ordinary mathematical expressions like you would see in C++ or even FORTRAN, but this is a "feature" of COBOL that has always seemed a little off.</p></quote>

              COBOL always had FORTRAN-like arithmetic statements since the first
              adopted
          • COBOL has things like decimal arithmetic built in. This is very useful for financial applications, where COBOL is still used quite a lot. If you buy a recent IBM machine, you will even find that the CPU has support for decimal floating point types in the FPU and these are used directly by the COBOL compiler.
            • Yeah, I was trying to be funny, but COBOL isn't so bad... you can also easily link C libraries with most recent COBOL compilers.

      • It's probably the first example of the fundamental misconception, that it's desirable (if even possible) to make formal descriptions using informal language

        It's always seemed to me like some confused attempt to make complex things simpler by writing them in laborious English. e.g. Integrating this function is proving to be quite hard. Perhaps the whole process of integration would be easier if we wrote "Integrate" rather than using that confusing stretched out S symbol.

      • One doesn't the word "plus" instead of the symbol "+" in COBOL. So, no, I don't believe you've written "tons of Cobol [sic]."
    • by RealRav ( 607677 )
      How can this possibly be modded off topic?
    • Re:Cobol (Score:4, Informative)

      by Grishnakh ( 216268 ) on Saturday January 07, 2012 @03:59PM (#38624660)

      Funny, but even this is ignoring reality. Accord to TFA summary, he was a EE grad in 1964. While those languages do indeed date back that far, EE students were probably not taught them at the time, and in fact probably weren't taught any programming at all, as that was a different discipline (CS). Even when I went to undergrad EE school in the early 1990s, we were only taught a little QBASIC, FORTRAN, C++, MATLAB (1/2 semester each), and x86 assembly language (full semester). There was some more in the junior/senior classes, but only if you elected to take those, and it was all concentrated on microcontroller and embedded programming. Back in the mid-60s, I imagine programming simply wasn't considered important for EEs, and that any EEs who ended up working on computers (which were room-size and mega-expensive at the time) would learn any necessary programming on the job. The fundamentals of EE simply don't include programming; they include network theory (Ohm's and Kirchoff's Laws), electromagnetics (Maxwell's equations), 3-phase power, etc. It's only been in very recent years (early/mid-90s and later) where they came up with the "computer engineering" degrees, or put the two together ("electrical and computer engineering" or ECE like at one university I went to).

      • When I got my EE in the early 90's, there wasn't any officially sanctioned CS degree on offer. You could either do an "Engineering and Applied Science" degree focusing on coding and algorithms, or a EE degree focusing on computer design and use.

        If you went the EE route you still had to learn all the antenna theory and 3-phase stuff as well, but at the end you understood how computers worked -and- how to use them.

        • That must have been unique to your school, because my 2nd-rate state university had a CS program, and I'm pretty sure CS has been around in other universities since the 60s. Wasn't Dijkstra a famous CS professor?

        • by Teancum ( 67324 )

          I graduated from high school in '83 and there was no shortage of schools which offered computer science programs a the time, although admittedly many of the departments were relatively new at the time.

          I took some time off to experience life and went back to school in the early 1990's where I know a separate Computer Engineering program was being created at the engineering college at the university I was attending, and there were four different majors offered at four different colleges that had a substantial

      • Back then most Universities placed computer science within EE. It wasn't often a separate department or degree.
    • Re:Cobol (Score:4, Funny)

      by c++0xFF ( 1758032 ) on Saturday January 07, 2012 @04:09PM (#38624746)

      He'll have to learn APL, or at least Perl. After all, he's going "to take a coding class in 2012 via Twitter." Anything else and the program won't fit in a single tweet!

      [rimshot]

    • by Ihmhi ( 1206036 )

      Bloomberg's coding is probably going to end up running about as well as his Spanish sounds.

      Seriously, I live in Newark, NJ and I can barely speak Spanish for the life of me, but every time there's a press conference and Bloomberg speaks Spanish it is downright hilarious.

  • by alphatel ( 1450715 ) * on Saturday January 07, 2012 @02:41PM (#38623976)
    His account was hacked. Bloomie would never make a New Year's resolution.
  • by tomalpha ( 746163 ) * on Saturday January 07, 2012 @02:52PM (#38624096)
    Mike Bloomberg was always the business/sales guy at the company. Tom Secunda [wikipedia.org] was (one of the) original programmer of the first terminals. That was all in Fortran back then. A fair chunk of it probably still is. You can read this and oh so much more in his not-very-gripping autobiography, which was required reading for all team leads and managers at Bloomberg. [Ex Bloomberger].
    • Bloomberg LP even at one time claimed to own 2-screen setups. These days, there's not much on the Bloomberg Terminal platform that isn't available over the web from them or other sources.

    • I figured that was the case, but there is a distinction between resolving to learn something and not being able to do it. False dichotomy and all that.

      Unfortunately, since you posted facts, theodp won't be able to practice his critical thinking skills. Or alternately practice being more subtle about driving page views for Codecademy with the rhetorical question. A few more obvious problems with this story are the following assumptions apparently based solely on a tweet and a short bio:

      Heading up developm

    • by Anonymous Coward on Saturday January 07, 2012 @03:29PM (#38624394)

      If you look at just about all tech companies, the person who got it going was the sales guy. In some cases the tech guy is also a great salesman - Larry Ellison of Oracle or Zuckerberg of Facebook - actually, FB is just a marketing data collection company.

      In my years in software development, I've seen some really great ideas and implementations just get burried because the geek didn't know how to sell it's value.

      All the tech bigshots knew how or knew someone who knew how to sell the value of their stuff.

      Wozniak had the luck of having God's gift of salesmenship, Steve Jobs, as his friend. All the gazillionaire techies had someone with them that had the contacts and sales ability to take their idea and make it into something.

      "Build a better mousetrap and the World will beat a path to your door" is a lie. The countless examples of inferior technology ruling the marketplace is proof.

    • Mike Bloomberg was always the business/sales guy at the company.

      So he was basically like Steve Jobs, only without the gift for picking mega-popular designs.

  • by nurb432 ( 527695 ) on Saturday January 07, 2012 @02:59PM (#38624166) Homepage Journal

    So? Just beacuse you manage a department doesn't mean you can do the work they are doing. He was there to manage people, not code.. a vastly different skill set.

    Sure, its nice if you can do the job of your people, so you can have a deeper understanding of what is going on, but its not a requirement.

    • Re: (Score:2, Interesting)

      by formfeed ( 703859 )

      Sure, its nice if you can do the job of your people, so you can have a deeper understanding of what is going on, ..

      That's why I in principle like the announcement. Even if it turns out just to be a publicity stunt, it at least shows that Bloomberg thinks that learning something different would be good - or at least thinks, that his voters think that..

      According to BBC, the reaction of the London mayor was that he's too busy for things like that. - Now, that shows a politican that needs to get rebooted. If politicians would do a couple things below their pay scale or volunteer for longer than a photo opportunity they m

      • by Asic Eng ( 193332 ) on Saturday January 07, 2012 @08:16PM (#38626252)

        According to BBC, the reaction of the London mayor was that he's too busy for things like that.

        That's completely wrong [bbc.co.uk]. The BBC actually reports [...] the mayor is in awe of his good friend Michael Bloomberg, and if re-elected will explore whether he can join him on that course. I believe you got Boris Johnson (current mayor) confused with Ken Livingstone (former mayor and current candidate for the opposing party). Ken Livingstone stated If I'm elected, I'll be a bit too busy to take any education courses.

        Anyway, it's certainly nice if politicians broaden their minds, but it's reasonable that they have to allocate their time and set priorities.

  • BW 2001 [businessweek.com]: Bloomberg still insists that the Net is too "unreliable" a way to deliver his product. Servers go down, security is dicey, and he has faith in a closed system. There's a Bloomberg Web site with data and news for free. But the CEO was an early skeptic of the Internet gold rush, and these days he figures that he has been proved more right than wrong.

    • by betterunixthanunix ( 980855 ) on Saturday January 07, 2012 @03:05PM (#38624214)
      Frankly, given the line of business he was in -- rapid news delivery to investors -- I am inclined to agree with him about the Internet. Delays on the network could translate into millions of dollars in losses for Bloomberg's customers, which could translate to millions in losses for Bloomberg. From a business perspective it made sense.
      • by Sir_Sri ( 199544 )

        Agreed, in 2001 that was about correct. Systems generally weren't that easy to make reliable, and for what he was doing it wouldn't have been worth it.

        If someone asked me today why I'm not making a WP7 app, the answer is: They don't have enough of a market share for it to be worth our time yet. If, 3 years from now, WP7 owns the whole damn marketplace that doesn't mean my opinion about what I'm doing right this minute is wrong. The world changes, the question is whether or not you can evolve with it, an

        • Even today, critical communications don't travel over the public internet:

          a) Mastercard & VISA card processing networks
          b) ACH & Fedwire money transfers
          c) US DoD communications.

          Using IP protocol isn't really the problem (why invent hardware now), but control & management of network is a big deal. Besides, his servers & his clients can be concentrated in Manhattan. Bloomberg made the right choice then, and it's still the right choice.

  • Bloomberg does not want to learn to code -- he is promoting a business with operations in NYC that will bring jobs into NYC. I do not think there is anything wrong with the mayor of NYC promoting such an organization, but why should /. glorify Bloomberg instead of just glorifying CodeAcademy?
  • by LostCluster ( 625375 ) * on Saturday January 07, 2012 @03:02PM (#38624190)

    Common in the 60s: Punch cards, text only dumb terminals, mainframes...
    Common Now: Online storage, visual designers, client/server setups....

    If your knowledge of computers ends in the 60s. there's a lot of updating to be done. Mayor Bloomberg has the right idea... every 10 years or so it's time to retrain to the current tools.

    • by Tablizer ( 95088 )

      Terminals were a luxury in the 60's. Teletype machines were generally cheaper (if you were lucky enough to get to use one), even though they consumed a lot of paper.

  • Did Bloomberg do something to the story submitter? Sounds like Bloomberg kicked his dog or something.

    • It's important to hate him because he's rich. Anyone who has money is evil because they are able to fulfill their desires and I am not.

  • Bloomberg: I need you to perform a privilege escalation on my compiler.
    Helpdesk: Before we proceed, can you describe the symptoms?
    Bloomberg: Yeah, it sometimes spits out some incomprehensible message, or the program says "Segmentation Fault." I don't care about its needs, I have work to do, now. So I'm calling in a privilege escalation. Now!
    Helpdesk: Sir, I'm not sure that's going to help, do you know what a privilege escalation means?
    Bloomberg: Yes, I think I do, or haven't you noticed that I'm the ri

  • THAT explains the Citytime fiasco, eh - maybe he's looking to get in on the Nth version
    http://www.nytimes.com/2011/11/01/nyregion/bloomberg-administration-admits-mishandling-citytime-and-nycaps-programs.html [nytimes.com]

  • by jacobsm ( 661831 ) on Saturday January 07, 2012 @03:50PM (#38624570)

    10 Print "I've got lots of money"
    20 goto 10
    30 end

  • by lightknight ( 213164 ) on Saturday January 07, 2012 @04:14PM (#38624792) Homepage

    Just curious -> why? Personal interest, or business venture?

    And someone make sure he starts with C++. If he survives that, he won't have any trouble picking up other languages.

    • And someone make sure he starts with C++. If he survives that, he won't have any trouble picking up other languages.

      I've always been baffled by people who think that C/C++ is a good starting point when you want to learn/teach programming. I think that the most important thing to understand - whether you end up working as a programmer or not - is the basic structure/flow of the program (conditionals, loops, modularity/functions). Then the basic programming concepts (recursion, abstract data types, etc.) and then the libraries/APIs for your platform so that you can actually create something interesting/useful. I don't thin

      • A fair point. I started programming in Javascript before I moved to PHP, VB.Net (I regret this one), C#, Java, C++ in that order. However it could just have easily been C++ first.

        If a person is really committed to learning a programming language, they would be fine learning C++ first which would teach not only all the fundamentals but also give some idea of how the system works.

        It can also be unnecessary depending what they want to do though (as you suggested)

        • If we are talking about programming in general, I think I started with Logo, then Java / Q-Basic, then C, then JavaScript, then C++. Something like that, with HTML / VRML mixed in for good measure. Ah, good old VRML.

          Currently enjoying C# as my primary language, and doing PHP work for a small project. Have a book on Ruby to finish reading, the AMD APP OpenCL reference for when I have some free time.

      • by lightknight ( 213164 ) on Saturday January 07, 2012 @06:42PM (#38625712) Homepage

        I think C++ is a good starting point simply because it teaches memory management and class design.

        Understanding the concept of a class is one of the most difficult programming concepts a novice will encounter. And they are used everywhere.

        Just try explaining the concept of a class to a non-programmer. I will bet money that they will nod their heads, and still have no idea what you're talking about.

        And memory management -> something you need to understand, even if you use a garbage collector.

        If he's just taking a programming class to get a taste (dilettante) for programming, then by all means teach him Visual Basic or JavaScript or whatever. However, if he's taking a programming class to learn programming (he wants the programmer skillset a.k.a. a real programmer), then C++ is where he wants to be. Once you understand the concepts in C++ (which can be brutal / metal when it comes to learning), the hardest part of learning how to program is past.

        Why, do you ask? Because otherwise you end up in sad scenarios, like when the PhDs in your Computer Science department do not know how to install an operating system, when the undergrads in your class have difficulty understanding the difference between an AMD processor and an Intel processor, or why one should never write a program in JavaScript that consumes 8 GB of the client computer's memory.

        TLDR; C++ will expose him to the greatest number of programming concepts in the shortest period of time, and give him the minimal amount of understanding necessary to eventually grow into a respected programmer.

        • Re: (Score:3, Informative)

          by Andrevan ( 621897 )
          Why do CS PhDs, who spend 98% doing theory (math), need to know anything about installing an OS? Why do undergrads, who probably use preassembled OEM boxes, need to understand the differences between hardware brands? More to the point, how does learning memory management or class design through C++ help one learn these things? To address a less ridiculous point, if I'm spending all my time in Java, Ruby or Python, why do I need to understand anything about pointers and memory management in C? For the sake
          • Re: (Score:2, Informative)

            by Aighearach ( 97333 )

            Like so many other classes required for a CS degree, I use nothing from it in my day-to-day work as a Ruby developer.

            As a Ruby developer I just have to point out, without C you can't understand the Ruby source or write native extensions.

            A Ruby developer without C is totally weak.

        • by siride ( 974284 )

          C++ is a terrible language for teaching OO. There are other languages that have a stronger OO model and it's consistent throughout the language.

          Teach C for basic programming concepts, memory management and that kind of thing. Use a .NET lang/Java, Python or Haskell for the modern and OO stuff.

      • When teaching people to program I start with an introduction to binary and hexadecimal, making them do a few things like note the patterns various various numbers contain, and add up a couple of things (which illustrates why powers of two are important and convenient, etc.)

        From there it is a brief introduction to logic gates. I demonstrate simple addition, and make them construct a circuit that will do add with carry.

        Then I do a brief introduction to assembly (x86 these days, I used to choose Alpha.) I do

      • The advisor for my intro to programming project promptly nixed C++ and went with Python
        A teacher who had worked with Fortran in the 70's said this: "Automatic memory management? You lucky bastard."
        Moreover, Python has a fairly straightforward syntax without being _just_ a teaching language

    • To build a website. He is having trouble finding developers who *want* to help him oppose OWS's web presence, so he's going to learn html and make "an anti-anti-wall-street web-page" all by himself.
    • Comment removed based on user account deletion
  • Perl! (Score:3, Funny)

    by RobertRCleveland ( 2547974 ) on Saturday January 07, 2012 @04:34PM (#38624970)
    Bloomberg should learn Perl. That'll make him ready for the Presidency! :)
  • by peter303 ( 12292 ) on Saturday January 07, 2012 @06:11PM (#38625504)
    Always learn new things in life since technology evolves so fast. I feel sorry for my co-workers to refuse to learn on their own because it would cost them some time or money.
  • Bloomberg is a narcissist, he's going to write a Hello World program and think he's an expert in all things technology related.

    LK

  • No mayor has overstepped his legal boundaries like he has. Running multiple illegal so called sting operations, not only in New York State, but in other States as well. New York city also has some rather questionable intelligence units that partner way to close with FBI and CIA.
  • I went there and looked at their other courses. Only 3? The Bloomberg thing is advertising for something that isn't there.

  • Only way to go, man.

    .

Heard that the next Space Shuttle is supposed to carry several Guernsey cows? It's gonna be the herd shot 'round the world.

Working...