Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Education Programming United Kingdom

Forget 6-Minute Abs: Learn To Code In a Day 306

whyloginwhysubscribe writes "The usually excellent BBC 'Click' programme has an article on 'Why computer code is the new language to learn' — which features a company in London who offer courses on learning to code in a day. The BBC clip has an interesting interview with a marketing director who, it seems to me, is going to go back and tell his programmers to speed up because otherwise he could do it himself! Decoded.co's testimonials page is particularly funny: 'I really feel like I could talk credibly to a coder, given we can now actually speak the same language.'"
This discussion has been archived. No new comments can be posted.

Forget 6-Minute Abs: Learn To Code In a Day

Comments Filter:
  • by Anonymous Coward on Tuesday August 14, 2012 @01:10PM (#40986211)

    Of course everyone starts somewhere. But to think that you can program after a one-day course is as ridiculous as thinking you know a foreign language after a one-day course. The problem is not in starting, the problem is in thinking you've reached the destination when in reality you are barely away from the starting point.

  • Re:language != logic (Score:5, Informative)

    by amicusNYCL ( 1538833 ) on Tuesday August 14, 2012 @01:11PM (#40986227)

    It's a course in HTML, CSS, and Javascript. Javascript is the only one of the three that is an actual programming language, they aren't teaching people how to program. They're teaching people how those three languages interact to create a web page. It actually seems like a pretty useful course for developers who work in any company that produces online products to send their marketing and sales teams to, so that those teams can at least get a glimpse about how these things work just so that they have a better understanding of what they're asking us to do. Or, so that they have more of an idea of what's possible. The #1 question I'm asked is "is it possible to..." Yes, it's possible, it's always possible, it's a question of time and money. I don't know how many times I have to answer that question before people realize they can just skip straight to the second question ("what does it take to do it"). A class like this may clue them in.

  • Re:language != logic (Score:4, Informative)

    by turbidostato ( 878842 ) on Tuesday August 14, 2012 @01:56PM (#40986751)

    "I was thinking about factorizing the product of two large primes. There are numerous problems in computer science that we can't do with current technology;"

    Excellent point... Against your position. Factorizing large primes it's not only possible but trivial. It's only that it's general case horribly time consuming (thus, expensive). Hence the "don't ask me if it's possible and go directly to the second question".

  • by firewrought ( 36952 ) on Tuesday August 14, 2012 @02:12PM (#40986977)

    The similarity with spoken language is uncanny.

    The similarities go way deeper than that. Mainly, there is a strong isomorphism between how human and computer languages are encoded and interpreted/compiled.

    At the lowest level, a digital "alphabet" must be imposed on this unruly analogue world. Human languages use phonemes [wikipedia.org] (generally a few dozen distinct sounds) while computer languages use a character set (such as ASCII or Unicode). These alphabets are all basically a set of finite, unchanging, and meaningless symbols.

    One level up are morphemes [wikipedia.org] or words and word-parts that are constructed of phonemes. So "dog" is the name/morpheme we assign to the furry thing lifting its leg over your bedroom carpet; "urinate" is the morpheme we assign to its activity; and "ed" is the morpheme that signifies the activity has already completed (as in urinated). In computer languages this is called lexcal analysis [wikipedia.org], and it happens very early during compilation, usually with the help of regexps. In both cases, this phase transforms the fixed set of phonemes into a large, ever-growing set of meaningful symbols.

    The next level up is syntax, in which a governing grammar (itself consisting of a closed set of abstract categories) is used to parse the morphemes/lexical tokens into tree-like data structures that will subsequently be used to determine relationships between word-units. This is where you start reading Chomsky or the Dragon book and reaching for the Midol. I don't know if it's Chomsky's fault or what, but there's a lot of similar terminology here between the same fields (e.g., syntax, grammar, parsing, production rules), as well as dissimilar terminology for roughly equivalent concepts (e.g., sentence<==>statement, clause<==>expression, paragraph<==>method).

    After that comes semantics (assignment of meaning) and pragmatics (what things mean in context), for which you could find some suggestive connections with compilation (type-checking and processor-specific optimizations, perhaps), but here the easy/clean comparisons start to break down... probably because we still have a very limited understanding of how the human brain works. In both cases, it seems that there has to be a translation from the abstract, extracted idea down into the series of electrical impulses that yield a change in state of the target brain/computer.

    As a completely separate topic, there is an isomorphism (in the sense of the term that Hofstadter uses in GEB [wikipedia.org]) between how both human and computer languages evolve and branch cladistically with time. (And unsurprisingly, there is yet another isomorphism between biological evolution and language evolution.... we live in an endlessly fascinating world.)

    Keep in mind, though, that we are ultimately finding similarities between things that are fundamentally different. Blindly inferring new "truths" about computer languages from human ones (or visa-versa) is a recipe for looking silly.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...