Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Databases Programming IT

IT's Next Hot Job: Hadoop Guru 112

gManZboy writes "JPMorgan Chase and other companies at this year's Hadoop World conference came begging for job applicants: They say they can't find enough IT pros with certain skills, including Hadoop MapReduce. That spells high pay. As for Hadoop's staying power as a career path (a la SQL 30 years ago), IBM, Microsoft and Oracle have all embraced Hadoop this year. Maybe the best news of all: 'Intelligent technologists will pick up Hadoop very quickly.'"
This discussion has been archived. No new comments can be posted.

IT's Next Hot Job: Hadoop Guru

Comments Filter:
  • by geoffrobinson ( 109879 ) on Wednesday November 09, 2011 @05:41PM (#38005120) Homepage

    The trick is going to be getting the appropriate experience without having learned it on the job already.

    Yes, it can be done. However, this technology is geared towards environments with lots of nodes in big clusters. (which can run Linux) That's not the same as simply learning a language.

    I got a job utilizing a "Big Data" database technology by being at the right place at the right time, when this technology was being rolled out. It's also hard to find people with that specialized experience.

    So I would suggest to companies, hire people and train them. Just get quality people if you can't find someone with the specific skill set.

  • by ackthpt ( 218170 ) on Wednesday November 09, 2011 @05:52PM (#38005304) Homepage Journal

    If you want a strong userbase, projects with good, easy to use learning resources do better. When you hit the hadoop main page, they tell you what it is, but not what you need to know in order to use it. They don't tell you what languages it supports. They give no examples of usage. Essentially, they don't do you any favours.

    I spent some time trying to implement some nice free tools from IBM and Apache. I found I needed to download X and do a build of it, but half way through it wanted Y to complete the build. OK... So I go find Y and try doing a build on it, but need something else from Apache, which doesn't like the vesion of Apache I'm running. So I get the other Apache thing and find I can't get it to start up. I go research it and find conflicting and incomplete information all over the web. I throw in the towel.

    One thing needed is One source for information and clear instructions for a basic, default build of a platform. Once that is reliable, then document ways to add foo and bar or even plugh if it suits you.

  • by JonySuede ( 1908576 ) on Wednesday November 09, 2011 @06:30PM (#38005758) Journal

    drink the maven kool aid, and you worries will be beyond you.
    To use hadoop :

            org.apache.hadoop
            hadoop-core
            0.20.205.0

    in your pom.mxl

    Then write 2 classes like those one:

    class MyMap extends MapReduceBase implements Mapper<K1, V1, K2, V2 >...
    class MyReduce extends MapReduceBase implements Reducer<K2, V2, K3, V3>...

    Feed instances of those to a JobConf and feed that instance to a JobClient.

    The rest should be obvious to a seasoned programmer, just by looking at the nomenclature of the classes hierarchy.

    The great Ward Cunningham, is right, put two days into studying something and you are already half way to expert. [infoq.com]

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...