Monday, March 11, 2013

Installing Cloudera Hadoop on Mac

Do not search it in the app store 

I know  that some of you searched for Hadoop in the app store :P .. It ain't there yet, too bad isn't it?  And, its not as easy as "sudo apt-get install hadoop" :)

But do not worry, its not as difficult either. Its much like installing Apache Hadoop in linux distributions. All it takes is a few simple baby steps to install the elephant in your mac.

This tutorial will guide you to install hadoop ina pseudo-distributed mode in your mac.

Step1: Passwordless ssh.

Try ssh-ing your own machine

$ ssh localhost

Without passwordless ssh, you will need to enter your password to login into your own system through ssh. To so

$ ssh-keygen

the command will ask for the location of the id_rsa and the keys. Press enter to accept the default location.
The command will ask for the passphrase. Just press enter to not have any password at all (afterall, we are trying to achieve passwordless-ssh) 

The command sometimes ends with a wierd random art of the key like this. 

+--[ RSA 2048]----+
|               -.|
|              .+o|
|              A=o|
|       .      ..=|
|        P .. . O |
|       . . .o    |
|          . F.. .|
|           +.o.o |
|          o O=+  |

or just a cryptic fingerprint like this a2:b1:5e:6f:2a:a2:d7:3f:d1:e5:5a:aa:ab:c5:e8:2a

But yea, don't get scared. you do not have to remember them :P
now go to your home directory and copy the pub file to authorized_keys. 

$ cd /Users/rajgopalv
$ cd .ssh
$ cp id_*.pub authorized_keys
$ ssh localhost

some times, when you login for a first time to the host, you will be asked with a security warning of some kind... 

The authenticity of host 'localhost (::1)' can't be established.
RSA key fingerprint is <<[then finger print]>>
Are you sure you want to continue connecting (yes/no)?

just type "yes" and continue. The system should not ask a password, and you should be able to login successfully.

Step-2 : Download the CDH4 tarballs.

The Hadoop and its family of softwares are available in tarball format in here :

Go ahead and download the stuff you want. 
But to begin with, let me start with downloading the "hadoop-2.0.0+922" tarball. If you want to run mapreduce version1 (i.e. nmot Yarn, ) then download the "hadoop-0.20-mapreduce-0.20.2+1341" tar ball too (recomended)

Now, unzip these tarballs and place them wherever you wnat them to get installed. I personally prefer a "Softwares" folder in my home directory.

$ pwd
$ ls -ld hadoop*
drwxr-xr-x@ 14 rajgopalv  1668562246  476 Feb  7 07:00 hadoop-2.0.0-cdh4.1.3

drwxr-xr-x@ 29 rajgopalv  1668562246  986 Feb  6 11:20 hadoop-2.0.0-mr1-cdh4.1.3

Step-3: Configure

DFS configuration : 
go to the hadoop hadoop-2.0.0-cdh4.1.3/etc/hadoop directory and edit the core-site.xml to look like this . 

        <!-- ofcourse you can use any directory you want. -->

and hdfs-site.xml to look like this:



Similarly, configure the map-reduce too. Go to hadoop-2.0.0-mr1-cdh4.1.3/conf/ directory and edit the mapred-site.xml to look like this.


Step-4: Run!

Now its time to format and start the DFS: 
go to the hadoop-2.0.0-cdh4.1.3 folder in your terminal.

$ cd /Users/rajgopalv/Softwares/hadoop-2.0.0-cdh4.1.3
$ bin/hdfs namenode -format

13/03/12 00:27:06 INFO namenode.NameNode: STARTUP_MSG:
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = Nucleus/
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.0.0-cdh4.1.3
STARTUP_MSG:   classpath = /Users/rajgopalv.......... [etc etc..]


blah bhal blah

13/03/12 00:27:07 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
13/03/12 00:27:07 INFO util.ExitUtil: Exiting with status 0
13/03/12 00:27:07 INFO namenode.NameNode: SHUTDOWN_MSG:
SHUTDOWN_MSG: Shutting down NameNode at Nucleus/

The important thing to notice is "Exiting with status 0". Status 0 indicates all is well. :)
Now start the DFS.

$ sbin/

now, http://localhost:50070/dfshealth.jsp must display the health of your DFS 
To start the mapreduce module.,  go to hadoop-2.0.0-mr1-cdh4.1.3 directory in your terminal.

$ cd /Users/rvaithiyanathan/Softwares/hadoop-2.0.0-mr1-cdh4.1.3
$ bin/

now, http://localhost:50030/jobtracker.jsp should show your mapreduce jobs. 

Bravo! you are good to go. 

Possible things that could go wrong: 

Set your java home before you start anything. 

$ export JAVA_HOME=/System/Library/Frameworks/JavaVM.framework/Home/

Try different port numbers in the configuration.

Although, I 've shown here that the DFS is configured on 8020 and mapreduce is configured on 8021, some other softwares might be using these ports. So feel free to try different ports.

Do you have permissions to the hadoop.tmp.dir ?

the directory that you specified in hadoop.tmp.dir must be writable by you. This is the reason i've specified a directory under my home-directory itself.

I say, check out the *.log files in the directories. hadoop-2.0.0-mr1-cdh4.1.3/logs and hadoop-2.0.0-cdh4.1.3/logs/ .. They could be a little cryptic if you are a beginner, but you will get used to it :)

Let me know if there are any doubts.!


  1. got this :

    smaikap:hadoop-2.0.0-cdh4.3.0 Smaikap$ bin-mapreduce1/
    | Error: HADOOP_HOME is not set correctly |
    | Please set your HADOOP_HOME variable to the absolute path of |
    | the directory that contains hadoop-core-VERSION.jar |
    smaikap:hadoop-2.0.0-cdh4.3.0 Smaikap$ env
    smaikap:hadoop-2.0.0-cdh4.3.0 Smaikap$ env | grep hadoop
    smaikap:hadoop-2.0.0-cdh4.3.0 Smaikap$

  2. As of CDH4.3, there is no separate tarball for MRv1:

  3. The information which you have provided is very good and easily understood. It is very useful who is looking for hadoop Training.
    Hadoop Training in hyderabad

  4. Thanks so very much for taking your time to create this very useful and informative site. I have learned a lot from your site. Thanks!!

    Hadoop Training in Chennai

  5. Your article is very useful for me. Thanks for sharing the wonderful information. AWS course chennai | AWS certification in chennai | AWS cerfication chennai

  6. This is extremely helpful info!! Very good work. Everything is very interesting to learn and easy to understood. Thank you for giving information. cloud computing training in chennai | cloud computing training chennai | cloud computing course in chennai | cloud computing course chennai

  7. This Information very helpful for the beginners.In this each step have a wonderful explanation.I would study and known about the application.thanks for giving wonderful information. Vmware certification in chennai | Vmware certification chennai

  8. This information is impressive; I am inspired with your post writing style & how continuously you describe this topic. After reading your post, thanks for taking the time to discuss this, I feel happy about it and I love learning more about this topic..
    Selenium Training in Chennai | QTP Training in Chennai

  9. Thanks for Information Oracle Apps Technical is a collection of a bunch of collected applications like accounts payables, purchasing, inventory, accounts receivables, human resources, order management, general ledger and fixed assets, etc which have its own functionality for serving the business
    Oracle Apps Training In Chennai

  10. Oracle Training in chennai | Oracle D2K Training In chennai
    This information is impressive; I am inspired with your post writing style & how continuously you describe this topic. After reading your post, thanks for taking the time to discuss this, I feel happy about it and I love learning more about this topic..

  11. Greens Technology's the leading software Training & placement centre Chennai & ( Adyar)
    hyperion training in chennai

  12. Awwsome informative blog ,Very good information thanks for sharing such wonderful blog with us ,after long time came across such knowlegeble blog. keep sharing such informative blog with us.
    Aviation Academy in Chennai | Aviation Courses in Chennai | Best Aviation Academy in Chennai | Aviation Institute in Chennai | Aviation Training in Chennai

  13. Thanks For Your valuable posting, it was very informative

    Guest posting sites

  14. Great post! This is very useful for me and gain more information, Thanks for sharing with us.