Hi I am trying to setup a multinode Hadoop cluster and I would like to know, how to install hadoop in all the slaves machine from the master machine itself.

Ideally, all the slave machines should have same configuration, is there a way to provide instructions to all the slaves simultaneously?

for example, if I say "sudo apt-get install git" from my master machine, it should install git on all the slave machines. appreciate your help on this

Download the free version of the Cloudera Hadoop management tool. It will nicely do all that cruft for you.

Rubberman, if it can work on Cloudera, then there is some software in apache that should do the same work? do you know what it is called in cloudera?

If you need a quick-and-dirty approach to controlling several machines in parallel (provided the number of machines is fairly small) you can use tmux. With a separate pane hosting an ssh session to a unique host you can enable synchronize-panes in tmux to have commands echoed to each host at the same time.

Search for tmux synchronize-panes for more details.

I read about a python module named SaltStack yesterday. I think it can distribute commands the way you want.