“Permission denied” errors whe starting a single node cluster in Hadoop

I’m working in Ubuntu 10.10 and am trying to start a single node cluster in Hadoop. hadoop@abraham-Dimension-3000:/usr/local/hadoop$ bin/start-all.sh mkdir: cannot create directory `/usr/local/hadoop/bin/../logs’: Permission denied starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out /usr/local/hadoop/bin/hadoop-daemon.sh: line 117: /usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out: No such file or directory head: cannot open `/usr/local/hadoop/bin/../logs/hadoop-hadoop-namenode-abraham-Dimension-3000.out’ for reading: No such file or directory localhost: mkdir: cannot create directory … Read more

How do I find out the version of Zookeeper I am running?

I have an Ubuntu-12.04 VM running on my laptop.I have installed zookeeper on it using the command sudo apt-get install zookeeper Now for traditional Ubuntu programs I check the version using –version command line option. For example gcc –version gives the version of gcc as 4.6.3 (for e.g) Similarly how do I find out the … Read more

How to undo an ssh-copy-id?

I have a 2 node hadoop cluster. I ran this command on the master: $ssh-copy-id -i /home/hadoop/.ssh/id_rsa.pub hadoop@192.168.1.1 How can I undo this? I would actually like to reassign the key. 192.168.1.1 is the slave. Answer Identify the public key that you copied when you ran ssh-copy-id: cat ~/.ssh/id_rsa.pub SSH to the server you copied … Read more

How to install Hadoop?

I am trying to install Hadoop in Ubuntu 12.04 version. Following the instructions from http://michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/, I installed java-6-openjdk from Ubuntu software-center. I have set java_home in .bashrc. Also set java_home in Hadoop conf/env.sh. While formatting the namenode, I am getting the following error: usr/lib/jvm/java-6-openjdk/bin/java no such file or directory. Thank you. But it’s a 64bit … Read more

Port binding error in PySpark

I have been trying to get PySpark to work. I use the PyCharm IDE on a Windows 10 machine. For the setup I took these steps: installed PySpark installed Java 8u211 downloaded and pasted the winutils.exe declared SPARK_HOME, JAVA_HOME and HADOOP_HOME in Path added spark folder and zips to Content Root already tried: export SPARK_LOCAL_IP=”127.0.0.1″ in … Read more

Hadoop – java.io.IOException: Connection reset by peer when creating when creating a new directory

I have installed hadoop 2.4.0 as a single node for learning purposes but after I start hadoop and create a directory using the command: hadoop fs -mkdir /tmp I get the following error: ls: Failed on local exception: java.io.IOException: Connection reset by peer; Host Details : local host is: “localhost.localdomain/127.0.0.1”; destination host is: “localhost”:9000; I’m … Read more

I can’t run hive in command line

Hive 3.1.2 Hadoop 3.2.1 When I run hive in command line, it tell me the error message below: which: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin) SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Exception in … Read more

7Zip Cannot create symbolic link, access is denied to libhdfs.so and libhadoop.so

I am working on Windows 10 and trying to install Hadoop I downloaded it from here. When trying to extract Hadoop for files (libhdfs.so and libhadoop.so) I am getting the error. Cannot create symbolic link : Access is denied How do I fix this? Answer How do I fix this? I am getting the error: … Read more