I am running two docker containers one is for hadoop basic services and other is for flume. Services are running successfully. I linked two containers env variables are automatically set by docker successfully.
126.96.36.199 7ab4ffb30dc0 ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters 127.0.0.1 localhost ::1 localhost ip6-localhost ip6-loopback fe00::0 ip6-localnet
This is My /etc/hosts file for hadoop. When i run
hadoop fs -ls / hadoop fs -ls hdfs://127.0.0.1:8020/
Works fine. But if i run
it returns 7ab4ffb30dc0
So i tried
hadoop fs -ls hdfs://188.8.131.52:8020/
it says Call From 7ab4ffb30dc0/184.108.40.206 to 7ab4ffb30dc0:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Hello, I learned from that site
These are the directions which i learened to tackle serverfault problem
connection refused. it tried to connect to 127.0.0.1 and 0::
works so telent daemon is running
telnet localhost 44444
Connected to localhost.
Escape character is '^]'.
Connection closed by foreign host.
says service is listening at :::44444
telnet localhost 25
works. there is an entry for port 25.
I don’t understand what’s the difference between :::44444 and ::1:25 under localhost of netstat -tna ‘s output.
My suggestion is to read the information provided.
java.net.ConnectException: Connection refused;
This generally means nothing is listening on the specified IP:Port.
Note also this information is provided
and it’s first line
You get a ConnectionRefused Exception when there is a machine at the address specified, but there is no program listening on the specific TCP port the client is using -and there is no firewall in the way silently dropping TCP connection requests.
There is more information in the helpful link too, you should read it.