ssh: Could not resolve hostname DataNode1: Name or service not known
When SSH connects to other nodes in the process of building Hadoop cluster
ssh hadoop@DataNode1
Error: SSH: could not resolve host name datanode1: name or service not known
——–
Solution:
The related configuration file of Linux host name is/etc/hosts
In this file, you need to configure not only the local IP and the host name, but also the IP and the host name of the connecting node
Similar Posts:
- Hadoop Start Error: ssh: Could not resolve hostname xxx: Name or service not known
- [Solved] Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try.
- [Solved] the default discovery settings are unsuitable for production use; at least one of [discovery.seed_hosts, discovery.seed_providers, cluster.initial_master_nodes] must be configured
- 127.0.1.1 host address in Debian (Ubuntu) hosts file
- [Hadoop 2. X] after Hadoop runs for a period of time, stop DFS and other operation failure causes and Solutions
- [Solved] hive beeline Connect Error: User:*** is not allowed to impersonate
- [Solved] Phoenix startup error: issuing: !connect jdbc:phoenix:hadoop162:2181 none…
- Hiveserver2 Connect Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadoop01:100…
- [Solved] Hadoop Error: The directory item limit is exceeded: limit=1048576 items=1048576
- failed: OCI runtime create failed: container_linux.go:348