- Apache Hadoop 3 Quick Start Guide
- Hrishikesh Vijay Karambelkar
- 439字
- 2021-06-10 19:18:43
Working across nodes without passwords (SSH in keyless)
When Apache Hadoop is set up across multiple nodes, it often becomes evident that administrators and developers need to connect to different nodes to diagnose problems, run scripts, install software, and so on. Usually, these scripts are automated and are fired in a bulk manner. Similarly, master nodes often need to connect to slaves to start or stop the Hadoop processes using SSH. To allow the system to connect to a Hadoop node without any password prompt, it is important to make sure that all SSH access is keyless. Usually, this works in one direction, meaning system A can set up direct access to system B using a keyless SSH mechanism. Master nodes often hold data nodes or map-reduce jobs, so the scripts may run on the same machine using the SSH protocol. To achieve this, we first need to generate a passphrase for the SSH client on system A, as follows:
hadoop@base0:/$ ssh-keygen -t rsa
Press Enter when prompted for the passphrase (you do not want any passwords) or file location. This will create two keys: a private (id_rsa) key and a public (id_rsa.pub) key in your .ssh directory inside home (such as /home/hadoop/.ssh). You may choose to use a different protocol. The next step will only be necessary if you are working across two machines—for example, using a master and slave.
Now, copy the id_rsa.pub file of system A to system B. You can use the scp command to copy that, as follows:
hadoop@base0:/$ scp ~/.ssh/id_rsa.pub hadoop@base1:
The preceding command will copy the public key to a target system (for example, base1) under a Hadoop user's home directory. You should now be able to log in to the system to check whether the file has been copied or not.
Keyless entry is allowed by SSH only if the public key entry is part of the authorized_key file in the.ssh folder of the target system. So, to ensure that, we need to input the following command:
hadoop@base0:/$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
The following command can be used for different machines:
hadoop@base0:/$ cat ~/id_rsa.pub >> ~/.ssh/authorized_keys
That's it! Now it's time to test out your SSH keyless entry by logging in using SSH on your target machine. If you face any issues, you should run the SSH daemon in debug mode to see the error messages, as described here. This is usually caused by a permissions issue, so make sure that all authorized keys and id_rsa.pub have ready access for all users, and that the private key is assigned to permission 600 (owner read/write only).
- Splunk 7 Essentials(Third Edition)
- 面向STEM的mBlock智能機器人創新課程
- Linux Mint System Administrator’s Beginner's Guide
- Julia 1.0 Programming
- Mastering D3.js
- 大數據時代的數據挖掘
- Visual C# 2008開發技術詳解
- 西門子S7-200 SMART PLC實例指導學與用
- RPA(機器人流程自動化)快速入門:基于Blue Prism
- C語言寶典
- 基于單片機的嵌入式工程開發詳解
- Prometheus監控實戰
- Linux嵌入式系統開發
- 精通數據科學:從線性回歸到深度學習
- R Machine Learning Projects