圖片來源:自行製作
[2014.02.10 Update] 這篇文章是把KDC、Ambari 以及所有Service裝在一起,但是如果套用在真實full distributed cluster mode 這篇文章在安裝上就會出問題,請參考另一篇安裝
Fully Distributed Hadoop Cluster by Ambari on Google Cloud Platform。
接續
Security for Hadoop - Data Encryption 這篇文章裡面提到的
Kerberos ,如果你的
Hadoop Cluster 沒有啟動Kerberos 的認證機制,其實根本是沒有安全防護可言的,所以今天這篇文章就要來分享如何利用
Apache Ambari 啟動Kerberos 的認證機制。
測試環境:
A. HDP 2.x Ambari 1.4.1 (理論上 Ambari-1.2.5 以上版本都可使用)。
B. 把KDC 和 Hadoop Cluster 分開安裝 (全部裝在一個Sandbox 也可以)
1. Install package
1-1. 首先要先安裝一台Kerberos KDC (
Kerberos Key Distribution Center)
> yum install -y krb5-server krb5-libs krb5-auth-dialog krb5-workstation
1.2 On KDC client
> yum install -y krb5-libs krb5-workstation
2. 設定 KDC Server
2.1 設定 /etc/krb5.conf --- the same in all Kerberos workstation
[logging]
default = FILE:/var/log/krb5libs.log
kdc = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind.log
[libdefaults]
default_realm = LOCALDOMAIN (這邊注意一定要大寫)
dns_lookup_realm = false
dns_lookup_kdc = false
ticket_lifetime = 24h
renew_lifetime = 7d
forwardable = true
[realms]
LOCALDOMAIN = {
default_domain = localdomain
kdc = bdp-node1.localdomain
admin_server = bdp-node1.localdomain
}
[domain_realm]
.localdomain = LOCALDOMAIN
localdomain = LOCALDOMAIN
2.2 編輯 /var/kerberos/krb5kdc/kdc.conf
[kdcdefaults]
kdc_ports = 88
kdc_tcp_ports = 88
[realms]
LOCALDOMAIN = {
#master_key_type = aes256-cts
acl_file = /var/kerberos/krb5kdc/kadm5.acl
dict_file = /usr/share/dict/words
admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab
supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
}
2.3 編輯 /var/kerberos/krb5kdc/kadm5.acl
*/admin@LOCALDOMAIN *
3. 在bdp-node1 啟動 KDC/kadmin 以及新增 admin principal
不過在第一次執行可能會遇到下面的問題
> service kadmin start
Bug: kadmind: No such file or directory while initializing, aborting
3-1.解決方法要先Create KDC master key
> kdb5_util create -s
Enter KDC database master key:
[your password]
3-2. 然後建立一個管理權限的帳號
> kadmin.local -q "addprinc root/admin"
password:
[your password]
Authenticating as principal root/admin@bdp-node1.localdomain with password.
WARNING: no policy specified for root/admin@bdp-node1.localdomain; defaulting to no policy
Enter password for principal "root/admin@bdp-node1.localdomain":
Re-enter password for principal "root/admin@bdp-node1.localdomain":
Principal "root/admin@bdp-node1.localdomain" created.
3-3. 重新啟動kdc Service
> service krb5kdc start
4. Copy /etc/krb5.conf to all Clients
5. using ambari-web to create keytab CSV file and save as keytabs.csv
5-1. 從Admin 的Menu 選擇Security的選項,可以看到Enable Security的選項
5-2. 按下Enable Security 後接下來就會出現下一步下一步的Wizard
5-3. 輸入你的Realm Name (記住要大寫),其他設定Ambari 都會幫你填好(除非你有特別要改什麼設定),把畫面卷到最下面按Next。
5-4. 接下來畫面就會顯示Hadoop Cluster 所有Service 角色該有的Keytab設定,確認無誤後,請先下載那個csv檔 (建議後面6~10都做完,再按apply )
6. cp /var/lib/ambari-server/resources/scripts/keytabs.sh /root/
keytabs.sh 是ambari 所提供用的script,利用這個script 和剛剛產生的keytabs.csv, 可以用來產生幫忙設定keytab的shell script (keytabs_create.sh)。
7. create the real keytabs creating scrupt
>chmod 755 keytabs.sh
>./keytabs.sh keytabs.csv > keytabs_create.sh
8. execute keytabs_create.sh to create keytabs
>chmod 755 keytabs_create.sh
>./keytabs_create.sh
>cp -raf xxxx/etc/security/keytabs /etc/security/
9. Checking: list all the created pricipals
>kadmin.local -q list_principals
10. Create a new user(howie) to access hdfs
>kadmin.local -q "addprinc howie"
當一切設定完成,你的Hadoop Cluster 才算是有了基本的安全防護
Referecne:
[1]
特別感謝Herb 大大贊助筆記
[2]
Setting Up Kerberos for Use with Ambari , Apache Ambari
[3]
HDP 2.x Setting Up Kerberos for Use with Ambari , Hortonworks
[4]
Configuring Hadoop Security in CDH4
[5]
Set up Kerberos Version 5 KDC to use AES encryption , IBM , 2007