# Kafka with SASL/PLAINTEXT 安裝 Apache 標準版 kafka (這次實驗用 kafka_2.12-2.6.0) 實驗需要環境包含有 SASL/Plaintext 認証,所以主要參考以下兩篇文章進行。 1. [https://codeforgeek.com/how-to-set-up-authentication-in-kafka-cluster/](https://codeforgeek.com/how-to-set-up-authentication-in-kafka-cluster/) 2. [https://www.itread01.com/content/1548468202.html](https://www.itread01.com/content/1548468202.html) ## 實作 ### 安裝 kafka 取得 kafka ,[官方網站](https://kafka.apache.org/),下載 [kafka](https://kafka.apache.org/downloads),這次用的版本是 [2.6.0](https://downloads.apache.org/kafka/2.6.0/kafka_2.12-2.6.0.tgz)已經編譯好的比較方便!其實如果不需要認証,[docker版](https://github.com/wurstmeister/kafka-docker)的也很好用。 下載之後解壓縮就可以開始用,一般參考kafka [quick start](https://kafka.apache.org/documentation/#quickstart)就可以開始收送資料。 如果收送資料正常,就把程式都停掉開始設定認証。主要有2種模式,SSL與SASL,以下打算用SASL/PLAINTEXT,SSLL加密是使用SSL加密在代理和客戶端之間,代理之間或代理和工具之間傳輸的數據。而SASL有分: SASL/PLAIN 不能動態增加用戶 SASL/SCRAM(256/512) 可以動態增加用戶 SASL/Kerberos 需要獨立部署驗證服務 SASL/OAUTHBEARER 需自己實現接口實現token的創建和驗證,需要額外Oauth服務 [資料來源](https://kknews.cc/zh-tw/news/qlb5e6b.html) 我的目標這次先進行 SASL/PLAINTEXT。 ### 設定 SASL/PlainTEXT 實作上要處理 Zookeeper 與 Broker 兩種服務。畢竟服務需要這兩種服務都在,Borker需要先經過zookeeper認証才算完整! 先設定zookeeper 帳號密碼的部份 ``` $ vi config/zookeeper_jaas.conf ``` 內容 ``` Server { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="password" user_admin="password"; }; ``` 注意: 最後的 ; ,不要忘記,想像這是一行 所以結尾要 ; 接著是serever 設定檔 zookeeper.properties 加上,以啟用認証功能 ``` #auth authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider requireClientAuthScheme=sasl jaasLoginRenew=3600000 ``` 執行 zookeeper ``` $ export KAFKA_OPTS="-Djava.security.auth.login.config=/home/ubuntu/kafka_2.12-2.6.0/config/zookeeper_jaas.conf" bin/zookeeper-server-start.sh config/zookeeper.properties ``` zookeeper 處理好之後就換broker, 因為環境變數關係,建議切換到另一個console 或是用tmux 滿方便處理multi console的! kafka 部份主要一樣要新增檔案 config/kafka\_server\_jaas.conf ``` KafkaServer { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="12345" user_admin="12345"; }; Client { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="password"; }; ``` client 就是要連到 zookeeper 的設定,所以密碼是 password。 kafka Server 裏面就是提供要給kafka連線用的設定。 設定完帳號密碼之後,設定broker認証 config/server.properties ``` # AUTH security.inter.broker.protocol=SASL_PLAINTEXT sasl.mechanism.inter.broker.protocol=PLAIN sasl.enabled.mechanisms=PLAIN authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer allow.everyone.if.no.acl.found=true listeners=SASL_PLAINTEXT://0.0.0.0:9092 advertised.listeners=SASL_PLAINTEXT://:9092 ``` 執行 kafka ``` $ export KAFKA_OPTS="-Djava.security.auth.login.config=/home/ubuntu/kafka_2.12-2.6.0/config/kafka_server_jaas.conf" $ bin/kafka-server-start.sh config/server.properties ``` ### 測試 測試我會測試兩個地方,一個是在Server內部進行 producer/consumer 測試,所以要設定認証;完成之後會再用python進行consumer進行外部測試。 Server內部進行 producer/consumer,需要設定: consumer.properties 增加 ``` security.protocol=SASL_PLAINTEXT sasl.mechanism=PLAIN ``` producer.properties 增加 ``` security.protocol=SASL_PLAINTEXT sasl.mechanism=PLAIN bootstrap.servers=localhost:9092 compression.type=none ``` kafka\_client\_jaas.conf ``` KafkaClient { org.apache.kafka.common.security.plain.PlainLoginModule required username="admin" password="12345"; }; ``` 完成之後,開始測試: 列出目前acl ``` bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=localhost:2181 --list ``` 新增topic與acl ``` bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=localhost:2181 --add --allow-principal User:staff --operation Write --operation Read --topic test ``` 送訊息 ``` bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test --producer.config=config/producer.properties ``` 收訊息 ``` bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning --consumer.config config/consumer.properties ``` Python 部份, ``` from kafka import KafkaConsumer consumer = KafkaConsumer('sensorData', bootstrap_servers=['103.124.73.XX:9092'], security_protocol="SASL_PLAINTEXT", sasl_mechanism="PLAIN", sasl_plain_username="admin", sasl_plain_password="12345") for msg in consumer: print(msg) ``` reference: [https://kafka-python.readthedocs.io/en/master/apidoc/BrokerConnection.html](https://kafka-python.readthedocs.io/en/master/apidoc/BrokerConnection.html) [https://github.com/wurstmeister/kafka-docker/issues/528](https://github.com/wurstmeister/kafka-docker/issues/528) [https://github.com/wurstmeister/kafka-docker](https://github.com/wurstmeister/kafka-docker) [http://wurstmeister.github.io/kafka-docker/](http://wurstmeister.github.io/kafka-docker/) [https://kafka.apache.org/documentation/#quickstart](https://kafka.apache.org/documentation/#quickstart) [https://codeforgeek.com/how-to-set-up-authentication-in-kafka-cluster/](https://codeforgeek.com/how-to-set-up-authentication-in-kafka-cluster/) [https://www.itread01.com/content/1548468202.html](https://www.itread01.com/content/1548468202.html) [https://medium.com/analytics-vidhya/kafka-ssl-encryption-authentication-part-two-practical-example-for-implementing-ssl-in-kafka-d514f30fe782](https://medium.com/analytics-vidhya/kafka-ssl-encryption-authentication-part-two-practical-example-for-implementing-ssl-in-kafka-d514f30fe782) [https://kknews.cc/zh-tw/news/qlb5e6b.html](https://kknews.cc/zh-tw/news/qlb5e6b.html) [https://www.mdeditor.tw/pl/2CDY/zh-tw](https://www.mdeditor.tw/pl/2CDY/zh-tw)
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up