Python通过kerberos安全认证操作kafka方式

(编辑:jimmy 日期: 2024/11/16 浏览:2)

如何通过Kerberos认证.

1.安装Kerberos客户端

CentOS:

yum install krb5-workstation

使用which kinit查看是否安装成功;

2.拷贝Kerberos配置文件

conf目录下krb5.conf和kafka.keytab和jaas.conf拷贝到客户端机器的etc目录, 同时,krb5.conf中的kdc集群主机名和IP配置到客户端机器hosts配置文件中

3.Kinit客户端通过kerberos认证

获取Principal

klist -kt kafka.keytab

4.安装python-gssapi

pip install gssapi

遇到的问题,如下:

a.在linux中执行wget命令提示 -bash: wget: command not found 解决方法

yum -y install wget

b.报错:bash: pip: command not found

wget https://bootstrap.pypa.io/get-pip.py python get-pip.py pip -V  #查看pip版本

python -m pip install --upgrade --force pip easy_install -U setuptools pip install --upgrade setuptools

3.pip安装出现Command “python setup.py egg_info” failed with error code 1 的解决方法

Traceback (most recent call last): File “”, line 1, in File “/tmp/pip-install-6HfDE3/gssapi/setup.py”, line 109, in raise Exception("Could not find main GSSAPI shared library. Please " Exception: Could not find main GSSAPI shared library. Please try setting GSSAPI_MAIN_LIB yourself or setting ENABLE_SUPPORT_DETECTION to ‘false'

Command “python setup.py egg_info” failed with error code 1 in /tmp/pip-install-6HfDE3/gssapi/

yum install -y krb5-devel.x86_64

4.关于error: command ‘gcc' failed with exit status 1错误的解决方法

yum install gcc python-devel

安装kafka-python

pip install kafka-python

初始化环境变量

export KAFKA_OPTS="-Djava.security.auth.login.config=/etc/conf/jaas.conf -Djava.security.krb5.conf=/etc/krb5.conf"

Python操作kafka样例

from kafka import KafkaProducer
from kafka.errors import KafkaError
import os

class Kafka_Producer():
 def __init__(self, kafkahost, kafkaport, kafkatopic):
  self.kafkaHost = kafkahost
  self.kafkaPort = kafkaport
  self.kafkatopic = kafkatopic
  self.producer = KafkaProducer(
    bootstrap_servers = '{kafka_host}:{kafka_port}'.format(kafka_host=self.kafkaHost,kafka_port=self.kafkaPort),
    security_protocol="SASL_PLAINTEXT",
    sasl_mechanism="GSSAPI",
    sasl_kerberos_service_name="kafka",
    compression_type='gzip' #压缩方式
    )
 def sendFileData(self, params):
  try:
    f = open(params,'rb')
    parmasMessage = f.read(-1).strip()
    producer = self.producer
    producer.send(self.kafkatopic, parmasMessage)
    producer.flush()
  except KafkaError as e:
    print (e)
  
def main():
 filePath = "/home/public/data/"
 topic = "demo"
 producer = Kafka_Producer("xxx.xx.xx.xx","9092",topic)
 dirList = os.listdir(filePath)
 for fileName in dirList:
  producer.sendFileData(filePath+fileName)
 print('send success!!!')

if __name__=='__main__':
 main()

以上这篇Python通过kerberos安全认证操作kafka方式就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持。

一句话新闻

微软与英特尔等合作伙伴联合定义“AI PC”:键盘需配有Copilot物理按键
几个月来,英特尔、微软、AMD和其它厂商都在共同推动“AI PC”的想法,朝着更多的AI功能迈进。在近日,英特尔在台北举行的开发者活动中,也宣布了关于AI PC加速计划、新的PC开发者计划和独立硬件供应商计划。
在此次发布会上,英特尔还发布了全新的全新的酷睿Ultra Meteor Lake NUC开发套件,以及联合微软等合作伙伴联合定义“AI PC”的定义标准。