Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kafka 0.9+ consumer support #1312

Closed
nomeez opened this issue Jun 2, 2016 · 7 comments
Closed

Kafka 0.9+ consumer support #1312

nomeez opened this issue Jun 2, 2016 · 7 comments

Comments

@nomeez
Copy link

nomeez commented Jun 2, 2016

Proposal:

Kafka supports authentication (GSSAPI since v0.9 and PLAIN(User/PW) since v0.10) and encryption (SSL) between the Kafka cluster and clients.

Current behavior:

The Telegraf Kafka-Consumer can't access secured clusters

Desired behavior:

Telegrafs Kafka-Consumer should be able to read data from secured clusters too.
Therefore it should support SASL_SSL, SASL_PLAINTEXT (with SASL=GSSAPI and SASL=PLAIN)

In other applications dealing with the same issue the application just allows to pass a set of attributes to the underlying Kafka Consumer library.
See the documentation here on which attributes are required: http://docs.confluent.io/3.0.0/kafka/security.html

Use case: [Why is this important (helps with prioritizing requests)]

Since Kafka introduced the security features, there is a move to secured Kafka clusters. Influx Products should keep up with the progress.

@seuf
Copy link
Contributor

seuf commented Mar 3, 2017

Input consumer kafka should update the kafka client sarama library and use the new consumer config with sasl ans ssl support.

@sparrc sparrc changed the title Feature Request Telegraf Kafka-Consumer should support secure Kafka Clusters Kafka 0.9+ consumer support Mar 6, 2017
@sparrc
Copy link
Contributor

sparrc commented Mar 6, 2017

If I understand the issue correctly, this is basically just a request to support consumption from Kafka 0.9+ clusters, which comes with the security features mentioned.

@sparrc
Copy link
Contributor

sparrc commented Mar 6, 2017

for reference:

the shopify sarama library makes a short mention of the difference: https://godoc.org/github.com/Shopify/sarama

and confluent wrote a blog post: https://www.confluent.io/blog/tutorial-getting-started-with-the-new-apache-kafka-0-9-consumer-client/

@thomas-tomlinson
Copy link

An additional benefit of the new consumer (0.9+) is the consumer offsets are stored within the broker cluster and not the zookeeper cluster. This also allows the configuration to directly reference a bootstrap host (any broker in the cluster) instead of the zookeepers for initial connection.

@jackzampolin
Copy link
Contributor

jackzampolin commented May 25, 2017

So after investigating further there are two client libs for the 0.9+ version of Kafka:

Official

Community

The official version relies on cgo which would add a degree of difficulty to the implementation. it also doesn't support SSL/SASL. The community client is pure go and supports all the things. #2487, a pull that implements the new API, uses the sarama-cluster implementation.

@burdandrei
Copy link
Contributor

well i tried binary from @seuf, and it works like a charm:
image
Full partition coverage, and everything looks good!

@danielnelson
Copy link
Contributor

Closing, support was added in 1.4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants