kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From westfox <west...@gmail.com>
Subject Re: Error happen when kafka 0.9 client try connect with server, with SSL enabled
Date Tue, 05 Apr 2016 22:45:10 GMT
Ismael,

It works after follow your advice. Thanks a lot!

Ping

On Tue, Apr 5, 2016 at 5:16 PM, Ismael Juma <ismael@juma.me.uk> wrote:

> Hi Ping,
>
> The problem is advertised.host.name, which only advertises a PLAINTEXT
> port. You should use advertised.listeners instead.
>
> Ismael
>
> On Tue, Apr 5, 2016 at 8:42 PM, westfox <westfox@gmail.com> wrote:
>
> > Hi,
> >
> > Got error when client try to talk with kafka server  kafka_2.11-0.9.0.1,
> > with SSL enabled. same error happen no matter consumer or producer.
> anyone
> > can help?
> >
> > Thanks
> > Ping
> >
> >
> > ====================================== Errro msg at server side
> >
> > [2016-04-05 19:26:04,836] ERROR [KafkaApi-1] error when handling request
> > Name: TopicMetadataRequest; Version: 0; CorrelationId: 1; ClientId:
> > kafka-consumer-1; Topics: test2 (kafka.server.KafkaApis)
> > kafka.common.BrokerEndPointNotAvailableException: End point SSL not found
> > for broker 1
> >         at kafka.cluster.Broker.getBrokerEndPoint(Broker.scala:141)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1$$anonfun$apply$mcV$sp$1$$anonfun$1$$anonfun$4.apply(MetadataCache.scala:57)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1$$anonfun$apply$mcV$sp$1$$anonfun$1$$anonfun$4.apply(MetadataCache.scala:57)
> >         at
> >
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
> >         at
> >
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
> >         at
> >
> >
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> >         at
> > scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
> >         at
> > scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
> >  at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1$$anonfun$apply$mcV$sp$1$$anonfun$1.apply(MetadataCache.scala:57)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1$$anonfun$apply$mcV$sp$1$$anonfun$1.apply(MetadataCache.scala:54)
> >         at
> >
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
> >         at
> >
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
> >         at
> >
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
> >         at
> >
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
> >         at
> >
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
> >         at
> scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
> >         at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
> >         at
> > scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
> >  at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1$$anonfun$apply$mcV$sp$1.apply(MetadataCache.scala:54)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1$$anonfun$apply$mcV$sp$1.apply(MetadataCache.scala:51)
> >         at scala.collection.immutable.Set$Set1.foreach(Set.scala:79)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1.apply$mcV$sp(MetadataCache.scala:51)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1.apply(MetadataCache.scala:51)
> >         at
> >
> >
> kafka.server.MetadataCache$$anonfun$getTopicMetadata$1.apply(MetadataCache.scala:51)
> >         at kafka.utils.CoreUtils$.inLock(CoreUtils.scala:262)
> >         at kafka.utils.CoreUtils$.inReadLock(CoreUtils.scala:268)
> >         at
> > kafka.server.MetadataCache.getTopicMetadata(MetadataCache.scala:50)
> > )
> >
> >
> fka.server.KafkaApis.handleTopicMetadataRequest(KafkaApis.scala:610--More--(15%)
> >         at kafka.server.KafkaApis.handle(KafkaApis.scala:71)
> >         at
> > kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:60)
> >         at java.lang.Thread.run(Thread.java:745)
> >
> >
> > ============================== Error msg at client side:
> >
> > 2016-04-05 16:30:09 [main] DEBUG o.apache.kafka.clients.NetworkClient -
> > Sending metadata request ClientRequest(expectResponse=true,
> callback=null,
> >
> request=RequestSend(header={api_key=3,api_version=0,correlation_id=1,client
> > _id=kafka-consumer-1}, body={topics=[test]}), isInitiatedByNetworkClient,
> > createdTimeMs=1459873809880, sendTimeMs=0) to node -1
> > 2016-04-05 16:30:09 [main] WARN  o.apache.kafka.clients.NetworkClient -
> > Error while fetching metadata with correlation id 1 : {test=UNKNOWN}
> > 2016-04-05 16:30:09 [main] DEBUG o.apache.kafka.clients.NetworkClient -
> > Sending metadata request ClientRequest(expectResponse=true,
> callback=null,
> >
> request=RequestSend(header={api_key=3,api_version=0,correlation_id=2,client
> > _id=kafka-consumer-1}, body={topics=[test]}), isInitiatedByNetworkClient,
> > createdTimeMs=1459873809990, sendTimeMs=0) to node -1
> > 2016-04-05 16:30:09 [main] WARN  o.apache.kafka.clients.NetworkClient -
> > Error while fetching metadata with correlation id 2 : {test=UNKNOWN}
> > 2016-04-05 16:30:10 [main] DEBUG o.apache.kafka.clients.NetworkClient -
> > Sending metadata request ClientRequest(expectResponse=true,
> callback=null,
> >
> request=RequestSend(header={api_key=3,api_version=0,correlation_id=3,client
> > _id=kafka-consumer-1}, body={topics=[test]}), isInitiatedByNetworkClient,
> > createdTimeMs=1459873810094, sendTimeMs=0) to node -1
> >
> >
> > ============== server.properties
> >
> > broker.id=1
> > auto.leader.rebalance.enable=true
> >
> > auto.create.topics.enable=true
> > default.replication.factor=1
> >
> > delete.topic.enable=false
> >
> > advertised.host.name=kafka
> >
> >
> > log.dir=/data
> > log.dirs=/data
> >
> > num.partitions=1
> >
> >
> >
> > log.retention.hours=168
> >
> >
> > zookeeper.connect=zookeeper:2181
> > zookeeper.connection.timeout.ms=10000
> > controlled.shutdown.enable=true
> > zookeeper.session.timeout.ms=10000
> >
> >
> > listeners=PLAINTEXT://:9092,SSL://:9093
> > ssl.keystore.location=/kafka/config/ssl/server.keystore.jks
> > ssl.keystore.password=changeit
> > ssl.key.password=changeit
> > ssl.truststore.location=/kafka/config/ssl/server.truststore.jks
> > ssl.truststore.password=changeit
> >
> > ================ client consumer.properties
> > bootstrap.servers=kafka:9093
> >
> > group.id=group-test
> > client.id=kafka-consumer-1
> >
> > security.protocol=SSL
> > ssl.truststore.location=/client.truststore.jks
> > ssl.truststore.password=changeit
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message