flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Piotr Nowojski (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (FLINK-10704) Fix sql client end to end test failure
Date Wed, 31 Oct 2018 09:07:01 GMT

    [ https://issues.apache.org/jira/browse/FLINK-10704?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16669791#comment-16669791
] 

Piotr Nowojski commented on FLINK-10704:
----------------------------------------

[~yanghua] I don't understand this:

> I do not agree to filter out `WARN` lines because of warning logs cannot be used as a
measure of failure

what do you mean? If as you say, "warning logs cannot be used as a measure of failure", then
why not filter them out as I proposed?

Btw, this global `check_logs_for_errors ` is broken design. We might want to have some limited
global list of exclusions, but adding more entries to ever growing list is unmaintainable.
I think we should filter out this one specific warning in kafka tests and do not mess with
changing global rules for all of the tests.

> Fix sql client end to end test failure
> --------------------------------------
>
>                 Key: FLINK-10704
>                 URL: https://issues.apache.org/jira/browse/FLINK-10704
>             Project: Flink
>          Issue Type: Bug
>          Components: E2E Tests, Kafka Connector
>            Reporter: vinoyang
>            Assignee: vinoyang
>            Priority: Major
>              Labels: pull-request-available
>
> The log file contains the following sentence:
> {code:java}
> 2018-10-29 03:27:39,209 WARN org.apache.flink.kafka010.shaded.org.apache.kafka.common.utils.AppInfoParser
- Error while loading kafka-version.properties :null
> {code}
> The reason for this log is that we explicitly exclude the version description file of
the kafka client when packaging the connector:
> {code:java}
> <filters>
>    <filter>
>       <artifact>*:*</artifact>
>       <excludes>
>          <exclude>kafka/kafka-version.properties</exclude>
>       </excludes>
>    </filter>
> </filters>{code}
> When the shell scan the "error" keyword with grep, it will hit, so the test will fail.
> {code:java}
> function check_logs_for_errors {
>   error_count=$(grep -rv "GroupCoordinatorNotAvailableException" $FLINK_DIR/log \
>       | grep -v "RetriableCommitFailedException" \
>       | grep -v "NoAvailableBrokersException" \
>       | grep -v "Async Kafka commit failed" \
>       | grep -v "DisconnectException" \
>       | grep -v "AskTimeoutException" \
>       | grep -v "WARN  akka.remote.transport.netty.NettyTransport" \
>       | grep -v  "WARN  org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline"
\
>       | grep -v "jvm-exit-on-fatal-error" \
>       | grep -v '^INFO:.*AWSErrorCode=\[400 Bad Request\].*ServiceEndpoint=\[https://.*\.s3\.amazonaws\.com\].*RequestType=\[HeadBucketRequest\]'
\
>       | grep -v "RejectedExecutionException" \
>       | grep -v "An exception was thrown by an exception handler" \
>       | grep -v "java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnException"
\
>       | grep -v "java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration"
\
>       | grep -v "org.apache.flink.fs.shaded.hadoop3.org.apache.commons.beanutils.FluentPropertyBeanIntrospector
 - Error when creating PropertyDescriptor for public final void org.apache.flink.fs.shaded.hadoop3.org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)!
Ignoring this property." \
>       | grep -ic "error")    //here
>   if [[ ${error_count} -gt 0 ]]; then
>     echo "Found error in log files:"
>     cat $FLINK_DIR/log/*
>     EXIT_CODE=1
>   fi
> }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message