beam-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Python_Verify #6975
Date Thu, 03 Jan 2019 06:57:57 GMT
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6975/display/redirect>

------------------------------------------
[...truncated 402.43 KB...]
          }
        }, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",

                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",

                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",

                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 2364 bytes>", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-01-03T06:12:30.614740Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-01-02_22_12_29-3117641307869783981'
 location: u'us-central1'
 name: u'beamapp-jenkins-0103061224-011690'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-01-03T06:12:30.614740Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-01-02_22_12_29-3117641307869783981]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_12_29-3117641307869783981?project=apache-beam-testing
root: INFO: Job 2019-01-02_22_12_29-3117641307869783981 is in state JOB_STATE_RUNNING
root: INFO: 2019-01-03T06:12:29.931Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2019-01-02_22_12_29-3117641307869783981. The number of workers will be between 1 and 1000.
root: INFO: 2019-01-03T06:12:29.970Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2019-01-02_22_12_29-3117641307869783981.
root: INFO: 2019-01-03T06:12:33.537Z: JOB_MESSAGE_DETAILED: Checking permissions granted to
controller Service Account.
root: INFO: 2019-01-03T06:12:34.245Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1
in us-central1-b.
root: INFO: 2019-01-03T06:12:34.756Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations
into optimizable parts.
root: INFO: 2019-01-03T06:12:34.827Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-01-03T06:12:34.863Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
group: GroupByKey not followed by a combiner.
root: INFO: 2019-01-03T06:12:34.919Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into optimizable parts.
root: INFO: 2019-01-03T06:12:34.967Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns
into MergeBucketsMappingFns
root: INFO: 2019-01-03T06:12:35.141Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-01-03T06:12:35.244Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write,
and Flatten operations
root: INFO: 2019-01-03T06:12:35.313Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into
group/Reify
root: INFO: 2019-01-03T06:12:35.381Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow
into group/Read
root: INFO: 2019-01-03T06:12:35.437Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow
into write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-01-03T06:12:35.499Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write
into write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-01-03T06:12:35.542Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract
into write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-01-03T06:12:35.609Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn)
into write/Write/WriteImpl/Pair
root: INFO: 2019-01-03T06:12:35.665Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify
into write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-01-03T06:12:35.715Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
root: INFO: 2019-01-03T06:12:35.773Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into
pair_with_one
root: INFO: 2019-01-03T06:12:35.833Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair
into write/Write/WriteImpl/WriteBundles/WriteBundles
root: INFO: 2019-01-03T06:12:35.880Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles
into format
root: INFO: 2019-01-03T06:12:35.937Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one
into split
root: INFO: 2019-01-03T06:12:35.989Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
root: INFO: 2019-01-03T06:12:36.036Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-01-03T06:12:36.090Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite
into write/Write/WriteImpl/DoOnce/Read
root: INFO: 2019-01-03T06:12:36.148Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default
resource spec.
root: INFO: 2019-01-03T06:12:36.218Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown
to workflow graph.
root: INFO: 2019-01-03T06:12:36.283Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-01-03T06:12:36.342Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-01-03T06:12:36.675Z: JOB_MESSAGE_DEBUG: Executing wait step start26
root: INFO: 2019-01-03T06:12:36.807Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2019-01-03T06:12:36.853Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-01-03T06:12:36.867Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-01-03T06:12:36.895Z: JOB_MESSAGE_BASIC: Executing operation group/Create
root: INFO: 2019-01-03T06:12:36.959Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-01-03T06:12:37.185Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session"
materialized.
root: INFO: 2019-01-03T06:12:37.230Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-01-03T06:12:37.348Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-01-03T06:12:47.117Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-01-03T06:14:39.154Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-01-03T06:15:25.968Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-01-03T06:15:26.027Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-01-03T06:17:34.916Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-01-03T06:17:39.321Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out"
materialized.
root: INFO: 2019-01-03T06:17:39.369Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/InitializeWrite.out"
materialized.
root: INFO: 2019-01-03T06:17:39.473Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
root: INFO: 2019-01-03T06:17:39.521Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
root: INFO: 2019-01-03T06:17:39.573Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
root: INFO: 2019-01-03T06:17:39.623Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
root: INFO: 2019-01-03T06:17:39.657Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
root: INFO: 2019-01-03T06:17:39.695Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-01-03T06:17:39.745Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
root: INFO: 2019-01-03T06:17:49.391Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Close
root: INFO: 2019-01-03T06:17:49.514Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/GroupByKey/GroupByWindow+write/Write/WriteImpl/Extract
root: INFO: 2019-01-03T06:17:58.830Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/Extract.out"
materialized.
root: INFO: 2019-01-03T06:17:58.919Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
root: INFO: 2019-01-03T06:17:58.947Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
root: INFO: 2019-01-03T06:17:59.044Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output"
materialized.
root: INFO: 2019-01-03T06:17:59.068Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0).output"
materialized.
root: INFO: 2019-01-03T06:17:59.308Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/PreFinalize
root: INFO: 2019-01-03T06:18:03.399Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize.out"
materialized.
root: INFO: 2019-01-03T06:18:03.504Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
root: INFO: 2019-01-03T06:18:03.681Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output"
materialized.
root: INFO: 2019-01-03T06:18:03.787Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
root: INFO: 2019-01-03T06:18:08.109Z: JOB_MESSAGE_DEBUG: Executing success step success24
root: INFO: 2019-01-03T06:18:08.298Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-01-03T06:18:08.560Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-01-03T06:18:08.609Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-01-03T06:20:09.805Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool
from 1 to 0.
root: INFO: 2019-01-03T06:20:09.835Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce
the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-01-03T06:20:09.907Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-01-03T06:20:09.959Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-01-02_22_12_29-3117641307869783981 is in state JOB_STATE_DONE
root: INFO: Wait 20 seconds...
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*'
-> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1546495943\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0708351135254 seconds.
root: WARNING: Retry with exponential backoff: waiting for 3.49371685546 seconds before retrying
_read_with_retry because we caught exception: IOError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 184, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",>
line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*'
-> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1546495943\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0601589679718 seconds.
root: WARNING: Retry with exponential backoff: waiting for 8.79045856761 seconds before retrying
_read_with_retry because we caught exception: IOError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 184, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",>
line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*'
-> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1546495943\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0597159862518 seconds.
root: WARNING: Retry with exponential backoff: waiting for 17.1601862495 seconds before retrying
_read_with_retry because we caught exception: IOError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 184, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",>
line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*'
-> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1546495943\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0586860179901 seconds.
root: WARNING: Retry with exponential backoff: waiting for 35.5024848865 seconds before retrying
_read_with_retry because we caught exception: IOError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 184, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",>
line 122, in _read_with_retry
    raise IOError('No such file or directory: %s' % self.file_path)

apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1546495943/results*-of-*'
-> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1546495943\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0827510356903 seconds.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 25 tests in 3183.482s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_00-5540045652149300219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_12_29-11753804420299607421?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_20_34-9068349041439216085?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_28_22-16872352801492152823?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_37_19-4251050553096392947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_45_16-14362768026322522493?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_51_34-928763833326102565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_03-4689397316821601924?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_19_33-13146501909156516123?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_02-446256343266995205?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_02-4150926361510293245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_17_07-9904244417684780384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_24_01-7670377387318559192?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_31_44-1350101093537153599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_01-12300050136213499563?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_02-2067045983457867087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_12_29-3117641307869783981?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_21_54-12037116280524517408?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_30_22-12797805034466738079?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_01-17775670976726179971?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_12_26-2713043786321227523?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_22_17-8273998190390626309?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_05_02-13932111299256847652?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-02_22_14_25-13988639084975388908?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 53 mins
4.45 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'>
line: 276

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log
output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 57m 43s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/gk5saxjjzszh6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Mime
View raw message