beam-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Python37 #19
Date Mon, 22 Jul 2019 13:10:58 GMT
See <https://builds.apache.org/job/beam_PostCommit_Python37/19/display/redirect>

------------------------------------------
[...truncated 252.38 KB...]
    args = (typing.Tuple[K, args[0]],) + args[1:]
  File "/usr/lib/python3.7/typing.py", line 743, in __getitem__
    return self.__getitem_inner__(params)
  File "/usr/lib/python3.7/typing.py", line 251, in inner
    return func(*args, **kwds)
  File "/usr/lib/python3.7/typing.py", line 769, in __getitem_inner__
    params = tuple(_type_check(p, msg) for p in params)
  File "/usr/lib/python3.7/typing.py", line 769, in <genexpr>
    params = tuple(_type_check(p, msg) for p in params)
  File "/usr/lib/python3.7/typing.py", line 139, in _type_check
    raise TypeError(f"{msg} Got {arg!r:.100}.")
TypeError: Tuple[t0, t1, ...]: each t must be a type. Got Any.

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 23.720s

FAILED (SKIP=1, errors=1)

> Task :sdks:python:test-suites:direct:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue
in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ...
ERROR
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT)
... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT)
... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ...
ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT)
... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ...
ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ...
SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT)
... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT)
... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests)
... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests)
... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT)
... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests)
... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests)
... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests)
... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests)
... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... ok

======================================================================
ERROR: test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/complete/autocomplete_test.py",>
line 69, in test_autocomplete_it
    assert_that(checksum, equal_to([self.KINGLEAR_HASH_SUM]))
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 426, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",>
line 107, in run
    else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",>
line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",>
line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 475, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 530, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 560, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 490, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",>
line 168, in stage_job_resources
    requirements_cache_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",>
line 487, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/processes.py",>
line 91, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/processes.py",>
line 83, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.7/subprocess.py", line 395, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.7/subprocess.py", line 487, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt',
'--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1.
 
 Pip install failed for package: -r         
 Output from execution of subprocess: b'Collecting pyhamcrest (from -r postcommit_requirements.txt
(line 1))\n  File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz\nCollecting
mock (from -r postcommit_requirements.txt (line 2))\n  ERROR: Could not find a version that
satisfies the requirement mock (from -r postcommit_requirements.txt (line 2)) (from versions:
none)\nERROR: No matching distribution found for mock (from -r postcommit_requirements.txt
(line 2))\n'
-------------------- >> begin captured logging << --------------------
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://dataflow-samples/shakespeare/kinglear.txt'
-> 'gs://dataflow\\-samples/shakespeare/kinglear\\.txt'
root: WARNING: Typical end users should not use this worker jar feature. It can only be used
when FnAPI is enabled.
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://dataflow-samples/shakespeare/kinglear.txt'
-> 'gs://dataflow\\-samples/shakespeare/kinglear\\.txt'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0722121752-127772.1563797872.127924/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0722121752-127772.1563797872.127924/pipeline.pb
in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0722121752-127772.1563797872.127924/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0722121752-127772.1563797872.127924/requirements.txt
in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt',
'--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_02-7237085553912691411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_32_57-5108652903346641600?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options
will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_40_56-1184725463703032088?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:565:
BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options
will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options
will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_01-3800094776193914922?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_39_48-17263246457781543547?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_48_19-4594542589083536060?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_01-10520559847905988137?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_39_32-2595770593749179686?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_48_05-4279159593473683215?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options
will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_02-9098522583680428594?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_31_18-14897671321209655214?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_39_15-16411104212581393383?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_48_00-12441258105062941944?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_56_06-4315495775347955873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_03-7576339307799299173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_27_41-1560073546237703815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_36_28-16343906557051473884?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_44_35-1871167654078829553?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_52_42-12544411726906708621?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_00-1542868565656585406?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_26_53-16533546640639536417?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_35_40-13946453000475982246?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_44_26-18430419841146942773?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_54_02-4211802934517475182?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_06_02_38-3810790730517781701?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_02-16652685734737887889?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_28_10-18048879134052435117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_39_41-3768764890650899608?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_47_59-9454917985936203193?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options
will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options
will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_18_03-3796693432181612521?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_28_16-5741476881381234614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_37_27-323386772012323234?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_46_05-16130956573661034900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-22_05_54_22-937551335408078671?project=apache-beam-testing.

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 41 tests in 3186.774s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
line: 49

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to
get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 80

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to
get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 6s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/2n3w7qfrgbho6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Mime
View raw message