beam-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Python_Verify #7001
Date Sun, 06 Jan 2019 06:49:43 GMT
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7001/display/redirect>

------------------------------------------
[...truncated 451.93 KB...]
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Jan 06, 2019 6:03:22 AM com.sun.jersey.server.impl.application.WebApplicationImpl
_initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Jan 06, 2019 6:03:22 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider
classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam)
throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated
as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam)
throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated
as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam)
throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated
as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam)
throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated
as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS
HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE
HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/01/06 06:03:23 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root
201
namenode_1  | 19/01/06 06:03:23 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001,
replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 19/01/06 06:03:23 INFO datanode.DataNode: Receiving BP-1822213002-172.18.0.2-1546754553926:blk_1073741825_1001
src: /172.18.0.3:33128 dest: /172.18.0.3:50010
datanode_1  | 19/01/06 06:03:23 INFO DataNode.clienttrace: src: /172.18.0.3:33128, dest: /172.18.0.3:50010,
bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2107085487_67, offset: 0, srvID:
0df88507-b840-4654-a927-4702752ba22b, blockid: BP-1822213002-172.18.0.2-1546754553926:blk_1073741825_1001,
duration: 14758060
datanode_1  | 19/01/06 06:03:23 INFO datanode.DataNode: PacketResponder: BP-1822213002-172.18.0.2-1546754553926:blk_1073741825_1001,
type=LAST_IN_PIPELINE terminating
namenode_1  | 19/01/06 06:03:23 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is
COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/01/06 06:03:23 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/01/06 06:03:23 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is
closed by DFSClient_NONMAPREDUCE_2107085487_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default
runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs
at 0x7f039cd89aa0> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f039cd89b18>
====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f039cd89b90>
====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f039cd89c08>
====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f039cd89c80>
====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f039cd89cf8>
====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f039cd89d70>
====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f039cd89de8>
====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f039cd89e60>
====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f039cd89ed8>
====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f039cd89f50>
====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/01/06 06:03:25 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0
200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/01/06 06:03:26 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-c75bee60117811e99d530242ac120004/1b0bfdb9-7912-4d4f-947e-422927b3e7b5.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root
201
namenode_1  | 19/01/06 06:03:27 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002,
replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-c75bee60117811e99d530242ac120004/1b0bfdb9-7912-4d4f-947e-422927b3e7b5.py-wordcount-integration
datanode_1  | 19/01/06 06:03:27 INFO datanode.DataNode: Receiving BP-1822213002-172.18.0.2-1546754553926:blk_1073741826_1002
src: /172.18.0.3:33148 dest: /172.18.0.3:50010
datanode_1  | 19/01/06 06:03:27 INFO DataNode.clienttrace: src: /172.18.0.3:33148, dest: /172.18.0.3:50010,
bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1482956073_69, offset: 0, srvID:
0df88507-b840-4654-a927-4702752ba22b, blockid: BP-1822213002-172.18.0.2-1546754553926:blk_1073741826_1002,
duration: 3857661
datanode_1  | 19/01/06 06:03:27 INFO datanode.DataNode: PacketResponder: BP-1822213002-172.18.0.2-1546754553926:blk_1073741826_1002,
type=LAST_IN_PIPELINE terminating
namenode_1  | 19/01/06 06:03:27 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-c75bee60117811e99d530242ac120004/1b0bfdb9-7912-4d4f-947e-422927b3e7b5.py-wordcount-integration
is closed by DFSClient_NONMAPREDUCE_-1482956073_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches:
1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7001_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7001_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7001_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7001_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7001_namenode_1 ... done
Aborting on container exit...

real	1m17.663s
user	0m1.021s
sys	0m0.170s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7001 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7001_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7001_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7001_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7001_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7001_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7001_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7001_test_net

real	0m0.471s
user	0m0.229s
sys	0m0.073s

> Task :beam-sdks-python:postCommitIT


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=("--kms_key_name=$KMS_KEY_NAME")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner
--project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it
--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1
--sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test/cryptoKeyVersions/1
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470:
UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ...
ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT)
... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT)
... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ...
ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ...
ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT)
... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ...
ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 25 tests in 2751.028s

OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_57-13201302787405368800?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_09_57-14023354200930925196?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_17_23-1166699517819245234?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_22_36-8517651036835521242?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_29_28-17063890844486636937?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_37_06-14464302783439323858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_42_44-2017736932704832807?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_59-3609787471530664462?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_18_41-15869236491160142149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_58-221362420126517839?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_59-9217891688740953209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_16_05-14741436999533447917?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_21_48-7157663342292516790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_27_02-3141426345107236721?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_57-6710539812322028725?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_58-6964994847863501920?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_10_06-6101104002158499494?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_16_35-16225907238535014890?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_58-16626117159062179166?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_10_36-6094138622536356238?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_18_07-2188510749921773397?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_24_23-3243522129911072576?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_03_58-5029666696301871442?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-05_22_12_15-10983236778878677809?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'>
line: 176

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 49m 33s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/f55fwr4cczu7i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Mime
View raw message