beam-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11 #33
Date Tue, 20 Jul 2021 16:03:13 GMT
See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Streaming_Java11/33/display/redirect?page=changes>

Changes:

[zhoufek] [BEAM-12474] Write PubsubIO parsing errors to dead-letter topic

[noreply] [BEAM-4152] Disable window tests on Dataflow (#15188)

[noreply] [BEAM-12548] Implement EqualsFloat test helper (#15175)

[noreply] Fix formatting using go fmt (#15189)

[noreply] [BEAM-4152] Add merging strategy for sessions to WindowingStrategy proto

[noreply] [BEAM-12613] Enable Python build tests for Samza (#15169)

[noreply] Fix spelling.


------------------------------------------
[...truncated 65.99 KB...]
See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java]
9b42aa0d18fc: Preparing
93821ce7b4d4: Preparing
b5dff7010038: Preparing
fa1948845804: Preparing
fb0606e647ba: Preparing
0a6cc7fa3603: Preparing
743ef93eef7b: Preparing
0ee94ef367e5: Preparing
0bc160da78ab: Preparing
d39bc526141e: Preparing
c65143949103: Preparing
62ab8cd4222c: Preparing
6ea995e9b7d3: Preparing
c0848348e2f7: Preparing
79c550eb7bd2: Preparing
7095af798ace: Preparing
fe6a4fdbedc0: Preparing
e4d0e810d54a: Preparing
4e006334a6fd: Preparing
62ab8cd4222c: Waiting
6ea995e9b7d3: Waiting
c0848348e2f7: Waiting
0a6cc7fa3603: Waiting
e4d0e810d54a: Waiting
743ef93eef7b: Waiting
4e006334a6fd: Waiting
0ee94ef367e5: Waiting
7095af798ace: Waiting
fe6a4fdbedc0: Waiting
79c550eb7bd2: Waiting
93821ce7b4d4: Pushed
fb0606e647ba: Pushed
b5dff7010038: Pushed
9b42aa0d18fc: Pushed
0a6cc7fa3603: Pushed
fa1948845804: Pushed
0ee94ef367e5: Pushed
0bc160da78ab: Pushed
6ea995e9b7d3: Layer already exists
c65143949103: Pushed
c0848348e2f7: Layer already exists
79c550eb7bd2: Layer already exists
7095af798ace: Layer already exists
fe6a4fdbedc0: Layer already exists
e4d0e810d54a: Layer already exists
4e006334a6fd: Layer already exists
62ab8cd4222c: Pushed
743ef93eef7b: Pushed
d39bc526141e: Pushed
20210720124333: digest: sha256:adcb6c4a7a0914befe640d4aeafb8eb13134c29a6fe8ba57b013a7d3abe63941
size: 4310

> Task :sdks:java:testing:load-tests:run
Jul 20, 2021 12:48:35 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 20, 2021 12:48:35 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath:
will stage 191 files. Enable logging at DEBUG level to see which files will be staged.
Jul 20, 2021 12:48:36 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Jul 20, 2021 12:48:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing implications related
to Google Compute Engine usage and other Google Cloud Services.
Jul 20, 2021 12:48:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
Jul 20, 2021 12:48:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <112637 bytes, hash 5aa1afeeab873967d94ee4769ca84dbd3263cde611c61403f1d092d8dad5cdb0>
to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-WqGv7quHOWfZTuR2nKhNvTJjzeYRxhQD8dCS2NrVzbA.pb
Jul 20, 2021 12:48:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Uploading 191 files from PipelineOptions.filesToStage to staging location to prepare
for execution.
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
INFO: Staging files complete: 191 files cached, 0 files newly uploaded in 0 seconds
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1
Jul 20, 2021 12:48:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3c4a71,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1352434e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4f9a6c2d,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2b6fcb9f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@75de6341,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74170687, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@68f0f72c,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d96fa9e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3b545206,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77bb48d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@181d8899,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@12d5c30e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@b887730,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26586b74, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52f57666,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6e041285, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@267dc982,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@439b15f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3aa41da1,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@74fab04a]
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Read input/StripIds as step s2
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Collect start time metrics (input) as step s3
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Window.Into()/Window.Assign as step s4
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5
Jul 20, 2021 12:48:41 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@44ed0a8f,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@32177fa5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a96d56c,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6ab4a5b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2abe9173,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@235d29d6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1fdca564,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@43f9dd56, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1d12e953,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@57cb70be, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2d4608a6,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20d87335, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a8a4e0c,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@26c89563, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3bd6ba24,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58f437b0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@20f6f88c,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4277127c, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c7e978c,
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@354e7004]
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Read co-input/StripIds as step s6
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Collect start time metrics (co-input) as step s7
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Window.Into()2/Window.Assign as step s8
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding CoGroupByKey/Flatten as step s11
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding CoGroupByKey/GBK as step s12
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Ungroup and reiterate as step s14
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Collect total bytes as step s15
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator
addStep
INFO: Adding Collect end time metrics as step s16
Jul 20, 2021 12:48:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.32.0-SNAPSHOT
Jul 20, 2021 12:48:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-07-20_05_48_41-13777248125947355154?project=apache-beam-testing
Jul 20, 2021 12:48:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-07-20_05_48_41-13777248125947355154
Jul 20, 2021 12:48:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-07-20_05_48_41-13777248125947355154
Jul 20, 2021 12:48:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
WARNING: 2021-07-20T12:48:49.293Z: The workflow name is not a valid Cloud Label. Labels applied
to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified
job name: load0tests0java110dataflow0v20streaming0cogbk01-jenkins-07-tg76. For the best monitoring
experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions
Jul 20, 2021 12:48:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:54.232Z: Worker configuration: e2-standard-2 in us-central1-a.
Jul 20, 2021 12:48:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:54.793Z: Expanding SplittableParDo operations into optimizable parts.
Jul 20, 2021 12:48:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:54.830Z: Expanding CollectionToSingleton operations into optimizable
parts.
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:54.924Z: Expanding CoGroupByKey operations into optimizable parts.
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.091Z: Expanding SplittableProcessKeyed operations into optimizable
parts.
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.122Z: Expanding GroupByKey operations into streaming Read/Write
steps
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.188Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.303Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.347Z: Unzipping flatten CoGroupByKey-Flatten for input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.379Z: Fusing unzipped copy of CoGroupByKey/GBK/WriteStream, through
flatten CoGroupByKey/Flatten, into producer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.414Z: Fusing consumer CoGroupByKey/GBK/WriteStream into CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.445Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
into Read input/Impulse
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.473Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.506Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.532Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds)
into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.571Z: Fusing consumer Collect start time metrics (input)/ParMultiDo(TimeMonitor)
into Read input/ParDo(StripIds)/ParMultiDo(StripIds)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.608Z: Fusing consumer Window.Into()/Window.Assign into Collect start
time metrics (input)/ParMultiDo(TimeMonitor)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.644Z: Fusing consumer CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
into Window.Into()/Window.Assign
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.676Z: Fusing consumer Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
into Read co-input/Impulse
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.711Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.747Z: Fusing consumer Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.780Z: Fusing consumer Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
into Read-co-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.815Z: Fusing consumer Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
into Read co-input/ParDo(StripIds)/ParMultiDo(StripIds)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.846Z: Fusing consumer Window.Into()2/Window.Assign into Collect
start time metrics (co-input)/ParMultiDo(TimeMonitor)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.884Z: Fusing consumer CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
into Window.Into()2/Window.Assign
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.921Z: Fusing consumer CoGroupByKey/GBK/MergeBuckets into CoGroupByKey/GBK/ReadStream
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:55.967Z: Fusing consumer CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
into CoGroupByKey/GBK/MergeBuckets
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:56.001Z: Fusing consumer Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
into CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:56.063Z: Fusing consumer Collect total bytes/ParMultiDo(ByteMonitor)
into Ungroup and reiterate/ParMultiDo(UngroupAndReiterate)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:56.102Z: Fusing consumer Collect end time metrics/ParMultiDo(TimeMonitor)
into Collect total bytes/ParMultiDo(ByteMonitor)
Jul 20, 2021 12:48:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:48:56.492Z: Starting 5 ****s in us-central1-a...
Jul 20, 2021 12:49:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:49:25.038Z: Your project already contains 100 Dataflow-created metric
descriptors, so new user metrics of the form custom.googleapis.com/* will not be created.
However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter.
If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jul 20, 2021 12:49:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:49:45.323Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline
can catch up with its backlog and keep up with its input rate.
Jul 20, 2021 12:50:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:50:34.403Z: Workers have started successfully.
Jul 20, 2021 12:50:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T12:50:34.437Z: Workers have started successfully.
Jul 20, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T16:00:39.194Z: Cancel request is committed for workflow job: 2021-07-20_05_48_41-13777248125947355154.
Jul 20, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T16:00:39.273Z: Cleaning up.
Jul 20, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T16:00:39.367Z: Stopping **** pool...
Jul 20, 2021 4:00:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T16:00:39.438Z: Stopping **** pool...
Jul 20, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T16:03:00.466Z: Autoscaling: Reduced the number of ****s to 0 based on low
average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping
up with input rate.
Jul 20, 2021 4:03:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler
process
INFO: 2021-07-20T16:03:00.512Z: Worker pool stopped.
Jul 20, 2021 4:03:08 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-07-20_05_48_41-13777248125947355154 finished with status CANCELLED.
Load test results for test (ID): a114d546-e5f4-4b74-bfd3-5dec9b673326 and timestamp: 2021-07-20T12:48:35.740000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                 11376.586
dataflow_v2_java11_total_bytes_count               7.2702688E9
Exception in thread "main" java.lang.RuntimeException: Invalid job state: CANCELLED.
	at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
	at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62)
	at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157)

> Task :sdks:java:testing:load-tests:run FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210720124333
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:adcb6c4a7a0914befe640d4aeafb8eb13134c29a6fe8ba57b013a7d3abe63941
Digests:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:adcb6c4a7a0914befe640d4aeafb8eb13134c29a6fe8ba57b013a7d3abe63941
  Associated tags:
 - 20210720124333
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210720124333
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210720124333].
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:adcb6c4a7a0914befe640d4aeafb8eb13134c29a6fe8ba57b013a7d3abe63941].

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero
exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 19m 51s
107 actionable tasks: 78 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6gh5zodkf6qbw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Mime
View raw message