beam-builds mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PerformanceTests_Spark #3434
Date Tue, 23 Jul 2019 00:48:23 GMT
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3434/display/redirect?page=changes>

Changes:

[github] Update Python 3 entry in Python SDK roadmap.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
+refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6518abfb3ea47a4802d76ca3c405c3f66e48eaa2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6518abfb3ea47a4802d76ca3c405c3f66e48eaa2
Commit message: "Merge pull request #9123 from tvalentyn/patch-57"
 > git rev-list --no-walk ab80cc5f031f7881b688e7fb5b05191fa6b3f80f # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4259186908169704807.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3279836306694721285.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1997446001854936900.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
--python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9212070207014343603.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip>
install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade
your Python as Python 2.7 won't be maintained after that date. A future version of pip will
drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages
(41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins20212359493051442.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins561240374081327264.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip>
install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade
your Python as Python 2.7 won't be maintained after that date. A future version of pip will
drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages
(from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
(line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog,
blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi,
chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth,
requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4
certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5
cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8
ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0
requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5838181423510418338.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py>
--project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results
--k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true
--dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-23 00:48:20,002 71a6ac69 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/71a6ac69/pkb.log>
2019-07-23 00:48:20,002 71a6ac69 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1281-g37855a6
2019-07-23 00:48:20,004 71a6ac69 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-23 00:48:20,397 71a6ac69 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-23 00:48:20,422 71a6ac69 MainThread WARNING  The key "flags" was not in the default
config, but was in user overrides. This may indicate a typo.
2019-07-23 00:48:20,444 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning
resources for benchmark dpb_wordcount_benchmark
2019-07-23 00:48:20,446 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during
benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",>
line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",>
line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",>
line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",>
line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",>
line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 00:48:20,447 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Running:
gcloud dataproc clusters delete pkb-71a6ac69 --format json --quiet --project apache-beam-testing
2019-07-23 00:48:21,751 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud
dataproc clusters delete pkb-71a6ac69 --format json --quiet --project apache-beam-testing}
 ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-71a6ac69

2019-07-23 00:48:21,752 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Running:
gcloud dataproc clusters describe pkb-71a6ac69 --format json --quiet --project apache-beam-testing
2019-07-23 00:48:22,345 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud
dataproc clusters describe pkb-71a6ac69 --format json --quiet --project apache-beam-testing}
 ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-71a6ac69

2019-07-23 00:48:22,348 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception
running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",>
line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",>
line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",>
line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",>
line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",>
line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark
1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark
run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete
logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/71a6ac69/pkb.log>
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion
statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/71a6ac69/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Mime
View raw message