spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stavros Kontopoulos <stavros.kontopou...@lightbend.com>
Subject Re: JDK vs JRE in Docker Images
Date Wed, 17 Apr 2019 17:23:16 GMT
Hi Rob,

We are using registry.redhat.io/redhat-openjdk-18/openjdk18-openshift (
https://docs.openshift.com/online/using_images/s2i_images/java.html)
It looks most convenient as Red Hat leads the openjdk updates which is even
more important from now on and also from a security
point of view.
There are some tools you might want to use at runtime like jstack, jps when
debugging apps so it might be more convenient to have jdk but
shouldnt be a requirement unless Spark does any compilation on the fly
behind the scenes (besides use of janino) or you need to use a tool like
keytool at container startup.

Best,
Stavros

On Wed, Apr 17, 2019 at 4:49 PM Rob Vesse <rvesse@dotnetrdf.org> wrote:

> Folks
>
>
>
> For those using the Kubernetes support and building custom images are you
> using a JDK or a JRE in the container images?
>
>
>
> Using a JRE saves a reasonable chunk of image size (about 50MB with our
> preferred Linux distro) but I didn’t want to make this change if there was
> a reason to have a JDK available.  Certainly the official project
> integration tests run just fine with a JRE based image
>
>
>
> Currently the projects official Docker files use openjdk:8-alpine as a
> base which includes a full JDK so didn’t know if that was intentional or
> just convenience?
>
>
>
> Thanks,
>
>
>
> Rob
>

Mime
View raw message