spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shivaram Venkataraman <>
Subject Re: Call to new JObject sometimes returns an empty R environment
Date Tue, 05 Jul 2016 17:00:19 GMT

[Please send SparkR development questions to the Spark user / dev
mailing lists. Replies inline]

> From:  <>
> Date: Tue, Jul 5, 2016 at 3:30 AM
> Subject: Call to new JObject sometimes returns an empty R environment
> To: SparkR Developers <>
>  Hi all,
>  I have recently moved from SparkR 1.5.2 to 1.6.0. I am doing some
> experiments using SparkR:::newJObject("java.util.HashMap") and I
> notice the behaviour has changed, and it now returns an "environment"
> instead of a "jobj":
>> print(class(SparkR:::newJObject("java.util.HashMap")))  # SparkR 1.5.2
> [1] "jobj"
>> print(class(SparkR:::newJObject("java.util.HashMap")))  # SparkR 1.6.0
> [1] "environment"
> Moreover, the environment returned is apparently empty (when I call
> ls() on the resulting environment, it returns character(0)) . This
> problem only happens with some Java classes. I am not able to say
> exactly which classes cause the problem.

The reason this is different in Spark 1.6 is that we added support for
automatically deserializing Maps returned from the JVM as environments
on the R side. The pull request has some more details. The
reason BitSet / ArrayList "work" is that we don't do any special
serialization / de-serialization for them.

> If I try to create an instance of other classes such as
> java.util.BitSet, it works successfully. I thought it might be related
> with parameterized types, but it does work successfully with ArrayList
> and with HashSet, which take a parameter.
> Any suggestions on this change of behaviour (apart from "do not use
> private functions" :-)   ) ?

Unfortunately there isn't much more to say than that. The
serialization/de-serialization is an internal API and we don't claim
to maintain backwards compatibility. You might be able to work around
this particular issue by wrapping your Map in a different object.


> Thank you very much
> --
> You received this message because you are subscribed to the Google
> Groups "SparkR Developers" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to
> To post to this group, send email to
> To view this discussion on the web visit
> For more options, visit

To unsubscribe e-mail:

View raw message