spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Georges Perrin <...@jgp.net>
Subject Issue with map Java lambda function with 3.0.0 preview and preview 2
Date Sat, 28 Dec 2019 17:38:13 GMT
Hey guys,

This code:

    Dataset<Row> incrementalDf = spark
        .createDataset(l, Encoders.INT())
        .toDF();
    Dataset<Integer> dotsDs = incrementalDf
        .map(status -> {
          double x = Math.random() * 2 - 1;
          double y = Math.random() * 2 - 1;
          counter++;
          if (counter % 100000 == 0) {
            System.out.println("" + counter + " darts thrown so far");
          }
          return (x * x + y * y <= 1) ? 1 : 0;
        }, Encoders.INT());

used to work with Spark 2.x, in the two previous, it says:

The method map(Function1<Row,Integer>, Encoder<Integer>) is ambiguous for the
type Dataset<Row>

IfI define my mapping function as a class it works fine. Here is the class:

  private final class DartMapper
      implements MapFunction<Row, Integer> {
    private static final long serialVersionUID = 38446L;

    @Override
    public Integer call(Row r) throws Exception {
      double x = Math.random() * 2 - 1;
      double y = Math.random() * 2 - 1;
      counter++;
      if (counter % 1000 == 0) {
        System.out.println("" + counter + " operations done so far");
      }
      return (x * x + y * y <= 1) ? 1 : 0;
    }
  }

Any hint on what/if I did wrong? 

jg




Mime
View raw message