spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ben Spark <ben_spar...@yahoo.com.au>
Subject SparkR dataFrame read.df fails to read from aws s3
Date Thu, 09 Jul 2015 04:14:17 GMT
I have Spark 1.4 deployed on AWS EMR but methods of SparkR dataFrame read.df method cannot
load data from aws s3.
1) "read.df" error message read.df(sqlContext,"s3://some-bucket/some.json","json")
15/07/09 04:07:01 ERROR r.RBackendHandler: loadDF on org.apache.spark.sql.api.r.SQLUtils failed
java.lang.IllegalArgumentException: invalid method loadDF for object org.apache.spark.sql.api.r.SQLUtils
	at org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:143)
	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:74)
	at org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:36)  at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) 2)
"jsonFile" is working though with some warning messageWarning message:
In normalizePath(path) :
  path[1]="s3://rea-consumer-data-dev/cbr/profiler/output/20150618/part-00000": No such file
or directory
Mime
View raw message