spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sela, Amit" <>
Subject Re: spark 2.0 readStream from a REST API
Date Thu, 11 Aug 2016 07:39:18 GMT
The current available output modes are Complete and Append. Complete mode is for stateful processing
(aggregations), and Append mode for stateless processing (I.e., map/filter). See :
Dataset#writeStream will produce a DataStreamWriter which allows you to start a query. This
seems consistent with Spark’s previous behaviour of only executing upon an “action”,
and the queries I guess are what “jobs” used to be.


From: Ayoub Benali <<>>
Date: Tuesday, August 2, 2016 at 11:59 AM
To: user <<>>
Cc: Jacek Laskowski <<>>, Amit Sela <<>>,
Michael Armbrust <<>>
Subject: Re: spark 2.0 readStream from a REST API

Why writeStream is needed to consume the data ?

When I tried it I got this exception:

INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
org.apache.spark.sql.AnalysisException: Complete output mode not supported when there are
no streaming aggregations on streaming DataFrames/Datasets;
at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$.org$apache$spark$sql$catalyst$analysis$UnsupportedOperationChecker$$throwError(UnsupportedOperationChecker.scala:173)
at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$.checkForStreaming(UnsupportedOperationChecker.scala:65)
at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:236)
at org.apache.spark.sql.streaming.DataStreamWriter.start(DataStreamWriter.scala:287)
at .<init>(<console>:59)

2016-08-01 18:44 GMT+02:00 Amit Sela <<>>:
I think you're missing:



Dis it help ?

On Mon, Aug 1, 2016 at 2:44 PM Jacek Laskowski <<>>
On Mon, Aug 1, 2016 at 11:01 AM, Ayoub Benali
<<>> wrote:

> the problem now is that when I consume the dataframe for example with count
> I get the stack trace below.

Mind sharing the entire pipeline?

> I followed the implementation of TextSocketSourceProvider to implement my
> data source and Text Socket source is used in the official documentation
> here.

Right. Completely forgot about the provider. Thanks for reminding me about it!

Jacek Laskowski
Mastering Apache Spark 2.0
Follow me at

To unsubscribe e-mail:<>

View raw message