spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Arbab Khalil <akha...@an10.io>
Subject Re: how to use cluster sparkSession like localSession
Date Fri, 02 Nov 2018 05:55:45 GMT
remove master configuration from code and then submit it to any cluster, it
should work.

On Fri, Nov 2, 2018 at 10:52 AM 崔苗(数据与人工智能产品开发部) <0049003208@znv.com>
wrote:

>
> then how about spark sql and spark MLlib , we use them at most time
> 0049003208
> 0049003208@znv.com
>
> <https://maas.mail.163.com/dashi-web-extend/html/proSignature.html?ftlId=1&name=0049003208&uid=0049003208%40znv.com&iconUrl=https%3A%2F%2Fmail-online.nosdn.127.net%2Fqiyelogo%2FdefaultAvatar.png&items=%5B%220049003208%40znv.com%22%5D>
> 签名由 网易邮箱大师 <https://mail.163.com/dashi/dlpro.html?from=mail81>
定制
> On 11/2/2018 11:58,Daniel de Oliveira
> Mantovani<daniel.oliveira.mantovani@gmail.com>
> <daniel.oliveira.mantovani@gmail.com> wrote:
>
> Please, read about Spark Streaming or Spark Structured Streaming. Your web
> application can easily communicate through some API and you won’t have the
> overhead of start a new spark job, which is pretty heavy.
>
> On Thu, Nov 1, 2018 at 23:01 崔苗(数据与人工智能产品开发部) <0049003208@znv.com>
wrote:
>
>>
>> Hi,
>> we want to execute spark code with out submit application.jar,like this
>> code:
>>
>> public static void main(String args[]) throws Exception{
>>         SparkSession spark = SparkSession
>>                 .builder()
>>                 .master("local[*]")
>>                 .appName("spark test")
>>                 .getOrCreate();
>>
>>         Dataset<Row> testData =
>> spark.read().csv(".\\src\\main\\java\\Resources\\no_schema_iris.scv");
>>         testData.printSchema();
>>         testData.show();
>>     }
>>
>> the above code can work well with idea , do not need to generate jar file
>> and submit , but if we replace master("local[*]") with master("yarn") ,
>> it can't work , so is there a way to use cluster sparkSession like local
>> sparkSession ?  we need to dynamically execute spark code in web server
>> according to the different request ,  such as filter request will call
>> dataset.filter() , so there is no application.jar to submit .
>>
>> 0049003208
>> 0049003208@znv.com
>>
>> <https://maas.mail.163.com/dashi-web-extend/html/proSignature.html?ftlId=1&name=0049003208&uid=0049003208%40znv.com&iconUrl=https%3A%2F%2Fmail-online.nosdn.127.net%2Fqiyelogo%2FdefaultAvatar.png&items=%5B%220049003208%40znv.com%22%5D>
>> 签名由 网易邮箱大师 <https://mail.163.com/dashi/dlpro.html?from=mail81>
定制
>> --------------------------------------------------------------------- To
>> unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
> --
>
> --
> Daniel de Oliveira Mantovani
> Perl Evangelist/Data Hacker
> +1 786 459 1341
>
>

-- 
Regards,
Arbab Khalil
Software Design Engineer

Mime
View raw message