spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-19593) Records read per each kinesis transaction
Date Fri, 17 Feb 2017 15:18:41 GMT

     [ https://issues.apache.org/jira/browse/SPARK-19593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen resolved SPARK-19593.
-------------------------------
    Resolution: Invalid

> Records read per each kinesis transaction
> -----------------------------------------
>
>                 Key: SPARK-19593
>                 URL: https://issues.apache.org/jira/browse/SPARK-19593
>             Project: Spark
>          Issue Type: Question
>          Components: DStreams
>    Affects Versions: 2.0.1
>            Reporter: Sarath Chandra Jiguru
>            Priority: Trivial
>
> The question is related to spark streaming+kinesis integration
> Is there a way to provide a kinesis consumer configuration. Ex: Number  of records read
per each transaction etc. 
> Right now, even though, I am eligible to read 2.8G/minute, I am restricted to read only
100MB/minute, as I am not able to increase the default number of records in each transaction.
> I have raised a question in stackoverflow as well, please look into it:
> http://stackoverflow.com/questions/42107037/how-to-alter-kinesis-consumer-properties-in-spark-streaming
> Kinesis stream setup:
> open shards: 24
> write rate: 440K/minute
> read rate: 1.42M/minute
> read byte rate: 85 MB/minute. I am allowed to read around 2.8G/minute(24 Shards*2 MB*60
Seconds)
> Reference: http://docs.aws.amazon.com/streams/latest/dev/kinesis-record-processor-additional-considerations.html



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message