spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sesterhenn, Mike" <msesterh...@cars.com>
Subject Re: Time-unit of RDD.countApprox timeout parameter
Date Tue, 04 Oct 2016 21:19:12 GMT
It only exists in the latest docs, not in versions <= 1.6.

________________________________
From: Sean Owen <sowen@cloudera.com>
Sent: Tuesday, October 4, 2016 1:51:49 PM
To: Sesterhenn, Mike; user@spark.apache.org
Subject: Re: Time-unit of RDD.countApprox timeout parameter

The API docs already say: "maximum time to wait for the job, in milliseconds"

On Tue, Oct 4, 2016 at 7:14 PM Sesterhenn, Mike <msesterhenn@cars.com<mailto:msesterhenn@cars.com>>
wrote:

Nevermind. Through testing it seems it is MILLISECONDS.  This should be added to the docs.

________________________________
From: Sesterhenn, Mike <msesterhenn@cars.com<mailto:msesterhenn@cars.com>>
Sent: Tuesday, October 4, 2016 1:02:25 PM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Time-unit of RDD.countApprox timeout parameter


Hi all,


Does anyone know what the unit is on the 'timeout' parameter to the RDD.countApprox() function?
(ie. is that seconds, milliseconds, nanoseconds, ...?)


I was searching through the source but it got hairy pretty quickly.


Thanks....



Mime
View raw message