All you need is a client to the target REST service in your Spark task. It could be as simple as a HttpClient. Most likely that client won't be serializable in which case you initialize it lazily. There are useful examples in Spark knowledge base gitbook that you can look at.

On Mar 31, 2015 1:48 PM, "Minnow Noir" <> wrote:
We have have some data on Hadoop that needs augmented with data only available to us via a REST service.  We're using Spark to search for, and correct, missing data. Even though there are a lot of records to scour for missing data, the total number of calls to the service is expected to be low, so it would be ideal to do the whole job in Spark as we scour the data.

I don't see anything obvious in the API or on Google relating to making REST calls from a Spark job.  Is it possible?