spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Lewis <lordjoe2...@gmail.com>
Subject writing to local files on a worker
Date Sun, 11 Nov 2018 22:13:39 GMT
I have a problem where a critical step needs to be performed by  a third
party c++ application. I can send or install this program on the worker
nodes. I can construct  a function holding all the data this program needs
to process. The problem is that the program is designed to read and write
from the local file system. I can call the program from Java and read its
output as  a  local file - then deleting all temporary files but I doubt
that it is possible to get the program to read from hdfs or any shared file
system.
My question is can a function running on a worker node create temporary
files and pass the names of these to a local process assuming everything is
cleaned up after the call?

-- 
Steven M. Lewis PhD
4221 105th Ave NE
Kirkland, WA 98033
206-384-1340 (cell)
Skype lordjoe_com

Mime
View raw message