spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Robison <david.robi...@psgglobal.net>
Subject creating a javaRDD using newAPIHadoopFile and FixedLengthInputFormat
Date Tue, 15 Nov 2016 13:44:40 GMT
I am trying to create a Spark javaRDD using the newAPIHadoopFile and the FixedLengthInputFormat.
Here is my code snippit,

Configuration config = new Configuration();
config.setInt(FixedLengthInputFormat.FIXED_RECORD_LENGTH, JPEG_INDEX_SIZE);
config.set("fs.hdfs.impl", DistributedFileSystem.class.getName());
String fileFilter = config.get("fs.defaultFS") + "/A/B/C/*.idx";
JavaPairRDD<LongWritable, BytesWritable> inputRDD = sparkContext.newAPIHadoopFile(fileFilter,
FixedLengthInputFormat.class, LongWritable.class, BytesWritable.class, config);

At this point I get the following exception:

Error executing mapreduce job: com.fasterxml.jackson.databind.JsonMappingException: Infinite
recursion (StackOverflowError)

Any idea what I am doing wrong? I am new to Spark. David

David R Robison
Senior Systems Engineer
O. +1 512 247 3700
M. +1 757 286 0022
david.robison@psgglobal.net<mailto:david.robison@psgglobal.net>
www.psgglobal.net<http://www.psgglobal.net/>

Prometheus Security Group Global, Inc.
3019 Alvin Devane Boulevard
Building 4, Suite 450
Austin, TX 78741



Mime
View raw message