nutch-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marko Bauhardt ...@101tec.com>
Subject Re: Eclipse Nutch1.0 IOException
Date Fri, 29 May 2009 13:51:37 GMT
hi georg,
maybe an outofmemory, it is possible that the default memory size is  
to less? look at the logfile, i think hadoop.log, for more details.

marko



On May 29, 2009, at 3:34 PM, Georg Kirschner wrote:

> Hi,
>
> I tried to hook up Eclipse and Nutch but got following error:
>
> "crawl started in: crawl
> rootUrlDir = urls
> threads = 10
> depth = 3
> topN = 50
> Injector: starting
> Injector: crawlDb: crawl/crawldb
> Injector: urlDir: urls
> Injector: Converting injected urls to crawl db entries.
> Exception in thread "main" java.io.IOException: Job failed!
> 	at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1232)
> 	at org.apache.nutch.crawl.Injector.inject(Injector.java:160)
> 	at org.apache.nutch.crawl.Crawl.main(Crawl.java:113)"
>
> Can anyone give me a clue here?
>
> Nutch 1.0
> Hadoop 0.19.1
> Eclipse Ganymede
> JDK 1.6.0_13
>
> Thanks
>
> Georg
>


Mime
View raw message