mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Musselman <andrew.mussel...@gmail.com>
Subject Re: Can any one help
Date Tue, 08 Apr 2014 17:27:39 GMT
Looks like your Hadoop dfs is not operating correctly; might need to stop
all your Hadoop services, format your namenode, and then restart.

E.g., https://www.google.com/search?q=could+only+be+replicated+to+0+nodes


On Tue, Apr 8, 2014 at 10:23 AM, Neetha <netasusan@gmail.com> wrote:

> Hi,
>
>
> I am trying to run Mahout -kmeans clustering on hadoop, but I am getting
> this error,
>
>
> hduser3@ubuntu:/usr/local/hadoop-1.0.1/mahout3$ bin/mahout seqdirectory
> \-i
> mahout-work/reuters-out \-o mahout-work/reuters-out-seqdir \-c UTF-8 -chunk
> 5
> Warning: $HADOOP_HOME is deprecated.
>
>
> hduser3@ubuntu:/usr/local/hadoop-1.0.1/mahout3$ bin/mahout seqdirectory
> \-i
> mahout-work/reuters-out \-o mahout-work/reuters-out-seqdir \-c UTF-8 -chunk
> 5
> Warning: $HADOOP_HOME is deprecated.
>
> Running on hadoop, using /usr/local/hadoop-1.0.1/bin/hadoop and
> HADOOP_CONF_DIR=
> MAHOUT-JOB: /usr/local/hadoop-1.0.1/mahout3/examples/target/
> mahout-examples-0.7-job.jar
> Warning: $HADOOP_HOME is deprecated.
>
> 14/04/07 12:10:14 INFO common.AbstractJob: Command line arguments:
> {--charset=[UTF-8], --chunkSize=[5], --endPhase=[2147483647],
> --fileFilterClass=[org.apache.mahout.text.PrefixAdditionFilter],
> --input=[mahout-work/reuters-out], --keyPrefix=[],
> --output=[mahout-work/reuters-out-seqdir], --startPhase=[0],
> --tempDir=[temp]}
> 14/04/07 12:10:15 WARN hdfs.DFSClient: DataStreamer Exception:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/hduser3/mahout-work/reuters-out-seqdir/chunk-0 could only be
> replicated to 0 nodes, instead of 1
>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.
> getAdditionalBlock(FSNamesystem.java:1556)
>     at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(
> NameNode.java:696)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1093)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1066)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>     at $Proxy1.addBlock(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:82)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:59)
>     at $Proxy1.addBlock(Unknown Source)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> locateFollowingBlock(DFSClient.java:3507)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> nextBlockOutputStream(DFSClient.java:3370)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> access$2700(DFSClient.java:2586)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$
> DataStreamer.run(DFSClient.java:2826)
>
> 14/04/07 12:10:15 WARN hdfs.DFSClient: Error Recovery for block null bad
> datanode[0] nodes == null
> 14/04/07 12:10:15 WARN hdfs.DFSClient: Could not get block locations.
> Source file "/user/hduser3/mahout-work/reuters-out-seqdir/chunk-0" -
> Aborting...
> Apr 7, 2014 12:10:15 PM com.google.common.io.Closeables close
> WARNING: IOException thrown while closing Closeable.
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/hduser3/mahout-work/reuters-out-seqdir/chunk-0 could only be
> replicated to 0 nodes, instead of 1
>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.
> getAdditionalBlock(FSNamesystem.java:1556)
>     at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(
> NameNode.java:696)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1093)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1066)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>     at $Proxy1.addBlock(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:82)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:59)
>     at $Proxy1.addBlock(Unknown Source)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> locateFollowingBlock(DFSClient.java:3507)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> nextBlockOutputStream(DFSClient.java:3370)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> access$2700(DFSClient.java:2586)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$
> DataStreamer.run(DFSClient.java:2826)
> 14/04/07 12:10:15 INFO driver.MahoutDriver: Program took 781 ms (Minutes:
> 0.013016666666666666)
> 14/04/07 12:10:15 ERROR hdfs.DFSClient: Exception closing file
> /user/hduser3/mahout-work/reuters-out-seqdir/chunk-0 :
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/hduser3/mahout-work/reuters-out-seqdir/chunk-0 could only be
> replicated to 0 nodes, instead of 1
>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.
> getAdditionalBlock(FSNamesystem.java:1556)
>     at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(
> NameNode.java:696)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1093)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/hduser3/mahout-work/reuters-out-seqdir/chunk-0 could only be
> replicated to 0 nodes, instead of 1
>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.
> getAdditionalBlock(FSNamesystem.java:1556)
>     at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(
> NameNode.java:696)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1093)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1066)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>     at $Proxy1.addBlock(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:82)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:59)
>     at $Proxy1.addBlock(Unknown Source)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> locateFollowingBlock(DFSClient.java:3507)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> nextBlockOutputStream(DFSClient.java:3370)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.
> access$2700(DFSClient.java:2586)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$
> DataStreamer.run(DFSClient.java:2826)
> hduser3@ubuntu:/usr/local/hadoop-1.0.1/mahout3$ bin/mahout seqdirectory
> \-i
> mahout-work/reuters-out \-o mahout-work/reuters-out-seqdir \-c UTF-8 -chunk
> 5
> Warning: $HADOOP_HOME is deprecated.
>
> Running on hadoop, using /usr/local/hadoop-1.0.1/bin/hadoop and
> HADOOP_CONF_DIR=
> MAHOUT-JOB: /usr/local/hadoop-1.0.1/mahout3/examples/target/
> mahout-examples-0.7-job.jar
> Warning: $HADOOP_HOME is deprecated.
>
> 14/04/07 12:11:56 INFO common.AbstractJob: Command line arguments:
> {--charset=[UTF-8], --chunkSize=[5], --endPhase=[2147483647],
> --fileFilterClass=[org.apache.mahout.text.PrefixAdditionFilter],
> --input=[mahout-work/reuters-out], --keyPrefix=[],
> --output=[mahout-work/reuters-out-seqdir], --startPhase=[0],
> --tempDir=[temp]}
> Exception in thread "main" org.apache.hadoop.ipc.RemoteException:
> java.io.IOException: failed to create file
> /user/hduser3/mahout-work/reuters-out-seqdir/chunk-0
> on client 127.0.0.1.
> Requested replication 0 is less than the required minimum 1
>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.
> startFileInternal(FSNamesystem.java:1249)
>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.
> startFile(FSNamesystem.java:1186)
>     at org.apache.hadoop.hdfs.server.namenode.NameNode.create(
> NameNode.java:628)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1093)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
>
>     at org.apache.hadoop.ipc.Client.call(Client.java:1066)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>     at $Proxy1.create(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(
> RetryInvocationHandler.java:82)
>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(
> RetryInvocationHandler.java:59)
>     at $Proxy1.create(Unknown Source)
>     at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<
> init>(DFSClient.java:3245)
>     at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:713)
>     at org.apache.hadoop.hdfs.DistributedFileSystem.create(
> DistributedFileSystem.java:182)
>     at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:555)
>     at org.apache.hadoop.io.SequenceFile$Writer.<init>(
> SequenceFile.java:892)
>     at org.apache.hadoop.io.SequenceFile$Writer.<init>(
> SequenceFile.java:880)
>     at org.apache.hadoop.io.SequenceFile$Writer.<init>(
> SequenceFile.java:872)
>     at org.apache.mahout.utils.io.ChunkedWriter.<init>(
> ChunkedWriter.java:48)
>     at org.apache.mahout.text.SequenceFilesFromDirectory.run(
> SequenceFilesFromDirectory.java:79)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>     at org.apache.mahout.text.SequenceFilesFromDirectory.main(
> SequenceFilesFromDirectory.java:53)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(
> ProgramDriver.java:68)
>     at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>     at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:57)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:616)
>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> hduser3@ubuntu:/usr/local/hadoop-1.0.1/mahout3$
>
> Could you please help this to solve this problem,
>
>
> Thanking you,
> Neetha Susan Thampi
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message