commons-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bernd Eckenfels <e...@zusammenkunft.net>
Subject Re: [VFS] HDFS failures on Windows
Date Wed, 18 Jun 2014 01:22:17 GMT
Am Tue, 17 Jun 2014 20:26:11 -0400
schrieb Gary Gregory <garydgregory@gmail.com>:

> The build still breaks on Windows. Can you fix it please?

Sure I am working on it, it is tracked under VFS-529.


> > I wanted to reproduce your problem, but had problems with the line
> > numbers in the stack trace. Can you check why you have different
> > ones? When I check it on my system the line numbers match the 1.2.1
> > sourcde.

That was actually caused by a local modification on my side (switched
to newer hadoop to see if it helps), so thats why the lines did not
match. But both versions have basically the same problem. In 2.x there
seems to be some better windows support but with specific setup
requirements, so it might be required to disable it there as well
(VFS-530).

Gruss
Bernd



 And if I actually disable stack-trace-trimming(commited)
> > in surefire, it actually prints a helpful error:
> >
> > ...
> > Caused by: java.lang.RuntimeException: Error while running command
> > to get file permissions : java.io.IOException: Cannot run program
> > "ls": CreateProcess error=2, Das System kann die angegebene Datei
> > nicht finden at
> > java.lang.ProcessBuilder.start(ProcessBuilder.java:1041) at
> > org.apache.hadoop.util.Shell.runCommand(Shell.java:200) at
> > org.apache.hadoop.util.Shell.run(Shell.java:182) at
> > org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> >         at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
> >         at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
> >         at
> > org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:712) at
> > org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:448)
> >         at
> > org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getPermission(RawLocalFileSystem.java:423)
> >         at
> > org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:146)
> >         at
> > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:162)
> >         at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1704)
> >         at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1651)
> >         at
> > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1626)
> >         at
> > org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:421)
> >         at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:284)
> >         at
> > org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest.<clinit>(HdfsFileProviderTest.java:95)
> > ...
> >
> > And it actually means, that "ls.exe" is not in the PATH. So yes,
> > the test does not work on all Windows systems, it requires at least
> > a ls.exe. I will remove the automatic running of those tests on the
> > Windows platform (again). (but with a better named profile).
> >
> > As a quick fix it should be enough to add any ls.exe, in my case it
> > was the portable git distribution (from github):
> >
> >
> > %LOCALAPPDATA%\GitHub\PortableGit_015aa71ef18c047ce8509ffb2f9e4bb0e3e73f13\bin\ls.exe
> >
> > Gruss
> > Bernd
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > Am Tue, 10 Jun 2014 11:02:19 -0400 schrieb Gary Gregory
> > <garydgregory@gmail.com>:
> >
> > > woa... cygwin? I have that installed but it does not help.
> > >
> > > How about this: can please you turn off HDFS testing on Windows
> > > like it was before.
> > >
> > > I'll be happy to test patches for you on my set up, Windows 7
> > > Professional 64-bit Service Pack 1.
> > >
> > > My understanding is that the HDFS jars we use do not run on
> > > Windows out of the box because they rely on calling OS commands
> > > that are *Nix specific.
> > >
> > > Gary
> > >
> > >
> > > On Tue, Jun 10, 2014 at 9:45 AM, Bernd Eckenfels
> > > <ecki@zusammenkunft.net> wrote:
> > >
> > > > Hello,
> > > >
> > > > they do work for me, hm. Windows 7 x64 de. I will try some other
> > > > environments. Maybe it picks up some cygwin stuff or something
> > > > on my system?
> > > >
> > > > Gruss
> > > > Bernd
> > > >
> > > >
> > > > Am Mon, 9 Jun 2014 18:05:18 -0400
> > > > schrieb Gary Gregory <garydgregory@gmail.com>:
> > > >
> > > > > Ecki enabled the HDFS tests on Windows but they sure fail for
> > > > > me, see below.
> > > > >
> > > > > Do they work for anyone else on Windows?
> > > > >
> > > > > Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time
> > > > > elapsed: 3.681 sec <<< FAILURE! - in
> > > > > org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest
> > > > > org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest
> > > > > Time elapsed: 3.68 sec  <<< ERROR!
> > > > > java.lang.ExceptionInInitializerError: null
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:468)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getPermission(RawLocalFileSystem.java:418)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:146)
> > > > >         at
> > > > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:162)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1643)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1590)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1565)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:421)
> > > > >         at
> > > > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:284)
> > > > >         at
> > > > >
> > > >
> > org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest.<clinit>(HdfsFileProviderTest.java:95)
> > > > >
> > > > > org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest
> > > > > Time elapsed: 3.681 sec  <<< ERROR!
> > > > > java.lang.NoClassDefFoundError: Could not initialize class
> > > > > org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest
> > > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > > > Method) at
> > > > >
> > > >
> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > > > >         at
> > > > >
> > > >
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > > >         at java.lang.reflect.Method.invoke(Method.java:606)
> > > > >         at
> > > > >
> > > >
> > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > > > >         at
> > > > >
> > > >
> > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > > > >         at
> > > > >
> > > >
> > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > > > >         at
> > > > >
> > > >
> > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
> > > > >         at
> > > > > org.junit.runners.ParentRunner.run(ParentRunner.java:309) at
> > > > >
> > > >
> > org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
> > > > >         at
> > > > >
> > > >
> > org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
> > > > >         at
> > > > >
> > > >
> > org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
> > > > >         at
> > > > >
> > > >
> > org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
> > > > >         at
> > > > >
> > > >
> > org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
> > > > >         at
> > > > >
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> > > > >
> > > > > Gary
> > > > >
> > > > >
> > > >
> > > >
> > > > ---------------------------------------------------------------------
> > > > To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
> > > > For additional commands, e-mail: dev-help@commons.apache.org
> > > >
> > > >
> > >
> > >
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
> > For additional commands, e-mail: dev-help@commons.apache.org
> >
> >
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Mime
View raw message