commons-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bernd Eckenfels <e...@zusammenkunft.net>
Subject Re: [VFS] HDFS failures on Windows
Date Mon, 30 Jun 2014 20:00:20 GMT
Hello,

I have commited the (new) no-hdfs profile, can you check if you still
see problems Gary?

I unfortunatelly think it is not possible to disable this profile on
the commandline (for testing if the needed binaries are installed by
hand).

Gruss
Bernd

Am Tue, 17 Jun 2014 22:22:18 -0400
schrieb Gary Gregory <garydgregory@gmail.com>:

> On Tue, Jun 17, 2014 at 9:27 PM, Bernd Eckenfels
> <ecki@zusammenkunft.net> wrote:
> 
> > Am Tue, 17 Jun 2014 20:26:11 -0400
> > schrieb Gary Gregory <garydgregory@gmail.com>:
> >
> > > The build still breaks on Windows. Can you fix it please?
> >
> > Sure, it is tracked under VFS-529 and I am on it.
> >
> >
> > > > I wanted to reproduce your problem, but had problems with the
> > > > line numbers in the stack trace. Can you check why you have
> > > > different ones? When I check it on my system the line numbers
> > > > match the 1.2.1 sourcde.
> >
> > That was actually caused by a local modification on my side
> > (switched to newer hadoop to see if it helps), so thats why the
> > lines did not match. But both versions have basically the same
> > problem. In 2.x there seems to be some better windows support but
> > with specific setup requirements. So I will see if they can be
> > provided, avoided or if it would need the auto-disable as well.
> >
> 
> I'll watch for the commits then.
> 
> Thank you,
> Gary
> 
> 
> > Gruss
> > Bernd
> >
> >
> >
> >  And if I actually disable stack-trace-trimming(commited)
> > > > in surefire, it actually prints a helpful error:
> > > >
> > > > ...
> > > > Caused by: java.lang.RuntimeException: Error while running
> > > > command to get file permissions : java.io.IOException: Cannot
> > > > run program "ls": CreateProcess error=2, Das System kann die
> > > > angegebene Datei nicht finden at
> > > > java.lang.ProcessBuilder.start(ProcessBuilder.java:1041) at
> > > > org.apache.hadoop.util.Shell.runCommand(Shell.java:200) at
> > > > org.apache.hadoop.util.Shell.run(Shell.java:182) at
> > > >
> > org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> > > >         at
> > > > org.apache.hadoop.util.Shell.execCommand(Shell.java:461) at
> > > > org.apache.hadoop.util.Shell.execCommand(Shell.java:444) at
> > > > org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:712) at
> > > >
> > org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:448)
> > > >         at
> > > >
> > org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getPermission(RawLocalFileSystem.java:423)
> > > >         at
> > > >
> > org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:146)
> > > >         at
> > > > org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:162)
> > > >         at
> > > >
> > org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1704)
> > > >         at
> > > >
> > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1651)
> > > >         at
> > > >
> > org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1626)
> > > >         at
> > > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:421)
> > > >         at
> > > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:284)
> > > >         at
> > > >
> > org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest.<clinit>(HdfsFileProviderTest.java:95)
> > > > ...
> > > >
> > > > And it actually means, that "ls.exe" is not in the PATH. So yes,
> > > > the test does not work on all Windows systems, it requires at
> > > > least a ls.exe. I will remove the automatic running of those
> > > > tests on the Windows platform (again). (but with a better named
> > > > profile).
> > > >
> > > > As a quick fix it should be enough to add any ls.exe, in my
> > > > case it was the portable git distribution (from github):
> > > >
> > > >
> > > >
> > %LOCALAPPDATA%\GitHub\PortableGit_015aa71ef18c047ce8509ffb2f9e4bb0e3e73f13\bin\ls.exe
> > > >
> > > > Gruss
> > > > Bernd
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > Am Tue, 10 Jun 2014 11:02:19 -0400 schrieb Gary Gregory
> > > > <garydgregory@gmail.com>:
> > > >
> > > > > woa... cygwin? I have that installed but it does not help.
> > > > >
> > > > > How about this: can please you turn off HDFS testing on
> > > > > Windows like it was before.
> > > > >
> > > > > I'll be happy to test patches for you on my set up, Windows 7
> > > > > Professional 64-bit Service Pack 1.
> > > > >
> > > > > My understanding is that the HDFS jars we use do not run on
> > > > > Windows out of the box because they rely on calling OS
> > > > > commands that are *Nix specific.
> > > > >
> > > > > Gary
> > > > >
> > > > >
> > > > > On Tue, Jun 10, 2014 at 9:45 AM, Bernd Eckenfels
> > > > > <ecki@zusammenkunft.net> wrote:
> > > > >
> > > > > > Hello,
> > > > > >
> > > > > > they do work for me, hm. Windows 7 x64 de. I will try some
> > > > > > other environments. Maybe it picks up some cygwin stuff or
> > > > > > something on my system?
> > > > > >
> > > > > > Gruss
> > > > > > Bernd
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
> > For additional commands, e-mail: dev-help@commons.apache.org
> >
> >
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Mime
View raw message