db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel John Debrunner <...@debrunners.com>
Subject Re: Length of binary streams must match length parameter(?)
Date Tue, 05 Oct 2004 01:12:18 GMT
Hash: SHA1

Ralph Richard Cook wrote:

> I'm not sure if this is a bug or proper behavior, but here goes...
> For performance reasons, I'd like to reuse a byte array when inserting
> into a derby database. I'm inserting with a PreparedStatement, making
a new
> ByteArrayInputStream around a reused byte[] buffer, depending on the
> parameter of the PreparedStatement.setBinaryStream method to say how many
> bytes to use out of the stream. However, if the length parameter
doesn't match
> the actual length of the buffer used in the ByteArrayInputStream then
> derby complains, loudly. Here is a small sample program to show what I
> followed by the Exception I get from it. Line 45 is the third call to
> PreparedStatement.execute.
> All that's needed to compile and run this in the classpath is derby.jar.
> Is this an actual bug, or are these two numbers supposed to match in all
> JDBC implementations? If they are supposed to match, why bother with the
> length parameter at all?

I'd thought I had covered this in the JDBC implementation notes document


but I didn't.

Look at the overview of that document for details on the JDBC spec and
how sometimes definitions can be found in one of three places. In this
specific case the [TUTORIAL3] book have additional comments for
PreparedStatement.setXXXStream() methods, indicating that the length
parameter must match the contents of the stream. I think the Derby code
does have comments indicating this.

As for why that is the case, I don't know, that would be best to ask of Sun.


Version: GnuPG v1.2.5 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org


View raw message