jclouds-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF subversion and git services (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (JCLOUDS-1366) OutOfMemory when InputStream referencing to big file is used as payload
Date Fri, 04 Jan 2019 23:45:00 GMT

    [ https://issues.apache.org/jira/browse/JCLOUDS-1366?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16734682#comment-16734682
] 

ASF subversion and git services commented on JCLOUDS-1366:
----------------------------------------------------------

Commit 1f392212bd67d9c51fa117ab3f6f9318e455ed62 in jclouds's branch refs/heads/2.1.x from
Andrew Gaul
[ https://git-wip-us.apache.org/repos/asf?p=jclouds.git;h=1f39221 ]

JCLOUDS-1366: JCLOUDS-1472: Fix InputStream MPU

Previously jclouds attempted to slice non-repeatable InputStream
Payloads in order to upload sequentially.  This never worked due to
mutating the single stream via skip and close.  Also backfill test
which spuriously succeeded.


> OutOfMemory when InputStream referencing to big file is used as payload
> -----------------------------------------------------------------------
>
>                 Key: JCLOUDS-1366
>                 URL: https://issues.apache.org/jira/browse/JCLOUDS-1366
>             Project: jclouds
>          Issue Type: Bug
>          Components: jclouds-blobstore
>    Affects Versions: 2.0.0, 2.0.3
>         Environment: Linux and Windows
>            Reporter: Deyan
>            Priority: Critical
>
> If I use InputStream which source is large file (lets say 3GB) I am getting OOE. This
is with default java VM options.
> Here is the code I am using to construct the blob:
> {code:java}
>  File bigFile = new File(file);
>  try (InputStream inputStream = new FileInputStream(f)) {
>                 Blob b = blobStore.blobBuilder(blobName)
>                         .payload(inputStream).contentLength(f.length())
>                         .contentDisposition(blobName)
>                         .contentType(
>                                 MediaType.OCTET_STREAM)
>                         .userMetadata(ImmutableMap.of("a", "b", "test", "beta"))
>                         .build();
>                 blobStore.putBlob("test", bbbbb, multipart());
> }
> {code}
> Stacktrace:
> {code:java}
> java.lang.OutOfMemoryError: Java heap space
> 	at org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.getNextPayload(BasePayloadSlicer.java:101)
> 	at org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.next(BasePayloadSlicer.java:90)
> 	at org.jclouds.io.internal.BasePayloadSlicer$InputStreamPayloadIterator.next(BasePayloadSlicer.java:63)
> 	at org.jclouds.blobstore.internal.BaseBlobStore.putMultipartBlob(BaseBlobStore.java:363)
> 	at org.jclouds.blobstore.internal.BaseBlobStore.putMultipartBlob(BaseBlobStore.java:349)
> 	at org.jclouds.s3.blobstore.S3BlobStore.putBlob(S3BlobStore.java:262)
> {code}
>  If 'bigFile' is used as payload the bug is not reproducible.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message