beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Etienne Chauchot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (BEAM-2993) AvroIO.write without specifying a schema
Date Fri, 29 Sep 2017 08:40:00 GMT

    [ https://issues.apache.org/jira/browse/BEAM-2993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16185511#comment-16185511
] 

Etienne Chauchot commented on BEAM-2993:
----------------------------------------

Well, I have a use case but it is not a very common case :) The collections use a lazy avro
coder whose responsibility is to determine the schema when the pipeline runs and then delegate
to the {{AvroCoder}}. So what it lacks is a write method signature with no schema parameter.
That is why I planned to use {{AvroIO.writeCustomTypeToGenericRecords()}}. Please take a look
at that code https://github.com/echauchot/beam/blob/69928dd59cdaf259e39f047735bd33a600673999/sdks/java/core/src/test/java/org/apache/beam/sdk/io/AvroIOTransformTest.java#L393.
It just does a test of  {{AvroIO.writeCustomTypeToGenericRecords()}} with a side input which
contains the schema extracted with a {{ParDo}}. Besides this code is not finished yet because
it suffers from a serialization pb in the write with sharding 
{code}
java.lang.IllegalArgumentException: unable to serialize org.apache.beam.sdk.io.WriteFiles$ApplyShardingKey@7a3793c7

	at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:53)
	at org.apache.beam.sdk.util.SerializableUtils.clone(SerializableUtils.java:90)
	at org.apache.beam.sdk.transforms.ParDo$SingleOutput.<init>(ParDo.java:592)
	at org.apache.beam.sdk.transforms.ParDo.of(ParDo.java:435)
	at org.apache.beam.sdk.io.WriteFiles.createWrite(WriteFiles.java:730)
	at org.apache.beam.sdk.io.WriteFiles.expand(WriteFiles.java:194)
	at org.apache.beam.sdk.io.WriteFiles.expand(WriteFiles.java:109)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:525)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:479)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:297)
	at org.apache.beam.sdk.io.AvroIO$TypedWrite.expand(AvroIO.java:1177)
	at org.apache.beam.sdk.io.AvroIO$TypedWrite.expand(AvroIO.java:842)
	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:525)
	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:460)
	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:284)
	at org.apache.beam.sdk.io.AvroIOTransformTest$AvroIOWriteTransformTest.testWriteGenericRecords(AvroIOTransformTest.java:417)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:327)
	at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.junit.runners.Suite.runChild(Suite.java:128)
	at org.junit.runners.Suite.runChild(Suite.java:27)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
	at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
	at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
	at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
	at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
	at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
Caused by: java.io.NotSerializableException: org.apache.beam.sdk.io.AvroIOTransformTest$AvroIOWriteTransformTest
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
	at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
	at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
	at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:49)
	... 48 more
{code}


> AvroIO.write without specifying a schema
> ----------------------------------------
>
>                 Key: BEAM-2993
>                 URL: https://issues.apache.org/jira/browse/BEAM-2993
>             Project: Beam
>          Issue Type: Improvement
>          Components: sdk-java-extensions
>            Reporter: Etienne Chauchot
>            Assignee: Etienne Chauchot
>
> Similarly to https://issues.apache.org/jira/browse/BEAM-2677, we should be able to write
to avro files using {{AvroIO}} without specifying a schema at build time. Consider the following
use case: a user has a {{PCollection<GenericRecord>}}  but the schema is only known
while running the pipeline.  {{AvroIO.writeGenericRecords}} needs the schema, but the schema
is already available in {{GenericRecord}}. We should be able to call {{AvroIO.writeGenericRecords()}}
with no schema.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message