It's on the 1.6 branch
On Thu, May 26, 2016 at 4:43 PM Andrés Ivaldi <> wrote:
I see, I'm using Spark 1.6.0 and that change seems to be for 2.0 or maybe it's in 1.6.1 looking at the history.
thanks I'll see if update spark  to 1.6.1

On Thu, May 26, 2016 at 3:33 PM, Anthony May <> wrote:

On Thu, 26 May 2016 at 16:02 Andrés Ivaldi <> wrote:
 I'realize that when dataframe executes insert it is inserting by scheme order column instead by name, ie

dataframe.write(SaveMode).jdbc(url, table, properties)

Reading the profiler the execution is 

insert into TableName values(a,b,c..) 

what i need is
insert into TableNames (colA,colB,colC) values(a,b,c)

could be some configuration?


Ing. Ivaldi Andres

Ing. Ivaldi Andres