spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cui Lin <>
Subject Spark for Oracle sample code
Date Fri, 25 Sep 2015 23:12:03 GMT
Hello, All,

I found the examples for JDBC connection are mostly read the whole table
and then do operations like joining.

val jdbcDF ="jdbc").options(
  Map("url" -> "jdbc:postgresql:dbserver",
  "dbtable" -> "schema.tablename")).load()

Sometimes it is not practical since the whole table data is too big and not

What makes sense to me is to use sparksql to get subset data from oracle
tables using sql-like statement.
I couldn't find such examples. Can someone show me?

Best regards!


View raw message