spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yin Huai (JIRA)" <j...@apache.org>
Subject [jira] [Issue Comment Deleted] (SPARK-5420) Cross-langauge load/store functions for creating and saving DataFrames
Date Mon, 09 Feb 2015 02:03:35 GMT

     [ https://issues.apache.org/jira/browse/SPARK-5420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Yin Huai updated SPARK-5420:
----------------------------
    Comment: was deleted

(was: h3. End user APIs added to SQLContext (load related)
h4. Load data through a data source and create a DataFrame
{code}
// This method is used to load data through file based data source (e.g. Parquet). We will
use the default data source . Right now, it is Parquet.
def load(path: String): DataFrame
def load(
      dataSourceName: String,
      option: (String, String),
      options: (String, String)*): DataFrame
// This is for Java users.
def load(
      dataSourceName: String,
      options: java.util.Map[String, String]): DataFrame
{code}

h3. End user APIs added to HiveContext (load related)
h4. Create a metastore table for the existing data
{code}
// This method is used create a table from a file based data source.  We will use the default
data source . Right now, it is Parquet.
def createTable(tableName: String, path: String, allowExisting: Boolean): Unit
def createTable(
      tableName: String,
      dataSourceName: String,
      allowExisting: Boolean,
      option: (String, String),
      options: (String, String)*): Unit
def createTable(
      tableName: String,
      dataSourceName: String,
      schema: StructType,
      allowExisting: Boolean,
      option: (String, String),
      options: (String, String)*): Unit
// This one is for Java users.
def createTable(
      tableName: String,
      dataSourceName: String,
      allowExisting: Boolean,
      options: java.util.Map[String, String]): Unit
// This one is for Java users.
def createTable(
      tableName: String,
      dataSourceName: String,
      schema: StructType,
      allowExisting: Boolean,
      options: java.util.Map[String, String]): Unit
{code} )

> Cross-langauge load/store functions for creating and saving DataFrames
> ----------------------------------------------------------------------
>
>                 Key: SPARK-5420
>                 URL: https://issues.apache.org/jira/browse/SPARK-5420
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>            Reporter: Patrick Wendell
>            Assignee: Yin Huai
>            Priority: Blocker
>             Fix For: 1.3.0
>
>
> We should have standard API's for loading or saving a table from a data store. Per comment
discussion:
> {code}
> def loadData(datasource: String, parameters: Map[String, String]): DataFrame
> def loadData(datasource: String, parameters: java.util.Map[String, String]): DataFrame
> def storeData(datasource: String, parameters: Map[String, String]): DataFrame
> def storeData(datasource: String, parameters: java.util.Map[String, String]): DataFrame
> {code}
> Python should have this too.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message