About 19,300 results
Open links in new tab
  1. Spark SQL & DataFrames | Apache Spark

    Spark SQL is Spark's module for working with structured data, either within Spark programs or through standard JDBC and ODBC connectors.

  2. Spark SQL and DataFrames - Spark 4.0.1 Documentation - Apache …

    Spark SQL is a Spark module for structured data processing. Unlike the basic Spark RDD API, the interfaces provided by Spark SQL provide Spark with more information about the structure …

  3. Spark SQL, Built-in Functions - Apache Spark

    Jul 30, 2009 · There is a SQL config 'spark.sql.parser.escapedStringLiterals' that can be used to fallback to the Spark 1.6 behavior regarding string literal parsing. For example, if the config is …

  4. SQL Reference - Spark 4.0.1 Documentation - Apache Spark

    Spark SQL is Apache Spark’s module for working with structured data. This guide is a reference for Structured Query Language (SQL) and includes syntax, semantics, keywords, and …

  5. SQL Syntax - Spark 4.0.1 Documentation - Apache Spark

    The SQL Syntax section describes the SQL syntax in detail along with usage examples when applicable. This document provides a list of Data Definition and Data Manipulation Statements, …

  6. Getting Started - Spark 4.0.1 Documentation - Apache Spark

    The sql function on a SparkSession enables applications to run SQL queries programmatically and returns the result as a DataFrame.

  7. Spark SQL — PySpark 4.0.1 documentation - Apache Spark

    pyspark.sql.tvf.TableValuedFunction.posexplode_outer pyspark.sql.tvf.TableValuedFunction.variant_explode …

  8. JDBC To Other Databases - Spark 4.0.1 Documentation - Apache …

    The below table describes the data type conversions from Spark SQL Data Types to Microsoft SQL Server data types, when creating, altering, or writing data to a Microsoft SQL Server table …

  9. Overview - Spark 4.0.1 Documentation - Apache Spark

    It also supports a rich set of higher-level tools including Spark SQL for SQL and structured data processing, pandas API on Spark for pandas workloads, MLlib for machine learning, GraphX …

  10. Data Types - Spark 4.0.1 Documentation - Apache Spark

    All data types of Spark SQL are located in the package of pyspark.sql.types. You can access them by doing from pyspark.sql.types import *