Describes the connection details of an IRIS cluster.
Describes the connection details of an IRIS cluster.
Implemented as a Java class within the InterSystems JDBC driver jar file.
Extends the given reader with IRIS specific methods.
Extends the given writer with IRIS specific methods.
Registers the InterSystems IRIS Spark Connector as a Spark SQL data source provider for the format "com.intersystems.spark", also known by its shorter alias "iris".
Registers the InterSystems IRIS Spark Connector as a Spark SQL data source provider for the format "com.intersystems.spark", also known by its shorter alias "iris".
This allows clients to execute queries against a cluster by calling Spark's generic load and save functions. For example:
spark.read .format("com.intersystems.spark") .option("query","SELECT * FROM Owls") .load()
executes the query "SELECT * FROM Owls"
on the default cluster, and hands
its rows back in the form of an appropriately partitioned DataFrame.
Here read
means 'execute a SELECT statement against the database', while
write
means 'execute batch INSERT statements against a database table'.
Apache Spark Documentation for more on how to use the generic load and save functions.
A function that formats the current row of a JDBC ResultSet as an element of an RDD.
A function that formats the current row of a JDBC ResultSet as an element of an RDD.
The function pair, for example:
val pair: Format[(String,String)] = r ⇒ (r.getString(1),r.getString(2))
extracts a pair of strings from the first two columns of the current row of a result set, so can be used to construct an RDD[(String,String)] from the result of any query of the cluster that includes at least two strings per record.
Format functions should normally restrict themselves to calling only pure
(that is, non-side effecting) member functions of the result set, such as
getInt
, getDouble
, getDate
and the like, since they will be invoked
for each and every record requested by the client.
Extends the given context with IRIS specific methods.
Extends the given session with IRIS specific methods.
Augments the Java class Address with additional functionality.
Extensions for working with the Spark ML library.
© 2024 InterSystems Corporation, Cambridge, MA. All rights reserved. Privacy & Terms Guarantee Accessibility
Package object for the InterSystems IRIS Spark Connector.
Defines a custom interface for the InterSystems IRIS Spark Connector, a set of classes and types that together offer a more convenient type safe Scala interface for the connector than that of the generic string-based interface that comes built into Spark itself.
scala> import com.intersystems.spark._
imports the custom interface into the current scope, allowing one to write, for example:
to read and display the first few records of a table called 'Owls' from the default cluster, while:
appends the contents of the dataframe
owls
to this very same table within the cluster.