Package: matlab.compiler.mlspark
Superclasses:
Interface class to initialize a connection to a Spark enabled cluster
A SparkContext
object serves as an entry
point to Spark™ by initializing a connection to a Spark cluster.
It accepts a SparkConf
object as an input argument
and uses the parameters specified in that object to set up the internal
services necessary to establish a connection to the Spark execution
environment.
creates
a sc
= matlab.compiler.mlspark.SparkContext(conf
)SparkContext
object initializes a connection
to a Spark cluster.
The properties of this class are hidden.
addJar | Add JAR file dependency for all tasks that need to be
executed in a SparkContext |
broadcast | Broadcast a read-only variable to the cluster |
datastoreToRDD | Convert MATLAB datastore to a Spark RDD |
delete | Shutdown connection to Spark enabled cluster |
getSparkConf | Get SparkConf configuration parameters |
parallelize | Create an RDD from a collection of local MATLAB values |
setCheckpointDir | Set the directory under which RDDs are to be checkpointed |
setLogLevel | Set log level |
textFile | Create an RDD from a text file |
See the latest Spark documentation for more information.