includePackage {SparkR} | R Documentation |
This function can be used to include a package on all workers before the
user's code is executed. This is useful in scenarios where other R package
functions are used in a function passed to functions like lapply
.
NOTE: The package is assumed to be installed on every node in the Spark
cluster.
includePackage(sc, pkg)
sc |
SparkContext to use |
pkg |
Package name |
## Not run:
##D library(Matrix)
##D
##D sc <- sparkR.init()
##D # Include the matrix library we will be using
##D includePackage(sc, Matrix)
##D
##D generateSparse <- function(x) {
##D sparseMatrix(i=c(1, 2, 3), j=c(1, 2, 3), x=c(1, 2, 3))
##D }
##D
##D rdd <- lapplyPartition(parallelize(sc, 1:2, 2L), generateSparse)
##D collect(rdd)
## End(Not run)