Install versions of Spark for use with local Spark connections
(i.e. spark_connect(master = "local"
)
spark_install(
version = NULL,
hadoop_version = NULL,
reset = TRUE,
logging = "INFO",
verbose = interactive()
)spark_uninstall(version, hadoop_version)
spark_install_dir()
spark_install_tar(tarfile)
spark_installed_versions()
spark_available_versions(
show_hadoop = FALSE,
show_minor = FALSE,
show_future = FALSE
)
List with information about the installed version.
Version of Spark to install. See spark_available_versions
for a list of supported versions
Version of Hadoop to install. See spark_available_versions
for a list of supported versions
Attempts to reset settings to defaults.
Logging level to configure install. Supported options: "WARN", "INFO"
Report information as Spark is downloaded / installed
Path to TAR file conforming to the pattern spark-###-bin-(hadoop)?### where ### reference spark and hadoop versions respectively.
Show Hadoop distributions?
Show minor Spark versions?
Should future versions which have not been released be shown?