Skip to content

Sebastiaan-Alvarez-Rodriguez/spark-deploy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

spark-deploy

Framework and CLI-tool to install Spark clusters on remote clusters.

Requirements

  • Python>=3.2
  • remoto>=1.2.0
  • metareserve>=0.1.0

Installing

Simply execute pip3 install . --user in the root directory.

Usage

To start a cluster, use:

spark-deploy start

For more information, use:

spark-deploy -h

About

Spark deployment Plugin, using metareserve reservation system

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages