Jupyter Notebook extension for Apache Spark integration.
To install, simply run pip install jupyter-spark
. For development and testing, clone the project and run pip install .
from a shell in the project's root directory.
Include a progress indicator for the current Notebook cell if it invokes a Spark job. Query the Spark UI service on the backend to get the required Spark job information.
To view all currently running jobs, click the "show running Spark jobs" button, or press Alt+S
.
A proxied version of the Spark UI can be accessed at localhost:8888/spark
.
NOTE: Uninstalling jupyter-spark via pip uninstall jupyter-spark
will uninstall the server extension but leave the client extension in a partially installed state. To fully remove the extension:
- Run
pip uninstall jupyter-spark
- Delete
spark.js
from yournbextensions
folder. - Delete any references to
jupyter-spark.spark
injupyter_notebook_config.json
(in your .jupyter directory) - Delete any references to
spark
innotebook.json
(in .jupyter/nbconfig)