According to best practices, you should only be running one Spark context in the same Java Virtual Machine (JVM) at a given time. You can either kill or stop an application to have the one in WAITING move into the RUNNING stage or you can launch multiple clusters to trigger independent tasks.
While an admin can set multiple Spark contexts by setting spark.driver.allowMultipleContexts
to true
, having multiple spark contexts in the same JVM makes it more unstable, and the crashing of one spark context can affect the other.
Helpful links:
https://stackoverflow.com/questions/32827333/spark-multiple-contexts
https://medium.com/@achilleus/spark-session-10d0d66d1d24
Comments
0 comments
Please sign in to leave a comment.