diff --git a/examples/spark/README.md b/examples/spark/README.md index 391ac94a55f..6b0d9222b65 100644 --- a/examples/spark/README.md +++ b/examples/spark/README.md @@ -156,7 +156,7 @@ kubectl proxy --port=8001 ``` At which point the UI will be available at -[http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/). +[http://localhost:8001/api/v1/proxy/namespaces/spark-cluster/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/). ## Step Three: Start your Spark workers @@ -294,7 +294,7 @@ kubectl get pods -lcomponent=zeppelin # Get the driver pod to interact with. ``` At which point the Master UI will be available at -[http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/). +[http://localhost:8001/api/v1/proxy/namespaces/spark-cluster/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/). You can either interact with the Spark cluster the traditional `spark-shell` / `spark-subsubmit` / `pyspark` commands by using `kubectl exec` against the