Category : spark-streaming

I’m trying to run a spark application on k8s and it fails immediately after container creation. No specific error. I see only error code 101 Container name: spark-kubernetes-driver Container image: image URL Container state: Terminated Exit code: 101 Driver logs + CMD=("$SPARK_HOME/bin/spark-submit" –conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" –deploy-mode client "[email protected]") + exec /usr/bin/tini -s — /opt/spark/bin/spark-submit –conf spark.driver.bindAddress=10.1.0.10 ..

Read more

So I have a spark cluster running on docker, I’m using [https://github.com/big-data-europe/docker-spark] to make my cluster Then i add 2 more containers, 1 is behave as server (plain python) and 1 as client (spark streaming app). They both run on the same network. For server (plain python) i have something like import socket s.bind((”, 9009)) ..

Read more

I have created and deployed a spark cluster which consist of 4 container running spark master spark-slave spark-submit data-mount-container : to access the script from the local directory i added required dependency jar in all these container And also deployed the kafka in the host machine where it produce streaming via producer. i launched the ..

Read more

I’m new to docker. I’m trying to run a spark streaming application using docker. I have kafka and spark streaming application running separately in 2 containers. My kafka service is up and running fine. I tested with $KAFKA_HOME/bin/kafka-console-producer.sh and $KAFKA_HOME/bin/kafka-console-consumer.sh. I’m able to receive messages. But when I’m running my spark streaming application, it’s showing: ..

Read more