Spark save csv file as stream inside a docker container

  apache-spark, docker, pyspark, python, spark-streaming

I wrote pyspark app that saving csv files using overwrite output mode.
this is the path where the file is saved :/app/files. I dockerized my app. When I use docker volumes, program gives an error. I wrote error below and docker compose codes.
Error :

java.io.IOException: Unable to clear output directory file:/app/files
prior to writing to it

Running fine docker compose file:

version: '2'
services:
  convert-pivot:
    image: convert-pivot
    restart: always
    container_name: convert-pivot

docker compose file that give error:

version: '2'
    services:
      convert-pivot:
        image: convert-pivot
        restart: always
        container_name: convert-pivot
        volumes:
          - ./files:/app/files

Source: Docker Questions

LEAVE A COMMENT