java.lang.AssertionError : Error with google cloud platform run service running pyspark docker file

  docker, google-cloud-platform, java, networking, pyspark

at java.base/
WARN MetricsSystem: Stopping a MetricsSystem that is not running
Exception in thread Thread-5:
Traceback (most recent call last): File "/opt/conda/lib/python3.9/", line 973, in _bootstrap_inner File "/opt/conda/lib/python3.9/", line 910, in run self._target(*self._args, **self._kwargs) File "/home/jovyan/", line 23, in receive_stream sc = SparkContext(appName="StreamTwitter") File "/opt/conda/lib/python3.9/site-packages/pyspark/", line 146, in __init__ self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer, File "/opt/conda/lib/python3.9/site-packages/pyspark/", line 209, in _do_init self._jsc = jsc or self._initialize_context(self._conf._jconf) File "/opt/conda/lib/python3.9/site-packages/pyspark/", line 321, in _initialize_context return self._jvm.JavaSparkContext(jconf) File "/opt/conda/lib/python3.9/site-packages/py4j/", line 1568, in __call__ return_value = get_return_value( File "/opt/conda/lib/python3.9/site-packages/py4j/", line 326, in get_return_value raise Py4JJavaError( py4j.protocol.Py4JJavaError: An error occurred while calling

java.lang.AssertionError: assertion failed: Expected hostname or IPv6 IP enclosed in [] but got fddf:3978:feb1:d745:0:0:0:c001
at scala.Predef$.assert(Predef.scala:223)

at org.apache.spark.util.Utils$.checkHost(Utils.scala:1072)

 at org.apache.spark.executor.Executor.<init>(Executor.scala:89)

I’m developing a streamlit app in a docker container that uses pyspark in the back end, the docker instance runs fine on my pc, and the streamlit app shows up on the google run service website, but it freezes in the middle of the process as it initialzes the spark instance due to the error above, how can I fix this? Thank you

Source: Docker Questions