How to solve: Elasticsearch Connectionerror. Max retries exceeded. Docker Python

I am trying to test docker run fr my python app that uses elasticsearch to get data. When running the below command:

docker run -p 5000:5000 myapp

I get the following error:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app
    response = self.full_dispatch_request()
  File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1951, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1820, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/usr/local/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
    raise value
  File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1949, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1935, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "main.py", line 27, in launch_app
    ques = get_data_es(ques1)
  File "/app/Text_Cleaning.py", line 31, in get_data_es
    es.indices.refresh(index="esposts")
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/client/utils.py", line 92, in _wrapped
    return func(*args, params=params, headers=headers, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/client/indices.py", line 42, in refresh
    "POST", _make_path(index, "_refresh"), params=params, headers=headers
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/transport.py", line 362, in perform_request
    timeout=timeout,
  File "/usr/local/lib/python3.7/site-packages/elasticsearch/connection/http_requests.py", line 157, in perform_request
    raise ConnectionError("N/A", str(e), e)
elasticsearch.exceptions.ConnectionError: ConnectionError(HTTPConnectionPool(host='localhost', port=9200): Max retries exceeded with url: /radius_ml_posts/_refresh (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe145ddf390>: Failed to establish a new connection: [Errno 111] Connection refused'))) caused by: ConnectionError(HTTPConnectionPool(host='localhost', port=9200): Max retries exceeded with url: /radius_ml_posts/_refresh (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe145ddf390>: Failed to establish a new connection: [Errno 111] Connection refused')))

The code for retrieving data in python is:

def get_data_es(question):
    es = Elasticsearch(hosts=[{"host": "localhost", "port": 9200}], connection_class=RequestsHttpConnection, request_timeout=30)
    doc = {'author': 'vaibhav','text': 'event', "timestamp": datetime.now()}    
    es.indices.refresh(index="esposts")
    res = es.index(index="esposts", id = 1, body = doc)
    res = es.search(index="esposts", size = 30, body={ "query": {
                                                                "query_string": { 
                                                                "default_field": "search_text",
                                                                "query": question
                                                                }
                                                            }
                                                        }
                                                    )
    return res

My docker file:

FROM python:3.7-slim

COPY . /app
WORKDIR /app

RUN pip --default-timeout=100 install -r requirements.txt

EXPOSE 5000
ENTRYPOINT ["python"]
CMD ["main.py"]

When I run the app normally (on spyder or Jupyter), it works fine.

Please help in solving this.

Much regards.

Source: StackOverflow