How to import CSV or JSON data into Elasticsearch using deviantony/docker-elk

I just started picking up elasticsearch and docker a few days ago and I had some trouble ingesting data into elasticsearch. The elastic stack repo that I am using is this:

I tried to follow this tutorial that I found online:
but could not find any indexes when I load up kibana.

Here is what I did.
I downloaded a sample data and stored it inside a folder called data under the root directory.
In the docker-compose.yml file, I made a bind-mount that points to my external data folder.

      context: elasticsearch/
      - type: bind
        source: ./elasticsearch/config/elasticsearch.yml
        target: /usr/share/elasticsearch/config/elasticsearch.yml
        read_only: true
      - type: bind
        source: ./data
        target: /usr/share/elasticsearch/data
      - "9200:9200"
      - "9300:9300"
      ES_JAVA_OPTS: "-Xmx256m -Xms256m"
      ELASTIC_PASSWORD: password
      # Use single node discovery in order to disable production mode and avoid bootstrap checks
      # see
      discovery.type: single-node
      - elk

And under my logstash.conf file. This is what I have changed:

input {
  tcp {
    port => 5000
  file {
    path => "/usr/share/elasticsearch/data/conn250K.csv"
    start_position => "beginning"

filter {
  csv {
    columns => [ "record_id", "duration", "src_bytes", "dest_bytes" ]

output {
  elasticsearch {
    hosts => "elasticsearch:9200"
    user => "elastic"
    password => "password"
    index => "network"

After firing up, “docker-compose up” command in the terminal, I could not find any index pattern to create in Kibana as there are no indexes generated. I can’t figure out what is wrong.

Source: StackOverflow