I’ve setup 3 docker containers following the diagram found here. In this case Turbine acts as a backend server which replies to request from SCADA. This is done using python Flask and requests libraries. I need to be able to connect to the network to monitor traffic and extract XML response and save it to ..
I’ve made a python script that uploads the files into s3 bucket. I need the script to run periodically from within the docker container. #!/usr/local/bin/python3 import boto3 from botocore.errorfactory import ClientError import os import glob import json import time s3_client = boto3.client(‘s3’) s3_bucket_name = ‘ap-rewenables-feature-data’ uploaded = None max_mod_time = ‘0’ file_list = glob.glob(‘/data/*.json’) file_mod_time ..
I have a script that uploads the file to S3 bucket every minute using crontab. I’m uisng docker to run the script, I need the changes made to the file on the host to be reflected on the docker container and uploaded to S3 bucket. The script uploads /data/boo.txt to S3 every minute. import boto3 ..
I’ve a made script to automate initial system setup on Linux Mint 20. The script runs fine until after finishing docker install just before docker-compose installation. After docker is installed the script stays on bash prompt until the usrer enters exit, afterwards the script continues it’s normal execution. It’s a small annoyance, but I’m not ..
I’m working on a small app utilising dockerode for the container management. I’m trying to make the process running in the docker container independent from node process. Essentially what I’m trying to achieve is if the node process crashes after restarting it can reattach back to the running container and continue as if nothing happened. ..