docker-py reading container logs as a generator hangs

I am using docker-py to read container logs as a stream. by setting the stream flag to True as indicated in the docs. Basically, I am iterating through all my containers and reading their container logs in as a generator and writing it out to a file like the following:

for service in service_names:
    dkg = self.container.logs(service, stream=True)
    with open(path, 'wb') as output_file:
            while True:
                line = next(dkg).decode("utf-8")
                print('line is: ' + str(line))
                if not line or "n" not in line:  # none of these work

        except Exception as exc:                  # nor this
            print('an exception occurred: ' + str(exc))

However, it only reads the first service and hangs at the end of the file. It doesn’t break out of the loop nor raise an exception (e,g. StopIteration exception). According to the docs if stream=True it should return a generator, I printed out the generator type and it shows up as a docker.types.daemon.CancellableStream so don’t think it would follow the traditional python generator and exception out if we hit the end of the container log generator and call next().

As you can see I’ve tried checking if eol is falsy or contains newline, even see if it’ll catch any type of exception but no luck. Is there another way I can. determine if it hits the end of the stream for the service and break out of the while loop and continue writing the next service? The reason why I wanted to use a stream is because the large amount of data was causing my system to run low on memory so I prefer to use a generator.

Source: StackOverflow