Monitor Python subprocess' output streams in real-time

Here is my (or the) solution to monitor stdout and stderr of a subprocess in Python 3. Python’s documentation recommends using subprocess.Popen.communicate() rather than to avoid blocking the application, but you only get to use the output once the subprocess ends. Things get complicated if you need to monitor both stdout and stderr. I use to solve this. I don’t think it works with MS Windows but that’s your problem, not mine :).

import subprocess
import logging
import select

p = subprocess.Popen(["spam", "--verbose"],
outputs = {p.stdout: {"EOF": False, "logcmd":},
           p.stderr: {"EOF": False, "logcmd": logging.error}}
while (not outputs[p.stdout]["EOF"] and
       not outputs[p.stderr]["EOF"]):
    for fd in[p.stdout, p.stderr], [], [])[0]:
        output = fd.readline()
        if output == b"":
            outputs[fd]["EOF"] = True


  • [](The logging module)
  • [](The select module)
  • [](The subprocess module)
  • [](Socket Programming HOWTO)

Edit : Also read subprocess source code located at /usr/lib/python3.2/ You will learn how to do it with the threading module.

2012-06-02 edit

Mr Eric Pruitt was kind enough to share his improvement on this code. Thank you sir!

Date: 2012-06-02
From: Eric Pruitt

Hey Alexandre,

I was looking for some information on logging subprocess output when I found your post. I re-wrote the code in a little more compact form, and I thought you might be interested. Here is my version:

def iterate_fds(handles, functions):
    methods = dict(zip(handles, functions))
    while methods:
        for handle in, tuple(), tuple())[0]:
            line = handle.readline()
            if line:

In my program, I am calling the code like this:

iterate_fds((rsync.stderr, rsync.stdout), (logging.warning,


Alexandre de Verteuil
Alexandre de Verteuil
Senior Solutions Architect

I teach people how to see the matrix metrics.
Monkeys and sunsets make me happy.