When building a Python web application with Flask, I ran into a common problem: using subprocess.run to capture output was blocking my entire application and causing timeouts. Here's how I fixed it with a simple multithreading approach.
The Problem
My Flask app would freeze whenever I called subprocess.run capture output:
from flask import Flask, jsonify
import subprocess
app = Flask(__name__)
@app.route('/process')
def process_data():
# This blocks the entire Python web application!
result = subprocess.run(['python', 'script.py'],
capture_output=True, text=True)
return jsonify({'output': result.stdout})
Issues:
- Page would freeze during processing
- Other users couldn't access the app
- Frequent timeouts on long processes
The Solution: Multithreading
I used Python's threading module to run subprocesses in the background:
from flask import Flask, jsonify
import subprocess
import threading
import uuid
app = Flask(__name__)
jobs = {}
def run_subprocess(job_id, command):
try:
jobs[job_id]['status'] = 'running'
result = subprocess.run(command, capture_output=True, text=True)
jobs[job_id].update({
'status': 'completed',
'output': result.stdout,
'error': result.stderr
})
except Exception as e:
jobs[job_id].update({'status': 'error', 'error': str(e)})
@app.route('/process')
def process_data():
job_id = str(uuid.uuid4())
jobs[job_id] = {'status': 'queued'}
# Start subprocess in background thread
thread = threading.Thread(
target=run_subprocess,
args=(job_id, ['python', 'script.py'])
)
thread.daemon = True
thread.start()
return jsonify({'job_id': job_id, 'status': 'started'})
@app.route('/status/')
def check_status(job_id):
return jsonify(jobs.get(job_id, {'error': 'Job not found'}))
if __name__ == '__main__':
app.run(threaded=True)
How It Works
- Start Process: User gets immediate response with job ID
- Background Execution:
subprocess.run capture outputruns in separate thread - Check Status: User can poll
/status/<job_id>for updates - Non-blocking: Python web application stays responsive
Frontend Integration
Simple JavaScript to poll for results:
async function startProcess() {
const response = await fetch('/process');
const {job_id} = await response.json();
// Poll every 2 seconds
const interval = setInterval(async () => {
const status = await fetch(`/status/${job_id}`);
const result = await status.json();
if (result.status === 'completed') {
console.log('Output:', result.output);
clearInterval(interval);
}
}, 2000);
}
Key Benefits
- Responsive UI: No more frozen pages
- Better UX: Users get immediate feedback
- Scalable: Multiple processes can run simultaneously
- Production Ready: Works great with httpd/Apache restarts
Alternative: FastAPI
FastAPI offers built-in background tasks, which is another good option:
from fastapi import FastAPI, BackgroundTasks
import subprocess
app = FastAPI()
def background_process():
subprocess.run(['python', 'script.py'], capture_output=True)
@app.post("/process")
async def start_process(background_tasks: BackgroundTasks):
background_tasks.add_task(background_process)
return {"message": "started"}
Conclusion
Threading solved my subprocess blocking issues in the Python web application. The key was moving subprocess.run capture output calls to background threads, keeping the main application responsive. This approach has been running smoothly in production, even handling httpd service restarts without issues.