I'm using a QProcess manager in my PyQt application, but I keep getting a
TypeError: done() missing 1 required positional argument: 'exit_status'. After removing theexit_statusparameter, the program runs briefly but then crashes with a segmentation fault. What's going on, and how can I fix it?
If you've been working with QProcess in PyQt and connecting the finished signal to a handler, you may have run into a confusing situation: your slot receives fewer arguments than you expected, leading to a TypeError. Then, when you adjust the parameters to match, things seem to work — until a segmentation fault takes out your application entirely.
This is a known area of friction, and it comes down to differences in how QProcess.finished behaves across PyQt and PySide versions. Let's walk through what's happening and how to solve it.
Understanding the QProcess.finished signal
According to the Qt documentation, the QProcess.finished signal is emitted when a process finishes. In Qt 5, this signal had two overloads — one that sent (int exitCode, QProcess::ExitStatus exitStatus) and one that sent only (int exitCode). In Qt 6, the overloaded version was removed, and the signal now always sends both exitCode and exitStatus.
However, in practice, different versions of PyQt5 and PySide2 didn't always expose both parameters consistently. Some versions only forwarded one value through the signal, which meant your connected slot would receive fewer arguments than expected.
Diagnosing the problem
A good way to see exactly what your signal is sending is to create a simple debug method that accepts anything:
def dump(self, *args):
print(args)
Connect this to your finished signal (through whatever forwarding mechanism you're using) in place of your real handler. When the process completes, check the output. You might see something like:
('ce89ff6113c940d98c78e8a4392eb365', 0)
If you're expecting three values (a job ID, an exit code, and an exit status) but only receiving two, that confirms the signal is only sending one value through — and your handler's parameter list doesn't match.
The fix: accept variable arguments
The most robust solution is to make your handler flexible about how many arguments it receives. Since the body of a typical done handler often doesn't depend on the exit code or exit status, you can use *args to absorb whatever the signal sends:
def done(self, job_id, *args):
"""
Task/worker complete. Remove it from the active workers
dictionary. We leave it in worker_state, as this is used
to display past/complete workers too.
"""
del self._jobs[job_id]
self.layoutChanged.emit()
This works regardless of whether the signal sends one parameter or two after the job ID. Your code won't crash from a mismatched argument count, and it won't matter which version of PyQt or PySide you're running.
Packaging Python Applications with PyInstaller by Martin Fitzpatrick — This step-by-step guide walks you through packaging your own Python applications from simple examples to complete installers and signed executables.
If you do need the exit code or exit status for your logic, you can pull them out of args safely:
def done(self, job_id, *args):
exit_code = args[0] if len(args) > 0 else None
exit_status = args[1] if len(args) > 1 else None
print(f"Job {job_id} finished with code={exit_code}, status={exit_status}")
del self._jobs[job_id]
self.layoutChanged.emit()
Why the segmentation fault?
The segmentation fault is a separate but related issue. When your done handler crashes due to the TypeError, the job entry is never removed from your internal _jobs dictionary. This means the dictionary continues to hold a reference to a QProcess object that has already finished (and potentially been destroyed by Qt's internal cleanup). Later, when your application tries to interact with that stale reference — for example, when updating a view — it accesses memory that's no longer valid, causing the segfault.
By fixing the TypeError so that done actually completes successfully, the finished process gets properly cleaned up from the dictionary, and the segmentation fault goes away.
Upgrading to PyQt6
If you're moving from PyQt5 to PyQt6, this particular problem is less likely to occur. In PyQt6, the QProcess.finished signal consistently provides both exit_code (an int) and exit_status (a QProcess.ExitStatus enum value). You can write your handler with explicit parameters:
from PyQt6.QtCore import QProcess
def done(self, job_id, exit_code, exit_status):
del self._jobs[job_id]
self.layoutChanged.emit()
That said, using *args as shown above is still a perfectly valid defensive approach, especially if you want your code to be portable across different Qt bindings.
A complete working example
Here's a minimal, self-contained example that demonstrates running multiple QProcess instances with a forwarding pattern and a safe done handler. This uses PyQt6:
import sys
import uuid
from PyQt6.QtCore import (
QAbstractListModel,
QProcess,
Qt,
pyqtSlot,
)
from PyQt6.QtWidgets import (
QApplication,
QListView,
QMainWindow,
QPushButton,
QVBoxLayout,
QWidget,
)
def fwd_signal(target, job_id):
"""
Create a forwarding function that prepends job_id
to whatever arguments the signal sends.
"""
return lambda *args: target(job_id, *args)
class ProcessManager(QAbstractListModel):
def __init__(self):
super().__init__()
self._jobs = {} # job_id: QProcess
self._job_labels = {} # job_id: description string
def data(self, index, role=Qt.ItemDataRole.DisplayRole):
if role == Qt.ItemDataRole.DisplayRole:
job_ids = list(self._jobs.keys())
job_id = job_ids[index.row()]
return f"Running: {self._job_labels.get(job_id, job_id)}"
return None
def rowCount(self, parent=None):
return len(self._jobs)
def start_job(self, command, arguments=None):
if arguments is None:
arguments = []
job_id = uuid.uuid4().hex
p = QProcess()
# Connect using the forwarding function and *args for safety.
p.finished.connect(fwd_signal(self.done, job_id))
self._jobs[job_id] = p
self._job_labels[job_id] = f"{command} {' '.join(arguments)}"
self.layoutChanged.emit()
p.start(command, arguments)
def done(self, job_id, *args):
"""
Called when a process finishes. Using *args ensures this
works regardless of how many values the finished signal sends.
"""
exit_code = args[0] if len(args) > 0 else None
exit_status = args[1] if len(args) > 1 else None
label = self._job_labels.get(job_id, job_id)
print(
f"Finished: {label} "
f"(exit_code={exit_code}, exit_status={exit_status})"
)
if job_id in self._jobs:
del self._jobs[job_id]
self.layoutChanged.emit()
class MainWindow(QMainWindow):
def __init__(self):
super().__init__()
self.setWindowTitle("QProcess Manager")
self.manager = ProcessManager()
layout = QVBoxLayout()
self.list_view = QListView()
self.list_view.setModel(self.manager)
layout.addWidget(self.list_view)
btn = QPushButton("Start a process")
btn.clicked.connect(self.start_process)
layout.addWidget(btn)
container = QWidget()
container.setLayout(layout)
self.setCentralWidget(container)
def start_process(self):
# Run a short-lived command. On Linux/macOS, "ping -c 3 127.0.0.1"
# finishes after 3 pings. On Windows, use "ping -n 3 127.0.0.1".
if sys.platform == "win32":
self.manager.start_job("ping", ["-n", "3", "127.0.0.1"])
else:
self.manager.start_job("ping", ["-c", "3", "127.0.0.1"])
app = QApplication(sys.argv)
window = MainWindow()
window.show()
app.exec()
Click the button a few times to launch several ping processes. You'll see them appear in the list and disappear as they finish, with details printed to the console. No TypeError, no segfault.
Summary
- The
QProcess.finishedsignal can send a different number of arguments depending on your PyQt/PySide version. Using*argsin your handler makes it resilient to these differences. - A
TypeErrorin your cleanup handler can leave staleQProcessreferences in your data structures, which leads to segmentation faults when those references are accessed later. - In PyQt6, the signal behavior is consistent (both
exit_codeandexit_statusare always sent), but using*argsremains a good defensive practice.
If you're working with external processes in your PyQt6 applications, the QProcess and external programs tutorial covers the fundamentals in more detail. And if you're looking at running concurrent work more broadly, multithreading with QThreadPool is worth exploring as an alternative approach.
PyQt6 Crash Course by Martin Fitzpatrick — The important parts of PyQt6 in bite-size chunks