Interrupts as User Interface
In modern systems, user interface usually means windows, widgets, and event loops. In classic DOS environments, the interface boundary often looked very different: software interrupts. INT calls were not only low-level plumbing; they were stable contracts that programs used as operating surfaces for display, input, disk services, time, and devices.
Thinking about interrupts as a user interface reveals why DOS programming felt both constrained and elegant. You were not calling giant frameworks. You were speaking a compact protocol: registers in, registers out, carry flag for status, documented side effects.
Take INT 21h, the core DOS service API. It offered file IO, process management, memory functions, and console interaction. A text tool could feel interactive and polished while relying entirely on these calls and a handful of conventions. The interface was narrow but predictable.
INT 10h for video and INT 16h for keyboard provided another layer. Combined, they formed a practical interaction stack:
- render character cells
- move cursor
- read key events
- update state machine
That is a full UI model, just encoded in BIOS and DOS vectors instead of GUI widget trees.
The benefit of such interfaces is explicitness. Every call had a cost and a contract. You learned quickly that “just redraw everything” may flicker and waste cycles, while selective redraws feel responsive even on modest hardware.
A classic loop looked like:
- read key via INT 16h
- map key to command/state transition
- update model
- repaint affected cells only
This remains good architecture. Event input, state transition, minimal render diff.
Interrupt-driven design also encouraged compatibility thinking. Programs often needed to run across BIOS implementations, DOS variants, and quirky hardware clones. Defensive coding around return flags and capability checks became normal practice.
Modern equivalent? Feature detection, graceful fallback, and compatibility shims.
Error handling through flags and return codes built good habits too. You did not get exception stacks by default. You checked outcomes explicitly and handled failure paths intentionally. That style can feel verbose, but it produces robust control flow when applied consistently.
There was, of course, danger. Interrupt vectors could be hooked by TSRs and drivers. Programs sharing this environment had to coexist with unknown residents. Hook chains, reentrancy concerns, and timing assumptions made debugging subtle.
Yet this ecosystem also taught composability. TSRs could extend behavior without source-level integration. Keyboard enhancers, clipboard utilities, and menu overlays effectively acted like plugins implemented through interrupt interception.
The modern analogy is middleware and event interception layers. Different mechanism, same concept.
Performance literacy was unavoidable. Each interrupt call touched real hardware pathways and constrained memory. Programmers learned to batch operations, avoid unnecessary mode switches, and cache where safe. This is still relevant in latency-sensitive systems.
A practical lesson from INT-era code is interface minimalism. Many successful DOS tools provided excellent usability with:
- clear hotkeys
- deterministic screen layout
- immediate feedback
- low startup cost
No animation. No ornamental complexity. Just direct control and predictable behavior.
Documentation quality mattered more too. Because interfaces were low-level, good comments and reference notes were essential. Teams that documented register usage, assumptions, and tested configurations shipped software that survived beyond one machine setup.
If you revisit DOS programming today, treat interrupts not as relics but as case studies in API design:
- small surface
- explicit contracts
- predictable error signaling
- compatibility-aware behavior
- measurable performance characteristics
These are timeless properties of good interfaces.
There is also a philosophical takeaway: user experience does not require visual complexity. A system can feel excellent when response is immediate, controls are learnable, and failure states are understandable. Interrupt-era tools often got this right under severe constraints.
You can even apply this mindset to current CLI and TUI projects. Build narrow, well-documented interfaces first. Keep interactions deterministic. Prioritize startup speed and feedback latency. Reserve abstraction for proven pain points, not speculative architecture.
Interrupts as user interface is not about romanticizing old APIs. It is about recognizing that good interaction design can emerge from strict contracts and constrained channels. The medium may change, but the principles endure.
When software feels clear, responsive, and dependable, users rarely care whether the plumbing is modern or vintage. They care that the contract holds. DOS interrupts were contracts, and in that sense they were very much a UI language.