In the [8086](https://www.ampheo.com/search/8086) [microprocessor](https://www.ampheo.com/c/microprocessors), the interrupt with the lowest priority is the single-step interrupt, generated internally when the Trap Flag (TF) is set. Here's a detailed breakdown of the 8086's interrupt priority hierarchy and why the single-step interrupt sits at the bottom. ![500px-KL_AMD_D8086](https://hackmd.io/_uploads/Sk-duEoy-g.jpg) **The Official 8086 Interrupt Priority Order** From highest to lowest priority, the hierarchy is: 1. Internal Hardware Interrupts & Software INT instructions: Divide Error (INT 0), Single-Step (INT 1), INTO (INT 4), INT *n* (e.g., INT 21h for DOS services). Note: This is a broad category. The INT n instruction, while software-triggered, has the highest priority level within the microprocessor's internal logic. 2. Non-Maskable Interrupt (NMI): A hardware interrupt for critical events like power failure or memory parity errors. It cannot be disabled by software. 3. Maskable Interrupt (INTR): The common hardware interrupt pin, which can be disabled by clearing the Interrupt Flag (IF) with the CLI instruction. This is how I/O devices (like keyboard, timer, disk) request service. 4. Single-Step Interrupt (Generated by Trap Flag): The lowest priority interrupt. **Why is the Single-Step Interrupt the Lowest?** The reason is deeply tied to its purpose: debugging. * Function: When the Trap Flag (TF) is set to 1, the 8086 automatically generates an internal interrupt (type 1) after the execution of every single instruction. This allows a debugger (like DEBUG.COM in DOS) to take control after each instruction, display the state of registers and memory, and let the programmer step through code. * The Priority Logic: If the single-step interrupt had a high priority, it would disrupt the normal handling of other critical interrupts. * Imagine you are single-stepping through a program that communicates with a hard disk via the INTR pin. * If a disk INTR occurs, the 8086 must service it to avoid losing data. If the single-step interrupt had higher priority, the CPU would finish the current instruction, service the single-step interrupt (showing the debugger), and then service the disk. This delay could be catastrophic. * By giving it the lowest priority, the 8086 ensures that all other important interrupts (NMI for critical errors, INTR for I/O) are serviced before the debugger gets control. This makes the system stable and responsive even during debugging. In essence, the debugger (single-step) should observe the system, including its interrupt activity, without interfering with it. **Visual Summary of the Interrupt Priority** The following chart illustrates the hierarchy of [8086](https://www.ampheoelec.de/search/8086) interrupts, from the highest-priority internal events to the lowest-priority single-step debugger: ![deepseek_mermaid_20251107_847532](https://hackmd.io/_uploads/SkDiw4sybl.png) **Key Takeaways** * Lowest Priority: Single-Step Interrupt (via Trap Flag). * Reason: To ensure debugging does not interfere with the timely servicing of critical hardware events and I/O. * Practical Use: This priority scheme is what allows developers to reliably debug complex, interrupt-driven software on the 8086.