Wasn't there some additional magic around BSD vs. System V-style signal handling? In one of these the signal handler is restored to the default after each invocation, so the first thing you have to do is reinstate your own handler (and you still might get surprised/terminated by a second signal that arrives too quickly to do this, and gets handled by the default signal handler).
Or is that an arcane thing that no longer applies to modern systems?
Yes, you are describing traditional System V style signals. BSD made signal handlers persistent. Modern (since the 1990s) code uses sigaction() to set up signal handlers which allows you to choose the sensible (BSD) semantics.
Or is that an arcane thing that no longer applies to modern systems?