While Mr Gruber might not use any computer other than a Mac, the rest of the world does. There's a lot of value in following the lowest common denominator in this case -- encoding the type in the filename isn't a bad way of ensuring the type never gets lost.
His headline and the gibe about using technology from the Windows 3.1 era is about encoding filetypes in filenames and how you handle the consequences.
That's not what the article is about. Mac OS X already used file extensions to derive UTIs in the absence of type & creator codes. It's a fine and practical replacement for type codes, in that they both describe the content, but they do nothing to replace creator codes, which can't be derived from an extension. There is no interoperability gain, because again extensions were already the primary way to determine type and application binding for things without type/creator codes, which means everything from other systems.
The point is that the new way is significantly more limited than the old way, which was the way the Mac had always worked. Understandably, this can be seen as a step backwards.
encoding the type in the filename isn't a bad way of ensuring the type never gets lost
The most likely way of losing the type is surely if a user accidentally (or maliciously) changes the filename extension. That is why Windows pops up an alarmist warning if a user tries to change their file extension. Whereas a largely invisible (but still changeable) attribute is much less likely to 'get lost' due to user intervention.
Why can't one strive for better?
Given the reality of prevalent general ignorance why not embrace it and eradicate writing in schools, replacing it with touch-tying lessons?