I understand that the drives themselves must be specified because of the differences in transfer rate among other things, but why would they bother doing this with cables if cable compatibility with all three protocols were a given?
You happened to quote my answer to this question, although it might be a little jargon-filled? I think
@Certificate of Excellence and
@HrutkayMods brought up good details here on the tests of cables, as well as a good alternative explanation for the issue you are running into, and I don't really have anything to add on those points.
What I will say is that this is pretty common across many connection protocols. Higher speeds over the same pinouts are easy if you can increase the signaling frequency and add some way to handshake to the higher frequency (think of how modems handshake to determine how quickly they can communicate). But faster signaling frequencies means that you are more susceptible to interference from electromagnetic radiation. Any suitable wire is an antenna, after all. So these faster speeds place more requirements on shielding the cable from this interference. HDMI/DisplayPort are probably the most obvious examples of this, where old cables may or may not handle newer speeds it wasn't rated for when it was made. It depends on the length of the cable, and how well the cable was built.
Ethernet is probably the cleanest of the bunch. By making clear categories, there's not much wiggle room in terms of say, a Cat 6 cable that somehow meets the requirements of Cat 7. So then you can make very specific claims about the performance of a given cable with Ethernet. But if you look at what makes the categories different, it's all about better and better rejection of interference.