One hex digit is a nibble, though you're correct. There is no real way for a modern computer to deal with a 4 bit value. It would have to fake it by zeroing out the top bits.
IBM mainframes (360/370/390 series) used to have hardware support for fixed decimal arithmetic where each byte contains two decimal digits, one in each nibble.