You don't seem to be following me. "codec" is the abbreviation for "coder/decoder" - so the combination of hardware and software in the receiver which converts the broadcast stream back to video is just as much a codec as the thing at the broadcast end. OK, so the encoders have improved (which is where the real heavy lifting is - I don't have the foggiest how it's done), but if you are saying that one device renders the video from the same encoded stream better than another (with otherwise identical video settings), then clearly the two devices must have different codecs (ie the decoder part of "codec").
That said, I find it difficult to accept. The decoder uses the encoded stream as a recipe to reconstruct the video, and it just does it in accordance with the stream specification. There's no judgement call, that's all done by the encoder (which may improve in the way it encodes the video, but it can't do anything dramatically different because it could render existing decoders incompatible). Every decoder should produce identical outputs from the same encoded stream (presuming they have been written to the same specification), so any remaining differences are the result of post-processing (artificial sharpening) or user perception.
You have, of course, been comparing images without applying scaling anywhere?