Watch recorded HD content from FOXSAT on HDR-FOX T2

Streaming from Foxsat-HDR to HDR-Fox T2. Only left right transport buttons and pause work. Picture quality not quite as good as on freetime box.
But it works, and might be sufficient. The picture quality is undoubtedly the same as any other picture from the HDR-FOX - some people say it looks soft, but it simply doesn't apply any sharpening to the image. That can be done in the TV if required. Unless you are saying the HDR1000S uses a different codec, there can't be any fundamental difference between HDMI outputs from the same source data (assuming the video output settings are the same).
 
But it works, and might be sufficient. The picture quality is undoubtedly the same as any other picture from the HDR-FOX - some people say it looks soft, but it simply doesn't apply any sharpening to the image. That can be done in the TV if required. Unless you are saying the HDR1000S uses a different codec, there can't be any fundamental difference between HDMI outputs from the same source data (assuming the video output settings are the same).

Of course it uses the same codecs, however it clearly has superior video processing/decoding. If you think about you are saying that H264/AVC decoders have not improved at all from the first generation devices. Compare a Foxsat-HDR with a HDR1000S the picture is a lot better. Sky Q users are widely reporting the new box has superior pictures to existing Sky-HD boxes. It's the same source.
 
Haven't tried playing them over the network. Just used the network to copy them once I found that a copied .ts played OK.
 
Wishful thinking because of their investment?

Possibly but I doubt it. Some less enlightened owners are putting it down to it having de-interlacing to 1080p built in. This of course would only make a difference if they have a rubbish TV. Prepared to do a blind test between a Foxsat-HDR and HDR1000S, the difference is very obvious. It's so pronounced, I always stream HD content recorded on the Foxsat-HDR via the HDR1000S. If you replay the recording as well as stream it, flicking HDMI inputs on the TV (The Foxsat has it's own HDMI in), it's very marked.
 
So you are saying that the HDR1000S has different (ie newer) codecs, otherwise the outputs (the result of converting identical input using identical implementations of the algorithm) would be identical too (except for subsequent image "enhancement"). There can't be any other explanation.
 
So you are saying that the HDR1000S has different (ie newer) codecs, otherwise the outputs (the result of converting identical input using identical implementations of the algorithm) would be identical too (except for subsequent image "enhancement"). There can't be any other explanation.

How do codecs come into it. The Video is compressed using the H264/AVC codec. The quality of the uncompressed video output on HDMI depends on several factors. The capability of the AVC decoder to make the best job of re-inventing the video data lost in the compression encoders used by the broadcaster being the primary factor (in your words the decoding algorithm is superior). AVC encoders are now much more efficient, don't you think the decoding process has also improved ? Do all HD TV's have the same picture quality on HDMI, they all get the same data streams. The fact remains until you actually compare the two items I refer to you are arguing from zero experience.

You could make exactly the same argument about video scalers. The 576i to 1080p scaler in my blu-ray player makes DVD's look a lot better than they did in my Denon DVD player despite it costing about 1/3 of the price of the Denon.
 
You don't seem to be following me. "codec" is the abbreviation for "coder/decoder" - so the combination of hardware and software in the receiver which converts the broadcast stream back to video is just as much a codec as the thing at the broadcast end. OK, so the encoders have improved (which is where the real heavy lifting is - I don't have the foggiest how it's done), but if you are saying that one device renders the video from the same encoded stream better than another (with otherwise identical video settings), then clearly the two devices must have different codecs (ie the decoder part of "codec").

That said, I find it difficult to accept. The decoder uses the encoded stream as a recipe to reconstruct the video, and it just does it in accordance with the stream specification. There's no judgement call, that's all done by the encoder (which may improve in the way it encodes the video, but it can't do anything dramatically different because it could render existing decoders incompatible). Every decoder should produce identical outputs from the same encoded stream (presuming they have been written to the same specification), so any remaining differences are the result of post-processing (artificial sharpening) or user perception.

You have, of course, been comparing images without applying scaling anywhere?
 
You don't seem to be following me. "codec" is the abbreviation for "coder/decoder" - so the combination of hardware and software in the receiver which converts the broadcast stream back to video is just as much a codec as the thing at the broadcast end. OK, so the encoders have improved (which is where the real heavy lifting is - I don't have the foggiest how it's done), but if you are saying that one device renders the video from the same encoded stream better than another (with otherwise identical video settings), then clearly the two devices must have different codecs (ie the decoder part of "codec").

That said, I find it difficult to accept. The decoder uses the encoded stream as a recipe to reconstruct the video, and it just does it in accordance with the stream specification. There's no judgement call, that's all done by the encoder (which may improve in the way it encodes the video, but it can't do anything dramatically different because it could render existing decoders incompatible). Every decoder should produce identical outputs from the same encoded stream (presuming they have been written to the same specification), so any remaining differences are the result of post-processing (artificial sharpening) or user perception.

You have, of course, been comparing images without applying scaling anywhere?

Both original 1080i. TV in both cases doing the de-interlacing. Same settings on TV for the two HDMI inputs. If you do a bit of research you should find a load of posts about the Foxsat-HDR picture compared to the HDR1000S. Are we all deluded ? In fact the picture quality may be easily discernible in a photo of the screen. TBH I can't be bothered you clearly are now arguing for the sake of it. Anyone else have the two boxes to corroborate or otherwise ? The HDR FOX T2 also has superior pictures (also much reported) though in this case the video stream isn't the same of course.
 
TBH I can't be bothered you clearly are now arguing for the sake of it.
That's just plain ridiculous. Either you are out of your depth (not understanding what a codec is), or you don't like having your beliefs challenged. I've given a full technical explanation of my side of the debate, so clearly I'm not just calling black white.

I would like not to bother with this, but for the benefit of the forum we should get to the bottom of it.

Summary: GT thinks the new units offer a better picture quality because they have improved decoders. BH thinks the decoders cannot reconstruct the picture from the encoded stream in any different way, and any perceived difference in picture quality is entirely due to other factors. How can we resolve this?
 
That's just plain ridiculous. Either you are out of your depth (not understanding what a codec is), or you don't like having your beliefs challenged. I've given a full technical explanation of my side of the debate, so clearly I'm not just calling black white.

I would like not to bother with this, but for the benefit of the forum we should get to the bottom of it.

Summary: GT thinks the new units offer a better picture quality because they have improved decoders. BH thinks the decoders cannot reconstruct the picture from the encoded stream in any different way, and any perceived difference in picture quality is entirely due to other factors. How can we resolve this?

What the hell does it matter, the picture quality is better on the hdr1000S than the Foxsat-hdr ? No amount of the considerable setting options available on a high end Panasonic TV will improve the Foxsat picture to anywhere near that of the HDR1000s. That's a fact there is nothing to resolve.

I found a test of various H264 AVC decoders that determined the prescence or not of a deblocking loop filter effects the capability of the actual decoder (this is a hardware device not software).

From Overview Of H264 Decoding

H.264 uses an adaptive de-blocking filter that operates on the horizontal and vertical block edges within the prediction loop in order to remove artifacts caused by block prediction errors[13]. The filtering is generally based on 4x4 block boundaries, in which two pixels on either side of the boundary may be updated using a different filter. The filter smoothens block edges, improving the appearance of decoded frames. The filtered image is then used for motioncompensated prediction of future frames. The inclusion of deblocking filter before motion compensated prediction stage is beneficial in terms of compression efficiency

The conclusions are reported as :

The H.264 decoder is implemented on ARM9, TMS320DM642 and Pentium 4 processor. Various parameters such as PSNR, SSIM, MSAD and MSE are calculated for the different video sequences on the three processors. From tables, TI DSP performs better than the other processors for implementing H.264 decoder without deblocking filter than any other processors. Also the decoding time for TI processor is less compared to other processors considered
 
Last edited:
Back
Top