Two FoxT2s playing through the same TV displaying differently

mike_m

Active Member
I've had two HDR Fox T2's running happily alongside each other for many years now - but today I noticed a new discrepancy between the two. They're both playing through the same TV, but one of them is not as sharp as the other. The difference is not enormous, but enough that I see it clearly. I've checked all the settings I can think of on both the TV and the Hummys, and they're all identical. Both Hummys are connected via HDMI

I attach 2 pics taken with a fixed camera on a tripod, switching the TV input from one Hummy to the other – both images have same exposure and focus. The effect is visible on both the programmes themselves and the programme guide, though I’m using the programme guide to demonstrate the anomaly, as it allows like-for-like comparison. As can be seen from the images, Hummy B is not as sharp as Hummy A, with the type being broken up and fainter. The Freeview logo is noticeably blurry, and the blue block behind ‘Dragon’s Den’ is showing some echo artifacts. Also, the Hummy B image is slightly larger on screen – as if magnified slightly.

Any ideas as to what might cause this?
 

Attachments

  • HumB.jpg
    HumB.jpg
    210.8 KB · Views: 33
  • HumA.jpg
    HumA.jpg
    212.6 KB · Views: 33
Presumably you have these on two different inputs on the TV. I suspect they have different scaling settings - use Full or whatever it's called.
To prove it, just use one input. Or swap them over and see whether the fault follows the input selection or the source.
 
1. Confirm the 'FOXes are both set to the same video output parameters (somebody might have accidentally pressed a button): press the "V-FORMAT" button once only to bring up the video output setting on-screen, and the "WIDE" button once only to bring up the presentation setting.

2. As per prpr's post above, check the TV settings for each input - one may be set to "just scan"/"1:1" and the other to "16:9"/"TV" (different manufacturers call them different things, but the former shows everything while the latter only shows the central 90-ish percent to mask edge defects in TV transmission - see https://hummy.tv/forum/threads/aspect-ratio.2045/post-42098). Yes, modern TVs do remember different settings for different inputs. The process of enlarging the image means the actual displayed pixels have to be computed from the values of several input pixels ("interpolation", AKA "scaling"), and the quality of the result depends on the quality of the scaler.

If both 'FOXes are set the same, then displaying their outputs on the same TV and same HDMI input cannot produce different results. It's a digital system.

See also Things Every... (click) section 16.
 
Last edited:
@prpr :
The HDRs are on two different inputs. I swapped the inputs, and the effect moved to the other input - i.e, the fault is coming down the wire from the hummy rather than being in the TV. Sorry, I should have done this test first!
 
@Black Hole:
Yes, that was it! I had failed to check the V-format & wide settings. For some reason the V setting was on 576. Many thanks for that! No doubt pressed accidentally while fumbling about with no glasses on!
 
@Black Hole:
Yes, that was it! I had failed to check the V-format & wide settings. For some reason the V setting was on 576. Many thanks for that! No doubt pressed accidentally while fumbling about with no glasses on!

mine regularly switches itself from 1080p to 576. I also notice occasional video hiccups at 1080p. At 1080i it's more stable, but still occasionally switches itself to 576. when this happens, pressing V-Format suggests that the Humax UI still thinks it's at 1080, but the telly info panel says otherwise.
 
1080p has twice the HDMI data rate of 1080i, and nothing is actually transmitted at 1080p so...
 
All of my variables (ie, those that can be changed on the remote) are set in the boot-settings plus a few other things. Hence, should SWMBO catch a button unwittingly, a reboot as a last resort will fix it.
Yet another very useful package in WebIf :doublethumbsup:
 
nothing is actually transmitted at 1080p so...
About 20% of freeview HD is broadcast as 1080p.
The current sample for Crystal Palace from http://digitalbitrate.com/dtv.php?liste=1&live=9&lang=en&mux=BBCB-PSB3 has BBC ONE HD, CBBC HD, Channel 4+1 HD and Cbeebies HD as progressive with the others as interlaced.
Channel 4+1 suprised me as I've only ever noticed the occasional BBC and Channel 5 HD as progressive.

There is an option on the Humax freeview HD recorders to make all HD channels look to the TV as if they are all only 1080p or all only 1080i. On the HDR-FOX T2 the info panal for the watched progarmme will show the broadcast 'v-format' but on my HDR-FOX T2 it can take a few seconds to switch. If you have the Info Display Time set to 3 seconds or less it may not always show the correct value when the info panel auto closes.

1080p has twice the HDMI data rate of 1080i
Are you serious?
When all other aspects are equal the video data rates for a whole frame are the same.
 
Last edited:
There is no point in interlacing except to save bandwidth. Interlacing is used in an analogue system to double the perceived frame rate (50 fps, UK* - necessary to prevent a noticeable flicker) while only transmitting the equivalent of a full frame's worth of data 25 times per second. It is reasonable to assume interlacing is used in a digital system for a similar reason (in this case letting the interpolation recreate the missing alternate lines and produce 50 fps at the display, either by simply repeating the alternate lines from the previous half-frame or by computing a best guess from the previous half-frame and the current half-frame).

I stand by what I said before: the 1080p setting (1080p50) for the HDMI link is double the data rate of the 1080i setting (1080i25**), therefore more taxing on the link itself and also more likely to produce interference. Receiving/recording a broadcast in 1080i and passing it to the TV as 1080p employs the PVR's interpolator to convert from i to p. Passing it to the TV as 1080i employs the TV's interpolator instead. If the TV's interpolator is crap or if you are sure the source material is 1080p you might want to use 1080p for the HDMI link, but otherwise it's not worth it.

* 50 fps was chosen for TV to align with the UK mains electricity frequency, which would otherwise have produced intermodulation visual effects (bands of light and dark) with the CRT display technology of the time. The US and other parts of the world standardised on 60 fps instead, because that's their mains frequency.

** Incidentally, my LG TV identifies the HDR-FOX's "1080i" output as "50Hz", but I'm pretty sure it's actually 1080i25 for the reasons stated above and therefore the TV is reporting the field rate (or maybe the reconstructed frame rate) rather than the actual frame rate.
 
Last edited:
Back
Top