Picture Quality




Had no problem with picture quality on my old CRT but cursing the day I had to buy an HD LCD!

I know that to get best standard def picture on an HD, I should look to reduce the size of the screen, rather than go for massive resolutions, as each SD "pixel" equates to several pixels on the screen resulting in a blocky picture when viewed up close.

So, I've been grousing about the picture quality for a long time now (anyone noticed the "halo" around footballers?) and I simply put this down to being SD on an LCD *AND* the massive reduction in bandwith required to shoehorn HD signals into available frequencies.

But, ... I noticed over the weekend that the picture on my embedded freeview tuner in the TV is much better than the one being fed from my Hummy (less fuzz, sharper).

What gives? Is it simply because it is coming via a SCART? Should I have altered settings on the Hummy when moving from CRT to LCD. Currently set to RGB if I recall correctly. TV is a Samsung 32" - not a bargain basement buy, probably mid-range (arounf £360 paid over a year ago). Same scart lead as I used previously AFAIK.

Any ideas?

RGB is the best quality connection. Make sure that the scart socket on the TV is RGB capable (not all are) and that you are using the TV scart on the 9200 (the vcr scart does not do RGB). If you are connected using a scart socket without RGB you will be using a composite video signal, the worst possible connection. Not sure where the comment about HD comes from. The 9200 is strictly a SD device. HD transmission uses a more efficient compression system than SD (H264 (mpeg4/AVC rather then mpeg2), Scart connectiond don't do HD (best is SD 576i)
There have been a few threads over the years about this. Basically it is because the Humax does not apply any artificial sharpening to the SD signal and your TV probably does. Your TV probably has a sharpening function for the input that the Humax is connected to that you might want to experiment with.

Humax once said:
Humax have actually debated adding artificial 'enhancement' options to the menus but each time we realise that we are only doing it because some people prefer 'enhanced' pictures rather than the cleanest picture. Thus so far we have decided not to do this and if we do it I will feel a little more dirty.

... A little sharpening can be added to try and give the feeling of extra detail in the scene (which does not exist because the image originated as SD) but the degree to which sharpening is added is a matter of opinion it seems.

Sharpening is the process by which the device looks for 'edges' in a scene and then exaggerates them so that they are more distinct. It is worth remembering that no more detail can be created in a scene than was broadcast, what is lost in encoding can never be restored.

See http://forums.digitalspy.co.uk/showthread.php?t=1416176 if you want to read a thread about this relating to the Fox T2, including comments from Humax. It would make sense that their stance also influenced the 9200.

The reference to HD is this. Once upon a time, Freeview signals had a much wider bandwidth - I recall BBC channels had 2-3 times as much as some of the others. But to satisfy the tech-heads, HD needed to fit into the available spectrum, and so bandwidth was severely reduced, which AIUI, affects the picture that can be seen.


Many thanks. I'll check the link later. This reminds me of the "up-scaling" that takes place on some DVD players. Interestingly, my DVD recorder does, but I can only guess that it only upscales DVD playback, as the freeview tuner inside it is not much better than the Hummy. And as the DVD uses an HDMI connection to connect to the TV, it seems more likely that the reference to artificial sharpening could be the reason.

I'm so unsure about getting an HD receiver. I'd like to think it would make me think "thank god, I get a decent picture for some of my viewing", but I'm inclined to think I will be "half-empty" and think "oh god, I can't bear to watch these channells in SD, they're awful".


The reference to HD is this. Once upon a time, Freeview signals had a much wider bandwidth - I recall BBC channels had 2-3 times as much as some of the others. But to satisfy the tech-heads, HD needed to fit into the available spectrum, and so bandwidth was severely reduced, which AIUI, affects the picture that can be seen.


Ahah now I get it, you are referring to the loss of a mux to accomodate HD. I only recently replaced a 9200 in my kit rack with a HD FOX T2. Before then the 9200 was connected by a RGB to component converter and my amplifier upscaled it to 1080i and delivered the pictures over hdmi (along with a Topfield 5800, and a Denon DVD player). Both the 9200 and 5800 delivered marginally better pictures than my Sony TV's built in Freeview tuner. Likely due to the AV amplifier Faroujda scaler being slightly better than the TV's. Satellite is not as bitrate starved as Freeview in respect of SD before ITV1-HD ITV1 West Country had a really good 7000mbps picture. Not checked recently as I just watch ITV1 Granada in full 1920 x 1080.

Are you sure that you are actually viewing RGB, your description sounds more marked than just different signal processing. Incidentally you can do the same yourself at the TV end as the picture adjustments are almost certainly input specific on a modern TV.
OK! Thanks for the help and advice.

Scart lead out of RGB enabled "TV Scart" socket on Hummy to the single Scart socket on the TV. Haven't checked yet whether the TV scart socket is RGB or not.

My Samsung is now set up with

Sharpness 50 (middle figure - unchanged from default)
Edge enhancement is now off
Digital noise filter is now off
MPEG noise filter is now off

I think the latter may explain why when they showed a sky shot of a bird flying on The One Show last night, that the sky looked very blocky. I'll continue to fiddle with settings to see what comes out best for my tastes. As far as I am aware, all extra processing of the signal is now switched off.

I believe I understand the issue of pixel mapping - if a signal is sent as say 576 x 425 (whatever!) and you ask it to display on a 1080 HD screen, then each original pixel is actually several on the screen. Can I therefore assume that pixel mapping is 1:1 mapping and that an SD picture pixel mapped on an HD screen will only utilise part of the screen? Unfortunately, nothing I saw on the TV menu showed that that option was available to me.

Initial comparison between the freeview tuner in the TV and the Hummy (with the settings above) suggests that peoples faces are "muddier" on the Hummy - by that I mean that large blocks of cheek appear to be all one colour, rather than lots of gradation of colour.

And a sweeping shot of a Wimbledon court this morning showed the wonderful jagged painted lines of the court quite beautifully - must make calling a ball out very difficult ;-). I guess that is edge enhancement set to off for you!

I do hope no-one suggests the SCART lead is crap - it's plastered into the wall!

Just to explain a bit more about pixel mapping and resolution. Standard Broadcast Definition in the UK uses 576 lines. The horizontal resolution varies with the channel.

The rubbish ones are only 544 pixels, ITV uses 704 and BBC the full 720. If you have a full HD 1920 x 1080 display the TV will invent the extra pixels to give you 1920 x 1080 pixels given a 16:9 transmission you won't have any missing pixels (A process known as scaling). This works because pixels don't have any inherent shape. For instance for the 544 x 720 picture they have to be interpreted as rectangular, (often known as anamorphic). A PAL DVD is compressed using mpeg2 like SD DVB and is normally the same resolution as the best SD TV ie 720 x 576, this is the same for 4:3 and 16:9 content, the extra quality is simply down to higher bitrates. Again the difference is down to the percieved shape of the pixels (technically Pixel Aspect Ratio - PAR). Pretty well all broadcast TV uses interlacing to deliver the pictures, this sends the pictures in two halves with the odd lines in one half followed by the even lines. Each frame is 1/50 second so the two combined give a full frame of 1/25 second. A crt send the lines to the TV as they are received building the picture in front of your eyes. This is the i in 576i.

HD TV uses the 1080i standard (The BBC are playing around on Freeview with 1080p25 but that's a different story). A 1920 x 1080 transmission is pixel mapped to a 1920 x 1080 display (ie needs no scaling). Normally all Freeview-HD uses 1440 x 1080 (anamorphic) so this needs scaling in the horizontal direction only. Of course if you show1920 x 1080 pictures on a HD TV with less than 1920 x 1080 pixels the pictures again need scaling (Scaling down is much better than up though). For a short time only BBC-HD is transmitting 1920 x 1080 on all platforms. This is because the Wimbledon finals will be simulcast on BBC1-HD and in 3D on BBC-HD. The extra pixels are required due to the way TV's show 3D (a whole other story :))

You may be losing a few pixels if your TV is set to overscan, many default to that to avoid artefacts on the edges using analogue transmissions. Turning off overscan will give you 1:1 pixel mapping from a 1920 x 1080 source.