• The forum software that supports hummy.tv has been upgraded to XenForo 2.3!

    Please bear with us as we continue to tweak things, and feel free to post any questions, issues or suggestions in the upgrade thread.

BBC iPlayer Quality

But surely fps has always been frames per second since the late 1800s. At that time fields had never been heard of other than the big green things that cows graze on. Only on the advent of TV was the concept of two interlaced fields to make up a single frame of film.
 
But surely fps has always been frames per second since the late 1800s. At that time fields had never been heard of other than the big green things that cows graze on. Only on the advent of TV was the concept of two interlaced fields to make up a single frame of film.
Yes, I understood that films were recorded at 24fps but run a little fast on broadcast at 25fps to match TV 'frame' rates. Wouldn't make sense to convert that to 50i?
Or would it? :cautious:
 
Yes, I understood that films were recorded at 24fps but run a little fast on broadcast at 25fps to match TV 'frame' rates. Wouldn't make sense to convert that to 50i?
Or would it? :cautious:

That's what's done when films are broadcast on Freeview SD (or put on PAL DVD or VHS), they're sped up 4% (which results in the audio pitch being wrong, unless it is corrected back down) and converted to 50i by repeating every frame twice (and transmitting only half the vertical resolution per field). That frankly makes no difference to transmitting it at 25p with twice the vertical resolution, any decent decoder can sort that out and display damn nigh identical results.

Where iPlayer 50fps does make a difference is with video source material that was actually shot at 50i and therefore has more fluid motion in it that 25fps can represent. That's probably why they talk about Eastenders and similar stuff.
 
That frankly makes no difference to transmitting it at 25p with twice the vertical resolution, any decent decoder can sort that out and display damn nigh identical results.
But at the time the technique was developed they had no 'decent decoder' to use. What makes me smile is those peeps who ask if they should set their box to (say) 1080i or 1080p. Why don't they just try each and leave it set to the one that looks better. If they can't tell the difference (which is very likely) then it doesn't matter a toss which they use.
 
I had noticed a significant reduction in BBC iPlayer picture quality on the HDR Fox T2 recently. I assumed it was something at the BBC end, now I know what. Damn them and their "subjectively better".
Agreed. I forgot to record "The Living and the Dead"; as a new production we were expecting the usual "good picture quality" from the BBC iPlayer app on the T2; and were slightly disappointed. It was "okay" but not as good as I expected. I spent time looking for a "problem" with my end and ended up here.

The problem with their "subjectively better" is that it is subjective; it depends on the comparison, the material and so on. As an example: some panned scenes in this case may "seem better looking" (especially if you think about the "zzzzz" ("juddering"?) effect of interlaced delivery); but that did not seem to follow through for the whole program.

I have plenty of broadband bandwidth (70 Mbps) to receive the other/better "profiles"; but how to make the BBC iPlayer app we have on the T2 do so or retrieve a "preferred" profile? 1280x720p @50 fps would be good of course! The BBC webpage seems to imply the profile they will deliver is based on the bandwidth available - the T2 fills the buffer in "no time at all".

I have an Amazon "Fire TV" box - it doesn't appear to be 100% compatible with my Sony TV - but I will see if I can determine what it gets for the iPlayer app.
 
It occurs to me that we have access to the kernel with custom firmware, and all network accesses go through it. It might be possible to detect the request to fetch the mediocre iPlayer profile and substitute a better one. But this would require reliably detecting the correct network packets.

I used to think if something failed to record it was OK, I could use iPlayer on the HDR Fox T2. Now I'm finding it gives worse picture quality than Freeview SD, which is hardly surprising as the "quarter HD" is a similar resolution to SD and compared to BBC SD channels considerably lower bit rate. I use my second box as a backup recorder now, recording the SD channels instead of HD typically. iPlayer is degraded to the level of being no longer of interest to me.
 
It might not be possible to access the 1280 x 720 / 50 fps profile on the HDR-FOX. That profile is available to iPlayer versions which have a 'best' quality setting and select the stream based on the internet connection (e.g. YouView and Roku boxes). Does the HDR-2000T have this capability too? If so is it possible to spoof the Humax server and make a HDR-FOX appear like a HDR-2000T? Would this be enough or would the iPlayer version on the HDR-FOX be problematic?
 
That's what's done when films are broadcast on Freeview SD (or put on PAL DVD or VHS), they're sped up 4% (which results in the audio pitch being wrong, unless it is corrected back down)

Depends on the source of the film. If it was filmed for TV then it was originally shot at 25fps, so no conversion in rate was needed. Only films shot for theatre use require 24 to 25fps conversion.
 
Depends on the source of the film. If it was filmed for TV then it was originally shot at 25fps, so no conversion in rate was needed. Only films shot for theatre use require 24 to 25fps conversion.

What about films shot for US TV or streaming sources like Netflix, they will nominally 30/60 fps based ? They are not 25fps. In fact even BBC HD series like say the The Tudors, and many others, or Sky's Game Of Thrones on blu-ray is 24 fps. I can't believe they were mastered at anything other than 48Hz.
 
What about films shot for US TV or streaming sources like Netflix, they will nominally 30/60 fps based ? They are not 25fps. In fact even BBC HD series like say the The Tudors, and many others, or Sky's Game Of Thrones on blu-ray is 24 fps. I can't believe they were mastered at anything other than 48Hz.

Much US TV was shot on 35mm film as well as made for TV films. Almost all were shot at 24fps. The reason is that 24 * 2.5 = 60 so by a technique called 3:2 pullup they could convert to TV frame rates with no speed issues. It caused juddering that some people could see (I usually can't, and I have plenty of NTSC DVDs) but no pitch issues. The US didn't shoot much TV on video, they were much more geared to film. I'm talking 10 years ago or more, I've no idea what the situation is now with so many digital options.
 
What about films shot for US TV or streaming sources like Netflix, they will nominally 30/60 fps based ? They are not 25fps. In fact even BBC HD series like say the The Tudors, and many others, or Sky's Game Of Thrones on blu-ray is 24 fps. I can't believe they were mastered at anything other than 48Hz.

Note that I specifically used the term "filmed" as distinct to "videoed". I don't know what the native frame rate for bluray or any of the other DVD formats is, but in these days where everything is done digitally it could be anything and we have the hardware to convert it on the fly.

But TV studio film (as in 16mm or 35mm master) definitely had cameras that did 25fps (probably 30fps in USA and other countries that have 60hz mains) so that when broadcast with the flying spot type scanners they didn't get the annoying 5hz flicker in the picture.
 
The US TV industry was heavily wedded to the film industry. I suspect they used 24fps cameras far more than you think. And with 3:2 pullup they had a lot less problems with the frame rate than we did.

DVD frame rates are the same as analogue TV standards, ie. 25i or 60i and that's all they can carry. Blu ray can carry 25i, 60i and 24p and maybe some others I've forgotten but are never used.
 
I could be wrong about that, but I thought that the requests were routed through a Humax server somehow.

I have a recollection of reading something along those lines. I currently have the "new-portal" installed. I have accessed BBC iPlayer in three different ways and in all cases it seems to behave the same:

1. Humax TV Portal.
2. new-portal.

This is what seems to happen with the new-portal (at least on my T2): There is a file "TVApps.js" and it lists BBC iPlayer as "http://az341951.vo.msecnd.net/webapps/iplayer/default.htm" (we might suspect the Humax portal also uses this); that's Azure hosted HTML with a simple script, which in the case of the T2 I think (I could be wrong) is likely just passing the T2 "browser" on to "http://www.bbc.co.uk/iplayer".

3. I added "http://www.bbc.co.uk/iplayer" as a direct link to "TVApps.js":

Code:
allApps.push({uri:"http://www.bbc.co.uk/iplayer/", name:"TEST", img:"app/img/bbciplayer_logo.png"});

and it also seems to work identically when selecting the above.

I would not be surprised if the Humax portal behaviours might be so that they can control things (including for different platforms) without having to issue OTAs because some service or other changed. Maybe.
 
I think part of the problem is going to be
http://iplayerhelp.external.bbc.co.uk/tv/no_hdswitch_tv
when the T2 loads BBC iPlayer it is not showing "best" as an option, only SD and HD, because it thinks the old "Opera" browser in the T2 does not support HLS? And possibly it doesn't.

I also suspect based on the language here
http://www.bbc.co.uk/rd/blog/2015-07-the-development-of-new-video-factory-profiles-for-bbc-iplayer
that the decision on what is "best"/"HD" and what to send when the BBC iPlayer app is in use is being made at the server end and/or possibly based on what encodings are available. For example: when I played back an episode of "The Living and the Dead" on the Fire TV, having selected "best", it "looked the same as it did on the T2" and the router indicated the same amount of data was downloaded. It takes less than 2 minutes to download to the device; so I can't think they think the connection is too slow.
 
We used to get 1280 x 720 for HD on the T2, we don't any more. That's a reduction in quality. I noticed a while back it doesn't look as good as it used to.
 
We used to get 1280 x 720 for HD on the T2, we don't any more. That's a reduction in quality. I noticed a while back it doesn't look as good as it used to.
We used to get 1280 x 720p at 25fps; they're saying that for the new profile of 960x540p at 50fps "subjective assessments shows it delivers significantly better pictures on TV screens across a wide range of popular content (such as EastEnders and Top Gear) due to its higher frame rate.". I think this is probably meant to apply to things moving in scene, or so long as the camera is being panned across everything. But "subjectively" I think there are aspects that don't look better. Although I suppose encoders and decoders used may have an impact too. 1280 x 720p at 25fps no longer looks to be included [in the list of profiles]. The next better profile listed is "The 50fps, 1280x720[p] profile, however, will be available to those with 5Mbit/s broadband connections."; yet it is not being served out it seems; or maybe the programs we're watching are not being encoded with it, so it can't be served up. 1280x720p at 50fps should be good... I wonder if we should check that the T2 supports it (that is; that it would not be too much of a load on the system).
 
Back
Top