[youtube-dl] Download files from youtube.com or other video platforms

MontysEvilTwin

Well-Known Member
The available options for youtube-dl can be found here.
On iPlayer, with the default configuration, it will probably download at 1280 x 720 at 50 fps. If you want to keep reasonable quality but cut down the download size, you could try limiting the frame rate to 25 fps by adding the following to the configuration file:
Code:
-f "[fps=25]"
Edit:
The above is fine for iPlayer, but something like the following may be a better general default:
Code:
-f "[fps<=30]"
This will stop the 50 fps iPlayer downloads, but will not exclude videos from other sources which may not be exactly 25 fps.
 
Last edited:

Black Hole

May contain traces of nut
Damn it. That's BootHDR to blame. Does it actually work as My Video if it doesn't exist?
It created My Video and put the download in it, and the folder is navigable under USB-1 (in my case), so it's not a problem - it's just inconsistent with the normal location for HD-FOX recorded content.

PS: DM is 11 minutes and 321MB, and extremely crisp!
 

/df

Well-Known Member
One problem: on HD-FOX, although the download is to /media/drive1/My Video, the actual video root is ../Video.

Damn it. That's BootHDR to blame. Does it actually work as My Video if it doesn't exist?
I'll fix the package.

On my HD Fox (by which I mean, on any HD Fox to which I connect the same 500GB drive), 'My Video' is a link to Video, and 'My Music' is a link to Audio (which doesn't exist). Perhaps I set that up in a fit of zeal, but I thought it came from BootHDR.

A CF feature or package to create a link, say .Video, that could be used by packages regardless of HD/HDR would have been useful, but too late now, I guess.
 

Black Hole

May contain traces of nut
It's a nicety for HD-FOX, whereas it would be a complete no-no for HDR-FOX (which cannot access anything other than MyVideo via the SUI). The daft thing is why Humax chose to name the recording folder something different on the two models!

A CF feature or package to create a link, say .Video, that could be used by packages regardless of HD/HDR would have been useful, but too late now, I guess.
Good idea. There's nothing stopping a standardised mechanism for accessing the recording folder (wherever it might be) being introduced, even if the actual packages that need it only gradually get updated to use it. However, wouldn't it be more practical to have a mechanism specifically on HD-FOX CF deployments that picks up accesses to MyVideo and redirects them to Video?
 
I have to say this is brilliant. Installing the package and running a download of a youtube video worked perfectly. Great work guys.

Now for the 'wants' list.

There is a problem with ITV which has already been reported.

What about Channel Four? I get 'ERROR: Unsupported URL'. When you go to the channel four website the videos are trying to use Adobe Flash, is that not supported? I mean who uses Adobe Flash these days, is there an option on their website to switch to a more up to date format?

Channel Four catchup would be really handy because they have adopted the rather nasty habit of showing the first episode on Freeview and the rest of a series on a catchup box set. Do they really think people have asked for TV to be moved onto a PC screen? C4 receive a proportion of the license fee so I should be able to watch all the content on a TV shouldn't I?

Rob.
 

Black Hole

May contain traces of nut
Options can be specified in this form on the command line:
Code:
humax# youtube [OPTIONS] URL [URL...]
or be put in the file /mod/.config/youtube-dl/config
I can't find the specified file. Are we supposed to create it if we need it?
 

Black Hole

May contain traces of nut
How can I keep the command alive in a Webshell session (iOS)? If I take the focus away from Safari (not actually closing it), the Webshell session expires.
I'm getting the same in a Chrome window in Win7. Would putting "&" on the end of the command line help?

Update: it seems so, except it then doesn't tell you when it's finished!
 
Last edited:

Black Hole

May contain traces of nut
I'm playing with youtube-dl to grab the first episode of Flatpack Empire. -f "worst" gave me a horrible 70MB file.
Code:
-f "[fps<=30]"
This is grabbing an 830MB file, which is coming in on my pittance of a broadband connection at roughly real-time, so that feels like what I would get if I used the iPlayer app and saved the stream.
 

Black Hole

May contain traces of nut
The download process seemed to stall for an extended period, so I assumed it had crashed or something. The "&" is a nuisance in that respect, there's nothing I know to do to intervene with a background process. I just re-issued the command string and it resumed the download (after the usual preamble).

(Now I've got "ERROR: unable to download video data: The read operation timed out" - presumably that was the previous process, but the current one has stalled again.)

I've launched a third time, but if anyone knows what I should be doing I would be grateful for some guidance (and even more grateful for a WebIF GUI to look after all this)!
 

MymsMan

Ad detector
I'm getting the same in a Chrome window in Win7. Would putting "&" on the end of the command line help?

Update: it seems so, except it then doesn't tell you when it's finished!
You could use >>/mod/tmp/youtube.log to redirect the output to a log viewable on the diag panel
You can use pgrep youtube to check if it is still running
 
OP
prpr

prpr

Well-Known Member
I can't find the specified file. Are we supposed to create it if we need it?
Yes. Beware the hidden directory called ".config". Things like this are really quite a nuisance, which is why I stayed away from it when packaging.

No, it doesn't work.
 
Last edited:

Black Hole

May contain traces of nut
After many repeated attempts (no doubt due to my broadband, and in some instances the HomePlugs dropped out - I'll try taking those out of the equation next time), I got an 800+ MB download of a 1 hour programme (seems a bit excessive, but 70MB was awful, and trying it without the -f restriction would have been over 2GB!

You could use >>/mod/tmp/youtube.log to redirect the output to a log viewable on the diag panel
You can use pgrep youtube to check if it is still running
Thanks. To clarify:

Does the "&" cause the process to persist even if I deliberately close the webshell/Telnet session (not just drop out because it's gone background)?

If the process hangs rather than ending, how do I kill it? Even without the "&", Ctrl-C doesn't seem to abort - is that not a Linux thing? I resorted to terminating the Telnet session and starting again. In cases where I had used "&", eventually the old process terminated with the error message in post 37 (but it waited a long time).

The download appears to be in segments which are then presumably assembled together at the end, hence the ability to resume the same download even with a new instantiation. Where are these segments accumulated, and is any clean-up necessary if a download is terminated early (and not resumed)?
 
Top