[youtube-dl] Download files from youtube.com or other video platforms

prpr

Well-Known Member
From https://hummy.tv/forum/threads/save-last-streamed-programme.8452/

youtube-dl is a new package available now.
Credit for all this goes to /df . I just packaged it (tweaking slightly to make it easier).
Run as e.g.
Code:
humax# youtube https://www.bbc.co.uk/iplayer/episode/b09p1jkt/bbc-weekend-news-28012018

Documentation: https://github.com/rg3/youtube-dl/blob/master/README.md#description
Change log: https://github.com/rg3/youtube-dl/blob/master/ChangeLog

Options can be specified in this form on the command line:
Code:
humax# youtube [OPTIONS] URL [URL...]

See also: Black Hole's Summary Guide
 
Last edited:
Brill!

Query: where does the resulting download get filed? (People are bound to ask)

Pretty please: will the package get updated when the youtube-dl people roll out updates?

WebIF request: a nice box somewhere to paste a URL into.

Edit: To cut to the chase, I have created a summary guide HERE (click).
 
Last edited:
The youtube-dl documentation indicates that ITV is supported. I tried it out but it seems that ITV are still serving Flash streams. The download failed, asking for the installation of rtmpdump. Would this be feasible with rtmpdump?
 
The download failed, asking for the installation of rtmpdump. Would this be feasible with rtmpdump?
I compiled rtmpdump, which seemed to go OK. I can't download anything using it as it just generates errors which I have no idea what they mean or why they're generated (and therefore how to fix).
e.g. rtmpdump exited with error code 1, or -11, or complaints that ffmpeg didn't support https
 
How can I keep the command alive in a Webshell session (iOS)? If I take the focus away from Safari (not actually closing it), the Webshell session expires.
 
Seems to be a long delay before it starts; even youtube --help took a minute to display the text
 
How can I keep the command alive in a Webshell session (iOS)? If I take the focus away from Safari (not actually closing it), the Webshell session expires.
Dunno, apart from using Abduco, if that works in Webshell. iOS is frustratingly single-tasking sometimes.
 
might be nice to have the option of an output sub-folder for those downloading a series
If anyone feels like modifying the script (in /mod/bin/youtube) and sending me a copy, I can update the package easily.
Seems to be a long delay before it starts; even youtube --help took a minute to display the text
Took 34 seconds on my box. We're limited by the lack of power in the CPU.
I unpacked the youtube-dl file and that improved it to 12, but it's more hassle to package every time there's an update, as the Busybox unzip won't handle it and we haven't got an Unzip 3.0 (to go with Zip 3.0).
Considering how long the download can take, is an extra 20 or so seconds worth bothering about?
 
If anyone feels like modifying the script (in /mod/bin/youtube) and sending me a copy, I can update the package easily.

Took 34 seconds on my box. We're limited by the lack of power in the CPU.
For the impatient:
Code:
alias youtube-dl='echo "Starting youtube-dl (may take some time) ..." && python /mod/lib/youtube-dl'
Probably takes another 50ms or so!

As you may have noticed, the youtube-dl download package is meant to be a self-executable archive (if chmod +x) but it expects /usr/bin/env where we have /mod/bin/env. You can fix that, but it still won't be any faster:
Code:
sed -i -e '1 s@/usr/@/mod/@'  /mod/lib/youtube-dl

I unpacked the youtube-dl file and that improved it to 12, but it's more hassle to package every time there's an update, as the Busybox unzip won't handle it and we haven't got an Unzip 3.0 (to go with Zip 3.0). ...
The 7Zip CF package will do the job, no?
Code:
# 7z t /mod/lib/youtube-dl | grep -v Testing

7-Zip 9.20  Copyright (c) 1999-2010 Igor Pavlov  2010-11-18
p7zip Version 9.20 (locale=C,Utf16=off,HugeFiles=on,1 CPU)

Processing archive: /mod/lib/youtube-dl

Everything is Ok

Files: 781
Size:       4652292
Compressed: 1620403
 
A couple of observations:

It seems inordinately slow, but that may be because the downloader defaults to the highest quality file available? As a trial earlier I tried to download a 5-minute Danger Mouse on my HD-FOX, and the target was several hundred megabytes (really?) and a 20-minute ETA at my 3Mbps. Tonight I was able to offer a service to my supported user who missed Portillo's railways programme, and as that machine is fitted with a WiFi dongle which connects to my phone hotspot (and links through to my iPad) I could add the youtube-dl package and grab it off iPlayer all without disturbing Emmerdale. However, the 28-minute programme was a 1.1GB(!) download, and took 20mins at about 10Mbps.

While grabbing, I made the mistake of switching away from that tab in Safari, and the session closed. Bum. I started again expecting to go back to the beginning, but it seemed to resume the download where it left off (after the usual initial preamble). Is that expected?
 
The 7Zip CF package will do the job, no?
Oh, yes, I expect it will. I'd overlooked that. FWIW, I looked at compiling Unzip 6.0 which claims to be the third most portable software in the world. I gave up having failed. It's horrendous stuff, out of the ark. No wonder the authors gave up as well.
 
It seems inordinately slow, but that may be because the downloader defaults to the highest quality file available?
It does. You could always read the documentation for youtube-dl and add your own parameters to the youtube command if you want something different.
it seemed to resume the download where it left off (after the usual initial preamble). Is that expected?
I believe so, yes. It certainly knows if you have downloaded the whole file and does nothing (much), so I would expect it to be able to resume.
 
I have restarted the Danger Mouse download, and it resumed.

One problem: on HD-FOX, although the download is to /media/drive1/My Video, the actual video root is ../Video.
 
It does. You could always read the documentation for youtube-dl and add your own parameters to the youtube command if you want something different.
It would be worth expanding Post 1 to include a reference.
 
It seems inordinately slow, but that may be because the downloader defaults to the highest quality file available?
and
It does. You could always read the documentation for youtube-dl and add your own parameters to the youtube command if you want something different.
In my original post https://hummy.tv/forum/threads/save-last-streamed-programme.8452/#post-119056, I suggested how to control this. If you want something equivalent to FreeviewHD (2.5-3Gbyte/hr), try -f "best[tbr<1900]" on the command line. Unless it's sport or under-40s will be watching or viewing will be from closer than the screen diagonal, the value 1900 (kbit/s) will probably give acceptable results: certainly good enough for D Mouse.

You can set an option as default by putting it on a line by itself in the file ~/.config/youtube-dl/config (ie /mod/.config/youtube-dl/config). We can't use the system-wide yt-dl settings file /etc/youtube-dl.conf because /etc is read-only, but as there's only 1 user it isn't a problem to use the per-user file.

While grabbing, I made the mistake of switching away from that tab in Safari, and the session closed. Bum. I started again expecting to go back to the beginning, but it seemed to resume the download where it left off (after the usual initial preamble). Is that expected?

'man youtube-dl' (or see https://github.com/rg3/youtube-dl/blob/master/README.md#description):
Code:
       -c, --continue
              Force  resume  of  partially  downloaded  files.   By   default,
              youtube-dl will resume downloads if possible.

       --no-continue
              Do  not  resume  partially downloaded files (restart from begin‐
              ning)

(edited for typo in '-f best' option)
 
Last edited:
Fair point, and I had forgotten. Worth getting into this thread though (as the master thread for the package), and worth everything being summarised in post 1.

We can't use the system-wide yt-dl settings file /etc/youtube-dl.conf because /etc is read-only
IIRC there is a way around that.
 
Back
Top