• The forum software that supports hummy.tv has been upgraded to XenForo 2.3!

    Please bear with us as we continue to tweak things, and feel free to post any questions, issues or suggestions in the upgrade thread.

[youtube-dl] Download files from youtube.com or other video platforms

IIRC swapper is a prpr product.
It was originally, but there was some unpleasantness surrounding it.

Permanent fix:
Code:
--- /mod/etc/init.d/S00swapper~
+++ /mod/etc/init.d/S00swapper
@@ -1,8 +1,7 @@
 #!/bin/sh
 
-[ "`cat /etc/model`" = "HDR" ] || exit 0
-
 swapfile=/mnt/hd3/.swap0
+[ "`cat /etc/model`" = "HD" ] && swapfile=/media/drive1/.swap0
 
 case "$1" in
     start)
 
Does it need to be got rid of once set up?
Just included for completeness, in case.
Here's my complete version of/mod/etc/init.d/S00swapper (it has better error checking than af123's):
Code:
#!/bin/sh

swapsize=128
swapfile=/mnt/hd3/.swap0
[ "`cat /etc/model`" = "HD" ] && swapfile=/media/drive1/.swap0

case "$1" in
        start)
                grep $swapfile </proc/swaps >/dev/null
                if [ $? -ne 0 ] ; then
                  dd if=/dev/zero of=$swapfile bs=1M count=$swapsize
                  mkswap $swapfile
                  swapon $swapfile
                  echo "Enabled swap file."
                fi
                ;;
        stop)
                grep $swapfile </proc/swaps >/dev/null
                if [ $? -eq 0 ] ; then
                  swapoff $swapfile
                  echo "Disabled swap file."
                fi
                [ -f $swapfile ] && rm $swapfile
                ;;
        *)
                exit 1
                ;;
esac

exit 0
 
Can't guarantee the syntax or the correctness of my assumptions, but what about?:

Code:
#!/bin/sh

swapsize=128
swapfile=/mnt/hd3/.swap0
[ "`cat /etc/model`" = "HD" ] && swapfile=/media/drive1/.swap0

case "$1" in
        start)
                grep $swapfile </proc/swaps >/dev/null
                if [ $? -ne 0 ] ; then
                  dd if=/dev/zero of=$swapfile bs=1M count=$swapsize
                  mkswap $swapfile
                  swapon $swapfile
                  echo "Enabled swap file."
>>>             ELSE ECHO "Swap file already running"
                fi
                ;;
        stop)
                grep $swapfile </proc/swaps >/dev/null
                if [ $? -eq 0 ] ; then
                  swapoff $swapfile
                  echo "Disabled swap file."
>>>             ELSE ECHO "Swap file not running"
                fi
                [ -f $swapfile ] && rm $swapfile
                ;;
        *)
                exit 1
                ;;
esac

exit 0
 
Looks like I should update it to support the HD then (assuming writable disk attached - 128MiB is not going to inconvenience anyone)
 
Thanks @prpr, @Black Hole and @af123. I will turn on the swap file and see if my Youtube-dl downloads work. One more thing, does sysmon log the CPU usage on the HD-FOX? It does seem to monitor network traffic but not CPU. Unless my installation is a bit broken, of course.
 
To update, the YouTube-dl downloads are working fine with the swap file enabled. This includes repeat downloads which either crashed or locked-up the unit before. Thanks for the help.
 
Oh sh*t.

I started a process running on my HD-FOX to download The First Night of the Proms, over a week ago. It is trying to download 1.67GB (I wish I had chosen a smaller version), at a blistering 300kB/s-ish. When I started it off, I was getting estimates of ETA of a couple of hours (rough fag-packet figures indicate 100 minutes would be about right). After a few hours I had just a few percent. After a day I had about 15% with an ETA of 24 hours. It's now on 86% and the ETA is still 37 hours away. There is a real risk the download won't complete before the iPlayer item expires!

Something, somewhere, is throttling this and I have no idea where. My "normal" internet still works fine.

However, I now realise there is fatal flaw. Re-reading the above, it seems I'm going to hit a storage limit when the youtube-dl process tries to stitch together the downloaded packets and convert them.

Can I start up a swap file while the download is still live, or do I have to stop it first? This wouldn't be too much of a problem I guess, since youtube-dl is able to pick up where it left off.

:eek:

While I've been thinking about this post, the download has completed and FFMPEG has started its thing... soon to crash no doubt. I am heartened that METs posts indicate it can be recovered without re-downloading. It's almost like the download wasn't happening while I didn't have the abduco session live - my log file might make interesting reading (although, with no timestamps, it might be difficult to figure out).

And it's crashed. Let's try that swap file...
 
Okay, well, I've instated a swap file as per instructions in post 80, fished around to reconstruct the command I used to start the download (stupidly I exited the abduco session I was running it in, so lost the command history), and now I am pleased to say the ffmpeg process has started without re-downloading the packets. I guess it might take a while, but (stupidly) I have not started it in an abduco session this time!!!!

Update: all done, didn't take too long.

A couple of comments for an update to youtube-dl, pretty please:

1. Prefix the status reports with the current system time;

2. Echo the command line to terminal at the start of the process, so that a log file tee'd from the terminal contains a record of the command used.
 
Last edited:
If you have the complete, original youtube-dl download file you can reprocess it manually using ffmpeg. From the command line:
Code:
ffmpeg -i "Input.mp4" -muxdelay 0 -map 0:1 -map 0:0 -c copy -bsf:a aac_adtstoasc "Output.mp4"
 
A couple of comments for an update to youtube-dl, pretty please:

1. Prefix the status reports with the current system time;

2. Echo the command line to terminal at the start of the process, so that a log file tee'd from the terminal contains a record of the command used.
I suggest you write your own wrapper script around /mod/bin/youtube
 
I've extracted stats from the log file I captured from my loooonnnggg download and graphed them, see if you can make anything of it. I can't rightly remember when I kicked the process off, but one could possibly interpret the slowdowns as occurring daily. However, the x-axis is not necessarily linear in time, it is plotted linear with percentage downloaded and appears to be linear with log updates (as if a new line is written to the screen/log per every so many bytes downloaded). This indicates that the stalls occur roughly every so-many bytes.

The sharp increases in ETA correspond with the stalls, and at those times the recorded bytes-per-second download rate is near zero. I find it curious the slopes between the stalls (orange line) all seem to converge on the 100% point - not sure whether that's expected or if it means something.

youtube-dl analysis.png
 
Aha! Now I know what's going on, after re-plotting based on an x axis linear with time rather than linear with amount of data transferred. What I did was derive an assumption of time index based on the inverse of the reported data rate for each record captured.

youtube-dl analysis2.png

So, obviously the download process is only actually doing something when I'm logged in and checking the progress! The bursts of non-zero(ish) data rate occur at times when I assume I was looking to find out what was happening.

Now, I assume there is no reason to suspect abduco of doing the dirty or the download session being throttled by something in the CF, so one has to suspect my network, and I figure maybe the router/homeplug combination is happy enough with download traffic while I have some browser interaction to that particular IP address, but as soon as the browser traffic stops so does everything else.

Could that be a real thing???
 
I suggest you write your own wrapper script around /mod/bin/youtube
OK, my /mod/bin/youtube now looks like this:
Code:
#! /bin/sh
if [ `cat /etc/model` = 'HDR' ]; then
  outdir='/mnt/hd2/My Video'
else
  outdir='/media/drive1/Video'
fi
echo $0 $@
python /mod/bin/youtube-dl --config-location /mod/etc/youtube-dl.conf -o "$outdir/%(title)s.%(ext)s" $@

...but I guess I would have to hack the Python to get timestamps.
 
..but I guess I would have to hack the Python to get timestamps.
You should be able to pipe the output from youtube-dl through awk to add timestamps.

DetectAds uses the following awk command string to add timestamps and process id to the output from Nicesplice
Code:
awk {\{ print strftime(\"%d/%m/%Y %H:%M:%S NS([pid])-\"), \$0; fflush(); \}}
 
You should be able to pipe the output from youtube-dl through awk to add timestamps.
...
awk is very useful but in CF<=3.13 it is neither installed directly nor provided by /bin/busybox. It can be installed separately or as part of the full busybox CF package, but then it depends on the /mod filesystem being available, which obvs it will be if you are running youtube-dl or DetectAds.

If no awk is available, shell, sed, etc, have to be used. Consequently I see your awk and raise you a shell function:
Code:
# dateline() { 
    local line;
    while read -r line; do 
        echo "$(date +%F\ %T): $line"
    done
}
# youtube-dl http://whatever | dateline
#    # or if you want to include error lines
# youtube-dl http://whatever 2>&1 | dateline
Any pipe-based solution has the limitation that the timestamp reflects the time when the output line was read rather than when the event of interest occurred, which might be somewhat earlier, eg, if the generating program buffers its output.

Also, a prefixed format like "%F %T" (eg, 2018-08-07 20:10) automatically sorts in time order whereas "%d/%m/%Y %H:%M:%S" doesn't do so directly.
 
I see your awk and raise you a shell function
:lol:

Now I've found a way to massage the data, I'm less bothered, but I still think a timestamp (in any log file) is pretty essential. OK, my tee'd log file is a hack in itself, so maybe a -l logfile.log command line switch would be the way to go.
 
Back
Top