One-liner for automating decryption via dlna via curl

dma

Member
Hi,

I had quite alot to archive off my main disk to an exfat USB drive. It's quite hard work doing this via the web interface, plus the dlna server uses an index id for the file name which can be confusing.

I wrote a script to automate this but made it into a simple one-liner that builds a curl command to download the video from the dlna server on the Humax unit. This means you can confine the download to the Humax unit itself, and rename it to something more meaningful.

All the info you need is in the tblObject table in dms_cds.db. With some formatting, it's possible to generate an sql query command to do all of this:

/mod/bin/sqlite3 -separator $'' /mnt/hd2/dms_cds.db "select 'curl localhost:9000/web/media/' || mediaID || '.TS -o \"/media/gpt-drive1/', title || '\"' from tblObject where class == 'object.item.videoItem.movie';"

This just selects every video in the database (of type object.item.videoItem.movie) and builds a curl command where the video is saved to my exfat USB drive mounted at /media/gpt-drive1/. It also means you can selectively grep for certain videos you want to download or just create a batch of jobs. Here's an example:

Let's say I want only videos with "rewind" in the title:

humax# /mod/bin/sqlite3 -separator $'' /mnt/hd2/dms_cds.db "select 'curl localho
st:9000/web/media/' || mediaID || '.TS -o \"/media/gpt-drive1/', title || '\"'
from tblObject where class == 'object.item.videoItem.movie';" | grep -i rewind
curl localhost:9000/web/media/883.TS -o "/media/gpt-drive1/Rewind_ 1985 Final - World Championship_20150430_1140.ts"
curl localhost:9000/web/media/934.TS -o "/media/gpt-drive1/Formula 1 Rewind_20150606_1600.ts"
curl localhost:9000/web/media/949.TS -o "/media/gpt-drive1/World Cup Rewind_20140610_2335.ts"
curl localhost:9000/web/media/1925.TS -o "/media/gpt-drive1/FA Cup Rewind_ 1990 FA Cup Final_20160521_1100.ts"

You can get almost 5MB/sec per download:

humax# curl localhost:9000/web/media/883.TS -o "/media/gpt-drive1/Rewind_ 1985 F
inal - World Championship_20150430_1140.ts"
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1619M 100 1619M 0 0 4738k 0 0:05:49 0:05:49 --:--:-- 6104k

Hope this is useful to someone, it certainly made it easy for me...
 

Black Hole

May contain traces of nut
This may be useful to somebody who does not routinely decrypt all recordings.

The usual modus operandi is to decrypt recordings on completion (or even simultaneous with recording), and then any bulk transfers can be performed by direct disk access without involving the DLNA server, and the .ts file names are preserved.

For those who don't know:
  1. Install the auto-unprotect package;
  2. In WebIF >> Browse Media Files, click the "OPT+" button on the top line (next to "/media/My Video") and select "Recursive Auto-Decrypt".
Decryption will occur shortly after a recording completes (assuming the box is not in standby), or next time it turns on. Full details here: Decryption Guide

The OP is documenting a process which runs self-contained on the HDR-FOX and can therefore be left to get on with it without external intervention. Decryption (if not already decrypted) is a consequence of using the DLNA server as the source of data for the archiving (HiDef recordings will need to have been "unprotected" first).

Users less comfortable with the Linux command line (like me!) will find it easier to use Explorer in Windows to select and copy recordings to another location. To access the HDR-FOX internal (and any external USB) drive via the home network as Network Attached Storage, install the samba package. (When selecting files, ensure all the files that make up a recording are selected - that means .ts, .htm, .nts, .thm. One defect of the OP's process is that it only transfers the naked .ts, which is sufficient in some circumstances but less than ideal if the recordings are to be played on the HDR-FOX in the future.)
 
Last edited:

Wildebeest

Member
BH, thanks for posting that concise and very useful rundown, it is very timely for me and very helpful. (Thanks to OP as well - just that it's BH's info that happens to match what I need to do).
 
Top