Customised HDF file released

Firstly, thank you so much! I was in the process of building my own box to replace my old Hummy 9200T but the hummy died yesterday before my homebrew box was ready. Found this thread today & pretty much went straight out to buy a HDR-FOXT2! One thing I really wanted to be able to do was to schedule recordings from the web but I didnt consider that off-the-shelf systems (with a little modding) might be able to do this until today.

Anyway, the disk info script wasnt working for me - I think its cause I dont have many recordings so have <100GB used which changes the array indexes of the split df output. So, I've added a regexp to remove multiple spaces & fixed up the indexes. Also, it was looking for ...00.png for the pie chart image file which doesnt exist so I changed it to use fiels 1..25 instead of 0..25. My diskspace.jim is now:
Code:
#!/mod/bin/jimsh

foreach df [split [exec df -h] "\n\r"] {
    if [string match *sda2* $df] {
        regsub -all -- {[[:space:]]+} $df " " df
        set fields [split $df]
        set size [lindex $fields 1]
        set used [lindex $fields 2]
        set perc [string trimright [lindex $fields 4] "%"]
    }
}

set file [format "%02d" [expr {$perc * 25 / 100 + 1}]]

puts "<div style=\"float: right; background:url('/images/345_1_27_ST_USB_BG.png')\">"
puts "<img src=/images/345_2_14_ST_HDD_$file.png>"
puts "</div>"
puts "<span style=\"float: right\">"
puts "<br>"
puts "Total space: $size<br>"
puts "Used: $used ($perc%)"
puts "</span>"
I did some tcl about 6 years ago at uni - hopefully Ive done it right & its useful (works on my box at least :)).

Regarding scheduling from the web - Ive found the epg data (/mnt/hd1/dvbepg/epg.dat), any ideas what format it is? Ive tried running it through the GNUWin32 version of file but it just says its 'data'. I think it's encoded with the length(data) as 1 byte followed by the actual data, Ill try to decode that some more tomorrow.

Thanks again,
Nick
 
Welcome to the party.

I did some tcl about 6 years ago at uni - hopefully Ive done it right & its useful (works on my box at least :)).

Very useful thanks, works for me too, I'll update the package with your changes. Condensing the spaces improves it a lot.

Regarding scheduling from the web - Ive found the epg data (/mnt/hd1/dvbepg/epg.dat), any ideas what format it is?

I don't think anyone has looked at it yet. Raydon has analysed some of the file formats on the Foxsat and T2 so may have some more information - if you look at this page it shows some of the Foxsat formats including an EPG block which doesn't seem to match the epg.dat. Not directly helpful but does show the way in which Humax have historically stored data (such as 4 byte epoch dates).

Alternatively it could be a standard Freeview EPG format..

More investigation required in either case![/quote]
 
Alternatively it could be a standard Freeview EPG format..

I think you've hit on it - I got the specs for data transmitted by Freeview (ETSI EN 300 468 from http://www.etsi.org/deliver/etsi_en/300400_300499/300468/01.11.01_60/en_300468v011101p.pdf), it's mostly EIT data with another header before each section I havent identified yet. Im writing a Python parser for it at the moment (that's my main language). Any idea if Python could be put on the box (I've never done cross-compiling/packaging before)? If Python was on there, I'd love to write a webif in Django (Ive got nearly 3 years Django experience vs. pretty much 0 tcl/jimsh!). Anyway, on with the parser for now.
 
Python could probably be put on with a bit of work - it needs some patches to cross compile but they're available on a few web sites. It's on my list of things to do.

The reason I knocked up that interface using TCL is because the box is very short on resources when it's doing its day job. Jim is a very small footprint TCL interpreter which makes it perfect for running in small embedded environments. I don't have much TCL experience either - I mainly program in C or assembler these days.

Even with the limited resources, it's surprising what can be done. I've packaged up the GCC C & C++ compilers (gcc package) and it's possible to compile fairly large things natively on the box (although I wouldn't try it when it's recording something I care about)

The ideal would probably be a small EPG parser written in C that is called by jim to extract information when it needs it. Similarly I'm working on a small utility that parses the .hmt files associated with recorded media. Jim could use that to display more information on recordings fairly easily and I might have a play around with that in the near future.

I named the interface package af123-webif because I imagined that other people would come along and write their own, and because mine is really just a proof of concept at the moment (although feel free to extend it and submit updates back to me). If it turns out to be practical to implement something using Python (or even Django) then that would certainly speed up development.
 
I've succeeded in getting MediaTomb 0.11.0 UPnP mediaserver up and running on the HDR T2. Now streaming video from the virtual disk to my PC using XBMC as the client. Works a treat ! :)
I'll pass full details about the installation to af123, and hopefully he will bundle it up into an opkg package and host it on his site.
af123: PM with link to mediatomb package is in your mailbox now.
 
I've just uploaded a new package called dedup which adds a new command with the same name which I used to sort out large folders of programmes. It's another use-at-your-own risk thing, but it works for me.

It works on recordings where the EPG contained the episode name, a : and the the rest of the description, e.g.

19/22. Stuck Together, Torn Apart: Animated comedy series about family life...

but the name that's displayed in the media list is just a generic series name, in this case Family Guy

If you change to a directory and run dedup with no arguments, it just lists the recordings there and shows what they would be renamed to. If you run it as dedup -yes then it makes the changes. It:
  • Renames the files on disk to match the episode name;
  • Changes the title shown in the media list to match the episode name;
  • Identifies duplicate episodes and moves them to a sub-folder called dup/ (which you can subsequently remove if you're happy that they're duplicates).
It's just a shell script so easy for you to look at and modify if necessary. It uses a new binary package called hmt which extracts information from .hmt files (associated with recordings) in order to determine the episode name and to change the media list title.
 
I've packaged up Raydon's mediatomb installation and it works a treat. It's available now

Code:
opkg update && opkg install mediatomb

I've made some (fairly minor) changes from the version that Raydon provided, so any bugs are likely to be my fault. On the other hand, if it works then that's all Raydon's work :)

The default database looks for video content in /media/virtual, which is the location that the virtual-disk package creates. You can add others using the web interface but note that all content under /media/My Video is likely to be encrypted if it was recorded straight from the air.

So, quick-start guide for people who already have telnet/ssh access to their box:
  • opkg update && opkg install virtual-disk mediatomb
  • Use the remote control to copy something you want to watch to the virtual disk; it will be decrypted as it is copied.
  • Watch the programme on another device on your network (I used XBMC on MacOSX and it worked perfectly).
  • If you use DHCP then the server may start too early during boot, log into the box and run:
Code:
/mod/etc/init.d/S90mediatomb stop && sleep 1 && /mod/etc/init.d/S90mediatomb start

(Raydon's shown me how he fixed this on the Foxsat so I will be releasing a new HDF file with that fix in it soon; the mediatomb package will then automatically wait for the network to come up before starting)
 
You can add others using the web interface...

I've updated the af123-webif package so that if mediatomb is installed then there's a link to the mediatomb web interface at the bottom of each page. Mediatomb changes port number each time it's run.
 
Python could probably be put on with a bit of work - it needs some patches to cross compile but they're available on a few web sites. It's on my list of things to do.

Cool - I think Ill work on a web interface based on Python & Django (on my desktop since I've copied the epg.dat & all the sqlite files over) & hope that a) Python is a possibility in the (near?) future and b) its not too hungry on the processor/memory.

The ideal would probably be a small EPG parser written in C that is called by jim to extract information when it needs it.

I was thinking of periodically (maybe once an hour, I saw cron in your repository) processing the EPG into a sqlite database. Im assuming selecting from a sqlite db would be quicker than reparsing the whole file every time from python (Ill try to verify that sometime) though if the parser was done in C then Im not sure what would be faster. Anyway, Ive attached the parser in case anyone's interested.

...feel free to extend it and submit updates back to me...

On that note, I've attached a new shtml/jim to dump the 3 databases I've seen on the box (setup, channels & recording schedule) to a html. Dont know if it'll be useful for anyone else, but I find it easier & quicker to look at them like that than firing up jimsh/sqlite & trying to remember all the commands (Im used to tsql - any chance of getting SQL server on there? ;)).

Anyway, as McDonalds says, 'Im Lovin it'!
 

Attachments

  • db.jim.txt
    958 bytes · Views: 8
  • db.shtml.txt
    136 bytes · Views: 6
  • epgparse.py.txt
    11.5 KB · Views: 5
Thanks to everyone for their efforts - I have mediatomb up and running and it streams beautifully! So far I have seen no adverse effects in day to day activity - I have installed the following packages (and their dependants).

  • mediatomb
  • af123-webif
  • utelnetd_0.1.9
  • virtual-disk
  • dedup
I'm hoping to get mediatomb to stream the media from my external USB attached to the Hummy. My current USB drive is NTFS (so I have to plug it into a PC to write to it) should I format this as ext3 or Fat32 if I want to leave it permanently attached to the HDR and add files via ftp in future?
 
Where's the download now? Clicking on the HDR hdf link below gives a ''HDR_FOX_T2_upgrade.hdf' is unavailable. This file was deleted.' message.

(I know a place where they will host the modded hdf files...)

Update HDF files and package repository:

I've updated the HDR_FOX_T2_upgrade.hdf file and added HD_FOX_T2_upgrade.hdf. These new packages contain a number of bug fixes so that, for example, external drives formatted with ext3 now work properly.

The package repository has been restructured too and a number of new packages have been added. The old HDF image won't be able to retrieve packages from the new repository.

If you've previously tried the customised HDF file then I recommend that you remove the contents of /mnt/hd2/mod and update with this new one. Once updated, the first package you should probably install is busybox which installs a small binary providing most of the missing user commands that you will want, including gzip, tar, a pager (more/less) and simple vi clone (full vim packages are available if you prefer which take more memory but have much more functionality). I also find ldd & file quite useful.

It's still early days so I suggest you only use these updates if you are technical with some Linux or other UNIX experience. If you don't understand what a package does by the name/google then you don't want it :)

On the other hand, please try af123-webif and build on it if you can!
 
Where's the download now?

The links in the first post in this thread are correct, and I'll keep those ones up todate. I'll remove all others now and point to the first post to avoid confusion.
 
Great work so far chaps! Just back from a month away, and disappointed to see there is still no HD update from Humax, but the progress made here has made up for it. I was wondering if anyone has had any success yet adding useful functionality to the HD model (in particular being able to media serve / backup / re-compress recorded programs)? I've started a thread in the HD section if people feel its more appropriate to discuss there.
 
I was wondering if anyone has had any success yet adding useful functionality to the HD model (in particular being able to media serve / backup / re-compress recorded programs)?

Yes, someone has managed to load a custom firmware onto his HD but the package installation didn't work properly; he's investigating. I don't have an HD myself. It shouldn't be long before I can roll an update that works properly for that model.
 
I thought the web interface was configured for port 49152 ?? :confused: Mine is.

It will usually start on that port (and to be honest I've never seen it on a different one), but the documentation states:

By default MediaTomb will select a free port starting with 49152.... it is possible that the port will change upon server restart.

so it's not entirely guaranteed.
 
Back
Top