1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Customised HDF file released

Discussion in 'HD/HDR-FOX T2 Customised Firmware' started by af123, Apr 14, 2011.

  1. csudcy

    csudcy New Member

    Firstly, thank you so much! I was in the process of building my own box to replace my old Hummy 9200T but the hummy died yesterday before my homebrew box was ready. Found this thread today & pretty much went straight out to buy a HDR-FOXT2! One thing I really wanted to be able to do was to schedule recordings from the web but I didnt consider that off-the-shelf systems (with a little modding) might be able to do this until today.

    Anyway, the disk info script wasnt working for me - I think its cause I dont have many recordings so have <100GB used which changes the array indexes of the split df output. So, I've added a regexp to remove multiple spaces & fixed up the indexes. Also, it was looking for ...00.png for the pie chart image file which doesnt exist so I changed it to use fiels 1..25 instead of 0..25. My diskspace.jim is now:
    Code:
    #!/mod/bin/jimsh
    
    foreach df [split [exec df -h] "\n\r"] {
        if [string match *sda2* $df] {
            regsub -all -- {[[:space:]]+} $df " " df
            set fields [split $df]
            set size [lindex $fields 1]
            set used [lindex $fields 2]
            set perc [string trimright [lindex $fields 4] "%"]
        }
    }
    
    set file [format "%02d" [expr {$perc * 25 / 100 + 1}]]
    
    puts "<div style=\"float: right; background:url('/images/345_1_27_ST_USB_BG.png')\">"
    puts "<img src=/images/345_2_14_ST_HDD_$file.png>"
    puts "</div>"
    puts "<span style=\"float: right\">"
    puts "<br>"
    puts "Total space: $size<br>"
    puts "Used: $used ($perc%)"
    puts "</span>"
    
    I did some tcl about 6 years ago at uni - hopefully Ive done it right & its useful (works on my box at least :)).

    Regarding scheduling from the web - Ive found the epg data (/mnt/hd1/dvbepg/epg.dat), any ideas what format it is? Ive tried running it through the GNUWin32 version of file but it just says its 'data'. I think it's encoded with the length(data) as 1 byte followed by the actual data, Ill try to decode that some more tomorrow.

    Thanks again,
    Nick
     
  2. af123

    af123 Administrator Staff Member

    Welcome to the party.

    Very useful thanks, works for me too, I'll update the package with your changes. Condensing the spaces improves it a lot.

    I don't think anyone has looked at it yet. Raydon has analysed some of the file formats on the Foxsat and T2 so may have some more information - if you look at this page it shows some of the Foxsat formats including an EPG block which doesn't seem to match the epg.dat. Not directly helpful but does show the way in which Humax have historically stored data (such as 4 byte epoch dates).

    Alternatively it could be a standard Freeview EPG format..

    More investigation required in either case![/quote]
     
  3. csudcy

    csudcy New Member

    I think you've hit on it - I got the specs for data transmitted by Freeview (ETSI EN 300 468 from http://www.etsi.org/deliver/etsi_en/300400_300499/300468/01.11.01_60/en_300468v011101p.pdf), it's mostly EIT data with another header before each section I havent identified yet. Im writing a Python parser for it at the moment (that's my main language). Any idea if Python could be put on the box (I've never done cross-compiling/packaging before)? If Python was on there, I'd love to write a webif in Django (Ive got nearly 3 years Django experience vs. pretty much 0 tcl/jimsh!). Anyway, on with the parser for now.
     
  4. af123

    af123 Administrator Staff Member

    Python could probably be put on with a bit of work - it needs some patches to cross compile but they're available on a few web sites. It's on my list of things to do.

    The reason I knocked up that interface using TCL is because the box is very short on resources when it's doing its day job. Jim is a very small footprint TCL interpreter which makes it perfect for running in small embedded environments. I don't have much TCL experience either - I mainly program in C or assembler these days.

    Even with the limited resources, it's surprising what can be done. I've packaged up the GCC C & C++ compilers (gcc package) and it's possible to compile fairly large things natively on the box (although I wouldn't try it when it's recording something I care about)

    The ideal would probably be a small EPG parser written in C that is called by jim to extract information when it needs it. Similarly I'm working on a small utility that parses the .hmt files associated with recorded media. Jim could use that to display more information on recordings fairly easily and I might have a play around with that in the near future.

    I named the interface package af123-webif because I imagined that other people would come along and write their own, and because mine is really just a proof of concept at the moment (although feel free to extend it and submit updates back to me). If it turns out to be practical to implement something using Python (or even Django) then that would certainly speed up development.
     
  5. raydon

    raydon Well-Known Member

    af123: PM with link to mediatomb package is in your mailbox now.
     
    af123 likes this.
  6. af123

    af123 Administrator Staff Member

    I've just uploaded a new package called dedup which adds a new command with the same name which I used to sort out large folders of programmes. It's another use-at-your-own risk thing, but it works for me.

    It works on recordings where the EPG contained the episode name, a : and the the rest of the description, e.g.

    but the name that's displayed in the media list is just a generic series name, in this case Family Guy

    If you change to a directory and run dedup with no arguments, it just lists the recordings there and shows what they would be renamed to. If you run it as dedup -yes then it makes the changes. It:
    • Renames the files on disk to match the episode name;
    • Changes the title shown in the media list to match the episode name;
    • Identifies duplicate episodes and moves them to a sub-folder called dup/ (which you can subsequently remove if you're happy that they're duplicates).
    It's just a shell script so easy for you to look at and modify if necessary. It uses a new binary package called hmt which extracts information from .hmt files (associated with recordings) in order to determine the episode name and to change the media list title.
     
  7. Dan

    Dan Member

    Great work - just run dedup in my American Dad collection - worked a treat
     
  8. af123

    af123 Administrator Staff Member

    I've packaged up Raydon's mediatomb installation and it works a treat. It's available now

    Code:
    opkg update && opkg install mediatomb
    I've made some (fairly minor) changes from the version that Raydon provided, so any bugs are likely to be my fault. On the other hand, if it works then that's all Raydon's work :)

    The default database looks for video content in /media/virtual, which is the location that the virtual-disk package creates. You can add others using the web interface but note that all content under /media/My Video is likely to be encrypted if it was recorded straight from the air.

    So, quick-start guide for people who already have telnet/ssh access to their box:
    • opkg update && opkg install virtual-disk mediatomb
    • Use the remote control to copy something you want to watch to the virtual disk; it will be decrypted as it is copied.
    • Watch the programme on another device on your network (I used XBMC on MacOSX and it worked perfectly).
    • If you use DHCP then the server may start too early during boot, log into the box and run:
    Code:
    /mod/etc/init.d/S90mediatomb stop && sleep 1 && /mod/etc/init.d/S90mediatomb start
    (Raydon's shown me how he fixed this on the Foxsat so I will be releasing a new HDF file with that fix in it soon; the mediatomb package will then automatically wait for the network to come up before starting)
     
  9. af123

    af123 Administrator Staff Member

    I've updated the af123-webif package so that if mediatomb is installed then there's a link to the mediatomb web interface at the bottom of each page. Mediatomb changes port number each time it's run.
     
  10. csudcy

    csudcy New Member

    Cool - I think Ill work on a web interface based on Python & Django (on my desktop since I've copied the epg.dat & all the sqlite files over) & hope that a) Python is a possibility in the (near?) future and b) its not too hungry on the processor/memory.

    I was thinking of periodically (maybe once an hour, I saw cron in your repository) processing the EPG into a sqlite database. Im assuming selecting from a sqlite db would be quicker than reparsing the whole file every time from python (Ill try to verify that sometime) though if the parser was done in C then Im not sure what would be faster. Anyway, Ive attached the parser in case anyone's interested.

    On that note, I've attached a new shtml/jim to dump the 3 databases I've seen on the box (setup, channels & recording schedule) to a html. Dont know if it'll be useful for anyone else, but I find it easier & quicker to look at them like that than firing up jimsh/sqlite & trying to remember all the commands (Im used to tsql - any chance of getting SQL server on there? ;)).

    Anyway, as McDonalds says, 'Im Lovin it'!
     

    Attached Files:

    william and af123 like this.
  11. af123

    af123 Administrator Staff Member

    Very neat - works for me.
     
  12. Dan

    Dan Member

    Thanks to everyone for their efforts - I have mediatomb up and running and it streams beautifully! So far I have seen no adverse effects in day to day activity - I have installed the following packages (and their dependants).

    • mediatomb
    • af123-webif
    • utelnetd_0.1.9
    • virtual-disk
    • dedup
    I'm hoping to get mediatomb to stream the media from my external USB attached to the Hummy. My current USB drive is NTFS (so I have to plug it into a PC to write to it) should I format this as ext3 or Fat32 if I want to leave it permanently attached to the HDR and add files via ftp in future?
     
  13. raydon

    raydon Well-Known Member

    Format it as ext3. Fat32 has a 4GB limit on file size.
     
  14. suarez

    suarez New Member

    Where's the download now? Clicking on the HDR hdf link below gives a ''HDR_FOX_T2_upgrade.hdf' is unavailable. This file was deleted.' message.

    (I know a place where they will host the modded hdf files...)

     
  15. Dan

    Dan Member

    *Edit - see af123's post below.
     
  16. af123

    af123 Administrator Staff Member

    The links in the first post in this thread are correct, and I'll keep those ones up todate. I'll remove all others now and point to the first post to avoid confusion.
     
  17. raydon

    raydon Well-Known Member

    I thought the web interface was configured for port 49152 ?? :confused: Mine is.
     
  18. Drutt

    Drutt Active Member

    Great work so far chaps! Just back from a month away, and disappointed to see there is still no HD update from Humax, but the progress made here has made up for it. I was wondering if anyone has had any success yet adding useful functionality to the HD model (in particular being able to media serve / backup / re-compress recorded programs)? I've started a thread in the HD section if people feel its more appropriate to discuss there.
     
  19. af123

    af123 Administrator Staff Member

    Yes, someone has managed to load a custom firmware onto his HD but the package installation didn't work properly; he's investigating. I don't have an HD myself. It shouldn't be long before I can roll an update that works properly for that model.
     
  20. af123

    af123 Administrator Staff Member

    It will usually start on that port (and to be honest I've never seen it on a different one), but the documentation states:

    so it's not entirely guaranteed.