• The forum software that supports hummy.tv has been upgraded to XenForo 2.3!

    Please bear with us as we continue to tweak things, and feel free to post any questions, issues or suggestions in the upgrade thread.

Getting HDF from OTA?

matbl

Member
Hi all.

This might be a strange question. But does anyone know if the data transmitted as OTA is a HDF-file just like the one you can download directly from humax or is the OTA in another format?

Background:
I'm in sweden so I don't have a HDR-FOX T2. But I do have a Humax BXR-HD+ which is the same box but adapted for the swedish market.
I would of course like to do the same that you huys have done for the HDR-FOX T2 but for the BXR-HD+, humax hasn't published any offical update on their website so there is nothing to start from. But there is an OTA running.
I have equipment and knowledge to pull the OTA for the BXR-HD+ from the air into a demuxed file. But is this an HDF? How do I know where the start is? There don't seem to be a magic number or similar at the file start...

Someone with indepth HDF-knowledge would be good...
 
Code:
/*
        HDF File Header
        ========
        2 bytes --> Header Length (0x0012)
        2 bytes --> CRC-16 of 0x10 bytes after this CRC-16
        2 bytes --> Humax Model
        2 bytes --> Number of raw blocks
        4 bytes --> SystemID1
        4 bytes --> SystemID2
        4 bytes --> Total size of raw blocks

Looking for the system IDs is probably the easiest way to find it in the data stream. SystemID1 and SystemID2 define the range of system IDs to which the update can be applied. They're usually the same value in official firmware files.
 
Looking for the system IDs is probably the easiest way to find it in the data stream. SystemID1 and SystemID2 define the range of system IDs to which the update can be applied. They're usually the same value in official firmware files.

Thanks!
I came to think of the sysID's after my post. But my main question still remains, does anyone know if the OTA is formatted in an HDF file? Or is it something else?
 
Thanks!
I came to think of the sysID's after my post. But my main question still remains, does anyone know if the OTA is formatted in an HDF file? Or is it something else?
Not sure of the exact format, but it is possible to extract hdf data from an OTA. You first need to capture the data from the live stream using something like TSReader. A guy named colibri created a utility for the German Humax PR-HD1000 and iCord which has the ability to extract, then export, the data from the captured transport stream. The utility is named PR-HD1000-Heaven.exe. Might be worth taking a look at.
You also have the problem of how to get your modded hdf back onto the box. I assume that Humax has not published a procedure for reflashing via USB, or what the hdf filename haas to be in order to be recognised by your box as valid.
 
Not sure of the exact format, but it is possible to extract hdf data from an OTA. You first need to capture the data from the live stream using something like TSReader. A guy named colibri created a utility for the German Humax PR-HD1000 and iCord which has the ability to extract, then export, the data from the captured transport stream. The utility is named PR-HD1000-Heaven.exe. Might be worth taking a look at.
Yeah, I have the data. But I need to gather some more information about how it's packed up in the transport stream packets. Seems like there is a public dvb standard for it, DVB-SSU, so it's hopefully just a matter of writing a util that parses the stream and extracts the right data.
Thanks. I'll have a look at that util. Maybe it will save me the time to write my own...
 
Code:
/*
        HDF File Header
        ========
        2 bytes --> Header Length (0x0012)
        2 bytes --> CRC-16 of 0x10 bytes after this CRC-16
        2 bytes --> Humax Model
        2 bytes --> Number of raw blocks
        4 bytes --> SystemID1
        4 bytes --> SystemID2
        4 bytes --> Total size of raw blocks

Looking for the system IDs is probably the easiest way to find it in the data stream. SystemID1 and SystemID2 define the range of system IDs to which the update can be applied. They're usually the same value in official firmware files.

I managed to extract the hdf file from the OTA data with my own util.
The hdf header seems fine and it lines up perfectly with the OTA start block.
The sysID's in the OTA-hdf is set to range 0000.0000 - FFFF.FFFF. The actual sysId is in the metadata for the OTA.
Problem is that the data sections seems to be corrupt.
I get this:
Code:
C:\Temp\D-TEMP\dsm-dump\humidify-1.0.2>humidify.exe ..\mmm.hdf
HDF Tool v1.0.2, by af123, 2011.

Opening ..\mmm.hdf, 12260029 bytes.

  Blocks:     377
  Model:      4
  System ID:  0000.0000 - ffff.ffff

File  Offset   Address  Type  Flags  Size       Uncompressed Size
----  ------   -------  ----  -----  ----       -----------------
   1  0000014  0000000  3     0      262        -
Oversized block.
4,0x8148: data block:
  File Offset:     33096 (0x8148)
  Block Length:    50056 (0xc388)
  CRC:              0x8372
  Flags:            0x53
  Type:             0xd7
  Original Length:  14543 (0x38cf)
  Address:          0xee897c33
  Datalen:          50044 (0xc37c)

   2  0000146  0000400  3     0      32758      -
Oversized block.
5,0x8154: data block:
  File Offset:     33108 (0x8154)
  Block Length:    51407 (0xc8cf)
  CRC:              0x57ba
  Flags:            0xc7
  Type:             0x16
  Original Length:  29823 (0x747f)
  Address:          0x3287748b
  Datalen:          51395 (0xc8c3)



C:\Temp\D-TEMP\dsm-dump\humidify-1.0.2>humidify.exe -t ..\mmm.hdf
HDF Tool v1.0.2, by af123, 2011.

Opening ..\mmm.hdf, 12260029 bytes.

  Blocks:     377
  Model:      4
  System ID:  0000.0000 - ffff.ffff

x  1.hdfbin-3-000000.raw          (262 bytes)
Block checksum error. (4308 vs. 40090).

Processed in: 0.01s

Any pointers or specification of the data section headers would be nice. :)
 
Data blocks follow the header and they look like:

Code:
        HDF Block
        =========
        2 bytes --> Block length (not including these two)
        2 bytes --> CRC-16 from next byte to end of block.
        1 byte  --> Flags: 0x00 Not compressed (-lh0-)
                           0x80 Compressed (-lh5)
        1 byte  --> Type
        2 bytes --> Decompressed length (or data length if not compressed)
        4 bytes --> Address
        ... data to end of block ...

Feel free to post the file somewhere if you'd like me to have a look, although it sounds like you're almost there.
 
Solved it. I now have a correct hdf from OTA in sweden... :)

Code:
C:\Temp\D-TEMP\dsm-dump\humidify-1.0.2>humidify.exe -t ..\mmm.hdf
HDF Tool v1.0.2, by af123, 2011.

Opening ..\mmm.hdf, 12253863 bytes.

  Blocks:     377
  Model:      4
  System ID:  0000.0000 - ffff.ffff

x  1.hdfbin-3-000000.raw          (262 bytes)
x  2.hdfbin-3-000400.raw          (12249017 bytes)

Processed in: 0.80s
 
Data blocks follow the header and they look like:

Code:
        HDF Block
        =========
        2 bytes --> Block length (not including these two)
        2 bytes --> CRC-16 from next byte to end of block.
        1 byte  --> Flags: 0x00 Not compressed (-lh0-)
                          0x80 Compressed (-lh5)
        1 byte  --> Type
        2 bytes --> Decompressed length (or data length if not compressed)
        4 bytes --> Address
        ... data to end of block ...

Feel free to post the file somewhere if you'd like me to have a look, although it sounds like you're almost there.

Next problem.
The hdf-file i got had two blocks, both of type 3. The first was only 262 bytes. The second one when unpacked was a second hdf file with a single raw file in it. This is of type 1.
After extracting this I can conclude that it isn't a squashfs filesystem at least.
What is the hdf block type numbers?

Edit:
I figured out what is wrong. But i can't do anything about it...
The OTA-hdf has a second hdf in it, call it 2.hdf.
2.hdf has a number of blocks flagged as compressed. Only problem is, humidify doesn't decompress them, it replaces them with all zeros.
So I loaded the hdf in HDFSmart and it shows that 11 out of 375 blocks are compressed but fails decompression test in that util. CRC is correct for all blocks, so it shouldn't be because of data errors.
Help?
 
Back
Top