[humidify] HDF file utility

as sample for new compression:
http :// snk_dot_to/f-cdh5gl8p
5150C_upgrade.hdf (carousel data dump)


Code:
** HEADER **
 
HDFBlockLength:  0x0012 (18)
HDFBlockCRC:    0xCC5B
HDFModelType:    0x0004
HDFTotalBlocks:  0x02A3 (675)
 
** BLOCK 1 **
 
HDFBlockLength:  0x0110 (272)
HDFBlockCRC:    0x0F1C
HDFBlockFlag1:  0x00
HDFBlockFlag2:  0x01
HDFDecompLength: 0x0106 (262)
HDFLoadAddress:  0x80100000
 
** BLOCK 2 **
 
HDFBlockLength:  0x001E (30)
HDFBlockCRC:    0xAC85
HDFBlockFlag1:  0x00
HDFBlockFlag2:  0x80
HDFDecompLength: 0x0014 (20)
HDFLoadAddress:  0x00000000
 
** BLOCK 3 **
 
HDFBlockLength:  0x8000 (32768)
HDFBlockCRC:    0x4550
HDFBlockFlag1:  0x00
HDFBlockFlag2:  0x01
HDFDecompLength: 0x7FF6 (32758)
HDFLoadAddress:  0x80100400
 
** BLOCK 4 **
 
HDFBlockLength:  0x8000 (32768)
HDFBlockCRC:    0xC989
HDFBlockFlag1:  0x00
HDFBlockFlag2:  0x01
HDFDecompLength: 0x7FF6 (32758)
HDFLoadAddress:  0x801083F6
etc
 
I met with the same problem. When I process HDF file I obtain all zeroes. All CRC's are correct, but I have no result. May be it is a new compression format indeed. Does anybody know anything about it ?
 
is there a new version of humidify for i386? I cant start it in qemu ((

i want to edit firmware for 7000i aka cxhd-5150c.

this one: http s : // yadi.sk /d/Xn7AilHBjCX5m
 
Last edited:
Yes, I have these versions. The problem is that they do not unpack the hdf properly (at least 1.0.2). After unpacking, there are 2 CCF file (file 2 and 256 bytes) If attempts unpacking these two files in version 1.0.2 - unpacking successful, but they are not squashfs. so I want to try to unpack them in version 1.0.4, which I did not start)))


I can not run 1.0.4 because I do not have humax hd(r)-fox :)
 
The tar archive already provided to you by prpr was unpacked on the T2 using humidify v1.04. What makes you think you will fare any differently ?
Renaming the extracted raw files to hdf and further unpacking them on the T2 using v1.0.4 only results in files with large blocks containing zeroes, as already noted by others in this thread. See this post from Alex M, for example. There is no simple way to extract a squashfs from this hdf. Humax closed that particular door a long time ago.
 
.. and it would not be possible to re-create a .hdf as the algorithm and secret data required to reconstruct the checksum blocks is unknown.
 
I appear to have a problem with humidify when an hdf has been created with a particular raw file. I am not sure whether it is platform related or version related. It works with version 1.0.2 on an i386 platform but does not work with version 1.0.4 on the HDR.

On the HDR I get the following (note the small file size):
Code:
# humidify -t h.hdf
HDF Tool v1.0.4, by af123, 2011-2015.

Opening h.hdf, 22494602 bytes.

  Blocks:     0
  Model:      4
  System ID:  80bc.7e00

x  1.hdfbin-1-000000.raw          (32758 bytes)

Processed in: 0.06s
When the hdf file has been created with version 1.0.2 using the same raw file:
Code:
$ humidify -t h2.hdf
HDF Tool v1.0.2, by af123, 2011.

Opening h2.hdf, 22494602 bytes.

  Blocks:     693
  Model:      4
  System ID:  80bc.7e00

x  1.hdfbin-1-000000.raw          (22638592 bytes)

Processed in: 0.77s
Code:
$ cmp -l h.hdf h2.hdf
       3 204  50
       4 163 343
       7   0   2
       8   0 265
      17   0   1
      18   0 127
      19   0  75
      20   0 166
 
I take it all back. After reverting to version 1.0.2 and then reinstalling 1.0.4 it is now working. It must have been a corrupt humidify binary.
 
links for version 1.0.2 doesnt work
Shame you didn't say so in the first place, that is pertinent information for your enquiry.

Sorry, not my department. We might also want to know a bit more about why you want it, as these were your first posts and you've joined the forum specifically to post them.
 
Last edited:
Back
Top