Another computer question...

Obviously there's no purpose in doing it this way except for bragging rights...
:rolling:

I've found the performance of HiDef (any sort) played on my laptop can be,for the want of a better word, crap.
.ts from the Humax won't play very well at all. Sometimes a conversion to mp4 (just using ffmpeg to copy sound/vision into an mp4 wrapper) will make it playable - just. Usually I have to return it to a Humax for reliable playing. Restricting the size of a VLC window doesn't improve things for me - resizing the output of the encoder to a smaller resolution does (obviously).
 
a)*

* Not just a different keyboard, but a different system altogether (ie my iPad + clamshell case with Bluetooth keyboard). The Perixx runs my Win7 notebook and Zen (Linux tower) through a KVM switch... and I find myself having to check whenever there's a z involved! Bloody annoying. Otherwise I'm very comfortable with that keyboard

I might take a hammer to it though, the only way in I can find to try to fix it.
 
but it won't play the video smoothly
Hmm, sorry to butt into an interesting thread, with what might be an obvious comment you might have already considered, but frame rates can be significant; PC screen probably at 60Hz and UK video source at 50Hz, so you often get staggering motion rendering, unless you can set display rate the the same as the source rate.

A modern UK TV can swap seamlessly between 60Hz and 50Hz display so we never notice.

Not all PC monitors handle 50Hz, but big TVs used as PC monitors usually can, if that PC's graphics card has been allowed to.
 
This is a window on a desktop, not a full-screen.
Which is precisely the problem.

60Hz desktop field rate, 50Hz TV field rate.

Interlaced TV adds significant interest and conversion difficulty level for showing motion seamlessly on a progressive scan screen, and then a different field rate adds a lot more bat's blood to the cauldron.

Our eyes cannot help but compare the possible differences; flicker, judder, jagged motion, or even a frame discontinuity scrolling through a smaller window against the still background of desktop.

So at its very simplest, for every six fields shown by the desktop one field out of the five original TV fields is repeated in that time. Add clever complexity to the processing and it might be bodged, or smeared, or hopefully even something better, closer to full standards conversion if the motion vectors in the video stream are still intact. It all depends on what algorithm the graphics card attempts, and if it has proper hardware support for decent field-stored de-interlacing with help from the motion vectors used in the compression.
 
Whatever.

I still don't think it's that, I'm not super-critical. Make the window small enough (ie fewer pixels to process) and the processing keeps up with the frame rate. So far as I know, rendering is independent of video scan read-out via the video RAM (possibly even two buffers), so the worst one should get is some tearing in moving objects.

What I get (if the window is large enough) is freezes for significant fractions of a second (ie several frames at a time), so I blame my graphics processing performance. Maybe its a question of tuning VLC (or the relevant drivers) appropriately for my hardware, because I'm pretty sure I have "sufficient".
 
So far as I know, rendering is independent of video scan read-out via the video RAM (possibly even two buffers),

Hmm, decoding of compressed video is spread across quite a few fields, remember the I, P and B frames?

The PC might need to buffer a sequence of up to a second or two long between I frames while modifying each field appropriately. Which is why I mentioned hardware support with the custom DSPs built in, as in every TV's decoder.

The algorithms underlying MPEG2 grew out of designing standards converters, where if the perceived motion was smooth and correctly predicted (Wimbledon tennis ball bounce being a very tough test) it did not matter what happened to the detail, because the eye integrated most of the errors seen on CRTs.

But then digital displays came along with much better fine detail and the algorithm for hiding the lossy compression errors had to be changed, both for the standards converters and for the proposed DVB-T. Lots of interesting stuff done, and it's still being developed three decades later.

It is why I always let the Humax do all the decoding work and send the (1080P) maximum screen resolution over the HDMI. Upscaling 576i or 1080i later in the TV is never as good.

But all this is moot if one is not bothered about what one sees on a PC screen in the way of TV.
:)
 
I've got myself some sh*t with a 64GB UPD. In the heat, my Win7 notebook slows to an unusable crawl (needs more RAM I expect), so I was swapping between the notebook and my still-under-development Linux tower (and a EeePC netbook running Win7 Starter) using said UPD as working storage.

I was importing the UPD contents into my notebook for safety, when the UPD stopped working. I wasn't even writing to it! It would no longer mount, and although Windows gave it a drive letter it said there was no file system. Investigating, I forget which utility told me the superblock is stuffed.

Everything was easily recreatable or existed elsewhere except one file, where the best I had was an auto-save from the last session which was a week out of date.

I have taken a binary image of the UPD (on my Linux system), before risking any "repair".

A google came up with "4Card Recovery", which offers a free download and the first 1 GB of data for free. I set it loose on the UPD and after about an hour (of ostensibly an 18+ hour scan) it had reported FAT structures (I expected NTFS, but not all that surprised if it's FAT) and a few hundred files located, so I stopped the scan to see what's what. The desired file was listed with a correct-looking file length, but exporting it only resulted in about 60% of the actual content.

Maybe that's because I terminated the scan early. it's now running again. I doubt it should need to run to completion because I doubt I've actually used much of the 64GB capacity, but I have no idea how long to leave it this time.

If that fails. I think it should be possible to examine the binary image and search for a known file name to locate the FAT directory and file pointer structures. This should be easier than if it were NTFS. The difficulty will be working with a 60GB binary image. This will be best done using my Linux tower, but I'm not familiar with the tools. I hope there is a hex editor...

Now, the questions are these:
  • Anybody hazard a guess how a UPD can get corrupted when all you're doing is reading from it? It was literally in the process of using a file manager to copy a block of files from the UPD to my notebook's HDD.

  • NTFS is a journalling file system, which means the OS leaves a breadcrumb trail of what file system alterations are about to be made, then makes them, then tidies up. This is so that a crash in the middle results in the file system unaltered, or a record of what alterations were in process. It would be nice to think that if I had formatted the UPD as NTFS it would have been protected from this insult, but I suspect that if the superblock can be corrupted even while being read, even NTFS would have got corrupted (and be harder to unpick).
 
Anybody hazard a guess how a UPD can get corrupted when all you're doing is reading from it?
Like you said at the top - heat.
Among others I have one upd that is tiny - fingernail size - with a metal cover, and when plugged in and not even reading or writing it gets really hot, such that it's uncomfortable gripping it to unplug.
So I suspect that under all the packaging these things run near the limit anyway and with a higher ambient and a bit of aging ... Pop!
 
Like you said at the top - heat.
Among others I have one upd that is tiny - fingernail size - with a metal cover, and when plugged in and not even reading or writing it gets really hot, such that it's uncomfortable gripping it to unplug.
So I suspect that under all the packaging these things run near the limit anyway and with a higher ambient and a bit of aging ... Pop!
I don't see why, in my ambient temperature which remained under 30 deg.C, a UPD should peg out when worldwide people must be using them at higher ambient temperatures.

The desired file was listed with a correct-looking file length, but exporting it only resulted in about 60% of the actual content.

Maybe that's because I terminated the scan early.
It turns out I had the whole file after all, I just wasn't expecting to find the export in a cryptic sub-folder created by 4Card Recovery instead of where I asked for it to go. The "truncated" file I was looking at was actually an old copy. All good now, situation normal (improbability level 1:1 and whatever I can't cope with is therefore my own problem).
 
I'd reformat it NTFS, give it a stern talking to Basil Fawlty style, and test the behaviour for a while before binning it.
 
Ah. Thought you'd recovered everything you wanted.
I kept a knackered mp3 player for the few GB storage it provides. (Battery charge lasts about one minute, but when connected behaves okay. As expected, it doesn't lose files - yet.)
 
Back
Top