• The forum software that supports hummy.tv has been upgraded to XenForo 2.3!

    Please bear with us as we continue to tweak things, and feel free to post any questions, issues or suggestions in the upgrade thread.

Is there a way of stopping the Queue from processing?

4ndy

Member
I have set Qtube to download Much Ado About Nothing from the BBC using the default settings. It turns out that the file is too big to process. The process gets to where it builds the temp file then crashes the box when the temp file reaches about 5.5gb. I reported a different example of this recently on the youtube-dl thread. here https://hummy.tv/forum/threads/yout...com-or-other-video-platforms.8462/post-160795

I obviously had no way of predicting this might happen again.

My problem now is that every time I restart, the box, the auto process starts and it goes around the loop again before I can hold or delete this item in the queue. Last time I had to delete the whole queue to break the cycle. I have tried setting an imminent recording and set non-processing times, but both of these fail to prevent the queued item restarting for some reason.

Is it possible to break this loop somehow, by either pausing the start of the queue on restart so that the item can be deleted, or a stop button in webif.?

Please feel free to move this to a more appropriate thread, as the issue seems to cross many packages.
 
Stopping the cron service should prevent queued items from being processed: use the Service item in the WebIf home page, or service crond stop at the command line.

Or set a period to disable auto-processing in WebIf>Settings>Auto-processing.

You might then need to kill a process that will be listed using ps -Al at the command line as python or youtube-dl, using kill nnn, where nnn is the PID listed for the process.

You may need to set some default format selection for Qtube to avoid getting gigantic files that might not be playable. However, the recording of a typical HD Hootenanny is ~7GB and it ought to be possible to download a file of similar size.
 
However, the recording of a typical HD Hootenanny is ~7GB and it ought to be possible to download a file of similar size.
Download might be okay, but the subsequent ffmpeg processing to fix the stream (or whatever it does) will exceed the memory available. Swapper should alleviate the memory problem, but I imagine even that has its limits.
 
Or set a period to disable auto-processing in WebIf>Settings>Auto-processing.
I have tried that. The function prevents new youtube-dl processes from running during the non-auto times, but not where they are immediately resumed following a reboot. Same applies to impending recording period.
 
My problem now is that every time I restart, the box, the auto process starts and it goes around the loop again before I can hold or delete this item in the queue. Last time I had to delete the whole queue to break the cycle. I have tried setting an imminent recording and set non-processing times, but both of these fail to prevent the queued item restarting for some reason.

Is it possible to break this loop somehow, by either pausing the start of the queue on restart so that the item can be deleted, or a stop button in webif.?
You get three minutes after startup before the Auto processing goes in to action. Is this really not enough to hold/delete the offending item?
The function prevents new youtube-dl processes from running during the non-auto times, but not where they are immediately resumed following a reboot. Same applies to impending recording period.
It's difficult to see how this fails. Have you got (i.e. can you generate) a relevant section from the /mod/tmp/auto.log file, with debug logging enabled?
 
Back
Top