oops - just deleted a symlink via ftp and it deleted all the files, not just the symbolic link!

rodp

Member
Hi All,

As per subject title - I didn't realise it would start going through all the folders, so my fault! Is there something that can be used to undelete the files? Read about https://itsfoss.com/recover-deleted-files-linux/ (testdisk). Would I need this or is there something already on the Humax if you have the custom firmware installed?

I didn't do too much damage - stopped it pretty quick but just want to get back the few music files I deleted.

secondary question - how do you remove a symlink you've just created? the option -f sounded like it would remove all active symlinks

Thanks

Rodp
 
Ouch.

is there something already on the Humax if you have the custom firmware installed
I don't think so, no. We have the undelete package, but that operates at the user interface level (SUI and WebIF) by intercepting the delete operation and instead moving files to a trash can. It does not intercept file-system level operations (command line, FTP, SMB...).

Would I need this
Yes, but no. In general, tools available to install in mainstream Linux are compiled for PC hardware, and would need recompiling for the (unique) HDR-FOX hardware environment. That is possible, if the source code is available, and if you don't have the skills you'll have to appeal to those who have. The problem is that all the while you have that HDR-FOX turned on, you are at risk of over-writing the data you are trying to recover so I advise you only run it in Maintenance Mode.

What I recommend is you take the HDD out and work on it as an external drive from a PC running Linux. Then you can use all the tools as-is, and maybe even a GUI file recovery tool for mainstream Linux.

how do you remove a symlink you've just created? the option -f sounded like it would remove all active symlinks
You just delete the symlink. The file system keeps a count of how many linked entries there are (if it's a hard link, the file system doesn't even know which one is the original file). The original file is only deleted if the link count = 1. It sounds to me like the -f switch told it to remove all of them (but I haven't read up on it).
 
Thanks for the reply Black Hole.

Just came across this thread talking about extundelete: https://hummy.tv/forum/threads/recovering-a-deleted-recording.7784/#post-106693. Looks a little complicated re the swap file and where the files are saved. One of the folders deleted was
Code:
/mnt/hd2/My Music/13 minutes to the moon
for example. @af123 or someone else... could you possibly walk me through the steps? I think I'll need to use the
Code:
--after dtime
as that music folder will contain alot of deleted .ts files from the conversion to mp3 etc. how is the date and time written? perhaps: 2023-03-09T00:00:00Z

Last resort I could take the HDD out for sure but I only have an old untuntu 16.04 live disc setup so to get some undelete software might be quite time consuming I've found some info about photrec so if HDD does need to be removed I might give this a go: https://www.digitalocean.com/community/tutorials/photorec-recover-deleted-files-in-linux-ubuntu


Thanks

Rodp
 
... take the HDD out and work on it as an external drive from a PC running Linux

+1

As noted, you have to distinguish symbolic links, which are like a shortcut or staging post, from hard links, built into the filesystem. In typical UNIX/Linux filesystems and even NTFS, there is a file object referenced by zero or more directory entries, such that an unreferenced (not pointed to by a directory entry, not open in a process, etc) file is discarded and its allocation returned to the free list (possibly with some secure gating so that existing blocks aren't made available to other users without being erased). Thus a pattern for using a temporary file is to create the file, open it, unlink it (ie, remove its directory entry), work on it, then close it, at which point it is automatically discarded. In contrast, a symlink may be pointing to a file even after the file has been deleted: there isn't a general reverse link that the system could follow to update a symlink.

In this case it seems that the deleting program wanted to follow symlinks when deleting. rm doesn't.

In this test, a symlink is made to a new file. Then
  • deleting the file leaves the symlink
  • re-creating the file and deleting the symlink leaves the new file
  • re-creating the symlink and deleting its target file leaves the symlink.
Code:
$ touch foo
$ ln -s foo bar
$ ls -l foo bar
lrwxrwxrwx 1 df df 3 Mar 10 12:14 bar -> foo
-rw-rw-r-- 1 df df 0 Mar 10 12:13 foo
$ rm foo
$ ls -l foo bar
ls: cannot access 'foo': No such file or directory
lrwxrwxrwx 1 df df 3 Mar 10 12:14 bar -> foo
$ touch foo
$ rm bar
$ ls -l foo bar
ls: cannot access 'bar': No such file or directory
-rw-rw-r-- 1 df df 0 Mar 10 12:14 foo
$ ln -s foo bar
$ rm $(realpath bar)
$ ls -l foo bar
ls: cannot access 'foo': No such file or directory
lrwxrwxrwx 1 df df 3 Mar 10 12:15 bar -> foo
$
 
Last resort I could take the HDD out for sure but I only have an old untuntu 16.04 live disc setup so to get some undelete software might be quite time consuming I've found some info about photrec so if HDD does need to be removed I might give this a go: https://www.digitalocean.com/community/tutorials/photorec-recover-deleted-files-in-linux-ubuntu
I see that as the first resort, not last resort. It will be much less error prone for you to use a GUI recovery utility, which should be just a case of the utility listing what's recoverable and then you clicking on what you want to recover.

I see no difficulty in downloading a newer live Linux, but also no reason an older Linux wouldn't be good enough. Once you have Linux booted, it should be a simple matter to use the software manager to download and install a recovery program (which will not be a permanent to the live Linux), or perhaps look for a live download which has a recovery utility built in.

The reason to use Linux is to ensure native support for the HDR-FOX's file system.
 
HDD is out, Ubuntu 16.04 in use and pleasantly surprised to find it already has photo rec on. However, Every sector it is reading it's saying error reading sector.

Have i run into the humax format not being understood?

I'm assuming I've chosen the correct disk / partition ('2 P' in the pic). I had 3 to choose from and this one had the biggest sector range.

Pretty sure my HHD is absolutely fine so I must doing something wrong. I didn't mount it, I just plugged it in and checked linux had recognised it.

Thanks

Rodp


20230310_195431, low res (1).jpg
 
Last edited:
ah! observation - the HDD is not spinning for some reason yet it's still working through the sectors - could it have cached it and so the HDD has just spun down. The green light on the HDD Caddy is lit up. Hmmm - puzzling.
 
wierd.... forced it to mount view the files on the HDD and then went back to a new session of photorec (this time it was no longer sdb but sdc) and not it's finding the files!!! It says it's finished but the results don't look that good which is a pity as photrec on windows does a much better job. No proper filenames either and directory structure so none the wiser as to what it's recovered.

Can anyone recommend a better application?

Thanks

Rodp
 
Is this connected via SATA or USB? I'd recommend the former. You might want to consider doing a long SMART test first.

Ah, a quick search for Inateck asm1153e shows it's a USB3 enclosure.
 
HDD back in! Here are my findings. As Black Hole suggested, recovery was more about the state of the file system and hiw tidy and unchanged since deleting it was. Photorec tried its hardest and recovered various portions of the files but couldn't recover whole files. It also didn't return the file names. So i tried out R-linux which is much nicer to use as it has a gui and showed me a folder structure and reported the filenames in full but the most were classed as zero bytes (this is all using ubuntu 16.04 from a live disc by the way). So at least I could clarrify what i lost and if photorec had been able to recover the files more fully I could have at least renamed them.

I think photorec might have been able to have done a better job if the files were less fragmented. I appreciate not many people think about this as tech has moved on and the HDD is not necassary the bottle neck these days but...if I was wanting to consider defragmenting the HDD, what is there that could do this? Perhaps SMART is already meant to be doing this? Is there a report for this?

If it doesn't do this, Ideally it would be nice to do this whilst in the box. If not, then I'd just google around for something and take the HDD out again (which is a bit of a faff to be honest!) or perhaps transfer files onto a usb stick, but I'd still need to ensure the files go back not fragmented.

And whilst I think about it. This symlnk thing is a little scary after my experience. I know it was the user at fault and the ftp program saw the symlnk and thought it was being helpful by going through all the folders it could see and I think there maybe a setting to turn this off but perhaps i need to think about making sure things (important parts which make the humax work) are set to read only. I guess it would need a different username and set of permissions but not sure where to begin on that.

Thanks

Rodp
 
Sorry and one other point to make.... from this experience, linux overall doesn't seem to be that easy to undelete files, which I was rather surprised about. It's alot less painful in a windows environment and photrec or r-undelete seem to be able to do a better job even without have to immediately turn the compy off.

Just an idea... I wonder if there is a way to stop journals or nodes being written to for a certain amount of time?

Thanks

Rodp
 
linux overall doesn't seem to be that easy to undelete files
That's not so much Linux as Ext3, and we're stuck with that on HDR-FOX. I'm not sure whether Ext4 is an improvement from that point of view, but it might be.

In general, file system architecture does not revolve around how easy it might be to recover an accidental deletion - GUIs have a recycle bin for that. Journalling file systems are about security of data in case the system crashes (or loses power) during a write.

Defragmentation doesn't seem to be a "thing" anymore, I suspect mainly because HDD performance is now sufficient that fragmented files are not a bottleneck. I would be surprised if there were no defraggers around any more though, there's bound to be somebody trying to wring out the last ounce of performance. If you can find a command-line one complete with source code, maybe some kind person could built it into fixdisk as a Maintenance Mode option.

This symlnk thing is a little scary after my experience.
I'm not sure why FTP went wrong, maybe it's a quirk/bug of that particular implementation and maybe somebody needs to look into that (I haven't looked into whether it was truly an accident or if you were the author of your own downfall). In general, they shouldn't be that scary and the file system should just count down the number of directory links to the data. I started using links (admittedly in WIndows) with some trepidation, but was pleasantly surprised how well they worked (NTFS) even when accessed with a Mac.
 
Last edited:
FTP itself doesn't acknowledge symlinks: there's no way for a client to ask the server about a symlink or to create one; there's no way for the server to report about a symlink. So it's an implementation choice or the server whether to treat symlinks as plain files (ie just return some blob) or to follow a symlink to its target file or directory. The web says that some implementations may have extension commands, invoked using the FTP SITE command, that do understand symlinks, but of course that requires both the client and the server to have implemented the extension(s) and the client to invoke the extension command rather than the symlink-blind standard one, with or without user intervention.

To follow this up further, one would have to know which server was running on the Humax and it would also be interesting to know which client was in use. However, based on the above, the obvious thing for a server to do is to treat symlinks like any other links, and that seems to be what happened.
 
Back
Top