So say I've just rented a VPS for the weekend, and I wanna run wget on an FTP server, over and over again to get any new files, without overwriting ones I already have, without making duplicates, and with the ability to "Deal with" partial or failed downloads?
wget -r -np [address]
I'm planning on running one of these for every server using tmux, manually flicking through clearing failures, deleting bad files, and updating IP addresses for the same server if it switches
I tried with -c -nc , but if a file becomes larger (as in I downloaded an unfinished upload), it ignores it
-nc alone still ignores, even if the file is newer, but -c continues if the file grows, but if I upload a 10MB zip called "porn.mp4", download it, then someone deletes porn.mp4 and replaces it with a 1GB file of the same name, but an actual video, if I re-run the download with -c, the first 10MB will still be the zip.
I doubt this would happen, but it's still a concern. Is just re-downloading if the file's different the best idea?
If I just use -r -np it redownloads every time, -c might leave dreggs, and -nc ignores everything.
How decent are FTP servers for timestamping?
Alright, I'm using
wget -r -np -c
It seems to grab the whole file again if it feels like it anyway, depending on server
...Strike that, it grabs the whole file, but appends only what it wants.
Fuck this is weird