I am transferring files from a folder on one server to another and I am using wget to do so.
But the problem is that wget gets terminated and when I rerun the command it starts from the very first file although I use -nc to skip files that exist but still it traverses all the files and then skip those files that exist so it takes too much time in skipping the files.
I want to know is there any way to have wget start downloading directly from the new file instead of checking each file from the top.
I hope I have made my question clear. Pardon me if couldn't.
This is the command that I am using:
wget -H -r --level=1 -k -p -nc http://www.example.com/images/
You could try using a reject-list to skip already downloaded files.
If all your files are in the same directory, it could be as simple as:
wget -R "`ls -1 | tr "\n" ,`" <your own options>
I am not sure what will happen with partial downloads.
Related
i am new to PGP and i am trying to setup a bat file to decrypt my files to load into Automated Task. i was able to put together a bat file that worked however it kept prompting me for a password even though it was included in my bat command. Deeper research i found that gpg-agent needs to allow loopback as mentioned here https://lists.gnupg.org/pipermail/gnupg-devel/2015-May/029851.html So when i include --pinentry-mode loopback it now loops infinitely! If i remove it from my bat statement it still loops infinitely! without decrypting anything. Furthermore i have to force close the window.
Here is the command i am using..
echo MyPassPhrase | gpg -v --batch --yes --pinentry-mode loopback --passphrase-fd 0 --force-mdc -d testing.file.pgp
Even if i use..
gpg -v -o test.txt --force-mdc -d testing.file.pgp
it loops infinitely!
Something is obviously wrong. I am using the GnuPG version 2.2.8. Thinking i should downgrade?? Been having a lot of issues with this version.
Never had any issues using this syntax
gpg --batch --passphrase somepassphrase -o "Outfile.txt" --decrypt "Input.pgp"
Got a couple dozen automated tasks that do this several times a day.
This question already has answers here:
psftp.exe get files from the server and delete
(3 answers)
Closed 4 years ago.
I was wondering if someone knows how to do what I'm looking to do.
For my server, I download files from an FTP server daily at 5AM. My batch script is pretty simple, it connects to the FTP server, downloads the files, processes them locally, and then deletes the processed files from the local directory, but I am unable to figure out how to get the batch file to purge only the downloaded files from the server.
Here is the code I'm currently using (edited for privacy)
C:
cd "C:\targetfolder"
rem psftp -b download.cmd -i priv(second).ppk -P 2223 xxx#yyy.ca
psftp -b download(second).cmd -i priv(second).ppk -P 2223 xxx#yyy.ca
rem psftp -b download.cmd -i priv.ppk xxx#yyy.ca
psftp -b download.cmd -i priv.ppk -P 2223 xxx#yyy.ca
rename *.xxx *.xxx
del done*.*
So the script as it is successfully is run every morning and downloads my new files. Are there some line(s) of code I'm missing that will simply delete the downloaded files only?
I also want to mention that I cannot install any new software on my FTP server to manage the files, so it has to be processed in my batch code here.
Thank you in advance for any help you all may be able to provide!
EDIT1: Here is the script in the doanload.cmd
ls
cd target
ls
mget *
Solved, thanks to #MartinPrikryl.
I added "rm *" to the end of my cmd file and it's working nicely, even though it doesn't differentiate from downloaded and non-downloaded files, it does what I need.
Thanks #MartinPrikryl!
I have a link on my website that when clicked dynamically creates a csv file and downloads the file. I need a way to do this in a batch file so that the file can be downloaded automatically (via task scheduler). I have played around with wget but I can't get the file. Thank you in advance for your help!
bitsadmin.exe /transfer "Job Name" downloadUrl destination
If you are using Windows 7 then use same command in Power Shell
Note:
downloadUrl : It is the download url from referred website
destination : It is path of the file where we need to download it.
I use it as follows:
#plain wget
wget "http://blah.com:8080/etc/myjar.jar"
#wget but skirting proxy settings
wget --no-proxy "http://blah.com:8080/etc/myjar.jar"
Or to download to a specific filename (perhaps to enable consistent naming in scripts):
wget -O myjar.jar --no-proxy "http://blah.com:8080/etc/myjar1.jar"
If you're having issues, ensure wget logging is on and possibly debug (which will be augmented with your logging):
# additional logging
wget -o myjar1.jar.log "http://blah.com:8080/etcetcetc/myjar1.jar"
#debug (if wget was compiled with debug symbols only!)
wget -o myjar1.jar.log -d "http://blah.com:8080/etc/myjar1.jar"
Additional checks you may need to do if still no success:
Can you ping the target host?
Can you "see" the target file in a browser?
Is the target file actually on the server?
I am having a bit of trouble grabbing some files that have a strange file structure. What do I mean exactly? http://downloads.cloudmade.com/americas/northern_america/united_states/district_of_columbia#downloads_breadcrumbs
Look at that example. I want to start at the root of the site and recursively grab all the files that end with *.shapefile.zip. wget appears to treat this as two separate files ending in .shapefile and .zip. Anyone have some wget goodness to help me get started on this one?
You can recursively wget specific file types with:
wget -A 'shapefiles.zip' -r <url>
Although I don't think .shapefiles.zip is an extension of .zip but more that site's naming conventions
I want to download only .htm or .html files from my server. I'm trying to use ncftpget and even wget but only with limited success.
with ncftpget I can download the whole tree structure no problem but can't seem to specify which files I want, it's either all or nothing.
If I specify the file type like this, it only looks in the top folder:
ncftpget -R -u myuser -p mypass ftp://ftp.myserver.com/public_html/*.htm ./local_folder
If I do this, it downloads the whole site and not just .htm files:
ncftpget -R -u myuser -p mypass ftp://ftp.myserver.com/public_html/ ./local_folder *.htm
Can I use ncftp to do this, or is there another tool I should be using?
You can do it with wget
wget -r -np -A "*.htm*" ftp://site/dir
or:
wget -m -np -A "*.htm*" ftp://user:pass#host/dir
However, as per Types of Files:
Note that these two options do not affect the downloading of HTML files (as determined by a .htm or .html filename prefix). This behavior may not be desirable for all users, and may be changed for future versions of Wget.
Does ncftpget understand dir globs?
Try
ncftpget -R -u myuser -p mypass ftp://ftp.myserver.com/public_html/**/*.htm ./local_folder
** means any number of directories.
The wget command understands standing unix file globbing syntax.
wget -r -np --ftp-user=username --ftp-password=password "ftp://example.com/path/to/dir/*.htm"
Conversely, you can use the -A option, which accepts a comma-separated list of file name suffixes or patterns to accept.
wget -A '*.htm'
The -R option is the opposite of -A, so you can use it to specify patterns NOT to fetch.
Caveat: Make sure to quote patterns! Otherwise, your shell may expand the glob itself, leading to unexpected results.
Also! See the "Using wget to recursively download whole FTP directories" question on Server Fault.