Need to download file on website from command line - batch-file

I have a link on my website that when clicked dynamically creates a csv file and downloads the file. I need a way to do this in a batch file so that the file can be downloaded automatically (via task scheduler). I have played around with wget but I can't get the file. Thank you in advance for your help!

bitsadmin.exe /transfer "Job Name" downloadUrl destination
If you are using Windows 7 then use same command in Power Shell
Note:
downloadUrl : It is the download url from referred website
destination : It is path of the file where we need to download it.

I use it as follows:
#plain wget
wget "http://blah.com:8080/etc/myjar.jar"
#wget but skirting proxy settings
wget --no-proxy "http://blah.com:8080/etc/myjar.jar"
Or to download to a specific filename (perhaps to enable consistent naming in scripts):
wget -O myjar.jar --no-proxy "http://blah.com:8080/etc/myjar1.jar"
If you're having issues, ensure wget logging is on and possibly debug (which will be augmented with your logging):
# additional logging
wget -o myjar1.jar.log "http://blah.com:8080/etcetcetc/myjar1.jar"
#debug (if wget was compiled with debug symbols only!)
wget -o myjar1.jar.log -d "http://blah.com:8080/etc/myjar1.jar"
Additional checks you may need to do if still no success:
Can you ping the target host?
Can you "see" the target file in a browser?
Is the target file actually on the server?

Related

Using wget but only getting html files

I am trying to download multiple NETCDF files from NASA website.
So I was following their tutorial of how to download multiple files using wget for windows(https://disc.gsfc.nasa.gov/data-access#windows_wget).
When I try to use the option to dowload multiple data files at once, the output is only returning to me html files and not the netcdf files. Does anyone know what can be happening?
Ps.: I am executing with the following command:
wget --load-cookies C:\.urs_cookies --save-cookies C:\.urs_cookies --auth-no-challenge=on --keep-session-cookies --user=<your username> --ask-password --content-disposition -i <url.txt>

What "option" to use with "WGET" for selecting only few files with particular extension from a FTP directory

I am trying to download files with particular datestamp as an extension from a folder through FTP server. Since the folder contains all other files, I wanted to download only files with a particular datestamp.
I tried using wget files_datestamp*.extension, which didn't work.
I also tried using wget -i files_datestamp*.extension, which downloads all.
My question is: What option to use with wget to download only particular files that I am interested in?
wget http://collaboration.cmc.ec.gc.ca/cmc/CMOI/NetCDF/NMME/1p0deg/#%23%23/CanCM3_201904_r4i1p1_20190501*.nc4
The link you've shared is over HTTP and not FTP. As a result, it is not possible to glob over the filenames, that is feasible only over FTP.
With HTTP, it is imperative that you have access to a directory listing page which tells you which files are available. Then use -r --accept-regex=<regex here> to download your files

Downloading artifacts from Jenkins using wget or curl

I am trying to download an artifact from a Jenkins project using a DOS batch script. The reason that this is more than trivial is that my artifact is a ZIP file which includes the Jenkins build number in its name, hence I don't know the exact file name.
My current plan of attack is to use wget pointing at: /lastSuccessfulBuild/artifact/
to do some sort of recursive/mirror download.
If I do the following:
wget -r -np -l 1 -A zip --auth-no-challenge --http-user=**** --http-password=**** http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/
(*s are chars I've changed for posting to SO)
I never get a ZIP file. If I omit the -A zip option, I do get the index.html, so I think the authorisation is working, unless it's some sort of session caching issue?
With -A zip I get as part of the response:
Removing ...+8080/job/MyProject/lastSuccessfulBuild/artifact/index.html since it should be rejected.
So I'm not sure if maybe it's removing that file and so not following its links? But doing -A zip,html doesn't work either.
I've tried several wget options, and also curl, but I am getting nowhere.
I don't know if I have the wrong wget options or whether there is something special about Jenkins authentication.
You can add /*zip*/desired_archive_name.zip to any folder of the artifacts location.
If your ZIP file is the only artifact that the job archives, you can use:
http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/*zip*/myfile.zip
where myfile.zip is just a name you assign to the downloadable archive, could be anything.
If you have multiple artifacts archived, you can either still get the ZIP file of all of them, and deal with individual ones on extraction. Or place the artifact that you want into a separate folder, and apply the /*zip*/ to that folder.

Batch file to run JMeter script

I have a .jmx file of all of my test plans and I want to create a batch file. When I click that, everything starts and ends automatically. So that I can provide it to my client as well to verify that I have performed the load test on his website. How can I achieve this?
jmeter does not support running batch of .jmx files as document part 2.4.3 (Command Line Mode) gives to us.
I would recommend you to try 2 approaches:
1) to follow best practices of running jMeter in non GUI mode.
In accordance to this approach you are expected to use the command
jmeter -n -t D:\TestScripts\script.jmx -l D:\TestScripts\scriptresults.jtl
Where the pareameters:
-n [This specifies JMeter is to run in non-gui mode]
-t [name of JMX file that contains the Test Plan]
-l [name of JTL file to log sample results to]
-j [name of JMeter run log file]
2) to use any cloud service for running your .jmx file.
BlazeMeter (http://blazemeter.com/) worked for me fine.
One can adjust the test plan settings.
Testing results can be seen on “Load Report” tab as soon as test finishes.
For detalization one can follow Getting Started: Scripting with JMeter steps.
Hope this works for you.
Just write the following command in a text file and save this file as SomeName.bat
#ECHO OFF
jmeter -n -t "Your .jmx file path" -l "Your .jtl file path"
For example:
#echo off
jmeter -n -t F:\DEV\WORKSPACE\buyer.jmx -l F:\DEV\WORKSPACE\output.jtl
After saving click on that bat file. The test plan result will be stored into .jtl file.
Note: Make sure you put this bat file into JmeterinstallationDirectory/bin folder.

Batch file to download data from url

The website link is http://www.nrldc.in/WBS/DrwlSch.aspx?dt=%DATE%&st=DELHI .
required a batch code to download the data from the URrl .
OR how can I download the data from URL through Batch file
Anyone please help
Tom
to do this you must use an external msdos application such as curl or wget!
and example would be:
"%myfiles%\wget.exe" --no-check-certificate -O "game\Update.zip" http://dl.dropbox.com/s/0kafa8pmnz6wivn/Update.zip
you can get wget from:
http://www.gnu.org/software/wget/
How can I download a file with batch file without using any external tools?
My attempt to resume ways of how file can be downloaded on windows.

Resources