How can I download an array of files from a website? wget? - arrays

Let's say I want to download example.com/pics/0000.jpg through example.com/pics/9999.jpg.
What's the best way to do that?
I tried:
wget example.com/pics/{0000..9999].jpg
but it said "Argument list too long".
What's a good script or program I can use to do this?
I don't code much. I am thinking it will involve a shell script that uses wget to get 0000.jpg and then it will +1 to get the next picture, until it reaches 9999.jpg.
Thanks.

Here's a Bash one-liner that does what you want:
for n in $(seq -f "%04g" 0 9999); do wget http://example.com/pics/$n.jpg; done

Related

Bash Loop 2 Arrays

I'm trying to create a loop for a couple of arrays but I get this error:
./bash.sh: 3: ./bash.sh: source[0]=/media/jon/my\ directory/: not found
This is what my code looks like:
sourceFiles[1]=/media/jon/ACER/Documents\ and\ Settings/Laura/Documents/Shared
destinationFiles[1]=/media/jon/My\ Book/Shared
for index in ${!sourceFiles[#]}
do
sudo rsync -a --delete ${sourceFiles[$index]} ${destinationFiles[$index]}
done
I'm some what green to bash files and this is terribly frustrating that doing a simple loop is so difficult.
Update
I needed a #!/bin/bash at the top per the correct answer.
Your code looks ok. I think you're not using bash though ("not found" is not a bash error message). Are you perhaps using /bin/sh? On many systems that's a minimal POSIX shell, not bash.
A POSIX shell would not recognize sourceFiles[1]=... as an assignment and would consequently run it as a command. Hence the "not found" error.
Try enclosing in double quote your variables in your sudo line:
sudo rsync -a --delete "${sourceFiles[$index]}" "${destinationFiles[$index]}"

Downloading file from online database using bash script

I want to download some files from an online database, but it does not allow me to download all the files at once. Instead it offers to download a file for a searched keyword. Because I have more than 20000 keywords, it's not feasible for me.
For example, I want to download whole information about miRNA-mRNA interaction from SarBase, but it does not offer an option to download all of them at once.
I wonder, how can I download it by writing some scripts. Can anybody help me?
Make a file called getdb.sh.
#!/bin/bash
echo "Download keywords in kw.txt."
for kw in $(cat kw.txt)
do
curl http://www.mirbase.org/cgi-bin/get_seq.pl?acc=$kw > $kw.txt
done
Create another file called kw.txt:
MI0000342
MI0000343
MI0000344
Then run this
$ chmod +x getdb.sh
$ ./getdb.sh
Download keywords in kw.txt.
$ ls -1 *.txt
kw.txt
MI0000342.txt
MI0000343.txt
MI0000344.txt
another way
cat kw.txt |xargs -i curl -o {}.txt http://www.mirbase.org/cgi-bin/get_seq.pl?acc={}

Exact command for starting a batch file by using powershell

I know this question has been asked before and I found a thread on here which almost gives me the solution I need.
Here is the link: How to run batch file using powershell
But this only works when I write out the full path. For example:
c:\Users\Administrator\Desktop\dcp_bearbeitet\start.bat -p c:\Users\Administrator\Desktop\dcp_bearbeitet\start.prop
What I want to reach is a solution which accepts a path with parameters, like this one here:
c:\Users\Administrator\Desktop\dcp_bearbeitet\$title\start.bat -p c:\Users\Administrator\Desktop\dcp_bearbeitet\$title\start.prop
Whereas $title contains the name of my file which I am using in this case. I know that I can create another parameter for the -p command and I know that this works, but unfortunately when I try the same method for the first command I always get an error message.
I hope you guys know a way to solve this problem.
I think Invoke-Expression could help here.
Just construct your path like you want it to be, for example:
$title = "file"
$path = "C:\Users\Administrator\Desktop\dcp_bearbeitet\$title\start.bat -p c:\Users\Administrator\Desktop\dcp_bearbeitet\$title\start.prop"
and then invoke it:
Invoke-Expression $path
Regards Paul

How to query Maya in script for supported file translator plugins?

I'm trying to specify an FBX file in MEL using the command
file -f -pmt 0 -options "v=0;" -typ "FBX" -o
on one computer this works great. On another, it fails but DOES work if I use
-typ "Fbx"
I think I'd like to query for the supported translators in my script, then either select the correct one or report an error. Is this possible? Am I mis-diagnosing the problem?
MEL has a command called pluginInfo. You could write a simple function that will return the proper spelling based on that. pluginInfo -v -query "fbxmaya"; will provide the version of the fbx plugin. I haven't used MEL in a while so I'm not gonna try to make this perfect but maybe something like if(pluginInfo -v -query "fbxmaya") ) string fbxType = "FBX" else( string fbxType = "Fbx"). Then just plug that var into file -f -pmt 0 -options "v=0;" -typ $fbxType -o.
It might be a different version of fbx. You'd have to provide another line which determines the version of fbx on that particular machine and pipes in the correct spelling.

wget - specify directory and rename the file

I'm trying to download multiple files and need to rename as I download, how can I do that and specify the directory I want them to download to? I know i need to be using -P and -O to do this but it does not seem to be working for me.
Ok it's too late to post my answer here but I'll correct #Bill's answer
If you read in "man wget" you will see the following
...
wget [option]... [URL]...
...
That is, common sense leads to realizing that
wget -O /directory_path/filename.file_format https://example.com
is the default that aligns with the wget documentation.
Remember: Just because it works doesn't mean it's right!
I ran into a similar situation and came across your question. I was able to get what I needed by writting a little bash script that parsed a file of urls in one column and the name in the 2nd.
This is the script I used for my particular requirement. Maybe it will give you some guidance if you still need help.
#!/bin/bash
FILE=URLhtmlPageWImagesWids.txt
while read line
do
F1=$(echo $line|cut -d " " -f1)
F2=$(echo $line|cut -d " " -f2)
wget -r -l1 --no-parent -A.jpg -O $F2.jpg $F1
done < $FILE
This won't work actually because -O combines all results into one page.
You could try using the --no-directories or --cut-dirs switch and in the loop process the files in the folder how you want to rename them.
wget your_url -O your_specify_dir/your_name
Like Bill was saying
wget http://example.com/original-filename -O /home/new_filename
worked for me !
Thanks
This may works for everyone
mkdir Download1
wget -O "Download1/test 10mb.zip" "http://www.speedtest.com.sg/test_random_10mb.zip"
You need to use " " for name with space.
I'm a little late to the party, but I just wrote a script to do this. You can check it out here: bulkGetter

Resources