Check if file exists on FTPS site using cURL - file

I am using the cURL app to download multiple csv files. I want to find a way to check if the file exists on the ftps site before kicking off the download. If it doesn't exist I would like to find a way for cURL to check again at regular intervals.
I am trying to stick to using cURL commands for this I am really not good at .Net programming. Any help would be appreciated

$ curl ftp://[host]/[path] --ssl --head
(you might also need -k)
--ssl: Try to use SSL/TLS for the connection
--head: When used on an FTP or FILE file, curl displays the file size and last modification time only
It will return an error if the file doesn't exist. It will not keep checking, it will only check once so you need to do the repeated checking using some scheduler/cron/script or whatever.

Related

curl fail to save into a file even when specifying output

Im trying to save curl result into a file so I can analize it later using other tools.
curl https://www.revolico.com/compra-venta/electrodomesticos/search.html?q=lavadora&order=date
the command works fine and I get the result on screen. The problem arises when I want to save it to a file. I have read related questions in stackoverflow and I think this one is a bit different.
When I use the command:
curl https://www.revolico.com/compra-venta/electrodomesticos/search.html?q=lavadora&order=date > testing.txt
Nothing is written into the file and the outputs once again goes to the screen.
I tried using the output choice
curl https://www.revolico.com/compra-venta/electrodomesticos/search.html?q=lavadora&order=date --ouput testing.txt
But curl says that doesnt recognize the command and prints the result to the screen. Same result using "-o".
I used curl in a localhost web server and everything worked ok. I think the difference resides in size because I also used on www.google.com and it worked just fine too. Did anyone run into the same problem before?
Place the options before the URL. Using —output is the best way, as it doesn’t clobber stdout. If this change still “doesn’t recognize the command”, maybe it is a really old or fake curl (like in PowerShell)? I usually run with -vv interactively, as it can sometimes show things like not following redirects (see/use -L to automatically follow) — https://linux.die.net/man/1/curl
So something like the following might be useful, to either fix the issue due to proper URL placement and ensuring to following redirects, or failing that, showing useful diagnostic information:
url=https://www.revolico.com/...
curl -vv -L --output testing.txt $url

Programmatic way to get open files info similar to Computer Management\Shared Folders\Open Files? (Server 2012R2)

Trying to track open files\locks on a server due to application issues. I can use Computer Management\Shared Folders\Open Files and see this data. The fields I get are:
Open File, Accessed By, Type, # Locks, Open Mode
Using this tool, I can export the list to a CSV. In trying to come up with a batch file to do it automatically, I found OpenFiles.exe. The script works fine, the issue is I only get these fields:
ID, Accessed By, Type, Open File (Path\executable)
There is no option with OpenFiles.exe to get the #Locks data. Which is frustrating, considering the data I want is right there in the Windows tool! Has anyone previously tackled this issue?
Try out the Handle utility from System Internals at https://learn.microsoft.com/en-us/sysinternals/downloads/handle
You should be able to get all of the information you are looking for with this command:
handle.exe -a -u -s

Windows Equivalent Using CAT to encrypt file with passkey in file

I have this script which sounds like what I want to do:
cat something_so_sign.xzy | gpg \
--passphrase-file "plaintext_passphrase.txt" \
--batch \
--pinentry-mode loopback \
-bsa
which I ran in .bat and then realize this was from Linux. I got that code from here: gpg encrypt file without keyboard interaction
But now I want to do this in CMD and I've researched for hours; I can't find exactly what I want to do. Everything I research either generates a key automatically, doesn't specify a key, or requires user interaction throughout the script run (I need this automated). I also like this particular script because its simple and I understand it. But What is the CMD equivalent of this?
I don't mind using a different method either...but my powershell version on the server is old (2.0) and it would be a huge, time consuming effort to get someone to update that. So, I can't install modules in powershell. It would also not be ideal to have to download external programs. So, I'm coming back to using .bat to encrypt my file, using a key that I'll store in its own file.

How to download only lines with specific ID from a file on SFTP server

I have created a bat file which runs WinSCP script to download a log file (response log) from remote server. The log file obviously downloads as-is (responses from all the requests). How can I limit the download of log file specific to my request? My request has a unique ID, how can I trim the response specific to my unique ID? TIA
Script within my bat file is as below:
winscp.com/command ^
"open sftp://xxx.com/ -hostkey=*" "get /var/log/jboss_sit/suFile.log" "exit" ^
It seems that you want to filter a text file to lines that contain a substring.
That's not possible with SFTP protocol. It does not allow filtering files.
You have to use other interface for that. For example, if you have a shell access, you can use grep command. You can execute shell commands like grep with WinSCP command call. Though if you do not need anything else, using a console terminal client might be more appropriate, than WinSCP.

psftp.exe get files from the server and delete

I'm using psftp.exe to download files from the server. Is there an easy way to delete these files once I have downloaded them but leave the new ones that might have appeared on the server when I was downloading to be downloaded next time?
Here's my command line:
psftp.exe domain.com -i keys\private.ppk
get *.xml
Edit: I want to download the files from a Linux box to a Windows PC.
There's no easy way to do this with psftp. You would have to parse its output to find files that were successfully downloaded.
Though you can do this easily with WinSCP. Just use get -delete *.xml command.
Full WinSCP script would be:
open sftp://domain.com/ -privatekey=keys\private.ppk -hostkey=...
get -delete *.xml
exit
See an introduction to WinSCP scripting.
See also a guide for converting PSFTP script to WinSCP.
You can also have WinSCP GUI generate script like this for you.
(I'm the author of WinSCP)
Martin's answer is good. The below is more industrial.
Moving them to a staging area before the download may be prudent.
Generally you would move/rename the files on the server as a starting point. They are going to be deleted anyway so nothing should miss them? nor would you want to fall back over a recent file.
(so restart after this point in the event of a subsequent failure)
Then perform the download.
Then perform the delete.
I would approach the issue differently. Instead of deleting the file from the server, add the downloaded filename to a local table of "Already downloaded files". Then when you scan the FTP again for new files, ignore any that are in that table.
That way the next time you run your download script you only get the new files, but the old files remain on the server.
You could have another script that runs periodically and deletes all files over a certain age.
WINSCP is alright, and Martin (the author) drops in to practically every PuTTy thread to recommend it, but it's a fully GUI-based app and not for me. If you really need everything to be done on the commandline then WINSCP is often not an option.

Resources