Batch Script to deploy Explicitly mentioned SVN revisions to Testing server - batch-file

I want to write down a Batch Script in which I put (Enter) the revision number(s) and it deploys those revision numbers in sequence on a specific location of the Server.
Any help or a sample batch script in Windows would be highly appreciated.
Regards,
Waseem Shahzad Bukhari.

Usually you want to perform a checkout for this
svn checkout -r <REVISION> <URL> <TO-PATH>
You can then later call
svn update -r <REVISION> <PATH>
to update to a later revision. This uses the data stored in the .svn subdir(s) to perform the least amount of work.
If you would use
svn export -r <REVISION> <URL> <TO-PATH>
you would also get the files from the repository, but without a way to connect them to where they came from.

Related

download multiple files from sftp with Jenkins

I have to download all files from a ftp folder using Explicit FTP over SSL/TLS. I need that for a jenkins job, running on a windows machine and didnt find any plugins - so I am trying to use a batch script with curl and the following code lists the contents of the folder.
set "$FILEPATH=C:\temp"
set "$REMOTEPATH=/files/"
curl -u user:pass --ftp-ssl ftp://hostame.com:port%$REMOTEPATH% -o %$FILEPATH%
I figured out that with curl I have to download files one by one, but how can I achieve to go through all the files in a ftp directory and get them one by one?
Is there a better way to achieve that? I read about mget, but it doesnt seem to work with the explicit ftp over ssl.
Thanks
I couldnt bring it to work with batch directly in the script, so I wrote a python script instead and download it from git and execute it as a step in the pypeline. It has some nice libraries, so it works as a charm.

Is it okay to copy my checked out file from the actual directory to my home directory in clearcase?

I am new to clear case. Our organizations code is versioned using clear case and I have to edit some code. The codes are database .ddl file. so 2 .ddl files for a package.
I have checked out the pieces of code that I have to use. But I can not see them anywhere. I have checked the FTP client I am using, as well as my local.
Now I am confused about two parts:
After checking out do I copy the .ddl files from the current location to my clearcase home and then download them to pc and use them? That is what I am doing right now.
or is there any other way to generate the ddl files from PL/SQL developer?
I can see the package and package body but can not find the .ddl files.
here I am attaching the clearcase terminal commands and responses:
denoad32:ddl $ cleartool lsco -me
--04-03T03:02 Sayan.Sikdar checkout version "XXONT_OH_REL_SC_HOLD_PB.ddl" from /main/R12/8 (reserved)
--04-03T03:02 Sayan.Sikdar checkout version "XXONT_OH_REL_SC_HOLD_PS.ddl" from /main/R12/3 (reserved)
What I am doing right now is I have check the files out. Now that I have checked the files out I am copying it from their current location to my view home. Then I am downloading it and using it.
Basically, you have checked out the files with the command "cleartool co ". in order to be able to access the files, you need to be inside your Clearcase view. If you are in the same session as when you performed the check out, you should have access to the files you have checked out.
The usual workflow is :
checkout the file
modify and save the file
checkin the file
All these must be done inside a Clearcase view.
download them to pc and use them?
If your PC has a ClearCase Client, it can host a ClearCase view (snapshot or dynamic) and will download automatically checked out files.
is there any other way to generate the ddl files from PL/SQL developer
If there is, that would explain why you don't see those files: they can be generated.
pg_dump -U user_name -h host database -s -t table_or_view_names -f table_or_view_names.sql

how to create folder in sonatype nexus repository through command line

Is there a way to create folder and copy artifacts into the created folders in sonatype nexus repository, through windows command line or batch files?
I have finally got solution to my problem by curl functionality.
curl -u admin:admin123 -T C:\upload\Test.txt http://Nexus_Repo_URL/folder_to_be_created/Test.txt
"folder_to_be_created" is the folder which is created in the repository and the file, 'Test.txt', is copied to it
Just upload the artifacts to whatever path is needed using one of the methods described here:
https://support.sonatype.com/hc/en-us/articles/213465818-How-can-I-programatically-upload-an-artifact-into-Nexus-
Any folders needed will be created automatically.

Downloading artifacts from Jenkins using wget or curl

I am trying to download an artifact from a Jenkins project using a DOS batch script. The reason that this is more than trivial is that my artifact is a ZIP file which includes the Jenkins build number in its name, hence I don't know the exact file name.
My current plan of attack is to use wget pointing at: /lastSuccessfulBuild/artifact/
to do some sort of recursive/mirror download.
If I do the following:
wget -r -np -l 1 -A zip --auth-no-challenge --http-user=**** --http-password=**** http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/
(*s are chars I've changed for posting to SO)
I never get a ZIP file. If I omit the -A zip option, I do get the index.html, so I think the authorisation is working, unless it's some sort of session caching issue?
With -A zip I get as part of the response:
Removing ...+8080/job/MyProject/lastSuccessfulBuild/artifact/index.html since it should be rejected.
So I'm not sure if maybe it's removing that file and so not following its links? But doing -A zip,html doesn't work either.
I've tried several wget options, and also curl, but I am getting nowhere.
I don't know if I have the wrong wget options or whether there is something special about Jenkins authentication.
You can add /*zip*/desired_archive_name.zip to any folder of the artifacts location.
If your ZIP file is the only artifact that the job archives, you can use:
http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/*zip*/myfile.zip
where myfile.zip is just a name you assign to the downloadable archive, could be anything.
If you have multiple artifacts archived, you can either still get the ZIP file of all of them, and deal with individual ones on extraction. Or place the artifact that you want into a separate folder, and apply the /*zip*/ to that folder.

how to override the already existing workspaces in rtc using command scm or lscm

I have the requirement as i need to connect to the rtc and automatically checkout the files from the stream to the repository workspace.
I am writing the following commands in the bat file.
lscm login -r https://rtc.usaa.com/ccm -u uname -P password -n nickname -c
scm create workspace (workspacename) -r nickname -s (streamname)
lscm load workspace name -r nickname -d directorypath(c:codebase/rtc)
lscm logout -r nickname
while i am executing the above batch file for the first time it is creating the workspace and loading the project into the workspace path.
while i am executing the above batch file for the second time again it is creating the duplicate workspace with the same name and getting exception while loading.
I want to override the already existing workspace every time while loading but I didn't find a command for that.
can you please provide me any other way of doing it or any command that solves my problem
It will be good to delete existing local workspace sandbox before loading the new one. In my setup, we execute the following steps:
1. Delete local sandbox (if it makes sense delete existing repository workspace too)
2. Create new repository workspace
3. Load the new repository workspace to local sandbox
Either create a uniquely named workspace (perhaps by sticking a time stamp into the name?) and then delete it when you're done, or use the workspace's UUID from the creation step.
Instead of deleting and again writing the files into workspace, you can try accept incoming changes before load and then using "--force" attribute you can overwrite only the changes made files.
Accept using - SCM accept --flow-components -r <> -u <> -p <> --target
Use force at the end of the load command which you using.
this should work fine.

Resources