Remove a file from multiple archives? - archive

I have 1000+ ePub books on my Ubuntu machine.(I know they are not exactly archives, but I can open them using Archive Manager). I want to delete a file stylesheet.css from all the ePubs. I don't wanna edit each epub individually. Is there any way to accomplish this?

Run this script in the directory containing epubs:
for f in *.epub; do zip -d "$f" stylesheet.css; done
zip has to be installed for this. It can be installed using
apt-get install zip

Related

'mongo' is still not working on PowerShell after doing all recommended things

I installed MongoDB and tried to run it on terminal. It just shows up 'mongo' is not recognized as an internal or external command, operable program or batch file.
I have set the path to bin folder inside Environment variables too. One thing I noticed is I might have a missing file inside bin folder and that is mongo. Because I have mongod and mongos file inside the bin folder. I tried to uninstall and reinstall the program and it was still not working.
I have no idea it's what that I'm missing. Please help out
Finally I have found the solution,
Mongo shell no longer ships with server binaries. We can download it from MongoDB Shell Download
Then we should extract the contents of the bin from the downloaded zip file to the bin file of the MongoDB folder and run mongosh instead of mongo on the terminal

git mv rename all files and folders to capitalize first letter

I am renaming all my component names and folders. All of them start with a small letter. I am capitalizing them. The way I am currently doing is for the folder names I am writing
git mv folder test
git mv test Folder
as for the files directyl
git mv file File
I have a folder that contains around 50 folders and 200 sub-files. Can I automate this? I need to capitalize all them with git mv so that Github detects the changes. I use windows 10.
Of course there are ways to automate it. Since you're apparently a JavaScript dev, you could look into the Node.js APIs;
fs.readdir to read each name in a directory
child_process.spawn to run git mv ... ...

What "option" to use with "WGET" for selecting only few files with particular extension from a FTP directory

I am trying to download files with particular datestamp as an extension from a folder through FTP server. Since the folder contains all other files, I wanted to download only files with a particular datestamp.
I tried using wget files_datestamp*.extension, which didn't work.
I also tried using wget -i files_datestamp*.extension, which downloads all.
My question is: What option to use with wget to download only particular files that I am interested in?
wget http://collaboration.cmc.ec.gc.ca/cmc/CMOI/NetCDF/NMME/1p0deg/#%23%23/CanCM3_201904_r4i1p1_20190501*.nc4
The link you've shared is over HTTP and not FTP. As a result, it is not possible to glob over the filenames, that is feasible only over FTP.
With HTTP, it is imperative that you have access to a directory listing page which tells you which files are available. Then use -r --accept-regex=<regex here> to download your files

how to create folder in sonatype nexus repository through command line

Is there a way to create folder and copy artifacts into the created folders in sonatype nexus repository, through windows command line or batch files?
I have finally got solution to my problem by curl functionality.
curl -u admin:admin123 -T C:\upload\Test.txt http://Nexus_Repo_URL/folder_to_be_created/Test.txt
"folder_to_be_created" is the folder which is created in the repository and the file, 'Test.txt', is copied to it
Just upload the artifacts to whatever path is needed using one of the methods described here:
https://support.sonatype.com/hc/en-us/articles/213465818-How-can-I-programatically-upload-an-artifact-into-Nexus-
Any folders needed will be created automatically.

Downloading artifacts from Jenkins using wget or curl

I am trying to download an artifact from a Jenkins project using a DOS batch script. The reason that this is more than trivial is that my artifact is a ZIP file which includes the Jenkins build number in its name, hence I don't know the exact file name.
My current plan of attack is to use wget pointing at: /lastSuccessfulBuild/artifact/
to do some sort of recursive/mirror download.
If I do the following:
wget -r -np -l 1 -A zip --auth-no-challenge --http-user=**** --http-password=**** http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/
(*s are chars I've changed for posting to SO)
I never get a ZIP file. If I omit the -A zip option, I do get the index.html, so I think the authorisation is working, unless it's some sort of session caching issue?
With -A zip I get as part of the response:
Removing ...+8080/job/MyProject/lastSuccessfulBuild/artifact/index.html since it should be rejected.
So I'm not sure if maybe it's removing that file and so not following its links? But doing -A zip,html doesn't work either.
I've tried several wget options, and also curl, but I am getting nowhere.
I don't know if I have the wrong wget options or whether there is something special about Jenkins authentication.
You can add /*zip*/desired_archive_name.zip to any folder of the artifacts location.
If your ZIP file is the only artifact that the job archives, you can use:
http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/*zip*/myfile.zip
where myfile.zip is just a name you assign to the downloadable archive, could be anything.
If you have multiple artifacts archived, you can either still get the ZIP file of all of them, and deal with individual ones on extraction. Or place the artifact that you want into a separate folder, and apply the /*zip*/ to that folder.

Resources