How to parse / pass lines from a text file to a program to be run using that string - batch-file

I use an old program that went out of business, it creates a folder and inside of that "root" folder it creates other folders inside of that folder it creates a rar file of the output from the program that ran.
Root
Folder
zipfile
Folder
zipfile
Note that inside of the Root is where there are over 400 folders and each has a zip file inside I need to unrar.
The folder is based on time it was run and this program runs anywhere from 2 times per minute to 6 times a minute so it creates a new folder in the "root" folder based on the time it ran.
To see the information, I need to unrar the file it made in that folder.
Problem, I have over 400 folders inside of a central folder, that has to have all the files unrared to the folder they are currently in.
I know WinRar has an unrar here function, and I have generated a text file that has the directory listing of each folder I need to unrar. One folder per line.
Basically I need a batch file that will go to each folder, do a unrar here and move on to the next folder.
Any help available? I am on a Win10 Home system and have my registered version 5.80 of WinRAR. I am not a Powershell user, I am learning it starting today, I did use PowerShell to generate the txt file with the directory listing though.
(Get-ChildItem -Recurse -Filter "zip").FullName | Out-File d:\pattern.txt
So I am not totally clueless.
Thank you.
Responding to Campo, I use WinRar to handle Zips as well as Rars, I do not find I need a zip extractor when I have WinRar installed.

The reason I asked about the extension in my comments is because you said that you'd already used powershell to create a filelist using Get-ChildItem, and wanting to start learning it.
As you said you were using PowerShell in Windows 10, I expect that you have, PowerShell v5.0+. This therefore, is a single line for a PowerShell script or to enter directly at the PowerShell Prompt, which should unzip each of your .zip files directly without introducing a third party utility, like WinRAR.
Please change the -Path string according to your specific location.
Get-ChildItem -Path "C:\Users\Frank\Root" -Filter "*.zip" -File -Recurse | ForEach-Object { Expand-Archive -Path $_.FullName -DestinationPath $_.DirectoryName -Force }
You didn't clarify whether you wanted to delete the zip file once you've unarchived it, so if you want that functionality, for the purpose of this exercise you could simply append a deletion command:
Get-ChildItem -Path "C:\Users\Frank\Root" -Filter "*.zip" -File -Recurse | ForEach-Object { Expand-Archive -Path $_.FullName -DestinationPath $_.DirectoryName -Force; Remove-Item $_.FullName }
Please note that the example is only a starting point for you. When you learn more, you can add robustness to it, to prevent for example, deleting a zip file if its extraction failed, and perhaps what to do in the event of errors etc.

Related

Batch script to delete same name files with appended numbers

I'm working on creating a batch script which will allow me delete files with the same name and an appended ascending number; for example:
fileName.txt
fileName (1).txt
fileName (2).txt
fileName (3).txt
fileName (4).txt
fileName (5).txt
fileName (6).txt
fileName (7).txt
etc....
Below is the code I came up with, but it only deletes the file name, without the appended number in the parenthesis.
#echo off
:: Change to the Downloads directory
cd %UserProfile%\Downloads
:: Deletes files
Del Awesome-Kicks-Test_File-Run.txt
All of these file are saved in my Downloads directory.
Could you please direct me on how best to go about deleting these files?
Since you are on Windows, you should have access to PowerShell. PS has much better support for these types of operations, and is available on all versions of Windows in common use.
One method is to use a regular expression to find all of the files that match the pattern. Here is a basic one that should work for the example you gave: fileName \(\d+\)
Using PowerShell, you then combine that regular expression with the Get-ChildItem, Where-Object and Remove-Item commandlets.
Get-ChildItem | Where-Object { $_.Name -match 'fileName \(\d+\)' } | Remove-Item

Batch replacing text within .xml files within a zip archive through powershell

I've posted this question before, yet was not able to find a suitable solution for my problem, hence I have been testing some more myself and have some new findings and theories as to why It might not work as intended. I hope a respectable time has passed for me to bump my question with new info attached.
I am for quite a while, in my free time, tackling a script that can batch replace external link addresses in multiple excel files within script folder. I have learned, that you can't change external links via usual powershell to excel interaction, as these values are forced to read-only. However, there is a clever way to bypass that by converting the Excel file to a .zip archive and read/change the files inside and then rename it back to excel format.
Through learning and digging around the web, I have compiled this script function that should create a backup, rename to archive and then replace desired text within, renaming the file backwards afterwards.
function Update-ExcelLinks($xlsxFile, $oldText, $newText) {
# Build BAK file name
$bakFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".bak"
# Build ZIP file name
$zipFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Uncomment the next line to create backup before processing XLSX file
# Copy-Item $xlsxFile $bakFile
# Rename file to ZIP
Rename-Item -Path $xlsxFile -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Force -Include *.xml,*.bin.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.PSPath) |
Foreach-Object { $_ -replace $oldText, $newText } |
Set-Content $file.PSPath
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempFolder
# Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $xlsxFile
}
So far, I am able to find the desired file, rename it to .zip, extract it to a temporary folder, and according to the powershell window prompts, it updates the archive with new files. Afterwards I am left with the .zip file without any desired changes inside. The files that are responsible for external links data in the excel files are located at
wk33\Gross Qty_wk33.zip\xl\externalLinks\_rels
and are presented in the form of files named externalLink1.bin.rels and numbered onwards.
These files are essentially identical to a .xml file and are opened with either Notepad or Internet explorer through windows and contain the following:
The aim of my script is to rename the week number within "Target=" parameters from last week to current (For example wk32 to wk33). The thing is that no changes happen even though no errors are displayed, and 7zip indicates that files are packed into the zip successfully.
I have tested what happens, If I unpack the .bin.rels file and change the week number inside manually through notepad and repeat the intended script process and I can confirm that it works. When I open the file the link is correctly updated.
The last 4 steps seem to be not working as intended, even though they are correct as far as I am aware. The changes are not made and the file is not consecutively renamed back to its original .xlsb extension.
Here is the output of my powershell window upon trying to execute the function:
I've been trying for several weeks to make it work, but nothing substantial can be changed as it seems. I would appreciate any additional insights or alternatives to my code to achieve the same task.
EDIT: The function is intended to be called upon from within the working directory, so it is supposed to be used as Update-ExcelLinks 'Gross Qty_wk33.xlsb' 'wk32' 'wk33' Although I have tried calling the file via its full path as Update-ExcelLinks 'C:\Test\Gross Qty_wk33.xlsb' 'wk32' 'wk33'

WinSCP Script upload recursivelly all files matching a mask to the same folder

I have a text file that I am calling from a batch file and it is not putting files recursively in the FTP site. The folder structure has subfolders which contain the files I want to copy among many other files. The put files only copy C:\storage only. After reading the documentation and trying other method is still not copying files recursively. (no folders to be copied with the RDF from subfolders)
The folder structure is random on different PC:
C:\storage\78286.S-92A.920024*.RDF
C:\Storage\folder1\78286.S-92A.920024*.RDF
C:\Storage\storage2\folder2\78286.S-92A.920024*.RDF
There are many RDF files, but the wildcard I am interested is the one you can see above. Basically I want to select all the *.RDF (as wildcard above from all the subfolders), but do not want the subfolders to be copied to the remote.
Please see code below.
option batch continue
option confirm off
option reconnecttime 900
open ftp://companyuser:!password#ftpsite.com/
lcd "C:\storage"
put "C:\storage\78286.S-92A.920024*.RDF" "/"
put "C:\storage\*\78286.S-92A.920024*.RDF" "/"
close
exit
It's not easy to do such custom processing with WinSCP scripting only.
But with WinSCP .NET assembly from a PowerShell script, it's not difficult:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$session = New-Object WinSCP.Session
Write-Host "Connecting ..."
$session.Open($sessionOptions)
$localPath = "C:\storage"
$remotePath = "/"
$wildcard = "78286.S-92A.920024*.RDF"
$localFiles = Get-ChildItem -Include $wildcard -Recurse -Path $localPath
foreach ($localFile in $localFiles)
{
Write-Host "Uploading $($localFile.FullName)..."
$session.PutFiles($localFile.FullName, $remotePath).Check()
}
Just extract a contents of WinSCP .NET assembly package along with the script (say flatupload.ps1) and run it like:
powershell -ExecutionPolicy Bypass -File flatupload.ps1
The code is partly based on WinSCP example Recursively move files in directory tree to/from SFTP/FTP server while preserving source directory structure.
See also WinSCP forum question Ignore folder structure when copying the files.
Have look at doc on https://winscp.net/eng/docs/commandline
You can use command-line winscp.com
Your script seems good, it should need to be called with winscp.com
For ftp client on Linux, mput/mget(multiple file operation), command is available but it is not available with WinScp.
You can try some work around like first create folder using mkdir command (with winscp.com) and then use synchronize option with winscp.exe to update folder content.

Finding the date a file was moved into a folder

Ok, here is the scenario. I have a folder that people can use to share data around. Quick and easy, no USB needed. However they tend to leave the data in there and it is filling up the HDD. I want to delete everything over 30 days old. I was going to do this with powershell -
$limit = (Get-Date).AddDays(-30)
$path = "C:\temp"
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force -whatif
Get-ChildItem -Path $path -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse -whatif
I left the -whatif there because that makes it easier to test.
So I tested it, and it works fine, but during the testing I ran into a small problem. Moved files retain their original creation date. Why is this important? Because a user might move something into the folder 5 minutes before the script runs, and it will get deleted if the creation date is older than 30 days. Copying a file is no problem, it gets a new creation date when copied. Its only the moved files.
So question, how do I find the date the file was moved into the folder so that it isn't accidentally deleted?
This topic may have already been covered here: Finding Modified Date of a file/folder
Is this more or less what you are looking for or are you looking specifically for the creation date? I would imagine that some of the files may have been created on 1/1/2016, but the user might be modifying it every so often so it could be last modified on 2/15/2016. I'm sure you wouldn't want to remove that otherwise you would have one very angry end user.
Ok, so short answer is it can't be done. In the end I have had to copy the folder to an archive folder on another server as a backup, and then delete everything in the folder. The backup then gets deleted 10 days later.
But some things I found out along the way.
Using Robocopy to move the files also deletes the root folder. This might be ok for some cases, but the root folder has permissions on it that aren't created with a simple "make directory" command
Using Move-Item in powershell doesn't like to work between servers
Powershell is limited by the path and file length limitations that windows has
What I ended up doing was copying all the files to an archive first
robocopy $SourceFolder $DestinationFolder /e
Then I wanted to delete all the files. Which Remove-Item can't do if the path is longer than 248 characters because of the file and path length limitations. Again, robocopy to the rescue. I have a blank folder. Nothing in it. And to delete all the items in the other folder, regardless of path and file length, I mirror the blank folder.
robocopy /mir $EmptyFolder $SourceFolder
I already had a powershell script so I stuck these commands in there. But as it isn't a powershell command, then you could replace the variables with the actual folders and put the commands straight into a .bat file.
From here I just put in a scheduled task to run the script once a month to keep the folder free of built up clutter.
I hope this helps anyone else who is trying to do something similar.
As it turns out, there's a hidden fourth modification time in windows: ChangeTime. As far as I understand it, it's stored in a separate place, and isn't returned by the normal APIs - it's part of the Master File Table, I think. (Perhaps it depends on your file system type?) Whether that description is accurate or not, consider the following:
C:\Users\erhannis\Downloads\nsx220en>stat nsx-2.2.0
File: nsx-2.2.0
Size: 0 Blocks: 4 IO Block: 65536 directory
Device: c6194f59h/3323547481d Inode: 7318349397564576 Links: 1
Access: (0755/drwxr-xr-x) Uid: (1056947/ erhannis) Gid: (1049089/ UNKNOWN)
Access: 2016-03-24 14:41:17.880026600 -0400
Modify: 2016-03-24 14:41:17.880026600 -0400
Change: 2020-12-01 17:03:41.697541400 -0500
Birth: 2016-03-24 14:35:09.540397700 -0400
(See links for relevant programs.)
The time listed next to Change: is the time I moved the folder from one place to another. When I move a file, the Change date is updated. Note: I tried moving a folder full of files, and it appears only the root object's ChangeTime is updated. So like, for your purposes, you would consider the last ChangeTime of a given file to be the newest of the ChangeTime of the file and any parent directory. (Or, just stop recursing into directories when you hit a change time too new; depends on how you're doing it.)
Perhaps more relevantly, the Linux find command has a parameter -cmin that lets you filter by "metadata more recently changed than X". So, for instance, find -cmin -5 recursively finds all items whose metadata (e.g., when it was last moved) has changed less than five minutes ago. (find -cmin 5 would find files changed MORE than 5 minutes ago.) (On windows, I rename the find command to something else so it doesn't conflict with the default windows find command.)
Windows ports of both stat and find are available incidentally as part of a Git installation:
https://git-scm.com/download/win
There's also this Powershell script I found - haven't tried it, but it purports to get the ChangeTime of a file.
https://gallery.technet.microsoft.com/scriptcenter/Get-MFT-Timestamp-of-a-file-9227f399

Scheduled Powershell Moves to Specific Folders

Thanks for viewing my question. I was unable to find any information online in regards to my question. I also have very basic experience in this area.
PowerShell Script:
-Query folder for files (list?)
-Move file based on filename to folder with same name. (Move with pipe to query?)
-Move will also parse second part of file name to include subsequent matching folder name for destination folder.
Files will contain many separate names so the move has to be on a loop.
Ex. File - "Name 1"
Scripts excutes moves file to folder with "name" then to subfolder "1".
Just to be clear there will be multiple names and numbers so multiple destination paths. Basically every file will have a different destination but the destination will correlate to the file name. If there is a language more accessible for this function please let me know.
Something like the following will get you started
$files = Get-ChildItem -File
foreach($f in $files) {
$dirname = $f -split " " -join "\"
New-Item -ItemType Directory -Path ".\$dirname"
Move-Item $f $dirname
}

Resources