Download .zip from shared FolderLink (LuckyCloud) with batch-skript - file

I have the following problem:
When I run the command below I just get a .zip file which is 9kb and won't open but the original file has 43.9Mb. I've already tried several variants with curl, but all without success.
I have already tried the way for HTML redirect. Unfortunately without success
If it helps, here the shred Link for Download: https://sync.luckycloud.de/d/fb56e4a8239a4c6cac7a/
curl https://sync.luckycloud.de/d/fb56e4a8239a4c6cac7a/files/?p=%2FValheimServer%20Buddelkiste%20Modpack%20v3.4%20-%20Standart.zip -O -J -L
I try the way with PowerShell too, but i get only the same 9kb file :/
# Source file location
$source = 'https://sync.luckycloud.de/d/fb56e4a8239a4c6cac7a/files/?p=%2FValheimServer%20Buddelkiste%20Modpack%20v3.4%20-%20Standart.zip'
# Destination to save the file
$destination = 'C:\Users\Anonymos\Downloads\Test.zip'
#Download the file
Invoke-WebRequest -Uri $source -OutFile $destination

I couldn't solve it with curl. However, with the PowerShell command invoke.
powershell -c "Invoke-WebRequest 'https://sync.luckycloud.de/d/fb56e4a8239a4c6cac7a/files/?p=%%2FValheimServer%%20Buddelkiste%%20Modpack%%20v3.4%20-%%20Standart.zip&dl=1' -OutFile '%FilePath%installdir\Test.zip'"
Be sure, that every % is transform to %% in the Link.
"&dl=1" is a specific ending from LuckyCLoud. Other Hoster use sometimes "&download=1"
You need this Link, to get the full file.

Related

Batch replacing text within .xml files within a zip archive through powershell

I've posted this question before, yet was not able to find a suitable solution for my problem, hence I have been testing some more myself and have some new findings and theories as to why It might not work as intended. I hope a respectable time has passed for me to bump my question with new info attached.
I am for quite a while, in my free time, tackling a script that can batch replace external link addresses in multiple excel files within script folder. I have learned, that you can't change external links via usual powershell to excel interaction, as these values are forced to read-only. However, there is a clever way to bypass that by converting the Excel file to a .zip archive and read/change the files inside and then rename it back to excel format.
Through learning and digging around the web, I have compiled this script function that should create a backup, rename to archive and then replace desired text within, renaming the file backwards afterwards.
function Update-ExcelLinks($xlsxFile, $oldText, $newText) {
# Build BAK file name
$bakFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".bak"
# Build ZIP file name
$zipFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Uncomment the next line to create backup before processing XLSX file
# Copy-Item $xlsxFile $bakFile
# Rename file to ZIP
Rename-Item -Path $xlsxFile -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Force -Include *.xml,*.bin.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.PSPath) |
Foreach-Object { $_ -replace $oldText, $newText } |
Set-Content $file.PSPath
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempFolder
# Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $xlsxFile
}
So far, I am able to find the desired file, rename it to .zip, extract it to a temporary folder, and according to the powershell window prompts, it updates the archive with new files. Afterwards I am left with the .zip file without any desired changes inside. The files that are responsible for external links data in the excel files are located at
wk33\Gross Qty_wk33.zip\xl\externalLinks\_rels
and are presented in the form of files named externalLink1.bin.rels and numbered onwards.
These files are essentially identical to a .xml file and are opened with either Notepad or Internet explorer through windows and contain the following:
The aim of my script is to rename the week number within "Target=" parameters from last week to current (For example wk32 to wk33). The thing is that no changes happen even though no errors are displayed, and 7zip indicates that files are packed into the zip successfully.
I have tested what happens, If I unpack the .bin.rels file and change the week number inside manually through notepad and repeat the intended script process and I can confirm that it works. When I open the file the link is correctly updated.
The last 4 steps seem to be not working as intended, even though they are correct as far as I am aware. The changes are not made and the file is not consecutively renamed back to its original .xlsb extension.
Here is the output of my powershell window upon trying to execute the function:
I've been trying for several weeks to make it work, but nothing substantial can be changed as it seems. I would appreciate any additional insights or alternatives to my code to achieve the same task.
EDIT: The function is intended to be called upon from within the working directory, so it is supposed to be used as Update-ExcelLinks 'Gross Qty_wk33.xlsb' 'wk32' 'wk33' Although I have tried calling the file via its full path as Update-ExcelLinks 'C:\Test\Gross Qty_wk33.xlsb' 'wk32' 'wk33'

WinSCP Script upload recursivelly all files matching a mask to the same folder

I have a text file that I am calling from a batch file and it is not putting files recursively in the FTP site. The folder structure has subfolders which contain the files I want to copy among many other files. The put files only copy C:\storage only. After reading the documentation and trying other method is still not copying files recursively. (no folders to be copied with the RDF from subfolders)
The folder structure is random on different PC:
C:\storage\78286.S-92A.920024*.RDF
C:\Storage\folder1\78286.S-92A.920024*.RDF
C:\Storage\storage2\folder2\78286.S-92A.920024*.RDF
There are many RDF files, but the wildcard I am interested is the one you can see above. Basically I want to select all the *.RDF (as wildcard above from all the subfolders), but do not want the subfolders to be copied to the remote.
Please see code below.
option batch continue
option confirm off
option reconnecttime 900
open ftp://companyuser:!password#ftpsite.com/
lcd "C:\storage"
put "C:\storage\78286.S-92A.920024*.RDF" "/"
put "C:\storage\*\78286.S-92A.920024*.RDF" "/"
close
exit
It's not easy to do such custom processing with WinSCP scripting only.
But with WinSCP .NET assembly from a PowerShell script, it's not difficult:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$session = New-Object WinSCP.Session
Write-Host "Connecting ..."
$session.Open($sessionOptions)
$localPath = "C:\storage"
$remotePath = "/"
$wildcard = "78286.S-92A.920024*.RDF"
$localFiles = Get-ChildItem -Include $wildcard -Recurse -Path $localPath
foreach ($localFile in $localFiles)
{
Write-Host "Uploading $($localFile.FullName)..."
$session.PutFiles($localFile.FullName, $remotePath).Check()
}
Just extract a contents of WinSCP .NET assembly package along with the script (say flatupload.ps1) and run it like:
powershell -ExecutionPolicy Bypass -File flatupload.ps1
The code is partly based on WinSCP example Recursively move files in directory tree to/from SFTP/FTP server while preserving source directory structure.
See also WinSCP forum question Ignore folder structure when copying the files.
Have look at doc on https://winscp.net/eng/docs/commandline
You can use command-line winscp.com
Your script seems good, it should need to be called with winscp.com
For ftp client on Linux, mput/mget(multiple file operation), command is available but it is not available with WinScp.
You can try some work around like first create folder using mkdir command (with winscp.com) and then use synchronize option with winscp.exe to update folder content.

How to set FromFile location in Powershell?

I am preparing a script, which needs to use some images from same folder as the script. The images are to be shown on WinForms GUI.
$imgred = [System.Drawing.Image]::FromFile("red.png")
When I run the ps1 script manually from the folder just by clicking, it loads images and shows them. Unfortuantely I do not remember exactly how I set up this, but as far as I can, it was just the default program to use for ps1 files.
When I run the script from a cmd file (to hide the cmd window), it also loads them.
But when I open with Powershell IDE and run it, I get errors and no icons are shown on my GUI.
When I open with Powershell it also fails to load them.
The only difference between those run modes I can find is with:
$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
$scriptPath #always same, the location of script
(Get-Location).Path #scriptlocation when icons loaded, system32 folder when unsuccessful load
Same behavior when doing cd $scriptPath, so the current folder is most likely not the guilty one.
I know I can write $scriptPath/red.png in each file read line (FromFile), but what I want is to define it once - the default location for FromFile - and then just have simple filename work regardless of the way I run it.
What is to be changed so the default file reading path is same as my scripts location?
Modifying the default location stack in PowerShell ($PWD) doesn't affect the working directory of the host application.
To see this in action:
PS C:\Users\Mathias> $PWD.Path
C:\Users\Mathias
PS C:\Users\Mathias> [System.IO.Directory]::GetCurrentDirectory()
C:\Users\Mathias
now change location:
PS C:\Users\Mathias> cd C:\
PS C:\> $PWD.Path
C:\
PS C:\> [System.IO.Directory]::GetCurrentDirectory()
C:\Users\Mathias
When you invoke a .NET method that takes a file path argument, like Image.FromFile(), the path is resolved relative to the latter, not $PWD.
If you want to pass a file path relative to $PWD, do:
$pngPath = Join-Path $PWD "red.png"
[System.Drawing.Image]::FromFile($pngPath)
or
[System.Drawing.Image]::FromFile("$PWD\red.png")
If you require a path relative to the executing script, in PowerShell 3.0 and newer you can use the $PSScriptRoot automatic variable:
$pngPath = Join-Path $PSScriptRoot "red.png"
If you need to support v2.0 as well, you could put something like the following at the top of your script:
if(-not(Get-Variable -Name PSScriptRoot)){
$PSScriptRoot = Split-Path $MyInvocation.MyCommand.Definition -Parent
}
When using PowerShell in interactive mode, you could configure the prompt function to have .NET "follow you around" like so:
$function:prompt = {
if($ExecutionContext.SessionState.Drive.Current.Provider.Name -eq "FileSystem"){
[System.IO.Directory]::SetCurrentDirectory($PWD.Path)
}
"PS $($executionContext.SessionState.Path.CurrentLocation)$('>' * ($nestedPromptLevel + 1)) ";
}
but I would recommend against that, just get into the habit of providing fully qualified paths instead.

SOLR POST files with no extension

I am using SOLR 5 and I want to scan documents that have no extensions. Unfortunately changing the file to have extensions is not an option in my case.
the command I am using is simply:
$bin/post -c mycore ../foldertobescaned -type application/pdf
the command works fine for documents that do have extension but I am getting:
Entering auto mode. File endings considered are xml,json,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
If renaming the files is not an option, you can use the following script as a workaround until Solr improves its post method. It is a simple bash for loop that submits each file individually and works regardless of the file extension. Note that this script will be slower than using post on the whole folder, because each individual file transfer needs to be initialized.
Save the script below as postFolderToSolr.sh inside your Solr folder (so that Solrs bin/ folder is a subdirectory), make it executable with chmod +x postFolderToSolr.sh and then use it as follows: ./postFolderToSolr.sh mycore /home/user1/foldertobescaned/ application/pdf
Using no arguments or the wrong number of arguments prints a short usage message as help.
#!/bin/bash
set -o nounset
if [ "$#" -ne 3 ]
then
echo "Post contents of a folder to Solr."
echo
echo "Usage: postFolderToSolr.sh <colletionName> </path/to/folder> <MIME>"
echo
exit 1
fi
collection=$1
inputPath=${2%/} # remove suffix / if it exists
mime=$3
for element in $inputPath"/"*; do
bin/post -c $collection -type $mime $element
done

Convert PowerShell script into non-readable format

I have a PowerShell script which installs a patch (contains set of files to be added) on a customer machine. For this, I have created a batch file which executes this PowerShell script.
For the customer to run this batch file, the PowerShell script file must be placed onto the customer machine as well.
The PowerShell script is in text format, which can be read and understood by the customer easily.
Can we convert this script file into some non-readable format (e.g. bin or exe), so that it is not readable by the customer?
You can convert the script to Base64 encoding, so that it's not immediately readable. To convert a PowerShell script file to a Base64 String, run the following command:
$Base64 = [System.Convert]::ToBase64String([System.IO.File]::ReadAllBytes('c:\path\to\script file.ps1'));
To launch the Base64-encoded script, you can call PowerShell.exe as follows:
powershell.exe -EncodedCommand <Base64String>
For example, the following command:
powershell.exe -EncodedCommand VwByAGkAdABlAC0ASABvAHMAdAAgAC0ATwBiAGoAZQBjAHQAIAAiAEgAZQBsAGwAbwAsACAAdwBvAHIAbABkACEAIgA7AA==
Will return the following results:
Hello, world!
I tried the solution proposed by #TrevorSullivan, but it gave me error
The term '????' is not recognized as the name of a cmdlet, function,
script file or operable program...
As I found out later there was a problem with bad encoding. I found somewhere another approach and when I combined those two, I got working PS command:
$Base64 = [System.Convert]::ToBase64String([System.Text.Encoding]::Unicode.GetBytes([System.IO.File]::ReadAllText("script.ps1")))
Then I can redirect the result to file:
$Base64 > base64Script.txt
from where I just copy the encoded command and paste it here instead of <Base64String>:
powershell.exe -EncodedCommand <Base64String>
and it works without any problem.
Thanks guys for your posts. I took #Frimlik's post and created my own script to automate the process. I hope this helps someone.
Save the script to a ps1 file and run it.
Function Get-FileName($initialDirectory)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $initialDirectory
$OpenFileDialog.ShowDialog() | Out-Null
$OpenFileDialog.filename
}
Function EncodePS1ToBat {
$ScriptToEncode = Get-FileName
# Encode the script into the variable $Base64
$Base64 = [System.Convert]::ToBase64String([System.Text.Encoding]::Unicode.GetBytes([System.IO.File]::ReadAllText($ScriptToEncode)))
# Get the path and name to be used as output
$filePath = Split-Path $ScriptToEncode -Parent
$fileName = (Split-Path $ScriptToEncode -Leaf).Split(".")[0]
# Output the encoded script into a batch file on the same directory of the origial script
"#echo off`n powershell.exe -ExecutionPolicy Bypass -EncodedCommand $Base64" |
Out-File -FilePath "$filePath\$fileName`_Encoded.bat" -Force -Encoding ascii
}
# Call the funtion to encode the script to a batch file
EncodePS1ToBat

Resources