No URLs matched- while copying file to google cloud storage - batch-file

I am copying two files into Google Cloud Storage (GCS). When I run the script from informatica, I get error but when I run the same script from Unix it works fine.
Below is a link from GitHub where a similar issue is discussed. I don't understand what's the issue.
------------------------PART OF SCRIPT------------------------
echo "$LFS_File_Path/$File_Name";
gsutil cp "$LFS_File_Path/$File_Name" $GCS_Path;
if [[ $? -eq 0 ]]; then
echo "copy to GCS success for LFS Data File";
else
echo "copy to GCS Failed for LFS" >> $Log_File_Path/$Workflow_Name.txt ;
exit 1
fi
echo "$LFS_File_Path/$Del_File_Name";
gsutil cp "$LFS_File_Path/$Del_File_Name" $GCS_Path;
if [[ $? -eq 0 ]]; then
echo "copy to GCS success for LFS Delete Data File";
else
echo "copy to GCS Failed for LFS" >> $Log_File_Path/$Workflow_Name.txt ;
exit 1
fi
------------------------PART OF SCRIPT------------------------
error:-
CommandException: No URLs matched: /opt/u01/app/informatica/server/infa_shared/TgtFiles/BQ_RT/DW_ORDER_HEADER_DEL.csv
Similar topic:-
https://github.com/GoogleCloudPlatform/gsutil/issues/501

It might be related to the permissions in that file or maybe the command is being run as a user but only root has access to read inside that folder.

resolved it after creating another directory
its because of the hiddenfiles in that directory,the hidden file used the gstuil commands and found some escape characters init
try to create one more directory on the same filesystem and move the files from the directory
its mainly because of the escape characters,try to remove/move the hidden files
it will avoid the the above error
errors:
1)"No URLS matched"
2)CommandException: No URLs matched

Related

Bash array with spaces and no spaces in elements

I know this question has been asked in different manners, and I've referred to some answers on here to get to where I am now.
I'm trying to create a script to essentially watch a folder and virus scan files once they're not being written to.
The files/strings I need it to handle will sometimes contain spaces and sometimes not, as well sometimes special characters. At the moment it will actually work only on files with spaces in the name (as in actually scan and move them), but not for files without spaces. Also after each file (spaces or not) the while loop breaks thus stopping the script with the following output;
./howscan.sh: line 29: snmp.conf: syntax error: invalid arithmetic operator (error token is ".conf")
./howscan.sh: line 34: snmp.conf: syntax error: invalid arithmetic operator (error token is ".conf")
I had it working to handle file names without any spaces, but since I introduced the "${files[$i]}" method to use the array elements it only works on files with spaces and outputs the above error.
Feel free to omit the sepscan part of this, as I'm sure if I can get it working with the other tasks it'll work for that too (just wanted to show the full script for a complete understanding).
Current Script:
#!/bin/bash
set -x
workingpath='/mnt/Incoming/ToScan/'
outputpath='/mnt/Incoming/Scanned/'
logfile='/var/log/howscan.log'
faildir='/mnt/Incoming/ScanFailed/'
sepscan='/opt/Symantec/symantec_antivirus/sav manualscan -c'
# Change to working directory
cd $workingpath
# Exclude files with a given extension, in this case .aspx, and declare the remaining files as the array "files"
shopt -s extglob nullglob
# Loop the below - ie it's a watch folder
while true
do
# Exclude files with .aspx in the file name
files=( !(*.aspx) )
# If the array files is not empty then...
if [ ${#files[#]} -ne 0 ]; then
for i in ${files[*]}
# For every item in the array files process them as follows
# Declare any variables you wish to happen on files here, not globally
# Check each file to see if it's in use using fuser, do nothing and log it if its still in use, or process it and log the results
do
fileopen=`fuser "${files[$i]}" | wc -c`
# Here for 'fileopen' we are checking if the file is being writen to.
if [ $fileopen -ne 0 ]; then
echo `date` "${files[$i]}" is being used so has been ignored >> $logfile
else
echo `date` File "${files[$i]}" not being used or accessed >> $logfile
sepscan=`$sepscan "${files[$i]}" | wc -c`
if [ $sepscan = 0 ]; then
mv "${files[$i]}" $outputpath
echo `date` "${files[$i]}" has been virus scanned and moved to $outputpath >> $logfile
else
echo `date` "${files[$i]}" has been moved to $faildir as a virus or error was detected >> $logfile
fi
fi
done
fi
echo `date` 'No more files to process. Waiting 60 seconds...' >> $logfile
sleep 60
done
Let me know if I can provide anything else to help clarify my issue.
Update:
There is a file in the /mnt/Incoming/ToScan/ directory called snmp.conf by the way.
for i in ${files[*]}
should be
for i in ${!files[*]}
# or
for i in ${!files[#]}
${files[*]} expands to the contents of the array and undergoes word splitting. The above syntax expands to a list of indices of the array.
You might also need to double quote the variables, e.g.
if [ "$fileopen" -ne 0 ]; then

Making script continue when a command fails

I am writing a script to remove unwanted files in my $HOME.
I need the script to continue if a command within the for-loop returns an error.
I tried to follow the advice from another Stack Overflow thread which said to use this format:
command || true
However, this does not seem to work for my scenario where I am executing code within a for-loop. The script exits the loop, and continues executing the lines after the loop.
The script:
#!/usr/bin/env bash
files=(
"Desktop"
".yarnrc"
)
cd $HOME
for file in $files;
do
echo "current file: $file"
rm -r "$file" || :
done
echo "hello world"
Output:
current file: Desktop
rm: cannot remove 'Desktop': No such file or directory
hello world
The problem is that $file expands only to Desktop, not all elements of the array. $file is equivalent to ${file[0]}.
cd
for file in "${files[#]}"; do
echo "current file: $file"
rm -r -- "$file"
done
You aren't using set -e, so whether rm succeeds or fails as no effect on the rest of the loop or script.

How to add a delay after getting the list before starting the loop in the shell script

Below script list, the files, change the permissions on the file and moving it to a different folder. Sometimes the script is moving the file before the contents fully generated.
Need to add a delay after getting the list before starting the loop. Is this possible? Please help how to achieve this scenario to implement.
Can we use the sleep command to achieve this?
Script to change the file permissions and move it to main folder
function starts here
function mv_purge_files
{
cd $SRC
if  [ "$?" = "0" ]; then
for c in $(/usr/bin/ls *)
do
echo "ext: changing file permission $c"
/usr/bin/chmod 775 $c
echo "ext: moving $c"
/usr/bin/mv $c $TGT/$c
done
else
echo "Error accessing folder " $SRC
fi
}
program starts here 
SRC=/temp/file.in
TGT=/tgt/purge
mv_purge_files

SOLR POST files with no extension

I am using SOLR 5 and I want to scan documents that have no extensions. Unfortunately changing the file to have extensions is not an option in my case.
the command I am using is simply:
$bin/post -c mycore ../foldertobescaned -type application/pdf
the command works fine for documents that do have extension but I am getting:
Entering auto mode. File endings considered are xml,json,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
If renaming the files is not an option, you can use the following script as a workaround until Solr improves its post method. It is a simple bash for loop that submits each file individually and works regardless of the file extension. Note that this script will be slower than using post on the whole folder, because each individual file transfer needs to be initialized.
Save the script below as postFolderToSolr.sh inside your Solr folder (so that Solrs bin/ folder is a subdirectory), make it executable with chmod +x postFolderToSolr.sh and then use it as follows: ./postFolderToSolr.sh mycore /home/user1/foldertobescaned/ application/pdf
Using no arguments or the wrong number of arguments prints a short usage message as help.
#!/bin/bash
set -o nounset
if [ "$#" -ne 3 ]
then
echo "Post contents of a folder to Solr."
echo
echo "Usage: postFolderToSolr.sh <colletionName> </path/to/folder> <MIME>"
echo
exit 1
fi
collection=$1
inputPath=${2%/} # remove suffix / if it exists
mime=$3
for element in $inputPath"/"*; do
bin/post -c $collection -type $mime $element
done

How I can retrieve and play audio file in cakephp

I am uploading audio files using cakephp app. I can move file successfully to folder and can update its path in database too. But I can not retrive that file successfully. server reply with 206 Partial Content AND some times with 304 Not Modified status. can anyone help me to figure out why server respondig with such status?
You are using chromium. Mp3 is not supported in both chromium and firefox for ubuntu. You can convert mp3 to ogg to play them in chromium and firefox. Or you can install chrome if you don't want to convert mp3 to ogg. If you want to convert them, you can do it with this shell script
#!/bin/bash
echo "The script convert mp3->ogg or ogg->mp3.";
echo "The script takes one parameter: ";
echo "[mp3ogg] - converting mp3->ogg";
echo "[oggmp3] - converting ogg->mp3";
if [ "$1" = "" ]; then
echo "";
echo "Argument does not exist!!!";
exit 102;
fi
if [ "$1" = "mp3ogg" ]; then
for file in *.mp3; do
avconv -i "$file" "`echo '../ogg/'${file%.mp3}.ogg`";
done
exit 0;
fi
if [ "$1" = "oggmp3" ]; then
for file in *.ogg; do
avconv -i "$file" -acodec libmp3lame "`echo ${file%.ogg}.mp3`";
done
exit 0;
fi
exit 104;
But you only problem is that chromium and firefox will not support mp3, they will support only ogg in ubuntu.

Resources