SSIS : Unwanted line return on a dynamic connection string - sql-server

In a SSIS package, I want to send data from several instance to a flat files. To do so I create a dynamic connection string made of 3 variables:
".txt"
a Network path
The file name (which is the instance Name variable (string) that i use elsewhere in my package)
When i evaluate my expression at this point i receive :
For
TRIM(#[User::FileName]+REPLACE(#[User::ServerName],"\\","")+#[User::ExtensionFile])
I receive
\\test-01\TEMP\SQL01MyInstance.txt
But, when i run the job, it's unable to create the SQL01MyInstance.txt, and i receive as error :
[Flat File Destination [11]] Error: Cannot open the datafile "\\test-01\TEMP\SQL01MyInstance
.txt".
[SSIS.Pipeline] Error: Flat File Destination failed the pre-execute phase and returned error code 0xC020200E.
There's a unwanted space at the end filename, when i copy paste the error message elsewhere it appear to be a line return (before the .txt)
Does anybody know how can i get rid of it that line return (which i'm assuming is making the job fail) ?
Edit 1:
Rights on the destination folder are ok, because there's another flat file that I create in case of errors and it's created normally after that failure; but not with a dynamic name (normal behavior)

To remove line return you can use REPLACE() function with \r\n
REPLACE(REPLACE(TRIM(#[User::FileName]+REPLACE(#[User::ServerName],"\\","")+#[User::ExtensionFile]),"\r",""),"\n","")
Where
\r : carriadge return
\n : line feed

The TRIM function only trims the space character (versus other functions which trim all white space):
TRIM does not remove white-space characters such as the tab or line feed characters. Unicode provides code points for many different types of spaces, but this function recognizes only the Unicode code point 0x0020. When double-byte character set (DBCS) strings are converted to Unicode they may include space characters other than 0x0020 and the function cannot remove such spaces. To remove all kinds of spaces, you can use the Microsoft Visual Basic .NET Trim method in a script run from the Script component.
https://learn.microsoft.com/en-us/sql/integration-services/expressions/trim-ssis-expression
You can try this first to see if it works (Trim first then concatenate):
TRIM(#[User::FileName]) + TRIM(REPLACE(#[User::ServerName],"\","")) + TRIM(#[User::ExtensionFile]))
If not then you'll have to do the recommended String.Trim() function using a Script Task/Component that the MSDN article recommends (again, Trim each variable first, then concatenate)

Related

"Too few data elements" error in Knime CSV Reader

Receiving below error while execution of CSV file which includes around 400k rows
Error:
ERROR CSV Reader 2:1 Execute failed: Too few data elements (line: 2 (Row0), source: 'file:/Users/shobha.dhingra/Desktop/SBC:Non%20SBC/SBC.csv')
I have tried executing another csv file with few lines, did not face an issue.
It is not about the number of lines, but the content in the line (2 in your case). It seems your SBC.csv file is not correct, it has extra header content or the second line misses the commas representing the missing cells.
You can use the CSV Reader node's Support Short Lines option to let KNIME handle this case by producing missing cells.
I get this error when end-of-line characters exist in a field. You could load the file into a text editor and identify any look for non-printing characters (tabs, carriage returns etc) between your delimiters.
If you can't get a clean version of the file, consider using this regex
[^ -~] to identify any character that is not a space or a visible character.
I hope this helps.

Batch file dropping characters

I'm creating a simple batch file that uses Azure REST API to download data from a blob. If I type the request directly into the command prompt, it works perfectly and my data appears in the directory. However, when I run it as a batch file, it does not work and I can see in the command line that some characters from the blob connection string (acts as an access token) have been dropped. I cannot share the full access token, but can show that the drop happens at the end of the connection string, in what is known as the signature:
correct: "...5U%2BJgo%3D"
batch file output: "...5UBJgoD"
It appears the issue is with special characters and some numbers. There are no other special characters in the signature and other numbers in the rest of the signature are not affected.
Other notes:
The connection string is indeed entered within a "" string
I tried forcing the encoding to UTF-8 encoding by running chcp 65001 before the request line executes; didn't work
You should escape your percent sign (%) with double-percent sign (%%). For example you should type:
"...5U%%2BJgo%%3D"
It is quite useful to search in the internet before you post here, on Stack OverFlow. So, check the links provided:
http://www.robvanderwoude.com/escapechars.php
https://ss64.com/nt/syntax-esc.html
Special Characters in Batch File
Hope this helps!

Unable to create directory in oracle 12c

I am using Oracle 12.2 .I wish to import data pump files. To do that, I wish to create a directory, containing the files and then import. I use the following command to create directory
CREATE DIRECTORY dpump_dir1 AS ‘D:\dumpdir’;
I am getting the error as
SQL Error: ORA-00911: invalid character
00911. 00000 - "invalid character"
*Cause: identifiers may not start with any ASCII character other than
letters and numbers. $#_ are also allowed after the first
character. Identifiers enclosed by doublequotes may contain
any character other than a doublequote. Alternative quotes
(q'#...#') cannot use spaces, tabs, or carriage returns as
delimiters. For all other contexts, consult the SQL Language
Reference Manual.
Could anybody tell me what is going wrong?
The quotes being used in the code you provided are not simple straight single quotes; it's slightly easier to see when formatted as code:
CREATE DIRECTORY dpump_dir1 AS ‘D:\dumpdir’;
You can also use your text editor or dump the string to see which chraacters it contains:
select dump(q'[CREATE DIRECTORY dpump_dir1 AS ‘D:\dumpdir’;]', 1016) from dual;
DUMP(Q'[CREATEDIRECTORYDPUMP_DIR1AS‘D:\DUMPDIR’;]',1016)
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Typ=96 Len=49 CharacterSet=AL32UTF8: 43,52,45,41,54,45,20,44,49,52,45,43,54,4f,52,59,20,64,70,75,6d,70,5f,64,69,72,31,20,41,53,20,20,e2,80,98,44,3a,5c,64,75,6d,70,64,69,72,e2,80,99,3b
You can see that it's reported at 49 bytes despite being 45 characters long, indicating you have multibyte characters. Before the final semicolon, which is shown as 3b, you have the sequence e2,80,99 which represents the ’ right single quotation mark, and a bit earlier you have the sequence e2,80,98 which represents the ‘ left single quotation mark.
If you use plain quotes it should work:
CREATE DIRECTORY dpump_dir1 AS 'D:\dumpdir';
Presumably you copied and pasted the text from an editor which helpfully substituted curly quotes.

SSIS truncation error only in control flow

I have a package that giving me a very confusing "Text was truncated or one or more characters had no match in the target code page" error but only when I run the full package in the control flow, not when I run just the task by itself.
The first task takes CSV files and combines them into one file. The next task reads the output of the previous file and begins to process the records. What is really odd is the truncation error is thrown in the flat file source in the 2nd step. This is the exact same flat file source which was the destination in the previous step.
If there was a truncation error wouldn't that be thrown by the previous step that tried to create the file? Since the 1st step created the file without truncation, why can't I just read that same file in the very next task?
Note - Only thing that makes this package different from the others I have worked on is I am dealing with special characters and using code page 65001 UTF-8 to capture the fields that have special characters. My other packages were all referencing flat file connection managers with code page 1252.
The problem was caused by the foreach loop and using the ColumnNamesInFirstDataRow expression where I have the formula "#[User::count_raw_input_rows] < 0". I have a variable initialized to -1 and I assign it to the ColumnNamesInFirstDataRow for the flat file. When in the loop I update the variable with a row counter on each read of a CSV file. This puts the header in the first time (-1) but then avoids repeating on all the other CSV files. When I exit the loop and try to read the input file it treats the header as data and blows up. I only avoided this on my last package because I didn't tighten the column definitions for the flat file like I did with this package. Thanks for the help.

psql syntax error, possibly on '-', when importing a dump

(Using postgres 9.4beta2)
I have a dump that I want to import. I have done this with the 'psql' command, as elsewhere it is noted that this is required when using COPY FROM stdin:
psql publishing < publishing.dump.20150211160001
I get this syntax error:
ERROR: syntax error at or near "fbc61bc4"
LINE 1: fbc61bc4-2875-4a3a-8dec-91c8d8b60bcc root
The offending line in the dump file is the one after the COPY statement, here are both those lines together:
COPY content_fragment (id, content, name, content_item_id, entity_version) FROM stdin;
fbc61bc4-2875-4a3a-8dec-91c8dcontent Content for root content fbc61bc4-2875-4a3a-8dec-91c8d8b60bcc 0
The items in the data appear to be tab separated. I am wondering given that the error message says at or near "fbc61bc4", but the full string is "fbc61bc4-2875-4a3a-8dec-91c8dcontent", is psql not liking the '-' character?
This kind of error happens when the COPY itself fails because the table doesn't exist, or one of the mentioned columns doesn't, or the user lacks the permission to write into it, etc...
As COPY fails, the SQL interpreter continues to the next line and interpret it as if was an SQL statement, although it's actually data meant to be fed to COPY. Generally this leads to a syntax error, preceded by the error telling why the COPY failed (and often followed by tons of errors if there are many lines of data).
See another question: psql invalid command \N while restore sql, which shares the same root cause and has some useful comments.

Resources