Error on loading csv to SybaseIQ using dbisql - sybase

I am trying to upload a bunch of csv's to SybaseIQ using dbisql's Load command.
My CSV's look like this
"ps,abc","jgh","kyj"
"gh",""jkl","qr,t"
and my Load command and other options are these:
set temporary option ISQL_LOG = '/path/for/log/file';
set temporary option DATE_ORDER = 'MDY';
Load table test (a, b, c)
Format bcp
Strip RTRIM
Escapes OFF
quotes ON
Delimited by ','
I create a '.ctl' file like the one above and then execute it with the following command:
dbisql -c "uid = xxx; pwd = yyyy" -host aaaa - port 1089 -nogui test.ctl
On execution I get the following error:
Non-space text found after ending quote character for an enclosed field.
--(db_RecScanner.cxx 2473)
SQLCODE=-1005014, ODBC 3 state="HY000"
HOWEVER
It works fine with the following csv format
"ps,abc","jgh",kyj
"gh",""jkl",qrt
i.e if the last column doesn't have the double quotes it works. but all my files have double quotes around each element.
Also the following .ctl file
set temporary option ISQL_LOG = '/path/for/log/file';
set temporary option DATE_ORDER = 'MDY';
Load table test (a ASCII(8), b ASCII(7), c ASCII(7))
Strip RTRIM
Escapes OFF
quotes ON
Delimited by ','
is able to insert data as in the first csv sample into the database but it also inserts the quotes and the commas and also messes up the data:
ex: first row in the db would look like
"ps,abc" ,"jgh", "kyj""g
as opposed to
ps,abc jgh kyj
I am going nuts over trying to figure out what the issue is I have read the manual for sybase and dbisql and according to that the first control file should be able to load the data properly, but it doesn't do that. Any help on this will be appreciated. Thanks in advance
Also my table structure is like this:
a(varchar(8)) b(varchar(7)) c(varchar(7))

Related

Extra quotes from exporting from SQL Server to CSV

I am getting extra double quotes (") when exporting to CSV using "Insert into OpenRowSet" command from a SQL Server stored procedure (this is part of a C# Visual Studio Core2 automation program, so I need this to run without any intervention.)
When I run:
SELECT data1 as DeleteThisLine
FROM tmpExportData
I get the following results:
DeleteThisLine
-----------------------------------------------------
"#HDR","#BATCH",20190611,Date Range: 06/11-06/19/2019
"#HDR","JOURNAL",,
If I right click and save as .csv, the resulting file looks exactly like the above results.
However when I run the "INSERT INTO OPENROWSET" command to export the results into a csv file:
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0','Text;Database=\\FileSever229\file\;HDR=YES;H=-1;FMT=Delimited','SELECT * FROM [ExportFile_06_24_120522.csv]')
SELECT data1 AS DeleteThisLine
FROM tmpExportData
It exports the file, but with Extra double quotes at the beginning and end of every line and double quotes around the first 2 fields:
DeleteThisLine
-----------------------------------------------------------
"""#HDR"",""#BATCH"",20190611,Date Range: 06/11-06/19/2019"
"""#HDR"",""JOURNAL"",,"
How do I eliminate these extra quotes?
Background:
This is an unusual situation where I am creating a Batch file that is to be uploaded into an accounting system. It holds all that information in a single cell (data1) as each row in the upload file has a different number of columns and each row has a set of variables of different types. So all the data is in a single column. It was working great, and then these extra quotes appeared. I must have changed something in my "insert into OPENROWSET" command.

Load table issue - BCP from flat file - Sybase IQ

I am getting the below error while trying to do bcp from a flat delimited file into Sybase IQ table.
Could not execute statement.
Non-space text found after ending quote character for an enclosed field.
I couldn't observe any non space text in the file, but this error is stopping me from doing the bulk copy. | is column delimiter with " as text qualifier and \n is row delimiter.
Below is the sample template for the same, am using.
LOAD TABLE TABLE_NAME(a NULL('(null)'),b NULL('(null)'),c NULL('(null)'))
USING CLIENT FILE '/home/...../a.txt' //unix
QUOTES ON
FORMAT bcp
STRIP RTRIM
DELIMITED BY '|'
ROW DELIMITED BY '\n'
When i perform the same query with QUOTES OFF, the load was successful. But, the same query is getting failed with QUOTES ON. I would like to get quotes stripped off, as well.
Sample Data
12345|"abcde"|(null)
12346|"abcdf"|"zxf"
12347|(null)|(null)
12348|"abcdg"|"zyf"
Any leads would be helpful!
If IQ bcp is the same as ASE, then I think those '(null)' fields are being interpreted as strings, not fields that are NULL.
You'd need to stream edit out those (null).
You're on unix so use sed or perl -ne.
E.g. pipe the file through " | perl -pne 's/(null)//g'" to the loading command or filename.
QUOTES OFF might seem to work, but I wonder if when you look in your loaded data, you'll see double quotes inside the 2nd field, and '(null)' where you expect a field to be NULL.

Import CSV data into SQL Server

I have data in the csv file similar to this:
Name,Age,Location,Score
"Bob, B",34,Boston,0
"Mike, M",76,Miami,678
"Rachel, R",17,Richmond,"1,234"
While trying to BULK INSERT this data into a SQL Server table, I encountered two problems.
If I use FIELDTERMINATOR=',' then it splits the first (and sometimes the last) column
The last column is an integer column but it has quotes and comma thousand separator whenever the number is greater than 1000
Is there a way to import this data (using XML Format File or whatever) without manually parsing the csv file first?
I appreciate any help. Thanks.
You can parse the file with http://filehelpers.sourceforge.net/
And with that result, use the approach here: SQL Bulkcopy YYYYMMDD problem or straight into SqlBulkCopy
Use MySQL load data:
LOAD DATA LOCAL INFILE 'path-to-/filename.csv' INTO TABLE `sql_tablename`
CHARACTER SET 'utf8'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
IGNORE 1 LINES;
The part optionally enclosed by '\"', or escape character and quote, will keep the data in the first column together for the first field.
IGNORE 1 LINES will leave the field name row out.
UTF8 line is optional but good to use if names have diacritics, like in José.

SQLite define output field separator on a single line command

I need to use a single command line to fetch a set of records from a database.
If I do this:
$ sqlite3 con.db "SELECT name,cell,email FROM contacts"
I get output with a separator "|", where the output looks like this:
Alison|+12345678|alison#mail.com
Ben|+23456789|ben#mail.com
Steve|+34567890|steve#mail.com
Is there a way (in single command line format like specified above) to change the output field separator to something else, like ";;;" or something else or more unique. This is because the output occasionally get the character "|" inside the records, and it causes issues.
My desired result is:
Alison;;;+12345678;;;alison#mail.com
Ben;;;+23456789;;;ben#mail.com
Steve;;;+34567890;;;steve#mail.com
Or any other unique separator, which is not likely to be found inside the values.
(The command is executed on a Linux machine)
Thank you.
The -separator option does what you want:
sqlite3 -separator ';;;' con.db "SELECT ..."
The only way to format the output so that you are guaranteed to not get the separator in the values is to quote all strings:
sqlite3 con.db "SELECT quote(name), quote(cell), quote(email) FROM contacts"
However, this would require you to parse the output according to the SQL syntax rules.

Commas within CSV Data

I have a CSV file which I am directly importing to a SQL server table. In the CSV file each column is separated by a comma. But my problem is that I have a column "address", and the data in this column contains commas. So what is happening is that some of the data of the address column is going to the other columns will importing to SQL server.
What should I do?
For this problem the solution is very simple.
first select => flat file source => browse your file =>
then go to the "Text qualifier" by default its none write here double quote like (") and follow the instruction of wizard.
Steps are -
first select => flat file source => browse your file => Text qualifier (write only ") and follow the instruction of wizard.
Good Luck
If there is a comma in a column then that column should be surrounded by a single quote or double quote. Then if inside that column there is a single or double quote it should have an escape charter before it, usually a \
Example format of CSV
ID - address - name
1, "Some Address, Some Street, 10452", 'David O\'Brian'
New version supports the CSV format fully, including mixed use of " and , .
BULK INSERT Sales.Orders
FROM '\\SystemX\DiskZ\Sales\data\orders.csv'
WITH ( FORMAT='CSV');
I'd suggest to either use another format than CSV or try using other characters as field separator and/or text delimiter. Try looking for a character that isn't used in your data, e.g. |, #, ^ or #. The format of a single row would become
|foo|,|bar|,|baz, qux|
A well behave parser must not interpret 'baz' and 'qux' as two columns.
Alternatively, you could write your own import voodoo that fixes any problems. For the later, you might find this Groovy skeleton useful (not sure what languages you're fluent in though)
Most systems, including Excel, will allow for the column data to be enclosed in single quotes...
col1,col2,col3
'test1','my test2, with comma',test3
Another alternative is to use the Macintosh version of CSV, which uses TAB's as delimiters.
The best, quickest and easiest way to resolve the comma in data issue is to use Excel to save a comma separated file after having set Windows' list separator setting to something other than a comma (such as a pipe). This will then generate a pipe (or whatever) separated file for you that you can then import. This is described here.
I don't think adding quote could help.The best way I suggest is replacing the comma in the content with other marks like space or something.
replace(COLUMN,',',' ') as COLUMN
Appending a speech mark into the select column on both side works. You must also cast the column as a NVARCVHAR(MAX) to turn this into a string if the column is a TEXT.
SQLCMD -S DB-SERVER -E -Q "set nocount on; set ansi_warnings off; SELECT '""' + cast ([Column1] as nvarchar(max)) + '""' As TextHere, [Column2] As NormalColumn FROM [Database].[dbo].[Table]" /o output.tmp /s "," -W

Resources