Insert to CLOB through ODP.NET - database

I am using ODP.NET and Oracle database. I have to save to CLOB field data more than 4000 length. When I am doing this through simple SQL statement and ExecuteNonQuery exception occurs - PLS-00172: string literal too long.
The question is how to save this lond data?
I cannot use/create procedures - no way to get permission. I can only use ODP.NET.

Related

NTEXT on SQL Server to NVARCHAR2(2000) on Oracle (ORA-12899: value too large for column)

My source is in SQL Server and the target is Oracle. There are some tables having columns defined NTEXT in SQL Server and I created columns of NVARCHAR2(2000) which allows 4000 bytes, to store the data from the source.
When I pull the data defined NTEXT from SQL Server, I cast and substring the data to fit into 4000 bytes in the target. I'm using Data Stage by IBM to extract the source form SQL Server and the code below performs converting data type to varchar(4000) and extracting a substring with the specified length, 4000 bytes.
cast(substring([text],1,3950) as varchar(4000)) as "TEXT"
However, it often occurs an error ORA-12899 when it inserts into NVARCHAR2(2000) on Oracle which is sized 4000 bytes.
Error message: ORA-12899: value too large for column (actual: 3095, maximum: 2000).
First, it is hard to understand why it occurs the error even though the destination has a column sized 4000 bytes and I cut the data using SUBSTING already.
Second, I was wondering if I miss anything to handle the issue when my team does not consider CLOB on Oracle for those NTEXT type data.
Please help me to resolve this issue. I'm working on many tables and the error occurs often.
nvarchar2 is limited to 2000 chars, which requires 4000 bytes. You have to specify the limit in chars (so, 2000, not 4000). Try changing your function to:
cast(substr([text],1,2000) as nvarchar2(2000)) as "TEXT"

SQL Server to Oracle migration - ORA-12899: value too large for column

While migrating a SQL Server database to Oracle, I end up with an error
ORA-12899: value too large for column
though the datatypes are the same.
This is happening with strings like 'enthält'. The data type NVARCHAR(7) should be able to hold the given string in SQL Server where as on Oracle VARCHAR2(7) not able to hold the value and throwing value too large error.
Is this something with the encoding style on Oracle? How can we resolve this?
Thanks
You can create your Oracle table with something like varchar2(7 char) this causes it to allocate in units of characters, not bytes.
This succeeds:
create table tbl(x varchar2(7 char));
insert into tbl values ('enthält');

Unable to store CLOB data in to CLOB defined column in DB2

It's a repeated issue i guess, but couldn't find a proper solution yet. Basically I am trying to insert bit huge XML i.e. 32000+ characters in to a CLOB column through DB2 procedure. Insertion is failing with the below error looks DB2 is considering the input as String rather than CLOB datatype. Can you please suggest what needs to be done?
SP
CREATE OR REPLACE PROCEDURE logging (IN HEADERDATA CLOB(10M))
LANGUAGE SQL
BEGIN
INSERT INTO Logging(Header) VALUES (HEADERDATA);
COMMIT;
END
Error
The string constant beginning with
"'<?xml version="1.0" encoding="UTF-8"?><XXXXXXXX xmlns:xsi="http:" is too long..
SQLCODE=-102, SQLSTATE=54002, DRIVER=XXXXXX
Character literals in DB2 are limited to about 32K bytes. To handle larger LOBs you need to avoid having to use SQL literal values.
One way to do this without extra programming is write your [future] CLOB contents to a file and use IMPORT or LOAD to insert its contents into a column.
Alternatively, you could wrap a simple Java program around your procedure call where you would use PreparedStatement.setClob() to handle your large XML document.

SSIS: XML to Excel export, truncation may happen due to inserting data from data flow column with a length of 4000 to database column with length 255

I'm using SQL 2008 R2 & Excel of format xlsx. I'm trying to export data to Excel through XML source.Everything is working fine, than for columns where data is more than 255 characters. I get a warning:
truncation may happen due to inserting data from data flow column with
a length of 4000 to database column with length 255
I've tried changing registry and IMEX value as per the blog. This doesn't seem to help me. Also, for me connection string (Excel Connection Manager) is
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=Extract_8202014.xlsx;Extended
Properties="EXCEL 12.0 XML;HDR=YES";
As per MDSN for memo type columns we should define datatype as longtext if table is getting created at run time. I've done same. But it's not helping.

CakePHP truncating large varchar columns from SQL Server database

Using CakePHP 1.3.11 and SQL Server 2005 and the included MSSQL database driver.
I need to retrieve a varchar(8000) field, however the typical find() queries truncate this field to 256 characters; the actual array value array['comment'] is truncated, so the data beyond character 256 isn't accessed by my application at all.
I tried changing the field to a text datatype and with that change the query returns the full value of the column. Is there a way for cake to read the full value of the column or does it always truncate varchars to 256 characters?
Solution has been to use the text data type on the database side.

Resources