I'm trying to migrate from Oracle to Postgresql database.
I have some clob column types at Oracle DB:
Here is my questions.
Is the TEXT type equivelant for the CLOB in ORACLE DB?
Are there any risk for directly convert it to the TEXT? I think TEXT limit is 1gb for the POSTGRE and CLOB limit is 4GB in Oracle?
Yes, TEXT is a good equivalent in PostgreSQL to CLOB in Oracle. But max size for TEXT is roughly 1GB whereas max for CLOB is 4GB.
Almost, a CLOB can be larger (2GB if I'm right) than TEXT: "just" 1GB.
Related
Currently, I'm using TDengine in a project and want to have a BLOB column to store a variable of unstructured binary data(e.g. CSV files). Does TDengine support this kind of data type for columns or tags like MySQL which has TINYBLOB, MEDIUMBLOB, and LONGBLOB, etc.?
TDengine supports the binary type and no more than 16K bytes length.
I am working on requirement to migrate Microsoft SQL Server to HSQL Database.
What will be the alternative for varchar(max) from SQL Server to other data type in HSQL Database?
You can use VARCHAR with a large maximum size, for example VARCHAR(1000000). Check the maximum size of the strings in that column in the SQLServre database and use a larger value. If the strings are typically longer than 32000 characters, you can consider using CLOB instead.
I need to upload some data from an Oracle table to a SQL Server table. The data will be uploaded to the SQL server using a Java processing utilising JDBC facilities.
Is there any benefit in creating the SQL server columns using nvarchar instead of varchar?
Google suggests that nvarchar is used when UNICODE characters are involved but i am wondering does nvarchar provide any benefit in this situation? (i.e. when the source data of the SQL Server table comes from an Oracle database running a Unix environment?)
Thanks in advance
As you have found out, nvarchar stores unicode characters - the same as nvarchar2 within Oracle. It comes down to whether your source data is unicode - or whether you anticipate having to store unicode values in future (e.g. internationalized software)
The CLOB is XML data that is > 8k (sometimes > 32k). Any suggestions?
Unfortunatly I was unable to run it within SQL Server, so I wrote a C# console application to import and parse the CLOB data, then to write out the results to SQL Server.
You may have to use the IMAGE type, as the BINARY type is limited to (IIRC) 8000 bytes in SQLServer 2K. The limits for varchar and varbinary increased in SQLServer 2005, so it depends on what your target is. For 2005, if your data is ASCII varchar will work, if it's unicode text use nvarchar, otherwise use varbinary.
If you're looking for sample code, you'll have to give us more information, like what language/platform you're using, and how you're accessing the two databases. Also, is this a one-time transfer, or something that you need to do programmatically in production?
Using just an sql query is it possible to write the contents of a varbinary cell to the file system? I have a column that stores pdf s as and for some quick testing I'd like to write out the pdfs to the filesystem.
Thanks for any help.
Similar Question Here.
How to dump all of our images from a VARBINARY(MAX) field in SQL Server 2008 to the filesystem?
You'll need to iterate through each row and perform a BCP (BulkCopy) Dump of each varbinary field to the filesystem