Data gets truncated while inserting JSON data - sql-server

I have a SQL table which has 'text' as a datatype. I am trying to insert JSON data using CfQueryparam with cf_sql_longvarchar, which I have in a ColdFusion variable.
However, when I checked the value in my column, I'm losing the data. I compared the total length of my column inSQL table to data being held. I've enough datalength left in that column.

This probably has to do with the settings in your datasource under the CF admin.
I would mess with the CLOB and your char buffer values and see what you can come up with.

Related

SQL Server 2019 Database with consequences of mixing nvarchar column type, but changing collation to Latin1_General_100_CI_AI_SC_UTF8

We need to store much UTF-8 collated data in XML data type columns and the XML files we store explicitly state encoding='UTF-8', which results in an error when trying to store XML data in the column.
We are now in the process of switching DB collation default to Latin1_General_100_CI_AI_SC_UTF8 from a prior UTF-16 based Latin1_General_100_CI_AI_SC. Do we need to switch all nvarchar columns to varchar in the process? We are not afraid of losing any data and we (probably) do not have anything but a few encoded chars in our data, we are all in 'latin' alphabet. Is it simply going to affect (use 2x size)? Will there be a performance hit on joins? Any other consequences?

I have a problem inserting more than 255 chars per column into an Excel file using INSERT INTO OPENROWSET from SQL Server

I am getting an error while exporting data from SQL Server to an already created .xlsx file using openrowset.
It works fine most of times, but when the data comes in of the field as a large string, while inserting into Excel, it shows this error:
The statement has been terminated, string or binary data would be truncated.
Data gets inserted into table, but while inserting in Excel, this error appears. Please help me find a solution.
As the error mentions "data would be truncated", you should be provide a longer string value into a placeholder or field that has a smaller storage size.
For example, the source field may have data type nvarchar(max) and in your SQL development or where a mapping exists, you assing the values into a smaller data size type. For example, in source table you have a string value 5000 characters, but during the process it is assigned to a nvarchar(4000) then a data truncation will occur
I would suggest you to check data mappings in your statements
Regedit: Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\14.0\Access Connectivity Engine\Engines\Excel
In Regedit, set the "TypeGuessRows" value on this path to a value greater than 8, for example 100000.enter image description here

Issue with varbinary(max) datatype in Sql Server

I have created a table having two columns with datatype varbinary(max). I am saving pdf files in binary format in these columns. There is no issue while inserting the pdf files in these columns. But when I am selecting even a single record with only one column of type varbinary in select query it takes around one minute to fetch the record. The size of pdf file inserted is of 1MB. Here is the sql query to fetch single record:
select binarypdffile from gm_packet where jobpacketid=1
Kindly suggest if there is a way to improve the performance with varbinary datatype.
Could you try and time the following queries:
SELECT cnt = COUNT(*) INTO #test1 FROM gm_packet WHERE jobpacketid = 1
SELECT binarypdffile INTO #test2 FROM gm_packet WHERE jobpacketid = 1
The first one tests how long it takes to find the record. If it's slow, add an index on the jobpacketid field. Assuming these values come in sequentially I wouldn't worry about performance as records get added in the future. Otherwise you might need to rebuild the index once in a while.
The second tests how long it takes to fetch the data from the table (and store it back into another table). Since no data goes out of 'the system' it should show 'raw database performance' without any "external" influence.
Neither should be very long. If they aren't but it still takes 'a long time' to run your original query in SSMS and get the binary data in the grid, then I'm guessing it's either a network issue (wifi?) or SSMS simply is very bad at representing the blob in the GUI; it's been noticed before =)

Entity Framework - Getting the length of data in a text column

I've hit an error when I've been working on a table that uses a text field.
If I was getting the length of a varchar column I could do
var result = (from t in context.tablename select t.fullname.Length)
However, if I run the same query on a text field:
var result = (from t in context.tablename select t.biography.Length)
I get the error :
Argument data type text is invalid for argument 1 of len function.
Having done a bit of reading up on the subject I understand why SQL Server raises this error but I'm not sure of the best way around it. I know I could return the result and then get the length of the resulting string but surely there is an easier way of doing this?
I think your best option is to update the column data type to VARCHAR(MAX) if it is TEXT or NVARCHAR(MAX) if it is NTEXT. There are plenty of resources on how to do this, but generally you make a new column of [N]VARCHAR(MAX) and then you update all your data across into the new column, then drop the old column and finally rename the new column to the old name.
If you can't change the table schema, then you will need to create a view and do the type casting in the select of that view.. but then you might as well have just changed the column data type as mentioned above (unless you're not the db owner and you create the view in a different database). But be mindful that EF doesn't always play as nice with views as it does with tables.

CakePHP truncating large varchar columns from SQL Server database

Using CakePHP 1.3.11 and SQL Server 2005 and the included MSSQL database driver.
I need to retrieve a varchar(8000) field, however the typical find() queries truncate this field to 256 characters; the actual array value array['comment'] is truncated, so the data beyond character 256 isn't accessed by my application at all.
I tried changing the field to a text datatype and with that change the query returns the full value of the column. Is there a way for cake to read the full value of the column or does it always truncate varchars to 256 characters?
Solution has been to use the text data type on the database side.

Resources