Hashkey(MD5) generation for JSON data (ntext)column SQL server - sql-server

We have to generate a hashkey column in a table for incremental load, where it has multiple JSON data (Ntext)columns with more than 40,000 characters and it varies.
Currently we are converting it to varchar and generating, but varchar has limitation. Could you please suggest?

Related

SQL Server 2019 Database with consequences of mixing nvarchar column type, but changing collation to Latin1_General_100_CI_AI_SC_UTF8

We need to store much UTF-8 collated data in XML data type columns and the XML files we store explicitly state encoding='UTF-8', which results in an error when trying to store XML data in the column.
We are now in the process of switching DB collation default to Latin1_General_100_CI_AI_SC_UTF8 from a prior UTF-16 based Latin1_General_100_CI_AI_SC. Do we need to switch all nvarchar columns to varchar in the process? We are not afraid of losing any data and we (probably) do not have anything but a few encoded chars in our data, we are all in 'latin' alphabet. Is it simply going to affect (use 2x size)? Will there be a performance hit on joins? Any other consequences?

Exporting a VARCHAR(MAX) Field from SQL to Excel in SSIS

I'm creating a SSIS Package to export Data from a SQL Server Table into an Excel Sheet. My Table has one column of datatype VARCHAR(MAX). And this column can have data as big as 30,000 characters sometimes. So I used a Data Conversion block to convert that column's DataType from it's original datatype text stream [DT_TEXT] to Unicode text stream [DT_NTEXT]. But when I execute the Package, I get this error An error occurred while setting up a binding for the "MyColumnName" column. The binding status was "DT_NTEXT".
I googled a lot but I couldn't get an answer to my problem. I'd appreciate all the help that I could get.
You can skip the data conversion task and just force the type when you extract from SQL Server SELECT CAST(MyBigColumn AS nvarchar(max)) As MyBigColumnUnicode but you're going to run into the problem that a cell in Excel cannot hold 100k characters.
Total number of characters that a cell can contain: 32,776
Excel specifications and limits

Convert field from nVarchar to Numeric, what sql function for perfomance on many rows

I'm bringing into SQL Server 2012 an external db table which has about 2 million rows and a Comments field nvarchar(200).
1st step moves data into a Staging DB.
2nd step move data into a Data Warehouse, incremental load once a day will be about 10,000 rows.
When moving data to DW I want to also split out the numeric data from Comments field into a new column.
I'm thinking of using TRY_CONVERT(FLOAT,col) but want to know if this is best function performance wise to use or another way.
Guessing best to do this with sql rather than in SSIS with an expression derived column?
I don't have access to system yet to test but thinking this over to plan.

Is there any specific sql query to covert clob data to string?

One of the column is having clob data in DB and I want to validate whether a particular text is present in that clob data but i am not able to achieve it through sql query, as the formats differs . Is there any specific sql query to covert clob data to string?

Data gets truncated while inserting JSON data

I have a SQL table which has 'text' as a datatype. I am trying to insert JSON data using CfQueryparam with cf_sql_longvarchar, which I have in a ColdFusion variable.
However, when I checked the value in my column, I'm losing the data. I compared the total length of my column inSQL table to data being held. I've enough datalength left in that column.
This probably has to do with the settings in your datasource under the CF admin.
I would mess with the CLOB and your char buffer values and see what you can come up with.

Resources