Select text column from table in SQL Server stored procedure - sql-server

I am having difficulty figuring this out. I have an incident table that contains columns id, comments, incidentdate, and incidentdescID. There are 10 years worth of data in this table. I wrote a stored procedure to extract the last 4 years worth of data but I am running into the following error.
Msg 8152, Level 16, State 10, Line 27
String or binary data would be truncated.
So when I change the date range for the incident to be between 2015 to 2016 I am not getting an error. Then when I change it to be between 2017-2018 I am still not getting an error. But when I change it to be between 2016-2017 I get the error. Also when I comment out the comments column, I do not get an error no matter what date range I put.
So I was thinking there might be a special character in the Comments column which is a text column in the Incident table. If that is the case how would I be able to select that column but remove the special characters in the stored procedure without making changes to the table?

If you suspect your "Comments" column is the culprit then you can search my friend for junk values. I got this error once and fixed by replacing char(10) and char(13) by blanks.
1.
SELECT REPLACE(REPLACE(tbl.comments, CHAR(10), '*JUNK*'), CHAR(13), '*JUNK*') AS CleandComments
FROM [your table name] tbl
Copy your query result into any editor and search for records corresponding JUNK keywords.
This ideally happens when you are importing data from excels or source tables with NVARCHAR datatype whereas your destination is a CSV or accepts only VARCHAR.
If above is your case then you simply need to put REPLACE function on your column/s in your procedure

Related

SQL Server: Error converting data type varchar to numeric (Strange Behaviour)

I'm working on a legacy system using SQL Server in 2000 compatibility mode. There's a stored procedure that selects from a query into a virtual table.
When I run the query, I get the following error:
Error converting data type varchar to numeric
which initially tells me that something stringy is trying to make its way into a numeric column.
To debug, I created the virtual table as a physical table and started eliminating each column.
The culprit column is called accnum (which stores a bank account number, which has a source data type of varchar(21)), which I'm trying to insert into a numeric(16,0) column, which obviously could cause issues.
So I made the accnum column varchar(21) as well in the physical table I created and it imports 100%. I also added an additional column called accnum2 and made it numeric(16,0).
After the data is imported, I proceeded to update accnum2 to the value of accnum. Lo and behold, it updates without an error, yet it wouldn't work with an insert into...select query.
I have to work with the data types provided. Any ideas how I can get around this?
Can you try to use conversion in your insert statement like this:
SELECT [accnum] = CASE ISNUMERIC(accnum)
WHEN 0 THEN NULL
ELSE CAST(accnum AS NUMERIC(16, 0))
END

Bulk Load Data Conversion Error - Can't Find Answer

For some reason I keep receiving the following error when trying to bulk insert a CSV file into SQL Express:
Bulk load data conversion error (type mismatch or invalid character for the
specified codepage) for row 2, column 75 (Delta_SM_RR).
Msg 4864, Level 16, State 1, Line 89
Bulk load data conversion error (type mismatch or invalid character for the
specified codepage) for row 3, column 75 (Delta_SM_RR).
Msg 4864, Level 16, State 1, Line 89
Bulk load data conversion error (type mismatch or invalid character for the
specified codepage) for row 4, column 75 (Delta_SM_RR).
... etc.
I have been attempting to insert this column as both decimal and numeric, and keep receiving this same error (if I take out this column, the same error appears for the subsequent column).
Please see below for an example of the data, all data points within this column contain decimals and are all rounded after the third decimal point:
Delta_SM_RR
168.64
146.17
95.07
79.85
60.52
61.03
-4.11
-59.57
1563.09
354.36
114.78
253.46
451.5
Any sort of help or advice would be greatly appreciated as it seems that a number of people of SO have come across this issue. Also, if anyone knows of another automated way to load a CSV into SSMS, that would be a great help as well.
Edits:
Create Table Example_Table
(
[Col_1] varchar(255),
[Col_2] numeric(10,5),
[Col_3] numeric(10,5),
[Col_4] numeric(10,5),
[Col_5] date,
[Delta_SM_RR] numeric(10,5),
)
GO
BULK INSERT
Example_Table
FROM 'C:\pathway\file.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
FIRSTROW = 2
);
Table Schema - This is a standalone table (further calculations and additional tables are built off of this single table, however at the time of bulk insert it is the only table)
It's likely that your data has an error in it. That is, that there is a character or value that can't be converted explicitly to NUMERIC or DECIMAL. One way to check this and fix it is to
Change [Delta_SM_RR] numeric(10,5) to [Delta_SM_RR] nvarchar(256)
Run the bulk insert
Find your error row: select * from Example_Table where [Delta_SM_RR] like '%[^-.0-9]%'
Fix the data at the source, or delete from Example_Table where [Delta_SM_RR] like '%[^-.0-9]%'
The last statements returns/deletes rows where there is something other than a digit, period, or hyphen.
For your date column you can follow the same logic above, by changing the column to VARCHAR, and then find your error by using ISDATE() to find the ones which can't be converted.
I'll bet anything there is some weird character in your data set. Open your data set in Notepad++ and view the data. Any aberration should become apparent very quickly! The problem is coming from Col75 and it's affecting the first several rows, and thus everything that comes after that also fails to load.
Make sure that .csv is not using text qualifiers and that none of your fields in the .csv have a comma inside the desired value.
I am struggling with this issue right now. The issue is that I have a 68 column report I am trying to import.
Column 17 is a "Description" column that has a double quote text qualifier on top of the comma delimitation.
Bulk insert with a comma field terminator won't identify the double quote text qualifier and munge all of the data to the right of the offending column.
It looks like to overcome this, you need to create a .fmt file to instruct the Bulk Insert which columns it needs to treat as simple delimited, and which columns it needs to treat as delimited and qualified (see this answer).

Dynamic SQL Insert: Column name or number of supplied values does not match table definition

I encounter some strange behavior with a dynamic SQL Query.
In a stored procedure I construct an insert query string out of multiple Strings. I execute the insert query in the SP like that - due to single nvarchar length restrictions.
EXEC(#QuerySelectPT+#QueryFromPT+#QueryFromPT)
If I print each part of the query, put these parts together and execute them manually in Management Studio the query works fine and inserts the data. But, if i execute the query in the EXEC() Method in the stored procedure, I get a
Column name or number of supplied values does not match table definition.
Error Message.
Did multiple check on the amount, spelling of columns in my query and in my insert table, but I have not found any differences so far.
Any advices?
Your count of columns for insert are different from count of columns for select. Print the statement before exec and find the error.
It as shot in the dark but seen you are telling the queries are valid and if you build the final query manually and it is working, the issue could be caused by string truncation.
Could you try:
EXEC(CAST(#QuerySelectPT AS VARCHAR(MAX))+#QueryFromPT+#QueryFromPT);
Also, as the Management Studio's message tab and selects are limited to 4000 symbols I think, you can test if the whole query is assembled correctly like this:
SELECT CAST(#QuerySelectPT+#QueryFromPT+#QueryFromPT AS XML)

Date Conversion Issue MS Access to SQL Server

I'm creating a table B from an exisitng table A. In the table A I have a column ValDate which is varchar and contains Date. When I try to create the table B I have a function used in the query as given below and I get a conversion error. Table A contains null values as well.
MS Access:
((DateDiff("d",Date(),format(Replace(Replace([Table A].ValDate,".","/"),"00/00/0000","00:00:00"),"dd/mm/yyyy")))>0)).
Tables were in MS Access and are being migrated to SQL Server 2012.
SQL Server:
((DATEDIFF(day,FORMAT( GETDATE(), 'dd-MM-yyyy', 'en-US' ),FORMAT( ValDate, 'dd-MM-yyyy', 'en-US' ))>0))
or
((DateDiff(day,GETDATE(),Format(Replace(Replace([TableA].[ValidFrom],'.','/'),'00/00/0000','00:00:00'),'dd/mm/yyyy')))
I tried converting the date using several approachs like Convert , Format and Cast but I end up getting error as below.
Msg 8116, Level 16, State 1, Line 1
Argument data type date is invalid for argument 1 of isdate function.
I would really appreciate someone telling me what I'm missing here.
since you have date data in a string field is very likely you have some value that is not valid against your expected date format.
copy the data in a sql server table and then perform check and validation of the content of the string field.
have a look to the function try_convert that can be helpful when checking the content of the string field containing the date values.
when bad data is ruled out you can apply again your formula with (hopefully) a different result.
a better solution would be to create a separate field with appropriate datatype to store date values converted from the string field and apply your logic to that field.

fetching master table data, getting error

i want to get text values from a master table corresponding to a string (which is comma seperated string of master table userid column) stored in another table
i am trying as
select maritialtype from tblmastermaritialstatus where MaritalStatusId in(select MaritalStatusId from tblPartnerBasicDetail where userid=1)
maritalstatusid in tblPartnerBasicDetail is a string like 1,2,3
i am getting error
Msg 245, Level 16, State 1, Line 1 Conversion failed when converting
the varchar value '1,2,3' to data type tinyint.
how to resolve it
Comma seperated nvarchar data is not the same as comma seperated integers.
You are doing something similar to:
WHERE 1 IN ("1,2,3")
1 is an integer, "1,2,3" is a string (which cannot be implicitly converted). Therefore you are getting an error.
I would recommend normalising your data so that there is no need for comma seperated values.
In the long run this will save you a lot of issues.
However, if you wish to stick with CSV, you may find this article helpful:
http://www.nigelrivett.net/SQLTsql/InCsvStringParameter.html
Check the fn_ParseCSVString part specifically

Resources