Error handling with Dapper Plus BulkInsert - dapper

I am using BulkInsert API from Z.Dapper.Plus and it is failing due to bad data and the message is Parameter value '7900000000000000.000000' is out of range.
Is there a way to improve this message so I can readily identity the specific row causing the issue?

There is no way to improve the message in Dapper Plus. That's an error message from SQL Server that your value cannot be stored in your current data type (pretty similar to this issue: https://stackoverflow.com/a/56349937/5619143)
You currently have the value in error, so the easiest way to find the row in error is probably to look at your datasource to which entity has this value. Or also by adding a validation that no value is out of bound with your data type.
Another way is to set a BatchSize = 1 and look with SQL Profiler when the error will be thrown. You will have all information about the row in error by looking at the SQL.

Related

How to load Informatica Rejected rows due to 'Database errors' to a relational table

While running a mapping I am getting couple of database errors and jobs failed
1.) Arithmetic Overflow error
2.) Conversion failed when converting date and/or time from character string.
This is purely data issue(datatype error and data length issue) and I want to reject these records and write it in a separate error table.
The .bad files in which these records are written consists of characters which looks like junk (',N,N,N,N' AND ',D' AND ',0'), I am not sure on what basis we get these characters.
Do we get this for null values? and how to overcome this and get the exact output?
Is it possible to write these rejected records directly to a relation table(error table with same structure as the target table) or a way around to achieve this?
You could use a router transformation to route every field which does not meet your criteria to the error table. This way you will handle them before they become bad rows.
Hey Vankat just lookig at you problem try to filter out the records which doesn't meet your criteria by putting condition like (data type,length)at router transformation and route them to error table or capture that recods in a flat file. Hope this will give you a clear picture.

Make Kingswaysoft truncate input data that is too long

I have an SSIS project that I'm using to automate pulling CRM data into a SQL Server Database using Kingswaysoft. These SSIS packages are autogenerated, so my solution to this issue needs to be compatible with that.
The description field on Contact in CRM is a nvarchar(2000), but this CRM org still has old data, and some of those old contact records have a description longer than 2000 characters. When I try to pull those using Kingsway, I get this error:
Error: 0xC002F304 at Stage Data for contact, Export contact Data [2]: An error occurred with the following error message: "The input value for 'description' field (or one of its related fields) does not fit into the output buffer, please consider increasing the output column's Length property or changing its data type to one that can accommodate more data such as ntext (DT_NTEXT). This change can be done using the component's Advanced Editor window.".
This makes sense, since I'm pulling a column longer than specified in the metadata, but the problem is that I want to ignore this error, truncate the column, and continue the data load. Obviously I could set the column to DT_NTEXT and not worry about it, but since these packages are autogenerated I have no way of knowing beforehand which columns have old data and which don't, so I won't know which should be DT_NTEXT.
So is there a way to make Kingswaysoft truncate input data which is longer than what's specified in the metadata?
Thank you for choosing KingswaySoft as your integration solution. For this situation, unfortunately there is no way to make that work without making those changes in the component’s Advanced Editor.
If the source component just simply ignores the error and truncates the value, you will lose some of your data and thus affect the data integrity during the integration. Therefore, you may need to change the data type to DT_NTEXT or increase the length of this field in order to handle this situation properly. Alternatively, you can try to change the field length on your CRM side so that the SSIS package can be generated correctly.

"Conversion failed because the data value overflowed the specified type" error applies to only one column of the same table

I am trying to import data from database access file into SQL server. To do that, I have created SSIS package through SQL Server Import/Export wizard. All tables have passed validation when I execute package through execute package utility with "validate without execution" option checked. However, during the execution I received the following chunk of errors (using a picture, since blockquote uses a lot of space):
Upon the investigation, I found exactly the table and the column, which was causing the problem. However, this is problem I have been trying to solve for a couple days now, and I'm running dry on possible options.
Structure of the troubled table column
As noted from the error list, the trouble occurs in RHF Repairs table on the Date Returned column. In Access, the column in question is Date/Time type. Inside the actual table, all inputs are in a form of 'mmddyy', which when clicked upon, turn into 'mm/dd/yyyy' format:
In SSIS package, it created OLEDB Source/Destination relationship like following:
Inside this relationship, in both output columns and external columns data type is DT_DATE (I still think it is a key cause of my problems). What bugs me the most is that the adjacent to Date Returned column is exactly the same as what I described above, and none of the errors applied to it or any other columns of the same type, Date Returned is literally the only black sheep in the flock.
What have I tried
I have tried every option from the following thread, the error remains the same.
I tried Data conversion option, trying to convert this column into datestamp or even unicode string. It didn't work.
I tried to specify data type with the advanced source editor to both datestamp/unicode string. I tried specifying it only in output columns, tried in both external and output columns, same result.
Plowing through the data in access table also did not give me anything. All of them use the same 6-char formatting through it all.
At this point, I literally exhausted all options I could think of. Can you please point me in the right direction on what else I could possibly try to resolve it, since it drives me nuts for last two days.
PS: On my end, I will plow through each row individually, while not trying to get discouraged by the fact that there are 4000+ row entries...
UPDATE:
I resolved this matter by plowing through data. There were 3 faulty entries among 4000+ rows... Since the issue was resolved in a manner unlikely to help others, please close that question.
It sounds to me like you have one or more bad dates in the column. With 4,000 rows, I actually would visually scan and look for something very short or very long.
You could change your source to selecting top 1 instead of all 4,000. Do those insert? If so, that would lend weight to the bad date scenario. If 1 row does not flow through, it is another issue.
(I will just share my experience, how I overcame this problem, in case it helps someone)
My scenario:
One of the column Identifier in the ole db data source has changed from int to bigint. I was getting the error message - Conversion failed because the data value overflowed the specified type.
Basically, it was telling me the source data size was greater than the destination data size.
What I have tried:
In the ole db data source and destination both places, I clicked "show advanced editior", checkd the data type Identifier was bigint. But still, I was getting the error message
The solution worked for me:
In the ole db data source--> show advanced edition option--> Input and Output Properties--> OLE DB Source Output--> there are two options - External columns & Output columns.
In my case, though the Identifier column in the External columns was showing the data type bigint, but in the Output columns was showing the data type int. So, I changed the data type to bigint and it has solved my problem.
Now and then I get this problem, specially when I have a big table with lots of data.
I hope it helps.
We had this error when someone had entered the year as 216 instead of 2016. The data source was reading the data ok but it was failing on the OLEDB destination task.
We use a script task in the data flow for validation. By adding a check that dates aren't too far in the past we are able to trap this kind of error and at least generate a meaningful error message to find and correct the problem quickly.

string or binary data would be truncated - but don't know parameter values

The company program prints out the string or binary data would be truncated error to an error table when trying to insert into a specific table. I do realize from other posts that the cause of this is the current table structure. There are one or more fields that are too short, but unfortunately I do not have access to the values of the actual query. Instead, it dumps them as #parametername1, #parametername2, etc. from the stacktrace.
Is there any way I can see from some kind of monitoring tool in SQL Server 2012 what parameter and value failed? I mean, the SQL Server has returned the error, so supposedly I can log it if I repeat the error?
You can setup a trace in SQL Profiler. For the statement, it will give you list of columns and matching values being inserted.

Truncation error SQL server 2005 Delete/Insert

So I am trying to do a bulk insert with SSIS and continually get:
"Microsoft SQL Native Client" Hresult: 0x80004005 Description: "String or binary data would be truncated."
Even though I already have a data conversion for every column into the exact same type as the table that the rows are getting inserted into. I used a view and the data looks like it supposed to just before the DB insert step. Still get the error.
Next I went into sql server management studio and setup an insert query into that damned table and still get the same truncation error. I then did a set ANSI_WARNINGS OFF and the insert works data looks good in the table. Now when I try to delete this row I get the Truncation error.
My question besides any basic input to the situation is how can I turn off the ANSI_WARNINGS within SSIS so that the bulk load can go though?
It sounds like you have a column that is too narrow to accept the data you are submitting.
Can you verify if this is or isn't the case?
I had a very similar issue arise frequently while we were nailing down a schema with a third party.
Can you select a LEN of all of the columns in the view? That could help find the issue.
Other than that, the only way I have found is to print out a report of the actual lengths of the source data columns.
Sounds like you've got one row (possibly more, but it only takes one!) where your data value exceeds the length of the table columns. Doing a data conversion to the shorter type will MOVE the error to whatever transform does the conversion from the Destination. What I'd recommend is creating a Flat File Destination, and tying the error output of your transforms to it. Change the error result to 'Redirect Row'. This will allow all the valid rows to go through, and provide you with a copy of the row(s) that are getting truncated for you to manually handle.
Are there triggers on the table you're inserting into? Then the error may come from an action that the trigger takes.
Turns out that in SSIS you can setup the OLE DB Destination with "Data Access Mode > Table or view: Fast Mode". When I chose this setting the bulk insert went through without any warnings or errors and the data looks perfect in the database. Not sure what this change did exactly but it worked and after 16hours on one SSIS insert I'm happy with results.
Thanks for the suggestions.

Resources