SSIS Adds Date to time field upon exporting to excel - sql-server

I am trying to run an SSIS program to take some logging data and export it into Excel for later use with a BI tool. The data has 3 time fields, a start time, finish time, and run time. They appear to be correct coming out of my script component. As it looks perfect when I use the dataviewer tool
However when I go into Excel I get this type of format.
On Run Time column
I am not sure what is causing this or how to fix it. The only thing that I was able to notice was a property in the sources advanced editor set the column data type to date.
The Property
But every time I try to change it to type DB_TIME (same type as coming out of script) it just switches back to the date data type.
Is there a way to prevent the adding of this date? It makes the use of the BI tool impossible. Any help would be greatly appreciated.

That seems like odd behavior to me, but try adding a Data Conversion Transformation to your package. This should force whatever type of data you want, either string or time.

Have you tried
DT_WSTR(1252)
to cast the time using Data Conversion Transformation?

I found the issue. It had something to do with the Excel connection manager auto recognizing that field as a date time field, and therefor it exported it in that format. This change was happening in the connection between the final component and the destination, so casting did not work as it happened after that.
I simply changed the xls file to a csv and used the flat file manager and that did the trick!

Related

Make Kingswaysoft truncate input data that is too long

I have an SSIS project that I'm using to automate pulling CRM data into a SQL Server Database using Kingswaysoft. These SSIS packages are autogenerated, so my solution to this issue needs to be compatible with that.
The description field on Contact in CRM is a nvarchar(2000), but this CRM org still has old data, and some of those old contact records have a description longer than 2000 characters. When I try to pull those using Kingsway, I get this error:
Error: 0xC002F304 at Stage Data for contact, Export contact Data [2]: An error occurred with the following error message: "The input value for 'description' field (or one of its related fields) does not fit into the output buffer, please consider increasing the output column's Length property or changing its data type to one that can accommodate more data such as ntext (DT_NTEXT). This change can be done using the component's Advanced Editor window.".
This makes sense, since I'm pulling a column longer than specified in the metadata, but the problem is that I want to ignore this error, truncate the column, and continue the data load. Obviously I could set the column to DT_NTEXT and not worry about it, but since these packages are autogenerated I have no way of knowing beforehand which columns have old data and which don't, so I won't know which should be DT_NTEXT.
So is there a way to make Kingswaysoft truncate input data which is longer than what's specified in the metadata?
Thank you for choosing KingswaySoft as your integration solution. For this situation, unfortunately there is no way to make that work without making those changes in the component’s Advanced Editor.
If the source component just simply ignores the error and truncates the value, you will lose some of your data and thus affect the data integrity during the integration. Therefore, you may need to change the data type to DT_NTEXT or increase the length of this field in order to handle this situation properly. Alternatively, you can try to change the field length on your CRM side so that the SSIS package can be generated correctly.

"Conversion failed because the data value overflowed the specified type" error applies to only one column of the same table

I am trying to import data from database access file into SQL server. To do that, I have created SSIS package through SQL Server Import/Export wizard. All tables have passed validation when I execute package through execute package utility with "validate without execution" option checked. However, during the execution I received the following chunk of errors (using a picture, since blockquote uses a lot of space):
Upon the investigation, I found exactly the table and the column, which was causing the problem. However, this is problem I have been trying to solve for a couple days now, and I'm running dry on possible options.
Structure of the troubled table column
As noted from the error list, the trouble occurs in RHF Repairs table on the Date Returned column. In Access, the column in question is Date/Time type. Inside the actual table, all inputs are in a form of 'mmddyy', which when clicked upon, turn into 'mm/dd/yyyy' format:
In SSIS package, it created OLEDB Source/Destination relationship like following:
Inside this relationship, in both output columns and external columns data type is DT_DATE (I still think it is a key cause of my problems). What bugs me the most is that the adjacent to Date Returned column is exactly the same as what I described above, and none of the errors applied to it or any other columns of the same type, Date Returned is literally the only black sheep in the flock.
What have I tried
I have tried every option from the following thread, the error remains the same.
I tried Data conversion option, trying to convert this column into datestamp or even unicode string. It didn't work.
I tried to specify data type with the advanced source editor to both datestamp/unicode string. I tried specifying it only in output columns, tried in both external and output columns, same result.
Plowing through the data in access table also did not give me anything. All of them use the same 6-char formatting through it all.
At this point, I literally exhausted all options I could think of. Can you please point me in the right direction on what else I could possibly try to resolve it, since it drives me nuts for last two days.
PS: On my end, I will plow through each row individually, while not trying to get discouraged by the fact that there are 4000+ row entries...
UPDATE:
I resolved this matter by plowing through data. There were 3 faulty entries among 4000+ rows... Since the issue was resolved in a manner unlikely to help others, please close that question.
It sounds to me like you have one or more bad dates in the column. With 4,000 rows, I actually would visually scan and look for something very short or very long.
You could change your source to selecting top 1 instead of all 4,000. Do those insert? If so, that would lend weight to the bad date scenario. If 1 row does not flow through, it is another issue.
(I will just share my experience, how I overcame this problem, in case it helps someone)
My scenario:
One of the column Identifier in the ole db data source has changed from int to bigint. I was getting the error message - Conversion failed because the data value overflowed the specified type.
Basically, it was telling me the source data size was greater than the destination data size.
What I have tried:
In the ole db data source and destination both places, I clicked "show advanced editior", checkd the data type Identifier was bigint. But still, I was getting the error message
The solution worked for me:
In the ole db data source--> show advanced edition option--> Input and Output Properties--> OLE DB Source Output--> there are two options - External columns & Output columns.
In my case, though the Identifier column in the External columns was showing the data type bigint, but in the Output columns was showing the data type int. So, I changed the data type to bigint and it has solved my problem.
Now and then I get this problem, specially when I have a big table with lots of data.
I hope it helps.
We had this error when someone had entered the year as 216 instead of 2016. The data source was reading the data ok but it was failing on the OLEDB destination task.
We use a script task in the data flow for validation. By adding a check that dates aren't too far in the past we are able to trap this kind of error and at least generate a meaningful error message to find and correct the problem quickly.

Unable to change an SSIS Excel Destination Column Data Type

I have an SSIS package that is importing data from SQL Server and placing it into an Excel destination file. When going into the Advanced Editor of the ADO Source component, I have a field Description that has an External Data Type of Unicode String, length 4000, and an Output Data Type of Unicode Text Stream (This is to ensure a String length > 255 can be imported into Excel). Now, when I go into the Advanced Editor of the Excel Destination component the Data Type is stuck as Unicode String, length 4000. It allows me to change it, but reverts back immediately after I click save. Running the package results in a failure since I have data in the Description field with a length > 255. I have searched countless threads regarding this issue such as this but haven't come to a solution. Any help would be greatly appreciated.
This might be very simple: after you make any change related to the Source component, I find I have to double-click the green arrow -- showing that metadata does more than just show it -- it updates that metadata, too, based on the source component. Only after that will the Destination component be able to "see" the changes to the Source component.
But if that isn't enough: when making these kind of changes, before I could get them to take effect, I've often had to (1) delete the destination component, (2) delete the destination connection object in SSIS, and (3) delete/rename/move the actual Excel spreadsheet, and then generate a new one by clicking the button (in the Destination component), that generates a new destination file from the metadata.
I've had this issue with the Union All component before, and they only way I've managed to fix it without deleting and re-creating the component was to open/edit it, set the incriminating fields to "ignore" on the inputs, press OK and then go back in and set the inputs back to the original fields and press OK.
This seemed to do the trick. Maybe a similar approach for other components may work.

Stubborn column data type in SSIS flat flat file connection manager won't change. :(

I have inherited an existing SSIS package that imports flat file data into my SQL Server 2005 database. I need to change the data type of the "Gross Sales" column from "numeric" to "currency". When I change the data type and export the package the data type remains numeric.
I have also tried creating a new flat file connection to use in the same package, but for some strange reason it still remains numeric instead of "cy" currency. I guess there is something "stuck" in some other area that is forcing the last column to remain numeric?
Does anyone know the trick to changing the data type for a flat file data source?
Thanks for all the help everyone. It looks like in my case I needed to delete and re-add the flat file source step and add a new flat file connection manager. Maybe there was a better way to do it and I was just clicking in the wrong order in the GUI maze in SSIS. :D
Did you try adding a Data Conversion Transformation Task in your DT?.
e.g. You can modify the Gross Sales as
Gross Sales, Gross Sales_CONV Choose Currency [DT_CY] as your currency.
I've also found that sometimes changing ValidateExternalMetaData to false on the Source and Destination properties will help.
When some changes don't apply, even though there's no apparent reason for the data to be "stubborn" and resist change, you should try closing Visual Studio and open it back again.
In my case it works for this mentioned failure, as well as when the Script Editor won't open. It looks like some information remains on cache and prevents some functionalities from working properly.
In my case I had a lot of columns in my flat file connection manager, so deleting the connection and adding those columns back again one by one was the last thing I would try.

XML column in SSIS has byte-order-mark

I'm using an oledb data source in an SSIS package to pull a column from a database. The column is XML data type. In SSIS, it is automatically recognized as data type DT_NTEXT. It's going to a script component where I'm trying to load it into a System.Xml.XmlDocument. This is the code that I'm using to get the xml data into a string:
System.Text.Encoding.Default.GetString(Row.Data.GetBlobData(0, Row.Data.Length))
Is this the correct way?
One odd thing that I'm seeing is that on one server, I get a byte-order-mark in the resulting string, and another server I don't. I wouldn't mind knowing why that is the case, but my real desire is how to get this string without the BOM.
Help me, Stack Overflow, you're my only hope...
This is the only way I was able to get it to work:
System.Text.UnicodeEncoding.Unicode.GetString(...).Trim()
The .Trim() removes the BOM. I'm not sure if this is the "right" way, but it's the only thing that's worked so far.

Resources