Excel destination character size in SSIS - sql-server

I am trying to export data from SQL server 2008 to Excel file using BIDS.
One of the fields 'DESCRIPTION' coming from SQL database is VARCHAR(4000).
I can export everything to excel but the 'DESCRIPTION' field size in excel is restricted to unicode 255 and no mater what I try it does not allow me to export the data over 255 characters (exports it as blank). I tried to change SQL field as varchar(max) or ntext but none of attempts worked. I used advanced editor in BIDS on excel destination to change 'DESCRIPTION' character length manually but as soon as I hit 'OK', it resets to unicode 255.
Could anybody please help me to resolve this issue?
Thanks,
Vishal

So, I did some testing. Excel data transformation is funky but I came up with a solution. I created an excel spreadsheet with fields as needed. I then created fake, dummy data in excel with character length far greater than 255 and hid the row. I then did the SSIS data transformation to the excel spreadsheet which worked. It's a weird and not preferable option but it works.

Problem: Excel only accepts 255 chars per cell when I attempt to use Excel Destination in SSIS (2008 R2) from a sql server table. SalesForce data loader would not accept CSV (with “” text qualifiers) created by
ssis flat file connection manager. SalesForce will only accept CSV (with “” text qualifiers). SalesForce will accept CSV as exported by Excel (2010).
Solution:
1. Create your excel connection manager, set name/path of the destination EXCEL file in your “Excel Destination Data Flow Competent” and map meta-data.
2. Open a new Excel file, remove all extra “sheets”, rename “sheet1” to that was created in step#1, above, select all cells and format to “text”, add all the column header names to the first row of your template sheet. In the columns that need to hold more data than 255 limit, paste in any characters that exceed your limit by 50% (just in case). These columns are now configured to hold your large data. Save the file, naming it something like TEMPLATE_Excel_forLargeCellValues.xlsx
3. Copy this template into your DESTINATION connection: Before your “Excel Destination Data Flow Competent” in the SSIS Control Flow, create a new “File System Task”. Create an ssis pkg level variable to hold the path/filename of your template excel file. In your “File System Task” set “IsSourcePathVariable” = TRUE, set “SourceVariable” to User::Template_Excel. Set “IsDestinationPathVariable” = FALSE, and set “DestinationConnection” = from step #1 above. Set “Operation” = Copy file. “OverwriteDestination”=TRUE. This will now copy your formatted Excel workbook/sheet into your destination folder with the file name you designated in step #1 above and because you put a larger amount of sample data in the columns that require more than 255 chars, all your data will fit.
Note: It is not necessary to delay validation on any components.

You're saying that the excel field is set to 255 right? Changing the SQL field won't have an effect on excel, you'd have to modify the excel file.

I don't believe you can modify the Excel output column to write more than 255 characters. Why not simply write your output to a csv, it can be opened and later modified in Excel anyway.

SSIS excel engine recognizes datatype of first 8 rows and assigns it to excel source or destination automatically. Even defining the excel column as memo wont work. I tried to resolve the error by changing registry value TypeGuessRows of excel engine but it did not work either. So I was not left with any other option but to create a dummy row(2nd row) with more than 255 characters and hide it.Excel source then identify the column with unicode text stream. You have to write some logic in SSIS package to exclude this row if you are trying to import the data from excel. I heard that this issue is resolved in excel versions on and after 2010. But BIDS 2008 does not have option to choose any version after 2007 so this is the only solution if you are working with BIDS 2008 and excel.

You have to select Microsoft Excel 97-2003 and use the xls as file extension in your file name for destination.

I got the same issue of the excel destination not allowing more than 255 characters. After spending almost a day, I tried adding more characters (to simplify, I added spaces more than 255) in the header of the column that has the issue with more than 255 characters. And it magically worked!

You can insert dummy data (260 characters) to under head column you want in your excel (Execute SQL Task)
Script Create and insert
CREATE TABLE `YourSheet` (`myColumn260char` LongText)
GO
INSERT INTO YourSheet(myColumn260char) Values('....................................................................................................................................................................................................................................................................')
And you can delete dummy row after imported.

Related

SSIS Data Flow OLE DB To Excel Nvarchar Size Issue

Hopefully, this is not an ignorant question as I am still working to build SSIS Skills.
I Have a package that takes an excel sheet and loads it into an SSMS SQL table so that I can run analysis and update statements to the data. I am now looking to load that SQL table back into an Excel sheet. I have made an excel sheet as a template of a replication of the SQL table.
The issue I am now having is I have a field named "Comment" that datatype is Nvarchar(MAX) in my SQL table. This column does contain NULL values as well. When I am trying to load these back to the Excel column I am having an error.
[Excel Destination [28]] Error: An error occurred while setting up a binding for the "Comment" column. The binding status was "DT_NTEXT".
I thought perhaps I could do a Data Conversion to a string with the max character (Which is 757) but it truncates and errors on that size.
This data came from an excel column so I would think I can load it back to a column.
Thanks for the help!!
Previously I thought that Excel does not allow exporting data with longer than 255 characters. After running several experiments, exporting DT_NTEXT values to Excel can be done using SSIS:
You should create an Excel file with one dummy row that contains long text values (> 255) then use this Excel as a destination. If the Excel contains previous data, make sure to add this dummy row directly after the file header and add ;IMEX=1 to the OLE DB connectionstring.

Ignoring column from Excel file while importing to SQL Server

I have multiple Excel files that have the same format. I need to import them into SQL Server.
The issue I currently have is that there are two text columns that I need to ignore completely as they are free text and the character length for some rows exceeds what the server allows me to import which results in a truncation error.
Because I don't need these columns for my analysis, the table I'm importing to doesn't include these columns but for some reason the SSIS packages still picks up those columns and cuts the import job halfway through.
I tried using max character length for those columns which still results in the truncation error.
I need to create an SSIS package that ignores the two columns completely without deleting the columns from Excel.
You can specify which columns you need to ignore from the Edit Mappings dialog.
I have added the image for your reference:
If you just create the SSIS package in SSDT the Excel file can be queried to return only the required columns. In the package, create an Excel Connection Manager using the Excel file. Then on the Control Flow of the package add a Data Flow Task that has an Excel Source component in it. On this source, change the data access mode to SQL command and the file can then be queried similar to SQL. In the following example TabName is the name of the Excel tab containing the data that will be returned. If either the tab or any column names contain spaces they will need to be enclosed in square brackets, i.e. TabName would be [Tab Name].
Import/Export Wizard
Since you mentioned in the comments that you are using SQL Server Import/Export Wizard. You can solve that if you have a fixed columns (range) that you are looking to import (example: first 10 columns).
In Import/Export wizard, after selecting destination options you will be asked if you want to read from tables or query:
Select the query option, then use a simple select query and specify the columns range after the sheet name. As example:
SELECT * FROM [Sheet1$A:C]
The query above will read from the first 3 columns in Sheet1 since A:C represent the range between first column A and third column C.
Now, you can check the columns from the Edit Mappings dialog:
SSIS
You can use the same logic within SSIS package, just write the same SQL command in the Excel Source after changing the Access Mode to SQL Command.
The solution is simple. I needed to write a query that will exclude the columns. So instead of selecting "Copy data from one or more tables" you select "write a query" and exclude the columns you don't need. This one worked 100%

SSIS receive Excel column as DT_IMAGE

Good day to you, Experts.
I'm stuck on a problem I'm having with an Excel 97-02 .xls file.
When adding it as a source in SSIS, I'm getting an External Columns Datatype of DT_IMAGE .
The column represents an ID and is numeric only. I can't extract and work with the data because of the DT_IMAGE datatype.
Setting IMEX=1 didn't help.
Thank you in advance.
Reading Excel files in SSIS is done using OLEDB provider which may not detect the appropriate Excel column type.
There are many other questions mentioning similar issues such as:
SSIS Excel Import Forcing Incorrect Column Type
SSIS Excel Data Source - Is it possible to override column data types?
SSIS keeps force changing excel source string to float
As you mentioned in the question, if you added ;Extended Properties="IMEX=1" to the connectionstring with no luck then i think there is 4 things you can try:
Sorting column data inside Excel
Change the entire column formatting manually
Go to the advanced editor on the Excel source >> into the output column list and set the type for each of the columns.
Adding IMEX=1; MAXROWSTOSCAN=0 to the connectionstring
If nothing of the above steps worked then you should save the Excel sheet as a text file and then you use Flat File Connection manager

Unable to preview SSIS Excel source data after skipping first few rows

I am using SQL Server 2008 BIDS. I am trying to read in an Excel file, having multiple sheets. The names are mostly alphabetical( and few with special char '&'). The data starts at row 8. I have skipped the blank rows by setting the rows and columns in the open rowset property for the Excel source. I get the exact mappings. However, I am not able to preview the data. The package runs successfully(everything turns green), but there is no data in my destination.
The error I receive while I try to preview is:
There was an error displaying the preview.
Additional Information:
Index and Length must refer to a location within the string. Parameter
name: Length(mscorlib)
Please let me know if I am doing anything wrong or I am missing any settings.
The links I have referred to:
Skipping rows when importing Excel into SQL using SSIS 2008
https://connect.microsoft.com/SQLServer/feedback/details/557049/ssis-fails-to-preview-excel-source-connector-due-to-incompatible-sheet-name
Thanks
I cracked it with the help of one of my friend .
In the properties of the Excel Source >> Custom Properties >> Open Rowset >>
SheetName$A12:J
It means skip the rows till A12. And the data is taken into account from A12 though end of J column.
Problem solved.
The way to solve this is to use q SQL command in the data access mode.
Select * FROM [Sheet1$A20:K]
This will read the data correctly and no preview errors will occur.
I was getting this same error trying to import a .xlsx file into SSIS 2008. I first saved the file as .xls (Excel 97-2003). I was still getting the above error and couldn't preview the data with the new file.
In case this link breaks in the future: https://connect.microsoft.com/SQLServer/feedback/details/557049/ssis-fails-to-preview-excel-source-connector-due-to-incompatible-sheet-name
Once I removed the space from the Sheet name, I was able to preview the data. The tab was originally called Time Data; I changed it to TimeData.
Some other potential problems with the sheet name, according to the above site:
Sheets named with all numbers (ex: 4385)
Sheet names beginning with a number
In my case I was trying to export from SQL into an Excel file and receiving "Index and Length must refer to a location within the string" while trying to preview the destination data.
Removing the space from the Excel Sheet in the Excel Destination Editor fixed it for me.
After trying all the other answers with no luck, I renamed the spreadsheet.
In the Excel connection manager, I browsed to the renamed spreadsheet and unchecked first row contains column names. I was then able to view the data in the preview window.

SQL import wizard drops leading zero

I've read all the posts about prefixing the numbers with ', and setting IMEX=1 in the connection string; nothing seems to do the trick for me.
Here's the setup: Excel column with mixed data - 99% numbers (some start with 0) 1% text.
PROGRAMATICALLY mporting into SQL Server 2005 table / column type - varchar(255).
Import works fine locally, but once i move the code to production (GoDaddy), it drops the leading 0's in the column.
Any ideas?
p.s. I knew about the registry change solution, matter of fact - the value was set to 0 on my dev box, but the answer made me realize that the value wasn't set on the PRODUCTION SERVER :)
The ISAM driver only samples the first 8 rows, but you can change that behaviour through a registry change:
http://sqlserversd.wordpress.com/2008/09/14/ssis-excel-values-import-as-nulls/
But yes, using Excel for machine-to-machine data transfer is a nightmare...Is there no other way you can be sent the data?
Yes. The Excel driver only sample the 1st 8 rows to determine the data type.
This means that it assumes the column is numeric is "bob" does not appear in rows 1 to 8.
The target table column datatype is irrelevant.
Ths issue has been there for a long time, I saw it in 2003 the first time.
BOL notes on excel import
We usually save the file as a .csv file or .txt file and then the issue doesn't occur.
There is a quick and tricky way for this. following these steps BUT first copy all the data / columns and rows from the actual excel sheet into another excel sheet just to be of the safe side so that you have the actual data to compare with.
Steps:
Copy all the values in the column and paste them into a notepad.
Now change the column type to text in the Excel sheet (it will trim the preceding / trailing Zeros), don't worry about that.
Go to Notepad and copy all the values that you have pasted just now.
Go to your excel sheet and paste the values in same column.
If you have more than one column with 0 values then just repeat the same.
Now your excel document is ready to be imported with Zero Values :).
Happy Days.

Resources