i received an excel workbook and opened it but could not see it unless I went into PowerPivot. Why? It is an XLSX file. I should be able to see it in regular excel.
PowerPivot data is saved within the XSLX file, but not in a regular spreadsheet. PowerPivot uses a different format for its data that is optimized for fast analysis of large data sets, but not for standard Excel operations like putting a formula somewhere into a cell etc. which you can do in normal Excel sheets.
Hence, these being different types of objects, you cannot see them at the same place when opening the Excel file.
If you have MS-Office 2003 or earlier goto the blow link and download and install the MS-Compatibility pack.
http://www.microsoft.com/downloads/details.aspx?FamilyId=941B3470-3AE9-4AEE-8F43-C6BB74CD1466&displaylang=en
Related
In this scenario I am dealing with very huge excel files (~150 MB) which have multiple but unknown number of sheets (We know them to be less than 20). The number of these sheets depend on the number of records in the excel file; once a sheet is filled to its max records (1,048,576 rows), another is created.
I would like to use SSIS to import all the data in these files into SQL Server tables.
Before, instead of using SSIS, we had a stored procedure to read the excel file using OpenRowset, import the data on 1st sheet and if it had passed the hard limit, the new sheet name is tested. The approach is not very optimized, as it calls upon Microsoft ACE OLEDB (Or ancient Microsoft Jet Provider) which are very, very slow compared to bulk insert.
Importing excel data using SSIS seems much faster in case of simple excel files. Trying to extract sheet names seems to need to treat the file as Rowset collections and will slow down the process. My suggested approach is to use an outer "Foreach Loop" to handle my excel files and an inner "For Loop" to handle worksheets within each file. Once the next worksheet is called and does not exist, the for loop should break but the outer loop should crunch the next file. (something similar to try/catch block)
My question is: is it possible? if yes, how?
(We have an alternative solution to use CLR to convert the excel file into csv files and then import the data, but we would like to keep the job within SSIS package if possible.
I have many SSRS reports, and all are having same issue while exporting data into CSV files. (Upgraded from VS-2005 to VS-2013)
I have some GROUP data on header while displaying the data based on condition like shown in below image:
While exporting Data into CSV format, their header data also getting download along with table-grid data. just like shown in below image (yellow shaded)
I actually want to merge those header columns with the actual data row, like shown in below image: (Yellow shaded)
EDITED:
I have found one solution to remove Header values from CSV. As setting the DataElementOutput to NoOutput (Shown in below image)
But still, not able to merge those header values into Data-Grid.
Thank you in advance
Based on the following official documentation:
Breaking Changes in SQL Server Reporting Services
Redesigned CSV Data Renderer
In earlier versions of Reporting Services, when you exported a report to a CSV file format, the data was formatted in a way that preserved the way the data appeared on the report page. For matrix data regions, this resulted in a data format that was inconvenient to import into other applications in order to continue to work with the data.
In this release, when you export a report to a CSV file, you can choose between two supported formats: Default mode and Compliant mode. Default mode is optimized for Excel. Compliant mode is optimized for third-party applications.
The earlier format for CSV files is no longer available. However, for reports that do not use matrix data regions, you can use Compliant mode to get a file format closest to the earlier CSV file format.
It looks like that the SSRS 2005 CSV renderer has gone through major changes in SSRS 2008 R2 with the support for new features.
Workaround
Check the following question, were they provided an extension to reporting services that add the old csv renderer:
Force SSRS 2008 to use SSRS 2005 CSV rendering
References and helpful links
MSDN - How to force SSRS 2008R2 to use the SSRS 2005 CSV rendering mode
Reporting Service 2008 - CSV export
How do I export a SSRS matrix to CSV without losing the structure?
I have 40 excel sheets in a single folder. I want to load them all in different tables sql server database through SSIS package. The difficulty I am having is because of different number and name of columns in each excel sheet.
Can this task be achievable through a single package?
Another option, if you want to do it in one data flow, you can write custom C# source component with multiple outputs. In the script task you'll figure out the file type and send the data to the proper output.
NPOI library(https://npoi.codeplex.com/) is a good way to read excel files in C#.
But if you have fixed file formats I would prefer to create N Data Flows inside Foreach loop container. Use regular Excel source components and just ignore errors in each data flow. This will let you get a file and try to load it in each data flow one by one. On error you will not fail the package but just go to the next data flow until you find the proper file format.
It can only be done, by adding multiple sources or using a script component, white a flag on what sheet it is. Then you can use a conditional split and enter multiple destinations.
We use SharePoint 2013 as a library to hold thousands of Excel files, with almost never consistent formatting, to manage projects occurring on servers. Somewhere in these maybe formatted as table objects is a common set of server names.
Somehow, without being able to change this process in the short term, I need to pull data from all these files to identify how many projects are targeting a particular server.
I've got access to SQL Server 2016 enterprise, and wondering if something like PolyBase could help with this? I also wonder about SSIS but I don't expect any tables to look exactly like another one.
Other tools may be an option, but I'm not sure what can handle this scale and variety. I think daily updates to the data would be enough, but even so it's still a mess.
How do I pull thousands of varied excel tables into a database? Is this even possible?
Any longer term solution that doesn't allow them to format and annotate like excel is unlikely to actually be adopted.
The less you know in advance, the more difficult it will be...
Some ideas:
Technology
read about FROM OPENROWSET which allows to read from an Excel
read about linked server
Use Excel and its great abilities through VBA to iterate through all your Excel-Sheets, open them, analyse them and fill proper tables. Within Excel you know most about your messy data...
Target structure
You might create thousands of tables, each representing one single sheet in all your Excel files. You could query these tables with dynamically created SQL (using meta-data of INFORMATION_SCHEMA) or think about Full-Text-Search
You might import each sheet into one single XML-structure (SELECT * ... FOR XML PATH('...')). In this case you'd need a target table with columns for Path and name of your Excel, Name of the sheet and an XML column for your data. Another approach was to represent each File on one XML and include all sheets there. Try to define common naming for all your data. Querying XML allows to query columns without knowing their actual names (XQuery with XPath using *).
If your Excels are xlsx already, you might open them with UNZIP and take the existing XML as-is.
To be honest: I do not think that any tool can do the magic to import such a wide range of mess automatically...
I'm writing an SSIS package to import the contents of several Excel files into a SQL Server database for my client. These files will be provided regularly and the system should be completely automated without user involvement.
The Excel files are provided by my client's business partners, so I don't have a lot of control over them.
One of the files seems to be in the Excel 2003 SpreadsheetML XML format. Note that this is different from Open XML. It seems from my research that SSIS cannot read this format. It can and does read "normal" Excel 2003 files just fine.
Does anyone know of a way (in code) to convert this file into either non-XML Excel 2003 or Excel 2007 so I can import it? It needs to be automated, so opening the file using Excel and "save as" another type is off the table.