I am working on a little Power BI Project. The data source for this project is mostly excel files with untransformed dynamic data.
Initially I used these Excel sheets as the data source and transformed them in PowerQuery prior to building the datamodel. However, now there is a need to load the historical data into SQL server and connect Power BI to SQL as the data source.
Since the source files are not cleaned and transformed prior to importing them into Power Query, I cannot import them directly using the SSIS package - foreachloop container for multiple dynamic files.
Is there a way I can get the data back into SQL Server from PowerQuery after transformation? Is this where PowerQuery Source come into picture? What does it do exactly?
It seems that PowerQuery Source will be the best choice in your case. There is a good article on Microsoft explaining it.
You will need to to three steps:
For each file which you use as a source, create a connection with PowerQuery type in SSIS connection manager;
Using PowerQuery source, copy the full query from Power BI advanced query editor to the "Query" field in the source settings;
On "Connection Managers" tab of the source, map each power query source (each file) with the corresponding connection which you created in step 1.
The output of this source is a list of columns which you can write to the DB using usual data flow task appraoch (like mapping this source to OLE DB destination)
Related
Problem is as simple Client want to store executable sql task queries result in Excel file.I have set full result set as a object but cant consume that object anywhere.
You need to export data from SQL Server to Excel using SSIS, right? In SSIS, you need to create a data flow task. Inside the data flow task you need an OLEDB data source or an ADO.NET data source. Then you need an Excel destination. Connect the source and destination and configure the mappings and other settings. More detailed instructions can be found in this tutorial: https://codingsight.com/export-data-from-sql-server-to-excel-and-text-file-via-using-ssis-package/
Add a Data Flow Task that contains a Script Component Source where you generate output rows from the recordset and an Excel Destination:
Using The SSIS Object Variable As A Data Flow Source
Implementing Recordset Source
On the other hand, you can simply use the SQL Command that you are executing in the Execute SQL Task in an OLE DB Source which is more simple.
What's the correct way of exporting data from Excel 2013 file to SQL Server database? The data from the Excel file should be transferred into SQL when saving excel file to a database.
I know many answers for this are available but my question is bit different: every time the excel data changes, or the user clicks on save button the data in the database also needs to be updated.
The easiest way to do this is with an SSIS package. SSIS (SQL Server Integration Services) is a package built in to SQL which allows transformations between data formats.
You can create a package by right-clicking on your target database in SQL Server Management Studio and selecting Tasks > Import Data. In the wizard that comes up asking for a data source choose "Microsoft Excel" from the top drop-down labelled as Data Source, then follow the wizard through. You'll have the choice of importing the Excel data into a new table or mapping it into an existing table.
If you want do this programmatically, you can save your package at the end of the wizard and then invoke it via code. But that's a different question.
What you want is not possible (as long as I know off). You can use SSIS package to migrate the Excel sheet into SQL Server, but is imposible to determine if someone "click save" or do some changes on the excel file. SSIS package can be programed to run on schedule or by demand. You should investigate SSIS packages. It is not easy to learn, but do what you need.
Maybe you would like to try a tool I have developed? It's an Excel Add-In that exports Excel data to SQL Server.
There is a feature to automatically export the data to SQL Server every time you press the save button in Excel. If you need to update the database every time a cell value change, you will need to add a few lines of VBA-code that will push the data to SQL Server.
To beta testers I currently give away a free license, so if you are interested in testing it out, send me an email:)
www.sqlpreads.com
I am new to SSIS. I am using SSIS 2012 to transfer data from Excel(COZYROC Excel Source Plus component) to SQL server database(OLE DB Destination). My requirement is whenever columns in the excel are not matched with mapped columns or any columns are missing, I should log the error message in the database.
Please help to resolve this problem.
I don't believe that is possible.
SSIS (and SSRS and other applications) require a 'contract' between source and destination such that any changes to the source will throw a mapping error and force the developer to re-map the data flow.
SSIS is not capable of a scenario such as 'if the source columns change, pump the rest and log the changes to another path'.
This is also an example of why Excel makes a terrible data source for a normalized database/ETL project, because users can easily change the Excel doc in such a way that would break data mapping.
Good luck.
I have a complex XSD schema and hundreds of XML files conforming to the schema.
How do I automate the creation of related SQL Server tables to store the XML data?
I've considered creating C# classes from the XSD schema using the xsd.exe tool and letting something like Subsonic figure out how to make a shiny database out of it, but not sure if it's the best way to approach it.
Has anyone managed to elegantly import XSD files into SQL Server?
A similar question with good answers: How can I create database tables from XSD files?
I suggest you use SQL Server Integration Services, which comes with SQL Server 2008 or 2005 (Or Data Transformation Services if your stuck with 2000).
Unfortunately it doesn't come with the free "Express" version of SQL Server, however SQL Server Developer edition can be had for < $100 which has the full SQL Server Standard functionality and would suit your needs.
SSIS is a big topic and I'm not going to go over all of the bells and whistles here but basically you:
Create a new SSIS project using BIDS (Business Intelligence Development Studio, a modified Visual Studio that comes with SSIS)
Drag a new Data Flow Task onto the Control Flow surface, then click the data flow tab.
Drag an "XML source" from toolbox into data flow panel, and then configure the XSD and XML file locations.
Drag an ADO.NET data destination from the toolbox onto the dataflow and connect one of the the outputs from the XML source to the input of the ADO.NET destination. If you want to create a new table based on the data output from the xml schema as opposed to using an existing one click on "New" when specifying the Connection Manager Settings in the ADO.NET Destination and it generate and execute the appropriate create table statement. Repeat this for any other outputs from the XML source (there will be one for each logical flat table generated from the schema).
You will most probably need to use other data transformation objects first to transform the data before it loaded into SQL server, but that is the general gist of it. If you need to run the process for a large amount of XML files you could put the task in a control loop and use a variable to set the XML file location.
The MS Documentation on using an XML source in SSIS is here: http://msdn.microsoft.com/en-us/library/ms140277(v=SQL.100).aspx
Just found XSD2DB on Sourceforge, according to the site:
XSD2DB is a command line tool written
in C#, that will read a Microsoft
ADO.NET compatible DataSet Schema File
(XSD) and generate a database.
Checking it out.
How can I make identical output from a transformation go to two separate places e.g., an OLE DB destination and a DataReader destination?
Background:
I have an existing package that reads data from a text file, does some transformations, and loads the data into a SQL Server table.
Now I'm trying to make the package be callable from a reporting services report (SSRS). I'm following the instructions here: http://msdn.microsoft.com/en-us/library/ms159215.aspx
It says to make my data go into a DataReader destination and then the report will have access to that. So I want the output of the final transformation to go to both the SQL table, and the DataReader destination.
Use the MULTICAST and send to a "DataReader destination" and "OLEDB destination" in your SSIS package.
When you create your datasets in SSRS, you should use the name of the Output object from your SSIS package. Your dataset in the report should then populate with fields, data, etc that from the SSIS package.
Perhaps the Multicast step?