Suppose you have an Source Component and you are doing a simple One-to-One data load to a Desination Component. A Sample screenshot is as below:
Now when i open my Destination component, i find that it asks me to enter the name of the table where the data would be loaded.
In the dropdown, i find that the names of the table are getting populated which are already present in my database.
My queries are as below:
Do i need to manually create the tables in the database first and then only i can execute the data flow?
Is there any way, i could get the all the Source columns and generate a DDL statement to create the table, if required. I am NOT asking for dynamically creating a table. Just a possible solution to get all you input columns and data-types lists and then auto-generate a sample DDL or SQL query.? (Imagine a scenario where in you have 200-500 columns in a text file/flat file,, manually creating a table with all the columns gets tough)
N.B: Integration Tools like Pentaho Kettle provides you with a similar feature. You don't have to worry about table creation. If its not present, you can use kettle feature to create one in the database.
It would be really helpful if anyone can help me in this. I am also a newbie to SSIS. Not much of a knowledge. Thanks :)
Related
I have a MSSQL database that has a 'settings' table with more than 500 rows. I want to generate a script of UPDATE commands including all the data. This will run against an existing database, and the data will replace what is currently in the table.
I have attempted to use the answer found here: How to export all data from table to an insertable sql format? but I get INSERT statements, which won't work.
I cannot drop and re-create the table, because it has foreign keys.
Disclaimer: I did not design this. I wouldn't put foreign keys on settings.
I use the Vroom SQL Script Generator for that. Its free and it does exactly what you are asking - generates insert/update scripts from a table. Just click on "Export Data", select the objects you want to export and then click the Generate button. If you need to really fine tune the way the script are generated then you can also use the built in automation scripting support - which is referenced at the bottom of the link.
I'm having some trouble loading a fact table (Fact_Servicio) of a star schema I made in SQL Server, here the diagram:
All the ID's are identity columns.
Our case is the following: we have a Service Desk software which makes daily reports, and we want to use this data to make Business Intelligence, creating a Data Mart (Star Schema) in which we will populate all this data and then display it through Power BI.
Current problem: our issue is in the ETL process with SSIS. After creating the database in SQL Server, we made a SSIS Package to populate all the data from the excel file to this Star-Schema, we start by populating the dimensions, and after this we attempt to populate the fact table Fact_Servicio, but we don't know how exactly to take the ID's of each dimensions, join them with the fields we need to grab from the excel file(which are our previously defined measures) and then insert all in the fact table, we tried using the lookup transformation, but we cannot match any dimension ID with any column in the excel file because these ID's are created in the database and they just autogenerate per record. (The lookup task needs to match the columns we grabbed with columns in the excel file but in the Excel file we don't have any column for ID's, and we wouldn't like to create fields for it because we want to avoid manual tasks as much as possible, because this would be a repetitive task every time we export the data from the Service Desk Software). Here I'm putting some pictures of our SSIS Package structure:
Control Flow View:
Data Flow View of the Fact Table:
Look Up View:
Connection Tab
Columns Tab
Here is where we can't match columns because the ID's are created in the database.
Guys, if there is maybe other way to do this data load, then go ahead and propose how would you do it, otherwise what can we fix here or what transformations from the toolbox can we use. We were also thinking about loading the big excel file sheet to one big single table in SQL Server and work from there but we aren't sure if we would get advantages by doing this.
Thank you all!
I'm using SQL Server 2012 Standard Edition
After setting an SSIS package that imports data from a flat file, I wanted to know how would I proceed to insert data into staging tables and what the structure of staging tables is?
It's quite hard to answer your question without any sample input/output but in general, I would create the temp stage table with the same data types to match the source files (as destination-data mentioned above). Then, you can use Derived Column Transformation to transform the data to match your business logic before using a Data Flow Task to insert the transformed data into your destination table(s).
You can add error handling to the Derived Column step by selecting 'Configure Error Output', there is a good example of how to do this here:
Hope this helps.
I am building a simple database with about 6-7 tables. I will be setting a schedule to do a clean import from a .txt file.
I want to take this data and create a report, like I would do in an excel spreadsheet, convert it to a pdf and post it to our company intranet for those interested to access it.
I'm trying to think of the best way to build my report. Would I just use an excel spreadsheet with a direct connection to the database? Would I create some sort of console application (c/c#/vb/vb.net) that would query the db, generate the report in an excel file, convert to pdf and save?
I'm quite comfortable in these different languages, just not as experienced in the reporting services (although I do have a lot of experience working with EXCEL and VBA Macros) but I want to get into it (SSRS) and get familiar with it as I will be doing a lot of projects like this in the future. This is seems like an easy one to get my hands dirty with and learn and build off of.
Any insight or suggestions would be greatly appreciated.
Thanks so much!
My suggestion:
Create desired SQL queries to retrieve the data in desired form
Link these queries to your Excel sheet, perhaps directly in form of pivot tables for aggregation of results
Using VBA, you can easily create PDF from the data at the click of a button
The initial design will be time intensive, but after that, everything is automated and one just needs to press the button that creates the PDF.
How to link Access queries to your Excel file:
Data --> Get external Data
You can easily refresh all data whenever you open the Excelsheet by using the code below in the On Open event of the workbook:
ThisWorkbook.RefreshAll
If you need further clarification, do not hesitate to ask
If your end goal is to create a PDF that will be out on your intranet then I would create the report in SSRS. Then you can schedule it to run and output a PDF to your network location.
I've had good experiences using a pivot table in Excel which is a connected table to your SQL database.
In the connection parameters in Excel there is a field where you can define your SQL query, whether it be to call a stored procedure or just a simple SELECT statement.
The main reason I prefer a pivot table SQL connection rather than a normal table connection is because if you have a chart that references the connected table, the chart formatting will be reset when you refresh your connection (if you need to updated your report).
If I use a chart that references a pivot table (or a pivot chart) then the formatting is retained.
I have two tables in database. I want to create a .csv file to maintain all activities done on that two tables. I need help in converting these two tables data into one single .csv file in oracle apex.
Are you just trying to export the data to a CSV?
write an SQL select query to link both the tables together. If you need help with this, let us know
create a Report and enter the SQL in the source
go to Report Attributes, there is an option called Enable CSV Output. Make sure this is set to Yes
Run the page, there should be a hyperlink or button beneath the report which exports the file.