I have the following tasks in my SQL Server 2012 package:
Execute SQL Task to create a local temp table (I need to use local temp table as I will be firing off same concurrent packages)
Data Flow Task: To export from a SQL Server table into the local temp table created in step 1. I have the RatainSameConnection option = TRUE for data sources and DelayValidation for both tasks to true.
Once I run the package, I get this error:
The metadata could not be determined because statement
select * from #tmptable uses a temp table.
I researched this a lot but couldn't find a good solution. Any help and sample working SSIS package will be appreciated.
I just had this same issue and found the solution is to create a fake dataset in the query before you actual SELECT statement that SELECTs static data that is in the same format.
SET FMTONLY ON
select 0 as a, 1 as b, 'test' as C, GETDATE() as D
SET FMTONLY OFF
--now put the real query
select a, b, c, d from ##TempTable
See Kyle Hale's answer - SSIS Package not wanting to fetch metadata of temporary table
Go to properties and set ValidateExternalMetadata to False so it won't try to check the metadata.
If that doesn't solve the problem then,
Use global temporary table(##someTable) instead of local temp table(#someTable). so you can create that global temp table using SSMS and then when you use it in SSIS it won't complain about metadata.
Update:
If you want to use local temp table then do this after you've followed the above steps,
From SSIS menu in BIDS, select Work Offline mode and then change your Global temp table to Local temp Table. make sure do it from Property window or better use variable for holding sqlCommand text otherwise SSIS will complain about external metadata again.
Generally I replace the temp table with a CTE or table variable.
Related
On the ETL server I have a DW user table.
On the prod OLTP server I have the sales database. I want to pull the sales only for users that are present in the user table on the ETL server.
Presently I am using an execute SQL task to fetch the DW users into a SSIS System.Object variable. Then using a for each loop to loop through each item (userid) in this variable and via a data flow task fetch the OLTP sales table for each user and dump it into the DW staging table. The for each is taking long time to run.
I want to be able to do an inner join so that the response is quicker, but I cant do this since they are on separate servers. Neither can I use a global temp table to make the inner join, for the same reason.
I tried to collect the DW users into a comma separated string variable and then using it (via string_split) to query into OLTP, but this is also taking more time at the pre-execute phase (not sure why exactly) even for small number of users.
I also am aware of lookup transform but that too will result in all oltp rows to be brought into the dw etl server to test the lookup condition.
Is there any alternate approach to be able to do an inner join by taking the list of users into the source?
Note: I do not have write permissions on the OLTP db.
Based on the comments, I think we can use a temporary table to solve this.
Can you help me understand this restriction? "Neither can I use a global temp table to make the inner join, for the same reason."
The restriction is since oltp server and dw server are separate so can't have global temp table common to both servers. Hope makes sense.
The general pattern we're going to do is
Execute SQL Task to create a temporary table on the OLTP server
A Data Flow task to populate the new temporary table. Source = DW. Destination = OLTP. Ensure Delay Validation = True
Modify existing Data Flow. Modify source to be a query that uses the temporary table i.e. SELECT S.* FROM oltp.sales AS S WHERE EXISTS (SELECT * FROM #SalesPerson AS SP WHERE SP.UserId = S.UserId); Ensure Delay Validation = True
A long form answer on using temporary tables (global to set the metadata, regular thereafter)
I don't use temp table in SSIS
Temporary tables, live in tempdb. Your OLTP and DW connection managers likely do not point to tempdb. To be able to reference a temporary table, local or global, in SSIS you need to either define an additional connection manager for the same server that points explicitly at tempdb so you can use the drop down in the source/destination components (technically accurate but dumb). Or, you use an SSIS Variable to hold the name of the table and use the ~From Variable~ named option in source/destination component (best option, maximum flexibility).
Soup to nuts example
I will use WideWorldImporters as my OLTP system and WideWorldImportersDW as my DW system.
One-time task
Open SQL Server Management Studio, SSMS, and connect to your OLTP system. Define a global temporary table with a unique name and the expected structure. Leave your connection open so the table structure remains intact during initial development.
I used the following statement.
DROP TABLE IF EXISTS #SO_70530036;
CREATE TABLE #SO_70530036(EmployeeId int NOT NULL);
Keep track of your query because we'll use it later on but as I advocate in my SSIS answers, perform the smallest task, test that it works and then go on to the next. It's the only way to debug.
Connection Managers
Define two OLE DB Connection Managers. WWI_DW uses points to the named instance DEV2019UTF8 and WWI_OLTP points to DEV2019EXPRESS. Right click on WWI_OLTP and select Properties. Find the property RetainSameConnection and flip that from the default of False to True. This ensures the same connection is used throughout the package. As temporary tables go out of scope when the connection goes away, closing and reopening a connection in a package will result in a fatal error.
These two databases on different instances so we can't cheat and directly comingle data.
Variables
Define 4 variables in SSIS, all of type String.
TempTableName - I used a value of ##SO_70530036 but use whatever value you specified in the One-time task section.
QuerySourceEmployees - This will be the query you run to generate the candidate set of data to go into the temporary table. I used SELECT TOP (3) E.[WWI Employee ID] AS EmployeeId FROM Dimension.Employee AS E WHERE E.[Is SalesPerson] = CAST(1 AS bit);
QueryDefineTables - Remember the drop/create statements from the on-time task? We're going to use the essence of them but use the expression builder to let us dynamically swap the table name. I clicked the ellipses, ..., on the Expression section and used the following "DROP TABLE IF EXISTS " + #[User::TempTableName] + "; CREATE TABLE " + #[User::TempTableName] + "( EmployeeId int NOT NULL);" You should be able to copy the Value from the row and paste it into SSMS to confirm it works.
QuerySales - This is the actual query you're going to use to pull your filtered set of sales data. Again, we'll use the Expression to allow us to dynamically reference the temporary table name. The prettified version of the expression would look something like
"SELECT
SI.InvoiceID
, SI.SalespersonPersonID
, SO.OrderID
, SOL.StockItemID
, SOL.Quantity
, SOL.OrderLineID
FROM
Sales.Invoices AS SI
INNER JOIN
Sales.Orders AS SO
ON SO.OrderID = SI.OrderID
INNER JOIN
Sales.OrderLines AS SOL
ON SO.OrderID = SOL.OrderID
WHERE
EXISTS (SELECT * FROM " + #[User::TempTableName] + " AS TT WHERE TT.EmployeeID = SI.SalespersonPersonID);"
Again, you should be able to pull the Value from the three queries and run them independently and verify they work.
Execute SQL Task
Add an Execute SQL task to the Control Flow. I named mine SQL Create temporary table My Connection Manager is WWI_OLTP and I changed the SQLSourceType to Variable and the SourceVariable is User::QueryDefineTables
Every time your package runs, the first thing it will do is establish create the temporary table. Which is good because SSIS is a metadata driven ETL engine and the next two steps would fail if the table didn't exist.
Data Flow Task - Prime the pump
This data flow is where we'll transfer DW data back to the OLTP system so can filter in the source system.
Drag a Data Flow Task onto the Control Flow. I named mine DFT Load Temp and before you click into it, right click on the Task and find the DelayValidation property and change this from the default of False to True. Normally, a package validates all metadata before actual execution begins as the idea is you want to know everything is good before any data starts moving. Since we're using temporary tables, we need to tell the execution engine "trust us, it'll be ready"
Double click inside the Data Flow Task.
Add an OLE DB Source. I named mine OLESRC SourceEmployees I use the connection manager WWI_DW. My data access mode changes to SQL command from variable and then I select my variable User::QuerySourceEmployees
Add an OLE DB Destination. I named mine OLEDST TempTableName and double clicked to configure it. The Connection Manager is WWI_OLTP and again, since the table lives in tempdb, we can't select it from the drop down. Change the Data access mode to Table name or view name variable - fast load and then select your variable name User::TempTableName. Click the Mapping tab and ensure source columns map to destination columns.
Data Flow Task - Transfer data
Finally, we will pull our source data, nicely filtered against the data from our target system.
Add an OLE DB Source. I named it OLESRC QuerySales. The Connection Manager is WWI_OLTP. Data access mode again changes to SQL command from variable and the variable name is User::QuerySales
From here, do whatever else you need to do to make the magic happen.
Instead of having 270k rows with an unfiltered query
I have 67k as there are only 3 employees in the temporary table.
Reference package
But wait, there's more!
Close out visual studio, open it back up and try to touch something in the data flows. Suddenly, there are red Xs everywhere! Any time you close a data flow component, it fires a revalidate metadata operation and guess what, it can't do that as the connection to the temporary table is gone.
The package will run fine, it will not throw VS_NEEDSNEWMETADATA but editing/maintenance becomes a pain.
If you switched from global temporary table to local, switch the table name variable's value back to a global and then run the define statement in SSMS. Once that's done, then you can continue editing the package.
I assure you, the local temporary table does work once you have the metadata set and you use queries via variables for source/destination.
No need for the global temporary table hack, or the SET FMTONLY OFF hack (which no longer works).
Just specify the result set metadata in the SQL query with WITH RESULT SETS. eg
EXEC ('
create table #t
(
ID INT,
Name VARCHAR(150),
Number VARCHAR(15)
)
insert into #t (Id, Name, Number)
select object_id, name, 12
from sys.objects
select * from #t
')
WITH RESULT SETS
(
(
ID INT,
Name VARCHAR(150),
Number VARCHAR(15)
)
)
If you need to parameterize the query, there's a bit of a catch because there are some limitations in how SSIS discovers parameters. SSIS runs sp_describe_undeclared_parameters, which doesn't really work with batches that call sp_executesql, because sp_executesql has a very unique way it handles parameters, one which you couldn't replicate with a user stored procedure.
So to parameterize the query you'll either need to pass the parameter values into the query using the "query from variable" and SSIS expressions, or push all this TSQL into a stored procedure.
I am using VS 2019 and comparing 2 databases using Schema Compare. Most of the time it finds the differences just fine and replicates them across. However, on some Stored Procedures it highlights a difference because of a schema name missing. e.g. [dbo]
Example:
Left Panel (local DB)
CREATE Procedure MyNewProcedure
(
#prmParam1 int
)
Select * from TableA where Id = #prmParam1
And on the Right hand Panel (remote DB) it shows the schema name correctly:
CREATE Procedure [dbo].[MyNewProcedure]
(
#prmParam1 int
)
Select * from TableA where Id = #prmParam1
If I run the update, it will create another version of this stored procedure but it won't belong to the dbo schema - it will take on the schema name of the connection.
It only does this for a handful of stored procedures. All the rest it adds the [dbo]. to each of the create statements.
I can not figure out why as I have to manually create the SP on the remote DB and then do an Exclude for each of these in the compare so it won't delete my new SP on the remote and create the new one without the [dbo].
Anyone got any ideas or seen this before?
I don't think I am creating the original SP (local) in any different way. Just a blank New Query window and type it in and hit F5.
Thanks in advance,
Ro
I have created a package in ssis in which i use some date-variables inside my SQL statements ( i.e declare #DateIn ="2018-02-22" and declare #DateTo = "2018-03-22"), in order to load the corresponding data inside the tables of the data warehouse.
What I need to do is to create a task or a different package, which will give me the possibility to define externally the values of these variables, every time i run it, in order to fill in the tables of the warehouse with the data that corresponds to the dates i set every time.
From what I've read, I should maybe use a script task or an execute sql task or parameters
Could you help me please? Or could you suggest me a good tutorial/link?
I have found plenty but can't decide if they meet the needs of what i am describing above.
Thank you
Create DTSX package with variables #DateStart and #DateEnd
Create table containing 3 columns DateStart, DateEnd, Active
Create stored procedure that reads DateStart, DateEnd where Active = 1 from your newly created table and does an alter on the SQL Server Job updating your variables value that are inside of your DTSX package with your desired value using sp_update_jobstep
See link
Ex of command:
dtexec /f YourPackage.dtsx
/set \package.variables[DateStart].Value;myvalue
/set \package.variables[DateStart].Value;myvalue
Add sp_start_job inside the stored procedure to start the job with the new variable values.
Create job with 1 step containing the execute of the stored procedure from Step 3
All you need to do is update the values from your table created in Step 2 and then execute job to run the stored procedure to update DTSX job exec command and start it. You can trigger this from a website and control the tables values from textboxes.
Also specific Permissions are required and the SP that updates the SQL Agent job needs to be run by Sysadmin
Good question by the way for the new learner!
There are many ways for this scenario,few of them I have mentioned below.
1-Create variable in variable pane #DateIn and #DateTo for storing the date and data type will be date.
Now put 2 entry in Excel ,text or xml for these two variables and call it by using foreachloop container and assign this to variables.
2-Create a SQl table in which you can store those values either by manually on daily basis or load the table with excel ,text ,xml or csv file and call the table in Execute SQL Task and select the result set and pass the result set values to the variables.
I hope it will solve your problem.
I am trying to create a global temp table using the results from one query, which can then be selected as a table and manipulated further several times without having to reprocess the data over and over.
This works perfectly in SQL management studio, but when I try to add the table through an Excel query, the table can be referenced at that time, but it is not created in Temporary Tables in the tempdb database.
I have broken it down into a simple example.
If I run this in SQL management studio, the result of 1 is returned as expected, and the table ##testtable1 is created in Temporary Tables
set nocount on;
select 1 as 'Val1', 2 as 'Val2' into ##testtable1
select Val1 from ##testtable1
I can then run another select on this table, even in a different session, as you'd expect. E.g.
Select Val2 from ##testtable1
If I don't drop ##testtable1, running the below in a query in Excel returns the result of 2 as you'd expect.
Select Val2 from ##testtable1
However, if I run the same Select... into ##testtable1 query directly in Excel, that correctly returns the result of 1, but the temptable is not created.
If I then try to run
Select Val2 from ##testtable1
As a separate query, it errors saying "Invalid object name '##testtable1'
The table is not listed within Temporary Tables in SQL management studio.
It is as if it is performing a drop on the table after the query has finished executing, even though I am not calling a drop.
How can I resolve this?
Read up on global temp tables(GTT). They persist as long as there is a session referencing it. In SSMS, if you close the session that created the GTT prior to using it in another session, the GTT would be discarded. This is what is happening in Excel. Excel creates a connection, executes and disconnects. Since there are no sessions using the GTT when Excel disconnects, the GTT is discarded.
I would highly recommend you create a normal table rather than use a GTT. Because of their temporary nature and dependence on an active session, you may get inconsistent results when using a GTT. If you create a normal table instead, you can be certain it will still exist when you try to use it later.
The code to create/clean the table is pretty simple.
IF OBJECT_ID('db.schema.tablename') IS NOT NULL
TRUNCATE TABLE [tablename]
ELSE
CREATE [tablename]...
GO
You can change the truncate to a delete to clean up a specific set of data and place it at the start of each one of your queries.
is it possible you could use a view? assuming that you are connecting to 5 DBs on the same server can you union the data together in a view:
CREATE VIEW [dbo].[testView]
AS
SELECT *
FROM database1.dbo.myTable
UNION
SELECT *
FROM database2.dbo.myTable
Then in excel:
Data> New Query > From Database > FromSQL Server Database
enter DB server
Select the view from the appropriate DB - done :)
OR call the view however you are doing it (e.g. vba etc.)
equally you could use a stored procedure and call that from VBA .. basically anything that moves more of the complexity to the server side to make your life easier :D
You can absolutely do this. Notice how I'm building a temp table from SQL called 'TmpSql' ...this could be any query you want. Then I set it to recordset 1. Then I create another recordset 2, that goes and gets the temp table data.
Imagine if you were looping on the first cn.Execute where TmpSql is changing.. This allows you to build a Temporary table coming from many sources or changing variables. This is a powerful solution.
cn.open "Provider= ..."
sql = "Select t.* Into #TTable From (" & TmpSql & ") t "
Set rs1 = cn.Execute(sql)
GetTmp = "Select * From #TTable"
rs2.Open GetTmp, cn, adOpenDynamic, adLockBatchOptimistic
If Not rs2.EOF Then Call Sheets("Data").Range("A2").CopyFromRecordset(rs2)
rs2.Close
rs1.Close
cn.Close
I have a requirement, where I need to create a local temp table inside a SSIS package and populate and later use it in data flow. For that I have started with a POC , where I created the temp table inside execute sql task and populate it, later use the values present in temp table and populate the physical table. I followed the net and set retain same connection true , work offline and all. But I am able to do it only for global temp table , when I put local temp table instead , I get an error at oledb source saying that #ABC.. tmp file could not be found.Any suggestions , why it is not working for local tables.
you can set the Data flow task propriety delay validation ='True'
On your DB Connection Manager go to Properties, there is a setting "RetainSameConnection" Change that to True.
set MARS Connection property in connection manager to TRUE