Our table valued stored procedures are taking a longer time to load data than usual. Please find the below details:
Number of rows: 1.3M
Time taken to load (old): 3 min
Time taken to load (new): 25-40 min.
Nothing has changed in the server. Can you please give me pointers as to what needs to be looked into?
Application: we have deployed a SSIS package in SQL Server, which extracts data from different teradata sources in a loop and creates a datatable and executes a stored procedure using script component
Regards,
Swathi S
Have tried putting the insert within a transaction
Insert into (col1, col2....)
select col1, col2
from #tvpvariable
Related
We have external table created, we need to run select on the table and select all the records, the select runs very very slow. Its not completing even after 30 mins, the table contains around 2millon recs
We also need to query this table from another DB and even this runs very very slow, doesn't return even after 30 mins.
Select is of the form:
select col1, col2,...col3 from ext_table;
Need help in:
1. Any suggestions on reducing the time taken for execution?
Note: we need to select entire content of the table so where condition might not be used.
Thanks in advance.
If you are not using the WHERE clause to push parameters to the remote database, then there is no way to optimize the performance of the query. You are returning the whole table.
My suggestion is to use SQL Data Sync to have a local copy of the table on this SQL Database that synchronizes with the remote Azure SQL Database at X interval of time.
I have created a package in ssis in which i use some date-variables inside my SQL statements ( i.e declare #DateIn ="2018-02-22" and declare #DateTo = "2018-03-22"), in order to load the corresponding data inside the tables of the data warehouse.
What I need to do is to create a task or a different package, which will give me the possibility to define externally the values of these variables, every time i run it, in order to fill in the tables of the warehouse with the data that corresponds to the dates i set every time.
From what I've read, I should maybe use a script task or an execute sql task or parameters
Could you help me please? Or could you suggest me a good tutorial/link?
I have found plenty but can't decide if they meet the needs of what i am describing above.
Thank you
Create DTSX package with variables #DateStart and #DateEnd
Create table containing 3 columns DateStart, DateEnd, Active
Create stored procedure that reads DateStart, DateEnd where Active = 1 from your newly created table and does an alter on the SQL Server Job updating your variables value that are inside of your DTSX package with your desired value using sp_update_jobstep
See link
Ex of command:
dtexec /f YourPackage.dtsx
/set \package.variables[DateStart].Value;myvalue
/set \package.variables[DateStart].Value;myvalue
Add sp_start_job inside the stored procedure to start the job with the new variable values.
Create job with 1 step containing the execute of the stored procedure from Step 3
All you need to do is update the values from your table created in Step 2 and then execute job to run the stored procedure to update DTSX job exec command and start it. You can trigger this from a website and control the tables values from textboxes.
Also specific Permissions are required and the SP that updates the SQL Agent job needs to be run by Sysadmin
Good question by the way for the new learner!
There are many ways for this scenario,few of them I have mentioned below.
1-Create variable in variable pane #DateIn and #DateTo for storing the date and data type will be date.
Now put 2 entry in Excel ,text or xml for these two variables and call it by using foreachloop container and assign this to variables.
2-Create a SQl table in which you can store those values either by manually on daily basis or load the table with excel ,text ,xml or csv file and call the table in Execute SQL Task and select the result set and pass the result set values to the variables.
I hope it will solve your problem.
I have a very bizarre error here I can't get my head around.
The following T-SQL:
CREATE TABLE #Contribs ( ID VARCHAR(100), Contribution FLOAT )
INSERT INTO #Contribs
EXEC [linkedserver].[catalogue].[schema].LocalContrib
SELECT * FROM #Contribs
creates a simple temp table in my server, fills it with data from a linked server and views the data.
When I run the remote procedure on its own, it provides me a list of (text,float) pairs.
When I run the whole T-SQL without requesting the actual execution plan, it shows me this list of pairs correctly inside the temp table.
When I run the whole T-SQL along with its actual execution plan, it fails and returns me the message 'Column name or number of supplied values does not match table definition'.
Does anyone know why this is happening or what I can do about it? It seems perverse to me that the display of the execution plan should interfere with the execution of the statement itself. It's rather annoying because I wish to examine the execution plan of a parent stored procedure that contains this code. I don't know what the procedure 'LocalContrib' being called looks like on the inside and I'm running SQL Server 2012.
Thanks.
I have a stored procedure in a SQL Server 2008 database that returns a set of values pulled from various different tables such as the following. I run the stored procedure as shown below, without any parameters.
EXEC [Data].[dbo].[sp_Usage]
Each row shows the product usage data such as
Last Login
No.of times used last month
last 3 months
last 6 months
App Version
for each unique AccountId
I want to run this stored procedure automatically every month/week and store the corresponding results in the database, without erasing the last week/month's data.
I plan to use this data over time to do data trending.
How should I execute this plan?
Any help or guidance will be much appreciated
Cheers!
Shiny
So your stored procedure presumably has a SELECT (list of columns) ..... inside it, right?
Why not change that to something like:
INSERT INTO dbo.YourBigTable(ExecutionDateTime, ...other columns here.....)
SELECT
GETDATE(), -- get current date/time
(list of other columns)
FROM .......
Just basically make the stored procedure run the INSERT into your target table directly. That would seem like the easiest way to go from my point of view.
Otherwise, you need to insert the data returned from the stored procedure into a temporary table, and then insert that temporary table's data, along with the execution date/time, into your target table.
I have done several SSIS packages over the past few months to move data from a legacy database to a SQL Server database. It normally takes 10-20 minutes to process around 5 millions of records depending on the transformation.
The issue I am experiencing with one of my package is a very poor performance because one of the columns in my destination is of the SQL Server XML data type.
Data comes in like this: 5
A script creates a Unicode string like this: <XmlData><Value>5</Value></XmlData>
Destination is simply a column with XML data type
This is really slow. Any advice?
I did a SQL Trace and notice that in behind the scene SSIS is executing on each row a convert before the insert:
declare #p as xml
set #p=convert(xml,N'<XmlData><Value>5</Value></XmlData>')
Try using a temporary table to store the resulting 5 million records without the XML transformation and then use SQL Server itself to move them from tempDB to the final destination:
INSERT INTO final_destination (...)
SELECT cast(N'<XmlData><Value>5</Value></XmlData>' AS XML) AS batch_converted_xml, col1, col2, colX
FROM #tempTable
If 5.000.000 turns to be too much data for a single batch, you can do it in smaller batches (100k lines should work like a charm).
The record captured by the profiler looks like an OleDB transformation with one command per line.