SQL server Instance hanging randomly - sql-server

I have a SQL Server agent job running every 5 minutes with SSIS package from SSIS Catalog, that package does:
DELETE all existing data ON OLTP_DB
Extract data from Production DB
DELETE all existing data on OLAP_DB and then
Extract data transformed from OLTP_DB into OLAP_DB ...
PROBLEM
That job I mentioned above is hanging randomly for some reason that I don't know,
I just realize using the activity monitor, every time it hangs it shows something like:
and if I try to run any query against that database it does not response just say executing.... and nothing happen until I stop the job.
The average running time for that job is 5 or 6 minutes, but when it hangs it can stay running for days if I donĀ“t stop it. :(
WHAT I HAVE DONE
Set delayValidation : True
Improved my queries
No transactions running
No locking or blocking (I guess)
Rebuild and organize index
Ran DBCC FREEPROCCACHE
Ran DBCC FREESESSIONCACHE
ETC.....
My settings:
Recovery Mode Simple
SSIS OLE DB Destination
1-Keep Identity (checked)
2-Keep Nulls (checked)
3-Table lock (checked)
4-Check constraints (unchecked)
rows per batch (blank)
Maximum insert commit size(2147483647)
Note:
I have another job running a ssis package as well (small) in the same instance but different databases and when the main ETL mentioned above hangs then this small one sometimes, that is why I think the problem is with the instance (I guess).
I'm open to provide more information as need it.
Any assistance or help would be really appreciated!

As Joeren Mostert said, it's showing CXPACKET which means that it's executing some work in parallel. (cxpacket)
It's also showing ASYNC_NETWORK_IO (async_network_io) which means it's also transfering data to the network.
There could be many reasons. Just a few more hints:
- Have you checked if network connection is slow? - What is the size of the data being transfered vs the speed of the network? - Is there an antivirus running that could slow the data transfer?
My guess is that there is lots of data to transfer and that it's taking a long time. I would think either I/O or network but since you have an asyn_network_io that takes most of the cumulative wait time, I would go for network.

As #Jeroen Mostert and #Danielle Paquette-Harvey Said, By doing right click using the activity monitor I could figure out that I had an object that was executing in parallel (for some reason in the past), to fix the problem I remove the parallel structure and put everything to run in one batch.
Now it is working like a charm!!
Before:
After:

Related

BULK INSERT Task Issues

First, I am new to SSIS so I am still getting the hang of things.
I am using Visual Studio 19 and SSMS 19
Regardless, I have set-up an OLE DB Package from .TSV file to table in SSMS. The issue is that it took 1 hour and 11 minutes to execute for 500,000 rows.
The data is extremely variable so I have set-up a staging table in SSMS that is essentially all varchar(max) columns. Once all the data is inserted, then I was going to look at some aggregations like max(len(<column_name>)) in order to better optimize the table and the SSIS package.
Anyways, there are 10 of these files so I need to create a ForEach File loop. This would take at minimum (1.17 hours)*10=11.70 hours of total runtime.
I thought this was a bit long and created a BULK INSERT Task, but I am having some issues.
It seems very straightforward to set-up.
I added the Bulk Insert Task to the Control Flow tab and went into the Bulk Insert Task Editor Dialogue Box.
From here, I configured the Source and Destination connections. Both of which went very smoothly. I only have one local instance of SQL Server on my machine so I used localhost.<database_name> and the table name for the Destination Connection.
I run the package and it executes just fine without any errors or warnings. It takes less than a minute for a roughly 600 MB .TSV file to load into a SSMS table with about 300 columns of varchar(max).
I thought this was too quick and it was. Nothing loaded, but the package executed!!!
I have tried searching for this issue with no success. I checked my connections too.
Do I need Data Flow Tasks for Bulk Insert Tasks? Do I need any connection managers? I had to configure Data Flow Tasks and connection managers for the OLE DB package, but the articles I have referenced do not do this for Bulk Insert Tasks.
What am I doing wrong?
Any advice from someone more well-versed in SSIS would be much appreciated.
Regarding my comment about using a derived column in place of a real destination, it would look like 1 in the image below. You can do this in a couple of steps:
Run the read task only and see how long this takes. Limit the total read to a sample size so your test does not take an hour.
Run the read task with a derived column as a destination. This will test the total read time, plus the amount of time to load the data into memory.
If 1) takes a long time, it could indicate a bottleneck with slow read times on the disk where the file is or a network bottleneck if the file is on another server on a shared drive. If 2) adds a lot more time, it could indicate a memory bottleneck on the server that SSIS is running. Please note that you testing this on a server is the best way to test performance, because it removes a lot of issues that probably won't exist there such as network bottlenecks and memory constraints.
Lastly, please turn on the feature noted as 2) below, AutoAdjustBufferSize. This will change the settings for DefaultBufferSize (max memory in the buffer) and DefaultBufferMaxRows (total rows allowed in each buffer, these are the numbers that you see next to the arrows in the dataflow when you run the package interactively). Because your column sizes are so large, this will give a hint to the server to maximize the buffer size which gives you a bigger and faster pipeline to push the data through.
One final note, if you add the real destination and that has a significant impact on time, you can look into issues with the target table. Make sure there are no indexes including a cluster index, make sure tablock is on, make sure there are no constraints or triggers.

How to config Oracle Data Integrator to restart when a job is error?

My company is using Oracle Data Integrator for ETL jobs. Recently, there's an issue with a source database that lead to extracting job sometimes fail (very randomly, once or twice per 10 extract jobs). When we restart the job, most of the times it run successfully.
So while we are trying to fix the connection to source database, is there any way to restart that particular job 1 or 2 times if it fails? How can I config that?
Thanks!
You can enclose the scenario in a package. Then, set the Processing after failure options in the package Advanced tab:

How can I get my SSRS Data sources/Data sets to run in parallel?

I'm having issues with my SSRS reports running slow. Using SQL Profiler, I found out that the queries are running one at a time. I did research and found the suggestion to make sure "Use single transaction when processing the queries" was not clicked in my Data Source. This was already set to off. I am now testing if not only the Data sets won't run in parallel, but the Data Sources also won't run in parallel.
Using SQL Profiler, I'm finding that my single .Net Client Process logs into the first Data Source, sets up properties..
SELECT
DATABASEPROPERTYEX(DB_NAME(), 'Collation'),
COLLATIONPROPERTY(CONVERT(char, DATABASEPROPERTYEX(DB_NAME(), 'collation')),'LCID')
and then runs my SQL statement. After completion, the same ClientProcessID moves onto the next Data Source and does that one.
Has anyone run into this problem before? Are there other issues at play?
Thanks
Are you running/testing these on the reporting server, or from your development machine? Because, the dataset queries will not run in parallel in BIDS, but they should on the server. (Posted in comments by R. Richards)

Implications of using waitfor delay task in ssis package on scheduled server

I have a question regarding implications of using waitfor delay in an execute sql task on an ssis package. Here's what's going on: I have source data tables that due to the amount of data and linked server connection yada yada they are dropped and created every night. Before my package the utilizes this data runs I have a loop for container. In this container I have an execute sql task that checks to see my source tables exist and if they do not, it sends me and email via email task, then goes to an execute sql task that has a waitfor delay of 30 mins (before looping and checking for source tables again). Now I thought I was pretty slick with this design but others on my team are concerned because they do not know enough about this waitfor task. They are concerned that my package could possibly interfere with theirs, or slow down server, use resources etc....
From my google searches I didn't see anything that actually seemed like it would cause issues. Can anyone here speak to the implications of using this task?
SQL WAITFOR is ideal for this requirement IMO - I've been using it in production SSIS packages for years with no issues. You can monitor it via SSMS Activity Monitor and see that it doesnt consume any resources.

System.Data.SqlClient.SqlException: Timeout expired on commit

I am writing some code that is importing a large amount of data into three tables currently around 6 million rows across the three tables. I am wanting to do this in a transaction so if there are any issues or the user cancels the import nothing is imported. This works fine on my own development machine however on a slower amazon ec2 instance and micro sql instance I am getting the following exception:
System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding
Now I know that the commit is finishing eventually because the data is present in the tables when I look, so my question is; can this be easily avoided without adding the connection timeout property to my connection string (I only want this one operation to not timeout) or is this a really hard/dangerous thing to be doing?
I am not sure if maybe I should import into holding tables and then call stored procedures to move the data when I am ready because I would assume this will result in a shorter transaction)
I am using Ms Sql server 2012.
Do comment if I need to add more data.
Many thanks for your help
Check what is the SP getting timedout .. if you have any third party tool like Redgate or Avicode you can figure it out ..or use Profiler to figure it out.. then see the execution plan for the SP or query .. If you find any Key lookups or RID lookups then resolve it first and try again..

Resources