How can I have my polling service call SSIS sequentially? - sql-server

I have a polling service that checks a directory for new files, if there is a new file I call SSIS.
There are instances where I can't have SSIS run if another instance of SSIS is already processing another file.
How can I make SSIS run sequentially during these situations?
Note: parallel SSIS's running is fine in some circumstances, while in others not, how can I achieve both?
Note: I don't want to go into WHEN/WHY it can't run in parallel at times, but just assume sometimes it can and sometimes it can't, the main idea is how can I prevent a SSIS call IF it has to run in sequence?

If you want to control the flow sequentially, think of a design like where you can enqueue requests (for invoking SSIS) to a queue data structure. At a time, only the top request from the queue will be processed. As soon as that request completes, next request can be dequeued.

Related

Execute Script after two separate ADFv2 Pipelines have completed

I have two ADFv2 Pipelines that import data into two seperate srl tables within an Azure SQL database. Once both the pipelines have completed I would need to execute a script.
The source .csv files that initiates the execution of each individual pipeline will be created on a daily basis, but I can only execute the script when both Pipelines have completed...
each seperate pipeline is triggered via a Logic App by the creation of a seperate .csv file
I can use Logic Apps as well, but at the moment I can't find the best process to implement this.
Any help greatly appreciated.
2 situation:
1.If you don't mind the pipeline linear execution,you could use Execute Pipeline Activity. Execute function until the first two Execute Pipeline Activity executes successfully, like this process:
2.If not, my idea is using queue trigger. After pipeline execution, send a message to azure queue storage by for example Web Activity(REST API). Configure a function queue trigger, judge if it receive 2 successful messages,then do some jobs.
Of course, you could use ADF monitor SDKs to de-polling to check the execution status and results of two pipelines and do the next jobs. You could pick a suitable solution.
Besides, you could get an idea of Logic App as you mentioned in the answer.It supports run after for 2 connectors. Both of them are successful, then do the next job.

VS2017 SSIS Parallel Processing error (?)

I am trying to run parallel processes to read Excel Files into an OLEDB Destination. However on runtime, SSIS doesn't show errors though it simply stops and states:
"Package Execution completed. Click here to switch to design mode, or select Stop Debugging from the Debug Menu".
No rows have been inserted with the parallel processes and I can't find the root cause of this 'completion' in the messages list. I've provided a screenshot as an example:
The MaxConcurrentExecutables is set to 5, the Run64Bit property is set to True (False didn't change anything), and the EngineThreads property is set to 1.
Could anyone help on this problem?
SSIS cannot read the same file simultaneously. Yes, you are running into a locking issue.
The solution is to use one data connection and one data flow. In the data flow, read from the file, then add a multicast, which will allow you to duplicate the data flow as many times you want. From there, merge the tasks that are occurring in both data flows into one.
The net effect is that you will have one data flow; one data source; one multicast; two data pipelines where you can do some transformations; and two data destinations.
I'm not 100% sure if this is true, but I think I know the reason why it fails.
The reason why it suddenly 'stops' executing, could be due to the fact that once SSIS reads from an Excel File to import data, it 'locks' the Excel File. The second Data Flow Task open or access the file since it's already opened by Data Flow Task. See image below.
If someone could confirm this, it would be greatly appreciated!

Performing the synchronization with ExecuteOfflineCommand more effectively

I'm wondering is there a way to recognize the OfflineComamd is being executed or internal flag or something to represent this command has been passed or mark it has been executed successfully. I have issue in recognizing the command is passed or not with unstable internet. I keep retrieve the records from database and comparing each and every time to see this has been passed or not. But due to the flow of my application, I'm finding it very difficult to avoid duplicates.IS there any automatic process to make sure commands executed automatically or something else?
2nd question, I can use UITimer to check isOffline() to make sure internet is connected or not on the forms. Is there something equivalent on server page or where queries is written to see internet is disconnected or not. When the control moved to queries and internet is disconnected I see the dialog open from form page being frozen for unlimited time and will not end. I have to close and re-open the app to continue the synchronization process.At the same time I cannot set a timeout for dialog because I'm not sure how long it will take the complete the Synchronization process. Please advise.
Extending on the same topic but I have created a new issue just to give more clarity on my questions.
executeOfflineCommand skips a command while executing from storage on Android
There is no way to know if a connection will stay stable as it requires knowledge of the future. You can work like transaction services do where the server side processes an offline command as a transaction using the approach of 2-phase commit.
In this approach you have an algorithm similar to this:
Client sends command to server
Server returns a special unique ID for the command
Client asks server to perform the unique id
Server acknowledges that the command was performed
If the first 2 stages didn't complete you just do that again. The worst thing that could happen is some orphan commands on the server.
If the 3rd option didn't complete you just do it again. The server knows whether it processed the command and will just acknowledge it if it was already processed.

What's the right way to do long-running processes on app engine?

I'm working with Go on App Engine and I'm trying to build an API that needs to perform a long-running background task - in this case it needs to parse and chunk a big file out to task queues. I'd like it to return a 200 and close the user connection immediately and let the process keep running in the background until it's complete (this could take 5-10 minutes). Task queues alone don't really work for my use case because parsing the initial file can take more than the time limit for an API request.
At first I tried a Go routine as a solution for this problem. This failed because my app engine context expired as soon as the parent function closed the user connection. (I suppose I could try writing a go routine that doesn't require a context, but then I'd lose logging and I'd need to fetch the entire remote file and pass it to the go routine.)
Looking through the docs, it looks like App Engine used to have functionality to support exactly what I want to do: [runtime.RunInBackground], but that functionality is now deprecated and the replacement isn't obvious.
Is there a "right" or recommended way to do background processing now?
I suppose I could put a link to my big file into a task queue, but if I understand correctly, even functions called through task queues have to complete execution within a specified amount of time (is it 90 seconds?) I need to be able to run longer than that.
Thanks for any help.
try using:
appengine.BackgroundContext()
it should be long-lived but will only work on GAE Flex

Execute one SSIS Package from Another, but using a DIFFERENT proxy user. Is it possible?

I have one SSIS Package that must run as Proxy A and another that must run as Proxy B. I would love to have the first package run, and, as one of its tasks, execute the second package. Is this possible?
Thanks a lot!
You could have the first package use sp_start_job to kick off a job that is set up to run the second package. If this is "fire-and-forget", that's all you need to do. If you need to wait until it's completed, things get more messy - you'd have to loop around calling (and parsing the output of) sp_help_jobactivity
and use WAITFOR DELAY until the run completes.
This is also more complex if you need to determine the actual outcome of running the second package.

Resources