How to detect an unfinished awaiting process before Window closure? - wpf

Having spent the last several days converting a working frontend application for a database with Async/Await, I fear I have worked myself into a corner. Being new to all this, I have a very simple question.
What happens to an "awaiting" process when the computer is suspended? Additionally, can a process that is "awaiting" be detected in a generic fashion so that the user can be warned that "the process is not finished" prior to Window closure? (The application has many Create and Update processes to manage the database backend).
TIA

Instead of
await DoAsync();
you could do something like
task = DoAsync();
await task;
and when closing your application, you might want to check somewhere else
if(task.Status == TaskStatus.Running)
...
this would make sense for long running operations. Have a look at the Task Class, though.

Related

Salesforce Apex CPU LImit

currently we are having issue with an CPU Limit. We do have a lot of processes that are most likely not optimized, I have already combined some processes for the same object but it is not enough. I am trying to understand logs rights now - as you can see on the screenshots, there is one process that is being called multiple times (I assume each time for created record). Even if I create, for example, 60 records in one operation/dml statement, the Process Builders still gets called 60 times? (this is what I think is happening) Is that a problem we are having right now? If so, is there a better way to do it? Because right now we need updates from PB to run, but I expected it should get bulkified or something like that. I was also thinking there might be some looping between processes. If there are more information you need, please let me know. Thank you.
Well, yes, the process builder will be invoked 60 times, 1 record at a time. But that shouldn't be your problem. The final update / create child records / email send (or whatever your action is) will be bulkified, it won't save 1 record at a time. If the process calls some apex actions - they're supposed to support passing collection of records, not just single record.
You maybe looking at wrong place. CPU time suggests code problems, not config (flow, workflow, process builder... although if you're doing updates of fields on "this" record it's possible you'd benefit from before-save flows). Try to compare timestamps related to METHOD_BEGIN, METHOD_END for triggers, code methods (including invocable action / process plugin interfaces).
Maybe there's code that doesn't need to run because key fields didn't change, there's nothing to recalculate, rollup. Hard to say without seeing the debug log.
Maybe the operation doesn't have to be immediate. Think if you can offload some stuff to "scheduled actions", "time based workflows" or in apex terms "#future, batchable, queueable". But they'd have to be relatively safe to run, if there's error - it won't display to the user because the action will be in the background, you'd need to handle the errors manually (send an email, create a record, make chatter post or bell notification).
You could try uploading the log to https://apextimeline.herokuapp.com/ and try to make sense out of that Gantt-chart-like output. Or capture the log "pro" way, with https://help.salesforce.com/s/articleView?id=sf.code_dev_console_solving_problems_using_system_log.htm&type=5 or https://marketplace.visualstudio.com/items?itemName=financialforce.lana (you'll likely need developer's help to make sense out of it).

Extended windows

I have an always one application, listening to a Kafka stream, and processing events. Events are part of a session. And I need to do calculations based off of a sessions data. I am running into a problem trying to correctly run my calculations due to the length of my sessions. 90% of my sessions are done after 5 minutes. 99% are done after 1 hour. Sessions may last more than a day, due to this being a real-time system, there is no determined end. Session are unique, and show never collide.
I am looking for a way where I can process a window multiple times, either with an initial wait period and processing any later events after that, or a pure process per event type structure. I will need to keep all previous events around(ListState), as well as previously processed values(ValueState).
I previously thought allowedLateness would allow me to do this, but it seems the lateness is only considered for when the event should have been processed, it does not extend an actual window. GlobalWindows may also work, but I am unsure if there is a way to process a window multiple times. I believe I can used an evictor with GlobalWindows to purge the Windows after a period of inactivity(although admittedly, I did not research this yet, because I was unsure of how to trigger a GlobalWindow multiple times.
Any suggestions on how to achieve what I am looking to do would be greatly appreciated, I would also be happy to clarify any points needed.
If SessionWindows won't do the job, then you can use GlobalWindows with a custom Trigger and Evictor. The Trigger interface has onElement and timer-based callbacks that can fire whenever and as often as you like. If you go down this route, then yes, you'll also need to implement an Evictor to dispose of elements when they are no longer needed.
The documentation and the source code are helpful when trying to understand how this all fits together.

'End Task' in Task Manager always sets CloseReason.UserClosing

I want to log if a customer tries to force close the application. I'm aware of having no chance to catch a process kill. But it should be possible through the main form closing event to get informed about the 'CloseReason.TaskManagerClosing' reason.
But any tests I did under Windows 8.1 I always got a CloseReason.UserClosing reason. But in this case (compared to a normals CloseReason.UserClosing) I've about 0.2s to run user code afterwards my program is killed!
Is this a new behavior in Windows 8.1?
Yes, I see this. And yes, this is a Windows change, previous versions of Task Manager sent the window the WM_CLOSE notification directly. I now see it issue the exact same command that's issued when you close the window with the Close button (WM_SYSCOMMAND, SC_CLOSE). Or press Alt+F4 or use the system menu. So Winforms can no longer tell the difference between the Task Manager and the user closing the window and you do get CloseReason.UserClosing.
What happens next is expected, if you don't respond to the close command quickly enough then Task Manager summarily assassinates your program with TerminateProcess().
Do keep in mind that trying to save data when the user aborts your program through Task Manager is a bad practice. Your user will normally use this if your program is malfunctioning, you can't really trust the data anymore and you risk writing garbage. This is now compounded by your saving code being aborted, high odds for a partially written file or dbase data that isn't usable anymore.
There is no simple workaround for this, the odds that Windows is going to be patched to restore old behavior are very close to zero. It is very important that you save your data in a transactional way so you don't destroy valuable data if the saving code is aborted. Use File.Replace() for file data, use a dbase transaction for dbase writes.
An imperfect way to detect this condition is by using the Form.Deactivate and Activate events. If you saw the Deactivate event and the FormClosing event fires then reasonable odds that another program is terminating yours.
But the normal way you deal with this is the common one, if the user ends the program without saving data then you display a dialog that asks whether to save data. Task Manager ensures that this doesn't go further than that.
Another solution for determining when the Task Manager is closing the program is by checking if the main form, or any of its controls, has focus. When you close via the Task Manager, the application is not focused, whereas if you close via the close button or Alt+F4, the application does have focus. I used a simple if check:
private void MyForm_Closing(object sender, FormClosingEventArgs e)
{
if (this.ContainsFocus)
{
// Put user close code here
}
}

Should I use async await on UI Thread or Task.Run to process multiple files?

I am writing an application in WPF using .NET 4.5. The application allows the user to select multiple files and import them. When this happens the application will parse text files and store the data into a database. The files can be very large so I want the UI to remain responsive and allow the user to do other things while this is happening.
I'm new to asynchronous programming and would to know what the best approach would be? According to a Microsoft article...
"The async and await keywords don't cause additional threads to be created. Async methods don't require multithreading because an async method doesn't run on its own thread. The method runs on the current synchronization context and uses time on the thread only when the method is active. You can use Task.Run to move CPU-bound work to a background thread, but a background thread doesn't help with a process that's just waiting for results to become available."
Since I am processing multiple files would I be better of using "Task.Run" for each file since they will run on separate threads? If not, what is the advantage of running everything on the same thread using async/await?
Note that no other UI operation (except progress bars - progress reporting) really cares about when these files are done processing so according to this article it seems like using Task.Run would benefit me. What are your thoughts?
What are you doing with the files? If you're doing anything CPU intensive, it would be better to move that processing off the UI thread. If it's really just going to be the IO, then you can do it all in the UI thread - and still make it parallel.
For example:
private async Task ProcessAllFiles(IEnumerable<string> files)
{
List<Task> tasks = files.Select(x => ProcessFile(x))
.ToList();
await Task.WhenAll(tasks);
textBox.Text = "Finished!";
}
private async Task ProcessFile(string file)
{
// Do stuff here with async IO, and update the UI if you want to...
}
Here you're still doing all the actual processing on the UI thread - no separate threads are running, beyond potentially IO completion ports - but you're still starting multiple asynchronous IO operations at the same time.
Now one other thing to consider is that this might actually slow everything down anyway. If you're processing multiple files on the same physical disk, you might find that accessing them sequentially would be faster just due to the nature of disk IO. It will depend on the disk type though (SSDs would act differently to "regular" disks, for example).

Synchronizing Asynchronous request handlers in Silverlight environment

For our senior design project my group is making a Silverlight application that utilizes graph theory concepts and stores the data in a database on the back end. We have a situation where we add a link between two nodes in the graph and upon doing so we run analysis to re-categorize our clusters of nodes. The problem is that this re-categorization is quite complex and involves multiple queries and updates to the database so if multiple instances of it run at once it quickly garbles data and breaks (by trying to re-insert already used primary keys). Essentially it's not thread safe, and we're trying to make it safe, and that's where we're failing and need help :).
The create link function looks like this:
private Semaphore dblock = new Semaphore(1, 1);
// This function is on our service reference and gets called
// by the client code.
public int addNeed(int nodeOne, int nodeTwo)
{
dblock.WaitOne();
submitNewNeed(createNewNeed(nodeOne, nodeTwo));
verifyClusters(nodeOne, nodeTwo);
dblock.Release();
return 0;
}
private void verifyClusters(int nodeOne, int nodeTwo)
{
// Run analysis of nodeOne and nodeTwo in graph
}
All copies of addNeed should wait for the first one that comes in to finish before another can execute. But instead they all seem to be running and conflicting with each other in the verifyClusters method. One solution would be to force our front end calls to be made synchronously. And in fact, when we do that everything works fine, so the code logic isn't broken. But when it's launched our application will be deployed within a business setting and used by internal IT staff (or at least that's the plan) so we'll have the same problem. We can't force all clients to submit data at different times, so we really need to get it synchronized on the back end. Thanks for any help you can give, I'd be glad to supply any additional information that you could need!
I wrote a series to specifically address this situation - let me know if this works for you (sequential asynchronous workflows):
Part 2 (has a link back to the part1):
http://csharperimage.jeremylikness.com/2010/03/sequential-asynchronous-workflows-part.html
Jeremy
Wrap your database updates in a transaction. Escalate to a table lock if necessary

Resources