I have a scenario in my project where i need to wait for a file to process in the blackened,when i upload a file. Based on that success or failure is displayed.
Currently i am using Thread.sleep() as there is not element i could wait for.
Can you please suggest me is there any way to handle this.
Thank You,
It require either an flag in DB with stated that file is process successfully.
If you having such thing than connect to DB using JDBC and loop the record till the flag not change or record not found.
Another way is an API response, it may need developers to add an key in api response with you can validate using java library like rest-assured
Third - If that file you can see in UI after complete process then use loop till the element with filename is not shown on UI, make sure loop should break at particular time period otherwise the loop run for indefinite time
Related
I'm wondering is there a way to recognize the OfflineComamd is being executed or internal flag or something to represent this command has been passed or mark it has been executed successfully. I have issue in recognizing the command is passed or not with unstable internet. I keep retrieve the records from database and comparing each and every time to see this has been passed or not. But due to the flow of my application, I'm finding it very difficult to avoid duplicates.IS there any automatic process to make sure commands executed automatically or something else?
2nd question, I can use UITimer to check isOffline() to make sure internet is connected or not on the forms. Is there something equivalent on server page or where queries is written to see internet is disconnected or not. When the control moved to queries and internet is disconnected I see the dialog open from form page being frozen for unlimited time and will not end. I have to close and re-open the app to continue the synchronization process.At the same time I cannot set a timeout for dialog because I'm not sure how long it will take the complete the Synchronization process. Please advise.
Extending on the same topic but I have created a new issue just to give more clarity on my questions.
executeOfflineCommand skips a command while executing from storage on Android
There is no way to know if a connection will stay stable as it requires knowledge of the future. You can work like transaction services do where the server side processes an offline command as a transaction using the approach of 2-phase commit.
In this approach you have an algorithm similar to this:
Client sends command to server
Server returns a special unique ID for the command
Client asks server to perform the unique id
Server acknowledges that the command was performed
If the first 2 stages didn't complete you just do that again. The worst thing that could happen is some orphan commands on the server.
If the 3rd option didn't complete you just do it again. The server knows whether it processed the command and will just acknowledge it if it was already processed.
I'm working with Go on App Engine and I'm trying to build an API that needs to perform a long-running background task - in this case it needs to parse and chunk a big file out to task queues. I'd like it to return a 200 and close the user connection immediately and let the process keep running in the background until it's complete (this could take 5-10 minutes). Task queues alone don't really work for my use case because parsing the initial file can take more than the time limit for an API request.
At first I tried a Go routine as a solution for this problem. This failed because my app engine context expired as soon as the parent function closed the user connection. (I suppose I could try writing a go routine that doesn't require a context, but then I'd lose logging and I'd need to fetch the entire remote file and pass it to the go routine.)
Looking through the docs, it looks like App Engine used to have functionality to support exactly what I want to do: [runtime.RunInBackground], but that functionality is now deprecated and the replacement isn't obvious.
Is there a "right" or recommended way to do background processing now?
I suppose I could put a link to my big file into a task queue, but if I understand correctly, even functions called through task queues have to complete execution within a specified amount of time (is it 90 seconds?) I need to be able to run longer than that.
Thanks for any help.
try using:
appengine.BackgroundContext()
it should be long-lived but will only work on GAE Flex
Pleeeeease help me... :- )
"You are my only hope".
I need to execute an action asynchronously several thousand times. The action is supposed to fetch email content from external API and it is located in the different controller, so I use requestAction to get it. When I get all the results, then these 1000 email contents are sent as 1000 emails during one request, using other API.
Unfortunately, when run sequentially, it takes a lof of time, so I need to run these request asynchronously.
My question is:
Can I execute
$this->requestAction($myUrl)
...parallelly? For example 100 requests at one time? I've seen a bit of asynchronous examples in PHP, but they all used static files and I need to preserve CakePHP structure to be able to use requestAction.
Thanks to all who can help!
EDIT: By the way, when I tried to run requests via fopen($url, 'r'); and then stream get contents the efficiency was worse than worst, but maybe it could be improved. Don't know, but the requestAction seems to be definitely better option (I think).
Is there any reason why you wouldn't use a shell for this?
Please take a look at: CakePHP Shells
"Deferred Execution" is probably what you are after here, basically you want to send one command not have the user waiting around for it? If so then you can use a Message Queue to handle this pretty easily.
We use CakeResque and Redis to send 1000's of emails and perform other API calls
CakeResque - Deferred Processing for CakePHP
There are other message queues that are available but this is really simple to get working and probably wont need much of a change to your code.
In the end, I wasn't able to find a solution that would use requestAction, but I was able to extract the code to a standalone php file, which then is processed using include method.
And the most interesting part, asynchronous requests, was done using great library called cURL-easy and with help of my utility function. You can read about it (how to install and how to use it) here:
Is making asynchronous HTTP requests possible with PHP?
I have a site which runs a quite intensive job from the admin panel.
I would somehow like to run it without interfering with the output it produce.
To make it more clear. On submitting a form I must unzip a big zip file, and for every image I fin, I must insert it to my database. This process takes more than 3-4 minutes. So I would like to put the job running and output that the job is being processed.
I don't know if I can notify user then (maybe through an ajax call) but I don't mind that.
Is it possible?
thanks
It depends on what you want the user to do while you're processing the data.
Do you want him to wait? Then use a AJAX call and display a progress bar (or whatever you like) until the task is complete. When the task is complete, inform the user (via ajax again) and proceed with (whatever) functionality you've got in store.
If you want the file to be processed while the user is doing something else, you can call (CakePHP) ShellTask in the background and let it do its' magic. If you want for the User to be able to view what the status of a import is, you can create a imports table, that stores information about all imports. By using a "status" field, the user can know that, for e.g. the task is still running.
When the shell task completes you can change the "status" of the import and do what ever. This needs to be done in the task itself, but it wouldn't be a problem because you can use models in Shell Tasks - the same way you can use models in Shells.
Hope this helps, comment if you need any further clarification.
I have a polling service that checks a directory for new files, if there is a new file I call SSIS.
There are instances where I can't have SSIS run if another instance of SSIS is already processing another file.
How can I make SSIS run sequentially during these situations?
Note: parallel SSIS's running is fine in some circumstances, while in others not, how can I achieve both?
Note: I don't want to go into WHEN/WHY it can't run in parallel at times, but just assume sometimes it can and sometimes it can't, the main idea is how can I prevent a SSIS call IF it has to run in sequence?
If you want to control the flow sequentially, think of a design like where you can enqueue requests (for invoking SSIS) to a queue data structure. At a time, only the top request from the queue will be processed. As soon as that request completes, next request can be dequeued.