Suppose we have a user uploading something like photos, and let's say that 20 photos get uploaded at the same time. Each of these photos would need to make a request to the backend to create them. That means that I might launch 20 requests at the same time which is a lot considering there can be other types of requests during this time.
What I want to accomplish with redux-saga is saga to put all these requests on a buffer and launch lets say 5 at a time, when a number out of the 5 are completed like 4 (in case 1 takes too long) saga should launch the rest. There is nothing on how you would accomplish this with the tools they offer. 'Throttle' offered by saga itself is not what I am looking for taking only the latest request of that type and just ignoring the others whilst I need it to save the others and fire them at a further date
There is no way to accomplish this without writing a custom module that does this management correct?
Related
I'm doing a request on an API that takes a very long time to execute (about 30 seconds to 4 minutes). Getting the user to wait is of course not a good idea, but I'm not sure about the web technologies that could allow to recontact the browser (subscribe) automatically after the request executed.
Any example of code, and pointers to the right techs would be really appreciated. I'm using aws APIs on the backend, and nextjs / redux on the frontend
Thanks
I'm sending data from my backend every 10 seconds and I wanted to display that data in reactjs. I've searched on the net to use socket.io to display real-time data. Is there a better way to use it?
If you're dead set on updating your data every 10 seconds, it would make more sense to make a request from the client to the server, as HTTP requests can only be opened from client to server. By using HTTP requests, you won't need to use socket.io, but socket.io is an easy alternative if you need much faster requests.
Depending on how you are generating the data being sent from your backend, specifically if you are using a database, there is most likely a way to subscribe to changes in the database. This would actually update the data in realtime, without a 10 second delay.
If you want a more detailed answer, you'll have to provide more detail regarding your question: what data are you sending? where is it coming from or how are you generating it?
I'm working on an autodialer feature, in which an agent will get a call when I trigger the button from the frontend (using react js language), and then automatically all the leads in the agent assigned portal will get back-to-back calls from agent number. However, because this process is automatic, the agent won't know who the agent has called, so I want to establish a real-time connection so that I can show a popup on the frontend that contains information about the lead who was called.
I am currently creating a react native + expo application upon which essentially each page makes an API call, which is a lot of API calls. I have this app also connected to firebase for different information. The things is, each of these pages don't update more than once or twice a day for the most part, so I really don't want the End User to be calling the API that much either.
My question is, is there a way to write and host a script that will continuously run that knows to call this API once every hour (or so) and then rewrite to the firebase db from which I can then only need to pull from the database as compared to having each user individually making dozens of API calls.
Please let me know! I have spent days on google and am no closer than I was before. I'm also willing to change my set up from firebase if it is not possible to accomplish that way. Thanks!
You can use a Cloud Functions scheduled trigger to run code periodically that can make changes to your database.
I have the following problem, I have API written in symfony 3, in angularJS I have written a front for this API and my problem is that the response time is very long, i.e. I am asking the API for a list of events which is 500, I am waiting for almost 30s to return them, I'm asking for a list of orders which is from 300 I am waiting 17s.
A backend in symfony is probably nothing to it, because even a postman asking for a list of events also takes 30s. The next thing is asking the API for the ticket code or it has been used up, the answer is only received after about 15s, which in the case of such a simple request is a tragic time.
I am seriously wondering what the problem is that this is how long it takes. On the other hand, in the situation of using Postman, the frontend also falls out because then it is not used at all.
I have no idea what the problem may be.
Every minute or so my app creates some data and needs to send it out to more than 1000 remote servers via URL Fetch callbacks. The callback URL for each server is stored on separate entities. The time lag between creating the data and sending it to the remote servers should be roughly less than 5 seconds.
My initial thought is to use the Pipeline API to fan out URL Fetch requests to different task queues.
Unfortunately task queues are not guaranteed to be executed in a timely fashion. Therefore from requesting a task queue start to it actually executing could take minutes to hours. From previous experience this gap is regularly over a minute so is not necessarily appropriate.
Is there any way from within App Engine to achieve what I want? Maybe you know of an outside service that can do the fan out in a timely fashion?
Well, there's probably no good solution for the gae here.
You could keep a backend running; hammering the datastore/memcache
every second for new data to send out, and then spawn dozens of async url-fetches.
But thats really inefficient...
If you want a 3rd party service, pubnub.com is capable of doing fan-out, however i don't know if it could fit in your setup.
How about using the async API? You could then do a large number of simultaneous URL calls, all from a single location.
If the performance is particularly sensitive, you could do them from a backend and use a B8 instance.