How do I inject users according to a dataset? - gatling

I am new for gatling. I have a CSV file which records the requests rate(or number of users). And I want to sent a specify number of requests to a website according to the CSV file, like 500 requests per seconds, how can I do this?
Please help me about this problem, Thanks

Related

Turning csv file into api

I want to create a project. I have a huge list of data in a single .csv file. It has Millions of Data and I want to access them remotely using an api call.
My csv files looks similar like this:
Name
Age
Red
20
Blue
25
It's a single csv file and it has over Millions of Data
I want to turn this csv file into a api so that I can access it from anywhere anytime using HTTPS request.
Like, www.myapi.example/api/csvfile/red
When I submit this call, I should get response Age:20 Something like that.
Please give me Idea how to do that. Thanks
I expect to get response like this after a HTTPS request :
{"Blue":"25"}

Processing Json file to post in batches

Im working on a Wordpress project that will take entries from a 3rd party API and add them as individual post on the site with the data saved as metadata. Up until now it was going well but with the number of entries on the API increasing im starting to run in to issue regarding the server timing out while making the API request.
What I have done now is to wright the response from the API that is all the interies in json format to a file then creating the post on the site from the file. But still running in to time out issues.
That brings me to my question.
I want to break the data (from the JSON file) up in to smaller manageable request to the server. only processing lets say 20 entries at a time out if the 1000+. But I need to find out how do I select the entry number in the json file, for example after processing the first 20 entries I then want the function to go back to the json file but this time start at the 21st entry in the file, if at all possible. Or is there a better method of programmatically creating post from a json file with a large amount of entries.
// Just some more info.
Im using wp_remote_post() with blocking set to false to run the function that creates the post in batches in the background.

Getting large images from mulesoft into salesforce

So currently I am doing a synchronous call to mulesoft which returns raw image(no encoding is done) and then storing the image in a document.So when ever we are getting bigger images more than 6 MB it is hitting the governerlimit for max size.So wanted to know is there a way to get a reduced or compressed image
I have no idea if Mule has anything to preprocess images, compress...
In apex you could try to make the operation asynchronous to benefit from 22 mb limit. But there wil be no UI element for it anymore, your component / user would have to periodically check if the file got saved or something.
you could always change the direction. Make Mule push to salesforce over standard API instead of apex code pulling from Mule. From what I remember standard files API is good for up to 2GB.
Maybe send some notification to mule that you want file XYZ attached to account 123, mule would insert contentversion, contentdocumentlink? And have apex periodically check.
And when file is not needed - nightly job to delete files created by "Mr mule" over a week ago?

Retrieving first few bytes from a huge file via $http.get

How can we prevent $http.get from downloading all the contents of a huge text file that is 500mb in filesize? Let's say, I only want to retrieve the first 50k
Front-end download specific bytes from file on server:
Try the following code to download first 50 MB from your file:
$http.get('www.example.com/someapi', {
headers: {'Range': "bytes=0-49999"}
});
Read more about Range-request-header
Back-end divide file on server to parts:
If you want to divide your huge file to parts, and download them from angular, so you can make routes for each part as the following:
http://rootPath.com/downloadFile?part:=id
Then you can send get request to download wanted part(50 MB), that'll tell the server which part of huge file the front-end wants.
If the server supports Range requests, you can send the header Range: bytes=0-49999 or however many bytes you want.

Appengine, excel files and the 30 seconds request limit

How can I upload parse and download excel files in Google appengine that require more than 30secs ? I use java poi and backend tasks, but as soon as the backend does the job I cannot notify the client. I cannot download the excel that is created from the backend task... Any suggestions would be much appreciated.
The best approach here is not to fight HTTP and a web service architecture but rather to work with it.
Introduce a notion of a job id. When your client uploads a file, immediately return a token that represents that job. Extra credit, include an estimated duration of the job. For starters, lets say its 2 minutes.
The client is then responsible for querying the server for the state of that job id using the token. The server either returns the answer, or it returns the token back with an updated ETA.
For starters, you could just always tell the client to check back in 2 minutes (or whatever constant makes most sense for your workload). As your server processing becomes smarter, you could give more accurate estimates, and decrease the busy-waiting the client does.

Resources