I used the Taurus Gatling guide to create a simple performance test and uploaded the yaml and scala file to blazemeter. When i start the test in blazemeter there is no test result and the bzt.log contains a ClassNotFoundException.
The validator for the yaml says its fine and i can't find anything so I'm lost...
My blazemleter.yaml:
execution:
- executor: gatling
scenario: products
iterations: 15
concurrency: 3
ramp-up: 2
scenarios:
products:
script: productSimulation.scala
simulation: test.productSimulation
My productSimulation.scala is mostly copied from their documentation:
package test
import io.gatling.core.Predef._
import io.gatling.http.Predef._
class productSimulation extends Simulation {
// parse load profile from Taurus
val t_iterations = Integer.getInteger("iterations", 100).toInt
val t_concurrency = Integer.getInteger("concurrency", 10).toInt
val t_rampUp = Integer.getInteger("ramp-up", 1).toInt
val t_holdFor = Integer.getInteger("hold-for", 60).toInt
val t_throughput = Integer.getInteger("throughput", 100).toInt
val httpConf = http.baseURL("https://mydomain/")
val header = Map(
"Content-Type" -> """application/x-www-form-urlencoded""")
val sessionHeaders = Map("Authorization" -> "Bearer ${access_token}",
"Content-Type" -> "application/json")
// 'forever' means each thread will execute scenario until
// duration limit is reached
val loopScenario = scenario("products").forever() {
// auth
exec(http("POST OAuth Req")
.post("https://oauth-provider")
.formParam("client_secret", "...")
.formParam("client_id", "...")
.formParam("grant_type", "client_credentials")
.headers(header)
.check(status.is(200))
.check(jsonPath("$.access_token").exists
.saveAs("access_token")))
// read products
.exec(http("products")
.get("/products")
.queryParam("limit", 200)
.headers(sessionHeaders))
}
val execution = loopScenario
.inject(rampUsers(concurrency) over rampUp) // during for gatling 3.x
.protocols(httpConf)
setUp(execution).maxDuration(rampUp + holdFor)
}
After learning that i can execute the scala file as a test directly if i click the file directly and not the yaml i got better exceptions.
Basicly i made two mistakes:
my variables are named t_concurrency, ... while the execution definition uses a different name. ups.
since gatling 3.x the keyword for the inject is during, so the correct code is: rampUsers(t_concurrency) during t_rampUp
Now everything works.
Related
I'm using OKHttpClient in a Kotlin app to post a file to an API that gets processed. While the process is running the API is sending back messages to keep the connection alive until the result has been completed. So I'm receiving the following (this is what is printed out to the console using println())
{"status":"IN_PROGRESS","transcript":null,"error":null}
{"status":"IN_PROGRESS","transcript":null,"error":null}
{"status":"IN_PROGRESS","transcript":null,"error":null}
{"status":"DONE","transcript":"Hello, world.","error":null}
Which I believe is being separated by a new line character, not a comma.
I figured out how to extract the data by doing the following but is there a more technically correct way to transform this? I got it working with this but it seems error-prone to me.
data class Status (status : String?, transcript : String?, error : String?)
val myClient = OkHttpClient ().newBuilder ().build ()
val myBody = MultipartBody.Builder ().build () // plus some stuff
val myRequest = Request.Builder ().url ("localhost:8090").method ("POST", myBody).build ()
val myResponse = myClient.newCall (myRequest).execute ()
val myString = myResponse.body?.string ()
val myJsonString = "[${myString!!.replace ("}", "},")}]".replace (",]", "]")
// Forces the response from "{key:value}{key:value}"
// into a readable json format "[{key:value},{key:value},{key:value}]"
// but hoping there is a more technically sound way of doing this
val myTranscriptions = gson.fromJson (myJsonString, Array<Status>::class.java)
An alternative to your solution would be to use a JsonReader in lenient mode. This allows parsing JSON which does not strictly comply with the specification, such as in your case multiple top level values. It also makes other aspects of parsing lenient, but maybe that is acceptable for your use case.
You could then use a single JsonReader wrapping the response stream, repeatedly call Gson.fromJson and collect the deserialized objects in a list yourself. For example:
val gson = GsonBuilder().setLenient().create()
val myTranscriptions = myResponse.body!!.use {
val jsonReader = JsonReader(it.charStream())
jsonReader.isLenient = true
val transcriptions = mutableListOf<Status>()
while (jsonReader.peek() != JsonToken.END_DOCUMENT) {
transcriptions.add(gson.fromJson(jsonReader, Status::class.java))
}
transcriptions
}
Though, if the server continously provides status updates until processing is done, then maybe it would make more sense to directly process the parsed status instead of collecting them all in a list before processing them.
I am using Gatling 3.6.1 and I am trying to repeat the request 10 times for the next 10 products from the feeder file. This is what I tried:
feed(products, 10)
.repeat(10, "index") {
exec(session => {
val index = session("index").as[Int]
val counter = index + 1
session.set("counter", counter)
})
.exec(productIdsRequest())
}
private def productIdsRequest() = {
http("ProductId${counter}")
.get(path + "products/${product_code${counter}}")
.check(jsonPath("$..code").count.gt(2))
}
I am having trouble getting the counter value to my API URL.
I would like to have something like
products/${product_code1},
products/${product_code2} etc.
But instead, I get the error 'nested attribute definition is not allowed'
So basically I would like that every request gets called with one product from the feeder (in the batch of 10 products)
Can you please help?
Thanks!
Disclaimer: I don't know how realized your feeder products.
If I clearly understand you - just need to move .repeat on high level:
.repeat(10, "counter") {
feed(product)
.exec(http("ProductId ${counter}")
.get("products/${product_code}")
.check(jsonPath("$..code").count.gt(2)))
}
I'm trying to attach a reactivestream subscriber to an akka source.
My source seems to work fine with a simple sink (like a foreach) - but if I put in a real sink, made from a subscriber, I don't get anything.
My context is:
import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import akka.stream.scaladsl.{Sink, Source}
import org.reactivestreams.{Subscriber, Subscription}
implicit val system = ActorSystem.create("test")
implicit val materializer = ActorMaterializer.create(system)
class PrintSubscriber extends Subscriber[String] {
override def onError(t: Throwable): Unit = {}
override def onSubscribe(s: Subscription): Unit = {}
override def onComplete(): Unit = {}
override def onNext(t: String): Unit = {
println(t)
}
}
and my test case is:
val subscriber = new PrintSubscriber()
val sink = Sink.fromSubscriber(subscriber)
val source2 = Source.fromIterator(() => Iterator("aaa", "bbb", "ccc"))
val source1 = Source.fromIterator(() => Iterator("xxx", "yyy", "zzz"))
source1.to(sink).run()(materializer)
source2.runForeach(println)
I get output:
aaa
bbb
ccc
Why don't I get xxx, yyy, and zzz?
Citing the Reactive Streams specs for the Subscriber below:
Will receive call to onSubscribe(Subscription) once after passing an
instance of Subscriber to Publisher.subscribe(Subscriber). No further
notifications will be received until Subscription.request(long) is
called.
The smallest change you can make to see some items flowing through to your subscriber is
override def onSubscribe(s: Subscription): Unit = {
s.request(3)
}
However, keep in mind this won't make it fully compliant to the Reactive Streams specs. It being not-so-easy to implement is the main reason behind higher level toolkits like Akka-Streams itself.
I have created a build.gradle file and in it i have this dep.
compile 'io.gatling.highcharts:gatling-charts-highcharts:2.1.7'
I have also created a simulation
package simulations
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import scala.concurrent.duration._
class LukeSimulation extends Simulation {
val httpConf = http
.baseURL("http://--------:8295/") // Here is the root for all relative URLs
.doNotTrackHeader("1")
.acceptLanguageHeader("en-US,en;q=0.5")
.acceptEncodingHeader("gzip, deflate")
.userAgentHeader("Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:16.0) Gecko/20100101 Firefox/16.0")
val headers_10 = Map("Content-Type" -> "application/x-www-form-urlencoded") // Note the headers specific to a given request
val scn = scenario("TotalUsage") // A scenario is a chain of requests and pauses
.exec(http("Usage")
.get("/api/v1/account/10186413349/totalusage"))
setUp(scn.inject(atOnceUsers(700),
rampUsers(100000) over(30 minutes),
constantUsersPerSec(200) during(10 minutes),
rampUsersPerSec(200) to(1000) during(10 minutes)
).protocols(httpConf))
//setUp(scn.inject(rampUsers(500) over(30 seconds)).protocols(httpConf))
//assertThat(global.failedRequests.count.is(0))
}
How do i execute it with gradle?
Create task like this -
task gatling() << {
javaexec {
main = 'io.gatling.app.Gatling'
classpath = sourceSets.test.runtimeClasspath
args('--simulation',
"simulations.LukeSimulation",
'--results-folder',
file('build/reports/gatling').absolutePath,
'--mute')
environment(['appUrl': appUrl])
systemProperties(['gatling.core.directory.binaries': sourceSets.test.output.classesDir])
}
}
If you want to pass parameter to your gatling you can do this way.
def getAppUrl() {
def appUrl = System.getProperty('appUrl')
appUrl ?: 'http://localhost:8295'
}
Command will be gradle gatling -DappUrl=http://localhost:8000
If you do not pass the appUrl then it will default to what is written in the getAppUrl method.
To use the property in your gatling code do this
val appUrl = getenv("appUrl")
In my long, but simple awesome Gatling simulation, I have few responses that ended with error 500. Is it possible to tell gatling to collect these error responses messages in a file during the simulation?
No in production mode. You only have them when debug logging is enabled.
It is possible to collect what ever you want and save it into simulation.log file. Use extraInfoExtractor method when you define protocol:
val httpProtocol = http
.baseURL(url)
.check(status.is(successStatus))
.extraInfoExtractor { extraInfo => List(getExtraInfo(extraInfo)) }
Then define in your getExtraInfo(extraInfo: ExtraInfo) method whatever criteria you want. Example bellow outputs request and response in case of debug is enables via Java System Property OR response code is not 200 OR status of request is KO (it can be KO if you have setup some max time and this max time gets increased)
private val successStatus: Int = 200
private val isDebug = System.getProperty("debug").toBoolean
private def getExtraInfo(extraInfo: ExtraInfo): String = {
if (isDebug
|| extraInfo.response.statusCode.get != successStatus
|| extraInfo.status.eq(Status.valueOf("KO"))) {
",URL:" + extraInfo.request.getUrl +
" Request: " + extraInfo.request.getStringData +
" Response: " + extraInfo.response.body.string
} else {
""
}
}