Gatling compilation fails for simulation with nothingFor function - gatling

I am trying to run simulation with nothingFor function under Gatling 3.4.1. However, compilation fails with the error could not find implicit value for evidence parameter of type io.gatling.core.controller.inject.InjectionProfileFactory[Product with Serializable]
Simulation
package abs
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import scala.concurrent.duration._
class DefaultSimulation extends Simulation {
...
val httpProtocol = http
.baseUrl(Setting.baseUrl)
.userAgentHeader("Gatling/3.4.1")
setUp(
getScenario.inject(
nothingFor(20 seconds),
rampConcurrentUsers(0) to(20) during(10 seconds),
constantConcurrentUsers(20) during (600 seconds)
),
setScenario.inject(
rampConcurrentUsers(0) to(20) during(30 seconds),
constantConcurrentUsers(20) during (600 seconds)
)
).protocols(httpProtocol)
}
Compilation error
[ERROR] i.g.c.ZincCompiler$ - C:/Installation/gatling-3.4.1/user-files/simulations/DefaultSimulation.scala:50:30: could not find implicit value for evidence parameter of type io.gatling.core.controller.inject.InjectionProfileFactory[Product with Serializable]
getScenario.inject(
^
[ERROR] i.g.c.ZincCompiler$ - one error found
[ERROR] i.g.c.ZincCompiler$ - Compilation crashed
xsbt.InterfaceCompileFailed: null
at xsbt.CachedCompiler0.handleErrors(CompilerBridge.scala:183)
at xsbt.CachedCompiler0.run(CompilerBridge.scala:172)
at xsbt.CachedCompiler0.run(CompilerBridge.scala:134)
at xsbt.CompilerBridge.run(CompilerBridge.scala:39)
at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:89)
at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$7(MixedAnalyzingCompiler.scala:185)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:240)
at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:175)
at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4$adapted(MixedAnalyzingCompiler.scala:156)
at sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:232)
at sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:156)
at sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:203)
at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:571)
at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:571)
at sbt.internal.inc.Incremental$.$anonfun$apply$5(Incremental.scala:174)
at sbt.internal.inc.Incremental$.$anonfun$apply$5$adapted(Incremental.scala:172)
at sbt.internal.inc.Incremental$$anon$2.run(Incremental.scala:459)
at sbt.internal.inc.IncrementalCommon$CycleState.next(IncrementalCommon.scala:115)
at sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:56)
at sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:52)
at sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:248)
at sbt.internal.inc.Incremental$.$anonfun$incrementalCompile$8(Incremental.scala:414)
at sbt.internal.inc.Incremental$.withClassfileManager(Incremental.scala:499)
at sbt.internal.inc.Incremental$.incrementalCompile(Incremental.scala:401)
at sbt.internal.inc.Incremental$.apply(Incremental.scala:166)
at sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:571)
at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:489)
at sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:332)
at sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:419)
at sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:137)
at io.gatling.compiler.ZincCompiler$.doCompile(ZincCompiler.scala:258)
at io.gatling.compiler.ZincCompiler$.delayedEndpoint$io$gatling$compiler$ZincCompiler$1(ZincCompiler.scala:265)
at io.gatling.compiler.ZincCompiler$delayedInit$body.apply(ZincCompiler.scala:40)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1$adapted(App.scala:80)
at scala.collection.immutable.List.foreach(List.scala:431)
at scala.App.main(App.scala:80)
at scala.App.main$(App.scala:78)
at io.gatling.compiler.ZincCompiler$.main(ZincCompiler.scala:40)
at io.gatling.compiler.ZincCompiler.main(ZincCompiler.scala)
I have tried to implicitly specify imports, but compilation fails with the same exception
import io.gatling.core.Predef.{nothingFor, rampConcurrentUsers, constantConcurrentUsers, _}
This simulation compiles and works without nothingFor(20 seconds).

Gatling has 2 different families of injection profiles steps:
open, where you control users arrival rate
closed where you control the number of concurrent users
You can't mix them because those are 2 completely different and incompatible behaviors.
nothingFor belongs to the open family while rampConcurrentUsers belongs to the closed one.
Use constantConcurrentUsers(0) during (20) instead.

Related

Error in loading YOLOv4 TFlite model in Flutter

I have loaded my custom YOLOv4 TFlite model on Flutter based on this repo by TexMexMax. I already followed all the instructions in the README.md but still got these errors:
/C:/flutter/.pub-cache/hosted/pub.dartlang.org/tflite_flutter-0.5.0/lib/src/delegates/gpu_delegate.dart:58:10: Error: The getter 'addressOf' isn't defined for the class 'TfLiteGpuDelegateOptionsV2'.
- 'TfLiteGpuDelegateOptionsV2' is from 'package:tflite_flutter/src/bindings/types.dart' ('/C:/flutter/.pub-cache/hosted/pub.dartlang.org/tflite_flutter-0.5.0/lib/src/bindings/types.dart').
Try correcting the name to the name of an existing getter, or defining a getter or field named 'addressOf'.
.addressOf);
^^^^^^^^^
/C:/flutter/.pub-cache/hosted/pub.dartlang.org/win32-1.7.4+1/lib/src/structs.dart:2992:7: Error: Struct 'STATSTG' is empty. Empty structs and unions are undefined behavior.
class STATSTG extends Struct {}
/C:/flutter/.pub-cache/hosted/pub.dartlang.org/tflite_flutter-0.5.0/lib/src/bindings/types.dart:6:7: Error: Struct 'TfLiteInterpreter' is empty. Empty structs and unions are undefined behavior.
class TfLiteInterpreter extends Struct {}
^
/C:/flutter/.pub-cache/hosted/pub.dartlang.org/ffi-0.1.3/lib/src/allocation.dart:47:33: Error: Expected type 'T' to be a valid and instantiated subtype of 'NativeType'.
final int totalSize = count * sizeOf<T>();
There are list of errors similar to the above but with different functions(?). Then the errors ended with below:
FAILURE: Build failed with an exception.
* Where:
Script 'C:\flutter\packages\flutter_tools\gradle\flutter.gradle' line: 1005
* What went wrong:
Execution failed for task ':app:compileFlutterBuildDebug'.
> Process 'command 'C:\flutter\bin\flutter.bat'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 10s
Exception: Gradle task assembleDebug failed with exit code 1
I also tried some of the solutions here but still no luck. Apparently, according to this author, the TFLite plugins in Flutter doesn't support YOLOv4 yet hence I followed the repo by TexMaxMex since they had modified something so that Flutter could support the YOLOv4 model but then I encountered those errors.
Hope someone could enlighten me why is this happening and what should I do to assist the problems. Thank you in advance!

Pyflink windowAll() by event-time to apply a clutering model

I'm a beginner on pyflink framework and I would like to know if my use case is possible with it ...
I need to make a tumbling windows and apply a python udf (scikit learn clustering model) on it.
The use case is : every 30 seconds I want to apply my udf on the previous 30 seconds of data.
For the moment I succeeded in consume data from a kafka in streaming but then I'm not able to create a 30seconds window on a non-keyed stream with the python API.
Do you know some example for my use case ? Do you know if the pyflink API allow this ?
Here my first shot :
from pyflink.common import Row
from pyflink.common.serialization import JsonRowDeserializationSchema, JsonRowSerializationSchema
from pyflink.common.typeinfo import Types
from pyflink.datastream import StreamExecutionEnvironment
from pyflink.datastream.connectors import FlinkKafkaConsumer, FlinkKafkaProducer
from pyflink.common.watermark_strategy import TimestampAssigner, WatermarkStrategy
from pyflink.common import Duration
import time
from utils.selector import Selector
from utils.timestampAssigner import KafkaRowTimestampAssigner
# 1. create a StreamExecutionEnvironment
env = StreamExecutionEnvironment.get_execution_environment()
# the sql connector for kafka is used here as it's a fat jar and could avoid dependency issues
env.add_jars("file:///flink-sql-connector-kafka_2.11-1.14.0.jar")
deserialization_schema = JsonRowDeserializationSchema.builder() \
.type_info(type_info=Types.ROW_NAMED(["labelId","freq","timestamp"],[Types.STRING(),Types.DOUBLE(),Types.STRING()])).build()
kafka_consumer = FlinkKafkaConsumer(
topics='events',
deserialization_schema=deserialization_schema,
properties={'bootstrap.servers': 'localhost:9092'})
# watermark_strategy = WatermarkStrategy.for_bounded_out_of_orderness(Duration.of_seconds(5))\
# .with_timestamp_assigner(KafkaRowTimestampAssigner())
ds = env.add_source(kafka_consumer)
ds.print()
ds = ds.windowAll()
# ds.print()
env.execute()
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.flink.api.java.ClosureCleaner (file:/home/dorian/dataScience/pyflink/pyflink_env/lib/python3.6/site-packages/pyflink/lib/flink-dist_2.11-1.14.0.jar) to field java.util.Properties.serialVersionUID
WARNING: Please consider reporting this to the maintainers of org.apache.flink.api.java.ClosureCleaner
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Traceback (most recent call last):
File "/home/dorian/dataScience/pyflink/project/__main__.py", line 35, in <module>
ds = ds.windowAll()
AttributeError: 'DataStream' object has no attribute 'windowAll'
Thx

Codenameone: Android Build Failed with strange exception

Since today (28.03.) the build of my app (CN1 build server) throws an build Exception which I don't understand. The Build yesterday worked without an error. The error from the error-log:
Dex: The number of method references in a .dex file cannot exceed 64K.
Learn how to resolve this issue at https://developer.android.com/tools/building/multidex.html
UNEXPECTED TOP-LEVEL EXCEPTION:
com.android.dex.DexIndexOverflowException: method ID not in [0, 0xffff]: 65536
com.android.dex.DexIndexOverflowException: method ID not in [0, 0xffff]: 65536
at com.android.dx.merge.DexMerger$8.updateIndex(DexMerger.java:565)
at com.android.dx.merge.DexMerger$IdMerger.mergeSorted(DexMerger.java:276)
at com.android.dx.merge.DexMerger.mergeMethodIds(DexMerger.java:574)
at com.android.dx.merge.DexMerger.mergeDexes(DexMerger.java:166)
at com.android.dx.merge.DexMerger.merge(DexMerger.java:198)
at com.android.builder.dexing.DexArchiveMergerCallable.call(DexArchiveMergerCallable.java:61)
at com.android.builder.dexing.DexArchiveMergerCallable.call(DexArchiveMergerCallable.java:36)
at java.util.concurrent.ForkJoinTask$AdaptedCallable.exec(ForkJoinTask.java:1424)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1689)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
:transformDexArchiveWithDexMergerForDebug FAILED
:transformDexArchiveWithDexMergerForDebug (Thread[Daemon worker,5,main]) completed. Took 0.334 secs.
FAILURE: Build failed with an exception.
Can anybody help me to understand what went wrong?
The error is:
Dex: The number of method references in a .dex file cannot exceed 64K.
In this case, add the build hint:
android.multidex=true
As written in the developer guide (link):
android.multidex -> Boolean true/false defaults to false. Multidex
allows Android binaries to reference more than 65536 methods. This
slows builds a bit so we have it off by default but if you get a build
error mentioning this limit you should turn this on.

In gatling, I got an error with "Exception in thread "main" java.lang.ExceptionInInitializerError"

When I run Engine.scala under the scala directory,it gives an error:
Exception in thread "main" java.lang.ExceptionInInitializerError
at Engine$.delayedEndpoint$Engine$1(Engine.scala:7)
at Engine$delayedInit$body.apply(Engine.scala:4)
at scala.Function0.apply$mcV$sp(Function0.scala:34)
at scala.Function0.apply$mcV$sp$(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App.$anonfun$main$1$adapted(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:389)
at scala.App.main(App.scala:76)
at scala.App.main$(App.scala:74)
at Engine$.main(Engine.scala:4)
at Engine.main(Engine.scala)
Caused by: java.lang.NullPointerException
at io.gatling.commons.util.PathHelper$.url2path(PathHelper.scala:34)
at IDEPathHelper$.<init>(IDEPathHelper.scala:7)
at IDEPathHelper$.<clinit>(IDEPathHelper.scala)
... 11 more
And BaiduHomePageSimulation.scala is:
package test.scala
import io.gatling.core.Predef._
import io.gatling.http.Predef._
class BaiduSimulation extends Simulation {
// Set the root path of the request
val httpConf = http.baseUrl("https://www.baidu.com")
/*
run 10 seconds, during:the default unit is seconds, if you want to use microseconds eg. during(100 millisecond)
*/
val scn = scenario("BaiduSimulation").during(10){
exec(http("baidu_home").get("/"))
}
// set the thread count
setUp(scn.inject(atOnceUsers(10)).protocols(httpConf))
}
I don't konw how to fix it? Has anyone encountered a similar issue?
I was getting this error and was able to resolve it by marking the scala directory as Test Sources Root
Right click scala folder > Mark directory as > Test Sources Root

How to configure Search plugin in Cakephp 2.0

I was trying to incorporate Search plugin to my project and I thought I followed the instructions from the download site but whenever I access Posts it gives me these 3 lines of errors:
Warning (2): call_user_func_array() expects parameter 1 to be a valid callback, class 'PrgComponent' does not have a method 'startup' [CORE\Cake\Utility\ObjectCollection.php, line 110]
Warning (2): call_user_func_array() expects parameter 1 to be a valid callback, class 'PrgComponent' does not have a method 'beforeRender' [CORE\Cake\Utility\ObjectCollection.php, line 110]
Warning (2): call_user_func_array() expects parameter 1 to be a valid callback, class 'PrgComponent' does not have a method 'shutdown' [CORE\Cake\Utility\ObjectCollection.php, line 110]
Can someone help me figure this out please?
Thanks!
In Cakephp 2.0 component class extents Component not Object
So, write following code:
class PrgComponent extends Component {
}
Download site? Which download site? There is no "download site" for the 2.0 version of this plugin yet.
The 2.0 branch is here https://github.com/CakeDC/search/tree/2.0 and all tests for the PrgComponent pass.
9/9 test methods complete: 9 passes, 0 fails, 13 assertions and 0 exceptions.
Time: 0.29582810401917 seconds
Peak memory: 11,688,424 bytes
Run more tests | Show Passes | Analyze Code Coverage
Code coverage results Toggle all files
PrgComponent.php Code coverage: 99.06%
Check the 2.0 branch out from git.
Are you even talking about the CakeDC search plugin?

Resources