Image Extractor by AI Habitat produces a configuration error when importing Matterport dataset - dataset

I need help understanding the error message, which is along the lines of changing the file name to json because the configuration fails. I have a long error message but pasted the part that is mostly repeated throughout the message:
/Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.stage_config.json
I0412 19:04:17.735939 42397184 AttributesManagerBase.h:296] AttributesManager::createFromJsonOrDefaultInternal (Stage) : Proposing JSON name : /Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.stage_config.json from original name : /Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.ply | This file does not exist.
I0412 19:04:17.736085 42397184 AbstractObjectAttributesManagerBase.h:182] AbstractObjectAttributesManager::createObject (Stage) : Done making attributes with handle : /Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.ply
I0412 19:04:17.736093 42397184 AbstractObjectAttributesManagerBase.h:189] File (/Users/kyra/Documents/GitHub/habitat-sim/matterport/scans/house1/8194nk5LbLH 13/poisson_meshes/8194nk5LbLH_10.ply) exists but is not a recognized config filename extension, so new default Stage attributes created and registered.
I0412 19:04:17.736124 42397184 SceneDatasetAttributes.cpp:46]
What I did: Ran image extractor after activating Conda env. I modified the image extractor to change the file path to point to a .ply file in the matterport dataset.
Setup: 1)Facebook's AI Habitat-sim built from source,
2)MacBook Air M1,
3)Conda environment with the dependencies (using pip install -r requirements.txt) but habitat-sim is not installed by Conda,
4)Matterport3D dataset (downloaded one house).
Thank you.

Related

Robot Framework Parameterizing using yaml file

Hi can anybody help me parameterize the string word so it will fetch from my yaml. I tried to run however I'm getting an error it shows failed: Using YAML variable files requires PyYAML module to be installed. Typically you can install it by running pip install pyyaml. but I already install pyyaml on my local machine. your response is highly appreciated. Thank you so much
Expected Result: ${String} parameter should get the value from my robot.yaml (Ralph) value
VS Terminal Screenshot:
.robot screenshot
robot.yaml file screenshot:
CMD Screenshot:
In robot.yaml define PYTHONPATH like this:
PYTHONPATH:
- .
- string: "RALPH"
Make sure you have installed PyYAML, then include robot.yaml and collections library in the robot file:
Variables path_to_file/robot.yaml
Library Collections
After this you can extract string value inside the test like this:
${value} = pop from dictionary ${PYTHONPATH[1]} string
log to console ${value}
This will print:
RALPH
Second item in PYTHONPATH list is a dictionary, so you first need to access ${PYTHONPATH[1]} and then pop the needed key (in your case string) in order to return its value.

While running gatling test getting following error "java.lang.IllegalArgumentException: requirement failed: CSV headers can't be empty"

I am creating a gatling script and trying to parameterize it. I have checked the path of CSV it is correct and CSV is present in the path mentioned in the configuration file. While running the gatling test from local machine getting below error and the test fails.
java.lang.IllegalArgumentException: requirement failed: CSV headers can't be empty
at scala.Predef$.require(Predef.scala:337)
at io.gatling.core.feeder.SeparatedValuesParser$.$anonfun$stream$5(SeparatedValuesParser.scala:47)
at io.gatling.core.feeder.SeparatedValuesParser$.$anonfun$stream$5$adapted(SeparatedValuesParser.scala:46)
at scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
at io.gatling.core.feeder.SeparatedValuesParser$.$anonfun$stream$1(SeparatedValuesParser.scala:46)
at io.gatling.core.feeder.SeparatedValuesFeederSource.$anonfun$feeder$1(FeederSource.scala:118)
at scala.util.Using$.resource(Using.scala:261)
at io.gatling.core.feeder.SeparatedValuesFeederSource.applyBatch$1(FeederSource.scala:117)
at io.gatling.core.feeder.SeparatedValuesFeederSource.feeder(FeederSource.scala:125)
at io.gatling.core.feeder.SourceFeederBuilder.apply(FeederBuilder.scala:81)
at io.gatling.core.feeder.SourceFeederBuilder.apply(FeederBuilder.scala:50)
at io.gatling.core.action.builder.FeedBuilder.newFeedActor(FeedBuilder.scala:38)
at io.gatling.core.action.builder.FeedBuilder.$anonfun$build$1(FeedBuilder.scala:43)
at scala.collection.mutable.MapOps.getOrElseUpdate(Map.scala:146)
at scala.collection.mutable.MapOps.getOrElseUpdate$(Map.scala:143)
at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:265)
at io.gatling.core.action.builder.FeedBuilder.build(FeedBuilder.scala:43)
at io.gatling.core.structure.BuildAction.$anonfun$build$1(BuildAction.scala:28)
at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:169)
at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:165)
at scala.collection.immutable.List.foldLeft(List.scala:79)
at io.gatling.core.structure.BuildAction.build(BuildAction.scala:27)
at io.gatling.core.structure.BuildAction.build$(BuildAction.scala:26)
at io.gatling.core.structure.ScenarioBuilder.build(StructureBuilder.scala:51)
at io.gatling.core.structure.PopulationBuilder.build(PopulationBuilder.scala:100)
at io.gatling.core.scenario.SimulationParams.buildScenario(Simulation.scala:170)
at io.gatling.core.scenario.SimulationParams.$anonfun$scenarios$1(Simulation.scala:174)
at scala.collection.immutable.List.map(List.scala:246)
at io.gatling.core.scenario.SimulationParams.scenarios(Simulation.scala:174)
at io.gatling.app.Runner.run0(Runner.scala:79)
at io.gatling.app.Runner.run(Runner.scala:49)
at io.gatling.app.Gatling$.start(Gatling.scala:83)
at io.gatling.app.Gatling$.fromMap(Gatling.scala:40)
at Engine$.delayedEndpoint$Engine$1(Engine.scala:11)
at Engine$delayedInit$body.apply(Engine.scala:4)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1(App.scala:76)
at scala.App.$anonfun$main$1$adapted(App.scala:76)
at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:563)
at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:561)
at scala.collection.AbstractIterable.foreach(Iterable.scala:919)
at scala.App.main(App.scala:76)
at scala.App.main$(App.scala:74)
at Engine$.main(Engine.scala:4)
at Engine.main(Engine.scala)
Not sure why test is failing any suggestion would be helpful.
Got the solution. I was referring to the csv and for one of the column the header name was not defined.
After removing the value without header name in csv file the test ran successfully.

Cakephp 3 - Plugin - translation file - not working

Using Cake version 3.4.5 :
1) I've wrote a plugin :
/plugins/Accounting/
2) then, to create the pot file from the view files I run :
bin/cake i18n extract --plugin Accounting
3) this generates /plugins/Accounting/src/Locale/default.pot
But the translated text does not appear.
My locale is es_AR, and I've tried to copy the file as :
/plugins/Accounting/src/Locale/accounting.pot
or
/plugins/Accounting/src/Locale/es_AR/default.pot
or
/plugins/Accounting/src/Locale/es_AR/accounting.pot
Also tried to save the files as accounting.po, but nothing happens
But still not text is translated from the plugin views (it does work for the app's views).
I've found it !!!
The problem was the file / directory permissions.
By default, cake i18n extract --plugin MyPlugin makes this :
creates the src/Locale/ directory inside the plugin structure
creates the template translation file default.pot instead of
my_plugin.pot
all these creations are made with mode 750 being
the owner the linux user currently logged in ( not www-data )
So in order to make it work :
change the permissions of the Locale structure to 755
rename default.pot to my_plugin.po
use __d( 'my_plugin', 'Text to be translated' )

What are the EJB Remote Interfaces in the project when createEJBStubs.bat is run?

I understand that createEJBstubs are necessary to create the stubs whenever it is accessed externally via. An Java Client (http://www-01.ibm.com/support/knowledgecenter/SSAW57_8.5.5/com.ibm.websphere.nd.doc/ae/rejb_3stubscmd2.html?lang=ko). And also, that the stubs that are created will be for the EJB3 interfaces that are available within the module. But how do they detect if a given interface/bean is of type EJB 3 and not 2.*. From the issue the I have faced below, it is not clear to me as to how this detection is done. Please help me to understand this so that I can resolve the issue that I am facing (below)-
DETAILED EXPLANATION OF MY ISSUE:
When I run createEJBStubs.bat C:\1\DEN\proj\target\proj.jar -updatefile C:\1\DEN\proj\target\proj.jar -verbose
I get the following error -
CNTR9241I: The C:\1\DEN\proj\target\proj-5.DEV-SNAPSHOT.jar Java archive (JAR) file has no level 3.0 enterprise beans with remote interfaces.
And when I run the same command using -verbose option, I get the following error -
createWarProcessingRootDir: enter
Root directory for war processing: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622
War explosion root: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622\warExpls
War primary input root: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622\warExpls\primary
War primary classpath root: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622\warExpls\primary\classPth
War primary workspace root: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622\warExpls\primary\wrkSpace
War alternate workspace root: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622\warExpls\alternate\wrkSpace
War alternate output file: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622\warExpls\alternateOutputWar.war
War embedded in ear root: C:\Users\w723521\AppData\Local\Temp\_tempWar_1096410607889881622\warExpls\embInEar
createWarProcessingRootDir: exit
Dumping input parameters:
parameter 1 = C:\1\DEN\proj\target\proj-5.DEV-SNA
PSHOT.jar
parameter 2 = -updatefile
parameter 3 = C:\1\DEN\proj\target\proj-5.DEV-SNA
PSHOT.jar
parameter 4 = -trace
Processing the C:\1\DEN\proj\target\proj-5.DEV-SNAPSHOT.jar input file.
checkEJBVersion - jar name is C:\1\DEN\proj\target\proj-5.DEV-SNAPSHOT.jar
The output file name is C:\Users\w723521\AppData\Local\Temp\ejb3093395338317385883.jar
copyArchiveEntriesAndStubs(null,C:\1\DEN\proj\target\proj-5.DEV-SNAPSHOT.jar,C:\Users\w723521\AppData\Local\Temp\ejb3093395338317385883.jar,false)
Main output archive file (no pre-existing stubs) is C:\Users\w723521\AppData\Local\Temp\ejb3093395338317385883.jar
Preexisting stubs archive file is ejb5694574801473018226.jar
Writing non-stub entry MANIFEST.MF
Writing non-stub entry TestLocal.class
Writing non-stub entry TestRemote.class
Classloader updated for -cp null
Classloader updated for jar C:\1\DEN\proj\target\proj-5.DEV-SNAPSHOT.jar
getMetaData - entry
metadataComplete setting is false
findRemoteInterfaces
List of interfaces are: null
Number of pre-existing stubs = 0
CNTR9241I: The C:\1\DEN\proj\target\proj-5.DEV-SNAPSHOT.jar Java archive (JAR) file has no level 3.0 enterprise beans with remote interfaces.
Starting process of deleting workspace files...
Done with process of deleting workspace files...
Command Successful
I am not able to understand why it says that no local and remote interfaces are of EJB3 even though the bean classes created using EJB3 annotations
#Remote
public interface TestRemote {
public void test();
}
#Remote
public interface TestRemote {
public void test();
}
How does the script detect if a given interface is EJB3 or not? (based on the output that I have obtained it is not as clear as I expected it to be)
proj-5.DEV-SNAPSHOT.jar would need to somehow specify both an EJB and its remote interface. For example, if ejb-jar.xml is used, a <session> with a <remote>pkg.TestRemote</remote>. For another example, if some class in the JAR is annotated with #Stateless and #Remote(TestRemote.class).

Prepare multi-databases with play framework

I want to prepare my application to be compatible with many databases types. To try it i've used H2, MySql and Postgresql. So 'ive added into build.sbt :
"mysql" % "mysql-connector-java" % "5.1.35",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41"
and i've added conf/prod.conf with all configuration without database configuration, and 3 files:
conf/h2.conf
include "prod.conf"
db.h2.driver=org.h2.Driver
db.h2.url="jdbc:h2:mem:dontforget"
db.h2.jndiName=DefaultDS
ebean.h2="fr.chklang.dontforget.business.*"
conf/mysql.conf
include "prod.conf"
db.mysql.driver=com.mysql.jdbc.Driver
db.mysql.jndiName=DefaultDS
ebean.mysql="fr.chklang.dontforget.business.*"
conf/postgresql.conf
include "prod.conf"
db.postgresql.driver=org.postgresql.Driver
db.postgresql.jndiName=DefaultDS
ebean.postgresql="fr.chklang.dontforget.business.*"
Add to it i've three folders into conf/evolutions with
evolutions/h2
evolutions/mysql
evolutions/postgresql
with these things user can start my application with this command :
-Dconfig.file=dontforget-conf.conf -DapplyEvolutions.default=true -Dhttp.port=10180 &
And this conf file is
include "postgresql.conf"
db.postgresql.url="jdbc:postgresql:dontforget"
db.postgresql.user=myUserName
db.postgresql.password=myPassword
But with this configuration, when my application try to connect to DB :
The default EbeanServer has not been defined? This is normally set via the ebean.datasource.default property. Otherwise it should be registered programatically via registerServer()]]
So i've tried to add, into my configuration :
ebean.datasource.default=postgresql
but when i add it i've :
Configuration error: Configuration error[Configuration error[]]
at play.api.Configuration$.play$api$Configuration$$configError(Configuration.scala:94)
at play.api.Configuration.reportError(Configuration.scala:743)
at play.Configuration.reportError(Configuration.java:310)
at play.db.ebean.EbeanPlugin.onStart(EbeanPlugin.java:56)
at play.api.Play$$anonfun$start$1$$anonfun$apply$mcV$sp$1.apply(Play.scala:91)
at play.api.Play$$anonfun$start$1$$anonfun$apply$mcV$sp$1.apply(Play.scala:91)
at scala.collection.immutable.List.foreach(List.scala:383)
at play.api.Play$$anonfun$start$1.apply$mcV$sp(Play.scala:91)
at play.api.Play$$anonfun$start$1.apply(Play.scala:91)
at play.api.Play$$anonfun$start$1.apply(Play.scala:91)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.api.Play$.start(Play.scala:90)
at play.core.StaticApplication.<init>(ApplicationProvider.scala:55)
at play.core.server.NettyServer$.createServer(NettyServer.scala:253)
at play.core.server.NettyServer$$anonfun$main$3.apply(NettyServer.scala:289)
at play.core.server.NettyServer$$anonfun$main$3.apply(NettyServer.scala:284)
at scala.Option.map(Option.scala:145)
at play.core.server.NettyServer$.main(NettyServer.scala:284)
at play.core.server.NettyServer.main(NettyServer.scala)
Caused by: Configuration error: Configuration error[]
at play.api.Configuration$.play$api$Configuration$$configError(Configuration.scala:94)
at play.api.Configuration.reportError(Configuration.scala:743)
at play.api.db.BoneCPApi.play$api$db$BoneCPApi$$error(DB.scala:271)
at play.api.db.BoneCPApi$$anonfun$getDataSource$3.apply(DB.scala:438)
at play.api.db.BoneCPApi$$anonfun$getDataSource$3.apply(DB.scala:438)
at scala.Option.getOrElse(Option.scala:120)
at play.api.db.BoneCPApi.getDataSource(DB.scala:438)
at play.api.db.DB$$anonfun$getDataSource$1.apply(DB.scala:142)
at play.api.db.DB$$anonfun$getDataSource$1.apply(DB.scala:142)
at scala.Option.map(Option.scala:145)
at play.api.db.DB$.getDataSource(DB.scala:142)
at play.api.db.DB.getDataSource(DB.scala)
at play.db.DB.getDataSource(DB.java:25)
at play.db.ebean.EbeanPlugin.onStart(EbeanPlugin.java:54)
So i don't understand how i can do it.
YES!!! I've found it! After debug mode (etc...)
There was 2 problems.
First problem : I must add a key into my application.conf :
ebeanconfig.datasource
For me (for exemple), postgresql.conf is modified to :
db.postgresql.driver=org.postgresql.Driver
db.postgresql.jndiName=DefaultDS
ebean.postgresql="fr.chklang.dontforget.business.*"
ebeanconfig.datasource.default=postgresql
Second problem : include into play 2.3.x don't works because conf folder isn't added into classpath (ref Load file from '/conf' directory on Cloudbees ) so we must concat prod.conf, postgresql.conf and dontforget.conf into an only single file.
I hope i have helped any other developper...

Resources