Can anyone help with graph transverse using gremlin .I have connected graph data were i need to query user contact with relationship 'KNOWS'
and its outgoing connection vertices properties .
g.addV().property(id, 'user').as('user').
addV().property(id, 'user1').as('user1').
addV().property(id, 'user2').as('user2').
addV().property(id, 'user3').as('user3').
addV().property(id, 'user4').as('user4').
addV().property(id, 'Industry1').as('Industry1').
addV().property(id, 'Industry2').as('Industry2').
addV().property(id, 'skills1').as('skills1').
addV().property(id, 'skills2').as('skills2').
addE('KNOWS').from('user').to('user1').
addE('KNOWS').from('user').to('user2').
addE('KNOWS').from('user').to('user3').
addE('KNOWS').from('user').to('user4').
addE('WORKS').from('user1').to('Industry1').
addE('WORKS').from('user2').to('Industry1').
addE('WORKS').from('user2').to('Industry2').
addE('WORKS').from('user3').to('Industry2').
addE('SKILLEDIN').from('user3').to('skills1').
addE('SKILLEDIN').from('user4').to('skills1').
addE('SKILLEDIN').from('user4').to('skills2')
What is required- navigating from user node get all outgoing connection with relationship 'KNOWS' and for each connection get his industry and skills
The output should be traversal result from user
user1---Industry1,skills1
user2--Industry1,Industry2
user3--Industry3,skills1
user4--skills1,skills2
I think this is the output you want:
gremlin> g.V('user').out('KNOWS').
......1> group().
......2> by(id).
......3> by(project('industries','skills').
......4> by(out('WORKS').id().fold()).
......5> by(out('SKILLEDIN').id().fold()))
==>[user1:[industries:[Industry1],skills:[]],user2:[industries:[Industry1,Industry2],skills:[]],user3:[industries:[Industry2],skills:[skills1]],user4:[industries:[],skills:[skills1,skills2]]]
Related
When trying to run a SOLR query,
SELECT count(*) from products;
we are seeing the below exception from solr server, ours is a SOLR cloud setup,
SOLR version - solr-8.8.2-PATCH2
solr-solrj-8.8.2 version
Complete Stack Trace i have mentioned below,
2023-02-09 14:21:34.824 ERROR (qtp1209411469-15) [c:products s:shard3 r:core_node12 x:otmm_shard3_replica_n10] o.a.s.s.HttpSolrCall java.lang.RuntimeException: java.lang.NoClassDefFoundError: Could not initialize class org.apache.solr.handler.sql.SolrRules
at org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:746)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:592)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:427)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:357)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:201)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:548)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:602)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1434)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1349)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:191)
at org.eclipse.jetty.server.handler.InetAccessHandler.handle(InetAccessHandler.java:177)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:516)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:386)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.solr.handler.sql.SolrRules
at org.apache.solr.handler.sql.SolrTableScan.register(SolrTableScan.java:72)
at org.apache.calcite.plan.AbstractRelOptPlanner.onNewClass(AbstractRelOptPlanner.java:239)
at org.apache.calcite.plan.volcano.VolcanoPlanner.onNewClass(VolcanoPlanner.java:464)
at org.apache.calcite.plan.AbstractRelOptPlanner.registerClass(AbstractRelOptPlanner.java:230)
at org.apache.calcite.plan.volcano.VolcanoPlanner.registerImpl(VolcanoPlanner.java:1224)
at org.apache.calcite.plan.volcano.VolcanoPlanner.register(VolcanoPlanner.java:589)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:604)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:84)
at org.apache.calcite.rel.AbstractRelNode.onRegister(AbstractRelNode.java:268)
at org.apache.calcite.plan.volcano.VolcanoPlanner.registerImpl(VolcanoPlanner.java:1132)
at org.apache.calcite.plan.volcano.VolcanoPlanner.register(VolcanoPlanner.java:589)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:604)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:84)
at org.apache.calcite.rel.AbstractRelNode.onRegister(AbstractRelNode.java:268)
at org.apache.calcite.plan.volcano.VolcanoPlanner.registerImpl(VolcanoPlanner.java:1132)
at org.apache.calcite.plan.volcano.VolcanoPlanner.setRoot(VolcanoPlanner.java:265)
at org.apache.calcite.tools.Programs.lambda$standard$3(Programs.java:262)
at org.apache.calcite.tools.Programs$SequenceProgram.run(Programs.java:331)
at org.apache.calcite.prepare.Prepare.optimize(Prepare.java:166)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:297)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:208)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare2_(CalcitePrepareImpl.java:642)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare_(CalcitePrepareImpl.java:508)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepareSql(CalcitePrepareImpl.java:478)
at org.apache.calcite.jdbc.CalciteConnectionImpl.parseQuery(CalciteConnectionImpl.java:231)
at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:556)
at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
at org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
at org.apache.solr.client.solrj.io.stream.JDBCStream.open(JDBCStream.java:278)
at org.apache.solr.client.solrj.io.stream.ExceptionStream.open(ExceptionStream.java:52)
at org.apache.solr.handler.StreamHandler$TimerStream.open(StreamHandler.java:465)
at org.apache.solr.client.solrj.io.stream.TupleStream.writeMap(TupleStream.java:82)
at org.apache.solr.common.util.JsonTextWriter.writeMap(JsonTextWriter.java:164)
at org.apache.solr.common.util.TextWriter.writeMap(TextWriter.java:216)
at org.apache.solr.common.util.TextWriter.writeVal(TextWriter.java:69)
at org.apache.solr.response.TextResponseWriter.writeVal(TextResponseWriter.java:153)
at org.apache.solr.common.util.JsonTextWriter.writeNamedListAsMapWithDups(JsonTextWriter.java:387)
at org.apache.solr.common.util.JsonTextWriter.writeNamedList(JsonTextWriter.java:293)
at org.apache.solr.response.JSONWriter.writeResponse(JSONWriter.java:73)
at org.apache.solr.response.JSONResponseWriter.write(JSONResponseWriter.java:66)
at org.apache.solr.response.QueryResponseWriterUtil.writeQueryResponse(QueryResponseWriterUtil.java:65)
at org.apache.solr.servlet.HttpSolrCall.writeResponse(HttpSolrCall.java:890)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:583)
... 40 more
Kindly let me know what is being wrong with this. Also am adding additional lines because its compalining me add more details.
Iam using Solrj, and i have a application it supports only JDBC and iam in a situation to use only JDBC solr using SOLRJ,
Its basically a BIRT reporting tool.
Where we can define SQL and output would be automatically mapped to the report.
This is what we are trying to do
Added below in solr.in.sh
SOLR_OPTS="$SOLR_OPTS -Dsolr.modules=sql"
And in solr admin, can able to see,
Args in Systemdashboard,
-Dsolr.modules=sql
But still not working
{
"responseHeader":{
"status":0,
"QTime":2},
"config":{"requestHandler":{"/export":{
"class":"solr.ExportHandler",
"useParams":"_EXPORT",
"components":["query"],
"invariants":{
"rq":"{!xport}",
"distrib":false},
"name":"/export",
"_useParamsExpanded_":{"_EXPORT":"[NOT AVAILABLE]"},
"_effectiveParams_":{
"distrib":"false",
"rq":"{!xport}"}}}}}
Edit: I just realized you had linked to the wrong docs for your version. The way you are trying to add modules was not available until 9.0 (https://issues.apache.org/jira/browse/SOLR-15914) 8.8 docs are at: https://solr.apache.org/guide/8_8/ Note that until recently this feature was called Parallel Sql https://solr.apache.org/guide/8_8/parallel-sql-interface.html
For 8.8 you shouldn't need to configure modules since at that time /sql was an implicitly loaded request handler.
https://solr.apache.org/guide/8_8/implicit-requesthandlers.html
You may need to verify if the implicit handlers have been (mis)configured via the request parameters API (https://solr.apache.org/guide/8_8/implicit-requesthandlers.html#how-to-edit-implicit-handler-paramsets)
Below refers to the 9.x versions of solr
java.lang.NoClassDefFoundError: Could not initialize class org.apache.solr.handler.sql.SolrRules
Probably indicates that you have not successfully loaded the sql module. Normally one enables modules at the very bottom of solr.in.sh not solr.sh as you said in your comments. solr.in.sh is the place where Solr intends for you to set up environment variables. One really shouldn't ever need to modify solr.sh directly, and doing so may make upgrades difficult in the future.
Check that there isn't another set of enablements in solr.in.sh that is overwriting what you've tried to do in solr.sh. Also check that you aren't enabling any other modules in other ways (several methods are shown here: https://solr.apache.org/guide/solr/latest/configuration-guide/solr-modules.html). You should pick one way of enabling modules (sysprops, solr.in.sh, solr.xml or solrconfig.xml) and then enable all modules that way to avoid having to understand any complicated precedence logic if possible. I don't know the precedence order, but I can probably figure it out if you have other modules and absolutely can't avoid using more than one method.
Also, you still haven't told us where you got the version ending in -PATCH2. This version spec sounds like you are working with some folks who are compiling their own custom Solr, so you should be sure to understand what they've changed in case they've customized something about how or where Solr loads jar files (not very likely, but one never knows).
so I have the following problem:
When I try to use the Ranker I have trained to search I get the following error message:
pysolr.SolrError: Solr responded with an error (HTTP 500): [Reason: Can not rerank results. Verify that your schema has not changed in incompatible ways.]
This is how I request the result:
pysolr_client._send_request("GET", path='/fcselect?q=%s&ranker_id=%s&wt=json' % (Question, ranker_id))
When I try to not do it through Python but through curl I get the following Error:
{"responseHeader":{"status":400,"QTime":1},"error":{"metadata":["error-class","org.apache.solr.common.SolrException","root-error-class","org.apache.solr.common.SolrException"],"msg":"Bad contentType for search handler :application/octet-stream request=...}","code":400}}
(I left out the request itself to not post the ranker id here).
The curl request I sent is the following:
curl -X POST -u "*username*":"*password*" "https://gateway.watsonplatform.net/retrieve-and-rank/api/v1/solr_clusters/*solr_cluster_id*/solr/Question_collection/fcselect?ranker_id=*ranker_id*&q=*question*?&wt=json"
I found the following solution to curl: just adding a -H "Content-Type: application/json" and, well, it shows me some documents as a result, but at the end of the response it still shows the same error. Additionally I see the following trace:
org.apache.solr.common.SolrException: Can not rerank results. Verify that your schema has not changed in incompatible ways.
at com.ibm.watson.hector.plugins.utils.ExceptionHandlingUtil.logAndThrowSolrException(ExceptionHandlingUtil.java:36)
at com.ibm.watson.hector.plugins.ss.FCFeatureGeneratorComponent.rerank(FCFeatureGeneratorComponent.java:743)
at com.ibm.watson.hector.plugins.ss.FCFeatureGeneratorComponent.process(FCFeatureGeneratorComponent.java:348)
at org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:272)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:155)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2102)
at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:654)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:460)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:257)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:208)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
The problem is that between training the ranker and using it I haven't even touched anything else. Not the schema, not the collection, none of the names. And I only have one collection, one configuration, one of everything, except for the documents - 294 of them.
The whole process I went through worked for a Ranker without custom features. But with custom features it doesn't.
I have gone through this tutorial to create my Watson Ranker with custom features: https://medium.com/machine-learning-with-ibm-watson/developing-with-ibm-watson-retrieve-and-rank-part-3-custom-features-826fe88a5c63
As far as I understand all I did thanks to this tutorial was changing the trainingdata.txt file, the training process is the same.
And now I have run out of ideas what to check to solve the problem..
Do you have any suggestions?
Thanks a lot in advance! :)
It was a stupid thing in the server.py of the rr_custom_scorer_proxy. When it writes the 'answer' csv that has to be reranked by the ranker it opens the file in 'wt' mode which leads to blank lines in between each line. This can't be handled by the ranker and we get an error.
If you change it to 'wb' verything works well.
I am trying to delete list of records from Salesforce via Mulesoft ESB.
I have already create an example project on github and the flow xml can be found here.
The xml snippet for deleting records is below:
<sfdc:delete config-ref="Salesforce__Basic_authentication" doc:name="Salesforce">
<sfdc:ids ref="#[payload]" />
</sfdc:delete>
where, payload data type is List of string.
While deleting the records I am getting below exception:
ERROR 2015-11-05 23:39:39,755 [[tutorial].HTTP_Listener_Configuration.worker.01] org.mule.exception.DefaultMessagingExceptionStrategy:
********************************************************************************
Message : Could not serialize object (org.mule.api.serialization.SerializationException)
Type : org.mule.api.transformer.TransformerException
Code : MULE_ERROR--2
JavaDoc : http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/api/transformer/TransformerException.html
Transformer : ObjectToByteArray{this=7be7e15, name='_ObjectToByteArray', ignoreBadInput=false, returnClass=SimpleDataType{type=[B, mimeType='*/*', encoding='null'}, sourceTypes=[SimpleDataType{type=java.io.Serializable, mimeType='*/*', encoding='null'}, SimpleDataType{type=java.io.InputStream, mimeType='*/*', encoding='null'}, SimpleDataType{type=java.lang.String, mimeType='*/*', encoding='null'}, SimpleDataType{type=org.mule.api.transport.OutputHandler, mimeType='*/*', encoding='null'}]}
********************************************************************************
Exception stack is:
1. com.sforce.soap.partner.DeleteResult (java.io.NotSerializableException)
java.io.ObjectOutputStream:1184 (null)
2. java.io.NotSerializableException: com.sforce.soap.partner.DeleteResult (org.apache.commons.lang.SerializationException)
org.apache.commons.lang.SerializationUtils:111 (null)
3. Could not serialize object (org.mule.api.serialization.SerializationException)
org.mule.serialization.internal.AbstractObjectSerializer:68 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/api/serialization/SerializationException.html)
4. Could not serialize object (org.mule.api.serialization.SerializationException) (org.mule.api.transformer.TransformerException)
org.mule.transformer.simple.SerializableToByteArray:66 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/api/transformer/TransformerException.html)
********************************************************************************
Root Exception stack trace:
java.io.NotSerializableException: com.sforce.soap.partner.DeleteResult
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.commons.lang.SerializationUtils.serialize(SerializationUtils.java:108)
at org.apache.commons.lang.SerializationUtils.serialize(SerializationUtils.java:133)
at org.mule.serialization.internal.JavaObjectSerializer.doSerialize(JavaObjectSerializer.java:44)
at org.mule.serialization.internal.AbstractObjectSerializer.serialize(AbstractObjectSerializer.java:64)
at org.mule.transformer.simple.SerializableToByteArray.doTransform(SerializableToByteArray.java:62)
at org.mule.transformer.simple.ObjectToByteArray.doTransform(ObjectToByteArray.java:78)
at org.mule.transformer.AbstractTransformer.transform(AbstractTransformer.java:415)
at org.mule.DefaultMuleMessage.getPayload(DefaultMuleMessage.java:425)
at org.mule.DefaultMuleMessage.getPayload(DefaultMuleMessage.java:373)
at org.mule.DefaultMuleMessage.getPayloadAsBytes(DefaultMuleMessage.java:714)
at org.mule.module.http.internal.listener.HttpResponseBuilder.build(HttpResponseBuilder.java:175)
at org.mule.module.http.internal.listener.HttpMessageProcessorTemplate.sendResponseToClient(HttpMessageProcessorTemplate.java:97)
at org.mule.execution.AsyncResponseFlowProcessingPhase.runPhase(AsyncResponseFlowProcessingPhase.java:83)
at org.mule.execution.AsyncResponseFlowProcessingPhase.runPhase(AsyncResponseFlowProcessingPhase.java:38)
at org.mule.execution.PhaseExecutionEngine$InternalPhaseExecutionEngine.phaseSuccessfully(PhaseExecutionEngine.java:65)
at org.mule.execution.PhaseExecutionEngine$InternalPhaseExecutionEngine.phaseSuccessfully(PhaseExecutionEngine.java:69)
at com.mulesoft.mule.throttling.ThrottlingPhase.runPhase(ThrottlingPhase.java:185)
at com.mulesoft.mule.throttling.ThrottlingPhase.runPhase(ThrottlingPhase.java:1)
at org.mule.execution.PhaseExecutionEngine$InternalPhaseExecutionEngine.process(PhaseExecutionEngine.java:114)
at org.mule.execution.PhaseExecutionEngine.process(PhaseExecutionEngine.java:41)
at org.mule.execution.MuleMessageProcessingManager.processMessage(MuleMessageProcessingManager.java:32)
at org.mule.module.http.internal.listener.DefaultHttpListener$1.handleRequest(DefaultHttpListener.java:126)
at org.mule.module.http.internal.listener.grizzly.GrizzlyRequestDispatcherFilter.handleRead(GrizzlyRequestDispatcherFilter.java:83)
at org.glassfish.grizzly.filterchain.ExecutorResolver$9.execute(ExecutorResolver.java:119)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeFilter(DefaultFilterChain.java:283)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.executeChainPart(DefaultFilterChain.java:200)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.execute(DefaultFilterChain.java:132)
at org.glassfish.grizzly.filterchain.DefaultFilterChain.process(DefaultFilterChain.java:111)
at org.glassfish.grizzly.ProcessorExecutor.execute(ProcessorExecutor.java:77)
at org.glassfish.grizzly.nio.transport.TCPNIOTransport.fireIOEvent(TCPNIOTransport.java:536)
at org.glassfish.grizzly.strategies.AbstractIOStrategy.fireIOEvent(AbstractIOStrategy.java:112)
at org.mule.module.http.internal.listener.grizzly.ExecutorPerServerAddressIOStrategy.run0(ExecutorPerServerAddressIOStrategy.java:102)
at org.mule.module.http.internal.listener.grizzly.ExecutorPerServerAddressIOStrategy.access$100(ExecutorPerServerAddressIOStrategy.java:30)
at org.mule.module.http.internal.listener.grizzly.ExecutorPerServerAddressIOStrategy$WorkerThreadRunnable.run(ExecutorPerServerAddressIOStrategy.java:125)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
You are actually deleting successfully. The problem is that your Mule flow ends after making the Salesforce call, and by default it tries to return the current payload. That payload is the result of the SF call, which is of type com.sforce.soap.partner.DeleteResult, and Mule doesn't know how to serialize it.
To start, try simply adding a "Set payload" component after the SF call and set payload to anything you want, say "hello world". Once you understand what's happening, you can add a groovy script to process the DeleteResult and actually determine if the deletion was successful or not, and possibly do something if it wasn't.
In this case, if you don't want to explicitly change the payload and want the exact payload being sent out of the SFDC connector, then try having an object to json connector after the sfdc connector and a logger to log the payload in this case.
My ESB flow needs to get files from a dynamic folder. This folder name changes based on month and year. Hence, I configured my inbound-endpoint as shown below but I am getting below error. I really appreciate any help on this.
Flow:
<flow name="DataMapperTestFlow" doc:name="DataMapperTestFlow">
<file:inbound-endpoint path="C:\#[new Date().format('yyyy\\MMMM')]" moveToDirectory="C:\#[new Date().format('yyyy\\MMMM')]\backup" pollingFrequency="10000" responseTimeout="10000" doc:name="File">
<file:filename-regex-filter pattern=".*.xls" caseSensitive="true"/>
</file:inbound-endpoint>
<custom-transformer class="ExcelToJava" doc:name="Java"/>
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="insertTestHeaders" connector-ref="NewDatabase" doc:name="InsertHeaders"/>
<set-payload value="#[payload.excelData.excelRows]" doc:name="Set Payload"/>
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="insertTestRows" connector-ref="NewDatabase" doc:name="InsertRows"/>
</flow>
Error:
org.mule.api.endpoint.MalformedEndpointException: The endpoint
"file:///C:/#[new Date().format('yyyy/MMMM')]" is malformed and cannot
be parsed. If this is the name of a global endpoint, check the name
is correct, that the endpoint exists, and that you are using the
correct configuration (eg the "ref" attribute). Note that names on
inbound and outbound endpoints cannot be used to send or receive
messages; use a named global endpoint instead.. Only Outbound
endpoints can be dynamic
"Only Outbound endpoints can be dynamic" quite says it all. You can have a look at the Mule Requester Module if it suits your needs, or try creating endpoints/flows programmatically with a scheduler and Java/Groovy/etc code.
trying to set up a config file for a custom module - do I need to have a unique model for each 'resourceModel', or is it possible to have multiple table entities per model?
Is it possible to get something like this to work:
<config>...
<model>
<namespace>
<class>Namespace_Module_Model</class>
<resourceModel>module_mysq4</resourceModel>
</namespace>
<module_mysql4>
<class>Namespace_Module_Model_Mysql4</class>
<entities>
<table_1>
<table>table_1</table>
</table_1>
<table_2>
<table>table_2</table>
</table_2>
<table_3>
<table>table_3</table>
</table_3>
...
</entities>
</module_mysql4>
..</config>
and then dynamically switch between the tables through the model?
and related: Anyone know what the possible children of the are and their properites? I've seen 'entities', 'associations' and 'items' - thx
It's not really clear what you're asking here. Magento has a basic one resource to one table resource, and a one resource to many tables constructed in a specific manner resource for EAV style models.
The scenario you're describing above isn't directly supported by the system, but if you wanted to implement something like it there's nothing stopping you from implementing a resource that works any way you want.
As for the possible children, create the simple config viewer described here to get a dump of the entire merged config, and then use an xpath viewer to examine all the nodes (and their children) that you're interested in
Thx for the response & apologies if the question was unclear. After a few hours of debugging, I have it working with the following structure:
<models>
<modulename>
<class>Namespace_Modulename_Model</class>
<resourceModel>modulename_mysql4</resourceModel>
</modulename>
<modulename_type1>
<class>Namespace_Modulename_Model_Type1</class>
<resourceModel>modulename_mysql4</resourceModel>
</modulename_type1>
<modulename_type2>
<class>Namespace_Modulename_Model_Type2</class>
<resourceModel>modulename_mysql4</resourceModel>
</modulename_type2>
<modulename_mysql4>
<class>Namespace_Modulename_Model_Mysql4</class>
<entities>
<modulename>
<table>modulename</table>
</modulename>
<modulename_type1>
<table>modulename_type1</table>
</modulename_type1>
<modulename__type2>
<table>modulename_type2</table>
</modulename_type2>
</entities>
</modulename_mysql4>
</models>
So yes - there is a single table entity for each model declared (one model, one resource) but I would have assumed that each additional model/resourceModel combination would require it's own separate Model_Mysql class in it's own modulename_mysql4 node, ala:
<models>
<modulename>
<class>Namespace_Modulename_Model</class>
<resourceModel>modulename_mysql4</resourceModel>
</modulename>
<modulename_type1>
<class>Namespace_Modulename_Model_Type1</class>
<resourceModel>modulename_mysql4_type1</resourceModel>
</modulename_type1>
<modulename_type2>
<class>Namespace_Modulename_Model_Type2</class>
<resourceModel>modulename_mysql4_type2</resourceModel>
</modulename_type2>
<modulename_mysql4>
<class>Namespace_Modulename_Model_Mysql4</class>
<entities>
<modulename>
<table>modulename</table>
</modulename>
</entities>
</modulename_mysql4>
<modulename_mysql4_type1>
<class>Namespace_Modulename_Model_Mysql4_Type1</class>
<entities>
<modulename_type1>
<table>modulename_type1</table>
</modulename_type1>
</entities>
</modulename_mysql4_type1>
<modulename_mysql4_type2>
<class>Namespace_Modulename_Model_Mysql_Type2</class>
<entities>
<modulename_type2>
<table>modulename_type2</table>
</modulename_type2>
</entities>
</modulename_mysql4_type2>
</models>
but that is not the case. Would love to hear a play by play explanation. Thx for the help!
Or:
<resources>
<modulename_setup>
<setup>
<module>modulename</module>
</setup>
<connection>
<use>core_setup</use>
</connection>
</modulename_setup>
<modulename_write>
<connection>
<use>core_write</use>
</connection>
</modulename_write>
<modulename_read>
<connection>
<use>core_read</use>
</connection>
</modulename_read>
</resources>