JBossFuse IllegalArgumentException in CXF Client - apache-camel

I'm getting a rather odd exception in a camel route. It is comparing a class with itself and identifying them as different types. I can only assume is the result of a ClassLoader issue. This is the exception formatted for clarity (full trace to below).
java.lang.IllegalArgumentException:
Part {http://ws.someco.com/messageserv}createMsgRequest
should be of type
com.someco.ws.messageserv.CreateMsgRequest,
not com.someco.ws.messageserv.CreateMsgRequest
Can anyone suggest a solution?
These are some of the version detail:
JBoss Fuse (6.2.1.redhat-084)
[installed ] [2.15.1.redhat-621084 ] camel-cxf camel-2.15.1.redhat-621084
[installed ] [1.2.0.redhat-621084 ] fabric-cxf fabric-1.2.0.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-specs cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-core cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-wsdl cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-http cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-http-jetty cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-bindings-soap cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-jaxws cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-jaxrs cxf-3.0.4.redhat-621084
[installed ] [3.0.4.redhat-621084 ] cxf-databinding-jaxb cxf-3.0.4.redhat-621084
This is from the logged error
2016-09-28 15:18:44,541 | ERROR | tp1704597020-170 | DefaultErrorHandler | 198 - org.apache.camel.camel-core - 2.15.1.redhat-621084 | Failed delivery for (MessageId: ID-IFDS3854-57235-1475090154837-1-1 on ExchangeId: ID-IFDS3854-57235-1475090154837-1-2). Exhausted after delivery attempt: 1 caught: java.lang.IllegalArgumentException: Part {http://ws.someco.com/messageserv}createMsgRequest should be of type com.someco.ws.messageserv.CreateMsgRequest, not com.someco.ws.messageserv.CreateMsgRequest
Message History
---------------------------------------------------------------------------------------------------------------------------------------
RouteId ProcessorId Processor Elapsed (ms)
[route2 ] [route2 ] [cxf://bean:msgservpocEndpoint ] [ 256]
[route2 ] [removeHeaders2 ] [removeHeaders[*] ] [ 1]
[route2 ] [log4 ] [log ] [ 1]
[route2 ] [to3 ] [ref:createMsgReqTransformer ] [ 97]
[route2 ] [process2 ] [ref:imageProcessor ] [ 0]
[route2 ] [setHeader2 ] [setHeader[operationName] ] [ 0]
[route2 ] [log5 ] [log ] [ 1]
[route2 ] [messageservEndpoin] [cxf:bean:messageservEndpoint?defaultOperationName=createMsg&defaultOperationNa] [ 152]
Exchange
---------------------------------------------------------------------------------------------------------------------------------------
Exchange[
Id ID-IFDS3854-57235-1475090154837-1-2
ExchangePattern InOut
Headers {CamelRedelivered=false, CamelRedeliveryCounter=0, operationName=createMsg}
BodyType java.util.ArrayList
Body [com.someco.ws.messageserv.CreateMsgRequest#161e2951]
]
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
java.lang.IllegalArgumentException: Part {http://ws.someco.com/messageserv}createMsgRequest should be of type com.someco.ws.messageserv.CreateMsgRequest, not com.someco.ws.messageserv.CreateMsgRequest
at org.apache.cxf.jaxb.io.DataWriterImpl.checkPart(DataWriterImpl.java:292)[79:org.apache.cxf.cxf-rt-databinding-jaxb:3.0.4.redhat-621084]
at org.apache.cxf.jaxb.io.DataWriterImpl.write(DataWriterImpl.java:220)[79:org.apache.cxf.cxf-rt-databinding-jaxb:3.0.4.redhat-621084]
at org.apache.cxf.interceptor.AbstractOutDatabindingInterceptor.writeParts(AbstractOutDatabindingInterceptor.java:122)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.wsdl.interceptors.BareOutInterceptor.handleMessage(BareOutInterceptor.java:69)[78:org.apache.cxf.cxf-rt-wsdl:3.0.4.redhat-621084]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.endpoint.ClientImpl.doInvoke(ClientImpl.java:516)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:418)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.camel.component.cxf.CxfProducer.process(CxfProducer.java:116)[207:org.apache.camel.camel-cxf:2.15.1.redhat-621084]
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:139)[198:org.apache.camel.camel-core:2.15.1.redhat-621084]
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)[198:org.apache.camel.camel-core:2.15.1.redhat-621084]
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:448)[198:org.apache.camel.camel-core:2.15.1.redhat-621084]
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)[198:org.apache.camel.camel-core:2.15.1.redhat-621084]
at org.apache.camel.processor.Pipeline.process(Pipeline.java:121)[198:org.apache.camel.camel-core:2.15.1.redhat-621084]
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)[198:org.apache.camel.camel-core:2.15.1.redhat-621084]
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:191)[198:org.apache.camel.camel-core:2.15.1.redhat-621084]
at org.apache.camel.component.cxf.CxfConsumer$1.asyncInvoke(CxfConsumer.java:95)[207:org.apache.camel.camel-cxf:2.15.1.redhat-621084]
at org.apache.camel.component.cxf.CxfConsumer$1.invoke(CxfConsumer.java:75)[207:org.apache.camel.camel-cxf:2.15.1.redhat-621084]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:59)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)[:1.7.0_79]
at java.util.concurrent.FutureTask.run(FutureTask.java:262)[:1.7.0_79]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor$2.run(ServiceInvokerInterceptor.java:126)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.workqueue.SynchronousExecutor.execute(SynchronousExecutor.java:37)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:131)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)[74:org.apache.cxf.cxf-core:3.0.4.redhat-621084]
at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:251)[96:org.apache.cxf.cxf-rt-transports-http:3.0.4.redhat-621084]
at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.doService(JettyHTTPDestination.java:261)[205:org.apache.cxf.cxf-rt-transports-http-jetty:3.0.4.redhat-621084]
at org.apache.cxf.transport.http_jetty.JettyHTTPHandler.handle(JettyHTTPHandler.java:70)[205:org.apache.cxf.cxf-rt-transports-http-jetty:3.0.4.redhat-621084]
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1088)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1024)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.Server.handle(Server.java:370)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)[86:org.eclipse.jetty.aggregate.jetty-all-server:8.1.17.v20150415]
at java.lang.Thread.run(Thread.java:745)[:1.7.0_79]

Related

JQ - Groupby and join using mapping

Submitted my first question and had hoped to apply it to the bigger JSON file but I am just not getting it.
Using JQ I am trying to turn this JSON:
[{"field": "F1","results": [{"details": [
{"name": "P1","matches": [
{"displayName": "User1","smtpAddress": "user1#foo.bar"},
{"displayName": "User2","smtpAddress": "user2#foo.bar"}
]
},
{"name": "P2","matches": [
{"displayName": "User3","smtpAddress": "user3#foo.bar"},
{"displayName": "User4","smtpAddress": "user4#foo.bar"}
]
}]}]},
{"field": "F2","results": [{"details": [
{"name": "P3","matches": [
{"displayName": "User1","smtpAddress": "user1#foo.bar"},
{"displayName": "User5","smtpAddress": "user5#foo.bar"}
]
},
{"name": "P4","matches": [
{"displayName": "User6","smtpAddress": "user6#foo.bar"},
{"displayName": "User7","smtpAddress": "user7#foo.bar"}
]
}]}]}]
into CSV like this.
"F1","P1","User1 <user1#foo.bar>;User2 <user2#foo.bar>"
"F1","P2","User3 <user3#foo.bar>;User4 <user4#foo.bar>"
"F2","P3","User1 <user1#foo.bar>;User5 <user5#foo.bar>"
"F2","P4","User6 <user6#foo.bar>;User7 <user7#foo.bar>"
Cannot get the sub nested array to be respected by MAP. Any explanation is appreciated.
jq -r '.[]
| .field as $field
| (.results[].details[]
| [$field, .name] +
[([.matches[] | "\(.displayName) <\(.smtpAddress)>"] | join(";")) ])
| #csv'

chart js error in next js site hosted on netlify [duplicate]

This question already has answers here:
Error "RangeError: minimumFractionDigits value is out of range" with ChartJS in Next.js app
(3 answers)
Closed 4 months ago.
everthing is working fine in development on my local machine,
but when I deploy my site to netlify, I get this wierd error in console.
the page data is gotten using getStaticProps function and then passed to the page as props.
here is how the data looks like
[
[ 17671, 19856, 37527 ], [ 4887, 5418, 10305 ],
[ 2803, 2303, 5106 ], [ 8617, 9403, 18020 ],
[ 12664, 15722, 28386 ], [ 4227, 3359, 7586 ],
[ 1371, 1545, 2916 ], [ 17526, 14297, 31823 ],
[ 6883, 4781, 11664 ], [ 9805, 10000, 19805 ],
[ 11068, 12888, 23956 ], [ 4570, 4722, 9292 ],
[ 15428, 17309, 32737 ], [ 3565, 3656, 7221 ],
[ 8750, 10443, 19193 ], [ 1771, 1432, 3203 ],
[ 8495, 9979, 18474 ], [ 288, 265, 553 ],
[ 21130, 19321, 40451 ], [ 6867, 6556, 13423 ],
[ 2656, 2092, 4748 ], [ 967, 656, 1623 ],
[ 4540, 4505, 9045 ], [ 7025, 7108, 14133 ],
[ 53134, 59829, 112963 ], [ 8640, 9028, 17668 ],
[ 6759, 7119, 13878 ], [ 28803, 31362, 60165 ],
[ 7346, 7630, 14976 ], [ 9771, 10963, 20734 ],
[ 15783, 17397, 33180 ], [ 9847, 9706, 19553 ],
[ 15154, 17577, 32731 ], [ 1056, 874, 1930 ],
[ 3260, 2476, 5736 ], [ 1488, 1424, 2912 ],
[ 1656, 1154, 2810 ]
]
it is an array of arrays of 3 numbers.
please how do I solve this?
You need to set swcMinify: false false in your next.confug.js. This is currently a bug in the swcMinifier and you will need to wait until they have fixed it before you can enable it again.

How to insert csv file (900MB) data in SQL server quickly?

I have tried inserting using itertuples but my file is too big. I even split the file in 4 different files even then its too big. one-fourth file takes more than 30 minutes. Is there a easier and quicker way to import data in SQL server?
Thanks in advance.
For faster importing big data, SQL SERVER has a BULK INSERT command. I tested on my local server, importing process for 870 MB CSV file (10 million records) to SQL SERVER executed time been 12.6 second.
BULK INSERT dbo.import_test_data
FROM 'C:\Users\Ramin\Desktop\target_table.csv'
Full syntax this command:
BULK INSERT
{ database_name.schema_name.table_or_view_name | schema_name.table_or_view_name | table_or_view_name }
FROM 'data_file'
[ WITH
(
[ [ , ] BATCHSIZE = batch_size ]
[ [ , ] CHECK_CONSTRAINTS ]
[ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ]
[ [ , ] DATAFILETYPE =
{ 'char' | 'native'| 'widechar' | 'widenative' } ]
[ [ , ] DATA_SOURCE = 'data_source_name' ]
[ [ , ] ERRORFILE = 'file_name' ]
[ [ , ] ERRORFILE_DATA_SOURCE = 'errorfile_data_source_name' ]
[ [ , ] FIRSTROW = first_row ]
[ [ , ] FIRE_TRIGGERS ]
[ [ , ] FORMATFILE_DATA_SOURCE = 'data_source_name' ]
[ [ , ] KEEPIDENTITY ]
[ [ , ] KEEPNULLS ]
[ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ]
[ [ , ] LASTROW = last_row ]
[ [ , ] MAXERRORS = max_errors ]
[ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ]
[ [ , ] ROWS_PER_BATCH = rows_per_batch ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
[ [ , ] TABLOCK ]
-- input file format options
[ [ , ] FORMAT = 'CSV' ]
[ [ , ] FIELDQUOTE = 'quote_characters']
[ [ , ] FORMATFILE = 'format_file_path' ]
[ [ , ] FIELDTERMINATOR = 'field_terminator' ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
)]

Azure Kinect Viewer cannot start device on HP Elitedesk 800 OS:Win10

I installed Azure Kinect SDK successfully and it looks normal in Device Manager. But the Kinect Viewer cannot start the device. My machine is HP Elitedesk. OS version is Windows 10 (build version:18363). I tried on a Lenovo laptop and the Kinect Viewer works fine. So the device is not damaged. Anyone has pointer on what I can do to fix?
Device Manager view
Azure Kinect Viewer error
Log:
[ trace ] : k4a_device_start_cameras(). k4a_device_start_cameras starting
[ info ] : k4a_device_start_cameras(). Starting camera's with the following config.
[ info ] : k4a_device_start_cameras(). color_format:3
[ info ] : k4a_device_start_cameras(). color_resolution:1
[ info ] : k4a_device_start_cameras(). depth_mode:2
[ info ] : k4a_device_start_cameras(). camera_fps:2
[ info ] : k4a_device_start_cameras(). synchronized_images_only:1
[ info ] : k4a_device_start_cameras(). depth_delay_off_color_usec:0
[ info ] : k4a_device_start_cameras(). wired_sync_mode:0
[ info ] : k4a_device_start_cameras(). subordinate_delay_off_master_usec:0
[ info ] : k4a_device_start_cameras(). disable_streaming_indicator:0
[ trace ] : usb_cmd_io(). XFR: Cmd=80000001, CmdLength=13, PayloadSize=0, CmdData=00000000 00000000...
[ trace ] : usb_cmd_io(). XFR: Cmd=000000e1, CmdLength=4, PayloadSize=0, CmdData=00000004 000001ec...
[ error ] : depth_engine_start_helper(). Depth engine create and initialize failed with error code: 204.
[ error ] : deresult == K4A_DEPTH_ENGINE_RESULT_SUCCEEDED returned failure in depth_engine_start_helper()
[ error ] : depth_engine_start_helper(dewrapper, dewrapper->fps, dewrapper->depth_mode, &depth_engine_max_compute_time_ms, &depth_engine_output_buffer_size) returned failure in depth_engine_thread()
[ warning ] : capturesync_add_capture(). Capture Error Detected, Depth
[ info ] : queue_stop(). Queue "Queue_capture" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_depth" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_color" stopped, shutting down and notifying consumers.
[ error ] : dewrapper_start(). Depth Engine thread failed to start
[ error ] : dewrapper_start(depth->dewrapper, config, depth->calibration_memory, depth->calibration_memory_size) returned failure in depth_start()
[ trace ] : usb_cmd_io(). XFR: Cmd=000000f2, PayloadSize=0
[ error ] : cmd_status == CMD_STATUS_PASS returned failure in depthmcu_depth_stop_streaming()
[ error ] : depthmcu_depth_stop_streaming(). ERROR: cmd_status=0x00000063
[ trace ] : usb_cmd_io(). XFR: Cmd=0000000a, PayloadSize=0
[ error ] : depth_start(device->depth, config) returned failure in k4a_device_start_cameras()
[ info ] : k4a_device_start_cameras(). k4a_device_start_cameras started
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopping
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopped
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A23E0
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2660
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2860
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC4821BE40
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC4821A0C0
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC4821B4C0
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC481A2660
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC481A23E0
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC4821BE40
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC481A2860
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC4821B4C0
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC4821A0C0
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A8A0B80
[ trace ] : imu_t_destroy(). Destroyed imu_t 000001EC4820E470
[ trace ] : color_t_destroy(). Destroyed color_t 000001EC48226D80
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A8A0860
[ trace ] : dewrapper_t_destroy(). Destroyed dewrapper_t 000001EC481A19E0
[ trace ] : depth_t_destroy(). Destroyed depth_t 000001EC481A5E00
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A8A0400
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A89FFA0
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A89FC30
[ trace ] : capturesync_t_destroy(). Destroyed capturesync_t 000001EC481739D0
[ trace ] : calibration_t_destroy(). Destroyed calibration_t 000001EC4519EBE0
[ trace ] : usbcmd_t_destroy(). Destroyed usbcmd_t 000001EC4813C930
[ trace ] : depthmcu_t_destroy(). Destroyed depthmcu_t 000001EC4A760320
[ trace ] : usbcmd_t_destroy(). Destroyed usbcmd_t 000001EC4813CA10
[ trace ] : colormcu_t_destroy(). Destroyed colormcu_t 000001EC481EE3B0
[ trace ] : k4a_device_t_destroy(). Destroyed k4a_device_t 000001EC45109B40
[ trace ] : k4a_device_t_create(). Created k4a_device_t 000001EC4A7A7F60
[ trace ] : depthmcu_t_create(). Created depthmcu_t 000001EC4A7603A0
[ trace ] : usbcmd_t_create(). Created usbcmd_t 000001EC4813C930
[ info ] : find_libusb_device(). Container ID found: {ccc11f1c-eddb-48c8-da9d-6c737eda8c49}
[ info ] : populate_serialnumber(). Serial Number found 000603501312
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ trace ] : colormcu_t_create(). Created colormcu_t 000001EC481F36B0
[ trace ] : usbcmd_t_create(). Created usbcmd_t 000001EC4813D2D0
[ info ] : find_libusb_device(). Container ID found: {ccc11f1c-eddb-48c8-da9d-6c737eda8c49}
[ info ] : populate_serialnumber(). Serial Number found 000603501312
[ trace ] : calibration_t_create(). Created calibration_t 000001EC479BB3D0
[ trace ] : usb_cmd_io(). XFR: Cmd=00000111, PayloadSize=10240
[ trace ] : capturesync_t_create(). Created capturesync_t 000001EC481739D0
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89EC90
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89ECE0
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89EB00
[ trace ] : depth_t_create(). Created depth_t 000001EC481A4A00
[ trace ] : usb_cmd_io(). XFR: Cmd=00000201, PayloadSize=18
[ trace ] : usb_cmd_io(). XFR: Cmd=00000201, PayloadSize=18
[ critical ] : ******************** Device Info ********************
[ critical ] : K4A SDK version: 1.4.1
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ critical ] : Serial Number: 000603501312
[ critical ] : RGB Sensor Version: 1.6.110
[ critical ] : Depth Sensor Version:1.6.79
[ critical ] : Mic Array Version: 1.6.14
[ critical ] : Sensor Config: 6109.7
[ critical ] : Build type: Release
[ critical ] : Signature type: MSFT
[ critical ] : ****************************************************
[ trace ] : dewrapper_t_create(). Created dewrapper_t 000001EC481A1E60
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89EB50
[ trace ] : usb_cmd_io(). XFR: Cmd=000000f2, PayloadSize=0
[ trace ] : usb_cmd_io(). XFR: Cmd=0000000a, PayloadSize=0
[ trace ] : color_t_create(). Created color_t 000001EC4A88F6E0
[ trace ] : imu_t_create(). Created imu_t 000001EC4820E470
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89F5A0
[ trace ] : usb_cmd_io(). XFR: Cmd=80000004, PayloadSize=0
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ trace ] : usb_cmd_io(). XFR: Cmd=80000006, PayloadSize=1
[ trace ] : usb_cmd_io(). XFR: Cmd=80000006, PayloadSize=1
[ trace ] : k4a_device_start_cameras(). k4a_device_start_cameras starting
[ info ] : k4a_device_start_cameras(). Starting camera's with the following config.
[ info ] : k4a_device_start_cameras(). color_format:3
[ info ] : k4a_device_start_cameras(). color_resolution:1
[ info ] : k4a_device_start_cameras(). depth_mode:2
[ info ] : k4a_device_start_cameras(). camera_fps:2
[ info ] : k4a_device_start_cameras(). synchronized_images_only:1
[ info ] : k4a_device_start_cameras(). depth_delay_off_color_usec:0
[ info ] : k4a_device_start_cameras(). wired_sync_mode:0
[ info ] : k4a_device_start_cameras(). subordinate_delay_off_master_usec:0
[ info ] : k4a_device_start_cameras(). disable_streaming_indicator:0
[ trace ] : usb_cmd_io(). XFR: Cmd=80000001, CmdLength=13, PayloadSize=0, CmdData=00000000 00000000...
[ trace ] : usb_cmd_io(). XFR: Cmd=000000e1, CmdLength=4, PayloadSize=0, CmdData=00000004 000001ec...
[ trace ] : usb_cmd_io(). XFR: Cmd=00000022, CmdLength=4, PayloadSize=2000000, CmdData=00000002 000001ec...
[ error ] : depth_engine_start_helper(). Depth engine create and initialize failed with error code: 204.
[ error ] : deresult == K4A_DEPTH_ENGINE_RESULT_SUCCEEDED returned failure in depth_engine_start_helper()
[ error ] : depth_engine_start_helper(dewrapper, dewrapper->fps, dewrapper->depth_mode, &depth_engine_max_compute_time_ms, &depth_engine_output_buffer_size) returned failure in depth_engine_thread()
[ warning ] : capturesync_add_capture(). Capture Error Detected, Depth
[ error ] : dewrapper_start(). Depth Engine thread failed to start
[ info ] : queue_stop(). Queue "Queue_capture" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_depth" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_color" stopped, shutting down and notifying consumers.
[ error ] : dewrapper_start(depth->dewrapper, config, depth->calibration_memory, depth->calibration_memory_size) returned failure in depth_start()
[ trace ] : usb_cmd_io(). XFR: Cmd=000000f2, PayloadSize=0
[ error ] : cmd_status == CMD_STATUS_PASS returned failure in depthmcu_depth_stop_streaming()
[ error ] : depthmcu_depth_stop_streaming(). ERROR: cmd_status=0x00000063
[ trace ] : usb_cmd_io(). XFR: Cmd=0000000a, PayloadSize=0
[ error ] : depth_start(device->depth, config) returned failure in k4a_device_start_cameras()
[ info ] : k4a_device_start_cameras(). k4a_device_start_cameras started
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopping
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopped
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A23E0
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2660
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2860
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC48218230
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC48219C30
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC48219DB0
This looks like the remote access issue on Linux. See https://github.com/microsoft/Azure-Kinect-Sensor-SDK/issues/810.

Apache Camel JUnit to test messages sent to redirected route

I am trying to test route app.cash.source-endpoint-quartz from my junit which internally redirect the flow to another route app.accrual.source-endpoint-direct.
In my case the scenario is like app.cash.source-endpoint-quartz sends few messages to MQ, after that the processing continues and based on certain condition it will redirect to route app.accrual.source-endpoint-direct and which eventually sends few more messages to same MQ.
How do I test this?
Cash Route
from("{{app.cash.source-endpoint-quartz}}")
.routeId("cash-route")
.log(LoggingLevel.INFO, logger,"***** CASH ROUTE STARTED *****")
.doTry()
...
....
.to("direct:cashTransactionRoute") //Sub Route
.process(c -> {
TransactionMaster transactionMaster = (TransactionMaster) c.getIn().getHeader(Constants.HEADER_TRANSACTION_MASTER_CASH);
transactionMasterService.updateMsgStatus(transactionMaster, Status.SUCCESS);
})
.bean(transactionManager, "markSuccess")
...
...
Cash Sub Route
from("direct:cashTransactionRoute")
.routeId("cash-transaction-route")
...
.split(simple("${body}"))
.parallelProcessing()
...
.end()// End of split() and parallelProcessing()
.end()
.process(e -> {
...
})
.choice()
.when(simple("${body.size} != 0"))
.process(e -> {
e.getIn().getBody();
})
.to("{{app.accrual.source-endpoint-direct}}") //Redirect to accrual route
.end() //End of choice
.end();
Accrual Route
from("{{app.accrual.source-endpoint-direct}}") //Accrual Route
.routeId("accrual-route")
.log(LoggingLevel.INFO, logger,"***** ACCRUAL ROUTE STARTED *****")
...
...
application-test.yaml
app:
cash:
source-endpoint-quartz: direct-vm:cash
txn-type: CASH
accrual:
source-endpoint-direct: direct-vm:accrual
source-endpoint-quartz-1: direct-vm:accrual-quartz-1
source-endpoint-quartz-2: direct-vm:accrual-quartz-2
Below is my JUnit which I tried but getting error.
#RunWith(CamelSpringBootRunner.class)
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT, properties = {"camel.springboot.java-routes-include-pattern=**/Cash*, **/Accrual*"})
#EnableAutoConfiguration(exclude = {DataSourceAutoConfiguration.class, DataSourceTransactionManagerAutoConfiguration.class, SecurityAutoConfiguration.class})
#DirtiesContext(classMode = ClassMode.AFTER_EACH_TEST_METHOD)
#ActiveProfiles(profiles = {"test"})
public class CashRouteTest {
#EndpointInject(value = "{{app.cash.source-endpoint-quartz}}")
private ProducerTemplate producerTemplate;
#EndpointInject(value = "{{app.accrual.source-endpoint-direct}}")
private ProducerTemplate producerTemplateAccrual;
#EndpointInject(value = "{{app.mqservice}}")
private MockEndpoint mock;
#Test
public void cashRouteTest_PaymentWithAccrual() throws Exception {
Mockito.when(...).thenReturn(.....);
Mockito.when(...).thenReturn(.....);
...
producerTemplateAccrual.start();
producerTemplate.start();
producerTemplate.sendBody(null);
//producerTemplateAccrual.sendBody(null);
mock.expectedMessageCount(4);
mock.expectedBodiesReceived();
Assert.assertEquals(4, mock.getExchanges().size());
String xml = String.valueOf(mock.getExchanges().get(0).getIn().getBody());
MessageEnvelope messageEnvelope = (MessageEnvelope) XmlUtil.toObject(xml);
String actualPayload = XmlUtil.toXml(messageEnvelope.getPayload());
String expectedPayload = "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>....";
Collection<TransactionMaster> txnMasters = (Collection<TransactionMaster>) txnMasterRepo.findAll();
Collection<Transaction> txns = (Collection<Transaction>) txnRepo.findAll();
logger.info("actualPayload : {} ", actualPayload);
Assert.assertEquals(expectedPayload, actualPayload);
Assert.assertEquals(2, txnMasters.size());
Assert.assertEquals(4, txns.size());
Assert.assertEquals(Status.SUCCESS, Status.forValue(txnMasters.iterator().next().getRefStatusId()));
Assert.assertEquals(Status.SUCCESS, Status.forValue(txns.iterator().next().getRefStatusId()));
mock.assertIsSatisfied(5000);
}
}
When I run this JUnit I get below error.
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-route started and consuming from: direct-vm://cash
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-enrich-route started and consuming from: direct://cashEnrichRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-transaction-route started and consuming from: direct://cashTransactionRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Total 3 routes, of which 3 are started
Message History
---------------------------------------------------------------------------------------------------------------------------------------
RouteId ProcessorId Processor Elapsed (ms)
[cash-route ] [cash-route ] [direct-vm://cash ] [ 32018]
[cash-route ] [log1 ] [log ] [ 9]
[cash-route ] [doTry1 ] [doTry ] [ 0]
[cash-route ] [bean1 ] [bean[com.app.service.DbTransactionManager] ] [ 500]
[cash-route ] [bean2 ] [bean[com.app.service.CashTransactionSearch] ] [ 15]
[cash-route ] [choice1 ] [when[simple{Simple: ${body.size} == 0}]choice[] ] [ 204]
[cash-route ] [process1 ] [Processor#0x20bc4c09 ] [ 15]
[cash-route ] [process2 ] [Processor#0x1ca6323c ] [ 16]
[cash-route ] [to1 ] [direct:cashEnrichRoute ] [ 130]
[cash-enrich-route ] [split1 ] [split[Simple: ${body}] ] [ 130]
[cash-route ] [process3 ] [Processor#0x5aac9d67 ] [ 235]
[cash-route ] [process4 ] [Processor#0x753cc26d ] [ 75]
[cash-route ] [to2 ] [direct:cashTransactionRoute ] [ 0]
[cash-transaction-r] [split2 ] [split[Simple: ${body}] ] [ 385]
[cash-transaction-r] [process16 ] [Processor#0x20b3bbe7 ] [ 0]
[cash-transaction-r] [choice3 ] [when[simple{Simple: ${body.size} != 0}]choice[] ] [ 0]
[cash-transaction-r] [process17 ] [Processor#0x5190ae57 ] [ 0]
[cash-transaction-r] [to5 ] [{{app.accrual.source-endpoint-direct}} ] [ 0]
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
org.apache.camel.component.directvm.DirectVmConsumerNotAvailableException: No consumers available on endpoint: direct-vm://accrual. Exchange[ID-SPLS1800411-10N-1564022207351-0-3]
I see that when it loads the context it does not start the accrual route. Even though I have specified producerTemplateAccrual.start(); in my junit.
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-route started and consuming from: direct-vm://cash
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-enrich-route started and consuming from: direct://cashEnrichRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-transaction-route started and consuming from: direct://cashTransactionRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Total 3 routes, of which 3 are started
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT, properties = {"camel.springboot.java-routes-include-pattern=**/Cash*, **/Accrual*"})
Cash*, **
Because of the space after , the AccrualRoute was getting ignored.
After removing the space both the routes are getting loaded.

Resources