Related
Submitted my first question and had hoped to apply it to the bigger JSON file but I am just not getting it.
Using JQ I am trying to turn this JSON:
[{"field": "F1","results": [{"details": [
{"name": "P1","matches": [
{"displayName": "User1","smtpAddress": "user1#foo.bar"},
{"displayName": "User2","smtpAddress": "user2#foo.bar"}
]
},
{"name": "P2","matches": [
{"displayName": "User3","smtpAddress": "user3#foo.bar"},
{"displayName": "User4","smtpAddress": "user4#foo.bar"}
]
}]}]},
{"field": "F2","results": [{"details": [
{"name": "P3","matches": [
{"displayName": "User1","smtpAddress": "user1#foo.bar"},
{"displayName": "User5","smtpAddress": "user5#foo.bar"}
]
},
{"name": "P4","matches": [
{"displayName": "User6","smtpAddress": "user6#foo.bar"},
{"displayName": "User7","smtpAddress": "user7#foo.bar"}
]
}]}]}]
into CSV like this.
"F1","P1","User1 <user1#foo.bar>;User2 <user2#foo.bar>"
"F1","P2","User3 <user3#foo.bar>;User4 <user4#foo.bar>"
"F2","P3","User1 <user1#foo.bar>;User5 <user5#foo.bar>"
"F2","P4","User6 <user6#foo.bar>;User7 <user7#foo.bar>"
Cannot get the sub nested array to be respected by MAP. Any explanation is appreciated.
jq -r '.[]
| .field as $field
| (.results[].details[]
| [$field, .name] +
[([.matches[] | "\(.displayName) <\(.smtpAddress)>"] | join(";")) ])
| #csv'
This question already has answers here:
Error "RangeError: minimumFractionDigits value is out of range" with ChartJS in Next.js app
(3 answers)
Closed 4 months ago.
everthing is working fine in development on my local machine,
but when I deploy my site to netlify, I get this wierd error in console.
the page data is gotten using getStaticProps function and then passed to the page as props.
here is how the data looks like
[
[ 17671, 19856, 37527 ], [ 4887, 5418, 10305 ],
[ 2803, 2303, 5106 ], [ 8617, 9403, 18020 ],
[ 12664, 15722, 28386 ], [ 4227, 3359, 7586 ],
[ 1371, 1545, 2916 ], [ 17526, 14297, 31823 ],
[ 6883, 4781, 11664 ], [ 9805, 10000, 19805 ],
[ 11068, 12888, 23956 ], [ 4570, 4722, 9292 ],
[ 15428, 17309, 32737 ], [ 3565, 3656, 7221 ],
[ 8750, 10443, 19193 ], [ 1771, 1432, 3203 ],
[ 8495, 9979, 18474 ], [ 288, 265, 553 ],
[ 21130, 19321, 40451 ], [ 6867, 6556, 13423 ],
[ 2656, 2092, 4748 ], [ 967, 656, 1623 ],
[ 4540, 4505, 9045 ], [ 7025, 7108, 14133 ],
[ 53134, 59829, 112963 ], [ 8640, 9028, 17668 ],
[ 6759, 7119, 13878 ], [ 28803, 31362, 60165 ],
[ 7346, 7630, 14976 ], [ 9771, 10963, 20734 ],
[ 15783, 17397, 33180 ], [ 9847, 9706, 19553 ],
[ 15154, 17577, 32731 ], [ 1056, 874, 1930 ],
[ 3260, 2476, 5736 ], [ 1488, 1424, 2912 ],
[ 1656, 1154, 2810 ]
]
it is an array of arrays of 3 numbers.
please how do I solve this?
You need to set swcMinify: false false in your next.confug.js. This is currently a bug in the swcMinifier and you will need to wait until they have fixed it before you can enable it again.
I have tried inserting using itertuples but my file is too big. I even split the file in 4 different files even then its too big. one-fourth file takes more than 30 minutes. Is there a easier and quicker way to import data in SQL server?
Thanks in advance.
For faster importing big data, SQL SERVER has a BULK INSERT command. I tested on my local server, importing process for 870 MB CSV file (10 million records) to SQL SERVER executed time been 12.6 second.
BULK INSERT dbo.import_test_data
FROM 'C:\Users\Ramin\Desktop\target_table.csv'
Full syntax this command:
BULK INSERT
{ database_name.schema_name.table_or_view_name | schema_name.table_or_view_name | table_or_view_name }
FROM 'data_file'
[ WITH
(
[ [ , ] BATCHSIZE = batch_size ]
[ [ , ] CHECK_CONSTRAINTS ]
[ [ , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ]
[ [ , ] DATAFILETYPE =
{ 'char' | 'native'| 'widechar' | 'widenative' } ]
[ [ , ] DATA_SOURCE = 'data_source_name' ]
[ [ , ] ERRORFILE = 'file_name' ]
[ [ , ] ERRORFILE_DATA_SOURCE = 'errorfile_data_source_name' ]
[ [ , ] FIRSTROW = first_row ]
[ [ , ] FIRE_TRIGGERS ]
[ [ , ] FORMATFILE_DATA_SOURCE = 'data_source_name' ]
[ [ , ] KEEPIDENTITY ]
[ [ , ] KEEPNULLS ]
[ [ , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ]
[ [ , ] LASTROW = last_row ]
[ [ , ] MAXERRORS = max_errors ]
[ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ]
[ [ , ] ROWS_PER_BATCH = rows_per_batch ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
[ [ , ] TABLOCK ]
-- input file format options
[ [ , ] FORMAT = 'CSV' ]
[ [ , ] FIELDQUOTE = 'quote_characters']
[ [ , ] FORMATFILE = 'format_file_path' ]
[ [ , ] FIELDTERMINATOR = 'field_terminator' ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]
)]
I installed Azure Kinect SDK successfully and it looks normal in Device Manager. But the Kinect Viewer cannot start the device. My machine is HP Elitedesk. OS version is Windows 10 (build version:18363). I tried on a Lenovo laptop and the Kinect Viewer works fine. So the device is not damaged. Anyone has pointer on what I can do to fix?
Device Manager view
Azure Kinect Viewer error
Log:
[ trace ] : k4a_device_start_cameras(). k4a_device_start_cameras starting
[ info ] : k4a_device_start_cameras(). Starting camera's with the following config.
[ info ] : k4a_device_start_cameras(). color_format:3
[ info ] : k4a_device_start_cameras(). color_resolution:1
[ info ] : k4a_device_start_cameras(). depth_mode:2
[ info ] : k4a_device_start_cameras(). camera_fps:2
[ info ] : k4a_device_start_cameras(). synchronized_images_only:1
[ info ] : k4a_device_start_cameras(). depth_delay_off_color_usec:0
[ info ] : k4a_device_start_cameras(). wired_sync_mode:0
[ info ] : k4a_device_start_cameras(). subordinate_delay_off_master_usec:0
[ info ] : k4a_device_start_cameras(). disable_streaming_indicator:0
[ trace ] : usb_cmd_io(). XFR: Cmd=80000001, CmdLength=13, PayloadSize=0, CmdData=00000000 00000000...
[ trace ] : usb_cmd_io(). XFR: Cmd=000000e1, CmdLength=4, PayloadSize=0, CmdData=00000004 000001ec...
[ error ] : depth_engine_start_helper(). Depth engine create and initialize failed with error code: 204.
[ error ] : deresult == K4A_DEPTH_ENGINE_RESULT_SUCCEEDED returned failure in depth_engine_start_helper()
[ error ] : depth_engine_start_helper(dewrapper, dewrapper->fps, dewrapper->depth_mode, &depth_engine_max_compute_time_ms, &depth_engine_output_buffer_size) returned failure in depth_engine_thread()
[ warning ] : capturesync_add_capture(). Capture Error Detected, Depth
[ info ] : queue_stop(). Queue "Queue_capture" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_depth" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_color" stopped, shutting down and notifying consumers.
[ error ] : dewrapper_start(). Depth Engine thread failed to start
[ error ] : dewrapper_start(depth->dewrapper, config, depth->calibration_memory, depth->calibration_memory_size) returned failure in depth_start()
[ trace ] : usb_cmd_io(). XFR: Cmd=000000f2, PayloadSize=0
[ error ] : cmd_status == CMD_STATUS_PASS returned failure in depthmcu_depth_stop_streaming()
[ error ] : depthmcu_depth_stop_streaming(). ERROR: cmd_status=0x00000063
[ trace ] : usb_cmd_io(). XFR: Cmd=0000000a, PayloadSize=0
[ error ] : depth_start(device->depth, config) returned failure in k4a_device_start_cameras()
[ info ] : k4a_device_start_cameras(). k4a_device_start_cameras started
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopping
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopped
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A23E0
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2660
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2860
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC4821BE40
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC4821A0C0
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC4821B4C0
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC481A2660
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC481A23E0
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC4821BE40
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC481A2860
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC4821B4C0
[ trace ] : k4a_image_t_destroy(). Destroyed k4a_image_t 000001EC4821A0C0
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A8A0B80
[ trace ] : imu_t_destroy(). Destroyed imu_t 000001EC4820E470
[ trace ] : color_t_destroy(). Destroyed color_t 000001EC48226D80
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A8A0860
[ trace ] : dewrapper_t_destroy(). Destroyed dewrapper_t 000001EC481A19E0
[ trace ] : depth_t_destroy(). Destroyed depth_t 000001EC481A5E00
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A8A0400
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A89FFA0
[ trace ] : queue_t_destroy(). Destroyed queue_t 000001EC4A89FC30
[ trace ] : capturesync_t_destroy(). Destroyed capturesync_t 000001EC481739D0
[ trace ] : calibration_t_destroy(). Destroyed calibration_t 000001EC4519EBE0
[ trace ] : usbcmd_t_destroy(). Destroyed usbcmd_t 000001EC4813C930
[ trace ] : depthmcu_t_destroy(). Destroyed depthmcu_t 000001EC4A760320
[ trace ] : usbcmd_t_destroy(). Destroyed usbcmd_t 000001EC4813CA10
[ trace ] : colormcu_t_destroy(). Destroyed colormcu_t 000001EC481EE3B0
[ trace ] : k4a_device_t_destroy(). Destroyed k4a_device_t 000001EC45109B40
[ trace ] : k4a_device_t_create(). Created k4a_device_t 000001EC4A7A7F60
[ trace ] : depthmcu_t_create(). Created depthmcu_t 000001EC4A7603A0
[ trace ] : usbcmd_t_create(). Created usbcmd_t 000001EC4813C930
[ info ] : find_libusb_device(). Container ID found: {ccc11f1c-eddb-48c8-da9d-6c737eda8c49}
[ info ] : populate_serialnumber(). Serial Number found 000603501312
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ trace ] : colormcu_t_create(). Created colormcu_t 000001EC481F36B0
[ trace ] : usbcmd_t_create(). Created usbcmd_t 000001EC4813D2D0
[ info ] : find_libusb_device(). Container ID found: {ccc11f1c-eddb-48c8-da9d-6c737eda8c49}
[ info ] : populate_serialnumber(). Serial Number found 000603501312
[ trace ] : calibration_t_create(). Created calibration_t 000001EC479BB3D0
[ trace ] : usb_cmd_io(). XFR: Cmd=00000111, PayloadSize=10240
[ trace ] : capturesync_t_create(). Created capturesync_t 000001EC481739D0
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89EC90
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89ECE0
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89EB00
[ trace ] : depth_t_create(). Created depth_t 000001EC481A4A00
[ trace ] : usb_cmd_io(). XFR: Cmd=00000201, PayloadSize=18
[ trace ] : usb_cmd_io(). XFR: Cmd=00000201, PayloadSize=18
[ critical ] : ******************** Device Info ********************
[ critical ] : K4A SDK version: 1.4.1
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ critical ] : Serial Number: 000603501312
[ critical ] : RGB Sensor Version: 1.6.110
[ critical ] : Depth Sensor Version:1.6.79
[ critical ] : Mic Array Version: 1.6.14
[ critical ] : Sensor Config: 6109.7
[ critical ] : Build type: Release
[ critical ] : Signature type: MSFT
[ critical ] : ****************************************************
[ trace ] : dewrapper_t_create(). Created dewrapper_t 000001EC481A1E60
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89EB50
[ trace ] : usb_cmd_io(). XFR: Cmd=000000f2, PayloadSize=0
[ trace ] : usb_cmd_io(). XFR: Cmd=0000000a, PayloadSize=0
[ trace ] : color_t_create(). Created color_t 000001EC4A88F6E0
[ trace ] : imu_t_create(). Created imu_t 000001EC4820E470
[ trace ] : queue_t_create(). Created queue_t 000001EC4A89F5A0
[ trace ] : usb_cmd_io(). XFR: Cmd=80000004, PayloadSize=0
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ trace ] : usb_cmd_io(). XFR: Cmd=00000115, PayloadSize=255
[ trace ] : usb_cmd_io(). XFR: Cmd=80000006, PayloadSize=1
[ trace ] : usb_cmd_io(). XFR: Cmd=80000006, PayloadSize=1
[ trace ] : k4a_device_start_cameras(). k4a_device_start_cameras starting
[ info ] : k4a_device_start_cameras(). Starting camera's with the following config.
[ info ] : k4a_device_start_cameras(). color_format:3
[ info ] : k4a_device_start_cameras(). color_resolution:1
[ info ] : k4a_device_start_cameras(). depth_mode:2
[ info ] : k4a_device_start_cameras(). camera_fps:2
[ info ] : k4a_device_start_cameras(). synchronized_images_only:1
[ info ] : k4a_device_start_cameras(). depth_delay_off_color_usec:0
[ info ] : k4a_device_start_cameras(). wired_sync_mode:0
[ info ] : k4a_device_start_cameras(). subordinate_delay_off_master_usec:0
[ info ] : k4a_device_start_cameras(). disable_streaming_indicator:0
[ trace ] : usb_cmd_io(). XFR: Cmd=80000001, CmdLength=13, PayloadSize=0, CmdData=00000000 00000000...
[ trace ] : usb_cmd_io(). XFR: Cmd=000000e1, CmdLength=4, PayloadSize=0, CmdData=00000004 000001ec...
[ trace ] : usb_cmd_io(). XFR: Cmd=00000022, CmdLength=4, PayloadSize=2000000, CmdData=00000002 000001ec...
[ error ] : depth_engine_start_helper(). Depth engine create and initialize failed with error code: 204.
[ error ] : deresult == K4A_DEPTH_ENGINE_RESULT_SUCCEEDED returned failure in depth_engine_start_helper()
[ error ] : depth_engine_start_helper(dewrapper, dewrapper->fps, dewrapper->depth_mode, &depth_engine_max_compute_time_ms, &depth_engine_output_buffer_size) returned failure in depth_engine_thread()
[ warning ] : capturesync_add_capture(). Capture Error Detected, Depth
[ error ] : dewrapper_start(). Depth Engine thread failed to start
[ info ] : queue_stop(). Queue "Queue_capture" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_depth" stopped, shutting down and notifying consumers.
[ info ] : queue_stop(). Queue "Queue_color" stopped, shutting down and notifying consumers.
[ error ] : dewrapper_start(depth->dewrapper, config, depth->calibration_memory, depth->calibration_memory_size) returned failure in depth_start()
[ trace ] : usb_cmd_io(). XFR: Cmd=000000f2, PayloadSize=0
[ error ] : cmd_status == CMD_STATUS_PASS returned failure in depthmcu_depth_stop_streaming()
[ error ] : depthmcu_depth_stop_streaming(). ERROR: cmd_status=0x00000063
[ trace ] : usb_cmd_io(). XFR: Cmd=0000000a, PayloadSize=0
[ error ] : depth_start(device->depth, config) returned failure in k4a_device_start_cameras()
[ info ] : k4a_device_start_cameras(). k4a_device_start_cameras started
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopping
[ info ] : k4a_device_stop_cameras(). k4a_device_stop_cameras stopped
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A23E0
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2660
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC481A2860
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC48218230
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC48219C30
[ trace ] : k4a_image_t_create(). Created k4a_image_t 000001EC48219DB0
This looks like the remote access issue on Linux. See https://github.com/microsoft/Azure-Kinect-Sensor-SDK/issues/810.
I am trying to test route app.cash.source-endpoint-quartz from my junit which internally redirect the flow to another route app.accrual.source-endpoint-direct.
In my case the scenario is like app.cash.source-endpoint-quartz sends few messages to MQ, after that the processing continues and based on certain condition it will redirect to route app.accrual.source-endpoint-direct and which eventually sends few more messages to same MQ.
How do I test this?
Cash Route
from("{{app.cash.source-endpoint-quartz}}")
.routeId("cash-route")
.log(LoggingLevel.INFO, logger,"***** CASH ROUTE STARTED *****")
.doTry()
...
....
.to("direct:cashTransactionRoute") //Sub Route
.process(c -> {
TransactionMaster transactionMaster = (TransactionMaster) c.getIn().getHeader(Constants.HEADER_TRANSACTION_MASTER_CASH);
transactionMasterService.updateMsgStatus(transactionMaster, Status.SUCCESS);
})
.bean(transactionManager, "markSuccess")
...
...
Cash Sub Route
from("direct:cashTransactionRoute")
.routeId("cash-transaction-route")
...
.split(simple("${body}"))
.parallelProcessing()
...
.end()// End of split() and parallelProcessing()
.end()
.process(e -> {
...
})
.choice()
.when(simple("${body.size} != 0"))
.process(e -> {
e.getIn().getBody();
})
.to("{{app.accrual.source-endpoint-direct}}") //Redirect to accrual route
.end() //End of choice
.end();
Accrual Route
from("{{app.accrual.source-endpoint-direct}}") //Accrual Route
.routeId("accrual-route")
.log(LoggingLevel.INFO, logger,"***** ACCRUAL ROUTE STARTED *****")
...
...
application-test.yaml
app:
cash:
source-endpoint-quartz: direct-vm:cash
txn-type: CASH
accrual:
source-endpoint-direct: direct-vm:accrual
source-endpoint-quartz-1: direct-vm:accrual-quartz-1
source-endpoint-quartz-2: direct-vm:accrual-quartz-2
Below is my JUnit which I tried but getting error.
#RunWith(CamelSpringBootRunner.class)
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT, properties = {"camel.springboot.java-routes-include-pattern=**/Cash*, **/Accrual*"})
#EnableAutoConfiguration(exclude = {DataSourceAutoConfiguration.class, DataSourceTransactionManagerAutoConfiguration.class, SecurityAutoConfiguration.class})
#DirtiesContext(classMode = ClassMode.AFTER_EACH_TEST_METHOD)
#ActiveProfiles(profiles = {"test"})
public class CashRouteTest {
#EndpointInject(value = "{{app.cash.source-endpoint-quartz}}")
private ProducerTemplate producerTemplate;
#EndpointInject(value = "{{app.accrual.source-endpoint-direct}}")
private ProducerTemplate producerTemplateAccrual;
#EndpointInject(value = "{{app.mqservice}}")
private MockEndpoint mock;
#Test
public void cashRouteTest_PaymentWithAccrual() throws Exception {
Mockito.when(...).thenReturn(.....);
Mockito.when(...).thenReturn(.....);
...
producerTemplateAccrual.start();
producerTemplate.start();
producerTemplate.sendBody(null);
//producerTemplateAccrual.sendBody(null);
mock.expectedMessageCount(4);
mock.expectedBodiesReceived();
Assert.assertEquals(4, mock.getExchanges().size());
String xml = String.valueOf(mock.getExchanges().get(0).getIn().getBody());
MessageEnvelope messageEnvelope = (MessageEnvelope) XmlUtil.toObject(xml);
String actualPayload = XmlUtil.toXml(messageEnvelope.getPayload());
String expectedPayload = "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>....";
Collection<TransactionMaster> txnMasters = (Collection<TransactionMaster>) txnMasterRepo.findAll();
Collection<Transaction> txns = (Collection<Transaction>) txnRepo.findAll();
logger.info("actualPayload : {} ", actualPayload);
Assert.assertEquals(expectedPayload, actualPayload);
Assert.assertEquals(2, txnMasters.size());
Assert.assertEquals(4, txns.size());
Assert.assertEquals(Status.SUCCESS, Status.forValue(txnMasters.iterator().next().getRefStatusId()));
Assert.assertEquals(Status.SUCCESS, Status.forValue(txns.iterator().next().getRefStatusId()));
mock.assertIsSatisfied(5000);
}
}
When I run this JUnit I get below error.
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-route started and consuming from: direct-vm://cash
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-enrich-route started and consuming from: direct://cashEnrichRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-transaction-route started and consuming from: direct://cashTransactionRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Total 3 routes, of which 3 are started
Message History
---------------------------------------------------------------------------------------------------------------------------------------
RouteId ProcessorId Processor Elapsed (ms)
[cash-route ] [cash-route ] [direct-vm://cash ] [ 32018]
[cash-route ] [log1 ] [log ] [ 9]
[cash-route ] [doTry1 ] [doTry ] [ 0]
[cash-route ] [bean1 ] [bean[com.app.service.DbTransactionManager] ] [ 500]
[cash-route ] [bean2 ] [bean[com.app.service.CashTransactionSearch] ] [ 15]
[cash-route ] [choice1 ] [when[simple{Simple: ${body.size} == 0}]choice[] ] [ 204]
[cash-route ] [process1 ] [Processor#0x20bc4c09 ] [ 15]
[cash-route ] [process2 ] [Processor#0x1ca6323c ] [ 16]
[cash-route ] [to1 ] [direct:cashEnrichRoute ] [ 130]
[cash-enrich-route ] [split1 ] [split[Simple: ${body}] ] [ 130]
[cash-route ] [process3 ] [Processor#0x5aac9d67 ] [ 235]
[cash-route ] [process4 ] [Processor#0x753cc26d ] [ 75]
[cash-route ] [to2 ] [direct:cashTransactionRoute ] [ 0]
[cash-transaction-r] [split2 ] [split[Simple: ${body}] ] [ 385]
[cash-transaction-r] [process16 ] [Processor#0x20b3bbe7 ] [ 0]
[cash-transaction-r] [choice3 ] [when[simple{Simple: ${body.size} != 0}]choice[] ] [ 0]
[cash-transaction-r] [process17 ] [Processor#0x5190ae57 ] [ 0]
[cash-transaction-r] [to5 ] [{{app.accrual.source-endpoint-direct}} ] [ 0]
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
org.apache.camel.component.directvm.DirectVmConsumerNotAvailableException: No consumers available on endpoint: direct-vm://accrual. Exchange[ID-SPLS1800411-10N-1564022207351-0-3]
I see that when it loads the context it does not start the accrual route. Even though I have specified producerTemplateAccrual.start(); in my junit.
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-route started and consuming from: direct-vm://cash
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-enrich-route started and consuming from: direct://cashEnrichRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Route: cash-transaction-route started and consuming from: direct://cashTransactionRoute
2019-07-25 10:36:47,179 [main] INFO o.a.camel.spring.SpringCamelContext - Total 3 routes, of which 3 are started
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT, properties = {"camel.springboot.java-routes-include-pattern=**/Cash*, **/Accrual*"})
Cash*, **
Because of the space after , the AccrualRoute was getting ignored.
After removing the space both the routes are getting loaded.