Apache Camel - Create mock endpoint to listen to messages sent from within a processor - apache-camel

I have a route as follows:
from(fromEndpoint).routeId("ticketRoute")
.log("Received Tickets : ${body}")
.doTry()
.process(exchange -> {
List<TradeTicketDto> ticketDtos = (List<TradeTicketDto>) exchange.getIn().getBody();
ticketDtos.stream()
.forEach(t -> solaceMessagePublisher.sendAsText("BOOKINGSERVICE.TICKET.UPDATED", t));
ticketToTradeConverter.convert(ticketDtos)
.forEach(t -> solaceMessagePublisher.sendAsText("BOOKINGSERVICE.TRADE.UPDATED", t));
}).doCatch(java.lang.RuntimeException.class)
.log(exceptionMessage().toString() + " --> ${body}");
solaceMessagePublisher is a utility class in application which performs some action on passed object (second argument) and finally converts it to json string and sends to a jms topic (first argument).
SolaceMessagePublisher.java
public void sendAsText(final String destinationKey, Object payload) {
LOGGER.debug("Sending object as text to %s",destinationKey);
String destinationValue = null;
if (StringUtils.isNotEmpty(destinationKey)) {
destinationValue = properties.getProperty(destinationKey);
}
LOGGER.debug("Identified Destination Value = %s from key %s", destinationValue,destinationKey);
if (StringUtils.isEmpty(destinationValue)) {
throw new BaseServiceException("Invalid destination for message");
}
sendAsTextToDestination(destinationValue, payload);
}
public void sendAsTextToDestination(final String destinationValue, Object payload) {
if (payload == null) {
LOGGER.debug(" %s %s",EMPTY_PAYLOAD_ERROR_MESSAGE, destinationValue);
return;
}
final String message = messageCreator.createMessageEnvelopAsJSON(payload, ContextProvider.getUserInContext());
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created message = " + message);
}
jmsTemplate.send(destinationValue, new MessageCreator() {
#Override
public Message createMessage(Session session) throws JMSException {
LOGGER.debug("Creating JMS Text Message");
return session.createTextMessage(message);
}
});
}
I am having a problem in creating a mock endpoint to listen to messages sent to this topic. Question is how to listen to the messages sent to a topic which is out of camel context?
I have tried in my Test using mock:jms:endpoint. It doesn't work.
My Test is as below
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = { SiteMain.class })
public class TicketRouteCamelTest extends CamelSpringTestSupport{
#Autowired
protected BaseMessageEnvelopCreator messageCreator;
private static final String MOCK_TICKET_UPDATED_QUEUE = "direct:mockTicketUpdated";
#Before
public void configureMockEndpoints() throws Exception {
//mock input
final AdviceWithRouteBuilder mockRouteAdvice = new AdviceWithRouteBuilder() {
#Override
public void configure() throws Exception {
replaceFromWith(MOCK_TICKET_UPDATED_QUEUE);
}
};
context().getRouteDefinition("ticketRoute").adviceWith(context(), mockRouteAdvice);
}
#Test
public void testTicketRouteWithListOfTickets() throws Exception {
//create test data
TradeTicketDto tradeTicketDto = TradeTestDataHelper.getTradeTicketDto();
//create an exchange and set its body with test data
List<TradeTicketDto> list = new ArrayList<>();
list.add(tradeTicketDto);
list.add(tradeTicketDto);
Exchange requestExchange = ExchangeBuilder.anExchange(context()).build();
requestExchange.getIn().setBody(list);
//create assert on the mock endpoints
MockEndpoint mockTicketUpdatedEndpoint = getMockEndpoint("mock:DEV/bookingservice/ticket/updated");
mockTicketUpdatedEndpoint.expectedBodiesReceived(
messageCreator.createMessageEnvelopAsJSON(list.get(0), ContextProvider.getUserInContext()),
messageCreator.createMessageEnvelopAsJSON(list.get(1), ContextProvider.getUserInContext()) );
MockEndpoint mockTradeUpdatedEndpoint = getMockEndpoint("mock:DEV/bookingservice/trade/updated");
mockTradeUpdatedEndpoint.expectedBodiesReceived(
messageCreator.createMessageEnvelopAsJSON(list.get(0).getTicketInstruments().get(0), ContextProvider.getUserInContext()),
messageCreator.createMessageEnvelopAsJSON(list.get(0).getTicketInstruments().get(1), ContextProvider.getUserInContext()),
messageCreator.createMessageEnvelopAsJSON(list.get(1).getTicketInstruments().get(0), ContextProvider.getUserInContext()),
messageCreator.createMessageEnvelopAsJSON(list.get(1).getTicketInstruments().get(1), ContextProvider.getUserInContext()));
//send test exchange to request mock endpoint
template.send(MOCK_TICKET_UPDATED_QUEUE, requestExchange);
//test the asserts
assertMockEndpointsSatisfied();
}
}
On running test actual bodies received on mockendpont is 0

Mock is NOT a queue for consumers/producers to exchange data. Its a sink for testing purpose where you can setup expectations on the mock.
If you want to simulate a JMS via some kind of other means, then take a look at the stub component: http://camel.apache.org/stub
Its also listed in the bottom of the testing docs at: http://camel.apache.org/testing

Related

BroadcastProcessFunction Processing Delay

I'm fairly new to Flink and would be grateful for any advice with this issue.
I wrote a job that receives some input events and compares them with some rules before forwarding them on to kafka topics based on whatever rules match. I implemented this using a flatMap and found it worked well, with one downside: I was loading the rules just once, during application startup, by calling an API from my main() method, and passing the result of this API call into the flatMap function. This worked, but it means that if there are any changes to the rules I have to restart the application, so I wanted to improve it.
I found this page in the documentation which seems to be an appropriate solution to the problem. I wrote a custom source to poll my Rules API every few minutes, and then used a BroadcastProcessFunction, with the Rules added to to the broadcast state using processBroadcastElement and the events processed by processElement.
The solution is working, but with one problem. My first approach using a FlatMap would process the events almost instantly. Now that I changed to a BroadcastProcessFunction each event takes 60 seconds to process, and it seems to be more or less exactly 60 seconds every time with almost no variation. I made no changes to the rule matching logic itself.
I've had a look through the documentation and I can't seem to find a reason for this, so I'd appreciate if anyone more experienced in flink could offer a suggestion as to what might cause this delay.
The job:
public static void main(String[] args) throws Exception {
// set up the streaming execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setStreamTimeCharacteristic(TimeCharacteristic.IngestionTime);
// read the input from Kafka
DataStream<KafkaEvent> documentStream = env.addSource(
createKafkaSource(getSourceTopic(), getSourceProperties())).name("Kafka[" + getSourceTopic() + "]");
// Configure the Rules data stream
DataStream<RulesEvent> ruleStream = env.addSource(
new RulesApiHttpSource(
getApiRulesSubdomain(),
getApiBearerToken(),
DataType.DataTypeName.LOGS,
getRulesApiCacheDuration()) // Currently set to 120000
);
MapStateDescriptor<String, RulesEvent> ruleStateDescriptor = new MapStateDescriptor<>(
"RulesBroadcastState",
BasicTypeInfo.STRING_TYPE_INFO,
TypeInformation.of(new TypeHint<RulesEvent>() {
}));
// broadcast the rules and create the broadcast state
BroadcastStream<RulesEvent> ruleBroadcastStream = ruleStream
.broadcast(ruleStateDescriptor);
// extract the resources and attributes
documentStream
.connect(ruleBroadcastStream)
.process(new FanOutLogsRuleMapper()).name("FanOut Stream")
.addSink(createKafkaSink(getDestinationProperties()))
.name("FanOut Sink");
// run the job
env.execute(FanOutJob.class.getName());
}
The custom HTTP source which gets the rules
public class RulesApiHttpSource extends RichSourceFunction<RulesEvent> {
private static final Logger LOGGER = LoggerFactory.getLogger(RulesApiHttpSource.class);
private final long pollIntervalMillis;
private final String endpoint;
private final String bearerToken;
private final DataType.DataTypeName dataType;
private final RulesApiCaller caller;
private volatile boolean running = true;
public RulesApiHttpSource(String endpoint, String bearerToken, DataType.DataTypeName dataType, long pollIntervalMillis) {
this.pollIntervalMillis = pollIntervalMillis;
this.endpoint = endpoint;
this.bearerToken = bearerToken;
this.dataType = dataType;
this.caller = new RulesApiCaller(this.endpoint, this.bearerToken);
}
#Override
public void open(Configuration configuration) throws Exception {
// do nothing
}
#Override
public void close() throws IOException {
// do nothing
}
#Override
public void run(SourceContext<RulesEvent> ctx) throws IOException {
while (running) {
if (pollIntervalMillis > 0) {
try {
RulesEvent event = new RulesEvent();
event.setRules(getCurrentRulesList());
event.setDataType(this.dataType);
event.setRetrievedAt(Instant.now());
ctx.collect(event);
Thread.sleep(pollIntervalMillis);
} catch (InterruptedException e) {
running = false;
}
} else if (pollIntervalMillis <= 0) {
cancel();
}
}
}
public List<Rule> getCurrentRulesList() throws IOException {
// call API and get rulles
}
#Override
public void cancel() {
running = false;
}
}
The BroadcastProcessFunction
public abstract class FanOutRuleMapper extends BroadcastProcessFunction<KafkaEvent, RulesEvent, KafkaEvent> {
protected final String RULES_EVENT_NAME = "rulesEvent";
protected final MapStateDescriptor<String, RulesEvent> ruleStateDescriptor = new MapStateDescriptor<>(
"RulesBroadcastState",
BasicTypeInfo.STRING_TYPE_INFO,
TypeInformation.of(new TypeHint<RulesEvent>() {
}));
#Override
public void processBroadcastElement(RulesEvent rulesEvent, BroadcastProcessFunction<KafkaEvent, RulesEvent, KafkaEvent>.Context ctx, Collector<KafkaEvent> out) throws Exception {
ctx.getBroadcastState(ruleStateDescriptor).put(RULES_EVENT_NAME, rulesEvent);
LOGGER.debug("Added to broadcast state {}", rulesEvent.toString());
}
// omitted rules matching logic
}
public class FanOutLogsRuleMapper extends FanOutRuleMapper {
public FanOutLogsJobRuleMapper() {
super();
}
#Override
public void processElement(KafkaEvent in, BroadcastProcessFunction<KafkaEvent, RulesEvent, KafkaEvent>.ReadOnlyContext ctx, Collector<KafkaEvent> out) throws Exception {
RulesEvent rulesEvent = ctx.getBroadcastState(ruleStateDescriptor).get(RULES_EVENT_NAME);
ExportLogsServiceRequest otlpLog = extractOtlpMessageFromJsonPayload(in);
for (Rule rule : rulesEvent.getRules()) {
boolean match = false;
// omitted rules matching logic
if (match) {
for (RuleDestination ruleDestination : rule.getRulesDestinations()) {
out.collect(fillInTheEvent(in, rule, ruleDestination, otlpLog));
}
}
}
}
}
Maybe you can give the complete code of the FanOutLogsRuleMapper class, currently the match variable is always false

How to read a file and send as stream to another endpoint

I need to read a file using apache camel and send to another endpoint as stream.
public class SimpleRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("file:C:/inputFolder?noop=true").to("streamEndPoint");
}
}
Here some simple working example of routes from(file).to(rest-service)
//receiver
from("jetty://http://0.0.0.0:5514/path/getFile")
.process(exchange -> {
if (exchange.getIn().getAttachments() != null) {
for (String key : exchange.getIn().getAttachments().keySet()) {
DataHandler dataHandler = exchange.getIn().getAttachments().get(key);
System.out.println(String.format("Receive attachment:%s size:%s", dataHandler.getName(), dataHandler.getInputStream().available()));
}
}
});
//sender
from("file:/Users/user1/test/?delete=true&delay=5000")
.process(exchange -> {
GenericFile<File> body = exchange.getIn().getBody(GenericFile.class);
exchange.getIn().setHeader("Content-Type", MediaType.MULTIPART_FORM_DATA);
exchange.getIn().setHeader("CamelHttpMethod", "POST");
exchange.getIn().setBody(MultipartEntityBuilder.create()
.addPart(body.getFileName(), new FileBody(body.getFile(), ContentType.MULTIPART_FORM_DATA, body.getFileName()))
.build()
);
})
.to("http4://0.0.0.0:5514/path/getFile?synchronous=true")//camel-http4 component for sending to our rest service
;
Here is whole example that you can run and see how it works.

Send data to an external resource: why content is null?

I have a simple Camel route. I want to extract a file from the queue and pass it by using POST request to an external resource. This route works and the request reaches an external resource:
import org.apache.camel.Exchange;
import org.apache.camel.Processor;
import org.apache.camel.builder.RouteBuilder;
public class MyRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("activemq:alfresco-queue")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
byte[] bytes = exchange.getIn().getBody(byte[].class);
// All of that not working...
// exchange.getIn().setHeader("content", bytes); gives "java.lang.IllegalAgrumentException: Request header is too large"
// exchange.getIn().setBody(bytes, byte[].class); gives "size of content is -1"
// exchange.getIn().setBody(bytes); gives "size of content is -1"
// ???
// ??? But I can print file content here
for(int i=0; i < bytes.length; i++) {
System.out.print((char) bytes[i]);
}
}
})
.setHeader(Exchange.HTTP_METHOD, constant("POST"))
.setHeader(Exchange.CONTENT_TYPE, constant("multipart/form-data"))
.to("http://vm-alfce52-31......com:8080/alfresco/s/someco/queuefileuploader?guest=true")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
System.out.println("The response code is: " + exchange.getIn().getHeader(Exchange.HTTP_RESPONSE_CODE));
}
});
}
}
The question is that the payload of the request is lost:
// somewhere on an external resource
Content content = request.getContent();
long len = content.getSize() // is always == -1.
// the file name is passed successfully
String fileName = request.getHeader("fileName");
How to set and pass the payload of POST request in this route/ processor?
I noticed that ANY data setted by this way is losted too. Only the headers are sent to the remote resource.
By using simple HTML form with <input type="file"> encoded in multipart/form-data I can successfully send all the data to the external resource.
What could be the reason?
Updated.
The following code also gives null-content:
MultipartEntityBuilder multipartEntityBuilder = MultipartEntityBuilder.create();
multipartEntityBuilder.setMode(HttpMultipartMode.BROWSER_COMPATIBLE);
// this also gives null-content
//multipartEntityBuilder.addBinaryBody("file", exchange.getIn().getBody(byte[].class));
multipartEntityBuilder.addPart("file", new ByteArrayBody(exchange.getIn().getBody(byte[].class), exchange.getIn().getHeader("fileName", String.class)));
exchange.getOut().setBody(multipartEntityBuilder.build().getContent());
/********** This also gives null-content *********/
StringBody username = new StringBody("username", ContentType.MULTIPART_FORM_DATA);
StringBody password = new StringBody("password", ContentType.MULTIPART_FORM_DATA);
MultipartEntityBuilder multipartEntityBuilder = MultipartEntityBuilder.create();
multipartEntityBuilder.setMode(HttpMultipartMode.BROWSER_COMPATIBLE);
multipartEntityBuilder.addPart("username", username);
multipartEntityBuilder.addPart("password", password);
String filename = (String) exchange.getIn().getHeader("fileName");
File file = new File(filename);
try(RandomAccessFile accessFile = new RandomAccessFile(file, "rw")) {
accessFile.write(bytes);
}
multipartEntityBuilder.addPart("upload", new FileBody(file, ContentType.MULTIPART_FORM_DATA, filename));
exchange.getIn().setBody(multipartEntityBuilder.build().getContent());
One more detail. If I change this:
exchange.getOut().setBody(multipartEntityBuilder.build().getContent());
To this:
exchange.getOut().setBody(multipartEntityBuilder.build());
I get the following exception on FUSE side (I see it through hawtio management console):
Execution of JMS message listener failed.
Caused by: [org.apache.camel.RuntimeCamelException - org.apache.camel.InvalidPayloadException:
No body available of type: java.io.InputStream but has value: org.apache.http.entity.mime.MultipartFormEntity#26ee73 of type:
org.apache.http.entity.mime.MultipartFormEntity on: JmsMessage#0x1cb83b9.
Caused by: No type converter available to convert from type: org.apache.http.entity.mime.MultipartFormEntity to the required type:
java.io.InputStream with value org.apache.http.entity.mime.MultipartFormEntity#26ee73. Exchange[ID-63-DP-TAV-55652-1531889677177-5-1]. Caused by:
[org.apache.camel.NoTypeConversionAvailableException - No type converter available to convert from type:
org.apache.http.entity.mime.MultipartFormEntity to the required type: java.io.InputStream with value org.apache.http.entity.mime.MultipartFormEntity#26ee73]]
I write a small servlet application and get the content in the doPost(...) method from the HttpServletRequest object.
The problem was with the WebScriptRequest object on the external system (Alfresco) side.
#Bedla, thanks for your advices!
On the Alfresco side the problem can be solved as follows:
public class QueueFileUploader extends DeclarativeWebScript {
protected Map<String, Object> executeImpl(WebScriptRequest req, Status status) {
HttpServletRequest httpServletRequest = WebScriptServletRuntime.getHttpServletRequest(req);
// calling methods of httpServletRequest object and retrieving the content
...
The route:
public class MyRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("activemq:alfresco-queue")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
MultipartEntityBuilder multipartEntityBuilder = MultipartEntityBuilder.create();
multipartEntityBuilder.setMode(HttpMultipartMode.BROWSER_COMPATIBLE);
multipartEntityBuilder.addPart("file", new ByteArrayBody(exchange.getIn().getBody(byte[].class),
exchange.getIn().getHeader("fileName", String.class)));
exchange.getIn().setBody(multipartEntityBuilder.build().getContent());
}
})
.setHeader(Exchange.HTTP_METHOD, constant(org.apache.camel.component.http4.HttpMethods.POST))
.to("http4://localhost:8080/alfresco/s/someco/queuefileuploader?guest=true")
// .to("http4://localhost:8080/ServletApp/hello")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
System.out.println("The response code is: " +
exchange.getIn().getHeader(Exchange.HTTP_RESPONSE_CODE));
}
});
}
}

How to browse messages from a queue using Apache Camel?

I need to browse messages from an active mq using Camel route without consuming the messages.
The messages in the JMS queue are to be read(only browsed and not consumed) and moved to a database while ensuring that the original queue remains intact.
public class CamelStarter {
private static CamelContext camelContext;
public static void main(String[] args) throws Exception {
camelContext = new DefaultCamelContext();
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory(ActiveMQConnectionFactory.DEFAULT_BROKER_URL);
camelContext.addComponent("jms", JmsComponent.jmsComponent(connectionFactory));
camelContext.addRoutes(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("jms:queue:testQueue").to("browse:orderReceived") .to("jms:queue:testQueue1");
}
}
);
camelContext.start();
Thread.sleep(1000);
inspectReceivedOrders();
camelContext.stop();
}
public static void inspectReceivedOrders() {
BrowsableEndpoint browse = camelContext.getEndpoint("browse:orderReceived", BrowsableEndpoint.class);
List<Exchange> exchanges = browse.getExchanges();
System.out.println("Browsing queue: "+ browse.getEndpointUri() + " size: " + exchanges.size());
for (Exchange exchange : exchanges) {
String payload = exchange.getIn().getBody(String.class);
String msgId = exchange.getIn().getHeader("JMSMessageID", String.class);
System.out.println(msgId + "=" +payload);
}
As far as I know, not possible in Camel to read (without consuming !) JMS messages...
The only workaround I found (in a JEE app) was to define a startup EJB with a timer, holding a QueueBrowser, and delegating the msg processing to a Camel route:
#Singleton
#Startup
public class MyQueueBrowser {
private TimerService timerService;
#Resource(mappedName="java:/jms/queue/com.company.myqueue")
private Queue sourceQueue;
#Inject
#JMSConnectionFactory("java:/ConnectionFactory")
private JMSContext jmsContext;
#Inject
#Uri("direct:readMessage")
private ProducerTemplate camelEndpoint;
#PostConstruct
private void init() {
TimerConfig timerConfig = new TimerConfig(null, false);
ScheduleExpression se = new ScheduleExpression().hour("*").minute("*/"+frequencyInMin);
timerService.createCalendarTimer(se, timerConfig);
}
#Timeout
public void scheduledExecution(Timer timer) throws Exception {
QueueBrowser browser = null;
try {
browser = jmsContext.createBrowser(sourceQueue);
Enumeration<Message> msgs = browser.getEnumeration();
while ( msgs.hasMoreElements() ) {
Message jmsMsg = msgs.nextElement();
// + here: read body and/or properties of jmsMsg
camelEndpoint.sendBodyAndHeader(body, myHeaderName, myHeaderValue);
}
} catch (JMSRuntimeException jmsException) {
...
} finally {
browser.close();
}
}
}
Apache camel browse component is exactly designed for that. Check here for the documentation.
Can't say more since you have not provided any other information.
Let's asssume you have a route like this
from("activemq:somequeue).to("bean:someBean")
or
from("activemq:somequeue).process(exchange -> {})
All you got to do it put a browse endpoint in between like this
from("activemq:somequeue).to("browse:someHandler").to("bean:someBean")
Then write a class like this
#Component
public class BrowseQueue {
#Autowired
CamelContext camelContext;
public void inspect() {
BrowsableEndpoint browse = camelContext.getEndpoint("browse:someHandler", BrowsableEndpoint.class);
List<Exchange> exchanges = browse.getExchanges();
for (Exchange exchange : exchanges) {
......
}
}
}

Accept Multipart file upload as camel restlet or cxfrs endpoint

I am looking to implement a route where reslet/cxfrs end point will accept file as multipart request and process. (Request may have some JSON data as well.
Thanks in advance.
Regards.
[EDIT]
Have tried following code. Also tried sending file using curl. I can see file related info in headers and debug output, but not able to retrieve attachment.
from("servlet:///hello").process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
Message in = exchange.getIn();
StringBuffer v = new StringBuffer();
HttpServletRequest request = (HttpServletRequest) in
.getHeaders().get(Exchange.HTTP_SERVLET_REQUEST);
DiskFileItemFactory diskFile = new DiskFileItemFactory();
FileItemFactory factory = diskFile;
ServletFileUpload upload = new ServletFileUpload(factory);
List items = upload.parseRequest(request);
.....
curl :
curl -vvv -i -X POST -H "Content-Type: multipart/form-data" -F "image=#/Users/navaltiger/1.jpg; type=image/jpg" http://:8080/JettySample/camel/hello
following code works (but can't use as it embeds jetty, and we would like to deploy it on tomcat/weblogic)
public void configure() throws Exception {
// getContext().getProperties().put("CamelJettyTempDir", "target");
getContext().setStreamCaching(true);
getContext().setTracing(true);
from("jetty:///test").process(new Processor() {
// from("servlet:///hello").process(new Processor() {
public void process(Exchange exchange) throws Exception {
String body = exchange.getIn().getBody(String.class);
HttpServletRequest request = exchange.getIn().getBody(
HttpServletRequest.class);
StringBuffer v = new StringBuffer();
// byte[] picture = (request.getParameter("image")).getBytes();
v.append("\n Printing All Request Parameters From HttpSerlvetRequest: \n+"+body +" \n\n");
Enumeration<String> requestParameters = request
.getParameterNames();
while (requestParameters.hasMoreElements()) {
String paramName = (String) requestParameters.nextElement();
v.append("\n Request Paramter Name: " + paramName
+ ", Value - " + request.getParameter(paramName));
}
I had a similar problem and managed to resolve inspired by the answer of brentos. The rest endpoint in my case is defined via xml:
<restContext id="UploaderServices" xmlns="http://camel.apache.org/schema/spring">
<rest path="/uploader">
<post bindingMode="off" uri="/upload" produces="application/json">
<to uri="bean:UploaderService?method=uploadData"/>
</post>
</rest>
</restContext>
I had to use "bindingMode=off" to disable xml/json unmarshalling because the HttpRequest body contains multipart data (json/text+file) and obviously the standard unmarshaling process was unable to process the request because it's expecting a string in the body and not a multipart payload.
The file and other parameters are sent from a front end that uses the file upload angular module: https://github.com/danialfarid/ng-file-upload
To solve CORS problems I had to add a CORSFilter filter in the web.xml like the one here:
public class CORSFilter implements Filter {
#Override
public void doFilter(ServletRequest req, ServletResponse resp, FilterChain chain) throws IOException,
ServletException {
HttpServletResponse httpResp = (HttpServletResponse) resp;
HttpServletRequest httpReq = (HttpServletRequest) req;
httpResp.setHeader("Access-Control-Allow-Methods", "GET, HEAD, POST, PUT, DELETE, TRACE, OPTIONS, CONNECT, PATCH");
httpResp.setHeader("Access-Control-Allow-Origin", "*");
if (httpReq.getMethod().equalsIgnoreCase("OPTIONS")) {
httpResp.setHeader("Access-Control-Allow-Headers",
httpReq.getHeader("Access-Control-Request-Headers"));
}
chain.doFilter(req, resp);
}
#Override
public void init(FilterConfig arg0) throws ServletException {
}
#Override
public void destroy() {
}
}
Also, I had to modify a little bit the unmarshaling part:
public String uploadData(Message exchange) {
String contentType=(String) exchange.getIn().getHeader(Exchange.CONTENT_TYPE);
MediaType mediaType = MediaType.valueOf(contentType); //otherwise the boundary parameter is lost
InputRepresentation representation = new InputRepresentation(exchange
.getBody(InputStream.class), mediaType);
try {
List<FileItem> items = new RestletFileUpload(
new DiskFileItemFactory())
.parseRepresentation(representation);
for (FileItem item : items) {
if (!item.isFormField()) {
InputStream inputStream = item.getInputStream();
// Path destination = Paths.get("MyFile.jpg");
// Files.copy(inputStream, destination,
// StandardCopyOption.REPLACE_EXISTING);
System.out.println("found file in request:" + item);
}else{
System.out.println("found string in request:" + new String(item.get(), "UTF-8"));
}
}
} catch (Exception e) {
e.printStackTrace();
}
return "200";
}
I'm using the Camel REST DSL with Restlet and was able to get file uploads working with the following code.
rest("/images").description("Image Upload Service")
.consumes("multipart/form-data").produces("application/json")
.post().description("Uploads image")
.to("direct:uploadImage");
from("direct:uploadImage")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
MediaType mediaType =
exchange.getIn().getHeader(Exchange.CONTENT_TYPE, MediaType.class);
InputRepresentation representation =
new InputRepresentation(
exchange.getIn().getBody(InputStream.class), mediaType);
try {
List<FileItem> items =
new RestletFileUpload(
new DiskFileItemFactory()).parseRepresentation(representation);
for (FileItem item : items) {
if (!item.isFormField()) {
InputStream inputStream = item.getInputStream();
Path destination = Paths.get("MyFile.jpg");
Files.copy(inputStream, destination,
StandardCopyOption.REPLACE_EXISTING);
}
}
} catch (FileUploadException | IOException e) {
e.printStackTrace();
}
}
});
you can do this with restdsl even if you are not using restlet (exemple jetty) for your restdsl component.
you need to turn restdinding of first for that route and reate two classes to handle the multipart that is in your body.
you need two classes :
DWRequestContext
DWFileUpload
and then you use them in your custom processor
here is the code :
DWRequestContext.java
import org.apache.camel.Exchange;
import org.apache.commons.fileupload.RequestContext;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
public class DWRequestContext implements RequestContext {
private Exchange exchange;
public DWRequestContext(Exchange exchange) {
this.exchange = exchange;
}
public String getCharacterEncoding() {
return StandardCharsets.UTF_8.toString();
}
//could compute here (we have stream cache enabled)
public int getContentLength() {
return (int) -1;
}
public String getContentType() {
return exchange.getIn().getHeader("Content-Type").toString();
}
public InputStream getInputStream() throws IOException {
return this.exchange.getIn().getBody(InputStream.class);
}
}
DWFileUpload.java
import org.apache.camel.Exchange;
import org.apache.commons.fileupload.FileItem;
import org.apache.commons.fileupload.FileItemFactory;
import org.apache.commons.fileupload.FileUpload;
import org.apache.commons.fileupload.FileUploadException;
import java.util.List;
public class DWFileUpload extends
FileUpload {
public DWFileUpload() {
super();
}
public DWFileUpload(FileItemFactory fileItemFactory) {
super(fileItemFactory);
}
public List<FileItem> parseInputStream(Exchange exchange)
throws FileUploadException {
return parseRequest(new DWRequestContext(exchange));
}
}
you can define your processor like this:
routeDefinition.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
// Create a factory for disk-based file items
DiskFileItemFactory factory = new DiskFileItemFactory();
factory.setRepository(new File(System.getProperty("java.io.tmpdir")));
DWFileUpload upload = new DWFileUpload(factory);
java.util.List<FileItem> items = upload.parseInputStream(exchange);
//here I assume I have only one, but I could split it here somehow and link them to camel properties...
//with this, the first file sended with your multipart replaces the body
// of the exchange for the next processor to handle it
exchange.getIn().setBody(items.get(0).getInputStream());
}
});
I stumbled into the same requirement of having to consume a multipart request (containing file data including binary) through Apache Camel Restlet component.
Even though 2.17.x is out, since my project was part of a wider framework / application, I had to be using version 2.12.4.
Initially, my solution drew a lot from restlet-jdbc example yielded data in exchange that although was successfully retrieving text files but I was unable to retrieve correct binary content.
I attempted to dump the data directly into a file to inspect the content using following code (abridged).
from("restlet:/upload?restletMethod=POST")
.to("direct:save-files");
from("direct:save-files")
.process(new org.apache.camel.Processor(){
public void process(org.apache.camel.Exchange exchange){
/*
* Code to sniff exchange content
*/
}
})
.to("file:///C:/<path to a folder>");
;
I used org.apache.commons.fileupload.MultipartStream from apache fileuplaod library to write following utility class to parse Multipart request from a file. It worked successfully when the output of a mulitpart request from Postman was fed to it. However, failed to parse content of the file created by Camel (even through to eyes content of both files looked similar).
public class MultipartParserFileCreator{
public static final String DELIMITER = "\\r?\\n";
public static void main(String[] args) throws Exception {
// taking it from the content-type in exchange
byte[] boundary = "------5lXVNrZvONBWFXxd".getBytes();
FileInputStream fis = new FileInputStream(new File("<path-to-file>"));
extractFile(fis, boundary);
}
public static void extractFile(InputStream is, byte[] boundary) throws Exception {
MultipartStream multipartStream = new MultipartStream(is, boundary, 1024*4, null);
boolean nextPart = multipartStream.skipPreamble();
while (nextPart) {
String headers = multipartStream.readHeaders();
if(isFileContent(headers)) {
String filename = getFileName(headers);
File file = new File("<dir-where-file-created>"+filename);
if(!file.exists()) {
file.createNewFile();
}
FileOutputStream fos = new FileOutputStream(file);
multipartStream.readBodyData(fos);
fos.flush();
fos.close();
}else {
multipartStream.readBodyData(System.out);
}
nextPart = multipartStream.readBoundary();
}
}
public static String[] getContentDispositionTokens(String headersJoined) {
String[] headers = headersJoined.split(DELIMITER, -1);
for(String header: headers) {
System.out.println("Processing header: "+header);
if(header != null && header.startsWith("Content-Disposition:")) {
return header.split(";");
}
}
throw new RuntimeException(
String.format("[%s] header not found in supplied headers [%s]", "Content-Disposition:", headersJoined));
}
public static boolean isFileContent(String header) {
String[] tokens = getContentDispositionTokens(header);
for (String token : tokens) {
if (token.trim().startsWith("filename")) {
return true;
}
}
return false;
}
public static String getFileName(String header) {
String[] tokens = getContentDispositionTokens(header);
for (String token : tokens) {
if (token.trim().startsWith("filename")) {
String filename = token.substring(token.indexOf("=") + 2, token.length()-1);
System.out.println("fileName is " + filename);
return filename;
}
}
return null;
}
}
On debugging through the Camel code, I noticed that at one stage Camel is converting the entire content into String. After a point I had to stop pursuing this approach as there was very little on net applicable for version 2.12.4 and my work was not going anywhere.
Finally, I resorted to following solution
Write an implementation of HttpServletRequestWrapper to allow
multiple read of input stream. One can get an idea from
How to read request.getInputStream() multiple times
Create a filter that uses the above to wrap HttpServletRequest object, reads and extract the file to a directory Convenient way to parse incoming multipart/form-data parameters in a Servlet and attach the path to the request using request.setAttribute() method. With web.xml, configure this filter on restlet servlet
In the process method of camel route, type cast the
exchange.getIn().getBody() in HttpServletRequest object, extract the
attribute (path) use it to read the file as ByteStreamArray for
further processing
Not the cleanest, but I could achieve the objective.

Resources