I want to upload a JPG file and a JSON-serialized Java object. On the server I am using Apache CXF, on the client I am integration testing with rest-assured.
My server code looks like:
#POST
#Path("/document")
#Consumes(MediaType.MULTIPART_FORM_DATA)
public Response storeTravelDocument(
#Context UriInfo uriInfo,
#Multipart(value = "document") JsonBean bean,
#Multipart(value = "image") InputStream pictureStream)
throws IOException
{}
My client code looks like:
given().
multiPart("document", new File("./data/json.txt"), "application/json").
multiPart("image", new File("./data/image.txt"), "image/jpeg").
expect().
statusCode(Response.Status.CREATED.getStatusCode()).
when().
post("/document");
Everything works fine when I read the json part from the file as in the first multiPart line. However, when I want to serialize the json instance I come into problems. I tried many variants, but none worked.
I thought this variant should work: on the client
JsonBean json = new JsonBean();
json.setVal1("Value 1");
json.setVal2("Value 2");
given().
contentType("application/json").
formParam("document", json).
multiPart("image", new File("./data/image.txt"), "image/jpeg").
...
and on the server
public Response storeTravelDocument(
#Context UriInfo uriInfo,
#FormParam(value = "document") JsonBean bean,
#Multipart(value = "image") InputStream pictureStream)
but no. Can anyone tell me how it should be?
Try different approach (worked for me), I am not sure if this is suitable in your case.
Make JsonBean a JAXB entity, that it add #XmlRootEntity above class definition.
Then, instead of formParam
given().
contentType("application/json").
body(bean). //bean is your JsonBean
multiPart("image", new File("./data/image.txt"), "image/jpeg").
then
public Response storeTravelDocument(
#Context UriInfo uriInfo,
JsonBean bean, //should be deserialized properly
#Multipart(value = "image") InputStream pictureStream)
I've never tried that with #Multipart part, but, hopefully it would work.
Multipart/form-data follows the rules of multipart MIME data streams, see w3.org. This means that each part of the request forms a part in the stream. Rest-assured supports already simple fields (strings), files and streams, but not object serialization into a part. After asking on the mailing list, Johan Haleby (the author of rest-assured) suggested to add an issue. The issue is already accepted, see issue 166.
The server will stay as it is:
#POST
#Path("/document")
#Consumes(MediaType.MULTIPART_FORM_DATA)
public Response storeTravelDocument(
#Context UriInfo uriInfo,
#Multipart(value = "document") JsonBean bean,
#Multipart(value = "image") InputStream pictureStream)
throws IOException
{}
The client code will look like:
given().
multiPartObject("document", objectToSerialize, "application/json").
multiPart("image", new File("./data/image.txt"), "image/jpeg").
expect().
statusCode(Response.Status.CREATED.getStatusCode()).
when().
post("/document");
Maybe the name "multiPartObject" will change. We will see once it is implemented.
Related
I am currently using Spring 4.1.6 with a RestTemplate to consume a third party webservice with JSON which I cannot change its behavior.I am using Jackson databind v2.6.0.
Problem: Sometimes the service returns for a member a hashmap {member:{"key":"value",...}} sometimes the same member is just an empty array {member:[]}. So I can not ignore the property by default.
Is there a way to configure the deserialization to ignore empty arrays? I saw a jackson property "WRITE_EMPTY_JSON_ARRAYS" but I am not quite sure how I can use it with my restTemplate and spring configuration.
Are there other possiblities e.g. use some combination of #JsonXXX Annotations? I saw #JsonSerialize which can be used on class level, but I don't like to write a deserializer for all my classes just to handle this situation (However if there is no other way of course I will do)
Example responses to llustrate the behavior of the service:
response with a hashmap
{"id":170,"categories":{"13":"caro"}}
response with empty array of the same member
{"id":170,"categories":[]}
Example of my RestTemplate usage:
BasicAuthRequestFactory requestFactory = new BasicAuthRequestFactory(httpClient);
restTemplate = new RestTemplate(requestFactory);
Article a = restTemplate.getForObject(new URI("http://..."), Article.class);
Error:
caused by: com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of java.util.LinkedHashMap out of START_ARRAY token
at [Source: java.io.PushbackInputStream#4aa21f9d; line: 1, column: 1456] (through reference chain: ResponseArticleWrapper["data"]->Article["categories"])
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
Example of my current annotated class:
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonInclude(Include.NON_NULL)
public class Article {
#JsonProperty("id")
private Integer id;
#JsonProperty("categories")
private Map<Integer,String> categories = new HashMap<Integer,String>();
}
Thank you in advance for any hints and examples.
Since jackson-databind 2.5 there is DeserializationFeature for handling this case. It's turned off by default, so you need to configure it in your ObjectMapper:
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(DeserializationFeature.ACCEPT_EMPTY_ARRAY_AS_NULL_OBJECT, true);
return objectMapper;
}
You can see how the custom ObjectMapper for RestTemplate is configured here: How can we configure the internal Jackson mapper when using RestTemplate?
After you're done with the configuration, you can just let Spring wire it for you in your class:
#Autowired
private RestOperations restTemplate;
and use the provided restTemplate instance.
I would like to send JMS messages containing Java POJOs to ActiveMQ and all messages should be converted to JSON documents. So I need mechanism that will convert POJO to JSON and will send created document as text message to ActiveMQ. I would like to use ProducerTemplate#send(...) method without need to define routes. I am using routes on the server, but in my opinion doing so on the client side is an overkill.
This is xml config:
<camel:camelContext id="camel-client">
<camel:template id="camelTemplate" />
<camel:dataFormats>
<camel:json id="json" library="Jackson" />
</camel:dataFormats>
</camel:camelContext>
and java code:
#EndpointInject(uri = "jms:queue:test?jmsMessageType=Text")
private ProducerTemplate camelTemplate;
#Test
public void send() {
Address address = new Adress("Eric Mouller", "ForstenriederAlle 99", 81476);
camelTemplate.sendBody(address);
}
The current implementation calls toString() on Adress, but I would like to automatically convert it to JSON, is it possible?
From my understanding you are trying to take a java object and convert it into a json string. So something like Gson would do wonders for you.
Gson gson = new Gson();
String address = gson.toJson(address);
Reference:
https://google-gson.googlecode.com/svn/trunk/gson/docs/javadocs/com/google/gson/Gson.html
I am using Camel's POJO producing e.g.
{
public interface MyListener {
String sayHello(String name);
}
public class MyBean {
#Produce(uri = "activemq:foo")
protected MyListener producer;
public void doSomething() {
// lets send a message
String response = producer.sayHello("James");
}
}
}
The interfaces using method sayHello with string object which used as body in the camel. However, If i try to use any other Object here i get exception from camel saying no TypeConvertor found for BeanInvocation for Conversion java.io.InputStream.
I know is the object was allowed it would have been mentioned somewhere. But i want to reason why it has been done like that and if there's a way to work-around this.
I havent really used POJO messaging as yet. Maybe, an experienced user can help you better with this.
But from what I understand, it should be able to support any kind of object not just string.
The error that you're talking of seems to arise out of a mismatch down the route. I'm guessing there is some kind of issue with the consumption.
Can you please post the exact error stacktrace and the consumer method?
Thanks!
Struggling with the same problem right now. The only obvious workaround so far is to use #EndpointInject instead of #Produce - then you get ProducerTemplate and publish any object:
#EndpointInject(uri = "seda:report-send")
ProducerTemplate reportSender;
Now you can do
Object myObject = new Object();
reportSender.sendBody(myObject);
Or even
Object myObject = new Object();
Map<String, Object> headers = new HashMap<String, Object>();
headers.put("Subject", "Mail subject");
headers.put("contentType", "text/plain");
reportSender.sendBodyAndHeaders(myObject, headers);
My project is based on GAE/J and utilizing the recent launched PULL queue, but I think the question can also be applied to Python.
Basically, when I put a task into the PULL queue, I need to set some params of the task for the later consumer to pick it up.
I have implemented in the params setting in both ways:
1) By using param():
TaskOptions taskOptions = TaskOptions.Builder.
withMethod(TaskOptions.Method.PULL);
taskOptions.param("param", paramValue);
taskOptions.param("param2", paramValue2);
2) By using payload():
TaskOptions taskOptions = TaskOptions.Builder.
withMethod(TaskOptions.Method.PULL);
taskOptions.payload("payloadValue");
Both approaches are working, however, what I would like to know is what's the differences between the two, and which way should be the preferred way in terms of efficiency or convenience.
I can see that by using param(), it is easy to set multiple parameters and also easy to retrieve the parameters for the consumer.
But for one parameter cases, then payload may come in more handy as it saves the code to catch Exceptions throwing out when the consumer extract parameters.
However, I would be happy to know any more differences between these two apart from what I have menitoned.
Per the python documentation, I would say that in your case is exactly the same.
In PULL requests, Do not specify params if you already specified a payload. Params are encoded as application/x-www-form-urlencoded and set to the payload.
There is difference in .param() and .payload() functions of TaskOptions. You can use these functions as follows;
taskOptions.param("param1","Invoice_3344");
Now at receiver end, lets say you are calling a servlet , the in the HttpRequest, you can receive the sent parameters as request parameter.
public class MyInvoiceTask extends HttpServlet{
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
String invoiceNum = request.getParameter("param1");
}
}
Now Assume you wanted to serialize your entire custom class object which has huge data. In such case you would need to user .payload() function as internally, it send the payload data in a request body.
//**Custom class object
Person person = new Person("Abc", "Mumbai", 22);
//**Convert the object into JSON so that can be converted into String(required for Payload)
//**Use Gson library
Gson gson = new Gson();
String personObjString = gson.toJson(person);
//**put the payload in task option as byte array
taskOption.payload(personObjString.toByteArray());
Now at receiver end lets say using servlet, then from HttpRequest object, we would need to get the payload byte array and convert it back into cutsom object i.e. "Person" class object in our case.
private byte[] getPayloadFromHttpRequest(HttpServletRequest req) throws IOException
{
InputStream inputStream = req.getInputStream();
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
int length;
byte[] buffer = new byte[1024];
while ((length = inputStream.read(buffer)) >= 0)
byteArrayOutputStream.write(buffer, 0, length);
if (byteArrayOutputStream.size() > 0){
return byteArrayOutputStream.toByteArray();
}
return null;
}
//**Now this received byteArray can be used with Gson to convert it back into Person object
byte[] payload = getPayloadFromHttpRequest(request);
Gson gson = new Gson();
String personJsonString = new String(payload);
Person person = gson.fromJson(personJsonString, Person.class);
I am trying to develop an API call using Apache CXF that takes in an attachment along with the request. I followed this tutorial and this is what I have got so far.
#POST
#Path("/upload")
#RequireAuthentication(false)
public Response uploadWadl(MultipartBody multipartBody){
List<Attachment> attachments = multipartBody.getAllAttachments();
DataHandler dataHandler = attachments.get(0).getDataHandler();
try {
InputStream is = dataHandler.getInputStream();
} catch (IOException e) {
e.printStackTrace();
}
return Response("OK");
}
I am getting an InputStream object to the attachment and everything is working fine. However I need to pass the attachment as a java.io.File object to another function. I know I can create a file here, read from the inputstream and write to it. But is there a better solution? Has the CXF already stored it as a File? If so I could just go ahead and use that. Any suggestions?
I'm also interested on this matter. While discussing with Sergey on the CXF mailing list, I learned that CXF is using a temporary file if the attachment is over a certain threshold.
In the process I discovered this blogpost that explains how to use CXF attachment safely.
You can be interested by the exemple on this page as well.
That's all I can say at the moment as I'm investigating right now, I hope that helps.
EDIT : At the moment here's how we handle attachment with CXF 2.6.x. About uploading a file using multipart content type.
In our REST resource we have defined the following method :
#POST
#Produces(MediaType.APPLICATION_JSON)
#Consumes(MediaType.MULTIPART_FORM_DATA)
#Path("/")
public Response archive(
#Multipart(value = "title", required = false) String title,
#Multipart(value = "hash", required = false) #Hash(optional = true) String hash,
#Multipart(value = "file") #NotNull Attachment attachment) {
...
IncomingFile incomingFile = attachment.getObject(IncomingFile.class);
...
}
A few notes on that snippet :
#Multipart is not standard to JAXRS, it's not even in JAXRS 2, it's part of CXF.
In our code we have implemented bean validation (you have to do it yourself in JAXRS 1)
You don't have to use a MultipartBody, the key here is to use an argument of type Attachment
So yes as far as we know there is not yet a possibility to get directly the type we want in the method signature. So for example if you just want the InputStream of the attachment you cannot put it in the signature of the method. You have to use the org.apache.cxf.jaxrs.ext.multipart.Attachment type and write the following statement :
InputStream inputStream = attachment.getObject(InputStream.class);
Also we discovered with the help of Sergey Beryozkin that we could transform or wrap this InputStream, that's why in the above snippet we wrote :
IncomingFile incomingFile = attachment.getObject(IncomingFile.class);
IncomingFile is our custom wrapper around the InputStream, for that you have to register a MessageBodyReader, ParamHandler won't help as they don't work with streams but with String.
#Component
#Provider
#Consumes
public class IncomingFileAttachmentProvider implements MessageBodyReader<IncomingFile> {
#Override
public boolean isReadable(Class<?> type, Type genericType, Annotation[] annotations, MediaType mediaType) {
return type != null && type.isAssignableFrom(IncomingFile.class);
}
#Override
public IncomingFile readFrom(Class<IncomingFile> type,
Type genericType,
Annotation[] annotations,
MediaType mediaType,
MultivaluedMap<String, String> httpHeaders,
InputStream entityStream
) throws IOException, WebApplicationException {
return createIncomingFile(entityStream, fixedContentHeaders(httpHeaders)); // the code that will return an IncomingFile
}
}
Note however that there have been a few trials to understand what was passed, how, and the way to hot-fix bugs (For example the first letter of the first header of the attachment part was eat so you had ontent-Type instead of Content-Type).
Of course the entityStream represents the actual InputStream of the attachment. This stream will read data either from memory or from disk, depending on where CXF put the data ; there is a size threshold property (attachment-memory-threshold) for that matter. You can also say where the temporary attachments will go (attachment-directory).
Just don't forget to close the stream when you are done (some tool do it for you).
Once everything was configured we tested it with Rest-Assured from Johan Haleby. (Some code are part of our test utils though) :
given().log().all()
.multiPart("title", "the.title")
.multiPart("file", file.getName(), file.getBytes(), file.getMimeType())
.expect().log().all()
.statusCode(200)
.body("store_event_id", equalTo("1111111111"))
.when()
.post(host().base().endWith("/store").toStringUrl());
Or if you need to upload the file via curl in such a way :
curl --trace -v -k -f
--header "Authorization: Bearer b46704ff-fd1d-4225-9dd4-e29065532b73"
--header "Content-Type: multipart/form-data"
--form "hash={SHA256}3e954efb149aeaa99e321ffe6fd581f84d5a497b6fab5c86e0d5ab20201f7eb5"
--form "title=fantastic-video.mp4"
--form "archive=#/the/path/to/the/file/fantastic-video.mp4;type=video/mp4"
-X POST http://localhost:8080/api/video/event/store
To finish this answer, I'd like to mention it is possible to have JSON payload in multipart, for that you can use an Attachment type in the signature and then write
Book book = attachment.getObject(Book.class)
Or you can write an argument like :
#Multipart(value="book", type="application/json") Book book
Just don't forget to add the Content-Type header to the relevant part when performing the request.
It might be worth to say that it is possible to have all the parts in a list, just write a method with a single argument of type List<Attachment>. However I prefer to have the actual arguments in the method signature as it's cleaner and less boilerplate.
#POST
void takeAllParts(List<Attachment> attachments)