Implement interceptor struts 2 filter all action fields data - interceptor

I have implemented my first interceptor in struts 2. I think its very basic but when it runs before the action the action fields are all set to null.
Not sure what Im doing wrong
My interceptor class is
public class LoginInterceptor extends AbstractInterceptor {
private static final long serialVersionUID = 1L;
private static final Logger log = Logger.getLogger(LoginInterceptor.class);
#Override
public String intercept(final ActionInvocation invocation) throws Exception {
Map<String, Object> session = ActionContext.getContext().getSession();
User user = SessionUtilities.getUser(session);
if (user != null) {
return invocation.invoke();
}
Object action = invocation.getAction();
if (!(action instanceof LoginAction)) {
return "loginRedirect";
}
return invocation.invoke();
}
}
My struts.xml file is
<interceptors>
<interceptor name="login" class="ar.com.cipres.framework.logic.LoginInterceptor"/>
<interceptor-stack name="loginStack">
<interceptor-ref name="login"/>
</interceptor-stack>
</interceptors>
<default-action-ref name="go.home" />
<global-results>
<result name="loginRedirect" type="redirect">go.loginform.action</result>
<result name="exception" type="redirect">exception.jsp</result>
</global-results>
and finally one of my actions with problems is configured as
<action name="go.asociate.form" method="prepareAsociateData" class="ar.com.cipres.chacabuco.asociate.AsociateAction">
<interceptor-ref name="loginStack"></interceptor-ref>
<result name="success">chaca/socios/asociateupdate.jsp</result>
</action>
Any comment so far?
Thank you

I think I found the problem
I must include the defaulStack that was omitted when declared the new one.
So I need modify my struts.xml file as
<interceptors>
<interceptor name="login" class="ar.com.cipres.framework.logic.LoginInterceptor"/>
<interceptor-stack name="loginStack">
<interceptor-ref name="login"/>
<interceptor-ref name="defaultStack"/> <!-- Need to be added to mantain default functionality -->
</interceptor-stack>
</interceptors>
Going to try later

Related

Open MultipartFile(Blob storage) in a _blank tab

I have a SpringBoot 2.1.3 + Thymeleaf 3 webapp. I have a big form with some information and also file upload. The uploading file works well, but when I want to reload into the same form (for detail or modify purpose) the information stored into DB everything works well less the part related to the files.
The code, for the uploading file part, is the follow:
<div class="form-group row">
<label for="allegato_durc" class="col-sm-5 col-form-label form-label">Allegato DURC</label>
<div class="col-sm-7">
<input type="file" th:field="*{documentiFornitoreDto.allegato_DURC}" class="form-control-file form-control-sm datif_input" id="allegato_durc">
</div>
<label for="allegato_CCIAA" class="col-sm-5 col-form-label form-label">Allegato CCIAA</label>
<div class="col-sm-7">
<input type="file" th:field="*{documentiFornitoreDto.allegato_CCIAA}" class="form-control-file form-control-sm datif_input" id="allegato_CCIAA">
</div>
</div>
Even if the file is present, I see the input field empty as below:
I'm storing the MultipartFile as MediumBlob into DB and, when I reload the info from DB, I rebuild the MultipartFile as follows:
public class ByteToMultipartFile implements MultipartFile {
private byte[] fileContent;
private String fileName;
public ByteToMultipartFile(String fileName, byte[] fileContent) {
this.fileContent = fileContent;
this.fileName = fileName;
}
#Override
public String getName() {
return fileName;
}
#Override
public String getOriginalFilename() {
return fileName;
}
#Override
public String getContentType() {
// TODO Auto-generated method stub
return null;
}
#Override
public boolean isEmpty() {
if (fileContent.length > 0) return false;
else return true;
}
#Override
public long getSize() {
return fileContent.length;
}
#Override
public byte[] getBytes() throws IOException {
return fileContent;
}
#Override
public InputStream getInputStream() throws IOException {
return new ByteArrayInputStream(fileContent);
}
#Override
public void transferTo(File dest) throws IOException, IllegalStateException {
// TODO Auto-generated method stub
}
}
Maybe there's something wrong with the class above??
Anyway I would like to perform 2 things:
1) Show the filename near Choose button (Scegli file in the image) when present
2) Show a button that permit the user to OPEN the file in a properly Windows app (if it is a .pdf open it with acrobat reader and so on)
It is possible to do some??
I have read right here, into a old post, that a file could be open in a new _blank tab (or page makes no difference) this way:
<h4>#document.Name</h4>
that is roughly what I want. Now the author writes that this attr:
#document.ContentBlobURL
represents the blob storage address of the DB. Is there someone who knows what it is?? How can I retrieve that value?
I googling a lot but I couldn't find anything interesting.
I would like to point out that, as you know, in a SpringBoot application (for example) with this structure:
if I save the file on disk, inside static folder for example, I can open it by:
http://localhost:8080/costruzione_stampi.pdf
I would like the same thing but whitout saving files on the disk..
Hope someone will answer..
I found a solution, I wanna post it because I hope it helps somebody else.
Googling around I find out that I can't set value of
<input type="file" ...
in a form with data (I have tried with Multipart, File, Blob, byte[] ecc...) loaded from DB for security reasons.
With this I mean that I can't set the input file value with a procedure like below:
#Controller
public class AppController {
#GetMapping('/loadDataInForm')
public String showData(Model model) {
model.addAttribute('file', repository.getByPk(1)); // suppose that this repository retrive a Blob or MultipartFile or someone else
return "form.html"
}
}
form.html
.....
<input type="file" th:field="*{file}" id="file_data"> // always contains nothing
I found some workaround (one of this is here) but is really not a best practice.
Anyway, if you have a different needs, for example show a preview of the file chosen from user (but at uploading time!!) you can use this trick:
<input type="file" th:field="*{someDto.file}" id="allegato_durc" onchange="show();">
.....
<script type="text/javascript">
function show() {
const fileElem = document.getElementById('allegato_durc').files[0];
var binaryData = [];
binaryData.push(fileElem);
var blob = new Blob(binaryData, {type: "image/jpg"});
const objectURL = window.URL.createObjectURL(blob);
window.open(objectURL, '_blank');
}
</script>
Hope helps..

No Association Found Error while indexing data to Solr using Spring-Data-Solr

I am trying out a sample service application of spring data mongoDB + spring data solr where MongoDB is used to persist the data and solr for indexing and searching.
The save operation to MongoDB happens successfully in the service class. But on calling the SolrOperation save() method the service crashes with the error log as below:
SEVERE [com.sun.jersey.spi.container.ContainerResponse] (defaulttask-1)The
RuntimeException could not be mapped to a response, re-throwing the HTTP
container:org.springframework.data.solr.UncategorizedSolrException:No
association fond!; nested exception is java.lang.IllegalStateException: No
association found! at org.springframework.data.solr.core.SolrTemplate.execute(SolrTemplate.java:171)
As I analyse the log further deep it says:
Caused by: java.lang.IllegalStateException:No association found!
at org.springframework.data.mapping.PersistentProperty.getRequiredAssociation(PersistentProperty.java:166)
The line getConverter().write(bean, document) inside convertBeanToSolrInputDocument () inside SolrTemplate is throwing the error.
The DAO method
public String addToRepo(MyEntity myEntity){
mongoOperation.save(myEntity); //works fine data saved to MongoDB
solrOperation.save("collectionName",myEntity); //generates above exception
return "success";
}
I am using Spring 5 + solrj-6.1.0 + spring-data-solr-4.0.2.
The solroperation has been correctly loaded as:
ApplicationContext SOLR_CONFIG_APP_CTX = new AnnotationConfigApplicationContext(SpringSolrConfig.class);
SolrOperations solrOperation = (SolrOperations)ctx.getBean("solrTemplate");
public static final SolrOperations SOLR_OPS=
(SolrOperations)SOLR_CONFIG_APP_CTX.getBean("solrTemplate");
SpringSolrConfig.java
#Configuration
public class SpringSolrConfig extends AbstractSolrConfig {
public SolrClientFactory solrClientFactory (){
SolrClient solrClient = new HttpSolrClient.Builder(solrUrl).build();
HttpSolrClientFactory solrClientFactory = new HttpSolrClientFactory (solrClient);
return solrClientFactory;
}
}
The SpringConfig.xml file looks like this:
<mongo:mongo host="195.168.1.140" port="27017"/>
<mongo:dbfactory dbname="myDB"/>
<bean id="mongoTemplate"
class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg-name="mongoDbFactory" ref="mongoDbFactory"/>
</bean>
<repositories base-package="sample.package.repositories"/>
<bean id="myEntityRepo" class="sample.package..repositories.MyEntityRepositoryInterface"/>
<solr:repositories base-package="sample.package.repositories"/>
<solr:sorl-server id="solrServer" url="http://localhost:8983/solr"/>
<bean id="solrTemplate" class="org.springframework.data.solr.core.SolrTemplate">
<constructor-arg index="0" ref="solrServer"/>
</bean>
Thanks in advance for helping me troubleshoot this!
I updated my SpringSolrConfig file as below to fix the problem. Courtesy: https://jira.spring.io/browse/DATASOLR-394
#Configuration
public class SpringSolrConfig extends AbstractSolrConfig {
String solrUrl = "http://localhost:8983/solr/"; // TODO read this ideally from spring-configuration.xml file
public SolrClientFactory solrClientFactory (){
SolrClient solrClient = new HttpSolrClient.Builder(solrUrl).build();
HttpSolrClientFactory solrClientFactory = new HttpSolrClientFactory (solrClient);
return solrClientFactory;
}
#Bean
public SolrTemplate solrTemplate () {
SolrTemplate solrTemplateObj = new SolrTemplate(solrClientFactory));
// This ensures that the default MappingSolrConverter.java is not used for converting the bean to a Solr Document before indexing
solrTemplateObj.setSolrConverter(new SolrJConverter());
return solrTemplateObj;
}
}

camel-xmljson define output types

I am using camel-xmljson component to transform xml to json in spring.
I have the next code in my camel-context:
<dataFormats>
<xmljson id="xmljson" forceTopLevelObject="false"/>
</dataFormats>
<route id="LCG-Producer" autoStartup="false">
<from uri="activemq:to_in"/>
<marshal ref="xmljson"/>
<to uri="activemq:to_out"/>
</route>
The body to transform is the next:
<?xml version="1.0"?>
<example>
<cadena1>aaaaaaa</cadena1>
<entero1>1511</entero1>
</example>
The problem is that the json generated is the next:
{
"cadena1": "aaaaaaa",
"entero1": "1511"
}
The field "entero1" is generated as a String and it is an Integer, i mean, i need that output (without ""):
{
"cadena1": "aaaaaaa",
"entero1":1511
}
Is there any way to define the data format?
I have checked source code of XmlJsonDataFormat and I guess it is not possible.
However you can use same serializers to make you our processor.
For example:
(Disclaimer: the code is not efficient, it is only to give you ideas)
#Override
public void process(Exchange exchange) throws Exception {
XMLSerializer serializer = new XMLSerializer();
JSON json=serializer.read(exchange.getIn().getBody(String.class));
StringWriter writer=new StringWriter();
JsonConfig jsonConfig=new JsonConfig();
jsonConfig.registerJsonValueProcessor("entero1", new JsonValueProcessor() {
#Override
public Object processArrayValue(Object value, JsonConfig jsonConfig) {
return new BigDecimal(value.toString());
}
#Override
public Object processObjectValue(String key, Object value, JsonConfig jsonConfig) {
return new BigDecimal(value.toString());
}
});
JSONSerializer.toJSON(json, jsonConfig).write(writer);
writer.close();
exchange.getIn().setBody(writer.toString());
}
According to this document http://json-lib.sourceforge.net/usage.html#xml xmlserializer treats all data as string, unless "type" attribute is specified on the xml element, e.g.:
<example>
<cadena1>aaaaaaa</cadena1>
<entero1 type="number">1511</entero1>
</example>

How do I properly use Mocks when testing Camel routes?

I am trying to write a Camel test that checks to ensure a content based router is routing XML files correctly. Here are the enpoints and the route in my blueprint.xml:
<endpoint uri="activemq:queue:INPUTQUEUE" id="jms.queue.input" />
<endpoint uri="activemq:queue:QUEUE1" id="jms.queue.1" />
<endpoint uri="activemq:queue:QUEUE2" id="jms.queue.2" />
<route id="general-jms.to.specific-jms">
<from ref="jms.queue.input" />
<choice>
<when>
<xpath>//object-type = '1'</xpath>
<log message="Sending message to queue: QUEUE1" />
<to ref="jms.queue.1" />
</when>
<when>
<xpath>//object-type = '2'</xpath>
<log message="Sending message to queue: QUEUE2" />
<to ref="jms.queue.2" />
</when>
<otherwise>
<log message="No output was able to be determined based on the input." />
</otherwise>
</choice>
</route>
Right now, all I am trying to do is send in a sample source file that has an <object-type> of 1 and verify that is it routed to the correct queue (QUEUE1) and is the correct data (should just send the entire XML file to QUEUE1). Here is my test code:
public class RouteTest extends CamelBlueprintTestSupport {
#Override
protected String getBlueprintDescriptor() {
return "/OSGI-INF/blueprint/blueprint.xml";
}
#Override
public String isMockEndpointsAndSkip() {
return "activemq:queue:QUEUE1";
}
#Test
public void testQueue1Route() throws Exception {
getMockEndpoint("mock:activemq:queue:QUEUE1").expectedBodiesReceived(context.getTypeConverter().convertTo(String.class, new File("src/test/resources/queue1-test.xml")));
template.sendBody("activemq:queue:INPUTQUEUE", context.getTypeConverter().convertTo(String.class, new File("src/test/resources/queue1-test.xml")));
assertMockEndpointsSatisfied();
}
}
When I run this test, I see the log message that I put in the route definition that says it is sending it to QUEUE1, but the JUnit test fails with this error message: java.lang.AssertionError: mock://activemq:queue:QUEUE1 Received message count. Expected: <1> but was: <0>.
Can someone help me understand what I am doing wrong?
My understanding is that Camel will automatically mock the QUEUE1 endpoint since I overrode the isMockEndpointsAndSkip() and provided the QUEUE1 endpoint uri. I thought this meant I should be able to use that endpoint in the getMockEnpoint() method just by appending "mock:" to the beginning of the uri. Then I should have a mocked endpoint of which I can set expections on (i.e. that is has to have the input file).
If I am unclear on something please let me know and any help is greatly appreciated!
The solution is to use CamelTestSupport.replaceRouteFromWith.
This method is completely lacking any documentation, but it works for me when invoking it like this:
public class FooTest extends CamelTestSupport {
#Override
public void setUp() throws Exception {
replaceRouteFromWith("route-id", "direct:route-input-replaced");
super.setUp();
}
// other stuff ...
}
This will also prevent the starting of the original consumer of the from destination of the route. For example that means it isn't necessary anymore to have a activemq instance running when a route with an activemq consumer is to be tested.
After working on this for quite some time, the only solution I came up with that actaully worked for me was to use the createRouteBuilder() method in my test class to add a route to a mocked endpoint at the end of the route defined in my blueprint.xml file. Then I can check that mocked endpoint for my expectations. Below is my final code for the test class. The blueprint XML remained the same.
public class RouteTest extends CamelBlueprintTestSupport {
#Override
protected String getBlueprintDescriptor() {
return "/OSGI-INF/blueprint/blueprint.xml";
}
#Test
public void testQueue1Route() throws Exception {
getMockEndpoint("mock:QUEUE1").expectedBodiesReceived(context.getTypeConverter().convertTo(String.class, new File("src/test/resources/queue1-test.xml")));
template.sendBody("activemq:queue:INPUTQUEUE", context.getTypeConverter().convertTo(String.class, new File("src/test/resources/queue1-test.xml")));
assertMockEndpointsSatisfied();
}
#Test
public void testQueue2Route() throws Exception {
getMockEndpoint("mock:QUEUE2").expectedBodiesReceived(context.getTypeConverter().convertTo(String.class, new File("src/test/resources/queue2-test.xml")));
template.sendBody("activemq:queue:INPUTQUEUE", context.getTypeConverter().convertTo(String.class, new File("src/test/resources/queue2-test.xml")));
assertMockEndpointsSatisfied();
}
#Override
protected RouteBuilder createRouteBuilder() throws Exception {
return new RouteBuilder() {
public void configure() throws Exception {
from("activemq:queue:QUEUE1").to("mock:QUEUE1");
from("activemq:queue:QUEUE2").to("mock:QUEUE2");
}
};
}
}
While this solution works, I don't fully understand why I can't just use isMockEndpointsAndSkip() instead of having to manually define a new route at the end of my existing blueprint.xml route. It is my understanding that defining isMockEndpointsAndSkip() with return "*"; will inject mocked endpoints for all of your endpoints defined in your blueprint.xml file. Then you can check for your expections on those mocked endpoints. But for some reason, this does not work for me.

SessionScoped Bean loses data on post-back on Google Appengine

I use Eclipse 3.7 GAE pluggin for development. My application uses JSF and datastore, and was set up as per https://sites.google.com/a/wildstartech.com/adventures-in-java/Java-Platform-Enterprise-Edition/JavaServer-Faces/javaserver-faces-20/configuring-javaserver-faces-20-to-run-on-the-google-appengine. In my development system, it works well. But when deployed to GAE, the SessionScoped Bean loses data on post-back:
// Input facelet
<h:outputLabel for="popupCal">Date </h:outputLabel>
<p:calendar value="#{editEntry.current.date1}" id="popupCal" />
<h:outputLabel for="code">Code </h:outputLabel>
<h:inputText id="code" value="#{editEntry.current.accountCode}"/>
<h:outputLabel for="amt">Amount </h:outputLabel>
<h:inputText id="amt" value="#{editEntry.current.amountInDollars}"/>
<h:commandButton action="#{editEntry.createCashExpenditure}" value="Create Entry"/>
#ManagedBean(name="editEntry")
#SessionScoped
public class EditEntry extends AbstractEntryBean implements Serializable {
#ManagedProperty(value="#{sessionBean}")
protected SessionBean sessionBean;
#ManagedProperty(value="#{dao}")
protected Dao dao;
#PostConstruct
public void init() {
Logger.getLogger(getClass().getName()).log(Level.WARNING, "dao is null? {0}", dao==null);
setTran_id(0L);
entries.clear();
setCurrent(new Entry());
getCurrent().clear();
...
this.refreshEntries();
}
public void refreshEntries() {
entries = dao.getEntries(current.getFinyr(), getTran_id());
Logger.getLogger(getClass().getName()).log(Level.INFO, "entries has {0} items", entries.size());
}
public String createCashExpenditure() {
if (dao == null) {
Logger.getLogger(getClass().getName()).log(Level.WARNING, "dao is null");
return null;
}
entries.clear();
Entry e = new Entry();
e.clear();
e.setAccountCode(current.getAccountCode());
e.setAccountName(dao.lookupAccoutName(e.getAccountCode()));
e.setAmount(current.getAmount());
e.setDate1(current.getDate1());
e.setTran_id(getTran_id());
Key key = dao.saveEntry(e, sessionBean.getFinyr());
e.setId(key.getId());
entries.add(e);
current = e;
this.setTran_id(e.getTran_id());
Logger.getLogger(getClass().getName()).log(Level.INFO, "current account is: {0}", current.getAccountCode());
return "newEntry?faces-redirect=true";
}
...
}
newEntry.xhtml
<p:dataTable id="items" value="#{editEntry.entries}" var="item">
// editEntry.entries is EMPTY!
When EditEntry.createCashExpenditure() is invoked, the log shows EditEntry.current is correctly populated, and saved to the datastore. Datastore viewer also displays the data. But on post-back, in newEntry.xhtml facelet, editEntry.entries becomes empty, EditEntry.current loses all data.
I have put in place ForceSessionSerializationPhaseListener as mentioned in http://java.zacheusz.eu/google-app-engine-http-session-vs-jsf-en/394/ The log shows this listener is invoked.
In web.xml, javax.faces.PROJECT_STAGE is Production,
I face the same issues, after redirect, previous session is gone. It only happen when deploy online.
I think that is due to session variable set to 'client' for javax.faces.STATE_SAVING_METHOD (in web.xml)
so before redirect, I need explicit set the session as below:
getSessionScope().put(sessionname,sessionObj);
public Map getSessionScope() {
return getFacesContext().getExternalContext().getSessionMap();
}

Resources