I been trying this camel-sql example to fetch some rows from the back-end oracle.
Datasource definition:
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver"/>
<property name="url" value="jdbc:oracle:thin:#localhost:port:sid"/>
<property name="username" value="username"/>
<property name="password" value="password"/>
</bean>
Routes Definition:
<route id="QueryTable">
<from uri="timer:foo?period=5s"/>
<to uri="sql:{{sql.selectOrder}}"/>
<to uri="file:target/data/?fileName=data.txt"/>
</route>
It seems to fetch the row properly, but, does not create the file and reports the following exceptions. Any help would be highly appreciated.
at org.apache.camel.impl.converter.BaseTypeConverterRegistry.mandatoryConvertTo(BaseTypeConverterRegistry.java:198)
at org.apache.camel.impl.MessageSupport.getMandatoryBody(MessageSupport.java:105)
... 16 more
http://camel.apache.org/sql-component.html
For select operations, the result is an instance of List<Map<String, Object>> type, as returned by the JdbcTemplate.queryForList() method.
You received List<Map<String, Object>> as result of the query. If you like to save this results to file, you have to split list by records (use splitter), convert one record to string (maybe you can use some sort of template, or just join to string fields values) and save (append) this string to file.
Thanks for the kind help. Here is what I used to save the resultSet to a file.
<from uri="timer:foo?repeatCount=1" />
<to uri="sql:{{sql.selectOrder}}"/>
<marshal>
<csv delimiter="|" />
</marshal>
<to uri="file://target/test.csv?fileName=data.txt" />
And of course, pom.xml has to have the following dependency added.
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-csv</artifactId>
</dependency>
Related
It is a simple select statement from mysql server and showing output on console. Somehow the result is getting printed in loop though I haven’t used any timer as such.
<beans>
<context:component-scan base-package="com.java"/>
<bean destroy-method="close" class="org.apache.commons.dbcp.BasicDataSource" id="dataSource">
<property value="com.mysql.jdbc.Driver" name="driverClassName"/>
<property value="jdbc:mysql://localhost:3306/world" name="url"/>
<property value="root" name="username"/>
<property value="mysql" name="password"/>
</bean>
<!-- configure the Camel SQL component to use the JDBC data source -->
<bean class="org.apache.camel.component.sql.SqlComponent" id="sql">
<property name="dataSource" ref="dataSource"/>
</bean>
<camel:camelContext>
<camel:route>
<camel:from uri="sql:select id from city where countryCode='AFG' and district ='Kabol' "/>
<camel:log message="${body[id]}"/>
<camel:to uri="stream:out"/>
</camel:route>
</camel:camelContext>
</beans>
Client code is:
public static void main(String rgs[]) throws InterruptedException{
PropertyConfigurator.configure("C:/Users/payal.bansal/workspace/CamelOneProject/src/resources/log4j.properties");
//AbstractApplicationContext ctx= new ClassPathXmlApplicationContext("resources/camel-configThree.xml");
AbstractApplicationContext ctx= new ClassPathXmlApplicationContext("resources/camel-configEight.xml");
Thread.sleep(5000);
ctx.close();
}
The from SQL will repeat the SQL call every 500 milli seconds by default (its scheduled).
http://camel.apache.org/sql-component.html
If you want the SQL to fire only once, then use a timer
from timer
to sql
And configure the timer to only run once.
http://camel.apache.org/timer
Ok, this is a continuation to my original question. (WSo2 Esb filtering messages to an output file)
After a couple more days of research into the iterate and aggregate mediator I have reached a progress wall and would be very grateful for any advice how to resolve my blocking issue. The task at hand is simple, read and xml file, filter on certain records, and only produce an output xml file with just those records that meet the criteria. After previous discussion and research this tasks solution is as follows: use a iterate mediator to access each xml record and use a filter mediator to decide what I want to keep. Then use a aggregator mediator to write all these records to 1 file. Without the aggregator my configuration is producing 1 record per file.
Here is my proxy at this state:
<?xml version="1.0" encoding="UTF-8"?>
<proxy xmlns="http://ws.apache.org/ns/synapse"
name="RenFileFilterProxy"
transports="vfs"
statistics="disable"
trace="disable"
startOnLoad="true">
<target>
<inSequence>
<log level="full"/>
<clone>
<target sequence="RenIqtFilterSequence"/>
</clone>
</inSequence>
<outSequence>
<log level="custom">
<property name="sequence" value="In the outSequence"/>
</log>
<log level="full"/>
<aggregate id="QuizType">
<completeCondition>
<messageCount min="-1" max="-1"/>
</completeCondition>
<onComplete xmlns:z="RowsetSchema" expression="//z:row">
<call-template target="FileWriteTemplate">
<with-param name="targetFileName" value="TEST_FILE"/>
<with-param name="addressUri" value="vfs:file:///var/process/ren/rrout"/>
</call-template>
</onComplete>
</aggregate>
</outSequence>
</target>
<parameter name="transport.vfs.ActionAfterProcess">MOVE</parameter>
<parameter name="transport.PollInterval">15</parameter>
<parameter name="transport.vfs.MoveAfterProcess">file:///var/process/ren/extractedfiles</parameter>
<parameter name="transport.vfs.FileURI">file:///var/process/ren/extractedfiles</parameter>
<parameter name="transport.vfs.MoveAfterFailure">file:///var/process/ren/failure</parameter>
<parameter name="transport.vfs.FileNamePattern">test.xml</parameter>
<parameter name="transport.vfs.ContentType">application/xml</parameter>
<parameter name="transport.vfs.ActionAfterFailure">MOVE</parameter>
<description/>
</proxy>
Here is the RenIqtFilterSequence:
<sequence xmlns="http://ws.apache.org/ns/synapse" name="RenIqtFilterSequence">
<log level="custom">
<property name="sequence" value="RenIqtFilterSequence"></property>
</log>
<iterate xmlns:ns2="http://org.apache.synapse/xsd" xmlns:ns="http://org.apache.synapse/xsd" xmlns:z="RowsetSchema" expression="//z:row" id="QuizType">
<target>
<sequence>
<log>
<property name="iteratesequence" value="Iterating through the the records"></property>
</log>
<filter xmlns:rs="urn:schemas-microsoft-com:rowset" xpath="//z:row/#name='RP'">
<then>
<log level="custom">
<property name="sequence" value="Condition Write"></property>
</log>
<log level="full"></log>
<log level="custom">
<property name="sequence" value="Getting ready to aggregate"></property>
</log>
</then>
<else>
<log level="custom">
<property name="sequence" value="Condition Drop"></property>
</log>
<drop></drop>
</else>
</filter>
</sequence>
</target>
</iterate>
</sequence>
The issue I am facing now is I cannot get to the outSequence to perform my aggregator logic. The fact is I do not need to call a backend service for this process from the inSequence so I am struggling to find a clean way to get to my outSequence. Any Recomendations? I tried to include the aggregator mediator in my inSequence but it did not work, no output is produced. I figured the aggregator needs to be in the outSequence to capture all the records I filtered and then I can write them all out to 1 file.
I believe I need some configuration here:
<filter xmlns:rs="urn:schemas-microsoft-com:rowset" xpath="//z:row/#name='RP'">
<then>
<log level="custom">
<property name="sequence" value="Condition Write"></property>
</log>
<log level="full"></log>
<log level="custom">
<property name="sequence" value="Getting ready to aggregate"></property>
</log>
===============> Need to transfer to OutSequence somehow here!!!!<============
</then>
<else>
I have tried several approaches based on articles and blogs online but I just cannot put the final pieces in place. Any advice would be much appreciated. Thank you for your time reading this.
I have an existing setup of mybatis + springs for working with Oracle DB. I have a set of java mapper interfaces, a set of corresponding mapper xmls (each having reference to their corresponding java mapper). I need to setup support for MSSQL also, but finding it hard to do that. I have created a separate set of xmls (specific to queries of MSSQL) in com/mycomp/mob/db/mappers/mssql.
Below is the extract of my applicationConext.xml (Here DBDataSource is an internal class which reads a config files to get all DB details).
<bean name="dataSource" class="com.mycomp.mob.core.db.DBDataSource">
<constructor-arg index="0" name="dbAlias" value="mob" />
<constructor-arg index="1" name="cfgSection" value="primary" />
</bean>
<bean class="org.mybatis.spring.mapper.MapperScannerConfigurer">
<property name="basePackage" value="com.mycomp.mob.db.mappers.oracle" />
</bean>
<bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="typeAliasesPackage" value="com.mycomp.mob.db.mappers.oracle" />
<property name="configLocation" value="classpath:mybatis-config.xml" />
</bean>
<bean id="sqlSession" class="org.mybatis.spring.SqlSessionTemplate">
<constructor-arg index="0" ref="sqlSessionFactory" />
</bean>
Below is part of mybatis-config.xml (here ${} params are replaced using the earlier DBDatasource)
<configuration>
<typeAliases>
<package name="com.mycomp.mob.db.model" />
</typeAliases>
<environments default="development">
<environment id="development">
<transactionManager type="JDBC" />
<dataSource type="POOLED">
<property name="driver" value="${driver}" />
<property name="url" value="${url}" />
<property name="username" value="${username}" />
<property name="password" value="${password}" />
</dataSource>
</environment>
</environments>
<mappers>
<mapper resource="com/mycomp/mob/db/mappers/oracle/tenant.xml" />
<mapper resource="com/mycomp/mob/db/mappers/oracle/functionenty.xml" />
</mappers>
</configuration>
com.mycomp.mob.db.model contains POJO for tenant and functionentry.
Usage is as below :
ITenantMapper mappper = sqlSessionFactory.openSession().getMapper(ItenantMapper.class)
Tenant t = mapper.getTenant();
Now at a time only one DB (which is configured as primary in DB config file) will be used. So how to make sure that XML corresponding to that particular DB is invoked by the java mapper interface.
Also need to know can I configure same java mapper class name inside the mapper xmls of mssql ?
Is the flow used correct or I need to change the flow for supporting multiple databases.
Create two datasources and two SessionFactories in spring one for Oracle another for MSSQL.
I am working with Apache Camel Bindy to process csv files of different data models.
e.g. file one is of data model on and file two is of data model two.
In the camel route, I associated two calls of the BindyCsvDataFormat with different data models as:
<bean id="bindyDataformat" class="org.apache.camel.dataformat.bindy.csv.BindyCsvDataFormat">
<constructor-arg name="type" value="com.barclays.creditit.cls.eoddata.model.risk.DataModel1" />
</bean>
<bean id="aBindyDataformat" class="org.apache.camel.dataformat.bindy.csv.BindyCsvDataFormat">
<constructor-arg name="type" value="DataModel2" />
</bean>
route looks like this:
<from uri="direct:start"/>
<bean ref="fileReader"/>
<unmarshal ref="bindyDataformat" />
<bean ref="flattener"/>
<bean ref="fileReader"/>
<unmarshal ref="aBindyDataformat" />
<bean ref="flattener"/>
When I run the code though, the factory has two models associated automatically, not one per run. And both the files are read into objects of the first data model and never the second data model. Any suggestions about how I could get this to work?
Thanks!
Create two different routes with different file filters and process them separately with one of the Bindy formaters.
I have some static files (some are HTML, some are images and some are pure data files - like .csv or .xls etc) that I want to share through the ESB. I can make that happen if I run a separate HTTP server that will receive the request for these through ESB. Instead, I like to handle it in ESB itself. Based on the incoming request URL (say HTTP GET request - http://myesb.com:8280/getstatus.html ), I like to pull these static files from the local server's folders.
I tried the VFS method and it looks like has built in "refresh" mechanism that I don't want. I want to "GET" these data only when the clients are requesting for it.
In short, I like to have a simple mapping done like this:
http://myesb.com:8280/getstatus.html will fetch the contents of /var/myapp/status/appstatus.html file.
Update
I did the following sequence - don't know how to make it work :(
<sequence xmlns="http://ws.apache.org/ns/synapse" name="app1status">
<in>
<log level="custom">
<property name="Reached app1status page - in" value="app1 Status"/>
<property name="transport.vfs.ContentType" value="text/html"/>
<property xmlns:ns="http://org.apache.synapse/xsd" name="TRPURL:" expression="get-property('From')"/>
</log>
<property name="transport.vfs.FileURI" value="vfs:file:///opt/platform/traffic/app1status1.html" scope="transport" type="STRING"/>
<property name="HTTP_METHOD" value="GET"/>
<property name="ClientApiNonBlocking" action="remove" scope="axis2"/>
<header name="To" action="remove"/>
<property name="RESPONSE" value="true" scope="default" type="STRING"/>
</in>
<out>
<log level="custom">
<property name="::::::Out:::::Reached app1status" value=" From OUT"/>
<property name="messageType" value="text/html"/>
<property name="ContentType" value="text/html"/>
</log>
<send/>
</out>
</sequence>
Note the following in the <in> mediator:
<property name="transport.vfs.FileURI" value="vfs:file:///opt/platform/traffic/app1status1.html" scope="transport" type="STRING"/>
My intent is to get the content of the file appstatus1.html retrieved and send back as the response. But I am not able to get the contents retrieved and added to the "RESPONSE"
Let me know how it can be done.
Thanks for your time.
Define a RESTAPI and based on GET/PUT pull or post data to your server.