How to configure Sybase SQLanywhere 16 data source in spring boot? - sybase

I want to declare a data source, which is actually a Sybase SQLAnywhere 16 database. I have done the same for MY SQL using com.mysql.jdbc.jdbc2.optional.MysqlDataSource which implements javax.sql.Datasource interface.
#Configuration
#PropertySource("classpath:db.properties")
public class DbConfig {
#Autowired
private Environment env;
#Bean
public DataSource getDataSource() {
try {
MysqlDataSource mysqlDS = null;
mysqlDS = new MysqlDataSource();
mysqlDS.setURL(env.getProperty("MYSQL_DB_URL"));
mysqlDS.setUser(env.getProperty("MYSQL_DB_USERNAME"));
mysqlDS.setPassword(env.getProperty("MYSQL_DB_PASSWORD"));
} catch (IOException e) {
e.printStackTrace();
}
return mysqlDS;
}
}
I want to know that. for Sybase SQLAnywhere16 which jar is required and how to configure the same like the above code. This data source is actually required to implement the below code.
#Override
protected void configure(AuthenticationManagerBuilder auth) throws Exception {
auth.jdbcAuthentication().dataSource(dataSource)......
}

I have done it by the following code.
#Bean
public DataSource getDataSource() {
SybDataSource mysqlDS = new SybDataSource();
try {
mysqlDS.setPortNumber(properties.getProperty("db_port"));
mysqlDS.setServerName(properties.getProperty("db_host"));
mysqlDS.setUser(properties.getProperty("db_username"));
mysqlDS.setPassword(properties.getProperty("db_password"));
mysqlDS.setDatabaseName(properties.getProperty("db_name"));
}
catch(Exception e) {
e.printStackTrace();
}
return mysqlDS;
}

Related

Can I write sync code in RichAsyncFunction

When I need to work with I/O (Query DB, Call to the third API,...), I can use RichAsyncFunction. But I need to interact with Google Sheet via GG Sheet API: https://developers.google.com/sheets/api/quickstart/java. This API is sync. I wrote below code snippet:
public class SendGGSheetFunction extends RichAsyncFunction<Obj, String> {
#Override
public void asyncInvoke(Obj message, final ResultFuture<String> resultFuture) {
CompletableFuture.supplyAsync(() -> {
syncSendToGGSheet(message);
return "";
}).thenAccept((String result) -> {
resultFuture.complete(Collections.singleton(result));
});
}
}
But I found that message send to GGSheet very slow, It seems to send by synchronous.
Most of the code executed by users in AsyncIO is sync originally. You just need to ensure, it's actually executed in a separate thread. Most commonly a (statically shared) ExecutorService is used.
private class SendGGSheetFunction extends RichAsyncFunction<Obj, String> {
private transient ExecutorService executorService;
#Override
public void open(Configuration parameters) throws Exception {
super.open(parameters);
executorService = Executors.newFixedThreadPool(30);
}
#Override
public void close() throws Exception {
super.close();
executorService.shutdownNow();
}
#Override
public void asyncInvoke(final Obj message, final ResultFuture<String> resultFuture) {
executorService.submit(() -> {
try {
resultFuture.complete(syncSendToGGSheet(message));
} catch (SQLException e) {
resultFuture.completeExceptionally(e);
}
});
}
}
Here are some considerations on how to tune AsyncIO to increase throughput: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-Async-IO-operator-tuning-micro-benchmarks-td35858.html

Flink -- get data from Cassandra as generic ResultSet and convert it to DataSet

I have StreamExecutionEnvironment job that consumes from kafka simple cql select queries.
I try to handle this queries asynchronically using following code:
public class GenericCassandraReader extends RichAsyncFunction {
private static final Logger logger = LoggerFactory.getLogger(GenericCassandraReader.class);
private ExecutorService executorService;
private final Properties props;
private Session client;
public ExecutorService getExecutorService() {
return executorService;
}
public GenericCassandraReader(Properties props, ExecutorService executorService) {
super();
this.props = props;
this.executorService = executorService;
}
#Override
public void open(Configuration parameters) throws Exception {
client = Cluster.builder().addContactPoint(props.getProperty("cqlHost"))
.withPort(Integer.parseInt(props.getProperty("cqlPort"))).build()
.connect(props.getProperty("keyspace"));
}
#Override
public void close() throws Exception {
client.close();
synchronized (GenericCassandraReader.class) {
try {
if (!getExecutorService().awaitTermination(1000, TimeUnit.MILLISECONDS)) {
getExecutorService().shutdownNow();
}
} catch (InterruptedException e) {
getExecutorService().shutdownNow();
}
}
}
#Override
public void asyncInvoke(final UserDefinedType input, final AsyncCollector<ResultSet> asyncCollector) throws Exception {
getExecutorService().submit(new Runnable() {
#Override
public void run() {
ListenableFuture<ResultSet> resultSetFuture = client.executeAsync(input.query);
Futures.addCallback(resultSetFuture, new FutureCallback<ResultSet>() {
public void onSuccess(ResultSet resultSet) {
asyncCollector.collect(Collections.singleton(resultSet));
}
public void onFailure(Throwable t) {
asyncCollector.collect(t);
}
});
}
});
}
}
each response of this code provides Cassandra ResultSet with different amount of fields .
Any Ideas for handling Cassandra ResultSet in Flink or should I use another technics to reach my goal ?
Thanks for any help in advance!
Cassandra ResultSet is not thread-safe. Better try to use Flink Cassandra connector. Or at least write your implementation in a similar way

Process works where transform does not

I am trying to hit a REST endpoint on Camel and convert that data into a class (and for simplicity and testing convert that class into a JSON string) and make a POST to a local server. I can get it to do all but make that final post and just seems to hang.
App:
#SpringBootApplication
public class App {
/**
* A main method to start this application.
*/
public static void main(String[] args) {
SpringApplication.run(App.class, args);
}
#Component
public class RestTest extends RouteBuilder {
#Override
public void configure() throws Exception {
restConfiguration().component("restlet").host("localhost").port(8000).bindingMode(RestBindingMode.json);
rest("/test").enableCORS(true)
.post("/post").type(User.class).to("direct:transform");
from("direct:transform")
.transform().method("Test", "alter")
.to("http4:/localhost:8088/ws/v1/camel");
}
}
}
Bean:
#Component("Test")
public class Test {
public void alter (Exchange exchange) {
ObjectMapper mapper = new ObjectMapper();
User body = exchange.getIn().getBody(User.class);
try {
String jsonInString = mapper.writeValueAsString(body);
exchange.getOut().setHeader(Exchange.HTTP_METHOD, constant(HttpMethods.POST));
exchange.getOut().setHeader(Exchange.CONTENT_TYPE, MediaType.APPLICATION_JSON);
exchange.getOut().setBody(jsonInString);
} catch (JsonGenerationException e) {
e.printStackTrace();
} catch (JsonMappingException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
User:
public class User {
#JsonProperty
private String firstName;
#JsonProperty
private String lastName;
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getFirstName() {
return firstName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public String getLastName() {
return lastName;
}
}
UPDATE
Able to get it to work with process instead of transform but errors when a response is sent back to Camel from the POST:
from("direct:transform")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
ObjectMapper mapper = new ObjectMapper();
User body = exchange.getIn().getBody(User.class);
try {
String jsonInString = mapper.writeValueAsString(body);
exchange.getOut().setHeader(Exchange.HTTP_METHOD, constant(HttpMethods.POST));
exchange.getOut().setHeader(Exchange.CONTENT_TYPE, MediaType.APPLICATION_JSON);
exchange.getOut().setBody(jsonInString);
} catch (JsonGenerationException e) {
e.printStackTrace();
} catch (JsonMappingException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
})
.to("http4://0.0.0.0:8088/ws/v1/camel");
Error
com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class org.apache.camel.converter.stream.CachedOutputStream$WrappedInputStream and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS)
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:284)
at com.fasterxml.jackson.databind.SerializerProvider.mappingException(SerializerProvider.java:1110)
at com.fasterxml.jackson.databind.SerializerProvider.reportMappingProblem(SerializerProvider.java:1135)
at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.failForEmpty(UnknownSerializer.java:69)
at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.serialize(UnknownSerializer.java:32)
at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:292)
at com.fasterxml.jackson.databind.ObjectWriter$Prefetch.serialize(ObjectWriter.java:1429)
at com.fasterxml.jackson.databind.ObjectWriter._configAndWriteValue(ObjectWriter.java:1158)
at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:988)
at org.apache.camel.component.jackson.JacksonDataFormat.marshal(JacksonDataFormat.java:155)
at org.apache.camel.processor.MarshalProcessor.process(MarshalProcessor.java:69)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:109)
at org.apache.camel.processor.MarshalProcessor.process(MarshalProcessor.java:50)
at org.apache.camel.component.rest.RestConsumerBindingProcessor$RestConsumerBindingMarshalOnCompletion.onAfterRoute(RestConsumerBindingProcessor.java:363)
at org.apache.camel.util.UnitOfWorkHelper.afterRouteSynchronizations(UnitOfWorkHelper.java:154)
at org.apache.camel.impl.DefaultUnitOfWork.afterRoute(DefaultUnitOfWork.java:278)
at org.apache.camel.processor.CamelInternalProcessor$RouteLifecycleAdvice.after(CamelInternalProcessor.java:317)
at org.apache.camel.processor.CamelInternalProcessor$InternalCallback.done(CamelInternalProcessor.java:246)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:109)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:197)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:97)
at org.apache.camel.component.restlet.RestletConsumer$1.handle(RestletConsumer.java:68)
at org.apache.camel.component.restlet.MethodBasedRouter.handle(MethodBasedRouter.java:54)
at org.restlet.routing.Filter.doHandle(Filter.java:150)
at org.restlet.routing.Filter.handle(Filter.java:197)
at org.restlet.routing.Router.doHandle(Router.java:422)
at org.restlet.routing.Router.handle(Router.java:639)
at org.restlet.routing.Filter.doHandle(Filter.java:150)
at org.restlet.routing.Filter.handle(Filter.java:197)
at org.restlet.routing.Router.doHandle(Router.java:422)
at org.restlet.routing.Router.handle(Router.java:639)
at org.restlet.routing.Filter.doHandle(Filter.java:150)
at org.restlet.engine.application.StatusFilter.doHandle(StatusFilter.java:140)
at org.restlet.routing.Filter.handle(Filter.java:197)
at org.restlet.routing.Filter.doHandle(Filter.java:150)
at org.restlet.routing.Filter.handle(Filter.java:197)
at org.restlet.engine.CompositeHelper.handle(CompositeHelper.java:202)
at org.restlet.Component.handle(Component.java:408)
at org.restlet.Server.handle(Server.java:507)
at org.restlet.engine.connector.ServerHelper.handle(ServerHelper.java:63)
at org.restlet.engine.adapter.HttpServerHelper.handle(HttpServerHelper.java:143)
at org.restlet.engine.connector.HttpServerHelper$1.handle(HttpServerHelper.java:64)
at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79)
at sun.net.httpserver.AuthFilter.doFilter(AuthFilter.java:83)
at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:82)
at sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(ServerImpl.java:675)
at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79)
at sun.net.httpserver.ServerImpl$Exchange.run(ServerImpl.java:647)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Leading to the question of what is the fundamental difference between process and transform?
A "Processor" in camel is the lowest level message processing primitive. Under the covers a transform definition is executed as a org.apache.camel.processor.TransformProcessor which is itself a processor. In fact mostly everything is a Processor under the hood so strictly anything you can accomplish with transform you can accomplish with a pure processor:
Your last error is because you need to unmarshal the output of your HTTP call to an object, before it can be marshalled back to JSON using Jackson. Something like that:
from("direct:transform")
.transform().method("Test", "alter")
.to("http4:/localhost:8088/ws/v1/camel")
.unmarshal().json(JsonLibrary.Jackson, User.class);

CamelTestSupport read placeholders from yml file

I am trying to test my Camel Routes using CamelTestSupport. I have my routes defined in a class like this
public class ActiveMqConfig{
#Bean
public RoutesBuilder route() {
return new SpringRouteBuilder() {
#Override
public void configure() throws Exception {
from("activemq:{{push.queue.name}}").to("bean:PushEventHandler?method=handlePushEvent");
}
};
}
}
And my test class look like this
#RunWith(SpringRunner.class)
public class AmqTest extends CamelTestSupport {
#Override
protected RoutesBuilder createRouteBuilder() throws Exception {
return new ActiveMqConfig().route();
}
#Override
protected Properties useOverridePropertiesWithPropertiesComponent() {
Properties properties = new Properties();
properties.put("pim2.push.queue.name", "pushevent");
return properties;
}
protected Boolean ignoreMissingLocationWithPropertiesComponent() {
return true;
}
#Mock
private PushEventHandler pushEventHandler;
#BeforeClass
public static void setUpClass() throws Exception {
BrokerService brokerSvc = new BrokerService();
brokerSvc.setBrokerName("TestBroker");
brokerSvc.addConnector("tcp://localhost:61616");
brokerSvc.setPersistent(false);
brokerSvc.setUseJmx(false);
brokerSvc.start();
}
#Override
protected JndiRegistry createRegistry() throws Exception {
JndiRegistry jndi = super.createRegistry();
MockitoAnnotations.initMocks(this);
jndi.bind("pushEventHandler", pushEventHandler);
return jndi;
}
#Test
public void testConfigure() throws Exception {
template.sendBody("activemq:pushevent", "HelloWorld!");
Thread.sleep(2000);
verify(pushEventHandler, times(1)).handlePushEvent(any());
}}
This is working perfectly fine. But I have to set the placeholder {{push.queue.name}} using useOverridePropertiesWithPropertiesComponent function. But I want it to be read from my .yml file.
I am not able to do it. Can someone suggest.
Thanks
Properties are typically read from .properties files. But you can write some code that read the yaml file in the useOverridePropertiesWithPropertiesComponent method and put them into the Properties instance which is returned.
Thank Claus.
I got it working by doing this
#Override
protected Properties useOverridePropertiesWithPropertiesComponent() {
YamlPropertySourceLoader loader = new YamlPropertySourceLoader();
try {
PropertySource<?> applicationYamlPropertySource = loader.load(
"properties", new ClassPathResource("application.yml"),null);
Map source = ((MapPropertySource) applicationYamlPropertySource).getSource();
Properties properties = new Properties();
properties.putAll(source);
return properties;
} catch (IOException e) {
LOG.error("Config file cannot be found.");
}
return null;
}

Bidirectional communication from Camel to Vertx socket Server

I am trying to use Camel NettyComponent to communicate with a SocketServer written in Vert.x.
This is my server code:
public class NettyExampleServer {
public final Vertx vertx;
public static final Logger logger = LoggerFactory.getLogger(NettyExampleServer.class);
public static int LISTENING_PORT = 15692;
public NettyExampleServer(Vertx vertx) {
this.vertx = vertx;
}
private NetServer netServer;
private List<String> remoteAddresses = new CopyOnWriteArrayList<String>();
private final AtomicInteger disconnections = new AtomicInteger();
public int getDisconnections(){
return disconnections.get();
}
public List<String> getRemoteAddresses(){
return Collections.unmodifiableList(remoteAddresses);
}
public void run(){
netServer = vertx.createNetServer();
netServer.connectHandler(new Handler<NetSocket>() {
#Override
public void handle(final NetSocket socket) {
remoteAddresses.add(socket.remoteAddress().toString());
socket.closeHandler(new Handler<Void>() {
#Override
public void handle(Void event) {
disconnections.incrementAndGet();
}
});
socket.dataHandler(new Handler<Buffer>() {
#Override
public void handle(Buffer event) {
logger.info("I received {}",event);
socket.write("I am answering");
}
});
}
});
netServer.listen(LISTENING_PORT);
}
public void stop(){
netServer.close();
}
}
I tried to build a Route like the following:
public class NettyRouteBuilder extends RouteBuilder {
public static final String PRODUCER_BUS_NAME = "producerBus";
public static final String CONSUMER_BUS_NAME = "receiverBus";
private Processor processor = new Processor(){
#Override
public void process(Exchange exchange) throws Exception {
exchange.setPattern(ExchangePattern.InOut);
}
};
#Override
public void configure() throws Exception {
from("vertx:" + PRODUCER_BUS_NAME).process(processor).to("netty:tcp://localhost:"+ NettyExampleServer.LISTENING_PORT + "?textline=true&lazyChannelCreation=true&option.child.keepAlive=true").to("vertx:"+CONSUMER_BUS_NAME);
}
}
My tests shows that:
If I eliminate the processor on the route, the delivery succeed but there is no answer by the server
If I keep the processor, the data is delivered to the server but an exception raise because no data is received.
I have created a small project here: https://github.com/edmondo1984/netty-camel-vertx . How do I use Camel Netty Component to create a bidirectional route ?
To communicate Vertx and Camel the best tool is to use one of this:
Camel Vertex endpoint
Vertex Camel connector
You can find an example here
If you could or have another requeriments it is possible also to use a common connector like Netty on the both sides.

Resources