Hystrix dashboard stucks on loading screen - hystrix

I have Hystrix dashboard running on localhost:8988/hystrix and I want to monitor a request between OrderService and ProductService . The endpoint "hystrix.stream" is already registered and the hystrix dashboard stucks on loading without any results.
This is the service client to call Product Service which I want to monitor:
#Service
public class ProductServiceClient {
private final RestTemplate restTemplate;
public ProductServiceClient(RestTemplate restTemplate) {
this.restTemplate = restTemplate;
}
#HystrixCommand(fallbackMethod = "getDefaultProductById")
public Optional<ProductDto> getProductById(Long productId) {
ResponseEntity<ProductDto> productResponse = restTemplate
.getForEntity("http://product-service/api/product/{id}",
ProductDto.class,
productId);
if (productResponse.getStatusCode() == HttpStatus.OK) {
return Optional.ofNullable(productResponse.getBody());
} else {
log.error("Unable to get product with ID: " + productId
+ ", StatusCode: " + productResponse.getStatusCode());
return Optional.empty();
}
}
Optional<ProductDto> getDefaultProductById(String productId) {
log.info("Returning default ProductById for product Id: " + productId);
ProductDto productDto = new ProductDto();
productDto.setId(productId);
productDto.setName("UNKNOWN");
productDto.setDescription("NONE");
return Optional.ofNullable(productDto);
}
}
I added the #EnableCircuitBreaker annotation to the main class and I use these dependencies:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-config</artifactId>
<version>2.1.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
<version>2.1.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-hystrix</artifactId>
<version>2.1.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-openfeign</artifactId>
<version>2.1.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-sleuth</artifactId>
<version>2.1.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-zipkin</artifactId>
<version>2.1.1.RELEASE</version>
</dependency>

Issue
In my case, the error occurs after upgrade spring-cloud to Hoxton.SR6.
In browser console, get following error:
Uncaught TypeError: e.indexOf is not a function
which seems to be a jquery version issue.
Solution
Downgrade spring-cloud version to Hoxton.SR4, then the error is gone.
So, you might need to check your browser's console output to see whether there are errors.
Tips
Another possible reason is that you didn't access your application's api which is marked by #HystrixCommand yet.
In that case, make a call to the api, then the dashboard would show the charts.

Try sending some requests by calling the controller.
If there are no requests then the dashboard will keep loading.

Related

Handle multipart/form-data with Quarkus

I'm facing an issue, I can't get my form in my resources, the variables are always null
My Resource :
#POST
#Path("/upload-logo")
#Consumes(MediaType.MULTIPART_FORM_DATA)
#Produces(MediaType.TEXT_PLAIN)
public String uploadLogo (#MultipartForm LogoMultipartForm logoMultipartForm) throws IOException {
return this.companyService.uploadLogo(username, logoMultipartForm.logo);
}
The form model
public class LogoMultipartForm {
#FormParam("logo")
public byte[] logo;
#FormParam("filename")
#PartType("text/plain")
public String fileName;
}
My Fetch request :
uploadLogo: async (file: File) => {
const form = new FormData();
form.append("logo", file, "logo.png");
form.append("filename", "test");
const { query, abort } = HttpClient.POST(`${COMPANY_URL}/upload-logo`, form);
let promise = query
.then((res: any) => {
console.log("Response", res);
if (res.status === 200) {
return res.text();
} else {
throw res;
}
})
.then((url: any) => url);
promise.cancel = abort;
return promise;
},
And my HttpClient :
POST: function (url: string, body: any, config?: any) {
const controller = new AbortController();
const signal = controller.signal;
return { query: fetch(url, { signal, method: "POST", body, ...config }) as any, abort: () => controller.abort() };
},
To be sure I was testing with a proxy and the request is effectively good : The variable fileName and logo is always null.
This is my pom.xml :
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-junit5</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-smallrye-graphql</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-flyway</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-postgresql</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-orm-panache</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-reactive-pg-client</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy-qute</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-oidc</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-rest-client</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy-multipart</artifactId>
</dependency>
</dependencies>
I don't see where is the problem.
I have already seen the official resteasy documentation for Multipart and don't see where is the problem. I have already Tested with MultipartFormDataInput and all parts are empty.
Thanks in advance for your help ! :)
You are missing the dependency:
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-multipart-provider</artifactId>
</dependency>
Be sure that your #PartType("text/plain") inherits from org.jboss.resteasy package! As well as the #MultipartForm (import org.jboss.resteasy.annotations.providers.multipart.MultipartForm;)
There is a good example how it works: Quarkus tutorial. Look at the packages!
Spent hours on figure out how to solve multipart data sending three a client to client
Hope so this example will help You
RestClient
#RegisterRestClient(configKey = "ted-api")
#RegisterClientHeaders(ClientHeaderFactory.class)
#RegisterProvider(value = ExceptionMapper.class)
#ApplicationScoped
public interface TedClient {
#POST
#Consumes(MediaType.MULTIPART_FORM_DATA)
#Path("/notices/submit")
Response sendData(#MultipartForm TedBody data);
}
Body
#Getter
#Setter
#NoArgsConstructor
public class TedBody {
#FormParam("notice")
#PartFilename("notice-filename.xml") //partFileName is necessary for multipart mainly many developers missing it
#PartType(MediaType.TEXT_XML)
private byte[] notice;
#FormParam("metadata")
#PartType(MediaType.APPLICATION_JSON)
private AuthorData AuthorData;
public TedBody(byte[] notice) {
this.notice = notice;
this.authorData = new Author();
}
in case if Your systems need to retrieve a multipart file then example may be easier
Rest
#POST
#Path("/upload")
#Consumes(MULTIPART_FORM_DATA)
#Produces(APPLICATION_JSON)
public Response saveMultipartNoticeContainer(
#MultipartForm #Valid RequestDto requestDto) {
Resnponse noticeDto = noticeService.save(requestDto);
return Response.accepted().entity(noticeDto).build();
}
import org.jboss.resteasy.annotations.providers.multipart.PartType;
#Builder
#Getter
#Setter
#NoArgsConstructor
#AllArgsConstructor
#JsonAutoDetect(fieldVisibility = JsonAutoDetect.Visibility.ANY)
public class NoticeRequestDto {
#FormParam("notice")
#PartType(MediaType.APPLICATION_OCTET_STREAM)
private InputStream data;
}
for us works on Quarkus - 2.15.3.Final

flink ClassNotFoundException ProcessFunction

i have a flink demo, to find a column of dataSet 1 not in an other dataSet. i write it whit flink sql. it seem ok with the code, but does not work.
the version i use is:
flink.version: 1.7.1
java.version: 1.8
scala.binary.version: 2.12
this is my flink demo:
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.DataSet;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.api.java.operators.DataSource;
import org.apache.flink.api.java.tuple.Tuple1;
import org.apache.flink.api.java.tuple.Tuple3;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.java.BatchTableEnvironment;
import org.apache.flink.types.Row;
import java.util.ArrayList;
import java.util.List;
public class TestUnScoreItem {
public static void main(String[] args) throws Exception {
final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
List<Tuple3<String, String, Integer>> leftList = new ArrayList<>();
leftList.add(new Tuple3<>("U1", "Item1", 4));
leftList.add(new Tuple3<>("U1", "Item3", 7));
leftList.add(new Tuple3<>("U1", "Item5", 2));
leftList.add(new Tuple3<>("U2", "Item2", 9));
leftList.add(new Tuple3<>("U2", "Item3", 3));
leftList.add(new Tuple3<>("U3", "Item1", 3));
List<Tuple1<String>> rightList = new ArrayList<>();
rightList.add(new Tuple1<>("Item1"));
rightList.add(new Tuple1<>("Item2"));
rightList.add(new Tuple1<>("Item3"));
rightList.add(new Tuple1<>("Item4"));
rightList.add(new Tuple1<>("Item5"));
DataSource<Tuple3<String, String, Integer>> userScoreSet = env.fromCollection(leftList);
DataSource<Tuple1<String>> allItemSet = env.fromCollection(rightList);
BatchTableEnvironment tableEnv = TableEnvironment.getTableEnvironment(env);
tableEnv.registerDataSet("userScoreTable", userScoreSet, "user,item,score");
tableEnv.registerDataSet("allItemTable", allItemSet, "item2");
Table unScoreTable = tableEnv.sqlQuery("select user, item from userScoreTable where item not in (select item2 from allItemTable) ");
DataSet<Row> result = tableEnv.toDataSet(unScoreTable, Row.class);
result.print();
}
}
and i get this exception
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/streaming/api/functions/ProcessFunction
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.flink.table.plan.nodes.dataset.DataSetAggregate.translateToPlan(DataSetAggregate.scala:107)
at org.apache.flink.table.plan.nodes.dataset.DataSetSingleRowJoin.translateToPlan(DataSetSingleRowJoin.scala:99)
at org.apache.flink.table.plan.nodes.dataset.DataSetCalc.translateToPlan(DataSetCalc.scala:91)
at org.apache.flink.table.plan.nodes.dataset.DataSetJoin.translateToPlan(DataSetJoin.scala:165)
at org.apache.flink.table.plan.nodes.dataset.DataSetCalc.translateToPlan(DataSetCalc.scala:91)
at org.apache.flink.table.api.BatchTableEnvironment.translate(BatchTableEnvironment.scala:498)
at org.apache.flink.table.api.BatchTableEnvironment.translate(BatchTableEnvironment.scala:476)
at org.apache.flink.table.api.java.BatchTableEnvironment.toDataSet(BatchTableEnvironment.scala:147)
at com.jychan.easycode.recommend.training.TestUnScoreItem.main(TestUnScoreItem.java:65)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.api.functions.ProcessFunction
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 21 more
Process finished with exit code 1
is some on know how to fit it? or there is any other ways to get the same answer? thanks!
add the dependencys
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.7.1</flink.version>
<java.version>1.8</java.version>
<scala.binary.version>2.12</scala.binary.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-scala_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-common</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<scope>runtime</scope>
</dependency>
</dependencies>
oh, i found what's wrong
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
remove the "provided" it work. thanks all.

Why Camel SCR deprecated?

I was looking into Camel-Scr and in pom.xml i saw
<artifactId>camel-scr</artifactId>
<name>Camel :: SCR (deprecated)</name>
<description>Camel with OSGi SCR (Declarative Services)</description>
Why this is deprecated? what alternative would community use in future?
My guess is that it was simply too complex with all the annotations and properties and hence probably didn't get much use compared to much simpler OSGi blueprints.
Usage of Apache Camel with Declarative services or SCR is pretty straightforward with the help of OsgiDefaultCamelContext. You can create the context manually, add routes and configurations and register it to OSGi with bundleContext.registerService method.
Example:
package com.example;
import java.util.Dictionary;
import java.util.Hashtable;
import java.util.Map;
import java.util.Properties;
import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.core.osgi.OsgiDefaultCamelContext;
import org.osgi.framework.BundleContext;
import org.osgi.framework.ServiceRegistration;
import org.osgi.service.component.annotations.Activate;
import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Deactivate;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
#Component(
immediate = true
)
public class OsgiDSCamelContextComponent {
private final static Logger LOGGER = LoggerFactory.getLogger(ExampleCamelContext.class);
CamelContext camelContext;
ServiceRegistration<CamelContext> camelContextRegistration;
#Activate
public void onActivate(BundleContext bundleContext, Map<String, ?> configs){
// Create new OsgiDefaultCamelContext with injected bundleContext
OsgiDefaultCamelContext newCamelContext = new OsgiDefaultCamelContext(bundleContext);
newCamelContext.setName("OsgiDSCamelContext");
// Add configs from com.example.OsgiDSCamelContextComponent.cfg
// available for use with property placeholders
Properties properties = new Properties();
properties.putAll(configs);
newCamelContext.getPropertiesComponent()
.setInitialProperties(properties);
camelContext = newCamelContext;
try {
// In Apache Camel 3.x CamelContext needs to be started before adding RouteBuilders.
camelContext.start();
camelContext.addRoutes(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("timer:exampleTimer?period=3000")
.routeId("exampleTimer")
.log("Hello from Camel using Declarative services");
}
});
//Create dictionary holding properties for the CamelContext service.
Dictionary serviceProperties = new Hashtable<>();
serviceProperties.put("context.name", "OsgiDSCamelContext");
serviceProperties.put("some.property", "SomeValue");
// Register the new CamelContext instance as a service to Karaf with given properties
camelContextRegistration = bundleContext.registerService(CamelContext.class,
camelContext, serviceProperties);
} catch (Exception e) {
LOGGER.error(e.getMessage(), e);
}
}
#Deactivate
public void onDeactivate(){
// Stop camel context when bundle is stopped
if(camelContext != null){
camelContext.stop();
}
// unregister camel context service when bundle is stopped
if(camelContextRegistration != null){
camelContextRegistration.unregister();
}
}
}
Now you could also use DS Service Components to register RouteBuilder service(s) and inject them to CamelContext using #Reference annotation and List<RouteBuilder>.
package com.example.routes;
import org.apache.camel.builder.RouteBuilder;
import org.osgi.service.component.annotations.Component;
#Component(
immediate = true,
property = {
"target.context=exampleContext"
},
service = RouteBuilder.class
)
public class ExampleRouteBuilderService extends RouteBuilder {
#Override
public void configure() throws Exception {
from("timer:exampleTimer?period=3000")
.routeId("exampleTimer")
.log("Hello from Camel using Declarative services");
}
}
#Reference(
target = "(target.context=exampleContext)",
cardinality = ReferenceCardinality.AT_LEAST_ONE,
policyOption = ReferencePolicyOption.GREEDY
)
List<RouteBuilder> routeBuilders;
Just be extra careful when using more advanced options like #Modified or policy = ReferencePolicy.DYNAMIC as these can prevent context from getting recreated when config changes or list gets modified. This can lead to issues like routes getting added twice.
Dependencies for OSGI R6
<dependencies>
<!-- OSGI -->
<dependency>
<groupId>org.osgi</groupId>
<artifactId>osgi.core</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>osgi.annotation</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>osgi.cmpn</artifactId>
<scope>provided</scope>
</dependency>
<!-- Camel -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>${camel.version}</version>
</dependency>
<dependency>
<groupId>org.apache.camel.karaf</groupId>
<artifactId>camel-core-osgi</artifactId>
<version>${camel.version}</version>
</dependency>
</dependencies>
Dependencies for OSGI R8
<dependencies>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>osgi.core</artifactId>
<version>${osgi.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.service.component.annotations</artifactId>
<version>1.4.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.service.metatype.annotations</artifactId>
<version>1.4.0</version>
<scope>provided</scope>
</dependency>
<!-- Camel -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>${camel.version}</version>
</dependency>
<dependency>
<groupId>org.apache.camel.karaf</groupId>
<artifactId>camel-core-osgi</artifactId>
<version>${camel.version}</version>
</dependency>
</dependencies>
There is no SCR out of the box, we'll support only OSGi blueprint.
You'll need to build your own scr support or re-use the camel-scr code.

I am getting a 404 Not Found error when trying to hit a Jersey 2 service

I am creating a Jersey 2.x Service using servlet 3.1 without any web.xml
My pom is given below:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com</groupId>
<artifactId>testex</artifactId>
<packaging>war</packaging>
<version>0.0.1-SNAPSHOT</version>
<name>testex Maven Webapp</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.4</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
<finalName>testex</finalName>
</build>
</project>
I have configured the service using #ApplicationPath resource config given below:
package com.testex.service;
import javax.ws.rs.ApplicationPath;
import org.glassfish.jersey.server.ResourceConfig;
#ApplicationPath("resources")
public class TestexResourceConfig extends ResourceConfig {
public TestexResourceConfig() {
super();
registerClasses(TestexService.class);
}
}
And my service is given below:
package com.testex.service;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import com.testex.config.Config;
import com.testex.config.ConfigLoader;
import com.testex.provider.ProjectDetailsProvider;
import com.testex.provider.VersionsProvider;
import com.testex.provider.bean.Project;
import com.google.gson.Gson;
#Path("/rest")
public class TestexService {
private final Config config;
public TestexService() {
super();
config = ConfigLoader.getConfig();
}
#GET
#Path("getDefaultProjectId")
#Produces(MediaType.APPLICATION_JSON)
public Response getDefaultProjectId() {
return Response.status(200).entity(new Gson().toJson(new Project(String.valueOf(config.getProjectId()))))
.build();
}
#GET
#Path("getProjectDetails")
#Produces(MediaType.APPLICATION_JSON)
public Response getProjectDetails(#QueryParam("projectId") int projectId) {
return Response.status(200).entity(new Gson().toJson(new ProjectDetailsProvider().getProjectDetails(projectId)))
.build();
}
#GET
#Path("getProjectVersionDetails")
#Produces(MediaType.APPLICATION_JSON)
public Response getProjectVersionDetails(#QueryParam("projectId") int projectId) {
return Response.status(200).entity(new Gson().toJson(new VersionsProvider().getVersionsForProject(projectId)))
.build();
}
}
On deploying it to Tomcat 8 on context root "testex", i hit the below URL:
http://localhost:8080/testex/resources/rest/getDefaultProjectId
But I get a 404 Not Found error. Please help.
Well i figured it out. All I needed to do was add the below dependency to my pom and all works like a charm now:
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet</artifactId>
<version>2.22.2</version>
</dependency>

Splitting database column into multivalued Solr field

I'm going nuts trying to figure out how to get the Data Import Handler's splitBy construct to work. I was expecting it to split the input column into a multivalued field. Here's a test case to reproduce the problem:
import java.io.File;
import java.io.IOException;
import java.sql.SQLException;
import static org.junit.Assert.*;
import javax.sql.DataSource;
import org.apache.commons.dbutils.QueryRunner;
import org.apache.commons.io.FileUtils;
import org.apache.solr.client.solrj.SolrQuery;
import org.apache.solr.client.solrj.SolrServer;
import org.apache.solr.client.solrj.embedded.EmbeddedSolrServer;
import org.apache.solr.client.solrj.response.QueryResponse;
import org.apache.solr.common.SolrDocument;
import org.apache.solr.core.CoreContainer;
import org.hsqldb.jdbc.JDBCDataSource;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
public class TestSplitBy {
SolrServer server;
File configPath = new File(FileUtils.getTempDirectory(), Long.toString(System.nanoTime()));
String solrconfig_xml = "<config><luceneMatchVersion>LUCENE_41</luceneMatchVersion><requestHandler name=\"search\" class=\"solr.SearchHandler\" default=\"true\"><lst name=\"defaults\"><str name=\"fl\">*</str><str name=\"df\">id</str></lst></requestHandler><requestHandler name=\"/dataimport\" class=\"org.apache.solr.handler.dataimport.DataImportHandler\"><lst name=\"defaults\"><str name=\"config\">data-config.xml</str></lst></requestHandler></config>";
String data_config_xml = "<dataConfig>" +
"<dataSource url=\"jdbc:hsqldb:mem:testdb\" user=\"SA\" driver=\"org.hsqldb.jdbc.JDBCDriver\" />" +
"<document>" +
"<entity name=\"item\" transformer=\"RegexTransformer\" query=\"SELECT * FROM test\">" +
"<field column=\"type\" name=\"type\" splitBy=\",\" />" +
"</entity>" +
"</document>" +
"</dataConfig>";
String schema_xml = "<schema version=\"1.3\" name=\"test\">" +
"<types>" +
"<fieldType name=\"string\" class=\"solr.StrField\" sortMissingLast=\"true\" omitNorms=\"true\" />" +
"</types>" +
"<fields>" +
"<field stored=\"true\" name=\"id\" type=\"string\" />" +
"<field stored=\"true\" name=\"type\" type=\"string\" multiValued=\"true\"/>" +
"</fields>" +
"<uniqueKey>id</uniqueKey>" +
"</schema>";
DataSource getDataSource() {
JDBCDataSource ds = new JDBCDataSource();
ds.setUser("SA");
ds.setUrl("mem:testdb");
return ds;
}
void populateDb(DataSource ds) {
QueryRunner runner = new QueryRunner(ds);
try {
runner.update("DROP TABLE test IF EXISTS");
runner.update("CREATE TABLE test(id INTEGER, type VARCHAR(256));");
runner.update("INSERT INTO test VALUES 1, 'foo,bar,baz'");
} catch (SQLException e) {
System.err.println(e);
}
}
void writeSolrConfig() throws IOException {
File corePath = new File(configPath, "collection1");
corePath.mkdir();
File confPath = new File(corePath, "conf");
confPath.mkdir();
FileUtils.write(new File(confPath, "data-config.xml"), data_config_xml);
FileUtils.write(new File(confPath, "schema.xml"), schema_xml);
FileUtils.write(new File(confPath, "solrconfig.xml"), solrconfig_xml);
}
void startSolr() {
System.setProperty("solr.solr.home", configPath.getAbsolutePath());
CoreContainer.Initializer initializer = new CoreContainer.Initializer();
CoreContainer coreContainer = initializer.initialize();
server = new EmbeddedSolrServer(coreContainer, "collection1");
}
#Before
public void setup() throws IOException {
populateDb(getDataSource());
writeSolrConfig();
startSolr();
}
#After
public void tearDown() {
server.shutdown();
FileUtils.deleteQuietly(configPath);
}
#Test
public void testSplitBy() throws Exception {
SolrQuery query = new SolrQuery();
query.set("qt", "/dataimport");
query.setParam("command", "full-import");
QueryResponse response = server.query(query);
Thread.sleep(500);
response = server.query(new SolrQuery("*:*"));
for (SolrDocument doc: response.getResults()) {
assertNotNull(doc.getFieldValues("type"));
assertEquals(3, doc.getFieldValues("type").size());
}
}
}
And the POM for the test case:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>solr</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>Solr Sanity</name>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr</artifactId>
<version>4.1.0</version>
<type>war</type>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-dataimporthandler</artifactId>
<version>4.1.0</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>4.1.0</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>commons-dbutils</groupId>
<artifactId>commons-dbutils</artifactId>
<version>1.5</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>2.2.9</version>
<type>jar</type>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
</dependencies>
</project>
Any insight on how to get those types to split correctly into multiple values?
Turns out there were a couple of issues with this unit test:
HSQL's column names are case sensitive (and default to upper case).
If the Solr field name and the db column name are identical an extra token with the entire db value is also added.
The field definition should look like:
<field column="solrField" splitBy="," sourceColName="TYPE" />
And in general - when using the RegexTransformer to mix single valued fields from a DB with multivalued fields:
If using splitBy then the column attribute is the name of the Solr field. The sourceColName is the database column
If not using splitBy then the column attribute is the database column name and the name attribute is the Solr field.

Resources