I am trying to use UNION to collect all the unique identifiers(jcr:uuid) of multiple node types and getting an InvalidQueryException, while the actual SELECT statements by themselves are successful.
jackrabbit version: 2.12.1
SELECT [jcr:uuid] FROM [base:nodeType1]
UNION
SELECT [jcr:uuid] FROM [base:nodeType2]
StackTrace:
SELECT [jcr:uuid] FROM [base:nodeType1]
UNION(*)SELECT [jcr:uuid] FROM [base:nodeType2]; expected: <end>
at org.apache.jackrabbit.rmi.server.ServerObject.getRepositoryException(ServerObject.java:113)
at org.apache.jackrabbit.rmi.server.ServerQueryManager.createQuery(ServerQueryManager.java:74)
at sun.reflect.GeneratedMethodAccessor2078.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:323)
at sun.rmi.transport.Transport$1.run(Transport.java:200)
at sun.rmi.transport.Transport$1.run(Transport.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:196)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at sun.rmi.transport.StreamRemoteCall.exceptionReceivedFromServer(StreamRemoteCall.java:276)
at sun.rmi.transport.StreamRemoteCall.executeCall(StreamRemoteCall.java:253)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:162)
at org.apache.jackrabbit.rmi.server.ServerQueryManager_Stub.createQuery(Unknown Source)
at org.apache.jackrabbit.rmi.client.ClientQueryManager.createQuery(ClientQueryManager.java:66)
Your query contains more than one selector, therefore you have to use a selector name in your query. The name is only optional if there is exactly one selector.
Try this:
SELECT [jcr:uuid] FROM [base:nodeType1] AS a
UNION
SELECT [jcr:uuid] FROM [base:nodeType2] AS a
Related
flink version: 1.15.3 and jdk versioin: 1.8
conf/flink-conf.yaml file here:
jobmanager.rpc.address: test1
jobmanager.rpc.port: 6123
jobmanager.bind-host: test1
taskmanager.bind-host: test1
taskmanager.host: test1
taskmanager.numberOfTaskSlots: 2
parallelism.default: 2
start a local cluster:
./bin/start-cluster.sh
Starting the SQL Client CLI by calling:
./bin/sql-client.sh embedded
then enter the simple query below and press Enter to execute it.
SET 'sql-client.execution.result-mode' = 'tableau';
SET 'execution.runtime-mode' = 'batch';
SELECT
name,
COUNT(*) AS cnt
FROM
(VALUES ('Bob'), ('Alice'), ('Greg'), ('Bob')) AS NameTable(name)
GROUP BY name;
I got error:
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.runtime.rest.util.RestClientException: [org.apache.flink.runtime.rest.handler.RestHandlerException: Could not upload job files.
at org.apache.flink.runtime.rest.handler.job.JobSubmitHandler.lambda$uploadJobGraphFiles$4(JobSubmitHandler.java:201)
at java.util.concurrent.CompletableFuture.biApply(CompletableFuture.java:1105)
at java.util.concurrent.CompletableFuture$BiApply.tryFire(CompletableFuture.java:1070)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1595)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.util.FlinkException: Could not upload job files.
at org.apache.flink.runtime.client.ClientUtils.uploadJobGraphFiles(ClientUtils.java:86)
at org.apache.flink.runtime.rest.handler.job.JobSubmitHandler.lambda$uploadJobGraphFiles$4(JobSubmitHandler.java:195)
... 11 more
Caused by: java.io.IOException: Could not connect to BlobServer at address localhost/127.0.0.1:35138
at org.apache.flink.runtime.blob.BlobClient.<init>(BlobClient.java:103)
at org.apache.flink.runtime.rest.handler.job.JobSubmitHandler.lambda$null$3(JobSubmitHandler.java:199)
at org.apache.flink.runtime.client.ClientUtils.uploadJobGraphFiles(ClientUtils.java:82)
... 12 more
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:606)
at org.apache.flink.runtime.blob.BlobClient.<init>(BlobClient.java:97)
... 14 more
]
why ?
This method works fine in my spring boot application when I use the H2 database:
Method
#Bean
public JdbcBatchItemWriter<PE> writer() {
JdbcBatchItemWriter<PE> itemWriter = new JdbcBatchItemWriter<PE>();
itemWriter.setDataSource(dataSource());
itemWriter.setSql("INSERT INTO PETABLE (TASK_ID, TASK_DESCRIPTION, ORIGINAL_DATE, POLICY_NO, BRAND_DESC, LOAN_NO, LINE_OF_BUSINESS, LAST_NAME, BUSINESS_RULE, BUSINESS_RULE_DESC) VALUES (:taskid, :taskdescription, :originaldate, :policyno, :branddescription, :loannumber, :lineofbusiness, :lastname, :businessrule, :businessruledescription)");
itemWriter.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<PE>());
return itemWriter;
}
But it throw the bad SQL grammar / Invalid object name 'BATCH_JOB_INSTANCE' error when I use SQL Server. I created the PETABLE in SQL Server manually.
SQL Server database config:
#Bean
public DataSource dataSource() {
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.driverClassName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
dataSourceBuilder.url("jdbc:sqlserver://termineserver:1433;databaseName=termineserverdatabase");
dataSourceBuilder.username("***");
dataSourceBuilder.password("***");
return dataSourceBuilder.build();
}
H2 database config:
#Bean
public DataSource dataSource(){
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder.addScript("classpath:org/springframework/batch/core/schema-drop-h2.sql")
.addScript("classpath:org/springframework/batch/core/schema-h2.sql")
.addScript("classpath:petable.sql")
.setType(EmbeddedDatabaseType.H2)
.build();
}
Table creation
CREATE TABLE PETABLE (
TASK_ID VARCHAR(10),
TASK_DESCRIPTION VARCHAR(500),
ORIGINAL_DATE VARCHAR(100) ,
POLICY_NO VARCHAR(100),
BRAND_DESC VARCHAR(100),
LOAN_NO VARCHAR(100),
LINE_OF_BUSINESS VARCHAR(100),
LAST_NAME VARCHAR(100),
BUSINESS_RULE VARCHAR(100),
BUSINESS_RULE_DESC VARCHAR(100)
) ;
Error when using MS-SQL database:
2021-03-04 13:25:00 ERROR - Unexpected error occurred in scheduled task.
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ?]; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'BATCH_JOB_INSTANCE'.
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:234)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)
at org.springframework.jdbc.core.JdbcTemplate.translateException(JdbcTemplate.java:1402)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:620)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:657)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:688)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:700)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:756)
at org.springframework.batch.core.repository.dao.JdbcJobInstanceDao.getJobInstance(JdbcJobInstanceDao.java:145)
at org.springframework.batch.core.repository.support.SimpleJobRepository.getLastJobExecution(SimpleJobRepository.java:293)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:343)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:197)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:294)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:98)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy57.getLastJobExecution(Unknown Source)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:98)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:343)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:197)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration$PassthruAdvice.invoke(SimpleBatchConfiguration.java:127)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy61.run(Unknown Source)
at com.mtb.stc.StcApplication.perform(StcApplication.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:65)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:93)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'BATCH_JOB_INSTANCE'.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:258)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1535)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:467)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:409)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7151)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2478)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:219)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:199)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:331)
at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeQuery(ProxyPreparedStatement.java:52)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeQuery(HikariProxyPreparedStatement.java)
at org.springframework.jdbc.core.JdbcTemplate$1.doInPreparedStatement(JdbcTemplate.java:666)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:605)
... 45 common frames omitted
You need to create all the tables that are needed to spring batch.
This code converts pandas to flink table do the transformation than again converting back to pandas. It perfectly works fine when I use filter filter than select but gives me an error when i add group_by and order_by.
import pandas as pd
import numpy as np
f_s_env = StreamExecutionEnvironment.get_execution_environment()
f_s_settings = EnvironmentSettings.new_instance().use_old_planner().in_streaming_mode().build()
table_env = StreamTableEnvironment.create(f_s_env, environment_settings=f_s_settings)
df = pd.read_csv("dataBase/New/candidate.csv")
col = ['candidate_id', 'candidate_source_id', 'candidate_first_name',
'candidate_middle_name', 'candidate_last_name', 'candidate_email',
'created_date', 'last_modified_date', 'last_modified_by']
table = table_env.from_pandas(df,col)
table.filter("candidate_id > 322445")\
.filter("candidate_first_name === 'Libby'")\
.group_by("candidate_id, candidate_source_id")\
.select("candidate_id, candidate_source_id")\
.order_by("candidate_id").to_pandas()
My error is
Py4JJavaError: An error occurred while calling o3164.orderBy.
: org.apache.flink.table.api.ValidationException: A limit operation on unbounded tables is currently not supported.
at org.apache.flink.table.operations.utils.SortOperationFactory.failIfStreaming(SortOperationFactory.java:131)
at org.apache.flink.table.operations.utils.SortOperationFactory.createSort(SortOperationFactory.java:63)
at org.apache.flink.table.operations.utils.OperationTreeBuilder.sort(OperationTreeBuilder.java:409)
at org.apache.flink.table.api.internal.TableImpl.orderBy(TableImpl.java:401)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)
If you look in the documentation, you will see that with the Table API, ORDER BY is only supported for batch queries. If you switch to SQL, then you can have streaming queries that sort on an ascending time attribute.
Sorting by anything else in an unbounded streaming query is simply impossible, since sorting requires full knowledge of the input.
Any idea what could be causing this error to be constantly produced in all.log and error. log. Openfire 4.3.2 on Windows connected to SQL Server 2012 (SP4) - although it shouldn’t matter. Could it be a result of incorrect SQL in ofProperty table? Where to look?
We have a working web/Candy.js chat application…
2019.05.08 20:46:12 ERROR [TaskEngine-pool-16]: org.jivesoftware.openfire.pubsub.PubSubPersistenceManager - Incorrect syntax near the keyword 'LEFT'.
java.sql.BatchUpdateException: Incorrect syntax near the keyword 'LEFT'.
at net.sourceforge.jtds.jdbc.JtdsStatement.executeBatch(JtdsStatement.java:1069) ~[jtds-1.3.1.jar:1.3.1]
at org.apache.commons.dbcp2.DelegatingStatement.executeBatch(DelegatingStatement.java:223) ~[commons-dbcp2-2.5.0.jar:2.5.0]
at org.apache.commons.dbcp2.DelegatingStatement.executeBatch(DelegatingStatement.java:223) ~[commons-dbcp2-2.5.0.jar:2.5.0]
at org.jivesoftware.openfire.pubsub.PubSubPersistenceManager.purgeItems(PubSubPersistenceManager.java:1893) [xmppserver-4.3.2.jar:4.3.2]
at org.jivesoftware.openfire.pubsub.PubSubPersistenceManager.access$000(PubSubPersistenceManager.java:57) [xmppserver-4.3.2.jar:4.3.2]
at org.jivesoftware.openfire.pubsub.PubSubPersistenceManager$2.run(PubSubPersistenceManager.java:283) [xmppserver-4.3.2.jar:4.3.2]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [?:1.8.0_202]
at java.util.concurrent.FutureTask.run(Unknown Source) [?:1.8.0_202]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_202]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_202]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_202]
We used SQL Profiler to check for Incorrect syntax near the keyword 'LEFT' message.
here's what we got: error was produced by the following bad SQL
DELETE FROM ofPubsubItem
LEFT JOIN (SELECT id FROM ofPubsubItem WHERE serviceID= #P0 AND nodeID= #P1 ORDER BY creationDate DESC LIMIT #P2 ) AS noDelete
ON ofPubsubItem.id = noDelete.id
WHERE ...
This SQL is incorrect, correct one would include 'FROM ofPubsubItem' twice like
DELETE FROM ofPubsubItem
FROM ofPubsubItem LEFT JOIN (SELECT id FROM ofPubsubItem WHERE serviceID= #P0 AND nodeID= #P1 ORDER BY creationDate DESC LIMIT #P2 ) AS noDelete
ON ofPubsubItem.id = noDelete.id
WHERE
I'm going to submit bug report to Openfire. Not sure if that SQL syntax ANSI or SQL Server specific, but Openfire is supposed to support SQL Server.
When the Spring Data findAll(Pageable) method of an entity is called, the following query is executed by Hibernate:
select TOP(?) customer0_.id as id1_0_, customer0_.name as name2_0_ from customer customer0_
After that the following query is executed:
Hibernate: select customer0_.id as id1_0_, customer0_.name as name2_0_ from customer customer0_ order by customer0_.id asc offset 0 rows fetch next 20 rows only
An execption is raised
o.h.engine.jdbc.spi.SqlExceptionHelper : could not execute query [select customer0_.id as id1_0_, customer0_.name as name2_0_ from customer customer0_ order by customer0_.id asc]
com.microsoft.sqlserver.jdbc.SQLServerException: The index 1 is out of range.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:206)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.setterGetParam(SQLServerPreparedStatement.java:940)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.setValue(SQLServerPreparedStatement.java:954)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.setInt(SQLServerPreparedStatement.java:1249)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setInt(HikariProxyPreparedStatement.java)
at org.hibernate.dialect.pagination.SQLServer2005LimitHandler.bindLimitParametersAtStartOfQuery(SQLServer2005LimitHandler.java:132)
at org.hibernate.loader.Loader.prepareQueryStatement(Loader.java:1950)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:1909)
at org.hibernate.loader.Loader.executeQueryStatement(Loader.java:1887)
at org.hibernate.loader.Loader.doQuery(Loader.java:932)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:349)
at org.hibernate.loader.Loader.doList(Loader.java:2615)
at org.hibernate.loader.Loader.doList(Loader.java:2598)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2430)
at org.hibernate.loader.Loader.list(Loader.java:2425)
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:502)
at org.hibernate.hql.internal.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:370)
at org.hibernate.engine.query.spi.HQLQueryPlan.performList(HQLQueryPlan.java:216)
at org.hibernate.internal.SessionImpl.list(SessionImpl.java:1481)
at org.hibernate.query.internal.AbstractProducedQuery.doList(AbstractProducedQuery.java:1441)
at org.hibernate.query.internal.AbstractProducedQuery.list(AbstractProducedQuery.java:1410)
at org.hibernate.query.Query.getResultList(Query.java:146)
at org.hibernate.query.criteria.internal.compile.CriteriaQueryTypeQueryAdapter.getResultList(CriteriaQueryTypeQueryAdapter.java:72)
at org.springframework.data.jpa.repository.support.SimpleJpaRepository.readPage(SimpleJpaRepository.java:589)
at org.springframework.data.jpa.repository.support.SimpleJpaRepository.findAll(SimpleJpaRepository.java:409)
at org.springframework.data.jpa.repository.support.SimpleJpaRepository.findAll(SimpleJpaRepository.java:377)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.executeMethodOn(RepositoryFactorySupport.java:504)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:489)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:461)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke(DefaultMethodInvokingMethodInterceptor.java:61)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:282)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:136)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.jpa.repository.support.CrudMethodMetadataPostProcessor$CrudMethodMetadataPopulatingMethodInterceptor.invoke(CrudMethodMetadataPostProcessor.java:133)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke(SurroundingTransactionDetectorMethodInterceptor.java:57)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
at com.sun.proxy.$Proxy170.findAll(Unknown Source)
at sa.tvtc.farm.web.rest.CustomerResource.getAllCustomers(CustomerResource.java:107)
at sa.tvtc.farm.web.rest.CustomerResource$$FastClassBySpringCGLIB$$48eccfb1.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:721)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at com.ryantenney.metrics.spring.TimedMethodInterceptor.invoke(TimedMethodInterceptor.java:48)
at com.ryantenney.metrics.spring.TimedMethodInterceptor.invoke(TimedMethodInterceptor.java:34)
at com.ryantenney.metrics.spring.AbstractMetricMethodInterceptor.invoke(AbstractMetricMethodInterceptor.java:59)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:656)
at sa.tvtc.farm.web.rest.CustomerResource$$EnhancerBySpringCGLIB$$9b6861c0.getAllCustomers(<generated>)
Configuration:
application-dev.yml
spring:
datasource:
type: com.zaxxer.hikari.HikariDataSource
...
jpa:
database-platform: org.hibernate.dialect.SQLServer2012Dialect
database: SQL_SERVER
pom.xml
<hibernate.version>5.2.8.Final</hibernate.version>
<hikaricp.version>2.6.0</hikaricp.version>
<mssql-jdbc.version>6.1.0.jre8</mssql-jdbc.version>
<liquibase-hibernate5.version>3.6</liquibase-hibernate5.version>
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
<version>${mssql-jdbc.version}</version>
</dependency>
<dependency>
<groupId>com.github.sabomichal</groupId>
<artifactId>liquibase-mssql</artifactId>
<version>${liquibase-mssql.version}</version>
</dependency>
The use of previous version of Hibernate fixed the problem
<hibernate.version>5.2.7.Final</hibernate.version>