When I want to make an insert I get this error. I'm using the Grails version ... can it be that I'm missing something? Thank you!
2019-03-08 09:08:56,705 [http-bio-8090-exec-20] ERROR events.PatchedDefaultFlushEventListener - Could not synchronize database state with session
org.hibernate.exception.ConstraintViolationException: Could not execute JDBC batch update
at ar.com.mlan.api.PdCController$_save_closure2.doCall(PdCController.groovy:201)
at ar.com.mlan.api.PdCController.save(PdCController.groovy:161)
at grails.plugin.cache.web.filter.PageFragmentCachingFilter.doFilter(PageFragmentCachingFilter.java:198)
at grails.plugin.cache.web.filter.AbstractFilter.doFilter(AbstractFilter.java:63)
at grails.plugin.springsecurity.rest.RestTokenValidationFilter.processFilterChain(RestTokenValidationFilter.groovy:118)
at grails.plugin.springsecurity.rest.RestTokenValidationFilter.doFilter(RestTokenValidationFilter.groovy:84)
at grails.plugin.springsecurity.web.filter.GrailsAnonymousAuthenticationFilter.doFilter(GrailsAnonymousAuthenticationFilter.java:53)
at grails.plugin.springsecurity.rest.RestAuthenticationFilter.doFilter(RestAuthenticationFilter.groovy:143)
at grails.plugin.springsecurity.web.authentication.logout.MutableLogoutFilter.doFilter(MutableLogoutFilter.java:82)
at com.brandseye.cors.CorsFilter.doFilter(CorsFilter.java:82)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
ConstraintViolationException … so check the constraints on your domain object versus the data you are trying to save.
Related
Trying to run the query using jdbcTemplete on oracle database where M_DB is the schema name and M_USER_DB is user for M_DB schema. M_USER_DB has certain tables on which I am trying to execute some queries like in springboot project
jdbcTemplate.query("SELECT * FROM M_USER_DB.C_USER_INFO", new ResultSetExtractor<HashMap<String, String>>()
However, while running the program its throwing below error:
No data read; nested exception is java.sql.SQLException: No data read
at org.springframework.jdbc.support.SQLStateSQLExceptionTranslator.doTranslate(SQLStateSQLExceptionTranslator.java:104)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
In the application.properties the database/datasource connection are for my primary Schema M_DB (not user M_USER_DB ). From SqlDeveloper, I can connect to M_DB and run the query like
select * from M_USER_DB.C_USER_INFO
without connecting to M_USER_INFO. I believe I need not to create primary and secondary jdbcTemplete here.
Any suggestion what could be the reason or anything I am missing? Thanks for help in advance.
I am using Kafka JDBC sink connector to sink data to Azure SQL server. I have tested the connector with one database and it worked fine but when I added more databases, I started seeing the following error:
USE statement is not supported to switch between databases. Use a new connection to connect to a different database.
Config:
tasks.max: 1
topics: topic_name
connection.url: jdbc:sqlserver://server:port;database=dbname;user=dbuser
connection.user: dbuser
connection.password: dbpass
transforms: unwrap
transforms.unwrap.type: io.debezium.transforms.ExtractNewRecordState
transforms.unwrap.drop.tombstones: false
auto.create: true
value.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable: true
insert.mode: upsert
delete.enabled: true
pk.mode: record_key
Stack:
2020-12-10 11:56:36,990 ERROR WorkerSinkTask{id=NAME-sqlserver-jdbc-sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. Error: java.sql.SQLException: com.microsoft.sqlserver.jdbc.SQLServerException: USE statement is not supported to switch between databases. Use a new connection to connect to a different database.
(org.apache.kafka.connect.runtime.WorkerSinkTask) [task-thread-NAME-sqlserver-jdbc-sink-0]
org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: com.microsoft.sqlserver.jdbc.SQLServerException: USE statement is not supported to switch between databases. Use a new connection to connect to a different database.
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:87)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:560)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:323)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:226)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:198)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:185)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:235)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.sql.SQLException: com.microsoft.sqlserver.jdbc.SQLServerException: USE statement is not supported to switch between databases. Use a new connection to connect to a different database.
I have identified the issue, I thought in the beginning that the issue is due to having multiple databases inside the db server but turned out that the topic name has prefix.dbo.table_name in it. instead of just table_name.
Hence, the connector is detecting prefix.dbo as another database.
The solution is to use transform dropPrefix.
Example, to save data from topic hello.dbo.table1,hello.dbo.table2 to table1 and table2 in the database, use the following config:
tasks.max: 1
topics: hello.dbo.table1, hello.dbo.table2
connection.url: jdbc:sqlserver://server:port;database=dbname;user=dbuser
connection.user: dbuser
connection.password: dbpass
transforms: dropPrefix,unwrap
transforms.dropPrefix.type: org.apache.kafka.connect.transforms.RegexRouter
transforms.dropPrefix.regex: hello\.dbo\.(.*)
transforms.dropPrefix.replacement: $1
transforms.unwrap.type: io.debezium.transforms.ExtractNewRecordState
transforms.unwrap.drop.tombstones: false
auto.create: true
value.converter: org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable: true
insert.mode: upsert
delete.enabled: true
pk.mode: record_key
If it doesn't work as a single connector then you'll need to create one connector per database.
Is it possible to import data from Cassandra into Apache Solr?
I am currently importing data from MySQL into Apache Solr using Solr's dataimporthandler. Is it possible to use Cassandra in place of MySQL?
Update 1:
I tried to connect to Cassandra from a simple Java program using the JDBC driver given here (https://code.google.com/a/apache-extras.org/p/cassandra-jdbc/). My idea was, if the java code works, Solr should also be to import from Cassandra. But it didn't work and I got the following error:
log4j:WARN No appenders could be found for logger (org.apache.cassandra.cql.jdbc.CassandraDriver).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/cassandra/cql/jdbc/AbstractJdbcType
at org.apache.cassandra.cql.jdbc.CassandraConnection.(CassandraConnection.java:146)
at org.apache.cassandra.cql.jdbc.CassandraDriver.connect(CassandraDriver.java:92)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:270)
at CqlConnection.main(CqlConnection.java:14)
Caused by: java.lang.ClassNotFoundException: org.apache.cassandra.cql.jdbc.AbstractJdbcType
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
Disclaimer - Never tried this, so cannot vouch for performance etc :)
Solr's DataImportHandler contrib uses JDBC to connect to a relational data source. Here is the official Solr Wiki about configuring JDBC
Now, for Cassandra, you could use the Cassandra-jdbc driver and setup your DIH config to have SQL that this driver supports.
Please note:
- I have not used Cassandra-jdbc in a production setup, so there might be shortcomings that you may want to consider piloting on.
- As mentioned above, I do not know the performance aspect as well - will recommend you spike it out.
Please post back any findings!
I'm running Sonarqube4.0 and PostgreSQL9.3 database and the sonarqube-runner2.3 on different servers.
Connectivity between Sonarqube and Postgresql database is fine as I can get to the console http://[sonar-host]:9000, login change password and do all activities. I can also see the tables created under my schema in the database
The issue I have is with the sonar-runner. When I run sonar-runner from my source directory I get the below Exception:
09:52:21.736 DEBUG - To prevent a memory leak, the JDBC Driver [org.postgresql.Driver] has been forcibly deregistered
INFO: ------------------------------------------------------------------------
INFO: EXECUTION FAILURE
INFO: ------------------------------------------------------------------------
Total time: 2.791s
Final Memory: 5M/220M
INFO: ------------------------------------------------------------------------
ERROR: Error during Sonar runner execution
org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.api.EmbeddedRunner.doExecute(EmbeddedRunner.java:102)
at org.sonar.runner.api.Runner.execute(Runner.java:90)
at org.sonar.runner.Main.executeTask(Main.java:70)
at org.sonar.runner.Main.execute(Main.java:59)
at org.sonar.runner.Main.main(Main.java:41)
Caused by: Unknown database status: FRESH_INSTALL
I know there is a similar post on stack overflow but, all the properties file have the correct URLs usernames and passwords
sonar-project.properties:
sonar.host.url=http://[sonar-host]:9000
sonar.jdbc.url=jdbc:postgresql://[dbhost]:5432/sonarqube-sit
sonar.jdbc.username=sonarqube-sit
sonar.jdbc.password=xEgIVeyr0kw1
sonar.jdbc.schema=sonarqubesit
sonar.jdbc.driverClassName=org.postgresql.Driver
sonar.login=admin
sonar.password=7nQ36mrnk0UCsX1
sonar.properties:
sonar.jdbc.username=sonarqube-sit
sonar.jdbc.password={aes}ftWqQkON7XUqwNmJyHqzJA==
sonar.jdbc.url=jdbc:postgresql://[dbhost]:5432/sonarqube-sit
sonar.jdbc.schema=sonarqubesit
pg_hba.conf has entry for each of the servers and users
I also get some error in postgres log as below when I run the sonar-runner:
2014-01-22 09:52:21 GMT ERROR: relation "schema_migrations" does not exist at character 15
2014-01-22 09:52:21 GMT STATEMENT: select * from schema_migrations
Firstly, I can see the schema_migartions table in the database under sonarqubesit schema. And Secondly, I don't know why its trying to use schema_migrations? as I'm not migrating to a different schema?
Can anyone help me resolve this please. I'm assuming it is something to do with the way the schema is defined in sonar-runner properties, but I can't see anything obvious.
I resolved the issue by deleting the database and re-creating it. But didn't create a new schema.
Also re-installed Sonarqube and used the default 'public' schema i.e. left the property sonar.jdbc.schema commented(#) under sonar.properties
I have oracle database configuration in tomcat's server.xml
<Resource name="jdbc/sgfdb" auth="Container"
driverClassName="oracle.jdbc.OracleDriver"
url="jdbc:oracle:thin:#databaseurl:1521:schema"
username="username" password="password" maxActive="20" maxIdle="10"
maxWait="-1"
factory="oracle.jdbc.pool.OracleDataSourceFactory"
type="oracle.jdbc.pool.OracleDataSource"/>
Then in my web app (spring mvc project), i declear this in context.xml
<Context>
<ResourceLink name="jdbc/sgfdb"
global="jdbc/sgfdb"
type="javax.sql.DataSource"/>
</Context>
I was able to connect to this database before. I didn't work on it for a week. Then now when i try to start it, always get:
SEVERE: Servlet.service() for servlet [action] in context with path [/WebUI] threw exception [Request processing failed; nested exception is javax.persistence.QueryTimeoutException: Could not open connection] with root cause
**java.sql.SQLException: ORA-01017: invalid username/password; logon denied**
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:440)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:382)
at oracle.jdbc.driver.T4CTTIfun.processError(T4CTTIfun.java:573)
at oracle.jdbc.driver.T4CTTIoauthenticate.processError(T4CTTIoauthenticate.java:431)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:445)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:191)
at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:366)
at oracle.jdbc.driver.T4CTTIoauthenticate.doOAUTH(T4CTTIoauthenticate.java:752)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:366)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:536)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:228)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:32)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:521)
at oracle.jdbc.pool.OracleDataSource.getPhysicalConnection(OracleDataSource.java:280)
at oracle.jdbc.pool.OracleDataSource.getConnection(OracleDataSource.java:207)
at oracle.jdbc.pool.OracleDataSource.getConnection(OracleDataSource.java:157)
at org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider.getConnection(InjectedDataSourceConnectionProvider.java:70)
at org.hibernate.internal.AbstractSessionImpl$NonContextualJdbcConnectionAccess.obtainConnection(AbstractSessionImpl.java:278)
at org.hibernate.engine.jdbc.internal.LogicalConnectionImpl.obtainConnection(LogicalConnectionImpl.java:297)
at org.hibernate.engine.jdbc.internal.LogicalConnectionImpl.getConnection(LogicalConnectionImpl.java:169)
at org.hibernate.engine.jdbc.internal.proxy.ConnectionProxyHandler.extractPhysicalConnection(ConnectionProxyHandler.java:82)
at org.hibernate.engine.jdbc.internal.proxy.ConnectionProxyHandler.continueInvocation(ConnectionProxyHandler.java:138)
at org.hibernate.engine.jdbc.internal.proxy.AbstractProxyHandler.invoke(AbstractProxyHandler.java:81)
at $Proxy36.prepareStatement(Unknown Source)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$5.doPrepare(StatementPreparerImpl.java:147)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl$StatementPreparationTemplate.prepareStatement(StatementPreparerImpl.java:166)
at org.hibernate.engine.jdbc.internal.StatementPreparerImpl.prepareQueryStatement(StatementPreparerImpl.java:145)
at org.hibernate.loader.Loader.prepareQueryStatement(Loader.java:1720)
at org.hibernate.loader.Loader.doQuery(Loader.java:828)
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:289)
at org.hibernate.loader.Loader.doList(Loader.java:2447)
at org.hibernate.loader.Loader.doList(Loader.java:2433)
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2263)
at org.hibernate.loader.Loader.list(Loader.java:2258)
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:470)
at org.hibernate.hql.internal.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:355)
at org.hibernate.engine.query.spi.HQLQueryPlan.performList(HQLQueryPlan.java:195)
at org.hibernate.internal.SessionImpl.list(SessionImpl.java:1215)
at org.hibernate.internal.QueryImpl.list(QueryImpl.java:101)
at org.hibernate.ejb.QueryImpl.getSingleResult(QueryImpl.java:284)
at org.hibernate.ejb.criteria.CriteriaQueryCompiler$3.getSingleResult(CriteriaQueryCompiler.java:258)
at mycompany.services.impl.JobServiceImpl.getNumberOfJobs(JobServiceImpl.java:51)
at mycompany.controller.ExecJobController.execJobList(ExecJobController.java:78)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126)
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578)
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882)
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:789)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:641)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:225)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:169)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:927)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:999)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:565)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:307)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
I am sure the username and password are correct.
Could anyone please give me some hint about what's going on here?
Just solved a problem similar to this. If an 11g database is configured for case sensitive passwords, but you're trying to connect using a 10g client, the 10g client will send the password all in upper case to the database, hence an invalid password when the password you typed in is clearly correct. So you need to upgrade the client to 11g to get it to send the password in the correct case (but for a quick test you can change your password to all upper case and you'll be able to connect).
Bumped into this thread because I was facing the same problem. The username and password were perfect. Was able to login using those credentials in SQLPlus and from other applications. The datasource.url was also perfect.
While analyzing the errors, found that the ojdbc6.jar that I was using was trying to connect to Oracle 11.1.0.7 whereas my Oracle was 11.2.0.4. Downloaded the latest ojdbc6.jar and tried to connect and voila!
Solved. The problem is I shouldn't use
factory="oracle.jdbc.pool.OracleDataSourceFactory"
After remove that, it works good!
If your password and username are lowercase and their uppercase mode have different value (like i, İ in Turkish) this might cause the problem.
Some Java- Oracle connection libraries make them uppercase without watching the culture differences.
The username could be incorrect.
The password could be incorrect.
The server/instance you are connecting to could be incorrect, or different between your machine and the server, or between the application and SQL Developer.
The database might be configured to use case sensitive passwords.
The password might contain a semicolon ; character causing the connection string to get truncated when the application builds the connection string, but allowing you to use it from SQL Developer(?)
You could have a typo somewhere(?)
I have similar issue where password works in SQL Developer but not in code (Java). I tried to reset the password and it works fine. Not sure of the root cause but it works. Hope this helps!
it is possible that the user you are trying to logon has an expired password
For me incompatible version of OracleDriver was causing this issue
Your application should either register oracle driver manually (which jar that I needed to work with was doing) or agter java 6 ojdbc.jar should be in the classpath for your application.
So google compatible driver version for your oracle installation and either declare it in your pom file (with needed plugin to put it in the resulting jar) and reference it manually from code or put the ojdbc.jar somewhere your jar can see it
Usefull links:
About connecting to oracle db:
https://www.codejava.net/java-se/jdbc/connect-to-oracle-database-via-jdbc
About java classpath: https://docs.oracle.com/javase/7/docs/technotes/tools/windows/classpath.html
You should change the value of FipsAlgorithmPolicy in system register:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\FipsAlgorithmPolicy] "Enabled" = dword: 00000000
You don't need to reboot the OS.