I'm working on an application with liquibase, spring-boot and hibernate.
The database used is PostgreSQL. In order to populate the DB at startup I configured a data.sql file in src/main/resources, containing some insert statements.
In addition, after the boot there is also liquibase trying to apply all the changesets, one of them being the creation of a table populated using the data.sql file. So I get the following non blocking error, when executing the mvn cmd to start the app:
2018-04-25 14:33:53.417 ERROR 11060 --- [neut-Executor-1] liquibase : classpath:config/liquibase/master.xml: config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster: Change Set config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster failed. Error: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
2018-04-25 14:33:53.453 ERROR 11060 --- [neut-Executor-1] i.g.j.c.liquibase.AsyncSpringLiquibase : Liquibase could not start correctly, your database is NOT ready: Migration failed for change set config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster:
Reason: liquibase.exception.DatabaseException: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
liquibase.exception.MigrationFailedException: Migration failed for change set config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster:
Reason: liquibase.exception.DatabaseException: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
at liquibase.changelog.ChangeSet.execute(ChangeSet.java:619)
at liquibase.changelog.visitor.UpdateVisitor.visit(UpdateVisitor.java:51)
at liquibase.changelog.ChangeLogIterator.run(ChangeLogIterator.java:79)
at liquibase.Liquibase.update(Liquibase.java:214)
at liquibase.Liquibase.update(Liquibase.java:192)
at liquibase.integration.spring.SpringLiquibase.performUpdate(SpringLiquibase.java:431)
at liquibase.integration.spring.SpringLiquibase.afterPropertiesSet(SpringLiquibase.java:388)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.initDb(AsyncSpringLiquibase.java:94)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.lambda$afterPropertiesSet$0(AsyncSpringLiquibase.java:77)
at io.github.jhipster.async.ExceptionHandlingAsyncTaskExecutor.lambda$createWrappedRunnable$1(ExceptionHandlingAsyncTaskExecutor.java:68)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: liquibase.exception.DatabaseException: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:309)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:55)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:113)
at liquibase.database.AbstractJdbcDatabase.execute(AbstractJdbcDatabase.java:1277)
at liquibase.database.AbstractJdbcDatabase.executeStatements(AbstractJdbcDatabase.java:1259)
at liquibase.changelog.ChangeSet.execute(ChangeSet.java:582)
... 12 common frames omitted
Caused by: org.postgresql.util.PSQLException: ERROR: relation "container" already exists
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2455)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2155)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:288)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:430)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:356)
at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:303)
at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:289)
at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:266)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:262)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tomcat.jdbc.pool.StatementFacade$StatementProxy.invoke(StatementFacade.java:114)
at com.sun.proxy.$Proxy129.execute(Unknown Source)
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:307)
... 17 common frames omitted
What I'm thinking is that the hibernate executing the data.sql gets executed just a moment before the liquibase migration process, so maybe there should be a way to avoid this exception. Is it possible to force the execution of hibernate importing data.sql to happen after liquibase migration?
you have to decide by which mechanism do you want to execute the changes. So you can use hibernate to create your tables and insert data or you can use liquibase to create you tables and data not both at the same time. What I was doing on previous projects was setting the hibenate to only validate mode spring.jpa.hibernate.ddl-auto=validateand using liquibase to create tables and insert data.
Related
I want to extract all data from my PostgreSQL to SQL Server using Kafka Connect and JDBC sink. I want to rid off from some queries to check whether i can do data stream using insert.mode=insert only.
This is my source config:
name=debezium_pg_connectors
connector.class=io.debezium.connector.postgresql.PostgresConnector
tasks.max=1
plugin.name=pgoutput
database.hostname=XXX.XXX.XXX.XX
database.port=5432
database.user=XXXXXX
database.password=XXXXXX
database.dbname=XXXXX
database.server.name=XXXXX
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic=XXXXXX
table.whitelist=XXXXXXX
time.precision.mode=connect
transforms=unwrap
transforms.unwrap.type= io.debezium.transforms.ExtractNewRecordState
This is my sink config:
name=jdbc-sink
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=pj_user
connection.url=<connection>
auto.create=true
auto.evolve=true
insert.mode=insert
pk.mode=record_key
table.name.format=<table>
transforms=unwrap,route
transforms.unwrap.type=io.debezium.transforms.UnwrapFromEnvelope
transforms=route
transforms.route.type=org.apache.kafka.connect.transforms.RegexRouter
transforms.route.regex=([^.]+)\\.([^.]+)\\.([^.]+)
transforms.route.replacement = $3
fields.whitelist=...
In my SQL Server, i have auto-generated column called key with uniqueidentifier data type and as primary key. However, there's failure every time i tried to push my data to sink:
[2020-03-03 12:45:11,487] ERROR WorkerSinkTask{id=jdbc-sink-0} RetriableException from SinkTask: (org.apache.kafka.connect.runtime.WorkerSinkTask:552)
org.apache.kafka.connect.errors.RetriableException: java.sql.SQLException: java.sql.BatchUpdateException: Cannot insert the value NULL into column 'key', table '<table>'; column does not allow nulls. INSERT fails.
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:93)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:539)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:322)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.sql.SQLException: java.sql.BatchUpdateException: Cannot insert the value NULL into column 'key', table '<table>'; column does not allow nulls. INSERT fails.
... 12 more
If anyone have any ideas to help me, any helps and advice are appreciated. thanks
Make sure that your key column has a default value:
ALTER TABLE [tableName] ADD DEFAULT NEWSEQUENTIALID() FOR key
I am not sure if my issue connecting to the Scala Play 2.5.x Framework or to PostgreSQL so I am going to describe my setup.
I am using the Play 2.5.6 with Scala and PostgreSQL 9.5.4-2 from the BigSQL Sandboxes. I use the Play Framework default evolution package to manage the DB versions.
I created a new database in BigSQL Sandbox's PGSQL and PGSQL created a default schema called public. I use this schema for development.
I would like to create a table with the following script (1.sql in DB evolution config):
# Initialize the database
# --- !Ups
CREATE TABLE user (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
email TEXT NOT NULL,
creation_date TIMESTAMP NOT NULL
);
# --- !Downs
DROP TABLE user;
Besides that I would like to read the table with a code like this:
val resultSet = statement.executeQuery("SELECT id, name, email FROM public.user WHERE id=" + id.toString)
I got an error if I would like to execute any of the mentioned code or even if I use the CREATE TABLE... code in pgadmin. The issue is with the user table name. If I prefix it with public (i.e. public.user) everything works fine.
My questions are:
Is it normal to prefix the table name with the schema name every time? It seems to odd to me.
How can I make the public schema a default option so I do not have to qualify the table name? (e.g. CREATE TABLE user (...); will not throw an error)
I tried the following:
I set the search_path for my user: ALTER USER my_user SET search_path to public;
I set the search_path for my database: ALTER database "my_database" SET search_path TO my_schema;
search_path correctly shows this: "$user",public
I got the following errors:
In Play: p.a.d.e.DefaultEvolutionsApi - ERROR: syntax error at or near "user"
In pgadmin:
ERROR: syntax error at or near "user"
LINE 1: CREATE TABLE user (
********** Error **********
ERROR: syntax error at or near "user"
SQL state: 42601
Character: 14
This has nothing to do with the default schema. user is a reserved word.
You need to use double quotes to be able to create such a table:
CREATE TABLE "user" (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
email TEXT NOT NULL,
creation_date TIMESTAMP NOT NULL
);
But I strongly recommend not doing that. Find a different name that does not require a quoted identifier.
I'm confused by the errors I get when trying to create an in-memory H2 DB for my Spring Boot application. The relevant configuration is
db.url=jdbc:h2:mem:test;MODE=MySQL;DB_CLOSE_DELAY=-1;INIT=runscript from 'classpath:create.sql'
hibernate.hbm2ddl.auto=create
And create.sql:
CREATE TABLE `cities` (
`name` varchar(45) NOT NULL,
PRIMARY KEY (`name`)
) ;
INSERT INTO `cities` VALUES ('JAEN'),('ALBACETE');
But I get the error Caused by: org.h2.jdbc.JdbcSQLException: Table "CITIES" already exists;
Weird is, if I remove the CREATE TABLE statement, I get:
Caused by: org.h2.jdbc.JdbcSQLException: Table "CITIES" not found;
The only thing that works is using DROP TABLE IF EXISTS, but well, I don't think I should need to.
What's going on? What's the proper way of pre-populating static data into an H2 memory DB?
1) Hibernate way: use import.sql file or specify files
spring.jpa.properties.hibernate.hbm2ddl.import_files=file1.sql,file2.sql
http://docs.spring.io/spring-boot/docs/current/reference/html/howto-database-initialization.html
2) Spring Boot: use default schema.sql & data.sql files
or specify files through properties
spring.datasource.schema = file1.sql
spring.datasource.data = file1.sql, file2.sql
http://docs.spring.io/autorepo/docs/spring-boot/1.0.2.RELEASE/reference/html/howto-database-initialization.html
I can't solve my problem with my local Oracle database.
I'm tryong to connect to my local Oracle database (Oracle Database 11g Express Edition)
Later on I will use JNDI to another Oracle Database, but I think this should still work.
Driver: ojdbc6.jar in /lib
db.default.driver=oracle.jdbc.driver.OracleDriver
db.default.url="jdbc:oracle:thin:#localhost:1521:xe"
db.default.user="user"
db.default.pass="pass"
So I know I do connect to the database, but the error is that it says that the table does not exist. I'm not even creating or querying to a table (no model exists - but I've tried with having a model too, same error). Something seems to be wrong in the beginning and I don't know how to Debug this.
Error:
**java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist**
oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:457)
oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:400)
oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:926)
oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:476)
oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:200)
oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:543)
oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:197)
oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:1213)
oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1492)
oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1710)
oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:2006)
oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:1709)
com.jolbox.bonecp.StatementHandle.executeQuery(StatementHandle.java:503)
play.api.db.evolutions.Evolutions$.executeQuery(Evolutions.scala:118)
play.api.db.evolutions.Evolutions$.databaseEvolutions(Evolutions.scala:334)
play.api.db.evolutions.Evolutions$.evolutionScript(Evolutions.scala:306)
play.api.db.evolutions.EvolutionsPlugin$$anonfun$onStart$1$$anonfun$apply$1.apply$mcV$sp(Evolutions.scala:435)
play.api.db.evolutions.EvolutionsPlugin.withLock(Evolutions.scala:478)
play.api.db.evolutions.EvolutionsPlugin$$anonfun$onStart$1.apply(Evolutions.scala:434)
play.api.db.evolutions.EvolutionsPlugin$$anonfun$onStart$1.apply(Evolutions.scala:432)
scala.collection.immutable.List.foreach(List.scala:309)
play.api.db.evolutions.EvolutionsPlugin.onStart(Evolutions.scala:432)
play.api.Play$$anonfun$start$1$$anonfun$apply$mcV$sp$1.apply(Play.scala:63)
play.api.Play$$anonfun$start$1$$anonfun$apply$mcV$sp$1.apply(Play.scala:63)
scala.collection.immutable.List.foreach(List.scala:309)
play.api.Play$$anonfun$start$1.apply$mcV$sp(Play.scala:63)
play.api.Play$$anonfun$start$1.apply(Play.scala:63)
play.api.Play$$anonfun$start$1.apply(Play.scala:63)
When reading about it I've only found that I might not have permission to some table, but the thing is that I use the same login in Oracle SQL Developer and it works.
As nico_ekito wrote, you need to create this table manually.
This one works for me:
CREATE TABLE play_evolutions
(
id Number(10,0) Not Null Enable,
hash VARCHAR2(255 Byte),
applied_at Timestamp Not Null,
apply_script clob,
revert_script clob,
state Varchar2(255),
last_problem clob,
CONSTRAINT play_evolutions_pk PRIMARY KEY (id)
);
Try to manually create a play_evolutions table with the following columns (by adapting the types to the ones used by Oracle):
id int not null primary key, hash varchar(255) not null,
applied_at timestamp not null,
apply_script text,
revert_script text,
state varchar(255),
last_problem text
In conf/application.conf
Un-comment the following line:
evolutionplugin=disabled
This is if you don't need Evolutions (to track schema changes).
I'm trying to add two tables to magento but it still dosen't work! I don't get the tables in MySQL.
There's no error message, just nothing happens.
I can't find where the mistake is;
I already checked all my pages.
This is my XML code in config.xml:
<models>
<interactivebanner>
<class>Kiwi_InteractiveBanner_Model</class>
<resourceModel>InteractiveBanner_resource</resourceModel>
</interactivebanner>
<interactivebanner_resource>
<class>Kiwi_InteractiveBanner_Model_Resource</class>
<entities>
<interactivebanner>
<table>interactivebanner</table>
</interactivebanner>
<interactivebanner2>
<table>interactivebanner_prod</table>
</interactivebanner2>
</entities>
</interactivebanner_resource>
</models>
and this is the setup page :
<?php
$installer = $this;
$installer->startSetup();
$installer->run("
DROP TABLE IF EXISTS `{$this->getTable('interactivebanner/interactivebanner')}`;
create table `{$this->getTable('interactivebanner/interactivebanner')}`
(
ENTITY_ID int not null,
NAME varchar(100),
LINK varchar(100),
STATUS int,
primary key (ENTITY_ID)
);
DROP TABLE IF EXISTS `{$this->getTable('interactivebanner/interactivebanner_prod')}`;
create table `{$this->getTable('interactivebanner/interactivebanner_prod')}`
(
PROD_ID int not null,
ENTITY_ID int,
POSI_V float,
POSI_H float,
primary key (PROD_ID)
);
alter table banner_pro add constraint FK_RELATION_1 foreign key (ENTITY_ID)
references banner (ENTITY_ID) on delete restrict on update restrict;
");
$installer->endSetup();
Is this existing module you want to upgrade?
If so, you will have to bump the module version before your upgrade script runs. If it's a separate module you have to make the mysql4-setup-<version>.php with a version so high or above that will create the tables. Also you will require to have module declaration in your app/etc/Myself_Interactivebanner.xml so that Magento knew about the module existence.
To get more certain answer provide more details about the environment you have.
If I had to guess, your install script is not running. This can happen for multiple reasons. I would look at this post that helped me get my install script running:
My Magento Extension Install Script Will Not Run
i found a mistake on my config.xml !
an uppercase missing :)
sorry.