Using SQL Server 2008 Schema with JPA - sql-server

I got a MS SQL Server (2008) database with two schemas, dbo as default for general purpose e.g. authentication and myapp for domain objects. I want to have JPA-Entitys in both Schemas. I use SpringBoot for configuration.
Entitys tables are created in the right schema as they should, e.g. myschema.job, but relationship tables, e.g. Job_Employee are created within the default schema dbo. How can I set in whicht schema automatically created tables are stored (without changing the default schema as this just shifts the problem)?
#Entity
#Table(schema="myschema")
public class Job {[...]
My application.yml looks like:
spring:
profiles: dev
datasource:
datasource1:
url: jdbc:sqlserver://localhost;databaseName=mydb;schema=myschema
username: SA
password: ###
datasource2:
url: jdbc:sqlserver://localhost;databaseName=mydb;schema=dbo
username: SA
password: ###
jpa:
show-sql: true
hibernate.ddl-auto : create-drop
properties:
hibernate.dialect: org.hibernate.dialect.SQLServer2012Dialect
hibernate.default_schema: dbo
And the datasources are Configured in
#Configuration
#EnableJpaRepositories
public class JPAConfiguration {
#Bean
#Primary
#ConfigurationProperties("spring.datasource.datasource1")
public DataSourceProperties firstDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource.datasource1")
public DataSource firstDataSource() {
return firstDataSourceProperties().initializeDataSourceBuilder().build();
}
#Bean
#ConfigurationProperties("spring.datasource.datasource2")
public DataSourceProperties secondDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("spring.datasource.datasource2")
public DataSource secondDataSource() {
return secondDataSourceProperties().initializeDataSourceBuilder().build();
}
}
Thanks!

The answer is: every Collection must be mapped to the right schema as well with #JoinTable annotation.
E.g. in our Case:
#JoinTable(schema="myschema")
#OneToMany(cascade=CascadeType.ALL)
#Column(nullable = false)
private List<Employee> employee;
This results in a table calles myschema.job_employee.

Related

Specify a schema using r2dbc-mssql

Is there any way to specify a default schema in a properties file using r2dbc-mssql?
The connection works fine with:
spring:
r2dbc:
url: 'r2dbc:mssql://zzzzz.database.windows.net:1433/dbname'
username: 'xxxxxx'
password: 'xxxxxx'
but i have to use a static schema:
#Table("schemaname.foo")
public class Foo {
#Id
private Long id;
I've found something similar in r2dbc-postgresql:
https://github.com/pgjdbc/r2dbc-postgresql/issues/37
In a Spring Boot application, I think you can execute a sql statement to switch schemas in the #PostConstruct method of a #Configuration class.
#Configuration
class DatabaseConfig{
#Autowired
ConnectionFactory connectionFactory;
#PostConstruct
void init(){
Mono.from(connectionFactory.getConnection())
.flatMap(c->c.createStatement("switch to your schema here, different database has different commands here").execute())
.subscribe()
}
}

Spring Batch "Invalid object name BATCH_JOB_INSTANCE"

I've created a spring batch to query a Azure SQL server database and write the data into a CSV file. I do not have create permissions for the database. I get this error Invalid Object name BATCH_JOB_INSTANCE on running the batch. I don't want the spring batch meta-data tables to be created in the main database. Or it would be helpful if I can have them in another local or in-memory db like h2db.
I've also added spring-batch-initialize-schema=never already, which was the case with most answers to similar questions on here, but that didn't help.
Edit:
I resolved the Invalid Object name error by preventing the metadata tables from being created into the main database by extending the DefaultBatchConfigurer Class and Overriding the setDataSource method, thus having them created in the in-memory map-repository. Now I want to try two options:
How to have the meta data tables to be created in a local db or in-memory db like h2db.
Or If I have the meta data tables created already in the main database, in a different schema than my main table I'm fetching from. How to point my job to those meta-data tables in another schema, to store the job and step details data in those.
#Configuration
public class SpringBatchConfig extends DefaultBatchConfigurer{
#Override
public void setDataSource(DataSource datasource) {
}
...
My application.properties file looks like this:
spring.datasource.url=
spring.datasource.username=
spring.datasource.password=
spring.datasource.driver-class-name=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring-batch-initialize-schema=never
spring.batch.job.enabled=false
spring.jpa.hibernate.ddl-auto=update
spring.jpa.show-sql=true
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.SQLServer2012Dialect
I've created a demo with two datasources. Batch metadata will sotre in H2 DB and the Job datasource is Azure SQL.
Here is the project structure:
We need define a DataSourceConfig class and use #Primary annotation for DataSource bean:
#Configuration
public class DataSourceConfig {
#Bean(name = "mssqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource appDataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name = "h2DataSource")
#Primary
// #ConfigurationProperties(prefix="spring.datasource.h2")
public DataSource h2DataSource() {
return DataSourceBuilder.create()
.url("jdbc:h2:mem:thing:H2;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE")
.driverClassName("org.h2.Driver")
.username("sa")
.password("")
.build();
}
}
In the ItemReaderDbDemo class, we use #Autowired #Qualifier("mssqlDataSource") to specify the dataSource in the Spring Batch task:
#Configuration
public class ItemReaderDbDemo {
//generate task Object
#Autowired
private JobBuilderFactory jobBuilderFactory;
//Step exec tasks
//generate step Object
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
#Qualifier("mssqlDataSource")
private DataSource dataSource;
#Autowired
#Qualifier("dbJdbcWriter")
private ItemWriter<? super Todo> dbJdbcWriter;
#Bean
public Job itemReaderDbDemoJob() {
return jobBuilderFactory.get("itemReaderDbDemoJob").start(itemReaderDbStep()).build();
}
#Bean
public Step itemReaderDbStep() {
return stepBuilderFactory.get("itemReaderDbStep")
.<Todo,Todo>chunk(2)
.reader(dbJdbcReader())
.writer(dbJdbcWriter)
.build();
}
#Bean
#StepScope
public JdbcPagingItemReader<Todo> dbJdbcReader() {
JdbcPagingItemReader<Todo> reader = new JdbcPagingItemReader<Todo>();
reader.setDataSource(dataSource);
reader.setFetchSize(2);
reader.setRowMapper(new RowMapper<Todo>() {
#Override
public Todo mapRow(ResultSet rs, int rowNum) throws SQLException {
Todo todo = new Todo();
todo.setId(rs.getLong(1));
todo.setDescription(rs.getString(2));
todo.setDetails(rs.getString(3));
return todo;
}
});
SqlServerPagingQueryProvider provider = new SqlServerPagingQueryProvider();
provider.setSelectClause("id,description,details");
provider.setFromClause("from dbo.todo");
//sort
Map<String,Order> sort = new HashMap<>(1);
sort.put("id", Order.DESCENDING);
provider.setSortKeys(sort);
reader.setQueryProvider(provider);
return reader;
}
}
Here is my application.properties:
logging.level.org.springframework.jdbc.core=DEBUG
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.jdbcUrl=jdbc:sqlserver://josephserver2.database.windows.net:1433;database=<Your-Database-Name>;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;
spring.datasource.username=<Your-UserName>
spring.datasource.password=<Your-Password>
spring.datasource.initialization-mode=always
It return expected result from my Azure SQL. By the way, my Azure sql username does not have create permissions for the database.
The result shows:
How to have the meta data tables to be created in a local db or in-memory db like h2db.
You can use spring.batch.initialize-schema=embedded for that.
Or If I have the meta data tables created already in the main database, in a different schema than my main table I'm fetching from. How to point my job to those meta-data tables in another schema, to store the job and step details data in those.
spring batch works against a datasource, not a particular schema. If meta-data tables are in a different schema, then you need to create a second datasource pointing to that schema and set it on the job repository.
I know this post is a little bit old, but I'd like to give an update.
For newer versions of Spring Boot spring.batch.initialize-schema is deprecated.
I'm using Spring Boot 2.7.1 and the newer property is spring.batch.jdbc.initialize-schema.
In my case: when I was receiving the error message was due that the user did not have the CREATE TABLE permission to create the corresponding spring bacth tables.
Adding the permissions fix the issue.

How to configure databases in springboot interchangeably

I need to enable code to interchangeably uses a different database by changing configuration. I have Oracle SQL and Azure SQL Server. By changing helm chart (or configuration), I would like to choose which database to use. Things I know are:
Datasource is configured in helm chart. I have a yaml file that declares driver, url, username and password for database.
env:
- name: datasource.project.driverClassName
value: 'oracle.jdbc.OracleDriver'
- name: datasource.project.url
value: 'url'
- name: datasource.project.username
value: 'username'
- name: datasource.project.password
value: 'password'
In my project, I create bean for database:
#Configuration
#EnableConfigurationProperties
public class ProjectDataSourceConfig {
public static final String DB_TX_MANAGER = "";
#Bean
#Primary
#ConfigurationProperties("datasource.project")
public DataSourceProperties projectDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
public DataSource projectDataSource() {
return projectDataSourceProperties().initializeDataSourceBuilder().type(ComboPooledDataSource.class).build();
}
#Bean
public NamedParameterJdbcTemplate projectJdbcTemplate() {
return new NamedParameterJdbcTemplate(projectDataSource());
}
#Bean(name = DB_TX_MANAGER)
public DataSourceTransactionManager projectDbtransactionManager() {
return new DataSourceTransactionManager(projectDataSource());
}
}
My goal is: to find a way to load either Oracle SQL OR Azure SQL Server by modifying configuration file. I am not sure if just changing driverClassName, url, username and password is sufficient enough.
I found this very easy. Since I don't use Hibernate, I can just simply change driver, url, username and password to Azure SQL's, then it works.

How to access two SQl server in springboot

have an application that runs Spring MVC.
I need it to access 2 different databases in my app (Two Sql servers).
How do I configure this?
You can access the first database with an EntityManager and use a JdbcTemplate to access the second database
1.application.properties
#SQL Server 1
spring.datasource.url = [url]
spring.datasource.username = [username]
spring.datasource.password = [password]
spring.datasource.driverClassName = [sql Server Driver class name]
#SQl Server 2
spring.secondaryDatasource.url = [url]
spring.secondaryDatasource.username = [username]
spring.secondaryDatasource.password = [password]
spring.secondaryDatasource.driverClassName = [sql Server Driver class name]
2.Create #Configuration class and declare two datasource beans.Create a Jbc template to use to access sql server 2
#Bean
#Primary
#ConfigurationProperties(prefix="spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix="spring.secondaryDatasource")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
public JdbcTemplate jdbcTemplate() {
return new JdbcTemplate(secondaryDataSource());
}
Usage Example
#Repository
public class CustomerRepositoryImpl implements CustomerRepository {
private final JdbcTemplate jdbcTemplate;
public CustomerRepositoryImpl(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
}
You can also look at the documentation:
https://docs.spring.io/spring-boot/docs/1.2.3.RELEASE/reference/htmlsingle/#howto-use-two-entity-managers
and this site
https://www.baeldung.com/spring-data-jpa-multiple-databases

Is it possible to configure multiple database connections in Dropwizard?

I am working on some code that leverages Dropwizard that will require that I need to connect to at least two different databases (I plan to use Hibernate as well). I was unable to find any examples/documentation that will allow me to configure two different database connections in the Database block of the .yml configuration file. Is this possible in Dropwizard? If not, what are the workarounds that people have used in the past. Thank you in advanced for your help!
You can configure multiple databases in dropwizard. In the config.yml you can have multiple database configuration like this.
database1:
driverClass: org.postgresql.Driver
user: user
password: pwd
url: jdbc:postgresql://localhost:5432/db1
validationQuery: select 1
minSize: 2
maxSize: 8
database2:
driverClass: org.postgresql.Driver
user: user
password: pwd
url: jdbc:postgresql://localhost:5432/db2
validationQuery: select 1
minSize: 2
maxSize: 8
And in the config class get both config details.
public class DBConfig extends Configuration {
private DatabaseConfiguration database1;
private DatabaseConfiguration database2;
public DatabaseConfiguration getDatabase1() {
return database1;
}
public DatabaseConfiguration getDatabase2() {
return database2;
}
}
And in your service configure which Dao to use which database.
#Override
public void run(MyConfiguration configuration,
Environment environment) throws ClassNotFoundException {
...
final DBIFactory factory = new DBIFactory();
// Note that the name parameter when creating the DBIs must be different
// Otherwise you get an IllegalArgumentException
final DBI jdbi1 = factory.build(
environment, configuration.getUserDatabase(), "db1");
final DBI jdbi2 = factory.build(
environment, configuration.getItemDatabase(), "db2");
final MyFirstDAO firstDAO = jdbi1.onDemand(MyFirstDAO.class);
final MySecondDAO secondDAO = jdbi2.onDemand(MySecondDAO.class);
...
}

Resources