springboot2.6: defer-initialization-mode set to true: data.sql is not initialized - spring-boot-2.6.0

I am upgrading my springboot from 2.2 to 2.6. As part of this upgrade, when I am testing JUNITs using inMemory H2 database, all tests started failing with missing data despite having data.sql.
I reviewed spring docs and made following changes in my application.yml file.
spring.sql.init.mode: always
spring.jpa.defer-datasource-initialization: true
spring.jpa.generated-ddl: true # tried this option without hibernate one too
spring.jpoa.hibernate.ddl-auto: create-drop #tried this option alone too along with changing to create
Despite using above options, I dont see my data.sql data visible in test cases and all my test cases started to fail.Did anyone encounter similar issue? If so, what would be possible resolution?

You just need to change your data.sql to import.sql

Related

Why isn't my database cluttered with artifacts from my Spring DBUnit tests?

I am currently testing my Spring repositories and decided to use a MariaDB server instance instead of an in-memory Derby instance because of some complications in a test that involved a database view.
While the tests eventually did succeed without errors and failures, I noticed that I didn't add a #DatabaseTeardown annotation to my test case. So I decided to check my database for unwanted rows leftover from the test and found that my database was just as empty as before the test.
Could someone here explain why this is happening?
As you said that you use #Transactional on your test cases, its default behavior is tests is to enclose entire test in a transaction and automatically rollback it, thus making sure your database is in the same state as it was before the test cases.
Check this StackOverflow answer - https://stackoverflow.com/a/9817815/1589165
This is documented here in Spring docs - http://docs.spring.io/spring/docs/current/spring-framework-reference/html/integration-testing.html#testcontext-tx

Bluemix Monitoring and Analytics: Resource Monitoring - JsonSender request error

I am having problems with the Bluemix Monitoring and Analytics service.
I have 2 applications with bindings to a single Monitoring and Analytics service. Every ~1 minute I get the following log line in both apps:
ERR [Resource Monitoring][ERROR]: JsonSender request error: Error: unsupported certificate purpose
When I remove the bindings, the log message does not appear. I also greped my code for anything related to "JsonSender" or "Resource Monitoring" and did not find anything.
I am doing some major refactoring work on our server, which might have broken things. However, our code does not use the Monitoring service directly (we don't have a package that connects to the monitoring server or something like that) - so I will be very surprised if the problem is due to the refactoring changes. I did not check the logs before doing the changes.
Any ideas will help.
Bluemix have 3 production environments: ng, eu-gb, au-syd, and I tested with ng, and eu-gb, both using 2 applications with same M&A service, and tested with multiple instances. They are all work fine.
Meanwhile, I received a similar problem that claim they are using Node.js 4.2.6.
So there are some more information we need to know to identify the problem:
1. Which version of Node.js are you using (Bluemix Default or any other one)
2. Which production environment are you using? (ng, eu-gb, au-syd)
3. Is there any environment variables are you using in your application?
(either the creating in code one, or the one using USER-DEFINED Variables)
4. One more thing, could you please try to delete the M&A service, and create it again, in case we are trapped in a previous fault of M&A.
cf ds <your M&A service name>
cf cs MonitoringAndAnalytics <plan> <your M&A service name>
NodeJS versions 4.4.* all appear to work
NodeJS uses openssl and apparently did/does not like how one of the M&A server certificates were constructed.
Unfortunately NodeJS does not expose the openssl verify purpose API.
Please consider upgrading to 4.4 while we consider how to change the server's certificates in the least disruptive manner as there are other application types that do not have an issue with them (e.g. Liberty and Ruby)
setting node js version 4.2.4 in package.json worked for me, however this is an alternative by-passing solution. Actual fix is being handled by core team. Thanks.

OpenCart - Sensible workflow, database migration?

I'm working on an OpenCart project. (Note: this is my first time dealing with it.)
I want to somehow implement my usual workflow of:
working on localhost, experimenting, etc,
deploying the changes to the production server (sometimes to a staging server before that),
adding the database changes.
Now, how should I achieve this?
What I already did with GIT is I created an automated deployment flow, which consists of the following:
building a deployment version (Checking out master/HEAD's upload/ directory, and removing the upload/install directory.),
copy the upload/ dir's contents to the target server.
This work fine, but won't solve the database migration issue.
I think it's not even as simple as updating certain tables in the target server's database from my local database, since for example: the "settings" table contains data that's specific to the environment.
So I can't just overwrite the settings table with my local version.
It seems to me, that the easiest - and ugliest - solution would be to develop on the prod server in parallel to the localhost changes. So for example: If I install a module, which causes changes in the database, then I would need to replicate every step I took in the local environment installing and placing that module. Same goes for every admin setup I take. (Meta changes, etc.)
This sounds awfully painful to me, so I hope there's a better solution out there other than doing every database-related change twice...
Thanks in advance!

Spring Roo with GAE error on most basic tests

I have been playing around with spring roo for some time now, read most of the documentation, tried most of the tutorials I could find (pizze, wedding...) - all worked well. My next step was to create a basic application, that can be deployed on Google App Engine. I just can't get it to work.
I found a simple tutorial that consists of 2 Entities and seems to run fine on GAE. It has been created using Roo-1.1.0.M2. I did the exact same steps but could not get it to run. I tried Roo-1.1.0.RELEASE as well as the current Roo-1.1.1 development branch. I always ran into the same problem as Ron.
I was able to strip down the test.roo file to
project --topLevelPackage com.springsource.failureexample
persistence setup --provider DATANUCLEUS --database GOOGLE_APP_ENGINE --applicationId failureexample
entity --class ~.domain.Person --testAutomatically
perform tests
Using this example, I get something like ERROR DataNucleus.Transaction - Operation rollback failed on resource: org.datanucleus.store.appengine.DatastoreXAResource and the build fails with a test in testCountPeople(com.springsource.failureexample.domain.PersonIntegrationTest).
The problem is also described at Roo's issue tracker (where I provided my minimal test script), but since it obviously worked at other setup, I might have misunderstood or overlooked something important in the setup process.
Could you provide some ideas of what my error might be?
How about creating at least one field in your entity ?

How to actually use liquibase in a maven project with svn

Last week i read about liquibase quick start and all the related tutorials but i feel like i don't get the whole drift about using it for consecutive change in the database and i have some questions still opened.
i'm developing an application in netbeans using maven with 3 modules: dbunit module, service module and the webapp module.As you might have guessed dbunit does the database stuffs, the service is on top of it and the webapp uses the services.so the parent pom has the declaration of all groupids, artifactids and versions for all jars and plugins.
I manage to generate the changelog file from command line since the db is already existing and supposing i set up everything correctly using liquibase maven plugin :
question 1 : What will the liquibase goal be since right now i'm doing any database change right now?
Question 2 : If i want to add for example a new table to the database, will i add the this new changeSet to the same changelog file or i have to create a new changelog.xml file?
Question 3 : I believe when the dbunit runs it will run the changeset but is that necessary to add the plugin to th webapp module too (maybe to run the liquibase goal before deployment with cargo plugin) or the dbunit will take care of that?
Question 4 : What exactly subversion helps with keep the states of the changelog (assuming there is only one changelog refere to question 2)
thanks for reading this and for you help.
See http://www.liquibase.org/documentation/maven
so you should bind your liquibase execution to a phase like
<phase>process-resources</phase>
But i use a spring executor, too. So everytime my app starts up, it starts a liquibase executor to execute missing changelogs in the database. It is nice, because when you are ready with your work and your tests, liquibase has updated your dev database but not your production database. But when you install your war and start your webapp, liquibase handles it automatically to bring your database to current state. So you don't have to remember doing some extra stuff before deploying your new version.
Keep your changelog.xml with includes to changelog files like this
http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-1.9.xsd">
when you want a new table, add a changelog-0002.xml file and reference it in your master changelog.xml
see answer 1. I would put in your webapp module too.
You should have many changelog files. so this question is not applicable.

Resources