Semantic Versioning 2.0.0 says:
For this system to work, you first need to declare a public API
My web application does not expose any public API, however it uses a database. Can the database schema be considered as the public API and then should a change in the database schema increment the major version?
According to this the answer is yes. The database schema can be considered as the public API.
Related
I am in process of making the c# .NET standard library which eventually should be published on nuget.
Since library encapsulates communication with database, I wanted to use EF core (Code First) to setup database model.
Users of the library should be able to generate database model for arbitrary database(i.e. user should specify connection string) from code, which is contained in library, i.e. migrations should be located in the library.
My concern is: Is this even supported scenario by EF core.
So far browsing the the internet resulted with no info about this scenario.
My questions are:
Is this scenario somehow supported in EF core.
If not then what are the ways to implement library which contains database model(either in the form of SQL script or migration code), user of the library should be able to specify connection string and generate database model.
I am following the documentation to create a healthcheck for my dropwizard application for database connectivity. I wanted to know how does the healthScript get a Database class? What do I need to import in order to use teh Database class mentioned in the Documentation?
http://dropwizard.io/0.8.0-rc1/docs/manual/core.html#health-checks
This is just an example providing a general idea of what can be accomplished with healthchecks. There isn't a concrete Database class that you can import. The actual healthcheck would depend on your choice of DB connection: JDBI or Hibernate. Dropwizard also easily allows you to define your own DAO for whatever vendor / protocol you need. And if you read through that documentation you'll see that those database objects are configured just like everything else - with YAML.
Note that you can upgrade to the 0.8.0 release instead of the release candidate.
I'm currently working on a Grails project which has a static production database with a lot of data in it. I would like to test my application using the production data, but instead of having to clone the production database I'd like to setup a proxy database to the production database.
Essentially reads of the database would go all the way to production database while writes would stop at a proxy database (preferably an h2 database). If a row was updated that came from the production database the row would be saved to the proxy database and returned, instead of the production's row, on subsequent queries.
I'd like to do all of this as transparently to the application as possible. My currently line of thinking is that I'd need to fork the Hibernate GORM implementation and make it support this use case. Has this been done before? Is there a better way?
Forking the Hibernate GORM implementation may not be a good idea. You will be stuck in your version and will have to, somehow, make this up to date with the original plugin (eg. bug fix, new implementations).
Maybe a custom TestMixin that allows you to override all registered domain classes, with new implementations of save(), get(), find() and etc can be an option. You can work with the metaClass to override this static methods and this will be triggered only on tests with the annotated mixin.
With this you can use multiple datasources in the test environment to determine which will be used.
Wanted to get your opinions.
When working in a team and using a version control system, at some point one developer will commit code that depends on new modifications that he made to his local database.
Well, once the other team members update the code, their local db is not ready for it.
Is there a nice way to handle this ?
Thanks.
Liquibase is good and The Right Thing (tm) for managing database evolution during development
What's your development platform?
Some web frameworks have 'database migration' features for this purpose, e.g:
Rails migrations
CodeIgniter migration class
FuelPHP migrations
Yii Framework database migration
Is it necessary to do backups of GAE's Datastore?
Does anyone have any experience, suggestions, tricks for doing so?
Backups are always necessary to protect against human error. Since App Engine encourages you to build mutiple revisions of your code that run against the same dataset, it's important to be able to go back.
A simple dump/restore tool is explained in the Bulkloader documentation.
Something else I've done in the past for major DB refactors is:
Change the entity name in your new code (e.g. User -> Customer or User2 if you have to)
When looking up an entity by key:
Try the key and return if possible
Try the key for the old db.Model class. If you find it, migrate the data, put(), and return the new entity
Use the entity as usual
(You may have to use a task queue to migrate all the data. If you always fetch the entities by key it's not necessary.)
Deploy a new version of your code so that both coexist server-side. When you activate the new version, it is like a point-in-time snapshot of the old entities. In an emergency, you could reactivate the old version and use the old data.
You can now use the managed export and import feature, which can be accessed through gcloud or the Datastore Admin API:
Exporting and Importing Entities
Scheduling an Export