Seed does not run when entity framework migrations are applied automatically - database

When I deploy my app to our staging server, when it starts it checks the database and applies any migrations if necessary, but it does not run the Seed() method. I just changed the connection string to an empty database, restarted the app and it recreated the entire database, but no seed data.
Am I missing something here? Do I need to do anything special to get the Seed() to run when migrations are applied automatically/implicitly?
Everything runs as expected on my dev environment because I apply the migrations explicitly using update-database and Seed is run every time I call update-database.

Related

How to create database using Database Project in Visual Studio if doesn't exists?

I have a Database project for my personal project and I am trying to deploy my code to my DEV server. I frequently delete and re-create my DEV Server. Right now, DEV Server is newly created with SQL Server. Every time I want to deploy my code I have to manually create Database Project and then publish database project. I want to automate creation of Database with database project deployment.
Right now, I have a script that creates database, but I have to execute it manually. And this is working perfectly but I want to automate this step as well.
Is this even possible? If yes, then how? Please explain step by step. Also what will we mention for Initial Catalog in connection string?
Edit:
I tried to create Database by using
CREATE DATABASE LocalDbTest
in Pre-Deployment Script. But it didn't work. It is creating Database, but then tables are not getting created tables under it. Since I used master database as default database, it is creating table under master. It is not letting me select LocalDbTest database as default because it is not yet created, so I have to select Master as my default database. I tried to Change Database by:
USE LocalDbTest
GO
I used it just after creating Database but this didn't work because when generating script it is changing back to default database. This part is coming automatically when generating script.
USE [$(DatabaseName)];
GO
Also Visual Studio is not letting me add database name in front of table name like:
CREATE TABLE [LocalDbTest].[dbo].[TestTable]
I am getting error:
When you create an object of this type in a database project, the object's name must contain no more than two parts.
If you have a script ready for database creation, you can use the Pre-build event to call SQLCMD and run your script.
Edit:
If you have trouble pointing to a database that does not exist, you may have to manually edit the publish profile (ex. dev.publish.xml) and set the TargetDatabaseName element explicitly. You can also set CreateNewDatabase element to True if you want to be recreated every time it gets published.
Answer:
You can use a publish profile and hardcode the target database in it.

SQL Server Database Project

I want to use database project for script deployment in Azure SQL Server, I don't want to import full database. I just want to use database project for delta script. I added a project and included one script file with none as build action that contains create table statement , I am publishing the project, It's completing successfully but create statement is not executing. What is wrong here? Is there any other way to do this?
TLDR: Set your build action to "Post Deployment Script".
Longer:
What happens in SSDT is that all the files that have a build action of "Build" are built into a model of what the database should look like. When the deploy happens that model is compared to the target database and if there are any changes, a change script it generated and then optionally deployed.
If you have any file marked pre or post deployment script then they are either prepended or appended to the change script and will be run as part of the deployment.
If you have any files with a build action of "None" then SSDT ignores them, you could put anything in there, even an ascii picture of a donkey and the project will still build and deploy (obviously your ascii donkey won't get deployed anywhere).
If you just want to use SSDT to do your deployments you can just set the build action to pre or post deploy and it will be included. This is pretty odd though, either don't use SSDT or use SSDT and put the model of your entire database in there.
Personally, I would use SSDT properly and live the dream.
Ed

What is the common practice to create db schema in Cloud Foundry?

I have been quested for a while for a best practice to initialize the relational database schema and pre-populated data.
There are a couple of ways to make it happen:
Install the cf-ex-phpmyadmin and import the data and schema thru it
Use the VMC cli tool to create a tunnel the service from this link
If using ruby or python, use the db migration command in the manifest.yml. However, it will be executed on each instance and every time the instance re-stages.
Which one is commonly used and most effective?
VMC is very old and is no longer supported. I'd be surprised if it even works against a Cloud Foundry installation that has been deployed within the last couple years. You should use the new cf CLI.
If you were to put the command in your manifest, you could avoid having it run on every instance if you had a conditional guard that would only run the migrations if $CF_INSTANCE_INDEX equals 0, however it's not always a great idea to run migrations in your start command, since there is a hard timeout on your start command, and you don't want migrations to be interrupted if they are long migrations.
A good suggestion I've heard [1] is that migrations should be handled as a separate part of your deploy process, either by cf ssh or running them locally, pointed at the URL and credentials of your database service instance.
[1] credit to Travis Grathwell for this suggestion.

See which data would be deleted before updating database

I'm using EF 6 Code First and have just changed my data model.
Now that I run the update-database command from the nuget package management console I get the error:
Automatic migration was not applied because it would result in data
loss. Set AutomaticMigrationDataLossAllowed to 'true' on your
DbMigrationsConfiguration to allow application of automatic migrations
even if they might cause data loss. Alternately, use Update-Database
with the '-Force' option, or scaffold an explicit migration.
I understand this error perfectly, however is there any way in which I can see which data would be deleted (before I run -force to actually delete the data)?

magento: database synchronization between production, staging & development

I've been reading up today on database synchronization in Magento.
One thing I am currently struggling with is what needs to be synced during development and during uploads to production. Now assuming that a batch of changes will consist of changes to the DB and code alike, below would be my understanding of a model workflow (I do not currently use a 'stage' server so that is bypassed in this example):
Sync dev DB from production DB
Checkout working copy of code to dev machine
Make changes and test them on dev server
Accept changes and commit them to svn repository
Touch Maintenance.flag on production server and prepare for upgrades (this altogether eliminates sync issues from users interacting with live data that is about to change right?)
Merge branches to trunk and deploy repository to production server
Sync dev DB back to production DB and test changes
So items # 1 & 7 I don't fully understand when working with Magento:
What needs to be synced and what doesn't?
It seems ridiculous to sync order and customer info to me so I wouldn't do it.
I would want product schema and data synced though obviously, and any admin changes, module changes, etc. How to handle that?
What about HOW to sync? (MySql dumps, import/export, etc)
Currently I'm using Navicat 10 Premium which has structure and data sync features (I haven't experimented yet but they look like a huge help)
So I don't necessarily need specifics here (but they would help). More or less I want to know what works for you and how happy you are with that system.
if you are using CE version then:
ditch svn and use GIT :)
never sync a database , prepare your database upgrades as extension upgrade files
have 3 sites dev, stage, live
live database is copied over to stage and dev when needed
make all your admin changes from live and just copy the whole database down the line
this way you never have to sync a database + if you do all config changes via extension upgrade scripts you can cold boot your magento to a new database structure wherever you want without loosing data structure
I use phpunit to build a dev db. I wrote a short script which dumps xml data from the live database and I used it table-by-table, munging anything sensitive and deleting what I didn't need. The schema for my dev database never changes and never gets rebuilt. Only the data gets dropped and recreated each phpunit run.
May not be the right solution for everyone because it's never going to be good for syncing dev up to stage/production, but I don't need to do that.
The main benefit is how little data I need for the dev db. It's about 12000 lines of xml, and handles populating maybe 30 different tables. Some small core tables persist as I don't write to them and many tables are empty because I do not use them.
The database is a representative sample, and is very small. Small enough to edit as a text file, and only a few seconds to populate each time I run tests.
Here's what it looks like at the top of each PHPUnit test. There's good documentation for PHPUnit and DbUnit
<?php
require_once dirname(dirname(__FILE__)) . DIRECTORY_SEPARATOR . 'top.php';
require_once "PHPUnit/Extensions/Database/TestCase.php";
class SomeTest extends PHPUnit_Extensions_Database_TestCase
{
/**
* #return PHPUnit_Extensions_Database_DB_IDatabaseConnection
*/
public function getConnection() {
$database = MY_DB
$hostname = MY_HOST
$user = MY_USER
$password = MY_PASS
$pdo = new PDO("mysql:host=$hostname;dbname=$database", $user, $password);
return $this->createDefaultDBConnection($pdo, $database);
}
/**
* #return PHPUnit_Extensions_Database_DataSet_IDataSet
*/
public function getDataSet() {
return $this->createXMLDataSet(dirname(dirname(__FILE__)) . DIRECTORY_SEPARATOR . 'Tests/_files/seed.xml');
}
}
So, now you just need a seed file that DbUnit reads from to repopulate your database each time Unit tests are invoked.
Start by copying your complete database twice. One will be your dev database and the second will be your "pristine" database, that you can use to dump xml data in case you start having key issues.
Then, use something like my xml dumper againt the "prisine" database to get your xml dumps and begin building your seed file.
generate_flat_xml.php -tcatalog_product_entity -centity_id,entity_type_id,attribute_set_id,type_id,sku,has_options,required_options -oentity_id >> my_seed_file.xml
Edit the seed file to use only what you need. The small size of the dev db means you can examine differences just by looking at your database versus what's in the text files. Not to mention it is much faster having less data.

Resources