Get the script of django database - sql-server

I have to modify my models but I alredy have data in my databse, and I have to clean it to modify my models, How can I make an script file with all the inserted data of django database in order to clean my database and have the data in a script and the uploaded it into the database

You can use following command to dump data into a json file.
python manage.py dumpdata --natural-primary --indent 4 > initial_data.json
Once you have json file, just use this to load it into database.
python manage.py loaddata initial_data.json
Also, You said you have to modify your models, Once you do that you wont be able to load it. since schema would have been changed.

You don't need to do anything special to preserve the data in your database. Django's migrations are specifically designed to alter existing tables as well as creating new ones.

Related

Symfony existing table error when updating database

I have the database entities in my Symfony application, I have created another table in the database and I want to download it as an entity in my program. When I execute the following line I get that there are tables that already exist.
php bin/console make:migration
How can I update the entities and create only the ones that are not there?

Connecting values in a database to Django models

I have manually imported data (via a .csv file) into a database, how can I now connect the values imported into the database to a Django model so that I could then reference the data via application logic? (i.e. Model.objects.all().values('field'))
I know that this is the opposite of the standard process and as a result, I have been able to find no references to this process online.
I would like to be able to call Model.objects.all().values('field')
and display a column of the csv that I imported into the database.
My model format is a Geodjango based model with environmental data mapped to it. The shp file is too large to directly sync with the proxy database I am using (it would take an estimated 300+ days) so I transferred the values to a csv and imported the csv directly into the assigned model table in the database. However, when using the Django shell I can see that the values of the csv were not synced with the models despite being imported to the same table.
It sounds like you want to use the inspectdb feature of Django. inspectdb automatically generates models based on an existing database schema. You can setup your database like your normally would in your settings.py. Then run:
python manage.py inspectdb > models.py
... to generate your models.
I wasted 50 reps on this question. Easy fix, just took me a day to figure it out. My CSV was not being properly imported but I was not receiving any error messages from the managed database I was importing to (Google Cloud). Once I ran some tests to confirm, I noticed nothing was imported. I then just reformatted and imported the file and everything was in sync and ran smoothly. No need for any obscure Django commands or writing custom commands for syncing with databases in unique circumstances. Just plain old csv formatting issues. Great.

Copy Postgres database structures but not data

We are creating a Dockerfile that can spin up a Postgres database container, using this image as a basis
https://hub.docker.com/_/postgres/
Every time we test we want to create a fresh database with the production database as a template - we want to copy the database structures without copying the data in tables etc.
Can someone provide an example of doing this? I need something concrete with database urls etc.
There are some good examples here, but some are a bit nebulous to a Postgres newb.
I see examples like this:
pg_dump production-db | psql test-db
I don't know what "production-db" / "test-db" refer to (are they URL strings?), so I am lost. Also, I believe this copies over all the data in the DB, and we really just want to copy the database structures (tables, views, etc).

What's the difference between generate-migrations-db and generate-migrations-models

What is the difference between
symfony doctrine:generate-migrations-db
and
symfony doctrine:generate-migrations-models
I can't notice any difference: I've tried with tables IN DB, NO schema.yml and NO models, both of them have no effect. No migrations generated.
I've tried with tables IN DB, GENERATED schema.yml and NO models, both of them have no effect. No migrations generated.
And lastly I've tried with tables IN DB, GENERATED schema.yml and GENERATED models, now both of them generate the same migrations classes :|.
Can't really understand the difference. Or at least, what is the best way to start using migrations considering all scenarious: having models, but no DB and having DB, but no models.
Thanks.
Given an existing database with 1+ tables, running
./symfony doctrine:generate-migrations-db
will result in migration files being created for each table. Similarly, given a directory like /lib/model/doctrine filled with pre-existing model classes, running
./symfony doctrine:generate-migrations-models
will result in migraiton files being created for each model.
With tables in the database, with or without schema.yml contents, and no models in lib/model/doctrine, you just need to ensure you're database.yml file has credentials to correctly connect to your database.
Once you figure out the problem with your migrations files not generating, what I would do is something like to get started with migrations.
Generate a fresh schema from your existing database with ./symfony doctrine:build-schema
Manually clean up the schema file, and re-establish your relations you're already got existing in your model files (if any).
Reconfigure config/databases.yml to point to a new blank database
Build migrations with ./symfony doctrine:generate-migrations-diff. This will create migrations based on your schema file to bring your (blank) database up to date
Run ./symfony doctrine:migrate and watch for errors. Fix them by fixing your schema file. Delete the migrations created in Step 4. Flush your databse ./symfony doctrine:drop-db && ./symofny doctrine:build-db and go back to Step 4. Continue until you're schema generates a clean set of migrations files that can run without error.
Rebuild your models with ./symfony doctrine:build --model --forms --filters
Now you have a clean schema.yml file, clean migrations that can bring a blank database up to date, and models that directly relate to your schema.yml file and database.
When you want to make a new change to your database, it's now as simple as
Make the desired change in your schema.yml
Run ./symfony doctrine:generate-migrations-diff
Manually review the generated migrations
Run ./symfony doctrine:migrate to make sure the migrations run without error
Rebuild your model, forms, and filters ./symfony doctrine:build --model --forms --filters
Going through this process can be frustrating at times, but it's something you only have to do once to create a really good based to build upon.

Django - Compare Model Code to Database

I maintain a Django project with a database that has several model constraints that have fallen out of sync with the actual database. So, for example, some model fields have null=False set, but the database permits NULLs for the corresponding database column.
I'm curious if there is a utility, either in Django or a third-party Python script, that will compare the SHOW CREATE TABLE output (in this case, using MySQL syntax) for each table and compare it with the python manage.py sql output, to highlight the discrepancies.
Granted, in an ideal situation, the database wouldn't fall out of sync with the Django model code in the first place, but since that's where I am, I'm curious if there's a solution to this problem before I write one myself or do the comparison manually.
./manage.py inspectdb generates the model file corresponding to the models that exist within the database.
You can diff it with your current model files using a standard unix diff or any other fancy diffing tool to find the difference and plan your migration strategy.
While the former seems simpler and better, you can also see the diff at the sql level. ./manage.py sqlall generates the sql for the current db schema and correspondingly show create table table-name shows the sql for the table creation.
You might want to refer http://code.google.com/p/django-evolution/ which once auto migrated the state of the db to the one in the current models. - Note however, that this project is old and seems abandoned.
I did come up with a quick and dirty means of doing what I described. It's not perfect, but if you run ./manage.py testserver, the test database will be created based on the model code. Then (using MySQL-specific syntax), you can dump the schema for the regular database and the test database to files:
$ mysqldump -uroot -p [database_name] --no-data=true > schema.txt
$ mysqldump -uroot -p [test_database_name] --no-data=true > test_schema.txt
Then you can simply diff schema.txt and test_schema.txt and find the differences.
For PostgreSQL, do a manage.py syncdb on a temporary empty database, then dump production and temporary databases with pg_dump -sOx and compare the resulting files. Among visual diff tools, at least GNOME Meld seems to cope well with PostgreSQL dumps.

Resources