Connecting values in a database to Django models - django-models

I have manually imported data (via a .csv file) into a database, how can I now connect the values imported into the database to a Django model so that I could then reference the data via application logic? (i.e. Model.objects.all().values('field'))
I know that this is the opposite of the standard process and as a result, I have been able to find no references to this process online.
I would like to be able to call Model.objects.all().values('field')
and display a column of the csv that I imported into the database.
My model format is a Geodjango based model with environmental data mapped to it. The shp file is too large to directly sync with the proxy database I am using (it would take an estimated 300+ days) so I transferred the values to a csv and imported the csv directly into the assigned model table in the database. However, when using the Django shell I can see that the values of the csv were not synced with the models despite being imported to the same table.

It sounds like you want to use the inspectdb feature of Django. inspectdb automatically generates models based on an existing database schema. You can setup your database like your normally would in your settings.py. Then run:
python manage.py inspectdb > models.py
... to generate your models.

I wasted 50 reps on this question. Easy fix, just took me a day to figure it out. My CSV was not being properly imported but I was not receiving any error messages from the managed database I was importing to (Google Cloud). Once I ran some tests to confirm, I noticed nothing was imported. I then just reformatted and imported the file and everything was in sync and ran smoothly. No need for any obscure Django commands or writing custom commands for syncing with databases in unique circumstances. Just plain old csv formatting issues. Great.

Related

Load all CSVs from path on local drive into AzureSQL DB w/Auto Create Tables

I frequently need to validate CSVs submitted from clients to make sure that the headers and values in the file meet our specifications. Typically I do this by using the Import/Export Wizard and have the wizard create the table based on the CSV (file name becomes table name, and the headers become the column names). Then we run a set of stored procedures that checks the information_schema for said table(s) and matches that up with our specs, etc.
Most of the time, this involves loading multiple files at a time for a client, which becomes very time consuming and laborious very quickly when using the import/export wizard. I tried using an xp_cmshell sql script to load everything from a path at once to have the same result, but xp_cmshell is not supported by AzureSQL DB.
https://learn.microsoft.com/en-us/azure/azure-sql/load-from-csv-with-bcp
The above says that one can load using bcp, but it also requires the table to exist before the import... I need the table structure to mimic the CSV. Any ideas here?
Thanks
If you want to load the data into your target SQL db, then you can use Azure Data Factory[ADF] to upload your CSV files to Azure Blob Storage, and then use Copy Data Activity to load that data in CSV files into Azure SQL db tables - without creating those tables upfront.
ADF supports 'auto create' of sink tables. See this, and this

Cleared SQL Server tables still retain some data

I made a custom application that is running from several years and is full of company data.
Now I need to replicate the application for another customer, so I set up a new server then i cloned the databases and empty all the tables.
Then I made a database and file shrink.
On the SQL Server side, the databases looks empty but if I run a grep search on the database files .mdf and .log I still can find recurrence of the previous company name also in system databases.
How do I really clean a SQL Server database?
Don't use backup/restore to clone a database for distribution to different clients. These commands copy data at the physical page/extent level, which may contain artifacts of deleted data, dropped objects, etc.
The best practice for this need is to create a new database with schema and system data from scratch using T-SQL scripts (ideally source controlled). If you don't already have these scripts, T-SQL scripts for schema/data can be generated from an existing database using the SMO API via .NET code or PowerShell. Here's the first answer I found with a search that uses the Microsoft.SqlServer.Management.SMO.Scripter class. Note you can include scripts data too (insert statements) by specifying the ScriptData scripting option for desired tables.

Importing Data Using Unix to Oracle DB

I want to import data on a weekly basis to an Oracle DB.
I'm receiving this data on specific location in a server in EDR format. For now I'm uploading them manually using Toad for Oracle uploader wizard. Is there any way to upload them automatically using Unix or any kind of scripting?
I would suggest to try out SQL loader through a shell script.
Code:
sqlldr username#server/password control=loader.ctl
two important files:
a. your data file to be uploaded.
b. Control file which states the table to be inserted and the delimiter character and the column fields, etc. basically describe how to load the data.
Oracle Reference

Get the script of django database

I have to modify my models but I alredy have data in my databse, and I have to clean it to modify my models, How can I make an script file with all the inserted data of django database in order to clean my database and have the data in a script and the uploaded it into the database
You can use following command to dump data into a json file.
python manage.py dumpdata --natural-primary --indent 4 > initial_data.json
Once you have json file, just use this to load it into database.
python manage.py loaddata initial_data.json
Also, You said you have to modify your models, Once you do that you wont be able to load it. since schema would have been changed.
You don't need to do anything special to preserve the data in your database. Django's migrations are specifically designed to alter existing tables as well as creating new ones.

data transfer from one database to other database in odoo

I have one database. I want to transfer data from one database to new database. all tables have same fields into both databases. I can use export feature of openerp, but I need to maintain the relationship between odoo table and there is so many tables so I don't know which tables I can import first into a new database so it does not give any problem into other tables data import.
is there any that I can do this into easy and simple way?
There are two ways in which you can take backup.
By hitting the given URL – server/web/database/manager.
By Import/Export and validation, functionality is given by Odoo.
• Backup-> We can take the full backup of the system and store zip file in our system for a future update. For that, we have to hit this URL- http://localhost:8069/web/database/manager
- Restore-> In a similar manner, we can restore the database by uploading the zipped file which we recently downloaded.

Resources