Transforming the Entirety of a PostgreSQL database into an Azure SQL Database [closed] - sql-server

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 days ago.
Improve this question
I am attempting to transform the entirety of a PostgreSQL database that is 10gb, 315 tables and 10,000 columns into the Azure SQL format on an Azure SQL server.
I have a remote connection to both the PostgreSQL server and the Azure SQL server, and am running SSMS 18 with the SQL Server Import and Export Wizard. I am using ODBC to fetch the PostgreSQL server, and a SQL Server password authentication account for the destination.
I open the SQL Server Import and Export Wizard, select my database, and select all 315 tables. I've had to manually go through hundreds of data types to map them, as I have no mapping file and no idea how to create one.
I run my script, and hit truncation errors.
Error 0xc020902a: Data Flow Task 2: The "Source 9 - table_name" failed because truncation occurred, and the truncation row disposition on "Source 9 - table_name.Outputs[ADO NET Source Output].Columns[column_name]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)
I have tried many things to solve this:
#1: Ignoring the columns. Even when set to ignore, the tool still tries to import them, and still crashes.
#2: Setting truncation fail mode to ignore. Apparently, other people's copies of the SQL Import and Export Wizard had the tools to change the truncation fail state. However, mine does not.
#3: Ignoring the tables. Unfeasible, too many tables, and I need them.
#4: varchar(MAX). Using MAX instead of the default results in no behavior change.
Any advice is appreciated, and I am willing to completely switch what program I am using as long as I do not have to manually map hundreds of data types again.

Related

Is there a way to monitor which data items are queried in SQL Server? [migrated]

This question was migrated from Stack Overflow because it can be answered on Database Administrators Stack Exchange.
Migrated 19 days ago.
I am working on a largish SQL Server database and we have Extended Events logging switched on.
One of the main tables has a column called DataItem which contains a relatively small (<100) number of values across the millions of records.
The client would like a report showing who has accessed each DataItem, when it was accessed, and with which technology.
Is there any SQL Server function or other software that can provide this?
Extended events gives the who, when and how but not the what.
You can use an AUDIT for this.
https://learn.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-database-engine?view=sql-server-ver16
Here you can add the name and location and how large the file can grow.
Enable it.
In the database you would like to audit, add the database audit.
Enter your Autit Action Type, in this case SELECT, but perhaps you would also like to see the UPDATE, INSERT and DELETE actions?
Don't forget to enable it:
After running some selects, you can see the audit information:

MsSql - select didnt return values for all columns properly

i would like to ask maybe basic question. My SSIS uses SQL query, result is exported to *.csv file. Everything works fine for months (sql select, sql records/results are in csv export) - but yesterday i have issue: SSIS ran without error, but in *.csv export one column was without any records(column header is in csv file, but records not) - in another columns there were records (records are from same table as missing column:(). when i ran SSIS manually again (cca 4 min after), data were in column. SSIS is deployed and run on same sever as database. my question is - if there is no error in SSIS Execution report, data flow task ran correctly, now query returns all data withou editing .... is there any chance to find what caused data loss? maybe on server log?
thank you

Copying database from SQL Server 2008 to SQL Server 2016 [duplicate]

This question already has answers here:
Is there any way to generate database scripts from a SQL query in SQL Server?
(2 answers)
Closed 4 years ago.
I am presently using SQL Server 2008. I want to move the entire applications and database to SQL Server 2016 without data. i.e. I don't want to copy the entire data but I need all the tables and everything from the previous server. Can please anyone help me with this?
You'll want to script out each database. To do so:
right click on the database
tasks
generate scripts
using the wizard, choose all the data objects you want (tables, views, etc.) or all of them
save to a location
open the file from that location
boom, there is your code set up all your tables, etc. for your database!
Just repeat for each database you want to 'copy'. Just remember (as requested) each table will have no data in it. Alternatively you can backup and restore each database and then truncate each table, that is probably a lot more work though.

Access database design in SQL [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
We have 52 MS Access databases and each database has 4 tables. The total data in our databases is around 5 million. Now we are planning to move to SQL Server. We have designed our new database which will be an SQL Server database with approximately 60 tables.
My question is - how will we integrate the 52 Access databases into one SQL Server database?
Is it possible, or we would have to create 52 database in SQL Server too, in order to migrate our data? these 52 databases are interrelated with each other having same structure in access?
If I was you (and I'm not, but if I was...) I would load all of that data into 4 tables. Just append all the data from each Access database into one table. Doctor, Project, Contract, Institution. However, as I'm appending each database, I would add a new field to each table; Country. Then, when you append the data for England to the tables, you also populate the Country field of that table with "England". Etc... with all your countries.
Now, when it comes time to access the data, you can force certain users to only be able to see the data for England, and certain other people to only see the data for Spain, etc... This way, those 4 tables can house all of your data, and you can still filter by any country you like.
From a technical point of view, there's no problem in creating only one SQL Server database, containing all 52 * 4 tables from the MS Access databases. SQL Server provides various options for logically separating your objects, for example by using Schemas. But then again, if you decide to create separate databases, you still have the ability to write queries across databases, even if the databases are not hosted on the same SQL Server instance (although there might be a performance penalty when writing queries across linked servers).
It's difficult to give a more precise answer with the limited detail in your question, but in most cases, a single database with multiple database schema (perhaps 1 schema for each MS access database) would probably be the best solution.
Now, for migrating the data from MS Access to SQL Server, you have various options. If you just need to perform a one-time migration, you could simply use the Import-Export wizard that comes with SQL Server. The wizard automatically creates the tables in the destination database for you, and it also lets you save SSIS-packages that you can use to migrate the data again.

how to take sql database backup without data [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
Backup SQL Schema Only?
Anybody tell how to take sql database backup without data. i wanted to take all the tables and structures from sql server 2008. and import to another sql server 2008. i dont need the data.
Use "tasks" -> "Generate scripts" and choose what you want to script. Run, save to a file, open the file against the new database and run the script after changing the database name to match (if it changed)
First time you can use the script database option, as Otavio suggested.
Subsequent times you can use a tool like RedGate SQL Compare or the Compare Schema functionality of Visual Studio Database Edition. These tools allow you to synch schema (ie. 'table structure') from one database to another.
You can do this by making scripts.
The way to go is:
Right click the table you want to script to the other database.
Script table as: -> Create to New query window.

Resources