I have to export all tables from database and then import them back. I generated a bcp command via SQLAzureMW tool and the tool somehow order the tables in proper way to avoid FKs dependency issues. I don't think this tool has an option to generate them via cmd or something and I'm not able to use UI in my scenario. So the question is can I get the list of the tables in that order via T-SQL?
I'm also not able to use backup/restore approach so I'm looking for other ways to accomplish the task. BCP works really fast and i prefer it, but I don't want to hardcode the order of the tables, if someone add new table with dependencies the script will no longer work.
This stored procedure :
EXEC sp_msdependencies #flags = 8
helped me and give me the correct order, then i just did export/import via bcp and proper order and everything works as expected.
Related
I am trying to generate a data only script for a view using SSMS Generate Scripts feature. However, even though I select Data Only, it still only generates the view definition and not the data generated by the view. Is there something else I need to do?
I'm using SSMS for SQL Server 2014.
I know this is old, but I will answer it for other people who stumble on it.
Generate Scripts -> Data Only is bugged for views.
The easiest option without searching for other stored procedures or external tools is to copy the view contents into a table. Generate Scripts -> Data Only works fine with tables.
For example,
SELECT *
INTO NEWTABLE
FROM dbo.Component
Then you can do Generate Scripts on the NEWTABLE and select Data Only in Advanced and it will work.
You can then delete the NEWTABLE.
Given that Generate Scripts still doesn't appear to work for view data as of SSMS v17.9.1, an alternative depending on your needs might be to use the SQL Server Import and Export Wizard. You can read data from a view and write it to a table, across different databases and servers without resorting to a linked server.
SSMS is still poor at this, VS has been able to do this for a while
Use menu VIEW->SQL SERVER OBJECT EXPLORER
Create a new server
Navigate down to your table or view , right click -> View Data
use the filter to limit the dataset to what you are interested in
Then use the SCRIPT command (also available on context menu)
This works for views and tables.
Not super easy, but ill give it A-. Way better than other hacks that used to be available (including SSMS.ExportData which is not great)
hope that helps someone. I just had to export some rows and had to re-remember how to do this.
hope it helps someone...
greg
I want to move the structure of user defined tables and all the stored procedures and functions from old database to new database through a SQL query...
I created the database through an stored procedure it is worked fine, but it is not creating the tables inside the database
I tried with the following queries but it is not working and I don't know whether it is proper or not...
SELECT INSERT newdb..+name SELECT * FROM olddb..name FROM sys.tables
or can i use the below methods
select * into newDB.sys.tables from oldDB.sys.tables where type='U'
How can I achieve this...
Can anyone help me...?
Thank you...
i think easiest way to do this is to generate database script .
Here is a nice article about Generating Script from Database. just generate script using the given article and run it to your new DB. Make sure you select "SCHEMA ONLY" option in "Types of data to script" while generating the script.
PS : never try to do this if you want both data and schema on production it might cost quiet a bit resources which might slow down your websites.
Often I need to extract the complete schema of an existing SQL Server DB to a file. I need to cover every object -- tables, views, functions, SPs, UDDTs, triggers, etc. The purpose is so that I can then use a file-diff utility to compare that schema to a baseline reference.
Normally I use Enterprise Manager or Management Studio to script out the DB objects and then concatenate those files to make one big file in a consistent predictable order. I was wondering whether there's a way to accomplish this task in Python? Obviously it'd take an additional package, but having looked at a few (pyodbc, SQLAlchemy, SQLObject), none of them seem really suited to this use case.
If you can connect to SQL Server and run queries in Python then yes – it’s possible but it will take a lot of effort and testing to get it to work correctly.
Idea is to use system tables to get details about each object and then generate DDL statements based on this. Some if not all DDL statements already exist in sys.syscomments table.
Start off by executing and examining this in SSMS before you start working in Python.
select *
from sys.tables
select *
from sys.all_columns
select *
from sys.views
select *
from sys.syscomments
All system tables documentation from MSDN.
I've used this PowerShell strategy in the past. Obviously, that isn't Python, but it is a script you can write then execute from within Python. Give this article a read as it may be your easiest (and cheapest) solution: http://blogs.technet.com/b/heyscriptingguy/archive/2010/11/04/use-powershell-to-script-sql-database-objects.aspx
As a disclaimer, I was only exporting stored procedures, not every single object.
Is there any handy tool that can make updating tables easier? Usually I got an Excel file with the original value in one column and new value in another column. Then I write a formula in Excel to create the 'update' statement. Is there any way to simplify the updating task?
I believe the approach in SQL server 2000 and 2005 would be different, so could we discuss them both? Thanks.
In addition, these updates usually request by "non-programmer" (which means they don't understand SQL, so it may not feasible to let them do query), is there any tool that can let them update the table directly without having DBAs do this task? Also, that tool needs to limit the privilege to only modify certain tables. And better has a way rollback the change.
Create a DTS package that will import a csv file, make the updates and then archives the file. The user can drop the file in a specific folder designated for the task or this can be done by an ops person. Schedule the DTS to run every hour, day, etc.
In case your users would insist that they keep using Excel, you've got several different possibilities of getting the data transferred to SQL Server. My preferred one would be to use DTS/SSIS, as mentioned by buckbova.
However, another method is by using OPENROWSET(), which makes it possible to query your Excel file as if it was a table. I wrote a small article about it here: http://blog.hoegaerden.be/2010/03/29/retrieving-data-from-excel/
Another approach that hasn't been mentioned yet (I'm not a big fan of letting regular users edit data directly in the DB), any possibility of creating a small custom application for them?
There you go, a couple more possible solutions :-)
Valentino.
I think the best approach is to expose a view on your data accessible to users who are allowed to do updates, and set up triggers on the view to perform the actual updates on the underlying data. Restrict change to only the columns they should be changing.
This technique can work on SQL Server 2000 and 2005.
I would add audit triggers on the underlying tables so you can always track changes.
You'll have complete control, and they can connect to it with Access or whatever and perform their maintenance.
You could create some accounts in SQL Server for these users and limit their access to only certain tables and columns along with onlu select / update / insert privileges. Then you could create an access database with linked tables to these.
I am in need of testing several different processes for the application we're builduing. Each process requires a particular table in our database to have data and all of these tables have foreign key constraints from other tables as well.
I've written sql scripts that populate the table I'm interested in as well as its dependencies but, it turns out that in a few of these scripts I've duplicated a lot of code when populating the dependencies tables.
I would like to take out the duplicated code and put it in a separate script but I don't know how, if possible, to execute a sql script from within another one.
An important part of all of this would also be to be able to get the ##IDENTITY value in the calling script from the called one.
Any help will be greately appreciated.
Best regards.
Clarification: By script I mean a file saved in disk. I don't want to be creating and deleting temporary stored procedures for this.
When I hear the word "script", I think of a file containing a series of commands; if you're asking how to get SQL Server to load a file of commands from another file of commands, I'm not sure of an easy way to do that.
If you can save your duplicate code as a stored procedure, you can certainly call a stored procedure from another stored procedure within SQL Server. You could then pass in a parameter holding the ##IDENTITY value (and you may want to look at SCOPE_IDENTITY() instead).
HTH,
Stu