copy only tables with data from one database to another database - sql-server

I have two database , dbOne(version - 10.50.1600 - locate in office server ) and dbTwo(version - 10.0.1600 - locate in my local server) .
I want to copy dbOne's tables with data to dbTwo .
Is there any way or script to do it ? I don't want to upgrade my local server-version !

"Import and Export Data" tool provided by SQL Server is a good tool to transfer data between two different servers.

How about generating the database scripts like in the following artcles
http://www.codeproject.com/Articles/598148/Generate-insert-statements-from
and
http://msdn.microsoft.com/en-us/library/ms186472(v=sql.105).aspx

Its possible to transfer data from one server to another server using SQL linked server query, if both are in a same network. below are the steps
Copying table structures
Generate script of all tables from server1 database then excute in server2 database. using Generate Script utility
Copying table data
sp_addlinkedserver [ #server= ] 'server' [ , [ #srvproduct= ] 'product_name' ]
[ , [ #provider= ] 'provider_name' ]
[ , [ #datasrc= ] 'data_source' ]
[ , [ #location= ] 'location' ]
[ , [ #provstr= ] 'provider_string' ]
[ , [ #catalog= ] 'catalog' ]
Insert into databaseserver2.db1.table1(columnList)
select columnList
from databaseserver1.db1.table1

Here are general steps you need to take in order for this to work
Migrating tables
Create scripts for tables in db1. Just right click the table and go to “Script table as -> Create to”
Re-order the scripts so that tables that don’t depend on any other tables are executed first
Execute scripts on db2
Migrating data
The most convenient way is to use SQL Server Import/Export wizard

Related

snowflake show tables with cluster_by

I can use show tables in <database name> to show all tables in a database.
The results returned show if a table has clustering enabled - shows the cluster_by column.
Is there a way to get back a list of all tables that have value in cluster_by ?
The documentation for show-tables shows only:
SHOW [ TERSE ] TABLES [ HISTORY ] [ LIKE '<pattern>' ]
[ IN { ACCOUNT | DATABASE [ <db_name> ] | SCHEMA [ <schema_name> ] } ]
[ STARTS WITH '<name_string>' ]
[ LIMIT <rows> [ FROM '<name_string>' ] ]
You can always ask INFORMATION_SCHEMA:
SELECT TABLE_CATALOG, TABLE_SCHEMA, TABLE_NAME, CLUSTERING_KEY
FROM INFORMATION_SCHEMA.TABLES
WHERE CLUSTERING_KEY IS NOT NULL;
or using RESULT_SCAN
SHOW TABLES IN DATABASE TEST;
SELECT *
FROM TABLE(result_scan(last_query_id()))
WHERE "cluster_by" <> '';
Reference: INFORMATION SCHEMA TABLES VIEW, RESULT_SCAN

Create External Table in Azure SQL Data warehouse to a wild card based file or folder path

I know we can create an External table in Azure SQL Data warehouse pointing to
a LOCATION that is either a file path or a folder path. Can this file or folder path be based on a wild card pattern instead of an explicit path.
Here my file path is a location in Azure Data Lake Store.
-- Syntax for SQL Server
-- Create a new external table
CREATE EXTERNAL TABLE [ database_name . [ schema_name ] . | schema_name. ] table_name
( <column_definition> [ ,...n ] )
WITH (
**LOCATION = 'folder_or_filepath'**,
DATA_SOURCE = external_data_source_name,
FILE_FORMAT = external_file_format_name
[ , <reject_options> [ ,...n ] ]
)
[;]
Polybase / External Tables do not support wildcards at this time. Simply have one folder per external table you require. If you feel this is an important missing feature you can create a request and vote for it here:
https://feedback.azure.com/forums/307516-sql-data-warehouse
Bear in mind Polybase (in Azure SQL Data Warehouse) can now read files either in blob storage or in Azure Data Lake Storage (ADLS). Therefore as another workaround, Azure Data Lake Analytics (ADLA) and U-SQL support Polybase, so you could use U-SQL to move the files you want from blob store into your lake, eg
// Move data from blob store to data lake
// add filename and structure as one file
DECLARE #inputFilepath string = "wasb://someContainer#someStorageAccount.blob.core.windows.net/someFilter/{filepath}.csv";
DECLARE #outputFilepath string = "output/special folder/output.csv";
#input =
EXTRACT
... // your column list
filepath string
FROM #inputFilepath
USING Extractors.Csv()
#input =
SELECT * FROM #input
WHERE filename.Contains("yourFilter");
// Export as csv
OUTPUT #input
TO #outputFilepath
USING Outputters.Csv(quoting:false);
// Now the data is in Data Lake which Polybase can also use as a source

How i communicate in two server.i want some data from A server to on B server.if i do it through Web socket if yes then how?

i want to access the data from server 1 to server 2.how can i do this things,can this possible through the p tunnel.i am try to sudo ptunnel -p 192.168.0.66 -lp 8080 -da 192.168.0.66 -dp 9090 this p tunnel on two server but it gives me error [err]: Failed to bind listening socket: Address already in use.
Follow these steps to create a Linked Server:
Server Objects -> Linked Servers -> New Linked Server
Provide Remote Server Name.
Select Remote Server Type (SQL Server or Other).
Select Security -> Be made using this security context and provide login and password of remote server.
Click OK and you are done !!
[HERE][1]ple tutorial for creating a linked server.
OR
You can add linked server using query.
Syntax:
sp_addlinkedserver [ #server= ] 'server' [ , [ #srvproduct= ] 'product_name' ]
[ , [ #provider= ] 'provider_name' ]
[ , [ #datasrc= ] 'data_source' ]
[ , [ #location= ] 'location' ]
[ , [ #provstr= ] 'provider_string' ]
[ , [ #catalog= ] 'catalog' ]
Read more about sp_addlinkedserver.
You have to create linked server only once. After creating linked server, we can query it as follows:
select * from LinkedServerName.DatabaseName.OwnerName.TableName

SSIS 2008 R2 Setting Run64BitRuntime Value Through Script Task

I have an SSIS project with multiple packages. I would like to create a "Master" package which would run the individual packages in a sequence. The first package contains a Data Flow task which imports data from Excel files, so my Run64BitRuntime setting is set to "false". The following package that needs to be run contains a Fuzzy Lookup, which requires that the Run64BitRuntime setting is set to "true".
Is there a way that I can change this project property setting through a Script Task, so that I can fully automate this process?
You Deploy the Packages in an SSIS Catalog (may be in your same instance of SQL).
In the SSISDB Database, we have several SPs. Some of them are listed below.
[SSISDB].[catalog].[set_execution_parameter_value]
[SSISDB].catalog.start_execution
[SSISDB].catalog.create_execution
Every sp has its own purpose. see here
http://technet.microsoft.com/en-us/library/ff878034.aspx
See syntax of
create_execution [ #folder_name = folder_name
, [ #project_name = ] project_name
, [ #package_name = ] package_name
[ , [ #reference_id = ] reference_id ]
[ , [ #use32bitruntime = ] use32bitruntime ]
, [ #execution_id = ] execution_id OUTPUT
The parameter #use32bitruntime will help you to make changes in 32/64 bit execution.
With the above set of SPs you can have great control over package execution.

Remote Computer Name in SQL Server

I have an app that run on many computers and connect to sql server
I want to log the machine names of that computers in a table every time they connect how can I do that
I want to know if there is a command like that
"Select ##MachineName"
It's up to you how you want to log this information, but HOST_NAME() returns the name of the workstation connecting to the server.
Create linked server : (allowing access to distributed, heterogeneous queries against OLE DB data sources.) using following command :
sp_addlinkedserver [ #server= ] 'server' [ , [ #srvproduct= ] 'product_name' ]
[ , [ #provider= ] 'provider_name' ]
[ , [ #datasrc= ] 'data_source' ]
[ , [ #location= ] 'location' ]
[ , [ #provstr= ] 'provider_string' ]
[ , [ #catalog= ] 'catalog' ]
Then access is like :
Select * from [server-name].[db-name].dbo.[tablename]
Also, make sure security login you are using on both the servers is same (or atleast exists on other server too).

Resources