Import SQL tables as data into access Db - sql-server

I have a SQL database (lets use northwind), that has a number of tables (unknown number of tables). I would like to import these tables into a MS access database as DATA (not tables) into a MTT_Table
All standard imports, creates the table as a physical table within ms access and not as data.
I have a table in MS Access that needs to store all the names of tables in other systems - not sure if that makes sense
Is there any way to read an infinite number of tables and populate them as data, using an odbc connection all through VBA
Expected output would be to see the table names as data values, and potentially able to populate the MS access row with metadata about the table

Use information schema to create a view in SQL server:
CREATE VIEW dbo.Sample_View
AS
SELECT TABLE_NAME
FROM [Your_Database].INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
Now import this view to access following the steps in this link

Your question is a bit broad (what information do you want from tables), but generally can be achieved by querying the INFORMATION_SCHEMA meta-tables over ODBC.
SELECT * INTO MTT_Table
FROM [ODBC;Driver={SQL Server};Server=my\server;Database=myDb;Trusted_Connection=Yes;].INFORMATION_SCHEMA.TABLES

Related

How to get multiple SQL Tables structure (T-SQL) into excel sheet?

I have to take 198 SQL Tables structures. Normally, I know about SP_HELP and ALT+F1 to get single table structure.
How can i get structure multiple tables? If i provide list of table names, output should be structure(Table name(field name, Data type, Length)) of all those tables.
I have only read access to SQL DB.
And, I am new to SQL.
Environment details:
Client Tool: Microsoft SQL Server Management Studio 2014
I have searched in SO, there are answers for single table. But, that doesn't solve my question.
This query will return a list of all tables and their columns with a lot of details about data_type, size, nullability and so on:
USE YourDatabaseNameHere;
SELECT t.TABLE_TYPE, c.*
FROM INFORMATION_SCHEMA.TABLES t
INNER JOIN INFORMATION_SCHEMA.COLUMNS c ON t.TABLE_SCHEMA=c.TABLE_SCHEMA
AND t.TABLE_NAME=c.TABLE_NAME
WHERE t.TABLE_TYPE = 'BASE TABLE'; --With 'VIEW' you'd find views, or just omit the WHERE...
You can use a simple Excel to connect to the database and read this result into a Sheet.
UPDATE
Did not read, that you can use SSMS. Just paste the query into a new query window and execute it. The result can be copy-pasted into excel...
I suggest to use DB Schema tool, which is used to design the database and understand the existing relational database mapping.
By using SQL Server 2008 R2
This will create a script for you then you will be able to run it on other sql server it will create the same Data Base with all talbes.
Right Click On Data Base Name
Go to Task
Go to Generate
Scripts SQL Server 2008 R2 DataBase Image
Next > Next > Next > SetYourPath Next > Finish
first Pic second Pic 3rd Pic 4th Pic 5th Pic

Tables created form model db MS SQL server

Long time I have not changed active database (like USE mydbname) and created bunch of tables into a master database I think. Ever since then when new databases are created these tables appears in it.
I think one of the four default databases (master, model, msdb, tempdb) works as model for new databases and therefore the "extra" tables must be stored somewhere. Based on this description, could you please advice me how to get rid of these tables in order to create new empty databases?
Do you want the get the user tables in system database?
you can try this:
EXEC sys.sp_MSforeachdb #command1 = N'use ?;select ''?'', * from sys.tables WHERE type=''U'' and is_ms_shipped=0'

How can I create a dynamic join between 2 SQL Server tables on 2 different servers?

I have data in 2 SQL Server 2012 database servers. I need to create a view containing data from both servers.
My first step was to import the join-table from Server2 into Server1 and create the view. The problem is though, that I need to keep the exported table up-to-date and thus a static "export" of the table is not ideal.
What methods could I use in order to create a dynamic join between 2 tables on 2 different servers?
You could establish linked server and use 4 part names:
CREATE VIEW dbo.my_view
AS
SELECT * -- cols list
FROM dbo.table_name t
JOIN server_name.database_name.schema_name.table_name c
ON t.id = c.id;
Note:
If view will be part of transaction, MS DTC (distributed transaction coordinator) should be enabled.
Depending how you build your query, performance may be degraded.
Not every type can be used (like XML)

Run query on sql server through teradata and store result in teradata

I have one table in SQL server and 5 tables in Teradata.I want to join those 5 table in teradata with sql server table and store result in Teradata table.
I have sql server name but i dont know how to simultaneously run a query both on sql server and teradata.
i want to do this:
sql server table query
Select distinct store
from store_Desc
teradata tables:
select cmp_id,state,sde
from xyz
where store in (
select distinct store
from sql server table)
You can create a table (or a volatile table if you do not have write privileges) to do this. Export result from SQL Server as text or into the language of your choice.
CREATE VOLATILE TABLE store_table (
column_1 datatype_1,
column_2 datatype_2,
...
column_n datatype_n);
You may need to add ON COMMIT PRESERVE ROWS before the ; to the above depending on your transaction settings.
From a language you can loop the below or do an execute many.
INSERT INTO store_table VALUES(value_1, value_2, ..., value_n);
Or you can use the import from text using Teradata SQL Assistant by going to File and selecting Import. Then execute the below and navigate to your file.
INSERT INTO store_table VALUES(?, ?, ..., n);
Once you have inserted your data you can query it by simply referencing the table name.
SELECT cmp_id,state,sde
FROM xyz
WHERE store IN(
SELECT store
FROM store_table)
The DISTINCT function is most easily done on export from SQL Server to minimize the rows you need to upload.
EDIT:
If you are doing this many times you can do this with a script, here is a very simple example in Python:
import pyodbc
con_ss = pyodbc.connect('sql_server_odbc_connection_string...')
crs_ss = con_ss.cursor()
con_td = pyodbc.connect('teradata_odbc_connection_string...')
crs_td = con_td.cursor()
# pull data for sql server
data_ss = crs_ss.execute('''
SELECT distinct store AS store
from store_Desc
''').fetchall()
# create table in teradata
crs_td.execute('''
CREATE VOLATILE TABLE store_table (
store DEC(4, 0)
) PRIMARY INDEX (store)
ON COMMIT PRESERVE ROWS;''')
con_td.commit()
# insert values; you can also use an execute many, but this is easier to read...
for row in data_ss:
crs_td.execute('''INSERT INTO store_table VALUES(?)''', row)
con_td.commit()
# get final data
data_td = crs_td.execute('''SELECT cmp_id,state,sde
FROM xyz
WHERE store IN(
SELECT store
FROM store_table);''').fetchall()
# from here write to file or whatever you would like.
Is fetching data from the Sql Server through ODBC an option?
The best option may be to use Teradata Parallel Transporter (TPT) to fetch data from SQL Server using its ODBC operator (as a producer) combined with Load or Update operator as the consumer to insert it into an intermediate table on Teradata. You must then perform rest of the operations on Teradata. For the rest of the operations, you can use BTEQ/SQLA to store the results in the final Teradata table. You can also put the same SQL in TPT's DDL operator instead of BTEQ/SQLA and get it done in a single job script.
To allow use of tables residing on separate DB environments (in your case SQL-Server and Teradata) in a single select statement, Teradata has recently released Teradata Query Grid. But I'm not sure about exact level of support for SQL-Server and it will involve licensing hassle and quite a learning curve to do this simple job.

Some tables are not visible but I can query them using RODBC and SQL Server

I am running R 3.0.1 and connecting to a SQL Server using RODBC. I am able to create the ODBC connection and execute queries without a problem. However, there are several different databases nested within the connection. I can query them, but cannot see the tables to get column names or other descriptives. Here's what I'm doing:
db_conn <- odbcConnect("db_name", "login", "pw")
sqlTables(db_conn)
TABLE_CAT TABLE_SCHEM TABLE_NAME TABLE_TYPE REMARKS
db_name_one schema_name table_1_name TABLE <NA>
And so on. I can see all of the tables in db_name_one, but not in db_name_two or db_name_three. However, I can query the other db_names using:
sqlQuery(db_conn, "select top 10 * from db_name_two.table_name")
With no problems. This would be great if I had all of the table and column names memorized, but obviously I don't.
You will need to specify the database name in order to see tables in that database. For example:
#'catalog' argument is for database names
#see tables in a database
sqlTables(db_conn, catalog = "db_name_two")
#see columns in a table of a database
sqlColumns(db_conn, catalog = "db_name_two",sqtable = "table1")

Resources