How to programmatically generate DDL from Oracle database? - database

I've got a monumentally tedious task that is to find several tables from a huge schema, and generate the DDL for these tables.
Say, I've got a schemaA has 1000 tables, I need to find if a tableA existed in this schemaA, if it does, generate the DDL and save it to filesystem, if don't, print it's name out or write it to a file. Any ideas?

The DBMS_METADATA package (assuming you are on a reasonably recent version of Oracle) will generate the DDL for any object in the database. So
SELECT dbms_metadata.get_ddl( 'TABLE', 'TABLEA', 'SCHEMAA' )
FROM dual;
will return a CLOB with the DDL for SchemaA.TableA. You could catch the exception that is thrown that the object doesn't exist, but I would tend to suggest that you query the data dictionary (i.e. DBA_OBJECTS) to verify that there is a table named TableA in SchemaA, i.e.
SELECT COUNT(*)
FROM dba_objects
WHERE owner = 'SCHEMAA'
AND object_name = 'TABLEA'
AND object_type = 'TABLE'
Note that if you don't have access to DBA_OBJECTS, you could use ALL_OBJECTS instead. The concern there, however, is that there may be a TableA in SchemaA that you don't have SELECT access on. That table would not appear in ALL_OBJECTS (which has all the objects that you have access to) but it would appear in DBA_OBJECTS (which has all the objects in the database regardless of your ability to access them).
You can then either write the DDL to a file or, if the count is 0, indicate that the object doesn't exist. From a stored procedure, you can use the UTL_FILE package to write to a file on the database server. If you are trying to write to a file on the client file system, you would need to use a language that has access to the client operating system's resources. A small C/ Java/ Perl/ etc. program should be able to select a CLOB and write that data to a file on the client operating system.

Related

How to do an inner join rather than for each loop in SSIS?

On the ETL server I have a DW user table.
On the prod OLTP server I have the sales database. I want to pull the sales only for users that are present in the user table on the ETL server.
Presently I am using an execute SQL task to fetch the DW users into a SSIS System.Object variable. Then using a for each loop to loop through each item (userid) in this variable and via a data flow task fetch the OLTP sales table for each user and dump it into the DW staging table. The for each is taking long time to run.
I want to be able to do an inner join so that the response is quicker, but I cant do this since they are on separate servers. Neither can I use a global temp table to make the inner join, for the same reason.
I tried to collect the DW users into a comma separated string variable and then using it (via string_split) to query into OLTP, but this is also taking more time at the pre-execute phase (not sure why exactly) even for small number of users.
I also am aware of lookup transform but that too will result in all oltp rows to be brought into the dw etl server to test the lookup condition.
Is there any alternate approach to be able to do an inner join by taking the list of users into the source?
Note: I do not have write permissions on the OLTP db.
Based on the comments, I think we can use a temporary table to solve this.
Can you help me understand this restriction? "Neither can I use a global temp table to make the inner join, for the same reason."
The restriction is since oltp server and dw server are separate so can't have global temp table common to both servers. Hope makes sense.
The general pattern we're going to do is
Execute SQL Task to create a temporary table on the OLTP server
A Data Flow task to populate the new temporary table. Source = DW. Destination = OLTP. Ensure Delay Validation = True
Modify existing Data Flow. Modify source to be a query that uses the temporary table i.e. SELECT S.* FROM oltp.sales AS S WHERE EXISTS (SELECT * FROM #SalesPerson AS SP WHERE SP.UserId = S.UserId); Ensure Delay Validation = True
A long form answer on using temporary tables (global to set the metadata, regular thereafter)
I don't use temp table in SSIS
Temporary tables, live in tempdb. Your OLTP and DW connection managers likely do not point to tempdb. To be able to reference a temporary table, local or global, in SSIS you need to either define an additional connection manager for the same server that points explicitly at tempdb so you can use the drop down in the source/destination components (technically accurate but dumb). Or, you use an SSIS Variable to hold the name of the table and use the ~From Variable~ named option in source/destination component (best option, maximum flexibility).
Soup to nuts example
I will use WideWorldImporters as my OLTP system and WideWorldImportersDW as my DW system.
One-time task
Open SQL Server Management Studio, SSMS, and connect to your OLTP system. Define a global temporary table with a unique name and the expected structure. Leave your connection open so the table structure remains intact during initial development.
I used the following statement.
DROP TABLE IF EXISTS #SO_70530036;
CREATE TABLE #SO_70530036(EmployeeId int NOT NULL);
Keep track of your query because we'll use it later on but as I advocate in my SSIS answers, perform the smallest task, test that it works and then go on to the next. It's the only way to debug.
Connection Managers
Define two OLE DB Connection Managers. WWI_DW uses points to the named instance DEV2019UTF8 and WWI_OLTP points to DEV2019EXPRESS. Right click on WWI_OLTP and select Properties. Find the property RetainSameConnection and flip that from the default of False to True. This ensures the same connection is used throughout the package. As temporary tables go out of scope when the connection goes away, closing and reopening a connection in a package will result in a fatal error.
These two databases on different instances so we can't cheat and directly comingle data.
Variables
Define 4 variables in SSIS, all of type String.
TempTableName - I used a value of ##SO_70530036 but use whatever value you specified in the One-time task section.
QuerySourceEmployees - This will be the query you run to generate the candidate set of data to go into the temporary table. I used SELECT TOP (3) E.[WWI Employee ID] AS EmployeeId FROM Dimension.Employee AS E WHERE E.[Is SalesPerson] = CAST(1 AS bit);
QueryDefineTables - Remember the drop/create statements from the on-time task? We're going to use the essence of them but use the expression builder to let us dynamically swap the table name. I clicked the ellipses, ..., on the Expression section and used the following "DROP TABLE IF EXISTS " + #[User::TempTableName] + "; CREATE TABLE " + #[User::TempTableName] + "( EmployeeId int NOT NULL);" You should be able to copy the Value from the row and paste it into SSMS to confirm it works.
QuerySales - This is the actual query you're going to use to pull your filtered set of sales data. Again, we'll use the Expression to allow us to dynamically reference the temporary table name. The prettified version of the expression would look something like
"SELECT
SI.InvoiceID
, SI.SalespersonPersonID
, SO.OrderID
, SOL.StockItemID
, SOL.Quantity
, SOL.OrderLineID
FROM
Sales.Invoices AS SI
INNER JOIN
Sales.Orders AS SO
ON SO.OrderID = SI.OrderID
INNER JOIN
Sales.OrderLines AS SOL
ON SO.OrderID = SOL.OrderID
WHERE
EXISTS (SELECT * FROM " + #[User::TempTableName] + " AS TT WHERE TT.EmployeeID = SI.SalespersonPersonID);"
Again, you should be able to pull the Value from the three queries and run them independently and verify they work.
Execute SQL Task
Add an Execute SQL task to the Control Flow. I named mine SQL Create temporary table My Connection Manager is WWI_OLTP and I changed the SQLSourceType to Variable and the SourceVariable is User::QueryDefineTables
Every time your package runs, the first thing it will do is establish create the temporary table. Which is good because SSIS is a metadata driven ETL engine and the next two steps would fail if the table didn't exist.
Data Flow Task - Prime the pump
This data flow is where we'll transfer DW data back to the OLTP system so can filter in the source system.
Drag a Data Flow Task onto the Control Flow. I named mine DFT Load Temp and before you click into it, right click on the Task and find the DelayValidation property and change this from the default of False to True. Normally, a package validates all metadata before actual execution begins as the idea is you want to know everything is good before any data starts moving. Since we're using temporary tables, we need to tell the execution engine "trust us, it'll be ready"
Double click inside the Data Flow Task.
Add an OLE DB Source. I named mine OLESRC SourceEmployees I use the connection manager WWI_DW. My data access mode changes to SQL command from variable and then I select my variable User::QuerySourceEmployees
Add an OLE DB Destination. I named mine OLEDST TempTableName and double clicked to configure it. The Connection Manager is WWI_OLTP and again, since the table lives in tempdb, we can't select it from the drop down. Change the Data access mode to Table name or view name variable - fast load and then select your variable name User::TempTableName. Click the Mapping tab and ensure source columns map to destination columns.
Data Flow Task - Transfer data
Finally, we will pull our source data, nicely filtered against the data from our target system.
Add an OLE DB Source. I named it OLESRC QuerySales. The Connection Manager is WWI_OLTP. Data access mode again changes to SQL command from variable and the variable name is User::QuerySales
From here, do whatever else you need to do to make the magic happen.
Instead of having 270k rows with an unfiltered query
I have 67k as there are only 3 employees in the temporary table.
Reference package
But wait, there's more!
Close out visual studio, open it back up and try to touch something in the data flows. Suddenly, there are red Xs everywhere! Any time you close a data flow component, it fires a revalidate metadata operation and guess what, it can't do that as the connection to the temporary table is gone.
The package will run fine, it will not throw VS_NEEDSNEWMETADATA but editing/maintenance becomes a pain.
If you switched from global temporary table to local, switch the table name variable's value back to a global and then run the define statement in SSMS. Once that's done, then you can continue editing the package.
I assure you, the local temporary table does work once you have the metadata set and you use queries via variables for source/destination.
No need for the global temporary table hack, or the SET FMTONLY OFF hack (which no longer works).
Just specify the result set metadata in the SQL query with WITH RESULT SETS. eg
EXEC ('
create table #t
(
ID INT,
Name VARCHAR(150),
Number VARCHAR(15)
)
insert into #t (Id, Name, Number)
select object_id, name, 12
from sys.objects
select * from #t
')
WITH RESULT SETS
(
(
ID INT,
Name VARCHAR(150),
Number VARCHAR(15)
)
)
If you need to parameterize the query, there's a bit of a catch because there are some limitations in how SSIS discovers parameters. SSIS runs sp_describe_undeclared_parameters, which doesn't really work with batches that call sp_executesql, because sp_executesql has a very unique way it handles parameters, one which you couldn't replicate with a user stored procedure.
So to parameterize the query you'll either need to pass the parameter values into the query using the "query from variable" and SSIS expressions, or push all this TSQL into a stored procedure.

T-SQL - Where is #tempimport?

I have a legacy db from an application that is no longer supported. I'm trying to deconstruct a stored procedure to make a modification, but I can't figure out where #tempimport is located. I've searched high and low through the schema, but I'm not finding anything close. This is how it's used:
SET NOCOUNT ON;
DECLARE #saleid UNIQUEIDENTIFIER
SELECT TOP 1 #saleid = [sale_id] FROM #tempimport;
Is this a T-SQL specific thing, or can I actually find the data in this table somewhere?
Tables with the # preceding the name are temporary tables.
You can find them in the tempdb database under System Databases
Tables that are prefixed with a # are temporary tables that are created in the tempdb system database.
try to find code of this table creation in you DB schema
select * from information_schema.routines
where object_definition(object_id(routine_name)) like '%#tempimport%'
try find #tempimport in schemas of neighboring DBs
try find #tempimport in your application sources if exists.
try to profile(SQL Profiler tool) your application and search #tempimport here.
Additional
#tempimport can be created by any connected application. Not only from your DB runnable code
you can research deeper and try to monitoring the moment of #tempimport creation. Example
select left(name, charindex('_',name)-1)
from tempdb..sysobjects
where charindex('_',name) > 0 and
xtype = 'u' and not object_id('tempdb..'+name) is null
and left(name, charindex('_',name)-1) = '#tempimport'
source: Is there a way to get a list of all current temporary tables in SQL Server?

Loading data of one table into another residing on different databases - Netezza

I have a big file which I have loaded in a table in a netezza database using an ETL tool, lets call this database Staging_DB. Now, post some verifications, the content of this table needs to be inserted into similar structured table residing in another netezza DB, lets call this one PROD_DB. What is the fastest way to transfer data from staging_DB to PROD_DB?
Should I be using the ETL tool to load the data into PROD_DB? Or,
Should the transfer be done using external tables concept?
If there is no transformation need to be done, then better way to transfer is cross database data transfer. As described in Netezza documentation that Netezza support cross database support where the user has object level permission on both databases.
You can check permission with following command -
dbname.schemaname(loggenin_username)=> \dpu username
Please find below working example -
INSERT INTO Staging_DB..TBL1 SELECT * FROM PROD_DB..TBL1
If you want to do some transformation and than after you need to insert in another database then you can write UDT procedures (also called as resultset procedures).
Hope this will help.
One way you could move the data is by using Transient External Tables. Start by creating a flat file from your source table/db. Because you are moving from Netezza to Netezza you can save time and space by turning on compression and using internal formatting.
CREATE EXTERNAL TABLE 'C:\FileName.dat'
USING (
delim 167
datestyle 'MDY'
datedelim '/'
maxerrors 2
encoding 'internal'
Compress True
REMOTESOURCE 'ODBC'
logDir 'c:\' ) AS
SELECT * FROM source_table;
Then create the table in your target database using the same DDL in the source and just load it up.
INSERT INTO target SELECT * FROM external 'C:\FileName.dat'
USING (
delim 167
datestyle 'MDY'
datedelim '/'
maxerrors 2
encoding 'internal'
Compress True
REMOTESOURCE 'ODBC'
logDir 'c:\' );
I would write a SP on production db and do a CTAS from stage to production database. The beauty of SP is you can add transformations as well.
One other option is NZ migrate utility provided by Netezza and that is the fastest route I believe.
A simple SQL query like
INSERT INTO Staging_DB..TBL1 SELECT * FROM PROD_DB..TBL1
works great if you just need to do that.
Just be aware that you have to be connected to the destination database when executing the query, otherwise you will get an error code
HY0000: "Cross Database Access not supported for this type of command"
even if you have read/write access to both databases and tables.
In most cases you can simply change the catalog using a "Set Catalog" command
https://www-304.ibm.com/support/knowledgecenter/SSULQD_7.0.3/com.ibm.nz.dbu.doc/r_dbuser_set_catalog.html
set catalog='database_name';
insert into target_db.target_schema.target_table select source_db.source_schema.source_table;

How can I compare tables in two different databases using SQL?

I'm trying to compare the schemas of two tables that exist in different databases. So far, I have this query
SELECT * FROM sys.columns WHERE object_id = OBJECT_ID('table1')
The only thing is that I don't know how to use the sys.columns to reference a database other than the one that the query is connected to. I tried this
SELECT * FROM db.sys.columns WHERE object_id = OBJECT_ID('table1')
but it didn't find anything.
I'm using SQL Server 2005
Any suggestions? thanks!
Take a look at redgate's SQL Compare.
To answer your specific question, you need to fully qualify the table reference.
SELECT * FROM db.sys.columns WHERE object_id = OBJECT_ID('db.SchemaName.table1')
Late one but hopefully useful.
Even though chama asked for SQL solutions I’d still recommend using a third party tools such as ApexSQL Diff or tools from Red Gate Joe already mentioned (I’ve used both and they worked great).
Reason is that query for comparing two tables using information schema has to be quite complex in order to catch all differences.
Note that all of the examples mentioned here only cover columns but none of the queries shown here will show the difference between nvarchar(20) and nvarchar(50) or difference in foreign keys or indexes or….
Short answer is yes – this is possible using information schema views but it can be rather complex if you want to compare every detail in those two tables.
all you need is to specify the DB name and shcema when calling OBJECT_ID function, like:
SELECT *
FROM DB_NAME.sys.columns
WHERE object_id = OBJECT_ID('DB_NAME.SHCEMA_NAME.table1')
Try the information_schema. eg:
select *
from
db1.information_schema.columns col1
join db2.information_schema.columns col2
on col1.table_catalog = col2.table_catalog
and col1.table_schema = col2.table_schema
and col1.column_name = col2.column_name
...
The information_schema simplifies sticking together the information from all the sys.columns,sys.objects etc. It exists automatically in your DB. I think its actually an ISO standard thing, so should work on various DB systems.
More information about the information_schema can be found here
Comparing whether the object or columns exists in both schemas is only a tiny bit of the solution. What if they exist in both databases but are different?
For my bsn ModuleStore project, I implemented a scripting routine which actually scripts most DB objects including table and view columns, indexes, namespaces etc. as XML using T-SQL code only. This may be a good place to start. You can find it on Google code, and the file in question (which generates the SQL query for dumping the object schema to XML) is here.
Code -
drop table #a
drop table #b
select *
into #a
from [databasename1].information_schema.columns a
--where table_name = 'aaa'
select *
into #b
from [databasename2].information_schema.columns b -- add linked server name and db as needed
--where table_name = 'bbb'
select distinct( a.table_name), b.TABLE_SCHEMA+ '.' + (b.table_name) TableName,b.TABLE_CATALOG DatabaseName
from #a a
right join #b b
on a.TABLE_NAME = b.TABLE_NAME and a.TABLE_SCHEMA = b.TABLE_SCHEMA
where a.table_name is null-- and a.table_name not like '%sync%'
Just in case you are using MS VS 2015 (Community is a free download). The SOL Server tools includes a Schema Comparison tool. "SQL Server Data Tools (SSDT) includes a Schema Compare utility that you can use to compare two database definitions".
This is a GPL Java program I wrote for comparing data in any two tables, with a common key and common columns, across any two heterogeneous databases using JDBC: https://sourceforge.net/projects/metaqa/
It intelligently forgives (numeric, string and date) data type differences by reducing them to a common format. The output is a sparse tab delimited file with .xls extension for use in a spreadsheet.
The program ingests SQL that is used to produce a source table that can be compared with the target table. The target table SQL can be generated automatically. The target table is read one row at a time and therefore should be indexed on the common key.
It detects missing rows on either side and common keyed rows with other optional column differences. Obviously the meta data can be accessed by SQL so whether your concern is with the data, or with the meta-data, it will still work.
This is very powerful in a data migration or System migration project, and also for auditing interfaces. You will be astounded at the number of errors it detects. Minimal false positives still do occur.
Informix, Oracle and SQL-Server were the first JDBC targets and you can extend that list if desired.

Can DTS Test for Presence of MS-Access Table

I have an Access database in which I drop the table and then create the table afresh. However, I need to be able to test for the table in case the table gets dropped but not created (i.e. when someone stops the DTS package just after it starts -roll-eyes- ). If I were doing this in the SQL database I would just do:
IF (EXISTS (SELECT * FROM sysobjects WHERE name = 'Table-Name-to-look-for'))
BEGIN
drop table 'Table-Name-to-look-for'
END
But how do I do that for an Access database?
Optional answer: is there a way to have the DTS package ignore the error and just go to the next step rather than checking to see if it exists?
SQL Server 2000
I'm not sure whether you can query the system objects table in an Access database from a DTS package.
If that doesn't work, why not just try doing a SELECT * from the Access table in question and then catch the error if it fails?
Try the same T-SQL, but in MS ACCESS the sys objects table is called:
MSysObjects.
Try this:
SELECT * FROM MSysObjects WHERE Name = 'your_table';
and see if it works from there.
You can take a look at these tables if you go to Tools -> Options -> View (a tab) -> and check Hidden Objects, System Objects. So you can see both. If you open the table, you should see your table names, queries, etc. Do not change this manually or the DB could panic :)
Martin.
P.D.: Your If Exists should also check of object type:
IF EXISTS (SELECT * FROM sysobjects WHERE id = object_id(N'[dbo].[Your_Table_Name]') AND OBJECTPROPERTY(id, N'IsUserTable') = 1)
Microsoft Access has a system table called MSysObjects that contains a list of all database objects, including tables. Table objects have Type 1, 4 and 6.
It is important to reference the type:
... Where Name='TableName' And Type In (1,4,6)
Otherwise, what is returned could be a some object other than a table.

Resources