Comparing two Oracle schema, other users - database

I have been tasked with comparing two oracle schema with a large number of tables to find the structural differences in the schema. Up until know I have used the DB Diff tool in Oracle SQL Developer, and it has worked very well. The issue is that now I need to compare tables in a user that I cannot log into , but I can see it through the other users section in SQL developer. The issue is that whenever I try to use the diff tool to compare those objects to the other schema it does not work. Does anyone have any idea how to do this? It would save me a very large amount of work. I have some basic SQL knowledge if that is whats needed. Thanks.

If you have been GRANTed permissions in that other schema, issue an
alter session set current_schema = OTHER_SCHEMA_NO_QUOTES_REQUIRED;
the run whatever tool.
Otherwise, it's select * from all_tables where owner = OTHER_USER;, 'select * from all_indexes where ...` etc.

Just reviving this question with a correct answer.
If you can get your DBA to grant you proxy through you can do the following without knowing the password of the end schema:
ALTER USER {use you do not have pw to - lets call it ENDSCHEMA} GRANT CONNECT THROUGH {user you have pw for - lets call it YOURSCHEMA};
Then you create a connection in SQL Developer where:
username: YOURSCHEMA[ENDSCHEMA]
password: YOURSCHEMA password
Then you can proceed and do a Database Diff on both schemas and never knowing the password for ENDSCHEMA.

FIRST You have to launch
ALTER SESSION SET CURRENT_SCHEMA = SCHEMA;
BUT ALSO IN THE FIRST SCREEN OF DIFFERENCES TOOL YOU HAVE TO CHOOSE FOR DDL COMPARISON OPTIONS - SCHEMA - "MANAGE" (NOT CONSOLIDATE)
(I have Sql Developer with italian interface, so I translated the options, the names could be a little different)
It works for me

Easily compare every things in two or more schemas using toad for oracle as in this pics
Step 1:
Step 2: compare window will appear to add schemas you want to compare (by default connected schema will be the one above) then click next,
Step 3: select objects you want to compare in both schemas then run

Related

ODBC query against sysobjects does not return user defined objects

I'm running the following query
SELECT *
FROM sysobjects
WHERE type = 'P'
ORDER BY name
against several SQL Server 2019 databases using ODBC.
For one of the databases the query refuses to return user defined objects over ODBC. It returns all system procedures and if drop the type it will return all system objects but no user defined tables, views or procedures.
Executing the same query as that user in SSMS will return the user defined objects
I'm using the same SQL Server login/user in all the databases and I have double checked the permissions over and over again. Effective permissions for the user in all databases are:
CONNECT
SELECT
VIEW ANY COLUMN ENCRYPTION KEY DEFINITION
VIEW ANY COLUMN MASTER KEY DEFINITION
VIEW DATABASE STATE
VIEW DEFINITION
The ODBC datasources are all set up the same way.
But obviously I'm missing something - any 2 cents welcome!
DOH - I lied when I said the ODBC sources were all set up the same. For some reason I missed changing the default database to the one in question in this case. And I managed to miss that several times when I went through and checked.

Azure SQL Database - change user permissions on a read-only database for cross-database queries

We use Azure SQL Database, and therefore had to jump through some hoops to get cross-database queries set up. We achieved this following this great article: https://techcommunity.microsoft.com/t5/azure-database-support-blog/cross-database-query-in-azure-sql-database/ba-p/369126 Things are working great for most of our databases.
The problem comes in for one of our databases which is read-only. The reason it's read-only is b/c it is being synced from another Azure SQL Server to derive its content. This is being achieved via the Geo-Replication function in Azure SQL Database. When attempting to run the query GRANT SELECT ON [RemoteTable] TO RemoteLogger as seen in the linked article, I of course get the error "Failed to update because the database is read-only."
I have been trying to come up with a workaround for this. It appears user permissions are one of the things that do NOT sync as part of the geo-replication, as I've created this user and granted the SELECT permission on the origin database, but it doesn't carry over.
Has anyone run into this or something similar and found a workaround/solution? Is it safe/feasible to temporarily set the database to read/write, update the permission, then put it back to read-only? I don't know if this is even possible - I was told by one colleague that they think it will throw an error along the lines of "this database can't be set to read/write b/c it's syncing from another database..."
I figured out a work-around: Create a remote connection to the database on the ORIGIN server. So simple, yet it escaped me until now. Everything working great now.

Cannot describe in Snowflake. No active database

A newbie to Snowflake and I cannot use the database navigator to look at a table.
It gives me the following error: "Cannot perform DESCRIBE. This session does not have a current database. Call 'USE DATABASE', or use a qualified name."
I changed the context to use the right database.
I changed my role to SYSADMIN.
I entered "USE DATABASE CITIBIKE".
Still no joy.
You can try specifying your database while creating the connection with snowflake as below:
const connection = snowflake.createConnection({
username: "username",
password: "password",
account : "accountname",
database: "database name"
});
Source :
https://docs.snowflake.com/en/user-guide/nodejs-driver-use.html
Note : you can also specify schema, warehouse and role.
this worked for me
Are you using a worksheet or have you tried at the worksheet level with the necessary permissions to the object? Perhaps try using the qualified name? For example, if you're trying to describe a table in Snowflake:
desc table database_name.schema_name.table_name
Please note if any of the objects here have special characters, were stored in lowercase, or have spaces, you may need to put double quotes around them.
Edit: After further investigation and understanding of the issue, this is actually a UI issue the Snowflake Dev team is working to resolve. Hoping to see a fix in the next release.

Using dba_CopyLogins as part of a Database Migration

I'm considering using dba_CopyLogins (found On SQLsoldier Here, and refrenced in many migration related threads in dba.stackexchange ) to get the logins moved over as part of a migration from SQL 2008 R2 to SQL 2012.
Edit: Why not just use Sp_help_revlogin?
As I understand it (please correct if i'm wrong) sp_help_revlogin comes with a lot of caveats. It does not map the users to the databases, it does not copy over the passwords, it does not handle explicit permissions.. I'm hopeing to avoid using a script to transfer logins that results in me still having to touch 50+ individual logins one at a time.
Most of our user logins (possibly all, but i've not checked each one) are windows logins so maybe I'm making mountains out of mole hills.
I have some questions concerning how to use [dba_CopyLogins] properly, since comments on other user's questions have been unclear/ contradictory or did not address my question, and to call me an "Accidental DBA" would be being kind.
First: I'm migrating from sql 2008 R2 (box A) to Sql 2012 (box B). Is dba_CopyLogins meant to created/run on box A, or on Box B?
Second: #PartnerServer..is that meant to be the server where the logins exist?
Is this where I put the name of the linked server object?
Third: Box A and B are on different domains, does that matter?
Forth: Just looking for confirmation that this script will fail if a login already exisits with the same name. Will it faily only that login, or will the whole script abort?
I used, as in the comments sp_help_revlogin method.
You can read more about it here: https://support.microsoft.com/en-us/kb/246133
The script must run on box A and then stored procedure called.
The stored procedure will output script for recreating logins. When you run it on BOX b it will fail just on exsisting logins.
Different domains. Yes. It is a problem. It is a problem if you use integrated security and there is no trust between domains.
But, if you use integrated security and you have the same users in destination domain, with sp_help_revlogin you can adapt the script to map logins to destination domain. I used this method to align stuff.
Another issue is the default db. It will be lost and somtimes this is a problem for some applications (with bad configuration).
I used to run this script on source server and the results on destination server:
SELECT 'EXEC sp_defaultdb ''' + name + ''', ''' + dbname + ''''
FROM master.dbo.syslogins
WHERE [name] IS NOT NULL
AND dbname IS NOT NULL

Determine which user deleted a SQL Server database?

I have a SQL Server 2005 database that has been deleted, and I need to discover who deleted it. Is there a way of obtaining this user name?
Thanks, MagicAndi.
If there has been little or no activity since the deletion, then the out-of-the-box trace may be of help. Try running:
DECLARE #path varchar(256)
SELECT #path = path
FROM sys.traces
where id = 1
SELECT *
FROM fn_trace_gettable(#path, 1)
[In addition to the out-of-the-box trace, there is also the less well-known 'black box' trace, which is useful for diagnosing intermittent server crashes. This post, SQL Server’s Built-in Traces, shows you how to configure it.]
I would first ask everyone who has admin access to the Sql Server if they deleted it.
The best way to retrieve the information is to restore the latest backup.
Now to discuss how to avoid such problems in the future.
First make sure your backup process is running correctly and frequently. Make transaction log baclup evey 15 mintues or half an hour if it is a higly transactional database. Then the most you lose is a half an hour's worht of work. Practice restoring the database until you can easily do it under stress.
In SQL Server 2008 you can add DDL triggers (not sure if you can do this in 2005) which allow you to log who did changes to structure. It might be worth your time to look into this.
Do NOT allow more than two people admin access to your production database - a dba and a backup person for when the dba is out. These people should load all changes to the database structure and code and all of the changes should be scripted out, code reviewed and tested first on QA. No unscripted, "run by the seat of your pants" code should ever be run on prod.
Here is bit more precise TSQL
SELECT DatabaseID,NTUserName,HostName,LoginName,StartTime
FROM
sys.fn_trace_gettable(CONVERT(VARCHAR(150),
( SELECT TOP 1
f.[value]
FROM sys.fn_trace_getinfo(NULL) f
WHERE f.property = 2
)), DEFAULT) T
JOIN sys.trace_events TE ON T.EventClass = TE.trace_event_id
WHERE TE.trace_event_id =47 AND T.DatabaseName = 'delete'
-- 47 Represents event for deleting objects.
This can be used in the both events of knowing or not knowing the database/object name. Results look like this:

Resources