How to check cursor(dbh) is active or not for sybase connection in python - cursor

In db2 we have
import ibm_db import active to check whether the cursor(dbh) is active or not
I am looking for the similar module in sybase

the is no active methode in sybase module. we need to run a query (select 1 ) and catch the exception if there is an error.

Related

Debezium SQL Server Connector - "Couldn't obtain database name"

I'm trying to set up a Debezium SQL Server Connector against a SQL Server instance that is controlled by DBAs at my workplace. I've been able to start up Zookeeper and Kafka Server without issue, and Kafka Connect itself works with sample Connectors, but when attempting to start a Debezium SQL Server Connector instance I've been getting the error "Couldn't obtain database name".
[2022-07-12 16:36:04,269] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:117)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration is invalid and contains the following 1 error(s):
Unable to connect. Check this and other connection properties. Error: Couldn't obtain database name
Here is my debezium config:
name=Dbz-SqlServer-connector
connector.class=io.debezium.connector.sqlserver.SqlServerConnector
database.hostname=MyDbHost
database.port=1433
database.user=MyUsername
database.password=MyPassword
database.dbname=MyDatabase
database.server.name=MyDbHost
table.include.list=dbo.CdcTest
database.history.kafka.bootstrap.servers=localhost:9092
database.history.kafka.topic=dbhistory.CdcTest
I've tried this in a .properties file passed to a standalone Connect instance, and as a JSON POST to a distributed Connect instance. I have tried all of the same steps on both my local Windows machine as well as on a linux VM, with the same results.
Confluent and Docker are not options for me in this situation.
for SQL Server login credentials, I am using a local account on the SQL Server instance that does have access to the database in question. I found the source code for debezium's connectors on their github and was able to find that specific error message within the code:
private static final String GET_DATABASE_NAME = "SELECT name FROM sys.databases WHERE name = ?";
...
public String retrieveRealDatabaseName(String databaseName) {
try {
return prepareQueryAndMap(GET_DATABASE_NAME,
ps -> ps.setString(1, databaseName),
singleResultMapper(rs -> rs.getString(1), "Could not retrieve exactly one database name"));
}
catch (SQLException e) {
throw new RuntimeException("Couldn't obtain database name", e);
}
}
I'm not completely familiar with Java but it appears that basically something is going wrong when the connector is trying to run "SELECT name FROM sys.databases WHERE name = 'MyDatabase'". When I run this against the database myself, logged in with the same account I'm using, it seems to work just fine, so I'm really not sure where to go from here. It is fair to say that since I'm not in full control of the SQL Server environment that I'm using, there may be some permissions issues that I'm not aware of, but from what I'm able to test it seems like it should be working.
I would greatly appreciate any help at all, whether just suggestions on settings/configs to check or a full-blown solution.
Thank you!
Update: I've built a simple console app to run that sys.databases query against MyDbHost, master database, as the relevant account, and it's working just fine so I feel like that confirms that my connection info is correct and account permissions are also correct. Seems like this is an issue within the Debezium connector.
It turned out that my problem was a mistake in the connector's config setting. I misunderstood which specific pieces of data to put into database.hostname and database.server.name, and one I corrected those fields the connector works.

Oracle container database 12c, connecting using JDBC

I'm trying to learn how to use Oracle Container database, and just do basic JDBC connections. I installed a dockerised version of Oracle:
https://hub.docker.com/_/oracle-database-enterprise-edition
Which according to the data sheet comes set up with a CDB database called ORCLCDB and a PDB database called ORCLPDB1.
So I figured out I can connect to it like this:
jdbc:oracle:thin:#localhost:1555:ORCLCDB
with username sys, password Oradoc_db1, and setting the special internal_logon jdbc parameter equal to "sysdba" to avoid the error "local oracle CDB: ORA-28009: connection as SYS should be as SYSDBA or SYSOPER"
And I figured out I can switch to the PDB by entering this:
ALTER SESSION SET CONTAINER=ORCLPDB1
And I can then create a new user:
CREATE USER MYUSER IDENTIFIED BY MYPASSWORD1
But then I'm stuck. I think there should be some way to connect directly to the PDB using a JDBC connect string. Every time I google about this, it talks about tnsnames blah blah, but people who use JDBC connections, are typically using Tomcat on a server, or otherwise don't have the Oracle Client installed. They expect to be able to connect to Oracle just with the thin driver installed, nothing else.
I've tried the obvious using:
jdbc:oracle:thin:#localhost:1555:ORCLPDB1
with username myuser or sys, but I always get:
ORA-12505, TNS:listener does not currently know of SID given in connect descriptor
At this point I'm stuck.
You need to use a SERVICE_NAME in order to connect to an Oracle container database
Please alter your connect string like this:
jdbc:oracle:thin:#localhost:1555/ORCLPDB1
A SERVICE_NAME is denoted by a "/"
A SID (SystemIDentifier) is denoted by a ":" (not to be used)
Note! Default listener port is 1521, not sure why you specifically want a different port.
Best of luck!
Apparently the correct answer is this...
jdbc:oracle:thin:#localhost:1521/ORCLPDB1.localdomain
Then I can connect as SYS using the method above. If I want to connect as the created user, I also need...
grant create session to myuser;
and then, turn off the internal_logon jdbc parameter.

Cannot Import DB using the Azure Portal

I am trying to import a DB in the Azure Portal. The original DB I exported from was on a different server, but is setup identically to where I am trying to import. I am importing by going to the target server and clicking the import button. I then choose the storage account, container, and bacpac file I would like to import. I check that the database size and type are the same for the import as the bacpac file. I also double check that the collation is the same on the import as it is in the bacpac. I then confirm. It attempts to do the import for about 20 minutes before giving the error message below. I can see the DB is created when I go to the sql server and click the sql databases blade, but the tables inside the DB are empty.
Could not import package.
Warning SQL72012: The object [data_0] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Warning SQL72012: The object [log] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Error SQL72014: .Net SqlClie
I have seen some responses in regards to similar issues, but they all seem to be using SSMS. Does anyone have any ideas on how to fix this issue inside the Azure Portal? Also, does anyone know what check box they are talking about? there is no checkbox when I am doing the import setup.
The warnings you are getting are a bit of a red herring. The issue is with the error you are getting. The line you posted only shows a generic error which should be followed by the actual error after. Try going to the actual database server and check import\export history.
try also importing using powershell which may give you some more details on the error you are getting:
$importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write("Importing")
while ($importStatus.Status -eq "InProgress") {
$importStatus = Get-AzSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write(".")
Start-Sleep -s 10
}
[Console]::WriteLine("")
$importStatus
Without knowing what the error you are getting it's a bit of a stab in the dark trying to guess what the issue is. Giving the version of SQL you are exporting from I can guess this is an on prem database server.
One of the common reasons dacpac files tend to fail when importing is your source database server isn't configured to allow contained databases
If that's the case, you need to go to your source database server (where you are exporting from) and enable that option:
sp_configure 'contained database authentication', 1;
GO
RECONFIGURE;
GO
Once this is run, recreate your dacpac file and try to import that.
As I mentioned, this is a complete stab in the dark as you haven't provided the error you are actually getting

How to import a database (or Schema) using impdp command in Oracle?

I've not worked in Oracle database anytime. Everything I've done in databases is using only MySql. I got a .dpdmp file which I need to import it into Oracle database.
I tried with the example provided in this link, but not a single statement is executed. Totally it completes with 208 error
http://gerardnico.com/wiki/database/oracle/oracle_db_datapump
impdp system/root DIRECTORY=data_dump_dir DUMPFILE=MYDUMPFILE.DPDMP
If I look at the log files, I assume this is the root cause of the problem
Processing object type SCHEMA_EXPORT/USER
ORA-39083: Object type USER:"MYUSER" failed to create with error:
ORA-65096: invalid common user or role name
Since the user creation failed, so every statements that is executed after this also resulted in error. The dump file is created in Oracle 10G where my Oracle is 12c. Is this due to the version conflict?
Please create the user "MYUSER" and then try to import with additional parameters like
remap_talbespace=old_tbsp:new_tbsp
This will import the data into your new tablespace

Error:SQL Server Import data to a new database

Trying to import data into a new database with SQL Server Management Studio's Import and Export Wizard.Shows an error below:
Warning 0x80049304:Data Flow Task 1: Warrning: Could not open global
flow performance counters are not available.To resolve, run this
package as an administrator, or on the wywtem's console
Please tell me what should I do? Thanks
(SQL Server 2012 64 bit, Windows 7)
To resolve, run this
package as an administrator
, or on the wywtem's console
personally I think you should try this first. Then if that fails - tell us what it says.

Resources