apologies if this question has already been asked, i just couldn't find the answer to my case.
I'm trying to make a connection (link) between Oracle 11g MS SQL database which are on 2 different servers. I've followed the instruction on this link
http://www.dba-oracle.com/t_heterogeneous_database_connections_sql_server.htm
Only that in my listener on SQL server, I'm using DG4ODBC rather than hsodbc
i've listed the steps below, but i can't figure out how to resolve it.
1) installed oracle client on my SQL server.
2) created an ODBC (64 bit) connection in the SQL server to point to my SQL target database called dg4odbc
3) Created file called initdg4odbc.ora in
D:\app\user\product\11.2.0\client_2\hs\admin with the below content:
# HS init parameters
#
HS_FDS_CONNECT_INFO = dg4odbc
HS_FDS_TRACE_LEVEL = on
4) Updated my listener to be as below:
LISTENER =
(ADDRESS_LIST=
(ADDRESS=(PROTOCOL=tcp)(HOST=sqlserver)(PORT=1521))
)
SID_LIST_LISTENER=
(SID_LIST=
(SID_DESC=
(SID_NAME=dg4odbc)
(ORACLE_HOME=D:\app\user\product\11.2.0\client_2)
(PROGRAM=dg4odbc)
)
)
#CONNECT_TIMEOUT_LISTENER = 0
5) when I stop and start the listener, I get the below message:
Instance "dg4odbc", status UNKNOWN, has 1 handler(s) for this service...
The command completed successfully
6) on my Oracle database server, updated the tnsnames.ora file to include:
dg4odbc.world = (DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)
(HOST = sqlserver)
(PORT = 1521)
)
(CONNECT_DATA = (SID=dg4odbc)
)
(HS=OK)
)
7) when I try to ping the tns using : tnsping dg4odbc I get the following error:
Used parameter files:
E:\oracle\product\11.2.0.4\dbhome_1\network\admin\sqlnet.ora
TNS-03505: Failed to resolve name
Could you please tell me where I am going wrong? Im getting a bit desperate to get this connection to work.
really appreciate your help on this.
Thanks
Related
After I triggered and refreshed the dag task, it went from running, delayed, to failed. The error log from the airflow told me to check the error from sql server which I got "Failed to start system task System Task" when I checked the logs on my sql server docker container. I'm not sure if I need to specify a schema but the rest of the connection params are correct.
[entrypoint.sh]
"${AIRFLOW_CONN_MY_SRC_DB:=mssql+pyodbc://SA:P#SSW0RD#mssqlcsc380:1433/?driver=ODBC+Driver+17+for+SQL+Server}"
[dag.py]
with DAG (
'mssql_380_dag',
start_date=days_ago(1),
schedule_interval=None,
catchup=False,
default_args={
'owner' : 'me',
'retries' : 1,
'retry_delay' : dt.timedelta(minutes=5)
}
) as dag:
get_requests = MsSqlOperator(
task_id = 'get_requests',
mssql_conn_id = 'my_src_db',
sql = 'select * from Request',
dag = dag
)
The issue was just that it couldn't notice the table so I specified the database which fixed the issue even though the database should of been recognized since I've passed it on the connection string.
sql = 'use csc380db; select * from Request',
Say you have your tables stores in an SQL server DB, and you want to perform multi table actions, i.e. join several tables from that same database.
Following code can interact and receive data from SQL server:
library(dplyr)
library(odbc)
con <- dbConnect(odbc::odbc(),
.connection_string = "Driver={SQL Server};Server=.;Database=My_DB;")
Table1 <- tbl(con, "Table1")
Table1 # View glimpse of Table1
Table2 <- tbl(con, "Table2")
Table2 # View glimpse of Table2
Table3 <- tbl(con, "Table3")
However, with a few results retrieved with the same connection, eventually following error occurs:
Error: [Microsoft][ODBC SQL Server Driver]Connection is busy with results for another hstmt
My current googling skills have taking me to the answer that the backend does not support multiple active result sets (MARS) - I guess more than 2 active result sets is the maximum? (backend is DBI and odbc)
So, my question is: what is best practice if I want to collect data from several tables from an SQL DB?
Open a connection for each table?
Actively close the connection and open it again for the next table?
Does the backend support MARS to be parsed to the connection string?
To make a connection that can hold multiple result sets, I've had luck with following connection code:
con <- DBI::dbConnect(odbc::odbc(),
Driver = "SQL Server Native Client 11.0",
Server = "my_host",
UID = rstudioapi::askForPassword("Database UID"),
PWD = rstudioapi::askForPassword("Database PWD"),
Port = 1433,
MultipleActiveResultSets = "True",
Database = my_db)
On top of that, I found that the new pool-package can do the job:
pool <- dbPool(odbc::odbc(),
Driver = "SQL Server Native Client 11.0",
Server = "my_host",
UID = rstudioapi::askForPassword("Database UID"),
PWD = rstudioapi::askForPassword("Database PWD"),
Port = 1433,
MultipleActiveResultSets = "True",
Database = my_db)
It is quicker and more stable than the DBI connection, however, one minor drawback is that the database doesn't pop up in the connection tab for easy reference.
For both methods, remember to close the connection/pool when done. For the DBI-method its:
dbDisconnect(con)
Whereas the pool-method is closed by calling:
poolClose(pool)
I am working off of a server housing various SQL databases (accessed via Microsoft SQL Server Management Studio) and am going to use R to perform analyses and explore a specific database within the server. I have network security that permits communication between machines, drivers installed on the R server, and RODBC installed.
When I attempt to establish a Windows ODBC connection in the Control panel>Administrative>Data Sources, I can only add a data source for the entirety of the SQL server, not just for the specifc database I want to look at. I pasted the code I have been experimenting with below.
library(RODBC)
channel <- odbcConnect("Example", uid="xxx", pwd=****");
sqlTables(channel)
sqlTables(ch, tableType = "TABLE")
res <- sqlFetch(ch, "samp.le", max = 15) #not recognizing as a table
library(RODBC)
ch <- odbcDriverConnect('driver={"SQL Server"}; server=Example; database=dbasesample; uid="xxxx", pwd = "****"')
Response: Warning messages:
1: In odbcDriverConnect("driver={\"SQL Server\"}; server=sample; database=dbasesample; uid=\"xxxx", pwd = \"xxxx\"") :
[RODBC] ERROR: state IM002, code 0, message [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
2: In odbcDriverConnect("driver={\"SQL Server\"}; server=sample; database=dbasesample; uid=\"xxxx\", pwd = \"xxxx!\"") :
ODBC connection failed
Any insight into this issue would be much appreciated.
Although while querying with the sqlQuery() function you can specify database, schema and table, e.g.
library(RODBC)
con = odbcConnect(dsn = 'local')
sample_query = sqlQuery(con,'select * from db.dbo.table')
I have not found a way to define the database from within the function parameters while using sqlFetch() or sqlSave(). An indirect way would be to define the default database in the dsn (as written in the comments). But then, you would need a different dsn for each database you would like to use.
A better solution would be to use the odbc and DBI packages instead of RODBC, and define the database in the connection statement e.g.
library(dplyr)
library(DBI)
library(odbc)
con <- dbConnect(dsn = 'local',database = 'db')
copy_to(con, rr2, temporary = F)
By the way, I found copy_to to be much faster than the equivalent sqlSave of RODBC.
Working through a tutorial to pull database data with:
install.packages('RODBC')
require(RODBC)
myNewDB=odbcConnect("QV Training")
And I get the error:
In odbcDriverConnect("DSN=QV Training")
Data source name not found and no default driver specified
In odbcDriverConnect("DSN=QV Training") : ODBC connection failed
Is 'QV Training' meant to be the name of a database that may no longer be present?
How does R know where to look for the database anyway?
Thank you!
In Windows (unsure of other OSes) you need to go into the ODBC Data Source Administrator, and add the data source. The ODBC Data Source Administrator is accessed via the 'Administrative Tools' section of Control Panel (in Windows 10 at least).
The connection command is then simply
conn <- odbcConnect("QV Training")
library(RODBC)
con <- odbcConnect("Oracle", uid="system", pwd="root", rows_at_time = 500)
sqlQuery(con, "select file_name,sum(bytes)/1024/1024 AS MB from dba_data_files group by file_name")
d <- sqlQuery(con, "select * from dba_data_files")
close(con)
I try to using connect with remote database using TOAD client. My oracle version is 10.2g. I follow these instructions that specified on this link, instead of step 13, because i cant understand what they explain link. My Oracle TNS_ADMIN path is C:\oracle\product\10.2.0\db_1\network\ADMIN and ORACLE_HOME is C:\oracle\product\10.2.0\db_1. I found may question regarding this query, but i am not found my answer.
We encountered this issue as well with Toad 10.x and Oracle Instant Client 11.2.x. The (rather silly) solution for us was to add a comment to the very top of the tnsnames.ora file. So for Step #8 in the linked instructions, the tnsnames.ora file would have this instead:
# Leave whitespace before your first entry below...
VIS =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = <host_name or ip_address>)(PORT = 1521))
)
(CONNECT_DATA =
(SID = <instance_name>)
(SERVER = DEDICATED)
)
)
Could you please verify your TOAD installation directory?
If it is something like C:\Program Files (x86)\..., the below link may help:
Yves' Blog
Apparently there are issues when installing the 32 bit version of TOAD into a directory that contains '()' chracters.
In my case, it was solved by removing extra white space connection name. there should be no white space in just before "TEST.CONN" and none of the other entries in tns file can have the same.
(No White Space)TEST.CONN=
(DESCRIPTION=
(ADDRESS=
(PROTOCOL=TCP)
(HOST=)
(PORT=)
)
(CONNECT_DATA=
(SERVICE_NAME=)
)
)