In "Talend Data Integration" I want to create a connection using JDBC to a Progress OpenEdge database. I have no experience whatsoever with this type of connection.
My ODBC-connections to the same resources work fine, but Talend requires a JDBC connection to function properly.
The connection settings in Talend I have at the moment are:
DB Type: General JDBC
JDBC URL: jdbc:sqlserver://db-name:port;databaseName=**
Driver jar: ??? (which jar-file do I need for OpenEdge?)
Class name: ??? (which class name do I need for OpenEdge?)
User name: *
Password: *
Schema: ??? (don't know what this means...?)
Mapping file: ??? (which xml-file do I need for Progress OpenEdge?)
EDIT: I am using Windows 7 on a 64-bit machine, using Talend Open Studio for Data Integration version 5.3.0.r101800.
Setup OpenEdge OBDC connection:
new OdbcConnection("Driver={Progress OpenEdge 10.2B Driver}; HOST=" + host + "; PORT=" + portNumber + "; DB=" + databaseName + "; DefaultIsolationLevel=READ COMMITTED; UID=" + user + "; PWD=" + pasword + ";");
ODBC driver is not included in OpenEdge. The driver must be downloaded and installed!
Setup OpenEdge JDBC connection:
String connectionString = "jdbc:datadirect:openedge://localhost:" + portNumber + ";databaseName=" + databaseName + ";user=" + user + ";password=" + pasword + "";
String cname = "com.ddtek.jdbc.openedge.OpenEdgeDriver";
Class.forName(cname);
connection = DriverManager.getConnection(connectionString);
Include driver in classpath from: C:\Progress\OpenEdge\java\openedge.jar
Setup in http://localhost:9090/fathom.htm: SQL Configuration Java classpath to: #{startup\dlc}\java\openedge.jar;#{startup\dlc}\java\util.jar
More information:
Try 'system' or 'sysprogress' for user;
Try 'SYSTEM' or 'PUB' for
catalog or schema;
Some tools will ask you for an external catalog name that you wanth to use, not from progress.
I found the solution:
What you need are a set of jar-files that are provided with your specific installation of Progress OpenEdge. These files, which are located in a folder called "java", are not commonly available on the internet and they should meet the exact version that you are using. If necessary, you need to contact your database provider. Use these files (you may not find all of them depending on your version of Progress OpenEdge):
progress.jar
openedge.jar
util.jar
base.jar
pool.jar
spy.jar
My url was wrong (it was still set to mySql). Instead use:
jdbc:datadirect:openedge://your-server-name:your-port;databaseName=your-db-name
As class name, use:
com.ddtek.jdbc.openedge.OpenEdgeDriver
I left schema and mapping file blank, and that worked. Good luck!
Related
I've had a look around SO and couldn't find this particular issue.
So I have an ext config.txt file which is used to obtain values which are stored in variables in the python program. I have a variable in the python that stores the key:values in dictionary form. (the idea is it takes the config settings and performs a sql server query from the program)
my code looks like this (also showing output of the print statements):
driver = config['DRIVER']
server = config['SERVER']
database = config['DATABASE']
trusted = config['Trusted_Connection']
print(driver) # = {ODBC Driver 17 for SQL Server};
print(server) # = server1;
print(database) # = db1;
print(trusted) # = yes
#1. working code
sql_conn = odbc.connect('DRIVER={ODBC Driver 17 for SQL Server}; SERVER=server1; DATABASE=db1; Trusted_Connection=yes')
#2. non working code
sql_conn = odbc.connect('\''DRIVER='+ str(driver) + ' SERVER=' + str(server) + ' DATABASE=' + str(database) + ' Trusted_Connection='+ str(trusted)+'\'')
When I try to run the first line, everything works as expected. However when I try with the second line I get:
pyodbc.InterfaceError: ('IM002', '[IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified (0) (SQLDriverConnect)')
Is this something to do with the conversion of dict to strings? or perhaps with pyodbc?
So I managed to find the fix and looks like it was an error with the parsed string:
incorrect:
odbc.connect('\''DRIVER='+ str(driver) + ' SERVER=' + str(server) + ' DATABASE=' + str(database) + ' Trusted_Connection='+ str(trusted)+'\'')
correct:
odbc.connect('DRIVER='+driver+';SERVER='+server+';DATABASE='+database+';Trusted_Connection='+trusted)
I am working off of a server housing various SQL databases (accessed via Microsoft SQL Server Management Studio) and am going to use R to perform analyses and explore a specific database within the server. I have network security that permits communication between machines, drivers installed on the R server, and RODBC installed.
When I attempt to establish a Windows ODBC connection in the Control panel>Administrative>Data Sources, I can only add a data source for the entirety of the SQL server, not just for the specifc database I want to look at. I pasted the code I have been experimenting with below.
library(RODBC)
channel <- odbcConnect("Example", uid="xxx", pwd=****");
sqlTables(channel)
sqlTables(ch, tableType = "TABLE")
res <- sqlFetch(ch, "samp.le", max = 15) #not recognizing as a table
library(RODBC)
ch <- odbcDriverConnect('driver={"SQL Server"}; server=Example; database=dbasesample; uid="xxxx", pwd = "****"')
Response: Warning messages:
1: In odbcDriverConnect("driver={\"SQL Server\"}; server=sample; database=dbasesample; uid=\"xxxx", pwd = \"xxxx\"") :
[RODBC] ERROR: state IM002, code 0, message [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
2: In odbcDriverConnect("driver={\"SQL Server\"}; server=sample; database=dbasesample; uid=\"xxxx\", pwd = \"xxxx!\"") :
ODBC connection failed
Any insight into this issue would be much appreciated.
Although while querying with the sqlQuery() function you can specify database, schema and table, e.g.
library(RODBC)
con = odbcConnect(dsn = 'local')
sample_query = sqlQuery(con,'select * from db.dbo.table')
I have not found a way to define the database from within the function parameters while using sqlFetch() or sqlSave(). An indirect way would be to define the default database in the dsn (as written in the comments). But then, you would need a different dsn for each database you would like to use.
A better solution would be to use the odbc and DBI packages instead of RODBC, and define the database in the connection statement e.g.
library(dplyr)
library(DBI)
library(odbc)
con <- dbConnect(dsn = 'local',database = 'db')
copy_to(con, rr2, temporary = F)
By the way, I found copy_to to be much faster than the equivalent sqlSave of RODBC.
Working through a tutorial to pull database data with:
install.packages('RODBC')
require(RODBC)
myNewDB=odbcConnect("QV Training")
And I get the error:
In odbcDriverConnect("DSN=QV Training")
Data source name not found and no default driver specified
In odbcDriverConnect("DSN=QV Training") : ODBC connection failed
Is 'QV Training' meant to be the name of a database that may no longer be present?
How does R know where to look for the database anyway?
Thank you!
In Windows (unsure of other OSes) you need to go into the ODBC Data Source Administrator, and add the data source. The ODBC Data Source Administrator is accessed via the 'Administrative Tools' section of Control Panel (in Windows 10 at least).
The connection command is then simply
conn <- odbcConnect("QV Training")
library(RODBC)
con <- odbcConnect("Oracle", uid="system", pwd="root", rows_at_time = 500)
sqlQuery(con, "select file_name,sum(bytes)/1024/1024 AS MB from dba_data_files group by file_name")
d <- sqlQuery(con, "select * from dba_data_files")
close(con)
I migrated PB7 to PB10.5 on SQL server DB. The system gives me this message:
"DBMS MSS Microsoft SQL Server 6.x is not supported in your current
installation"
I changed the database connection settings from:
Old connect used in PB7:
DBMS = MSS Microsoft SQL Server 6.x
Database = databaseName
ServerName = serverName
LogId = LogId
AutoCommit = 1
DBParm = ""
UserId = userid
DatabasePassword =
LogPassword=password
Lock=
Prompt=0
To this in PB10.5:
DBMS =SNC SQL Native Client(OLE DB)
Database =databaseName
ServerName =serverName
LogId =LogId
AutoCommit = 0
DBParm = "
Database='databaseName'
TrimSpaces=1"
UserId=userid
DatabasePassword=
LogPassword=password
Lock=
Prompt=0
The system run without previous error message,but when retrieve any old stored arabic data in datawindows it seem unreadable like
ÚãáíÇÊ ÇÎÑì
I can't believe this question got overlooked -- sorry about that. It is a common question when migrating from older versions of PowerBuilder to PowerBuilder version 10 and higher. Good news, very easy to fix just can be time consuming depending on how many places you need to fix.
I've already written a blog article on the subject or just duckduckgo migrating PowerBuilder Unicode issues.
Converting ANSI and Unicode Strings for PowerBuilder Migrations to Version 10 and Higher
Here is a summary of the conversion process:
Convert data to ANSI
Blob lbl_data
lbl_data = Blob("PowerBuilder is cool!", EncodingANSI!)
ls_data = String(lbl_data, EncodingANSI!)
Convert data read via file to ANSI
Blob lbl_data
lbl_data = Blob("PowerBuilder is cool!", EncodingANSI!)
ls_data = String(lbl_data, EncodingANSI!)
I am new to PowerBuilder.
I want to retrieve the data from MSAccess tables and update it to corresponding SQL tables. I am not able to create a permanent DSN for MSAccess because I have to select different MSAccess files with same table information. I can create a permanent DSN for SQL server.
Please help me to create DSN dynamically when selecting the MSAccess file and push all the tables data to SQL using PowerBuilder.
Also give the full PowerBuilder code to complete the problem if its possible.
In Access we strongly suggest not using DSNs at all as it is one less thing for someone to have to configure and one less thing for the users to screw up. Using DSN-Less Connections You should see if PowerBuilder has a similar option.
Create the DSN manually in the ODBC administrator
Locate the entry in the registry
Export the registry syntax into a .reg file
Read and edit the .reg file dynamically in PB
Write it back to the registry using PB's RegistrySet ( key, valuename, valuetype, value )
Once you've got your DSN set up, there are many options to push data from one database to the other.
You'll need two transaction objects in PB, each pointing to its own database. Then, you could use a Data Pipeline object to manage the actual data transfer.
You want to do the DSNLess connection referenced by Tony. I show an example of doing it at PBDJ and have a code sample over at Sybase's CodeXchange.
I am using this code, try it!
//// Profile access databases accdb format
SQLCA.DBMS = "OLE DB"
SQLCA.AutoCommit = False
SQLCA.DBParm = "PROVIDER='Microsoft.ACE.OLEDB.12.0',DATASOURCE='C:\databasename.accdb',DelimitIdentifier='No',CommitOnDisconnect='No'"
Connect using SQLCA;
If SQLCA.SQLCode = 0 Then
Open ( w_rsre_frame )
else
MessageBox ("Cannot Connect to Database", SQLCA.SQLErrText )
End If
or
//// Profile access databases mdb format
transaction aTrx
long resu
string database
database = "C:\databasename.mdb"
aTrx = create transaction
aTrx.DBMS = "OLE DB"
aTrx.AutoCommit = True
aTrx.DBParm = "PROVIDER='Microsoft.Jet.OLEDB.4.0',DATASOURCE='"+database+"',PBMaxBlobSize=100000,StaticBind='No',PBNoCatalog='YES'"
connect using aTrx ;
if atrx.sqldbcode = 0 then
messagebox("","Connection success to database")
else
messagebox("Error code: "+string(atrx.sqlcode),atrx.sqlerrtext+ " DB Code Error: "+string(atrx.sqldbcode))
end if
// do stuff...
destroy atrx