These are commands which I am running:-
bin/zookeeper-server-start etc/kafka/zookeeper.properties &
bin/kafka-server-start etc/kafka/server.properties &
bin/schema-registry-start etc/schema-registry/schema-registry.properties &
bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/quickstart-sqlserver.properties &
bin/kafka-avro-console-consumer --new-consumer --bootstrap-server localhost:9094 --topic test3-sqlserver-jdbc-ErrorLog --from-beginning
I am trying to connect sqlserver using confluent platform(kafka-connect) and facing following issues:
When I am trying to connect to default schema i.e. dbo , connection is built but it is not able to fetch data into the kafka consumer. The connection details that I am using are:
name=test-sqlserver-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.url=jdbc:sqlserver://********:1433;database=AdventureWorks2012;user=****;password=****
mode=incrementing
incrementing.column.name=ErrorLogID
topic.prefix=test3-sqlserver-jdbc-
table.whitelist=ErrorLog
schema.registry=dbo
When I am trying to connect to any other schema, the producer is throwing error, connection details that i am using are :
name=test-sqlserver-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.url=jdbc:sqlserver://********:1433;database=AdventureWorks2012;user=****;password=****
mode=incrementing
incrementing.column.name=AddressID
topic.prefix=test3-sqlserver-jdbc-
table.whitelist=Address
schema.registry=Person
Error :
INFO Source task WorkerSourceTask{id=test-sqlserver-jdbc-autoincrement-0} finished
initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:138)
[2017-03-07 17:55:47,041] ERROR Failed to run query for table
TimestampIncrementingTableQuerier{name='Address', query='null',
topicPrefix='test3-sqlserver-jdbc-', timestampColumn='null',
incrementingColumn='AddressID'}:
com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'Address'.
io.confluent.connect.jdbc.JdbcSourceTask:239)
[2017-03-07 17:55:52,124] ERROR Failed to run query for table
TimestampIncrementingTableQuerier{name='Address', query='null',
topicPrefix='test3-sqlserver-jdbc-', timestampColumn='null',
incrementingColumn='AddressID'}: com.microsoft.sqlserver.jdbc.SQLServerException:
Invalid object name 'Address'. (io.confluent.connect.jdbc.JdbcSourceTask:239)
[2017-03-07 17:55:53,684] INFO Reflections took 9299 ms to scan
262 urls, producing 12112 keys and 79402 values
(org.reflections.Reflections:229)
[2017-03-07 17:55:57,181] ERROR Failed to run query for table
TimestampIncrementingTableQuerier{name='Address', query='null',
topicPrefix='test3-sqlserver-jdbc-', timestampColumn='null',
incrementingColumn='AddressID'}:
com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'Address'.
(io.confluent.connect.jdbc.JdbcSourceTask:239)
Related
I got the following error in the database manager DBeaver 22.1.4 on the informix database:
SQL-Error [IX000]: Connection not established
Strangely if i make a select only on this table i got the error above. But if i make a select on all other tables the select has an result.
The full error message you find here:
org.jkiss.dbeaver.model.impl.jdbc.JDBCException: SQL-Error [IX000]: Connection not established
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.prepareStatement(JDBCConnectionImpl.java:197)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.prepareStatement(JDBCConnectionImpl.java:1)
at org.jkiss.dbeaver.model.DBUtils.createStatement(DBUtils.java:1391)
at org.jkiss.dbeaver.model.DBUtils.makeStatement(DBUtils.java:1359)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeStatement(SQLQueryJob.java:553)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.lambda$1(SQLQueryJob.java:486)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:172)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.executeSingleQuery(SQLQueryJob.java:493)
at org.jkiss.dbeaver.ui.editors.sql.execute.SQLQueryJob.extractData(SQLQueryJob.java:894)
at org.jkiss.dbeaver.ui.editors.sql.SQLEditor$QueryResultsContainer.readData(SQLEditor.java:3643)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.lambda$0(ResultSetJobDataRead.java:118)
at org.jkiss.dbeaver.model.exec.DBExecUtils.tryExecuteRecover(DBExecUtils.java:172)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetJobDataRead.run(ResultSetJobDataRead.java:116)
at org.jkiss.dbeaver.ui.controls.resultset.ResultSetViewer$ResultSetDataPumpJob.run(ResultSetViewer.java:4945)
at org.jkiss.dbeaver.model.runtime.AbstractJob.run(AbstractJob.java:105)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:63)
Caused by: java.sql.SQLException: Connection not established
at com.informix.util.IfxErrMsg.buildExceptionWithMessage(IfxErrMsg.java:416)
at com.informix.util.IfxErrMsg.buildException(IfxErrMsg.java:397)
at com.informix.util.IfxErrMsg.getSQLException(IfxErrMsg.java:371)
at com.informix.jdbc.IfxStatement.<init>(IfxStatement.java:134)
at com.informix.jdbc.IfxPreparedStatement.<init>(IfxPreparedStatement.java:116)
at com.informix.jdbc.IfxSqliConnect.prepareStatement(IfxSqliConnect.java:5902)
at com.informix.jdbc.IfxSqliConnect.prepareStatement(IfxSqliConnect.java:2367)
at com.informix.jdbc.IfxSqliConnect.prepareStatement(IfxSqliConnect.java:101)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.prepareStatement(JDBCConnectionImpl.java:244)
at org.jkiss.dbeaver.model.impl.jdbc.exec.JDBCConnectionImpl.prepareStatement(JDBCConnectionImpl.java:146)
... 15 more
Have somebody an idea?
The following code is running into a Docker container. I have a connection specified as follow which is working as I could make a simple query based on it.
from sqlalchemy import create_engine
engine = create_engine("mssql+pyodbc://username:pw#hostname?driver_name")
con_xpf = engine.connect()
con_xpf.execute("use db_name;")
After that I create a sqlite3 DB and connect to it:
DBNAME = "data/NEWDB.db"
con = sqlite3.connect(DBNAME)
tbllist = ["AAAAA","BBBBBBBB","CCCCCCCCC","DDDDDDDDD","EEEEEEEEE"
"FFFFFFFFF","GGGGGGGGG","HHHHHH","IIIIIIIII"]
chunksizes = [500000,800000,500000,1000000,500000,500000,500000,900000,500000]
The problem start from here when I try to run the following code to query from the first DB to write into the sqlite DB:
def loads_data_from_xpf(tbllist,chunksizes) :
for tbl,chunksize in zip(tbllist,chunksizes) :
cnt = 0
maxcnt = math.ceil(pd.read_sql("SELECT COUNT(*) CNT FROM x.{}".format(tbl),con_xpf).CNT[0]/chunksize)
with tqdm(range(maxcnt)) as pbar:
for chunk in pd.read_sql("SELECT * FROM x.{} A".format(tbl),con_xpf,chunksize=chunksize) :
chunk.to_sql(tbl,con,if_exists="append")
pbar.update()
loads_data_from_xpf(tbllist,chunksizes)
I got the following error:
OperationalError: (pyodbc.OperationalError) ('08S01', '[08S01] [Microsoft][ODBC Driver 17 for SQL Server]TCP Provider: Error code 0x71 (113) (SQLGetData)')
(Background on this error at: http://sqlalche.me/e/13/e3q8)
Could the problem be because of the size of the query? Do you see any error somewhere? I am stuck for a while. Any help would be appreciated.
EDIT
I am now able to see the sqlite DB file updating meaning data are pushed into it. But I have the following error which is because of the Jupyter Kernel. Help please
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/nbclient/client.py", line 841, in async_execute_cell
exec_reply = await self.task_poll_for_reply
concurrent.futures._base.CancelledError
During handling of the above exception, another exception occurred:
nbclient.exceptions.DeadKernelError: Kernel died
Anybody that actually got this to work? To use a csv-file as a datasource with polybase? I've just get the error message below.
Msg 105082, Level 16, State 1, Line 12. See the rest in code block.
Got the latest updates both of SQL Server 2019 EE, OS and ODBC and Microsoft Access Text Driver. The user has the right credentials. Creating the external data source is no problem, it's when trying creating the external table the error occurs. Can anybody see the any obvious error
OPEN MASTER KEY DECRYPTION BY PASSWORD = 'TOPSECRET_PSW';
GO
CREATE EXTERNAL DATA SOURCE TestCSV
WITH
(
LOCATION = 'odbc://localhost',
CONNECTION_OPTIONS = 'Driver=Microsoft Access Text Driver (*.txt, *.csv);Dbq=D:\APA\',
PUSHDOWN = OFF
);
CREATE EXTERNAL TABLE dbo.testCSV
(
Header1 nvarchar(128)
,Header2 nvarchar(128)
)
WITH
(
LOCATION='[testCSV.csv]',
DATA_SOURCE = TestCSV
)
Msg 105082, Level 16, State 1, Line 12
105082;Generic ODBC error: [Microsoft][ODBC Text Driver]General error Unable to open registry key Temporary (volatile) Ace DSN for process 0x1608 Thread 0x3450 DBC 0x8d76d578 Text'. Additional error <2>: ErrorMsg: [Microsoft][ODBC Text Driver]Invalid connection string attribute SERVER, SqlState: 01S00, NativeError: 8 Additional error <3>: ErrorMsg: [Microsoft][ODBC Text Driver]Invalid connection string attribute SERVER, SqlState: 01S00, NativeError: 8 Additional error <4>: ErrorMsg: [Microsoft][ODBC Text Driver]Invalid connection string attribute SERVER, SqlState: 01S00, NativeError: 8 Additional error <5>: ErrorMsg: [Microsoft][ODBC Text Driver]Invalid connection string attribute SERVER, SqlState: 01S00, NativeError: 8 .
You need to apply latest SQL Server 2019 CU to get rid of that message. However, once that error disappears, I couldn't make it work on Docker, are you using a physical machine?
I'm trying to import some data from different SqlServer databases using ExecuteSQL in NiFi, but it's returning me an error. I've already imported a lot of other tables from MySQL databases without any problem and I'm trying to use the same workflow structure for the SqlServer dbs.
The structure is as follows:
There's a file .txt with the list of tables to be imported
This file is fetched, splitted and uptaded; so there's a FlowFile for each table of each db that has to be imported,
These FlowFiles are passed into ExecuteSQL which executes their contents
For example:
file.txt
table1
table2
table3
is being updated into 3 different FlowFiles:
FlowFile1
SELECT * FROM table1
FlowFile2
SELECT * FROM table2
FlowFile3
SELECT * FROM table3
which are passed to ExecuteSQL.
Here follows the configuration of ExecuteSQL (identical for SqlServer tables and MySQL ones)
ExecuteSQL
As the only difference with the import from MySQL db is in the connectors, this is how a generic MySQL connector has been configured:
SETTINGSPROPERTIES
Database Connection URL jdbc:mysql://00.00.00.00/DataBase?zeroDateTimeBehavior=convertToNull&autoReconnect=true
Database Driver Class Name com.mysql.jdbc.Driver
Database Driver Location(s) file:///path/mysql-connector-java-5.1.47-bin.jar
Database User user
PasswordSensitive value set
Max Wait Time 500 millis
Max Total Connections 8
Validation query No value set
And this is how a SqlServer connector has been configured:
SETTINGSPROPERTIES
Database Connection URL jdbc:jtds:sqlserver://00.00.00.00/DataBase;useNTLMv2=true;integratedSecurity=true;
Database Driver Class Name net.sourceforge.jtds.jdbc.Driver
Database Driver Location(s) /path/connectors/jtds-1.3.1.jar
Database User user
PasswordSensitive value set
Max Wait Time -1
Max Total Connections 8
Validation query No value set
It has to be noticed that one (only one!) SqlServer connector works and the ExecuteSQL processor imports the data without any problem. The even stranger thing is that the database that is being connected via this connector is located in the same place as other two (the connection URL and user/psw are identical), but only the first one is working.
Notice that I've tried appending ?zeroDateTimeBehavior=convertToNull&autoReconnect=true also to the SqlServer connections, supposing it was a problem of date type, but it didn't give any positive change.
Here is the error that is being returned:
12:02:46 CEST ERROR f1553b83-a173-1c0f-93cb-1c32f0f46d1d
00.00.00.00:0000 ExecuteSQL[id=****] ExecuteSQL[id=****] failed to process session due to null; Processor Administratively Yielded for 1 sec: java.lang.AbstractMethodError
Error retrieved from logs:
ERROR [Timer-Driven Process Thread-49] o.a.nifi.processors.standard.ExecuteSQL ExecuteSQL[id=****] ExecuteSQL[id=****] failed to process session due to java.lang.AbstractMethodError; Processor Administratively Yielded for 1 sec: java.lang.AbstractMethodError
java.lang.AbstractMethodError: null
at net.sourceforge.jtds.jdbc.JtdsConnection.isValid(JtdsConnection.java:2833)
at org.apache.commons.dbcp2.DelegatingConnection.isValid(DelegatingConnection.java:874)
at org.apache.commons.dbcp2.PoolableConnection.validate(PoolableConnection.java:270)
at org.apache.commons.dbcp2.PoolableConnectionFactory.validateConnection(PoolableConnectionFactory.java:389)
at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2398)
at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2381)
at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2110)
at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1563)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:305)
at org.apache.nifi.dbcp.DBCPService.getConnection(DBCPService.java:49)
at sun.reflect.GeneratedMethodAccessor1696.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:84)
at com.sun.proxy.$Proxy449.getConnection(Unknown Source)
at org.apache.nifi.processors.standard.AbstractExecuteSQL.onTrigger(AbstractExecuteSQL.java:195)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I have an SSIS 2008r2 package that is failing with a foreign key violation. I'm trying to trap the error that appears in the debug window,
"The INSERT statement conflicted with the FOREIGN KEY constraint
"xxxx". The conflict occurred in database "xxxxxx", table "xxxxx",
column 'xxxx'."
I've tried enabling SSIS logging and also creating an event handler for the "OnError" event but the logs only provide the generic SSIS message
"SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method
on component "xxxxx Destination" (1458) failed with error code
0xC020844B while processing input "ADO NET Destination Input" (1461).
The identified component returned an error from the Process Input
method. The error is specific to the component, but the error is fatal
and will cause the Data Flow task to stop running. There may be error
messages posted before this with more information about the failure."
I would like to get the exact error to be logged (ie that this is a Foreign Key violation) instead of the DTS error. How can this be done?
Try to use an OLE DB Destination instead.
And just for the record, if you have a script task in the SSIS you can log the error using the DTS object:
try
{
// your code
}
catch (Exception ex)
{
//An error occurred.
Dts.Events.FireError(0, "Script Task Example", ex.Message + "\r" + ex.StackTrace, String.Empty, 0);
Dts.TaskResult = (int)ScriptResults.Failure;
}
MSDN - Logging in the Script Task