random failures in npgsql with ssl - npgsql

Windows 2012 r2. Npgsql v 3.1.8. postgres 9.6.3, client and server on same system (using 'localhost'). Client is single threaded powershell script calling c# dll. DB runnig as service.
During init of database (its the installer of my product). Sending large batch of create table etc,(~200 tables) in one large transaction. The client complains that the far side closed the connection. The pg log says
ERROR: invalid string in message
FATAL: invalid frontend message type 233
or
ERROR: invalid byte sequence for encoding "UTF8": 0x86
FATAL: invalid frontend message type 104
or
ERROR: invalid byte sequence for encoding "UTF8": 0xe5 0x5e 0xaa
FATAL: invalid frontend message type 198
or
ERROR: invalid byte sequence for encoding "UTF8": 0xeb 0xa6 0x02
FATAL: invalid frontend message type 162
The problem is 100% repro in a given configuration but can be changed by varying things. THe most annoying thing is that increasing the log level on the server to max stops the bug (adding some logging changes the failure), I can get a failure with statement tracing on and can see its failing in the middle of the long batch of create tables. But exactly where in the batch it fails varies. Other things that change the behavior is increasing the memory on the server (its a VM), using a faster disk for the DB, ..
The client sets up the connection with pooling on, ssl enabled. uses the default choice for UseSslStream (false I think)
EDIT:
Here is more detailed trace wiht error in context
LOG: 00000: execute <unnamed>: create table locks44_ ("_rowkey" text NOT NULL, "acquired" timestamptz, "expires" timestamptz, "ownerrole" Text, "ownertx" Text, "name" Text, "uniquevalue" Text, "_partitionkey" text NOT NULL, primary key (_partitionkey,_rowkey))
LOCATION: exec_execute_message, postgres.c:1952
LOG: 00000: execute <unnamed>: create table locks45_ ("_rowkey" text NOT NULL, "acquired" timestamptz, "expires" timestamptz, "ownerrole" Text, "ownertx" Text, "name" Text, "uniquevalue" Text, "_partitionkey" text NOT NULL, primary key (_partitionkey,_rowkey))
LOCATION: exec_execute_message, postgres.c:1952
LOG: 00000: execute <unnamed>: create table locks46_ ("_rowkey" text NOT NULL, "acquired" timestamptz, "expires" timestamptz, "ownerrole" Text, "ownertx" Text, "name" Text, "uniquevalue" Text, "_partitionkey" text NOT NULL, primary key (_partitionkey,_rowkey))
LOCATION: exec_execute_message, postgres.c:1952
LOG: 00000: execute <unnamed>: create table locks47_ ("_rowkey" text NOT NULL, "acquired" timestamptz, "expires" timestamptz, "ownerrole" Text, "ownertx" Text, "name" Text, "uniquevalue" Text, "_partitionkey" text NOT NULL, primary key (_partitionkey,_rowkey))
LOCATION: exec_execute_message, postgres.c:1952
LOG: 00000: execute <unnamed>: create table locks48_ ("_rowkey" text NOT NULL, "acquired" timestamptz, "expires" timestamptz, "ownerrole" Text, "ownertx" Text, "name" Text, "uniquevalue" Text, "_partitionkey" text NOT NULL, primary key (_partitionkey,_rowkey))
LOCATION: exec_execute_message, postgres.c:1952
LOG: 00000: execute <unnamed>: create table locks49_ ("_rowkey" text NOT NULL, "acquired" timestamptz, "expires" timestamptz, "ownerrole" Text, "ownertx" Text, "name" Text, "uniquevalue" Text, "_partitionkey" text NOT NULL, primary key (_partitionkey,_rowkey))
LOCATION: exec_execute_message, postgres.c:1952
LOG: 00000: execute <unnamed>: create table locks50_ ("_rowkey" text NOT NULL, "acquired" timestamptz, "expires" timestamptz, "ownerrole" Text, "ownertx" Text, "name" Text, "uniquevalue" Text, "_partitionkey" text NOT NULL, primary key (_partitionkey,_rowkey))
LOCATION: exec_execute_message, postgres.c:1952
ERROR: 08P01: invalid string in message
LOCATION: pq_getmsgstring, pqformat.c:637
FATAL: 08P01: invalid frontend message type 97
EDIT2: seems to be a known bug, fixed by UseSslStream = true
plus it seems to be fixed - https://github.com/npgsql/npgsql/issues/1362

it seems to be fixed - https://github.com/npgsql/npgsql/issues/1362
or set UseSslStream = true (this fixed it for me)

Related

EXterna Table errorORA-29913- ORA-29400- KUP-04040

i created a external table when i select from it this error show.
i work with oracle 19c
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file customer.csv in EXTERNAL not found
------------code----------------
CREATE TABLE customers
(Email VARCHAR2(255) NOT NULL,
Name VARCHAR2(255) NOT NULL,
Phone VARCHAR2(255) NOT NULL,
Address VARCHAR2(255) NOT NULL)
ORGANIZATION EXTERNAL(
type oracle_loader
DEFAULT DIRECTORY external
ACCESS PARAMETERS
(
records delimited by newline
fields terminated by ','
missing field values are null
REJECT ROWS WITH ALL NULL FIELDS)
LOCATION ('customer.csv'))
REJECT LIMIT UNLIMITED;
customer.csv data
salma.55#gmm.com,salma,0152275522,44al,
mariam.66#hotmail.com,mariam,011145528,552www,
ahmed.85#gmail.com,ahmed,0111552774,44eee,
"DEFAULT DIRECTORY external" means you are looking in a named directory that you have called "external".
For example, if I had done:
create directory XYZ as '/tmp';
then
default directory XYZ
means I'll be searching in /tmp for my files. So look at DBA_DIRECTORIES to see where your "EXTERNAL" directory is pointing

BCP Error Invalid field size for datatype With Native Method

I keep getting the "Invalid field size for datatype" error trying to BCP in several tables. This is despite the fact that I'm using Native (Unicode) mode and I am absolutely certain that the table definitions match because the target is created from a DACPAC of the source right before I attempt the import. Some tables work, but most fail with the error. I've tried using format files as well, and that didn't help either. I've read all the documentation and all the BCP-related questions I could find on SO. Any ideas?
Here's my export command. The $query_array is just a dictionary of "Select * From $tbl" mixed with "Select * From $tbl Where some date filter":
&bcp $query_array[$tbl] QueryOut "Data\$tbl.dat" -N -S $SqlAddress -d $SqlDB -T;
And here's the import:
&bcp $tgtTable in "Data\$tbl.dat" -N -S $SqlAddress -d $SqlDB -E -T -ErrorAction Stop;
One of the tables that fails looks like this:
CREATE TABLE [Foo].[Bar](
[Blah1] [numeric](38, 0) NOT NULL,
[Blah2] [numeric](38, 0) NULL,
[Blah3] [numeric](38, 0) NULL,
[Blah4] [numeric](38, 0) NULL,
[Blah5] [datetime2](7) NULL,
[Blah6] [varchar](30) NULL,
[Blah7] [datetime2](7) NULL,
CONSTRAINT [PK_Foo] PRIMARY KEY CLUSTERED
(
[Blah1] ASC
)
The overall procedure is pretty simple
Generate DACPAC from source
Export each table using BCP in Unicode Native to a dat file
Create target DB from DACPAC
Import each table using BCP in Unicode Native format from dat file
This is driving me crazy. It should just work. I'm guessing that there's a problem with the datetime2 fields, but I cant imagine what it is. TIA.

liquibase.exception.DatabaseException: ERROR: relation "container" already exists

I'm working on an application with liquibase, spring-boot and hibernate.
The database used is PostgreSQL. In order to populate the DB at startup I configured a data.sql file in src/main/resources, containing some insert statements.
In addition, after the boot there is also liquibase trying to apply all the changesets, one of them being the creation of a table populated using the data.sql file. So I get the following non blocking error, when executing the mvn cmd to start the app:
2018-04-25 14:33:53.417 ERROR 11060 --- [neut-Executor-1] liquibase : classpath:config/liquibase/master.xml: config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster: Change Set config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster failed. Error: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
2018-04-25 14:33:53.453 ERROR 11060 --- [neut-Executor-1] i.g.j.c.liquibase.AsyncSpringLiquibase : Liquibase could not start correctly, your database is NOT ready: Migration failed for change set config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster:
Reason: liquibase.exception.DatabaseException: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
liquibase.exception.MigrationFailedException: Migration failed for change set config/liquibase/changelog/20180424154826_added_entity_Container.xml::20180424154826-1::jhipster:
Reason: liquibase.exception.DatabaseException: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
at liquibase.changelog.ChangeSet.execute(ChangeSet.java:619)
at liquibase.changelog.visitor.UpdateVisitor.visit(UpdateVisitor.java:51)
at liquibase.changelog.ChangeLogIterator.run(ChangeLogIterator.java:79)
at liquibase.Liquibase.update(Liquibase.java:214)
at liquibase.Liquibase.update(Liquibase.java:192)
at liquibase.integration.spring.SpringLiquibase.performUpdate(SpringLiquibase.java:431)
at liquibase.integration.spring.SpringLiquibase.afterPropertiesSet(SpringLiquibase.java:388)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.initDb(AsyncSpringLiquibase.java:94)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.lambda$afterPropertiesSet$0(AsyncSpringLiquibase.java:77)
at io.github.jhipster.async.ExceptionHandlingAsyncTaskExecutor.lambda$createWrappedRunnable$1(ExceptionHandlingAsyncTaskExecutor.java:68)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: liquibase.exception.DatabaseException: ERROR: relation "container" already exists [Failed SQL: CREATE TABLE public.container (id BIGINT NOT NULL, name VARCHAR(255) NOT NULL, description VARCHAR(2000), container_type VARCHAR(255), created TIMESTAMP WITHOUT TIME ZONE, CONSTRAINT PK_CONTAINER PRIMARY KEY (id))]
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:309)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:55)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:113)
at liquibase.database.AbstractJdbcDatabase.execute(AbstractJdbcDatabase.java:1277)
at liquibase.database.AbstractJdbcDatabase.executeStatements(AbstractJdbcDatabase.java:1259)
at liquibase.changelog.ChangeSet.execute(ChangeSet.java:582)
... 12 common frames omitted
Caused by: org.postgresql.util.PSQLException: ERROR: relation "container" already exists
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2455)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2155)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:288)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:430)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:356)
at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:303)
at org.postgresql.jdbc.PgStatement.executeCachedSql(PgStatement.java:289)
at org.postgresql.jdbc.PgStatement.executeWithFlags(PgStatement.java:266)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:262)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tomcat.jdbc.pool.StatementFacade$StatementProxy.invoke(StatementFacade.java:114)
at com.sun.proxy.$Proxy129.execute(Unknown Source)
at liquibase.executor.jvm.JdbcExecutor$ExecuteStatementCallback.doInStatement(JdbcExecutor.java:307)
... 17 common frames omitted
What I'm thinking is that the hibernate executing the data.sql gets executed just a moment before the liquibase migration process, so maybe there should be a way to avoid this exception. Is it possible to force the execution of hibernate importing data.sql to happen after liquibase migration?
you have to decide by which mechanism do you want to execute the changes. So you can use hibernate to create your tables and insert data or you can use liquibase to create you tables and data not both at the same time. What I was doing on previous projects was setting the hibenate to only validate mode spring.jpa.hibernate.ddl-auto=validateand using liquibase to create tables and insert data.

Cannot import data file into SQL Server using BCP. How can I make this work?

I have been trying for hours to load a data file into SQL Server using BCP but can't make it work.
This is the commando I am running: bcp EDW.stg.STG_ACCOUNT in .\dm_account000 -T -S 111.1.1.111,1111 -t "," -c and all I got was this error:
Starting copy...
SQLState = 22005, NativeError = 0
Error = [Microsoft][ODBC Driver 11 for SQL Server]Invalid character value for cast specification
I already checked the date conversion problem that is mentioned on many websites but nothing seems to be wrong with my last three fields in the data file. Here is my data file and at the footer you can see the encoding, etc.
And here is the spec of my target table:
CREATE TABLE [stg].[STG_ACCOUNT](
[CD_ACCOUNT] [varchar](20) NULL,
[CD_CNPJ] [varchar](30) NULL,
[NA_ACCOUNT] [varchar](255) NULL,
[NA_OWNER] [varchar](100) NULL,
[NA_OWNER_SHARED] [varchar](100) NULL,
[NA_OWNER_GROWTH_TEAM] [varchar](100) NULL,
[DS_INDUSTRY] [varchar](50) NULL,
[DS_ACCOUNT_STATUS] [varchar](30) NULL,
[NA_ACCOUNT_CITY] [varchar](50) NULL,
[AB_ACCOUNT_STATE] [char](30) NULL,
[DS_REGION] [varchar](50) NULL,
[IN_CLASSIFICATION_PARTNER] [int] NULL,
[IN_CLASSIFICATION_PARTNER_N] [int] NULL,
[DS_COMPETITOR_PARTNER] [char](3) NULL,
[DS_HEADQUARTER_BRANCH] [varchar](30) NULL,
[DS_CHAIN_FRANCHISE] [varchar](30) NULL,
[DS_ECONOMIC_GROUP] [varchar](255) NULL,
[DS_WATCH_LIST] [varchar](50) NULL,
[IN_STRATEGIC] [int] NULL,
[DT_CREATED] [date] NULL,
[DT_LAST_ACTIVITY] [date] NULL,
[DT_LAST_MODIFIED] [date] NULL,
)
Can anybody help me with this issue? I don't really know what else I can try. Thanks in advance.
Try adding -C ACP to specify the code page.
From MSDN...
-C { ACP | OEM | RAW | code_page } Specifies the code page of the data in the data file. code_page is relevant only if the data contains
char, varchar, or text columns with character values greater than 127
or less than 32. System_CAPS_ICON_note.jpg Note
We recommend specifying a collation name for each column in a format
file, except when you want the 65001 option to have priority over the
collation/code page specification. Code page value Description
ACP ANSI/Microsoft Windows (ISO 1252). OEM Default code page used by
the client. This is the default code page used if -C is not specified.
RAW No conversion from one code page to another occurs. This is the
fastest option because no conversion occurs. code_page Specific code
page number; for example, 850.
Versions prior to version 13 (SQL Server 2016) do not support code
page 65001 (UTF-8 encoding). Versions beginning with 13 can import
UTF-8 encoding to earlier versions of SQL Server.

How to store special characters in sqlite

I have text and blog types for two columns..
When ever I use special chars like ; / = etc. Sqlite is raising OperationalError.
I am using python sqlite3 api..
Why this is happening and how to fix it?
edit1
this is the table:
create table mycontent(
id integer primary key autoincrement,
subject text not null,
body text not null
);
The below is typically I get error when for example there is a character ; in subject field
sqlite3.OperationalError: near ";": syntax error
However when I try to store simple values like "1" in all fields.. its working
edit2
cursor.execute("""
INSERT INTO mycontent
(subject, body)
VALUES (%s, %s);
""" % (kwargs["subject"], kwargs["body"])
)
cursor.execute("""
INSERT INTO mycontent
(subject, body)
VALUES (?, ?);
""", (kwargs["subject"], kwargs["body"])
)
its ?

Resources