I have a postgresql data have some unicode tables.
I have already setup successful for replication but can not send origin unicode data to SQL server.
I have already tried:
mssql.use.ntypes.for.sync=true
but still have error.
Could someone help me ?
Thanks
Well after make some following changes, it works
Change the connection in property file to:
sendStringParametersAsUnicode=true
mssql.use.ntypes.for.sync=true
But i can see that it will make a big pressure on your server performance.
Related
Platform: Google Data Studio
Data Source: MySQL
Connection was working before,
meaning no issues with credentials.
All of a sudden, getting the below error:
All IPs have been whitelisted from the google data studio list of ips.
The only thing that comes to mind is a limitation of GDS to process data.
The data source table has around 200K+ rows.
Not sure what is the limitation for GDS with MySQL.
There's no indication anywhere.
Anyone out there can help to solve this or maybe provide some info would be appreciated.
Thanks
If you use a firewall, be sure to double check the Google ip adresses. They may have added new ips (in my case, the last one was missing).
Check them here !
After doing so, I had to change the Host name of the connection to the database for a url alias (www.yourserver.com <- url pointing on your server), and change it back to the IP to make it work.
Sounds like a the connector cannot establish a new connection.
Cloud SQL Connector:
At the time of writing this, the connector seems unable to establish a new connection once the existing one has timed out and modifying the JDBC url to include query parameters gives you an error when authenticating.
This is probably due to the connector appending it's own parameters.
(Seems to be a possible bug here when a connection no longer exists)
MySQL Connector (with IP Address):
This connector allows you to add query parameters to the JDBC url. Enable SSL and append useSSL=true to the url.
e.g.jdbc:mysql://<ip>/<database>?useSSL=true
This worked as expected and establishes new connections when required.
Example Source Setup
Suffering from this issue too, my experience is that using the MySQL connector instead of the Cloud SQL Connector provides better stability in combination with setting wait_timeout to a value above 12 hours.
This issue has been reported on the official Google Data Studio bug tracker. Please vote them up if you are also suffering from this !
🐛 130205306 MySQL connection does not exist Apr 9, 2019 04:36PM
🐛 118470083 Data source password not stored for MySQL sources. Oct 26, 2018 01:24PM
I am having some issues with database backups.
My database is in simple recovery mode and database backup occurs every night. We some times getting backup job failed and throwing the error as below.
ERROR:
The operating system returned the error '112(failed to retrieve text for this error. Reason: 15105) while attempting 'SetEndOfFile' on \backups\sqlbackups\finename
Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Problems with the query / Property not set correctly / Parameters not set correctly: this is running from past 2 years.
I am still unsure why this happens some times.
If anyone having the same issue and figured out the possible reason, please discuss
Server info: SQL Server 2008 R2, Standard
Database info: simple recovery mode and is acting as a publisher with size 1.4TB
Thanks in advance
It seems you haven't enough space on your destination place. Make sure that there is enough free space on your drive and try again. If you use a third-party tool to backup your databases set "Auto-delete" option to delete your old backups.
So , i'm creating a datasnap TCP/IP server multiitier data base in Delphi XE7, i used in the server a mssql Database then connected it used a datasetprovider
in the Client Side i made a sqlConnection for connecting to DataSnap Server then i used a dsproviderconnection then i used "ClientDataSet"
when i run the Client Application i will a serval rows and post the modifcation but if i desactivate the ClientDataSet then reactivate it the data will be lost
so help me please i cant see what is the probleme there no error or something like that?
and thank you
Either don't deactivate the ClientDataSet (the data will get lost as you found) or save the data to a file or stream and reload it afterwards (SaveToFile/Stream, LoadFromFile/Stream).
I am working on an ios project that has a Sybase (ultralite) database that is synchronized with a Sybase Sql Anywhere 12 database using mobilink.
Everything was properly, until i decided today to add some fields to the main database so that they synchronize to the main database.
I have updated the schema of the consolidated database from the main engine, then i have updated the schema of the remote database from the consolidated engine, and then i mapped the added fields together, and I deployed a new ultralite database.
Please note that it's not the first time I do a similar task, i always add fields, and sync databases..
after the update, when i synchronize using the blank ultralite database, mobilink will fail giving only this error: Synchronization Failed: -1305 (MOBILINK_COMMUNICATIONS_ERROR) %1:201 %2: %3:0
I have researched Error Number 201 in sybase and it points to: SQLE_NOT_PUBLIC_ID
and in the sybase documentation the error's probably cause is:
"The option specified in the SET OPTION statement is PUBLIC only. You cannot define this option for any other user."
I have tried to redeploy, I have tried to move the engine to a windows pc, all give the same error.. and i have no clue where this SET OPTION statement came from and how can i solve it..
Any hints are appreciated!
The problem was just caused by small network timeout value while setting up mobilink parameters.
info.stream_parms = (char*) #"host=192.168.0.100;port=3309;timeout=1"
i just changed the value from timeout=1 to timeout=300 and it worked!
Please I need help from someone possibly from balusC and or anyone out there.
I have been trying to upload picture files above 390kb into varbinary(max) datatype in microsoft sql server 2008 but I can't, instead I get the exception below.
java.sql.SQLException: [Microsoft][ODBC SQL Server Driver][SQL Server]The text, ntext, or image pointer value conflicts with the column name specified
Am using primeface fileupload component to get the image needed.
The method am using to send the file into the database looks like this:
pstmnt.setBinaryStream.(2,uploadedFile.getInputStream.(),uploadedFile.getSize());
But when I try uploading files below equal to or below 390kb, it uploads successfully.
I have been making research on how to solve this and these various research led me into updating my sql server driver to sqljdbc4, and jtds.
I also tried enabling filestream on my database and the column that accepts the picture but it still does not work. Please I need help from someone, because I have been on this for up to three weeks now. ANY kind of help will be appreciated. Thank you all.
its now working. it was actually my jdbc driver. I changed it to jtds as pedrag maric advised but it wasn't working at first because I didn't configure it well enough to work with my app. Simply put, I didn't specify in DriverManager.getConnection(jdbc:jtds:sqlserver/ blah blah blah);. Rather what I was still using even after I added the jar file to my classpath was DriverManager.getConnection("jdbc:odbc:databasename","sa","****"); mehn!! I was soooo happy when it worked.