how to configure autocommit to true in flyway command line? (snowflake issue) - snowflake-cloud-data-platform

we are having issues with autocommit set to false in flyway.
We need to set autocommit to true.
this is what we see in snowflake query activity: alter session /* JDBC:SnowflakeConnectionV1.setAutoCommit*/ set autocommit=false
I did not find any reference in the flyway documentation discussing how to do it in flyway command line tool.
I only found this topic, but it is using flyway JAVA API not for flyway CLI: https://github.com/flyway/flyway/issues/1534

You should be able to set autoCommit=false in the JDBC string. Keep in mind that whenever you run a DDL command (create an object) that this will automatically commit your transaction no matter how this setting is. See this section of the Snowflake documentation for more info.

Related

Liquibase diffChangeLog does not report stored procedures

I am new to Liquibase. I am using Liquibase version 4.0 and on Windows 10 Home edition
Starting Liquibase at 23:08:42 (version 4.0.0 #19 built at 2020-07-13 19:45+0000)
Liquibase Version: 4.0.0
Liquibase Community 4.0.0 by Datical
I added one table and SP to DATABASEONE. I also generated diff between two databases and it worked fine.
liquibase diffChangeLog
liquibase.properties
driver=com.microsoft.sqlserver.jdbc.SQLServerDriver
classpath=../mssql-jdbc-8.4.0.jre11.jar
url=jdbc:sqlserver://localhost;databaseName=DATABASETWO
username=sa
password=Password123
referenceUrl=jdbc:sqlserver://localhost;databaseName=DATABASEONE
referenceUsername=sa
referencePassword=Password123
changeLogFile=diff.xml
However, Liquibase does not report difference on stored procedures.
How can I get the newly added (or) missing stored procedure information in diffChangeLog ?
EDIT
I used Pro License Key (14-day trial) to generate the diff for Stored Procedures
I also generated liquibase updateSQL > update.sql. Now I want to run 'update.sql' against another database. How can I do that?
Running an update on the update.sql generatated the command liquibase updateSQL > update.sql will return extra stuff that is debug info. Not just straight SQL. You will want to:
use generateChangelog to produce a changelog file (let's call it change.postgresql.xml)
run liuqibase update using the above produced changelog file and url of destination db
Note, if they are different db platforms, like say from postgres->mysql, then you will have to review the generate changelog (in this case change.postgresql.xml) for non spec db objects of the target db (in this example mysql).
Unfortunately, you can't diff stored function / procedure according to diff-changelog document
Additional Functionality with Liquibase Pro
While Liquibase Open Source stores all changesets in a changelog, Liquibase Pro creates a directory called Objects and places the directory at the same level as your changelog. The Objects directory contains a subdirectory for each of the following stored logic types:
checkconstraint
package
packagebody
procedure
function
trigger
synonyms

Liquibase creates log tables with user schema

When implementing liquibase on Sql Server. Below tables are created on USER schema not with DBO schema. Not sure how to fix this.
AMAR.DATABASECHANGELOG
AMAR.DATABASECHANGELOGLOCK
It should be
DBO.DATABASECHANGELOG
DBO.DATABASECHANGELOGLOCK
I don't have any issues on creating table. It always creates on dbo schema. I'm using integratedSecurity and Liquibase version is 3.4.2
You can specify
--liquibaseSchemaName=dbo
as a parameter when running Liquibase from the command line.
I prefer to that into my liquibase.properties file.
This works for me using Liquibase 3.5.3 - don't know about 3.4.2 though
Have a look at changelogSchemaName parameters (http://www.liquibase.org/documentation/maven/generated/updateSQL-mojo.html) to specify schema where Liquibase stores tables.

How to resolve Liquibase changeSet error?

I have use Liquibase for my spring hibernate web application. Now I added new change set to update my database. When I try to restart my web application, Liquibase asks to drop database. But in my case I want to update my db schema without drop existing schema. How can I do this?
Thanks in advance
Lakshmi Priya.K
There is a "dropFirst" property on the spring implementation of liquibase:
http://www.liquibase.org/documentation/spring.html
This might be your issue.
Are you sure liquibase is dropping the schema and not just the objects within the schema?
Update
I found more details about the "dropFirst" property (see CORE-964). It is indeed performing a dropAll liquibase action.
Avoiding unused Autowire in the Controllers and specifying the
Liquibase Query in the root-context.xml without any space will solve the issue...
Liquibase works fine without database drop..
Thanks
Lakshmi Priya.K

DB2: How to backup a DB2 database?

DB2 v10.1 database on WINDOWS 7.
Can somebody share about creating a database backup of the DB2? I could not find detailed instructions.
Thanks in advance for any help in this matter
Have you tried looking at the documentation? Perhaps the "Data Recovery Reference"?
http://pic.dhe.ibm.com/infocenter/db2luw/v10r1/topic/com.ibm.db2.luw.admin.ha.doc/doc/c0006150.html
In a db2cmd window type \DB2 HELP BACKUP\ for more complete command syntax. The simplest form of the command is
DB2 BACKUP DATABASE <database name>
Optim Studio in 9.7 and 10.1 and Control Center in 9.7 have GUI's to assist with these tasks as well.
For a local backup you can use a simple command line command also provided in the other answers:
db2 backup database <name>
If you want a more automated solution that's more for "enterprise" then you should look into IBM Tivoli Storage Manager for example. DB2 supports making backups to network storaged TSM on the fly with incremental backups without disrupting the local database from working. I.e. you can run queries while the backup is running.
For TSM you need log archiving enabled on the database, you can do that with command should be:
db2 update db cfg using LOGARCHMETH1 TSM
After you have enabled log archiving you can create a backup script and schedule it:
set DB2INSTANCE=DB2
"C:\IBM\ProductName\db2\BIN\db2cmd.exe" /c DB2.EXE backup db WPSDB user <DOMAINUSERNAME> using <DOMAINUSERPASSWORD> online use tsm include logs
Here's a link to a full tutorial: http://www.codeyouneed.com/db2-backup-using-tsm/
For detailed step by step guide to configure DB2 backup, you can refer:
DB2 v9.7 on AIX(x64) backup configuration for TSM v7.1
Every step form planning, preparation and execution is explained with diagrams.
Basic steps are:
Download Appropriate TSM API 32/64 bit based on db2level from passport advantage
Extract TSMCLI_AIX.tar
Login as root and enter "SMITTY INSTALL"
Select required components:
tivoli.tsm.client.ba.64bit,
tivoli.tsm.client.api.64bit etc.
If not using TSM client GUI then no need to install
Tivoli.tsm.client.jbb.64bit
Tivoli.tsm.filepath
Now apply steps mention in the link as example to configure for File level and DB2 level backup as per your environment.

PostgreSQL how to see which queries have run

I have a PostgreSQL DB at my computer and I have an application that runs queries on it.
How can I see which queries has run on my DB?
I use a Linux computer and pgadmin.
Turn on the server log:
log_statement = all
This will log every call to the database server.
I would not use log_statement = all on a production server. Produces huge log files.
The manual about logging-parameters:
log_statement (enum)
Controls which SQL statements are logged. Valid values are none (off), ddl, mod, and all (all statements). [...]
Resetting the log_statement parameter requires a server reload (SIGHUP). A restart is not necessary. Read the manual on how to set parameters.
Don't confuse the server log with pgAdmin's log. Two different things!
You can also look at the server log files in pgAdmin, if you have access to the files (may not be the case with a remote server) and set it up correctly. In pgadmin III, have a look at: Tools -> Server status. That option was removed in pgadmin4.
I prefer to read the server log files with vim (or any editor / reader of your choice).
PostgreSql is very advanced when related to logging techniques
Logs are stored in Installationfolder/data/pg_log folder. While log settings are placed in postgresql.conf file.
Log format is usually set as stderr. But CSV log format is recommended. In order to enable CSV format change in
log_destination = 'stderr,csvlog'
logging_collector = on
In order to log all queries, very usefull for new installations, set min. execution time for a query
log_min_duration_statement = 0
In order to view active Queries on your database, use
SELECT * FROM pg_stat_activity
To log specific queries set query type
log_statement = 'all' # none, ddl, mod, all
For more information on Logging queries see PostgreSql Log.
I found the log file at /usr/local/var/log/postgres.log on a mac installation from brew.
While using Django with postgres 10.6, logging was enabled by default, and I was able to simply do:
tail -f /var/log/postgresql/*
Ubuntu 18.04, django 2+, python3+
You can see in pg_log folder if the log configuration is enabled in postgresql.conf with this log directory name.

Resources