server properties in loops in Capistrano - capistrano3

I am new to Capistrano.
I need to get the server properties in tasks using a loop. I am using this code:
server 'IP_address', user: 'root', password: 'pass', roles: %w{web}, database: 'production1'
server 'IP_address', user: 'root', password: 'pass', roles: %w{web}, database: 'production2'
task :backup_FilesDatabaseServerfiles do
on roles (:web) do |h|
puts h.database
end
end
How can I fetch database options in the above task?

This should do it.
task :backup_FilesDatabaseServerfiles do
on roles :web do |server|
p server.properties.database
end
end
Per Capistrano 3: use server custom variable in task

Related

Role 'DBT_DEV_ROLE' specified in the connect string does not exist or not authorized

I am following this tutorial: https://quickstarts.snowflake.com/guide/data_engineering_with_dbt/#1
I ran this in my worksheet after selecting the securityadmin role and then sysadmin role,
-------------------------------------------
-- dbt credentials
-------------------------------------------
USE ROLE securityadmin;
-- dbt roles
CREATE OR REPLACE ROLE dbt_dev_role;
CREATE OR REPLACE ROLE dbt_prod_role;
------------------------------------------- Please replace with your dbt user password
CREATE OR REPLACE USER dbt_user PASSWORD = "<mysecretpassword>";
GRANT ROLE dbt_dev_role,dbt_prod_role TO USER dbt_user;
GRANT ROLE dbt_dev_role,dbt_prod_role TO ROLE sysadmin;
-------------------------------------------
-- dbt objects
-------------------------------------------
USE ROLE sysadmin;
CREATE OR REPLACE WAREHOUSE dbt_dev_wh WITH WAREHOUSE_SIZE = 'XSMALL' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
CREATE OR REPLACE WAREHOUSE dbt_dev_heavy_wh WITH WAREHOUSE_SIZE = 'LARGE' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
CREATE OR REPLACE WAREHOUSE dbt_prod_wh WITH WAREHOUSE_SIZE = 'XSMALL' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
CREATE OR REPLACE WAREHOUSE dbt_prod_heavy_wh WITH WAREHOUSE_SIZE = 'LARGE' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE MIN_CLUSTER_COUNT = 1 MAX_CLUSTER_COUNT = 1 INITIALLY_SUSPENDED = TRUE;
GRANT ALL ON WAREHOUSE dbt_dev_wh TO ROLE dbt_dev_role;
GRANT ALL ON WAREHOUSE dbt_dev_heavy_wh TO ROLE dbt_dev_role;
GRANT ALL ON WAREHOUSE dbt_prod_wh TO ROLE dbt_prod_role;
GRANT ALL ON WAREHOUSE dbt_prod_heavy_wh TO ROLE dbt_prod_role;
CREATE OR REPLACE DATABASE dbt_hol_dev;
CREATE OR REPLACE DATABASE dbt_hol_prod;
GRANT ALL ON DATABASE dbt_hol_dev TO ROLE dbt_dev_role;
GRANT ALL ON DATABASE dbt_hol_prod TO ROLE dbt_prod_role;
GRANT ALL ON ALL SCHEMAS IN DATABASE dbt_hol_dev TO ROLE dbt_dev_role;
GRANT ALL ON ALL SCHEMAS IN DATABASE dbt_hol_prod TO ROLE dbt_prod_role;
I have this in my profiles.yml file:
dbt_hol:
target: dev
outputs:
dev:
type: snowflake
######## Please replace with your Snowflake account name
account: xyz.eu-central-1
user: TEST
######## Please replace with your Snowflake dbt user password
password: password
role: dbt_dev_role
database: dbt_hol_dev
warehouse: dbt_dev_wh
schema: public
threads: 200
prod:
type: snowflake
######## Please replace with your Snowflake account name
account: xyz.eu-central-1
user: TEST
######## Please replace with your Snowflake dbt user password
password: password
role: dbt_prod_role
database: dbt_hol_prod
warehouse: dbt_prod_wh
schema: public
threads: 200
Although I am following the tutorial, when I run dbt debug, I get an error that:
Connection:
account: xyz.eu-central-1
user: TEST
database: dbt_hol_dev
schema: public
warehouse: dbt_dev_wh
role: dbt_dev_role
client_session_keep_alive: False
Connection test: ERROR
dbt was unable to connect to the specified database.
The database returned the following error:
>Database Error
250001 (08001): Failed to connect to DB: xyz.eu-central-1.snowflakecomputing.com:443. Role 'DBT_DEV_ROLE' specified in the connect string does not exist or not authorized. Contact your local system administrator, or attempt to login with another role, e.g. PUBLIC.
What could I be doing wrong?
As I see, you try to connect using the user TEST:
Connection:
account: xyz.eu-central-1
user: TEST
database: dbt_hol_dev
schema: public
warehouse: dbt_dev_wh
role: dbt_dev_role
client_session_keep_alive: False
Connection test: ERROR
On the other hand, you granted the dbt_dev_role to the following users:
GRANT ROLE dbt_dev_role,dbt_prod_role TO USER dbt_user;
GRANT ROLE dbt_dev_role,dbt_prod_role TO ROLE sysadmin;
You need to grant the role to the user TEST.

2 SQL Server Databases in Django Project

I have a problem loading data in database1 (default). You see, the system only has to load data that is in the database2 (source). the system works in the machine of my confessor but it has two different ports loaded and uses docker, I have the SQL server installed. The system starts, the problem is that when I want to load a data in the database1 it tells me that this data does not exist in the database2, then it does not. Now, if I try to load data that is not in the database2 if it loads correctly. I searched how to change the ports of the SQL Server but I did not get it. Can anyone help me?
DATABASES = {
'default': {
'ENGINE': 'sql_server.pyodbc',
'NAME': 'database1',
'HOST': 'name\\name',
'PORT': '',
'USER': 'user1',
'PASSWORD': 'password1',
'OPTIONS': {
'driver': 'ODBC Driver 13 for SQL Server',
}
},
'source': {
'ENGINE': 'sql_server.pyodbc',
'NAME': 'database2',
'HOST': 'name\\name',
'PORT': '',
'USER': 'user2',
'PASSWORD': 'password2',
'OPTIONS': {
'driver': 'ODBC Driver 13 for SQL Server',
}
}
This is the configuration:
def decide_on_model(model):
"""Small helper function to pipe all DB operations of a worlddata model to the world_data DB"""
return 'source' if model._meta.app_label == 'source' else None
class TutoriasRouter:
"""
Implements a database router so that:
* Django related data - DB alias `default` - MySQL DB `world_django`
* Legacy "world" database data (everything "non-Django") - DB alias `world_data` - MySQL DB `world`
"""
def db_for_read(self, model, **hints):
return decide_on_model(model)
# def db_for_write(self, model, **hints):
# return decide_on_model(model)
def db_for_write(self, model, **hints):
return 'default'
def allow_relation(self, obj1, obj2, **hints):
# Allow any relation if both models are part of the worlddata app
# if obj1._meta.app_label == 'source' and obj2._meta.app_label == 'source':
# return True
# # Allow if neither is part of worlddata app
# elif 'source' not in [obj1._meta.app_label, obj2._meta.app_label]:
# return True
# # by default return None - "undecided"
return True
def allow_migrate(self, db, app_label, model_name=None, **hints):
# allow migrations on the "default" (django related data) DB
if db == 'default' and app_label != 'source':
return True
# allow migrations on the legacy database too:
# this will enable to actually alter the database schema of the legacy DB!
# if db == 'source' and app_label == "source":
# return True
return False
`

PostgreSQL "Database does not exist" but it does

I have been trying to create a database in PostgreSQL for several days and have run into several issues but seem to be stuck.
I have manually created a database in PostgreSQL called postgres_development because bundle exec rake db:create wasn't working.
Now, I am trying to run bundle exec rake db:migrate but it is not recognizing that I have a database called postgres_development.
This is my Rakefile.
require 'rake'
require 'rspec/core/rake_task'
require 'active_support'
require 'active_support/core_ext'
require_relative 'config'
namespace :db do
desc "Drop, create, and migrate the database"
task :reset => [:drop, :create, :migrate]
desc "Create #{APP_NAME} databases"
task "create" do
puts "Creating #{APP_NAME} development and test databases if they don't exist..."
system("#SET PGPASSWORD=#{DB_PASSWORD}; createdb --username=#{DB_USERNAME} --password=#{DB_PASSWORD} #{DB_NAME} && #SET PGPASSWORD=#{DB_PASSWORD}; createdb --username=#{DB_USERNAME} --password=#{DB_PASSWORD} #{TEST_DB_NAME}")
end
desc "Drop #{APP_NAME} databases"
task "drop" do
puts "Dropping #{APP_NAME} development and test databases..."
system("dropdb #{DB_NAME} && dropdb #{TEST_DB_NAME}_test")
end
desc "Migrate the database"
task "migrate" do
ActiveRecord::Migrator.migrations_paths << File.dirname(__FILE__) + 'db/migrate'
ActiveRecord::Migration.verbose = true
ActiveRecord::MigrationContext.new("/db/migrate/").migrate
end
desc "Populate the database with sample data"
task "seed" do
require APP_ROOT.join('db', 'seeds.rb')
end
end
namespace :generate do
desc "Create a database migration\n rake generate:migration NAME=create_people"
task :migration do
unless ENV.has_key?('NAME')
raise "Must specify NAME for migration, e.g. rake generate:migration NAME=create_people"
end
migration_name = ENV['NAME']
class_name = migration_name.camelize
timestamp = Time.now.strftime('%Y%m%d%H%M%S')
filename = "#{timestamp}_#{migration_name}.rb"
path = APP_ROOT.join('db', 'migrate', filename)
if File.exist?(path)
raise "ERROR! File '#{path}' already exists"
end
puts "Creating migration at #{path}"
File.open(path, 'w+') do |f|
f.write("class #{class_name} < ActiveRecord::Migration\n\tdef change\n\n\tend\nend")
end
end
end
desc 'Start IRB with application environment loaded'
task "console" do
exec "irb -r./config"
end
desc "Run the specs"
RSpec::Core::RakeTask.new(:spec)
task :default => :specs
# Will this not work?
#desc "Run the specs"
#task 'specs' do
# exec "rspec spec"
#end
And this is my config.rb in the same folder, postgres which I renamed from activerecord-template in a failed attempt to get it to connect to my database.
require 'pathname'
require 'pg'
require 'active_record'
require 'logger'
## Load all files and configure the db
APP_ROOT = Pathname.new(File.expand_path(File.dirname(__FILE__)))
APP_NAME = APP_ROOT.basename.to_s
DB_PATH = APP_ROOT.join('db', APP_NAME + "_development.db").to_s
DB_NAME = APP_NAME + "_development.db"
TEST_DB_NAME = APP_NAME + "_test.db"
DB_USERNAME = 'postgres'
DB_PASSWORD = '****'
if ENV['DEBUG']
ActiveRecord::Base.logger = Logger.new(STDOUT)
end
Dir[APP_ROOT.join('models', '*.rb')].each do |model_file|
filename = File.basename(model_file).gsub('.rb', '')
autoload ActiveSupport::Inflector.camelize(filename), model_file
end
ActiveRecord::Base.establish_connection :adapter => 'postgresql',
:database => DB_NAME,
:host => 'localhost',
:username => DB_USERNAME,
:password => DB_PASSWORD
Any thoughts on what is going on here would be greatly appreciated!
That seems obvious: You created a database postgres_development, then you try to connect to a database with a different name, namely postgres_development.db.
How is that supposed to work?
in your Rakefile in line 28, add a . in front of the /, i.e. changing it to ActiveRecord::MigrationContext.new("./db/migrate/").migrate
db/migrate/20181108113458_create_people.rb add a [4.2] at the end of the first line ie. changing it to class CreatePeople < ActiveRecord::Migration[4.2]

SQL Connection Error with Spring Boot

I am getting this "SQL state [null]; error code [0]; The connection is closed.; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: The connection is closed." Error when using springBootVersion = '1.5.4.RELEASE', JDBC driver version 6, and SQLServer 2008R2, using database mirroring. My connection settings:
spring:
datasource:
initialize: false
username: user1
password: pass1
type: org.apache.tomcat.jdbc.pool.DataSource
connection-properties:
statementPoolingCacheSize=200;failoverPartner=Server2;applicationName=${sp
ring.application.name}
url: jdbc:sqlserver://Server1;databaseName=DBName
tomcat:
initial-size: 10
max-active: 50
max-idle: 10
max-wait: 1000
validation-query: "SELECT 1"
Can you please help me in determining why this error is occurring?
Tomcat native connection pool didn't work for failover and hikari.HikariDataSource was used.
spring:
datasource:
initialize: false
type: com.zaxxer.hikari.HikariDataSource
username: user1
password: pass1
jdbcUrl: jdbc:sqlserver:/server1;databaseName=DB1;statementPoolingCacheSize=200;failoverPartner=server2;applicationName=${spring.application.name}
dataSourceClassname: com.microsoft.sqlserver.jdbc.SQLServerDataSource
hikari:
pool-name: TimeCommerce
connection-timeout: 1000
leak-detection-threshold: 5000
maximum-pool-size: 40
max-lifetime: 60000
minimum-idle: 20
connection-test-query: SELECT 1
test-on-borrow: true
     test-on-connect: true
     test-on-return: true

Cannot connect to Redshift database with a driver even though play.ap.db.DB can do this for the same driver

I am trying to connect to a redshift server and run some sql commands. Here is the code that I have written:
Class.forName("org.postgresql.Driver")
val url: String = s"jdbc:postgres://${user}:${password}#${host}:${port}/${database}"
val connection: Connection = DriverManager.getConnection(url, user, password)
val statement = connection.createStatement(ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY)
val setSearchPathQuery: String = s"set search_path to '${schema}';"
statement.execute(setSearchPathQuery)
But I am getting the following error:
java.sql.SQLException: No suitable driver found for jdbc:postgres://user:password#host:port/database
But when I am using play framework's default database library with the same configuration, then I am able to connect to database successfully. Below is the configuration for the default database:
db.default.driver=org.postgresql.Driver
db.default.url="postgres://username:password#hostname:port/database"
db.default.host="hostname"
db.default.port="port"
db.default.dbname = "database"
db.default.user = "username"
db.default.password = "password"
The problem was with the url. The correct format for the url is:
jdbc:postgresql://hostname:port/database

Resources