Using Prometheus to Monitor PostgreSQL - database

I am using Prometheus to monitor PostgreSQL DB using postgres_exporter installed on the DB server.
I used this tutorial https://computingforgeeks.com/monitor-postgresql-with-prometheus-grafana/ to install and configure postgres_exporter and it is working fine.
As part of the configuration, I have to create postgres_exporter.env file with the following contents:
DATA_SOURCE_NAME="postgresql://postgres:password#localhost:5432/?sslmode=disable"
Our security team has a concern about using a clear text password in the DB connection string mentioned in the DATA_SOURCE_NAME variable located in postgres_exporter.env file.
Is there any way to encrypt the DB password in the above configuration file please?
Thanks!

Related

Automating import of data-tier application (SQL database) from Azure with a Master Key

When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!

Knime with database

How to add new driver into database through KNIME preferences? Generally,
File-> Preferences -> Add File/ Add Directory
The files accepted are only of *.jar or *.zip.
MY QUESTION
I have installed ODBC64 into my PC. Now I need to add that file into knime preferences and use the Driver into Database Connector node.
How to add and use the file into my Knime?
And What is meant by Database URL jdbc:mysql://host:port/database_name
Host and port?
Can anyone please briefly explain and help me out?
I'm assuming based on your database URL of jdbc:mysql:// that you are wanting to connect to a MySQL database? Based on that, then there is a thread on the KNIME forum which explains pretty much all of your question, but the process is the same for any other sort of database. The steps are as follows:
Download the jdbc driver (e.g. from https://dev.mysql.com/downloads/connector/j/ for MySQL) - NB KNIME now comes bundled with several drivers already installed - MySQL is one of those - in the Database Connector node the drivers installed are listed.
In the database URL, you need to change those parts in <> - i.e. the hostname, port number and database name. Hostname may be localhost if it is a local database. The port number you will need to find from your database administrator, or will be what you set it up to be if you are running a local database (3306 is the default for MySQL), so for a database called 'myDB' on the default port on your local machine, the url should be jdbc:mysql://localhost:3306/myDB
For some of the shipped drivers, there are also connector nodes, e.g. MySQL Connector, SQLite Connector, PostgreSQL Connector etc, which still require the server name/port and database name, but take them as individual inputs rather than requiring editing of the URL
Recent versions of KNIME are based on Java 8, which dropped support for ODBC, so you should first find an alternative driver for your database and only after you can connect to that with KNIME as described on the KNIME documentation page for DB connectors.
You have several nodes which allow you to connect to a DB (especially MySQL).
I remember there was a dedicated MySQL node for connecting with the DB.
Just remember this: you have to input the IP adress : port, then insert credentials and point to the DB you want to open by default.

Publishing a VB.NET Application with SQL Express DB (using LocalDB)

I have written a VB.Net application that uses an SQL Express DB file containing a single table and a handful of stored procedures.
I have successfully built and exported the application to my VPS.
The problem comes when knowing what to do concerning the database file, there is a wealth of stuff online but not specifically to suit my needs.
I plan to use LocalDB on the VPS but being commandline - it is hard to know if the scripts that I have run have been successful after creating an instance , starting it... etc,
I want to keep installation requirements to an absolute minimum on my VPS machine and (in time other end users machines)... hence using LocalDB and not SQL Express
So, what do I have to do on the VPS to enable my application to connect to the database.. ? This was simple when it was Access - (supply the MDB file and run the AccessDatabaseEngine(redistributable) - job done)
The connection on my devt. machine runs as expected.
The connection string in my code is:
Const strSQLConnection As String = "Data Source= (localdb)\v11.0;Database=SoccerTrader;Trusted_Connection=True"
Can anyone help please.. this is driving me around the bend.. surely it cant be that difficult..?
===========================
I have found the following in an MSDN blog which says:
Database as a File: LocalDB connection strings support AttachDbFileName property that allows attaching a database file during the connection process. This lets developers work directly with databases instead of the database server. Assuming a database file (*.MDF file with the corresponding *.LDF file) is stored at “C:\MyData\Database1.mdf” the developer can start working with it by simply using the following connection string: “Data Source=(localdb)\v11.0;Integrated Security=true;AttachDbFileName=C:\MyData\Database1.mdf”.
================ ADDED 12th June =====================
OK, this is really bugging me now... I have read around this till it is coming out of my ears and nothing specifically seems to target what I am trying to do. All the blogs I read refer to installing / running SQL Server and changing permissions etc.
As I have mentioned I am using a VPS and propose to use LocalDB on the VPS to access a simple/small database file for a VB.Net application I am writing.
This is the story so far.
1) I have built a working prototype on my development PC and connected using SQL Express to a database file SoccerTrader.mdf - no problem.
In the Visual Studio Project properties I have added a requirement to the project that checks for SQL Server ..and if it is missing, installs it...
2) I install the project on the VPS and as expected SQL Server 2012 LocalDB is installed .... see here..
3) I have copied the SoccerTrader.MDF and SoccerTrader.LDF files into "C:\BESTBETSoftware\SoccerBot" on the VPS
4) for practical reasons given the problems I am having getting this to work, I have implemented an inputbox for me to specify the connection string when the application runs.... the connection strings I have used give the following...
1]: http://i.stack.imgur.com/i2tro.png
I have not changed any file permissions on the development PC and the database state is NOT read only....
So, the question is where do I go from here...? What have I missed.. why is it not working..?
I have managed to sort the problem.
Seemingly, the connection string I was using was OK. It was my error handling that wasnt 'clean' enough. It transpired the connection was being made on my VPS but when the application attempted to update the table , the directory I had created and put the MDF file into, would not permit write access.
I moved the MDF into the C:\Users\Public\Documents folder and all works as it should.
You have to specify the full path of the Db file with folder name/ip-address

How do I use the Flyway command-line client and flyway.conf to migrate multiple databases?

I started using flyway and its easy , but i only tried it with one Database.
setting flyway.url to that database
Using flyway migrate from command line to execute all the scripts.
This is the current Setting.
flyway.driver=net.sourceforge.jtds.jdbc.Driver
Jdbc url to use to connect to the database
flyway.url=jdbc:jtds:sqlserver://'databaseName'
User to use to connect to the database (default: <>)
flyway.user=user
Password to use to connect to the database (default: <>)
flyway.password=user
but i am having problem with multiple databases and scripts for each DB,
how can i set flyway to migrate data to all the databases ?
run separate schema for each ?
can i add multiple databases info to the config file?
what should i set my flyway.url to in the properties file
I am assuming you are using the command-line client. It has a -configFile option that lets you choose which config file to use. Simply use one config file per DB and you should be OK.
I am assuming you are using the command-line client. It has a -configFile option that lets you choose which config file to use. Simply use one config file per database and you should be ok.
For example, if you have Flyway installed in a folder called c:/Flyway:
Copy your conf/flyway.conf file to a file called conf/prod.conf.
Open it and update the username, password, and url properties, e.g:
flyway.url=jdbc:postgresql://dbinstance.eu.rds.amazonaws.com:5432/myapp
Then run in a command prompt (shell):
flyway -configFile=c:/flyway/conf/prod.conf migrate

Create MySQL database on server

I'm following a tutorial for constructing a PHP and MySQL ecommerce driven website, and I'm uploading them to my server at the moment, but in need of some assistance determining how to proceed.
In the README of the tutorial, are the following instructions:
INSTALLATION INSTRUCTIONS
1.) Unzip plaincart.zip to the root folder under your
HTTP directory ( or under your preferred directory)
2.) Create a database and database user on your web
server for Plaincart
3.) Use the sql dump in plaincart.sql to generate the
tables and example data
4.) Modify the database connection settings in
library/config.php.
5.) If you want to accept paypal modify the settings
in include/paypal/paypal.inc.php . More information
about this paypal stuff can be found in
http://www.phpwebcommerce.com/shop-checkout-process/
OK, so I obviously am capable enough to complete #1! :)
So, on to number 2, how to I create a database on my server?
I understad number 3, referring to the fact that I use the SQL dump file to construct some sample data once the database has been created.
I can't tell about #4 and #5 yet, but we'll see when we get there.
So, I guess I just need to know how to construct a MySQL database on my web server.
Easiest way: install phpmyadmin on the remote server, and do it from that web interface.

Resources