Querying MS Sql Server from a Jenkins Pipeline - sql-server

I've been using Jenkins (2.289.3) in a docker container (https://hub.docker.com/r/jenkins/jenkins). The next update to Jenkins 2.312 migrates the docker container from Java 8 to Java 11.
I have some pipelines that use the sourceforge jdbc driver to query SQL server (http://jtds.sourceforge.net/)
Example:
import java.sql.DriverManager
import groovy.sql.Sql
con = DriverManager.getConnection('jdbc:jtds:sqlserver://servername', 'user', 'password');
stmt = con.createStatement();
To make this work, in the Docker container on Java 8 I ran this on the docker container
cp jtds-1.3.1.jar ${JAVA_HOME}/jre/lib/ext
Which loads the jar for use inside Jenkins. This method no longer exists with Java 11.
It seems pipelines have added the #Grab syntax, eg
#Grab(group='net.sourceforge.jtds', module='jtds', version='1.3.1')
If I add this to my pipline, I can see the Jars are downloaded in /var/jenkins_home/.groovy/grapes/ but it doesn't seem to actually load the jar
java.lang.ClassNotFoundException: net.sourceforge.jtds.jdbc.Driver
or
java.sql.SQLException: No suitable driver found for jdbc:jtds:sqlserver://servername
depending on which commands I run. Either way, it appears to be due to the jar not being loaded.
All the groovy examples use
#GrabConfig(systemClassLoader=true)
But this appears to not be supported in pipelines.
I've considered using a command line client, but I need to parse the results of queries and I haven't seen a tool that works well for this (ie, one that would load results into a json file or similar)
I've also tried setting the -classpath argument in the docker container, eg
ENV JAVA_OPTS=-classpath /var/jenkins_home/test/jtds-1.3.1.jar
Running ps in the docker container, I can see that the java process runs with the classpath command line option specified, but it doesn't seem to actually load the jar for use.
I'm a bit lost on how to get this working, can anyone help? Thanks.

Well, I've found a workaround. It doesn't seem ideal, but it does work
The original code
import java.sql.DriverManager
import groovy.sql.Sql
con = DriverManager.getConnection('jdbc:jtds:sqlserver://servername', 'user', 'password');
stmt = con.createStatement();
Assuming we have the jar saved in /var/jenkins_home/test/jtds-1.3.1.jar it can be updated to:
import java.sql.DriverManager
import groovy.sql.Sql
def classLoader = this.class.classLoader
while (classLoader.parent) {
classLoader = classLoader.parent
if(classLoader.getClass() == java.net.URLClassLoader)
{
// load our jar into the urlclassloader
classLoader.addURL(new File("/var/jenkins_home/test/jtds-1.3.1.jar").toURI().toURL())
break;
}
}
// register the class
Class.forName("net.sourceforge.jtds.jdbc.Driver")
con = DriverManager.getConnection('jdbc:jtds:sqlserver://servername', 'user', 'password');
stmt = con.createStatement();
Once this code has been run once, the jar seems to be accessible globally (even in other pipelines that don't load the jar).
Based on this, it seems like a good way to handle this is on the Jenkins initialization, rather than in the script at all. I created /var/jenkins_home/init.groovy with these contents:
def classLoader = this.class.classLoader
while (classLoader.parent) {
classLoader = classLoader.parent
if(classLoader.getClass() == java.net.URLClassLoader)
{
classLoader.addURL(new File("/var/jenkins_home/jars/jtds-1.3.1.jar").toURI().toURL())
break;
}
}
Class.forName("net.sourceforge.jtds.jdbc.Driver")
And after that, the scripts seem to behave similar to how I think it should work with the Jar in the classpath.

Related

Setup airflow - mssql connection without using airflow UI

I have a working airflow - docker - mssql connection (mssql is in airflow container, and I am able to communicate with mssql by CLI).
I would like to creat a DAG which is able to read a database without using airflow UI.
With a huge help of this(https://stackoverflow.com/a/47464524/20325057) I changed env. variable(AIRFLOW_CONN_MY_MSSQL: mssql+pyodbc://sa:Database2022#localhost,1433) in docker-compose.yaml like this:
...
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
AIRFLOW__API__AUTH_BACKENDS: 'airflow.api.auth.backend.basic_auth'
AIRFLOW_CONN_MY_MSSQL: mssql+pyodbc://sa:Database2022#localhost,1433
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
...
And created a simple DAG like that:
from airflow import DAG
from airflow.providers.microsoft.mssql.operators.mssql import MsSqlOperator
from datetime import datetime
default_args = {
'owner': 'airflow',
'start_date': datetime.now()
}
with DAG("airflow_mssql_test3", schedule='#daily', default_args=default_args, catchup=False) as dag:
read_mssql = MsSqlOperator(
task_id='reading_mssql',
mssql_conn_id='my_mssql',
# printing first row of the table:
sql=r"""SELECT TOP 1 FROM stores;""",
)
In PyCharm I get this single line error:
WARNING:root:OSError while attempting to symlink the latest log directory
I have Windows 10 but it's running in Docker container, so I do not know where the problem is.
At the same time in Airflow (2.4.1) I have this error:
from airflow.providers.microsoft.mssql.operators.mssql import MsSqlOperator
ModuleNotFoundError: No module named 'airflow.providers.microsoft.mssql'
which is strange as well as providers and packages are installed.
I am really stuck, and would appreciate any kind of help.

Airflow initdb command fails after linking with postgresql

I am trying to connect Airflow with a Postgresql DB.
When in airflow.cfg I change the sql_alchemy_conn = spostgresql+psycopg2://127.0.0.1:5432/airflow, where airflow is the name of my DB which is installed on the same machine.
After updating the config file, I run airflow initdb and get the following error which I cannot understand:
File "/some_path/env/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 232, in load
"Can't load plugin: %s:%s" % (self.group, name)
sqlalchemy.exc.NoSuchModuleError: Can't load plugin: sqlalchemy.dialects:spostgresql.psycopg2
I found this on the web, which seems to "solve" this problem, but the solution was not clear to me at all.
Can someone tell me what the problem is and how to solve it?
Looks to me like you have a typo in your sqlalchemy connection string (s at the beginning of postgres). Try changing:
spostgresql+psycopg2://127.0.0.1:5432/airflow
to
postgresql+psycopg2://127.0.0.1:5432/airflow

Unable to run a python flink application on cluster

I am trying to run a Python Flink Application on the standalone Flink cluster. The application works fine on a single node cluster but it throws the following error on a multi-node cluster. java.lang.Exception: The user defined 'open()' method caused an exception: An error occurred while copying the file. Please help me resolve this problem. Thank you
The application I am trying to execute has the following code.
from flink.plan.Environment import get_environment
from flink.plan.Constants import INT, STRING, WriteMode
env = get_environment()
data = env.from_elements("Hello")
data.map(lambda x: list(x)).output()
env.execute()
You have to configure "python.dc.tmp.dir" in "flink-conf.yaml" to point to a distributed filesystem (like HDFS). This directory is used to distributed the python scripts.

Configuring a Postgresql connection with Play 2 and Slick-Play

I'm learning how to build an application using Scala and the Play 2 Framemork. I`ve created a new project using the activator tool, based on "play-scala-intro" current template.
The template have a sample app using the Play-Slick 1.0 for managing dependencies and is configured with a H2 DB, that worked without problems.
When I tried to change to a Postgres DB, I'm running in trouble. I get an error 500, telling me:
"Cannot connect to database [default]".
In the stack trace, the exception is:
"Configured Slick driver org.postgresql.Driver is not an instance of
requested profile slick.profile.BasicProfile"
So... What I already did:
I added to my build.sbt file the dependency:
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41"
In my configuration file (application.conf), the DB connection is configured as:
slick.dbs.default.driver=org.postgresql.Driver
slick.dbs.default.db.url="jdbc:postgresql://localhost:5432/hello_play"
slick.dbs.default.db.user="postgres" slick.dbs.default.db.password=""
PS: I've tried with slick.dbs.default.driver="org.postgresql.Driver" too...
PS2: My db password is empty. I'm connecting with PgAdmin without problems
slick.dbs.default.driver must be a slick driver, not a JDBC driver. Your db config should look something like this:
slick.dbs.default.driver="slick.driver.PostgresDriver$"
slick.dbs.default.db.driver="org.postgresql.Driver"
slick.dbs.default.db.url="jdbc:postgresql://localhost:5432/hello_play"
slick.dbs.default.db.user="postgres"
slick.dbs.default.db.password=""

ClassNotFoundException with Netbeans and H2 Database

So I followed the tutorial on the H2 Documentation page and used the "Connecting to a Database using JDBC" method of connecting to the database. I First added the h2-*.jar file to the Lib Folder (through Netbeans) and used the following to make the connection to my database I previously created.
Class.forName("org.h2.Driver");
connection = DriverManager.getConnection("jdbc:h2:~/" + DatabaseName);
This turned out to work in the IDE environment, however when I attempted to run the application directly from the jar executable I get the following error:
java.lang.ClassNotFoundException: org.h2.Driver ...
this error occurs at the Class.forName() class loader. So I did a little looking around and found that this was a prominent problem. One solution people use was to extract the class Loader from the current Thread as so:
Thread t = Thread.currentThread();
ClassLoader cl = t.getContextClassLoader();
cl.getClass().getClassLoader();
Class toRun = cl.loadClass("org.h2.Driver");
Unfortunately this seems to still result in the same error, so i'm wondering what i'm doing wrong. Should I be doing something to make sure the Driver is in the class path? I have no I idea how if that's the case.
Thanks!
You need to add the h2-*.jar file to the classpath when running the application, for example using
java -cp h2*.jar -jar yourApp.jar

Resources