Sharing sqljdbc_auth.dll among multiple war files in Tomcat server running at the same time - sql-server

Following is my use-case:
I have started using Camunda Platform and am accessing the Camunda process engine using a Custom Application written in java. So basically there are 2 applications or war files in Tomcat server. Initially I was using the inbuilt h2 database for the Camunda application and SQL server database for my Custom Application and was using integrated authentication mechanism which worked perfectly fine. Now I have replicated the h2 database to SQL server database and am again using integrated authentication for the process engine to connect to SQL Server.
Problem:
After deploying the 2 wars, when I restart the Tomcat server, the sqljdbc_auth.dll present in the tomcat bin folder gets loaded successfully by the Camunda application and process engine successfully accepts requests, accesses the database and gives correct responses on the Camunda Web applications(Cockpit,Tasklist,Admin) but when I try to login on my Custom application then I get the following error:
"
null.null Failed to load the sqljdbc_auth.dll cause : Native Library C:\Users\Aakanksha\Desktop\BACKUP\$CAMUNDA_HOME\server\apache-tomcat-8.0.24\bin\sqljdbc_auth.dll already loaded in another classloader
"
I do understand why this is happening and have followed the following solutions already:
Sol.1 -
Added the sqljdbc4.jar file to $TOMCAT_HOME/lib folder
Added the sqljdbc_auth.dll file to $TOMCAT_HOME/bin folder
Sol.2 - Added sqljdbc4.jar and sqljdbc_auth.dll file to separate war files i.e.
WEBINF/lib folders.
Sol.3 -
Removed the dll file from $TOMCAT_HOME/bin folder and added the same to Windows/System32
Added this path to the PATH Environment variable
Sol.4 -
Added the dll file to Java/JDK/bin folder.
Sol.1, Sol.3 and Sol.4 - The dll files was loaded and used successfully by one war but not the other with the same error.
Sol.2 - lead to error "
com.microsoft.sqlserver.jdbc.AuthenticationJNI. Failed to load the sqljdbc_auth.dll cause : no sqljdbc_auth in java.library.path
"
Both of my applications are running at the same time and have different SQL Server databases to make connections to. Is it even possible for 2 wars running at the same instant to use a shared dll for making connection to different databases?
Kindly share your suggestions and ideas.

I was having the same issue. We changed our war files to use jndi lookups from tomcat configuration. That works great we now can deploy multiple war files.
For more information see
http://www.baeldung.com/spring-persistence-jpa-jndi-datasource
Johan

Related

How to load file locally with path vs path in Azure Devops build

I am trying to load a file in my PostDeployment script in my DB project:
FROM OPENROWSET(BULK'C:\Development\MyProject.Db\Scripts\My_Script.sql',SINGLE_CLOB)
Of course my build pipeline in azure devops is complaining that cannot find this script. But when I change it to the following my local build complains:
FROM OPENROWSET(BULK'$(Build.ArtifactStagingDirectory)\MyProject.Db\Scripts\My_Script.sql',SINGLE_CLOB)
What is the right way here? I would like to keep my local path locally and switch to the build one (hoping that one is the correct one, didn't even get the chance to try it cause the build complains) when the Azure Pipeline build runs. How can I achieve this?
Generally, the Post Deployment scripts will run on the servers once the databases have been deployed successfully to the servers. So, you can't directly use the absolute (or relative) paths in the scripts to access the local machines.
In Azure Pipelines, if we want to access the remote services from the local machines, we have many existing methods and tasks to create and use the service connections to access the services.
However, for the reverse visit that access the local machines from the remote services, we have no existing, easy ways. As far as I know, you may need to map the IP address and port of the local machine on the remote services, and you also may need to configure proxy settings, firewall settings on the local machines.
If the files you want to access in the Post Deployment scripts have been deployed to the servers, you can directly use these file paths on the servers instead of the local machines.
[UPDATE]
Each pipeline will be assigned with a working directory ($(Pipeline.Workspace)) in the working directory for the agent ($(Agent.WorkFolder)).
If the files you want to access are the source files in the source repository, and the repository has been checked out to the working directory for the pipeline, you can access the source files in the directory $(Build.SourcesDirectory).
$(Build.SourcesDirectory) is the default working directory of the job, so you can access the source files using either relative paths or absolute paths.
For example:
Using relative path.
MyProject.Db/Scripts/My_Script.sql
OR
./MyProject.Db/Scripts/My_Script.sql
Using absolute path.
$(Build.SourcesDirectory)/MyProject.Db/Scripts/My_Script.sql

Electron - How to setup db with sqlite in Windows

I have created an electron app, and built it with electron-builder. It creates a package in the dist folder, which I am able to install and then run the resulting application.
I have a sqlite database in the root folder of my project, with some data in it. But when I package and then run the exe file, it seems not to connect to the database or it appears empty. If I simply run the project with electron without packing, it is able to connect to the database and make use of the data.
Also, if visit the installation folder, there I find a copy of the database I had in my application but without any rows in it. Inside an .asar folder, there is a database populated as I would want but this one I supposedly cannot edit.
Would you have any pointers on what could be causing this? How can I properly connect to the database I have in the root folder of my project using sqlite, sequelize, windows and electron?
Thanks in advance
Ensure that electron-builder doesn't pack the database file into the app ASAR (use the asarUnpack option).
If your packaged app needs to modify the database then have it copy the file to the location returned by app.getPath('userData') and work with that copy. Your app generally does not have permission to write to the directory in which it is installed.

Search in all project files on remote host in PhpStorm

I have lots of files in a project on a remote host and I want to find out from which file another php file is called. Is it possible to use Ctrl+Shift+f search on a remote host project?
Is it possible to use Ctrl+Shift+F search on a remote host project?
Currently it's not possible. (2022-06-09: now possible with remote development using JetBrains Gateway, see at the end)
In order to execute search in a file content in a locally run IDE such file must be read first. For that the IDE must download it... which can be quite time & connection consuming task for (S)FTP connections (depends on how far the server is; how fast your connection; bandwidth limits etc.)
Even if the IDE could do it transparently for search like it does with Remote Edit functionality (where it downloads a remote file but instead of placing it in the actual project it stores it in a temp location) it still needs to download it.
If you execute one search (one term) and then need to do another search (slightly modified term or completely different search string) the IDE would need to re-download those files again (waste of time and connection).
Therefore it makes much more sense to download your project (all or desired files only) locally and then execute such search(es) on local files.
If it has to be purely remote search (when nothing gets downloaded locally)... then you just establish SSH/RDP/etc connection to that remote host (BTW: PhpStorm has built-in SSH Console functionality) and execute such search directly on the remote server with OS native tools (find/grep and alike) or some remote software (e.g. mc or notepad++).
P.S. (on related note)
Some of the disadvantages when doing Remote Edit: https://stackoverflow.com/a/36850634/783119
EDIT 2022-06-09:
BTW, JetBrains now has JetBrains Gateway for remote development where you run the IDE core on a remote server and connect to it via SSH using a local dedicated app or a plugin to your IDE (PhpStorm comes bundled with such a plugin since 2021.3 version).
To check more on JetBrains Gateway:
https://www.jetbrains.com/remote-development/gateway/
https://blog.jetbrains.com/blog/2021/11/29/introducing-remote-development-for-jetbrains-ides/

Filezilla/Wildfly server deployment error (Possible database dependencies)

This is my first time on here. I am having an issue deploying a java application I made on myEclipse. I am using Filezilla to host my Wildfly 9.0.2 test server. I exported my project to a .war file and upon dragging it into the test server I am met with a deployment.failed. Upon viewing the file in Notepad it declares "Services with missing/unavailable dependencies". one such error can be seen below:
[ "jboss.naming.context.java.module.myproject.myproject.env.common.jdbc.database_connection is missing [jboss.naming.context.java.database.connection] "
There are five of these similar errors and all point to a diffferent database connection of some type that I am not using within my project. I understand the issue but I do not know where these dependencies are declared and how I can go about removing them.
Any help will be greatly appreciated.
Kind Regards,
Paul
Creating the WAR file will use the project's deployment assembly (assuming you're using MyEclipse 2013 or later). Right click on the project and select Properties. Then go to the MyEclipse/Deployment Assembly page. This will have all of the files that are added to the deployment (or to the WAR file).
However, the message seems to suggest that a project is using a database connection which can't be found when running on the server. A first thought was that you're using the inbuilt Derby database but don't have that running when you run on Wildfly.But you say that you're not using a database. Also, I'm not familiar with how Filezilla can host a J2EE server - I thought Filezilla was an FTP client and server solution. Perhaps you could give more details, if this answer doesn't help.

ClickOnce publishing incomplete and then removes everything

I am using Visual Studio Express 2013 and trying to deploy Windows forms software to a web server, running Windows Server 2012. It is failing consistently as follows:
It successfully uploads 16 supporting DLL, ranging in size from 74KB
to 5,679KB
It successfully uploads a DLL that is part of the solution being
deployed
It adds the .application and .exe.config.deploy files both at 0KB
It successfully uploads .exe.deploy and .exe.manifest
It successfully uploads another DLL that is part of the solution
being deployed
It appears to (see later error message) successfully upload a
.ico.deploy file
Then all files except two (.application and
name>.exe.config.deploy) are removed and the
name>.exe.config.deploy file is shown as 1KB, rather than 0KB as
before.
A refresh of the folder then shows the .application as
11KB, rather than 0KB as before.
Then these files disappear as well, leaving an empty folder.
According to Visual Studio three files failed:
.application
.exe.config.deploy
.ico.deploy
The same error is given for each failed file:
Failed to copy file to . Unable to add to the Web site. Can't connect to on port 21. Check the server name and proxy settings. If the settings are correct, the server may be temporarily unavailable.
This seems very odd given that it has successfully connected to the FTP server to upload all the preceding files.
I have tried massively extending timeouts on the FTP site but it made no difference.
Any ideas?
Thanks
Gee

Resources