This is my first time on here. I am having an issue deploying a java application I made on myEclipse. I am using Filezilla to host my Wildfly 9.0.2 test server. I exported my project to a .war file and upon dragging it into the test server I am met with a deployment.failed. Upon viewing the file in Notepad it declares "Services with missing/unavailable dependencies". one such error can be seen below:
[ "jboss.naming.context.java.module.myproject.myproject.env.common.jdbc.database_connection is missing [jboss.naming.context.java.database.connection] "
There are five of these similar errors and all point to a diffferent database connection of some type that I am not using within my project. I understand the issue but I do not know where these dependencies are declared and how I can go about removing them.
Any help will be greatly appreciated.
Kind Regards,
Paul
Creating the WAR file will use the project's deployment assembly (assuming you're using MyEclipse 2013 or later). Right click on the project and select Properties. Then go to the MyEclipse/Deployment Assembly page. This will have all of the files that are added to the deployment (or to the WAR file).
However, the message seems to suggest that a project is using a database connection which can't be found when running on the server. A first thought was that you're using the inbuilt Derby database but don't have that running when you run on Wildfly.But you say that you're not using a database. Also, I'm not familiar with how Filezilla can host a J2EE server - I thought Filezilla was an FTP client and server solution. Perhaps you could give more details, if this answer doesn't help.
I'm starting to configure Keycloak to run on production environment and I need to use a database in order to run more than one instance with a single configuration repository. I'm using Oracle as SGBD.
But I didn't find the scripts to create the database in the Keycloak's git.
Does anyone knows where can I find them?
You don't need to specifically run a separate set of SQL files. Keycloak will run it for you on first startup.
A bit of advice as it's not really obvious at first - you'll either need to remove and install the default Keycloak data source (KeycloakDS) or manually modify the standalone.xml to point to the setup you want. It took me a little bit to figure out the order that I needed to do things.
I have been quested for a while for a best practice to initialize the relational database schema and pre-populated data.
There are a couple of ways to make it happen:
Install the cf-ex-phpmyadmin and import the data and schema thru it
Use the VMC cli tool to create a tunnel the service from this link
If using ruby or python, use the db migration command in the manifest.yml. However, it will be executed on each instance and every time the instance re-stages.
Which one is commonly used and most effective?
VMC is very old and is no longer supported. I'd be surprised if it even works against a Cloud Foundry installation that has been deployed within the last couple years. You should use the new cf CLI.
If you were to put the command in your manifest, you could avoid having it run on every instance if you had a conditional guard that would only run the migrations if $CF_INSTANCE_INDEX equals 0, however it's not always a great idea to run migrations in your start command, since there is a hard timeout on your start command, and you don't want migrations to be interrupted if they are long migrations.
A good suggestion I've heard [1] is that migrations should be handled as a separate part of your deploy process, either by cf ssh or running them locally, pointed at the URL and credentials of your database service instance.
[1] credit to Travis Grathwell for this suggestion.
While configuring the SQL Server 2012 Master Data Services, I am having following problem
The required .svc handler mappings are not installed in IIS.
What I want to do is that, I want to query my database using a URL so that I can retrieve data directly using the URL it self just like we can store the querystring parameters into SQL Server
How do I deal with it, I followed several documents but not any ideas.
To fix this issue, open a command prompt and go to the .NET directory
(for example %windir%\Microsoft.NET\Framework64\v4.0.30319).
Run the command: aspnet_regiis –i
For further details check:SVC Handler mapping error in MDS Configuration Manager
I've come across these types of errors a few times when installing MDS, the problem usually comes about because just having IIS installed is not enough, there are loads of other role services and features that you need to enable and install as well which the setup program doesn't tell you about.
Thankfully they are all documented here:
Web Application Requirements (Master Data Services)
And, if you've missed any, you can go back, install them and then re-launch the configuration tool to complete the setup without having to re-install MDS from scratch.
I have an SSIS Package that sets some variable data from a SQL Server Package Configuration Table. (Selecting the "Specify configuration setings directly" option)
This works well when I'm using the Database connection that I specified when developing the package. However when I run it on a server (64 bit) in the testing environment (either as an Agent job or running the package directly) and I Specify the new connection string in the Connection managers, the package still reads the settings from the DB server that I specified in development.
All the other Connections take up the correct connection strings, it only seems to be the Package Configuration that reads from the wrong place.
Any ideas or am I doing something really wrong?
The only way I was able to do this was to use Windows Environment Variables. You can specify things like connection strings and user preferences in environment variables, and then pick up those environment variables from your SSIS Task.
I prefer to use Server Aliases in the SQL Client Configuration. That way, when you decide to point the package to another SQL Server it is as simple as editing the alias to point to the new server, no editing necessary in the SSIS package. When moving the package to a live server, you need to add the aliases, and it works.
This also helps when you have a real painful naming convention for servers, the alias can be a more descriptive name than the actual machine name.
I didn't actually understand your question completely but I store my connection settings in a configuration files usually one for each environment like dev, production etc. The packages read the connection settings from the config files when they are run.
When you're creating a job to call the SSIS package, and you're setting up the step, there is a tabbed area. The default tab is where you set the package name, and the next tab over is where you can set the configuration file. Have a config file for each package, and change for the server (dev, test, prod). The config file can be put directly on the dev, test, and prod servers, and then point to them when setting up that job.
If u are using SQL Server Package Configuration then all the properties of the packages will come from SQL Server table - Please check that
SSIS security the way it stands is terrible. No one will be able to support things when I am out of the office. The job never reads from the configuration file...I give up. It only works when I edit the string in the Data sources tab. However the password gets lost if you happen to go into the job a second time. Terrible design, absolutely horrible. You would think that when you specify a xml file in the job step it would read the connection string from there that is defined, but it does not. Does this really work for anyone else?
Goto the package properties and set deployment True. This should work for what you have done.
I had the identical question, and got the same answer, i.e. you cannot edit the connection string used for package configurations hosted in SQL Server, except if you specify that the SQL Server connection string should be in an environment variable.
This unfortunately does not work in my dev setup, where two environments are hosted on the same machine. I ended up following Scott Coleman's approach as detailed on SQL Server Central [Free sign-up and a good site]. The trick is that you create a view to store your configuration settings on one central server, and then use the machine that connects to it to determine which environment is active.
I used that approach, but also used the User connecting to the environment to make a determination, because my test and dev setups run on the same SSIS instance, but as different user names. Scott suggests in the comments that the application name should be set, but this cannot be changed in the package execution job step, so it was not an option.
One other caveat that I found was that I had to add "Instead of" triggers to my view to do the inserts, updates and deletes for configuration variables.
We want to keep our package configs in a database table, we know it gets backuped with our other data and we know where to find it. Just a preference.
I have found that to get this to work I can use an environment variable configuration to set the connection string of the connection manager that I am reading my package config from. (Although I had to restart the SQL Server agent before it could find the new environment variable. Not ideal when I deploy this to Production)
Looks Like when you run an SSIS package as a step in a scheduled task it works in this order:
Load each of the Package Configs in the order they appear in the Package Configuations Organiser
Set the Connection Strings from the Data sources tab in the Job Step properties of the Scheduled Job
Start running package.
I would have expected the first 2 to be the other way around so that I can set the data source for my package config from the scheduled job. That is where I would expect other people to look for it when maintaining the package.