We have been trying to set up a CI process for SQL server database, and have successful imported the remote (DEV) db into our local database as a VS project. So we are able to make changes to the local db, run unit tests and then commit/push changes to the remote repo. The CI server (TFS) triggers the process on the remote DEV server, making the build and running the unit tests.
Everything is nearly working apart from the connection string to be made before running the test. We are not able to differentiate the app.config file based on the running environment. It is fixed and (it seems) can contain only one reference (local or remote) therefore the connection to the database can only work locally or remotely.
Is there any possible way to adjust the connection string in the app.config on the fly connecting it against local db (on the user laptop) while working locally and using the remote DEV database while the CI process is triggered by the TFS ?
Is it a good practice to use two different files (i.e. local.app.config and remote.app.config) ? if yes, how ?
or would you rather change its content based on the running env ? or maybe there is an easier way in VS which I don't truly know!
We really appreciate any advice/suggestion.
Thanks!
Related
So not sure it this is stupid to ask, but I'm running a neo4j database server (using Apollo server) from my React Application. Currently, I run it using node in a separate terminal (and I can navigate to it on localhost), then run npm start in a different terminal to get my application going. How can I get the database just up and running always, so if customers use the product they can always access the database? Or, if this isn't good practice, how can I establish the database connection while I run my client code?
Technologies being used: ReactJS, Neo4j Database, GraphQL + urql
I tried moving the Apollo server code into the App.tsx file of my application to run it from there directly when my app is launched, but this was giving me errors. I'm not sure if this is the proper way to do it, as I think it should be abstracted out of the client code?
If you want to run your server in the cloud so that customers can access your React application you need two things:
one server/service to run your database, e.g. Neo4j AuraDB (Free/Pro) or other Cloud Marketplaces https://neo4j.com/docs/operations-manual/current/cloud-deployments/
A service to run your react application, e.g. netlify, vercel or one of the cloud providers (GCP, AWS, Azure) that you then have to configure with the server URL + credentials of your Neo4j server
You can run neo4j-admin dump --to database.dump on your local instance to create a copy of your database content and upload it to the cloud service. For 5.x the syntax is different, I think neo4j-admin database dump --path folder.
I had created a webpage with DB access which works perfectly well on the machine I develop it. However, when I copy everything across to another machine, it fails with an error message - cannot open database xxx requested by the login. I did copy everything including the database created in the development environment and put it in the same place as the development did. What other thing I had done wrong ? Please help. In addition, when I open the solution on another machine, I cannot see the database from the SQL Server Object Explorer like what I saw on the development machine.
Problem solved ! Need to copy the database and the log file across and the used the SQL Server Manager to attach the database to the right server. After that, everything just work !!
I have written a VB.Net application that uses an SQL Express DB file containing a single table and a handful of stored procedures.
I have successfully built and exported the application to my VPS.
The problem comes when knowing what to do concerning the database file, there is a wealth of stuff online but not specifically to suit my needs.
I plan to use LocalDB on the VPS but being commandline - it is hard to know if the scripts that I have run have been successful after creating an instance , starting it... etc,
I want to keep installation requirements to an absolute minimum on my VPS machine and (in time other end users machines)... hence using LocalDB and not SQL Express
So, what do I have to do on the VPS to enable my application to connect to the database.. ? This was simple when it was Access - (supply the MDB file and run the AccessDatabaseEngine(redistributable) - job done)
The connection on my devt. machine runs as expected.
The connection string in my code is:
Const strSQLConnection As String = "Data Source= (localdb)\v11.0;Database=SoccerTrader;Trusted_Connection=True"
Can anyone help please.. this is driving me around the bend.. surely it cant be that difficult..?
===========================
I have found the following in an MSDN blog which says:
Database as a File: LocalDB connection strings support AttachDbFileName property that allows attaching a database file during the connection process. This lets developers work directly with databases instead of the database server. Assuming a database file (*.MDF file with the corresponding *.LDF file) is stored at “C:\MyData\Database1.mdf” the developer can start working with it by simply using the following connection string: “Data Source=(localdb)\v11.0;Integrated Security=true;AttachDbFileName=C:\MyData\Database1.mdf”.
================ ADDED 12th June =====================
OK, this is really bugging me now... I have read around this till it is coming out of my ears and nothing specifically seems to target what I am trying to do. All the blogs I read refer to installing / running SQL Server and changing permissions etc.
As I have mentioned I am using a VPS and propose to use LocalDB on the VPS to access a simple/small database file for a VB.Net application I am writing.
This is the story so far.
1) I have built a working prototype on my development PC and connected using SQL Express to a database file SoccerTrader.mdf - no problem.
In the Visual Studio Project properties I have added a requirement to the project that checks for SQL Server ..and if it is missing, installs it...
2) I install the project on the VPS and as expected SQL Server 2012 LocalDB is installed .... see here..
3) I have copied the SoccerTrader.MDF and SoccerTrader.LDF files into "C:\BESTBETSoftware\SoccerBot" on the VPS
4) for practical reasons given the problems I am having getting this to work, I have implemented an inputbox for me to specify the connection string when the application runs.... the connection strings I have used give the following...
1]: http://i.stack.imgur.com/i2tro.png
I have not changed any file permissions on the development PC and the database state is NOT read only....
So, the question is where do I go from here...? What have I missed.. why is it not working..?
I have managed to sort the problem.
Seemingly, the connection string I was using was OK. It was my error handling that wasnt 'clean' enough. It transpired the connection was being made on my VPS but when the application attempted to update the table , the directory I had created and put the MDF file into, would not permit write access.
I moved the MDF into the C:\Users\Public\Documents folder and all works as it should.
You have to specify the full path of the Db file with folder name/ip-address
I developed an Access 2003 application that is connected to SQL Server.
My problem is that I developed the software on my server, and the application runs on the client network on a different (identical) server.
As a result my executable file (Aka. .ADE) does not open on the client's computer, because of bad SQL Server connection.
My solution so far was to open the application file (.ADP) on the client's computer, changing the connection path from there and then creating there the executable file.
Now my client has only Access runtime environment, so I cannot do such thing.
I wonder if there is a way to determine the connection in an ADE file this way.
(I know I can change it through VBA, but when the connection is initially false, I don't even get to the VBA code stage.)
In the interest of keeping things simple, I'll say you need to set up a testing environment you control that mimics your client's environment. For instance, if they have a sql 2008 server named "SQL1", then you should install sql 2008 express on your machine, and rename your machine to "SQL1" so you can test. You'd also need to copy the schema of their database tables and put that same schema in your own test database, and fill it with test data that is similar to theirs. And you'll want to create duplicate logins as well.
With all that in place, I wouldn't think you'd need to update anything. Just copy the ADE file over to your client when you're done making changes. You could try to code your way though this scenario, but I've been there and done that. Having a test environment that apes your client's takes a lot of headaches out of the equation.
I have an SSIS Package that sets some variable data from a SQL Server Package Configuration Table. (Selecting the "Specify configuration setings directly" option)
This works well when I'm using the Database connection that I specified when developing the package. However when I run it on a server (64 bit) in the testing environment (either as an Agent job or running the package directly) and I Specify the new connection string in the Connection managers, the package still reads the settings from the DB server that I specified in development.
All the other Connections take up the correct connection strings, it only seems to be the Package Configuration that reads from the wrong place.
Any ideas or am I doing something really wrong?
The only way I was able to do this was to use Windows Environment Variables. You can specify things like connection strings and user preferences in environment variables, and then pick up those environment variables from your SSIS Task.
I prefer to use Server Aliases in the SQL Client Configuration. That way, when you decide to point the package to another SQL Server it is as simple as editing the alias to point to the new server, no editing necessary in the SSIS package. When moving the package to a live server, you need to add the aliases, and it works.
This also helps when you have a real painful naming convention for servers, the alias can be a more descriptive name than the actual machine name.
I didn't actually understand your question completely but I store my connection settings in a configuration files usually one for each environment like dev, production etc. The packages read the connection settings from the config files when they are run.
When you're creating a job to call the SSIS package, and you're setting up the step, there is a tabbed area. The default tab is where you set the package name, and the next tab over is where you can set the configuration file. Have a config file for each package, and change for the server (dev, test, prod). The config file can be put directly on the dev, test, and prod servers, and then point to them when setting up that job.
If u are using SQL Server Package Configuration then all the properties of the packages will come from SQL Server table - Please check that
SSIS security the way it stands is terrible. No one will be able to support things when I am out of the office. The job never reads from the configuration file...I give up. It only works when I edit the string in the Data sources tab. However the password gets lost if you happen to go into the job a second time. Terrible design, absolutely horrible. You would think that when you specify a xml file in the job step it would read the connection string from there that is defined, but it does not. Does this really work for anyone else?
Goto the package properties and set deployment True. This should work for what you have done.
I had the identical question, and got the same answer, i.e. you cannot edit the connection string used for package configurations hosted in SQL Server, except if you specify that the SQL Server connection string should be in an environment variable.
This unfortunately does not work in my dev setup, where two environments are hosted on the same machine. I ended up following Scott Coleman's approach as detailed on SQL Server Central [Free sign-up and a good site]. The trick is that you create a view to store your configuration settings on one central server, and then use the machine that connects to it to determine which environment is active.
I used that approach, but also used the User connecting to the environment to make a determination, because my test and dev setups run on the same SSIS instance, but as different user names. Scott suggests in the comments that the application name should be set, but this cannot be changed in the package execution job step, so it was not an option.
One other caveat that I found was that I had to add "Instead of" triggers to my view to do the inserts, updates and deletes for configuration variables.
We want to keep our package configs in a database table, we know it gets backuped with our other data and we know where to find it. Just a preference.
I have found that to get this to work I can use an environment variable configuration to set the connection string of the connection manager that I am reading my package config from. (Although I had to restart the SQL Server agent before it could find the new environment variable. Not ideal when I deploy this to Production)
Looks Like when you run an SSIS package as a step in a scheduled task it works in this order:
Load each of the Package Configs in the order they appear in the Package Configuations Organiser
Set the Connection Strings from the Data sources tab in the Job Step properties of the Scheduled Job
Start running package.
I would have expected the first 2 to be the other way around so that I can set the data source for my package config from the scheduled job. That is where I would expect other people to look for it when maintaining the package.