Can connectionstring cross over to other sites on the same server? - connection-string

I ran across a new problem in the last week. Due to the nature of my project and available budget a small intranet web application I've been working on is both the testing and live server, as well as serves up the pages and is the sql server. This will last at least until the project is out of the major development cycle. Now that the project has real users but I am continuing development I duplicated the database to have a safe copy to mess with that won't cause havoc to live business data and a development copy of the website.
All was well until I discovered an anomoly on the test copy of the site, anything that uses a sql datasource was properly pulling it's data from the test database, but anything that gets it's data from a stored procedure triggered in the code behind was pulling it's data from the live databse.
My confusion comes from the fact that all stored procedures and sql datasources ultimately point back to the same connectionstring setting in the web.config file to know where to connect to. I just rename the database name depending on if I'm uploading the latest changes to the test or live site.
My question comes down to, why would with one connection string in each site would my test site accessing data one way get it from one database and accessing the other get it's data from the other database?
Here's my connection string they all point back to, names/passwords of course change for obvious reasons, but the structure is intact.
<add name="db_Connection" connectionString="Data Source=SERVERNAME;Initial Catalog=DATABASE_live;Persist Security Info=False;User ID=USERID;Password=password" providerName="System.Data.SqlClient"/>
I added a key to the appsettings to reference the name of the database connection so I could easily change it's name if need be without having to edit dozens of pages for the code behind SProc calls.
<add key="defaultDB" value="db_Connection" />
Am I violating some rule I'm unaware of or is there something else going on that I need to be aware of and change so I can have a true test environment as I continue to develop an active site?
EDIT This project is in ASP.NET 2.0 VB, fixed the code display.
solution found I have tracked down the solution, thanks for the pointers, they got me looking elsewhere. When I copied the site to a different location for testing I forgot to update my appsetting key for the site's location, this caused the following part of the call for stored procedures to grab data from the live site's web.config aparently.
System.Web.Configuration.WebConfigurationManager.OpenWebConfiguration(pubvar_webConfig)

Change the username and password on the dev database. If your problem persists then you might have a connection string set somewhere else that you don't know about.

I would search all of the files in your solution to make sure you don't have one of the database names hard coded some place. Maybe in the designer files?

It may be worth running the two applications in different app pools via IIS (if you aren't already or course!). This should eliminate any concurrency issues between the test and production sites at the application level.
IMHO with a shared test / production environment seperate app pools is good practice at any time.

Related

Can a MS-Access front-end be opened simultaneously from different computers if stored on a shared network drive?

I have a quick question: if both my MS-SQL back-end and MS-Access front-end are installed on a shared network drive, can the front-end be opened from different computers and the data still be synced correctly on the server? I am asking because we are having more employees located in different buildings. If possible, it would be easier to manage than walking everywhere to install the front-end.
Two people opened the front-end today and put in clinical cases simultaneously. When they opened it again, everything seemed to have synced, but could something wrong be happening? I am not sure.
if ... my ... MS-Access front-end [is] installed on a shared network drive, can the front-end be opened from different computers and the data still be synced correctly on the server?
What you seem to be describing is more than one concurrent user directly opening the same copy of a front-end file, e.g., one stored in a shared folder like
\\servername\sharename\path\to\frontend.accdb
If so, then that is extremely bad practice. Every user must have their own private copy of the front-end file on their local hard drive. That is what the other answer meant when they said
you only need to have them copy the front-end file to their computer - there's no installation required
In your question you said two users apparently opened the same copy of the front-end and made changes to the data, and nothing bad seems to have happened. They were lucky. Sometimes it works and other times it doesn't, but in my experience it's not a matter of if it will cause problems, it's a matter of when.
Yes it should work OK. If the network connection between the front end and back end database will work from the user's individual computer, then you only need to have them copy the front-end file to their computer - there's no installation required.

changing URL in joomla

I have looked through the answers and I am not sure which will work for me , so I am writing the question again, Hoping for a response that might work.
Here : http://norstore-trd-bio0.hpc.ntnu.no/tfcp/
I have a database on server and I am using Joomla to create the website and manage database.
This was a test version, now I want to have the database here it self but use the new URL for website http://tfcheckpoint.org/ bought by godaddy.com.
According to godaddy i tried to first mask the website that is kinda a redirect tfcheckpoint.org to orginal site as explained here:
http://support.godaddy.com/help/article/422/forwarding-or-masking-your-domain-name
BUT, this cause some links to be broken and not accessible anymore, since the external URL is fixed.
Secondly I looked for other solutions and tried to get the mysql dump and replace the old url with new url everywhere and imported the sql file again, but nothing changed. probably because the new site is masked and I have to unmask it. Or should I install Joomla again and start from scratch with the new URL.
But is there a proper way to do this. I don't think moving the database to new hosting server is the solution as many people have their data on local servers while they use these fancy domain names.
Please guide me if there is a standard procedure to do so.
As far as I can tell from your question, you are merely trying to move a Joomla site from one hoster's server to another? You may even be moving between hosters...my answer still stands.
Firstly I have heard of many issues with GoDaddy for Joomla sites. But that is probably an answer for a different question.
In terms of moving your test site and "turning" it into a live site on a new server, this can be done with a 3rd party tool like Akeeba Backup.
For Joomla's settings in the new location, you will need to change the details to accommodate the new hoster's details for database and file locations. This is done through the Site Restoration process(made very easy in Akeeba). There is also a setting for LiveSite that you have have set which would need to be changed manually in /newlocation ofyour files/joomla install/configuration.php.
In terms of DNS settings, your new domain should point to the new hoster's server and a 301 redirect against the old domain pointing to the new one should cover all the bases.
I hope that answers your question. If not, I suggest rewriting the question to be more clear. You will probably get more answers

How to determine at runtime if I am connected to production database?

OK, so I did the dumb thing and released production code (C#, VS2010) that targeted our development database (SQL Server 2008 R2). Luckily we are not using the production database yet so I didn't have the pain of trying to recover and synchronize everything...
But, I want to prevent this from happening again when it could be much more painful. My idea is to add a table I can query at startup and determine what database I am connected to by the value returned. Production would return "PROD" and dev and test would return other values, for example.
If it makes any difference, the application talks to a WCF service to access the database so I have endpoints in the config file, not actual connection strings.
Does this make sense? How have others addressed this problem?
Thanks,
Dave
The easiest way to solve this is to not have access to production accounts. Those are stored in the Machine.config file for our .net applications. In non-.net applications this is easily duplicated, by having a config file in a common location, or (dare I say) a registry entry which holds the account information.
Most of our servers are accessed through aliases too, so no one really needs to change the connection string from environment to environment. Just grab the user from the config and the server alias in the hosts file points you to the correct server. This also removes the headache from us having to update all our config files when we switch db instances (change hardware etc.)
So even with the click once deployment and the end points. You can publish the a new endpoint URI in a machine config on the end users desktop (I'm assuming this is an internal application), and then reference that in the code.
If you absolutely can't do this, as this might be a lot of work (last place I worked had 2000 call center people, so this push was a lot more difficult, but still possible). You can always have an automated build server setup which modifies the app.config file for you as a last step of building the application for you. You then ALWAYS publish the compiled code from the automated build server. Never have the change in the app.config for something like this be a manual step in the developer's process. This will always lead to problems at some point.
Now if none of this works, your final option (done this one too), which I hated, but it worked is to look up the value off of a mapped drive. Essentially, everyone in the company has a mapped drive to say R:. This is where you have your production configuration files etc. The prod account people map to one drive location with the production values, and the devs etc. map to another with the development values. I hate this option compared to the others, but it works, and it can save you in a pinch with others become tedious and difficult (due to say office politics, setting up a build server etc.).
I'm assuming your production server has a different name than your development server, so you could simply SELECT ##SERVERNAME AS ServerName.
Not sure if this answer helps you in a assumed .net environment, but within a *nix/PHP environment, this is how I handle the same situation.
OK, so I did the dumb thing and released production code
There are a times where some app behavior is environment dependent, as you eluded to. In order to provide this ability to check between development and production environments I added the following line to global /etc/profile/profile.d/custom.sh config (CentOS):
SERVICE_ENV=dev
And in code I have a wrapper method which will grab an environment variable based on name and localize it's value making it accessible to my application code. Below is a snippet demonstrating how to check the current environment and react accordingly (in PHP):
public function __call($method, $params)
{
// Reduce chatter on production envs
// Only display debug messages if override told us to
if (($method === 'debug') &&
(CoreLib_Api_Environment_Package::getValue(CoreLib_Api_Environment::VAR_LABEL_SERVICE) === CoreLib_Api_Environment::PROD) &&
(!in_array(CoreLib_Api_Log::DEBUG_ON_PROD_OVERRIDE, $params))) {
return;
}
}
Remember, you don't want to pepper your application logic with environment checks, save for a few extreme use cases as demonstrated with snippet. Rather you should be controlling access to your production databases using DNS. For example, within your development environment the following db hostname mydatabase-db would resolve to a local server instead of your actual production server. And when you push your code to the production environment, your DNS will correctly resolve the hostname, so your code should "just work" without any environment checks.
After hours of wading through textbooks and tutorials on MSBuild and app.config manipulation, I stumbled across something called SlowCheetah - XML Transforms http://visualstudiogallery.msdn.microsoft.com/69023d00-a4f9-4a34-a6cd-7e854ba318b5 that did what I needed it to do in less than hour after first stumbling across it. Definitely recommended! From the article:
This package enables you to transform your app.config or any other XML file based on the build configuration. It also adds additional tooling to help you create XML transforms.
This package is created by Sayed Ibrahim Hashimi, Chuck England and Bill Heibert, the same Hashimi who authored THE book on MSBuild. If you're looking for a simple ubiquitous way to transform your app.config, web.config or any other XML fie based on the build configuration, look no further -- this VS package will do the job.
Yeah I know I answered my own question but I already gave points to the answer that eventually pointed me to the real answer. Now I need to go back and edit the question based on my new understanding of the problem...
Dave
I' assuming yout production serveur has a different ip address. You can simply use
SELECT CONNECTIONPROPERTY('local_net_address') AS local_net_address

SQLite vs.SQLCE Deployment

I am in the process of writing an offline-capable smartclient that will have syncing capability back to the main backend when a connection can be made. As a side note, I considered the Microsoft Sync Framework but since I'm really only going one-way I didn't feel it would buy me enough to justify it.
The question I have is related to SQLite vs. SQLCE and ClickOnce deployments. I've dealt with SQLite before (impressive little tool) and I've dealt with ClickOnce, but never together. If I setup an installer for my app via ClickOnce, how do I ensure during upgrades the local database doesn't get wiped out? Is it possible to upgrade the database (table structure, etc. if necessary) as part of the installer? Or is it better to use SQLCE for something like this? I definitely don't want to go the route of installing SQL Express or anything as the overhead would be far too high for what I am doing.
I can't speak about SQLLite, having never deployed it, but I do have some info about SQLCE.
First, you don't have to deploy it as a prerequisite. You can just include the dll's in your project. You can check this article which explains how. This gives you finite control over what version is being used, and you don't have to deal with installing it per se.
Second, I don't recommend that you deploy the database as a data file and let ClickOnce manage it. When you change that file, ClickOnce will publish it again and put it in the data directory. Then it will take the previous one and put it in the \pre subfolder, and if you have no code to handle that, your user will lose his data. So if you open the database file to look at the table structure, you might be unpleasantly surprised to get a phone call from your user about the data being gone.
If you need to retain the data between updates, I recommend you move the database to the [LocalApplicationData] folder the first time the application runs, and reference it there. Then if you need to do any updates to the structure, you can do them programmatically and control when they happen. This article explains how to do this and why.
The other advantage to putting the data in LocalApplicationData is that if the user has a problem and has to uninstall and reinstall the application, his data is retained.
Regardless of the embedded database you choose your database file (.sqlite or .sdf) will be a part of your project so you will be able to use "Build Action" and "Copy to Output Directory" properties of that file to control what happens with the file during the install/update.
If you choose "Do not copy" it will not copy the database file and if you choose "Copy if newer" it will only copy if you have a new version of your database file.
You will need to experiment a little but by using these two properties you can have full control of how and when your database file is deployed/updated...

JSP website pre-database configuration

I'm working on a website in JSP (in GWT really, but on the server side, it's really just JSP), and I need to configure my database.
I know HOW to code in the database connection etc, but i'm wondering how/where the database config should be saved.
To clarify my doubt, let me give an example; in PHP, a website usualy has a config.php, where the user configures the database, user, etc (or an install.php generates it).
However, since JSP is bytecode, I can't code this info into my site and have the user modify it, nor can I modify it analogously to an install.php.
How should I handle this? what's the best/most common practice ? I've found NO examples of this. Mainly, where should the config file be stored?
There are several possibilities to do this, what I've seen done include:
Having database credentials in a special file, usually db.properties or some simple XML file that contain the required information (driver, url, username, password, any ORM parameters if needed). The properties file would be placed under WEB-INF or WEB-INF/classes; the downside of this approach is that the user would have to modify the file inside the WAR before deploying it to the application server.
Acquire the database connection via JNDI and expect it to be provided by the application server. This seems to be the most common way of doing this; on the upside, your WAR doesn't have to be changed, however, the downside is that configuring a JNDI data source is different for every application server and may be confusing if your system administrators are not experienced with Java technology.

Resources