NodeJS mssql multiple connections from different users - sql-server

I'm developing an express-based application that makes some queries to different (different by user!) SQL Server 2008 and 2014 databases. It's different because each user belongs to a different company and each company has its own SQL Server. My app uses an own SQL Server to manage companies and their SQL Server connection string (my app has access to their database servers). I'm using the mssql module.
I've not found a best practice regarding "should I use one SQL Server connection per user session or one connection for each user request".
Coming from a .NET world we had a rule: "one query/function - one connection".
First, the app has to query the own app database to get the SQL Server connection string for the database of the user's company. The user then can retrieve some data from their company's SQL Server (in my app) - like getAccounts(). Each of these functions (each function - not each request in that function!) opens a new connection and closes it after query completion:
let connection = new mssql.Connection(conStr, (err) => {
request.query(queryString, (err, result) => {
if (err)
throw new Error('...');
resolve(result)
connection.close();
});
})
As far as I understand, it should make no (negative) difference if 100 users open and close connections per request (assuming just one request per user at the same time) or if 100 user have 100 opened connections (one per user) for the whole session. At first glance it seems that my approach is less resource hungry since connections are only opened when they are needed (i.e., a few seconds per request).
Am I missing something? What if 200 users access my app at the same time - will I get in trouble somehow?
Thanks in advance!
[EDIT]
As far as I understand,
let connection = new mssql.Connection(...)
will create a new connection pool which will open a new connection when I use something like
connection.connect()
and close all active connections with:
connection.close()
So I'm guessing that best practice in my scenario would be to create one connection pool (new mssql.Connection(..)) per active user, save it in some kind of session store and then reuse it throughout the lifetime of the session.
Is this a good approach?
I just want to avoid one thing: a user gets an error because a connection can't be created.

Related

How To Access logs of the HTTP Requests in IIS or Query logs sent from Entity Framework to SQL Server?

I created a simple website with a login page that requires to authenticate only with an email address.
I did not store any login logs nor I enabled any logging functionalities on IIS.
Now, people are asking for the list of emails who logged in the past two days.
I used EntityFramework to connect to the database. This is the authentication method:
public static bool Authenticate(string email)
{
using (var db = new DatabaseEntities())
{
var user = db.Users.Where(x => x.Email.ToLower() == email.ToLower()).FirstOrDefault();
if (user != null)
return true;
}
return false;
}
I have tired this in SQL but nothing relevant will show:
SELECT t.[text]
FROM sys.dm_exec_cached_plans AS p
CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) AS t
Is there a way to access the logs in SQL for every time this line of code was executed?
var user = db.Users.Where(x => x.Email.ToLower() == email.ToLower()).FirstOrDefault();
Perhaps there is a way to access the logs for every post request sent to the IIS server?
If you didn't already enable a logging feature in IIS or SQL Server then there is nothing you can do retroactively unfortunately.
Moving forward you can pretty easily enable out of the box features of SQL Server to log User connections via either SQL Server Audit or a Logon Trigger to store the log to a table.
This article lists a few other methodologies (in addition to what I mentioned above) such as using a Trace.
Unfortunately, the queries sent by the application to the database cannot be viewed in IIS. These query records can only be viewed through SQL Server.
The Logging module in IIS records the communication between the client and the server, which includes the URL status code and time.
Even the Fail Request Tracing module cannot capture SQL Server query records, but only the communication records with SQL Server, such as request response time and success and failure.

SQL Server 2014 - Service Broker: maximum Services

I am using SQL Server 2014 using FireDAC in Delphi XE7 to connect to the database.
We need an Event to automatically open a form if some Data where changed in a special Table. Therefor we found the TFDEventAlerter which we used to create a Queue and Service for each User.
UserEvent.Names.Add('QUEUE=qUserEvent');
UserEvent.Names.Add('SERVICE=s' + Username);
UserEvent.Names.Add('CHANGE1=usr;SELECT ID FROM dbo.MsgBox WHERE Status = 'A');
So we have got one Queue and a lot of Services that are listening to that Queue. In general this Setup ist working fine.
But if a lot of Users (550 in my case) are connecting to the database and adding new Services to the Queue we got the Problem that we are running into bad Performance enforced by ThreadPool_Starvation as each Service is blocking a Worker-Thread from time to time.
So does anybody know why there is a limitation using Services for the Service Broker in SQL Server 2014?
Is there another way to use the TFDEventAlerter with 500 Users without creating 500 Services? It seems to me, that we are not using the TFDEventAlerter as it is used to be.

Multi connection to SQL Server database

I am facing an issue since yesterday. I have a SQL Server located on a remote hosting provider. The provider claims that there could be max 5 connections at the same time.
I also have my own developed app. Until now there was only one user using the application and there was no problem with the connection to that database.
Now we got an additional user who will be working with this app from another place. The problem is, when the first user logged in into the program and he's using it, the second user retrieving information on login form that cannot connect to the SQL Server but before application thinking around 30 seconds before that message.
Seems there can be only one conenction at same time and not 5. Can you advise something or is there any test I can do to check it and make sure where I stand?

Codeigniter Multi Tenant Takes long time in loading tenant database

I am developing a multi tenant app in CodeIgniter, where every tenant has its own db. At run time I find the tenant name and then load its db info from my master database. In My_Model a function establishes connection with slave database
function getDbConFig() {
$dsn = 'mysql://'.$this->dbs_user.':'.$this->dbs_pwd.'#'.$this->dbs_dbhost.'/'.$this->dbs_dbnam;
if(!empty($this->dbs_user) && !empty($this->dbs_dbhost) && !empty($this->dbs_dbnam)){
$this->db_slave = $this->load->database($dsn, TRUE);
}
}
Every thing is working fine, but problem is it take very long time in establishing slave database connect.
Any help will be appreciated.
IMHO, rather than use the getdbconfig at the model level, you should be considering using at the Data access layer level. Did you check out the connection from the connection pools. You can consider increasing the connection pool recycling frequency. Also, you should not be handling two connections at any point of time.
Post your details here for more detailed discussion.

Automatic failover with SQL mirroring and connection strings

I have 3 servers set up for SQL mirroring and automatic failover using a witness server. This works as expected.
Now my application that connects to the database, seems to have a problem when a failover occurs - I need to manually intervene and change connection strings for it to connect again.
The best solution I've found so far involves using Failover Partner parameter of the connection string, however it's neither intuitive nor complete: Data Source="Mirror";Failover Partner="Principal" found here.
From the example in the blog above (scenario #3) when the first failover occurs, and principal (failover partner) is unavailable, data source is used instead (which is the new principal). If it fails again (and I only tried within a limited period), it then comes up with an error message. This happens because the connection string is cached, so until this is refreshed, it will keep coming out with an error (it seems connection string refreshes ~5 mins after it encounters an error). If after failover I swap data source and failover partner, I will have one more silent failover again.
Is there a way to achieve fully automatic failover for applications that use mirroring databases too (without ever seeing the error)?
I can see potential workarounds using custom scripts that would poll currently active database node name and adjust connection string accordingly, however it seems like an overkill at the moment.
Read the blog post here
http://blogs.msdn.com/b/spike/archive/2010/12/15/running-a-database-mirror-setup-with-the-sqlbrowser-service-off-may-produce-unexpected-results.aspx
It explains what is happening, the failover partner is actually being read from the sql server not from your config. Run the query in that post to find out what is actually being used as the failover server. It will probably be a machine name that is not discoverable from where your client is running.
You can clear the application pool in the case a failover has happened. Not very nice I know ;-)
// ClearAllPools resets (or empties) the connection pool.
// If there are connections in use at the time of the call,
// they are marked appropriately and will be discarded
// (instead of being returned to the pool) when Close is called on them.
System.Data.SqlClient.SqlConnection.ClearAllPools();
We use it when we change an underlying server via SQL Server alias, to enforce a "refresh" of the server name.
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlconnection.clearallpools.aspx
The solution is to turn connection pooling off Pooling="false"
Whilst this has minimal impact on small applications, I haven't tested it with applications that receive hundreds of requests per minute (or more) and not sure what the implications are. Anyone care to comment?
Try this connectionString:
connectionString="Data Source=[MSSQLPrincipalServerIP,MSSQLPORT];Failover Partner=[MSSQLMirrorServerIP,MSSQLPORT];Initial Catalog=DatabaseName;Persist Security Info=True;User Id=userName; Password=userPassword.; Connection Timeout=15;"
If you are using .net development, you can try to use ObjAdoDBLib or PigSQLSrvLib and PigSQLSrvCoreLib, and the code will become simple.
Example code:
New object
ObjAdoDBLib
Me.ConnSQLSrv = New ConnSQLSrv(Me.DBSrv, Me.MirrDBSrv, Me.CurrDB, Me.DBUser, Me.DBPwd, Me.ProviderSQLSrv)
PigSQLSrvLib or PigSQLSrvCoreLib
Me.ConnSQLSrv = New ConnSQLSrv(Me.DBSrv, Me.MirrDBSrv, Me.CurrDB, Me.DBUser, Me.DBPwd)
Execute this method to automatically connect to the online database after the mirror database fails over.
Me.ConnSQLSrv.OpenOrKeepActive
For more information, see the relevant links.
https://www.nuget.org/packages/ObjAdoDBLib/
https://www.nuget.org/packages/PigSQLSrvLib/
https://www.nuget.org/packages/PigSQLSrvCoreLib/

Resources