i want to make dynamic connection to mongoose but with useDb for caching connection and prevent open new connection every time but nestjs/mongoose don't support this
Related
We are having dropwizard application using default configurations provided by dropwizard-jdbi for connecting to database.
Using the following to get sql connection object
Connection dbConnection = handle.getConnection();
Did a code walk-though and verified that the connections that are opened are closed.
But when i check v$session, I can see some inactive-sessions still present and are not getting released for long time.
I am using default connection pool provided by dropwizard.
Please let me know how to get the inactive sessions released.
What are your settings in the configuration file for Dropwizard?
If you have a look at http://www.dropwizard.io/1.3.0/docs/manual/configuration.html#database and then the service configuration there is an option for connections to keep alive.
# the minimum number of connections to keep open
minSize: 10
But most of the times you want to have some connections open this will speed up your application. Your application doesn't have to validate and connect to the database again and again for every call. That's one of the purposes of a connection pool.
I'm developing an express-based application that makes some queries to different (different by user!) SQL Server 2008 and 2014 databases. It's different because each user belongs to a different company and each company has its own SQL Server. My app uses an own SQL Server to manage companies and their SQL Server connection string (my app has access to their database servers). I'm using the mssql module.
I've not found a best practice regarding "should I use one SQL Server connection per user session or one connection for each user request".
Coming from a .NET world we had a rule: "one query/function - one connection".
First, the app has to query the own app database to get the SQL Server connection string for the database of the user's company. The user then can retrieve some data from their company's SQL Server (in my app) - like getAccounts(). Each of these functions (each function - not each request in that function!) opens a new connection and closes it after query completion:
let connection = new mssql.Connection(conStr, (err) => {
request.query(queryString, (err, result) => {
if (err)
throw new Error('...');
resolve(result)
connection.close();
});
})
As far as I understand, it should make no (negative) difference if 100 users open and close connections per request (assuming just one request per user at the same time) or if 100 user have 100 opened connections (one per user) for the whole session. At first glance it seems that my approach is less resource hungry since connections are only opened when they are needed (i.e., a few seconds per request).
Am I missing something? What if 200 users access my app at the same time - will I get in trouble somehow?
Thanks in advance!
[EDIT]
As far as I understand,
let connection = new mssql.Connection(...)
will create a new connection pool which will open a new connection when I use something like
connection.connect()
and close all active connections with:
connection.close()
So I'm guessing that best practice in my scenario would be to create one connection pool (new mssql.Connection(..)) per active user, save it in some kind of session store and then reuse it throughout the lifetime of the session.
Is this a good approach?
I just want to avoid one thing: a user gets an error because a connection can't be created.
Can anyone help me understand how to keep a connection alive on Heroku using Sequelize? Currently I am following the instructions on the Sequelize site to establish a connection to the postgres DB on Heroku.
Here is the link that shows the current model/index.js file I am following.
http://sequelizejs.com/heroku
In this example for the site. Every time we call global.db.User we have to reestablish a connection for each time we want to access the DB. It would better optimized if there was a way to keep the only connection alive during each client session.
I'm trying to understand what is the right way of using Session. If creating the session with
session = Session()
creates new connection to db each time, then I must try to reuse my session for several transactions, otherwise I can create it frequently.
Can you help me with this?
SQLAlchemy has built-in connection pooling for the engine that you make (a connection is reused if already available).
The "session" itself is bound to an engine:
# create a configured "Session" class
Session = sessionmaker(bind=some_engine)
# create a Session
session = Session()
Therefore, the session will use the default connection pooling automatically; you don't need to do anything special to gain the benefits of this feature.
You can read more about how it works (and, if you wish, how to tune the connection pool to modify values such as connection timeout and pool size) in the documentation.
If you have a class that services requests from other classes for database data when should you hold onto the databse connection and when should you close it and reopen it on the next request?
What if it's a service that responds to connections from external applications? (Web service, Ajax, rpc)
Is it a good idea to hold a singleton connection to the databse which is always open, and just reopen it on failure? Or should you open a new database connection for every request?
If maintaining a singleton database object that has an always open connection to the databse is a bad idea then are there any circumstances where it's a good idea? I've often seen it referenced as a justification for the Singleton pattern?
I'm not talking about a new connection per databse query, that would be silly.
You probably want to take a look at connection pooling.
In this scenario, N connections are opened and made available to clients. When you 'close' the connection, the connection itself is not closed, but returned to the pool for use by another client.
Apache DBCP is a useful library for managing this.