I have a WinForms project, which consists of 10 forms. Forms are called to each other, each shape is drawn to the database
(I'm using LINQ to SQL) how best to create a database connection, I set up the connection when loading the main form
string path = "Data Source = | DataDirectory | \ \ Database.sdf";
Database db = new Database (path);
and then loading each window is transmitted through an object constructor db, then work with the database,
Window1 win1 = new Window1(db)
there may be a better way?
Try to use static class for your db operations.
Well for one, you should abstract your database logic / connections into their own class or assembly. The connection string shouldn't be hard coded, but come from the app.config file.
I don't think you should be passing around the connection object.
As I stated above, abstract our your database code and wrap your the context in a Using statement.
Related
Our team's application development involves using Effort Testing Tool to mock our Entity Framework's DbContext. However, it seems that Effort Testing Tool needs to be see the actual SQL Server Database that the application uses in order to mock our Entity Framework's DbContext which seems to going against proper Unit Testing principles.
The reason being that in order to unit test our application code by mocking anything related to Database connectivity ( for example Entity Framework's DbContext), we should Never need a Database to be up and running.
How would I configure Effort Testing Tool to mock Entity Framework's DbContext withOut the actual SQL Server Database up and running?
*
Update:
#gert-arnold We are using Entity Framework Model First approach to implement the back-end model and database.
The following excerpt is from the test code:
connection = Effort.EntityConnectionFactory.CreateTransient("name=NorthwindModel");
jsAudtMppngPrvdr = new BlahBlahAuditMappingProvider();
fctry = new BlahBlahDataContext(jsAudtMppngPrvdr, connection, false);
qryCtxt = new BlahBlahDataContext(connection, false);
audtCtxt = new BlahBlahAuditContext(connection, false);
mockedReptryCtxt = new BlahBlahDataContext(connection, false);
_repository = fctry.CreateRepository<Account>(mockedReptryCtxt, null);
_repositoryAccountRoleMaps = fctry.CreateRepository<AccountRoleMap>(null, _repository);
The "name=NorthwindModel" pertains to our edmx file which contains information about our Database tables
and their corresponding relationships.
If I remove the "name=NorthwindModel" by making the connection like the following line of code, I get an error stating that it expects an argument:
connection = Effort.EntityConnectionFactory.CreateTransient(); // throws error
Could you please explain how the aforementioned code should be rewritten?
You only need that connection string because Effort needs to know where the EDMX file is.
The EDMX file contains all information required for creating an inmemory store with an identical schema you have in your database. You have to specify a connection string only because I thought it would be convenient if the user didn't have to mess with EDMX paths.
If you check the implementation of the CreateTransient method you will see that it merely uses the connection string to get the metadata part of it.
public static EntityConnection CreateTransient(string entityConnectionString, IDataLoader dataLoader)
{
var metadata = GetEffortCompatibleMetadataWorkspace(ref entityConnectionString);
var connection = DbConnectionFactory.CreateTransient(dataLoader);
return CreateEntityConnection(metadata, connection);
}
private static MetadataWorkspace GetEffortCompatibleMetadataWorkspace(ref string entityConnectionString)
{
entityConnectionString = GetFullEntityConnectionString(entityConnectionString);
var connectionStringBuilder = new EntityConnectionStringBuilder(entityConnectionString);
return MetadataWorkspaceStore.GetMetadataWorkspace(
connectionStringBuilder.Metadata,
metadata => MetadataWorkspaceHelper.Rewrite(
metadata,
EffortProviderConfiguration.ProviderInvariantName,
EffortProviderManifestTokens.Version1));
}
I am having issues attempting to connect to two different databases in one Qt Application. I have my information database that stores all the information collected by the application and the new Log database which allows me to track all the changes that occur to the Application, button presses, screen loads etc, for easy debugging after its release. Separately, the databases work perfectly, but when I try to use both of them, only one will work. I read that this could be because I wasn't naming the connections and obviously only the most recently connected database could use the default connection. However when I give the databases names they wont work at all, isOpen() will return true on both, but as soon as they attempt to execute a query I get the errors
"QSqlQuery::prepare: database not open"
"QSqlError(-1, "Driver not loaded", "Driver not loaded")"
My two database declarations are:
database_location = filepath.append("/logger.sqlite");
logDB = QSqlDatabase::addDatabase("QSQLITE", "LoggerDatabaseConnection");
logDB.setHostName("localhost");
logDB.setDatabaseName(database_location);
for the Logger Database connection and :
database_location = filepath.append("/db.sqlite");
db = QSqlDatabase::addDatabase("QSQLITE", "NormalDB");
db.setHostName("localhost");
db.setDatabaseName(database_location);
Also when I am running the first query on the databases to see if their tables exist I am using
QSqlQuery query("LoggerDatabaseConnection");
and likewise for the normal database, but I am still getting connection issues even after declaring the database connection to run the query on.
The database used for the application is declared as a static QSqlDatabase in a namespace to create a global effect, so everyone can access it, that was a previous programmer, and I created the Log database as Singleton with a private database connection. Like I said both versions of the code work separately but when they are together they are fighting each other. I know there is a huge debate over the proper design of Singleton vs Dependecy Injection, but again the code works separately so I am happy with how it is designed for now. If there is any missing information or if you have any ideas, please let me know. Thank you.
QSqlQuery query("LoggerDatabaseConnection");
The first parameter of the constructor is the query, not the connection name. It will use the default connection since you specified no database object.
Try something like this:
QSqlQuery query1("YourFirstQuery", db);
QSqlQuery query2("YourSecondQuery", logDB);
Important: Also do not forget to open and close the database before / after using it by calls to QSqlDatabase::open() and QSqlDatabase::close().
The correct way to have multiple databases is to not use the pointer returned from the static addConnection method. You should use the connectionName argument:
https://doc.qt.io/qt-5/qsqldatabase.html#addDatabase-1 during initilization and query usage:
example:
void MyClass::initDb(QString dbPath, QString connName)
{
// initial db usage, etc
QSqlDatabase db = QSqlDatabase::addDatabase(YOUR_DRIVER, connName);
db.setDatabaseName(dbPath);
// open it, etc
}
void MyClass::updateThing(QString val, QString name, QString connName)
{
QString q = QString("UPDATE THINGS SET val=%1 WHERE name=%2").arg(val, name);
// add the reference to your database via the connection name
QSqlDatabase db = QSqlDatabase::database(connName);
QSqlQuery query(db);
query.exec(q);
// handle the query normally, etc
}
I'm creating a rptlibrary to share with all the reports in my company.
The library has an oda datasource created and shared to all reports. We want to do some querys from ReportEventAdapter.initialize() to the database to get information from the database. I can acces the datasource in the library in this way:
ReportDesignHandle rdh = (ReportDesignHandle)reportContext.getReportRunnable().getDesignHandle();
DesignSessionImpl ds = rdh.getModule().getSession();
String rsf = ds.getResourceFolder( );
LibraryHandle libhan = ds.openLibrary(rsf + "/my.rptlibrary" ).handle( );
DataSourceHandle datasource = libhan.findDataSource("myDS");
But once I have the datasource, there's no way to get a connection to the database from the datasource. The only way to do this, is creating a classic JDBC connection to the database using the data from the datasource? Is there any way to use a more elegant method to connect to the database from the java handler? Like using pooling, reusing the connection, etc..
Thanks.
We can iterate over dataset values in a report script event, thus if a dataset is defined with a JNDI URL, queries can take advantage of a connection pool.
However it is quite complicated. There is a full example in this topic: the script defined in "getDefaultValueList" event of the report parameter can be moved anywhere in the report and then initialize a global variable. In particular we could move it to "initialize" event, or to "beforeFactory" event (in your case "beforeFactory" is probably what you want).
Is there a way to dump the generated sql to the Debug log or something? I'm using it in a winforms solution so the mini-profiler idea won't work for me.
I got the same issue and implemented some code after doing some search but having no ready-to-use stuff. There is a package on nuget MiniProfiler.Integrations I would like to share.
Update V2: it supports to work with other database servers, for MySQL it requires to have MiniProfiler.Integrations.MySql
Below are steps to work with SQL Server:
1.Instantiate the connection
var factory = new SqlServerDbConnectionFactory(_connectionString);
using (var connection = ProfiledDbConnectionFactory.New(factory, CustomDbProfiler.Current))
{
// your code
}
2.After all works done, write all commands to a file if you want
File.WriteAllText("SqlScripts.txt", CustomDbProfiler.Current.ProfilerContext.BuildCommands());
Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this. However, if it helps, the core parts of mini-profiler are actually designed to be architecture neutral, and I know of other people using it with winforms, wpf, wcf, etc - which would give you access to the profiling / tracing connection wrapper.
In theory, it would be perfectly possible to add some blanket capture-point, but I'm concerned about two things:
(primarily) security: since dapper doesn't have a concept of a context, it would be really really easy for malign code to attach quietly to sniff all sql traffic that goes via dapper; I really don't like the sound of that (this isn't an issue with the "decorator" approach, as the caller owns the connection, hence the logging context)
(secondary) performance: but... in truth, it is hard to say that a simple delegate-check (which would presumably be null in most cases) would have much impact
Of course, the other thing you could do is: steal the connection wrapper code from mini-profiler, and replace the profiler-context stuff with just: Debug.WriteLine etc.
You should consider using SQL profiler located in the menu of SQL Management Studio → Extras → SQL Server Profiler (no Dapper extensions needed - may work with other RDBMS when they got a SQL profiler tool too).
Then, start a new session.
You'll get something like this for example (you see all parameters and the complete SQL string):
exec sp_executesql N'SELECT * FROM Updates WHERE CAST(Product_ID as VARCHAR(50)) = #appId AND (Blocked IS NULL OR Blocked = 0)
AND (Beta IS NULL OR Beta = 0 OR #includeBeta = 1) AND (LangCode IS NULL OR LangCode IN (SELECT * FROM STRING_SPLIT(#langCode, '','')))',N'#appId nvarchar(4000),#includeBeta bit,#langCode nvarchar(4000)',#appId=N'fea5b0a7-1da6-4394-b8c8-05e7cb979161',#includeBeta=0,#langCode=N'de'
Try Dapper.Logging.
You can get it from NuGet. The way it works is you pass your code that creates your actual database connection into a factory that creates wrapped connections. Whenever a wrapped connection is opened or closed or you run a query against it, it will be logged. You can configure the logging message templates and other settings like whether SQL parameters are saved. Elapsed time is also saved.
In my opinion, the only downside is that the documentation is sparse, but I think that's just because it's a new project (as of this writing). I had to dig through the repo for a bit to understand it and to get it configured to my liking, but now it's working great.
From the documentation:
The tool consists of simple decorators for the DbConnection and
DbCommand which track the execution time and write messages to the
ILogger<T>. The ILogger<T> can be handled by any logging framework
(e.g. Serilog). The result is similar to the default EF Core logging
behavior.
The lib declares a helper method for registering the
IDbConnectionFactory in the IoC container. The connection factory is
SQL Provider agnostic. That's why you have to specify the real factory
method:
services.AddDbConnectionFactory(prv => new SqlConnection(conStr));
After registration, the IDbConnectionFactory can be injected into
classes that need a SQL connection.
private readonly IDbConnectionFactory _connectionFactory;
public GetProductsHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
The IDbConnectionFactory.CreateConnection will return a decorated
version that logs the activity.
using (DbConnection db = _connectionFactory.CreateConnection())
{
//...
}
This is not exhaustive and is essentially a bit of hack, but if you have your SQL and you want to initialize your parameters, it's useful for basic debugging. Set up this extension method, then call it anywhere as desired.
public static class DapperExtensions
{
public static string ArgsAsSql(this DynamicParameters args)
{
if (args is null) throw new ArgumentNullException(nameof(args));
var sb = new StringBuilder();
foreach (var name in args.ParameterNames)
{
var pValue = args.Get<dynamic>(name);
var type = pValue.GetType();
if (type == typeof(DateTime))
sb.AppendFormat("DECLARE #{0} DATETIME ='{1}'\n", name, pValue.ToString("yyyy-MM-dd HH:mm:ss.fff"));
else if (type == typeof(bool))
sb.AppendFormat("DECLARE #{0} BIT = {1}\n", name, (bool)pValue ? 1 : 0);
else if (type == typeof(int))
sb.AppendFormat("DECLARE #{0} INT = {1}\n", name, pValue);
else if (type == typeof(List<int>))
sb.AppendFormat("-- REPLACE #{0} IN SQL: ({1})\n", name, string.Join(",", (List<int>)pValue));
else
sb.AppendFormat("DECLARE #{0} NVARCHAR(MAX) = '{1}'\n", name, pValue.ToString());
}
return sb.ToString();
}
}
You can then just use this in the immediate or watch windows to grab the SQL.
Just to add an update here since I see this question still get's quite a few hits - these days I use either Glimpse (seems it's dead now) or Stackify Prefix which both have sql command trace capabilities.
It's not exactly what I was looking for when I asked the original question but solve the same problem.
I've searched the stackoverflow for a long time and didn't find a solution fit my situation, so I asked here.
I have a single asp.net website, and need the web app to access different SQL Server database by the subdomain name.
According to the url request subdomian to determine the access the different database.
prj1.test.com prj1--->use the prj1_DB
prj2.test.com prj2 use the prj2_DB
I couldn't find a better practice to solve the issue.
My intuition solution:
when the url request coming, get the url subdomain, get the subdomain's db connection string stored in the main db, passing the connection string to the DAL to get the data.
Index.aspx.cs
DataTable dt = ProjectObject.GetProjectIndexNotice(new object[] { 0, CurrentProject.DbConnectionString });
ProjectObject.cs
public static DataTable GetProjectIndexNotice(object[] param)
{
ProjectDLC obj = new ProjectDLC();
return obj.GetProjectIndexNotice(param);
}
ProjectDAL.cs
public DataTable GetProjectIndexNotice(object[] param)
{
return base.GetDataTableFromDatabase(param, "NEMP_GetProjectIndexNotice");
}
DALBase.cs
DataBase db = new Microsoft.Practices.EnterpriseLibrary.Data.Sql.SqlDatabase(CurrentProject.DbConnectionString);
I want to find a better way to solve this problem.
The solution I using above is:
get the dbConnectionString from main DB,
passing it over the Index.page ->BusinessObject Layer -->DAL Layer
It's so bad that passing the DB connection string from UI page to the DAL layer.
Any ideas?
Update 1:
What I really want is: don't pass the db connection string from UI to DAL layer.
I want to find a solution that don't do pass the db connstr from UI to DAL linearly.
Is there some pattern in asp.net to share some variable for UI layer and DAL Layer?
Update 2:
if I stored the project db info in a xml file or in the main db, it looks like this
it's a key-value part for the project. here is the question, I get the values all in the main DB or a xml file. How I get the key when I need to access the DB in DAL layer?
in the DAL layer, how to get the correct key for the currnet url request?
it's back to the above, pass the key from UI to DAL. that's I want to avoid.
the real problem is, I can get the key from url request in the UI layer, and I can get the value for that key in the DAL layer. but there is a gap between the two layers, How to conquer this gap?
If you can compute your connection string from a base connection string, then you could do something like this:
store the base connection string in your web.config
<connectionStrings>
<add name="BaseConnString"
connectionString="server=MyServer;database=master;Integrated Security=SSPI;" />
</connectionStrings>
load the base connection string into a SqlConnectionStringBuilder in your code:
string baseConnStr = WebConfigurationManager.ConnectionString["BaseConnString"].ConnectionString;
SqlConnectionStringBuilder scsBuilder =
new SqlConnectionStringBuilder(baseConnStr);
now, just define the database you want to connect to, e.g. based on something in your URL
scsBuilder.InitialCatalog = "ProjectDatabase" + ........ ;
use the resulting complete connection string for your SqlConnection:
using(SqlConnection _con = new SqlConnection(scsBuilder.ConnectionString))
{
// do something
}
Check out the MSDN docs on SqlConnectionStringBuilder.
With this approach, you'd store a single "base" connection string in your web.config and this wouldn't be changing, and using SqlConnectionStringBuilder, you can safely and efficiently define and "compute" your real, "dynamic" connection strings at runtime.
How about adding add the connection strings in web.config as:
Subdomain_connectionString
Now read the subdomain from Request:
Reading connection string from web.config in your DL:
ConfigurationManager.ConnectionStrings[Subdomain_connectionString].ConnectionString
Update:
You can also use xml files to store connection string values:
<ROOT>
<Project_1>
<IPAddress></IPAddress>
<DBName></DBName>
...
</Project_1>
<Project_2>
....
</ROOT>
Anytime a new project is added/removed this xml file would be updated. Use XPath expressions to parse the xml file.
#Passing Connection string from UI to DAL: Just try to add the System.web namespace to DAL layer. This would give access to Request object in DAL. Now you can get the subdomain and build the connectionstring in DAL itself. Not sure whether this is a right approach, but might work in your case.