Create Schema using EF Core failing for npgsql - npgsql

I’m trying to create a schema using context.
Database.ExecuteSqlCommand(“CREATE SCHEMA #p0”, <schemaNameParameter>)
It is giving the below error:
42601: Syntax error at or near $1
I tried other solutions given for similar question for DML, but they did not work in my case.

I found a solution, may not be very elegant, but works:
var con = _context.Database.GetDbConnection();
con.Open();
try
{
using (var cmd = con.CreateCommand())
{
cmd.CommandText = $"CREATE SCHEMA IF NOT EXISTS \"<schemaName\">";
await cmd.ExecuteNonQueryAsync();
}
}
finally
{
con.Close();
}
PS: Still trying to find a solution which takes a parameter rather than inline value for the schema name. (Note: I'm using regex validation before executing this command to ensure there is no injection attempt)

Related

Executing a non-query requires a transaction

I migrated my code from WebApi2 to NET5 and now I have a problem when executing a non-query. In the old code I had:
public void CallSp()
{
var connection = dataContext.GetDatabase().Connection;
var initialState = connection.State;
try
{
if (initialState == ConnectionState.Closed)
connection.Open();
connection.Execute("mysp", commandType: CommandType.StoredProcedure);
}
catch
{
throw;
}
finally
{
if (initialState == ConnectionState.Closed)
connection.Close();
}
}
This was working fine. After I migrated the code, I'm getting the following exception:
BeginExecuteNonQuery requires the command to have a transaction when the connection assigned to the command is in a pending local transaction. The Transaction property of the command has not been initialized.
So, just before calling Execute I added:
var ct = dataContext.GetDatabase().CurrentTransaction;
var tr = ct.UnderlyingTransaction;
And passed the transaction to Execute. Alas, CurrentTransaction is null, so the above change can't be used.
So then I tried to create a new transaction by doing:
using var tr = dataContext.GetDatabase.BeginTransaction();
And this second change throws a different exception complaining that SqlConnection cannot use parallel transactions.
So, now I'm in a situation where I originally had no problem to having neither an existing transaction nor can I create a new one.
How can I make Dapper happy again?
How can I make Dapper happy again?
Dapper has no opinion here whatsoever; what is unhappy is your data provider. It sounds like somewhere, somehow, your dataContext has an ADO.NET transaction active on the connection. I can't tell you where, how, or why. But: while a transaction is active on a connection, ADO.NET providers tend to be pretty fussy about having that same transaction explicitly specified on all commands that are executed on the connection. This could be because you are somehow sharing the same connection between multiple threads, or it could simply be that something with the dataContext has an incomplete transaction somewhere.

How can you run a report from the ReportServer database without building subscriptions?

I'd like to build a back end system that allows me to run each report every night and then query the execution log to see if anything failed. I know you can build out subscriptions for these reports and define parameters etc but is there a way to execute each report from the ReportServer database using TSQL without building out each subscription?
I understand that your overall goal is that you want to automate this and not have to write a subscription for every report. You say you want to do it in T-SQL, but is that required to meet your overall goal?
If you can accept, say .Net, then you can use the System.Data.SqlClient.SqlConnection and related classes to query your report server catalog and fetch a listing of all your reports.
Then you can use System.Net.WebClient or similar tool to attempt to download a pdf of your report. From there you can either read your execution log, or catch the error in the .Net Code.
EDIT
Well, since you accepted the answer, and it seems you may go this route, I'll mention that if you're not familiar with .net, it may be a long path for you. Here's a few things to get you started.
Below is a c# function utilizing .Net that will query the report catalog. If safeImmediate is set to true, it will only capture reports that can be run immediately, as in there are no parameters or the defaults cover the parameters.
IEnumerable<string> GetReportPaths(
string conStr,
bool safeImmediate // as in, you can exexute the report right away without paramters
) {
using (var con = new SqlConnection(conStr))
using (var cmd = new SqlCommand()) {
cmd.Connection = con;
cmd.CommandText = #"select path from catalog where type=2";
con.Open();
if (safeImmediate)
cmd.CommandText = #"
select path
from catalog
cross apply (select
params = convert(xml, Parameter).value('count(Parameters/Parameter)', 'int'),
defaults = convert(xml, Parameter).value('count(Parameters/Parameter/DefaultValues/Value)', 'int')
) counts
where type = 2
and params = defaults
and path not like '%subreport%' -- this is not standard. Just works for my conventions
";
using (var rdr = cmd.ExecuteReader())
while (rdr.Read())
yield return rdr["path"].ToString();
}
}
The next function will download a report given proper paths passed to it:
byte[] DownloadReport (
WebClient wc,
string coreUrl,
string fullReportPath,
string parameters = "" // you won't use this but may come in handy for other uses
) {
var pathToViewer = "ReportServer/Pages/ReportViewer.aspx"; // for typical ssrs installs
var renderOptions = "&rs:Format=pdf&rs:Command=Render"; // return as pdf
var url = $#"{coreUrl}/{pathToViewer}?{fullReportPath}{parameters}{renderOptions}";
url = Uri.EscapeUriString(url); // url's don't like certain characters, fix it
return wc.DownloadData(url);
}
And this utilizes the functions above to find what's succeeding and whats not:
var sqlCon = "Server=yourReportServer; Database=ReportServer; Integrated Security=yes"; // or whatever
var ssrsSite = "http://www.yourSite.org";
using (var wc = new WebClient()) {
wc.UseDefaultCredentials = true; // or whatever
int loops = 3; // get rid of this when you're ready for prime-time
foreach(var path in GetReportPaths(sqlCon, true)) {
try {
DownloadReport(wc, ssrsSite, path);
Debug.WriteLine($"Success with: {path}");
}
catch(Exception ex) { // you might want to get more specific
Debug.WriteLine($"Failed with: {path}");
}
if (loops-- == 0)
break;
}
}
Lots to learn, but it can be very beneficial. Good luck.

Run ddl script from file in Groovy

I working with Spock and Groovy in order to test my application. I should need to run a ddl script before to run every test.
To execute the script from Groovy I am using the following code:
def scriptToExecute = './src/test/groovy/com/sql/createTable.sql'
def sqlScriptToExecuteString = new File(scriptToExecute).text
sql.execute(sqlScriptToExecuteString)
The createTable.sql is a complex script that do several drop and create operation ( of course it is multiline ). When I try to execute it I got the following exception:
java.sql.SQLSyntaxErrorException: ORA-00911: invalid character
To notice that the ddl is correct since that it has been checked running it on the same DB that I am connecting with groovy.
Any Idea how to resolve the problem?
I think JDBC does not support this, but there are tools/libraries that could help, see this answer for Java.
In Groovy, using this JDBC script runner would be something like:
Connection con = ....
def runner = new ScriptRunner(con, [booleanAutoCommit], [booleanStopOnerror])
def scriptFile = new File("createTable.ddl")
scriptFile.withReader { reader ->
runner.runScript(reader)
}
Or, if your script is "simple enough" (ie no comments, no semicolons other than separating statements...), you can load the text, split it around ; and execute using sql.withBatch, something like that:
def scriptText = new File("createTable.ddl").text
sql.withBatch { stmt ->
scriptText.split(';').each { order ->
stmt.addBatch order.trim()
}
}
If you can't get it done in JDBC (See Hugues' answer), consider executing sqlplus from your Groovy program.
["sqlplus", CREDENTIALS, "#"+scriptToExecute].execute()

What is my embedded database location

Hello I am having problems with testing if my embedded datbase exists.
I created a database like follows:
try {
SQLiteConnection.CreateFile("AttendanceDatabase.sqlite");
} catch (SQLiteException ex) {
}
And then I insert tables and data into the tables, everything works fine. When im saving data to the database im using the the connection string as follows:
conn = new SQLiteConnection("Data Source=AttendanceDatabase.sqlite;Version=3;");
Now my problem is everytime I run my program it creates the database over, and I would like to know how to test if the database exists it should not create the database over again.
I see the recomended way to do it is using the next statement:
if (File.Exists())
{
}
and I have tried using it as follows:
if (File.Exists("Data Source=AttendanceDatabase.sqlite;Version=3;")){
MessageBox.Show("File Exists");
}
but it does not want to go into the if brackets and display "File Exists".
So I would like to know what my path should be for my embedded database, that is if thats where my problem lies?
Thanx in advance!
I don't have a ton of context but if you update your check:
var basePath = "C:/<path to file>/";
if (File.Exists(basePath + "AttendanceDatabase.sqlite")){
MessageBox.Show("File Exists");
}
You might have more luck. If you give me more context to how you are running this I can help you with using services to lookup the file path. You can look it up based on assembles, approot, etc.

Is there any way to trace\log the sql using Dapper?

Is there a way to dump the generated sql to the Debug log or something? I'm using it in a winforms solution so the mini-profiler idea won't work for me.
I got the same issue and implemented some code after doing some search but having no ready-to-use stuff. There is a package on nuget MiniProfiler.Integrations I would like to share.
Update V2: it supports to work with other database servers, for MySQL it requires to have MiniProfiler.Integrations.MySql
Below are steps to work with SQL Server:
1.Instantiate the connection
var factory = new SqlServerDbConnectionFactory(_connectionString);
using (var connection = ProfiledDbConnectionFactory.New(factory, CustomDbProfiler.Current))
{
// your code
}
2.After all works done, write all commands to a file if you want
File.WriteAllText("SqlScripts.txt", CustomDbProfiler.Current.ProfilerContext.BuildCommands());
Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this. However, if it helps, the core parts of mini-profiler are actually designed to be architecture neutral, and I know of other people using it with winforms, wpf, wcf, etc - which would give you access to the profiling / tracing connection wrapper.
In theory, it would be perfectly possible to add some blanket capture-point, but I'm concerned about two things:
(primarily) security: since dapper doesn't have a concept of a context, it would be really really easy for malign code to attach quietly to sniff all sql traffic that goes via dapper; I really don't like the sound of that (this isn't an issue with the "decorator" approach, as the caller owns the connection, hence the logging context)
(secondary) performance: but... in truth, it is hard to say that a simple delegate-check (which would presumably be null in most cases) would have much impact
Of course, the other thing you could do is: steal the connection wrapper code from mini-profiler, and replace the profiler-context stuff with just: Debug.WriteLine etc.
You should consider using SQL profiler located in the menu of SQL Management Studio → Extras → SQL Server Profiler (no Dapper extensions needed - may work with other RDBMS when they got a SQL profiler tool too).
Then, start a new session.
You'll get something like this for example (you see all parameters and the complete SQL string):
exec sp_executesql N'SELECT * FROM Updates WHERE CAST(Product_ID as VARCHAR(50)) = #appId AND (Blocked IS NULL OR Blocked = 0)
AND (Beta IS NULL OR Beta = 0 OR #includeBeta = 1) AND (LangCode IS NULL OR LangCode IN (SELECT * FROM STRING_SPLIT(#langCode, '','')))',N'#appId nvarchar(4000),#includeBeta bit,#langCode nvarchar(4000)',#appId=N'fea5b0a7-1da6-4394-b8c8-05e7cb979161',#includeBeta=0,#langCode=N'de'
Try Dapper.Logging.
You can get it from NuGet. The way it works is you pass your code that creates your actual database connection into a factory that creates wrapped connections. Whenever a wrapped connection is opened or closed or you run a query against it, it will be logged. You can configure the logging message templates and other settings like whether SQL parameters are saved. Elapsed time is also saved.
In my opinion, the only downside is that the documentation is sparse, but I think that's just because it's a new project (as of this writing). I had to dig through the repo for a bit to understand it and to get it configured to my liking, but now it's working great.
From the documentation:
The tool consists of simple decorators for the DbConnection and
DbCommand which track the execution time and write messages to the
ILogger<T>. The ILogger<T> can be handled by any logging framework
(e.g. Serilog). The result is similar to the default EF Core logging
behavior.
The lib declares a helper method for registering the
IDbConnectionFactory in the IoC container. The connection factory is
SQL Provider agnostic. That's why you have to specify the real factory
method:
services.AddDbConnectionFactory(prv => new SqlConnection(conStr));
After registration, the IDbConnectionFactory can be injected into
classes that need a SQL connection.
private readonly IDbConnectionFactory _connectionFactory;
public GetProductsHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
The IDbConnectionFactory.CreateConnection will return a decorated
version that logs the activity.
using (DbConnection db = _connectionFactory.CreateConnection())
{
//...
}
This is not exhaustive and is essentially a bit of hack, but if you have your SQL and you want to initialize your parameters, it's useful for basic debugging. Set up this extension method, then call it anywhere as desired.
public static class DapperExtensions
{
public static string ArgsAsSql(this DynamicParameters args)
{
if (args is null) throw new ArgumentNullException(nameof(args));
var sb = new StringBuilder();
foreach (var name in args.ParameterNames)
{
var pValue = args.Get<dynamic>(name);
var type = pValue.GetType();
if (type == typeof(DateTime))
sb.AppendFormat("DECLARE #{0} DATETIME ='{1}'\n", name, pValue.ToString("yyyy-MM-dd HH:mm:ss.fff"));
else if (type == typeof(bool))
sb.AppendFormat("DECLARE #{0} BIT = {1}\n", name, (bool)pValue ? 1 : 0);
else if (type == typeof(int))
sb.AppendFormat("DECLARE #{0} INT = {1}\n", name, pValue);
else if (type == typeof(List<int>))
sb.AppendFormat("-- REPLACE #{0} IN SQL: ({1})\n", name, string.Join(",", (List<int>)pValue));
else
sb.AppendFormat("DECLARE #{0} NVARCHAR(MAX) = '{1}'\n", name, pValue.ToString());
}
return sb.ToString();
}
}
You can then just use this in the immediate or watch windows to grab the SQL.
Just to add an update here since I see this question still get's quite a few hits - these days I use either Glimpse (seems it's dead now) or Stackify Prefix which both have sql command trace capabilities.
It's not exactly what I was looking for when I asked the original question but solve the same problem.

Resources