I have a contained database user. Since it's contained in the database it's not allowed to connect to any other database including master. Unfortunately Entity Framework seems to connect to the master database anyway.
I've created a new console app with the latest Entity Framework nuget (6.2.0) to make sure nothing else connects to the master database:
static void Main(string[] args)
{
var connectionString = "Server=sql-azure.database.windows.net;Database='Database';User ID=Username;Password=password;Trusted_Connection=False;";
using (var dbContext = new DbContext(connectionString))
{
dbContext.Database.CommandTimeout = 10 * 60;
dbContext.Database.ExecuteSqlCommand("EXEC cleanup #Date", new SqlParameter("#Date", DateTime.UtcNow.AddMonths(-3)));
}
}
How do I force Entity Framework to not connect to the master database? I get failures in the audit logs on the master database which causes azure threat detection to go off.
After researching some more I've disabled the database initializer before the using statement like this:
Database.SetInitializer<DbContext>(null);
With this line of code, the console app doesn't connect to the master database any more. More info about Database.SetInitializer(null).
Full example:
static void Main(string[] args)
{
var connectionString = "Server=sql-azure.database.windows.net;Database='Database';User ID=Username;Password=password;Trusted_Connection=False;";
Database.SetInitializer<DbContext>(null);
using (var dbContext = new DbContext(connectionString))
{
dbContext.Database.CommandTimeout = 10 * 60;
dbContext.Database.ExecuteSqlCommand("EXEC cleanup #Date", new SqlParameter("#Date", DateTime.UtcNow.AddMonths(-3)));
}
}
I can't repro that:
Contained user:
//create user joe with password ='xxxxxx'
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.ComponentModel.DataAnnotations.Schema;
using System.Data.Entity;
using System.Data.SqlClient;
using System.Linq;
namespace Ef6Test
{
class Program
{
static void Main(string[] args)
{
var connectionString = "server=xxxxxxxxx.database.windows.net;database=adventureworks;uid=joe;pwd=xxxx";
using (var dbContext = new DbContext(connectionString))
{
dbContext.Database.CommandTimeout = 10 * 60;
Console.WriteLine(dbContext.Database.SqlQuery<string>("select db_name() dbname;").Single());
}
Console.WriteLine("Hit any key to exit");
Console.ReadKey();
}
}
}
The below would work for MS SQL on premises as well as if you are on SQL Azure:
use master;
CREATE LOGIN DemoUser WITH PASSWORD = 'DemoPassword';
GO
CREATE USER DemoUser FOR LOGIN DemoUser
GO
use DemoDB;
CREATE USER DemoUser FOR LOGIN DemoUser WITH DEFAULT_SCHEMA=[dbo]
GO
ALTER ROLE db_datareader ADD MEMBER DemoUser;
ALTER ROLE db_datawriter ADD MEMBER DemoUser;
Related
I've set up a Unit Test project in Visual studio for my SQL Server project.
The test itself work, and the setup includes deploying that database. My problem is that I want to have a "clean slate" test, and every time I run a test, the data accumulates.
I tried manually calling a 'DROP DATABASE' from the SqlDatabaseSetup.cs, but it seems that I don't have a DataConnection at this point.
[TestClass()]
public class SqlDatabaseSetup
{
[AssemblyInitialize()]
public static void InitializeAssembly(TestContext ctx)
{
var q = ctx.DataConnection.CreateCommand();
q.CommandText = "DROP DATABASE MyDb;";
q.ExecuteNonQuery();
// Setup the test database based on setting in the
// configuration file
SqlDatabaseTestClass.TestService.DeployDatabaseProject();
SqlDatabaseTestClass.TestService.GenerateData();
}
}
Is there anyway to indicate that the DB should be flushed first (without manually calling DELETE FROM XX for every table), or is there a way I can pass through a command to do that for me?
The solution was as follows :
[AssemblyInitialize()]
public static void InitializeAssembly(TestContext ctx)
{
// Setup the test database based on setting in the
// configuration file
try
{
var conn = SqlDatabaseTestClass.TestService.OpenExecutionContext();
var cmd = conn.Connection.CreateCommand();
cmd.CommandText = "use master; ALTER DATABASE [MyTestDb] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;DROP DATABASE MyTestDb;";
cmd.ExecuteNonQuery();
}
catch
{
}
SqlDatabaseTestClass.TestService.DeployDatabaseProject();
SqlDatabaseTestClass.TestService.GenerateData();
}
I want to access a database from Xamarin and it looks like a good approach is to create a azure database (free for a while)
So I created an account in azure, I created a DBSQLServer, and a SQLDataBase, I set the admin user and password and opened the firewall as part of the process.
I then created a project with azure function in VS 2019, Created a function that just returns an string in the OkObjectResult, and it works (both visiting the local url and the public (after publishing)).
Then I installed the System.Data.SqlClient Nuget package and then tried to connect to the data base usign admin user and password like this :
//did not use "new line"'s in the actual code, included them here for readability.
using (SqlConnection conn =
new SqlConnection("
Server=tcp:javrsserver.database.windows.net,
1433;
Initial Catalog=JaviRS;
Persist Security Info=False;
User ID={javirs};Password={-The one set in the azure portal-};
MultipleActiveResultSets=False;
Encrypt=True;
TrustServerCertificate=False;
Connection Timeout=30;"))
{
try{
conn.Open();
}catch (Exception exc)
{
//here I collect "Login failed for user {javirs}"
//evntID : 18456
}
}
In theory is just wrong password, but im pretty sure the passowrd is Ok. So I guess is more of some windows weird stuff about user account control ...
Any clue ??
PS: If there is a simplier way to get simple sql into my Xamarin app .. I'm also interested. this Azure thing is an over engineering for what I need and is not free forever..
EDIT:
I tried connecting using SQLServver Object Explorer in visual studio, entered the same credentials I entered in the connection string and It does allow me in..
I have done it with following steps:
Azure Sql Script:
CREATE TABLE AzureSqlTable(
[Id] [int] PRIMARY KEY IDENTITY(1,1) NOT NULL,
[FirstName] [nvarchar](max) NULL,
[LastName] [nvarchar](max) NULL,
[Email] [nvarchar](max) NULL,
)
GO
Function Class:
public class AzureFunctionV2SqlTableClass
{
public int Id { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public string Email { get; set; }
public string DbOperationType { get; set; }
}
Azure Function Body:
[FunctionName("FunctionV2SqlConnectionExample")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
//Read Request Body
var content = await new StreamReader(req.Body).ReadToEndAsync();
//Extract Request Body and Parse To Class
AzureFunctionV2SqlTableClass objFuncV2Sql = JsonConvert.DeserializeObject<AzureFunctionV2SqlTableClass>(content);
// variable for global message.
dynamic validationMessage;
// Validate param because, I am checking here.
if (string.IsNullOrEmpty(objFuncV2Sql.FirstName))
{
validationMessage = new OkObjectResult("First Name is required!");
return (IActionResult)validationMessage;
}
if (string.IsNullOrEmpty(objFuncV2Sql.LastName))
{
validationMessage = new OkObjectResult("Last Name is required!");
return (IActionResult)validationMessage;
}
if (string.IsNullOrEmpty(objFuncV2Sql.Email))
{
validationMessage = new OkObjectResult("Email is required!");
return (IActionResult)validationMessage;
}
//Read database Connection
var sqlConnection = "Data Source =tcp:sqlserverInstancenNameFromAzurePortal.database.windows.net,1433;Initial Catalog=YouDbName;Persist Security Info=False;User ID=ServerUserName;Password=ServerPass;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;";
//Sql Execution Message varible
dynamic sqlExecutionMessage;
//Define Db operation Type
if (objFuncV2Sql.DbOperationType.ToUpper() == "INSERT")
{
using (SqlConnection conn = new SqlConnection(sqlConnection))
{
conn.Open();
var text = "INSERT INTO AzureSqlTable VALUES ('" + objFuncV2Sql.FirstName + "', '" + objFuncV2Sql.LastName + "', '" + objFuncV2Sql.Email + "') ";
using (SqlCommand cmd = new SqlCommand(text, conn))
{
sqlExecutionMessage = await cmd.ExecuteNonQueryAsync();
}
conn.Close();
}
validationMessage = new OkObjectResult(sqlExecutionMessage + " ROW INSERTED");
return (IActionResult)validationMessage;
}
//As we have to return IAction Type So converting to IAction Class Using OkObjectResult We Even Can Use OkResult
var result = new OkObjectResult("Operation Falid! No Relevant Command Found!");
return result;
}
Reference Required:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using System.Net.Http;
using System.Data.SqlClient;
Nuget Paackage I used:
System.Data.SqlClient(4.6.1)
Download from Nuget package manager. See the screen shot below:
Post Man Sample:
{
"FirstName": "Kiron New Sql FunctionV2",
"LastName":"Kiron New Local Sql",
"Email":"KironTest#microsoft.com",
"DbOperationType":"INSERT"
}
Point To Remember:
Follow what I exactly tried to demonstrate here, no engineering
before make it run
Just update the connection string with your Azure Sql Server
Credentials
Get rid of {} from password as I did not specified it in my example
also, While you copy connection string from portal in contain
{password} just omit {}
If your function encountered error from Azure Portal SQL Db
regarding your Client IP address. In that case just add your Client
IP like below:
Step:1
Step:2
Note: I have just tried to show Insert operation. Hope it will work accordingly.
The error says: "Login failed for user {javirs}"
This hints that you included the {} in your user name (and provably in your password. Removing the { and the } will do the trick.
It works now.
I'm implementing NUnit Integration Tests of our controller REST endpoints in a .NET Web API 2 project. We use an Entity Framework code-first from database approach to create our controllers and models.
I've got the myProjectIntegrationTests project set up, with NUnit installed and a reference to myProject.
From my research, the next step is to create a TestSetup script which, on each test, creates an Integration Tests Database in a LocalDb. This allows us to integration test our API calls without affecting the master dev database.
This TestSetup script should do several things each time a test runs:
- check if a connection is currently open in Integration Test Db - if so, close it.
- check if an existing Integration Test db exists - if so, tear it down.
- run a migration from my master dev database to my Integration Test Db to load it with real data.
- create a new instance of Integration Test Db
- integration tests run...
- close Integration Test Db connections
- teardown Integration Test Db
Creating this TestSetup class is what's giving me trouble. I've found tutorials on how to do this for .NET MVC, .NET Core and also Entity Framework - but none of these seem to be utilizing just .Net Web API, so some of the libraries and code being referenced isn't working for me. Can someone provide an example script or tutorial link that might work in .NET Web API 2?
Here's an example of someone doing this for EntityFramework, using I believe .Net Core. This is part of a great PluralSight tutorial on integration testing in Entity Framework by Michael Perry, found here:
using Globalmantics.DAL;
using Globalmantics.DAL.Migrations;
using NUnit.Framework;
using System;
using System.Collections.Generic;
using System.Data.E ntity;
using System.Data.SqlClient;
using System.IO;
using System.Linq;
using System.Reflection;
namespace Globalmantics.IntegrationTests
{
[SetUpFixture]
public class TestSetup
{
[OneTimeSetUp]
public void SetUpDatabase()
{
DestroyDatabase();
CreateDatabase();
}
[OneTimeTearDown]
public void TearDownDatabase()
{
DestroyDatabase();
}
private static void CreateDatabase()
{
ExecuteSqlCommand(Master, $#"
CREATE DATABASE [Globalmantics]
ON (NAME = 'Globalmantics',
FILENAME = '{Filename}')");
var migration = new MigrateDatabaseToLatestVersion<
GlobalmanticsContext, GlobalmanticsConfiguration>();
migration.InitializeDatabase(new GlobalmanticsContext());
}
private static void DestroyDatabase()
{
var fileNames = ExecuteSqlQuery(Master, #"
SELECT [physical_name] FROM [sys].[master_files]
WHERE [database_id] = DB_ID('Globalmantics')",
row => (string)row["physical_name"]);
if (fileNames.Any())
{
ExecuteSqlCommand(Master, #"
ALTER DATABASE [Globalmantics] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
EXEC sp_detach_db 'Globalmantics'");
fileNames.ForEach(File.Delete);
}
}
private static void ExecuteSqlCommand(
SqlConnectionStringBuilder connectionStringBuilder,
string commandText)
{
using (var connection = new SqlConnection(connectionStringBuilder.ConnectionString))
{
connection.Open();
using (var command = connection.CreateCommand())
{
command.CommandText = commandText;
command.ExecuteNonQuery();
}
}
}
private static List<T> ExecuteSqlQuery<T>(
SqlConnectionStringBuilder connectionStringBuilder,
string queryText,
Func<SqlDataReader, T> read)
{
var result = new List<T>();
using (var connection = new SqlConnection(connectionStringBuilder.ConnectionString))
{
connection.Open();
using (var command = connection.CreateCommand())
{
command.CommandText = queryText;
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
result.Add(read(reader));
}
}
}
}
return result;
}
private static SqlConnectionStringBuilder Master =>
new SqlConnectionStringBuilder
{
DataSource = #"(LocalDB)\MSSQLLocalDB",
InitialCatalog = "master",
IntegratedSecurity = true
};
private static string Filename => Path.Combine(
Path.GetDirectoryName(
Assembly.GetExecutingAssembly().Location),
"Globalmantics.mdf");
}
}
And here's an older example of someone doing this for .Net MVC:
using System;
using System.Data.Entity;
using NUnit.Framework;
namespace BankingSite.IntegrationTests
{
[SetUpFixture]
public class TestFixtureLifecycle
{
public TestFixtureLifecycle()
{
EnsureDataDirectoryConnectionStringPlaceholderIsSet();
EnsureNoExistingDatabaseFiles();
}
private static void EnsureDataDirectoryConnectionStringPlaceholderIsSet()
{
// When not running inside MVC application the |DataDirectory| placeholder
// is null in a connection string, e.g AttachDBFilename=|DataDirectory|\TestBankingSiteDb.mdf
AppDomain.CurrentDomain.SetData("DataDirectory", NUnit.Framework.TestContext.CurrentContext.TestDirectory);
}
private void EnsureNoExistingDatabaseFiles()
{
const string connectionString = "name=DefaultConnection";
if (Database.Exists(connectionString))
{
Database.Delete(connectionString);
}
}
}
}
Probably not the answer you are looking for but I have had success recently using the sql server docker image with docker compose.
You can fire up a database instance and delete the data volumes when the the image shuts down. Using the —rm switch on the docker run command will do that automatically for you.
If you are using dot net core you can setup another container to run your entity framework migrations and tests.
If you are using dotnet framework you maybe be able to run windows docker images however they tend to be a bit slower to startup.
This approach would work best if you launched everything from a powershell script. Launching the infrastructure from the code as you are looking to do could be tricky and perhaps more complex than it needs to be.
Going line-by-line through the type of sql commands you'll need for these operations is just going to be painful. You would benefit much better to just develop a stored procedure that does the tear-down/build-up steps. You appear to already have a start in that as I see you writing code around the statements. Then your integration test code would just need to call this procedure and wait for the setup to complete. Remember, You don't have to do everything in code.
My team develops an application which deploys MSSQL database at customer's system. We encountered a problem with using migrations to update the customer's database structure .
We can't use automated migrations because more than one instance of the app can run on the same database so if one of instances gets updated and therefore changes the model and therefore the structure of database the others change it back so neither of them can work on the database.
We can't use nonautomated migrations because we have no access to customer's database to run the update-database command.
The question is what's the best approach to keep the database and the model always up to date on the level of code ?
You have to use the migrate to latest version strategy. This apporach allows you to automatically update the database when the model is changed:
Database.SetInitializer(new MigrateDatabaseToLatestVersion<MyDbContext, MyMagicDatabaseConfiguration>()`);
public class MyMagicDatabaseConfiguration : DbMigrationsConfiguration<MyDbContext>
{
public MyMagicDatabaseConfiguration()
{
this.AutomaticMigrationsEnabled = true;
this.AutomaticMigrationDataLossAllowed = true;
}
}
// !!Force the initialization. this will execute the update!!
MyDb.Context.Database.Initialize(true);
This works fine if you are using MS SQL Server
The problem you are using MySql you have to do everything by your self!:
var migrator = new DbMigrator(new DbMigrationsConfiguration ());
migrator.Update();
// In the migrtaions directory:
public partial class MyMigration : DbMigration
{
public override void Up()
{
AddColumn("dbo.MyTable", "AnyName", c => c.Boolean(nullable: false));
}
public override void Down()
{
DropColumn("dbo.MyTable", "AnyName");
}
}
This is not an easy work and I do not recommeded you to do it. Just use SQL Server apporach this will safe your time.
one note more: Magic migration sometimes does not working(Complex changes with keys), if the changes can not be handled automatically.
You can also use migration to a target version:
var configuration = new DbMigrationsConfiguration();
var migrator = new DbMigrator(configuration);
migrator.Update("HereMigrationId");
var scriptor = new MigratorScriptingDecorator(migrator);
var migrationScript = scriptor.ScriptUpdate(sourceMigration: null, targetMigration: "HereMigrationId");
In your case you need migration to target version.
with the decrator you can modifiy the migration script during the migration.
I'd quite like to use ADO.NET to generate a CREATE TABLE script to create an exact copy of a given table.
The reason for this is persistence testing. I would like to know whether my application will persist to a particular database. I would like to be able to point the app to the database and table in question, and then the app will generate a new database with an exact copy of the specified table. Thus, persistence testing can take place against the cloned table without touching the original database, and when I'm done the new database can simply be dropped.
Before I embark on this ambitious project, I would like to know if anything already exists. I've tried Google, but all I can find are ways to get schema generation SQL through the SSMS UI, not through code.
You can use SQL Management Objects (SMO) for this.
Example (C#)
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.SqlServer.Management.Smo;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
Server srv = new Server(#".\SQLEXPRESS");
Database db = srv.Databases["MyDB"];
Scripter scrp = new Scripter(srv);
scrp.Options.ScriptDrops = false;
scrp.Options.WithDependencies = true;
//Iterate through the tables in database and script each one. Display the script.
//Note that the StringCollection type needs the System.Collections.Specialized namespace to be included.
Microsoft.SqlServer.Management.Sdk.Sfc.Urn[] smoObjects = new Microsoft.SqlServer.Management.Sdk.Sfc.Urn[1];
foreach (Table tb in db.Tables)
{
smoObjects[0] = tb.Urn;
if (tb.IsSystemObject == false)
{
System.Collections.Specialized.StringCollection sc;
sc = scrp.Script(smoObjects);
foreach (string st in sc)
Console.WriteLine(st);
}
}
Console.ReadKey();
}
}
}