My Query is regarding using NOT hard coded File Locations to initialize the the Variables DefaultDataPath and DefaultLogPath. Prior to adopt Database Projects as our standard Deployment and Database Management Tools and migrating our existing Scripts to Database projects we have been using the SET of CREATE and INITIALIZE scripts for Setting up Database. We are having following SQL Query to CREATE the Database with the FILE location:
SET #data_path = (SELECT SUBSTRING(filename, 1, CHARINDEX(N'master.mdf', LOWER(filename)) - 1)
FROM sys.sysaltfiles WHERE dbid = 1 AND fileid = 1);
set #mdb_file=#data_path + 'CF_DB.mdf'
set #cfdata='CF_DB_Data'
set #cflog='CF_DB_Log'
set #ldf_file=#data_path + 'CF_DB_log.ldf'
declare #sql nvarchar(500)
set #sql = 'CREATE DATABASE [CF_DB] ON (NAME = ' + quotename(#cfdata) + ',FILENAME =' + quotename(#mdb_file) + ',SIZE = 53, FILEGROWTH = 10%) LOG ON (NAME =' + quotename(#cflog) + ',FILENAME = ' + quotename(#ldf_file) + ', SIZE = 31, FILEGROWTH = 10%)COLLATE SQL_Latin1_General_CP1_CI_AS'
exec(#sql)
Here we are trying to figure out the location of MDF file for MASTER DB and using the same location to CREATE DATABASE.
Problem: With the scripts generated (after Deploy action) , there is an auto Generated SQLCMD variables , initialized with some default path (hardcoded one ) or Empty strings (which are using Default Datafile path used by SQL Server 2008 or 2005).
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
We need to make it work like our existing system. We need to know path of MASTER DB data and log files and using the same path to initialize DefaultDataPath and DefaultLogPath. We can't go with PreDeployment scripts because Database settings are done by Database Project generated script before embedding PreDeploymentScript in the final Deploy Scripts.
NEXT big thing: Developer need to switch to SQLCMD Mode in SQL Server Management Studio to run the scripts generated by DB Project. This is our implementation Team's requirement NOT TO USE SQLCMD mode to setup DATBASE. To overcome these step, I need to modify the generated SQL file and use SQL Variables instead of SQLCMD variables. Can we generate the clean SQL Statements and keeping automation script generation intact? I know both of these issues are corelated thus the solution for one is going to Fix the other one.
Thanks for any good suggestions or help upon the above discussions.
Regards
Sumeet
Not sure how best to handle your file path, though I suspect you will want to not use the Default File Path setting and instead use a new file path that you can control through a variable.
It sounds to me like you're trying to have the developers update their local machines easily. My recommendation would be to build out some batch files that do the following:
Set the PATH to include the location for MSBuild.exe
Get the location for your master database
Pass that location in to a variable
Run the MSBuild command to set your path variables to the local master path
and publish the database/changes.
Overall, that sounds like more trouble than it's really worth. I'd recommend that you send out a SQL Script to all of the developers getting them to properly set up their Default Data/Log paths and just use the defaults.
I do recommend that you look into setting up some batch files to run the MSBuild commands. You'll have a lot easier time getting the database builds to your developers without them generating scripts and running them locally. Alternatively, you could have them change their SSMS defaults to set SQLCMD mode on for their connections. SSDT made this a little nicer because it won't run at all without SQLCMD mode turned on - eliminated a lot of the messiness from VS2008/VS2010 DBProjects.
I used something like the following to build and deploy when we were on DB Projects:
msbuild /m .\MyDB\MyDB.dbproj /t:build
msbuild /m .\MyDB\MyDB.dbproj /t:deploy /p:TargetConnectionString="Data Source=localhost;Integrated Security=True;Pooling=False;" /p:TargetDatabase="MyDB"
When SSDT/VS generates the SQL file, it actually runs some queries against the server you've told it to connect to. The settings you're getting here for example...
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
Are obtained from server settings from the target database connection you specified in your publish file/profile.
On the server that you are using to generate your scripts, open regedit.exe and search for the keys "DefaultLog" and "DefaultData" under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft - they should be in the same location. See if they match the settings your scripts are generating.
You can change these on the server/your PC (where ever you are pointing to) and it will generate the locations you enter in your generate SQL Scripts. Be cautious naturally around a server you do not own, or that is in use for production etc as this will change a setting on the server which points SQL Server where to place new databases. This is a different setting it seems than the one you enter in SQL Server properties -> Database Settings.
Hope that helps!
Related
I am having a VERY difficult time publishing a pre-existing SQL Server project to a Docker hosted instance of SQL Server.
What I am attempting to do is make a clean pipeline for a Docker hosted instance to use in testing a SQL Server project, which of course starts with doing it first by hand to understand all the steps involved. The SQL Server project itself has been around for many years, and has no problems deploying to SQL Server instances hosted on Windows boxes.
As near as I can tell, the issue comes while SSDT is generating the SQL Server deployment script itself. In a normal deployment to a Windows hosted SQL Server, the generated script starts out with some :setvar commands, including:
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\"
However, when publishing to a Docker hosted instance of SQL Server, and the same deployment process, the SQL script has:
:setvar DefaultDataPath ""
:setvar DefaultLogPath ""
The 1st thing this deployment does is to alter the database by adding in an additional data file, e.g.:
ALTER DATABASE [$(DatabaseName)]
ADD FILE (NAME = [ARCHIVE_274A259D], FILENAME = N'$(DefaultDataPath)$(DefaultFilePrefix)_ARCHIVE_274A259D.mdf') TO FILEGROUP [ARCHIVE];
The Docker based deployment then craps itself because the file path is (obviously) invalid.
In researching this problem, I've seen MANY solutions which hand-edit the generated deployment SQL script, and manually set the "proper" values for DefaultDataPath and DefaultLogPath ... and even one solution that ran the generated Sql through some sort of post-processor to make that same edit in a programmatic way with string replacement. This does work, but is less than optimal (especially in an automated build/test/deploy pipeline).
I've checked in the Docker instance itself, and its mssql.conf file does have defaults defined:
$ cat /var/opt/mssql/mssql.conf
[sqlagent]
enabled = false
[filelocation]
defaultdatadir = /var/opt/mssql/data/
defaultlogdir = /var/opt/mssql/log/
Can anybody shed light on why these are not being picked up by the SSDT process of generating the deploy script?
I spent a few days trying various workarounds to the problem ...
Defined the DATA and LOG directories in the Docker "run" command, but this had no effect on the gnerated Sql deploy script, e.g.: -e 'MSSQL_DATA_DIR=/var/opt/mssql/data/' -e 'MSSQL_LOG_DIR=/var/opt/mssql/log/'
Configure the Sql Project with SQLCMD Variables. This method could not override the DefaultDataPath or DefaultLogPath. I could add new Variables, but those would not affect the file path of the ALTER DATABASE command above.
Tried a Pre-Deployment script specifically tailored to override the values of DefaultDataPath and DefaultLogPath. While this technically CAN override the default values, the Pre-Deployment script is included in the generated Sql deployment script after the ALTER DATABASE commands to add data files. It would effectively work for the rest of the script, just not the specific portion that was throwing the error on initial deployment of the database.
At this point I feel there is either a Sql Server configuration option that I am simply unaware of, or possibly a flaw in SSDT which is preventing it from gathering the Default Path values from the Docker Sql Server instsance. Any ideas?
I have a backup application and some of my customers want their SQL Server databases backed up. I need SQL Server to give me compressed files with file fixed name (without timestamp). I have tried using the command 'SqlCmd -E -S...' but it does not compress the database (need to change configuration in SQL Server) which customers are not comfortable with. Also due to lack of free space on the hard disk we need it compressed. With 'SQL Enterprise Studio' the backup that takes place always has a timestamp in the name. I need the backup filename fixed eg: ABC.BAK
Do you necessarily have to use this backup application you mentioned? I'd just whip the T-SQL at it:
BACKUP DATABASE [CustomerDB] TO DISK = N'D:\ABC.BAK' WITH COMPRESSION
GO;
That'll let you define the file names as well as override the server's default compression for that backup.
You could use sqlcmd to both compress the backup and give the backup file whatever name you want. sqlcmd can take an input file to execute against a SQL Server. Here is an example of the sqlcmd you would run
sqlcmd -S localhost -E -i ./SQLQuery1.sql
Where I'm connecting to my localhost installation of SQL Server (default instance name), and a trusted connection. You know this bit already of course. The new thing is the command-line option at the end.
-i ./SQLQuery1.sql
Here is an example SQLQuery1.sql file (you can of course call it whatever you would like)
USE [master]
GO
DECLARE #Path NVARCHAR = N'C:\Program Files\Microsoft SQL Server\MSSQL15.MSSQLSERVER\MSSQL\Backup\Example.bak';
BACKUP DATABASE [Example] TO DISK = #Path
WITH NAME = N'Example-Full Database Backup'
,COMPRESSION;
GO
Change the #Path variable to wherever you want the backup to be written and change the name of the file at the end of the #Path variable to whatever you want to call the file.
I have this update tool for my program. The tool updates the SQL Server database with this code (vb and sql).
Dim sql As Process = Process.Start("sqlcmd.exe", Param + " -i update.sql -o log.txt")
Param contains the name of the .bak file myprogram_update.bak among others.
update.sql starts with
RESTORE DATABASE [myprogram_tmp]
FROM DISK = N'$(db_src)'
WITH FILE = 1,
MOVE N'myprogram_tmp' TO #mdf,
MOVE N'myprogram_tmp_log' TO #ldf,
NOUNLOAD, STATS = 5
#mdf and #ldf are standard paths read from the SQL Server settings.
The update tool is shipped to customers and the problem is that usually SQL Servers aren't allowed to read in user directories of windows like Desktop or Downloads. But many customers unzip the archive to these directories and then they get this error.
Could not access myprogram_update.bak / access denied.
I can't change the settings on every customers server, so is there any way to make it work for these directories? One idea of mine was to run a setup first and unzip it to program files but maybe there is a smarter solution.
Edit: the tool runs in administrator mode.
You should ask for elevation and run as an administrator account, maybe this will help:
http://www.downloadinformer.com/how-to-make-a-vb-net-application-always-run-in-administrator-mode/
I've now put my program into an setup which is usually installed under an ordinary folder, like C:\Program Files (x86)\MyProgram. Here SQL-Server has access.
I am using VS2012 and I have a database created:
(localdb)\v11.0 (SQL Server 11.0.2100 - T61\Alan)
How can I find out the physical location of this database. How can I back this up? Can I just make a copy of the files, move these to another location and start the database again.
Here is my connection string:
<add name="DB1Context" connectionString="Data Source=(LocalDb)\v11.0;Initial Catalog=DB1;Integrated Security=SSPI;" providerName="System.Data.SqlClient" />
It is quite confusing for people who touch with Entity Framework the first time.
If you use Code First, an mdf file is generated at %USERPROFILE% (e.g. C:\Users\<username>).
If you use Database First, you create a database under SQL Server Object Explorer (not Server Explorer!), an mdf file will be generated at %LOCALAPPDATA%\Microsoft\Microsoft SQL Server Local DB\Instances\MSSQLLocalDB.
By default, LocalDB database creates “*.mdf” files in the C:/Users/"username" directory.
Link ref: https://docs.asp.net/en/latest/tutorials/first-mvc-app/working-with-sql.html
Are you saying you can see it listed in SQL Server Management Studio? Right click on DataBase -> Properties -> Files will tell you where on your hard disk it lives. If you backup the mdf, be sure to back up the ldf too.
Alternatively, you can right click on the DB, and choose Tasks -> Backup. This will make a a single .bak file for you, and you don't need to worry about the mdf/ldf.
http://technet.microsoft.com/en-us/library/hh510202.aspx
The system database files for the database are stored in the users'
local AppData path which is normally hidden. For example
C:\Users\--user--\AppData\Local\Microsoft\Microsoft SQL Server Local
DB\Instances\LocalDBApp1. User database files are stored where the
user designates, typically somewhere in the C:\Users\\Documents\
folder.
Try this one -
DECLARE
#SQL NVARCHAR(1000)
, #DB_NAME NVARCHAR(100) = 'AdventureWorks2008R2'
SELECT TOP 1 #SQL = '
BACKUP DATABASE [' + #DB_NAME + ']
TO DISK = ''' + REPLACE(mf.physical_name, '.mdf', '.bak') + ''''
FROM sys.master_files mf
WHERE mf.[type] = 0
AND mf.database_id = DB_ID(#DB_NAME)
PRINT #SQL
EXEC sys.sp_executesql #SQL
Output -
BACKUP DATABASE [AdventureWorks2008R2]
TO DISK = 'D:\DATABASE\SQL2012\AdventureWorks2008R2.bak'
Open Windows registry editor and navigate to key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server Local DB\Installed Versions. Look beneath the version key of the SQL Server instance being targeted e.g. 11.0 for SQL 2012, and see its InstanceAPIPath value for file system location of the localdb's.
Note that at full list of SQL server versions mapped to release name and year can be found here
This PowerShell script, will give you the default location for localdb .mdf files:
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
(New-Object Microsoft.SqlServer.Management.Smo.Server("(localdb)\$instancename")).DefaultFile
where $instancename is the name of the localdb instance you want to check for. You can get a list of localdb instances by running
sqllocaldb i
I tried everything here and could not find them anywhere. I finally found them by searching *.mdf in file explorer. They were in C:\Users\user\source\repos\CallNote\App_Data.
I came back to a machine to work on it today and noticed that my batch files no longer can back up a copy of my database using the backup database command.
The server is 2008R2 for both windows server and sql server.
No more than a month ago this same batch file would back up my database. Now when I manually run my sql script I get a "Operating System Error 3 - cannot find file specified".
Here's my script:
DECLARE #FileName varchar(50), #Date varchar(20)
set #Date = REPLACE ((CONVERT (VARCHAR(10), GETDATE(), 101)), '/', '-')
SET #FileName = ('C:\mybkfolder\BackupSQLData\db_dnt_' + #Date + '.bak')
BACKUP DATABASE db_dnt TO DISK = #FileName
Go
I have verified that sql server is running under the system account. I can manually make a backup from sql management studio to the folder in question. But if I run my script as a query in management studio I get this error.
I went to this folder and set the security permission so that the SYSTEM account would have full access. Then I went to task scheduler and made sure my scheduled task (run a batch. batch calls a .sql file which contains script above) was configured to run under SYSTEM. Still nothing is backing up.
Other than windows patches nobody has messed with the machine really. What causes this to just stop working? This is an express version of sql server and since you can't create jobs on the express version like the real one I have to use a two batch file system where one backs up and the other purges to keep the folder under control.
This turned out to be an issue with a path that no longer existed. My task called a batch, the batch called a .sql script. I had forgotten to update the path in the sql script even though I remembered to do so in my batch file.