SQL Server Project Publish to a Docker Hosted Instance - SSDT and DefaultDataPath - sql-server

I am having a VERY difficult time publishing a pre-existing SQL Server project to a Docker hosted instance of SQL Server.
What I am attempting to do is make a clean pipeline for a Docker hosted instance to use in testing a SQL Server project, which of course starts with doing it first by hand to understand all the steps involved. The SQL Server project itself has been around for many years, and has no problems deploying to SQL Server instances hosted on Windows boxes.
As near as I can tell, the issue comes while SSDT is generating the SQL Server deployment script itself. In a normal deployment to a Windows hosted SQL Server, the generated script starts out with some :setvar commands, including:
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\"
However, when publishing to a Docker hosted instance of SQL Server, and the same deployment process, the SQL script has:
:setvar DefaultDataPath ""
:setvar DefaultLogPath ""
The 1st thing this deployment does is to alter the database by adding in an additional data file, e.g.:
ALTER DATABASE [$(DatabaseName)]
ADD FILE (NAME = [ARCHIVE_274A259D], FILENAME = N'$(DefaultDataPath)$(DefaultFilePrefix)_ARCHIVE_274A259D.mdf') TO FILEGROUP [ARCHIVE];
The Docker based deployment then craps itself because the file path is (obviously) invalid.
In researching this problem, I've seen MANY solutions which hand-edit the generated deployment SQL script, and manually set the "proper" values for DefaultDataPath and DefaultLogPath ... and even one solution that ran the generated Sql through some sort of post-processor to make that same edit in a programmatic way with string replacement. This does work, but is less than optimal (especially in an automated build/test/deploy pipeline).
I've checked in the Docker instance itself, and its mssql.conf file does have defaults defined:
$ cat /var/opt/mssql/mssql.conf
[sqlagent]
enabled = false
[filelocation]
defaultdatadir = /var/opt/mssql/data/
defaultlogdir = /var/opt/mssql/log/
Can anybody shed light on why these are not being picked up by the SSDT process of generating the deploy script?
I spent a few days trying various workarounds to the problem ...
Defined the DATA and LOG directories in the Docker "run" command, but this had no effect on the gnerated Sql deploy script, e.g.: -e 'MSSQL_DATA_DIR=/var/opt/mssql/data/' -e 'MSSQL_LOG_DIR=/var/opt/mssql/log/'
Configure the Sql Project with SQLCMD Variables. This method could not override the DefaultDataPath or DefaultLogPath. I could add new Variables, but those would not affect the file path of the ALTER DATABASE command above.
Tried a Pre-Deployment script specifically tailored to override the values of DefaultDataPath and DefaultLogPath. While this technically CAN override the default values, the Pre-Deployment script is included in the generated Sql deployment script after the ALTER DATABASE commands to add data files. It would effectively work for the rest of the script, just not the specific portion that was throwing the error on initial deployment of the database.
At this point I feel there is either a Sql Server configuration option that I am simply unaware of, or possibly a flaw in SSDT which is preventing it from gathering the Default Path values from the Docker Sql Server instsance. Any ideas?

Related

Bitbucket and Database Development

I have a Windows server with MS SQL Server running on it.
On the SQL Server developers have created stored procedures, views, tables, triggers.
On the Windows server developers created shell scripts.
I would like to start versioning the code described above in a BitBucket repository. I have a repository created in BitBucket.
How should the branches be organized in this repository? i.e. "SQL Server\Database\\ ...
"Windows Server\\shell_script" ...
Can I connect BitBucket to SQL Server and Windows Server and specify which code needs to be versioned?
Are both 1 and 2 options above possible?
I just need to version control the changes to the code and have the ability to mark under which project the code change was made.
I am new to BitBucket. I am using the web front end of it. I do not know how to configure command line access, so please try not to reference Bitbucket commands. Sorry if I sound confusing.
Please help.
I know this is an old question but anyway, in principle I'd recommend:
Put all the server shell scripts into one place and make that a git repo linked to your bitbucket repo
Add a server shell script to export what you want version controlled from the SQL db
The export from the SQL db should be to text files so they are easily 'diffable'
You might as well make the export to a sub-directory within the shell scripts repo so that everything is in one place and can't get out of sync
So you only have one branch, not a separate one for server shell scripts and db
Make sure people run the export script and then commit everything when they make a change
You ideally have a test server which means you'd want a way to push changes from the repo into the SQL db. I presume you can do this with a script but deleting the server setup and re-creating it from the text files.
So basically, you can't connect an SQL db to bitbucket directly. You need scripts to read and write to the db from a repo.

Automating import of data-tier application (SQL database) from Azure with a Master Key

When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!

Deploying Dacpacs to an Availability Group in a locked-down production

My DBA and I are trying to work out how to effectively use Microsoft's Database projects and the Dacpacs they generate to simplify our production deployment system.
Ideally, I would be able to build and/or publish the .sqlproj, generating a .dacpac file, which can then be uploaded to the production server and used to upgrade the database from whatever version it was to the latest version. This is similar to how we're doing website deployments, where I publish to a package, and then that package is uploaded to the server and imported into IIS.
However, we can't work out how to make this work. The DBA has already created the database and added it to our Availability Groups. And every time we try to apply the Dacpac, it tries to adjust settings which it can't because of the AGs.
Nothing I've been able to do has managed to create a .dacpac file which doesn't try to impose settings on the database. The closest option I've found will exclude them when publishing, but as best as I can tell you can't publish to an inaccessible database, and only the DBA has access to the production server.
Can I actually use dacpacs this way?
There are two parts to this, firstly how do you stop deploying settings you don't want to deploy - can you give an example of one of the settings that doesn't apply?
For the second part where you do not have access to the SQL Server there are a few different ways to handle this:
Use an offline copy to generate the deploy script
Get the DBA to generate the deploy script
Get the DBA to deploy using the dacpac
Get read only access to the database
Option 1: "Use an offline copy to generate the deploy script"
You need to compare the dacpac to something and if you do not have a TDS connection (default instance default port tcp:1433) then you can use a version of the database that matches production either through:
Use log shipping to restore a copy of production somewhere you can access it
Get a development db and production in sync, then every release goes to the dev and prod databases, ensuring that they stay in sync
The log shipped copy is the easiest, if it is to a development server you can normally have server permissions to give you acesss or you can create the correct permissions at the database level but not on the production server level.
If the data is sensitive then the log shipped copy might not be appropriate so you could try to keep a development and production database in sync but this is difficult and requires that the DBA be "well trained" into not running anything that isn't first run against the db database as well.
Once you have access to a database that has exactly the same schema as the production database you can use sqlpackage.exe /action:script to generate a deploy script, in fact because it isn't the production database you can generate the script as part of your CI process :).
Option 2: "Get the DBA to generate the deploy script"
This is to get the DBA to copy the dacpac to the productions server and to use sqlpackage.exe that will be in "Program Files (x86)\Microsoft Sql Server\Version\DAC\bin" folder to compare the dacpac to the database and generate a script that he can review before deploying.
Option 3: "Get the DBA to generate the deploy script"
This is simlar to option 2 but instead of generating a script he deploys in SSMS he just use sqlpackage.exe /Action:Publish to deploy the changes directly.
Option 4: "Get read only access to the database"
This is actually my preferred as it means that you always build scripts against what is guaranteed to be the state of production (as it is production). In your case you would need to get the tcp port between your machine or ideally your build machine and the SQL Server and then you will need these permissions:
https://the.agilesql.club/Blogs/Ed-Elliott/What-Permissions-Do-I-Need-To-Generate-A-Deploy-Script-With-SSDT
As I said option 4 is always my preferred but I understand that it isn't always possible.
Option 2 + 3 are fraught with worry as you will be running scripts that haven't been tested anywhere, with option 4 and 1 you can generate the scripts and then deploy to a test / QA database as long as they themselves have the same schema as production. The scripts can also go through a code review process.
If you do option 2 / 3 then I would create a batch file or powershell script that drives sqlpackage.exe and if they deploy from a different server that doens't have sqlpackage.exe then you can copy the DAC folder to that machine and run sqlpackage from that, you do not have to actually install it (you may need to also copy in the Microsoft.SqlServer.TransactSql.ScriptDom.dll from the "Program Files (x86)\Microsoft Sql Server\Version\SDK\Assemblies" folder.
I hope this helps, if you have any more questions feel free to post here or ping me :)
ed

How do I change "Database default locations" for LocalDB in SQL Server Management Studio?

Connect to LocalDB in SSMS
Open Server Properties -> Database Settings
Change Data/Log/Backup locations -> click OK
When I click OK I get this error:
Found some blogpost and changed this in regedit but it didn't help.
Anyone got any other ideas I could try?
I do not believe that these default paths for SQL Server LocalDB are changeable. This is quite unfortunate due to what appears to be a bug with SQL Server Express 2017 LocalDB ** (fixed as of CU 6 for SQL Server 2017), as per this question (and my answer to it) on DBA.StackExchange:
LocalDB v14 creates wrong path for mdf files
HOWEVER, you do not need to use the default paths. Those are used when you create a Database without specifying the physical locations. If you specify the physical location, then you should be able to create the files to any folder / directory that you have read / write access to.
After making that change in the registry try restarting the sql instance.
Also I would make sure that the account running SQL Server has the ability to write to that folder.
for an easy test you could go to the folder properties -> security then add the account 'everyone' then give them full control. then try making that change. If it works it was a permissions issue to that account. Accounts generally don't have access to other users accounts without some level of admin.
After 10 years this is still an issue for the current version(15.0) of Microsoft SQL Server Express.
After a bit of investigation I discovered, there is an issue with permission inside the registry. The process sqlservr.exe cannot create entries in HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL15E.LOCALDB\MSSQLServer.
On my computer this process is running under my account, so I opened regedit and gave myself Full Control permission to this key. And it worked like a charm. I hope this will help you as well.
Changing these paths in RegEdit or SSMS doesn't work, SQL LocalDb won't respect these values for existing databases. One has to move the databases manually. Here is the reliable way to change a database location for any LocalDB instance.
First, make sure you work with a correct instance of SQL Server LocalDB. In command prompt enter:
sqllocaldb info
It will show the LocalDB instances you have on your machine. Let's assume that the instance name is MSSQLLocalDB.
Next, execute this script on your database (let's call it TestApp), using SqlCmd tool or SSMS:
alter database TestApp
modify file (name = TestApp, filename = 'C:\NewDataLocation\TestApp.mdf');
go
alter database TestApp
modify file (name = TestApp_log, filename = 'C:\NewDataLocation\TestApp_log.ldf');
go
Now, stop the SQL LocalDB instance, in command prompt:
sqllocaldb stop MSSQLLocalDB
Move the database files to the new location that you specified in the script. From %UserProfile%\TestApp.mdf (which is where they are located) to C:\NewDataLocation\TestApp.mdf, same for LDF file.
Start the SQL LocalDB instance again:
sqllocaldb start MSSQLLocalDB
Now your database is working from a new location. Repeat the steps for any other databases.
Paths Cannot Be Changed in SQL Server LocalDB "Automatic Instance" Types
In case anyone in 2023 finds out they cannot change their default database file storage paths, this article is for you!
This error applies to Microsoft SQL Server not being able to allow you to change the default file folder location on your PC where the SQL Server Database Files are saved (database and logs files, .mdf and .ldf).
Most developers often need control over where local database files are saved. Most prefer to store them in a central location, another drive, or simply the main SQL Server database repository inside the C:\Program Files\Microsoft SQL Server\{sql version name}\MSSQL\DATA, since that is where system data storage goes. One example of the problem of not being able to customize database file storage might be using Entity Framework Core, which runs "migration" scripts that create databases in SQL Server. When it does so, where those scripted databases get stored is heavily dependent on SQL Server's default file path settings. When the location of those EF code-first database files using LocalDB is locked down, developers are stuck with SQL files in multiple locations on their PC's.
THE PROBLEM
Apparently, when Microsoft installs SQL Server / SQL Express on your device, it attempts to install a default instance of the server as a specialized type called a "LocalDB Automatic Instance". They do this to get the user up and running fast with a "LocalDB" sql server instance, which is a one-time, "light", custom created server running as a public instance, complete with default settings which are customized for the user (or developer) so he can get up and running fast. The automatic type has the advantage that its granted permissions to the user as administrator in SQL, as well as granting all applications on the user's device public access to the server instance. (You will notice that IIsExpress works this way using ApplicationPools as dummy Windows User Accounts, creating default accounts next to your User Account in Windows to run app pools in IIS.) These SQL Server LocalDB binaries do not run as a service but on-demand. But only one of the "automatic" types may be installed per version per device. The other SQL Server LocalDB type is the named instance and is not as restricted as the automatic one, apparently.
The problem is, when they create this special LocalDB automatic instance, it locks down certain settings and applies certain permissions and settings that are unique just for this instance. This then limits what the user can do as far as customizations, one of which is the "Database default locations" in the Properties dialog box that appears when you right-click your sql server instance and choose properties.
Anyone using the full SQL Server version, or who has created a new instance of LocalDB, deleting the old one, will not experience this issue, so most of those people are scratching their heads.
But for local developers, what this means is your Sql Server LocalDB databases running under this instance of the server will typically store their databases under a locked down path...either the path you chose on install or default to the user-friendly account paths under C:\Users\{YourName}.
When users attempt to change the path in the properties box for the instance, many users online the past 5-6 years have noticed a nasty RegCreateKeyEx() returned error 5 Access is denied that would appear when saving a default path. Microsoft doesn't bother to tell you, but that is intentional. They don't expect to allow you to save paths to the registry for the instance, and assume everyone is ok with the default path.
You can fix the key error by going into your registry and changing permission on the Microsoft SQL Server registry keys, assigning the "Everyone" group account to the registry node managing these keys. In the Registry, add Everyone group account to this node below then try and save a new default path in the properties box for your sql server localdb instance:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server
The location of the default database file path keys (2019) in the Registry in Windows for an instance of the localdb server of are located here:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL15E.LOCALDB\MSSQLServer
You are then able to save the new default paths in SQL, and the error goes away. Saving your default path in the Properties box works now, and the new values appear in the registry.
Even though you can change these paths, they will not stick, however, and reset back to the User Account Path, by default. Even if you save a new default sql path for your databases, when you create a new database it still reverts to the old path. Again, this applies ONLY for users who are running under the default "Automatic" LocalDB instance created on install of SQL Express.
So even after restarting SQL, restarting your PC, or restarting the SQL Service, those registry values will still not pull the registry keys into the SQL Server instance settings for Default file paths.
As proof, run these two scripts below in your SQL Server LocalDB instance. The first one returns the actual LocalDB default file paths SQL Server stores internally. The second script returns what is stored in your registry for the LocalDB default file path. If you saved new default path registry keys, they should be the same and shown in SQL Server instance properties, but they are different! That means Microsoft has decided not to allow you to change them for those running the "automatic" instance type of LocalDB on install. Below is the T-SQL to run to test this:
-- GETS THE PATH STORED IN SQL SERVER FOR "DefaultData" path
SELECT
[Value] = 'DefaultData',
[Data] = SERVERPROPERTY('InstanceDefaultDataPath')
-- DefaultData C:\Users\YourAccountName\
-- GETS WHATS IN THE REGISTRY FOR "DefaultData" path
EXECUTE [master].dbo.xp_instance_regread
N'HKEY_LOCAL_MACHINE',
N'SOFTWARE\Microsoft\Microsoft SQL Server\MSSQLServer',
N'DefaultData'
-- DefaultData C:\Program Files\Microsoft SQL Server\MSSQL15.MSSQLSERVER\MSSQL\DATA
-- Note: If the second one returns `NULL` it just means you
-- have not yet tried or succeeded in saving a new file path
-- to your registry.
Why isnt SQL Server LocalDB pulling in the registry values?
What this means again, is sorry you can't change these default paths. Your best bet is to simple "detach" your databases, copy the .mdf and .ldf files to your new prefered folder, then reattach. When you create new databases, the console allows you to change the database file path there, as well. There are also some elaborate SQL scripts you can run to set paths before saving files.
But just know this is by design.
I think one of the purposes of LocalDB is that it is very convinient in bundling a demo database along with the source files of an application. The database file and its log, of course, are somewhere in the source file directory.
Take a Visual Studio solution for example, in web.config or app.config, you can see something like this:
<connectionStrings>
<add name="DefaultConnection" connectionString="Data Source=(LocalDb)\v11.0;Initial Catalog=aspnet-XXXXXX-20140609153630;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|\aspnet-XXXXXX-20140609153630.mdf" providerName="System.Data.SqlClient" />
Now that the location of every LocalDB is specified in the config file, I don't think "default location" makes much sense.

Setting DefaultDataPath and DefaultLogPath programmatically (Using SQL Statements to initialize the path)

My Query is regarding using NOT hard coded File Locations to initialize the the Variables DefaultDataPath and DefaultLogPath. Prior to adopt Database Projects as our standard Deployment and Database Management Tools and migrating our existing Scripts to Database projects we have been using the SET of CREATE and INITIALIZE scripts for Setting up Database. We are having following SQL Query to CREATE the Database with the FILE location:
SET #data_path = (SELECT SUBSTRING(filename, 1, CHARINDEX(N'master.mdf', LOWER(filename)) - 1)
FROM sys.sysaltfiles WHERE dbid = 1 AND fileid = 1);
set #mdb_file=#data_path + 'CF_DB.mdf'
set #cfdata='CF_DB_Data'
set #cflog='CF_DB_Log'
set #ldf_file=#data_path + 'CF_DB_log.ldf'
declare #sql nvarchar(500)
set #sql = 'CREATE DATABASE [CF_DB] ON (NAME = ' + quotename(#cfdata) + ',FILENAME =' + quotename(#mdb_file) + ',SIZE = 53, FILEGROWTH = 10%) LOG ON (NAME =' + quotename(#cflog) + ',FILENAME = ' + quotename(#ldf_file) + ', SIZE = 31, FILEGROWTH = 10%)COLLATE SQL_Latin1_General_CP1_CI_AS'
exec(#sql)
Here we are trying to figure out the location of MDF file for MASTER DB and using the same location to CREATE DATABASE.
Problem: With the scripts generated (after Deploy action) , there is an auto Generated SQLCMD variables , initialized with some default path (hardcoded one ) or Empty strings (which are using Default Datafile path used by SQL Server 2008 or 2005).
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
We need to make it work like our existing system. We need to know path of MASTER DB data and log files and using the same path to initialize DefaultDataPath and DefaultLogPath. We can't go with PreDeployment scripts because Database settings are done by Database Project generated script before embedding PreDeploymentScript in the final Deploy Scripts.
NEXT big thing: Developer need to switch to SQLCMD Mode in SQL Server Management Studio to run the scripts generated by DB Project. This is our implementation Team's requirement NOT TO USE SQLCMD mode to setup DATBASE. To overcome these step, I need to modify the generated SQL file and use SQL Variables instead of SQLCMD variables. Can we generate the clean SQL Statements and keeping automation script generation intact? I know both of these issues are corelated thus the solution for one is going to Fix the other one.
Thanks for any good suggestions or help upon the above discussions.
Regards
Sumeet
Not sure how best to handle your file path, though I suspect you will want to not use the Default File Path setting and instead use a new file path that you can control through a variable.
It sounds to me like you're trying to have the developers update their local machines easily. My recommendation would be to build out some batch files that do the following:
Set the PATH to include the location for MSBuild.exe
Get the location for your master database
Pass that location in to a variable
Run the MSBuild command to set your path variables to the local master path
and publish the database/changes.
Overall, that sounds like more trouble than it's really worth. I'd recommend that you send out a SQL Script to all of the developers getting them to properly set up their Default Data/Log paths and just use the defaults.
I do recommend that you look into setting up some batch files to run the MSBuild commands. You'll have a lot easier time getting the database builds to your developers without them generating scripts and running them locally. Alternatively, you could have them change their SSMS defaults to set SQLCMD mode on for their connections. SSDT made this a little nicer because it won't run at all without SQLCMD mode turned on - eliminated a lot of the messiness from VS2008/VS2010 DBProjects.
I used something like the following to build and deploy when we were on DB Projects:
msbuild /m .\MyDB\MyDB.dbproj /t:build
msbuild /m .\MyDB\MyDB.dbproj /t:deploy /p:TargetConnectionString="Data Source=localhost;Integrated Security=True;Pooling=False;" /p:TargetDatabase="MyDB"
When SSDT/VS generates the SQL file, it actually runs some queries against the server you've told it to connect to. The settings you're getting here for example...
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
Are obtained from server settings from the target database connection you specified in your publish file/profile.
On the server that you are using to generate your scripts, open regedit.exe and search for the keys "DefaultLog" and "DefaultData" under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft - they should be in the same location. See if they match the settings your scripts are generating.
You can change these on the server/your PC (where ever you are pointing to) and it will generate the locations you enter in your generate SQL Scripts. Be cautious naturally around a server you do not own, or that is in use for production etc as this will change a setting on the server which points SQL Server where to place new databases. This is a different setting it seems than the one you enter in SQL Server properties -> Database Settings.
Hope that helps!

Resources