I'm working on integrating SQL Server databases into our in-house version control/deployment utility, which is built with powershell,and uses Github as a repository.
Using the excellent sqlpackage.exe utility, I have been able to add a process whereby a developer can extract their current changes into a dacpac and store it in Github, then do the opposite in reverse when they want to get the latest version. However, because the .dacpac is a binary file, it's not possible to see differences in git. I have mitigated this somewhat by unzipping the dacpac before storing in in source control, so contained xml files are added instead. However, even though these files are text-based, they are still not easy to look through and find differences.
What I would like to do, is convert the dacpac into a folder structure similar to what would be seen in SSMS (with all the database objects such as triggers, sprocs etc in their respective folders), store that in Github, and then convert it back into a dacpac when a client checks out the code. However, there doesn't seem to be any function in sqlpackage.exe for this, and I can't find any documentation. Is there any command line tool I can use to this through Powershell?
Using the public APIs for DacFx you can load the dacpac, iterate over all objects, and script each one out. If you're willing to write your own code you could write each one to its own file based on the object type. The basic process is covered in the model filtering samples in the DacExtensions Github project. Specifically you'll want to do something like the ModelFilterer code that loads a dacpac, queries all objects, scripts them out - see the CreateFilteredModel method. I've put a sample that should mostly work below. Once you have this, you can easily do compare on a per-object basis.
using (TSqlModel model = new TSqlModel(dacpacPath))
{
IEnumerable<TSqlObject> allObjects = model.GetObjects(QueryScopes);
foreach (TSqlObject tsqlObject allObjects)
{
string script;
if (tsqlObject.TryGetScript(out script))
{
// Some objects such as the DatabaseOptions can't be scripted out.
// Write to disk by object type
string objectTypeName = tsqlObject.ObjectType.Name;
// pseudo-code as I didn't bother writing.
// basically just create the folder and write a file
this.MkdirIfNotExists(objectTypeName);
this.WriteToFile(objectTypeName, tsqlObject.Name + '.sql', script);
}
}
}
This can be converted into a powershell cmdlet fairly easily. The dacfx libraries are on nuget at https://www.nuget.org/packages/Microsoft.SqlServer.DacFx.x64/ so you should be able to install them in PS and then use the code without too much trouble.
Based on the other post I was able to get a script working. Caveat is you'll have to try the types till you get what you want... The way it is no it trys to put the full http or https value for some of the objects.
param($dacpacPath = 'c:\somepath' Debug', $dacpac = 'your.dacpac')
Add-Type -Path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.dll'
add-type -path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.Extensions.dll'
cd $dacpacPath
$dacPack = [Microsoft.SqlServer.Dac.DacPackage]::Load(((get-item ".\$dacpac").fullname))
$model =[Microsoft.SqlServer.Dac.Model.TSqlModel]::new(((get-item ".\$dacpac").fullname))
$queryScopes = [Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All
$return = [System.Collections.Generic.IEnumerable[string]]
$returnObjects = $model.GetObjects([Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All)
$s = ''
foreach($r in $returnObjects)
{
if ($r.TryGetScript([ref]$s))
{
$objectTypeName = $r.ObjectType.Name;
$d="c:\temp\db\$objectTypeName"
if(!(test-path $d ))
{
new-item $d -ItemType Directory
}
$filename = "$d\$($r.Name.Parts).sql"
if(! (test-path $filename))
{new-item $filename -ItemType File}
$s | out-file $filename -Force
write-output $filename
}
}
Related
I am trying to script out the replication objects via PowerShell using Microsoft.SqlServer.Rmo.dll. The replication type is transactional with push subscriptions.
I have been able to script out publication, articles, PALs but not able to script out publisher side subscriptions.
Reference
[reflection.assembly]::LoadFrom("c:\\sql\\Microsoft.SqlServer.Rmo.dll") | out-null
ScriptOptions
$scriptargs =[Microsoft.SqlServer.Replication.ScriptOptions]::Creation `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeCreateLogreaderAgent `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeCreateMergeAgent `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeCreateQueuereaderAgent `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludePublicationAccesses `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeArticles `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludePublisherSideSubscriptions` #one way tried to get the subscriptions
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeGo
foreach($replicateddatabase in $repsvr.ReplicationDatabases)
{
if ($replicateddatabase.TransPublications.Count -gt 0)
{
foreach($tranpub in $replicateddatabase.TransPublications)
{
**[string] $myscript=$tranpub.script($scriptargs)** #Errors out here
writetofile $myscript $filename 0
}
}
}
The other way I tried is exclude IncludePublisherSideSubscriptions from Scriptoptions and tried to script out directly using the following statement
foreach($replicateddatabase in $repsvr.ReplicationDatabases)
{
if ($replicateddatabase.TransPublications.Count -gt 0)
{
foreach($tranpub in $replicateddatabase.TransPublications)
{
[string] $subs=$tranpub.TransSubscriptions.script($scriptargs) #another way but same error
writetofile $subs $filename 0
}
}
}
The third way I tried:
$repsvr.ReplicationDatabases.TransPublications.TransSubscriptions.Script($scriptargs)
Of all, I found the following link to be very helpful where my code is mostly based on but just got stuck in the scripting out of publisher side subscriptions. I appreciate your help.
Microsoft.SqlServer.Replication has been deprecated since SQL Server 2012 (I think), and as of SQL Server 2017, it's no longer available. Follow the link and try switching the version to 2017 or 2019.
If you're using server 2016 and below, you may just need to load the rmo assembly from the GAC (Global Assembly Cache), this will do that for you.
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Rmo')
If for whatever reason, you don't want to, or can't use the GAC, do this
Add-Type -Path 'C:\path\to\dll\Microsoft.SqlServer.Rmo.dll'
I'll dig, and see if there's a different way to go about doing what you're trying to do.
Edit: Not sure if I'm misunderstanding the docs about RMO, or if they're just out of date
you take this link and it shows you how to configure publishing & distribution using RMO on Server 2019. Step 2 references this which shows the ReplicationServer class. The problem is, that's for SQL Server 2016. I'm not entirely sure if this is supported past 2016 or not.
I generate DACPAC files from automated builds with a generated version number. While useful during sqlpackager operations I need to be able to determine the version number of a DACPAC before doing anything with the file.
What tooling can I use (automated of course) to query the DACPAC file for its version number and description?
Hey I know you found a solution but I have an alternative method that may help someone else.
By referencing Microsoft.SqlServer.Management.Dac.dll and using the DacType class:
using System.IO;
using Microsoft.SqlServer.Management.Dac;
(Not entirely sure which using statements are needed - I have copied for a larger DAC helper file)
using (Stream dacPackFileStream = File.Open(this.dacPackFileName, FileMode.Open))
{
var dacType = DacType.Load(dacPackFileStream);
dacPackFileStream.Close();
return dacType.Version;
}
DACPAC files are actually zip files. Extract the zip and query the file DacMetaData which is a xml file. Use the XPath: /DacType/Version
I have inherited a SQL CLR project as part of a code maintenance project that I'm working on for a client. I'm pretty new to SQL CLR, admittedly, so I'm trying to figure out how it works.
I noticed that the database connection string is stored in the project's Properties, so I know how to change it if I need to. The one question I have though is this: is it possible to set multiple connection strings for deployment to multiple SQL Server instances? In my case I have a local dev machine, a staging server, and a production server (with a separate copy of the target database on each server). I'd like to be able to deploy the SQL CLR assembly to all 3 without having to change the connection string and re-build for each one.
You should not deploy to anywhere but development via Visual Studio, hence the connection string in the Project should always point to your dev environment.
Once you have the code tested in the development server, you can script out the Assembly in SSMS by right-clicking on the Assembly in question and do "Script Assembly As..." then "Create To..." and then "New Query Window". This will give you the basic script that should be used to deploy to QA, Staging, and Production.
The general format is:
USE [DBName]
GO
CREATE ASSEMBLY [AssemblyName]
AUTHORIZATION [dbo]
FROM 0x0000...
WITH PERMISSION_SET = SAFE
You do not really need to propagate the Assembly Files to the other environments, though if you want to it does not hurt.
If you want to automate that, once you have that basic script you can always grab the updated Assembly code (what is noted as 0x0000 above) via:
SELECT Content FROM sys.assembly_files WHERE name = 'AssemblyName'
Edit:
For the sake of completeness, as Jeremy mentioned in a comment below, the above info only describes deployment of the Assembly itself, not of the wrapper objects to access the code within the Assembly. A full deployment process would:
Drop existing wrapper objects (Stored Procs, Functions, Triggers, Types, and Aggregates)
Drop the Assembly
Create the new Assembly
Create the wrapper objects
When you deploy the code to your development server, Visual Studio creates a .sql file in the bin/Release folder.
This can useful for deployment, it requires some cleaning.
Here is a perl script I'm using to get a deployment script from the script created by VS.
It's closely linked to my needs and the file format (I'm using VS 2010 SP1, SQL 2008 R2, perl within cygwin), consider this as an example it may not work automagically for everyone.
use strict;
use warnings;
use Text::Unidecode 'unidecode'; # http://search.cpan.org/dist/Text-Unidecode/
sub ProcessBlock($)
{
my $lines = $_[0];
if ($lines =~ "Deployment script for") { return 0; }
if ($lines =~ "^SET ") { return 0; }
if ($lines =~ "^:") { return 0; }
if ($lines =~ "^USE ") { return 0; }
if ($lines =~ "^BEGIN TRANSACTION") { return 0; }
if ($lines =~ "extendedproperty") { return 0; }
if ($lines =~ "^PRINT ") { return 0; }
if ($lines =~ "#tmpErrors") { return 0; }
if ($lines =~ "^IF \#\#TRANCOUNT") { return 0; }
my $drop = $lines;
if ($drop =~ m/^DROP (FUNCTION|PROCEDURE) ([^ ]+);/m)
{
printf("if OBJECT_ID('$2') IS NOT NULL\n");
}
elsif ($drop =~ m/^DROP ASSEMBLY \[([^ ]+)\];/m)
{
printf("IF EXISTS (SELECT 1 FROM sys.assemblies WHERE name = '$1')\n");
}
printf($lines);
printf("GO\n");
my $create = $lines;
if ($create =~ m/^CREATE PROCEDURE (\[[^]]+\])\.(\[[^]]+\])/m)
{
printf("GRANT EXECUTE ON $1.$2 TO PUBLIC\nGO\n");
}
elsif ($create =~ m/^CREATE FUNCTION (\[[^]]+\])\.(\[[^]]+\]).*RETURNS .* TABLE /ms)
{
printf("GRANT SELECT ON $1.$2 TO PUBLIC\nGO\n");
}
elsif ($create =~ m/^CREATE FUNCTION (\[[^]]+\])\.(\[[^]]+\])/m)
{
printf("GRANT EXECUTE ON $1.$2 TO PUBLIC\nGO\n");
}
}
my $block="";
while (<>)
{
my $line = $_;
$line = unidecode($line);
if ($line =~ "^GO")
{
ProcessBlock($block);
$block = "";
}
else
{
$block .= $line;
}
}
Usage:
perl FixDeploy.pl < YourAssembly.sql > YourAssembly.Deploy.sql
Look here: The difference between the connections strings in SQLCLR I think you should use context connection if possible. That way you don't have to reconfigure.
If you need different credentials or something, you can query a settings table that holds those settings. Use the context connection to connect, query the settings table to get the login details and then use them to connect again.
Also: the connection string is in the properties, but as I understand the settings.xml does not get deployed so you'd always be getting the default values hardcoded into settings class.
I should preface by saying my experience with scripting or programming in OOP languages is limited.
I'm working on a method for programatically creating and executing SSIS packages using PowerShell. Unfortunately, most of the resources available for PowerShell and SSIS are for calling PS from SSIS, not the other way around.
I have, however, found a number of resources for VB/C# for creating SSIS packages.
Example resource here.
I've succeeded in converting most of the code by calling the DTS/SSIS assemblies, but it's failing now on converting the TaskHost object to a mainpipe.
Sample code:
[Void][Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ManagedDTS')
[Void][Reflection.Assembly]::LoadWithPartialName('Microsoft.Sqlserver.DTSPipelineWrap')
# Create the Package and application, set its generic attributes
$Package = New-Object Microsoft.SqlServer.Dts.Runtime.Package
$Package.CreatorName = $CreatorName
$App = New-Object Microsoft.SqlServer.Dts.Runtime.Application
# Set connection info for our package
$SourceConn = $package.Connections.Add("OLEDB")
$SourceConn.Name = "Source Connection"
$SourceConn.set_ConnectionString("Data Source=$SourceServer;Integrated Security=True")
$TargetConn = $package.Connections.Add("OLEDB")
$TargetConn.Name = "Target Connection"
$TargetConn.set_ConnectionString("Data Source=$TargetServer;Integrated Security=True")
# Build the tasks
# Data Flow Task - actually move the table
[Microsoft.SQLServer.DTS.Runtime.Executable]$XferTask = $Package.Executables.Add("STOCK:PipelineTask")
$XferTaskTH = [Microsoft.SqlServer.Dts.Runtime.TaskHost]$XferTask
$XferTaskTH.Name = "DataFlow"
$XferTaskTH.Description = "Dataflow Task Host"
$DataPipe = [Microsoft.SQLServer.DTS.pipeline.Wrapper.MainPipeClass]($XferTaskTH.InnerObject)
Everything works fine til the last line, when I get the error:
Cannot convert the
"System.__ComObject" value of type
"System.__ComObject#{}" to
type
"Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipeClass"
Any assistance or ideas are welcome!
Microsoft.SqlServer.DTSPipelineWrap makes heavy use of COM instances.
This forum post suggested using CreateWRapperOfType method:
http://social.technet.microsoft.com/Forums/en-US/ITCG/thread/0f493a31-fbf0-46ac-a6a5-8a10af8822cf/
You could try this:
$DataPipe = [System.Runtime.InteropServices.Marshal]::CreateWrapperOfType($XferTaskTH.InnerObject, [Microsoft.SQLServer.DTS.pipeline.Wrapper.MainPipeClass])
Doesn't error out and produces an object--I'm not sure of what type.
You could always just compile the working .NET version you referenced above into an exe, and allow it to accept parameters as needed in order to create the SSIS packages. Then, use Powershell to call the executable with the parameters as needed.
Is it possibile to create a simple way to backup the event log, with such as a batch file or a simple app ?
I need to make it working on a customer's site, where the reference is an non-expert user.
Thanks
If you're using Windows 2008, use the built-in wevtutil command. Example:
wevtutil epl Application c:\temp\foo.evtx
Otherwise, get dumpel.exe from the resource kit, or psloglist from http://technet.microsoft.com/en-us/sysinternals/bb897544.aspx
With powershell and export-clixml its oneliner.
get-eventlog -list | %{ get-eventlog $_.Log | export-clixml -path ($_.Log + ".xml") }
The Microsoft Script Center has some sample code for Backing Up and Clearing Event Logs using VBScript and WMI.
Frank-Peter Schultze's Scripting Site has some code to clear an event log ( http://www.fpschultze.de/uploads/clrevt.vbs.txt) that you can modify to backup or backup then clear.
If you have access to the server you can backup from the Event Viewer by right-clicking on a log and using the "Save Log File As..." command. You can save to a binary, tab delimited or comma delimited file.
Finally I made a little winapp using this method found on the internet:
public void DoBackup(string sLogName)
{
string sBackup = sLogName; // could be for example "Application"
EventLog log = new EventLog();
log.Source = sBackup;
var query = from EventLogEntry entry in log.Entries
orderby entry.TimeGenerated descending
select entry;
string sBackupName = sBackup+"Log";
var xml = new XDocument(
new XElement(sBackupName,
from EventLogEntry entry in log.Entries
orderby entry.TimeGenerated descending
select new XElement("Log",
new XElement("Message", entry.Message),
new XElement("TimeGenerated", entry.TimeGenerated),
new XElement("Source", entry.Source),
new XElement("EntryType", entry.EntryType.ToString())
)
)
);
DateTime oggi = DateTime.Now;
string sToday = DateTime.Now.ToString("yyyyMMdd_hhmmss");
string path = String.Format("{0}_{1}.xml", sBackupName, sToday);
xml.Save(Path.Combine(Environment.CurrentDirectory, path));
}
this is the source link:
It simply works great!