Automate turning named-pipes and tcp\ip on - sql-server

I am working on an install of a new product that requires modifications to SQL Server.
Specifically, enable tcp/ip and turning on named pipes. I know how to do it manually. What i want is a way to automate this for a new customer though SQL or with C# code.
I would love any suggestions for the right direction.

You can use C# and Server Management Objects (SMO) to do it. The classes you need are in the Microsoft.SqlServer.Smo and Microsoft.SqlServer.WmiManagement libraries.
Here's a Powershell snippet I've used that uses the same objects. Hopefully, it will point you down the right path.
$smo = 'Microsoft.SqlServer.Management.Smo.'
$wmi = new-object ($smo + 'Wmi.ManagedComputer').
# List the object properties, including the instance names.
$Wmi
# Enable the TCP protocol on the default instance.
$uri = "ManagedComputer[#Name='<computer_name>']/ ServerInstance[#Name='MSSQLSERVER']/ServerProtocol[#Name='Tcp']"
$Tcp = $wmi.GetSmoObject($uri)
$Tcp.IsEnabled = $true
$Tcp.Alter()
$Tcp
# Enable the named pipes protocol for the default instance.
$uri = "ManagedComputer[#Name='<computer_name>']/ ServerInstance[#Name='MSSQLSERVER']/ServerProtocol[#Name='Np']"
$Np = $wmi.GetSmoObject($uri)
$Np.IsEnabled = $true
$Np.Alter()
$Np

Related

How do you manage DSNs on Windows with Python 3?

I prefer to connect to databases with DSNs. I'm not a fan of putting user names and passwords in code or config files nor do I appreciate the trusted connection approach.
When I google how to MANAGE DSNs with Python, I wind up with some variation of the below.
import ctypes
ODBC_ADD_DSN = 1 # Add data source
ODBC_CONFIG_DSN = 2 # Configure (edit) data source
ODBC_REMOVE_DSN = 3 # Remove data source
ODBC_ADD_SYS_DSN = 4 # add a system DSN
ODBC_CONFIG_SYS_DSN = 5 # Configure a system DSN
ODBC_REMOVE_SYS_DSN = 6 # remove a system DSN
def create_sys_dsn(driver, **kw):
"""Create a system DSN
Parameters:
driver - ODBC driver name
kw - Driver attributes
Returns:
0 - DSN not created
1 - DSN created
"""
nul = chr(32)
attributes = []
for attr in kw.keys():
attributes.append("%s=%s" % (attr, kw[attr]))
if (ctypes.windll.ODBCCP32.SQLConfigDataSource(0, ODBC_ADD_SYS_DSN, driver, nul.join(attributes))):
return True
else:
print(ctypes.windll.ODBCCP32.SQLInstallerError)
return False
if __name__ == "__main__":
if create_sys_dsn("SQL Server",SERVER="server name", DESCRIPTION="SQL Server DSN", DSN="SQL SERVER DSN", Database="ODS", Trusted_Connection="Yes"):
print ("DSN created")
else:
print ("DSN not created")
When you run this, I wind up with this as output:
<_FuncPtr object at 0x0000028E274C8930>
I have two issues.
I'm not used to working with the OS through an API and I can't find much documentation or usage examples that aren't in C++. That said, the code runs, it just never returns true.
I can't figure out how to get it to kick out error information so I can diagnose the problem. This code leaves no footprint in the logs so nothing shows up in event viewer as something being wrong.
How can I troubleshoot this? Am I even doing the right thing by taking this route? What exactly IS best practice for connecting to databases with Python?
There are actually TWO answers to this question.
You can do it by calling PowerShell scripts from Python.
You don't. You use trusted connections in your connection string.
I went with option 2.

Issue with scripting out subscriptions using Windows PowerShell

I am trying to script out the replication objects via PowerShell using Microsoft.SqlServer.Rmo.dll. The replication type is transactional with push subscriptions.
I have been able to script out publication, articles, PALs but not able to script out publisher side subscriptions.
Reference
[reflection.assembly]::LoadFrom("c:\\sql\\Microsoft.SqlServer.Rmo.dll") | out-null
ScriptOptions
$scriptargs =[Microsoft.SqlServer.Replication.ScriptOptions]::Creation `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeCreateLogreaderAgent `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeCreateMergeAgent `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeCreateQueuereaderAgent `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludePublicationAccesses `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeArticles `
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludePublisherSideSubscriptions` #one way tried to get the subscriptions
-bor [Microsoft.SqlServer.Replication.ScriptOptions]::IncludeGo
foreach($replicateddatabase in $repsvr.ReplicationDatabases)
{
if ($replicateddatabase.TransPublications.Count -gt 0)
{
foreach($tranpub in $replicateddatabase.TransPublications)
{
**[string] $myscript=$tranpub.script($scriptargs)** #Errors out here
writetofile $myscript $filename 0
}
}
}
The other way I tried is exclude IncludePublisherSideSubscriptions from Scriptoptions and tried to script out directly using the following statement
foreach($replicateddatabase in $repsvr.ReplicationDatabases)
{
if ($replicateddatabase.TransPublications.Count -gt 0)
{
foreach($tranpub in $replicateddatabase.TransPublications)
{
[string] $subs=$tranpub.TransSubscriptions.script($scriptargs) #another way but same error
writetofile $subs $filename 0
}
}
}
The third way I tried:
$repsvr.ReplicationDatabases.TransPublications.TransSubscriptions.Script($scriptargs)
Of all, I found the following link to be very helpful where my code is mostly based on but just got stuck in the scripting out of publisher side subscriptions. I appreciate your help.
Microsoft.SqlServer.Replication has been deprecated since SQL Server 2012 (I think), and as of SQL Server 2017, it's no longer available. Follow the link and try switching the version to 2017 or 2019.
If you're using server 2016 and below, you may just need to load the rmo assembly from the GAC (Global Assembly Cache), this will do that for you.
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Rmo')
If for whatever reason, you don't want to, or can't use the GAC, do this
Add-Type -Path 'C:\path\to\dll\Microsoft.SqlServer.Rmo.dll'
I'll dig, and see if there's a different way to go about doing what you're trying to do.
Edit: Not sure if I'm misunderstanding the docs about RMO, or if they're just out of date
you take this link and it shows you how to configure publishing & distribution using RMO on Server 2019. Step 2 references this which shows the ReplicationServer class. The problem is, that's for SQL Server 2016. I'm not entirely sure if this is supported past 2016 or not.

Convert dacpac into folder structure of database objects with powershell

I'm working on integrating SQL Server databases into our in-house version control/deployment utility, which is built with powershell,and uses Github as a repository.
Using the excellent sqlpackage.exe utility, I have been able to add a process whereby a developer can extract their current changes into a dacpac and store it in Github, then do the opposite in reverse when they want to get the latest version. However, because the .dacpac is a binary file, it's not possible to see differences in git. I have mitigated this somewhat by unzipping the dacpac before storing in in source control, so contained xml files are added instead. However, even though these files are text-based, they are still not easy to look through and find differences.
What I would like to do, is convert the dacpac into a folder structure similar to what would be seen in SSMS (with all the database objects such as triggers, sprocs etc in their respective folders), store that in Github, and then convert it back into a dacpac when a client checks out the code. However, there doesn't seem to be any function in sqlpackage.exe for this, and I can't find any documentation. Is there any command line tool I can use to this through Powershell?
Using the public APIs for DacFx you can load the dacpac, iterate over all objects, and script each one out. If you're willing to write your own code you could write each one to its own file based on the object type. The basic process is covered in the model filtering samples in the DacExtensions Github project. Specifically you'll want to do something like the ModelFilterer code that loads a dacpac, queries all objects, scripts them out - see the CreateFilteredModel method. I've put a sample that should mostly work below. Once you have this, you can easily do compare on a per-object basis.
using (TSqlModel model = new TSqlModel(dacpacPath))
{
IEnumerable<TSqlObject> allObjects = model.GetObjects(QueryScopes);
foreach (TSqlObject tsqlObject allObjects)
{
string script;
if (tsqlObject.TryGetScript(out script))
{
// Some objects such as the DatabaseOptions can't be scripted out.
// Write to disk by object type
string objectTypeName = tsqlObject.ObjectType.Name;
// pseudo-code as I didn't bother writing.
// basically just create the folder and write a file
this.MkdirIfNotExists(objectTypeName);
this.WriteToFile(objectTypeName, tsqlObject.Name + '.sql', script);
}
}
}
This can be converted into a powershell cmdlet fairly easily. The dacfx libraries are on nuget at https://www.nuget.org/packages/Microsoft.SqlServer.DacFx.x64/ so you should be able to install them in PS and then use the code without too much trouble.
Based on the other post I was able to get a script working. Caveat is you'll have to try the types till you get what you want... The way it is no it trys to put the full http or https value for some of the objects.
param($dacpacPath = 'c:\somepath' Debug', $dacpac = 'your.dacpac')
Add-Type -Path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.dll'
add-type -path 'C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.Extensions.dll'
cd $dacpacPath
$dacPack = [Microsoft.SqlServer.Dac.DacPackage]::Load(((get-item ".\$dacpac").fullname))
$model =[Microsoft.SqlServer.Dac.Model.TSqlModel]::new(((get-item ".\$dacpac").fullname))
$queryScopes = [Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All
$return = [System.Collections.Generic.IEnumerable[string]]
$returnObjects = $model.GetObjects([Microsoft.SqlServer.Dac.Model.DacQueryScopes]::All)
$s = ''
foreach($r in $returnObjects)
{
if ($r.TryGetScript([ref]$s))
{
$objectTypeName = $r.ObjectType.Name;
$d="c:\temp\db\$objectTypeName"
if(!(test-path $d ))
{
new-item $d -ItemType Directory
}
$filename = "$d\$($r.Name.Parts).sql"
if(! (test-path $filename))
{new-item $filename -ItemType File}
$s | out-file $filename -Force
write-output $filename
}
}

Configure Slick with Sql Server

I have a project that is currently using MySQL that I would like to migrate to SQL Server (running on Azure). I have tried a lot of combinations of configurations but always get the same generic error message:
Cannot connect to database [default]
Here is my latest configuration attempt:
slick.dbs.default.driver = "com.typesafe.slick.driver.ms.SQLServerDriver"
slick.dbs.default.db.driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
slick.dbs.default.db.url = "jdbc:sqlserver://my_host.database.windows.net:1433;database=my_db"
slick.dbs.default.db.user = "username"
slick.dbs.default.db.password = "password"
slick.dbs.default.db.connectionTimeout="10 seconds"
I have the sqljdbc4.jar in my lib/ folder.
And have added the following to my build.sbt
libraryDependencies += "com.typesafe.slick" %% "slick-extensions" % "3.0.0"
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/maven-releases/"
Edit: I can connect from this machine using a GUI app, so the issue is not with any of the network settings.
Edit: 5/30/2017
After the release of Slick 3.2 the driver is now in the core suite, these are examples of Configs with 3.2
oracle = {
driver = "slick.jdbc.OracleProfile$"
db {
host = ${?ORACLE_HOST}
port = ${?ORACLE_PORT}
sid = ${?ORACLE_SID}
url = "jdbc:oracle:thin:#//"${oracle.db.host}":"${oracle.db.port}"/"${oracle.db.sid}
user = ${?ORACLE_USERNAME}
password = ${?ORACLE_PASSWORD}
}
}
sqlserver = {
driver = "slick.jdbc.SQLServerProfile$"
db {
host = ${?SQLSERVER_HOST}
port = ${?SQLSERVER_PORT}
databaseName = ${?SQLSERVER_DB_NAME}
url = "jdbc:sqlserver://"${sqlserver.db.host}":"${sqlserver.db.port}";databaseName="${sqlserver.db.databaseName}
user = ${?SQLSERVER_USERNAME}
password = ${?SQLSERVER_PASSWORD}
}
}
End Edit
I only have experience with the oracle config but I believe it is fairly similar. You are missing the critical $ at the end of the default driver. Also you will need to make sure your SBT project recognizes the lib
This first code snippet should be in application.conf or whatever file you are using for your Configuration
oracle = {
driver = "com.typesafe.slick.driver.oracle.OracleDriver$"
db {
host = ""
port = ""
sid = ""
url = "jdbc:oracle:thin:#//"${oracle.db.host}":"${oracle.db.port}"/"${oracle.db.sid}
user = ${?USERNAME}
password = ${?PASSWORD}
driver = oracle.jdbc.driver.OracleDriver
}
}
This second section is in my build.sbt . I put my oracle driver in the base folder in the /.lib, although their may be a better way.
unmanagedBase := baseDirectory.value / ".lib"
Finally to make sure the config is loading properly. Slick default seems to misbehave, so hopefully you get a right answer, rather than a what works for me answer. However utilizing my config above I can then load that using the last snippet. I found this in an example of a cake implementation and it has worked very well in multiple projects.
val dbConfig: DatabaseConfig[JdbcProfile] = DatabaseConfig.forConfig("oracle")
implicit val profile: JdbcProfile = dbConfig.driver
implicit val db: JdbcProfile#Backend#Database = dbConfig.db
This allows you to use the database, the driver for imports and will fail on compile if your configuration is wrong. Hope this helps.
edit : I finished and realized you were working with Azure so make sure that you can fully connect utilizing the same settings from the same machine utilizing a client of your choice. To make sure all firewall and user settings are correct and that the problem truly lies in your code and not in your system configuration.
edit2: Wanted to make sure I didn't give you bad advice since it was an Oracle Config so I set it up against and AWS SQL Server. I utilized the sqljdbc42.jar that is given by Microsoft with their jdbc install. Put that in the .lib and then I had a configuration like follows. As in the upper example you could instead use Environmental variables but this was just a quick proof of concept. Here is a Microsoft SQL Server Config I have now tested to confirm works.
sqlserver = {
driver = "com.typesafe.slick.driver.ms.SQLServerDriver$"
db {
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
host = ""
port = ""
databaseName = ""
url = "jdbc:sqlserver://"${sqlserver.db.host}":"${sqlserver.db.port}";databaseName="${sqlserver.db.databaseName}
user = ""
password = ""
}
}

Assembling SSIS Packages in PowerShell

I should preface by saying my experience with scripting or programming in OOP languages is limited.
I'm working on a method for programatically creating and executing SSIS packages using PowerShell. Unfortunately, most of the resources available for PowerShell and SSIS are for calling PS from SSIS, not the other way around.
I have, however, found a number of resources for VB/C# for creating SSIS packages.
Example resource here.
I've succeeded in converting most of the code by calling the DTS/SSIS assemblies, but it's failing now on converting the TaskHost object to a mainpipe.
Sample code:
[Void][Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ManagedDTS')
[Void][Reflection.Assembly]::LoadWithPartialName('Microsoft.Sqlserver.DTSPipelineWrap')
# Create the Package and application, set its generic attributes
$Package = New-Object Microsoft.SqlServer.Dts.Runtime.Package
$Package.CreatorName = $CreatorName
$App = New-Object Microsoft.SqlServer.Dts.Runtime.Application
# Set connection info for our package
$SourceConn = $package.Connections.Add("OLEDB")
$SourceConn.Name = "Source Connection"
$SourceConn.set_ConnectionString("Data Source=$SourceServer;Integrated Security=True")
$TargetConn = $package.Connections.Add("OLEDB")
$TargetConn.Name = "Target Connection"
$TargetConn.set_ConnectionString("Data Source=$TargetServer;Integrated Security=True")
# Build the tasks
# Data Flow Task - actually move the table
[Microsoft.SQLServer.DTS.Runtime.Executable]$XferTask = $Package.Executables.Add("STOCK:PipelineTask")
$XferTaskTH = [Microsoft.SqlServer.Dts.Runtime.TaskHost]$XferTask
$XferTaskTH.Name = "DataFlow"
$XferTaskTH.Description = "Dataflow Task Host"
$DataPipe = [Microsoft.SQLServer.DTS.pipeline.Wrapper.MainPipeClass]($XferTaskTH.InnerObject)
Everything works fine til the last line, when I get the error:
Cannot convert the
"System.__ComObject" value of type
"System.__ComObject#{}" to
type
"Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipeClass"
Any assistance or ideas are welcome!
Microsoft.SqlServer.DTSPipelineWrap makes heavy use of COM instances.
This forum post suggested using CreateWRapperOfType method:
http://social.technet.microsoft.com/Forums/en-US/ITCG/thread/0f493a31-fbf0-46ac-a6a5-8a10af8822cf/
You could try this:
$DataPipe = [System.Runtime.InteropServices.Marshal]::CreateWrapperOfType($XferTaskTH.InnerObject, [Microsoft.SQLServer.DTS.pipeline.Wrapper.MainPipeClass])
Doesn't error out and produces an object--I'm not sure of what type.
You could always just compile the working .NET version you referenced above into an exe, and allow it to accept parameters as needed in order to create the SSIS packages. Then, use Powershell to call the executable with the parameters as needed.

Resources