I am building a web application deployment script in PowerShell, and for one task I am trying to create and restore a database from a Sql-Server backup file.
The files end up being in the user's desktop, so when I instruct Sql-Server to restore it, it complains with an 'Access is denied.' error when reading the backup.
RESTORE DATABASE [Acme] FROM DISK = 'C:\Users\matthew\Desktop\my-database.bak' WITH REPLACE
Responds with
Msg 3201, Level 16, State 2, Line 2
Cannot open backup device 'C:\Users\matthew\Desktop\my-database.bak'. Operating system error 5(Access is denied.).
Moving the file to a publicly accessible area like C:\Temp works, as indicated in the following answer: Why can't I read from .BAK files on my Desktop using SQL Express in Windows Authentication Mode
However, C:\Temp is not a standard Windows temp directory. Since I am using PowerShell, I am leveraging .NET libraries, such as using GetTempPath. This ends up pointing to
C:\Users\matthew\AppData\Local\Temp
which still has the same permission problem.
Is there a standard way to get a temporary directory that any local user can access?
EDIT: to clarify, the user matthew and the user that is restoring the backup are different.
It's not uncommon to create a folder C:\Temp as a system-wide temp directory. For a backup/restore scenario you just need a folder that's accessible by both the backup and the restore user, be it a custom folder, a built-in public folder like C:\Users\Public\Documents, or adjusted permissions on a userprofile.
However, from a security point of view it's probably a good idea to create a dedicated folder (e.g. C:\backup) to which only the required users have access, e.g. like this:
$backupDir = 'C:\backup'
$backupUser = 'DOMAIN\userA'
$restoreUser = 'DOMAIN\userB'
function New-Ace {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[string]$User,
[Parameter(Mandatory=$true)]
[string]$Access
)
New-Object Security.AccessControl.FileSystemAccessRule ($User, $Access,
'ObjectInherit, ContainerInherit', 'None', 'Allow')
}
$dir = New-Item -Type Directory $backupDir
$acl = Get-Acl -Path $dir
$acl.SetAccessRuleProtection($true, $false)
$acl.AddAccessRule((New-Ace -User $backupUser -Access 'Modify'))
$acl.AddAccessRule((New-Ace -User $restoreUser -Access 'Read'))
$acl | Set-Acl -Path $dir
Related
I've created an Azure RM VM image of Windows 2008 R2 with SQL Server 2014 installed. The image was created with a data disk where I placed the SQL Server data directory (location for the system databases, error logs etc). The image was sysprepped then generalized, all successfully.
I created a new VM from the above image, pointing to the OS and data disk URIs. The VM gets created but I have to go into Computer Management > Disk Management and provision the drive from the presented volume. Since SQL Server's startup process is looking for the errorlogs, system databases etc, which do not exist there, it's basically a failed install.
Is there a way to preserve the data on the data disk, then provision that into Windows, programatically?
Is there a way to preserve the data on the data disk, then provision
that into Windows, programatically?
Yes, you could. You could use Azure PowerShell to create an image of a generalized Azure VM. You can then use the image to create another VM. The image includes the OS disk and the data disks that are attached to the virtual machine. I have tested in my lab, it works for me.
Stop-AzureRmVM -ResourceGroupName shuitest1 -Name shui -Force
Set-AzureRmVm -ResourceGroupName shuitest1 -Name shui -Generalized
$vm = Get-AzureRmVM -ResourceGroupName shuitest1 -Name shui -Status
$vm.Statuses
Save-AzureRmVMImage -ResourceGroupName shuitest1 -Name shui -DestinationContainerName "shuitest" -VHDNamePrefix "shuitest" -Path "D:\Filename.json"
More information about how to capture a VM image from a generalized Azure VM please refer to this link.
You could use the image(contains OS disk and data disks but no virtual network in it) to deploy your VM. More information about how to create a VM from a generalized managed VM image please refer to this link.
Also, you could use local json file to deploy your VM, you need to create a NIC on Azure Portal. If you use the way to deploy VM, the VM does not have a Public IP, you need to add it manual. I test in my lab, it works for me. If possible, I suggest you use local json file to redeploy your VM. The following is my cmdlet.
New-AzureRmResourceGroupDeployment -Name ExampleDeployment -ResourceGroupName shuitest1 -TemplateFile "D:\Filename.json"
I've been using following backup script for a while to create .bacpac files for MSSQL databases. The script working ok in general but fails for some databases. Those databases are not too much different to others, maybe a bit bigger. As this is used only for dev databases the average size is not big, the bacpac file size is ~200Mb.
The weird thing the script successfully running when executed from PowerShell ISE but failed when running from the command line. Please note that script only failing for some databases and work for others. The error message is not really helpful:
WARNING: Exception occurred: Exception calling "ExportBacpac" with "2" argument(s): "Could not export schema and data from database."
We using MSSQL 2104 and the database can be exported to backup from MSSQL Studio without problems.
The script:
Param(
# Database name to backup e.g 'MYDB'
$databaseName,
# Database connection string "server=server ip;Integrated Security = True;User ID=user;Password=pass"
$connectionString,
# Path to the directory where backup file should be created
$backupDirectory
)
add-type -path "C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin\Microsoft.SqlServer.Dac.dll";
try {
$dacService = new-object Microsoft.SqlServer.Dac.DacServices $connectionString
# build backup filename
$backupFileName = $backupDirectory+"\"+$dataBaseName+[DateTime]::Now.ToString("yyyyMMdd-HHmmss")+".bacpac"
# perform backup
$dacService.exportBacpac($backupFileName, $dataBaseName);
Write-Output "Database backup file created $backupFileName"
} catch {
Write-warning "Exception occurred: $_"
throw "Database backup haven't been created, execution aborted."
}
Does anyone come across this issue? I know PowerShell and PowerShell ISE a bit different but I don't understand why script execution produces different results.
[EDIT]
Tried to add event listener for DacService and print output to get more debug information
register-objectevent -in $dacService -eventname Message -source "msg" -action { out-host -in $Event.SourceArgs[1].Message.Message } | Out-Null
The output is
Dac Assembly loaded.
Extracting schema (Start)
Gathering database options
Gathering users
WARNING: Exception occurred: Exception calling "ExportBacpac" with "2" argument(s): "Could not export schema and data
from database."
Gathering roles
Gathering application roles
Gathering role memberships
Gathering filegroups
Gathering full-text catalogs
Gathering assemblies
Gathering certificates
....
Processing Table '[dbo].[file_storage_entity]'. 99.74 % done.
Processing Table '[dbo].[file_storage_entity]'. 100.00 % done.
Exporting data (Failed)
[EDIT 2]
As a workaround, I've created C# program to do the same. Runs fine in any environment. The code virtually the same as in PowerShell script.
I am trying to run a PowerShell script from a windows batch file. This is a SharePoint related script that uses Import-SPData.
This works without any issue when using USERA's login. However, if I try to run the same batch file from USERB's login, I get the error below:
c:\PS>ExecMyPowershellScript.bat
c:\PS>C:\Windows\system32\WindowsPowerShell\v1.0\powershell.exe -psconsolefile "
C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\CONFIG\P
OWERSHELL\Registration\psconsole.psc1" -command "c:\ps\MyPSScript.ps1"
The local farm is not accessible. Cmdlets with FeatureDependencyId are
not registered.
Import-SPData : Cannot access the local farm. Verify that the local
farm is properly configured, currently available, and that you have
the appropriate permissions to access the database before trying
again.
At C:\ps\Run_MyPSScript.ps1:5 char:18
USERB has permissions to run the bat and the ps1 files.
You are assuming, the error is related to permission to either the bat or the powershell file.
The error you get comes from a SP cmdlet, so you have successfully opened the bat file and successfully run the powershell script.
Which then throws an error. UserB has not the apropriate rights to the farm. Hence the error:
...and that you have the appropriate permissions to access the
database before trying again.
Compare the permissions from UserA and UserB on the farm and the database.
Or you could use a sledgehammer and log into UserA to run the following powershell script:
$db = Get-SPDatabase | Where {$_.Name -eq "SharePoint_ConfigDB"}
Add-SPShellAdmin "domain\UserB" -database $db
I managed to grab the concept of for-each and the SSMS application programming and also reading from excel file thru this question
Adding Servers to SQL Management Studio
This is the code that is working for me now
Import-Csv C:\sl.csv | ForEach-Object { New-Item $(Encode-Sqlname $_.Name) -ItemType Registration -Value ("server=;$($_.Name);integrated security=true") }
But I was having a problem with getting the user name and password to be auto-configured,
which means changing the above code to this
Import-Csv C:\sl.csv | ForEach-Object { New-Item $(Encode-Sqlname $_.Name) -ItemType Registration -Value ("server=;$($_.Name);integrated security=false") }
but that is ok that is how the people I am delivering the script to prefer, for security purposes.(even though I would like to know how to get it done :))
Now for further enhancement, there are quite a number of mirrored servers, like
server1/instance2a
server2/instance2b
so the thing is I want to know in the registered servers window
there are mirrored and the principal servers, which means when the server is registered the name I want to make it appear like this, server1/isntance2a (mirror), so when the user wants to login he easily knows which is the mirror or prinicipal server. So to determine this the sql query is this.
select mirroring_role_desc from sys.database_mirroring where database_id > 4 and mirroring_state is NOT NULL
the output of this will give me this
mirroring_role_desc
PRINCIPAL
PRINCIPAL
PRINCIPAL
PRINCIPAL
PRINCIPAL
This query will run and determine first if it is a mirrored instance, if it is then display the number of principal databases in the instance. From here I want to take the output and display the registered server name according to the specifications I mentioned above.
BUT when there is a failover the registered server name still shows as server1/instance2a even though it is the mirrored now, so as by now you can understand that I am trying to make this script dynamic so the user can run this when ever he wants or it is run every fortnight or something like that (not to worry about the schedule for now)
Given a detached SQL Server Primary Data File (.mdf) how can you get the list of data and log files that exist inside that file? The goal is to use the Server.AttachDatabase SMO method to attach the database. But the database may have multiple data and/or log files so I need to get the list in order to add them to the StringCollection parameter of the method.
What I need is the equivalent of select * from sys.files for a detached mdf. SQL Server Management Studio does this when you use it to manually attach a database so I know it can be done.
If it's a single MDF file, if there are other MDF files I don't think the MDF file can tell you that directly.
And you can always try to attach an MDF file without a log file by using the CREATE DATABASE ... FOR ATTACH_REBUILD_LOG option.
In SMO you can do this using the AttachDatabase method specifying RebuildLog for AttachOptions.
Of course this all assumes the .mdf file is healthy - it will only be usable if it was cleanly detached.
Management Studio probably has some proprietary way of reading the file headers, but these aren't documented and you're not going to be able see what SSMS is doing using Profiler or the like.
If you are typically creating .mdf files for distribution of some kind, I really strongly recommend using backup/restore instead. You can learn a lot more about a .BAK file and the data/log files its database represents, using documented and public methods such as RESTORE FILELISTONLY.
Finally figured this one out. The undocumented command DBCC checkprimaryfile(N'blah.mdf',3) gives the info needed.
It took me a while to find it, but in SMO, you use EnumDetachedLogFiles and EnumDetachedLogFiles.
# PowerShell
$servername = "sqlserver\instance"
$mdf = "S:\DATA\mydb.mdf"
[void][Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")
$filestructure = New-Object System.Collections.Specialized.StringCollection
$server = New-Object Microsoft.SqlServer.Management.Smo.Server $servername
# Here you can automatically determine the database name, or set it manually
$dbname = ($server.DetachedDatabaseInfo($mdf) | Where { $_.Property -eq "Database name" }).Value
foreach ($file in $server.EnumDetachedDatabaseFiles($mdf)) {
$null = $filestructure.add($file)
}
foreach ($file in $server.EnumDetachedLogFiles($mdf)) {
$null = $filestructure.add($file)
}
Enumerate detached database file structures
SMO Attach/Detach Recipes