MS SQL Server Docker data location - sql-server

I am using the mcr.microsoft.com/mssql/server:2019-latest container and want to mount its data directory so that data does not get lost if the server goes down.
Where inside is the data directory located? The documentation does not mention this at all.

The files, for SQL Server on Linux, are by default located in /var/opt/mssql. Unsurprising the Data files are in the data directory, and the log files in the log directory.
This is also in the documentation Change the default data or log directory location:
The filelocation.defaultdatadir and filelocation.defaultlogdir settings change the location where the new database and log files are created. By default, this location is /var/opt/mssql/data.

Depends on the OS platform and if the persistent storage is mounted
If it is Linux, then it according to #Larnu answer.
For Windows, it is still C:\Program Files\Microsoft SQL Server...
However, in both cases the data will have a lifetime of the container. At the restart of the container all changes will be gone.
In case of mounted volumes, the location is to be determined by the volume and the data is persistent, so it can survive restart of the container

Related

SSIS Project - Catalog Deployment - Environment Variable (for location) accessing file server mapped to a local Z drive - SQL Server Agent issue

I have an SSIS package that reads a number of files using a For Each Loop Container. There are a number
of parameters in this package, and in the Integration Services Catalog in the SSMS, I have created an
environment with many variables for this project/package.
There are a number of environment variables for this package. There is a particular environment variable for Source Location.
While in my DEV setting, I was able to pass the Source Location environment variable as :
C:\Data Repository\Files (in a local machine).
Everything fine. Package runs perfectly, and For Each Loop Container works reads the files.
However, in the PROD setting, I have to use a file server, mapped to a Z drive.
For example:
This PC > Data Repository (\\tordfs) (Z:) > Data Repository > X
becomes
Z:\Data Repository\X
when I copy the path.
Inside the SSIS package, I am able to set the parameter value for Source Location as Z:\Data Repository\X
and the For Each Loop Container works fine from the SSDT/Visual Studio.
Now after the SSIS package/project is deployed to the SSMS Catalog, when I feed Z:\Data Repository\X as a value for the Source Location environment variable, and I Execute the package manually from the Catalog, it works fine.
However, when I use the SQL Server Agent for the above process, I get the following error:
For Each Loop Container:Warning: The For Each File enumerator is empty.
The For Each File enumerator did not find any files that
matched the file pattern, or the specified directory was
empty.
Is there anything I need to do in the For Each Loop Container or the SSIS Catalog to eliminate the above error during execution from the Catalog using SQL Server Agent?
Let me know.
In Windows mapped drives are user-specific. So you would have to map the drive for the account running the package. Instead use a UNC Path in both cases, and not a drive letter.
So something like:
\\tordfs\Data Repository\Files
The account running the package will still need permissions to the share, and permissions to the folder, but won't need a drive letter mount.
I have 2 suggestions:
Try giving read/write permissions to the SQL Database Engine Service account NT SERVICE\MSSQL$<Instance Name> (Where <Instance Name> should be replaced by the installed instance name):
Configure File System Permissions for Database Engine Access
Try to Map the Z:\ network drive within SQL Server:
Make Network Path Visible For SQL Server Backup and Restore in SSMS
Thanks a lot guys. Appreciate it.
I think I have fixed the issue:
In the environment variable, you cannot have Z:\Data Repository\X
The variable must have the values such as this:
\\tordfs\Data Repository\Data Repository\X
While manual execution from SSMS Integration Services Catalog can accept Z:\Data Repository\X as a value of an environment variable,
the SQL Server Agent needs \\tordfs\Data Repository\Data Repository\X
If the SQL Server Agent when reading from the environment in Catalog reads Z:\Data Repository\X,
I get the For Each Loop Container posted above!
This said, I am using a proxy for the SQL Server Agent to resolve other access issues such as moving a file into a folder using the File System Task.

changing default data directory for sql server after installation

I have mistakenly put my SQL data directory in the wrong folder and as such, the system DBs are located in the wrong directory, and hence, I want to move it to a different directory. I am going to move the data and log files for all my system DBs but I would like to know if it will move all the folders under MSSQL or do I need to perform some other steps as well? Please find below the folders that I can see under MSSQL.
In order to move system databases, not only you should move the files, but prior to it you should change their paths in system tables by using
ALTER DATABASE..MODIFY FILE
When moving master database you should also change the startup parameters in Configuration Manager, you should put there the new location of master files.
All this is described in Move System Databases article.
Note that if one day you'll need to rebuild your master, it will be put in the old location, so you cannot just get rid of DATA folder. And after rebuilding you'll need to move your system databases again.
This is to let you know the solution to my problem. I have successfully changed the data root directory by moving the data and log files for all the system DBs to the desired location.I also had to change the registry value for the SQLDataRoot directory to my desired location and modify the location of the server diagnostics file as per my cluster requirement. After all this, I have been able to successfully move the folder to the desired location. Thanks for all the help everyone.

How do I change the path to a dropbox folder

I want to move my .mdf and .ldf into my dropbox folder.
I ran this script command:
ALTER DATABASE MyDatabase1 MODIFY FILE
(
Name = matrix,
Filename = 'C:\Users\mycomputer\Dropbox\MyDatabase1.mdf'
);
But I get this error:
The path specified by 'C:\Users\mycomputer\Dropbox\MyDatabase1.mdf' is
not in a valid directory.
I'm pretty sure it's just a permissions issue where the sql service running my script doesn't have the correct permissions. But I have no clue which object to grant all permissions to my DropBox. I tried mycomputer\users but that didn't work. Can someone help please?
As far as I know, Dropbox does not make snapshots of the files it copies.
This means the files could (and most probably would) be written to during the copy and they'll arrive in inconsistent state, rendering them unusable.
I believe you would want to use Log Shipping instead. This is a feature of SQL Server which allows transaction logs to be incrementally backed up and sent to another server (possibly by means of Dropbox), where they can be restored. This would allow you to have a snapshot of the database on another server.

Recover postgreSQL databases from raw physical files

I have the following problem and I need to know if thereĀ“s a way to fix it.
I have a client who was cheap enough to decline buying a backup plan for his postgreSQL databases on the main system that runs his company and as I thought it would happen some day, some OS files crashed during a blackout and the OS needs to be reinstalled.
This client didn't have any backups of the databases but I managed to save the PostgreSQL main directory. I read that the databases are stored somehow inside the data directory of the postgres main folder.
My question is: Is there any way to recover the databases from the data folder only? I am working in a windows environment (XP service pack 2) with PostgreSQL 8.2 and I need to reinstall PostgreSQL in a new server. I would need to recreate the databases in the new environment and somehow attach the old files to the new database instances. I know that's possible in SQL Server because of the way that engine stores the databases but I have no clue in postgres.
Any ideas? They would be much appreciated.
If you have the whole data folder, you have everything you need (as long as architecture is the same). Just try restoring it on another machine before wiping this one out, in case you didn't copy something.
Just save the data directory to disk. When launching Postgres, set the parameter telling it where the data directory is (see: wiki.postgresql.org). Or remove original data directory of the fresh installation and place the copy in its place.
This is possible, you just need to copy the "data" folder (inside the Postgres installation folder) from the old computer to the new one, but there are a few things to keep in mind.
First, before you copy the files, you must stop the Postgres server service. So, Control Panel->Administrative tools->Services, find Postgres service and stop it. When you're done copying the files and setting permissions, start it again.
Second, you need to set the permissions for the data files. Because postgres server actually runs on another user account, it will not be able to access the files if you just copy them into the data folder, because it will not have permissions to do so. So you need to change the ownership of the files to the "postgres" user. I had to use subinacl for this, install it first, and then use it from command prompt like this (first navigate to folder where you installed it):
subinacl /subdirectories "C:\Program Files\PostgreSQL\8.2\data\*" /setowner=postgres
(Changing ownership should also be possible to do from the explorer: first you must disable "Use simple file sharing" in Folder options, then a "Security" tab will appear in the folder Properties dialog, and there are options there to set permissions and change ownership, but I wasn't able to do it that way.)
Now, if the server service can't start after you start it manually again, you can usually see the reason in the Event viewer (Administrative tools->Event viewer). Postgres will throw an error event, and inspecting it will give you a clue about what the problem is (sometimes it will complain about a postmaster.pid file, just remove it, etc.).
The question is very old, but I want to share an effective method that I found.
If you have not got a backup with "pg_dump" and your old data is folder, try the following steps.
In the Postgres database, add records to the "pg_database" table. With a manager program or "insert into".
Make the necessary check and change the following insert query and run it.
The query will return an OID after it has worked. Create a folder with the name of this number. Once you have copied your old data into this folder, the use is now ready.
/*
------------------------------------------
*** Recover From Folder ***
------------------------------------------
Check this table on your own system.
Change the differences below.
*/
INSERT INTO
pg_catalog.pg_database(
datname, datdba, encoding, datcollate, datctype, datistemplate, datallowconn,
datconnlimit, datlastsysoid, datfrozenxid, datminmxid, dattablespace, datacl)
VALUES(
-- Write Your collation
'NewDBname', 10, 6, 'Turkish_Turkey.1254', 'Turkish_Turkey.1254',
False, True, -1, 12400, '536', '1', 1663, Null);
/*
Create a folder in the Data directory under the name below New OID.
All old backup files in the directory "data\base\Old OID" are the directory with the new OID number
Copy. The database is now ready for use.
*/
select oid from pg_database a where a.datname = 'NewDBname';
As shown by move database to another hard drive. All we need to do is to modify the registry table and file permissions. By modifying registry table(shown in image 1), postgresql server know the new location of data.
modify registry
If you have issues with permissions or with stuff like icacls during installation to old data folder then try my solution from sister website.
https://superuser.com/a/1611934/1254226
I do so but the most tricky part was to change the owner permission:
go to services from administative tools
find postgres service and double click on it
at log on tab change to local system
then restart

Create a SQL Server database on a remote machine mdf file path issue

I'm trying to create a MS Sql Server database on a database instance running on a remote machine. When I'm doing so I need to be able to specify the path to the database (.mdf) file. If I try to create a database in a folder which doesn't exist, SQL Server will just fail (wouldn't it be nice if it created the folder structure automatically).
Is there any way that I can create the folder path on the target machine in SQL before I try to create the database, or at least to determine what the default folder is for new databases in which I could safely create the new database file?
If you have appropriate permissions, and xp_cmdshell is enabled, you can:
EXEC xp_cmdshell 'md "<path>"';
--...repeat for each node in the path
If cmdshell is disabled, again assuming appropriate permissions, you can enable it temporarily using sp_configure:
Ancient article removed
Don't forget to set it back!
Otherwise why can't you let the engine place the database files in their default location? If you are using a drive other than C:, you'll also need to verify that the drive you specify even exists, and shouldn't you check with the user that it is okay for you to put these files elsewhere? If you choose some arbitrary location they might not know to check there for active SQL Server files.
You can check the default path by using xp_regread (undocumented, unsupported)... these are in the registry as keys DefaultDataDir and DefaultLogDir for the default instance under:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSSQLServer\MSSQLServer
If it's not the default instance, check this article:
http://foxtricks.blogspot.com/2009/06/how-to-determine-default-database-path.html
Are you doing this just so that you can name your MDF/LDF files the way you want to, instead of dbname-data, dbname-log? If so, why? Have you written scripts that depend on the physical name of the file? Really curious as to the motivation behind this.

Resources