Create a SQL Server database on a remote machine mdf file path issue - sql-server

I'm trying to create a MS Sql Server database on a database instance running on a remote machine. When I'm doing so I need to be able to specify the path to the database (.mdf) file. If I try to create a database in a folder which doesn't exist, SQL Server will just fail (wouldn't it be nice if it created the folder structure automatically).
Is there any way that I can create the folder path on the target machine in SQL before I try to create the database, or at least to determine what the default folder is for new databases in which I could safely create the new database file?

If you have appropriate permissions, and xp_cmdshell is enabled, you can:
EXEC xp_cmdshell 'md "<path>"';
--...repeat for each node in the path
If cmdshell is disabled, again assuming appropriate permissions, you can enable it temporarily using sp_configure:
Ancient article removed
Don't forget to set it back!
Otherwise why can't you let the engine place the database files in their default location? If you are using a drive other than C:, you'll also need to verify that the drive you specify even exists, and shouldn't you check with the user that it is okay for you to put these files elsewhere? If you choose some arbitrary location they might not know to check there for active SQL Server files.
You can check the default path by using xp_regread (undocumented, unsupported)... these are in the registry as keys DefaultDataDir and DefaultLogDir for the default instance under:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSSQLServer\MSSQLServer
If it's not the default instance, check this article:
http://foxtricks.blogspot.com/2009/06/how-to-determine-default-database-path.html
Are you doing this just so that you can name your MDF/LDF files the way you want to, instead of dbname-data, dbname-log? If so, why? Have you written scripts that depend on the physical name of the file? Really curious as to the motivation behind this.

Related

How do I change "Database default locations" for LocalDB in SQL Server Management Studio?

Connect to LocalDB in SSMS
Open Server Properties -> Database Settings
Change Data/Log/Backup locations -> click OK
When I click OK I get this error:
Found some blogpost and changed this in regedit but it didn't help.
Anyone got any other ideas I could try?
I do not believe that these default paths for SQL Server LocalDB are changeable. This is quite unfortunate due to what appears to be a bug with SQL Server Express 2017 LocalDB ** (fixed as of CU 6 for SQL Server 2017), as per this question (and my answer to it) on DBA.StackExchange:
LocalDB v14 creates wrong path for mdf files
HOWEVER, you do not need to use the default paths. Those are used when you create a Database without specifying the physical locations. If you specify the physical location, then you should be able to create the files to any folder / directory that you have read / write access to.
After making that change in the registry try restarting the sql instance.
Also I would make sure that the account running SQL Server has the ability to write to that folder.
for an easy test you could go to the folder properties -> security then add the account 'everyone' then give them full control. then try making that change. If it works it was a permissions issue to that account. Accounts generally don't have access to other users accounts without some level of admin.
After 10 years this is still an issue for the current version(15.0) of Microsoft SQL Server Express.
After a bit of investigation I discovered, there is an issue with permission inside the registry. The process sqlservr.exe cannot create entries in HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL15E.LOCALDB\MSSQLServer.
On my computer this process is running under my account, so I opened regedit and gave myself Full Control permission to this key. And it worked like a charm. I hope this will help you as well.
Changing these paths in RegEdit or SSMS doesn't work, SQL LocalDb won't respect these values for existing databases. One has to move the databases manually. Here is the reliable way to change a database location for any LocalDB instance.
First, make sure you work with a correct instance of SQL Server LocalDB. In command prompt enter:
sqllocaldb info
It will show the LocalDB instances you have on your machine. Let's assume that the instance name is MSSQLLocalDB.
Next, execute this script on your database (let's call it TestApp), using SqlCmd tool or SSMS:
alter database TestApp
modify file (name = TestApp, filename = 'C:\NewDataLocation\TestApp.mdf');
go
alter database TestApp
modify file (name = TestApp_log, filename = 'C:\NewDataLocation\TestApp_log.ldf');
go
Now, stop the SQL LocalDB instance, in command prompt:
sqllocaldb stop MSSQLLocalDB
Move the database files to the new location that you specified in the script. From %UserProfile%\TestApp.mdf (which is where they are located) to C:\NewDataLocation\TestApp.mdf, same for LDF file.
Start the SQL LocalDB instance again:
sqllocaldb start MSSQLLocalDB
Now your database is working from a new location. Repeat the steps for any other databases.
Paths Cannot Be Changed in SQL Server LocalDB "Automatic Instance" Types
In case anyone in 2023 finds out they cannot change their default database file storage paths, this article is for you!
This error applies to Microsoft SQL Server not being able to allow you to change the default file folder location on your PC where the SQL Server Database Files are saved (database and logs files, .mdf and .ldf).
Most developers often need control over where local database files are saved. Most prefer to store them in a central location, another drive, or simply the main SQL Server database repository inside the C:\Program Files\Microsoft SQL Server\{sql version name}\MSSQL\DATA, since that is where system data storage goes. One example of the problem of not being able to customize database file storage might be using Entity Framework Core, which runs "migration" scripts that create databases in SQL Server. When it does so, where those scripted databases get stored is heavily dependent on SQL Server's default file path settings. When the location of those EF code-first database files using LocalDB is locked down, developers are stuck with SQL files in multiple locations on their PC's.
THE PROBLEM
Apparently, when Microsoft installs SQL Server / SQL Express on your device, it attempts to install a default instance of the server as a specialized type called a "LocalDB Automatic Instance". They do this to get the user up and running fast with a "LocalDB" sql server instance, which is a one-time, "light", custom created server running as a public instance, complete with default settings which are customized for the user (or developer) so he can get up and running fast. The automatic type has the advantage that its granted permissions to the user as administrator in SQL, as well as granting all applications on the user's device public access to the server instance. (You will notice that IIsExpress works this way using ApplicationPools as dummy Windows User Accounts, creating default accounts next to your User Account in Windows to run app pools in IIS.) These SQL Server LocalDB binaries do not run as a service but on-demand. But only one of the "automatic" types may be installed per version per device. The other SQL Server LocalDB type is the named instance and is not as restricted as the automatic one, apparently.
The problem is, when they create this special LocalDB automatic instance, it locks down certain settings and applies certain permissions and settings that are unique just for this instance. This then limits what the user can do as far as customizations, one of which is the "Database default locations" in the Properties dialog box that appears when you right-click your sql server instance and choose properties.
Anyone using the full SQL Server version, or who has created a new instance of LocalDB, deleting the old one, will not experience this issue, so most of those people are scratching their heads.
But for local developers, what this means is your Sql Server LocalDB databases running under this instance of the server will typically store their databases under a locked down path...either the path you chose on install or default to the user-friendly account paths under C:\Users\{YourName}.
When users attempt to change the path in the properties box for the instance, many users online the past 5-6 years have noticed a nasty RegCreateKeyEx() returned error 5 Access is denied that would appear when saving a default path. Microsoft doesn't bother to tell you, but that is intentional. They don't expect to allow you to save paths to the registry for the instance, and assume everyone is ok with the default path.
You can fix the key error by going into your registry and changing permission on the Microsoft SQL Server registry keys, assigning the "Everyone" group account to the registry node managing these keys. In the Registry, add Everyone group account to this node below then try and save a new default path in the properties box for your sql server localdb instance:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server
The location of the default database file path keys (2019) in the Registry in Windows for an instance of the localdb server of are located here:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL15E.LOCALDB\MSSQLServer
You are then able to save the new default paths in SQL, and the error goes away. Saving your default path in the Properties box works now, and the new values appear in the registry.
Even though you can change these paths, they will not stick, however, and reset back to the User Account Path, by default. Even if you save a new default sql path for your databases, when you create a new database it still reverts to the old path. Again, this applies ONLY for users who are running under the default "Automatic" LocalDB instance created on install of SQL Express.
So even after restarting SQL, restarting your PC, or restarting the SQL Service, those registry values will still not pull the registry keys into the SQL Server instance settings for Default file paths.
As proof, run these two scripts below in your SQL Server LocalDB instance. The first one returns the actual LocalDB default file paths SQL Server stores internally. The second script returns what is stored in your registry for the LocalDB default file path. If you saved new default path registry keys, they should be the same and shown in SQL Server instance properties, but they are different! That means Microsoft has decided not to allow you to change them for those running the "automatic" instance type of LocalDB on install. Below is the T-SQL to run to test this:
-- GETS THE PATH STORED IN SQL SERVER FOR "DefaultData" path
SELECT
[Value] = 'DefaultData',
[Data] = SERVERPROPERTY('InstanceDefaultDataPath')
-- DefaultData C:\Users\YourAccountName\
-- GETS WHATS IN THE REGISTRY FOR "DefaultData" path
EXECUTE [master].dbo.xp_instance_regread
N'HKEY_LOCAL_MACHINE',
N'SOFTWARE\Microsoft\Microsoft SQL Server\MSSQLServer',
N'DefaultData'
-- DefaultData C:\Program Files\Microsoft SQL Server\MSSQL15.MSSQLSERVER\MSSQL\DATA
-- Note: If the second one returns `NULL` it just means you
-- have not yet tried or succeeded in saving a new file path
-- to your registry.
Why isnt SQL Server LocalDB pulling in the registry values?
What this means again, is sorry you can't change these default paths. Your best bet is to simple "detach" your databases, copy the .mdf and .ldf files to your new prefered folder, then reattach. When you create new databases, the console allows you to change the database file path there, as well. There are also some elaborate SQL scripts you can run to set paths before saving files.
But just know this is by design.
I think one of the purposes of LocalDB is that it is very convinient in bundling a demo database along with the source files of an application. The database file and its log, of course, are somewhere in the source file directory.
Take a Visual Studio solution for example, in web.config or app.config, you can see something like this:
<connectionStrings>
<add name="DefaultConnection" connectionString="Data Source=(LocalDb)\v11.0;Initial Catalog=aspnet-XXXXXX-20140609153630;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|\aspnet-XXXXXX-20140609153630.mdf" providerName="System.Data.SqlClient" />
Now that the location of every LocalDB is specified in the config file, I don't think "default location" makes much sense.

How do I change the path to a dropbox folder

I want to move my .mdf and .ldf into my dropbox folder.
I ran this script command:
ALTER DATABASE MyDatabase1 MODIFY FILE
(
Name = matrix,
Filename = 'C:\Users\mycomputer\Dropbox\MyDatabase1.mdf'
);
But I get this error:
The path specified by 'C:\Users\mycomputer\Dropbox\MyDatabase1.mdf' is
not in a valid directory.
I'm pretty sure it's just a permissions issue where the sql service running my script doesn't have the correct permissions. But I have no clue which object to grant all permissions to my DropBox. I tried mycomputer\users but that didn't work. Can someone help please?
As far as I know, Dropbox does not make snapshots of the files it copies.
This means the files could (and most probably would) be written to during the copy and they'll arrive in inconsistent state, rendering them unusable.
I believe you would want to use Log Shipping instead. This is a feature of SQL Server which allows transaction logs to be incrementally backed up and sent to another server (possibly by means of Dropbox), where they can be restored. This would allow you to have a snapshot of the database on another server.

Recover postgreSQL databases from raw physical files

I have the following problem and I need to know if thereĀ“s a way to fix it.
I have a client who was cheap enough to decline buying a backup plan for his postgreSQL databases on the main system that runs his company and as I thought it would happen some day, some OS files crashed during a blackout and the OS needs to be reinstalled.
This client didn't have any backups of the databases but I managed to save the PostgreSQL main directory. I read that the databases are stored somehow inside the data directory of the postgres main folder.
My question is: Is there any way to recover the databases from the data folder only? I am working in a windows environment (XP service pack 2) with PostgreSQL 8.2 and I need to reinstall PostgreSQL in a new server. I would need to recreate the databases in the new environment and somehow attach the old files to the new database instances. I know that's possible in SQL Server because of the way that engine stores the databases but I have no clue in postgres.
Any ideas? They would be much appreciated.
If you have the whole data folder, you have everything you need (as long as architecture is the same). Just try restoring it on another machine before wiping this one out, in case you didn't copy something.
Just save the data directory to disk. When launching Postgres, set the parameter telling it where the data directory is (see: wiki.postgresql.org). Or remove original data directory of the fresh installation and place the copy in its place.
This is possible, you just need to copy the "data" folder (inside the Postgres installation folder) from the old computer to the new one, but there are a few things to keep in mind.
First, before you copy the files, you must stop the Postgres server service. So, Control Panel->Administrative tools->Services, find Postgres service and stop it. When you're done copying the files and setting permissions, start it again.
Second, you need to set the permissions for the data files. Because postgres server actually runs on another user account, it will not be able to access the files if you just copy them into the data folder, because it will not have permissions to do so. So you need to change the ownership of the files to the "postgres" user. I had to use subinacl for this, install it first, and then use it from command prompt like this (first navigate to folder where you installed it):
subinacl /subdirectories "C:\Program Files\PostgreSQL\8.2\data\*" /setowner=postgres
(Changing ownership should also be possible to do from the explorer: first you must disable "Use simple file sharing" in Folder options, then a "Security" tab will appear in the folder Properties dialog, and there are options there to set permissions and change ownership, but I wasn't able to do it that way.)
Now, if the server service can't start after you start it manually again, you can usually see the reason in the Event viewer (Administrative tools->Event viewer). Postgres will throw an error event, and inspecting it will give you a clue about what the problem is (sometimes it will complain about a postmaster.pid file, just remove it, etc.).
The question is very old, but I want to share an effective method that I found.
If you have not got a backup with "pg_dump" and your old data is folder, try the following steps.
In the Postgres database, add records to the "pg_database" table. With a manager program or "insert into".
Make the necessary check and change the following insert query and run it.
The query will return an OID after it has worked. Create a folder with the name of this number. Once you have copied your old data into this folder, the use is now ready.
/*
------------------------------------------
*** Recover From Folder ***
------------------------------------------
Check this table on your own system.
Change the differences below.
*/
INSERT INTO
pg_catalog.pg_database(
datname, datdba, encoding, datcollate, datctype, datistemplate, datallowconn,
datconnlimit, datlastsysoid, datfrozenxid, datminmxid, dattablespace, datacl)
VALUES(
-- Write Your collation
'NewDBname', 10, 6, 'Turkish_Turkey.1254', 'Turkish_Turkey.1254',
False, True, -1, 12400, '536', '1', 1663, Null);
/*
Create a folder in the Data directory under the name below New OID.
All old backup files in the directory "data\base\Old OID" are the directory with the new OID number
Copy. The database is now ready for use.
*/
select oid from pg_database a where a.datname = 'NewDBname';
As shown by move database to another hard drive. All we need to do is to modify the registry table and file permissions. By modifying registry table(shown in image 1), postgresql server know the new location of data.
modify registry
If you have issues with permissions or with stuff like icacls during installation to old data folder then try my solution from sister website.
https://superuser.com/a/1611934/1254226
I do so but the most tricky part was to change the owner permission:
go to services from administative tools
find postgres service and double click on it
at log on tab change to local system
then restart

Attach .mdf file located on desktop?

In SQL Server 2008 I can attach databases located only in its predefined folder (C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA). On may occasions, especially when I read a book, I need to attach test database from desktop rather then copy each database every time I need it, but SQL Server does not allow me to access desktop.
Any workaround to solve this issue?
It's probably a matter of granting the account running the SQL service appropriate permissions to your desktop folder (C:\Documents and Settings\YourLogin\Desktop). But, rather than use a location like Desktop that is specific to your login and possibly inaccessible to the account running the SQL service, why not use a common holding location for these files? Something like C:\AdHocDBs or whatever you want to call it.
When a database file (data or log) is first created, it is (of course) located in a specific drive and folder. When a backup is created, this information is stored as part of the backup. A database RESTORE command will assume that the database is to be restored in the exact same location, unless instructed otherwise. To do this, in the RESTORE command under the "with" option, you must include the "move" option. It looks something like this:
RESTORE ...
with
move '<logcalFileName>' to 'physicalFileName'
,move '<logcalLogFileName>' to 'physicalLogFileName'
One move must be included for each file to be so moved, so you usually end up with at least two of these clauses. The tricky part is that you must know the database files' logical names. These can be found via sp_helpFile on an attached database, and
RESTORE FILELISTONLY
from disk = '<backupFile>'
On an existing backup.
(I'm sure all this can be done somehow with the SSMS backup/restore GUIs. I switched over to TSQL-based scripts years ago, to provide quick and flexible access to all the features wrapped in the backup and restore commands.)

Oracle: changing the DB_RECOVERY_FILE_DEST without pre-existing folder?

I'm working on a web interface for modifying Oracle database backup settings. One of the options I want to give users is where to set the flash recovery area. As far as I know, the only way to change this is by executing something like:
ALTER SYSTEM SET DB_RECOVERY_FILE_DEST='C:\file\path' SCOPE=BOTH SID='*';
The problem is that if the file path is some path that doesn't already exist on the system, it isn't created automatically, and this script fails. Does anyone know if there is a way to instruct Oracle to make that directory for me or if there is a PL/SQL script I can use to create a directory on the physical disk (I.E. not a CREATE DIRECTORY call)?
If you really want to do this, write a Java stored procedure (stored as an Oracle object) that calls the mkdir function on a file object. You'll need to use dbms_java.grant_permission to grant java.io.FilePermission privileges.

Resources