How do I change the path to a dropbox folder - sql-server

I want to move my .mdf and .ldf into my dropbox folder.
I ran this script command:
ALTER DATABASE MyDatabase1 MODIFY FILE
(
Name = matrix,
Filename = 'C:\Users\mycomputer\Dropbox\MyDatabase1.mdf'
);
But I get this error:
The path specified by 'C:\Users\mycomputer\Dropbox\MyDatabase1.mdf' is
not in a valid directory.
I'm pretty sure it's just a permissions issue where the sql service running my script doesn't have the correct permissions. But I have no clue which object to grant all permissions to my DropBox. I tried mycomputer\users but that didn't work. Can someone help please?

As far as I know, Dropbox does not make snapshots of the files it copies.
This means the files could (and most probably would) be written to during the copy and they'll arrive in inconsistent state, rendering them unusable.
I believe you would want to use Log Shipping instead. This is a feature of SQL Server which allows transaction logs to be incrementally backed up and sent to another server (possibly by means of Dropbox), where they can be restored. This would allow you to have a snapshot of the database on another server.

Related

How do i restart PostgreSQL service after putting back the original Data folder?

I'm writing some batch scripts for doing incremental backups of a PostgreSQL cluster on a Windows Server.
I copied the Data folder to a different folder, ran my backup scripts, stopped the service, deleted the Data folder, and tried recovering the database from the WAL files and such.
This didn't work, because i copied the wrong log files, and i couldn't get the service started again, so i tried copying back in the original Data folder, but i still can't start the service.
The first script i ran called:
pg_basebackup -Fp -D %BACKUPDIR%\full_%CURRENTDATE%
This was the only line which actually ended up interacting with the data, but not the original Data folder, which i copied beforehand.
When trying to start the service again i get the following error message:
The postgresql-x64-10 - PostgreSQL Server 10 service on Local Computer started and then stopped.
Some services stop automatically if they are not in use by other services or programs.
I have gotten this before, when making a typo in the conf file, so i'm guessing that's just the standard error message for when something is missing.
Found out that i had to redo the folder permissions.
This is done the following way:
5. Change permissions for the new data directory
For the new data-dictionary folder: Right-click on it and click Properties. Under the Security Tab click “Edit...” and then “Add...”. Type “Network Service” and then click “Check Names”, make sure it has Modify and Full Control permissions and then click OK. Equally important PostgreSQL needs to be able to “see” the data-directory (see my ServerFault.StackEx question), i.e. it needs to have read access to the parent directories above it. So Right-click on the pg_db folder and under the Security Permissions add Network Services again, but this time it only needs Read & Execute as well as List folder contents permissions.
The full post is a nice checklist to go through, for anyone else facing similar issues:
https://radumas.info/blog/tutorial/2016/08/08/Migrating-PostgreSQL-Data-Directory-Windows.html

Restore SQL Server database from bak file to different partition (Permission Denied)

On Ubuntu 17.04, I have a .bak file in /var/opt/mssql/backup/ that I am trying to restore to a separate partition because the partition I have SQL Server installed on does not have enough room for the database to be restored to.
I am getting an error like the following: The operating system returned the error '5(Access is denied.)' while attempting 'RestoreContainer::ValidateTargetForCreation' on '/media/<my-user-name>/<some-folder>/<mdf-file>.mdf'.
I've tried to use chmod and chown to change the permissions of that folder on the second partition, but I'm not getting it quite right because I still get the error.
What user is trying to write to that folder in the second partition?
How do I get that user account permissions to successfully restore the database to that folder?
I had this second hard drive connected via a caddy and was able to perform this task no problem. But as soon as I installed the ssd internally, Ubuntu has not allowed whatever user account I'm using in the SQL Server CLI for it this time.
Thanks!
Update
I changed the owner of the second partition/ssd to mssql and now I have permission to restore the database to this location. I would assume that if the owner of that whole ssd is mssql, I might have other permission issues down the road using this ssd for other things. Is there a way to configure this so that my personal user account as well as mssql has permissions to this folder enough to own it? I don't think two different accounts can own a folder, but is there a way to permit multiple accounts with sufficient access to perform these actions?
I won't pretend to be knowledgeable about this, but I had a permissions issue while trying to restore a .bak that was on a network vm to my local device, it worked when I added it to a .zip with 7-share, then copied it to the location I wanted and extracted it.
I had the permissions issue when I tried to move it without zipping, and as far as I remember I still had this issue when I used send to compressed (zipped) folder. I'm not sure why, maybe someone else can elaborate
I solve the problem by deleting the old database, creating a new one and restoring the backup to the new one.
My problem was probably cause by the fact that I had created the database in evalutation edition of MS SQL Server and I wanted to rewrite it by backup in new instalation of developer edition.

Recover postgreSQL databases from raw physical files

I have the following problem and I need to know if there´s a way to fix it.
I have a client who was cheap enough to decline buying a backup plan for his postgreSQL databases on the main system that runs his company and as I thought it would happen some day, some OS files crashed during a blackout and the OS needs to be reinstalled.
This client didn't have any backups of the databases but I managed to save the PostgreSQL main directory. I read that the databases are stored somehow inside the data directory of the postgres main folder.
My question is: Is there any way to recover the databases from the data folder only? I am working in a windows environment (XP service pack 2) with PostgreSQL 8.2 and I need to reinstall PostgreSQL in a new server. I would need to recreate the databases in the new environment and somehow attach the old files to the new database instances. I know that's possible in SQL Server because of the way that engine stores the databases but I have no clue in postgres.
Any ideas? They would be much appreciated.
If you have the whole data folder, you have everything you need (as long as architecture is the same). Just try restoring it on another machine before wiping this one out, in case you didn't copy something.
Just save the data directory to disk. When launching Postgres, set the parameter telling it where the data directory is (see: wiki.postgresql.org). Or remove original data directory of the fresh installation and place the copy in its place.
This is possible, you just need to copy the "data" folder (inside the Postgres installation folder) from the old computer to the new one, but there are a few things to keep in mind.
First, before you copy the files, you must stop the Postgres server service. So, Control Panel->Administrative tools->Services, find Postgres service and stop it. When you're done copying the files and setting permissions, start it again.
Second, you need to set the permissions for the data files. Because postgres server actually runs on another user account, it will not be able to access the files if you just copy them into the data folder, because it will not have permissions to do so. So you need to change the ownership of the files to the "postgres" user. I had to use subinacl for this, install it first, and then use it from command prompt like this (first navigate to folder where you installed it):
subinacl /subdirectories "C:\Program Files\PostgreSQL\8.2\data\*" /setowner=postgres
(Changing ownership should also be possible to do from the explorer: first you must disable "Use simple file sharing" in Folder options, then a "Security" tab will appear in the folder Properties dialog, and there are options there to set permissions and change ownership, but I wasn't able to do it that way.)
Now, if the server service can't start after you start it manually again, you can usually see the reason in the Event viewer (Administrative tools->Event viewer). Postgres will throw an error event, and inspecting it will give you a clue about what the problem is (sometimes it will complain about a postmaster.pid file, just remove it, etc.).
The question is very old, but I want to share an effective method that I found.
If you have not got a backup with "pg_dump" and your old data is folder, try the following steps.
In the Postgres database, add records to the "pg_database" table. With a manager program or "insert into".
Make the necessary check and change the following insert query and run it.
The query will return an OID after it has worked. Create a folder with the name of this number. Once you have copied your old data into this folder, the use is now ready.
/*
------------------------------------------
*** Recover From Folder ***
------------------------------------------
Check this table on your own system.
Change the differences below.
*/
INSERT INTO
pg_catalog.pg_database(
datname, datdba, encoding, datcollate, datctype, datistemplate, datallowconn,
datconnlimit, datlastsysoid, datfrozenxid, datminmxid, dattablespace, datacl)
VALUES(
-- Write Your collation
'NewDBname', 10, 6, 'Turkish_Turkey.1254', 'Turkish_Turkey.1254',
False, True, -1, 12400, '536', '1', 1663, Null);
/*
Create a folder in the Data directory under the name below New OID.
All old backup files in the directory "data\base\Old OID" are the directory with the new OID number
Copy. The database is now ready for use.
*/
select oid from pg_database a where a.datname = 'NewDBname';
As shown by move database to another hard drive. All we need to do is to modify the registry table and file permissions. By modifying registry table(shown in image 1), postgresql server know the new location of data.
modify registry
If you have issues with permissions or with stuff like icacls during installation to old data folder then try my solution from sister website.
https://superuser.com/a/1611934/1254226
I do so but the most tricky part was to change the owner permission:
go to services from administative tools
find postgres service and double click on it
at log on tab change to local system
then restart

Attach .mdf file located on desktop?

In SQL Server 2008 I can attach databases located only in its predefined folder (C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA). On may occasions, especially when I read a book, I need to attach test database from desktop rather then copy each database every time I need it, but SQL Server does not allow me to access desktop.
Any workaround to solve this issue?
It's probably a matter of granting the account running the SQL service appropriate permissions to your desktop folder (C:\Documents and Settings\YourLogin\Desktop). But, rather than use a location like Desktop that is specific to your login and possibly inaccessible to the account running the SQL service, why not use a common holding location for these files? Something like C:\AdHocDBs or whatever you want to call it.
When a database file (data or log) is first created, it is (of course) located in a specific drive and folder. When a backup is created, this information is stored as part of the backup. A database RESTORE command will assume that the database is to be restored in the exact same location, unless instructed otherwise. To do this, in the RESTORE command under the "with" option, you must include the "move" option. It looks something like this:
RESTORE ...
with
move '<logcalFileName>' to 'physicalFileName'
,move '<logcalLogFileName>' to 'physicalLogFileName'
One move must be included for each file to be so moved, so you usually end up with at least two of these clauses. The tricky part is that you must know the database files' logical names. These can be found via sp_helpFile on an attached database, and
RESTORE FILELISTONLY
from disk = '<backupFile>'
On an existing backup.
(I'm sure all this can be done somehow with the SSMS backup/restore GUIs. I switched over to TSQL-based scripts years ago, to provide quick and flexible access to all the features wrapped in the backup and restore commands.)

Create a SQL Server database on a remote machine mdf file path issue

I'm trying to create a MS Sql Server database on a database instance running on a remote machine. When I'm doing so I need to be able to specify the path to the database (.mdf) file. If I try to create a database in a folder which doesn't exist, SQL Server will just fail (wouldn't it be nice if it created the folder structure automatically).
Is there any way that I can create the folder path on the target machine in SQL before I try to create the database, or at least to determine what the default folder is for new databases in which I could safely create the new database file?
If you have appropriate permissions, and xp_cmdshell is enabled, you can:
EXEC xp_cmdshell 'md "<path>"';
--...repeat for each node in the path
If cmdshell is disabled, again assuming appropriate permissions, you can enable it temporarily using sp_configure:
Ancient article removed
Don't forget to set it back!
Otherwise why can't you let the engine place the database files in their default location? If you are using a drive other than C:, you'll also need to verify that the drive you specify even exists, and shouldn't you check with the user that it is okay for you to put these files elsewhere? If you choose some arbitrary location they might not know to check there for active SQL Server files.
You can check the default path by using xp_regread (undocumented, unsupported)... these are in the registry as keys DefaultDataDir and DefaultLogDir for the default instance under:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSSQLServer\MSSQLServer
If it's not the default instance, check this article:
http://foxtricks.blogspot.com/2009/06/how-to-determine-default-database-path.html
Are you doing this just so that you can name your MDF/LDF files the way you want to, instead of dbname-data, dbname-log? If so, why? Have you written scripts that depend on the physical name of the file? Really curious as to the motivation behind this.

Resources