I came back to a machine to work on it today and noticed that my batch files no longer can back up a copy of my database using the backup database command.
The server is 2008R2 for both windows server and sql server.
No more than a month ago this same batch file would back up my database. Now when I manually run my sql script I get a "Operating System Error 3 - cannot find file specified".
Here's my script:
DECLARE #FileName varchar(50), #Date varchar(20)
set #Date = REPLACE ((CONVERT (VARCHAR(10), GETDATE(), 101)), '/', '-')
SET #FileName = ('C:\mybkfolder\BackupSQLData\db_dnt_' + #Date + '.bak')
BACKUP DATABASE db_dnt TO DISK = #FileName
Go
I have verified that sql server is running under the system account. I can manually make a backup from sql management studio to the folder in question. But if I run my script as a query in management studio I get this error.
I went to this folder and set the security permission so that the SYSTEM account would have full access. Then I went to task scheduler and made sure my scheduled task (run a batch. batch calls a .sql file which contains script above) was configured to run under SYSTEM. Still nothing is backing up.
Other than windows patches nobody has messed with the machine really. What causes this to just stop working? This is an express version of sql server and since you can't create jobs on the express version like the real one I have to use a two batch file system where one backs up and the other purges to keep the folder under control.
This turned out to be an issue with a path that no longer existed. My task called a batch, the batch called a .sql script. I had forgotten to update the path in the sql script even though I remembered to do so in my batch file.
Related
System Error: Cannot bulk load because the file "XYZ.txt" could not be opened. Operating system error code 1311(There are currently no logon servers available to service the logon request.)
I have a stored procedure in SQL Server 2008 R2 which is using Bulk Insert command to load data from txt files into SQL Server tables. These files are on a shared folder on a drive located on different domain. I have full access to the drive. I tried copying files to different directory on that drive, moving files, deleting files and everything works.
When I execute the stored procedure from a ssms session from local computer it works like a champ. It is able to open the files on shared drive, read it and load the data into the SQL Server tables without any issue. When I call the stored procedure from a SQL Server Agent job, it throws this error.
SQL Server Agent is using the account which is very powerful with lot more permissions than mine. But the job fails.
To find a workaround I created a SSIS package which calls the stored procedure from "Execute SQL Task". It uses Windows authentication to connect to the database. I tried executing the package and it ran successfully. It is able to upload the data from txt to table.
So, then I created a SQL Server user with my account details, and then used that credentials to create a proxy with ssis sub system. I then scheduled the job to execute the step with newly created proxy to see if it can upload the data. But it failed with the same error.
I am confused what am I doing wrong..? I even added myself to bulkdmin role and ran the job again with no success.
I'll appreciate if someone can help.
Thanks.
Just out of curiosity I tried replacing Bulk Insert command with BCP. For some reason BCP worked. It is able to Open the files on network drive and read through it to insert the data in sql server tables. I can even call the same stored proc from sql agent job and it works perfectly fine. I didn't need to use SSIS package to solve this.
We are taking full disk backup of our servers weekly. How do I set up SQL Server 2016 to export all databases automatically to a local folder, every Friday at 6 PM? I'd like to do that to add extra level of protection against database corruption.
I recommend using Ola Hallengren's scripts for database backups. https://ola.hallengren.com/sql-server-backup.html
Remember to DBCC check database before taking a backup and to both validate the backup media and to try your restore scenarios frequently!
Here are the steps from the FAQ:
Download MaintenanceSolution.sql.
In the script, find this line:
SET #BackupDirectory = N'C:\Backup'
and replace C:\Backup with the path to your backup directory.
In the script, find this line:
SET #CleanupTime = NULL
and replace NULL with your cleanup time. The cleanup time is the the number of hours after which the backup files are deleted.
Execute MaintenanceSolution.sql. This script creates all the objects and jobs that you need.
Go into [SQL Server Agent] / [Jobs] and start the jobs that have been created. Verify that these jobs are completing successfully. Verify that the backup files are being created. Check the output files in the error log directory.
Schedule the jobs.
My Windows Server 2003 got corrupted and I'm trying to repair it but before that I'm trying to create a backup of my SQL Server databases.
Can anyone please tell me which files do I need to copy from the Windows command line as I'm not familiar with SQL Server. Database files from which I can restore data.
Its an old server but data is important.
And also if I repair Windows server 2003 using repair disk will it effect on SQL Server files ?
http://postimg.org/image/5jsstbqmd/
When I start server I get this error.
You can use this SQL command (adapt to your specific case):
--Back up the files in SalesGroup1:
BACKUP DATABASE YourDBName
TO DISK = 'Z:\SQLServerBackups\BackupFileName.bck';
GO
See Backup in Transact-SQL for more details.
To run a SQL script from command line:
sqlcmd -S myServer\instanceName -i C:\myScript.sql
Before messing with anything, you could take a complete image of your hard drive using a tool such as clonezilla.
I would get to the root of your disk and run
dir /a /s *.mdf
The .mdf file is the file extension that SQL Server uses, and that command will tell you where they are located. The log files are usually in the same directory.
As per your second question, the disk repair will only affect your database files if they are part of the corruption that is happening; which is quite likely if you were running a high I/O database when it crashed. I would definitely try and copy those files off before running a disk check.
My Query is regarding using NOT hard coded File Locations to initialize the the Variables DefaultDataPath and DefaultLogPath. Prior to adopt Database Projects as our standard Deployment and Database Management Tools and migrating our existing Scripts to Database projects we have been using the SET of CREATE and INITIALIZE scripts for Setting up Database. We are having following SQL Query to CREATE the Database with the FILE location:
SET #data_path = (SELECT SUBSTRING(filename, 1, CHARINDEX(N'master.mdf', LOWER(filename)) - 1)
FROM sys.sysaltfiles WHERE dbid = 1 AND fileid = 1);
set #mdb_file=#data_path + 'CF_DB.mdf'
set #cfdata='CF_DB_Data'
set #cflog='CF_DB_Log'
set #ldf_file=#data_path + 'CF_DB_log.ldf'
declare #sql nvarchar(500)
set #sql = 'CREATE DATABASE [CF_DB] ON (NAME = ' + quotename(#cfdata) + ',FILENAME =' + quotename(#mdb_file) + ',SIZE = 53, FILEGROWTH = 10%) LOG ON (NAME =' + quotename(#cflog) + ',FILENAME = ' + quotename(#ldf_file) + ', SIZE = 31, FILEGROWTH = 10%)COLLATE SQL_Latin1_General_CP1_CI_AS'
exec(#sql)
Here we are trying to figure out the location of MDF file for MASTER DB and using the same location to CREATE DATABASE.
Problem: With the scripts generated (after Deploy action) , there is an auto Generated SQLCMD variables , initialized with some default path (hardcoded one ) or Empty strings (which are using Default Datafile path used by SQL Server 2008 or 2005).
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
We need to make it work like our existing system. We need to know path of MASTER DB data and log files and using the same path to initialize DefaultDataPath and DefaultLogPath. We can't go with PreDeployment scripts because Database settings are done by Database Project generated script before embedding PreDeploymentScript in the final Deploy Scripts.
NEXT big thing: Developer need to switch to SQLCMD Mode in SQL Server Management Studio to run the scripts generated by DB Project. This is our implementation Team's requirement NOT TO USE SQLCMD mode to setup DATBASE. To overcome these step, I need to modify the generated SQL file and use SQL Variables instead of SQLCMD variables. Can we generate the clean SQL Statements and keeping automation script generation intact? I know both of these issues are corelated thus the solution for one is going to Fix the other one.
Thanks for any good suggestions or help upon the above discussions.
Regards
Sumeet
Not sure how best to handle your file path, though I suspect you will want to not use the Default File Path setting and instead use a new file path that you can control through a variable.
It sounds to me like you're trying to have the developers update their local machines easily. My recommendation would be to build out some batch files that do the following:
Set the PATH to include the location for MSBuild.exe
Get the location for your master database
Pass that location in to a variable
Run the MSBuild command to set your path variables to the local master path
and publish the database/changes.
Overall, that sounds like more trouble than it's really worth. I'd recommend that you send out a SQL Script to all of the developers getting them to properly set up their Default Data/Log paths and just use the defaults.
I do recommend that you look into setting up some batch files to run the MSBuild commands. You'll have a lot easier time getting the database builds to your developers without them generating scripts and running them locally. Alternatively, you could have them change their SSMS defaults to set SQLCMD mode on for their connections. SSDT made this a little nicer because it won't run at all without SQLCMD mode turned on - eliminated a lot of the messiness from VS2008/VS2010 DBProjects.
I used something like the following to build and deploy when we were on DB Projects:
msbuild /m .\MyDB\MyDB.dbproj /t:build
msbuild /m .\MyDB\MyDB.dbproj /t:deploy /p:TargetConnectionString="Data Source=localhost;Integrated Security=True;Pooling=False;" /p:TargetDatabase="MyDB"
When SSDT/VS generates the SQL file, it actually runs some queries against the server you've told it to connect to. The settings you're getting here for example...
:setvar DatabaseName "CF"
:setvar DefaultDataPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
:setvar DefaultLogPath "C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER2008\MSSQL\DATA\"
Are obtained from server settings from the target database connection you specified in your publish file/profile.
On the server that you are using to generate your scripts, open regedit.exe and search for the keys "DefaultLog" and "DefaultData" under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft - they should be in the same location. See if they match the settings your scripts are generating.
You can change these on the server/your PC (where ever you are pointing to) and it will generate the locations you enter in your generate SQL Scripts. Be cautious naturally around a server you do not own, or that is in use for production etc as this will change a setting on the server which points SQL Server where to place new databases. This is a different setting it seems than the one you enter in SQL Server properties -> Database Settings.
Hope that helps!
This has been driving me crazy. I have tried all suggestions and no go.
This absolutely does not work. The files are all still there.
The job runs successfully, but the files do not delete.
I recently ran into the same problem, and it was due to folder permissions. Easy enough to check:
Check the properties of a recent bak or trn file, security, and find out who the owner is.
Now check the properties of the backup FOLDER, security and see if the FILE owner from step 1 has enough effective permissions to delete files. The account might only have enough to create and modify, but not to remove files.
The peculiar part is that the plan always ran "successfully", even though it failed miserably. This is why teachers shouldn't let students grade their own tests. (grin).
What account is this running under? Domain Admin, service, etc?
I've always found it easier to create a batch job and use windows scheduler to clean up .bak files over x number of weeks. Can you look at the job history and see if the task failed / succeed, may be worth looking at the event viewer on the server as well.
The only solution I could find was to take the SQL that was generated from the cleanup task, and run that in an SP instead, 'cause guess what??? The SQL that this plan generates, runs perfectly!!!!!
This is the code I am using to run and it works.
ALTER PROCEDURE spUtility_delete_OldBackups
AS
DECLARE #date varchar(28)
SET #date = CONVERT(varchar(28),DATEADD(DAY, -5,GETDATE()))
EXECUTE master.dbo.xp_delete_file 0, 'D:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Backup\EEIDW\', 'bak', #date,1
EXECUTE master.dbo.xp_delete_file 0,'D:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Backup\EEIDW\','diff', #date,1
EXECUTE master.dbo.xp_delete_file 0,'D:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Backup\EEIDW\','trn', #date,1
I have a similar job that runs with no problems, what account does SQLServerAgent run under, these maintenance plans execute as SQLServerAgent, if the security context that SQLServerAgent runs under does not have adequate permissions, this job will fail, there should be some job history that will tell you more though.