I am using a SQL-Server 2012 (SP2) on a WinServer 2012 R2. I created a maintenance plan that makes a backup of my databases and deletes report files and backup files that are older than 2 days.
My problem is that the maintenance plan doesn't delete files which are older than 2 days. these files are also not deleted after 5 or 6 days. The backups are verified and compressed.
I have rights on the directory (agent is running under LocalSystem) and in the task properties under "file extension" I tried to set the name without and with a dot, e.g. ".bak", "bak" and "*.bak" but nothing works. I changed the order of the task and nothing works.
But I always get success messages. So what is here wrong?
Just to clarify, are you trying to run a script (i.e., .bat) to handle some of this through SQLCMD, and calling that script through Task Scheduler? From your question, it sounds like you've set the Maintenance Plan up in SQL Server, but are trying to setup a "Task" to call it (Task Scheduler), rather than a "Job" (SQL Agent), my apologies if this is not the case.
It will be a lot tidier to do this through the SQL Agent (the SQL Server internal equivalent of Task Scheduler). Once you have a Maintenance Paln you can test through the Management Studio Interface, it's a breeze to add the Agent Job.
If you are running a Task Scheduler Task to call a script, it is likely you're getting "success" because the Task started your script, but the script isn't setup to handle the returns from the SQLCMD call properly.
I have tried to solve it in different ways and there has been no way, the fastest I have found is to use a .bat that deletes them and schedule a windows task that does it every night.
This bat clears from the last 3 days :
FORFILES /P "C:\Directorio" /S /D -3 /c "CMD /c DEL /Q #PATH"
Related
I have created a batch file startservice.bat to start a windows service and scheduling this batch files using task scheduler on windows server 2012 R2.
startservice.bat
NET START myservice
But after scheduler runs a cmd pop up comes but service is not actually started.
If I run this batch file manually then service starts.
On my 2012 R2 server, I attempted to reproduce the problem you describe. The batch file always starts the service when I run via a scheduled task and I never get a cmd pop up.
Your question did not specify scheduled task settings. You may need to enable the "Run with highest privileges" option on the General tab of your scheduled task.
Your action should call the batch file directly or you can call CMD. Either way, make sure you call the full path of the batch file.
cmd /c C:\startservice.bat
Hopefully this question is unique enough not to be a duplicate. I have a PowerShell script which does two things.
Inserts records into a SQL Server table
Writes text to a text file
For the purpose of this post, I have simplified the script. On my computer, the script is located at C:\Temp\ssis.ps1. Following is the contents of the script.
DTEXEC.EXE /F "C:\Temp\ssisjob.dtsx"
$date = Get-Date
Write-Output "This PowerShell script file was last run on $date" >> C:\Temp\test.txt
When I manually run this PowerShell script, records are inserted into the SQL Server table, and a line of text is written to the test.txt file. If I schedule this script to run using Windows Task Scheduler, a new line of text is written to the text file, but the records are not inserted into the SQL Server table. This tells me that Windows Task Scheduler is able to run the PowerShell script. However, for some unknown reason, Windows Task Scheduler seems to not want to run the SSIS job (DTEXEC.EXE) part of the script. Event Viewer confirms there is an issue with the SSIS job. I am running Microsoft SQL Server 2014, Developer Version.
In my task, on the Actions tab, the Add arguments field has the following reference: C:\Temp\ssis.ps1. Task Scheduler is configured to run with the highest privileges.
I have tried all of the following Execution Policies in PowerShell. Regardless of the Execution Policy I select, my experience does not change.
Bypass
Unrestricted
RemoteSigned
The History tab in Task Scheduler has information events, but no error events.
I do not have the permission to view the SQL Server logs (this is a production server).
I have been debugging this issue for a few weeks, and I have read numerous posts here on Stack Overflow, yet I still cannot seem to find the answer, so hopefully I have done my due diligence before making a new post here. I could add some additional observations, but I do not want my post here to get extensively long. If anyone has any hints or tips or insight that might lead me down the right path, it would be greatly appreciated.
Here is the solution I came up with. Instead of exporting the file to Excel, I exported to a flat file (txt file). Also, using Nick McDermaids excellent recommendations, instead of using PowerShell in Task Scheduler, I started the dtexec.exe file in Task Scheduler.
Task Sheduler Actions Tab
Keep the action as Start a program
In Program/script, type dtexec.exe
In Add arguments, type /f "C:\path\to\example.dtsx
Leave the Start In box empty
I have SQL Server 2012 on my local machine and I am hosting the production version of my application/database with Arvixe.
To initially set up the database on the remote server with Arvixe, I just uploaded a .bak file of my local DB. Not a big deal since it was just getting things set up, but this as you know also pushes all of my test data to the database on my production server.
My question is this .. How should I go about pushing database changes (new tables, columns, keys, etc..) from my local development environment to the production environment on Arvixe's server? A simple backup won't work now - I can't overwrite my production data and replace it with dev data.
Is there a program that I can use for this? Is there something in SQL Server 2012 that I'm just missing? All I can find is the suggestion to upload a backup version of my local DB.
Thanks for your help.
The way to push database changes from Development to Production has little to nothing to do with where the Production instance is located.
All changes should be migrated to other environments via rollout scripts:
You should be creating scripts for all changes as you make those changes.
The scripts should be placed in folder for a particular release.
The scripts should be numbered to retain the same chronological order in which those changes happened (this should eliminate -- mostly -- any dependency issues).
The script numbering should be consistent with 2 or 3 digits for proper sorting (i.e. 01, 02, ... OR 001, 002, ...).
The scripts should all be re-runnable (i.e. they should first check to see if the intended change has already happened and if so, skip to the next script).
When migrating to a higher environment, just run all of the scripts.
If any script fails, the entire process should terminate since all changes need to occur in the same order in which they happened in Development. If using SQLCMD.EXE to run your scripts, use the -b (i.e. "On error batch abort") command line switch as it will terminate the process upon any error.'
Here is a simple CMD script (you could name it DeploySqlScripts.cmd) that handles one folder at a time, to a single Server/Instance, and assumes that you have a USE [DatabaseName]; line at the top of each script:
#ECHO OFF
SETLOCAL ENABLEDELAYEDEXPANSION
IF "%1"=="" (
ECHO Usage: DeployScripts "full:\path\to\SQL\scripts" Server[\Instance]
GOTO :EOF
)
FOR /F "delims=," %%B IN ('DIR /B /O:N /A-D "%1\*.sql"') DO (
SQLCMD -b -E -S "%2" -i "%%~fB"
IF !ERRORLEVEL! EQU 1 (
ECHO.
ECHO Error in release script...
ECHO.
EXIT /B !ERRORLEVEL!
)
ECHO.
)
Also:
If you are migrating from Dev to Prod, then you are missing at least one environment, if not 2 or even 3. You should not push changes directly from Development to Production since you might be developing things that are not ready to release. All changes that you feel are ready for release should first go to a QA environment that provides a better testing ground since it doesn't have any other changes that might invalidate certain tests.
You really, really should have your database object CREATE scripts in some source of source code control system (a.k.a. version control system). Subversion (SVN), Git, TFS, etc. There are several options, each with their pros and cons (as well as true-believers and haters). So do some research on a few of them, pick one that suits your needs, and just use it. And yes, the release scripts should also be a part of that repository.
There is also a tool from Redgate, SQL Source Control, that is not free, and I have not used it, but seeks to help in this area.
For simple / small projects (single database) Microsoft has a free tool that can script out differences between a source (i.e. Development) and target (i.e. QA or Production). It is called SqlPackage.exe and is part of SQL Server Data Tools (SSDT).
You can set up triggers on the database objects you're interested in pushing and set up the production server as a linked server.
As an aside, pushing production data to the development environment is fine, but going the other way involves a lot of risk.
I am running a 2008 SQL server. I am using remote backup to backup the server and have two batch files to trigger the backup process. The first batch file contains:
#echo off
CALL "" "C:\Scripts\SQL Maintainence\Nightly Maintenance 2008 SSE.bat"
The second batch file (Nightly Maintenance 2008 SEE) looks like this:
#echo off
osql -S %SERVER% -d msdb /U (username) /P (password) -i "Nightly Maintenance 2008 SSE.sql" -o "Nightly Maintenance 2008 SSE.txt"
For some reason, the first batch file is not calling for the second batch file to run, the script works when manually ran.
I am very new to writing batch files and have done quite a bit of research up to this point. Any help or maybe an article that can help with my issue would be greatly appreciated.
Reposted as answer instead of question...
I think you need to delete the "" immediately after the CALL, like this:
CALL "C:\Scripts\SQL Maintainence\Nightly Maintenance 2008 SSE.bat"
I restored my development database from production, and the stored procedures I need in my development environment doesn't exist in my production database. Is there a command Ii can use to import the developmetn stored procedures back into SQL Server. There are about 88 files, as each procedure is in a different text file.
TIA!
Chris
Oops, you did the painful way of generating scripts. You should have created a single script for all procedures by right clicking on the database in SSMS, choosing Tasks -> Generate Scripts.
However, if you don't want to go through that process again, open up a cmd shell in the folder and remember those old batch file days:
for %f in (*.sql) do sqlcmd -i %f
This should do the trick!
You could add other parameters to sqlcmd if required (i.e. login, password, server name, ...). To see a list of switches just do a sqlcmd -h.
For SQL 2K & 2K5, you want this tool.
I asked a similar question awhile ago and got this advice from Mike L (give him votes here).
Right click on the database from where you want to transfer the data
Select Data Transfer
Select Tables or Store Procedure (what you want to transfer)
Select the location where you want to transfer the data (either on server or localhost or any file)
Right click on the development database Hit Generate SQL Scripts and then only select stored precedures. If you need need additional filtering you can even select the stored procedures you dont want.
Then just run that query on development.
I don't know if there's a command line way to do it, but if you have them all in text files, it shouldn't be difficult at all to write a quick down and dirty app that just loops through all the files, and runs the create statements on your production server using whatever language you choose.
If like me you have to deal with a bunch of sql files in a hierarchy of folders, this one liner will combine them into a single file called out.sql which you can easily execute in SQL Management studio. It will only include files that END in .sql, and ignore files such as *.sqlsettings.
Run it from the root folder of the hierarchy of .sql files. Be sure you have nothing of value in out.sql, as it will be replaced.
del out.sql && for /f %f in ('dir /b /s ^| findstr /E \.sql') do type %f >> out.sql