All data erased when reboot of edit the program - sql-server

Each time the program is run, or when the program is edited and run, the SQL Server database (.mdf file) is cleared and only the data given later is seen. All old data is cleared.
I made this program in Visual Studio 2010. How to fix this? How can the data be permanently saved to the SQL Server database?

Have you verified that you are looking at the correct MDF file? When you run your program it will copy to a folder under [ProjectFolder]/bin - if you are running the Debug configuration it will be [ProjectFolder]/bin/Debug/yourFile.mdf and if you are running the Release configurations it will be [ProjectFolder]/bin/Release/yourFile.mdf. These are separate from the MDF file that is likely in the project root ([ProjectFolder]). If you are making changes when the program is running, it will be saved (assuming your code is correct) to one of the MDF files that exist under the [ProjectFolder]/bin folder.

Related

How do i restart PostgreSQL service after putting back the original Data folder?

I'm writing some batch scripts for doing incremental backups of a PostgreSQL cluster on a Windows Server.
I copied the Data folder to a different folder, ran my backup scripts, stopped the service, deleted the Data folder, and tried recovering the database from the WAL files and such.
This didn't work, because i copied the wrong log files, and i couldn't get the service started again, so i tried copying back in the original Data folder, but i still can't start the service.
The first script i ran called:
pg_basebackup -Fp -D %BACKUPDIR%\full_%CURRENTDATE%
This was the only line which actually ended up interacting with the data, but not the original Data folder, which i copied beforehand.
When trying to start the service again i get the following error message:
The postgresql-x64-10 - PostgreSQL Server 10 service on Local Computer started and then stopped.
Some services stop automatically if they are not in use by other services or programs.
I have gotten this before, when making a typo in the conf file, so i'm guessing that's just the standard error message for when something is missing.
Found out that i had to redo the folder permissions.
This is done the following way:
5. Change permissions for the new data directory
For the new data-dictionary folder: Right-click on it and click Properties. Under the Security Tab click “Edit...” and then “Add...”. Type “Network Service” and then click “Check Names”, make sure it has Modify and Full Control permissions and then click OK. Equally important PostgreSQL needs to be able to “see” the data-directory (see my ServerFault.StackEx question), i.e. it needs to have read access to the parent directories above it. So Right-click on the pg_db folder and under the Security Permissions add Network Services again, but this time it only needs Read & Execute as well as List folder contents permissions.
The full post is a nice checklist to go through, for anyone else facing similar issues:
https://radumas.info/blog/tutorial/2016/08/08/Migrating-PostgreSQL-Data-Directory-Windows.html

SSIS deployed package fails to map drive tag to network shared folder

Modifying this post as a friend has helped me figure it out.
The culprit was that SQL was not able to map drive tag to network shared folder, so the deployed SSIS package was not able to write. The execution report showed all green and success, so I was confused as a beginner. See also the comments below.
Backup original post below:
SSIS package text file write works in visual studio not when deployed on sql server
I have narrowed down the issue to the text file writing action in script task (C#), the experiment simply writes the current time stamp into a text file.
It works in Visual Studio 2015, both with (F5) and without (Cntl+F5) Debugger. The project is in package deployment mode. When deployed to a database server of SQL Server 2016 and manually trigger execute with Administrator login, the writing action never happens although execution report shows all success, and Windows system log shows no clue to me either.
I am a beginner on SSIS and hints and tips will be highly appreciated.
Yes, SSIS doesn't like mapping shared folder to a letter drive. Thanks #Nick.McDermaid .
Never use mapped drives. Use UNC instead i.e. \server\share\folder –
Nick.McDermaid Nov 8 '18 at 23:40

SSIS File System task didn't copy files from the source server location when scheduled

I'm new to SSIS and stuck with a problem and hope some of them would have already gone through any of this.
Task:
Copying files from a remote server to a local machine folder using File System task and For each loop container.
Problem:
The job executes i.e. files are getting copied successfully when I execute from the SSIS designer but when deployed the project on the SQL server instance it isn't copying any files in fact the target folder is totally empty.
I'm not understanding this strange behavior. Any inputs would be of great help!
Regards-
Santosh G.
The For each loop will not error out if it doesn't find any files.
The SQL Agent account may not have access to read the directory contents.
Check is your path a variable - is it being set by a config or /SET statement?
Can you log the path before starting the for loop?
Can you copy a dummy file and see can SSIS see this file?
How are you running the job - cmd_exec() can give spurious results with file I/O tasks
The issue was related to the user authorizarions of the SQL Server agent service.
When I execute the job from SQL Server it uses agent service and for that agent service you need to assign a service user who has access rights to the desired file path.

Where to place database files in Visual Studio

If this question has been asked before please point me in the right direction.
I am working with an MDF file which I attach to localdb in Visual Studio.
However, when I run my application it is copied to the Debug folder (I know this is because the "Copy always" option is set)
This works fine because my connection string is:
Data Source=(localdb)\v11.0;AttachDbFilename=|DataDirectory|\Invoicing.mdf;Integrated Security=True
which means my application will look for the database in the Debug folder.
My question is, where should I place the database file because :
1. The file I am attaching to localdb is under the projects folder
2. Meanwhile, my application looks for the database file in the debug folder
I would appreciate any guidelines
I found this article:
http://msdn.microsoft.com/en-us/library/ms246989.aspx
I guess I will have to attach the one in the bin folder and choose the option to copy if newer
EDIT:
I have done the above and in addition I have created two connections, one for the MDF in the projects folder and the other to view data in the database in the bin folder

Windows 2008 R2 - Kernel (System Process PID=4) is locking files and folders

Windows 2008 R2 - Kernel (System Process PID=4) is locking files and folders for a long time.
For example when deleting a file, the file may remain locked for 1 minute or more and only after that be deleted.
On another occasions I encountered files or folders I could not delete. ProcMon showed that the System Process was holding a handle to those resources for a couple of minutes and then released them
None of the resources I mentioned were system resources, only files and folders installed be me and handled by my applications.
As Dani has already mentioned in the comment:
It's a bug in Windows 7 and likely in Windows Server 2008 (possibly 64bit versions only). It surfaces when you disable Application Experience service.
Re-enabling this service has fixed this problem for me.
A bit more info here as to why it's causing a problem.
List of other SO questions which seem to be related:
Visual Studio output file permissions?
Under which circumstances does the System process (PID 4) retain an open file handle?
Files accessed through a share will be locked by the system process (PID 4).
Try opening compmgmt.msc -> System Tools -> Shared Folders -> Open Files to see if the locked file is listed there
See also the sysinternals forum for a way to replicate this.
Not all applications lock files when they are opened, Excel however does...
In my case, it was fixed by a simple command in the command line:
net session /delete
I hope that helps.
Hope this helps others.
open windows run and lauch mmc.exe
File -> Add or Remove Snap-ins --> Shared Folders --> localcomputer
Select Open Files scroll down to the directory or file and right click to close.
You can also get the username that has the lock and go to sessions and right click --> close session.
In my case it was MacOS 10.13 holding file locks open...
https://support.apple.com/en-us/HT208209
I had this issue when trying to rename a folder. I had to stop the server service while performing the rename. Just restarting didn't help, as the system process re-locked the folder as soon as the server service restarted.
Make this and resolve the problem:
Go to Services and activate Application Experience.
Tried all these...
Even copying the file, deleting the original, renaming copy to original name (all on server) would immediately tell me the user had it locked.
In the end -
used Unlocker to clear the file locks.
Copied the file OFF THE SERVER to a desktop.
Deleted the original file off the server.
Changed the filename of the copy on the desktop.
Renamed it back to the original name on the desktop.
Put the file back into the original location ON THE SERVER.
HTH, YMMV... :)
Had this issue just now whilst trying to replicate data to a new file server (both source & destination servers running Windows 2008 R2).
PID 4 was found locking the file (using procexp as above), but Application Experience has never been installed on either server & the file was not shown in the list of open files.
Fortunately we use scheduled shadow copies on this server (to enable users to self-serve most file recoveries). I just used the Previous Versions option (available through Properties of the containing folder), selected the most recent copy of the file & copied it to somewhere else, then deleted and replaced the problem file.
You might need to delete the containing folder to delete the file - which could be a problem if lots of files in use obviously (this wasn't an issue for me given this was the only file in the folder).
For a one-off issue like I had (single locked file for the whole server drive), this worked without any disruption to the server or users.
Given you are talking about a server & that Shadow Copies are using VSS - you should be able to recover the locked file from your backups (presumably you have these) if you don't use Shadow Copies. Otherwise there are some useful utils like ShadowSpawn (https://github.com/candera/shadowspawn) around that might help.

Resources