Copy Multiple DBF files content to other DBF files - database

I have 200 DBF files (related to Windows program) that I need to transfer their content to different DBF files that have the same names.. , i.e. file1.dbf content should be transferred to file1.dbf content from different folder, I need to do that for all 200 files. I can only find manual solution but its slow , I need to do it for all at once. The reason I need to do that is because the files have some protection or I don't get it, if I simply replace them the program will give error realated to db, other easier solution is welcome.
What really happened -
The files are used in accounting software. I have created the backup with backup function from the software, it was done daily automatically. Restore function is protected for free version. I can only manually restore the files, but the backup files work only on the exact software version as protection. The thing is that the software is 100% free as long as you don't update it, I didn't knew that until I selected the update menu option. Now the software is updated but it asks me for license key which is very expensive. I have the backup files, but as the restore function is blocked I can't restore it with the program. I can only replace the files myself from the backup , its a directory that is backed up. The problem is that it has some protection on files itself , they are not recognized but only if I manually edit the files with DBF Manager software, if I open a default software file and paste the content from backup file it works, but I have many files.Now I'm stuck with company documents locked not been able to access it
Thank you

Related

SSIS File System Task Doesn't Work but Reports Success

I created an SSIS package that extracts two files from a .zip file, imports data from them and then attempts to delete the files that were extracted.
The package works, data is imported and all tasks report success. However the File System Tasks that attempt to delete the files don't delete them, even though they report success.
I use the CozyRoc Zip Task to extract the files. When I remove the Zip Task, the File System Tasks actually do delete the files. I'm not certain that CozyRoc is causing the problem, but the existence of that task may be causing other issues with the package.
Can anyone help me figure out how to reliably delete the files?
Do I need to put in some sort of pause after the Data Flow Tasks to allow them to release whatever locks they might have on the files?
Is there a way to view the DOS commands that the File System tasks use at run time, to verify that they are actually attempting to delete the correct files?
Thank You,
Robbie
Control Flow:
Details:
Visual Studio 2019 v16.11.3
File Names are from Flat File Connection Managers (See image below).
Flat File Connection Managers use Expressions to set their connection strings.
The same connection managers are used to import the data, so I presume that they accurately refer to the correct files and their correct locations.
File System Task Editor for one of the delete tasks:

changing default data directory for sql server after installation

I have mistakenly put my SQL data directory in the wrong folder and as such, the system DBs are located in the wrong directory, and hence, I want to move it to a different directory. I am going to move the data and log files for all my system DBs but I would like to know if it will move all the folders under MSSQL or do I need to perform some other steps as well? Please find below the folders that I can see under MSSQL.
In order to move system databases, not only you should move the files, but prior to it you should change their paths in system tables by using
ALTER DATABASE..MODIFY FILE
When moving master database you should also change the startup parameters in Configuration Manager, you should put there the new location of master files.
All this is described in Move System Databases article.
Note that if one day you'll need to rebuild your master, it will be put in the old location, so you cannot just get rid of DATA folder. And after rebuilding you'll need to move your system databases again.
This is to let you know the solution to my problem. I have successfully changed the data root directory by moving the data and log files for all the system DBs to the desired location.I also had to change the registry value for the SQLDataRoot directory to my desired location and modify the location of the server diagnostics file as per my cluster requirement. After all this, I have been able to successfully move the folder to the desired location. Thanks for all the help everyone.

ZIP files not working in SSIS (server level issue?)

I'm posting this rather odd issue here in the remote chance that someone has come across this before, or possibly just has an idea or two about what I could try or check next because I'm stumped.
Summary: SQL 2008 SSIS package tasks that attempt to create files with .zip extension fail with
"Access to the path is denied"
Detail: This first occurred in a test environment with a package that works fine in Dev (and Prod). The part that makes this problem odd is that it is all about the File Extension, not security. I mention this now to curb replies about checking the security (SSIS Account, Directory Level permissions etc.) :- it's not that, 100%.
So, I've built an SSIS package as a proof of behavior, that takes 3 files (a.txt, b.txt, c.txt) and respectively for
(a) uses CozyRoc Zip to Create a Zip,
(b) uses a script task to create a .zip (using GZipStream - I know this creates a GZIP not a ZIP but bear with me...) and
(c) native SSIS File System Task copies the file from c.txt to c.zip (yes, creating a .zip file that is not really a zip file).
All Three fail with the above message - the .ZIP files are created for (a) and (b), but remain at 0 length. (For (c) just the error message).
Now, I edit the SSIS package and change the extensions of the destinations (to .ZOP or .ZIP2 or .GZ or .ANYTHING), and all 3 work perfectly. And this is obviously how I know that it's the .ZIP extension not a "normal" security issue.
So I've initially assumed this is a one-off on this test server because it was the only place it happens, but I've found another box (build rehearsal) on which exactly the same problem exists. I've tried associating .ZIP with various different programs (Windows Explorer, WinZip, 7Zip, WinRar & "no program") and nothing works, and I've googled the problem to death with no luck yet.
I've tried creating .ZIP files with the various installed archive programs using their GUIs and they all work fine. Existing .ZIP files can be unzipped using CozyRoc. Existing .GZ (GZIP) files renamed to .ZIP can be unzipped using the script GZipStream decompress. And I can rename files to and from .ZIP using SSIS or Explorer/CMD. It's just SSIS (specifically SSIS) creating a file with extension .ZIP (specifically .ZIP) throws this error.
I'm starting to suspect it might have something to do with SSIS thinking that .ZIP is an archive "folder" not a ".ZIP File" but I don't know where to go with this idea, proving it or fixing it.
Any ideas at all? - at my wits end!!
Thanks in advance
P.S. The "obvious" answer of using .ZIP2 and renaming is not an option, there are (literally) hundreds of packages running in production that create .ZIP files and packages need to move from Test to Prod without modification. I really need a solution, not a workaround, in this instance if there is one.
This turned out to be a RedGate tool (HyperBac) having a file association with .ZIP extension files (amongst others). Hyperbac's monitoring of .ZIP files appears to have clashed with SSIS's attempt to write to the .ZIP file, as procmon reported shared file access violations, causing a spurious ACCESS DENIED error to be reported by the package.
Since use of the tool is necessary on our environments, I was able to solve the problem by deleting the .ZIP association using the GUI ("Hyperbac Configuration Manager" > "Extensions" > Ext=.ZIP, Delete)

Windows 2008 R2 - Kernel (System Process PID=4) is locking files and folders

Windows 2008 R2 - Kernel (System Process PID=4) is locking files and folders for a long time.
For example when deleting a file, the file may remain locked for 1 minute or more and only after that be deleted.
On another occasions I encountered files or folders I could not delete. ProcMon showed that the System Process was holding a handle to those resources for a couple of minutes and then released them
None of the resources I mentioned were system resources, only files and folders installed be me and handled by my applications.
As Dani has already mentioned in the comment:
It's a bug in Windows 7 and likely in Windows Server 2008 (possibly 64bit versions only). It surfaces when you disable Application Experience service.
Re-enabling this service has fixed this problem for me.
A bit more info here as to why it's causing a problem.
List of other SO questions which seem to be related:
Visual Studio output file permissions?
Under which circumstances does the System process (PID 4) retain an open file handle?
Files accessed through a share will be locked by the system process (PID 4).
Try opening compmgmt.msc -> System Tools -> Shared Folders -> Open Files to see if the locked file is listed there
See also the sysinternals forum for a way to replicate this.
Not all applications lock files when they are opened, Excel however does...
In my case, it was fixed by a simple command in the command line:
net session /delete
I hope that helps.
Hope this helps others.
open windows run and lauch mmc.exe
File -> Add or Remove Snap-ins --> Shared Folders --> localcomputer
Select Open Files scroll down to the directory or file and right click to close.
You can also get the username that has the lock and go to sessions and right click --> close session.
In my case it was MacOS 10.13 holding file locks open...
https://support.apple.com/en-us/HT208209
I had this issue when trying to rename a folder. I had to stop the server service while performing the rename. Just restarting didn't help, as the system process re-locked the folder as soon as the server service restarted.
Make this and resolve the problem:
Go to Services and activate Application Experience.
Tried all these...
Even copying the file, deleting the original, renaming copy to original name (all on server) would immediately tell me the user had it locked.
In the end -
used Unlocker to clear the file locks.
Copied the file OFF THE SERVER to a desktop.
Deleted the original file off the server.
Changed the filename of the copy on the desktop.
Renamed it back to the original name on the desktop.
Put the file back into the original location ON THE SERVER.
HTH, YMMV... :)
Had this issue just now whilst trying to replicate data to a new file server (both source & destination servers running Windows 2008 R2).
PID 4 was found locking the file (using procexp as above), but Application Experience has never been installed on either server & the file was not shown in the list of open files.
Fortunately we use scheduled shadow copies on this server (to enable users to self-serve most file recoveries). I just used the Previous Versions option (available through Properties of the containing folder), selected the most recent copy of the file & copied it to somewhere else, then deleted and replaced the problem file.
You might need to delete the containing folder to delete the file - which could be a problem if lots of files in use obviously (this wasn't an issue for me given this was the only file in the folder).
For a one-off issue like I had (single locked file for the whole server drive), this worked without any disruption to the server or users.
Given you are talking about a server & that Shadow Copies are using VSS - you should be able to recover the locked file from your backups (presumably you have these) if you don't use Shadow Copies. Otherwise there are some useful utils like ShadowSpawn (https://github.com/candera/shadowspawn) around that might help.

Recover postgreSQL databases from raw physical files

I have the following problem and I need to know if thereĀ“s a way to fix it.
I have a client who was cheap enough to decline buying a backup plan for his postgreSQL databases on the main system that runs his company and as I thought it would happen some day, some OS files crashed during a blackout and the OS needs to be reinstalled.
This client didn't have any backups of the databases but I managed to save the PostgreSQL main directory. I read that the databases are stored somehow inside the data directory of the postgres main folder.
My question is: Is there any way to recover the databases from the data folder only? I am working in a windows environment (XP service pack 2) with PostgreSQL 8.2 and I need to reinstall PostgreSQL in a new server. I would need to recreate the databases in the new environment and somehow attach the old files to the new database instances. I know that's possible in SQL Server because of the way that engine stores the databases but I have no clue in postgres.
Any ideas? They would be much appreciated.
If you have the whole data folder, you have everything you need (as long as architecture is the same). Just try restoring it on another machine before wiping this one out, in case you didn't copy something.
Just save the data directory to disk. When launching Postgres, set the parameter telling it where the data directory is (see: wiki.postgresql.org). Or remove original data directory of the fresh installation and place the copy in its place.
This is possible, you just need to copy the "data" folder (inside the Postgres installation folder) from the old computer to the new one, but there are a few things to keep in mind.
First, before you copy the files, you must stop the Postgres server service. So, Control Panel->Administrative tools->Services, find Postgres service and stop it. When you're done copying the files and setting permissions, start it again.
Second, you need to set the permissions for the data files. Because postgres server actually runs on another user account, it will not be able to access the files if you just copy them into the data folder, because it will not have permissions to do so. So you need to change the ownership of the files to the "postgres" user. I had to use subinacl for this, install it first, and then use it from command prompt like this (first navigate to folder where you installed it):
subinacl /subdirectories "C:\Program Files\PostgreSQL\8.2\data\*" /setowner=postgres
(Changing ownership should also be possible to do from the explorer: first you must disable "Use simple file sharing" in Folder options, then a "Security" tab will appear in the folder Properties dialog, and there are options there to set permissions and change ownership, but I wasn't able to do it that way.)
Now, if the server service can't start after you start it manually again, you can usually see the reason in the Event viewer (Administrative tools->Event viewer). Postgres will throw an error event, and inspecting it will give you a clue about what the problem is (sometimes it will complain about a postmaster.pid file, just remove it, etc.).
The question is very old, but I want to share an effective method that I found.
If you have not got a backup with "pg_dump" and your old data is folder, try the following steps.
In the Postgres database, add records to the "pg_database" table. With a manager program or "insert into".
Make the necessary check and change the following insert query and run it.
The query will return an OID after it has worked. Create a folder with the name of this number. Once you have copied your old data into this folder, the use is now ready.
/*
------------------------------------------
*** Recover From Folder ***
------------------------------------------
Check this table on your own system.
Change the differences below.
*/
INSERT INTO
pg_catalog.pg_database(
datname, datdba, encoding, datcollate, datctype, datistemplate, datallowconn,
datconnlimit, datlastsysoid, datfrozenxid, datminmxid, dattablespace, datacl)
VALUES(
-- Write Your collation
'NewDBname', 10, 6, 'Turkish_Turkey.1254', 'Turkish_Turkey.1254',
False, True, -1, 12400, '536', '1', 1663, Null);
/*
Create a folder in the Data directory under the name below New OID.
All old backup files in the directory "data\base\Old OID" are the directory with the new OID number
Copy. The database is now ready for use.
*/
select oid from pg_database a where a.datname = 'NewDBname';
As shown by move database to another hard drive. All we need to do is to modify the registry table and file permissions. By modifying registry table(shown in image 1), postgresql server know the new location of data.
modify registry
If you have issues with permissions or with stuff like icacls during installation to old data folder then try my solution from sister website.
https://superuser.com/a/1611934/1254226
I do so but the most tricky part was to change the owner permission:
go to services from administative tools
find postgres service and double click on it
at log on tab change to local system
then restart

Resources