I made a React project using npx create-react-app inside my OneDrive folder on Windows 10.
OneDrive complained about one of the folders being named '~' (it was a folder with Node config files made automatically by create-react-app).
This was honestly a huge nightmare:
I could not rename, move or delete the folder myself. Windows Explorer didn't allow it because there was a 'sync error'
OneDrive would just completly stop syncing until the issue is fixed. I could not do anything manually, so my only option was to use their 'rename' button in the error message (which is supposed to automatically rename the file and fix the error). This did not work. I would try again and again over the span of a few days and it would do nothing, the error persisted.
Ultimately I copied my project outside OneDrive, but now I wasn't able to delete my old folders. I tried everything I could think of: pause sync, try to delete them with Windows in Safe Mode, uninstalled OneDrive in the end. I managed to delete most of the contents (with a lot of effort), but there were stil some Node directories that would not be deleted. I was getting a 'reparse point buffer' error which I solved by running chkdsk /f /x
Partly, I'm posting this hoping that my experience would help someone that has simmilar issues with OneDrive, but I also want to know how to keep React projects in my OneDrive.
I like having everything on my laptop in my OneDrive folder so it is synced: I want to be able continue my work when I move to my computer.
I'm having the exact same problem. I figured out if I moved the react folder into another folder and then deleted the other folder that worked try it. My onedrive still kept trying to sync something for a while but it finally quit and now everything is okay. That is until I do another react project and onedrive will be messed up again for sure.
After deleting the folder press Window + R and run this command to reset onedrive
%localappdata%\Microsoft\OneDrive\onedrive.exe /reset
Related
I work on my WPF project daily in Visual Studio 2022, and back-up my project folder in the evening. I can't help but notice the number of files in there just keeps getting greater (by hundreds every day), even when I'm not adding anything new to the project. It's causing back-up time, and the time it takes to transfer my files between workstations, to get longer and longer.
What could be generating these extra files and is there a way to minimize it?
I've had a look around the net for answers, but to no avail.
Thanks for all and any advice you can give me.
The best way to solve your problem is using the GitHub Repository.
Go to the Git Studio menu, select the create repository item in it, upload your Solution to the created Repository, commit the changes when you need it.
In addition to just backing up, this will allow you to easily and conveniently compare different versions.
If for some reason this is inconvenient for you, then pay attention to the ignore file in the Repository. All templated files and folders in this file do not require backup.
Here is a link to the ignore file from my Repository: https://github.com/EldHasp/CyberForumMyCycleRepos/blob/master/.gitignore
In the most minimal variant, all folders whose names begin with a dot, the bin, obj and packages folders do not require saving.
Try the following:
Try compiling in release mode instead of debug mode.
Try clearing the obj and bin folders
Turn off the automatic generation of xml.
Update:
My guess is that you are running your program in debug mode and your program is causing some files to grow.
There are the following methods to clean up the obj folder:
Right-click the project and select clean, then rebuild
Add the following code into the pre-build event, so that the obj folder will be cleaned up before each build, but this operation will delete the previous obj folder.
rmdir /s /q "$(ProjectDir)obj"
I am trying to download the data of all of my users before we close out our Google Suite account. I have created the export. I installed Google Cloud SDK Shell and authenticated to it.. I run gsutil cp -r gs://takeout-export-xxxxxxxxxxxx C:\GExport and it downloads all of the folders that come before 'R' but when it hits the first "Resource: -xxx" folder, it fails with
OSError: The directory name is invalid.
These folders don't seem to have overly useful data so I even tried deleting them (from the website interface) but they always fail to delete.
What gives? How can I go about downloading all of the user folders without doing so one at a time, manually?
Edit:
I tried selecting each folder (minus the problem folders) in the website GUI and select download so it gave me commands to download those folders. I tried to copy/paste those commands into GCloud SDK Shell but it doesn't seem to work. It fails when it hits the second line (the first folder to download). Not sure of the proper syntax to attempt to download many folders apparently (and Google's suggested commands are not correct).
Ended up giving up doing it the "right" way and modified the command to be many lines, grabbing each folder one at a time, copy/pasted the commands into GCloud SDK and letting them downloading one at a time. Notepad++ is so useful
Treesize free (https://www.jam-software.com/treesize_free) is a file / folder analysis tool that quickly scans a PC and sorts folders and files in order of size to quickly show what is using up disk space.
It used to work just fine but sometime in the last few months (I've got a new PC so might be just since having this) I've noticed it has stopped working for OneDrive folders. We use OneDrive for business at work and all my docs / downloads / desktop etc are backed up on OneDrive, and these folders are all marked to keep offline ("Always keep on this device"), so they are saved locally.
However, Treesize doesn't show these files, apparently I only have 4GB in OneDrive.
If I right click the OneDrive folder and go to properties, I can see that it is about 60GB.
Any ideas what's going wrong, or how I could analyse disk space that is used by OneDrive?
I have the latest version of Treesize and have also tried an older version
I've tried starting Treesize as admin and as standard
Weird but just if I select an individual folder within Onedrive, it will scan fine. Just not the whole thing. So then I tried scanning the full C drive and then go to the Onedrive folder and "Update this branch" and it finished scanning it just fine:
and updates correctly after that:
Bit annoying though. Doesn't explain why it's doing this in the first place
I have a TFS build process that drops outputs on sandbox which is another server in the same network. In other words, the build agent and sandbox are separate machines. After the outputs are created, a batch script defined within the build template does the following:
Rename existing deployment folder to some prefix + timestamp (IIS can now no longer find the app when users attempt to access it)
Move newly-created outputs to deployment location
The reason why I wanted to rename and move files instead of copy/delete/overwrite is the latter takes a lot of time because we have so many files (over 5500). I'm trying to find a way to complete builds in the shortest amount of time possible to increase developer productivity. I hope to create a windows service to delete dump folders and drop folder artifacts periodically so sandbox doesn't fill up.
The problem I'm facing is IIS maintains a handle to the original deployment folder so the batch script cannot rename it. I used Process Explorer to see what process is using the folder. It's w3wp.exe which is a worker process for the application pool my app sits in. I tried killing all w3wp.exe instances before renaming the folder, but this did not work. I then decided to stop the application pool, rename the folder, and start it again. This did not work either.
In either case, Process Explorer showed that there were still uncollected handles to my outputs except this time the owner name wasn't w3wp.exe, but it was something along the lines of unidentified process. At one point, I saw that the owner was System, but killing System's process tree shuts down the server.
Is there any way to properly remove all handles to my deployment folder so the batch script can safely rename it?
https://technet.microsoft.com/en-us/sysinternals/bb896655.aspx
Use windows systernal tool called Handle v4.0
Tools like Process Explorer, that can find and forcibly close file handles, however the state and behaviour of the application (both yours and, in this case, IIS) after doing this is undefined. Some won't care, some will error and others will crash hard.
The correct solution is to allow IIS to cleanly release locks and clean up after itself to preserve server stability. If this is not possible, you can either create another site on the same box, or set up a new box with the new content, and move the domain name/IP across to "promote" the new content to production
So straight to the point- Im trying to clean my host entirely (databases too) and after I delete the last 2 files wp-content and wp-includes (700MB of files) they get restored instantly. This may be a simple question but for me it s very odd and I don`t get it. Besides file-manager i used Filezilla too and the same thing happens(my hosting company as it su#%$ failed to give me a reply after 48h).
I have recorded a short video of my problem to help you better understand my issue.
https://www.youtube.com/watch?v=wqL35R0-vvw&feature=youtu.be
Hope you`ll be able to help me. Thank You !
I`m working on this website for an NGO after it was hacked and for now I want to wipe every single file from the server and rebuild it but those files which have inside infected pages(php scripts) wont get deleted
Chances are very good some of those files are owned by the webserver, especially if you were compromised via a WordPress vulnerability. As they're owned by the webserver and not your user, you're unable to delete them.
If you have root/sudo access, you can use that on the command-line to remove them. If you don't, you'll need your host to help.