Trying to download export of GSuite. Receive 'OSError: The directory name is invalid.' when it hits folders named " - export

I am trying to download the data of all of my users before we close out our Google Suite account. I have created the export. I installed Google Cloud SDK Shell and authenticated to it.. I run gsutil cp -r gs://takeout-export-xxxxxxxxxxxx C:\GExport and it downloads all of the folders that come before 'R' but when it hits the first "Resource: -xxx" folder, it fails with
OSError: The directory name is invalid.
These folders don't seem to have overly useful data so I even tried deleting them (from the website interface) but they always fail to delete.
What gives? How can I go about downloading all of the user folders without doing so one at a time, manually?
Edit:
I tried selecting each folder (minus the problem folders) in the website GUI and select download so it gave me commands to download those folders. I tried to copy/paste those commands into GCloud SDK Shell but it doesn't seem to work. It fails when it hits the second line (the first folder to download). Not sure of the proper syntax to attempt to download many folders apparently (and Google's suggested commands are not correct).

Ended up giving up doing it the "right" way and modified the command to be many lines, grabbing each folder one at a time, copy/pasted the commands into GCloud SDK and letting them downloading one at a time. Notepad++ is so useful

Related

Sync problems developing React projects on OneDrive

I made a React project using npx create-react-app inside my OneDrive folder on Windows 10.
OneDrive complained about one of the folders being named '~' (it was a folder with Node config files made automatically by create-react-app).
This was honestly a huge nightmare:
I could not rename, move or delete the folder myself. Windows Explorer didn't allow it because there was a 'sync error'
OneDrive would just completly stop syncing until the issue is fixed. I could not do anything manually, so my only option was to use their 'rename' button in the error message (which is supposed to automatically rename the file and fix the error). This did not work. I would try again and again over the span of a few days and it would do nothing, the error persisted.
Ultimately I copied my project outside OneDrive, but now I wasn't able to delete my old folders. I tried everything I could think of: pause sync, try to delete them with Windows in Safe Mode, uninstalled OneDrive in the end. I managed to delete most of the contents (with a lot of effort), but there were stil some Node directories that would not be deleted. I was getting a 'reparse point buffer' error which I solved by running chkdsk /f /x
Partly, I'm posting this hoping that my experience would help someone that has simmilar issues with OneDrive, but I also want to know how to keep React projects in my OneDrive.
I like having everything on my laptop in my OneDrive folder so it is synced: I want to be able continue my work when I move to my computer.
I'm having the exact same problem. I figured out if I moved the react folder into another folder and then deleted the other folder that worked try it. My onedrive still kept trying to sync something for a while but it finally quit and now everything is okay. That is until I do another react project and onedrive will be messed up again for sure.
After deleting the folder press Window + R and run this command to reset onedrive
%localappdata%\Microsoft\OneDrive\onedrive.exe /reset

Treesize not scanning Onedrive local files

Treesize free (https://www.jam-software.com/treesize_free) is a file / folder analysis tool that quickly scans a PC and sorts folders and files in order of size to quickly show what is using up disk space.
It used to work just fine but sometime in the last few months (I've got a new PC so might be just since having this) I've noticed it has stopped working for OneDrive folders. We use OneDrive for business at work and all my docs / downloads / desktop etc are backed up on OneDrive, and these folders are all marked to keep offline ("Always keep on this device"), so they are saved locally.
However, Treesize doesn't show these files, apparently I only have 4GB in OneDrive.
If I right click the OneDrive folder and go to properties, I can see that it is about 60GB.
Any ideas what's going wrong, or how I could analyse disk space that is used by OneDrive?
I have the latest version of Treesize and have also tried an older version
I've tried starting Treesize as admin and as standard
Weird but just if I select an individual folder within Onedrive, it will scan fine. Just not the whole thing. So then I tried scanning the full C drive and then go to the Onedrive folder and "Update this branch" and it finished scanning it just fine:
and updates correctly after that:
Bit annoying though. Doesn't explain why it's doing this in the first place

auto check in the files uploaded on Sharepoint using batch file

Could you please help me to upload files and auto check in on Share Point through batch script from windows server?
Whenever I try to upload files on SP thorough batch files, the script is automatically checked out and it is invisible to other users except me, so I need to check in every time manually.
Use PowerShell with SharePoint Client Object Model (CSOM). Ask your favorite search engine, there are plenty of examples out there.

Merging a folder in SVN records only the folder in the log, and not the files inside it

The scenario is as follows.
We're running a CI server which scans a repository for any .sql changes, then executes them against a target database.
Currently it's failing because SVN is not recording file changes within a folder (that has been merged from a branch). Merge info was commit too.
Example:
Developer branches "/Trunk" to "/Branches/CR1"
Developer adds a new folder "CR1/Scripts"
Developer adds two new files "Scripts/Script1.sql" and "Scripts/Script2.sql"
Developer commits the folder and files together
Developer merges from CR1 to Trunk, commit dialog displays status "Normal"
CI server detects no changes
Developer examines the log and sees no mention of Script1.sql or Script2.sql
All this is displayed via TortoiseSVN on Windows, the CI Server is using SharpSvn .NET library.
Any help figuring out how to get the *.sql files to show up would very much be appreciated.
It's nearing a year, and during this time we've used a workaround to find the missing files. Using the CLI command svn log -v we scanned for any directory with the COPY-FROM-PATH text and listed the contents from that directory on disk rather than SVN.
Whilst this does provide us with a full list of files in that folder, we should really be able to get this info remotely without checking out a copy of the repository. When a co-worker also encountered this issue recently they found the the answer courtesy of the IRC channel #svn on freenode.
Using the CLI command svn diff <url>[old rev] <url>[new rev] --summarize you get a difference between the revisions which thanks to the --summarize flag displays all the files and which finally answered the original question.

Installation of joomla 3.0 is not being finished

I was trying to work on joomla 3.0. I have done everything that is necessary. But the page given in the image is being shown for a long time. Database tables are being created. But it is not going to the next step. Can anyone help me out? TIA
With the almost none information you provide I could tell you this.
Be sure you are selecting mysql and not mysqli and that the username has all the permissions granted on the database you provided.
In your server's home directory, (which is the lowest directory for shared hosting users,) make a new file called "phprc" inside of the .php folder. If the folder doesn't exist yet, create it. The period in front of the folder name means that it's invisible, so make sure you can see invisible files in your FTP client, or use the command "ls -a" to see all files in the command line.
Add the following lines to the 'phprc' file:
Code:
max_execution_time = 3000 ;
memory_limit=128M ;
Then save it.
normally if it is a shared host could take some minutes to reflect the change...but try again after 5 or 10 minutes and you might see it works.

Resources