Restore overwritten files and directories in Dropbox - file

Does anyone know how to restore previous versions of files/directories that have been accidentally overwritten (copied over) in Dropbox? I'm happy to restore the entire directories from the previous day. I'm pretty desperate.
Thanks in advance.

Dropbox does offer an API you can use for listing and restoring versions of files, among other operations. You can find everything you need to get started with the Dropbox API, including documentation, tutorials, and SDKs here:
https://www.dropbox.com/developers
Specifically, to list files, use:
https://www.dropbox.com/developers/documentation/http/documentation#files-list_folder
https://www.dropbox.com/developers/documentation/http/documentation#files-list_folder-continue
And to list the different versions of a file, you would use:
https://www.dropbox.com/developers/documentation/http/documentation#files-list_revisions
And to restore a file to a particular version:
https://www.dropbox.com/developers/documentation/http/documentation#files-restore
Those are links to the documentation for the HTTPS endpoints themselves, but we recommend using one of the official SDKs if possible:
https://www.dropbox.com/developers/documentation
Those have corresponding native methods for the HTTPS endpoints.

Related

Where do I find CodenameOne's version of iKVM?

I need to do some research if/how to use backend code from some already available Java web service in some newly created Windows 10 UWP app. The Java code deals with parsing special binary data, depends on things like configuration files and some additional 3rd party libs like Apache Commons*. The current ideas are either providing some native DLL to be bundled with the UWP-app or providing a stand-alone one publishing some high level web services which the UWP-app consumes.
I came across iKVM and CodenameOne and read that iKVM itself is not maintained anymore, but CodenameOne forked a version for their own purposes. At various places authors say that that version of iKVM is managed in the official GitHub repo of CodenameOne, but I'm unable to find it there. The only thing I find are some helper implementations and formerly committed DLLs in the repo-history and such, but nothing which looks like the complete forked project.
Any idea where I can find this? Obviously I'm missing something...
I would simply like to have a look at what CodenameOne needed to change, how much effort they put into keeping up with Java 8, what of those efforts went back to the original project etc.
Thanks!
Sorry about that. I was under the wrong impression that the code resided in the Ports/UWP directory but apparently it isn't there. I'm probably the person who wrote that in those places...
We added a link to the actual repo there for reference. It's here: https://github.com/shannah/cn1-ikvm-uwp

Does Google Cloud Debugger require that source files be placed in a specific locations in the repository?

I'm trying to setup Cloud Debugging for a Python App Engine module without success. See this question for the specific issue I am having.
I am wondering if the reason for my issue is the locations of the source files in my repository.
My source files are (for various reasons) in rather idiosyncratic locations and I have a "build" step that copies the files into into a staging directory where everything is laid out as app engine expects. It also generates some files (including the app.yaml) based on configuration settings.
I then run appcfg.py update from this staging directory.
Given all this moving around of files, I am wondering how the Cloud Debugger can identify which source file in the module I uploaded corresponds to which file in the repository. Is it designed to look in a specific locations (which will explain my problem), or is it somehow more robust than that?
I don't think the issue is related to location of source files in the repository.
The Python Cloud Debugger loops through all the loaded modules and tries to find the best match. The actual location of the module only matters if there are multiple modules with the same name. In this case, the debugger will try to find the best match given the relative path of the files in the repository. You can see the implementation here.

How the GitHub store your repository files?

I'm feeling stupid, but I want to know how GitHub and Dropbox store user files, because I have a similar problem and I need to store user's project files .
Is it just like storing project files somewhere in the server and refer to the location as a field in the database, or there are other better methods ?
Thanks.
GitHub uses Git to store repositories, and accesses those repos from their Ruby application. They used to do this with Grit, a Ruby library. Grit was written to implement Git in Ruby but has been replaced with rugged. There are Git reimplementations in other languages like JGit for Java and Dulwich for Python. This presentation gives some details about how GitHub has changed over the years and is worth watching/browsing the slides.
If you wanted to store Git repositories, what you'd want to do is store them on a filesystem (or a cluster thereof) and then have a pointer in your database to point to where the filesystem is located, then use a library like Rugged or JGit or Dulwich to read stuff from the Git repository.
Dropbox stores files on Amazon's S3 service and then implements some wrappers around that for security and so on. This paper describes the protocol that Dropbox uses.
The actual question you've asked is how do you store user files. The simple answer is... on the filesystem. There are plugins for a lot of popular web frameworks for doing user file uploads and file management. Django has Django-Filer for instance. The difficulty you'll encounter in rolling your own file upload management system is building a sensible way to do permissions (so users can only download the files they are entitled to download), so it is worth looking into how the various framework plugins do it.

Version control strategy with Google Cloud Endpoints

When working with Google Cloud Endpoints in a appengine project (eclipse based), some files that describe the api are automatically generated every time the endpoint is edited and for every version.
The files are *-v1.api, *-v1-rest.discovery and *-v1-rpc.discovery (the version number may change) and are placed in WEB-INF.
¿Should these files be committed to source control?
My impression is that if the files are automatically generated, they will always be available and there is no need to track them.
Even if I add more versions of the endpoint in the future, I need to keep all those versions for backwards compatibility so all .api and .discovery files will also be generated for all the versions.
Personally, I don't version control (or even worry about backing up) any generated files. I only worry about source and published binaries. And in theory you don't need the binary either because you should be able to recreate the binary from those source files.

Drupal 7.17 - Do I have to remove these files after installation?

A site I'm currently managing has Drupal 7.17 on it. I'm noticing the following files in the root of the website:
install.php
CHANGELOG.txt
INSTALL.txt
INSTALL.mysql.txt
INSTALL.pgsql.txt
LICENSE.txt
MAINTAINERS.txt
UPGRADE.txt
Researching this, tells me that as of Drupal 7.16, they fixed a security issue that would allow arbitrary code to run in install.php that would allow the re-installation of Drupal that someone could run. But basically, I am wondering if any of these files (if left in the server root) could cause problems in Drupal 7.17? Do I have to remove these files for security reasons? Or is this no longer a security risk whatsoever in Drupal 7.17?
I understand that we shouldn't remove the upgrade.php file, but just curious on the rest of these files.
Thanks, and this is probably a dumb question, but just felt the need to ask anyways. Usually I remove these files when I install software on websites, but not sure how Drupal uses and/or misuses these files.
Based on the document “Finalize the upgrade” (which applies to both upgrades and installations) from the Drupal handbook:
The last step in an upgrade is to delete or move the following files from your site:
install.php
CHANGELOG.txt
INSTALL.txt
INSTALL.mysql.txt
INSTALL.pgsql.txt
INSTALL.sqlite.txt
LICENSE.txt
MAINTAINERS.txt
UPGRADE.txt
Just to make that clear: the only PHP file to delete is "install.php" (i.e. be sure to leave "upgrade.php" and all other PHP files). And when you delete the TXT files, be sure to keep "robots.txt" since it is used by search engines.
You shouldn't delete any files. If you really wanted to, you could delete various txt files. A better solution if you are afraid of security is to not let the files be accessed through the web server. Drupal only use the index.php file for serving content.
I would love to hear an update and more recent thoughts on this question, and here is why.....
I was just working with a newly updated Drupal site to Drupal 8.9.20 running the Open Social distribution as a logged-in user with no admin privileges. This is my ACTIVE PRODUCTION site!
I deleted a node (News Article) I was trying to embed from a website that refuses to connect on some of their metadata ie: image #1, and after submitting on the delete link, the browser switched to install.php, which as you know sites in the document root.
I was of course shocked to see this and after considering the even innocent response a user might get that lead them to reinstall THEIR App, this could be very dangerous, of course.
So, since the last reply on this references Drupal versions from 2014, I was
just wondering your thoughts in this day and age on what are the latest recommendations!
Thanks

Resources