Google Realtime API - creating and removing shortcut file in Google Drive - google-drive-realtime-api

Looking at Realtime API quickstart example, a shortcut file is used to store realtime document model. I'm assuming that this is a file that holds realtime document model state.
Question: do I need to create and clean up this shortcut file for each collaboration session?
Note: Eventually I want to persist data to my database, not Google Drive.

From an API perspective, the realtime documents are designed to be persistent storage. The files are long lived, and there is there is no need to ever recreate them or store data elsewhere.
If you want to copy data elsewhere, how and when to do that sounds like a design decision you need to make given whatever makes sense for your app.

Related

What is the best way to store data from my repository?

I'm building a website with a "repository" where the user can download and upload several files of different sizes. My idea is to use my database with metadata to store information (extension, some tags and the path where it is stored). And then the user can access this data or search the application. My problem is that I don't know the best way to store it, I thought of using google drive, S3 and dropbox through API's. But I would like to know other ways to make this possible. I know it's a complex subject, but I would like you to tell me where to study.
for storing metadata you can use mongodb sync with file id and for storing large file you can use google drive API (let me know if you intrested ) or some kind of file hosting.

What are my options for private image hosting for a private website <100 users

I have created an operations journal website where users can write and read what happened during their shifts and report (on-going) incidents. I was looking to expand the solution by letting the users attatch images as well. The images should not be hosted publicly, so what are the common options?
Technologies that I've used to build the app are primarily: React, NextJS, Next-auth, MongoDB, and it's all hosted on Azure.
Should I just host the images with MongoDB?
Is it possible to use say OneDrive or Workplace, which we already pay for, for image hosting?
Or is there some other practice that is highly recommended?
The natural storage option for things like images (or any other kind of media files) is Azure Blob Storage. Cost effective and well suited for what you usually want to do with them. Putting binary data into any kind of database is usually a waste of resources.
Using Azure Blobs to Store the Images as blobs is the standard
if you don't want the Images to be available publicly then use a shared access signature (SAS),
https://learn.microsoft.com/en-us/azure/storage/blobs/sas-service-create?tabs=dotnet
But from the Stack you are using Mongodb can be sufficent create a seperate Collection and save the blobs there bcos
blob storage is a Nosql db. are you using azure's NoSql db Cosmosdb ? or mongodb itself.
Do you want to trigger code using events from the db?. such as after the image is uploaded do xxxx automatically.

Is there a possibility to save an mp3 file on the firebase realtime database in React Native?

I need to save sounds which my rn app is using on a database. I'm not really familiar with databases, so I chose the most easiest variant as I'm thinking - firebase. So, is there a possibility to save an mp3 file on this db using react native? If not, what db supports this feature?
What you usually do in such cases is to store the actual file on a cloud storage service, such as AWS S3, while in your actual database you save the path to that file.
When you need to retrieve the file, you load the path from the database and with that information you can download the file from the storage.
If you want to remain in the firebase eecosystem, you should take a look at the cloud storage service for Firebase: https://firebase.google.com/docs/storage
You better store the files inside a storage like Google storage for firebase
or AWS simple storage or any other storage.
And store the url of the file inside your database.

Move database from local datastore to another local datastore

I and my friend are working on a GWT-Google App Engine project, using Tortoise SVN and Google Code to synchronize the code.
We also synchronize the local_db.bin file in appengine-generated folder. But we cant get it work. After synchronize the db file, our local datastore is not updated as we expected.
That is a pain. Im worrying about our future, when our database get bigger and more complicated #A#.
Anyone please give me an advice. What should i do to synchronize our local datastore?
I have to suggestions:
1) Use remote api : https://developers.google.com/appengine/articles/remote_api to share a GAE hosted db locally.
2) Maybe you can use Gdrive to sync folders.
This is a really bad idea. Even if you weren't having trouble making both ends read from the same datastore file, the local datastore is in a binary format, and thus you won't both be able to work on the app at the same time, or you'll get merge conflicts you will be unable to resolve.
Instead, both for collaboration purposes and for testing and deployment, you should provide a set of test data you can easily load into the datastore. Store the test data in version control, and load it in using bulkloader or your own code.

Replicating data from GAE data store

We have an application that we're deploying on GAE. I've been tasked with coming up with options for replicating the data that we're storing the the GAE data store to a system running in Amazon's cloud.
Ideally we could do this without having to transfer the entire data store on every sync. The replication does not need to be in anything close to real time, so something like a once or twice a day sync would work just fine.
Can anyone with some experience with GAE help me out here with what the options might be? So far I've come up with:
Use the Google provided bulkloader.py to export the data to CSV and somehow transfer the CSV to Amazon and process there
Create a Java app that runs on GAE, reads the data from the data store and sends the data to another Java app running on Amazon.
Do those options work? What would be the gotchas with those? What other options are there?
You could use a logic similar to what App Engine HRD migration or backup tool are doing:
Mark modified entities with a child entity marker
Run a MapperPipeline using App Engine mapreduce library iterating on those entity using a Datastore Input Reader
In your map function fetch the parent entity and serialize it to Google Storage using a File Output Writer and remove the marker
Ping the remote host to import those entity from the Google Storage url
As an alternative to 3 and 4, you could make multiple urlfetch(POST) to send each serialized entity to the remote host directly, but it is more fragile as an single failure could compromise the integrity of your data import.
You could look at the datastore admin source code for inspiration.

Resources