I have built this website using reactjs, firebase. I would like to know if someone wants to contribute to this project how should he/she get the data that is stored in my firebase account for the development of the website? https://www.codingspace.codes/
I know they need to create a new firebase project but how do they get the exact data that is on my website.
I don’t have any idea, anyone please help me with this.
This is a very open-ended question, typically though if it's for local development I'd just give them access to the firebase project as a collaborating member.
Alternatively, you could be looking into the import/export features of Firebase.
Use the firestore export command to export all the documents in your database, replacing [BUCKET_NAME] with the name of your Cloud Storage bucket. Add the --async flag to prevent the gcloud tool from waiting for the operation to complete.
gcloud firestore export gs://[BUCKET_NAME]
And importing it on their project in a similar manner:
Use the firestore import command to import documents from a previous export operation.
gcloud firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/
where [BUCKET_NAME] and [EXPORT_PREFIX] point to the location of your export files.
See the documentation for further details https://firebase.google.com/docs/firestore/manage-data/export-import
Related
So I am at the place where I lost all of my Strapi data after moving from local to remote host and deploying my Strapi through herokuapp.com
I am using Strapi in my personal NEXT.JS project.
Luckily my Strapi database wasn't so huge and all content-types were kept so I was able to recreate the database quite quick. Also this was just a personal project.
I am wondering though, if I decide to push Strapi to use in in more professional context and in real project – how do I move from local development to deployment without loosing all data?
Is there a way to export everything before deployment and then import it into the deployed CMS or how does this work?
Also – what if I want to do it the other way around? I will keep working on my app using my Strapi on Heroku but at some point I will want to replicate the CMS locally. Where is the data stored and how do it get access to it?
From docs: "Strapi does not currently provide any tools for migrating or deploying your data changes between different environments (ie. from development to production). With the exception being the Content-Manager settings"
And there is no export/import content for now.
To export your data for example from the local environment to the production you have to handle:
content-types - Strapi store this stuff at files so version control will help
database data - you have to make database backup manually and then import data at the production server
static files - if you use Srapi to handle the static files you probably will have to copy them manually and import them to the production server or use version control for it (bad option). They are stored at app/public/uploads
I didn't tried this myself but it looks like a pretty tough task.
Conclusion: if it's OK for you to migrate only your content types, just put a git on your Strapi folder
You can do it via the CLI now, new from version 4.6.
Strapi supports export, import, and transfer.
To create a tar of your data:
npm run strapi export --file my-strapi-export
To import data into your project:
npm run strapi import -f export_file.tar.gz
There are more options in the docs:
https://docs.strapi.io/developer-docs/latest/developer-resources/data-management.html#export-data-using-the-cli-tool
With the new data export & import system, you can backup & restore
your Strapi project data without acting manually and directly into the
database.
More info: https://strapi.io/blog/announcing-strapi-v4.6
Keep in mind Heroku is in the middle of shutting down its free tier, so using another provider like railway.app or render.com might be a good idea.
Anyway: As Eugene already mentioned in his answer there are 3 types of data that might have to transmitted (content types, the actual database, and files).
After your first deployment to Heroku you should end up with all content types being there, but with an empty database and no files.
Following this guide you will create an own database while setting up your project where you can now either export and import your database from your local development environment (which you would have to do manually) or put in new data by hand. Sometimes this is even better since development environments tend to include a lot of "Lorem Ipsum" content for testing purposes.
Future deployments should not reset your database though but keep your data from that environment.
Finally there are the files which I would recommend to store on Cloudinary since it's free, and Strapi offers an easy-to-use plugin for it. Just create a free account on Cloudinary, install the plugin in your Strapi project, and set your ENV variables for your production environment within Heroku.
I found this
Apparently they recently did a plugin tutorial that had to do with this issue. There is a plugin called strapi-plugin-import-export-content on git hub that might be able to help your issue.
git hub link
For Example, i am using react-native-firebase library.
I want to grow my app install and i create a campaign.
The question is how can i know my app is installed by clicking ads or my app is downloaded directly on play store.
Simply i want to know where was my app downloaded from. I do implement firebase analytics configuration.
It seems like you could follow the advice given in this question: https://stackoverflow.com/a/47893179/4147687
Essentially, if all you want is simply a way to track how your app was installed, you can simply pass utm parameters along with your URL.
For example:
https://link-to-my-app.com/?utm_source=business_card&utm_medium=email&utm_campaign=sign_up_offer
Google Play has it's own URL builder to make this step easier here. If you are not using any specific ad network just set it as "Custom".
Firebase will automatically scrape the source, campaign and medium parameters there and present them to you in the first_open conversion event. Documentation for how this works is here.
The easiest way to do this is by using this npm package
https://www.npmjs.com/package/react-native-install-referrer
You should invoke the API only once during the first execution after install.
You can do that using AsyncStorage to check if its freshly installed or not and then you can logEvent based on that UTM url and track the users
After creating the event you can segregate users by creating Audiences in the firebase console.
I've setup a React Amplify project. I have successfully got Auth working using Cognito User Pools but can't seem to figure out DataStore/API.
I currently use DataStore to save data locally but it doesn't seem to update in the backend. I do have the aws_appsync_graphqlEndpoint in the aws-exports.js.
Not sure how to enable the Sync Engine in this guide.
You must import Amplify from #aws-amplify/core and NOT from aws-amplify
The following code example worked well in App.js, at the BOTTOM of all your imports:
import Amplify from "#aws-amplify/core";
import { DataStore, Predicates } from "#aws-amplify/datastore";
import { Post, PostStatus } from "./models";
//Use next two lines only if syncing with the cloud
import awsconfig from "./aws-exports";
Amplify.configure(awsconfig);
Source: https://docs.amplify.aws/lib/datastore/examples/q/platform/js
You can use DataStore.query to query the data.
DataStore.save to save the record etc.
Below example should be good starting point.
https://github.com/dabit3/amplify-datastore-example
If I understand the question correctly everything is set up with DataStore and working locally. The problem is that it doesn't sync to the cloud. One of the answers suggested using Apollo. You don't need Apollo. By default, and the suggested development mode of using DataStore, is local only. When you are ready to deploy to the cloud you use the command
amplify push
which provisions all of your cloud resources for the app. For more details refer to DataStore docs.
I was trying to find a step by step guide to migrate an existing project working with Datastore to the new Firebase in Datastore mode. If you do not want to wait for the Automatic migration to Firestore here is my experience with the migration.
Documentation page "Exporting and Importing Entities" has the steps, but I struggled a bit figuring things out. So I thought I'd share my experience.
Some notes to consider:
Be ware of cost/time if you have a huge database!
Yet to figure out the challenges of moving the application from Python 2.7 to Python 3 (not entirely a datastore issue, this doc page may help!)!
From Google Console, create a new project
gcloud config set project <your-old-project-id>
Create a new bucket on Google Storage (on your old project)
Export your database to the newly created bucket gcloud datastore export gs://<your-new-bucket-name> (on your old project)
Waiting for
[projects/[your-old-project-id]/operations/AvcsdafSDFasdfI3MDQJGnRsdWFmZWQHEmVwb3J1Z
S1zYm9qLW5asdfcsopEg] to finish...done.
From Google Console go to IAM page (still on your old project)
Add the new project service account account and grant "Cloud
Datastore Import Export Admin" & "Storage Admin" permissions (NOTE: for some reason when adding the "Storage viewer" only permission I got the below error, so I used the "Storage Admin" instead!)
"details: [new-project-service-account]#appspot.gserviceaccount.com does not have storage.buckets.get access to [new-project-id].appspot.com."
Move to your new project gcloud config set project <your-NEW-project-id>
Import the data from your storage bucket gcloud datastore import gs://<your-bucket-path>/[FILENAME].overall_export_metadata, when done you should see a message similar to export
Waiting for
[projects/[your-old-project-id]/operations/AaredafSDFa2otbmltZGEQCigSFmZWQHEmVwb3J1Z
S1z2otbmltZGEQCigS] to finish...done.
Optional: feel free to remove the new project service account
permission from the old project IAM page
If you browser to your new project Datastore page, you
should see your migrated Entities
I've been looking at how to create multiple Firestore instances in Firebase, I need different Db's for prod, staging and development. I read the documentation and seems that I only need to modify the "Google-services.json" file in the applications. what I don't get is what should I modify and how that would look in the console.
Am I going to see different instances in the same project?
I need to create different projects for every environment and modify those values in the file?
If I need to test something that requires testing in all environments and all of them require Blaze to run that test do I have
to pay Triple?
Thanks in advance
Firebase doesn't support the use of multiple Firestore instances in a single project.
The Firebase team strongly recommends creating different projects for each of your environments to keep them separate. This probably means you will have to build different apps that each point to different instances, or you will have to somehow configure your app at runtime to select which project you want to work with.
There is no obligation to add billing to any of the projects you don't want to pay for, as long as you accept the limitations of the Spark plan.
Yes Firebase doesn't support multiple instance in a single project. However my case, i created 3 different document in Firestore root and setup each environment to refer these documents accordingly. Each document will have the same collections, sub collections and documents.
For example,
dev -> users -> details
-> others_collection -> details
stag -> users
-> others_collection -> details
prod -> users
-> others_collection -> details
On the client side, each environment will get the collection like this :
db.collection(`${env}/users`)
I am doing this on small project and seem pretty happy with what Firestore provided. In single project. i can create many ios/android apps according to the environment and each environment have it own document and collections in Firestore.
The safest way is to create a new google cloud project. You can export your current firestore database to your new project if needed.
Step by step:
Backup your current database by exporting all your documents into a
bucket:
https://firebase.google.com/docs/firestore/manage-data/export-import
gcloud beta firestore export gs://[BUCKET_NAME]
Create a new project -> https://cloud.google.com/resource-manager/docs/creating-managing-projects
In the dashboard, select Firestore into your new project and create an empty database
In the dashboard, go to your BACKUP_BUCKET and add your new project service account in the permission tab
Switch project to your new project
gcloud config set project 'new-project'
Then export your data
gcloud beta firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]/ The EXPORT_PREFIX being the name of the folder created in your bucket.
I use this method to have clearly separated 'preprod' and 'production' environments (the team can access and synchronize with production without having access to the production environment codebase and configuration).