I'm planning to build an app that uses a NoSQL database and RethinkDB sounds good, but there's not enough info about how to connect a flutter app to RethinkDB, since many resources and videos favor firebase.
So if it's possible to build a database for my flutter app using RethinkDB, how can I go about doing that?
Yes, it is possible to use RethinkDB with a flutter application.
There is a rethinkdb_dart package on pub.dev.
There you can also find a example on how to use the package:
To include this driver in your own project add the package to your pubspec.yaml file:
dependencies:
rethinkdb_dart: '^2.3.2+6'
Then import the package into your project:
import 'package:rethinkdb_dart/rethinkdb_dart.dart';
Connect to the database:
var connection = await r.connect(db: "test", host: "localhost", port: 28015);
Create a table:
await r.db('test').tableCreate('tv_shows').run(connection);
Insert some data:
await r.table('tv_shows').insert([
{'name': 'Star Trek TNG', 'episodes': 178},
{'name': 'Battlestar Galactica', 'episodes': 75}
]).run(connection);
And work with the data:
var count = await r.table('tv_shows').count();
print("count: $count");
Be aware that you will have to set up the database by yourself.
How to do so is explained in the RethinkDB docs.
Related
How to deploy a docker image from Artifactory on Google App Engine?
What I am trying to achieve is deploying my docker image that is stored on a jfrog Artifactory to a Good App Engine. Though all the examples I find are pushing the image to Artifact Registry which is redundant as I only want to store the artifact on jfrog. Has anyone tried to do it before?
Here is the further I could go using Cloud Build:
- name: 'gcr.io/cloud-builders/docker'
dir: /workspace/app
args: [ 'pull', 'myjfrogurl.jfrog.io/$PROJECT_ID:$BRANCH_NAME' ]
Then I use terraform later to deploy:
resource "google_app_engine_flexible_app_version" "app_deploy" {
version_id = "v1"
service = var.service_name
runtime = "nodejs"
...
deployment {
container {
# Here is the problem as it needs to be a google URI
image = "myjfrogurl.jfrog.io/${var.project_id}:${var.branch_name}"
}
}
Maybe there is a way of doing that, it doesn't need to be via terraform or cloud build.
Edit
With the following code is possible to pull the image from jfrog and push to Container Registry where it will be visible for App Engine or Cloud Run, though as the answer says it is not possible to keep the image stored in only one place
# Pull from external repository
- name: 'gcr.io/cloud-builders/docker'
args: [ 'pull', 'myjfrogurl.jfrog.io/$PROJECT_ID:$BRANCH_NAME' ]
# Do a fast build using --cache-from
- name: 'gcr.io/cloud-builders/docker'
args: [ 'build',
'-t', 'gcr.io/$PROJECT_ID/appName:$BRANCH_NAME',
'--cache-from', 'gcr.io/$PROJECT_ID/appName:$BRANCH_NAME',
'.' ]
# Tag the image for Container Registry
- name: 'gcr.io/cloud-builders/docker'
args: ['tag',
'gcr.io/$PROJECT_ID/appName:$BRANCH_NAME']
# Push to the Container Registry
- name: 'gcr.io/cloud-builders/docker'
args: ['push', 'gcr.io/$PROJECT_ID/appName:$BRANCH_NAME']
Posting Guillaume Blaquiere's comment as a Community Wiki answer for better visibility for the community:
This is not possible for App Engine, and there is the same limitation with Cloud Run.
To deploy an image to the App Engine it requires to push the image to the Google Container Registry. Under the hood container registry is a GCP bucket called eu.artifacts.projectId.appspot.com or artifacts.projectId.appspot.com (according to the region - more). Artifacty Registry is a service under the Container Registry to help managing the images.
I am fairly new to web development (currently enrolled in a bootcamp) and have struggled finding the needed resources to incorporate uploading to Amazon S3 in my project. I apologize for the vagueness ahead of time.
I currently have a react app that is pulling images from my AmazonS3 account but I am intending to give the user the ability to upload to my bucket and use/view the images on my website.
I have tried watching tutorials and looking at various GitHub Repo's to identify what I am missing but have been unable to locate a tutorial that involves React, JSX and Javascript. (I've seen jquery, PHP, etc). Ultimately, I know this task is difficult and I am willing to put in the work but felt the need to ask if anyone knows of a useful resource that can help me?
I've tried using the 'aws-nodejs-sample' repo, 'themetoerchef/uploading-with-react' repo, watched a youTube tutorial, I've looked into FineUploader and have read the react-S3-uploader npm files but am unable to connect the dots. Additionally, I've included my AWS access keys in my .env file and tried making query strings to access the S3 bucket.
Is there a better way to go about this or are there other ways to upload with react that may be useful outside of S3?
To upload to s3 from the browser you need to get a signedUrl from an aws sdk which is how aws verifies your identity. In my last application I used skd for nodejs to generate the signedUrl and pass it to my front end application to use in pushing files to s3. You don't have to go that route there is an sdk that can be used by javascript within the browser.
Check this aws link for more
Go to your project directory and run
npm install --save react-aws-s3
https://www.npmjs.com/package/react-aws-s3
And add the code in your component as per the NPM document
import S3 from 'react-aws-s3';
const config = {
bucketName: 'myBucket',
dirName: 'media', /* optional */
region: 'eu-west-1',
accessKeyId: 'JAJHAFJFHJDFJSDHFSDHFJKDSF',
secretAccessKey: 'jhsdf99845fd98qwed42ebdyeqwd-3r98f373f=qwrq3rfr3rf',
s3Url: 'https:/your-custom-s3-url.com/', /* optional */
}
const ReactS3Client = new S3(config);
/* Notice that if you don't provide a dirName, the file will be automatically uploaded to the root of your bucket */
/* This is optional */
const newFileName = 'test-file';
ReactS3Client
.uploadFile(file, newFileName)
.then(data => console.log(data))
.catch(err => console.error(err))
/**
* {
* Response: {
* bucket: "myBucket",
* key: "image/test-image.jpg",
* location: "https://myBucket.s3.amazonaws.com/media/test-file.jpg"
* }
* }
*/
});
Now its everything is done, make sure to load your keys and secrets from Process ENV.
NOTE: Please don't forget to add the CORS policy on the AWS bucket if you see corse error, see here the detailed example.
thanks
Is there a way to run firestore locally (e.g. for testing purposes)?
What would the approach to write tests against the DB (except of using mocks)
Update 2020:
There's now also a Firebase Emulator Suite.
Update Nov 2018:
Local emulation, at least for the purpose of testing Firestore rules, was demoed at Firebase Summit 2018 using #firestore/testing and documented under Test your Cloud Firestore Security Rules.
It looks like it's along the lines of:
const firebase = require(`#firebase/testing`)
const app = firebase.initializeTestApp({
projectId: 'my-project',
auth: { uid: '123', email: 'name#domain.com' }
})
const attempt = app.firestore()
.collection('colId').doc('docId').get()
firebase.assertFails(attempt)
firebase.assertSucceeds(attempt)
It seems early-on, as it's not been noted in the release-notes, but I'm sure it's coming along.
There is not currently, but stay tuned as it's something we want to provide.
In the meantime we suggest uses a separate testing project to cover this. The daily free tier per project helps with this too.
You can run the Firestore emulator by running:
gcloud beta emulators firestore start
and then set the FIRESTORE_EMULATOR_HOST environment variable as per the console output (e.g. run export FIRESTORE_EMULATOR_HOST=::1:8505).
This requires the Google Cloud SDK and a Java 8+ JRE installed and on your system PATH.
for a firestore testing write a js example test.js
you could test write with this format example
var data = {
value: {createTime: new Date(),
updateTime: new Date(),
fields:{
name:{stringValue:'new value data'},
age:{integerValue:50}
}
},
oldValue: {createTime: new Date(), //old create time
updateTime: new Date(), //old update time time
fields:{
name:{stringValue:'olvalue data'},
age:{integerValue:50}
}
}
};
testFireStoreEvent(data);
for run execute
firebase experimental:functions:shell < test.js
UPDATE!!!! VALID FOR WRITE AND UPDATE EVENTS
var data = {
before: {
//your before data
},
after: {
//your after data
}
};
testFireStoreEvent(data);
There are two libraries which attempt to facilitate mocking of the firebase sdk.
1) https://github.com/soumak77/firebase-mock
2) https://github.com/mikkopaderes/mock-cloud-firestore
I currently use the first one, since it seems to have a bit more of the SDK implemented.
They're not perfect, but they're currently sufficient for my needs, and are preferable to the other approaches since they're entirely in-process.
Note that firebase-mock (#1) does cause a webpack error if used as-is from Webpack/web code. To resolve, you can use option #2 (mock-cloud-firestore), or use the workaround mentioned here (until a fix gets merged): https://github.com/soumak77/firebase-mock/issues/157#issuecomment-545387665
Other options:
3) Firestore emulator: needs the google-cloud-sdk, and relies on a separate process
4) Separate test project: relies on connection to the internet, which also means possible quota limitations/costs
5) firebase-server: Only supports the realtime-database api, not Firestore
Firestore can be setup in local using gcloud.
Start the firestore emulator by running gcloud beta emulators firestore start --host-port=localhost:8081 and if it started successfully you will be seeing Dev App Server is now running
In case if you are using #google-cloud/firestore then create the Firestore instance in this way
// Firestore instance only for test env
const { Firestore } = require('#google-cloud/firestore')
const instance = new Firestore({ projectId; 'Your project id', host: 'localhost', 'port': 8081})
Now you have an option to work with local firestore emulator by setting local host:
var db = firebaseApp.firestore();
if (location.hostname === "localhost") {
db.settings({
host: "localhost:8080",
ssl: false
});
}
https://firebase.google.com/docs/emulator-suite/connect_and_prototype#instrument_your_app_to_talk_to_the_emulators
Could someone help me access Big Query from an App Engine application ?
I have completed the following steps -
Created an App Engine project.
Installed google-api-client, oauth2client dependencies (etc) into /lib.
Enabled the Big Query API for the App Engine project via the cloud console.
Created some 'Application Default Credentials' (a 'Service Account Key') [JSON] and saved it/them to the root of the App Engine application.
Created a 'Big Query Service Resource' as per the following -
def get_bigquery_service():
from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials
credentials=GoogleCredentials.get_application_default()
bigquery_service=build('bigquery', 'v2', credentials=credentials)
return bigquery_service
Verified that the resource exists -
<googleapiclient.discovery.Resource object at 0x7fe758496090>
Tried to query the resource with the following (ProjectId is the short name of the App Engine application) -
bigquery=get_bigquery_service()
bigquery.tables().list(projectId=#{ProjectId},
datasetId=#{DatasetId}).execute()
Returns the following -
<HttpError 401 when requesting https://www.googleapis.com/bigquery/v2/projects/#{ProjectId}/datasets/#{DatasetId}/tables?alt=json returned "Invalid Credentials">
Any ideas as to steps I might have wrong or be missing here ? The whole auth process seems a nightmare, quite at odds with the App Engine/PaaS ease-of-use ethos :-(
Thank you.
OK so despite being a Google Cloud fan in general, this is definitely the worst thing I have been unfortunate enough to have to work on in a while. Poor/inconsistent/nonexistent documentation, complexity, bugs etc. Avoid if you can!
1) Ensure your App Engine 'Default Service Account' exists
https://console.cloud.google.com/apis/dashboard?project=XXX&duration=PTH1
You get the option to create the Default Service Account only if it doesn't already exist. If you've deleted it by accident you will need a new project; you can't recreate it.
How to recover Google App Engine's "default service account"
You should probably create the default set of JSON credentials, but you won't need to include them as part of your project.
You shouldn't need to create any other Service Accounts, for Big Query or otherwise.
2) Install google-api-python-client and apply fix
pip install -t lib google-api-python-client
Assuming this installs oath2client 3.0.x, then on testing you'll get the following complaint:
File "~/oauth2client/client.py", line 1392, in _get_well_known_file
default_config_dir = os.path.join(os.path.expanduser('~'),
File "/usr/lib/python2.7/posixpath.py", line 268, in expanduser
import pwd
File "~/google_appengine-1.9.40/google/appengine/tools/devappserver2/python/sandbox.py", line 963, in load_module
raise ImportError('No module named %s' % fullname)
ImportError: No module named pwd
which you can fix by changing ~/oauth2client/client.py [line 1392] from:
os.path.expanduser('~')
to:
os.env("HOME")
and adding the following to app.yaml:
env_variables:
HOME: '/tmp'
Ugly but works.
3) Download GCloud SDK and login from console
https://cloud.google.com/sdk/
gcloud auth login
The issue here is that App Engine's dev_appserver.py doesn't include any Big Query replication (natch); so when you're interacting with Big Query tables it's the production data you're playing with; you need to login to get access.
Obvious in retrospect, but poorly documented.
4) Enable Big Query API in App Engine console; create a Big Query ProjectID
https://console.cloud.google.com/apis/dashboard?project=XXX&duration=PTH1
https://bigquery.cloud.google.com/welcome/XXX
5) Test
from oauth2client.client import GoogleCredentials
credentials=GoogleCredentials.get_application_default()
from googleapiclient.discovery import build
bigquery=build('bigquery', 'v2', credentials=credentials)
print bigquery.datasets().list(projectId=#{ProjectId}).execute()
[or similar]
Good luck!
When I change my setting of Database according to the official guide as
DATABASES['default'] = dj_database_url.config()
It has
NameError: name 'DATABASES' is not defined
when building.
When I change the syntax of database settings to
DATABASES = {
'default': dj_database_url.config()
},
it has
settings.DATABASES is improperly configured. Please supply the ENGINE value. Check settings documentation for more details. when opening the app locally.
and it has
Internal Server Error: The server encountered an unexpected internal server error (generated by waitress)
when launching from heroku.
Notice, this way worked once. But when i merge my code with my friends, it has problem again. I roughly located it was database problem. So i delete the database on heroku and wanted to sync again. But when I sync the database, it has Import error: No module named events.
When I change the setting back to the original way:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(PROJECT_PATH, 'db.sqlite3'),
}
}
It can work locally(of course), but can't in the heroku, with the error of Import error: No module named events too when syncing the database.
PS:
1, I made sure that Heroku installed all the requirements I need to run the app, especially i triple checked all the files: models, views, urls, etc.
2, I use waitress as the server instead of gunicorn recommended by the Heroku official guide.
How can I fix it?
You need a DATABASE_URL environment variable that dj_database_url will read.
To set it, run heroku config: set DATABASE_URL=<your database url> from your terminal.