How to change firebase realtime database URL in React - reactjs

How can i change database url (firebase real time db) in code? I would like to have option to choose which database you want (just different links)
I have firebaseConfig.ts file where is my firebaseconfig variable that contains apikey, authDomain, databaseURL, ...

https://firebase.google.com/docs/projects/multiprojects#web
You can have multiple firebase configs in an object
like
var configs = {"option1":{},"option2":{}}
and then initialise them all with different names like
var databases = {}
Object.keys(configs).map((name)=>{
const app = firebase.initializeApp(configs[name],name);
databases[name]=app;
})
then based on the choice you can access the database and access the db and change the values etc
const app = databases["option1"]
//access the realtime db and so on

Related

Sort Data on react-firebase-hooks/database

So in React, I'm reading a firebase real-time database using the "react-firebase-hooks/database" package.
import { useList } from "react-firebase-hooks/database";
import { db, auth } from "../../firebase";
function GameHistory() {
var dbRef = db.ref("/" + user.uid);
const [snapshots, loading, error] = useList(dbRef);
So basically snapshots variable contains all the firebase realtime database data.
Then later in my code I simply map each element of the snapshot array into a component.
Here is the problem, I want to sort my snapshots data in order of the .timestamp firebase property in my data. I'm not sure how to do this.
I tried to sort the snapshot data when I map it:
snapshots
.sort((a, b) => {
return a.val().timestamp > b.val().timestamp;
})
.map((game, index) => (MORE CODE
But that doesn't work because timestamp is a firebase object, and JavaScript doesn't know what to do with it.
Just to establish more context on the timestamp variable, I defined it as such:
timestamp: firebase.firestore.FieldValue.serverTimestamp()
So is there any way to sort my snapshot data? Or should I use another package? If I should use something else please show code for how you read and sort the realtime db.
You're mixing up two different databases here:
The import { useList } from "react-firebase-hooks/database" and other code are for the Realtime Database.
The timestamp: firebase.firestore.FieldValue.serverTimestamp() is for Cloud Firestore
While both databases are part of Firebase, they are completely separate and each has their own API.
To write a server-side timestamp to Realtime Database, use:
timestamp: firebase.database.ServerValue.TIMESTAMP
I'd actually also let the database server handle the sorting, instead of doing this in your code, which is done with:
var dbRef = db.ref("/" + user.uid);
const dbQuery = dbRef.orderByChild("timestamp");
const [snapshots, loading, error] = useList(dbQuery);
...

Upload images to Azure blob from front end (React)

The front end enables people to upload their photos, so i was sending the base64 to the server and working with it initially, but there are problems with firewall which blocks the request which contains base64. As an alternative solution I was trying to upload the image to azure blob get the file name and then send that to the server for processing where I generate a sas token for the blob validation and processing.
This works perfectly fine when I work locally and the front end connection works with #azure/storage-blob
and uploadBrowserData() when I send the arrayBuffer as the param
export const uploadSelfieToBlob = async arrayBuffer => {
try {
const blobURL = `https://${accountName}.blob.core.windows.net${sasString}`;
const blobServiceClient = new BlobServiceClient(blobURL, anonymousCredential);
const containerClient = blobServiceClient.getContainerClient(containerName);
let randomString = Math.random().toString(36).substring(7);
const blobName = `${randomString}_${new Date().getTime()}.jpg`;
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.uploadBrowserData(arrayBuffer);
return { blobName, blobId: uploadBlobResponse.requestId };
} catch (error) {
console.log('error when uploading to blob', error);
throw new Error('Error Uploading the selfie to blob');
}
};
When I deploy this is not working, the front is deployed in the EastUs2 location and the local development location is different.
I thought the sasString generated for anonymous access had the timezone option so I generated 2 different one's one for local and one for hosted server with the same location selected.
Failed to send request to https://xxxx.blob.core.windows.net/contanainer-name/26pcie_1582087489288.jpg?sv=2019-02-02&ss=b&srt=c&sp=rwdlac&se=2023-09-11T07:57:29Z&st=2020-02-18T00:57:29Z&spr=https&sig=9IWhXo5i%2B951%2F8%2BTDqIY5MRXbumQasOnY4%2Bju%2BqF3gw%3D
What am I missing any lead would be helpful thanks
First, as mentioned in the comments there was an issue with the CORS Settings because of which you're getting the initial error.
AuthorizationResourceTypeMismatchThis
request is not authorized to perform this operation using this
resource type. RequestId:7ec96c83-101e-0001-4ef1-e63864000000
Time:2020-02-19T06:57:31.2867563Z
I looked up this error code here and then closely looked at your SAS URL.
One thing I noticed in your SAS URL is that you have set the signed resource type (srt) as c (container) and trying to upload the blob. If you look at the description of the kind of operations you can do using srt=c here, you will notice that blob related operations are not supported.
In order to perform blob related operations (like blob upload), you would need to set signed resource type value to o (for object).
Please regenerate your SAS Token and include signed resource type as object (you can also include container and/or service in there as well) and then your request should work. So essentially your srt in your SAS URL should be something like srt=o or srt=co or srt=sco.
I couldn't notice anything wrong with the code you mentioned about, but I have been using a different method to upload files to Azure Blog Storage using React, the method is exactly the same as in this blog article which works perfectly for me.
https://medium.com/#stuarttottle/upload-to-azure-blob-storage-with-react-34f37805fdfc

access Alexa Session persistence from other service

I have a Alexa hosted skill. I am trying to find a way to access persistenceAdapter.S3PersistenceAdapter() from some other places ( e.g. angular 2.x). Can this be done, or I have to use other database to replace S3? If that is the case, which database is recommended?
I use some sample code to access S3 from alexa skill. I have no idea how attributesManager work, just copy and paste.
.withPersistenceAdapter(
new persistenceAdapter.S3PersistenceAdapter({bucketName:process.env.S3_PERSISTENCE_BUCKET})
)
and
const attributesManager = handlerInput.attributesManager;
const sessionAttributes = await attributesManager.getPersistentAttributes() || {};
const temperature = sessionAttributes.hasOwnProperty('temperature') ? sessionAttributes.temperature : 0;
S3 isn't a database - S3 is object storage. If a database is what you need, you could use DynamoDB instead. It sounds like it is - you wouldn't normally use S3 for this.
Anyway, you'll not be able to use the ASK SDK in an Angular or other (non Alexa skill) project. But, you could connect to S3 (or DynamoDB) using the AWS SDK.
https://github.com/aws/aws-sdk-js
You have 3x types of attributes for Alexa - request, session and persistent. I noticed your variable is named sessionAttributes but you're doing getPersistentAttributes.
Here's an example of how you'd use the withPersistentAdapter - https://www.talkingtocomputers.com/alexa-skills-kit-ask-sdk-v2#data-persistence
But here's an example if you use DynamoDB. It's simpler IMO:
module.exports.handler = Alexa.SkillBuilders.standard()
.addRequestHandlers(/* your handlers */)
.withTableName(/* your table name (string) */)
.withDynamoDbClient()
.lambda()
}
Then you could do something like (in an async function):
const att = await attributesManager.getPersistentAttributes()
const temperature = att.temperature ? att.temperature : 0
But of course, you need to save the attribute there first, if you want to access it. For example (in an async function):
const att = await attributesManager.getPersistentAttributes()
await attributesManager.setPersistentAttributes( { ...att, temperature: 10 }) // set the value
await attributesManager.savePersistentAttributes() // save it
You can use S3 buckets the syntax looks like this, they do have get,set and save like the persistence attributes as in the example above and more on this you can go here the Amazon Docs at the bottom.
const { S3PersistenceAdapter } = require('ask-sdk-s3-persistence-adapter');
const S3PersistenceAdapter = new S3PersistenceAdapter({ bucketName : 'FooBucket' })

proper location for public/private keys in my React development app?

In the dev environment for my React app, I have a set of public/private keys that I need to access an API. I'd like to ideally put these keys into their own file for gitignore purposes, but i'm not having luck with my code as shown below.
my helpers.jsx file is where the API data is called via lightweight AJAX add-on, and I have the actual keys in the require declarations area:
var API_KEY = require('./keys.jsx');
var PRIV_KEY = require('./keys.jsx');
Summarily, my keys.jsx file (stored in the same subfolder as the helpers.jsx) consists of the following:
module.exports = {
API_KEY:'myactualpublickey',
PRIV_KEY:'myactualprivatekey'
};
However, my app does not like this set up, as I get an "Failed to load resource: the server responded with a status of 401 (Unauthorized)” error message and the API call isn't successful because the necessary keys are not included.
When I replace the require('./keys.jsx'); in the helpers.jsx file with the actual keys, the API call works fine.
Any help or guidance would be most appreciated. Thanks.
You're exporting an object with properties called API_KEY and PRIV_KEY, so try this:
var API_KEY = require('./keys.jsx').API_KEY;
var PRIV_KEY = require('./keys.jsx').PRIV_KEY;

Hosting nodeJS app with firebase

So I have this web-app using angularJS and nodeJS. I don't want to just use localhost to demo my project because it doesn't looks cool at all when I type "node server.js" and then go to localhost.....
Since I intend to use Firebase for the data, I have noticed that Firebase provides hosting. I tried it, but it seems to only host the index.html and not through/using server.js. I have customized files for the server to use/update. So, how can I tell Firebase Hosting to use my server and related files when hosting?
Is it possible to tell Firebase, hey, run "node server.js" to host my index.html?
I'm guessing by the way you are wording the question you want to see this site from "the internet".
Two routes you could go here.
a) Serve your index through Firebase hosting. Firebase only hosts assets. If your Angular app is being served through Node then you will need to change your architecture to be more SPA-ish
SPA-ish would be like an index bootstrap that interacts with the backend purely through API's.
You would host the API server on something more appropriate like through Nodejitsu.
b) Serve the whole thing through something like Nodejitsu (hosting platform) or your very own VM managed by a different kind of hosting company like BuyVM.net.
Another idea, is if your nodejs app is independent of the angularjs app (however they use shared data, and perform operations on that data model) you could separate the two and connect them only via firebase.
Firebase hosting -> index.html and necessary angularjs files.
Locally (your PC) -> server.js which just connects to firebase and trigger on changed data.
I have done this for a few projects and it's a handy way to access the outside world (internet) while maintaining some semblence of security by not opening ports blindly.
I was able to do this to control a chromecast at my house while at a friends house
Here's an example from my most recent project (I'm trying to make a DVR).
https://github.com/onaclov2000/webdvr/blob/master/app.js
var FB_URL = '';
var Firebase = require('firebase');
var os = require('os')
var myRootRef = new Firebase(FB_URL);
var interfaces = os.networkInterfaces();
var addresses = [];
for (k in interfaces) {
for (k2 in interfaces[k]) {
var address = interfaces[k][k2];
if (address.family == 'IPv4' && !address.internal) {
addresses.push(address.address)
}
}
}
// Push my IP to firebase
// Perhaps a common "devices" location would be handy
var ipRef = myRootRef.push({
"type": "local",
"ip": addresses[0]
});
myRootRef.on('child_changed', function(childSnapshot, prevChildName) {
// code to handle child data changes.
var data = childSnapshot.val();
var localref = childSnapshot.ref();
if (data["commanded"] == "new") {
console.log("New Schedule Added");
var schedule = require('node-schedule');
var date = new Date(data["year"], data["month"], data["day"], data["hh"], data["mm"], 0);
console.log(date);
var j = schedule.scheduleJob(date, function(channel, program, length){
console.log("Recording Channel " + channel + " and program " + program + " for " + length + "ms");
}.bind(null, data["channel"], data["program"], data["length"]));
localref.update({"commanded" : "waiting"});
}
});
When I change my "commanded" data at the FB_URL, to "new" (which can be accomplished by angularjs VERY Simply, using an ng-click operation for example) it'll schedule a recording for a particular date and time (not all actually functional at the moment).
I might be late but since 3 years have passed there is an solution available now from Firebase in the form of cloud functions
Its not straight forward but looks promising if one can refactor their code a bit

Resources