I recently started learning Sveltekit and i adapted a small CRUD application that i made with express and PostgreSQL but i am not able to pass correctly the database data to the frontend. This is the code i wrote so far:
var query_output = undefined;
export const load = () => {
return {query_output}
};
export const actions = {
default: async (event) => {
let form_input = await event.request.formData();
let query = {
text: "select * from ux_return_shipments($1, $2, $3, $4)",
values: [form_input.get('beg-dep-date') || null, form_input.get('end-dep-date') || null, form_input.get('beg-arr-date') || null, form_input.get('end-arr-date') || null]
}
event.locals.pool.query(query)
.then(result => {
query_output = result.rows;})
}
};
The problem is that the data sent to the frontend is not the same extracted from the database. I suspect that this is caused by the fact that the second export requires time while the first one is performed immediately, therefore the 'query_output' variable sent to the frontend is always late.
How can i sync the two steps? Currently i am using two different exports because the second one reads the data that the user inserted in a form and also contains the db connection data through the locals attribute.
How can i sync the two steps?
You don't. load is for loading page data without any user input, form actions are for responding to form submissions from user interactions. They are completely separate and do not share state.
The action should return its own data which is passed to the page component as the form property. Data from load is passed to the data property.
Related
I am using express to create my firebase functions, and I understand how to create regular callable functions. I am lost however on the exact way to implement trigger functions for the background (i.e. onCreate, onDelete, onUpdate, onWrite), as well as how Reactjs in the frontend is supposed to receive the data.
The scenario I have is a generic chat system that uses react, firebase functions with express and realtime database. I am generally confused on the process of using triggers for when someone sends a message, to update another user's frontend data.
I have had a hard time finding a tutorial or documentation on the combination of these questions. Any links or a basic programmatic examples of the life cycle would be wonderful.
The parts I do understand is the way to write a trigger function:
exports.makeUppercase = functions.database.ref('/messages/{pushId}/original')
.onWrite((change, context) => {
// Only edit data when it is first created.
if (change.before.exists()) {
return null;
}
// Exit when the data is deleted.
if (!change.after.exists()) {
return null;
}
// Grab the current value of what was written to the Realtime Database.
const original = change.after.val();
console.log('Uppercasing', context.params.pushId, original);
const uppercase = original.toUpperCase();
// You must return a Promise when performing asynchronous tasks inside a Functions such as
// writing to the Firebase Realtime Database.
// Setting an "uppercase" sibling in the Realtime Database returns a Promise.
return change.after.ref.parent.child('uppercase').set(uppercase);
});
But I don't understand how this is being called or how the data from this reaches frontend code.
Background functions cannot return anything to client. They run after a certain event i.e. onWrite() in this case. If you want to update data at /messages/{pushId}/original to other users then you'll have to use Firebase Client SDK to listen to that path:
import { getDatabase, ref, onValue} from "firebase/database";
const db = getDatabase();
const msgRef = ref(db, `/messages/${pushId}/original`);
onValue(msgRef, (snapshot) => {
const data = snapshot.val();
console.log(data)
});
You can also listen to /messages/${pushId} with onChildAdded() to get notified about any new node under that path.
I'm developing an app using React Native that allows you to create your own checklists and add items to them.
For example you'd have "Create Checklist", and inside that you'll have the option to "Add Item", "Delete Item" "Edit Item", basic CRUD methods etc.
It's going to be completely offline but I'm wondering what the best approach to storing this data locally would be.
Should I be using a DB such as firebase? I have read that it is overkill and to use something like Redux but I'm not sure if the latter will accomplish everything I need. As long as it's storing data which can be edited, and will save on the user's device (with minimal effort) it sounds good to me.
Would appreciate some input on this, thanks!
You could use AsyncStorage for persisting data locally on the user's phone. It is a simple persistent key-value-storage.
Each checklist is most likely an array of JS objects. The documentation provides an example on how to store objects.
const storeData = async (value) => {
try {
const jsonValue = JSON.stringify(value)
await AsyncStorage.setItem('#storage_Key', jsonValue)
} catch (e) {
// saving error
}
}
The value parameter is any JS object. We use JSON.stringify to create a JSON string. We use AsyncStorage.setItem in order to persist the data. The string #storage_Key is the key for the object. This could be any string.
We retrieve a persisted object as follows.
const getData = async () => {
try {
const jsonValue = await AsyncStorage.getItem('#storage_Key')
return jsonValue != null ? JSON.parse(jsonValue) : null;
} catch(e) {
// error reading value
}
}
Both examples are taken from the official documentation.
Keep in mind that this functionality should be used for persistence only. If the application is running, you should load the complete list, or parts of the list if it is very large, in some sort of application cache. The implementation for this functionality now heavily depends on how your current code looks like. If you have a plain view, then you could access the local storage in an effect and just store it in a local state.
function MySuperList() {
const [list, setList] = useState([]);
React.useEffect(() => {
// retrieve data using the above functionality and set the state
}, [])
// render list
return (...)
}
I would implement some sort of save button for this list. If it is pressed, then we persist the data in the local storage of the phone.
I am making a project - Gift recommendation using React. At the result page, I am trying to save the values to firebase realtime database from which I'll be fetching later for showing user their past recommendations. The problem is, firebase writes the same values twice. What is the issue? Where am I going wrong? Just to be clear, I did console log, the function runs just once, still the value gets written twice.
Code:
const firebasesent = () => {
if (user !== null) {
const username = String(user.email).split("#")[0];
const userRef = firebase.database().ref("Users/" + username);
const recommendation = {
name,
age,
gender,
relation,
ocassion,
interest1,
interest2,
budget,
title,
};
userRef.push(recommendation);
setSentToFirebase(true);
}
};
sentToFirebase is a state variable which I'm using to make sure that the function runs just once.
Firebase -
A single call to userRef.push(recommendation); will result in only one write to the database. If you see the data showing up multiple times, the most likely reason is that firebasesent is being called multiple times. I recommend setting a breakpoint in the method, running in a debugger, and seeing where (else) it's called from.
I'm confused as to the appropriate way to access a bunch of images stored in Firebase storage with a react redux firebase web app. In short, I'd love to get a walkthrough of, once a photo has been uploaded to firebase storage, how you'd go about linking it to a firebase db (like what exactly from the snapshot returned you'd store), then access it (if it's not just <img src={data.downloadURL} />), and also how you'd handle (if necessary) updating that link when the photo gets overwritten. If you can answer that, feel free to skip the rest of this...
Two options I came across are either
store the full URL in my firebase DB, or
store something less, like the path within the bucket, then call downloadURL() for every photo... which seems like a lot of unnecessary traffic, no?
My db structure at the moment is like so:
{
<someProjectId>: {
imgs: {
<someAutoGenId>: {
"name":"photo1.jpg",
"url":"https://<bucket, path, etc>token=<token>"
},
...
},
<otherProjectDetails>: "",
...
},
...
}
Going forward with that structure and the first idea listed, I ran into trouble when a photo was overwritten, so I would need to go through the list of images and remove the db record that matches the name (or find it and update its URL). I could do this (at most, there would be two refs with the old token that I would need to replace), but then I saw people doing it via option 2, though not necessarily with my exact situation.
The last thing I did see a few times, were similar questions with generic responses pointing to Cloud Functions, which I will look into right after posting, but I wasn't sure if that was overcomplicating things in my case, so I figured it couldn't hurt too much to ask. I initially saw/read about Cloud Functions and the fact that Firebase's db is "live," but wasn't sure if that played well in a React/Redux environment. Regardless, I'd appreciate any insight, and thank you.
In researching Cloud Functions, I realized that the use of Cloud Functions wasn't an entirely separate option, but rather a way to accomplish the first option I listed above (and probably the second as well). I really tried to make this clear, but I'm pretty confident I failed... so my apologies. Here's my (2-Part) working solution to syncing references in Firebase DB to Firebase Storage urls (in a React Redux Web App, though I think Part One should be applicable regardless):
PART ONE
Follow along here https://firebase.google.com/docs/functions/get-started to get cloud functions enabled.
The part of my database with the info I was storing relating to the images was at /projects/detail/{projectKey}/imgs and had this structure:
{
<autoGenKey1>: {
name: 'image1.jpg',
url: <longURLWithToken>
},
<moreAutoGenKeys>: {
...
}, ...}
My cloud function looked like this:
exports.updateURLToken = functions.database.ref(`/projects/detail/{projectKey}/imgs`)
.onWrite(event => {
const projectKey = event.params.projectKey
const newObjectSet = event.data.val()
const newKeys = Object.keys(newObjectSet)
const oldObjectSet = event.data.previous.val()
const oldKeys = Object.keys(oldObjectSet)
let newObjectKey = null
// If something was removed, none of this is necessary - return
if (oldKeys.length > newKeys.length) {
return null
}
for (let i = 0; i < newKeys.length; ++i) {// Looking for the new object -> will be missing in oldObjectSet
const key = newKeys[i]
if (oldKeys.indexOf(key) === -1) {// Found new object
newObjectKey = key
break
}
}
if (newObjectKey !== null) {// Checking if new object overwrote an existing object (same name)
const newObject = newObjectSet[newObjectKey]
let duplicateKey = null
for (let i = 0; i < oldKeys.length; ++i) {
const oldObject = oldObjectSet[oldKeys[i]]
if (newObject.name === oldObject.name) {// Duplicate found
duplicateKey = oldKeys[i]
break
}
}
if (duplicateKey !== null) {// Remove duplicate
return event.data.ref.child(duplicateKey).remove((error) => error ? 'Error removing duplicate project detail image' : true)
}
}
return null
})
After loading this function, it would run every time anything changed at that location (projects/detail/{projectKey}/imgs). So I uploaded the images, added a new object to my db with the name and url, then this would find the new object that was created, and if it had a duplicate name, that old object with the same name was removed from the db.
PART TWO
So now my database had the correct info, but unless I refreshed the page after every time images were uploaded, adding the new object to my database resulted (locally) in me having all the duplicate refs still, and this is where the realtime database came in to play.
Inside my container, I have:
function mapDispatchToProps (dispatch) {
syncProjectDetailImages(dispatch) // the relavant line -> imported from api.js
return bindActionCreators({
...projectsContentActionCreators,
...themeActionCreators,
...userActionCreators,
}, dispatch)
}
Then my api.js holds that syncProjectDetailImages function:
const SAVING_PROJECT_SUCCESS = 'SAVING_PROJECT_SUCCESS'
export function syncProjectDetailImages (dispatch) {
ref.child(`projects/detail`).on('child_changed', (snapshot) => {
dispatch(projectDetailImagesUpdated(snapshot.key, snapshot.val()))
})
}
function projectDetailImagesUpdated (key, updatedProject) {
return {
type: SAVING_PROJECT_SUCCESS,
group: 'detail',
key,
updatedProject
}
}
And finally, dispatch is figured out in my modules folder (I used the same function I would when saving any part of an updated project with redux - no new code was necessary)
Okay. I'm kinda new to react and I'm having a #1 mayor issue. Can't really find any solution out there.
I've built an app that renders a list of objects. The list comes from my mock API for now. The list of objects is stored inside a store. The store action to fetch the objects is done by the components.
My issue is when showing these objects. When a user clicks show, it renders a page with details on the object. Store-wise this means firing a getSpecific function that retrieves the object, from the store, based on an ID.
This is all fine, the store still has the objects. Until I reload the page. That is when the store gets wiped, a new instance is created (this is my guess). The store is now empty, and getting that specific object is now impossible (in my current implementation).
So, I read somewhere that this is by design. Is the solutions to:
Save the store in local storage, to keep the data?
Make the API call again and get all the objects once again?
And in case 2, when/where is this supposed to happen?
How should a store make sure it always has the expected data?
Any hints?
Some if the implementation:
//List.js
componentDidMount() {
//The fetch offers function will trigger a change event
//which will trigger the listener in componentWillMount
OfferActions.fetchOffers();
}
componentWillMount() {
//Listen for changes in the store
offerStore.addChangeListener(this.retriveOffers);
}
retrieveOffers() {
this.setState({
offers: offerStore.getAll()
});
}
.
//OfferActions.js
fetchOffers(){
let url = 'http://localhost:3001/offers';
axios.get(url).then(function (data) {
dispatch({
actionType: OfferConstants.RECIVE_OFFERS,
payload: data.data
});
});
}
.
//OfferStore.js
var _offers = [];
receiveOffers(payload) {
_offers = payload || [];
this.emitChange();
}
handleActions(action) {
switch (action.actionType) {
case OfferConstants.RECIVE_OFFERS:
{
this.receiveOffers(action.payload);
}
}
}
getAll() {
return _offers;
}
getOffer(requested_id) {
var result = this.getAll().filter(function (offer) {
return offer.id == requested_id;
});
}
.
//Show.js
componentWillMount() {
this.state = {
offer: offerStore.getOffer(this.props.params.id)
};
}
That is correct, redux stores, like any other javascript objects, do not survive a refresh. During a refresh you are resetting the memory of the browser window.
Both of your approaches would work, however I would suggest the following:
Save to local storage only information that is semi persistent such as authentication token, user first name/last name, ui settings, etc.
During app start (or component load), load any auxiliary information such as sales figures, message feeds, and offers. This information generally changes quickly and it makes little sense to cache it in local storage.
For 1. you can utilize the redux-persist middleware. It let's you save to and retrieve from your browser's local storage during app start. (This is just one of many ways to accomplish this).
For 2. your approach makes sense. Load the required data on componentWillMount asynchronously.
Furthermore, regarding being "up-to-date" with data: this entirely depends on your application needs. A few ideas to help you get started exploring your problem domain:
With each request to get offers, also send or save a time stamp. Have the application decide when a time stamp is "too old" and request again.
Implement real time communication, for example socket.io which pushes the data to the client instead of the client requesting it.
Request the data at an interval suitable to your application. You could pass along the last time you requested the information and the server could decide if there is new data available or return an empty response in which case you display the existing data.