I am using EXT-JS 3.2.0 and I have an Ext.grid.EditorGridPanel backed by a Ext.data.Store object. The store has the restful flag on and uses Ext.data.JsonReader and Ext.data.JsonWriter. It works great for retrieving data and populating the grid. However, when I add or update a record, the JSON produced for the POST/PUT has the data nested under a root field. This is not matching up with what the service I am calling expects. It wants a flat object. For example, when I add or update a record, the JSON produces looks something like this:
{
"data": {
"name": "TEST",
"id": "-1"
}
}
But I need it to be
{
"name": "TEST",
"id": "-1"
}
Any ideas?
Thanks,
John
I don't know if it's the best approach but I ended up creating my own Ext.data.Connection object and making the request where I needed it, for example on the delete. Not the solution I was hoping for but it works.
Related
I'm building a Zapier app for a platform that has dynamic fields that are specific to the individual user.
My API returns this response using a GET request:
[
{
"title": "003 Best Selling Book",
"id": "d86cbdf41be958336f1221a2211c3f65",
"date": "03/25/2021"
},
{
"id": "b844eaa3da5c476ab7337e83775777e0",
"title": "002 Best Selling Book",
"date": "03/26/2021"
}
]
The response is received by Zapier successfully
Response received by Zapier
but it is only showing the first item in the JSON array.
Only one object in my array shown.
When I go to test my trigger, it only shows me the one object in my array and give me a MISSING VALUE! error.
Missing Value Error in Zapier
Does anyone know how to fix this?
I'm trying to setup a Dropdown Type that is Dynamic and uses a Trigger to get the JSON object that populates the trigger.
A screenshot of the settings for the my dropdown from the Form Editor on the Zapier Platform Input Designer
I tried looking for example code in the Zapier Github or elsewhere on Stackoverflow or the web that showed example JSON responses for Zapier Actions, Zapier Triggers and Zapier Dynamic Dropdowns but couldn't find any.
Per the docs, your returned JSON needs name and id properties.
I have a workflow in Azure Logic App that execute an API call to get custom fields values from Trello card. Then, I put a 'for each' step to get the definition (field name) of every custom field. This is made by calling another method of Trello API. I want to create a JSON object based in the values I've got from these API calls. The format of JSON is as simple as:
{
"name1": "value1",
"name2": "value2",
...
}
Edit:
I have an initial object with this schema:
{
"Cel": "",
"City": "",
"CreatedOn": "",
"Lead": "",
"Mail": "",
"Brand": "",
"Name": "",
"Salesperson": ""
}
It's a simple json object with key:value pairs.
I want to set the object properties inside a for each loop. I got the values of the fields from Trello API (returns an array like):
[{"id":"5fbdb5da1626b7499d690ebf","value":{"date":"2020-11-09T15:00:00.000Z"},"idCustomField":"5fbdb5cee93a775e4a9405e5","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"},{"id":"5fa946e31680220774095ed9","idValue":"5fa9464bd260145aacba03ed","idCustomField":"5fa9462327f2c331a184a483","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"},{"id":"5fa946e0dafc5b7ead25411a","value":{"text":"Lilia Noemi Cabral"},"idCustomField":"5fa9454303ee497b7b99772a","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"},{"id":"5fa946bf8697d18d58b73ac1","value":{"text":"Tania"},"idCustomField":"5fa94518f410418c57662fd0","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"},{"id":"5fa946c3a2ee2771d26ba30e","value":{"text":"tania#gmail.com"},"idCustomField":"5fa945088fc819708d157c5b","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"},{"id":"5fa946bc38b469081857ea9a","value":{"text":"Asuncion"},"idCustomField":"5fa944fe9d7af62b3806c8b6","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"},{"id":"5fa946b82bbaf938c4c3c341","value":{"number":"098234567"},"idCustomField":"5fa944e59527342399f460e2","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"},{"id":"5fa946b32aad4d323ba0648e","value":{"text":"Tania Cardozo Olivera"},"idCustomField":"5fa944d779089c6ca5cf534b","idModel":"5fa53d647b35d5744fd8b856","modelType":"card"}]
I put a for each step to iterate and parse the field value. I got the field name with another call to Trello API, which return this response:
{"id":"5fa94518f410418c57662fd0","idModel":"5f988b8cb225cc7ac34deae9","modelType":"board","fieldGroup":"53eb88d2e4cde013cb9f03e7500ad1a00a4c82f153f9165f7b1ec554f4cc2d74","display":{"cardFront":true},"name":"Name","pos":81920,"type":"text","limits":{"customFieldOptions":{"perField":{"status":"ok","disableAt":50,"warnAt":45}}},"isSuggestedField":false}
So, I matched the idCustomField to get the field name and field value. Now I want to set the property in the initial declared object (each property name is equal to custom field name in Trello)
I've tried using a compose step with:
setProperty(variables('Object'),body('Parse_JSON')?['name'],body('Parse_JSON2')?['text'])
And later set the variable object to the compose outputs, but doesn't work fine. The result object doesn't have all the properties set. My idea is to set all the object properties inside the for each loop.
What could I be doing wrong?
Is there a way to achieve that in Azure Logic Apps?
Thanks in advance!
You should use
setProperty(variables('object'), body('Parse_JSON')?['name'], body('Parse_JSON_2')?['value']?['text'])
instead of
setProperty(variables('Object'),body('Parse_JSON')?['name'],body('Parse_JSON2')?['text'])
================================Update===================================
Your problem may be caused by the "For each" loop run in parallel, so please try with the steps below (Note: if you do this change, when you want to change back, sometimes it will show error message when you click "Save" of you logic app. So please create another logic app for test or make a backup copy of logic app):
Click the "Settings" of your "For each" loop.
Then enable Concurrency Control and set Degree of Parallelism as 1.
Intro
I have a FireStore database similar to a social media db, with 3 collections Users, Events, and EventUpdates. My goal is to create a feed with eventUpdates created by me and my friends. So i have to expand my database with friendship connections. But i struggle with 3 problems, and hopefully somebody here can push me in the right direction to solve these.
Problem/Question 1:
I added username and user image to the EventUpdate model so it's easier to query. I've heard denormalise is the way to go in a NoSQL database. But if a user updates his user image, i've to update all eventUpdates created by that user. Sounds like something you don't wanne do. But is there a better way to do this?
Problem/Question 2:
How can i create a data structure that is optimised for performing the following query: get eventUpdates from me and my friends ordered by date.
Problem/Question 3:
How to store likes? I can keep a counter in a eventUpdate. But this becomes a problem when i denormalise eventUpdates (see current solution underneath EDIT)..
Data structure example .
{
"users": {
"1": { "name": "Jack", "imageUrl": "http://lorempixel.nl" }
},
"events": {
"A": {
"name": "BeerFestival",
"date": "2018/09/05",
"creatorId": "1"
}
},
"eventUpdates": {
"1": {
"timestamp": "13243543",
"creatorId: "1",
"creatorName": "Jack",
"creatorImageUrl": "http://lorempixel.nl",
"eventId": "A",
"message": "Lorem ipsum"
}
}
}
EDIT
OK, after some trial and error i ended up with the following structure. This structure seems work, but my problem with this solution is that i need to make a lot of write calls to update a single eventUpdate because of all the copies in each feed (1000 followers means 1000 copies). And it looks like i need to do that a lot.
I would like for example to add a like button to each event update. This trigger an update on all EventUpdate copies. For me it looks like firebase is not suited for my project and i'm thinking of replacing it with a SQL DB, or can anyone here change my mind with a better solution?
{
"users": {
"user1": { "name": "Jack",
"imageUrl": "http://lorempixel.nl",
"followers": ["user1"]
}
},
"feeds": {
"user1": {
"eventUpdates": {
"1": {
"timestamp": "13243543",
"creatorId: "1",
"eventId": "A",
"message": "Lorem ipsum"
}
},
"following": {
"user1": {
"name": "Jack",
"imageUrl": "http://lorempixel.nl",
"followers": ["user1"]
}
}
},
"events": {
"A": {
"name": "BeerFestival",
"date": "2018/09/05",
"creatorId": "1"
}
}
}
I added username and user image to the EventUpdate model so it's easier to query. I've heard denormalise is the way to go in a NoSQL database.
That's right, denormalization and is a common practice when it comes to Firebase. If you are new to NoQSL databases, I recommend you see this video, Denormalization is normal with the Firebase Database for a better understanding. It is for Firebase realtime database but same rules apply to Cloud Firestore.
But if a user updates his user image, i've to update all eventUpdates created by that user. Sounds like something you don't wanne do. But is there a better way to do this?
Yes, that's also correct. You need to update all the places where that image exists. Because you have chosen google-cloud-firestore as a tag, I recommend you see my answer from this post because in case of many write operations, Firestore might be a little costly. Please also see Firestore pricing plans.
Regarding Firestore, instead of holding an entire object you can only hold a reference to a picture. In this case, there is nothing that you need to update. It's always a trade between these two techniques and unfortunately there is no way between. You either hold objects or only references to objects. For that, please see my answer from this post.
How can i create a data structure that is optimised for performing the following query: get eventUpdates from me and my friends ordered by date.
As I see, your schema is more a Firebase realtime database schema more than a Cloud Firestore. And to answer to your question, yes you can create. So talking about Firestore, you can create a collection named eventUpdates that can hold eventUpdate objects and to query it according to a timestamp, a query like this is needed:
FirebaseFirestore rootRef = FirebaseFirestore.getInstance();
CollectionReference eventUpdatesRef = rootRef.collection("eventUpdates");
Query query = eventUpdatesRef.orderBy("timestamp", Query.Direction.ASCENDING);
But please note that the timestamp field should be of type Date and not long. Please also take a look at my answer from this post, to see how you can add a date property in a Cloud Firestore database.
How to store likes? I can keep a counter in a eventUpdate. But this becomes a problem when i denormalise eventUpdates (see current solution underneath EDIT)
You can simply add likes but I recommend you see the last part of my answer from this post. So you might consider adding that count in a Firebase realtime database rather than in Cloud Firestore. Both databases work very well together.
This structure seems work, but my problem with this solution is that i need to make a lot of write calls to update a single eventUpdate because of all the copies in each feed (1000 followers means 1000 copies). And it looks like i need to do that a lot.
You might also take a look at my answer from this post.
For me it looks like firebase is not suited for my project and i'm thinking of replacing it with a SQL DB, or can anyone here change my mind with a better solution?
I don't think this way. There are many apps out there that have the exact mechanism as yours and are working very well.
If you want your feed items to be in sync with the real users data (new profile image when the user changes it for example) you can simply store the user ID in the eventUpdate document. This way you don't have to keep them in sync manually, and every time you have to display the item in the feed you could simply fetch user data, and easily query many eventUpdates on userId and created_at fields ( assuming you have them ).
To implement likes in your feed the solution depends on a bunch of things like traffic volume.
The simplest way is to update a likes field with a transaction, but Firestore has a maximum updates frequency on a single document of 1 second. Plus, a transaction can easily fail if more than 5 transactions are trying to update the same document.
To implement a more solid likes system take a look at this page from the official Firebase docs.
Firestore has a different approach to the NoSQL world. Once you know the data you will use (as You already do) there are some very important points about what architecture the data will have. And It depends a lot about how the data grows, what kind of queries you will need and how often you will use them. Some cases You can create a root collection that aggregates data and queries might be easier.
There is a great video from Firebase Channel that might help. Check it out!
How to Structure Your Data | Get to Know Cloud Firestore #5
https://www.youtube.com/watch?v=haMOUb3KVSo
[UPDATED] December 26th
Others videos that might help to model and query your data is these videos:
How to Connect Firebase Users to their Data - 3 Methods
https://www.youtube.com/watch?v=jm66TSlVtcc
How to NOT get a 30K Firebase Bill
https://www.youtube.com/watch?v=Lb-Pnytoi-8
Model Relational Data in Firestore NoSQL
https://www.youtube.com/watch?v=jm66TSlVtcc
I am setting up my site to use the Coinbase iframe for accepting payments.
I am testing using the sandbox.
Sometimes when I make a payment the callback to my server takes the form:
{
"order": {
"id": "YDWALXJW",
"uuid": "2a6de442-be7b-5517-9b49-f00908460115",
"resource_path": "/v2/orders/2a6de442-be7b-5517-9b49-f00908460115",
"metadata": null,
"created_at": "2015-12-06T16:58:02-08:00",
"status": "completed",
...
and other times it looks like this:
{
"id": "f08d1f11-27f9-5be2-87fd-e086d1b67cab",
"type": "wallet:orders:paid",
"data": {
"resource": {
"id": "309a20df-a8e6-532d-9a2b-3ce5ea754d6d",
"code": "52N6TG58",
"type": "order",
...
I realize this is probably just api v1 vs v2, but I don't understand why it seems to be randomly switching back and forth. Any ideas of how to make it just use v2?
Thanks.
Most likely you've entered the same URL as both a Notifications (v2) and Callback (v1) URL.
This is easy to do, given that there are 3 different places in the UI where you can provide either or both the callback/notifications URL.
Merchant Settings Page
Your API Key's Edit form
The Merchant Tools Generator
You'll receive a POST message for each place you've entered this URL. (I was able to get 5 unique POSTs in my testing!)
The right place to include the URL depends on your situation:
If you just want merchant notifications (paid orders, mispaid orders and payouts), put it in the Merchant settings page.
If you are building an app with functionality beyond merchant tools, and want a broader set of wallet notifications, put it in your API Key's Edit form.
For Merchants I would generally not recommend entering the URL for a button generated via option 3. Based on the title of your question, I'm guessing this is your situation.
You won't be able to view or edit this setting in the future. If you're re-using a static button that you've previously generated, and think that you've included a URL there which you'd like removed, you'll need to replace the button by generating a new one.
I hope that helps!
This question may be too broad/conceptual for the SO community, but I'll give it a shot.
Quick Project Overview:
I have an project that consists of a front end application requesting data from a database via Angular $http requests. Each request is pretty much mapped one to one with a controller that visualizes the data specified in that request.
For example, I can specify keywords over a certain timeframe with:
get/A/kwords/?year=2013&month=9
and receive:
[
{"kword": "a", "count": 100, },
{"kword": "b", "count": 200, },
...
]
which I then plug in to a d3 directive.
The Problem:
I've reached the point in the project where I'm forced to either give extra work to the people developing the backend or the frontend. As the app currently stands, the database sends large chunks of JSON data that the front end then has to apply transformative functions upon in order to shape the data into the format required for the different d3 directives. For example, some JSON requests send excess data that the front end needs logic for to standardize the data entering the directives.
This is logic that I do not think the front end should be forced to handle. In my mind, the front end should only have to interact with the JSON request parameters, and not the format of the actual data coming in. I think it makes more sense for the backend to be able to serve up data in consistent formats depending on the URL params.
For example, instead of the backend serving up data formatted as such:
/get/B/kwords/?year=2013&month=9&limit=6
[
{
"kword": "a",
"data": [{"impressions": 100, "clicks": 150, "conversions": 200} ]
},
{
"kword": "b",
"data":[{"impressions": 50, "clicks": 60, "conversions": 70} ]
},
...
]
and forcing the front end to break apart this array-object-array-object, I should be able to specify a data=impressions parameter in the request:
/get/B/kwords/?year=2013&month=9&limit=6&data=impressions
[
{
"kword": "a",
"data": 100,
},
{
"kword": "b",
"data": 50,
},
...
]
Does this make sense/is this a reasonable request?
I was in a similar situation and I initially ended up going with the route that the backend handles the filtering and the front-end handles just binding data to d3.
The problem is that this is very very slow. Each $http request took 1-3 seconds so the filtering experience was not very good as you had to click a filter and wait to see a response.
It's actually much easier to send as much data as possible to the front-end and do filtering there. So while the initial page load takes a bit longer, filtering is instant. I ended up rewriting both the backend and front end to do the work on the front-end. I tried to make the initial data sent from the back-end as flat as possible and iterated through that array and pushed relevant data to properties on a javascript object to quickly transform data.
If I were to do this project again, I might have tried exploring the libraries dc.js and crossfilter as to try to avoid writing some of my filtering logic.
These are examples of just how fast filtering can be on the client side:
http://dc-js.github.io/dc.js/
http://square.github.io/crossfilter/