CDAP PUBSUB Realtime Pipleine MAP Datatype - google-cloud-pubsub

Im trying to pull through a pubsub subscription using cdap realtime pipeline.
I can connect the pubsub up but the attributes column is coming through as a MAP datatype and I seen unable to do anything with it (I need the data in it).
The idea is to take that message and place it in a database for further processing.
Is there any way to take the MAP data type and convert it to something useful?

You can convert the column using JavaScript transform plugin.
output.attributes = JSON.stringify(output.attributes);
This will convert the map type to a string.

Related

TanStack Table (React-Table) | How to convert the column filter state into the format that the backend is excepting?

I'm trying to implement server-side column filtering for a table in my React app, but I'm having trouble converting the "ColumnFiltersState" object that I'm using on the frontend into the format that the backend (which uses the "nestjs-paginate" library) is expecting. The "ColumnFiltersState" object contains a "value" field that can be of type "unknown" and I'm not sure how to handle it. I have a couple of possible solutions in mind:
One solution could be to use the "filterFn" property for each column, and pass in the filter operator that the backend is expecting (e.g. '$eq', '$gt', etc.) along with the value.
Another approach would be to define a separate mapping function that maps the "ColumnFiltersState" object into the format that the backend is expecting, using the appropriate filter operator and value for each column, but then how we would know witch operator to use, maybe add custom meta prop to the coulmnDef.
Can anyone provide some guidance on how to correctly map the column filters for the backend, and which solution would be the best approach? I would really appreciate any feedback or advice on the best approach, and even better if there is an example code to help me understand the solution better.

How to parse response from custom connector

I have an action call on my custom connector that returns JSON with data:
{
"#odata.context": "https://graph.microsoft.com/v1.0/$metadata#users(mail,displayName,department)/$entity",
"mail": "mail#company.com",
"displayName": "First Last",
"department": "DPE-DES-Platform Services"
}
I want to parse this response and store 'department' into a variable that I can use for another action call as a parameter. How would I do this? Sorry if this is elementary, I'm very new to PowerApps.
I have like 10 calls I need to make with graph API and I only want to display the end result, so I need some way to store information into variables. If there was some way I could interact with the information through code that would be great, because I also need to do things like create data structures and modify variables if possible
There is currently no way to directly parse the JSON response in Powerapps . You can use Powerapps with Power automate to parse the JSON response by using Action “Parse JSON”.
Reference: Solved: Parse JSON string in Power APPS - Power Platform Community (microsoft.com)
For the feature you can also upvote the feature request here :
Parse JSON in PowerApps - Power Platform Community (microsoft.com)
For more details on Powerapps : Microsoft Power Apps documentation - Power Apps | Microsoft Docs
You might try using dot (.) notation on the end of your Custom Connector call. Depending on the shape of the actual response, you can parse out a value that way.
Something like:
Set(varDept,
ccCall.GetDept({whatever:criteria}).department
)

Is Firestore a good database for storing many large objects in?

I have been using React/Leaflet to create a choropleth map that can color any country on the map. What I am trying to do is to develop a save/load function that saves the colored countries and later be able to import it from the database. When this object is saved and loaded, it can bring back the exact same countries that were colored. I have been using firebase/firestore but I haven't been getting any luck.
This is how my object of map data looks like
Is Firestore the right database to do it? Or should I approach another database? I need a database that can store multiple objects in the picture above.
If you can convert that file into a JSON of a size smaller than 1MB, which is the Firestore Document size limit, it is possible. In the case you are proposing I would have the following structure, from the information you shared but fell free to adapt it as you see fit:
Map Collection
field1
...
fieldN
CountriesOptions Subcollection
optionObject: {}
Where each object is a separate document in the CountriesOptions subcollection converted to JSON using JSON.stringify(obj).
For more information on how to structure your Firestore with subcollections you can check this link to the documentation.

Selenium Webdriver- Which is the better way to get data from an external data file

I am just trying to login into a web application and filling out the input criteria(10 text fields) and clicking on submit.I am getting data from xml.
My doubt here is we can get input data from excel,xml,json,etc..But which is better,efficient and lightweight.Please suggest
You can use different approaches. For example:
If you want representation of object (you have object structure), you may store information in XML/json, load data into Entity and pass data into page using this Entity.
If you want just load data and don't want to use objects or your data unstructured (it can't be represented as object), you may use txt/csv/excel
something else (depends on your situation)

Serialize Old App Engine 'db' Queryset into JSON

I'm just working on a rather dated project on App Engine that still uses the old 'db' model format instead of 'ndb'.
What would be the simplest way to serialize a 'db' query into JSON?
For example:
sections = Section.all() >>> JSON
All of the methods that I found from a google search use the to_list method of 'ndb' models.
Thanks!!!
A quick read of the docs (you have done that ?) turns up to_dict https://cloud.google.com/appengine/docs/python/datastore/functions?hl=en#to_dict which allows you to transform a model entity to a dictionary. Dictionaries can be transformed to JSON (unless they have Decimal values and a few other types, but I am sure you can work around that.). Then just iterate over the query result, producing a list of dicts, which you can json.dump(thelist)
However if you have a lot of entities you will have to take some additional steps, but you can read the docs to work out that.

Resources