Need to wait for heroku-connect SFID - angularjs

I'm facing a challenge on using heroku-connect with Salesforce. I'm inserting a record into a parent object (order) on PG db and I get the PG id when I do the insert, then I have to insert the child (order lines) but heroku-connect hasn't done the insert into Salesforce and I don't have the SFID to be able to insert it.
What would you guys recommend I do? Do I do a requery of the field that tells me if it's synched and refresh the $digest in NG? Or do I do it on the API backend with a interval. I'm a little lost on what route to take.
I'm using streaming API but still can't get the SFID from the callback when I do the insert.
rows: [ { id: 85, sfid: null } ],
EDIT
Got this from Heroku support, works great.
https://devcenter.heroku.com/articles/herokuconnect#relationships-between-objects

We ran into this issue before, when we were evaluating Heroku Connect. We ended up with writing our own sync due several limitations of our data model.
In your case, before saving record into Postgres database, I would suggest to send API REST call to SFDC in order to get SFDC Id - only after that persist a record into Postgres. It will keep data model on SFDC/Postgres side consistent. At the same time API call limit utilization will be relatively low as you will be using API only for record creation.
As you are using Heroku Connect(not stand alone sync app) I would not recommend to put backend scheduler into your web app to control refresh of populated Ids. Logic will be highly coupled and might be painful in future to support you system

Related

Sequelize returning old data from SQL Server

Problem: what I'm sometimes seeing is that very old data is being returned from my various databases when I do SELECT operations (findAll).
I have a VueJS app in which I'm using AxiosJS to call my backend ExpressJS app which uses Sequelize to connect to several (4) SQL Server databases. I'm also using the VueJS Developer Tools so I can see the variables and Vuex store change in real time on the frontend. I'm also using console.log on the backend so I can see some things there too.
In searching for solutions, I have found that:
I'm not using transactions; all of my queries are single queries that return results to the application, so I don't (shouldn't) have issues with commit timing; i.e. I UPDATE or ADD a row and only after it returns a result do I then make a SELECT
Sequelize keeps connections open, and so old connections show as "Sleeping" in the DB; I see these do get reused over time
I'm using the default isolation level in SQL Server which is READ COMMITTED and which should return committed data; since I'm waiting for the first query to return a result before launching the second query, shouldn't it be committed and ready to get me the latest values?
I see that SQL Server CAN store old copies of rows, sometimes making SNAPSHOTS, but I think they require higher isolation levels - but I'm not sure about that; maybe it IS keeping old versions?
I've been unsuccessful in figuring out how to close and reopen Sequelize connections. I'd like to close things at logout and reconnect there too since the app is still running in the tab (if this will solve my old data problem), thereby allowing someone to relogin and open all of the DB connections to be available again. I can't figure it out based on the available docs.
I think Sequelize is using old connections and somehow it is that which is causing SELECT results to be stale - this is even more likely if SQL Server is keeping old versions of rows
For some reason, if I logout and do a hard refresh of my app (CTRL+F5), all data is fresh; I don't understand why this would be at all. What could the browser be holding on to? Note that the console logs on the server app are always consistent with whatever the frontend shows. In other words, if the backend console log is stale, the frontend is stale; if backend is fresh, frontend is too.
I am unable to duplicate the getting of stale data using SSMS or Postman
Question: So what are the possible ways I can be getting stale data when I'm using Axios on the frontend and Sequelize on the backend?

Which is the best option to fetch data from Mongodb database?

Sorry for a general question. My situation looks so: i have mongodb database and 2 reactjs pages. In each page i want to fetch a different information from database. Depending by your practice, which is the best way to fetch data from mongodb in a reactjs component?
I would recommend reading up on the MERN stack - tons of guides available online via google and youtube. The gist would be that a typical web application will consist of a few key components. In this case:
1 - (React) The client page rendered to the user
2 - (Node + Express) The server which processes data, allows you to use endpoints to make changes to your application. These endpoints make the necessary database queries. You can use a database client to write these queries as JavaScript within your NodeJS endpoints.
3 - (MongoDB) Your database.
So for instance a typical CRUD app allows you to create, read, update, and delete. Let's say you are looking at making a standard TODO list app.
You would need to make requests to these endpoints to perform these operations.
You could have a POST to /todo which would then insert a new document into your database.
You would need a way to read the information from the page... say a GET request to /todos to read all items. Or also a GET request to /todo/:id to get a specific item.
You would need a way to update an existing item... say a PUT request to /todo/:id with the updates you want to take place.
Finally you would need a way to delete an item... a DELETE request to /todo/:id which would delete the item.
Each of these endpoints would make a request to insert / read / update / delete items from the database, and return content to the client React code --> which then displays it to the user.
Frontend side, in react.js call api data using fetch() method. Pass your Mongodb URI string. If you want data in slot based use limit() and Skip() function for pagination.
Follow MVC pattern where your frontend only calla controller api. And controller calls DAO methods for Mongodb. You can Use Mongodb Stitch for serverless app.sor data leak can be avoided forntend side. Mongodb has connecting pool max.100 so that each time client request Mongodb connection cashed object given from pool to further speed up your connection time.

Best approach for real time process information / Server + JS Client

I have a C# Web API project on server side and on front-end I have ExtJS 4.2.1 (Javascript framework client).
There is a section in my app where I request to start a long running process (about 5 minutes) and I want to show the user the status of the process being executed.
Basically, the process will run a special calculation for every employee in the database (about 800), so I want to let the user know which Employee is being processed in that moment.
So I was thinking in two ways of doing this, and maybe I don't know if having both is ok.
Use SignalR to show the information of the process in Real Time.
Write to a database table all the process log (every employee that its being processed).
If I use the first approach, if the user close the browser he will loose all the information about the process and if he log into the app again he will only see the actual status.
If I use the second approach, if he log into the app again he could see all the information, and using maybe a timer on client side the data could be refreshed every 5 seconds.
Does anyone have implemented something like this? Any advice is appreciated.
You should use a combination of the two. When you have calculated a employee save the state to the database and publish the change on a service bus.
Let SignalR pick these messages up and forward them to the client. This way the user will see old state when he connects and new state then they arrive with SignalR. I have created a Event aggregator proxy that makes this very easy.
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/wiki
Follow the wiki to set it up, here is a demo project
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/tree/master/SignalR.EventAggregatorProxy.Demo.MVC4
Live demo
http://malmgrens.org/Signalr/

How to update backbone model from database

I am creating a webpage to show the listing from my database table,
suppose my database table is updating in every 10 second from my php cron , then i want to sync my backbone model/collection whenever my database table get update from my php cron.
Example:- i have table stock_exchange in which i am storing the stocks rate for various companies, and my cronjob update the stocks rate in every 10 second, To show this on UI i am creating the backbone application but my problem is that whenever my table stock_exchange updated i want to sync my backbone model/collection.
Please help,
Thanks in advance
You can either poll the server by calling stockModel.fetch() every 10 seconds in the browser (via setInterval perhaps), or you can use something like web sockets (via socket.io perhaps) to allow the server to push the latest data to the browser, which you can then do stockModel.set(dataFromServer);. Try something and post some code as stack overflow is intended for specific problems not tutorials.

Load data when webservices starts

i have a scenario where i have to load data from SQL server when i start running a web service. Later i have to use this data for my application, instead of accessing it every time from Database. In addition to this this data should be refreshed every one hour without affecting the website operation on the back end.If any of you has came across such scenario please let me know the solution. By the way i am using asp.net web services, SQL server database, and DNN for my front end.Thanks in advance.
In Global Asax,Application start event you can load all your data in the Dataset.
And by using Sql Cache dependency, You can refresh the data for each hour.But loading the
Entire data is not advisable.By making so you memory will be full.There will
be Performance degrade.
http://www.codeproject.com/Articles/14976/ASP-NET-Caching-Dependencies
Pre-loading all of your data is not a good practice because the database loses its purpose then. It’s probably ok for some data that is very rarely updated but needed very frequently but most definitely not for all the data you have in database.
As for the loading of data you can use app start event as others have already suggested.
Regarding caching – use Application object to make this data available to all parts of application and add a proprety to it that will keep the time of the last update. Then just create separate service that will check the last update time every X minutes and refresh the data when the time comes.

Resources