We have an on-premises oracle database installed on a server. We have to create some Charts/Dashboards with Tableau CRM on those data on-premises. Note that, tableau CRM is not Tableau Online, it is a Tableau version for the Salesforce ecosystem.
Tableau CRM has APIs, so we can push data to it or can upload CSV programmatically to it.
So, what can be done are,
Run a nodeJS app on the on-premise server, pull data from Oracle DB, and then push to Tableau CRM via the TCRM API.
Run a nodeJS app on the on-premise server, pull data from Oracle DB, create CSV, push the CSV via TCRM API
I have tested with the 2nd option and it is working fine.
But, you all know, it is not efficient. Because I have to run a cronJob and schedule the process multiple times in a day. I have to query the full table all the time.
I am looking for a better approach. Any other tools/technology you know to have a smooth sync process?
Thanks
The second method you described in the questions is a good solution. However, you can optimize it a bit.
I have to query the full table all the time.
This is can be avoided. If you take a look at the documentation of SObject InsightsExternalData you can see that it has a field by name Operation which takes one of these values Append, Delete, Overwrite, Upsert
what you will have to do is when you push data to Tableau CRM you can use the Append operator and push the records that don't exist in TCRM. That way you only query the delta records from your database. This reduces the size of the CSV you will have to push and since the size is less it takes less time to get uploaded into TCRM.
However, to implement this solution you need two things on the database side.
A unique identifier that uniquely identifies every record in the database
A DateTime field
Once you have these two, you have to write a query that sorts all the records in ascending order of the DateTime field and take only the files that fall below the last UniqueId you pushed into TCRM. That way your result set only contains delta records that you don't have on TCRM. After that you can use the same pipeline you built to push data.
I am actually working on integration of real time data to my forge model. I use Postgres database which is linked to timescale DB. I want to import the data of each element based on its ID in real time as showed in this screen by red.
To sum up, two tasks:
- connecting forge model to my database and import the data of each element
- update the data in real time of each element
In red, the properties that should be connected to my database:
I have an existing Oracle database with production data. I am planning to move to MongoDB, so I want to migrate my existing data in Oracle database to MongoDB. The Data Model of data stored in Oracle database and MongoDB will be different.
I am planning to get all the data from Oracle database as json using https://blogs.oracle.com/jsondb/generating-json-data. Once I will have the json file with all the data I will import that in the MongoDB. In case the data extracted from Oracle database is not per my required I will create a utility to convert the data to multiple json file for each collection.
I need to some suggestions if there are better ways to this and is my solution right approach for the problem?
Is it possible to have a choice to store Database at different locations"?
eg. on Dropbox/Google-Drive or other place and update it in real time
Can a database be selected from a list of Databases to use?
Can MongoDB be stored locally or on a server elsewhere?
I want to store Database query logs belongs to Laravel 5.6 application inside a database table. Is there any possible way to do it?