Is it possible to have a choice to store Database at different locations"?
eg. on Dropbox/Google-Drive or other place and update it in real time
Can a database be selected from a list of Databases to use?
Can MongoDB be stored locally or on a server elsewhere?
Related
We have an on-premises oracle database installed on a server. We have to create some Charts/Dashboards with Tableau CRM on those data on-premises. Note that, tableau CRM is not Tableau Online, it is a Tableau version for the Salesforce ecosystem.
Tableau CRM has APIs, so we can push data to it or can upload CSV programmatically to it.
So, what can be done are,
Run a nodeJS app on the on-premise server, pull data from Oracle DB, and then push to Tableau CRM via the TCRM API.
Run a nodeJS app on the on-premise server, pull data from Oracle DB, create CSV, push the CSV via TCRM API
I have tested with the 2nd option and it is working fine.
But, you all know, it is not efficient. Because I have to run a cronJob and schedule the process multiple times in a day. I have to query the full table all the time.
I am looking for a better approach. Any other tools/technology you know to have a smooth sync process?
Thanks
The second method you described in the questions is a good solution. However, you can optimize it a bit.
I have to query the full table all the time.
This is can be avoided. If you take a look at the documentation of SObject InsightsExternalData you can see that it has a field by name Operation which takes one of these values Append, Delete, Overwrite, Upsert
what you will have to do is when you push data to Tableau CRM you can use the Append operator and push the records that don't exist in TCRM. That way you only query the delta records from your database. This reduces the size of the CSV you will have to push and since the size is less it takes less time to get uploaded into TCRM.
However, to implement this solution you need two things on the database side.
A unique identifier that uniquely identifies every record in the database
A DateTime field
Once you have these two, you have to write a query that sorts all the records in ascending order of the DateTime field and take only the files that fall below the last UniqueId you pushed into TCRM. That way your result set only contains delta records that you don't have on TCRM. After that you can use the same pipeline you built to push data.
I am working on a desktop application that uses a local installation of MySQL to store data across multiple schemata. My goal is to use SymmetricDS to transfer those schemata to an Oracle database on a different machine.
So far I managed to set up a slave node residing on the desktop computer and a master node residing on some server. Using a .properties file in the engine directory, I also successfully transfer data from a single schema and table to the Oracle DB.
The problem I am now facing is that my application will create and possibly delete schemata on the fly.
Does that mean I will have to maintain a .properties file for each schema and somehow implement a wrapper for the symadmin command to register the corresponding engines?
Or is there maybe a better way?
You should able to adjust configuration on the fly. The sym_trigger table has a reference to schema for each table. If the database user SymmetricDS uses has access to newly created schemas (database) then SymmetricDS should be able to create new triggers dynamically that are in new databases. No restart needed.
I have an existing Oracle database with production data. I am planning to move to MongoDB, so I want to migrate my existing data in Oracle database to MongoDB. The Data Model of data stored in Oracle database and MongoDB will be different.
I am planning to get all the data from Oracle database as json using https://blogs.oracle.com/jsondb/generating-json-data. Once I will have the json file with all the data I will import that in the MongoDB. In case the data extracted from Oracle database is not per my required I will create a utility to convert the data to multiple json file for each collection.
I need to some suggestions if there are better ways to this and is my solution right approach for the problem?
We had an intern who was given written instructions for deleting old data from a database based on dates (from within our ERP system). They were fascinated by the results and just kept deleting instead of stopping at the required date. There are now 4 years of missing records in the production database. I have these records in my development database, which is in a different instance on a different server. Is there a way to transfer just those 4 years worth of data from my development database to my production database, checking, of course, to make sure there are no duplicates (unique index on transaction number).
I haven't tried anything yet because I'm not sure where to start. I do have a test database on the same instance as the production database that I could use to test the transfer with.
There are several ways to do this. Assuming that this is on a different machine, you will want to create a Linked Server on your dev machine to link to the target server (Or, technically, a link from the production server to your dev machine could be used as well). Then, perform an insert of the selected records from the source to the target.
More efficiently, you can use the Export Data functionality. Right click on the database (Not the server / instance, but the database) and select Tasks / Export Data from the popup menu. This will pop up the SQL Server Import and Export Wizard. Use your query above to select the data for export.
If security considerations interfere with this, create a duplicate of the table(s) with alternate names (e.g. MyInvRecords) in a new database, and export the data into those tables. Back up that DB, transfer it to someplace accessible from the target server, restore that DB, then transfer the rows back into the original DB.
I haven't had to use anything but these methods before, so one of them should work for you.
A basic insert will work just fine.
Insert ProdDB.schema.YourTable
([Columns])
select ([Columns])
from TestDB.schema.YourTable
where YourDateRange predicates here
I need multi database concept for my project. I create user on real time automatically create new database for that new user .that new user login switched to master database to user database.. how to do this .. I am trying larval 5.4 version .
1.How to create database on real-time for new user ?
2.How to switch users login master database to user database ?
You can create schemas and migrate them real time, I've done exactly this before in a project. Use the following
DB::connection('mydb')->statement(DB::raw('CREATE DATABASE....BLAH BLAH BLAH'))
to execute raw queries for creating new databases. You'll need your master connection to execute these queries.
Then you can use Artisan::call('migrate', ['--database' => 'newdb']) to migrate and optionally seed it.
Finally, you will need to either have a customers entry in your database config file that determines which schema the user connects to, or you can set this real time using Config::set(['my.user.db.config').
My strategy for creating unique databases for the customer was to use a guid as the database name and keep a master log of that in my master database. When a user would log in, their database name would be read from master and then the application would switch his connection on the fly. My company has hundreds of small customer databases with names like 123151-2135151-5132123-545-231231 that keep them unique.
It's quite involved and there are a lot of moving pieces but that will get you started.