Does the Dynamics 365 support a similar service like the Salesforce Bulk API, which can download millions of records in minutes. We are using Java so this won't work. Maybe using batch operations in Dynamics could solve this?
Whats your exact requirement? Do you wan't to load your Dynamics Database into an BI tool? Maybe Data Export Service could help https://community.dynamics.com/365/b/dataexportserviceblog/archive/2018/07/11/how-to-configure-data-export-service-with-an-azure-sql-database-using-azure-resource-manager
Or do you Need to replicate a lot of data between Dynamics and another (ERP) System? Maybe Change Tracking is your friend. You will need to do one full synchronization and after ward you can retrieve only the changed data since the last Sync.
https://learn.microsoft.com/en-us/dynamics365/customer-engagement/admin/enable-change-tracking-control-data-synchronization
Maybe using batch operations in Dynamics could solve this?
Yes, you could use a Batch Request with multiple Retrieves. This is tricky, because you Need some sharding logic.
Related
Firstly I'm new to development and currently I have a problem with server data filling up rapidly. I'm looking at solutions such as watcher programs to help me detect when the server data is reaching the limit but I wanted to know if cloud solutions could help in this regard. Additionally I also wanted to know if companies such as Snowflake can help to handle fast growing data and in what way can a developer use it or will it be too costly to use this approach from an enterprise point of view.
I have tried to look up the documentations of Snowflake but I am unable to reach any conclusions as to whether it can help me. I could just see articles about storage and that they store data by compressing it but I wanted more clarity on this solution.
Snowflake stores the data using Cloud Storege Services (AWS S3, Google Cloud Storage, or Microsoft Azure), so you can't fill the server data in normal conditions (never heard that S3 is full on any region).
Check the pricing page to see if it will be costly for you (or not):
https://www.snowflake.com/pricing/
I have a website on Shopify, linked to an Airtable database, and my transactions are sent to Firmhouse to be finalized.
I want to create a new database and was wondering, witch is the best software(most compatible) I can used to link to Shopify and Firmhouse? There are no sources out there that discuss this matter and I would like to know, from your experience, which is most optimal and easiest to manage.
Thank you for your continued help and support.
I tried to use MS SQL instead of air table to connect to shopify, but couldn't find a way to sink data from firmhouse back to mssql after transactions. This was also the same problem I faced with Airtable, were I would have to manually update it everytime a transaction is concluded or edited.
I would like, optimally, for there to be a database software that allows me to connect to shopify, and at the same time sync with Firmhouse. I don't mind if there is a "middle level" software that helps in these processes.
Has anyone here tried pulling data from MaestroQA to Snowflake?
From MaestroQA to Snowflake there is a way, but I'm wondering if there's way the other way around, from Snowflake pulling MaestroQA data, without using any APIs.
In addition, trying to look for a way to automate this.
I tried looking for documentation and any threads online, but couldn't find one.
Below are documents/links I have seen so far, but this method is from MaestroQA pushing data to Snowflake.
https://help.maestroqa.com/en/articles/1982484-data-warehouse-table-overview
https://help.maestroqa.com/en/articles/1557390-push-qa-data-to-your-data-warehouse.
Snowflake can only load data from its internal/external stages. It has no capabilities to pull data from anywhere.
You'll either need to use a tool with ETL capabilities or write your own process in, for example, python.
I have around 500MB data in firebase and I want to move it to amazon redshift on daily basis. what is the best way for above problem.
thanks in advance.
What is "the best way" depends on your criteria, and often highly subjective. But a few pointers may help you get started:
don't download the entire data with a single ref.once('value'. Loading that much data will take time and all your regular users will be blocked while your read is being fulfilled.
do consider using Firebase's private backups. These are coming out of a different data stream, so will not interfere with your regular users. But the downside is that you'll need to a paid app to be able to use this feature.
do consider how you can make your backup process streaming, instead of daily. Firebase is a real-time database, and typically works best when you consider the data flow to be real-time too.
I'm hoping you can help.
I'm looking for a zero config multi-user datbase that my winforms application can easily upload to a webserver folder (together with 1 or 2 classic asp pages) and am looking for some suggestions/recommendations.
The idea is that the database will be used to collect feedback entered by people filling in the asp pages. The pages will write to the database using javascript.
The database will subsequently be downloaded again for processing once the responses are in.
In Summary:
It will mostly run in MS Windows environments.
I have a modest budget for this and do not mind paying for such a database.
No runtime licensing costs.
Should be xcopy - Once uploaded to a website folder it should be operational.
It should not have a dotnet CLR dependency.
It should support a resonable level of concurrent access. Average respondent count would be around 20-30 but one never knows.
Should be a reasonable size so that uploads/downloads to and from the site will be reasonably fast.
Would appreciate your suggestions/comments
Many thanks
Abz
To clarify - this is a desktop commercial application for feedback management in a vertical market. It uses SQL Server as the backing store.
The application currently provides feedback management from email and paper feedback. I now want to add web feedback capability. Getting users to to make their SQL servers accessible to a website is not at option at this time as I am want to make getting up and running as painless as possible.
I intend to release a web based implementation of the software in the near future but for now am looking at the above as a pragmatic way to provide web based feedback collection.
SQLite comes to mind. It meets all of your stated requirements, is open source, and has a liberal license (public domain).
http://sqlite.org/
I would use 'normal' database (say MySql, Postgresql, Firebird, etc.) on server. Instead of copying files to server your winforms application would create custom tables (or even custom databases). After collecting data you could just get it back to your application using plain old SQL.
why reinvent the wheel ? If you want to collect feedback and stuffs from users of your app and if they are connected to internet, it might be a better idea - and in the long term cheaper - to use a service like wufoo. We recently switched from homegrown setup to wufoo and are very pleased. Check it out.
Otherwise you might want to take a look at sqlite orfirebird. Both of them are very robust, and have ADO.NET providers. Firebird scales from a single user to full blown client server system and has no .NET dependency.
If you really don't want a DB/SQL Solution, you could try simple text files and ftp/xcopy files down and parse them into the back-office server as needed. ASP/VBScript or ASP.NET can create the files to store the basic feedback comments. Need to consider security of course!