I am making an application in Flutter, which requires to save the daily data that the users input in the application. The user has 10 daily actions to do.
The app needs to:
Save the daily data that the user inputs in those 10 actions.
Make a daily, weekly, monthly, quarterly and yearly summary of the data, divided by each action.
At the start of the development of the app I was thinking of using Firebase, but I feel like it is not the best option.
If your app only need to support offline data saving mechanism which not required any internet connectivity then you would use:
Isar (It is a no-sql/non-relational local/offline database)
Object Box (It is a no-sql/non-relational local/offline database)
Hive (It is a no-sql/non-relational local/offline database)
Sqflite (It is a sql/relational local/offline database)
Otherwise you can use:
Firebase (It is a no-sql/non-relational online database)
Parse Server (It is a kind of sql/relational online database)
AppWrite (It is a no-sql/non-relational online database)
Supabase (It is an online database)
According to your provided information, I think Firebase (for online db) or Isar (for local db) would be good option.
Related
I need to design a scalable database architecture in order to store all the data coming from flat files - CSV, html etc. These files come from elastic search. most of the scripts are created in python. This data architecture should be able to automate most of the daily manual processing performed using excel, csv, html and all the data will be retrieved from this database instead of relying on populating within csv, html.
Database requirements:
Database must have a better performance to retrieve data on day to day basis and it will be queried by multiple teams.
ER model, schema will be developed for the data with logical relationship.
The database can be within cloud storage.
The database must be highly available and should be able to retrieve data faster.
This database will be utilized to create multiple dashboards.
The ETL jobs will be responsible for storing data in the database.
There will be many reads from the database and multiple writes each day with lots of data coming from Elastic Search and some of the cloud tools.
I am considering RDS, Azure SQL, DynamoDB, Postgres or Google Cloud. I would want to know which database engine would be a better solution considering these requirements. I also want to know how ETL process should be designed- lambda or kappa architecture.
To store the relational data like CSV and excel files, you can use relational database. For flat files like HTML, which doesn't required to be queried, you can simply use Storage account in any cloud service provider, for example Azure.
Azure SQL Database is a fully managed platform as a service (PaaS) database engine that handles most of the database management functions such as upgrading, patching, backups, and monitoring without user involvement. Azure SQL Database is always running on the latest stable version of the SQL Server database engine and patched OS with 99.99% availability. You can restore the database at any point of time. This should be the best choice to store relational data and perform SQL query.
Azure Blob Storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Your HTML files can be stored here.
The ETL jobs can be performed using Azure Data Factory (ADF). It allows you to connect almost any data source (including outside Azure) to transform the stored dataset and store it into desired destination. Data flow transformation in ADF is capable to perform all the ETL related tasks.
Our client doesn't want to let us make any call in their SQL database (even create a replica, etc). The best solution we have thought until now is instantiating a Google Cloud SQL server, so we can ask customer to push its data once a day/week (using the public IP of the server) and then we consume the data pushing into Google Big Query.
I have been reading many topics on the web and my possible solution is asking user doing weekly ETL -> Cloud SQL -> BigQuery. Is it a good approach?
To sum up, I am looking for recommendations about best/cheap practices and possible ways to let the user insert data in GCP without exposing his data or my infrastructure.
My cloud provider is Google Cloud and my client uses SQL Server.
We are open to new or similar options (even other providers like Amazon and Azure)
Constraints:
Client will send data periodically (once a day/or week ingestion)
Data finally should be sent and stored in BigQuery
The costs of having a Cloud SQL in Google is high while we don't need the allocated CPU/Memory and public IP available 24/7 (only a few times a month, e.g: 4 times a month)
The question is missing many details, but how about:
Have the customer create a weekly .csv.
Send the .csv with the new data to GCS.
Load into BigQuery.
I would like to provide my users with a session or workspace using Azure SQL DB where they can take a snap shot of the database and cook up their changes, analyze the result and then submit the final changes, so that all the other users can see it.
Do you think if the Temporary Tables in SQL Server is the answer?
Do we have some middleware in the market which can be used on top of SQL server to create sessions, manage sessions and post the session data back to master DB version with proper Reconciliation of data between the version created by user and the current master version of DB?
I have seen a middleware ArcSDE from Esri which used to do it for complex Geodatabases but I am struggling to find similar Middleware for normal Azure-SQL RDBMS.
I am currently working on a new Web Application. I really love the idea behind firebase (augularfire) for the realtime data sync. But I can't figure out how to organize all the data, make each customer (enterprise) have his own data, and ensure no data is shared between each enterprise.
In a regular MySql server, I can create a database per enterprise (best implementation for speed and security) or simply add a table Enterprise and a table Customer with enterprise_id. Which is the best approach in a Firebase DB?
Assuming I want to develop a Win 8/10 universal app eg calendar, the user has a two devices tablet/phone.
How can my calendar share a local SQL database?
I don't want to maintain or administer an azure service or any other remote DB service ie AWS, VPS running SQL Express etc.. which is overkill for such a basic scenario.
I have considered SQLite and dumping the DB file in the users MS OneDrive folder but as we know this could result in sync/lock issues.
So using a purely self contained & free model, how can my two devices share this basic SQL data?
If the database is small you can save it in the RoamingFolder and let Windows take care of copying it to the other system.
See Quickstart: Roaming app data
Other than that storing the data in a cloud service such as an Azure Mobile App Service is the easiest and cleanest solution. Azure Mobile Services support offline sync to SQLite if you need to support both online and offline scenarios.
Saving app data on the user's OneDrive or dropbox and then explicitly downloading it to use is possible but more difficult and a bit dirtier since the data will be visible to the user and could be accidentally deleted, moved, etc. For a personal app it's not bad but I wouldn't do this in production.
If you're copying the database between systems (either via roaming or via a data storage site) you'll need to devise a way to handle conflicts. This is simpler if everything is kept in a single cloud database.