I have a website on Shopify, linked to an Airtable database, and my transactions are sent to Firmhouse to be finalized.
I want to create a new database and was wondering, witch is the best software(most compatible) I can used to link to Shopify and Firmhouse? There are no sources out there that discuss this matter and I would like to know, from your experience, which is most optimal and easiest to manage.
Thank you for your continued help and support.
I tried to use MS SQL instead of air table to connect to shopify, but couldn't find a way to sink data from firmhouse back to mssql after transactions. This was also the same problem I faced with Airtable, were I would have to manually update it everytime a transaction is concluded or edited.
I would like, optimally, for there to be a database software that allows me to connect to shopify, and at the same time sync with Firmhouse. I don't mind if there is a "middle level" software that helps in these processes.
Related
Are there any resources available that can guide someone on how to 'think' about the various components of a hosted / cloud solution before going ahead and starting to make a hosted application? If that made no sense, what I mean to ask is are there any guidance books/websites on what things need to be considered when making a cloud application?
I am attempting to make a hosted CRM-style software application that will serve many hundreds of customers. The application is powered by a SQL server database with many tables and a ColdFusion, HTML5, CSS, Javascript front-end. If I was installing this application and its components at each client site, then each installation is unique to that customer. But somehow I have to replicate this uniqueness in the cloud which is baffling me.
Only two things have come to mind so far:
The need for a unique database per customer in SQL server
The need to change DB connection strings per customer in the web application
My thought process has come to a block when I am trying to envisage how to design the application to serve so many different customers. Even though the application that all customers use will is the same (same DB tables, same front-end), the data that they store and retrieve will be specific to them. So I was thinking that surely each customer needs a separate database creating for them? Is it feasible to create a replica database for each customer? If I need to update some tables or add a new table, how would I do this for hundreds of different databases?
From the front-end I guess each unique customer log-in would change DB connection strings so that they can only access their database. Other than this I can't think of anything else that needs to change per customer basis.
When a new customer wants to sign up, it needs to be clear to me what I need to create for them to have access to the application. I guess this is ultimately what I need to think of but I'm stuck.
If anyone can suggest some things to think of or if there is a book or website on this kind of thing that someone could point me to I'd really be very thankful.
EDIT:
I was looking at an article about Salesforce.com and it says
"In order to ensure privacy of data for each user and give an effect of each having their own database, the data from different users are securely isolated from one another."
Anyone know how this is achieved or how it may be done?
Found some great information here. It is called multi-tenant database design and seems to be a common topic. Once I get the database designed then the application can sit nicely on top.
https://dba.stackexchange.com/questions/1043/what-problems-will-i-get-creating-a-database-per-customer
I am new to ipad development. I have to develop an app for a client whose employees use ipads.I am to develop this app that would take the data that they have and store it to the main sql server on their server. On researching i came across that people do that once they have their data on ipad and later sync it with their server. I have used sqlite for android before. But that was like a school project. CRUD operations basically. So since i have little knowledge of sqlite i want to pursue this app in this way. My question is can i write an app that will sync temporary sqlite data with server once they sync ? I have more questions..
Thanks.
It is certainly possible to synchronize data between multiple databases.
Generally speaking, you have to record all changes made since the last synchronization (usually done with serial numbers or timestamps), and apply those changes to the other database.
If the same data has been modified by multiple users, you have to resolve this conflict somehow.
If multiple users can add data, you have to prevent duplicates of primary keys.
See these Wikipedia articles for explanations of some related concepts:
Data synchronization
Replication
Change data capture
this Guy may solve the problem, but it only supports Xamarin(iOS or Android).
http://forums.xamarin.com/discussion/5719/sync-sqlite-with-sql-server-merge-replication
First please excuse me for my grammar mistakes.
Ok, this is what I already know :-),
I want to use EF and MVC 4, UI with angularJs, I need a Database per user \ group of users,
my application growth may come to 5000+ users, they all have also a shared resource which is a single
database, when the user search for something the results will come both from the shared resource
and from the user own database.
Performance is extremely important.
In my research I found that EF can connect to different databases but i couldn't find any proper way of doing so without writing tons of code.
Scenarios :
New user registers, the system builds a new database for him.
New user logs in, the system returns data from his database and the shared database.
New user logs in, BUT, the system database got upgraded, users db should too.
Now I know that there is no easy method to achieve all of my goals,
but can you please direct me to what suits me best?
Again sorry for my English!
Thank you! :-)
IMHO, we have worked in several SaaS Applications that have been using a shared database [central repository] that will contain all the user [tenant] data and that there will be an application database [tenant based] for every user group.
This will work with ease in EF and there will be no performance overheads. You should not be using cross database queries and instead focus on the optimization of the EF code that you may have in the data access layer and then you can have separate services that will handle the task of merging the user data from the shared and separate databases.
May be you should analyze the application and then find the data that may be non-frequently updated and cache them and get them rendered using a distributed cache like Appfabric.
With respect to the synchronization of the User db and the centralized database, in the service tier, you can get this job done by wrapping these calls in a .Net Transaction scope and then the preserve the atomicity.
Please post your understanding and any further clarifications in my reply.
I've built a django/satchmo ecommerce site which is starting to get some traffic, and I am having a problem because I do not have a smart way to deal with database changes. When I develop the site on my local system, I make changes to the layout and the DB, which manages the product attributes.
When I want to push new developments to the server, I have to overwrite the server database which has information about recent shoppers and purchases.
What I want to do is "merge" the two databases together so that new purchases still stay recorded in the database, but which also allow me to push local changes to the server.
I'd appreciate any advice. Thanks.
are you using south? (if not, you should)
in particular, have a look at data migrations
My current development project has two aspects to it. First, there is a public website where external users can submit and update information for various purposes. This information is then saved to a local SQL Server at the colo facility.
The second aspect is an internal application which employees use to manage those same records (conceptually) and provide status updates, approvals, etc. This application is hosted within the corporate firewall with its own local SQL Server database.
The two networks are connected by a hardware VPN solution, which is decent, but obviously not the speediest thing in the world.
The two databases are similar, and share many of the same tables, but they are not 100% the same. Many of the tables on both sides are very specific to either the internal or external application.
So the question is: when a user updates their information or submits a record on the public website, how do you transfer that data to the internal application's database so it can be managed by the internal staff? And vice versa... how do you push updates made by the staff back out to the website?
It is worth mentioning that the more "real time" these updates occur, the better. Not that it has to be instant, just reasonably quick.
So far, I have thought about using the following types of approaches:
Bi-directional replication
Web service interfaces on both sides with code to sync the changes as they are made (in real time).
Web service interfaces on both sides with code to asynchronously sync the changes (using a queueing mechanism).
Any advice? Has anyone run into this problem before? Did you come up with a solution that worked well for you?
This is a pretty common integration scenario, I believe. Personally, I think an asynchronous messaging solution using a queue is ideal.
You should be able to achieve near real time synchronization without the overhead or complexity of something like replication.
Synchronous web services are not ideal because your code will have to be very sophisticated to handle failure scenarios. What happens when one system is restarted while the other continues to publish changes? Does the sending system get timeouts? What does it do with those? Unless you are prepared to lose data, you'll want some sort of transactional queue (like MSMQ) to receive the change notices and take care of making sure they get to the other system. If either system is down, the changes (passed as messages) will just accumulate and as soon as a connection can be established the re-starting server will process all the queued messages and catch up, making system integrity much, much easier to achieve.
There are some open source tools that can really make this easy for you if you are using .NET (especially if you want to use MSMQ).
nServiceBus by Udi Dahan
Mass Transit by Dru Sellers and Chris Patterson
There are commercial products also, and if you are considering a commercial option see here for a list of of options on .NET. Of course, WCF can do async messaging using MSMQ bindings, but a tool like nServiceBus or MassTransit will give you a very simple Send/Receive or Pub/Sub API that will make your requirement a very straightforward job.
If you're using Java, there are any number of open source service bus implementations that will make this kind of bi-directional, asynchronous messaging a snap, like Mule or maybe just ActiveMQ.
You may also want to consider reading Udi Dahan's blog, listening to some of his podcasts. Here are some more good resources to get you started.
I'm mid-way through a similar project except I have multiple sites that need to keep in sync over slow connections (dial-up in some cases).
Firstly you need to track changes, if you can use SQL 2008 (even the Express version is enough if the 2Gb limit isn't a problem) this will ease the pain greatly, just turn on Change Tracking on the database and each table. We're using SQL Server 2008 at the head office with the extended schema and SQL Express 2008 at each site with a sub-set of data and limited schema.
Secondly you need to track your changes, Sync Services does the trick nicely and supports using a WCF gateway into the main database. In this example you will need to use the Sync using SQL Express Client sample as a starting point, note that it's based on SQL 2005 so you'll need to update it to take advantage of the Change Tracking features in 2008. By default the Sync Services uses SQL CE on the clients, which I'm sure isn't enough in your case. You'll need a service that runs on your Web Server that periodically (could be as often as every 10 seconds if you want) runs the Synchronize() method. This will tell your main database about changes made locally and then ask the server for all changes made there. You can set up the get and apply SQL code to call stored procedures and you can add event handlers to handle conflicts (e.g. Client Update vs Server Update) and resolve them accordingly at each end.
We have a shop as a client, with three stores connected to the same VPN
Two of the shops have a computer running as a "server" for that shop and the the third one has the "master database"
To synchronize all to the master we don't have the best solution, but it works: there is a dedicated PC running an application that checks the timestamp of every record in every table of the two stores and if it is different that the last time you synchronize, it copies the results
Note that this works both ways. I.e. if you update a product in the master database, this change will propagate to the other two shops. If you have a new order in one of the shops, it will be transmitted to the "master".
With some optimizations you can have all the shops synchronize in around 20minutes
Recently I have had a lot of success with SQL Server Service Broker which offers reliable, persisted asynchronous messaging out of the box with very little implementation pain.
It is quick to set up and as you learn more you can use some of the more advanced features.
Unknown to most, it is also part of the desktop editions so it can be used as a workstation messaging system
If you have existing T-SQL skills they can be leveraged as all the code to read and write messages is done in SQL
It is blindingly fast
It is a vastly under-hyped part of SQL Server and well worth a look.
I'd say just have a job that copies the data in the pub database input table into a private database pending table. Then once you update the data on the private side have it replicated to the public side. If you don't have any of the replicated data on the public side updated it should be a fairly easy transactional replication solution.