How to connect a distributed database in a web application? - database

I plan to use a distributed database but my question is what if you can use the database schema distributed in a web application? , In specific with Java (Servlets, JSP).
Watch what happens I plan to do a small control system for a cinema with multiple branches, where the user may buy ticket to a function, the system I want to use a distributed database but do not know if my idea is possible?.

One option is to simply use a centralized database. Why is this not an option?
Another option is to use a database at each location, and have them synchronize to a central database (master-slave), or with each other (master-master). Look at replication for your database of choice, or db sync tools like SymmetricDS.

Related

postgresql database server and many users

I'm relatively new to databases - I've used postgresql in the past to create databases stored on my computer and accessed only by myself.
I'm currently designing a database that will be used and edited by multiple people (10-15 max) living in different parts of the world. What is the best way to ensure we will all have access to the most current version of the database? Is it best to continue storing the database on my individual computer? Should I host the database on a cloud server? I've read that it is dangerous to store databases on Dropbox.
We are social science researchers organizing our data into a single database.
Based on your comment about not always be on and connected, it seems to me that the cloud service is the way to go for you. There are two approaches there, just rent a machine ("AWS EC2") and install the database software and manage the database yourself, or use a cloud provider's managed database service ("AWS RDS"). The names are just by way of a concrete examples, there are other providers of each type of service.

Can I store any custom tables in SharePoint system database?

Can I store any custom tables in SharePoint's own database?
Is this supported behavior or not?
(I mean tables in MS SQL database, not SharePoint lists.)
If I can, how well does this play with backup/restore functionality?
What are possible caveats?
For anyone wondering why I'm asking: there's an app which is bound to SharePoint server and needs to store some purely relational internal information that doesn't make sense apart from that SharePoint instance. I would like to narrow down data storage to one place but I'm not sure if SharePoint likes its database being used for other purposes.
I'm using SharePoint 2007.
Is it possible? Sure. Should you? Nope.
The SharePoint content/configuration databases are subject to change with any update Microsoft releases, and any changes you make will very likely be destroyed, and if your farm depends on them, be left non-functional.
If you want to store purely relational data in a set of tables, just create another database. There's nothing stopping you from using the same SQL Server instance that houses your SharePoint content and/or configuration databases to store other relational databases as well.
Not a good idea: Support for changes to the databases used by Windows Sharepoint Services
...
Making any modification to the database schema
Adding tables to any of the databases
...
If an unsupported database modification is discovered during a support call, the customer must perform one of the following procedures at a minimum:
Perform a database restoration from the last known good backup that did not include the database modifications
Roll back all the database modifications
It is even worse than the above. It is likely that future upgrades will notice your changes to the content database schema and refuse to upgrade the database period.

Export from a standalone database to an embedded database

I have a two-part application, where there is a central database that is edited, and then at certain times, the data is released and distributed as its own application. I would like to use a standalone database for the central database (MySQL, Postgres, Oracle, SQL Server, etc.) and then have a reliable export to an embedded database (probably SQLite) for distribution.
What tools/processes are available for such an export, or is it a practice to be avoided?
EDIT: A couple of additional pieces of information. The distributed application should be able to run without having to connect to another server (ex: your spellchecker still works even you don't have internet), and I don't want to install a full DB server for read-only access to the data.
If you really only want your clients to have read-access to the offline data it should not be that difficult to update your client-data manually.
A good practice would be to use the same product for the server database and the client database. You wouldn't have to write SQL-Statements twice since they use the same SOL-Dialect and same features.
Firebird for example offers a server
and an embedded version.
Also Microsoft offers their MS SQL Server
as a mobile version (compact edition) and there are
also Synchronization services
provided by Microsoft (good blog
describing sync services in visual
studio:
http://keithelder.net/blog/archive/2007/09/23/Sync-Services-for-SQL-Server-Compact-Edition-3.5-in-Visual.aspx)
MySQL has a product which is called "MySQLMobile" but I never actually used it.
I can also recommend SQLite as an embedded database since it is very easy to use.
Depending on your bandwidth and data amount you could even download the whole database and delete the old one. (in Firebird for example only copy the database files and it will also work with the mobile version) Very easy - BUT you have to know if it will work for your scenario. If you have more data you will need something more flexible and sophisticated, only updating the data that really changed.

What is the best way to configure a SQL Server for 50 developers?

If I am running an organization that has 50 .net developers and all are using SQL Server, what is the best way to make a single SQL Server available to them?
Here is some of the concerns that I want to be careful about
Should I configure database users per project or per user? or both?
Should I provide single SQL Server instance?
Edit:
How can I track changes done by each user in database?
There are some more concerns but I think getting answer of these two will be a good starting point.
You should definitely configure a database per project, as only project specific items should be in that database. Also for backup and restore purposes a database per project will be a good idea.
Configuring databases for your developers depends on how many developers will actually develop for the database: create tables, views etc. Database developers should probably have some sort of test copy of the database they can use to develop their end of things, while the 'regular' developers work against a published copy of this database:
So a setup could be: 2 databases per project, one for db development and one for other development.
This way changes to the database scheme can first be developed and tested before pushed out to the rest of the developers.

How to update a database remotely?

I'm looking for a strategy to allow automatic updates for a number of databases at customer sites through a publish-subscribe kind of mechanism. Right now there is a datacenter which has all the master data that get fed through extractions from hundreds of databases out there. The problem is that, whenever I need to do create a new view in the remote customer databases, I have to manually roll out an installation patch and ask the users to run it (their sites are behind firewalls, so I can't remotely do that from my end). Ideally, I would like to have a "DDL image" of the customer database schema at the datacenter, and whenever any change happens to it, all the subscribing customer databases would update their table view codes. The target databases are mostly SQL Server 2005 and Oracle.
I heard the MS SQL replication services could do such a thing? What about Oracle? anybody had experience with such?
Thanks!
Not sure about existing solutions, but how about writing your own auto-update mechanism that would run on a timer on the client machines and pull the latest schemas and views from some service table in your master database? Your change wouldn't get propagated straight away to all sites and some sites would update before others, but they would all eventually see the changes.
Golden gate might fit your needs.

Resources