I have created a blogging system with php+postgresql.
Now I want to add a web chat ( in REAL TIME for Million of users simultaneously ) where every message is saved in database.
I am thinking to use Erlang+Mnesia on a different webserver for this issue.
Message's table will be like this:
message_id, user_id, message, date
user_id should be related with users table in Postgresql database in another webserver.
How can I do that without lose performance ?
If you have any other creative solutions tell me please ;).
I'm not sure why you want to save every single message in a database, but mnesia doesn't sound like a particularly good choice for doing that. Mnesia is more of a distributed key-value store, that you can use to keep the state of your application, when you need to store "tabular data" and query it, in a simple to medium-complex fashion.
For large amounts of text, I've heard lucene is supposed to be good, it has fulltext search features etc. which are said to be efficient, you might want to look into it:
Apache Lucene Project page
Other than that, using erlang as chatserver, using mnesia to hold all the other state sounds like a good idea, You could write a javascript client that uses something like JSONP (to overcome the cross-domaine-issue) and mochiweb on the erlang site to do the webserver part.
Writing the rest of the core chat system should be fairly simple, the fun part, so to say :)
Mnesia can certainly do what you suggest but if you've already got postgres set up is there some reason you don't want to use that? It might be simpler than creating a whole seperate service and if you want erlang to run the chat service then it has postgres drivers.
This project is using postgresql with great success.
http://zotonic.com/
You may want to use the same code for db access to postgresql.
Related
I'm beginning to pursue my first online project that I am planning will need to scale as such I have opted for a NoSQL DB. Some reading into this and modeling of what my queries would look like and there are two databases I am considering. Cassandra seems like the right choice for item lookups by keyword but MongoDB sounds like the right choice for initially entering the data in as it can retain the account structure in document form.
This split decision has left me wondering: Are there any major companies that use multiple database types for storage of different items as in using both Cassandra and Mongo together?
I would think scaling up would be more difficult but are the added benefits (if there are any) worth the trouble? I'm not the expert on this. I'm hoping you are. Thanks in advance for sharing your experience.
Cassandra can handle both use cases so you can use the same database for your purposes.
Stargate (https://stargate.io/) is an open-source API platform which provides a data gateway to Cassandra with REST API, GraphQL API, Document API and even native CQL access.
The Document API lets you save and search schemaless JSON documents to/from Cassandra directly from your app.
You can try it out for free on Astra with no credit card required. In just a few clicks, you'll be able to launch a Cassandra cluster with Stargate pre-configured so you can use the Document API straight out-of-the box and build a proof-of-concept app immediately without having to worry about downloading/installing/configuring a Cassandra cluster.
There are even sample apps you can access straight from the Astra dashboard so you can see Stargate in action. For more info, see Using the Document API on Astra. Cheers!
Using multiple database technologies in the same project is somewhat common nowadays and it is called "Polyglot persistence".
Many people use this method to take advantage of multiple systems - and as you mentioned Cassandra is right for somethings and something else (maybe MongoDB) is best for something else, so using a combination can give the advantage of both worlds.
Scaling, Replication, Support can be more costly when you use multiple technologies because you need expertise in both to support.
So if you really have use cases where Cassandra wont be a good choice and you have some primary use cases where Cassandra is the best choice then yes, going with two databases can be the best option provided you are ready to take the trouble of supporting two systems.
I am working on a webapp for a client that has a cPanel virtual server, and it appears that I can only use MySQL, but I want to store the data using a json-like structure, so that I can more easily use Angular.js on the frontend.
I've looked into installing a NoSQL database, and I can't find anything viable (if you know of a way to do that, that would be my best solution), so I'm thinking of storing the data as json strings in a series of text files on the server that I would write to with php.
I'd like to hear some opinions, and if there are any better solutions of which I'm not thinking of.
Go look at firebase and thank me afterwards.
In short, firebase is a cloud real-time JSON data storage. Everything for the backend is done for you and all you need to do is the front-end. Their servers are CDNs which means it will be great if you're looking to serve the entire world. All you need to do is configure your data-structure and use it!
It also provides sockets, which is great for real-time data (used for games, chat and etc).
There is a free option. The only downside is that it is a little expensive if you want to scale it, nevertheless if your app really gets to that stage - I'm sure you'll have money to hire some people to develop a similar backend for yourself.
We are looking for a system that is able to do the following:
Browse trough records containing plain text data (name/ID's/suppliers/show links etc) but is also able to store pdf's that go along with this information.
The system should preferably be server/web based to be accesible over intranet.
Any idea's?
Thanks
There are usually two opposite ways to solve IT tasks. Take something that fits your requirements or build it yourself. Your task sounds not so complicated and so I think one solution could be to set it up yourself. (Or ask somebody to do so.)
I would prefer perl as the programming language, the dancer web framework and may be a mongodb to store your data. You can easily run the app with starmen and use apache in front of it as proxy.
This environment is easy to expand to your needs and would fit exactly to your reqiurements, nothing more nothing less.
For my new project I'm looking forward to use JSON data as a text file rather then fetching data from database. My concept is to save a JSON file on the server whenever admin creates a new entry in the database.
As there is no issue of security, will this approach will make user access to data faster or shall I go with the usual database queries.
JSON is typically used as a way to format the data for the purpose of transporting it somewhere. Databases are typically used for storing data.
What you've described may be perfectly sensible, but you really need to say a little bit more about your project before the community can comment on your approach.
What's the pattern of access? Is it always read-only for the user, editable only by site administrator for example?
You shouldn't worry about performance early on. Worry more about ease of development, maintenance and reliability, you can always optimise afterwards.
You may want to look at http://www.mongodb.org/. MongoDB is a document-centric store that uses JSON as its storage format.
JSON in combination with Jquery is a great fast web page smooth updating option but ultimately it still will come down to the same database query.
Just make sure your query is efficient. Use a stored proc.
JSON is just the way the data is sent from the server (Web controller in MVC or code behind in standind c#) to the client (JQuery or JavaScript)
Ultimately the database will be queried the same way.
You should stick with the classic method (database), because you'll face many problems with concurrency and with having too many files to handle.
I think you should go with usual database query.
If you use JSON file you'll have to sync JSON files with the DB (That's mean an extra work is need) and face I/O problems (if your site super busy).
I've been struggling with this issue for a while. Our company servers lack any sort of database, i.e. no MySQL, MongoDB, etc in sight.
Since we can't install any for reasons beyond the scope of this question, I was wondering if there was any alternative to that that I could use to save data from a form. (We collect prospect data through a form on our site which then sends this data in the form of an email and is plugged in our internal database through email2DB...)
You could use a library like SQLite
You could also use indexed files like Gdbm
However, you should think about backup strategies. Perhaps serialization should be a concern (and using textual or portable data formats like XDR, ASN1, JSON, YAML, ...).
But you might also try to discuss with managers to install e.g. a MySQL server on a machine. You don't need a dedicated hardware for that, it can run (at least for development and test) on a machine used for some other things.
textfile?:)
or perhaps TinySQL?
You can save it as a flat file. Flat files work great when you are just saving things like logs, or output from a webform. They quickly start to fail if you have any *-to-many relationships.
Do you have access to PHP?