:)
I recently came across MEAN.JS. I'm still a beginner in webdevelopment but all worked really fine so far. Up to one thing.
Unfortunately, all requests seem to take a huge amount of time - 300 - 4000(!) ms for a single call (have a look at the screenshot). I'm developing locally on a state of the art computer and wonder where the bottleneck might be. Does anyone have the same issues? Could you give me a hint how to attack this problem?
I've had a look at this and similar posts, but couldn't find a way to tackle it.
What are the ways to find bottlenecks in a web application?
The framework uses MongoDB, ExpressJS, AngularJS, Node.js. Could you give me a hint how to track down the source of those latencies in a Javascript-based application? (Maybe a tool, plugin or best practice approach in development?) Have you experienced similar issues?
Greetings,
Tea
It's hard to guess what's wrong as that latency can be originated from many sources, however if we put aside computer and network problems/configurations, and taking into account that you don't have any other processes running that can affect your app performance, the first thing I would check is the express configuration, i.e, the order in which the middleware is loaded. A misplaced middleware can indeed influence the app's performance.
Related
Assuming you want to launch a social app (which mean many interactions) with the ambition of acquiring several thousand users and for those who have already done so, what are the pitfalls that you know and that you would absolutely avoid, in term of code and servers architectures for sure ?
I have that feeling that you can easily feel alone when you trying to answer this kind of question which is clearly out of the scope of all that SaaS or landing pages that maybe (and I insist on this word) don't have this scaling problem. Or maybe that there is just no real pitfall and that the best approach is 'problem' --> 'solution' when those problems come up.
I don't think it is an opinion-based question because I/O intensive database, queue systems, server calculation, etc have clearly some technical consideration in that kind of configuration.
And to give you some example of problems that I think large scale social app can encounter, there is Facebook engineers with their early latency problem or Twitter engineers with their Bieber problem.
I was able to avoid the first pitfall that Netflix couldn't avoid, which is not using Cloud and trying to build their own servers infrastructure at this scale.
The CAP theorem is vital for all large distributed systems. It is explained here.
If you go to my Heroku-hosted to do list program, you can put test data in, but it's gone pretty soon. This is because, I learned, Heroku has an "ephemeral" filesystem and disposes of any data that users write to it via post. I don't know how to set up a PostgreSQL database or any other kind of database (although maybe I soon will, as I'm working through Hartl's Rails tutorial). I'm just using a humble YAML file. It works fine in my local environment.
Any suggestions for beginners to work around this problem, short of just learning how to host a database? Is there another free service I might use that would work without further setup? Any advice greatly welcome.
I fully understand that I can't do what I'm trying to do with Heroku (see e.g. questions like this one). I just want to understand my options better.
UPDATE: Looks like this and this might have some ideas about using Dropbox to host (read/write) flat files.
The answer is no. But I'll take a minute to explain why.
I realize that you aren't yet familiar with building web applications, databases, and all that stuff. And that's OK! This is an excellent question.
What you need to know, however, is that doing what you're asking is a really bad idea when you're trying to build scalable websites. And Heroku is a platform company that SPECIFICALLY tries to help developers building scalable websites. That's really what the platform excels at.
While Heroku is really easy to learn and use, it isn't targeted at beginners. It's meant for experienced developers. This is really clear if you take a look at what Heroku's principles are, and what policies they enforce on their platform.
Heroku goes out of their way to make building scalable websites really easy, and makes it VERY difficult to do things that would make building scalable websites harder.
So, let's talk for a second about why Heroku has an ephemeral file system in the first place!
This design decision forces you (the developer of the application) to store files that your application needs in a safer, faster, dedicated file storage service (like Amazon S3). This practice results in a lot of scalability benefits:
If your webservers don't need to write to disk, they can be deployed many many times without worrying about storage constraints.
No disks need to be shared across webservers. Sharing disks typically causes IO contention and can adversely affect performance.
It makes it easy to scale your web application horizontally across commodity servers, since disk resources aren't required.
So, the reason why you cannot store flat files on Heroku is because doing this causes scalability and performance problems, and would make it nearly impossible for Heroku to help you scale your application easily (which is their main goal).
That is why it is recommended to use a file storage service to store files (like Amazon S3), or a database for storing data (like Postgres).
What I'd recommend doing (personally) is using Heroku Postgres. You mentioned you're using rails, and rails has excellent Postgres support built in. It has what's called an ORM that let's you talk to the database using some very simple Ruby objects, and removes almost all the prerequisite database background to get things going. It's really fun / easy once you give it a try!
Finally: Heroku Postgres also has a great free plan, which means you can store the data for your todo app in it for no cost at all.
Hope this helps!
I have been asked by my professor at university to demonstrate how load balancing happens on the cloud. Like, for example, what kind of tools are used to do it.
I have looked at some like nginx, Pen etc, but running them on my laptop to demonstrate load balancing is a little far fetched. I have also looked at the Google App Engine, which seems feasible, but it load balances automatically.
Is there another way I can demonstrate this?
EDIT: I'm looking at Open-Source software only, I can't pay a fee to use it.
have you considered amazon ?
also
http://www.rackspace.com/cloud/cloud_hosting_products/loadbalancers/
Does this help?
http://askville.amazon.com/open-source-load-balancer-software-run-Linux/AnswerViewer.do?requestId=7650176
Some pointers to think about:
TCP mode vs HTTP mode (some tools support both) - you will want to know what difference is.
Load balancing versus load distribution.
Load balancing strategies (eg hashing/consistent hashing, cookies, round robin and more).
Good luck !
I know there are many options out there for eCommerce and I know there are many opinions on these two frameworks that differ beyond belief.
I am looking for thoughts on which framework would be easier to use to create a site that will be used to sell prints of photos.
The prints will be sold in various sizes, each size being added to the cart at a set price that is the same for every picture.
Shouldn't be too complex, just don't want to jump into one and find out that I missed a much easier journey using the other.
Thanks in advance!
(If you're going to give me stats about the difference in speed between the two, keep in mind that it will just be hosted on shared hosting and millisecond differences make no difference to me).
Go with whatever you feel most comfortable with.
Cake has got a steeper learning curve, because it wants a lot of things done its way. I've found myself looking at the source code quite a few times - the documentation sometimes is not clear or missing. But it takes care of a lot of things for you, and the cake console application is nice. It even generates code for you.
Whereas with CodeIgniter you can start developing right away if you are familiar with PHP; and it's a bit faster than Cake too. The manual is clear and concise - I really appreciate that. But it has less functionality than Cake.
So it depends on you. I'd try both for a couple days and make a decision.
Here in the office we use ZenCart for commerce and CakePHP for everything else. I haven't had much to do with Zen and it seems to have organically developed into a nightmarish beast.
I'd love to develop a solution with Cake and I think Cake would be well suited to it. The biggest headache - as I see it - would be solid secure payment handling (which is why we stick with Zen).
I'm new to production level web development, so sorry if this is obvious. My site has a potential to have a sudden surge of (permanent) users and I'm wondering what happens if too many users sign up in a short period of time, causing the site to run slowly. Since development takes time, would it just be a case of adding more boxes to the server, or does the site have to be taken down for code improvement?
Thanks
Don't worry even very popular sites go through this. Coding well is always a plus, but sometimes even that is not enough. Twitter being an ideal example, they started their messaging on Ruby but had to move to Scala as they became more and more popular.
Since you say you are new, can I suggest getting yourself familiar with caching queries and caching static content? Learning about good indexing practices on SQL server should also be helpful in dealing with a large influx of users.
Both but code improvement would be the first to target. Writing code that will scale will help you out the most. You can throw more servers at it behind the scenes but you would have to do this less with well architected code that was designed for scalability.
Depends on the technologies your using and how the code you write is written.
Since you tagged sql-server, when it comes to databases in general, you are limited by your locking strategies and your replication architecture a lot of the time. How you design your database and put it into production has big impact. Things that have to happen in any type of serial manner are bottlenecks. Check your execution plans, watch and manage your indexes, and replicate and distribute your systems if you can.
The best way to understand your scalability limitations is through load testing and proper QA.
If you don't do it right, your users are sure to be unhappy when you start 503ing or timing out. :-)
If the site is developed in such a fashion that you can have multiple servers/data access layers, then scalibilty should not be an issue.
Create the app so that you can loadshed as required, and keep the code as flexible as possible.
But from past experiance. Performance tune once it is required. Write easily understandable and maintainable code, and fix performance issues as the occur.
The best advice I can give is to test your app and server before you go live, then you can see when you are likely to get problems and how bad they could be.
It is one thing to say 'it will go slow' but once you get past a certain point your app may crash or randomly give users error 500 pages.
Test with automatic scripts tools to stress the site and simulate sign-ups and random users visiting random pages.
If you have SSL make sure your tools simulate lots of different SSL connections rather than just different HTTP requests ( SSL handshakes take extra resources )