These posts are being treated as listings for a directory, but they are not loading all at the same time.
I think it may be a database issue, I noticed my wp_postmeta size is over 20Mb
I ruled out all js and css issues already.
Thanks!
You might consider using the Query Monitor plugin to help you track the slow database queries…
Maybe a plug-in is causing your site to slow down. You could check this just to be sure... You can always check your database or something like that but sometimes it could be much smaller than you'd think. Also,
Use a plug-in like wp super cache to generate html files which makes your site slightly faster. Caching is always good in my opinion.
That said, run a db check.
My website has stated to get the following error: OperationalError: (1203, "User xxxxx already has more than 'max_user_connections' active connections")
From what I understand this is because there are too many requests to the database at one time and the database cannot cope. Ideally I need to setup caching for the database access and know this is pretty easy to do with Django, but the question is, which cache solution is best.
My hosting is on the MediaTemple gridserver platform if that helps. As far as I am aware I can use any or the solutions that Django provides: http://www.djangobook.com/en/beta/chapter14/
Is there a good way to figure out what the best option should be? I don't generally have much traffic, but sometimes there can be a spike and the content is pretty much static, except for the odd blog post, that doesn't have to be to 'fresh'.
Read a cache solution comparison here.I guess django-staticgenerator would be what are looking for.
And you can take a look at Johny-cache
I have to develope something to hold blob's. I have a various options but have a hard time choosing. What is needed is for the user to upload documents in the form of images mostly. I need to make it secure. I would like some speed though it's not a prerequisite. The size of the uploaded images/documents will "probably" not be too much of an issue.
The options I have found so far are:
SQL FileStream
Azure Storage
HDD
Other cloud services like Amazon etc.
I am not too fond of the cloud since it will be somewhat sensitive data and I am not liking the price model. Not saying it's bad it's just a personal preference.
When it comes to FileStream I have no idea, I have been recommended both for and against it. Does it really work?
HDD, well yeah it's meant to store things but now I need to code for both Database and HDD. I am not overly fond of that tight coupling.
Are there any other suggestions on how to store stuff? Can I get some comments on the options I mentioned?
FileStream just ends up going to the HDD. The only thing that gets stored in the DB is a reference to the file. You can replace the file with different content, and SQL server will never know.
I would personally just use the HDD, that's your best bet, if you want full control over the files.
I will be either Azure or Amazon
I'm running a database-backed web site on shared hosting that occasionally gets swarmed after a mention on a link sharing site.
Because of how much load the first couple of traffic surges put on the database, I have implemented file-based caching.
When a query runs, I just serialize the resultset object and save it to a file. I have a sub-directory structure in the cache directory that keeps thousands of files from ending up in the same directory. Next time I have to run the same query, I just pull the object out of the file instead.
It's been working pretty well so far. But I'm worried that I am overlooking something, and possibly asking for trouble if there is a higher level of traffic than I've previously enjoyed. Or maybe there's just an easier way to do this?
Please poke some holes in this for me? Thanks!
Ideally. cache in memory to remove disk access. Have a look at something like memcached
Since you're on shared hosting, you should do some throttling (google "Throttling your web server (Oct 00)" for ideas).
A related interesting read (which also mentions Stonehenge::Throttle) is
Building a Large-Scale E-commerce site with Apache and mod_perl
http://perl.apache.org/docs/tutorials/apps/scale_etoys/etoys.html
How often do you refresh your development database from production database?Since there are many types of projects (targeting different domains) I would like to know how it is being done and at what intervals(days/months/years) it is being done ?
thanks
While working at Callaway Golf we had an automated build that would completely refresh the database from a baseline. This baseline would be updated (from production) almost daily. We had a set up scripts (DTS) that would do this for us. So if there was some new and interesting information we could easily do it a couple times of day, once a week, etc. The key here is automation to perform the task. If it is easy to do then when it is done is really only dependent on how performing the task impacts the load on the production database, the network, and the amount of time it takes to complete it. This could of course be set up as a schedule task to run at off peak hours and before the dev team gets in in the morning.
The key things in refreshing your development database are:
(1) Automate the refresh through a script.
(2) Obfuscate the data in your development database since you do not want your developers to see the real data or you could do some sampling of your production database.
(3) Decide the frequency of the refresh -- I usually do it once a week.
Depends on what kind of work you're doing. If you're debugging issues that are closely related to the data, then frequent updates are good.
If you're doing data Quality Assurance (which often involves writing code to detect and repair it, that you have to develop and test away from the production server), then you need extremely fresh data. The bad data that is the most valuable to fix is the data that was just inserted or updated today.
If you are writing client code, then infrequent updates are good. Often when I'm writing C# UI code, I could care less what the data is, I just care if it shows up in the right box on the screen.
If you have data with any security issues, you should stop using production data--i.e. never update from production--and get a good sample data generator. Writing a good sample data generator is hard, so 3rd party products are the way to go. MS Data Dude comes to mind, and I recommend Sql RedGate's data generator.
And finally, how hard is it to get a copy of the production data? If it is cheap and automatable, just get a new copy every night. If it is expensive (requires the attention of a very busy DBA), well, resource constraints might answer the question for you regardless to these other concerns.
We tend to refresh every couple of days, or perhaps once a week or so if things are "normal," though if we're investigating something amiss we may do so more much more often.
Our production database is about 1GB, so it's not a trivial thing to copy around. Also, for us there's generally no burning need to get current data from production into the dev systems.
The "how" is simply a MySQL "backup" and "restore"
In a lot of cases, refreshing the dev database really isn't that important. Often production systems have far more data that required for development, and working with such a large dataset can be a hassle for several reasons. Examples include development on the interface, where it's more important to have some data instead of anything specific. In this case, it's more customary to thin out the production database to a smaller subset of real data. After you do this once, it's not really that important to update, as long as all schema changes are pushed through the dev database(s).
On the other hand, performance bugs may often require production-sized databases to be able to reproduce and identify bottlenecks, so in this scenario it is extremely useful to have an almost-realtime database. Many issues may only present themselves with the exact data used in production.
We tend to always go back to an on-demand schedule. We have many different databases that are used in a suite of applications. We stay away from automatic DEV databases b/c many of our code changes involve database changes and I don't want anything overwritten.
Not at all, the dev databases (one per developer) get setup by a script or similar a couple of times a day, possibly a couple hundred times when running db tests locally. This includes a couple of records for playing around
Of course we still need a database with at least parts of production in it, for integration and scalability tests. We are aiming for a daily automated refresh. But we aren't there yet.