I have a web made with CakePHP 1.3.10. This web seems to get slower every time new folders/pages are added to it (which happens pretty often).
I believe reading somewhere that the mod_rewrite found in the 3 .htaccess files may have something to do with it.
Is it true?
I'm trying to get it to work without the htaccess files, but all my links are messed up. Is there any way to avoid having to edit all the links in the website? Now it seems that I have to add /app/webroot/ before every file I'm linking (css, js, etc) and add /index.php before every link in the website.
Is this the only way?
Have you measured how much time is spent in mod_rewrite and how much in PHP? In my experience the most likely problem is the time Cake spends looking for files in the file system, which gets progressively worse when you add files and directories.
You can use Xdebug to profile the application, or just add calls to print the time in appropriate places in the framework to see how much time has passed since the beginning of the request.
This doesn't sound like a mod_rewrite issue. The time it takes to transform an URL like http://example.com/wiki/Page_title to something like http://example.com/wiki/index.php?title=Page_title is more or less constant, and doesn't grow with the number of files/directories as they are irrelevant for the rewriting process.
Related
I'm working in an AngularJS app that uses webpack for bundling the resources. Currently we are creating a single app.js file that contains the CSS as well. The size of the app.js is around 6MB. If we break the app.js into multiple chunks does that improve the page performance. My colleagues convinces me if we break the single JS file into 2 or 3 then the page load time will increase twice or thrice. Is that really true? I remember reading some where having a single file is better than multiple. I don't really remember the reasons now. Do I really need to break the app.js file for page performance? or what other options I can apply here?
A single file is better because it requires fewer connections (means less overhead), but this is really negligible when talking about < 5 files. When splitting parts of your file you do gain the ability to cache the files separately, which is often a great win. Therefore I'd recommend splitting the files in logically cachable sections (like vendor code and custom code).
Also note that if the client and server support http/2, the fewer connections reason is also gone since http/2 supports connection re-use.
Note that there is no real difference for the initial load time, since in that case all files will need to be downloaded anyway.
A single file will usually mean better performance. You should also ensure that this file is properly cached (on the browser side) and gzipped when served by your webserver.
I did a practical test in Chrome (Mac 54.0.2840.98 (64-bit)) to prove whether there is really a performance gain in breaking a huge JS file into many. I created a 10MB js file and made three copies of it. Concatenated all the 3 copied and created a 30MB file. I measured the time it took for the single file that is referenced using a normal script tag at the page bottom and it's around 1 minute. Then I referenced the 3 10MB script files one after other and it took nearly 20seconds to load everything. So there is a really a performance gain in breaking a huge JS file into many. But there is a limit in the no. of files the browser can download parallely.
I have a website and its mobile version (m.mydomainname.com), the problem is that Im in a bit of a dilemma with the mobile website. t has dependencies files like the javascripts, phps, etc. Im not sure if I should get these files from the main domain website or from their own folder. I do know that it would be easier to update them if they are only located in one folder, but I dont know if it will be reducing its speed or not. So the question basically is will it reduce its speed or not?
I see no reasons (apart from some very esoteric ones) why putting static files in different folders should affect the speed of access, hence I'd say the answer to your question is almost certainly no.
However, you can always simply try both options and measure the difference in throughput. If you find any, I'd be interested to hear about it.
Im trying to fuzz some tools but i need a huge amount of .zip or .jpg files for that. I ve tried crawlers like webripper but its not very effective (or im doing it wrong). Is there a better way to get lots of different files?
Ok, for the offchance that someone else might need sth like this:
In the end i used Webripper and instead of generating links to google/bing results with the "filetype" parameter i just put some upload/freeware pages as targeted rip job with the max link depth.
Webripper might crash sometimes and it will take quite some time but well it works somewhat.
A possible better solution would probably be to use the google API (e.g. c#SearchAPI ). Then extract the clean links from the results and call asynch download for those. Using the direct result link most likely wont work because google will block it after some files "Unusual datatransfer".
It took me forever to figure out how to customize my Dreamweaver setup so that it would recognize the .ctp file extension (if you're trying to figure it out, there's a second Extensions.txt file under /Users/yourname/Library/ ...)
Now I'm trying to set up Dynamically-Related files on Dreamweaver with CakePHP, and it won't work. I'm assuming that this is because Cake is using those weird .htaccess files to prevent Dreamweaver from seeing the same directory that it is expecting.
Has anyone done this / can anyone help? All of the other q's have been left unanswered!
Thanks
"Not seeing"? I guess you're talking about mod rewrite. Also .htaccess files are nothing weird but something pretty normal in the web. Also see the CakePHP book page about mod rewrite and .htaccess. So either disable it or configure it properly.
I have no clue how dreamweaver is effected by this anyways. Does it come with a webserver?
However the best advice I can give you is simply to throw Dreamweaver away. Even after it became somewhat capable of doing non table based layouts it is still a lot faster to write the HTML/CSS manually or even faster with an editor that simply provides code/tag completion instead of this crutch Dreamweaver.
A sersious php + html/css editor is in my opinion phpstorm, if you want something free... uhm... Try Aptana or Eclipse PDT or pspad.
I’ve just updated most of my static files but it seems that the old versions of those files are still being served. How long does it usually take for the new versions to be served? Is there any ways to speed that up?
Are you talking on production server?
In my project, usually they are affected immediately. Sometime due to the caching framework ,it keeps old static file served. I'm using Django-nonrel.
If you are using Google Chrome, you can you Inspect Element to see if it has an cache-control header or not.
Also this link will help you to change default_expiration on app engine.
Maybe it gives you some clues
I've found that it's usually immediate but sometimes takes about 15 minutes or so. For css/js many people append a build # to the filenames to break the cache.