Chrome (Webkit) WebSQL Database maximum file size? - database

I'm trying to find some information on the maximum size a WebSQL (SQLite) database can be on Google Chrome. I've read conflicting information such as max size is 5MB and the User is prompted when DB reaches 10, 50, 100MB etc.
I've tried creating DB's of various sizes and they open fine at 500MB and 5,000MB, however I've yet to try adding data up to these large sizes.
Does anyone have any first hand experience with large WebSQL DB's or can point me at relevant information?

Here's a link to the most solid article on the subject I've found so far:http://html5doctor.com/introducing-web-sql-databases/. Helped me a lot.

i'm not sure about the max size for a database on a normal webpage, but if you need a lot of storage, you can make an extension that requests the "unlimitedStorage" permission. See http://code.google.com/chrome/extensions/manifest.html.

If you try to create a database over the size of the default database size, 5MB. The popup is shown the image below, asking whether you want to grant the database permission to scale up to the next size of database — 5MB++ to onwards. In short, you can have unlimited size but if user want.

Related

Picture on phpmyadmin

I would like to put photos in my phpmyadmin database but I can't find the solution that allows me to do so. I made an image array with a varchar but I can't find the path that allows to put an image. I thank you in advance for helping a young beginner ❤️
The best solution is to store the image as a file on disk and put the path to the file in the database (as a varchar or some other text type), then your application references the file instead of loading the BLOB data directly from the database. There are performance reasons that are well documented elsewhere that explain the intricate details better than I can, but basically the database slows and the amount of disk space it takes up expands as you store the images in the database.
If you decide to defy that advice and store the image directly in the database, you shouldn't be using phpMyAdmin as your main interface. Don't get me wrong, it's fully capable of uploading the image, but presumably you'll have some custom application interface that you should be using instead that also does application-level logic. But that's not what you asked, so to do this in phpMyAdmin, there should be a "Browse" button near the field on the phpMyAdmin Insert page. This requires your column be some appropriate sort of binary such as BLOB.

How can I limit the size of the debug_kit.sqlite file in cakephp 3.x?

The debug_kit.sqlite file in the tmp directory grows with every request by approx. 1.5 Mb. If I don`t remember to delete it, I am running out of disc space.
How could I limit it`s growth? I don't use the history panel, so I don't need the historic data. (Side question: why does it keep all historic requests anyways? In the history panel only the last 10 requests are shown, so why keep more than 10 requests in the db at all?)
I found out that the debug_kit has a garbage collection. However it is not effective in reducing the disc space because sqlite needs to rebuild the database with the vacuum command to free disc space. I created a PR to implement vacuuming into the garbage collection: https://github.com/cakephp/debug_kit/pull/702
UPDATE: The PR has been accepted. You can solve the problem now by updating debug_kit to 3.20.3 (or higher): https://github.com/cakephp/debug_kit/releases/tag/3.20.3
Well, there is one main purpose for debug kit. DebugKit provides a debugging toolbar and enhanced debugging tools for CakePHP applications. It lets you quickly see configuration data, log messages, SQL queries, and timing data for your application. Simple answer is Just for debug. Even though only shown 10 requests, you can still query to get all histories such as
Cache
Environment
History
Include
Log
Packages
Mail
Request
Session
Sql Logs
Timer
Variables
Deprecations
It's safe to delete debug_kit.sqlite or you can set false to generate again or what I did it I run cronjob to delete it every day.
Btw, you should not enable it for staging or production. Hope this help for you.

Does CKAN have a limit size of data to upload?

I have set CKAN and it is running fine, but have two questions.
Both problems below happen only if uploading file. If I add a new resource by a URL, everything runs fine.
1) I can upload small files (around 4kb) to a given dataset, but when trying with bigger files (65 kb) I get Error 500 An Internal Server Error Occurred. So is there a size limit for uploading files? What can I do to be able to upload bigger files?
2) I get another error, for the small uploaded files, and that is: when clicking in Go to Resource to download the data, it gives me Connection to localhost refused, and I cant visualize the data either. What am I doing wrong?
I appreciate any help. If you need me to provide more info on anything, I'll happily do.
Many thanks.
CKAN has an upload size limit of 10MB for resources by default. You can raise that in your ini with ckan.max_resource_size = XX, for example ckan.max_resource_size = 100 (which means = 100MB).
As for question 2): have you set ckan.site_url correctly in your ini?
As far as I'm aware, CKAN can easily cope with terabytes of data(Used for millions of medical records in hospitals etc) so there shouldn't be an issue with your file size. It could be an issue on their end while receiving your data.

Storing Serialized Video files to SQL Server

I currently am faced with a need to host 20 small video files for my website. I know I could just host them with my project in a folder but I came a crossed this article.
http://www.kindblad.com/2008/04/how-to-store-files-in-ms-sql-server.html
The thought of storing the file in the db had not occurred to me. My question is would there be a performance increase or decrease by storing the files as bit data in the db versus just streaming the data. I like the idea of having the data in the db for portability and having control and who gets access to the videos. Thanks in advance.
Unless you have a pressing need to store them in a database, I wouldn't, personally. You can still control who gets access to which files by using a handler to validate access to the file. One big problem that the method in that article has is that it doesn't support reading a byte range - so if someone wants to seek to the middle of a video, for example, they would have to wait for the whole thing to download. You'd want it to be able to support the range header, as described in this question.

Elegant way to determine total size of website?

is there an elegant way to determine the size of data downloaded from a website -- bearing in mind that not all requests will go to the same domain that you originally visited and that other browsers may in the background be polling at the same time. Ideally i'd like to look at the size of each individual page -- or for a Flash site the total downloaded over time.
I'm looking for some kind of browser plug-in or Fiddler script. I'm not sure Fiddler would work due to the issues pointed out above.
I want to compare sites similar to mine for total filesize - and keep track of my own site also.
Firebug and HttpFox are two Firefox plugin that can be used to determine the size of data downloaded from a website for one single page. While Firebug is a great tool for any web developer, HttpFox is a more specialized plugin to analyze HTTP requests / responses (with relative size).
You can install both and try them out, just be sure to disable the one while enabling the other.
If you need a website wide measurement:
If the website is made of plain HTML and assets (like CSS, images, flash, ...) you can check how big the folder containing the website is on the server (this assumes you can login on the server)
You can mirror the website locally using wget, curl or some GUI based application like Site Sucker and check how big the folder containing the mirror is
If you know the website is huge but you don't know how much, you can estimate its size. i.e. www.mygallery.com has 1000 galleries; each gallery has an average of 20 images loaded; every image is stored in 2 different sizes (thumbnail and full size) an average of for _n_kb / image; ...
Keep in mind that if you download / estimating a dynamic websites, you are dealing with what the website produces, not with the real size of the website on the server. A small PHP script can produce tons of HTML.
Have you tried Firebug for Firefox?
The "Net" panel in Firebug will tell you the size and fetch time of each fetched file, along with the totals.
You can download the entire site and then you will know for sure!
https://www.httrack.com/

Resources