I am currently working on a C project that contains an SQLite3 database with WAL enabled. We have an HTTP web interface over which you shall be able to get an online backup of the database. Currently, the database file is reachable over HTTP, which is bad in many ways. My task now is to implement a new backup algorithm.
There is the SQLite-Online-Backup API which seems to be pretty nice. There, you open two database connections and copy one database to the other. Anyway, in my setup, I can't be sure that there is enough space to copy the entire database, since we may have a lot of statistics and multimedia files in it. For me, the best solution would be to open a SQLite connection that is directly connected to stdout, so that I could backup the database through CGI.
Anyway, I didn't find a way in the SQLite3 API to open a database connection on special files like stdout. What would be best practice to backup the database? How do you perform online backups of your SQLite3 databases?
Thanks in advance!
If you need to have some special target interface for the backup, you can implement your custom VFS interface that does what you need. See the parameters for sqlite3_open_v2() where you can pass in the name of a VFS.
(see https://www.sqlite.org/c3ref/vfs.html for Details about VFS and the OS interface used by SQLite)
Basically every sqlite3_backup_step() call will write some blocks of data, and you would need to transfer those to your target database in some way.
Related
I've never done databases before but I know that you can create them on MySQL, the thing is that I want to be able to write/read a database locally and not have to send/receive any information online as it would be too slow for what I need.
Is there a way to create a stand alone database (possibly on MySQL then downloading it) and write/read from it using LUA?
Thanks for any information, as I say, never touched databases.
Yes it is.
You can download sqlite3, which is a simple SQL relational DB. Basically this type of DB is just a file (.db extension), so you can have it locally on you PC and also exchange it between many machines easily.
From a Lua standpoint, what you need is a library to access your DB and I think LuaSQLIte3 is the best option to go with. Check this SO post for a basic example on this.
IMO using lua in conjunction with sqlite3 is one of the best choices when one wants to have a (light) SQL-based DB locally, without the added "complexity" of a more commercially used DB (e.g.postgres).
I want to build a program which needs database to be used in it. Is it possible to use a database without pre-required program and internet access on client computer?
The only way to use a database without requiring internet access is if the databases is on the same computer.
You can bundle a database with your application but then you wouldn't have a central database for everyone to update. If that fits your requirements, great. Otherwise you are out of luck.
Regardless, you will need some program to perform persistent storage. While XML is one option, if you want any database like behavior just do a search on the internet for open source databases you could use if you don't want to pay for Oracle or SQL Server.
I have an in-memory SQLite database which I want to serialize and send to another computer. Is this possible without writing the database out to disk and reading the file from there?
You could use the online backup API to transfer the in-memory database, to a file-based database created in shared memory (for Linux, in /dev/shm for instance) avoiding the disk operations. Then this pseudo-file is transferred to the remote host (still put in /dev/shm), and the online load API is used to transfer from the file-based database, to your target in-memory database.
See:
http://www.sqlite.org/backup.html
http://www.sqlite.org/c3ref/backup_finish.html
AFAIK, there is no API to perform online/load without intermediate databases.
The sqlite3 shell program contains a .dump command that "dumps the database in an SQL text format." You can use the source code for .dump (it is public domain) to create your own serializer.
I have a standard WinForms application that connects to a SQL Server. The application allows users to upload documents which are currently stored in the database, in a table using an image column.
I need to change this approach so the documents are stored as files and a link to the file is stored in the database table.
Using the current approach - when the user uploads a document they are shielded from how this is stored, as they have a connection to the database they do not need to know anything about where the files are stored, no special directory permissions etc are required. If I set up a network share for the documents I want to avoid any IT issues such as the users having to have access to this directory to upload to or access existing documents.
What are the options available to do this? I thought of having a temporary database where the documents are uploaded to in the same way as the current approach and then a process running on the server to save these to the file store. This database could then be deleted and recreated to reclaim any space. Are there any better approaches?
ADDITIONAL INFO: There is no web server element to my application so I do not think a WCF service is possible
Is there a reason why you want to get the files out of the database in the first place?
How about still saving them in SQL Server, but using a FILESTREAM column instead of IMAGE?
Quote from the link:
FILESTREAM enables SQL Server-based applications to store unstructured
data, such as documents and images, on the file system. Applications
can leverage the rich streaming APIs and performance of the file
system and at the same time maintain transactional consistency between
the unstructured data and corresponding structured data.
FILESTREAM integrates the SQL Server Database Engine with an NTFS file
system by storing varbinary(max) binary large object (BLOB) data as
files on the file system. Transact-SQL statements can insert, update,
query, search, and back up FILESTREAM data. Win32 file system
interfaces provide streaming access to the data.
FILESTREAM uses the NT system cache for caching file data. This helps
reduce any effect that FILESTREAM data might have on Database Engine
performance. The SQL Server buffer pool is not used; therefore, this
memory is available for query processing.
So you would get the best out of both worlds:
The files would be stored as files on the hard disk (probabl faster compared to storing them in the database), but you don't have to care about file shares, permissions etc.
Note that you need at least SQL Server 2008 to use FILESTREAM.
I can tell you how I implemented this task. I wrote a WCF service which is used to send archived files. So, if I were you, I would create such a service which should be able to save files and send them back. This is easy and you also must be sure that the user under which context the WCF service works has permission to read write files.
You could just have your application pass the object to a procedure (CLR maybe) in the database which then writes the data out to the location of your choosing without storing the file contents. That way you still have a layer of abstraction between the file store and the application but you don't need to have a process which cleans up after you.
Alternatively a WCF/web service could be created which the application connects to. A web method could be used to accept the file contents and write them to the correct place, it could return the path to the file or some file identifier.
I am looking to set up an FTP server without connecting it to a file system.
I want to use a database to store which of the many large files on my site each user will have access to. Because of the number and size of the files involved, the files cannot all be stored on a single server so a link based setup is not useful.
I am imagining an FTP server that will act as a pass-through for a backend CDN that stores all the files and checks a remote database for which files to present.
Does a system like this exist? If it doesn't exist, Which open source FTP server would be easiest to modify to suit my needs?
Have you looked at JScape?
It costs money but has the capability
Since you are on StackOverflow, I assume that you are ready for coding. In this case you can use FTP/FTPS server component, included in FTPSBlackbox of our SecureBlackbox product. It lets you handle all operations yourself, so you are not bound to the file system in any way.
We also have SFTP server (SSH File Transfer Protocol) with similar design.