How to connect a database with files - database

What would be the best way to insert metadata into a database that need to be logicaly connected files that are stored locally on a web-server?

In general, databases control their own storage. The proper procedure is to load data into tables in the database. This is important, because databases manage storage and memory. In a typical configuration, you don't want to be accessing files being updated by another application. And, you typically don't want to be storing database data over the network.
The general answer to the question is that you want to load data into the database.
That said, many database engines allow you to remotely access data in other databases or through a technology such as ODBC. You can get drivers for flat files, even those stored remotely on the network. However, this is not an optimal set up for querying. Alternatively, databases can be used to manage metadata for remote files, such as image files stored on disk. The purpose is to allow searches through the metadata which, in essence, retrieve file names that are then resolved (either on the client side or server side, depending on the architecture).
You should, perhaps, ask another question with a lot more detail about what you are trying to accomplish and about which database you are using.

Related

Is it safe to store server side information as ordinary files with no database

I usually us PHP and MariaDB on the server side with web applications (a LAMP stack). Typically, if I needed to store information, say a username, hashed/salted password, or json/text data, I would simply use the server database to store the information in a table using SQL statements.
Recently, however, I have had some small amounts of simple data and it just seems like overkill to use a database to store a handful of strings. Sometimes I don't really need any of the database functions or the ability to query the data.
My question is:
Can't I just make a folder with PHP, change its permissions to not allow the public, (like 600 or 640), and just store a few text files in that folder? Is this safe from a security standpoint?

Linked Server vs. Ad Hoc (OpenRowset / OpenDatasource) Distributed Queries

I have an application which needs to grab data from different clients' databases that reside at the client's location. So alongside their normal details such as company name, address etc, I also store the name of their DB server and the name of the database I need to interrogate.
The number of clients is currently zero. However, I expect this to grow to to around 200+ in a year's time.
I am a bit confused with which option to go with to run distributed queries:
Creating linked servers for every single client (up to 200+!)
Using an OpenDataSource() or OpenRowset() ad-hoc query within an SP that feeds in the DB server and DB name dynamically from the client
account table
Option 2 sounds the easiest to manage because if a client was to move their server or whatever, they just need to update their details in their account once and everything should work correctly.
But the reason I'm confused is because of this statement on Microsoft's site:
OPENDATASOURCE should be used only to reference OLE DB data sources
that are accessed infrequently. For any data sources that will be
accessed more than several times, define a linked server.
The external DBs will get accessed quite frequently, with the majority of transactions being SELECT statements.
I'm also unsure to the security implications and which option is more tight for security. Does anyone have experience in this area and could give me some tips please?
I have used both. Linked servers are typically used if a database has lookup data and is connected to often. There is nothing wrong with using dynamic opendatasource to connect to your clients machines. Be very security aware as to where, and how, you store your clients credentials. You probably should read up on encrypting passwords and usernames.

How to use database without pre-required program

I want to build a program which needs database to be used in it. Is it possible to use a database without pre-required program and internet access on client computer?
The only way to use a database without requiring internet access is if the databases is on the same computer.
You can bundle a database with your application but then you wouldn't have a central database for everyone to update. If that fits your requirements, great. Otherwise you are out of luck.
Regardless, you will need some program to perform persistent storage. While XML is one option, if you want any database like behavior just do a search on the internet for open source databases you could use if you don't want to pay for Oracle or SQL Server.

Data warehousing using local DB - Beginner

I want to get an idea on how to achieve this;
I have an application that runs at 5 different geographical locations. Eg: Texas, NY,California, Boston, Washington
This application saves data to a local database, which is located at that location.
I want to do data warehousing, So is it a must to have just have one database (Where all the 5 applications will now save its data in a single database - without having local DBs)
Or is it possible to have 5 local databases, and do data warehousing by retrieving data from those local DBs to a central DB and then performing data warehousing.
Please give me your thoughts and references.
You have three options for this:
you use a single, centrally hosted database server. Typical relational database servers can be directly accessed via network these days: mySQL, Postgresql, Oracle, ... This means you can implement an application which opens a network connection to the database server and uses that remote server to store and retrieve the data as required. Multiple connections are possible at the same time.
you use a single, central database server but put a wrapper around it. So some small network layer application layer acting as a broker. This way you can address that central instance over network, but via standard protocols like for example http.
you use a decentralized approach and install a database instance at each location. Then you need some additional tool to perform a synchronization. For most modern database servers (see above) such tools exist, but the setup is not trivial.
If in doubt and if the load is not that high go with the first alternative.

WinForms application design - moving documents from SQL Server to file storage

I have a standard WinForms application that connects to a SQL Server. The application allows users to upload documents which are currently stored in the database, in a table using an image column.
I need to change this approach so the documents are stored as files and a link to the file is stored in the database table.
Using the current approach - when the user uploads a document they are shielded from how this is stored, as they have a connection to the database they do not need to know anything about where the files are stored, no special directory permissions etc are required. If I set up a network share for the documents I want to avoid any IT issues such as the users having to have access to this directory to upload to or access existing documents.
What are the options available to do this? I thought of having a temporary database where the documents are uploaded to in the same way as the current approach and then a process running on the server to save these to the file store. This database could then be deleted and recreated to reclaim any space. Are there any better approaches?
ADDITIONAL INFO: There is no web server element to my application so I do not think a WCF service is possible
Is there a reason why you want to get the files out of the database in the first place?
How about still saving them in SQL Server, but using a FILESTREAM column instead of IMAGE?
Quote from the link:
FILESTREAM enables SQL Server-based applications to store unstructured
data, such as documents and images, on the file system. Applications
can leverage the rich streaming APIs and performance of the file
system and at the same time maintain transactional consistency between
the unstructured data and corresponding structured data.
FILESTREAM integrates the SQL Server Database Engine with an NTFS file
system by storing varbinary(max) binary large object (BLOB) data as
files on the file system. Transact-SQL statements can insert, update,
query, search, and back up FILESTREAM data. Win32 file system
interfaces provide streaming access to the data.
FILESTREAM uses the NT system cache for caching file data. This helps
reduce any effect that FILESTREAM data might have on Database Engine
performance. The SQL Server buffer pool is not used; therefore, this
memory is available for query processing.
So you would get the best out of both worlds:
The files would be stored as files on the hard disk (probabl faster compared to storing them in the database), but you don't have to care about file shares, permissions etc.
Note that you need at least SQL Server 2008 to use FILESTREAM.
I can tell you how I implemented this task. I wrote a WCF service which is used to send archived files. So, if I were you, I would create such a service which should be able to save files and send them back. This is easy and you also must be sure that the user under which context the WCF service works has permission to read write files.
You could just have your application pass the object to a procedure (CLR maybe) in the database which then writes the data out to the location of your choosing without storing the file contents. That way you still have a layer of abstraction between the file store and the application but you don't need to have a process which cleans up after you.
Alternatively a WCF/web service could be created which the application connects to. A web method could be used to accept the file contents and write them to the correct place, it could return the path to the file or some file identifier.

Resources