Storing Database Locally in File/Folder - database

I'm in the middle of doing a personal project and would like to create a system of three components.
A simple form application that would allow the user to input data into a database.
A database of multiple tables.
An excel spreadsheet that queries the database.
At this point in the project, there is a desire for the database to be stored on the PC of the person working on the project and for all three components of the project to have the ability to be zipped up in a folder and emailed around. I know how to code well enough to query databases from applications and excel, but how can I go about creating a database that can be stored in a specific folder so it can be emailed around?
Thanks!

Look into sql compact edition
http://en.wikipedia.org/wiki/SQL_Server_Compact
http://xldennis.wordpress.com/2010/08/30/using-sql-server-compact-edition-database-with-excel/
MS Access might also be an option here.

Related

SQL Server Partitioning and Configuration Management

I work in a team of developers and we currently manage our SQL Server database schema (tables, stored procedures, user defined types, etc) through TFS and Visual Studio using a database project. We keep our local development copies of the database in sync using the Schema Compare tools in Visual Studio.
I'm currently setting up partitioning on a couple of huge tables of data which has resulted in 500+ FileGroups & Files based partly on what data we want to be on SSDs vs. spinning HDDs.
My question is, does anyone have a suggestion or experience on how to manage the database schema in TFS such that each developer doesn't have to setup the 500+ FileGroups/Files on each development machine?
The reason I want to avoid this is because
On our development machines we will only have a small amount of data
loaded based on disk space.
We plan to have a maintenance job on our production server to move data from SSD partitions to HDD partitions based on age. This means our production partitioning function won't match our development machines for very long anyway.
First, for your situation you can try to use server workspace. When you need to modify the files or projects, you just need to check out them.
When you use a server workspace, Visual Studio keeps only one copy of
each file. This can significantly reduce disk space usage and
improve performance when you have a lot of items. We recommend that
you use a server workspace if:
Your workspace contains more than 100,000 items.
You want to use Visual Studio 2010 or earlier versions to work with the workspace.
You need to use the Enable get latest on check-out option.
I'm not sure how did you handle the a database project. About how to put an existing database under source control, which consists of the following steps:
You create a database project.
You connect to an existing database.
You import the database schema from the existing database into the
database project.
You review the results that are shown in the database project.
You put the database project and its contents under version control.
You can also use a SSDT Database Project For Your Data Warehouse. Below is an example of how you could structure your database project (am only showing a few tables and views in the screen shots for brevity). You don't have to structure it this way, but in this project it's sorted first by schema, then by object type (table, view, etc), then by object (table name and its DDL, etc).
More detail info please refer this blog:Why You Should Use a SSDT Database Project For Your Data Warehouse

Database project connect to database instance - how to manage a subset of the actual database in the project

We have an existing SQL Server database, and I'd like to create a Visual Studio database project for it and put some of the scripts for the stored procedures in that database in source control. So, I thought I would create a new database project. Is there a way I can wire up this database project to the database? I thought it might be "Add Database Reference", but that only gives me options to use some other database project in the current solution, not set up the current project to be connected to an actual database.
Ideally, I'd like to be able to right-click on my project and do a "Publish" and have all my database info prefilled. I realize that I might be thinking of this wrong, but searching around on the web is of little help. It is surprising how poorly these concepts are documented.
EDIT: After the first answer, maybe my question really is: How can I have only a part of a database managed in a database project? I had assumed this was an ordinary thing that people did all the time with database projects, but maybe not. In my case, I would like to have only some of the stored procedures in source control.
After further reasearch, it seems the answer is, "no, there is no way" to have only a part of a database managed in a database project.
I'm using vs 2013 but i think this is valid back VS 2008
Right click database project.
Then choose -> import -> database.
Setup a connection to your target database.
When you import select the import setting Folder structure Schema\Object type
And run the wizard.
This builds a folder structure containing sql script for your schema objects.
Alternatively if you have sql data tools you could run the schema compare against a blank project.

How to build database reports using multiple remote databases

Does anyone have experience building database reports - doesn't matter which database - i just want design ideas - for a system that is made up of many separate, but identical databases?
I cannot "combine" all databases into one. They must be separate.
But the structure is identical across all databases...
I need to build a web interface that will allow a user to get a "global" report that will query all databases and build one combined report.
Do you have any comments on how the model would look like? or anything you think i need to beware of?
Thanks.
I don't have first hand experience with cross database reports, my experience comes from a product the company i work for sells which can create reports from multiple databases, from your description i believe you require something of the "combine" tables kind, in this case i recommend you to detect the tables used in the query, and unify them in a single temporary intermediary database, for example Access, SQL Server CE or SQLite and then run the query against this temporary database or table.
If your databases are Microsoft SQL Server, then using SQL Server Reporting Services seems like a good solution. The software for the report generation / display is bundled along with the database software.
It gives you a web interface, where you can configure 'data sources' from any number of remote databases, and combine data from these sources into reports. It is user friendly and you can do all the report design / configuration through the web interface without having to write any code.
some references :
Building report using SQL Server stored procedure
http://blog.hoegaerden.be/2009/11/10/reporting-on-data-from-stored-procedures-part-1/

Automatically deploying changes to a web application

What's the best way to automatically deploy changes to a database driven web application? Is there a single product out there that can modify the following...
Website (dlls, aspx, css files etc)
Database Schema (add tables, columns, etc)
Database data (modify table contents)
Reporting Services reports
I've seen various separate products, but not one that does everything.
Yes, there is a Powershell method posted at http://www.codeproject.com/KB/install/DeploySite.aspx
note: in addition, for the schema stuff you might need to upload a schema.version file and then have a process up on that server detect a new schema file was uploaded and apply it. for new database rows you could maybe do something similar. another idea is that you could run the SQL database as a webservice and talk to it direct with your powershell script.

Updating database on website from another data store

I have a client who owns a business with a handful of employees. He has a product website that has several hundred static product pages that are updated periodically via FTP.
We want to change this to a data-driven website, but the database (which will be hosted at an ISP) will have to be updated from data on my client's servers.
How best to do this on a shoestring? Can the database be hot-swapped via FTP, or do we need to build a web service we can push changes to?
Ask the ISP about the options. Some ISPs allow you to ftp upload the .mdf (database file).
Some will allow you to connect with SQL management studio.
some will allow both.
you gotta ask the ISP.
Last time I did this we created XML documents that were ftp'd to the website. We had an admin page that would clear out the old data by running some stored procs to truncate tables then import the xml docs to the sql tables.
Since we didn't have the whole server to ourselves, there was no access to SQL Server DTS to schedule this stuff.
There is a Database Publishing Wizard from MS which will take all your data and create a SQL file that can then be run on the ISP. It will also, though I've never tried it, go directly to an ISP database. There is an option button on one of the wizard screens that does it.
it does require the user to have a little training and it's still a manual process so mabe not what you're after but i think it will do the job.
Long-term, building a service to upload the data is probably the cleanest solution as the app can now control it's import procedures. You could go grossly simple with this and just have the local copy dump some sort of XML that the app could read, making it not much harder than uploading the file while still in the automatable category. Having this import procedure would also help with development as you now have an automated and repeatable way to sync data.
This is what I usually do:
You could use a tool like Red-Gate's SQL Data Compere to do this. The tool compares data between two catalogs (on same or different servers) and generates a script for syncing them.

Resources