I am very new to Sequel Server. For Practice, I created a Database where I imported some sample datasets (mostly excel files) via SSIS. The database consists of 6 tables. I was wondering if there is a way I can send the database (I created) via email or share it by any other way. Like Adventureworks was available for download and also for attaching to SQL SERVER, is there a way I can do the same for my practice database?
Thank you,
Regards,
Sourav
You can backup the database you created to a file and send it over to anyone. Right click on your database, Tasks -> Backup
Related
I would like to know does database is erased on daily basis which is created PHPMyadmin
Yesterday I have created a database by clicking "new" option under the Server "phpmyadmin demo-MYSQL(root)".Today that database doesn't exist.Any suggestions please and let me know whether I have missed nay steps before creating a database
You were using a demo.phpmyadmin.net server which is not intended for production and which removes databases created by testers on a daily basis.
I am very new to SQL and I want to update a table on a new site of mine with older live DB costing table. I have read that i need the .mdf and .ldf files from the live Db, but i have no idea how to create or get them? Please advice on this problem.
regards
If you're using SQL Server then what you need to know is how to Backup and Restore a database.
The following tutorials will explain how each is done.
Backup Tutorial
Restore Tutorial
I have a few databases that I want to migrate to another server. These are production Databases, what is the best way to Migrate
1) Take full back up of the Current Databse and then Restore it on to the other Server.
or
2) Detach and then Copy the mdf/ldf files on the Destination Server and then attach the files there.
I know that after Migrating Sql Logins and Sql Agent Jobs will have to be created manually. Are there any other risks that come to mind?
Any help will be helpful.
Thanks,
Ben
For Login transfer use "sp_help_revlogin" you get the script
https://support.microsoft.com/en-us/kb/918992
this Stored procedure list out all the instance login not particular database login. One special thing about this stored procedure is no need to do orphaned fix. Just migrate the logins and check. It works.
I have a database with data that i don't want anyone to copy around.
Now, how can i prevent other users to have access in the local machine, but whenever i want to work with, i am allowed.
And another issue I'm thinking, how can i lock the DB so if someone copy the .mdb file and try to attach it on another machine he/she couldn't see a thing?
There are many different kinds of encryption options available in SQL Server. If you don't want to have to re-write your application then the new feature in SQL Server 2008 called Transparent Data Encryption (TDE) is for you. It means you can encrypt the data files to prevent unauthorized users getting at the data from a backup or by copying the data files.
This MSDN link should be enough to get you started.
You probably want to use database encryption to achieve this. Then the mdb file will be useless.
Some links
http://blog.sqlauthority.com/2009/04/28/sql-server-introduction-to-sql-server-encryption-and-symmetric-key-encryption-tutorial-with-script/
http://msdn.microsoft.com/en-us/library/cc278098.aspx
http://technet.microsoft.com/en-us/library/bb510663.aspx
http://support.microsoft.com/kb/316898
I have a client who owns a business with a handful of employees. He has a product website that has several hundred static product pages that are updated periodically via FTP.
We want to change this to a data-driven website, but the database (which will be hosted at an ISP) will have to be updated from data on my client's servers.
How best to do this on a shoestring? Can the database be hot-swapped via FTP, or do we need to build a web service we can push changes to?
Ask the ISP about the options. Some ISPs allow you to ftp upload the .mdf (database file).
Some will allow you to connect with SQL management studio.
some will allow both.
you gotta ask the ISP.
Last time I did this we created XML documents that were ftp'd to the website. We had an admin page that would clear out the old data by running some stored procs to truncate tables then import the xml docs to the sql tables.
Since we didn't have the whole server to ourselves, there was no access to SQL Server DTS to schedule this stuff.
There is a Database Publishing Wizard from MS which will take all your data and create a SQL file that can then be run on the ISP. It will also, though I've never tried it, go directly to an ISP database. There is an option button on one of the wizard screens that does it.
it does require the user to have a little training and it's still a manual process so mabe not what you're after but i think it will do the job.
Long-term, building a service to upload the data is probably the cleanest solution as the app can now control it's import procedures. You could go grossly simple with this and just have the local copy dump some sort of XML that the app could read, making it not much harder than uploading the file while still in the automatable category. Having this import procedure would also help with development as you now have an automated and repeatable way to sync data.
This is what I usually do:
You could use a tool like Red-Gate's SQL Data Compere to do this. The tool compares data between two catalogs (on same or different servers) and generates a script for syncing them.