Export MSSQL DB, Import in shared environment - sql-server

We are in the process of trying to migrate from a VPS to a shared environment. The VPS is running Studio Express 2005 so is therefore limited quite a lot in functionality in terms of exporting.
I have managed to export a database in .bak format and upload (Restore) it to the shared environment.
However, here comes the problem, the schema has come with the database. Causing problems when connecting via asp.
The table name structure is as follows [SCHEMA].[TABLE_NAME].
The shared environment does not allow for changing of schema or many advanced features. (Its running myLittleAdmin).
So I guess the schema changes would have to be done on the database, then exported then imported.
Ps. I'm new to MSSQL and more experienced in MYSQL.

Ok So I have found a solution to this.
Export the schema from Studio Express using Right Click > Tools > Generate Script.
Execute this script on the server.
Open this file, find and replace the old user with your new one.
Use a tool such as this one http://sqldumper.ruizata.com/ (SQL Dumper) to export the DB to .SQL.
Find and replace on this file, again for the old user to the new.
Copy this SQL and execute it on the server.
Job done!
Joe

Related

Make SQL Server LocalDb portable

I'm setting up a new machine and to my surprise it is really difficult to get the localdb working on a new machine. Everything in my solution explorer populated, but nothing shows up in my (localdb) project after I fetch from source (using git).
What is the best way to get my localdb project from my old machine to this new one?
In an ideal world, what i'd like to do is have it set ups so then when some pulls the SSDT project down from our source they can hit deploy and completely populate the localdb project on their machine. Has anyone done this before, know how to do this?
--Disclaimer--
I feel like when talking about localdb it's important make some clarifications, because it seems that when other people post questions like this they get a lot of responses that apply to SQL Server databases.
What a localdb project is:
A localdb SQL Server project is a special server instance that runs only
when its connected to and acts partly like a SQL Server and
contains special instances of databases that only run when a
connection is made to them. A localdb can be used for certain types
of production, but they are most often used as a means to test other
databases.
What a localdb project is not:
A localdb project is NOT a SQL Server database.
I'm well aware that I could back up every DB in my project and manually recover them on the new machine, but that is not what I'm looking to do.
You can achieve this by detaching and attaching process.
Step 1: Find the location of the localDB database.
Right click on the database name and select properties
From the Data File property, you can find the database current location.
Step 2: Detach the current database
EXEC sp_detach_db 'aspnet-IdentityApplication-E2BBF1E6-123-4567-8910-07BC0413419B', 'true';
Step 3: Copy and Paste the localDB to the different location (The location where you want to put the localDB database)
Step 4: Attach the database with the new location
CREATE DATABASE [aspnet-IdentityApplication-E2BBF1E6-123-4567-8910-07BC0413419B]
ON (FILENAME = 'D:\Test\aspnet-IdentityApplication-E2BBF1E6-123-4567-8910-07BC0413419B.mdf'),
(FILENAME = 'D:\Test\aspnet-IdentityApplication-E2BBF1E6-123-4567-8910-07BC0413419B_log.ldf')
FOR ATTACH;

Bitbucket and Database Development

I have a Windows server with MS SQL Server running on it.
On the SQL Server developers have created stored procedures, views, tables, triggers.
On the Windows server developers created shell scripts.
I would like to start versioning the code described above in a BitBucket repository. I have a repository created in BitBucket.
How should the branches be organized in this repository? i.e. "SQL Server\Database\\ ...
"Windows Server\\shell_script" ...
Can I connect BitBucket to SQL Server and Windows Server and specify which code needs to be versioned?
Are both 1 and 2 options above possible?
I just need to version control the changes to the code and have the ability to mark under which project the code change was made.
I am new to BitBucket. I am using the web front end of it. I do not know how to configure command line access, so please try not to reference Bitbucket commands. Sorry if I sound confusing.
Please help.
I know this is an old question but anyway, in principle I'd recommend:
Put all the server shell scripts into one place and make that a git repo linked to your bitbucket repo
Add a server shell script to export what you want version controlled from the SQL db
The export from the SQL db should be to text files so they are easily 'diffable'
You might as well make the export to a sub-directory within the shell scripts repo so that everything is in one place and can't get out of sync
So you only have one branch, not a separate one for server shell scripts and db
Make sure people run the export script and then commit everything when they make a change
You ideally have a test server which means you'd want a way to push changes from the repo into the SQL db. I presume you can do this with a script but deleting the server setup and re-creating it from the text files.
So basically, you can't connect an SQL db to bitbucket directly. You need scripts to read and write to the db from a repo.

TeamCity Database migration

We have a TeamCity installation as well as an external MSSQL database on a Microsoft SQL server. We've had to migrate the database to a new instance and now have to configure TeamCity to point to the new database.
I've looked through this guide (https://confluence.jetbrains.com/display/TCD10/Manual+Backup+and+Restore) among others but they all seem needlesly complicated and seem to imply a complete relocation of the entire teamcity installation whereas we simply want to point an existing teamcity installation to a new database.
A simply search reveals a config with a connectionstring hidden in teamcity/serverdata/config. It would seem like we could simply change the config file and be done with it. Are we missing something?
We're using TeamCity Professional 2017.1 (build 46533)
If you're only migrating to the new server, then changing configuration in <TeamCity Data Directory>\config\database.properties file all you have to do.
I assume that you'll make a backup, migrate data to the new database, right? After that you can safely change value in the corresponding file and restart the Teamcity. Probably make sense to check connection to the database from Teamcity server first as well.

Automating import of data-tier application (SQL database) from Azure with a Master Key

When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!

Publishing a VB.NET Application with SQL Express DB (using LocalDB)

I have written a VB.Net application that uses an SQL Express DB file containing a single table and a handful of stored procedures.
I have successfully built and exported the application to my VPS.
The problem comes when knowing what to do concerning the database file, there is a wealth of stuff online but not specifically to suit my needs.
I plan to use LocalDB on the VPS but being commandline - it is hard to know if the scripts that I have run have been successful after creating an instance , starting it... etc,
I want to keep installation requirements to an absolute minimum on my VPS machine and (in time other end users machines)... hence using LocalDB and not SQL Express
So, what do I have to do on the VPS to enable my application to connect to the database.. ? This was simple when it was Access - (supply the MDB file and run the AccessDatabaseEngine(redistributable) - job done)
The connection on my devt. machine runs as expected.
The connection string in my code is:
Const strSQLConnection As String = "Data Source= (localdb)\v11.0;Database=SoccerTrader;Trusted_Connection=True"
Can anyone help please.. this is driving me around the bend.. surely it cant be that difficult..?
===========================
I have found the following in an MSDN blog which says:
Database as a File: LocalDB connection strings support AttachDbFileName property that allows attaching a database file during the connection process. This lets developers work directly with databases instead of the database server. Assuming a database file (*.MDF file with the corresponding *.LDF file) is stored at “C:\MyData\Database1.mdf” the developer can start working with it by simply using the following connection string: “Data Source=(localdb)\v11.0;Integrated Security=true;AttachDbFileName=C:\MyData\Database1.mdf”.
================ ADDED 12th June =====================
OK, this is really bugging me now... I have read around this till it is coming out of my ears and nothing specifically seems to target what I am trying to do. All the blogs I read refer to installing / running SQL Server and changing permissions etc.
As I have mentioned I am using a VPS and propose to use LocalDB on the VPS to access a simple/small database file for a VB.Net application I am writing.
This is the story so far.
1) I have built a working prototype on my development PC and connected using SQL Express to a database file SoccerTrader.mdf - no problem.
In the Visual Studio Project properties I have added a requirement to the project that checks for SQL Server ..and if it is missing, installs it...
2) I install the project on the VPS and as expected SQL Server 2012 LocalDB is installed .... see here..
3) I have copied the SoccerTrader.MDF and SoccerTrader.LDF files into "C:\BESTBETSoftware\SoccerBot" on the VPS
4) for practical reasons given the problems I am having getting this to work, I have implemented an inputbox for me to specify the connection string when the application runs.... the connection strings I have used give the following...
1]: http://i.stack.imgur.com/i2tro.png
I have not changed any file permissions on the development PC and the database state is NOT read only....
So, the question is where do I go from here...? What have I missed.. why is it not working..?
I have managed to sort the problem.
Seemingly, the connection string I was using was OK. It was my error handling that wasnt 'clean' enough. It transpired the connection was being made on my VPS but when the application attempted to update the table , the directory I had created and put the MDF file into, would not permit write access.
I moved the MDF into the C:\Users\Public\Documents folder and all works as it should.
You have to specify the full path of the Db file with folder name/ip-address

Resources