Azure SQL and Visio 2019 Plan 2 Database Reverse engineering wizard Fails - sql-server

Screenshot of ErrorI am using Visio 2019, trying to reverse engineer a Azure SQL server. I have successfully created multiple User data sources to use in the wizard using both our DB-admin user and my admin azure directory logins. The database credentials are successfully verified and the tables/views i want to reverse engineer are about to load in when i get the following:
Visio reverse engineer database wizard raises error:
"Error! Cannot extract column definition for the table/view . The definition is not
available or you may not have sufficient privileges."
with a text box that says:
"Could not find server "database name"* in sys.servers. Verify that the correct server name was specified. If necessary, execute the stored procedure sp_addlinkedserver to add the server to sys.servers."
I can neither find sys.servers nor run sp.addlinkedserver as it does not exist. though, sys.sysservers does exist.
I starred database name since instead of showing the target DB of "DB_2.0" it shows "DB_2" which is not the full name of the Database.
As i mentioned above i believe that i have sufficient privileges as i have tried both the admin username and password and my admin login using azure directory. So it may have something to do with not having the sys.servers table?
Is there a way to create the sys.servers table or create the stored procedure sp.addlinkedserver as Visio is requesting? Is it advisable to do is Azure SQL, could the naming convention of our DB have anything to do with the error?

Worked with MSFT support, they had me rename my database from DB_2.0 to DB_2. I created a new connection DSN and it worked perfectly afterwards. Apparently visio doesn't like the ".0" in a connection string. It would be nice if i could have eddited the connection string in a text file so I did not have to rename my entire DB.

Related

Can't export SQL Azure database when stored procedure is encrypted

I want to export my SQL Azure database to a file test.bacpac, but I failed:
One or more unsupported elements were found in the schema used as part of a data package.
Error SQL71564: Error validating element [dbo].[IsMyUserExisted]: The element [dbo].[IsMyUserExisted] cannot be deployed as the script body is encrypted.
The question is, why can't I back up my database like in SQL Server 2008, 2017 etc (just backup database, and then restore database).
"C:\Program Files (x86)\Microsoft Visual Studio\2019\Preview\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\150\SqlPackage.exe" /a:Export /ssn:"servername" /sdn:"databasename" /su:"username" /sp:"passwordhere" /tf:"myfile.bacpac" ExcludeObjectsTypes=StoredProcedures
but the property ExcludeObjectsTypes=StoredProcedures is invalid
I also tried "/p:ExcludeObjectsTypes=StoredProcedures" but still get an error.
Azure SQL Database does not support the WITH ENCRYPTION option for migrating objects such as stored procedures, user defined functions, triggers, or views. Therefore, migrating objects compiled with that option is not possible. You will need to remove the WITH ENCRYPTION option.
It means that Azure SQL doesn't support export/migrate database which contains these encrypted object, we will always get the error like this:
You must unencrypt this procedure then backup the database. After the database restored, find this stored procedure and encrypt it again.
Please ref this blog: https://thomaslarock.com/2013/05/migrate-encrypted-procedures-azure-sql-database/
HTH.

Automating import of data-tier application (SQL database) from Azure with a Master Key

When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!

CONTROL DATABASE issue when migrating SQL database to Azure

I'm following this tutorial on Microsoft Docs. I've reached the part where I use the "Data Migration Assistant", but after selecting the target Azure database and clicking "Next", I get the following error:
An unexpected error occurred.
Current principal does not have CONTROL permission on securable AzureDatabaseName of class DATABASE.
I'm using the only user of the Azure SQL server - the server admin, which should have all permissions. I've verified that the user is 'db_owner' by using IS_ROLEMEMBER.
Am I missing something?
I had the same issue. This seems to be a bug in Azure SQL databases. If you have dots in the database name it does not work. I replaced the dots with slashes and it worked for me.
You do not need to recreate the database. A rename worked fine for me:
You have to make sure, that no-one else is using the database!
Connect to master table and execute the following script on the Azure SQL Server:
USE master;
GO
ALTER DATABASE [my.database]
Modify Name = [my-database] ;
GO
Here is a link on how to rename Azure SQL databases:
https://learn.microsoft.com/en-us/sql/relational-databases/databases/rename-a-database
Also make sure to create a firewall rule for your incoming connection. This error can be a bit of a red herring.
I deleted everything - the database, the sql server, and the resource group. Then I recreated everything using the same names, except the database name - which previously contained dots - and this time the migration tool worked. I guess I just encountered some bug.
if you have dots in target database name, you have to remove or replace the dots in db name.
Like: 'demo.customerdb' to 'demo-customerdb'
You can use Sql Management Studio for db renaming:
connect target database server
select target database
press "F2" key or right click on target database then select
"Rename"
remove dots (.) from the database name and that's it! :)
After then, you can try migration process again from the start.

Troubleshooting the Re Creation of ReportServerTempDB

RESOLVED SEE EDITS:
Like a total noob I deleted our ReportServerTempDB by accident (I have a backup of ReportServer but not ReportServerTemp, live and learn). (Using SQL Server 2008 R2)
To recreate the database I followed several online guides that gave the several steps:
created a new database with the name ReportServerTempDB, and with the same collation as ReportServer (collation was key)
made a new Database Role called RSExecRole with same users as my ReportServer (also key to make sure this role has the correct permissions to the tables)
ran the CatalogTempDB script which ran without a hitch (the version of CatalogTempDB was not sufficient to recreate all of the objects necessary)
Used Reporting Services Config Manager to Change Database and picked ReportServer
Just for good measure turned off and on the SQL Server Reporting Services a few times
But I am still getting an error when I try to load my Reporting Services Home page:
An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. (rsReportServerDatabaseError) For more information about this error navigate to the report server on the local server machine, or enable remote errors
What am I forgetting? As an alternative can I simply "create a new report server database" and import a back-up of my original ReportServer? TIA
EDIT: I reviewed the RSExecRole and made sure that it had permission to edit tables and execute stored procedures (online sources did not spell this out very clearly) and after restarting the Reporting Services my error has changed to "An error occurred within the report server database. This may be due to a... Invalid object name 'ReportServerTempDB.dbo.TempCatalog'. Could not use view or function 'ExtendedCatalog' because of binding errors. "
Further reading is suggesting that the name of the temp Report Server is hardcoded into many stored procedures in ReportServer, but my new temp report server has the same name: ReportServerTempDB. Where is the disconnect?
EDIT2: So the script I used, CatalogTempDB, did not create all of the tables necessary to rebuild my temporary Report Server db. I created a new Report Server and ReportServerTempDB (which an altered name) and compared the object in my ReportServerTempDB built using CatalogTempDB to the one the SQL wizard created. Then used the import wizard to add in the missing tables and re-started the Report Service with my original. Voila.
Happy to provide more details about any of these steps.
To recreate the database I followed several online guides that gave the several steps:
created a new database with the name ReportServerTempDB, and with the same collation as ReportServer (collation was key and you need to assign it when yout are creating the db)
made a new Database Role called RSExecRole with same users as my ReportServer (also key to make sure this role has the correct permissions to the tables and stored procedures)
ran the CatalogTempDB script which ran without a hitch (the version of CatalogTempDB was not sufficient to recreate all of the objects necessary, several tables were missing)
To replace the missing tables I created a second ReportServer instance (using Reporting Services Configuration Manager)and compared the temporary db to my re-built temporary db and filled in the holes
Moral of the story: Keep a back up of BOTH ReportServer and ReportServerTempDB

Expose Microsoft Access database over SQL Server using linked server

We have one .exe application that uses one .mdb Microsoft Access database.
I need to access data inside access file over Microsoft SQL Server. We have SQL Server 2008 R2 Enterprise that has linked server pointed to this Access file and I can run select / update query using SQL statement.
SELECT * FROM [LinkedServerAccessDB]...[SomeTable]
How can I configure that this linked server, my Access database, is directly published as "Database" when some application tries to connect to my SQL Server using SQL Server instance name, and username and password. Which "database name" should I use to use directly linked server ?
Thank you
It sounds like you want your MS Access Linked Server object available as a database (i.e. available in the 'Databases' folder in SSMS). This isn't possible, directly.
Suggest you create a new SQL Server database that mimics the name of that Access database. Map a user to that login you've got above. Allow the user to run queries against the linked server.
You can use CREATE SYNONYM like so.
USE ASQLServerDB
GO
CREATE SYNONYM Sometable FOR LinkedServerAccessDB...SomeTable
Once this is done you can write SELECT [...] from SomeTable as though it was a member of the database ASQLServerDB
I was only able to get it to work at the object level so you'll need to do this for each object you want to expose. You could create an empty database that just contained these Synonyms if you wanted to get that "published as a database" feel.
--This doesn't work
CREATE SYNONYM Sometable FOR LinkedServerAccessDB...

Resources