SQL Refactor Rename Not Publishing Correctly - sql-server

I'm trying to rename a few tables in one of my database projects. I right click and choose "Refactor" then choose "Rename". The Rename process appears to be working great! All references to the table are updated correctly and the refactorlog file is updated with an appropriate "Rename Refactor" operation.
However; when I generate a script to publish changes, the script simply creates a new table rather than going through the process of creating a new table, then copying the old table's data over, and finally (presumably) dropping the old table.
I've also tried just renaming a column on the table which results in a new column and dropping of the old one. The data should be copied over to the new column via a new table/identity insert/rename.
I've run a repair of SSDT just to be sure and had no success. Any advice is welcome!
-- Update --
I've not yet resolved this issue but it should be noted that the original DB project was created with an earlier version of visual studio (2010 regular) than we are currently using (2013 ultimate). The project was working in terms of refactoring in our current version of visual studio until recently.

After completely recreating my db project, I had some success but it was inconsistent. After a few publish tests, it started to act inconsistently Some refactors would take while others would drop the original table and then begin creating a new one with the new definition.
It turns out I had multiple versions of SSDT (SQL Server Data Tools) installed (2012 and 2013). I uninstalled 2012 and then ran a repair of 2013. Viola! Refactoring is now working again.

Related

How to solve Acumatica SQL Error After Upgrade

I'm trying to update my client's Acumatica ERP to the latest version. I cloned the current instance to test drive the update procedure and make sure everything runs smoothly. They are currently using version 2019 R2 and want to update to 2020 R2.
Using the test instance, I updated it to the latest build of 2020 R2 and everything seems to be working except for one report. When I try to generate the Report I'm getting the following error.
I imagine this has to do with a change in the Database. However I can't find a table with that name either in the new database or in the current database. I'm not sure if that's table, store procedure, view, etc. I'm not very familiar with SQL.
I loaded the report in the report designer and try looking at the schema but couldn't find any reference to that particular table.
Any help would be greatly appreciated.
Regards.
CES
The SOAdjust table must exist in the database.
Please, try again with the following steps:
Create a snapshot of the client system.
Create a new system on the same version
Download and restore the snapshot created on the 1 point.
Download and install Acumatica 2020R2 ERP Configuration
Open the Acumatica ERP Configuration.
Select the system
For the upgrade procedure
7.1 Click the Update Only Database
7.2 Click the Update Only Website
In Acumatica 2019R2, the SOAdjust table is in two different namespaces.
PX.Objects.SO.SOOrderEntry.SOAdjust
PX.Objects.SO.SOAdjust
In Acumatica 2020R2, the SOAdjust table is in only one of them
PX.Objects.SO.SOAdjust
I think you should update the SOAdjust table in the report.
"view the namespaces in SQL Management Studio" - You don't. Namespaces are from .Net, and have to do with the code organization (crude description, but close enough for understanding). At a SQL level, the Acumatica structure is quite flat, just tables in the database (VERY few fancy sql tricks / sql level organization), all the "Real" logic tends to be in the business objects (Graphs, for the most part, though some interesting logic is within the DAC (data object classes))

Tables and stored procedures not getting included in the generated script

I have created a SQL Server database project using Visual Studio 2015. The underlying database is SQL Server 2016. I have created the project by importing an existing database into the project. I have the structure ready. When I click on publish and select generate, it doesn't include the tables and stored procedures in the generated script. I can see one odd script added. Am I missing something?
Please see the screenshot of my project
The issue has been now resolved. The mistake i was making is importing the data objects from the database and trying to generate the script pointing to the same database. It was not generating the scripts since the database objects existed . Pointing to another database helped me generating the scripts.

SSDT implementation: Alter table insteed of Create

We just trying to implement SSDT in our project.
We have lots of clients for one of our products which is built on a single DB (DBDB) with tables and stored procedures only.
We created one SSDT project for database DBDB (using VS 2012 > SQL Server object Browser > right click on project > New Project).
Once we build that project it creates one .sql file.
Problem: if we run that file on client's DBDB - it creates all the tables again & it deletes all records in it [this fulfills the requirements but deletes the existing records :-( ]
What we need: only the update which is not present on the client's DBDB should get update with new changes.
Note : we have no direct access to client's DBDB database for comparing with our latest DBDB. We only can send them some magic script file which will update their DBDB to the latest state.
The only way to update the Client's DB is to compare the DB schemas and then apply the delta. Any way you do it, you will need some way to get a hold on the schema thats running at the client:
IF you ship a versioned product, it is easiest to deploy version N-1 of that to your development server and compare that to the version N you are going to ship. This way, SSDT can generate the migration script you need to ship to the client to pull that DB up to the current schema.
IF you don't have a versioned product, or your client might have altered the schema or you will need to find a way to extract the schema data on site (maybe using SSDT there) and then let SSDT create the delta.
Option: You can skip using the compare feature of SSDT altogether. But then you need to write your migration script yourself. For each modification to the schema, you need to write the DDL statements yourself and wrap them in if clauses that check for the old state so the changes will only be made once and if the old state exists. This way, it doesnt really matter from wich state to wich state you are going as the script will determine for each step if and what to do.
The last is the most flexible, but requires deep testing in its own and of course should have started way before the situation you are in now, where you don't know what the changes have been anymore. But it can help for next time.
This only applies to schema changes on the tables, because you can always fall back to just drop and recreate ALL stored procedures since there is nothing lost in dropping them.
It sounds like you may not be pushing the changes correctly. You have a couple of options if you've built a SQL Project.
Give them the dacpac and have them use SQLPackage to update their own database.
Generate an update script against your customer's "current" version and give that to them.
In any case, it sounds like your publish option might be set to drop and recreate the database each time. I've written quite a few articles on SSDT SQL Projects and getting started that might be helpful here: http://schottsql.blogspot.com/2013/10/all-ssdt-articles.html

VS2010 database project deploy rebuilds every table

I have recently created a database project in VS2010 for an existing SQL Server 2008 R2 DB. I have updated 1 table out of 11 by adding 3 new columns to the end. I then updated 4 views that referred to that table.
I then tried a Build/Deploy with it only generating a script.
I have inspected the script and for every single table in the DB, it has generated code that will create a temp version of each table, copy the data from the existing table, drop the original and rename the copy.
I saw the posting on here where it insisted on rebuilding the table for dropped columns and I tried setting the IgnoreColumnOrder but it didn't make any difference. It didn't seem relevant to my situation, anyway, so I wasn't surprised.
I created my DB project by getting the DBA to give me a fully scripted version of Production, built that DB on my PC version of SQL Server and then created my initial project from that. I don't think that would make any difference and I have compared the project definition of the tables to the target Dev DB and they are the same.
I have "Always recreate database" unticked and "Block incremental deployment if data loss might occur" ticked. Don't suppose they have anything to do with my issue?
Any ideas?
I found a backup of the database and as per Peter's suggestion, ran a Schema Compare. The difference turned out to be that the target DB had PAGE compression on most of the tables but that was not in the project definition.

Can't update with schema compare

Up until today, I was able to use the Schema Compare feature in Visual Studio 2012 to update a database from a database project. But now, for one project I can do the compare, but the update button is greyed out.
I am able to use other projects to update other database, but from this project I can't update any databases. I do not get any errors, the functionality is just unavailable.
Using publish still works. Also updating the project from the database works, just not the other way around.
Does anyone know why I wouldn't be able to update a database via Schema Compare?
Check the bottom of the screen after a compare it shows status messages there.
I've seen this issue if there is a compile error in the database project. Once the error is resolved close and reopen the compare dialog. Rerun your compare and the Update button should be available again.
You must check database users and database schema. Often, if database users aren't correctly replicated in DB Project, Schema Compare doesn't work.
For me the error list pane and output pane weren't showing anything in Visual Studio 2015. Only after building the database project that I was targetting, was I able to see the errors in the output pane (but still not in the error list pane). After fixing these errors, the Update button was no longer greyed out.
Ran into the same problem myself. As mentioned above, the normal Visual Studio Error List will list errors that block update... but further, there will also be warnings. One of the options that's enabled by default is that data loss blocks the update. That's the problem. Even though it's only a warning condition, any possible data-loss is functionally an error unless you change this flag.
imho, this is a pretty severe UI failure on MS's part, but what are you gonna do?
What worked for me was including the schema.
I was selecting to include only certain tables / procs etc.
If the schema containing the tables and procedures is not also ticked, the import does not include the elements.
You need to ensure that all of your SQLCMD variables have default values.
Right-click onto the project within the Solution Explorer and select Properties.
On the tab to the left, go to SQLCMD Variables and enter the default value(s) into the column provided.
After running your schema compare another time, the update button should now be available.
In my case, the issue was that I had installed a newer version of SQL Server and SSMS (2016). You must always make sure you have the correct version of Sql Server Data Tools installed to match the version you are doing the compare against. Here is the link to SSDT for SQL Server 2005-2017 that I verified working with Visual Studio 2017:
https://learn.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017
For me, I changed the order of the tables being added. If there is any relationship between any two tables, you have to add the parent table then dependent one to the database.
Not only does the schema have to be in your database project, it must be kept up to date if it changes on the database. Updates quit working after DBA granted a exec permission to a new sql login in the database on an schema that was in my project. After multiple failures for any stored procedure changes to get applied to my project, I updated the project, selecting only the schema that had changed. After updating the schema in the database project, the Update started working again. I include the schema in all updates. Hope this helps.
Within Schema Compare go to Options -> General -> check *Ignore authorizer*
This issue usualy is thrown if there was a change of tables in one of the two Databases since you've clicked "compare". Regardless weather the change happens on a table that is beeing updated or not.

Resources