SQL Server View showing outdated / wrong data after changing underlying dependencies - sql-server

We have a View (call it X) that is the base view called by 2 other views(call them Y and Z).
Today we made a change to view X, after that view Y and Z started bringing back data that was incorrect. When we were in Management Studio and ran SELECT * FROM Y(which is exactly how the view is called in code) it would get back data that was incorrect. However, when we ran the actual SQL that the view contained it was fine. We tried a number of things until a colleague suggested adding a space to view X and Z and then running Alter, which worked. Everything returned to normal and ran fine.
My question is: Does MSSQL cache its views? and if so how do you force them not to OR force them to re-compile?
Also, any additional reading about this would be helpful.

See the sp_refreshview command.
Updates the metadata for the specified non-schema-bound view.
Persistent metadata for a view can become outdated because of changes
to the underlying objects upon which the view depends.

SQL Server does not cache view data (at least, not in the way you are referring to it).
If a view definition contains 'SELECT *', then the actual column list is defined when the view is created, i.e. the 'SELECT *' is replaced by the actual column list than exists at the time you create the view. That means if you add columns to underlying tables referenced by that view, that won't appear in the view.

Related

Is it possible to us MSAccess to insert data through an external linked View in SQL Server? [duplicate]

This question already has answers here:
Any way to edit data on MS-Access form using SQL View with two tables
(3 answers)
Closed 2 years ago.
I'm using MS Access 2016 and SQL Server 2012.
I have single table with a schemabound view over the table. Our design was to allow our users to insert data through the view so we can do some id lookups for them and audit who inserted the record using an instead of insert trigger.
No matter how I configure it, Access doesn't want to let me add records to the view. If I use the table, it works as expected and it lets me insert new records.
I've tried the following:
Opening the external table and try to add new record at bottom - no luck
Creating an "Update" query - no luck
Creating an "Append" query - no luck
Creating a form, setting the view as a source, and setting Data Entry = Yes - no luck
Is what I'm trying to do even possible? I come from a SQL/C# background, and have only basic MSAccess skills. Any help is appreciated!
EDIT: I found out the View had a group by when it shouldn't have. I corrected that, but now it is complaining that it is not updatable because there are multiple underlying tables.
EDIT2: I've made progress but it's not ideal.
More info:
This view represents a datawarehouse fact, and joins to two dimension tables to get business keys.
I was able to get this to work by creating a "dummy" table with all the same fields, then creating a view over that. I was able to insert, but the data comes back "deleted" immediately because my instead of trigger fires.
Data went to the right place, but obviously the view doesn't show any data since it's not actually querying the correct tables anymore.
Clearly the view is irrelevant now, and I'll likely write the trigger over the dummy table.
It's not ideal, but I think my solution will be to have them insert through this dummy table, and then have a second view that does the joins so they can view the results.
I appreciate the detailed answer that was given, but unfortunately it didn't help this situation.
If the view is up-datable from SSMS, then it will be so in Access. So, before you try anything in Access, you have to first ensure that the view is up-datable in SSMS. So not all SQL views are up-datable.
You would do beyond well to test/check this in SSMS.
Once you done this, then you have to link the table from access. (and if you messed around with modify to the view, then delete the linked view in access and re-link). YOU MUST delete the access link and re-link.
And the linking process needs extra CARE. WHEN you link, you will be asked for a primary key for the view. Because a view can have multiple PK's (as a result of several tables), then Access can't know or guess. And far worse is that a SQL view does not in fact have a defined PK, and there is no command or means that Access can use to determine this. So you are PROMPTED during that link process. I note this issue, since if you are using some VBA re-link code, then if you re-link and change the database (or server), then during that re-link process, the PK setting you had will be lost. You ONLY get this prompt during a link of a new table - not a re-link. So, keep this important detail in mind.
You can after the fact (after linking a table) execute the following command to re-enable or "set" which column is to be the PK with this command in Access:
I in fact use this routine:
Sub createPK(strTable As String, strPK As String)
CurrentDb.Execute "CREATE UNIQUE INDEX " & strPK & _
" ON " & strTable & " (" & strPK & ") WITH PRIMARY"
End Sub
So, to set a PK for a linked view, then I use this:
Call createPK("dbo_tblHotels","ID")
As a FYI:
The above command DOES NOT create a index in access for that linked view, but is a MEANS to ONLY TELL Access what column to use for the PK. So, in this context, the create index command is not creating a index, but is the means/approach/how/process in which you can tell/set in Access what column is to be used for the PK view. As noted, you only need to do the above if you using code to re-link (or create) a table (view in this case).
So, if you using the Access UI, and you link to a view? Then Access will prompt/ask you to choose a column for the PK. You can as noted after linking use the above routine/command in Access to set which column is to be used as the PK if you missed the prompt, or as noted are using some VBA code to re-link.
A re-link (refresh) with the Access UI to the SAME database will preserve the PK setting. But if you change the connection string (database or server name), then the PK setting will be lost.
First, this is a really good answer to a similar question. Try adding an INSTEAD OF trigger to the view in SQL Server. In order to insert into a view the key columns must all be present and each table must be UNION'ed together. INSTEAD OF trigger can be made to do exactly as you wish.

sp_help - table is referenced by an 'empty string' schema bound view

I'm trying to systematically determine the differences in schema between a local database and a remote database administered by someone else. I've had the remote administrator run a script that executed sp_help and sp_helptext on a variety of objects.
There is one difference I don't know how to account for. On my local system sp_help on one table produces a line of message output No views with schema binding reference table 'dbo.tbl'.
On the remote system, the output was 'Table is referenced by views' followed by a blank line. The query was run with output as text so this indicates on the remote machine a one row result set was produced with an empty string (or NULL?) value.
How can that happen? If I create a schema bound view locally I get the 'Table is referenced by views' output followed by the name of the view plainly displayed. What scenario on the remote machine could be producing this result set without any view name recorded?
To answer your question requires knowledge of the internal workings of sp_help. You are welcome to descend into the depths of procedures that are called by sp_help. If you do, you will see that this procedure uses sysdepends - which is both not "dependable" and obsolete. Ref: https://learn.microsoft.com/en-us/sql/relational-databases/system-compatibility-views/sys-sysdepends-transact-sql . The short answer is that something has been altered/renamed in a way that is not captured in sysdepends.

Epicor asking for password after making a table change

Epicor - what a beastly creature!
Epicor asking for password after making a table change, any idea why?!?!
We removed the relationship from the (part table) and set up a criteria, instead. Now it is asking for a password, which should not be happening.
the login happens when I try to run the report. I am trying to figure out what I did to aggravate Epicor. The table was already there. I removed the relationship (part table) and added a criteria, instead, otherwise, that is exactly what I would have done. The only reason that I did not add a table to a report data definition, like I originally wanted to is because the parts table could only be added once. Which is why I removed the relationship and added a criteria, instead.
From your description, it sounds like the problem is related to the xml generated by Epicor for a non-BAQ based report data definition. Crystal and SSRS reports ask login information when either there is more than one datasource is referenced in the report, or there is improper relationships defined.
Note:
If you are not a report developer and you have modified this in an attempt to change the end data, I recommend you contact the report developer responsible for maintaining these before proceeding. Otherwise, read on.
Based on my experience, I would say if you are confident in the new relationship structure you have in the report data definition, the solution to this problem is likely within the report itself. Generate an xml file by running a test report, then open the .rpt (or .rdl) associated with this report and set the datasource to the new xml file. This should update the new xml schema used as the datasource. Even if none of the fields were changed in the data definition, the datasource schema definition that is stored in these files define exactly the data formatting that the report expects to receive when it is opened by Epicor.
If that doesn't solve the problem and you are using Crystal, the xml relationships may be defined in a way that will effect the way the data is displayed, which can be adjusted by using database expert->links tab in crystal. You should reconnect all of the links to match the report data definition within Epicor.
If none of that works, open up and view the xml file.
It is not unheard of for report data definitions in Epicor to break behind the scenes when altering relationships, and the xml file generated by the test report may not be a fully-qualified xml file. I have seen many xml files that do not have elements closed, etc. that will cause various problems when attempting to run the report. In this case, my recommendation is to create a completely new report data definition (do not copy), and re-enter all of the parameters that existed in the former definition. Repeat the refreshing of the report datasource as described above and this problem should be fixed.

How to display results from Stored Procedure WinForms

I have a stored procedure in SQL Server that returns different structures based on the parameters.
In other words, it might return a result set with three fields, or 15, and the column names are going to be different.
How can I display these results in a WinForm app?
I currently use the Entity Framework for accessing data, but obviously that is not going to work in this situation.
The data will be readonly, ie, no need to edit. Just need to display it.
I am guessing that I need to skip EF and just call the SP directly, and populate a DataGridView with autocolumns.
Greg
Since you're retrieving a dynamic data structure you'll need to make your grid columns dynamic too. What I would do is, after retrieving the result set successfully, to clear the Columns collection and recreate them from scratch every time you called the SP and the data needs to be refreshed on-screen. Do not use AutoGenerateColumns, as they provide little chance to customize the look of them, instead define each and every column yourself so that you can choose what to display and how.
This is the best answer I found: How to use a DataAdapter with stored procedure and parameter
In short, use a DataAdapter and a DataTable. Even with AutoGenerateColumns, the column headers look great and it works no matter the table structure that comes back.
Greg

SSAS cube processing error about column binding

This is an error message I get after processing an SSIS Cube
Errors in the back-end database access module. The size specified for a binding was too small, resulting in one or more column values being truncated.
However, it gives me no indication of what column binding is too small.
How do I debug this?
This error message has been driving me crazy for hours. I already found which column has increased its length and updated the data table in the source which was now showing the right length. But the error just kept popping up. Turns out, that field was used in a fact-to-dimension link on Dimension Usage tab of the cube. And when you refresh the source, the binding created for that link does not refresh. The fix is to remove (change relationship type to 'No Relationship') and re-create that link.
Upd: Since that answer seems to be still relevant, I thought I'd add a screenshot showing the area where you can encounter this problem. If for whatever reason you are using a string for Dimension-to-Fact link it can be affected by the increased size. And the solution is described above. This is additional to the problem with Key, Name, and Value Columns on the Dimension Attribute.
ESC is correct. Install the BIDS Helper from CodePlex. Right click on the Dimensions folder and run the Data Discrepancy Check.
Dimension Data Type Discrepancy Check
This fixed my issue.
Open your SSAS database using SQL Server Data Tools.
Open the Data Source View of the SSAS database.
Right click an empty space and click Refresh
A window will open and show all changes to the underlying data model.
Documentation
Alternate Fix #1 - SQL Server 2008 R2 (haven't tried on 2012 but assume this will work).
Update / refresh your DSV. Note any changed columns so you can review.
Open each dimension that uses the changed columns. Find the related attribute and expand the properties KeyColumns, NameColumn and ValueColumn.
Review the DataSize properties for each and if these do not match the value from the DSV, edit accordingly.
Alternate Fix #2
Open the affected *.dim file and search for your column name / binding.
Change the Data Size element: <DataSize>100</DataSize>
As Esc noted, column size updates can affect the Dimension Usage in the cube itself. You can either do as Esc suggests, or edit the *.cube file directly - search for the updated attribute and related Data Size element: <DataSize>100</DataSize>
I've tried both fixes when a column size changed, and they both work.
In my case the problem was working on the cube on live server.
If you are working on the cube live, connecting to the server this error message pops up.
But when you are working on the cube as a solution saved on the computer you do not get the error message.
So work on the cube locally and deploy after making changes.
In my particular case, the issue was because my query was reading from Oracle, and a hard-coded column had a trailing space (my mistake).
I removed the trailing space, and for a good measure, Cast the hardcoded value to be CAST ('MasterSystem' as VarChar2(100)) as SOURCE
This solved my particular issue.
I encountered this problem. The question decided by removing leading and trailing spaces and functions rtrim and ltrim.
I encountered the same problem, refreshing the data source did not work. I had a Materialized Referenced Dimension for the Fact Partition that was giving me the error. In my DEV environment I unchecked Materialize and processed the partition without the error.
Oddly, now I can enable Materialization for the same relationship and it will still process without issue.
Simple thing to try first - I've had this happen several times over the years.
Go to data source view and refresh (it may not look like anything happens, but it's good practice)
Edit dimension. Delete the problem attribute, then drag it over again from the data source view listing.
Re-process full.
As others have mentioned, data with trailing spaces can be the cause as well. Check for them: SELECT col FROM tbl WHERE col LIKE '% '
Running into the same problem, the answer from Esc can be a solution too. The cause is much more 'hidden' and the more obvious solutions 'Refresh' and 'Data type discrepancy check' don't do any good in my case.
I did not find a proper way to "debug" this problem.

Resources