How to order project solution content in SQL Server Management Studio - sql-server

In SQL Server Management Studio (SSMS) running against SQL Server 2005, I have a solution which contains a number of views.
These views are not sorted alphabetically.
Can anyone provide either an explanation of why, or a solution to order them alphabetically ?

I just came across this forum post. It doesn't get any simpler.
Just edit the ssmssqlproj file.
The file for my project (SQL Main) is
located in "My Documents\SQL Server
Management Studio\Projects\SQL
Main\SQL Main\SQL Main.ssmssqlproj".
Its just an xml file. Change the
following line
<LogicalFolder Name="Queries" Type="0" Sorted="true">
to
<LogicalFolder Name="Queries" Type="0" Sorted="false">
It will revert back to true so you
need to repeat this if you make
changes. THere is probably a better
way Smile

There is a tool that you can install to sort the contents of a SQL Server Solution project.
See the following reference.
http://web.archive.org/web/20121019155526/http://www.sqldbatips.com/showarticle.asp?ID=78
Please ensure you save your work, before attempting a sort.

When you add new item to the project they are added to the end of the list. They are kept in the order that they were added to the project because this order is preserved in the corresponding *.ssmssqlproj file. To change this order, close the project/solution, then locate the *.ssmssqlproj file and edit it with Notepad or your favorite XML editor (always make a backup first!). Reorder the FileNode elements along with their children to reorder the items appearance in the Solution Explorer.

Here's the solution I took. Up votes for the recommendation of the source code, however I don't have .NET installed here, so I had to go for a manual approach.
finish working on any project or solution files you have checked out, and check these edits in.
get latest version on everything in the solution.
Check everything out
Backup the whole folder structure
open the .ssmssqlproj file in notepad
maximise notepad full screen and turn word-wrapping off
edit the .ssmssqlproj file, reordering the XML nodes in the required order
save the .ssmssqlproj file
check everything back in.
That seems to have fixed my issue.

The stored procedure listed in the following post can do it as well:
http://beyondrelational.com/blogs/jacob/archive/2009/08/22/xquery-lab-48-sorting-query-files-in-sql-server-management-studio-ssms-solution-project.aspx

Please note that the sorting is case-sensitive.
So "B..sql" will come before "a...sql".
Remember to start all your scripts with the same casing (be it lower or upper).

Another easier (I think) way to edit the ssmssqlprog is to do so with MS XML Notepad 2007. With that I could drag the nodes around to order them. Make a copy of course. 2007 appears to be the latest version and is available at ...
http://www.microsoft.com/en-us/download/details.aspx?id=7973

Another way to handle this is to start with a new query from Management Studio (not by selecting 'New Query' in the solution).
Save the query file into the solution folder with a good name.
Then 'Add Existing Item' into the solution.
It adds with your chosen name, sorted correctly instead of creating the file initially as 'SQLQuery1'.
Maybe not much better, but another option to editing the project file and reopening.

Building on the previous answer, you can add it as a new query in the solution, and when you are done, just remove it from the solution and then Add existing item to the solution, and select your new query. Doing it this way adds the new query to the correct solution folder for you before you remove and add back in.

And yet another way... while one of these ways worked for me several times... sometimes it becomes stubborn. Find the location where it is broken. Take a screenshot of the file names.
Select all the file names you need to remove before the glitch.
in other works, if it sorts a - f and then sorts b - z
Remove all the a-f (don't delete them) that occur before the b-z, then save the project.
Now add them back in and save the project again. Presto.
So far this has worked very well for me and is fairly easy to do.

Related

Logic app expression for path to File not working

I tried to find documentation in the subject but fell short until now.
I am trying to use Logic Apps in order to update a table when a trigger occurs.
Adding some context:
In many separate excel online file that are located in different area of Sharepoint, I have one Table in each of those files. Anytime the SQL table is updated, I get the following elements:
Name
Age
path_to_doc
doc_id
Name and Age are element I wish to add in those Excel file.
path_to_doc is the path to the Excel file that needs to be updated.
doc_id is the id of the Excel file that needs to be updated.
In the "Add row to a table" action, those are the elements that need to be filled:
Site (Manual no problem, this doesn't change) Document Library
(Manual no problem, this doesn't change)
File (this is where I have a first problem: when I do not click
manually, and try to put either the "path_to_doc" or the "doc_id"
instead, it doesn't work.
Table (It seems that I can force it to be Table1), which is fine
because all my Excel files have the table called Table1
Arguments (that is Azure understands the Table and is componnents and
asks you to fill the ones you need to fill, those elements disappear
when you change from a manual input to an input "path_to_doc" or
"doc_id").
It throws me an error:
ERROR 400
NOTE: When I do it manually, it works.
Anyone has experienced this and found a solution?
Thank you
You don't need to use Expression.
For example, if we want to get tables of the modified Excel, we can do like this:
A similar flow in SharePoint:
Finally found the answer.
I needed to go to the code view and add my dynamic details there for the body.
Thank you for your help.
Here is the solution. I hope it helps others :)
In the designer view, create an action "Add a row into a table" and use the dynamic path that brings you to the excel file that you need to update. It will show an error and you will not be able to add the body arguments.
In the code view, now you can manually add the body of the request to include the element you wish to update in the Table of the excel file.
That's it!

SQL Management Studio : Request group to multiple files

This may sound like a bizarre question, so let me clarify.
I am currently exporting a bunch of lines from a MS SQL database to a file. The total is approximately 5M records with 10 fields.
Result file is huge and target software struggles to handle it.
What I'd like to do is I'd like to split this request in order to get multiple smaller files instead of one huge file, grouped with one of the 10 fields, let's say by regions.
Is that something SQL Studio can do ? Otherwise is there any solution to my problem ?
I have never worked with SQL fuctions, mabe could they help as well ?
Thanks in advance for your help & have a great day !
Vincent
You can handle this by SQL but I would say if you already produced your intended file and only need to split it you can split the file using some tool.
See this question for how to do it on Windows using command line:
Batch file to split .csv file
If it is a csv file as seems from the tags of this question, you will have to copy the first line and add to all files but first one. Because first line is the header of the CSV file and I assume your application you will need it for every part file.
The other solution would be to write a SQL statement to filter results.
Say if you want to filter by regions field you can write:
SELECT * FROM WHERE regions = ''
This however is very simplistic and you might need to do more work to get intended result.
Your regions values might not be the same number as your intended parts. You will need to figure out how to split based on many region values. You can also implement some SQL partitioning of the result set but I would say the file processing solution should be easier for you to apply.

Typo3 store sql table definitions in separate files

Got to make my first steps with typo3, now.
Got an Extension, some tables in ...typo3conf\ext\my_extension\ext_tables.sql and would like to put each table Definition in a separate file, because it gets very long.
Is it possible?
The best would still be to put everything into the ext_tables.sql file as many checks are happening with this file like if you add new fields, remove fields, add tables, the DB compare in the Install Tool can handle that.
Have a look at an example in CMS7
/typo3/sysext/install/Classes/Controller/Action/Tool/UpgradeWizard.php::silentCacheFrameworkTableSchemaMigration()
Where given SQL file is used to perform update. fx:
/typo3/sysext/core/Resources/Private/Sql/Cache/Backend/Typo3DatabaseBackendCache.sql

SSAS cube processing error about column binding

This is an error message I get after processing an SSIS Cube
Errors in the back-end database access module. The size specified for a binding was too small, resulting in one or more column values being truncated.
However, it gives me no indication of what column binding is too small.
How do I debug this?
This error message has been driving me crazy for hours. I already found which column has increased its length and updated the data table in the source which was now showing the right length. But the error just kept popping up. Turns out, that field was used in a fact-to-dimension link on Dimension Usage tab of the cube. And when you refresh the source, the binding created for that link does not refresh. The fix is to remove (change relationship type to 'No Relationship') and re-create that link.
Upd: Since that answer seems to be still relevant, I thought I'd add a screenshot showing the area where you can encounter this problem. If for whatever reason you are using a string for Dimension-to-Fact link it can be affected by the increased size. And the solution is described above. This is additional to the problem with Key, Name, and Value Columns on the Dimension Attribute.
ESC is correct. Install the BIDS Helper from CodePlex. Right click on the Dimensions folder and run the Data Discrepancy Check.
Dimension Data Type Discrepancy Check
This fixed my issue.
Open your SSAS database using SQL Server Data Tools.
Open the Data Source View of the SSAS database.
Right click an empty space and click Refresh
A window will open and show all changes to the underlying data model.
Documentation
Alternate Fix #1 - SQL Server 2008 R2 (haven't tried on 2012 but assume this will work).
Update / refresh your DSV. Note any changed columns so you can review.
Open each dimension that uses the changed columns. Find the related attribute and expand the properties KeyColumns, NameColumn and ValueColumn.
Review the DataSize properties for each and if these do not match the value from the DSV, edit accordingly.
Alternate Fix #2
Open the affected *.dim file and search for your column name / binding.
Change the Data Size element: <DataSize>100</DataSize>
As Esc noted, column size updates can affect the Dimension Usage in the cube itself. You can either do as Esc suggests, or edit the *.cube file directly - search for the updated attribute and related Data Size element: <DataSize>100</DataSize>
I've tried both fixes when a column size changed, and they both work.
In my case the problem was working on the cube on live server.
If you are working on the cube live, connecting to the server this error message pops up.
But when you are working on the cube as a solution saved on the computer you do not get the error message.
So work on the cube locally and deploy after making changes.
In my particular case, the issue was because my query was reading from Oracle, and a hard-coded column had a trailing space (my mistake).
I removed the trailing space, and for a good measure, Cast the hardcoded value to be CAST ('MasterSystem' as VarChar2(100)) as SOURCE
This solved my particular issue.
I encountered this problem. The question decided by removing leading and trailing spaces and functions rtrim and ltrim.
I encountered the same problem, refreshing the data source did not work. I had a Materialized Referenced Dimension for the Fact Partition that was giving me the error. In my DEV environment I unchecked Materialize and processed the partition without the error.
Oddly, now I can enable Materialization for the same relationship and it will still process without issue.
Simple thing to try first - I've had this happen several times over the years.
Go to data source view and refresh (it may not look like anything happens, but it's good practice)
Edit dimension. Delete the problem attribute, then drag it over again from the data source view listing.
Re-process full.
As others have mentioned, data with trailing spaces can be the cause as well. Check for them: SELECT col FROM tbl WHERE col LIKE '% '
Running into the same problem, the answer from Esc can be a solution too. The cause is much more 'hidden' and the more obvious solutions 'Refresh' and 'Data type discrepancy check' don't do any good in my case.
I did not find a proper way to "debug" this problem.

When updating Entity Framework model from database, updatable views get deleted

I have a emdx file with update-able views. I made these views by following an example here where I delete the name and the type and leave just dbo:schema, however, every time I pick "Update Model from Database" these views and the entire definition including associations and such, get removed from the file.
To solve this problem, I end up doing a manual merge with the previous version, however, this is a really long and painful process.
Anyone know what I'm doing wrong?
Example of my declared update-able view:
<EntitySet Name="vw_MeterEmisHist" EntityType="Model.Store.vw_MeterEmisHist" Schema="dbo" />
I have had the same this happen when adding node to allow for mapping stored procedures to entities. The reason for this is that the XML formatted EDMX file is always completely auto generated when the model is updated (or created) from the database.
The easiest work around that I have found is to keep a text file within my solution with the changes that I have made so that they can be easily replaced. To speed things up, its possible to create a find/replace macro within Visual Studio to automate the process.
If anyone ever gets really bored, that sort of functionality would make a great add-in. (Or a great fix in VS. MS, are you listening?)

Resources