My SQL CLR procedure depends on log4net, it depends on System.Web.dll.
When I upload this System.Web.dll :
create assembly [System.Web]
from 'C:\Windows\Microsoft.NET\Framework64\v4.0.30319\System.Web.dll'
with permission_set = unsafe;
go
I see that yet 20 other assemblies added.
Ok.
How can I script this/these assemblies and upload them from binary in script?
Good News (mostly)
If you have already loaded these assemblies into SQL Server, then you have two options. First, if you are using SSMS, you might be able to script out the assemblies. Just:
right-click on the database name in Object Explorer, go to "Tasks ⏵", and select "Generate Scripts...".
Choose "Select specific database objects" and choose the desires assemblies to script under "SQL Assemblies".
Click the "Next >" button
Select "Save as script file" and enter in the "File name:"
Click the "Next >" button
Click the "Next >" button
The scripted out objects might be in dependency order, in which case the CREATE ASSEMBLY statements don't need to be reordered.
If you get an error, you might need to save the report if you want the actual details. It could be:
SmoException: Cyclic dependencies found.
which is what I ran into when scripting out some .NET Framework libraries that I had loaded. Sooo....
The second option is to simply construct the CREATE ASSEMBLY statements by SELECTing their contents from the system catalog views:
SELECT N'CREATE ASSEMBLY ' + QUOTENAME(asm.[name]) + NCHAR(0x0A)
+ N' FROM ' + CONVERT(NVARCHAR(MAX), [content], 1) + NCHAR(0x0A)
+ N' WITH PERMISSION_SET = '
+ CASE asm.[permission_set]
WHEN 3 THEN N'UNSAFE'
ELSE asm.[permission_set_desc] COLLATE DATABASE_DEFAULT
END
FROM sys.assemblies asm
INNER JOIN sys.assembly_files asf
ON asf.[assembly_id] = asm.[assembly_id]
WHERE asf.[file_id] = 1
AND asm.is_user_defined = 1
--AND asm.[name] = N'{{assembly_name}}'
FOR XML AUTO, ELEMENTS;
Warning: Do NOT left-click on the returned XML value that should appear as a link and will open a new tab with what should be a formatted XML document if you do left-click on it. The returned value will most likely be too large for SSMS to handle (it certainly was on my system using SSMS 18.10) and it will just hang forever and you will need to forcibly kill the process (and I was just trying an assembly that was 4 MB). You likely have several assemblies that got loaded as dependencies and those sizes can add up quickly.
You will need to save the results by right-clicking anywhere in the "Results" tab and selecting "Save Results As...". I'm not sure if the "Save as type" matters as they are all text and there is only one value, but I usually select "All files (*.*)".
If the file is too big to open in a text editor or SSMS, you can uncomment the WHERE predicate filtering on [name] and do one at a time, or make it an IN list and do several at a time.
Also keep in mind that:
You will need to remove the <asm> and </asm> tags.
The values are not necessarily returned in an order that can be loaded, so you will have to test and might need to adjust the order of the CREATE ASSEMBLY statements.
Not-so-good News (potentially)
SQL Server checks the GAC for assemblies of the same name that are being referenced in SQL Server. If it finds any, it will check the version number and they must be the same (i.e. between the assembly in SQL Server and the one in the GAC). You will get an error if the version within SQL Server is different. This means, that if any of the .NET Framework assemblies that you are loading into SQL Server get updated via Windows Update (or even manual update of .NET), then you will probably need to drop these assemblies and reload them with what will then be the current version(s).
Related
When creating a UserDefinedType in C# code for the sake of SQLCLR integration, it is required that you prefix a class or struct with a SqlUserDefinedType, such as here:
[SqlUserDefinedType(
Name = "g.Options",
// ...
)]
public struct Options : INullable {
// ...
}
Notice that in the "Name" parameter, I attempt to set a schema in addition to the object name. But, when I generate the script in the publish stage of a Visual Studio Database Project, I get:
CREATE TYPE [dbo].[g.Options]
There is no "schema" parameter for SqlUserDefinedType.
I do believe I can write the T-SQL script to make the type from the assembly specifically, but I would like to avoid that, as I plan on putting most of my types in different schemas and wouldn't be happy to have to register via explicit TSQL on each one.
EDIT:
As Solomon Rutzky points out, you can set the Default Schema in the project properties. It is certainly no substitute for something akin to a 'schema' parameter in SqlUserDefinedType, particularly if you want to work with multiple schemas, but it certainly gets the job done for many people's needs.
A post-deployment script will technically get the job done, but unfortunately, the comparison engine doesn't know about the post-deployment logic and so will perpetually register the schema difference as something that needs to be changed. So all your affected objects will be dropped and re-created on every publish regardless of whether you changed them or not.
The Schema name is specified in a singular location per each project, not per object.
You can set it in Visual Studio via:
"Project" (menu) -> "{project_name} Properties..." (menu option) -> "Project Settings" (tab)
On the right side, in the "General" section, there is a text field for "Default schema:"
OR:
you can manually edit your {project_name}.sqlproj file, and in one of the top <PropertyGroup> elements (one that does not have a "Condition" attribute; the first such element is typically used), you can create (or update if it already exists) the following element:
<DefaultSchema>dbo</DefaultSchema>
HOWEVER, if you are wanting to set one object (such as a UDT) to a different Schema name than the rest of the objects are using, that would have to be done manually in a Post Release SQL script. You can add a SQL script to your project, then in the Solution Explorer on the right side, select the SQL script, go to its Properties, and for "BuildAction", select "PostDeploy". In that post-deploy script, issue an ALTER SCHEMA statement:
ALTER SCHEMA [g] TRANSFER TYPE::dbo.Options;
EDIT: Judging from the comments, I have been unclear in what I am trying to achieve. I'll try from another angle.
I have been developing sprocs for a number of years. I increasingly feel that choosing between SQL embedded in C# code and sprocs are both bad choices. I know that many people will disagree and that's fine.
I didn't elaborate on this in the question to avoid having the discussion be about sprocs or not :-) .
As an experiment, I have tried embedding .sql files in my project in Visual Studio. Literally, in my project tree I have files with the .sql extension and the Build Action set to Embedded Resource.
That way I can edit the SQL query code from within VS and even run/execute it from within VS, without running the project. I love that. The actual code that I am working on right now, is akin to a "product list" with paging and multiple ordering and filtering options. The SQL query has parameters like #skip, #search, etc. This means that if I try to run/execute it from within VS (specifically by pressing CTRL + SHIFT + e or selecting "Execute" from the "SQL" menu in VS) these parameters are missing (of course), since they are meant to be provided as SqlParameter at runtime. I understand why this happens and by no way intend to imply that either VS or SSMS is bugged. I am merely looking for a way to "tell" VS that "When I execute this query from within VS, I intend for #skip to be 10". I (perhaps mistakenly) assumed that the reason that VS had IntelliSense support and Execute support for .sql files was to support a scenario akin to what I am trying).
I was just hoping someone else was doing the same and had a clever way/addon/trick to support it.
EDIT END
In an project in Visual Studio 2017, I have a number of .sql files which are embedded into the application when built.
While editing the files in VS, I can conveniently connect to a SQL server and execute the query.
However, if the query has parameters like this:
SELECT * FROM Employee WHERE ID=#ID;
The execution, from within Visual Studio, fails with
Must declare the scalar variable "#ID".
This can be "fixed" by adding a line at the top of the script/file, so it looks like:
DECLARE #ID int = 123;
SELECT * FROM Employee WHERE ID=#ID;
However, now it doesn't work when called from within the code like this (using Dapper, it's not relevant to the question, but explains the syntax):
var emp = conn.Query<Employee>( sql, new { ID=123 } );
I was hoping that either I could specify the value of ID somewhere not in the file, or specify some part of the file which VS would execute, but would be ignored later when calling from code.
EDIT: To be clear, I can craft the SQL so that it works flawlessly from within Visual Studio or at runtime, but I cannot have both. I was hoping there was some neat hack, a VS addon or just some VS feature that I was just missing. I have considered adding a simple pre-processor so I can do something like this:
--if DEBUG
DECLARE #ID int = 123;
--endif
SELECT * FROM Employee WHERE ID=#ID;
And then I could add it to my function that loads the embedded resource.
Pardon if I understood the question wrongly. Could it be you're looking for something like this? The query approach actually doesn't matter, it more looks like the issue is that you declare it like #ID when you actually have to add the replacement to the script itself
con.Open();
SqlCommand cmd = con.CreateCommand();
cmd.CommandText = "SELECT * FROM Employee WHERE ID=#ID;";
cmd.Parameters.Add("#ID", actualID); <--this is is why declaring doesn't work - You can declare it, but you also need to add it to the query
SqlDataReader rdr = cmd.ExecuteReader();
Five years have passed since the original post remains without the answer, and now I stumbled upon it because I am trying to do exactly the same as Thomas.
One way to do it, which is far from perfect, is to add the parameters with test values in the SQL file on top of the query. (e.g. DECLARE #p1=10) Later they need to be removed of course.
What is needed in my view in VS is a user interface like when a stored procedure with parameters is run from SQL Server Management studio and it pops up with a form where parameters can be set before the SP runs.
We can create (update, populate) tables etc by either SQL commands or by simply clicking the options/command with a mouse/keyboard in SQL Server Management Studio (Express) 2008.
Is there a way I can record all the commands that occur when I do it with a mouse?
Are there macro recording features within SSMS?
Are there 3rd party addones to support macro recording in SSMS?
No, there is nothing built into SSMS to record keystrokes / mouse clicks. There are plenty of 3rd party programs available, though.
What you should do is write the commands you want to run using T-SQL or DDL. Those are very easy to save to a file, store in source control, review, run again later, etc.
In newer versions of SSMS, almost all tasks you perform through dialogs has the ability to script out the change you are about to perform (or just generate a script instead of clicking OK):
EDIT for the use case you presented in a comment, where you create a table, add constraints, add some dummy data. Here is what you can do to generate a script that you can use later:
Open Object Explorer
Right-click your database and choose Tasks > Generate Scripts
Click Next, Click Next
On the "Choose Script Options" page, scroll down and set "Script Data" to True
Click Next
Check Tables and click Next
Check the table you want and click next
Choose whether to script to a file, to the clipboard, or a new query editor window.
You'll end up with something like this:
USE [your_database]
GO
/**** set options, object name, script date, etc ****/
CREATE TABLE [dbo].[foo](
bar INT,
blat INT,
CONSTRAINT PK_foo PRIMARY KEY CLUSTERED(bar ASC)
) -- with/on etc.
GO
INSERT [dbo].[foo]([bar],[blat]) VALUES(1,2);
INSERT [dbo].[foo]([bar],[blat]) VALUES(3,4);
GO
ALTER TABLE [dbo].[foo] WITH NOCHECK ADD CONSTRAINT [chk_blat]
CHECK ((blat > bar))
GO
ALTER TABLE [dbo].[foo] CHECK CONSTRAINT [chk_blat]
GO
I know this is more steps than you want, but Management Studio's job is not to save every single command you execute and store it somewhere so you can retrieve it later.
Here are two options that might help.
The SSMS Tools Pack has a query history:
http://www.ssmstoolspack.com/Images/Features/QEH1.png
I think this will work as you expect because your interaction with the GUI will translate into T-SQL commands, which should be recorded by this tool.
You can download this here:
http://www.ssmstoolspack.com/Features
If you start getting familiar with the actual commands you can use SQL Tab Magic, which auto-saves your SSMS query windows for future retrieval:
http://www.red-gate.com/products/sql-development/sql-prompt/entrypage/sqltabmagic
I am running 2008 R2
I have an entire set of reports that I have to load into RS. These were developed and sent to our company by a third party. They connect to a shared data source. I uploaded the reports into folders that separated them by function. The problem is that they don't see the data source that the reports are written to use. I created a shared data source and pointed the report to it. The connection tested successfully but when I ran the report I continued to receive the following error:
"The report server cannot process the report or shared dataset. The shared data source 'dsMyObject' for the report server or SharePoint is not valid. Browse to the server or site and select a shared data source. (rsInvalidDataReference)"
When I used the exact same credentials in an embedded data source the report runs and after I get it to run I can change it back to the shared data source I created and it runs just fine.
My problem is that this process is too inefficient I have to many reports, and many more servers to roll this too, to have to manually switch the data source back and forth in order for it to work.
I noticed that when I look through the list of data sources I see the ones that I created with the name of the DS on the first line and the location on the next (http://[Server]/reports/[Data Source Folder]) while the DS created by the vender have the name of the DS on both lines. This leads me to believe that there is a way to load the DS into SSRS with the same name the vendor used in such a way to allow the reports to automatically use them. I can’t find anything that tells me how to do this so I am asking this question here. How do I get prewritten reports to see a shared data source on the server I deploy the reports to?
Thanks in advance,
I know this is a really old question but I came across this as I was facing a similar issue over the past two days. However, in my experience the solution was simpler - so I'll add it in case someone else searches for this in the future.
You need to create your data source before you deploy your reports into RS. Here's what happened for me:
1) I first deployed them before adding a data source, and they didn't work: neither from the application that I'm workin on, nor from RS
2) I then created a data source but that didn't do anything to existing reports. I was able to configure some of them and point them to the new data source, and that worked. If you want to do this in bulk, you could probably use the RS Utility as suggested above or probably something like this, however I haven't tried these.
3) I tried redeploying the reports into RS, however apparently RS doesn't overwrite items that haven't changed as the redeployed reports still didn't pick up the datasource
4) I then deleted existing reports and redeployed - this fixed it and they picked up the data source straight away.
This is a big shortcoming of SSRS. When reports and data sources are created in SSRS, they are assigned unique identifiers in the ReportServer database. When reports are uploaded, they do not recognize an existing data source with the same name the report was built with.
One way around this is to use the rs.exe utility that comes with SSRS. It can be used for bulk tasks, including setting data source metadata on reports. See the link below for more technical information.
RS Utility
I found the easiest way to resolve this was to simply create a new report that DOES use the shared data source. In SSRS2012/VS2010, the first page of the wizard asks whether to use a "Shared data source" or to create a "New data source". Taking the first option (and continuing on to create the dummy report and save it) results in an RDL containing a section between the section and the section, something like this:
<DataSources>
<DataSource Name="shared_datasource_name">
<DataSourceReference>shared_datasource_name</DataSourceReference>
<rd:SecurityType>None</rd:SecurityType>
<rd:DataSourceID>f00b5045-1a8c-44be-952b-cca1ce9c57d6</rd:DataSourceID>
</DataSource>
</DataSources>
I just copy-pasted this section into my intended report RDL in the same location. It happily uses that datasource during preview and when re-deployed.
Update one report manually with shared datasource. In the [ReportServer].[dbo].[DataSource] table, note the "Link" value for that report, then simply update the reset that have the Datasource in the Name column
update [ReportServer].[dbo].[DataSource]
set Link = '01B758F2-B597-43BE-A90F-416DF3CDA396'
where Name = 'PNLDS'
and Link is NULL
In my case I had to click the drop-down arrow at the right of the report in SSRS, select "Manage", then "Data Sources", then browse to the location of the Shared Data Source the report needs to use.
Appreciate an old thread but I was trying to find if SSRS can report which shared data source it's using so i could add a warning to an SSRS report in case it was still pointing at wrong data source when deployed. The solution i came up with admittedly only works if you're using different servers for Dev/UAT/Prod but you can stick ##SERVERNAME in your tsql script.
I've made it into a case statement that I can bung on my reports
select case ##SERVERNAME when 'Prodserver' then ''
else 'WARNING: This report is not looking at production data!' end as ServerWarning
Works well enough for me!
Definitely a big shortcoming of SSRS.
Many good answers posted here which suit different scenarios, skillsets and tools.
I'm adding another one using Class Sys.Net.WebServiceProxy on Powershell:
# Create WebServiceproxy object
[string] $WebServiceUri="http://localhost/ReportServer/ReportService2010.asmx?wsdl"
$proxy=New-WebServiceProxy -uri $WebServiceUri -UseDefaultCredentails
# Define required data types (DataSource and DataSourceReference) and instantiate
$typeds=($proxy.gettype().namespace) + '.DataSource'
$ds=new-object -TypeName $typeds
$typedsref=($proxy.gettype().namespace) + '.DataSourceReference'
$reference=new-object -TypeName $typedsref
# Set data source object with reference and name
$reference.Reference="/Data Sources/DStest" # path/name of the data source
$ds.Item=$reference
$ds.Name="DStest" # name of the data source on the report
# Repoint data source on report
$ReportPath="/Reports/TestReport" # path/name of the report
$proxy.SetItemDataSources($ReportPath, $ds)
Explanation:
The main idea is utilise the method SetItemDataSources(x, y) from class Sys.Net.WebServiceProxy to set the datasource y.name on report x to point to the datasource reference y.item.
In our case the report is called TestReport, located on folder "/Reports" which references data source "/Data Sources/DSTest" by the name "DStest".
In order to do the repoint we need to create a reference to the data souce ($ds) for which we first needed to define the data types DataSource and DataSourceReference by derived them from the namespace of the WebServiceProxy object ($proxy)
I was able to bulk update and fix the problem by updating the data source link using the T-SQL below.
begin tran
update dbo.DataSource
set Link = c.ItemID
from dbo.DataSource ds
inner join dbo.Catalog c
on ds.Name = c.Name
and c.Type = 5
where ds.link is null
select ds.Link oldlink, c.ItemID
from dbo.DataSource ds
inner join dbo.Catalog c
on ds.Name = c.Name
and c.Type = 5
--where ds.link is null
select *
from dbo.DataSource ds
where ds.link is null
rollback tran
-- commit tran
http://tsqlblog.blogspot.co.uk/2011/05/rsinvaliddatasourcereference-on.html
SQL Server Mgmt Studio is driving me crazy.
If I create a view and SELECT '*' from a table, it's all OK and I can save the view.
Looking at the SQL for the view (eg.by scripting a CREATE) reveals that the 'SELECT *' really is saved to the view's SQL.
But as soon as I reopen the view using the GUI (right click > modify), SELECT * is replaced with a column list of all the columns in the table.
How can I stop Management Studio from doing this ? I want my 'SELECT *' to remain just that.
Perhaps it's just the difficulty of googling 'SELECT *' that prevented me from finding anything remotely relevant to this (i did put it in double quotes).
Please, I am highly experienced in Transact-SQL, so please DON'T give me a lecture on why I shouldn't be using SELECT *. I know all the pros and cons and I do use it at times. It's a language feature, and like all language features can be used for good or evil (I emphatically do NOT agree that it is never appropriate to use it).
Edit: I'm giving Marc the answer, since it seems it is not possible to turn this behaviour off. Problem is considered closed. I note that Enterprise Manager did no similar thing.
The workaround is to either edit SQL as text, or go to a product other than Managment Studio. Or constantly edit out the column list and replace the * every time you edit a view. Sigh.
When SQL Server Mgmt Studio creates a view, I assume they're expanding the * to the complete list of columns that are present in the underlying table(s) at that particular time exactly for this reason: what if one of the underlying tables changes? Do you want the new columns to just simply show up in every view that references that table?? Seriously???
I think Microsoft tries to impmenent the "element of least surprise" here - your view will contain those columns that are present at the time the view gets created - and it stays that way, unless you explicitly and knowingly change that.
I for one wouldn't want to have my views suddenly having more columns than before, when an underlying table changes..... do you??
And I don't think there's any setting in Mgmt Studio to turn this behavior off, sorry.
Don't use the GUI editor.
Instead use the T-SQL editor. You get this by selecting "Script View As" -> "ALTER to" -> "New Query Window" from the right-click menu.
Try either of these: they are alternatives to using the GUI and can be setup as snippets with keyboard shortcuts:
select view_definition
from information_schema.views
where table_name = 'viewname'
or
exec sp_helptext 'viewname'
The results will retain the "select *". (Tested)