I am running 2008 R2
I have an entire set of reports that I have to load into RS. These were developed and sent to our company by a third party. They connect to a shared data source. I uploaded the reports into folders that separated them by function. The problem is that they don't see the data source that the reports are written to use. I created a shared data source and pointed the report to it. The connection tested successfully but when I ran the report I continued to receive the following error:
"The report server cannot process the report or shared dataset. The shared data source 'dsMyObject' for the report server or SharePoint is not valid. Browse to the server or site and select a shared data source. (rsInvalidDataReference)"
When I used the exact same credentials in an embedded data source the report runs and after I get it to run I can change it back to the shared data source I created and it runs just fine.
My problem is that this process is too inefficient I have to many reports, and many more servers to roll this too, to have to manually switch the data source back and forth in order for it to work.
I noticed that when I look through the list of data sources I see the ones that I created with the name of the DS on the first line and the location on the next (http://[Server]/reports/[Data Source Folder]) while the DS created by the vender have the name of the DS on both lines. This leads me to believe that there is a way to load the DS into SSRS with the same name the vendor used in such a way to allow the reports to automatically use them. I can’t find anything that tells me how to do this so I am asking this question here. How do I get prewritten reports to see a shared data source on the server I deploy the reports to?
Thanks in advance,
I know this is a really old question but I came across this as I was facing a similar issue over the past two days. However, in my experience the solution was simpler - so I'll add it in case someone else searches for this in the future.
You need to create your data source before you deploy your reports into RS. Here's what happened for me:
1) I first deployed them before adding a data source, and they didn't work: neither from the application that I'm workin on, nor from RS
2) I then created a data source but that didn't do anything to existing reports. I was able to configure some of them and point them to the new data source, and that worked. If you want to do this in bulk, you could probably use the RS Utility as suggested above or probably something like this, however I haven't tried these.
3) I tried redeploying the reports into RS, however apparently RS doesn't overwrite items that haven't changed as the redeployed reports still didn't pick up the datasource
4) I then deleted existing reports and redeployed - this fixed it and they picked up the data source straight away.
This is a big shortcoming of SSRS. When reports and data sources are created in SSRS, they are assigned unique identifiers in the ReportServer database. When reports are uploaded, they do not recognize an existing data source with the same name the report was built with.
One way around this is to use the rs.exe utility that comes with SSRS. It can be used for bulk tasks, including setting data source metadata on reports. See the link below for more technical information.
RS Utility
I found the easiest way to resolve this was to simply create a new report that DOES use the shared data source. In SSRS2012/VS2010, the first page of the wizard asks whether to use a "Shared data source" or to create a "New data source". Taking the first option (and continuing on to create the dummy report and save it) results in an RDL containing a section between the section and the section, something like this:
<DataSources>
<DataSource Name="shared_datasource_name">
<DataSourceReference>shared_datasource_name</DataSourceReference>
<rd:SecurityType>None</rd:SecurityType>
<rd:DataSourceID>f00b5045-1a8c-44be-952b-cca1ce9c57d6</rd:DataSourceID>
</DataSource>
</DataSources>
I just copy-pasted this section into my intended report RDL in the same location. It happily uses that datasource during preview and when re-deployed.
Update one report manually with shared datasource. In the [ReportServer].[dbo].[DataSource] table, note the "Link" value for that report, then simply update the reset that have the Datasource in the Name column
update [ReportServer].[dbo].[DataSource]
set Link = '01B758F2-B597-43BE-A90F-416DF3CDA396'
where Name = 'PNLDS'
and Link is NULL
In my case I had to click the drop-down arrow at the right of the report in SSRS, select "Manage", then "Data Sources", then browse to the location of the Shared Data Source the report needs to use.
Appreciate an old thread but I was trying to find if SSRS can report which shared data source it's using so i could add a warning to an SSRS report in case it was still pointing at wrong data source when deployed. The solution i came up with admittedly only works if you're using different servers for Dev/UAT/Prod but you can stick ##SERVERNAME in your tsql script.
I've made it into a case statement that I can bung on my reports
select case ##SERVERNAME when 'Prodserver' then ''
else 'WARNING: This report is not looking at production data!' end as ServerWarning
Works well enough for me!
Definitely a big shortcoming of SSRS.
Many good answers posted here which suit different scenarios, skillsets and tools.
I'm adding another one using Class Sys.Net.WebServiceProxy on Powershell:
# Create WebServiceproxy object
[string] $WebServiceUri="http://localhost/ReportServer/ReportService2010.asmx?wsdl"
$proxy=New-WebServiceProxy -uri $WebServiceUri -UseDefaultCredentails
# Define required data types (DataSource and DataSourceReference) and instantiate
$typeds=($proxy.gettype().namespace) + '.DataSource'
$ds=new-object -TypeName $typeds
$typedsref=($proxy.gettype().namespace) + '.DataSourceReference'
$reference=new-object -TypeName $typedsref
# Set data source object with reference and name
$reference.Reference="/Data Sources/DStest" # path/name of the data source
$ds.Item=$reference
$ds.Name="DStest" # name of the data source on the report
# Repoint data source on report
$ReportPath="/Reports/TestReport" # path/name of the report
$proxy.SetItemDataSources($ReportPath, $ds)
Explanation:
The main idea is utilise the method SetItemDataSources(x, y) from class Sys.Net.WebServiceProxy to set the datasource y.name on report x to point to the datasource reference y.item.
In our case the report is called TestReport, located on folder "/Reports" which references data source "/Data Sources/DSTest" by the name "DStest".
In order to do the repoint we need to create a reference to the data souce ($ds) for which we first needed to define the data types DataSource and DataSourceReference by derived them from the namespace of the WebServiceProxy object ($proxy)
I was able to bulk update and fix the problem by updating the data source link using the T-SQL below.
begin tran
update dbo.DataSource
set Link = c.ItemID
from dbo.DataSource ds
inner join dbo.Catalog c
on ds.Name = c.Name
and c.Type = 5
where ds.link is null
select ds.Link oldlink, c.ItemID
from dbo.DataSource ds
inner join dbo.Catalog c
on ds.Name = c.Name
and c.Type = 5
--where ds.link is null
select *
from dbo.DataSource ds
where ds.link is null
rollback tran
-- commit tran
http://tsqlblog.blogspot.co.uk/2011/05/rsinvaliddatasourcereference-on.html
Related
I want to display all audit history data as per MS CRM format.
I have imported all records from AuditBase table from CRM to another Database server table.
I want this table records using SQL query in Dynamics CRM format (as per above image).
I have done so far
select
AB.CreatedOn as [Created On],SUB.FullName [Changed By],
Value as Event,ab.AttributeMask [Changed Field],
AB.changeData [Old Value],'' [New Value] from Auditbase AB
inner join StringMap SM on SM.AttributeValue=AB.Action and SM.AttributeName='action'
inner join SystemUserBase SUB on SUB.SystemUserId=AB.UserId
--inner join MetadataSchema.Attribute ar on ab.AttributeMask = ar.ColumnNumber
--INNER JOIN MetadataSchema.Entity en ON ar.EntityId = en.EntityId and en.ObjectTypeCode=AB.ObjectTypeCode
--inner join Contact C on C.ContactId=AB.ObjectId
where objectid='00000000-0000-0000-000-000000000000'
Order by AB.CreatedOn desc
My problem is AttributeMask is a comma separated value that i need to compare with MetadataSchema.Attribute table's columnnumber field. And how to get New value from that entity.
I have already checked this link : Sql query to get data from audit history for opportunity entity, but its not giving me the [New Value].
NOTE : I can not use "RetrieveRecordChangeHistoryResponse", because i need to show these data in external webpage from sql table(Not CRM database).
Well, basically Dynamics CRM does not create this Audit View (the way you see it in CRM) using SQL query, so if you succeed in doing it, Microsoft will probably buy it from you as it would be much faster than the way it's currently handled :)
But really - the way it works currently, SQL is used only for obtaining all relevant Audit view records (without any matching with attributes metadata or whatever) and then, all the parsing and matching with metadata is done in .NET application. The logic is quite complex and there are so many different cases to handle, that I believe that recreating this in SQL would require not just some simple "select" query, but in fact some really complex procedure (and still that might be not enough, because not everything in CRM is kept in database, some things are simply compiled into the libraries of application) and weeks or maybe even months for one person to accomplish (of course that's my opinion, maybe some T-SQL guru will prove me wrong).
So, I would do it differently - use RetrieveRecordChangeHistoryRequest (which was already mentioned in some answers) to get all the Audit Details (already parsed and ready to use) using some kind of .NET application (probably running periodically, or maybe triggered by a plugin in CRM etc.) and put them in some Database in user-friendly format. You can then consume this database with whatever external application you want.
Also I don't understand your comment:
I can not use "RetrieveRecordChangeHistoryResponse", because i need to
show these data in external webpage from sql table(Not CRM database)
What kind of application cannot call external service (you can create a custom service, don't have to use CRM service) to get some data, but can access external database? You should not read from the db directly, better approach would be to prepare a web service returning the audit you want (using CRM SDK under the hood) and calling this service by external application. Unless of course your external app is only capable of reading databases, not running any custom web services...
It is not possible to reconstruct a complete audit history from the AuditBase tables alone. For the current values you still need the tables that are being audited.
The queries you would need to construct are complex and writing them may be avoided in case the RetrieveRecordChangeHistoryRequest is a suitable option as well.
(See also How to get audit record details using FetchXML on SO.)
NOTE
This answer was submitted before the original question was extended stating that the RetrieveRecordChangeHistoryRequest cannot be used.
As I said in comments, Audit table will have old value & new value, but not current value. Current value will be pushed as new value when next update happens.
In your OP query, ab.AttributeMask will return comma "," separated values and AB.changeData will return tilde "~" separated values. Read more
I assume you are fine with "~" separated values as Old Value column, want to show current values of fields in New Value column. This is not going to work when multiple fields are enabled for audit. You have to split the Attribute mask field value into CRM fields from AttributeView using ColumnNumber & get the required result.
I would recommend the below reference blog to start with, once you get the expected result, you can pull the current field value using extra query either in SQL or using C# in front end. But you should concatenate again with "~" for values to maintain the format.
https://marcuscrast.wordpress.com/2012/01/14/dynamics-crm-2011-audit-report-in-ssrs/
Update:
From the above blog, you can tweak the SP query with your fields, then convert the last select statement to 'select into' to create a new table for your storage.
Modify the Stored procedure to fetch the delta based on last run. Configure the sql job & schedule to run every day or so, to populate the table.
Then select & display the data as the way you want. I did the same in PowerBI under 3 days.
Pros/Cons: Obviously this requirement is for reporting purpose. Globally reporting requirements will be mirroring database by replication or other means and won't be interrupting Prod users & Async server by injecting plugins or any On demand Adhoc service calls. Moreover you have access to database & not CRM online. Better not to reinvent the wheel & take forward the available solution. This is my humble opinion & based on a Microsoft internal project implementation.
Hi I am using RevoScaleR package from Revolution Analytics and
I find it quite odd that the functions that are available for sql server objects are very limited .
for example:
RxSqlServerData does not support querying from a view .
I have a view which I have created from multiple tables and I intend to use this view as my source of data and I could not find anything that can solve my purpose in RevoScaleR .
I can very well create another table ( which I dont want to for many reasons) but I am looking for solutions in RevoScaleR space.
I figured this out ,if someone else stumbles upon this ,I guess it would help them(although its highly unlikely ,the documentation ,if consulted,should make this pretty trivial)
Like RxOdbcData ,RxSqlServerData also accepts table parameter and sqlquery parameter .
Now if you want to use a view or a stored Procedure you can exclude the table parameter and use sqlquery parameter .
keep in mind that both can not be used with each other
As #Bg1850 suggested - searching through the Microsoft/RevoScaleR docs is not very easy, but I think I found something here under Section : "Create New Data Sources" .
Basically, you have to set up a channel to your DB and you can use this as the basis to define different data sources ( = complete table or view or result of a SQL query).
Epicor - what a beastly creature!
Epicor asking for password after making a table change, any idea why?!?!
We removed the relationship from the (part table) and set up a criteria, instead. Now it is asking for a password, which should not be happening.
the login happens when I try to run the report. I am trying to figure out what I did to aggravate Epicor. The table was already there. I removed the relationship (part table) and added a criteria, instead, otherwise, that is exactly what I would have done. The only reason that I did not add a table to a report data definition, like I originally wanted to is because the parts table could only be added once. Which is why I removed the relationship and added a criteria, instead.
From your description, it sounds like the problem is related to the xml generated by Epicor for a non-BAQ based report data definition. Crystal and SSRS reports ask login information when either there is more than one datasource is referenced in the report, or there is improper relationships defined.
Note:
If you are not a report developer and you have modified this in an attempt to change the end data, I recommend you contact the report developer responsible for maintaining these before proceeding. Otherwise, read on.
Based on my experience, I would say if you are confident in the new relationship structure you have in the report data definition, the solution to this problem is likely within the report itself. Generate an xml file by running a test report, then open the .rpt (or .rdl) associated with this report and set the datasource to the new xml file. This should update the new xml schema used as the datasource. Even if none of the fields were changed in the data definition, the datasource schema definition that is stored in these files define exactly the data formatting that the report expects to receive when it is opened by Epicor.
If that doesn't solve the problem and you are using Crystal, the xml relationships may be defined in a way that will effect the way the data is displayed, which can be adjusted by using database expert->links tab in crystal. You should reconnect all of the links to match the report data definition within Epicor.
If none of that works, open up and view the xml file.
It is not unheard of for report data definitions in Epicor to break behind the scenes when altering relationships, and the xml file generated by the test report may not be a fully-qualified xml file. I have seen many xml files that do not have elements closed, etc. that will cause various problems when attempting to run the report. In this case, my recommendation is to create a completely new report data definition (do not copy), and re-enter all of the parameters that existed in the former definition. Repeat the refreshing of the report datasource as described above and this problem should be fixed.
I have a short example on how to generate dbf files like
I saw the following link:
Data File Header Structure for the dBASE Version 7 Table File
I write my program with C #
For example, I want to produce the following table( to binary ):
Field Name Type MaxLength
-------------------------------------------
DSK_ID Character 100
DSK_ADRS Numeric 2
Are you trying to create the table within Foxpro (Visual Foxpro) itself?, DBase, or with a .net/java language. Your tabs are unclear as to what you are really getting into, and just creating the table via low-level is not the way to go.
I can modify this answer more, but suggest you edit your question to provide more detail.
The basic syntax, if using Visual FoxPro would be something like.
create table SomeTableName ( DSK_ID C(100), DSK_ADRS N(2,0) )
But again, would need more on the environment you plan on working with.
By knowing you want to do via C#, I would start by Downloading Microsoft's VFP OleDb provider.
Then, you can look at the many other links for connecting, querying (always parameterize) and execute what you need. Here is a short example to get a connection and create the table you want. Then it is up to you for querying, inserting, updating as needed.
OleDbConnection oConn = new OleDbConnection("Provider=VFPOLEDB.1;Data Source=C:\\SomePath");
OleDbCommand oCmd = new OleDbCommand();
oCmd.Connection = oConn;
oCmd.Connection.Open();
oCmd.CommandText = "create table SomeTableName ( DSK_ID C(100), DSK_ADRS N(2,0) )";
oCmd.ExecuteNonQuery();
oConn.Close();
Now, note, the "Connection" string has a Data Source. This should point to a PATH location where you WANT TO CREATE and/or QUERY the tables. You can have one connection that points to a folder that has 100+ tables and you can eventually query from any of them. But again, those are going to be other questions that you can find LOTS of answer to for sampling... for example, just search on
VFP OleDB C# and you will get plenty of hits
How are you going to handle memo files? Compound index files?
Just use the ODBC or Ole DB providers via COM InterOp and issue a CREATE TABLE.
Hey StackOverflow community,
My question is as follows:
I have a table, say USER_ADDR with a bunch of columns in one database, say DB001
I need to copy the contents of this table(based on a criteria) to a similar table USER_ADDR (same name, yes) in another database DB002 with a different userID and pwd.
I need to do this in a stored procedure that will be executed using a .net framework.
I tried this:
INSERT INTO "DB002".USER_ADDR (--column names--)
SELECT *
FROM "DB001".USER_ADDR
WHERE ID = "APPLICATION_NO_IN";
I get:
0: Error occurred: [IBM][DB2/NT64] SQL0204N "DB002.USER_ADDR" is an undefined name. LINE NUMBER=15. SQLSTATE=42704 : -204: IBM.Data.DB2: 42704
What am I doing wrong?
Thanks in advance
Vashist
i'm deleting my other answer after seeing the additional info about your use case. Load is mainly for bulk loads of large numbers of records.
in this case i'd recommend you do something like open connection1 in .Net to your data source, select the data and hold it in a .Net DataTable. If required, you can do that select in a stored proc that returns either individual column values for a single row or return a cursor (rowset) that contains all the columns (and rows). Then in .Net open connection2 and insert the data from the DataTable to your destination. Again, that can be done with a stored proc.
Another approach is using an external script that connects to both databases.
From just one database is not possible, at least you use, as already mentioned, Information integration (federation) or by exporting the data and then loading it.