msxml4 parseError undefined? - msxml

I have a Classic ASP website that uses a DLL to create a Msxml2.DOMDocument object and returns that to the browser (called from an jQuery ajax call). I run the website on two computers. Both have the same classic ASP, and both have the same version of the DLL file registered.
Under one environment the returned XML object has a "parseError" property, and has an "xml" property that contains a string representation of the XMl in the object. The object type of the XML returned by the server is IXMLDOMDocument2 (as viewed in the browser's debugging tools).
But in my second environment the returned object seems to be of type "XMLDocument", and it doesn't have the "parseError" or "xml" properties.
In both environments, Windows shows that MSXML4 SP2 is installed in "Programs and Features", but only versions 3.0 and 6.0 are listed in this registry key:
HKEY_CLASSES_ROOT\CLSID{2933BF90-7B36-11D2-B20E-00C04F983E60}\VersionList
Both environments have identical ASP code, and I've registered the same DLL file, which generates the XML to return, in both environments.
What might cause one environment to return a different XML object under the same conditions?
I'm not sure where to begin to solve this.

I figured out the issue, it was silly. The website in question is very old, and most of the pages don't work unless you have IE's Compatibility Mode enabled. Compatibility Mode was enabled in IE for the environment which worked as I expected (it returns an IXMLDOMDocument2 object), but it was not enabled in the other environment.

Related

Which version of the Excel API should be referenced in the manifest file, when submitting add-in for the Office Store?

When submitting Excel add-in to the office store. Which version of the Excel API should be referenced in the manifest file?
We have experienced being rejected because we didn’t refer to the newest version of the Excel API.
But if our Excel add-in supports an older version of the API. Shouldn't we be referencing this?
There are several aspects to the versioning of the Office.js library.
First, there is versioning of the actual JavaScript source files. Fortunately, this part is pretty simple: you always want the latest and greatest of the production Office.js, which is conveniently obtainable through our CDN: https://appsforoffice.microsoft.com/lib/1/hosted/Office.js. The files are also shipped as a NuGet package to allow corporate firewalled development, but the NuGet may lag a few weeks behind the CDN -- and in any case, any Store-bound add-in is required to reference the CDN location. So, in terms of Office.js versions, there isn't really versioing there: there is simply the one and only evergreen, frequently-updated, always-back-compatible, Store-required CDN version.
(While on the subject of CDNs: we also have a beta CDN, available at https://appsforoffice.microsoft.com/lib/beta/hosted/Office.js. This one is great to testing newly implemented-but-not-yet-officially-stamped-as-done features, which you'll find in our Open Specs: http://dev.office.com/reference/add-ins/openspec. However, any new APIs therein should be considered as strictly "beta", and they may well be renamed, re-grouped, or postoponed -- so your app should not rely on them, as we explicitly reserve the right to break pack-compat on the beta branch. An API is not "done" until it is listed in IntelliSense ande Documentation as complete, until it's available on the Production CDN, *and until its isSetSupported returns true -- more on that momentarily).
The more interesting bit for versioning are the actual API capabilities that are offered by each host. The Office.js library will have the latest JS code to be able to run them, but older hosts might not be able to support some of the functionality. For example, if you look at https://dev.office.com/reference/add-ins/requirement-sets/excel-api-requirement-sets, you will see that the 2016 wave of Excel APIs -- grouped under the "ExcelApi" requirement set -- has had three versions to date: 1.1, 1.2, and 1.3. ExcelApi 1.1 was what shipped with Office 2016 RTM in September 2015; 1.2 shipped in early March 2016; and 1.3 shipped just recently, and is in the process of being rolled out to the CDN. Each API set version has a corresponding Office host version that supports this API set (for most APIs, there have to be both JS and host-side changes; it's fairly rare that something can be a purely-JS addition). The version numbers are listed in the table, and there are links below the table to find a mapping from build numbers to dates.
Each of the API set versions contain a number of fairly large features, as well as incremental improvements to existing features. The topic for each requirment set, such as the link above, will provide a detailed listing of each of those features. And as you're programming, if you are using the JavaScript or TypeScript IntelliSense, you should be able to see the API versions for each of your APIs displayed as part of the IntelliSense:
You can use the requirement set in one of two ways. You can declare in the manifest that "I need API set ExcelApi 1.2, or else my add-in doesn't work at all" -- and that's fine, but then of course you aren't able to service older hosts, and so your add-in won't even show up there. Alternatively, if you add-in could sorta work in a 1.1 environment, but you want to light-up additional functionality on newer hosts that support it, you can use the manifest to declare only your minimal API sets that you need (e.g., ExcelApi 1.1), and then do runtime checks for higher version numbers via the isSetSupported API. Neat ways to get environment (i.e. Office version) has a very detailed explanation of this isSetSupported API-checking.
Hope this helps!

Getting IIS Folder physical path inside a Windows Forms Application

This question is strictly related with Windows Forms as my task is to do this inside a SAP Business one addon using C#. My requirement is to alter some configuration values stored in a Web.Config file of a related wcf service hosted in IIS. I need to get the IIS folder path (even the default path could be like "C:\inetpub\wwwroot", looking for a way to get it without hard-coding it) inside the SAP B1 form (Think as inside of a Windows Forms).
I've tried out the suggestion posted in the Getting IIS Application filesystem path thread.
string apPath = System.Web.Hosting.HostingEnvironment.ApplicationPhysicalPath;
Even if I added the System.Web reference to the project it gives me a null value, and I cant add the System.Web reference specifically to address this issue.
Also I've found Environment.SpecialFolder enum usage on MSDN but even it doesn't list IIS physical folder.
Environment.GetFolderPath(Environment.SpecialFolder.System))
Can someone suggest a workaround for this scenario? Even getting this value from the system registry would be ok.
You are trying to access the web application configuration information from IIS. That means you will need a library such as Microsoft.Web.Administration from Microsoft (part of IIS),
https://www.iis.net/learn/manage/scripting/how-to-use-microsoftwebadministration
or its open source equivalent from Jexus Manager,
https://www.nuget.org/packages/Microsoft.Web.Administration.Jexus

BIRT and iServer, dev/qa/production environments

I'm trying to go about setting up my BIRT reports and the iServer they sit on such that the database the Data Sources connect to are determined by the environment. Our setup is that currently there is just one iServer instance and many environments running a tomcat webapp that hit it (this may be the problem...).
Essentially the ideal is that the report connects differently in these places:
Local developement, which is running a local tomcat instance of the application which talks to the iPortal/iServer. Local database, but should be able to easily change to other databases for debugging etc.
QA deploy, qa database
Production deploy, production database
I've seen two options for how to fix this:
First option is to bind the Data Source to a configuration file in resources somewhere. Problem here is that if you have only one iServer, its resources are local to the server it is on, and not where the webapp. So, if I understand it correctly, this does not provide the flexibility I'm looking for.
Second option is to pass in all the connection info as report parameters and get the application to determine the correct parameters to send in. This way the application could pull from a local configuration file. This option would work, but I'm weary of the security (or lack thereof) in passing around connection info/credentials.
Does anyone have a better option? Or have people just run local iServer instances for developement? I can see running an iServer for each environment may simplify this issue and allow the reports released to production to be updated and tested in a QA environment without disrupting production, so maybe that is the solution.
One possible approach would be to set each of the connection properties conditionally in the Property Binding section of the Edit Data Source dialog, based on the value of a hidden parameter indicating which environment is to be accessed.
An example of this approach can be found here.
You mention that you are looking for an option for development, including the possibility of a local iServer. I think this would be overkill. Do you Dev & initial testing in BIRT; you do not need an iServer to run the report. If you need resources on the iServer to run & test the report you can reference those through the Server explorer in BIRT Pro. Once you are ready to deploy, I would follow Mark's strategy above using property bindings on the data source itself. That is as close to a best practice as exists for this migration requirement as exists in BIRT.

Consuming Data Services from Windows Phone, versions of data schema: IExtensibleDataObject alternative?

I'm accessing Azure storage (table) from my windows phone azure app, using System.Data.Services.Client dll, via DataServiceContext.
My problem is that in my data classes I can't use IExtensibleDataObject as it's not supported in Silverlight.
Applying XmlSerializerFormat attribute also doesn't affect it, seems like it is ignored when using data services (Fiddler shows that the data is not really in XML format).
Is there a way I can prevent my app from crushing each time a new field is added to the table?
The type IExtensibleDataObject isn't available in the Silverlight Windows Phone version of WCF. If you have an error regarding this type, you should be able to regenerate the proxy class.
You can regenerate your proxy class by using the "Add Service Reference" option in the WP project on Visual Studio.
You can also use the Windows Phone service utility. Mine was found here:
C:\Program Files (x86)\Microsoft SDKs\Windows Phone\v7.1\Tools\SlSvcUtil.exe

Is it ok to load .net dlls into SQL Server as UNSAFE?

When creating a SQL Server CLR stored procedure, I noticed that I couldn't reference anything in the .net framework as I would normally. After some reading around, I realised that assemblies needed to be loaded into the database first.
Therefore, I loaded in the ones I need but due to P/Invoke had to use the UNSAFE permission set. I can now reference them in my stored procedure code and everything works fine.
However, I'm a little concerned about having to set them to UNSAFE when I don't really know what they are doing. So my question is this:
Is it ok to load the .net framework in as UNSAFE without exactly what it's doing?
And how would doing so compromise security/robustness/scalability of sql server (as microsoft warn it could)?
Many thanks.
It could change the registry, restart services, reboot the server etc. Nothing too important ;-) A simple chart with the differences
See this question too (no answers though) SQL Server 2008: How crash-safe is a CLR Stored Procedure that loads unmanaged libraries
Of course, what are you doing that requires UNSAFE access?
When you use the SQL database engine on a server which is hosting many many public websites you don't know anything about as a server administrator (or DBA or whoever responsible), you should restrict their access and damn it's important! Also if you have a DBA in a restricted area, where data matters the most in some big companies, again it's the most important thing.
In my point of view, you should give your application as it needs to see, nothing more. If you don't need to see the registry for example, why you wanna give unrestricted access to the assembly? You have no idea how dangerous it could be if somebody injects the code of your application and hijack into the database (also with an unrestricted access!).
Hope it helps
This question is specific to loading .Net Framework assemblies that are not in the set of Supported .NET Framework Libraries, so I will focus on the context being Microsoft supplied DLLs rather than any random DLL.
The difference between the assemblies in the "Supported" list and those not in the list comes down to the fact that the supported ones "have been tested to ensure that they meet reliability and security standards for interaction with SQL Server" (as noted in the "Supported Libraries" page linked above). The main issue is more so the "reliability" than the "security". The assemblies in the supported list have been verified to behave consistently as expected and without any bugs or odd side-effects. The functionality has been tested to work with various languages and collations, etc.
Some .Net Framework assemblies that are not in the supported list can be loaded with a PERMISSION_SET set to SAFE. This, however, does not guarantee desired behavior. And some can be loaded as UNSAFE without necessarily indicating that there will be a problem.
As an example of not guaranteeing behavior: I have loaded System.Drawing in order to do some simple image manipulation. I tested manipulations when the image was supplied directly via byte[] / VARBINARY(MAX) as well as when it was supplied by a filepath and read from disk. Everything worked as expected. I sent that to someone in Germany whose Windows and SQL Server were both set to "German" as the language. He was able to get the expected results when supplying the image directly. But when he supplied a filepath it didn't work.
And regarding undesired behavior, SQL Server will display the reasons why the assembly can't load as either SAFE or EXTERNAL_ACCESS when you try to do it. For example:
CREATE ASSEMBLY [System.Drawing]
AUTHORIZATION [dbo]
FROM 'C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Drawing.dll'
WITH PERMISSION_SET = SAFE;
Results in:
Warning: The Microsoft .NET Framework assembly 'system.drawing, version=4.0.0.0, culture=neutral, publickeytoken=b03f5f7f11d50a3a, processorarchitecture=msil.' you are registering is not fully tested in the SQL Server hosted environment and is not supported. In the future, if you upgrade or service this assembly or the .NET Framework, your CLR integration routine may stop working. Please refer SQL Server Books Online for more details.
Msg 6218, Level 16, State 2, Line 1
CREATE ASSEMBLY for assembly 'System.Drawing' failed because assembly 'System.Drawing' failed verification. Check if the referenced assemblies are up-to-date and trusted (for external_access or unsafe) to execute in the database. CLR Verifier error messages if any will follow this message
[ : System.Drawing.BufferedGraphicsContext::bFillColorTable][mdToken=0x600013c][offset 0x00000053][found address of Byte] Expected numeric type on the stack.
[ : System.Drawing.BufferedGraphicsContext::bFillColorTable][mdToken=0x600013c][offset 0x00000043][found Native Int][expected address of Byte] Unexpected type on the stack.
[ : System.Drawing.BufferedGraphicsContext::bFillColorTable][mdToken=0x600013c][offset 0x00000027][found Native Int][expected address of Byte] Unexpected type on the stack.
[ : System.Drawing.Icon::ToBitmap][mdToken=0x6000349][offset 0x00000084][found unmanaged pointer][expected unmanaged pointer] Unexpected type on the stack.
[ : System.Drawing.Icon::ToBitmap][mdToken=0x6000349][offset 0x000000E4] Unmanaged pointers are not a verifiable type.
[ : System.Drawing.Icon::GetShort][mdToken=0x6000356][offset 0x00000002] Unmanaged pointers are not a verifiable type.
...
If you are not going to use any of those methods or types, then you likely will not have any issues. There is just no way to separate out the "safe" stuff from the "unsafe" items.
Another example of guilt-by-association, but even farther removed, is:
CREATE ASSEMBLY [System.Web]
AUTHORIZATION [dbo]
FROM 'C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Web.dll'
WITH PERMISSION_SET = SAFE;
Results in:
Warning: The Microsoft .NET Framework assembly 'system.web, version=4.0.0.0, culture=neutral, publickeytoken=b03f5f7f11d50a3a, processorarchitecture=x86.' you are registering is not fully tested in the SQL Server hosted environment and is not supported. In the future, if you upgrade or service this assembly or the .NET Framework, your CLR integration routine may stop working. Please refer SQL Server Books Online for more details.
Warning: The Microsoft .NET Framework assembly 'microsoft.build.framework, version=4.0.0.0, culture=neutral, publickeytoken=b03f5f7f11d50a3a, processorarchitecture=msil.' you are registering is not fully tested in the SQL Server hosted environment and is not supported. In the future, if you upgrade or service this assembly or the .NET Framework, your CLR integration routine may stop working. Please refer SQL Server Books Online for more details.
Warning: The Microsoft .NET Framework assembly 'system.xaml, version=4.0.0.0, culture=neutral, publickeytoken=b77a5c561934e089, processorarchitecture=msil.' you are registering is not fully tested in the SQL Server hosted environment and is not supported. In the future, if you upgrade or service this assembly or the .NET Framework, your CLR integration routine may stop working. Please refer SQL Server Books Online for more details.
Msg 6212, Level 16, State 1, Line 1
CREATE ASSEMBLY failed because method 'TypeDescriptorRefreshed' on type 'System.Windows.Markup.ValueSerializer' in safe assembly 'System.Xaml' is storing to a static field. Storing to a static field is not allowed in safe assemblies.
As you can see, System.Web is actually, by itself, fine for SAFE, but it has dependent assemblies and those are being auto-loaded. The first dependent assembly, microsoft.build.framework also has no issues (at least not that can be verified, though it is possible that something that is disallowed in SAFE is there but can only be caught at run-time). But the second dependent assembly does have an issue that can be verified upon loading the assembly: it is "storing to a static field". This is a problem for reliability more than security because classes are instantiated one time (well, per App Domain, meaning: per-database, per owner) and shared across the SPIDs to use (which is why only static methods are accessible in SQLCLR). Hence, static class-level variables are technically sharing information between sessions (i.e. SPIDs) and that can very easily cause unexpected behavior. But at the same time, if you only want to use HtmlString.ToHtmlString(), then you probably aren't making use of System.Xaml. So why doesn't it just load System.Web as SAFE and System.Xaml as UNSAFE? Probably because code in SAFE assemblies is not allowed to call code in UNSAFE assemblies (at least not in SQLCLR).
CONLUSION
So is it OK to load UNSAFE .Net Framework assemblies? That really should come down to testing. Lots of testing (and not just a single thread on your dev box, but real testing). If everything behaves as expected then you should be fine. But, if something does not behave as expected, then it is not a bug that can be reported to Microsoft because it has already been declared as unsupported.
EDIT:
And here is a more official answer to this question, which lists a few situations where problems could occurr: Support policy for untested .NET Framework assemblies in the SQL Server CLR-hosted environment

Resources