I have an o365 addin that works as expected on click 2 run versions of Outlook, however, there is a customer that has a MSI installation of Outlook 2016 that experiences inconsistencies.
I would like to obtain a copy of Outlook 2016 MSI so that I can do some testing locally.
It may be an issue with centralized deployment but would like to do some validation.
I've done web searches and contacted MS support. All things lead to MSI versions no longer being available.
May be the following will help you.
Workaround:
To get the MSI versions either you can remote their machine or get the MSI copy from customer (may not be feasible solution)
Understanding API requirement sets for Outlook.
I believe when the Addin was developed, you might be aware of the API requirement set. The following will be your case
All Outlook APIs belong to the Mailbox requirement set. The Mailbox
requirement set has versions, and each new set of APIs that are
released belongs to a higher version of the set. Not all Outlook
clients will support the newest set of APIs when they are released,
but if an Outlook client declares support for a requirement set, it
will support all the APIs in that requirement set.
For example:
if you specify requirement set version 1.3, the add-in will not show up in any Outlook client that doesn't support a minimum version of 1.3.
You can use the following code : isSetSupported property to check whether the addin is supported or not
if (Office.context.requirements.isSetSupported('Mailbox', 1.3) === true) {
// Perform actions.
}
else {
// Provide alternate flow/logic.
}
official reference:
https://learn.microsoft.com/en-us/office/dev/add-ins/reference/requirement-sets/outlook-api-requirement-sets
From the above screenshot you can easily figure out which add in will work on which version.
Hope it will give some idea of how to solve your problem.
Related
I am trying to figure out if Automated Deployments for Archer GRC is possible for the On Prem version ?
Currently it is deployed manually.
Latest version of Archer (v6.8, v6.9) has limited API provided to allow package deployments, BUT last time I checked they don't allow mapping and partial installs (I can be wrong, so double check).
API is there, but functionality is limited to the point that I don't see how package installation can be automated via provided API. I hope that in the next Archer versions it will be extended to replicate the functionality available with manual package deployment (mapping, partial installs, and other options).
Technically, if you like complex and time consuming tasks, you can decode/parse the package installation page. Then you can write an application to simulate HTTP packets sent to Archer server simulating the package installation.
I'm not aware of any company doing something like this as of today.
If you write a product to implement proper Code/Configuration Version Control for RSA Archer, then you may be able to sell it as well :)
Good luck!
Shortly I ve Windows Server 2012 R2, AEM Forms(6.2), SQLServer(2014) and Workbench(6.2) in same server. At first when i install and configure all of them, i can check out or in my applications from Workbench succesfully. However After my software team executes some scripts at Database, we can not check in/out from workbench. The worst thing when i click check out, workbench gives any error. any log. on event log or server application. It gives nothing and don't do my transaction. I saw at forums some people have same issue but nobody writes solution.
Please if any one knows the solution, share with us. What's wrong with my workbench? what to do fix this issue?
The query that your software team ran turns off security on every single LiveCycle service and makes them run as the system user. This includes the services used by Workbench and is very bad. Some of the services rely on knowing who is logged in to operate correctly. In particular, how can LiveCycle know who has checked in/out a resource if the service always runs as system?
Your best bet is to restore the LiveCycle database - or at least the tb_sc_service_configuration table to be where it was before you ran the script.
If you need to remove security on individual services, you should do it through the admin console, but only do it for your processes. Never do it for systems services unless the Adobe documentation says it is OK.
As JeremyP pointed out, modifying the Adobe database directly is a bad idea. The database should be treated as a black box that is only manipulated by Adobe code (either by doing things in the Adobe tools or making calls to Adobe APIs).
You can either make security changes manually through the adminui (as he indicates, which is the most common way of doing it) or programatically using the Adobe client APIs. See the following links for sample code that uses the APIs:
Removing Security - http://help.adobe.com/en_US/livecycle/10.0/ProgramLC/WS624e3cba99b79e12e69a9941333732bac8-7f35.html
Setting the runAs user - http://help.adobe.com/en_US/livecycle/10.0/ProgramLC/WS624e3cba99b79e12e69a9941333732bac8-7f38.html
My company, 4Point, offers AEM Forms consulting services. We have an in-house Apache Ant library that wraps the code above to automate this (and other) common tasks that are typically required when deploying (and redeploying) AEM Forms solutions. It can be included as part of a consulting engagement.
When submitting Excel add-in to the office store. Which version of the Excel API should be referenced in the manifest file?
We have experienced being rejected because we didn’t refer to the newest version of the Excel API.
But if our Excel add-in supports an older version of the API. Shouldn't we be referencing this?
There are several aspects to the versioning of the Office.js library.
First, there is versioning of the actual JavaScript source files. Fortunately, this part is pretty simple: you always want the latest and greatest of the production Office.js, which is conveniently obtainable through our CDN: https://appsforoffice.microsoft.com/lib/1/hosted/Office.js. The files are also shipped as a NuGet package to allow corporate firewalled development, but the NuGet may lag a few weeks behind the CDN -- and in any case, any Store-bound add-in is required to reference the CDN location. So, in terms of Office.js versions, there isn't really versioing there: there is simply the one and only evergreen, frequently-updated, always-back-compatible, Store-required CDN version.
(While on the subject of CDNs: we also have a beta CDN, available at https://appsforoffice.microsoft.com/lib/beta/hosted/Office.js. This one is great to testing newly implemented-but-not-yet-officially-stamped-as-done features, which you'll find in our Open Specs: http://dev.office.com/reference/add-ins/openspec. However, any new APIs therein should be considered as strictly "beta", and they may well be renamed, re-grouped, or postoponed -- so your app should not rely on them, as we explicitly reserve the right to break pack-compat on the beta branch. An API is not "done" until it is listed in IntelliSense ande Documentation as complete, until it's available on the Production CDN, *and until its isSetSupported returns true -- more on that momentarily).
The more interesting bit for versioning are the actual API capabilities that are offered by each host. The Office.js library will have the latest JS code to be able to run them, but older hosts might not be able to support some of the functionality. For example, if you look at https://dev.office.com/reference/add-ins/requirement-sets/excel-api-requirement-sets, you will see that the 2016 wave of Excel APIs -- grouped under the "ExcelApi" requirement set -- has had three versions to date: 1.1, 1.2, and 1.3. ExcelApi 1.1 was what shipped with Office 2016 RTM in September 2015; 1.2 shipped in early March 2016; and 1.3 shipped just recently, and is in the process of being rolled out to the CDN. Each API set version has a corresponding Office host version that supports this API set (for most APIs, there have to be both JS and host-side changes; it's fairly rare that something can be a purely-JS addition). The version numbers are listed in the table, and there are links below the table to find a mapping from build numbers to dates.
Each of the API set versions contain a number of fairly large features, as well as incremental improvements to existing features. The topic for each requirment set, such as the link above, will provide a detailed listing of each of those features. And as you're programming, if you are using the JavaScript or TypeScript IntelliSense, you should be able to see the API versions for each of your APIs displayed as part of the IntelliSense:
You can use the requirement set in one of two ways. You can declare in the manifest that "I need API set ExcelApi 1.2, or else my add-in doesn't work at all" -- and that's fine, but then of course you aren't able to service older hosts, and so your add-in won't even show up there. Alternatively, if you add-in could sorta work in a 1.1 environment, but you want to light-up additional functionality on newer hosts that support it, you can use the manifest to declare only your minimal API sets that you need (e.g., ExcelApi 1.1), and then do runtime checks for higher version numbers via the isSetSupported API. Neat ways to get environment (i.e. Office version) has a very detailed explanation of this isSetSupported API-checking.
Hope this helps!
By the end of last week our central IT Department introduced SCCM and applied it to a bunch of clients in our division. My colleagues and I work as so called "IT-Partner" in a 1st level support for a few hundrets of colleagues. Now we're facing some problems with our new SCCM System (installed packages do not work etc.) Now we'd like to "reset" applications so the SCCM Agend will reinstall them. I've read something about the detection methods but unfortunatelly I do not really know how they work nor I know where those methods are saved. I want to "analyse" those methods so I know which file to modify / delete that the agent will reinstall the application.
By the way, how much time does SCCM take from "assigning" a package to applying to the client?
Assuming you only have the client and no access to the SCCM Console the detection methods can be found using WMI. They are stored in root\ccm\CIModels in the Class Local_Detect_Synclet.
The format is XML in one column and it is designed so that all kinds of detection methods can basically be represented in the same style so it's not very readable but you should be able to get some basic understanding about the detection method used.
Keep in mind this is only true if the software was deployed in the "new" (introduced in sccm 2012) application format and not for the "old" package/program format.
If you want more detail I once tried to automate the process of triggering a reinstall for any given application but ultimately failed due to problems with the chache/distribution point. I posted all my findings here.
So from an application POV. When you deploy an app the detection method is setup in SCCM to determine wether or not the application installed successfully. This detection method could be configured a variety of ways. For example, it could check to see if the msi code is installed to determine success, it could check the .exe and compare it to a specific version, or even check a registry file for existence. In order to change/modify these detection methods you should be an SCCM admin and be able to login to the console. From there you would select the specific application or package you want to analyze and click through the properties of the deployment.
Is it possible to customize the DNN 8 modules and Skins? Is it possible to config the DNN 8 and use it in VS 2010 framework 4.0? If is it let me know the steps to do, because I have configured DNN 8 site to the IIS 7 and it works good from the there, but when I am trying to load this to the VS2010 and Build it, it gives me different errors.
Errors:
i) Unknown server tag 'dnn:DnnCssIncludes' - Which was resolved by adding one line for dnn tag in the same file.
ii) After resolving previous error the another error wsa of ckFinder, and it was resolved by adding ckFinder.dll file in bin folder.
iii) After resolving previous issues it generates new error for ckEditor. It shows me the following error message:
The type or namespace name 'Ventrian' could not be found (are you missing a using directive or an assembly reference?)
I have tried to resolve and search for the solution but fail to do. Will any one let me know the fixes for this?
Yes, this is possible, you will want to do a couple of things
Setup your Environment
Open the Project for whatever you are modifying, this typically involves install the SOURCE package of the extension you want to modify.
Don't change "core", meaning don't change "DNN" itself, you can, it's open source, but once you do you are forked and upgrading to new releases of DNN is very difficult to do if you aren't careful.
Setting up your environment
From http://www.christoc.com/Tutorials/All-Tutorials/aid/1
Setting up your development environment can vary based on what your end goal is. If you are doing module development for your own use, and within your own DNN environments, you can ignore a few of the settings below. If you are doing module development with the idea that you might turn around and give the modules away, or sell them, then you will likely want to follow the guidelines set forth below to support the widest array of DNN installation environments.
I recommend that each developer have their own local development environment, with a local IIS website running DotNetNuke, and a SQL Server 2008/2012 (not express, though you can use it) database for the website. Having an individual development environment makes group module development far easier than if you share environments/databases.
Choosing a DotNetNuke Version
Choosing a version of DotNetNuke is important when you start your development for couple of reasons. For modules that you are developing for yourself, you need to ask, what is the minimum version of DotNetNuke that you have in production. Are you running DNN 5.6.1? Are you running 6.2.6, 7.0.0, 7.0.6? Based on the answer you can determine what version of DNN you should setup as your development environment. You shouldn't be developing on a newer version of DNN than what you have running in production. As with everything there are ways around this, but I am not going to go into the details on that in this tutorial.
As a developer working to create modules and release those, you might have production sites that are running on the latest and greatest version of DNN, but what about your customers? Or your potential customers? You have to ask yourself, do you want to provide support for really old versions of DotNetNuke? From a development perspective you will probably say no, but from a business perspective, you might say yes, and here’s why. Not everyone upgrades DotNetNuke websites as they should, and often times you will find that some people never upgrade. While I don’t advise taking that approach to managing a DotNetNuke website, it is a fact of life that people don’t always upgrade and there are thousands of people, if not tens of thousands, that have sites that aren’t running on the latest version of DNN. You should take that into account when you are doing your module development, if you compile your module against an older version of DNN then your module should run on newer versions of as well, for example. If you compile your module against DotNetNuke 6.2.6 it will likely run on every version of DNN released since then. Though there are extended cases where this won’t always work, DNN strives to maintain backwards compatibility, this isn't always possible.
You might also want to use features that are only available starting with a specific version of DotNetNuke, such as the workflow functionality found starting in DNN 5.1, in that case you may choose not to support older versions of the platform out of necessity. This will minimize the market in which you can sell your modules, but also can make for less support and an easier development cycle due to the features that DNN provides.
Choosing a Package
Now here’s one that may baffle you a bit. I’m going to recommend that you use the INSTALL package for whatever version of DotNetNuke that you download. What? The INSTALL package? What about the SOURCE package? Well you can use the source, but you don’t need it. The module development that I’m setting you up for doesn't require the DNN source, and using the INSTALL package makes your development environment cleaner. We aren't going to be opening the DotNetNuke project when we do our module development, so why have the files sitting around for nothing? Also, if you've ever tried to use the SOURCE package for anything, you'll know it isn't easy.
The steps for setting up your development environment will apply to both the Community and Professional editions of DotNetNuke.
Installation Configuration
Once you have the version selection out of the way you can go through the installation process. While I’m not going to walk you through the minutest of details of each step of installing DotNetNuke in this post, I will at least try to point you in the right direction for each step.
Download the INSTALL package of the version of DotNetNuke you want to use in your development environment.
Extract the files in the INSTALL package to a location of your choosing, this location is where you will point IIS (the web server) when we can configure the website. In my environment I typically use c:\websites\dnndev.me\ (One item of note: you may need to right click on the ZIP file and choose Properties before extracting, on the properties window if you have an UNBLOCK option, click that. Some versions of Windows have started blocking files within the DotNetNuke ZIP files, which will cause you problems later during the actual install.)
Setup IIS
IIS is the web server that comes with Windows computers. DNN 7 requires IIS 7 or later (7,7.5,8.0), so you will need at least Windows Vista, Windows 7, Windows 8, or Windows Server 2008 R2, Windows Server 2012.
In IIS you should create a new website (Note: If you use an existing website in IIS be sure to add the HOST binding for DNNDEV.ME), and point to the folder where you extracted the INSTALL package.
Note: With DotNetNuke 7.0+, .NET Framework 4.0 is required, so be sure that your application pool is configured to run under 4.0, and not 2.0.
Set File Permissions
Setting up the file permissions for your DNN install is often the step that causes the most trouble. You should right click on the FOLDER in which you extracted DNN (c:\websites\dnndev.me) and choose properties. Choose the Security tab. You need to add permissions for the account in which your website's application pool is running under. You will want to setup the permissions to give the account Full or Modify permissions for the DNNDEV.ME folder. Which account you will use will vary based on your version of IIS, here’s a simple list of some of the default accounts based on the version of IIS.
IIS Version Operating System Account
IIS 7 Windows Vista, Windows Server 2008 localmachine\Network Service
IIS 7.5 Windows 2008 R2, Windows 7 IIS AppPool\APPPOOLNAME
IIS 8 Windows 2012, Windows 8 IIS AppPool\APPPOOLNAME
Note: If you are using IIS7.5/8.0 you’ll notice in the above table that we have APPPOOLNAME in the identity, this is because when you setup a new website in IIS a new application pool is created. In place of you should type in the name of the application pool that was created. You can also bypass this and configure your application pool to use the Network Service account instead of a dynamic account if you would like.
Database Configuration
In SQL Server you should go through and create a new database. I always create a database with the same name as the website, so in this case DNNDEV.ME. Once you have created the database, create a user that can access that database. I always use SQL authentication, turn off the enforce password requirements, and give the user DB Owner and Public access to the DNNDEV.ME database. Remember the username and password you create here as you will need them when you walk through the Installation screen for DotNetNuke.
DotNetNuke Installation Screen
Populate the installation screen with the standard DNN information, Host username, password, etc. For the Database option, choose Custom and configure your database connection, providing the Server IP/Name, the Database name (dnndev.me). For the database authentication you'll want to choose the option that allows you to enter the username/password for the database user that you created previously.
Now there are two additional options you can configure, normally I would tell you not to modify these, but from a development environment perspective I do recommend that you change the objectQualifier setting. It should be blank by default, you should type in “dnn” (without quotes), this will prepend “dnn_” to all of the objects that get created by DNN such as Tables and Stored Procedures. This is not something I recommend from a production stand point, but if you are developing modules for sale, then supporting objectQualifier in your development is recommended. It will save you time down the road if you have a customer who has an objectQualifier defined on their production databases.
DotNetNuke Module Development
To get started with your DNN module development, be sure to read our tutorial on how to install our Module Development Templates.
Next, setup Visual Studio Templates (you'll want to use VS 2015) and create a project.
You can find the templates here https://visualstudiogallery.msdn.microsoft.com/bdd506ef-d5c3-4274-bf1d-9e673fb23484
Download that, run the VSIX package installer, or search through the online templates for DotNetNuke. Watch this video https://www.youtube.com/watch?v=kOoQJDeTlJ0&list=PLFpEtny5sIbb9jGxJ7RBM5hIizodOCtoj&index=1