We have an automated test suite, using Borland Silk Test 2008 R2 to carry out regression tests of a new in-house product.
The test script consistently refers to controls by their index:
Form.Control3 ...
We've made a "minor" change to the main form of the application, and now the control that used to have index 3 has index 4.
The easy, but tedious, fix is to edit the scripts to reference Control4 instead of Control3, but this remains pretty brittle.
How do we instead identify the controls by name - so instead of referencing Control3 we specify "the control named ribbon".
(We believe that referencing things by name will be significantly less brittle.)
We've tried the obvious:
Form.ribbon
which doesn't execute at all.
The primitive intellisense in the editor doesn't show much of use - no Controls property, no GetXX or FindXX methods.
Our application is written using C# on .NET 3.5, and does make use of third party controls.
SilkTest usually stores the information to locate the controls in you application in an .inc file. The part
Form.Control3 ...
you mentioned is a reference to the structure in that .inc file. When you application changes, you should be able to adapt your test scripts by simply updating the entry in the .inc file.
Related
I am in the process of migrating an existing Adobe anlytics implementation on s_code version 27.5 to DTM. The first step of the migration and what is in scope of the project is a pick up and shift job of the current s_code into AdobeDTM.
The site has multiple JS files that house functions that need the 's object' to be initialised to work, however s is being initialised in the s_code contents after most of these JS functions have run so is throwing errors for 'S is not defined'. It is not being initialised globally as it would be in a standard implementation.
Is there a way I can initialise 's' in the DTM satellite library globally. I have tried to add var = s{}; a page load rule under third party/custom tags area but only having intermittent luck with it, where sometimes getting errors thrown.
Any support/insight into this issue would be most appreciated.
Thanks!
Step 1: Change the Code Configuration to Custom
Note: If you migrated your legacy H code to DTM as a tool, then you should already be familiar with and already done this step, since DTM does not allow you to specify legacy H code for the "Managed by Adobe" option.
In the Library Management section of the Adobe Analytics tool, change the Code Configuration option to Custom, and Code Hosted to In DTM.
If you are using the legacy H code library, then you must also check the "Set report suites using custom code below" option. If part of your migration to DTM was to move to AppMeasurement library, checking this option is optional, depending on how you want to handle report suite routing.
Then, click the Open Editor button to open the editor. You should see the Adobe Analytics library in the code box. If you are migrating legacy H code, then remove everything currently in the box and add your legacy H code library (which you should have already done based on the question).
Step 2: Instantiate the s object
If you are using the legacy H code, then add the following line to the top of the code box, above the library:
window.s = s_gi("[report suite id(s)]");
You will need to replace [report suite id(s)] with the report suite id(s) you want to send the data to. s_gi() requires a value to be passed to it, which is why you must check the checkbox above.
If you are using AppMeasurement library, then add the following line to the top of the code box, above the library:
window.s = new AppMeasurement("[report suite id(s)]");
If you checked the "Set report suites using custom code below" checkbox, then specify the report suite(s). If you did not check it, then do not pass anything to Appmeasurement(). Alternatively, you can pass nothing, but also add the following underneath it:
s.account="[report suite id(s)]";
Note however in step 3 you will be setting it in doPlugins anyway so you don't really need this here (I just added this sidenote for other readers who may be migrating AppMeasurement s_code.js to DTM).
Note: Aside from the window.s part, you should already be familiar with this line of code, and already have logic for populating report suite(s), coming from a legacy implementation. Specifically, you may be using the dynamicAccountXXX variables. If you are upgrading to AppMeasurement library, then you will need to write your own logic to simulate that, since AppMeasurement (for reasons unclear to anybody) does not have this functionality.
Step 3: Setting report suite(s) after page load
One of the many caveats about implementing Adobe Analytics as a tool is that DTM (for reasons unclear to anybody) creates a new s object whenever an event based or direct call rule is triggered and AA is set to trigger. In practice, this means almost all of the variables you set within the custom code boxes in the tool config will not be carried over to subsequent AA calls on a page - report suite(s) being one of them.
What DTM does for report suite is set it to the specified Production Report Suite(s) if DTM is in production mode, or Staging Report Suite(s) if in staging mode. Even if you enabled the "Set report suites using custom code below" option!
To get around this, you will need to include doPlugins function (and usePlugins) in one of the tool's custom code boxes if you don't already have it included (you almost certainly do, coming from a legacy implementation), and you will need to also assign the report suite(s) within it (doPlugins and usePlugins do get carried over now).
For legacy H library, within doPlugins, add the following:
s.sa("[report suite id(s)]");
Note: setting dynamicAccountXXX variables within doPlugins will not work. You will need to write your own logic for passing the report suite(s) to s.sa()
For AppMeasurement library, within doPlugins, add the following:
s.account="[report suite id(s)]";
General Notes:
In the Library Management section, setting Load library at
Page Top will load the library synchronously at the position where
you put your DTM Header tag, which is the earliest you can trigger it
through DTM. However, this is not a guarantee the library will be
loaded before your other scripts that referenced it are executed
(e.g., references to s object in some other script tag above the
DTM Header script will continue to give you a reference error).
If you are indeed still using the legacy H library, then I would
recommend your highest priority be to migrate to AppMeasurement
library. Even higher priority than migrating the code to DTM, IMO.
While I echo Mark's sentiments about implementing AA code as a 3rd
party tag in general, the sad truth is in practice, it may still be
your best option at the moment, depending on your exact scenario. DTM
currently has too many caveats, short-comings, and outright bugs that
make it impossible to implement AA as a tool in DTM, depending on
your exact implementation requirements. Particularly when it comes to
making AA integrate with certain common 3rd party tools, and even
some of Adobe's other tools!
You will be better off if you migrate completely to DTM for analytics deployment rather than trying to reference the s object from legacy H page code.
If migrating completely from H-code to DTM is an option, I would do the following:
Remove all H page code and any references to s_code
Remove all calls to s.t or s.tl on links or pages
Deploy DTM Header / Footer code on all pages
Within DTM, Add the Adobe Analytics Tool
Within DTM, Add the Adobe Marketing Cloud ID Service
Within DTM and the "Custom Page Code" of Adobe Analytics tool, create the "do_plugins" section and add any custom plugins from the H-code.
Following these steps will allow the s object to be created within DTM and allow for all other rules to use it correctly.
What I would not do:
Deploy H-code (s_code) as a third-party script and try and reference the s object outside of the Adobe Analytics tool. This is not efficient and doesn't allow you to get the best practices from DTM, IMO.
Mark
One of the issues noticed using DTM to implement Adobe Analytics was with the S-Object being undefined.enter image description here
Reasons very much unclear.You have a workaround that I used by reminding DTM to set the S object again. In-cases where DTM does not recognizes what needs to be done.
var s = _satellite.getToolsByType('sc')[0].getS();
For my Implementation we had used a Third Party JavaScript that set within a Direct call rule and within which the above code was set.
The solution worked great ....
Let's say that I have a simple WPF or Winforms solution. To that solution I add a new project (based on a class library template , which I then reference in the main project) which is intended to be a data layer containing an entity framework data model. When I create the data model in the new project the connection string that it uses gets added to the app.config file of the main project in the solution.
Now let us say that I want to add two more projects to the solution (both of which will again be based on class libraries) to contain details of WCF services that I wish to use. In each case I add the WCF service by using the ADD Service Reference option from the right click context menu of the projects.
Unlike the data model project though the bindings for the service model get added to the local projects app.config file ass opposed to the app.config file of the main start-up project.
Should I simply copy those bindings to the start-up project's app.config file, or should I copy and then delete, or in fact should I be doing something completely different. Thus far trying combination of the first two suggestions I get error messages connected with endpoint configuration, however my knowledge of WCF is not really sufficiently good to fully understand the MSDN articles that the error list points me to.
Note that if the service references are added to the main project I get no errors whatsoever, so I figure this must be a configuration problem of some description.
Would anyone be able to provide the correct procedure for adding projects that essentially contain no more than a WCF service reference to an existing visual studio solution.
Edit
The screenshot below shows my main app.cofig file after having copied over the bindings configurations from the two service contracts. I'm not sure whether I should have commented out the bit that I did or not, I had thought that by doing so I might get rid of the blue squiggly underlines telling me the following (which I must admit to not understanding):
Warning The 'contract' attribute is invalid - The value 'ErsLiveService.IERSAPIService' is invalid according to its datatype 'clientContractType' - The Enumeration constraint failed.
You're likely getting the blue squigglies because the namespace ErsTestService is defined within the project in which you created the service reference. If the root namespace of that project is MyServiceReferenceProject then try changing the namespace to MyServiceReferenceProject.ErsTestService.IERSAPIService.
I have a Silverlight solution that has multiple silverlight projects (Views) that all compile to their own .Xap file.
There is one "master" project that handles the dynamic downloading of the Xap files, which works pretty well.
But now I need to make sure that all the references are set to CopyLocal=false in all the View Projects. Only the "master" project can have CopyLocal=true.
This means that the Xap files generated by the Views stay rather small.
What I would like to do is check post or during the build process to see if any of the View projects have a reference with CopyLocal=true.
What would be a smart way of doing this? Using an external tool in the Post Build event? Or perhaps an addin for Visual Studio ? Or creating a macro in Visual Studio for this?
I have looked at using .extmap with assembly caching, but since you have to specify the assemblies in that, this does not solve my problem. I just need to know if there is a reference with the wrong setting and report that. Fixing it is not the question, that will still be done manually. It's just the notification I need.
Solution has 35 projects now, so dont want to check them all by hand every time.
I found a question similar to this one, but it lists msbuild as a possible solution. I would like to know if there is a way to do this using "code" (be it prebuilt in a tool/addin or otherwise)
I have chosen to go the Addin path. I created an addin that listens to : BuildEvents.OnBuildBegin
Whenever that event fires I create a list of all projects in the current solution. Doing a bit of recursive searching since there are also Solution folders that make life in DTE world a bit harder.
Then I loop through all the projects and cast them to a VSProject so I can loop through all the references.
Anytime I come accross a reference that is wrong, I create an ErrorTask where I set the Document property to the full solution path of the reference. To do this I Build the path for the project this reference is in, all the way up to the root of the solution.
The ErrorTask is then sent to an ErrorListHelper class I created, that handles the ErrorTasks and also performs navigation.
If I'm done with all the projects and I found any errors, I cancel the current build and show the Error List window, where my ErrorListHelper holds all the Reference Errors I created.
Whenever I want to navigate to the Reference in question, I activate the Solution Explorer window and get the root of it using an UIHierarchy.
Then I walk the path from the root on down, step by step, using the UIHierarchy to get to the UIHierarchyItems and expand them. Until I get to the deepest level (the reference) and I Select that.
Since I only need it for a certain solution and within that solution for certain projects (.Views.* and .ViewModels.*) I also have some checking for those in place during buildup of the Error List.
It works like a charm, already found 12 "wrong" References in 35 projects where I tought all were well.
I am using a different path now to do this. I have a base class that I can use to write unit tests that have access to the DTE2 object. This way I dont need an addin. This also works for Silverlight projects since the test class does not actually need access to the Silverlight projects, just being in the solution is enough to be able to iterate through the projects and check the references.
I am new to both WPF and WCF, and have a WPF app that has a service reference to a WCF one. I have all sorts of files created under Service References/MyService. I am not so sure which need to go into source control and which don't.
I have a .disco, a .datasource, a .wsdl, 3 .xsds, 2 configuration.svcinfos, a Reference.cs,
and a Reference.svcmap.
I assume most are generated, yet I don't know which belong to source control and which do not.
Put all of them under source control, why not?
It's part of your code and it's needed to compile the project. If you use an automated build system, then you don't want that script to generate this code again, right?
As a bonus you'll get a history of changes to your service interface, could be useful too.
All of those files are source files, so they all belong under source control.
How about adding all of them to the source control in the first instance and then remove those that never change later?
Within a Silverlight library, I need to validate incoming XML against a schema. The schema is composed of 5 interdependent .xsd files; the main file uses "xs:import" to reference all of the others, and there are other references among them.
Assuming that the .xsd files need to be distributed with the library (i.e. not hosted on some well-known external URL), how should I structure my project to include them?
I have found that I can embed them in the library project with build type "Resource" and then load them (individually) using Application.GetResourceStream() and a relative URI with the ";content" flag in it. But if I take this approach, can I validate against the interdependent set of 5 files? What happens when the schema parser tries to resolve the interdependencies?
Or should I embed them with build type "Content" and access the main one with some other sort of URL?
Or???
To summarize: how should I use these 5 .xsd files in my project so that I will be able to validate XML against them?
EDIT: It's not clear whether it's even possible to validate in Silverlight. I spun off a related question.
I cannot say much about Silverlight limitations with respect to validation, but the question itself is more generic - one might want to store .xsd files as resources in a desktop .NET application, for example - so I will answer that part.
You can have full control over resolution of URIs in xs:import by means of XmlSchemaSet.XmlResolver property. Just create your own subclass of XmlResolver, override GetEntity() method, and implement it using GetResourceStream(), or GetManifestResourceStream(), or whichever other way you prefer.