Omniture capture a traffic variable without associated visit or page metrics - analytics

I need to track a single traffic variable (a "prop") on a 3rd party site that is embedding our content.
We don't want the Omniture call to increment a visit or page view or any other metric that is normally captured by the s code library.
The 3rd party site on which we're capturing the prop variable will need to track this in the client, so using the data insertion API is not an option.
How to achieve this? Would a s.tl() method do this? i.e. not count a visit, etc.

Using a s.tl() call would not work. Even though that call does not increment a page view, any "first" call made will register a visitor and a visit. What I mean by that is even if a standard visit begins with a s.tl() call as opposed to a s.t() call, which ever one happens first will start the visitor and visit being counted.
What you are looking for is called a Light Server Call. This allows you to fire a call and capture some variables but will not register a page view, visit, or visitor. Plus they cost much less than a standard server call. You will need to work with Client Care to get it set up and implemented.

I was able to get this working by making an image request that looks something like:
< img src='http://somersid.122.2o7.net/b/ss/somersid/1/H.XX.X--NS/0?c[some_prop]=data height='1' width='1' alt='' />
This is the first I've heard of Light Server Calls. Can't find anything about them in the documentation. Thanks.

Related

pagination for the list secrets for logic apps

I am using List secrets activity to get all the secrets from key vault. I am only able to get first few values as pagination is not Woking for this activity. Is there any other way I can get all the secrets values from the logic apps.Right now I am only able to do for first page values only and as per Microsoft there is limitation of maximum 25 items.
I've managed to recreate the problem in my own tenant and yes, it is indeed an issue. There should be a paging option in the settings but there's not.
To get around this, I suggest calling the REST API's directly. The only consideration is how you authenticate and if it were me, I'd be using a managed identity to do so.
I've mocked up a small example for you ...
The steps are ...
Create a variable that stores the nextLink property. Initialise it with the initial URL for the first call to the REST API, it looks something like this ... https://my-test-kv.vault.azure.net/secrets?maxresults=25&api-version=7.3 ... and is straight out of the doco ... https://learn.microsoft.com/en-us/rest/api/keyvault/secrets/get-secrets/get-secrets?tabs=HTTP
In the HTTP call as shown, use the Next Link variable given that will contain the URL. As for authentication, my suggestion is to use a managed identity. If you're unsure how to do that, sorry but it's a whole other question. In simple terms, go to the Identity tab on the LogicApp and switch on the system managed status to on. You'll then need to assign it access in the KeyVault itself (Key Vault Secrets User or Officer will do the job).
Next, create an Until action and set the left hand side to be the Next Link variable with the equal to value being expression string('') which will check for a blank string (that's how I like to do it).
Finally, set the value of the Next Link value to the property in the response from the last call, the expression is ... body('HTTP')?['nextLink']
From here, you can choose what you do with the output, I'd suggest creating an array and appending all of the entries to that array so you can process it later. I haven't taken the answer that far given I don't know the exactness of how you want to process the results.
That should get you across the line.

How to make JSON loads faster with large data (on HTTP or WebPage)

. Requesting the page(on HTTP or WebPage), it is very slow or even crash unless i load my JSON with fewer data. I really need to solve this since sooner or later i will be using large amount of data frequently. Here are my JSON data. --->>>
Notes:
1. The JSON loads only String and Integer.
2. I used to view my JSON in JSONView more like treeview using plugin
from GoogleChrome.
I am using angular and nodejs. tq
A quick resume of all the things that comes to my mind :
I had a similar issue once. My solutions may make the UI change.
Pagination
I doubt you can display that much data at one time, so the strategy should be divising your data in small amounts and then only load more when the client ask for it.
This way, the whole data is no longer stored in RAM as it is currently. This is how forums works (only 20 topics at a time).
Just imagine if StackOverflow make you load the whole historic of questions in the main page, how much GB would your navigator need just for that ?
You can use pagination in a classic way (button with page number, like google), or in an infinite scroll way, as you want.
For that you need to adapt your api and keep track of the index of the pages you already loaded at every moment in your Front. There are plenty of examples in AngularJS.
Only show the beginning of the data
When you look at Facebook comments, you may have a "show more" button. In their case, maybe it's to not break the UI, but it can also be used to not overload data.
You can display only the main lines of your datae (titles or somewhat) and add a button so the user can load more details if they want.
In your data model, the cost seems to be on the second level of "C". Just load data untill this second level, and download the remaining part (for this object) only if the user asks for.
Once again, no need to overload, your client's RAM will be thankfull, and your client's mobile 3G too.
Optimize your data stucture
If this is still not enough :
As StefanArya said in comment, indeed remove the "I" attribute, which is redundant with the JSON key.
Remove the "I" as you can use Object.keys() to get key name.
You also may don't need that much precision on your floats.
If I see any other ideas, I'll edit this post later.

How to cut down API requests in AngularJS app

My problem is I'm making too many API requests, which I want to cut down if possible. Below I'll describe the situation:
I have three pages, all linked using ngRoute. Like this:
Page A: Teams (list of teams)
URL: "/teams"
Page B: Team Details (list of players)
URL: "/teams/team-details"
Page C: Player Details (list of player stats)
URL: "/teams/team-details/player-details"
Page A is populated by pulling an array of the teams from an API very easily using a simple $resource.query() request, and using ng-repeat to iterate through them.
Page B is populated by calling an html template and populating specific fields with values from a separate API request to the /team-details endpoint, taking the team_id value from the clicked element on Page A.
Page C (as with page B) takes a player_id from the clicked player on Page B and calls the /player-details endpoint using that value. This is yet another separate request.
This all works fine, but as you can imagine, a single user could quite easily rack up in excess of 100 API requests within an hour.
I have a request limit of 1000/hour, so if a mere 10 users are online at the same time, it could easily exceed my limit and shut down my API.
If I could access the API as one single master endpoint that outputted all data and subdata in one set, then that would solve my problem, but since I need to request separate endpoints I can't see how to do this.
Is there a better way to approach this? Or are these excessive API requests the only way?
Any help would be appreciated.
As far as I can see, Your model looks suitable for the application and meets how an API-driven application should work...
However, One potential cut-down you could make is to cache some of the results locally. i.e. store a local version of some of the data that is unlikely to change within a session. For example, If the number of teams is unlikely to change, then store the results of 1 API request locally and use that instead of recalling data from your API.
Following on from this route, you could choose to only update certain data after a certain time period. So, if a user has looked at some team-details then refuse to update this data for the next 10-20minutes. However, this does again depend how time-sensitive your data is.

Create many passes from the app - iphone passbook

SITUATION :
I have an application where i have to issue a gift cupon kind of a thing when the user reaches a certain score say 'x'.
I want to create a coupon with a unique QRcode, at the time the user reaches the score 'x' so that he can download it on his iphone and use it. Once it is used , the cupon should be invalidated. this applies to any user using the application. Meaning a coupon is created once the score is reached and deleted or invalidated once it is used.
ISSUE :
I'm not able to figure out how to create a cupon everytime any user reaches the score. Ofcourse, i did go through a lot of documentations and links like http://www.raywenderlich.com/20734/beginning-passbook-part-1. I also tried using pass-source but the valid account requires you to pay minimum about 8$.
As suggested in raywenderlich tutorials, i can create passes but thats not created through the application.
Also i didn't see any method where we can be notified when a user uses his issued coupon so that we can invalidate it.
Am i missing something here?
"Using" a QR code on a coupon means it is scanned by something else. That something else has to take responsibility to report the activity back to you, so you could then update the pass with an "Expired" flag in your database, re-sign and rebuild the pass, issue the push notification so that it would eventually update on the device. You'd also probably want that scanner-thingie to check with you to see that the code is valid before accepting it. So, yeah, not Apple's problem.

Foursquare Updating

No, not another question that asks, "How can I make my messages flow like on Foursquare???"
What I want to know is, how they are getting their messages in the right order and timeframe.
Here's my situation. I have a proc that can get messages for a given day, and then return the selected result set to the web and have on the front end, my code show them and slide new ones on top. However, these "new" ones, aren't new ones, they are just the ones in the set that didn't initially fit on the page, although they "look new". Now what happens when I get to the end, and the set is empty finally...I make another call right?
Well this call is going to get, yes some ones they didn't see, but also all the ones they already saw.
What's a work around for this?
Thanks.
If you only want to show messages once, then persist the Id of the last message and use that as input into the proc on the second call, basically asking for any messages that came in since the last call.
re: Foursquare, I assume you are referring to the "recent activity" on their main page. They seem to call for 30 activities, then just cycle through them showing 11 at a time. They loop through a static list of 30. No second call that I can see.

Resources