Piwik, Export count of page visits for a specific url - matomo

I Make artificial page views with javascript on my website using javascript, because I need to track them later
piwikTracker.trackPageView('/lid/902095/website');
I need to be able to export how many times the above code was executed, i.e. how many times /lid/902095/website was visited, so that I can show some stats on my website, however I'm not sure what my query to the API should be, here is my attempt:
?module=API&method=Actions.getExitPageUrls&idSite=2&period=day&date=today&format=JSON&token_auth=anonymous&segment=pageUrl=#%2Flid%2F902095%2Fwebsite
It returns an error at this time "Error: The report 'VisitsSummary_CoreMetrics' was requested but it is not available at this stage. You may also disable the related plugin to avoid this error."
I don't see a plugin with that name in plugins section, my active plugins are "Actions, Dashbord, DoNotTrack, ExampleAPI, ExamplePlugin, Live, Login, Widgetize"
Not sure if this is the right query to send to the API or not, any help is appreciated.

If you want to track the visits use this api call
?module=API&method=VisitsSummary.getVisits&idSite=2&period=day&date=today&format=json&token_auth=anonymous
by default VisitsSummary plugin is provided
Piwik Web Essentials is a good book for reference.

Related

Fetching data every time page is reloaded

I am creating an eCommerce-type website with react and Laravel with the REST APIs. Every time user enters page www.mywebsite.com/products a rest API is called to get all the products, and it takes like a second to load all of them. If the user leaves the page and enters it again, the same request is called, and the same products are loaded.
My question is: What's the best approach for this situation? Maybe I should somehow fetch the products only once, store them inside localStorage and get them using Context? On the other hand, most of the websites I visit seem to load products instantly, so maybe even the REST API is the wrong approach for this kind of website?
I would suggest to check your DB queries and investigate why this request takes so long. Have a look at the laravel Telescope package: https://laravel.com/docs/9.x/telescope
In the request tab you can follow your requests and see which queries are the most expensive ones. My guess is that you're lazy loading some relations on each product.
Next I would propably cache the articles (expect maybe the stock information). Use the built-in feature of laravel or maybe a third party package like this one
Only then I would go for client-side optimizations. Here you should consider maybe a state management extension like akita.

SPA DTM analytics pagename issue

I am using Angular SPA with DTM.Using custom event based rules, I am able to get all my data including pageName, v41,v42 as correct. Now inside adobe editor, i am storing pagename to s.pageName and some hard-coded value to s.server. I have verified that all my data is correctly populating using OMNIBUG tool as server,pageName, v41 and v42.
Problem is coming in Omniture reporting, as server and page data are not coming through. Page-name data only showing SPA homepage in all page visits and server also coming as default from s.code and not the one i am passing from s.server. eVar/prop are all coming fine.Even if I do prop40=s.pageName/prop41=s.server, then in omniutre reporting i am seeing correct data populating in prop40/prop41 but not under Page and server. And again I cant use prop40/prop41 for pagename/server as its not a correct way to follow and PAGE-VISITS are ZERO in that case.
Any help how to get data in page/server in omniture for SPA or anything wrong in my implementation? Thanks in advance!!
If you really do see the correct values in Omnibug (or more specifically, network request to Adobe collection server), then the issue is not in the code.
Check against another AA hit debugger. Possible Omnibug is somehow bugging out. There are a ton of alternatives out there. Adobe Experience Cloud Debugger. Observepoint. Charles Proxy. Fiddler. Or just use the browser dev tool network tab (what I usually do as a backup).
Make sure you are looking in the correct report suite. Perhaps your data is being sent to a dev report suite, and you are looking at prod report suite, or visa versa?
Check to see if you have any Processing Rules that are overriding your values.
Contact your Adobe Rep to check if there are any VISTA Rules present for the report suite, that are overriding your values.
If you have verified none of the above is the case, then sorry, but it sounds like the issue must really be in your code, but there is a problem with your QA method (e.g. maybe you are looking at the wrong AA request, or something).
Update:
Based on your comment:
Earlier, i was making s.tl() call, but replacing it with s.t() call
resolved my problem for data was not populating
pageName/server/page-views in Omniture and now it is. But the current
problem is we need PageName on all SPA clicks (can be achieved by
s.t() call ) , but the page-Views are not needed on all clicks. So,
its like link-tracking needed only but with PageName data. I am
struggling not to populate page-views on a s.t() call or vice-versa
how to get PageName populated on s.tl() call. Again, omnibug shows all
requests just fine but the issue comes in reports in omniture
When Adobe processes a hit, it wipes pageName for s.tl calls, as that's how it determines whether to count the request as a page view or not. If you want to see page name even for s.tl calls, the common practice is to dupe the pageName value to a prop or eVar and send in with the s.tl call, and look at that report. In fact, most clients I work with don't even use the native pages report, and instead use the (usually eVar) report.

gtag.js tracking on multiple domains but no linking

My company has an angular component library that we want to track usage of. This means that multiple developers will us it on multiple domains. I have included a single tracking id and am dynamically loading the gtag.js library when components are loaded. This works on the main domain but the tracking isn't logged in the dashboard. Does anyone know of the settings required to make this possible? Also I am not trying to link any pages or sessions, just have unique tracking per application.
It turns out that I had upgraded one gtag emit call but a second one was still in the ga syntax and wasn't registering in the dashboard.

angularjs sitemap SEO

I don't see any updated answer on similar topics (hopefully something has changed with last crawl releases), that's why I come up with a specific question.
I have an AngularJS website, which lists products that can be added or removed (the links are clearly updated). URLs have the following format:
http://example.com/#/product/564b9fd3010000bf091e0bf7/published
http://example.com/#/product/6937219vfeg9920gd903bg03/published
The product's ID (6937219vfeg9920gd903bg03) is retrieved by our back-end.
My problem is that Google doesn't list them, probably because I don't have a sitemap.xml file in my server..
In a day a page can be added (therefore a new url to add) or removed..
How can I manage this?
Do I have to manually (or by batch) edit the file each time?
Is there a smart way to tell Google: "Hey my friend, look at this page"?!
Generally you can create a JavaScript - AngularJS sitemap, and according to this guidance from google :
https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html
They Will crawl it.
you can also use Fetch as Google To validate that the pages rendered correctly
.
There is another study about google execution of JavaScript,
http://searchengineland.com/tested-googlebot-crawls-javascript-heres-learned-220157

Why google doesnt index my single page app properly?

Since some time google officially depreceated ajax crawling scheme they came up with back in 2009 so I decided to get rid of generating snapshots in phantomJS for _escaped_fragment and rely on google to render my single page app like a modern browser and discover its content. They describe it in here. But now I seem to have a problem.
Google indexed my page (at least I can see in webmastertools it has) but when using webmastertools I look at google index-->content keywords it shows non processed content of my angularJS templates only and names of my binded variable names e.g. {{totalnewprivatemessagescount}} etc. The keywords do not contain words that should should be generated by ajax calls when Javascript executes so e.g. fighter is not even in there and it should be all over the place.
Now, when I use Crawl-->Fetch as google-->Fetch and render the snapshot what google bot sees is very much the same as what user sees and is clearly generated using Javascript. The Fetch HTML tab though shows only source without being processed using JS which I'm guessing is fine.
Now my question is why google didn't index my website properly? Is there anything I implemented incorrectly somewhere? The website is live at https://www.fightersconnect.com and thanks for any advice.

Resources