Getting pages indexed in Kik browser - kik

I'm having trouble getting pages to show up in the NEW tab and in the Optimized for Kik search results.
All my pages have the required title, meta description, canonical and script tag served if the user-agent contains the string "kik".
Here is an example of a page that isn't being indexed.
http://playcanv.as/p/MW862amA
The pages have been correctly set up for around a week and still aren't showing up. Any ideas why?

Currently, the Kik browser shows a loading screen on top of your website until the window.onload event has fired. If the website takes too long to load the user is presented with an error screen.
Testing locally, http://playcanv.as/p/MW862amA downloaded roughly 5MB before window.onload and took roughly 30 seconds to get there. I'm betting the search index isn't let it in because of this.
So the fix is simply deferring expensive network requests until after window.onload. The easiest solution is to wrap your network calls in kik.ready(function(){})

Related

Angular JS application taking around 100% CPU while resizing

My website is in AngularJS v1.7.2 (using Laravel PHP framework as backend).
Current issue is the website freezes after resizing several times (2-3 or more times). I have checked Google chrome Task Manager Tool and it showing around 100% of CPU usage at the time of freezes. Same issue also occurred while testing the site in mobile by changing orientation several times. I have already tried solution like bind one way, add track by for ng-repeat, using tools like Batarang but no luck yet. Is there any proper solution for the issue. Please let me know.
UPDATE
The website almost finish. Only the freeze issue currently blocking us. The website mainly containing song, album, playlist, artist.
On home page there is banner (4 images) using iosSlider . then there are per category (8 categories) album (5 album each). on right sidebar there are song list. There are other pages like category page, artist page, album page, song page, user profile pages etc. Also there is a customised player using angular soundmanager2 API. Generally while viewing in desktop normally there no issue so far. But if I check using Device Toolbar (CTRL+Shift+M) and switch to other pages and rotate 2-3 times it just freezes. Even can't reload or refresh the page tab. If I check Chrome Task manager (Shift+Esc) it shows around 100% CPU usages
that time. Thing to notice the issue only occurring if I route to other page(forward or backward or both) then rotate 2-3 times. I am using AngularJS Batarang to sort out the issue but with no clue still now
SOLVED
The issue was with iosSlider. After removing the iosSlider no crash issue now.

Reactjs takes too much time for content download in chrome?

Scenario:
I have a page which loads additional data when scrolled to the end of the page. Most of the times the Content Download is slow and some times it is fast. Though, the response data is only few kb. This issue is only seen in chrome (Chrome Version: Version 67.0.3396.99 (Official Build) (64-bit)).
What might be the solution here?
In chrome, Content Download takes sometimes shorter time and sometimes longer time for same request.
The requests are to the same API.
As suggested here, that frontend might be doing a lot of work!
So, I checked the profiling and it is not the issue.
Why is there a lot of idle time in the second call? How to reduce the Content Download time?
Edit
When click event is triggered in browser during loading, the content loads faster.

how to dynamically set title and description of angular single page application for google bot?

i want google bot to recognize titles and descriptions of my pages, the title and descriptions are coming from the database..
i used
document.title = $scope.dataFetchedFromDB.title;
and
document.querySelector("meta[name='description']").content = $scope.dataFetchedFromDB.description;
and it does change the title and description in the browser, but not in the snippets fetched by google or facebook or slack.... the old title and description remains.
i know about ng-meta npm package, but i dont have my pages on static route, the route is determined by the page ID (every page has its ID and its description and title)
i also read
Remember that while Google says that they use JavaScript to crawl pages, Facebook, Twitter, etc., do not. You can test Google's render of your page from the links here.
But Google takes a while to index these changes in their snippets. I would recommend creating a Google Search Console account and having it fetch-and-render the pages you want it to re-index. Even then, public results make take days or weeks to reflect your changes.
Also, it seems that the Googlebot with Javascript doesn't have a lot of patience. Try to make sure you are changing your Title and Description within mere moments of the page loading, and not at the end. In little tests, it appears that the Googlebot renderer may time-out after a few seconds, and only capture the original Title and Description.
In order to get other sites like Facebook/Twitter to render the proper metadata, you'll need to server-side render these pieces of data. Whatever appears when you say "View Source..." will be seen by these simplified crawlers. Consider updating to Angular (from AngularJS) and try server-side rendering for your metadata.

Facebook Debugger needs several refreshes before returning proper og:image value

I've been using the Facebook Debugger to consistently solve a problem with og:image tags that we are using on an AngularJS site. My content editor has to clear caches in FB several times before getting the correct meta to come through. Here is our setup:
We are using a PhantomJS, disk cache enabled look aside for all UA ~ Facebook to properly pass FB requests to our static HTML markup. We have verified (via curl 'http://localhost:9000/path/to/my/page' | grep og:image) that the proper og:image tag is present before trying to share or present a new object to the FB Open Graph
We have to consistently "Fetch new scrape information" 3 - 4 times before FB Debugger pulls the proper image. The debugger returns scrapes in the following way:
-- First fetch: Default og tags before Angular bindings hit.
It's hard to say why this is happening since we haven't tried to share the page previously. We've passed the page into the PhantomJS process and seen the proper og in the returns (in order to cache the return before sharing or heading to FB).
-- Second fetch: Proper og tags filled in with desired image but with the OG Image warning
og:image was not defined, could not be downloaded or was not big enough. Please define a chosen image using the og:image metatag, and use an image that's at least 200x200px and is accessible from Facebook. Image 'XXXXXXX' will be used instead. Consult http://developers.facebook.com/docs/sharing/webmasters/crawler for more troubleshooting tips.
The desired image is 600x337 png (no transparency) so size isn't an issue (and it eventually shows). The fallback image being used instead is the default og:image previous from scrape #1.
-- Third fetch:
OG Image warning is gone and all additional fetches return the proper meta. Sharing works and we can move on.
So while this works, it is a little heavy handed. Clearly we have an issue with FB seeing our default meta, caching that meta, and needing us to clear things out. Before we implement any cache warming in out PhantomJS process, and possibly a POST to the FB API to get the proper scrape markup into Open Graph, can someone answer why the additional 2nd refresh produces the og:image warning and then it goes away? If the proper og:image exists and is correctly sized, why the error?
We looked at this answer, and the comment says to clear our browser cache when using the debugger. We've considered the comment to use multiple browsers but to no avail. We've tried cache-less POSTs using Postman to test this theory as it may be how we cache warm but still see the need for the additional refreshes.

Progressive rendering of a webpage in Internet Explorer 7

I'm trying to improve the user perception of page load time of some web pages. These web pages take about 5 seconds to complete loading and rendering. The overall time is fine; but on clicking a link to load a page, nothing happens for about 4.5 seconds and then the whole page appears in one shot. This spoils the user experience, since the user is left wondering if anything is happening or not after clicking the link.
As I understand it, browsers are supposed to progressively render web pages as and when the resources available to render portions of the page become available to it. One thing I've seen recommended (by YSlow for eg:) is to put the css in the head and the javascript near the ending body tag - or as near the end of the page as possible. I've done this, but I don't see the initial part of the page rendering and then pausing for the javascript to load. The theory, as I understand it, is that the page will begin rendering progressively once all the CSS is loaded. I also understand that the page will pause rendering when any javascript is being executed/downloaded.
What else can affect progressive rendering on IE, especially on IE7?
I found out that javascript (specifically, some jQuery selectors) were slowing things down and preventing the page from rendering. We first optimized the jQuery code by removing some code which was repeatedly selecting the same elements. Then moved the code down to $.ready so that it executes after the page has loaded.
Overall, this has given us a 2 second boost in page load times as well as allowing more pages to load progressively.
A first step may be to understand what's going on on the network side, a tool like Fiddler will help you. In your case, Timeline display should be a good starting point.
Why not show notifications to users when a link is clicked that the page is currently in loading state.
You can do this:
window.onbeforeunload = function(e){ document.body.innerHTML='loading...';/*or even a better content/* };
I'm having the same load problems because of flash videos on a page. Will somebody tell me why oh my God why can't ie just load pages as nicely as firefox does???
If ie went out of business today, all the hours and days and nights I've wasted would be over.
I think it's about time that ie get with the demands of web maasters in 2009 and change the way they load pages.
If java script is used, people without java will see blank spaces.
Check for unclosed tags.
Check all images have width and height attributes.

Resources