Should I senatize this in angular1? - angularjs

We are running a legacy CMS system that stores some content in pure HTML. This content is now fetched using http from my angular 1.5 application and displayed on the page. Should I now senatize this HTML before adding it to the page? If yes, how? If not, why not?

This depends who can enter the HTML. If only authorized personal can enter content, you could sanitize it, so it gets displayed on the page... however this may cause errors on the page, when the syntax is incorrect and I wouldn't suggest it if avoidable.
If the content can come from anyone, you definitely absolutely shouldn't insert it into your page, it opens it up for XSS attacks!
See this video, how an attack can take over your site: https://www.youtube.com/watch?v=U4e0Remq1WQ

Related

How to display my AngularJS correctly for GoogleBot and Optimizely?

I've a website called VoteCircle (www.votecircle.com), but i noted that it doesn't display well for Google Bot/Optimizely (used for A/B tests). It shows only the content that AREN'T in ng-view. All content in ng-view isn't displayed.
it was made in AngularJS and the content in ng-view isn't displayed for those bots/previews that i mentioned.
What's the best way to fix that?
Please, see attached screenshot.
Thanks.
There is a pretty easy fix for this. In your URL bar, click on the small key and enable mixed content. The browser blocks loading mixed content in the editor by default (HTTPS and HTTP resources combined). By enabling it you can load the rest of the page in the editor.

Angular.js: Is there any disadvantage of hash in url with respect to SEO?

I am making a website using AngularJS, I am curious to know that is there any disadvantage of hash in url with respect to seo ?
e.g. http://www.website.com/#about-us
I'll appreciate any contribution.
Thanks
If we go back to the basics, HASH # means a DIV ID in your HTML, and to talk in more details Google ignores anything after the HASH.
Example, this page www.mydomain.com is similar to www.mydomain.com/#about-us
This is an advanced technique some marketers are using it to track their campaign without using parameters like UTMs to avoid content duplication.
To make sure your page is loading without any errors, try to disable the JS from your browsers using "Web Developer Tool" and then load your page, i think you will get a white page without content and this is the way Google and most of the search engines see your pages.
Also there is another way to test it by going to Search Console "Webmaster tool" and use fetch as Google, here you will see exactly how Google view your page.

Facebook Debugger needs several refreshes before returning proper og:image value

I've been using the Facebook Debugger to consistently solve a problem with og:image tags that we are using on an AngularJS site. My content editor has to clear caches in FB several times before getting the correct meta to come through. Here is our setup:
We are using a PhantomJS, disk cache enabled look aside for all UA ~ Facebook to properly pass FB requests to our static HTML markup. We have verified (via curl 'http://localhost:9000/path/to/my/page' | grep og:image) that the proper og:image tag is present before trying to share or present a new object to the FB Open Graph
We have to consistently "Fetch new scrape information" 3 - 4 times before FB Debugger pulls the proper image. The debugger returns scrapes in the following way:
-- First fetch: Default og tags before Angular bindings hit.
It's hard to say why this is happening since we haven't tried to share the page previously. We've passed the page into the PhantomJS process and seen the proper og in the returns (in order to cache the return before sharing or heading to FB).
-- Second fetch: Proper og tags filled in with desired image but with the OG Image warning
og:image was not defined, could not be downloaded or was not big enough. Please define a chosen image using the og:image metatag, and use an image that's at least 200x200px and is accessible from Facebook. Image 'XXXXXXX' will be used instead. Consult http://developers.facebook.com/docs/sharing/webmasters/crawler for more troubleshooting tips.
The desired image is 600x337 png (no transparency) so size isn't an issue (and it eventually shows). The fallback image being used instead is the default og:image previous from scrape #1.
-- Third fetch:
OG Image warning is gone and all additional fetches return the proper meta. Sharing works and we can move on.
So while this works, it is a little heavy handed. Clearly we have an issue with FB seeing our default meta, caching that meta, and needing us to clear things out. Before we implement any cache warming in out PhantomJS process, and possibly a POST to the FB API to get the proper scrape markup into Open Graph, can someone answer why the additional 2nd refresh produces the og:image warning and then it goes away? If the proper og:image exists and is correctly sized, why the error?
We looked at this answer, and the comment says to clear our browser cache when using the debugger. We've considered the comment to use multiple browsers but to no avail. We've tried cache-less POSTs using Postman to test this theory as it may be how we cache warm but still see the need for the additional refreshes.

setting up Google Webpage Optimizer where text and conversion pages are the same

I have a site where both landing and thank you page are index.php page with different content loaded dynamically. As I'm generating the javascript and trying to validate it, it gives me an error saying that JS is not installed on the thank you page, which makes sense, because its content is not loaded yet. I was wondering how I can circumvent this issue? Any suggestions?
Thanks!
Luka
You will need to do the offline validation of the conversion page, follow these instructions.

Progressive rendering of a webpage in Internet Explorer 7

I'm trying to improve the user perception of page load time of some web pages. These web pages take about 5 seconds to complete loading and rendering. The overall time is fine; but on clicking a link to load a page, nothing happens for about 4.5 seconds and then the whole page appears in one shot. This spoils the user experience, since the user is left wondering if anything is happening or not after clicking the link.
As I understand it, browsers are supposed to progressively render web pages as and when the resources available to render portions of the page become available to it. One thing I've seen recommended (by YSlow for eg:) is to put the css in the head and the javascript near the ending body tag - or as near the end of the page as possible. I've done this, but I don't see the initial part of the page rendering and then pausing for the javascript to load. The theory, as I understand it, is that the page will begin rendering progressively once all the CSS is loaded. I also understand that the page will pause rendering when any javascript is being executed/downloaded.
What else can affect progressive rendering on IE, especially on IE7?
I found out that javascript (specifically, some jQuery selectors) were slowing things down and preventing the page from rendering. We first optimized the jQuery code by removing some code which was repeatedly selecting the same elements. Then moved the code down to $.ready so that it executes after the page has loaded.
Overall, this has given us a 2 second boost in page load times as well as allowing more pages to load progressively.
A first step may be to understand what's going on on the network side, a tool like Fiddler will help you. In your case, Timeline display should be a good starting point.
Why not show notifications to users when a link is clicked that the page is currently in loading state.
You can do this:
window.onbeforeunload = function(e){ document.body.innerHTML='loading...';/*or even a better content/* };
I'm having the same load problems because of flash videos on a page. Will somebody tell me why oh my God why can't ie just load pages as nicely as firefox does???
If ie went out of business today, all the hours and days and nights I've wasted would be over.
I think it's about time that ie get with the demands of web maasters in 2009 and change the way they load pages.
If java script is used, people without java will see blank spaces.
Check for unclosed tags.
Check all images have width and height attributes.

Resources