Am using Silverlight 4 hosted in an ASP.NET MVC page, e.g. http://test.example.com/main. I make a call to the server from Silverlight using WCF and get some values back. One of these values I write as a cookie using:
HtmlPage.Document.SetProperty("cookie", newCookie);
I can then view the cookie text using:
MessageBox.Show(HtmlPage.Document.Cookies);
I can see various cookies, including the one I just created, so looks like it was created ok.
From within the SL app, I display some hyperlinks. When the user clicks on this it will display the link in a new browser window. The links go to the same domain, e.g. http://test.example.com/viewdoc?1233
The new cookie that was created is not being passed in the request. The other cookies that were originally there are being passed. I don't see how its a crossdomain policy issue since they are going to the same domain. It doesn't matter what browser I use (Safari, Firefox, IE8, IE6), they all exhibit the same problem, so it doesn't seem to be an IE8 issue that I saw on other similar issues.
So where is my cookie going?
Verify that the path property of the cookie is not set to certain page only:
The path parameter is potentially the
most useful of the 4 optional cookie
settings. It sets the URL path the
cookie is valid within. Pages outside
of that path cannot read or use the
cookie. If Path is not set explicitly,
then it defaults to the URL path of the
document creating the cookie.
Related
I am building my first ever Chrome browser extension and I am struggling to find the right solution for handling authentication. There is a requirement that the extension stay logged in as long as possible, to reduce the need for the user to log in often. This means we would need to use Refresh Tokens. I would very much like to handle all authentication on the background script but this is no longer persistent in MV3 nor does it have access to the DOM.
This being the case, I see these options:
use Auth0 React SDK on the content scripts - this means all my authentication logic will run in a somewhat less secure environment but the token will be handled by the library and I will be able to access it in all my content and popup scripts (if I need persistence across page refreshes, I would still need to use localStorage, I believe). But this means that the background script will not have access to the token and it will need one of the other scripts to retrieve it and send it through a message
implement the Authorization Code Flow with PKCE following the steps in this tutorial on the background script - this will mean that all my auth logic is running in a more secure environment but I don't have a way of storing the token, other than using chrome.storage. It's also a bit tricky to silently retrieve the token (or check if user is still logged in) from the background script (it can be done using an injected iframe and the web_message response type or with chrome.identitybut there are still issues with the redirect_uri which needs to be listed in the Allowed Origin config of the Auth0 app - so you can only easily do this on the pages of the extension).
I know that the recommended solution for an SPA is using the SDK but I would like to know if this is also the right solution for a browser extension. Based on this article on Token Storage, localStorage is dangerous especially due to third-party scripts. Seeing that the MV3 manifest has now removed the ability to execute remote code, is localStorage an acceptable way to store tokens?
I have implemented both options using the docs provided but I am unsure as to what is the best solution, given the changes introduced by MV3.
Thank you
I have a reactJs SPA hosted on github pages with custom domain.
I indexed it in google search and also have description <meta> tags in my code.
All works fine on normal machine.
However, When I try to open the website behind a corporate firewall, I get this error
Request denied based on content categorization: "Uncategorized URLs"
I might never be able to open the site behind this firewall, because I can not update the firewall policies, but the reason mentioned in the blocking proxy is something I want to fix.
Question
My site is shown as "Uncategorized URLs".
As this is a SPA, and no content gets loaded until JS gets triggered so proxy might not be able to analyze the content, but is there a way to resolve this Uncategorized URLs issue by setting some <meta> tags or any quick fix except SSR
We have 2 asp.net MVC applications installed on the same backend-server, using the same port. Opening both applications in the same browser causes their sessions to interfere.
I'm wondering how this could be avoided without touching to the server-infrastructure ?
I was thinking about clearing all sessions at startup of both applications but I'm not sure how to do this or whether this would be a good idea ?
If the sites are on the same subdomain/domain then they'll share cookies by default. This is because cookies are domain-bound. There's no way around that, but you can customize the cookie name each application uses, so even though they'll both receive the same cookies, they'll only mess with the one that belongs to them. For sessions, this can be achieved by adding the cookieName attribute to your <sessionState> element in the site's Web.config:
<sessionState ... cookieName="Website1_SessionId" />
You'll need to similarly customize the auth cookie name, if you're handling authentication on both sites as well.
I'm using cookies so that mobile users can visit my site as desktop users. To do this, I give them a cookie - mob_yes.
Then, in a module, i use a drupal hook to see if the cookie is set.
I can see that the cookie IS getting set, but in my module (isset($_COOKIE["mob_yes"])) always returns false when using varnish.
In /etc/varnish/default.vlc I have the following:
if (req.http.Cookie) {
set req.http.Cookie = regsuball(req.http.Cookie, ";(mob_yes)=", "; \1=");
I'm really not sure what's going on here, but I only presume varnish is not unsetting that cookie temporarily? Does anyone have any idea what's going wrong here?
Thanks,
what do you mean by
I can see that the cookie IS getting set
you mean that you can see it in headers in firebug (client side) and then you see it on the server side with tcpdump / varnishlog / application (server side)?
code snippet from vcl is probably part of commonly used way of preserving important cookies by adding a space in front of them, deleting all that dont have ";[space]" combination and removing space at the end.
It is used later on to generate hash for specific url+cookies request.
i think you should check your vcl if its not removing any cookies if user is not logged in - it's a common practice to increase hitrate.
usually in vcl for drupal it's part which checks for DRUPAL_UID
The company I work for has proxies/WAN accelerators between our international sites to cache Intranet web content. I have a Silverlight application being hosted on a server at one location, but being accessed by clients in another location. When the users access the web page hosting the Silverlight app, they get the stale xap file being cached by the proxy and not the latest version from the server. Local users always get the latest xap as their requests are not going through a proxy.
I've tried the various header/metadata techniques mentioned elsewhere to prevent caching, and the containing web page itself is being served up fresh, but I still get the old .xap file. Short of getting our IT admin to disable proxy caching for my site, is there anything I can do make sure the latest xap file get retrieved from the server instead of the proxy? The containing page is ASP.NET.
What I do is just add a querystring at the end of the path to the xap file. Then when you change the querystring variable, the proxies etc. should see it as a request to a new file. So far this has worked fine for me.
So basically, when embedding an .xap in a straight-up HTML file, you would do this:
<param name="source" value="ClientBin/SilverlightApplication1.xap?cachepreventer=whatevervalue"/>
And then when you deploy a new version, just change "whatevervalue" to something else.
EDIT
If you need to use this technique in many places in your app I would read the querystring value from config and just write it to the page using asp.net. That way you only need to update it in one place when you deploy.
If you want to make sure every time the xap file is retrieved and you don't want to worry about it - just use
<param name="source" value="ClientBin/YourSilverlightapp.xap?<%=Guid.NewGuid().ToString() %>"/>
of course - this lends itself to a heavier cache load. I do like the helper method above though if you only want changes to be propagated to the client.