CakePHP - Get my application root address from model - cakephp

I am looking to get http://www.mysite.com/app_directory in the model. Note my app is not in the site root. I need it to perform a cURL request and I don't want to hard code it because the app location will change. I also need the check to return http://www.mysite.com if the app moves to the site root.
About it being a bad idea to have the model know about its environment, I was wanting to check if a URl is an external web page or a URl within the current website.

that would do:
env('HTTP_HOST')
it's cake way of looking at:
$_SERVER['HTTP_HOST']

Related

Microsoft Teams Action messaging extension with task module and URL not working

I am trying to build an action messaging extension with a task module implementation which uses a URL attribute to load the page. Attached is the screenshot of the task module code which was generated by Yeoman Teams generator.
The popup comes up blank. So it means it's not loading the HTML file path, but if I open Chrome and try to load the URL, it works fine.
Also instead of using URL if I use an adaptive card it works fine. Only the URL part doesn't load on the popup. Attached is another screenshot of the popup inside teams:
What could be wrong with the code?
The other answer is correct in that your url needs to be reflected 100% correctly in your manifest. However, there are a few things that you need to be clear on:
It's not per se the address of the BOT that's important, but rather the address of the web page itself that needs to be listed in your safe domains list in your manifest. In your case, they're hosted in the same endpoint, but they might not be in your final solution, depending on how you end up hosting this.
While you're developing locally, rather use App Studio. That way, you don't need to fiddle with the zip file every time - you can just change it in App Studio and immediately redeploy with the updated URL
Every time when you compile and run the project, a new hostname is generated since ngrok free license is used in the yo teams scaffolding, which makes the app to reference to the old URL.
You need to uninstall the app from the Teams app store under your organization and upload the new app from the package folder .zip (Only after gulp ngrok-serve)
If it still does not work, check the below
Unzip the package file and verify the manifest whether it's pointing to the right hostname of the action html page
Go to http://localhost:4040 to inspect the ngrok tunnel traffic that should give more info on the routed requests.

CDN serving private images / videos

I would like to know how do CDNs serve private data - images / videos. I came across this stackoverflow answer but this seems to be Amazon CloudFront specific answer.
As a popular example case lets say the problem in question is serving contents inside of facebook. So there is access controlled stuff at an individual user level and also at a group of users level. Besides, there is some publicly accessible data.
All logic of what can be served to whom resides on the server!
The first request to CDN will go to application server and gets validated for access rights. But there is a catch - keep this in mind:
Assume that first request is successful and after that, anyone will be able to access the image with that CDN URL. I tested this with Facebook user uploaded restricted image and it was accessible with the CDN URL by others too even after me logging out. So, the image will be accessible till the CDN cache expiry time.
I believe this should work - all requests first come to the main application server. After determining whether access is allowed or not, a redirect to the CDN server or access-denied error can be shown.
Each CDN working differently, so unless you specify which CDN you are looking for its hard to tell.

why i couldn't see any text in "http://crawlservice.appspot.com/?key=123456&url=http://mydomain.com#!article"?

Ok, i found this link https://code.google.com/p/gwt-platform/wiki/CrawlerSupport#Using_gwtp-crawler-service that explain how you can make your GWTP app crawlable.
I got some GWTP experience, but i know nothing about AppEngine.
Google said its "crawlservice.appspot.com" can parse any Ajax page. Now I have a page "http://mydomain.com#!article" that has an artice that was pulled from Database. Say that page has the text "this is my article". Now I open this link:
crawlservice.appspot.com/?key=123456&url=http://mydomain.com#!article, then i can see all javascript but I couldn't find the text "this is my article".
Why?
Now let check with a real life example
open this link https://groups.google.com/forum/#!topic/google-web-toolkit/Syi04ArKl4k & you will see the text "If i open that url in IE"
Now you open http://crawlservice.appspot.com/?key=123456&url=https://groups.google.com/forum/#!topic/google-web-toolkit/Syi04ArKl4k you can see all javascript but there is no text "If i open that url in IE",
Why is it?
SO if i use http://crawlservice.appspot.com/?key=123456&url=mydomain#!article then Can google crawler be able to see the text in mydomain#!article?
also why the key=123456, it means everyone can use this service? do we have our own key? does google limit the number of calls to their service?
Could you explain all these things?
Extra Info:
Christopher suggested me to use this example
https://github.com/ArcBees/GWTP-Samples/tree/master/gwtp-samples/gwtp-sample-crawler-service
However, I ran into other problem. My app is a pure GWTP, it doesn't have appengine-web.xml in WEB-INF. I have no idea what is appengine or GAE mean or what is Maven.
DO i need to register AppEngine?
My Appp may have a lot of traffic. Also I am using Godaddy VPS. I don't want to register App Engine since I have to pay for Google for extra traffic.
Everything in my GWTP App is ok right now except Crawler Function.
So if I don't use Google App Engine, then how can i build Crawler Function for GWTP?
I tried to use HTMLUnit for my app, but HTMLUnit doesn't work for GWTP (See details in here Why HTMLUnit always shows the HostPage no matter what url I type in (Crawlable GWT APP)? )
I believe you are not allowed to crawl Google Groups. Probably they are actively trying to prevent this, so you do not see the expected content.
There's a couple points I wish to elaborate on:
The Google Code documentation is no longer maintained. You should look on Github instead: https://github.com/ArcBees/GWTP/wiki/Crawler-Support
You shouldn't use http://crawlservice.appspot.com. This isn't a Google service, it's out of date and we may decide to delete it down the road. This only serves as a public example. You should create your own application on App Engine (https://appengine.google.com/)
There is a sample here (https://github.com/ArcBees/GWTP-Samples/tree/master/gwtp-samples/gwtp-sample-crawler-service) using GWTP's Crawler Service. You can basically copy-paste it. Just make sure you update the <application> tag in appengine-web.xml to the name of your application and use your own service key in CrawlerModule.
Finally, if your client uses GWTP and you followed the documentation, it will work. If you want to try it manually, you must encode the Query Parameters.
For example http://crawlservice.appspot.com/?key=123456&url=http://www.arcbees.com#!service will not work because the hash (everything including and after #) is not sent to the server.
On the other hand http://crawlservice.appspot.com/?key=123456&url=http%3A%2F%2Fwww.arcbees.com%2F%23!service will work.

Cake php host in server with multiple domain

I have a server with many domains/ applications on it. I need to host a cake php application on that server. When I uploaded, I get errors w.r.t urls.
for eg, www.xyz.com/aboutus. this url is working. there is a controller called Aboutus.
But when I take the url www.xyz.com/aboutus/add, it must go to the add method in Aboutus controller. It is working in my local system. But in live, it shows the error that 'add' controller is missing.
In my local, I have changed the document root in apache. But in live server I cant do this as there are multiple sites.
You need to make sure that the ROOT, APP_DIR, and CAKE_CORE_INCLUDE_PATH variables in each site's webroot/index.php have been updated to go to the right paths. [details here] (or see below where I list my settings) Other than that, just make sure your host has mod rewrite on and you should be good to go.
According the the CakePHP book for 2.0.x, it's easier to just change the include_path, but I haven't tried that yet: http://book.cakephp.org/2.0/en/deployment.html#multiple-cakephp-applications-using-the-same-core
The file-structure I use:
/cakephp
/cakephp_1_3
/cakephp_2_0_5
/public_html
/mysite1.com
/mysite2.com
/mysite3.com
//webroot/index.php (of one of my sites)
define('ROOT', DS.'home'.DS.'myusername'.DS.'public_html');
define('APP_DIR', DS.'mysite1.com');
define('CAKE_CORE_INCLUDE_PATH', DS.'home'.DS.'myusername'.DS.'cakephp'.DS.'cakephp_2_0_5'.DS.'lib');
(I just took the 3 lines that set the variables - they're not really three lines in a row like that)
Don't forget to make sure your database settings are still correct in app/Core/Config/database.php

Subdomain is preventing my search results from rising as it should in page rank

My problem is that I have a site which has requires a dedicated page for every city I choose to support. Early on, I decided to use subdomains rather than a directly after my domain (ie i used la.truxmap.com rather than truxmap.com/la). I realize now that this was a major mistake because Google seems to treat la.truxmap.com as a completely different site as ny.truxmap.com. So for instance, if i search "la food truck map" my site will be near the top, however, if i search "nyc food truck map" im no where in sight because ny.truxmap.com wouldnt be very high in the page rank by itself, and it doesnt have the boost that it ought to be getting from the better known la.truxmap.com
So a mistake I made a year ago is now haunting my page rank. I'd like to know what the most painless way of resolving my dilemma might be. I have received so much press at la.truxmap.com that I can't just kill the site, but could I re-direct all requests at la.truxmap.com to truxmap.com/la and do the same for all cities supported without trashing my current, satisfactory page rank results I'm getting from la.truxmap.com ??
EDIT
I left out some critical information. I am using Google Apps to manage my domain (that is, to add the subdomains) and Google App Engine to host my site. Thus, Google Apps provides a simple mechanism to mask truxmap.appspot.com (the app engine domain) as la.truxmap.com, but I don't see how I can mask it as truxmap.com/la. If I can get this done, then I can just 301 redirect la.truxmap.com to truxmap.com/la as suggested below.
Thanks so much!
You could send a "301 Moved Permanently" redirect to cause the Google crawler to update its references to your site, no?
See this article on 301 redirects and SEO.
You'll need to modify your app as follows:
Add www.truxmap.com as an alias for the app (you can't serve naked domains in App Engine, so just truxmap.com won't work)
Add support to your app for handling URLs of the form www.truxmap.com/something/, routing to the same handlers as the subdomain. You'll need to make sure you've debugged any relative path issues well before continuing.
Modify your app to serve 302 redirects for every url under something.truxmap.com/whatever to www.truxmap.com/something/whatever.

Resources