I am trying to use HttpSocket to perform an operation i usually do using cURL which passing .PEM and .KEY along with my request.
I know you can specify the path of your CA file using ssl_cafile but couldn't find a way to pass the .key! is this even possible?!
cURL way:
curl_setopt($ch, CURLOPT_SSLCERT, $pemfile);
curl_setopt($ch, CURLOPT_SSLCERTTYPE, 'PEM');
curl_setopt($ch, CURLOPT_SSLKEY, $keyfile);
HttpSocket way to pass pem file:
App::uses('HttpSocket', 'Network/Http');
$Socket = new HttpSocket(array(
'ssl_cafile'=>'/file.pem',
));
Thanks
So far it's impossible you have to use your own implementation of Curl.
Related
I want to upgrade my SuiteCRM to latest generation of SuiteCRM versions. Since I am using REST API4.1 for my existing SuiteCRM and came to know that I have to use API v8 for latest generation of SuiteCRM versions, I installed SuiteCRM 7.11.3 with dummy data on our demo server running on https with PHP version 7.1.17.
After this, I generated "client_id" and "client_secret" for "Client Credentials" Grant type by navigating at "Admin" - "OAuth2 Clients and Tokens" - "New Client Credentials Client" on this.
Now I am checking the CRM API for Authentication with Client Credentials and trying to obtain a session using below code but neither getting any array or session nor any error for this.
$ch = curl_init();
$header = array(
'Content-type: application/vnd.api+json',
'Accept: application/vnd.api+json',
);
$postStr = json_encode(array(
'grant_type' => 'client_credentials',
'client_id' => 'xxxxxxxxxx',
'client_secret' => 'xxxxxxxxxx'
));
$url = 'https://url/Api/access_token';
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'POST');
curl_setopt($ch, CURLOPT_POSTFIELDS, $postStr);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HTTPHEADER, $header);
$output = curl_exec($ch);
$tab = json_decode($output);
print_r($tab);
curl_close($ch);
I have tried by adding "'scope' => ''" in $postStr as suggested in one of posts but did not return anything.
Can anyone please guide regarding this?
Make sure you generate your private and public keys first or it wont work. https://docs.suitecrm.com/developer/api/version-8/json-api/#_before_you_start_calling_endpoints
Also your post variables should be in the body.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 8 years ago.
Improve this question
Our url is remote url.i can run through Ip address.it thrown error as "file_get_contents failed to open stream: Connection refused".I have use this code.
code:
$html = file_get_contents('http://xx.xxx.xx.xx:xxxx/apps/index.php');
print_r($html);
var_dump($html);
What does this error?
A lot of hosts will prevent you from loading files from remote URLs for security reasons (allow_url_fopen setting in php.ini). It's better to use CURL to download the contents of the file.
<?php
$url = 'http://xx.xxx.xx.xx:xxxx/apps/index.php';
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HEADER, false);
$data = curl_exec($curl);
curl_close($curl);
Ref: Get file content via PHP cURL
HTH :)
Getting a page would require you to open a stream.
Something like this:
<?php
// Create a stream
$opts = array(
'http'=>array(
'method'=>"GET",
'header'=>"Accept-language: en\r\n" .
"Cookie: foo=bar\r\n"
)
);
$context = stream_context_create($opts);
// Open the file using the HTTP headers set above
$file = file_get_contents('http://www.example.com/', false, $context);
?>
For reference please check PHP.net: http://php.net/manual/en/function.file-get-contents.php
I'm trying to perform a post request and I'm trying to do it with the digest authentication. with libcurl, I set the options:
curl_easy_setopt(curl_handle, CURLOPT_HTTPAUTH, CURLAUTH_DIGEST);
curl_easy_setopt(curl_handle, CURLOPT_USERPWD, "username:password");
before setting all the other option (post, url and everything). The server closes my connection and I think that no digest is made. I just don't know how to automatically obtain the challenge-response behaviour of the digest. If I set HTTPAUTH to CURLAUTH_BASIC it encodes the stuff, I see with the VERBOSE option the header containing authorization = basic. With digest no headers.
Do you know how can I do it, or can you give me some example? I really searched everywhere.
For a basic POST request you should do:
curl_easy_setopt(hnd, CURLOPT_USERPWD, "user:pwd");
curl_easy_setopt(hnd, CURLOPT_HTTPAUTH, (long)CURLAUTH_DIGEST);
curl_easy_setopt(hnd, CURLOPT_CUSTOMREQUEST, "POST");
For a multipart POST (a.k.a multipart/form-data):
struct curl_httppost *post;
struct curl_httppost *postend;
/* setup your POST body with `curl_formadd(&post, &postend, ...)` */
curl_easy_setopt(hnd, CURLOPT_USERPWD, "user:pwd");
curl_easy_setopt(hnd, CURLOPT_HTTPPOST, post);
curl_easy_setopt(hnd, CURLOPT_HTTPAUTH, (long)CURLAUTH_DIGEST);
curl_easy_setopt(hnd, CURLOPT_CUSTOMREQUEST, "POST");
Pro-tip: use curl command-line tool with --libcurl request.c: it outputs into this C file the list of options used to perform the corresponding request.
I have created a method that I want to expose to outside world. My application is controlled by ACL. so to consume restful service you send the data in get or post to that method via url.
http://mysite.com?action=this&data=this etc
but I think I need to send the username and password too with it dont I? Also if I do, then where do I add it?
** WHAT HAVE I TRIED **
<?php
$ch = curl_init("http://test.local/sites/loginData");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_HTTPAUTH, CURLAUTH_BASIC);
curl_setopt($curl, CURLOPT_USERPWD, 'user:password');
// sending username and pwd.
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($curl, CURLOPT_USERAGENT, 'Sample Code');
curl_setopt($curl, CURLINFO_HEADER_OUT, true);
$output = curl_exec($ch);
print_r($output);
curl_close($ch);
echo '<br><br>';
echo $output;
?>
but it doesnt show me any response
If you are using http basic auth you have to configure the auth component in CakePHP properly to use that authentication mechanism.
See http://book.cakephp.org/2.0/en/core-libraries/components/authentication.html#authentication for authentication
and this http://book.cakephp.org/2.0/en/core-libraries/components/authentication.html#configuring-authorization-handlers for authorization via ACL.
I wanted to use published GoogleDocs documents and twitter tweets as the datasource of a Silverlight application but ran into clientaccesspolicy issues.
I read many articles like this and this about how difficult it is to get around the clientaccesspolicy issue.
So I wrote this CURL script and put it on my PHP site and now I can get the text of any GoogleDocs document and twitter feed into my Silverlight application:
<?php
$url = filter_input(INPUT_GET, 'url',FILTER_SANITIZE_STRING);
$validUrls[] = "http://docs.google.com";
$validUrls[] = "http://twitter.com/statuses/user_timeline";
if(beginsWithOneOfThese($url, $validUrls)) {
$user_agent = 'Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)';
$ch = curl_init();
curl_setopt($ch, CURLOPT_COOKIEJAR, "/tmp/cookie");
curl_setopt($ch, CURLOPT_COOKIEFILE, "/tmp/cookie");
curl_setopt($ch, CURLOPT_URL, $url );
curl_setopt($ch, CURLOPT_FAILONERROR, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 15);
curl_setopt($ch, CURLOPT_USERAGENT, $user_agent);
curl_setopt($ch, CURLOPT_VERBOSE, 0);
echo curl_exec($ch);
} else
echo "invalid url";
function beginsWithOneOfThese($main, $prefixes) {
foreach($prefixes as $prefix) {
if(beginsWith($main, $prefix))
return true;
}
return false;
}
function beginsWith($main, $prefix) {
return strpos($main, $prefix) === 0;
}
?>
So it makes me wonder:
Why is there so much discussion about whether or not URLs support clientaccesspolicy or not, since you just have to write a simple proxy script and get the information through it?
Why aren't there services, e.g. like the URL shortening services, which supply this functionality?
What are the security implications of having a script like this?
While you might think that a proxy gives you the same capabilities as having the client make the request, it doesn't. More specifically, you won't have the client's cookies/credentials for the target site, and in some cases, a client can reach the target site but your proxy can't (e.g. Intranet).
http://blogs.msdn.com/ieinternals/archive/2009/08/28/Explaining-Same-Origin-Policy-Part-1-Deny-Read.aspx explains Same Origin Policy at some length.
In terms of the security implications for your proxy-- well, that depends on whether you have access control on that. If not, a bad guy could use your proxy to hide his tracks as he hacks sites or downloads illegal content.