I am using libcurl for a small c application. The project uses https and requires a validation of both, server and client certificates. I cannot use an option to suppress the verification, since I work in an insecure environment.
I am currently trying to get the server side certificate validated. First attempts gave me an expected error:
Peer certificate cannot be authenticated with given CA certificates
As said an expected error, I understand what the message means. I dug into the documentation of libcurl and found that it supports "certificate bundles", and that younger versions do not come with a bundle all. All options I found (and also all explanations) refer to certificate files read at runtime and obviously suggest to include the required CA certificate in the local bundle.
Instead I would prefer to include a single certificate inline into the application, so compiled in. This does make sense for this special case, since the application only tries to access a single, hard coded url, so server. I accept that I'd have to replace all deployed copies of the application if the server certificate gets changed. However I do not find any options for that in the documentation. I would prefer this strategy, since it allows a much more compact deployment of the application: a single file instead of a structure and runtime configuration.
So my question is: does libcurl offer to include a CA certificate at compile time which can be used at runtime without having to rely on an external bundle?
In libcurl the part that verifies certificates is handled by openssl. You could use SSL_CTX_use_certificate to install your certificate at runtime or use SSL_CTX_set_verify to overwrite the SSL verification function with your own.
Check curlx.c for an example.
Related
I'm creating a VS Code extension with a webview that contains a React application. In the React code, I'm making a GET request to a REST API, but it keeps failing due to the following error:
Failed to load resource: net::ERR_CERT_AUTHORITY_INVALID
Any ideas on why this may be happening or a workaround? Maybe this is a restriction of webviews?
If I make the call in the extension code, it works fine.
I upgrade the version of my browser to the latest and it worked me.
find this below given link to know how to update browser version.
https://www.computerhope.com/issues/ch001388.htm
Assuming that you get this error about the certificate of the remote side (the one serving the REST API), you get this error because of one of the following:
the authority that signed the certificate is not recognized on the client side (ie : the authority is not installed on your PC)
the certificate has expired
your PC has a wrong date
You can correct the above, or as a workaround you can (depending on your tools) explicitly ignore the untrusted remote certificate. But this workaround should remain for test purpose only, as it is a security breach.
I am trying to run my IoT-client on Threadx-Os Client which doesn't have file- system/certificate trusted store kind of things like in linux. When i look into Wireshark the client closing connection with Fatal,Bad certificate error. I tried all possible options which are suggested in different forums to solve this issue. Which haven't solved my problem. The solution i tried mentioned below.
By using below API to added only above Baltimore root certificate available in cert.c.
IoTHubDeviceClient_LL_SetOption(device_ll_handle, OPTION_TRUSTED_CERT,
certificates);
it's not working for me because we don't have trusted store like linux.
ifdef SET_TRUSTED_CERT_IN_SAMPLES
// Setting the Trusted Certificate. This is only necessary on system with without
// built in certificate stores.
IoTHubDeviceClient_LL_SetOption(device_ll_handle, OPTION_TRUSTED_CERT, certificates);
endif // SET_TRUSTED_CERT_IN_SAMPLES
I need answers for two important questions.
1) Do i need to Add entire certificate string in cert.c (or) only first Baltimore root as CA root to my client.
2) Without trusted store, how client can tell to azure-cloud i have trusted root.
Any help would be appreciated.
--First a little background:--
I have already managed to connect to a Microsoft web service using C#. To use this web service, I have to supply a username and a password in the C# code. I also have to install a certificate (in .cer format) into the "Root Certificate Authorities" section of the system's certificates.
(By the way, the C# class I use to connect to the service was automatically generated for me with the command line tool "svcutil.exe https://address.of.service")
--Here is my question:--
How can I connect to this web service using Axis2/C? The example in the documentation is of a completely different nature -- it asks for a certificate, key file, and a passphrase. In my case, it is username, password, and a .cer file.
So I'm not sure where to even begin. I don't know where my .cer file, username and password should go exactly. Any ideas?
If at all possible stay away from Axis2, perhaps use gSOAP instead.
That said, figure out which of the HTTP libraries you are building Axis2 with (I believe it can use a number of different ones depending on which OS you are building etc).
Also you might want to update your question with the reference to the sample program you are talking about and relevant excerpt from the C# client for reference.
Does anybody have any experience with libcurl (C/C++) and Kerberos authentication?
I am able to set everything up and post data - however, now we have switched on SSO (via SPNEGO on a JBoss server) I am unable to authenticate properly, with authorization being downgraded to Basic.
At this stage I am using curl_easy_setopt(curl,CURLOPT_HTTPAUTH, CURLAUTH_GSSNEGOTIATE) and have also set the CURLOPT_KRBLEVEL to "private".
Looking at the headers in wireshark, the response from curl doesn't provide any credentials - which is throwing an EncryptionKey exception on the JBoss server. I am able to authenticate on the server via .NET using cached credentials and a httprequest object. Only problem is we can't use .NET on this project.
Thanks in advance for any help.
As an additional point. I have just seen that under libcurl Kerberos is only supported for ftp. I am trawling the source to try and confirm if this is the case. Anybody know of any other libraries that we might be able to use to perform our POST?
Ok - for anybody that reaches this point...
I downloaded this version: http://curl.haxx.se/download.html win 32 generic.
It is compiled to run under ssh, ntlm and kerberos etc. The download includes the relevant DLL's you will need to create apps for any environment that requires cached credentials.
If you want specific code - pertaining to using callbacks, chunking etc, drop me a line and I can forward it to you. Given the flexibility of the curl library - it would be a bit much to just drop a ton of code here. Once you have the curl.exe up and running, there is a cmd line switch that can output the equivalent code to a text file (although it doesn't include information regarding any writecallback functions etc - just the easy setup options required). Thanks to n.m for your help - much appreciated. G
I'm doing https web requests in silverlight using "WebRequest"/"WebResponse" framework classes.
Problem is: I do a request to an url like: https://12.34.56.78
I receive back a versign signed certificate which has as subject a domain name like: www.mydomain.com.
Hence this results in a remote certificate mismatch error.
First question: Can I somehow accept the invalid certificate, and get the WebBresponse content ? (even if it involves using other libraries, I'm open to it)
Additional details: (for those interested on why I need this scenario)
I'm trying to give a client access to a silverlight app deployed on a test server.
Client accesses the silverlight app at: www.mydomain.com/app
Then I do some rest requests to: https://xx.mydomain.com
Problem is I don't want to do requests on https://xx.mydomain.com, since that is on our productive server. For this reason I use https://12.34.56.78 instead of https://xx.mydomain.com.
Client has some firewalls/proxies and if I simply change his hosts file and map https://xx.mydomain.com to 12.34.56.78, web requests don't resolve to the mapped IP.
I say this because on his network webrequests fail if I try that, on my network I can use the hosts changing without problems.
UPDATE: Fixed the problem by deploying test releases to an alternative: https://yy.domain.com and allowing the user to configure for test purposes, the base url to which I do requests to be: https://yy.domain.com.
Using an certificate that contained the IP in the subject or an alternative subject would've probably worked too, but would have cost some money to be issued by a certified provider and would not be so good because IP's might change.
After doing more research looks like Microsoft won't add this feature too soon, unless there's a scenario for non-testing/debugging uses.
See: http://connect.microsoft.com/VisualStudio/feedback/details/368047/add-system-net-servicepointmanager-servercertificatevalidationcallback-property