Non HTTPS URL in Identity Server 4 Discovery Document - identityserver4

I hosted IdentityServer4 on IIS. Endpoint URL's are having HTTP instead of HTTPS.
I already tried forwardheaders method. But doesn't seems to have any effect. We have SSL offloading. Is that the reason? Is there any different solution for that?
app.UseForwardedHeaders(new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto
});

In older versions (< 4.0) it was possible to set IdentityServerOptions.PublicOrigin including a fixed scheme (since some people seem to have troubles to get the forwarded headers working). Even me used this in an older project.
For the newer versions (4.x+)
If you are sure it will be https and you are sure what the public domain is you can follow this issue on Github. There leastpriviledge offers:
app.Use(async (ctx, next) =>
{
ctx.SetIdentityServerOrigin("https://yourdomain.com");
//ctx.Request.Scheme = "https"; // direct approach
await next();
});
According to the information given within the issue this should do the same.

So the new version / library is Duende 6.2.3, the method SetIdentityServerOrigin has been marked obsolete (as with most of the extension methods within Duende.IdentityServer.Extensions.HttpContextExtensions. It now recommends using IServiceUrls.Option, which I've implemented as such:
app.Use(async (ctx, next) =>{
if (ctx != null)
{
ctx.RequestServices.GetRequiredService<IServerUrls>().Origin = "[NEW URL]";
}
await next();
});

Related

Why does a POST request become a GET request in Microsoft Edge?

I'm using Axios and React in my frontend app. When I'm trying to send POST request over HTTPS with Axios (xhr, fetch) and faced with the strange issue - my POST request turns into GET in Edge dev tools.
Here is my request:
const response = await axios.post(
config.local + "/api/login/credentials",
{
login,
password
}
);
Then I tried to dig dipper - created a simple HTTPS server and tried to send POST request from the client.
const https = require('https');
const fs = require('fs');
const options = {
key: fs.readFileSync('server.key'),
cert: fs.readFileSync('server.crt')
};
const PORT = 8188;
function handleRequest(req, res){
console.log(req.method);
}
const server = https.createServer(options, handleRequest);
server.listen(PORT, function(){
console.log("Server listening on: https://localhost:" + PORT);
});
And then, as I understand it, that request does not reach the server.
Here are some links:
Issue link 1
Issue link 2
Is there any error in console? You could use Fiddler to trace the network traffic and see the details. Also mentioned in the first link you provided, you could also try the two solutions in your GitHub link:
Solution 1:
My issue is caused by HTTPS; the backend is requiring HTTPS while I send an HTTP post from front side. Now I fixed by changing both to HTTPS.
Or Solution 2:
I solved it by passing the data in a different format using the "URLSearchParams " class.
I had the same problem with:
Microsoft Edge 44.18362.267.0
Microsoft EdgeHTML 18.18362
Windows 10
I think the problem is that Edge only supports certain data types in post requests. If you want to use the content-type 'application/x-www-form-urlencoded' then use URLSearchParams to make it work in Edge and other browsers like Firefox and Chrome. Passing a querystring seems not to work in Edge even if it does in other browsers.
Modifying the original post source code, the result would be:
import Axios from 'axios'
import Promise from 'es6-promise'
Promise.polyfill()
const URL= 'http://192.168.0.112/account/login/username'
// Use URLSearchParams instead:
const dataParams = new URLSearchParams()
dataParams.append('username', 'admin')
dataParams.append('password', 'admin')
Axios.post(URL, dataParams, {
// if you still have problems try more specific options like:
// withCredentials: true,
// crossdomain: true,
// ...
})
.then(res=>{
console.log(res)
}
)
.catch(error=>{
console.log(error)
}
)
Aside from that, the issue in your question is usually caused by CORS. If you use CORS and request an untrusted origin, then Microsoft Edge will only send the GET request and lead to the failure of other requests. You could also refer to this thread to understand why CORS requests fail in Microsoft Edge but work in other browsers. Microsoft Edge also uses Enhanced Protected Mode, the outcome is that: if the site is trusted, it will make two requests, OPTIONS and GET, but if it's not trusted, it only makes the GET request which causes it to fail.
In my case problem was caused by a self-sign certificate. As soon as I started using normal certificate everything began to work.

How do you create a API/IdentityServer/Blazor(server-side) application?

I attempted to build this application myself but, have hit several stumbling blocks along the way. I am thinking that it may be best to step back and take a larger look at what I am trying to create. There doesn't seem to be any documentation on how to make what I am looking for. (unless someone can point me in the right place I might have missed)
Ultimately what I would like is to have a Blazor(server-side) application make API calls to use data in the app and then have an IdentityServer4 encapsulate the authentication. I need to have Azure as well as ASP.net Identity as the possible authentication methods.
I have tried and was able to create an IdentityServer4 that also has a local API. I can make calls to this from Postman to get token and such. But, when it comes to tying a Blazor(server-side) application to the IdentityServer4 I am befuddled.
I have tried to ask this question in specifics but, haven't gotten any results at all. I am hoping maybe this larger look at it might be helpful.
It seems like odic-client.js is the way to get the data from the IdentityServer4 callback but, that doesn't seem to tie in nicely with the .NET Authorization in Blazor(server-side). How do I get these to work together.
IMPORTANT: There are better sources now than my answer. Follow the links provided in the last part of this answer.
I've got a similar setup with API / IdentityServer4 / Blazor(server-side). I'll show you some of the code I used, maybe you can make some use of it.
Using the NuGet Package Microsoft.AspNetCore.Authentication.OpenIdConnect, I've got this code in the ConfigureServices method in the Startup class:
services.AddAuthentication(options =>
{
options.DefaultScheme = "Cookies";
options.DefaultChallengeScheme = "oidc";
})
.AddCookie("Cookies")
.AddOpenIdConnect("oidc", options =>
{
options.Authority = "https://localhost:5001";
options.ClientId = "myClient";
options.ClientSecret = "mySecret";
options.ResponseType = "code id_token";
options.SaveTokens = true;
options.GetClaimsFromUserInfoEndpoint = true;
options.Scope.Add("MyApi");
options.Scope.Add("offline_access");
options.ClaimActions.MapJsonKey("website", "website");
});
and in the Configure method app.UseAuthentication();
Then in App.razor i used the CascadingAuthenticationState component:
<CascadingAuthenticationState>
<Router AppAssembly="typeof(Startup).Assembly" />
</CascadingAuthenticationState>
And using the NuGet package Microsoft.AspNetCore.Authorization in my main page Index.razor:
#using Microsoft.AspNetCore.Authorization
#attribute [Authorize]
Now it should say "Not authenticated" when you open the main page but there's still no redirection to the IdentityServer4. For this you've got to add MVC in the startup too, as I learned from this stackoverflow question:
services.AddMvcCore(options =>
{
var policy = new AuthorizationPolicyBuilder()
.RequireAuthenticatedUser()
.Build();
options.Filters.Add(new AuthorizeFilter(policy));
});
Now you should be getting redirected to IdentityServer4 to log in after starting the application. In my case I've got an ApiClient, which describes the methods of my API. I use DI to inject the ApiClient and add the access token:
services.AddHttpClient<IApiClient, ApiClient>(async (serviceProvider, client) =>
{
var httpContextAccessor = serviceProvider.GetService<IHttpContextAccessor>();
var accessToken = await httpContextAccessor.HttpContext.GetTokenAsync("access_token");
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", accessToken);
client.BaseAddress = new Uri("http://localhost:55578");
});
Like you said, there is not much documentation on this topic except some answers here on stackoverflow. It took me a long time to set this up, so I hope I can help someone else with this post.
UPDATE: Logout process
Logging out with this setup requires a detour to a razor page because the HttpContext is inaccessible after the blazor component is loaded.
Create a new Razor Page in the Pages folder and add the following code to the newly created Logout.cshtml.cs:
public class LogoutModel : PageModel
{
public async void OnGetAsync()
{
await HttpContext.SignOutAsync("Cookies");
var prop = new AuthenticationProperties()
{
RedirectUri = "http://localhost:62909"
};
await HttpContext.SignOutAsync("oidc", prop);
}
}
Add a logout button somewhere which calls the function UriHelper.NavigateTo("/Logout") relying on #inject IUriHelper UriHelper. Done!
UPDATE: Login Workaround
The previously described login process worked locally but after publishing to the test server, I had the problem, that the IHttpContextAccessor was always null inside the AddHttpClient method. So I ended up using the same workaround as with the logout process. I let the IdentityServer redirect to a razor page (which always has a HttpContext), save the access token in the user claim and redirect to the index page. In the AddHttpClient method I only get the token from the user claim and put it into the authentication header.
UPDATE: Open issues
I still struggle to get this setup working on our server. I opened this issue and requirement on the AspNetCore Github but both got closed without a proper answer. For the time being, I found some blogs that give a good overview of the current state of the topic:
https://mcguirev10.com/2019/12/15/blazor-authentication-with-openid-connect.html
https://wellsb.com/csharp/aspnet/blazor-consume-identityserver4-protected-api/
Try this
Blazor Consume IdentityServer4 Protected API

Protect a single api resource with multiple IDServers

So I have a .Net Core web api, lets call it "CMS" and its currently protected by an IdentityServer4 server as an api resource. I have configured the ID4 server to have the IDP Claim of MyIDP.
For business reasons, I need to give a client their own IdentityServer but they would also like to have their users access the same api "CMS" .
Is this possible?
In the StartUp.cs of my CMS api it currently looks like this
services.AddAuthentication("Bearer")
.AddIdentityServerAuthentication(options =>
{
options.Authority = "http://www.idserver1.com";
options.RequireHttpsMetadata = true;
options.ApiName = "cmsapi";
});
so to add protection for another id server I assume i could just duplicate the AddAuthentication but change the scheme name from Bearer to something else but that seems wrong?
The reason I think this should be possible because I have been able to add multiple external providers to my Web Application in this manner . But this is for s sign in flow and not for an api.
If this is possible how do I go about this?
This can be achieved quite simply. Suppose you want to issue a separate subdomain for each of your clients: auth0.yourdomain.com, auth1.yourdomain.com and you want an api resource to respect the token from either of those identity providers.
Assuming that the signing key is the same, you can configure a shared issuer uri on the identity server side in Startup.cs->ConfigureServices(...):
var builder = services.AddIdentityServer(options => {
options.IssuerUri = "auth.yourdomain.com";
})
...
And then on the api side you can respect the single issuer uri without having to duplicate authentication schemes:
services.AddAuthentication("Bearer")
.AddIdentityServerAuthentication(options =>
{
options.Authority = "auth.yourdomain.com";
options.RequireHttpsMetadata = true;
options.ApiName = "cmsapi";
});
One thing I can't remember is if the request scheme (http/https) is inferred for the issuer uri or not so you might need to specify that as well (https:\\auth.yourdomain.com). Other than that, this sort of implementation should be quite seamless as far as your clients are concerned.
i think i may have figured out the solution, based off another problem that was happening to me over here
Using Client Credentials flow on identityserver4 and custom AuthorizationHandler User.Identity.isAuthenticated = false
turns out you can use multiple authenticationschemes to protect an api and choose which things to protect with what using the authenticationSchemes property of the Authorize Attribute.
so you would just need a way to map the incoming bearer token to the correct authentication scheme

Get compute metadata from app engine, got 404 page not found error

I try to get project-wide metadata from app engine, the url like this:
http://metadata.google.internal/computeMetadata/v1/project/attributes/IT_EBOOKS_API
The stackdriver logging give me an error:
message: '404 - "404 page not found\\n"',
But I can get metadata from compute engine. Here is the metadata output:
novaline_dulin#test:~$ curl http://metadata.google.internal/computeMetadata/v1/project/attributes/IT_EBOOKS_API -H
"Metadata-Flavor: Google"
http://it-ebooks-api.info/v1novaline_dulin#test:~$
And, here is my code for getting custom project-wide metadata
const request = require('request-promise');
async function getMetaData(attr) {
const url = `http://metadata.google.internal/computeMetadata/v1/project/attributes/${attr}`;
const options = {
headers: {
'Metadata-Flavor': 'Google'
}
};
console.log('url:', url);
return request(url, options)
.then((response) => {
console.info(`Retrieve meta data successfully. meta data: ${response}`);
return response;
})
.catch((err) => {
console.error('Retrieve meta data failed.', err);
return '';
});
}
Is there something wrong? thanks.
update
I can get project-id from metadata server correctly. Here is the code:
const METADATA_PROJECT_ID_URL = 'http://metadata.google.internal/computeMetadata/v1/project/project-id';
async function getProjectId() {
const options = {
headers: {
'Metadata-Flavor': 'Google'
}
};
return request(METADATA_PROJECT_ID_URL, options)
.then((response) => {
console.log('response: ', response);
return response;
})
.catch((err) => {
if (err && err.statusCode !== 200) {
console.log('Error while talking to metadata server.');
return 'Unknown_Project_ID';
}
return Promise.reject(err);
});
}
A while ago this wasn't at all possible in the standard environment, see Is there a way to access the Google Cloud metadata service from AppEngine Standard for runtime configuration?
But things appear to be changing.
There is a mentioning of the Metadata service in the (1st generation) standard environment documentation, but:
only for the java sandbox
potentially limited scope - only a subset of the endpoints mentioned, maybe the user-configured aren't, indeed, covered. But may be a matter of interpretation (emphasis mine):
The following table lists the endpoints where you can make HTTP
requests for specific metadata.
read-only:
Note: Metadata access is currently read only: you cannot write your own metadata for an instance.
This means the DNS limitation making it impossible a while ago was eliminated. Since you can get the data in the flexible environment it means it exists and you're not really trying to write it, so what you experience isn't related to the read-only limitation either.
It seems that indeed the service feature/endpoint you seek is more likely not available/functional, at least for the go sandbox (if not for all of them), rather than just an accidental documentation omission (which one might suspect/hope).
Finally I found the reason, it's metadata api version issue.
Instead of using
http://metadata.google.internal/computeMetadata/v1beta/project/attributes/${attr}
use
http://metadata.google.internal/computeMetadata/v1/project/attributes/${attr}
Now, I can get metadata from app engine flexible environment.
{"IT_EBOOKS_API":"http://it-ebooks-api.info/v1","PROJECT_ID":"just-aloe-212502","API_KEY":"12j28flsrbapznq"}
But for GAE standard environment and GCF. Still get 404 page not found
So I think but not sure that GCF and GAE standard environment are not running in compute engine.
GAE flexiable environment use compute engine as its infrastructure. That's why it can get metadata from compute engine.
I had a similar error using python APIs from a compute engine VM. Refreshing the key file and using the GOOGLE_APPLICATION_CREDENTIALS environment variable to point to it explicitly seemed to work. Even when that account was the only account configured with gcloud auth, I still had the error. I'm probably missing something in my understanding of where the Python libraries get their credentials from, but at least I have a workaround for the moment.

Make an API endpoint private

I'm developing a little webApp with AngularJS, express, nodejs and Passportjs. I have been creating endpoints on my API on demand but now I have a problem.
I have an endpoint which cannot be called by the users when they want. This API call is made when a user make an specific action in the app, so they earn points.
The endpoint is something like /api/users/updatePoints and I don't want the users with the developers tools resending the call and earning points they don't deserve.
How could I accomplish this? I have been thinking for a day but I can't think of anything reasonable.
Thank you in advance :)
--EDIT--
At last I have just deleted that ENDPOINT and write directly in the database in server-side. Not the solution I wanted but good enough. Thank you!!
It's already too late to answer but I think this could help someone.
To Privatize an endpoint, Allow only your whitelisted Origins by setting the respective response headers and for all other users send a 403 status code (Which implies forbidden, 401 says try again).
Here is an example of middleware that privatizes endpoints.
module.exports = function (allowedOrigins) {
const whitelistedOrigins = Array.isArray(allowedOrigins) ? allowedOrigins : [allowedOrigins];
return function (req, res, next) {
const origin = req.headers.origin;
if (whitelistedOrigins.indexOf(origin) > -1) {
res.setHeader("Access-Control-Allow-Origin", origin);
next();
} else {
res.status(403).json({
msg: "This is a private Endpoint, Please contact the Admin",
});
}
};
};
Here is an example of usage.
const privatizeEndpoint = require("../customMiddlewares/privatizeEndpoint");
router.post("/someEndpoint", privatizeEndpoint(["http://myprivate.domain.com"]), (req, res) => {
console.log("Inside Private Endpoint");
});
However, your endpoint will be still exposed but at least it will be served only to your whitelisted domains.
Notice that even requests without origin will be blocked (curl commands or through POSTMAN).

Resources