Where does Identity Server 4 stores client's code verifier - identityserver4

I have setup Identity Server 4 with EF & Angular client (OIDC JS library) with Authorization Code with PKCE grant. I see the code challenge & method being passed in the /authorize url. As per specification, auth server stores code challenged passed by client on auth server to be verified later when client sends code verifier.
Where does identity server 4 stores the code challenge in database?

It depends on how it is implemented, one common way is to store them in a database table called PersistedGrants. The Authorization Codes are also stored in this table, however they are deleted as as soon as they are used.
The image shows what it can look like in this table.
The Code_challenge is stored together with the access-code in the database and you can see the code where they pack it all together and then store it as an encrypted blob in the database.
AuthorizeResponseGenerator.cs (look at the end of the file)
The authorization code is stored in the database as the image below shows:
The actual payload is encrypted and protected, at it looks like this:
{
"PersistentGrantDataContainerVersion": 1,
"DataProtected": true,
"Payload": "CfDJ8OFLAj3iVVVHvhgvjcKB19Z7-Hms4IIQobGgGl7VnJQCtKiB-Inr3h-mcWCxxD8dJ4QNTbuVeywbT6ROsaf13EpaIQDWtLgbnSPvCDTLQeWTO_vP0UtDwJ7TTCc5aTvKEp_9hX9S1b3l685bmBMlTIcZFqGGM2VfK0qasWCqKSQcTxeN6cgJygZEQNMgAG4ipqr..."
}

Related

Azure Data Factory API config basic

Hi I am trying to connect to a HR system to simply pull down some data and copy in ASDB
I have managed to get it to work in excel using the following code below in the advanced editor but dont even know where to start in terms of where i put this information in Azure Data Factory
let url="https://api.peoplehr.net/Query",
body="{""APIKey"": ""ENTER API KEY HERE"",""Action"": ""GetQueryResultByQueryName"",""QueryName"":""ENTER QUERY NAME HERE""}",
Source = Json.Document(Web.Contents(url,[
Headers = [ #"Content-Type"="application/json"],
Content = Text.ToBinary(body)
]
))
in Source
my understanding is this is a REST api and i have an API key. I am very new to ADF and to API's in general, i have spent days on google trying to get a solution that works
there is a guide here https://help.peoplehr.com/en/articles/2492019-people-queries-and-excel-power-bi
that describes how to do it in excel which i basicall want to replicate in Azure Data Factory and create a table in SQL DB
thanks in advance
I have used the ingest wizard in ADF with the "Copy" function and the source set to REST
The options on the source only allow me to pass Auth Headers and not additional headers and body (hope that makes sense)

How should I push and pull data from a server using REST API and Later generate reports from it

I am new to REST API. What I basically understand from REST API is you need to call it each time to get the updated data.
In my case, I need to use the data received from REST API to generate reports in PowerBI. Adding on, I should be able to "read" the data coming from the server and "write" to the data as well.
I did find the option of getting data from WEB in PowerBI to connect directly to REST API. So, if I do that I can only "read".
Can you help me with different options on how can I do both "read/pull" and "write/push" to the server when I have its REST API? I am not sure if a cloud has to be used in-between.

Account linking fails if I re-enable immediately after diasbling the skill

I am developing an alexa skill that requires account linking. The Account linking succeeds first 2 times(enable skill-disable it- and again re-enable the skill). Account linking fails only when I re-enable immediately after disabling the skill. I use Code grant auth type. The data (in the query string state/code/etc) are successfully redirected back to amazon's redirect/return url value. But Amazon ends the account linking process with a message stating that the account linking process failed at this time. Could anyone has any idea? Your help is much appreciated.
Answer: Finally I figured out the issue. The authorization server runs in 2 machines (instanes). The authorization server uses concurrent dictionary to store the access tokens. The concurrent dictionary uses local memory (in proc memory). During the authentication, the Amazon connected to one of the Auth server; that Auth server stores the access code in it's memory store. When the Amazon tries to get the access code from my authorization server using the code value that was returned to the Amazom previously, the second authorization server got hit. The second auth server does not have the access token for the amazon provided code, hence it invalidates the request. The solution is to use the shared store (out of process memory like REDIS cache) to store the access codes. So that both authorization servers can serve the request by referring the same store.

API to Database?

Please presume that I do not know anything about any of the things I will be mentioning because I really do not.
Most OpenData sites have the possibility of exporting the presented file either in for example .csv or .json formats (Example). They also always have an API tab (Example API).
I presume using the API would mean that if the data is updated you would receive the change whereas exporting it as .csv would mean the content will not be changed anymore.
My questions is: how does one use this API code to display the same table one would get when exporting a .csv file.
Would you use a database to extract this information? What kind of database and how do you link the API to the database?
I presume using the API would mean that if the data is updated you
would receive the change whereas exporting it as .csv would mean the
content will not be changed anymore.
You are correct in the sense that, if you download the csv to your computer, that csv file won't be updated any more.
An API is something you would call - in this case, you can call the API, saying "Hey, do you have the latest data on xxx?", and you will be given back the latest information about what you have asked. This does not mean though, that this site will notify you when there's a new update - you will have to keep calling the API (every hour, every day etc) to see if there are any changes.
My questions is: how does one use this API code to display the same
table one would get when exporting a .csv file.
You would:
Call the API from a server code, or a cloud service
Let the server code or cloud service decipher (or "Parse") the response
Use the deciphered response to create a table made out of HTML, or to place it into a database
Would you use a database to extract this information? What kind of
database and how do you link the API to the database?
You wouldn't necessarily need a database to extract information, although a database would be nice to place the final data inside.
You would first need some sort of way to "call the REST API". There are many ways to do this - using Shell Script, using Python, using Excel VBA etc.
I understand this is hard to visualize, so here is an example of step 1, where you can retrieve information.
Try placing in the below URL (taken from the site you showed us) in your address bar of your Chrome browser, and hit enter
http://opendata.brussels.be/api/records/1.0/search/?dataset=associations-clubs-sportifs
See how it gives back a lot of text with many brackets and commas? You've basically asked the site to give you some data, and this is the response they gave back (different browsers work differently - IE asks you to download the response as a .json file). You've basically called an API.
To see this data more cleanly, open your developer tools of your Chrome browser, and enter the following JavaScript code
var url = 'http://opendata.brussels.be/api/records/1.0/search/?dataset=associations-clubs-sportifs';
var xhr = new XMLHttpRequest();
xhr.open('GET', url);
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xhr.onload = function() {
if (xhr.status === 200) {
// success
console.log(JSON.parse(xhr.responseText));
} else {
// error
console.log(JSON.parse(xhr.responseText));
}
};
xhr.send();
When you hit enter, a response will come back, stating "Object". If you click through the arrows, you can see this is a cleaner version of the data we just saw - more human readable.
In this case, I used JavaScript to retrieve the data, but you can use whatever code you want. You could proceed to use JavaScript to decipher the data, manipulate it, and push it into a database.
kintone is an online cloud database where you can customize it to run JavaScript codes, and have it store the data in their database, so you'll have the data stored online like in the below image. This is just one example of a database you can use.
There are other cloud services which allow you to connect API end points of different services with each other, like IFTTT and Zapier, but I'm not sure if they connect with open data.
The page you linked to shows that the API returns values as a JSON object. To access the data you can just send an appropriate http request and the response will be the requested data as a JSON. You can send requests like that over your browser if you want to.
Most languages allow JSON objects to be manipulated pro grammatically if you need to do work on the data.
Restful APIs publish model is "request and publish". Wen you request data via an API endpoint, you would receive response strings in JSON objects, CSV tables or XML.
The publisher, in this case Opendata.brussel.be would update their database on regular basis and publish the results via an API endpoint.
If you want to download the table as a relational data table in a CSV file, you'd need to parse the JSON objects into relational tables. This can be tricky since each JSON response string can vary in their paths.
There're several ways to do it. You can either write scripts to flatten the JSON objects or use a tool to parse and flatten the objects for you.
I use a tool called Acho to turn API endpoints into CSV files. It would parse almost all API endpoints through the parameters and even configure for multiple requests, such as iterative and recursive requests.
Acho API parser

Inserting data from SQL server into a RESTful API via SSIS

I am investigating ways to move data from SQL Server into system exposed via a RESTful HTTP API.
If I were to use SSIS would I have to write a custom connector to push the data to the HTTP API after the transform step, or is there a built in feature that supports pushing to an HTTP API?
If you only want to move a very small amount of data, you could use the Web Services Task
...but note that pushing data out of SQL Server is not what this task is intended for...
The Web Service task executes a Web service method. You can use the
Web Service task for the following purposes:
Writing to a variable the values that a Web service method returns.
For example, you could obtain the highest temperature of the day from
a Web service method, and then use that value to update a variable
that is used in an expression that sets a column value.
Writing to a file the values that a Web service method returns. For
example, a list of potential customers can be written to a file and
the file then used as a data source in a package that cleans the data
before it is written to a database.
For more control, you'll want to look at using the Script Component in a data flow. Much more flexibility/control.

Resources