What is best practice for for testing fulfilled chainlink oracle requests ethers/hardhat? - request

I am using hardhat with ethers on rinkeby to test a smart contract that makes a a get request to a local chainlink node. I can observe on the node dashboard that the request is fulfilled.
I am struggling to write a test that waits for the 2nd fulfillment transaction to be confirmed.
I see similar tests in the SmartContractKit/chainlink repo tests
it("logs the data given to it by the oracle", async () => {
const tx = await oc.connect(roles.oracleNode).fulfillOracleRequest(...convertFufillParams(request, response));
const receipt = await tx.wait();
assert.equal(2, receipt?.logs?.length);
const log = receipt?.logs?.[1];
assert.equal(log?.topics[2], response);
});
I fail to see that this would wait for the fulfilled transaction at all. In the consumer.sol this function calls there is an event RequestFulfilled, that is emit, but it doesn't seem like this test is listening to it.
Another example I found, ocean protocol request test, accomplishes this by creating a mapping of request id's, an accessor, and a while loop in the test the polls until the request id is found.
it("create a request and send to Chainlink", async () => {
let tx = await ocean.createRequest(jobId, url, path, times);
request = h.decodeRunRequest(tx.receipt.rawLogs[3]);
console.log("request has been sent. request id :=" + request.id)
let data = 0
let timer = 0
while(data == 0){
data = await ocean.getRequestResult(request.id)
if(data != 0) {
console.log("Request is fulfilled. data := " + data)
}
wait(1000)
timer = timer + 1
console.log("waiting for " + timer + " second")
}
});
This makes sense, and I see how it works. However I would like to avoid creating a mapping, and accessor when I imagine there has got to be a more optimal way.

You'd want to look at the hardhat-starter-kit to see examples of working with Chainlink/oracle API responses.
For unit tests, you'd want to just mock the API responses from the Chainlink node.
For integration tests (for example, on a testnet) you'd add some wait parameter for a return. In the sample hardhat-starter-kit, it just waits x number of seconds, but you could also code your tests to listen for events to know when the oracle has responded. This does use events to get the requestId, however, you actually don't have to make a the event yourself, as the Chainlink core code already has this.
it('Should successfully make an external API request and get a result', async () => {
const transaction = await apiConsumer.requestVolumeData()
const tx_receipt = await transaction.wait()
const requestId = tx_receipt.events[0].topics[1]
//wait 30 secs for oracle to callback
await new Promise(resolve => setTimeout(resolve, 30000))
//Now check the result
const result = await apiConsumer.volume()
console.log("API Consumer Volume: ", new web3.utils.BN(result._hex).toString())
expect(new web3.utils.BN(result._hex)).to.be.a.bignumber.that.is.greaterThan(new web3.utils.BN(0))
})

Related

Cloud Tasks are stuck in queue and are not executed

I am using Cloud Functions to put tasks into Cloud Tasks Queue and invoke a service (worker) function. Both the task generator and task Handler functions are deployed to Cloud Functions.
This is my createTask.js:
const {CloudTasksClient} = require('#google-cloud/tasks');
const client = new CloudTasksClient();
exports.createTask = async (req, res) => {
const location = 'us-central1';
const project = 'project-id';
const queue = 'queueid';
const payload = 'Hello, World!';
const parent = client.queuePath(project, location, queue);
const task = { appEngineHttpRequest: {
httpMethod: 'POST',
relativeUri : '/log_payload'},
const [ response ] = await tasksClient.createTask({ parent: queuePath, task })
if (payload) {
task.appEngineHttpRequest.body = Buffer.from(payload).toString('base64');
}
let inSeconds = 0 ;
if (inSeconds) {
// The time when the task is scheduled to be attempted.
task.scheduleTime = {
seconds: inSeconds + Date.now() / 1000,
};
}
console.log('Sending task:');
console.log(task);
// Send create task request.
const request = {parent: parent, task: task};
const [response] = await client.createTask(request);
const name = response.name;
console.log(`Created task ${name}`);
res.send({message : "Ok"});
}
server.js
const express = require('express');
const app = express();
app.enable('trust proxy');
app.use(bodyParser.raw({type: 'application/octet-stream'}));
app.get('/', (req, res) => {
// Basic index to verify app is serving
res.send('Hello, World!').end();
});
app.post('/log_payload', (req, res) => {
// Log the request payload
console.log('Received task with payload: %s', req.body);
res.send(`Printed task payload: ${req.body}`).end();
});
app.get('*', (req, res) => {
res.send('OK').end();
});
app.listen(3000 , () => {
console.log(`App listening on port`);
console.log('Press Ctrl+C to quit.');
});
When I run trigger the task generator function via HTTP trigger in Postman, the task is added to the queue but it stays there forever.
The queue looks like this:
The logs of the handler task show it was never triggered. The task in the queue cannot reach its handler.
The logs of task in queue looks like this:
The task is failed and is in the queue:
enter image description here
I have tried to reproduce the issue by following doc. Successfully tasks are executed.I assume you also followed Cloud Task quickstart & Github code samples for set-up. This quickstart attempts to set up following components -
a) Create Task (~ createTask.js) - This can be either run locally or deployed as a Cloud function. In your case, this has been created as a Cloud Function.
b) Task Queue Creation - This is the creation of a Cloud Task queue.
c) Task Target / Handler (~ server.js) - The quickstart assumes this component to be deployed as an App Engine worker instance. This can also be seen in the corresponding Task Creation script (~ createTask.js).
Based on the description, assuming you are deploying the Task Target / Handler also as a cloud function If this assumption is correct then you need to follow this public doc to create a HTTP Target Task which uses "httpRequest" instead of "appEngineHttpRequest" construct. There is also a tutorial, that you may find helpful.
If you are using Cloud Functions instead of App Engine, Target for Tasks is also supported by the error - "404 - Not Found" in those screenshots you provided. This error signifies that the target App Engine instance endpoint (~ log_payload) is not found. This is also the reason why the task is not getting executed.
I suggest you to try out the above steps if those does not help I think you may raise support case here as your issue seems to require more in-depth analysis in your project logs to see why task queues are not being triggered.

Where does one access events that are emitted from a solidity contract, either via a node or a mirror?

How can I get emitted events from a solidity smart contract on the Hedera Network? My best guess is via ContractFunctionResult.
You have few options:
Use hether.js, so something like:
// Setup a filter and event listener to know when an address receives/sends tokens
const filter = contract.filters.Transfer(walletAddress, null);
contract.once(filter, (from, to, amount, event) => {
console.log(`\n- Event: ${from} sent ${amount} tokens to ${to}`);
});
More on hether.js events here: https://docs.hedera.com/hethers/application-programming-interface/contract-interaction/contract#events
You can use ethers.js or web3.js with the Hedera SDKs to parse event logs, either from transaction records or mirror node api results. So, to get event data in a readable fashion you would use the contract’s ABI, log data, and ethers/web.js.
Here's some sample JS code using ethers.js and mirror node (can do something similar with info from the tx record):
async function getEventsFromMirror(contractId) {
const url = https://testnet.mirrornode.hedera.com/api/v1/contracts/${contractId.toString()}/results/logs?order=asc;
axios.get(url)
.then(function (response) {
const jsonResponse = response.data;
jsonResponse.logs.forEach(log => {
// create an object to specify log parsing requirements
let logRequest = {};
logRequest.data = log.data;
logRequest.topics = log.topics;
// parse the logs
let event = abiInterface.parseLog(logRequest);
// output the from address and message stored in the event
console.log(Mirror event(s): from '${AccountId.fromSolidityAddress(event.args.from).toString()}' update to '${event.args.message}');
});
})
.catch(function (err) {
console.error(err);
});
}
Get the logs and events directly from a mirror node (https://hips.hedera.com/hip/hip-226 and https://hips.hedera.com/hip/hip-227) and use your own library, if applicable. Probably the first two options make more sense for most folks.

RxJS Observable with multiple ajax responses

Brand noob at this and have gotten one post request working but hoping to chain several and process the response. I gather the way to do this is forkJoin(), however I am not getting the responses (although I see the requests and responses in the Network) and don't really get how to do the composition. I think I may need to subscribe to them?
const requests: Array<Observable<AjaxResponse>> = [];
fields.forEach((field: string) => {
const request: AjaxRequest = generateRequest(field);
requests.push(Observable.ajax(request));
});
Observable.forkJoin(requests).map(
responses => { // never stops here
responses.map((res, idx) => { // or here
})
});
Found it about 10 mins later
const forkJoin = Observable.forkJoin(requests);
forkJoin.subscribe(ajaxResponses => {
});

React Query handling response status codes

this is related to this question:
Handling unauthorized request in react-query
I understand the point that React-Query doesnt care about responses codes because there is no error. So for example if the server respond with a 400 "Bad Request", do i have to check for this on the data returned by the muate function?
const handleFormSubmit = async (credentials) => {
const data = await mutateLogin(credentials);
// Do i have to check this data if for example i wanna show an error message
// "Invalid Credentials"?
};
I need to save the user on the cache.
const useMutateLogin = () => {
return useMutation(doLogin, {
throwOnError: true,
onSuccess: data => // Do i have to check here again if i receive the user or 400 code
})
}
Thanks.
react-query does not take care of the requests and it is completely agnostic of what you use to make them as long you have a Promise. From the documentation we have the following specification for the query function:
Must return a promise that will either resolves data or throws an error.
So if you need to fail on specific status codes, you should handle that in the query function.
The confusion comes because popular libraries usually take care of that for you. For example, axios and jQuery.ajax() will throw an error/reject if the HTTP status code falls out of the range of 2xx. If you use the Fetch API (like the discussion in the link you posted), the API won't reject on HTTP error status.
Your first code snippet:
const handleFormSubmit = async (credentials) => {
const data = await mutateLogin(credentials);
};
The content of data depends on the mutateLogin function implementation. If you are using axios, the promise will reject to any HTTP status code that falls out of the range of 2xx. If you use the Fetch API you need to check the status and throw the error or react-query will cache the whole response as received.
Your second code snippet:
const useMutateLogin = () => {
return useMutation(doLogin, {
throwOnError: true,
onSuccess: data => // Do i have to check here again if i receive the user or 400 code
})
}
Here we have the same case as before. It depends on doLogin implementation.

How can I make 1 million+ requests to an API in React appication?

I'm trying to make a large number of post requests to an API and create "service tickets".
I have 3 types of services and I want to create packages of 50, 100, 500, 1k, 10k, and 1Million depending of what the user selects. (I am already storing that in a React State)
When I click I button I would like to send all the requests for the tickets at once, so we can have at the end:
service 1 = 1,000,000 tickets
service 2 = 500 tickets
service 3 = 1,000,000 tickets
total of Tickets = 2,000,500
I was wondering what is the best way to handle that amount of post requests in a single click? Any help will be very useful. Thanks!
This is the function I'm passing to my onClick event that creates a single ticket:
const addTicket = async () => {
try {
const token = await getTokenSilently(); // from auth0 authentication
let url = '.../endpoint'
const post = {
createdFor: hostId, // here I put the id of the user I want to add the tickets
service: typeOfService // Here I select one of the 3 types of services
}
const response = await fetch(url, {
method: 'POST',
body: JSON.stringify(post),
headers: {
"Authorization": "Bearer "+ token
}
});
const responseData = await response.json();
} catch (error) {
console.error(error);
}
};
Im using:
-"react": "^16.13.1"
-the backend is running AWS Lambda
A million requests are not required, you can make an API endpoint where you can make 1 request with information of how many tickets you want and then handle the number of tickes on the server-side.
If you will make 1 million requests you will need a much better server and resources and you will be wasting a lot of unnecessary money

Resources