I write a ML code and I run it 5 times ,
each time I calculte Area under the curve (AUC)
Now, I want to calculate the average of this five AUC value ..
How can I do it? just summing and devided by 5?
Also, is there any way to draw the average of the area under the curve ( in python )?
Thanks
Related
I fit a ML algorithm ( binary classification ) , and I really calculate all metrics that I found
and I write the 10 important features in each run ( my run = 5 times )
the problem now:
I need to calculate AUC for each run from 5 runs
Is there any way to calculate AUC from my privious run ?
without to return all calculation and what I did it before
I pray there is a way to do this
help me please
Say, I got a dataset of intensity for each wavelength from these curves in the picture and performed curve fitting with the gaussian function for each curve to extract its parameters like peak amplitude, center, and FWHM.
)
I want to apply the AI method to predict the Imax (maximum Intensity), center, and FWHM for another curve in different temperatures. Say I want to predict Imax, center, and FWHM at 1 Celcius, is it possible to do so? which method should I try and how should I format the input feature and its labels? thank you.
I want to create a set of N random convex disjoint polygons on a plane where each polygon must have at most V vertices, where N and V are parameters for my function, and I'd like to obtain a distribution as close as possible to uniform (every possible set being equally probable). Also I need to randomly select two points on the plane that either match with one of the vertices in the scene or are in empty space (not inside a polygon).
I already implemented for other reasons in the same programming language an AABB tree, Separating Axis Theorem-based collision detection between convex polygons and I can generate a random convex polygon with arbitrary amount of vertices inside a circle of given radius. My best bet thus far:
Generate a random polygon using the function I have available.
Query the AABB tree to test for interception with existing polygons.
If the AABB tree query returns empty set I push the generated polygon into it, otherwise I test with SAT against all the other polygons whose AABB overlaps with the generated one's. If SAT returns "no intersection" I push the polygon, otherwise I discard it.
Repeat from 1 until N polygons are generated.
Generate a random number in {0,1}
If the generated number is 1 I pick a random polygon and a random vertex on it as a point
If the generated number is 0 I generate a random position in (x,y) and test if it falls within some polygon (I might create a tiny AABB around it and exploit the AABB tree to reduce the required number of PiP tests). In case it's not inside any polygon I approve it as a valid point, otherwise I repeat from 5.
Repeat from 5 once more to get the second point.
I think the solution would possibly work, but unfortunately there's no way to guarantee that I can generate N such polygons for very large N, or find two good points in an acceptable time, and I'm programming in React, where long operations run on the main thread blocking the UI till they end. I could circumvent the issue by ejecting from create-react-app and learn Web Workers, which would require probably more time than it's worth for me.
This is definitely non-uniform distribution, but perhaps you could begin by generating N points in the plane and then computing the Voronoi diagram for those points. The Voronoi diagram can be computed in O(n log n) time with Fortune's algorithm. The cells of the Voronoi diagram are convex, so you can then construct a random polygon of the desired number of vertices that lies within each cell of the diagram.
By Balu Ertl - Own work, CC BY-SA 4.0, Link
Ok, here is another proposal. I have little knowledge of js, but could cook up something in Python.
Use Poisson disk sampling with distance parameter d to generate N samples of the centers
For a given center make a circle with R≤d.
Generate V angles using Dirichlet distribution such that sum of angles is equal to 2π. Sort them.
Place vertices on the circle using angles generate at step#3 and connect them. This would be be your polygon
UPDATE
Instead using Poisson disk sampling for step 1, one could use Sobol quasi-random sequences. For N points sampled in the 1x1 box (well, you have to scale it afterwards), least distance between points would be
d = 0.5 * sqrt(D) / N,
where D is dimension of the problem, 2 in your case. So radius of the circle for step 2 would be 0.25 * sqrt(2) / N. This ties nicely N and d.
https://www.sciencedirect.com/science/article/abs/pii/S0378475406002382
I'm on a geographic project:
you have a flight, within 10 miles of the coastline, the led on it will be triggered, otherwise remains.
We have generated 500 sets of coordinates, discrete numbers(longitude and latitude).
My first thought was creating a grid, each block was 3 mile x 3 mile, all land part will be 1, otherwise 0. And we calculate the sum of the nearest 24 blocks, if it is larger than 1. It would not trigger.
Then my manager suggest we can find the centroid of the polygon, then calculate the distance every time.
Can anyone explain the algorithm here?
Sorry have not touch any programming for long time, and I'm a mechanical engineering, manager supposed this should be something fun to work on.
Thanks in advance.
You can take a look at K-means algorithm
I've got an 1700 x 3 array, with latitude, longitude and metres above sea level. Like this:
51.2551649606487 7.15089717516404 153.110000000000
51.2552453948075 7.15086528446721 150.160000000000
51.2552903318980 7.15086348124900 150.200000000000
I want to calculate the distance between successive Lat Lon Coordinates, using the Haversine formula, since i have no access to the MATLAB mapping toolbox( https://de.mathworks.com/matlabcentral/fileexchange/38812-latlon-distance).
My question is, how i should change the given code, to read all of the 1700 coordinates directly from my array?
I read/checked the following link: MATLAB function to calculate distance between two coordinates (latitude and longitude). But it doesn't tell me how to read all the 1700 coordinates from my array at once.
Thank you in advance for everybody who is willing to help me!
Best regards
If I understand the question correctly, you should be able to split the data into 3 1700 x 1 arrays: latitude, longitude, and elevation. Then apply the haversine formula on those three arrays. (Although the typical haversine function doesn't account for elevation)
coordinates; %1700 x 3 array
latitudes=coordinates(:,1);
longitudes=coordinates(:,2);
elevations=coordinates(:,3);
lat1=0; %coordinates to compare to
long1=0;
earthRadius=6371000;
a=sind((latitudes-lat1)./2).^2 + cosd(latitudes).*cosd(lat1).*sind((longitudes-long1)./2).^2;
c=atan2(sqrt(a),sqrt(1-a));
distances=c*earthRadius;%in meters