I read a tutorial of how to implement Seek behavior of steering behavior.The link is here
.And this is the graph to illustrate the algorithm:
.
I know the velocity, force, acceleration are all vector. But how come "steering" in formular "steering = desired_velocity - current_velocity" becomes into a force rather than a velocity in this article? why does this make sense? Does it mean that we can mix them in one calculation? Does that mean that a velocity vector add or subtract another velocity vector can product a force vector? if not , why the result is called "force"? I know how the steering behaviors work in AI. The key point of achieving this is that we can sum up all the different steering forces together and get a result total force. This total force can be used in formular "a = F/m" to get the acceleration. After that , we can use this acceleration to calculate new position and velocity of object in game loop update.
Based on my view , the "F" should be steering force , but I'm stucking on understanding the way to calculate it.
Related
Now I am trying to implement a particle filter. I am given a wall-mounted map, and I try to localize a robot in this map. Based on particle filter method, I initialize 1000 random particles, and in each step, I move these 1000 particles according to a certain movement instruction, i.e. an angle-odometry pair. After a move, I calculate the likelihood of the measurements compared to the sensed distance to the wall, and then resample the particles based on their likelihoods. I think this is the basic process for particle filter. What confuses me now is that how should I deal with the situations where some of the particles hit the wall while they are forwarding?
I think it is too late for you. However, it may help other people. Particle filter is a probabilistic approach, where particles can be sampled everywhere based on motion and prior distributions.
In your case, you can sample on the wall without any worry. Afterwards, the likelihood process will return a very low probability for that particle and it will be automatically resampled to another one with higher probability.
I want to develop an app that detects how far the user/device is from points on a map.
Calculating the distance is easy, but when you get close to about 30meters I would need it to be as precise as possible.
Basically I want some lights on the UI to get brighter the closer you get to the target/point.
How do I achieve this if the gps position sometimes bounces around for 5-10 meters or more?
Any ideas on how to approach this?
Thanks!
In general there is the inaccuracy with the position, and indeed its meters, thus the bouncing will be there and its rather impossible to get rid of it, anyways, one suggestion would be to collect the last few (3-10 up to you and your logic really) locations and calculate average from them. Then with fast movements your position would be lagging of course, but when doing slow movements the position shown should be more stable.. Of course you could also have additional logic on determining the movement direction, and accepting the location change towards that faster etc.
You will not get a better precision than 3m to the target.
At low, speed, like walking, you will no make it better than 8-10m.
Count the distance sicne last used fix, If it exceeds 12m then use the fix, and mark it as last used.
This is a simple filter which works well for walking speeds.
At speeds higher (> 10km/h) switch off the filter.
GPS should not jump at that speed.
I would like to produce a realistic 3D demonstration of a ball rolling down a Conical Helix path. The reference that has helped me get close to a solution can be found here. [I am creating my solution in Actionscript 3, using Stage3D, but would be happy to have any suggested coding solutions in other languages, using other 3D frameworks, with which you may be more familiar.]
As I entered the title for my posting, the system pointed me to a wealth of "Questions that may already have your answer", and that was helpful, and I did check each of them out. Without wanting to hijack an existing thread, I should say that this oneincludes a good deal of very helpful commentary about the general subject, but does not get to the specific challenges I have been unable to resolve.
Using the cited reference, I am happy with this code snippet that traces the path I would like the ball to follow. [N.B. My reference, and most other math-based references, treat Z as being up-down; my usage, however, is the more usual 3D graphics of Y for up-down.]
This code is executed for each frame.
ft += 0.01; // Where ft is a global Number.
var n:Number = Math.pow (0.5, (0.15 * ft));
// Where s is a constant used to scale the overall path.
obj.moveTo (
(s * n * Math.cos (2.0 * ft)),
(s * n),
(s * n * Math.sin (2.0 * ft))
);
The ball follows a nice path, and owing to the lighting and other shader code, a very decent effect is viewed in the scene.
What is not good about my current implementation is that the ball does not appear to be rolling along that path as it moves from point to point. I am not using any physics engine, and am not seeking any solution dealing with collisions, but I would like the ball to correctly demonstrate what would be happening if the movement were due to the ball rolling down a track.
So, to make a little more clear the challenge, let's say that the ball is a billiard ball with the stripe and label for #15. In that case, the visual result should be that the number 15 should be turning head over heals, but, as you can probably surmise from the name of my obj.moveTo() function, that only results in changes in position of the 3D object, not its orientation.
That, finally, brings me to the specific question/request. I have been unable to discover what rotation changes must be synchronized with each positional change in order to correctly demonstrate the way the billiard ball would appear if it rolled from point 1 from point 2 along the path.
Part of the solution appears to be:
obj.setRotation ((Math.atan2 (Math.sin (ft), Math.cos (ft))), Vector3D.Y_AXIS);
but that is still not correct. I hope there is some well-known formula that I can add to my render code.
I have an assigment to do a tron game with AI. Me an my team almost made it but we're trying to find a good heuristic. We taught about Voronoi, but it's kinda slow :
for yloop = 0 to height-1
for xloop = 0 to width-1
// Generate maximal value
closest_distance = width * height
for point = 0 to number_of_points-1
// calls function to calc distance
point_distance = distance(point, xloop, yloop)
if point_distance < closest_distance
closest_point = point
end if
next
// place result in array of point types
points[xloop, yloop] = point
next
next
We have 5 seconds to make a move and this algorithm doesn`t sound too good ! I don't need code ... we just need an ideea !
Thank you !
Later edit : Should we try Delaunay Triangulations ?
Have a look at the postmortem of Google's AI Challenge about this.
well i am considering redesigning my old Wurmeler game (AI including) so i stumped on your question while searching for new ideas so here is my insight from my old AI
Wurmeler is similar to tron but much slover and worms turn smoothly
game space is 2D bitmap
each AI is very simple ... stupid,...
but navigate better than me
unless they are closed by other player
or crush into local min/max
but still they are fun
OK now the AI algorithm in every decision move:
create few rays from Worm
one in movement direction
few turned to the left by some angle (5 degree step is fine)
few turned to the right
evaluate the ray length
from worm until it hit border
or another worm path curve
use the max rule to change heading
This old AI maintain only navigation but I want to implement more (this is not yet done):
divide map to square sections
each section will have the average density of already filled space
so if possible AI will choose less filled area
add strategies
navigate (already done)
flee (go away from near player if too close and behind)
attack (if on relative parallel course and too close and in the front)
may be conversion from raster to vector
should speed up the ray traceing and colision detection
but with growing length may be slower ... have to try it and see
possible use of field algorithms
I have created a very simple numerical simulation that models an object being thrown off a building at some angle, and when the object hits the ground, the simulation stops. Now I want to add in collision detection. How would I go about doing this?
I know I need to find the exact time that the object (a ball) hits the ground, as well as the velocity in the x and y direction, and position of the object when it hits the ground, and I have to add in parameters that say how much the ball will bounce on impact. But I don't know how to go about doing this. I know that there are various ways of detecting collision but since I am new to this, the most comprehensible method would be best.
Make a coordinate system, with the ground at y=0. Track the coordinates of the ball as it flies and then check when it has y=0, and that's where it hits the ground. You can also keep track of the x and y velocity as the ball is moving.
Use Physics skillz. This is a good tutorial. If you have it, I recommend Fundamentals of Physics by Halliday, Resnick and Walker. They have a very good chapter on this.
If you are just looking for the math, that you could write C code for. I found this one helpful. Math Models
Collision detection simply involves determining the distance between 2 objects.
If you are only interested in collisions between objects and the ground, you can use:
if(object.y <= ground.y) {
//collision occurred
}
To do collisions between objects, you can loop through all objects and compare them to each other in the same way.