Get Local Position for Object in React-three-fiber - reactjs

Briefing: I'm attempting to parent a "drei Text" element to a point on the outside of a sphere near the pins in a react-three-fiber scene. So that when the sphere is rotated, or the camera rotates around the sphere, the Texts position is centered on the outside of the sphere.
An example: three.js text alignment
Questions:
How do I find the local and world space positions of an object or parts/points of an object?
How do I parent that objects position to a child object, such that when the parent moves, the child moves with it?
Does a scene have a world position that is relative to the axis at [0,0,0] and a local position that is relative to an object?
My Code Sandbox: earth with locations

Check out Object3d, the base class for many objects in three.js, including the scene itself. It has properties and functions that can help you like .position and .getWorldPosition().
Note that the position of any object in the scene is relative to its parent. You can add children to an object with .add(child), but a better way to group objects might be with a Group.

Related

Render Visual 3D Container on top in Viewport3D

Using Helixtoolkit you have set up a scene (viewport). Added the camera, default lights, grids, etc. You also added a SortingVisual3D where you add various box elements for example. They are rendered as they are placed in the view. Everything fine.
Now I would like to achieve is to create a new container for 3D objects where my moving gizmo would be placed (every object gets one). If I add gizmo to sorting container, it might not be visible (box overlapping gizmo), so I need a separate container which has to be rendered on top of everything.
How to set container (content) to be rendered on top of everything - regardless of its physical location while still keeping it in the correct 3d space when rotating camera. Something like 3dsmax does (example).
Tnx
Ok, found the solution myself. What you want to do is to make an overlay and transform Point3D to Point and place objects there (a canvas for example).

3D Object Translation in Viewport of WPF

I have translate the object by applying the transform on 3DObject. It translate the objects correctly buy rotation of that object is getting disturb means in opposite direction. I want to rotate the 3DObject on its center not on viewport3d center.
Make sure you apply the translation and rotation in the correct order.
Move the Object so that its center is at the origin (0,0,0)
Rotate the Object
Translate the Object anywhere you want
If you do this using matrices, multiply the matrices in reverse order!

Leaflet polygon object attributes

Within a leaflet polygon object, there are two arrays "_originalPoints" and "_parts", I was wondering if anyone knew what the purpose of these two arrays were.
Thanks,
I'm Leaflet author.
_originalPoints is an array of projected geographical points (screen coordinates of the polygon points), and _parts is an array of arrays of points that eventually gets rendered after clipping and simplifying points. In case of polygon different "parts" are needed to render holes in polygons, and in case of polylines you often get several paths out of one after clipping (cutting points off a certain rectangular area for performance).

Getting Relative Position of a Rotating Camera

I have a Viewport3D with a 3D model composed of multiple smaller components centered at the origin. I'm animating the PerspectiveCamera to rotate about the Y-axis using an AnimationClock created from a DoubleAnimation to create a rotating effect on the model. In addition, I have another rotateTransform3D assigned to the camera's transformgroup to enable the user to orbit around and zoom in-out of the model using the mouse.
I want to be able to translate each component as it is being selected to move in front of the rotating camera. However, I don't know how to get the position of the camera relative to the 3D model because the coordinate system of camera is being animated and transformed by the user's input.
Is there a way to get the offset between two coordinate systems?
Any help or suggestions would be appreciated.
Thanks,

WPF 3D - Positioning Visual3D elements in a 3D scene using nested Model3DGroup transforms?

I have a 3D scene where my 3D models are being loaded in the code behind from XAML files.
Each model is comprised of a tree of nested Model3DGroups each of which has various transformations applied to it to position and orient the next subcomponent of the model in the tree. This model is then used as the content of a ModelVisual3D so that it can be displayed to the screen.
I want to attach a child ModelVisual3D to a 'parent' ModelVisual3D. This child ModelVisual3D needs to use all of the nested transformations of the parent ModelVisual3D.Content to correctly position and orient itself in the virtual space. For example, the first ModelVisual3D is a robot arm which has various moveable joints and I want to attach a tool on the end of this arm--another ModelVisual3D. How can I access this composite transform from the parent ModelVisual3Ds content property to allow me to position the next ModelVisual3D correctly relative to its parent?
As you have no doubt observed, when you group Model3Ds in a Model3DGroup the Transform properties of the children combine with those of the parent.
It sounds like you are asking how to compute the net transform down to a particular Model3D within a tree of Model3Ds that make up what you are calling your "model". To do this you need to know (or be able to scan and discover) the path from your root Model3DGroup down to the Model3D you want to find the transform for.
Once you have this path, all that is required is to combine the Transform properties at each level. To do this, simply construct a Transform3DGroup and add the individual transforms to it.
For example, if your robot arm has Model3D components named "UpperArm", "LowerArm", and "Hand", and you wanted to find out the position and angle of the hand you might do:
var combined = new Transform3DGroup();
combined.Children.Add(UpperArm.Transform);
combined.Children.Add(LowerArm.Transform);
combined.Children.Add(Hand.Transform);
Now you can find the (0,0,0) location on the hand as follows:
combined.Transform(new Point3D(0,0,0));
Similarly you can find other points and use them to position your other ModelVisual3D.

Resources