I am trying to create a particle system in SceneKit. I use the
particleSystem.birthLocation = .volume
option on a SceneKit node. This is working fine for built-in SceneKit shapes like SCNBox or SCNCylinder but now I need to make more complex meshes in Blender for use as the emitter. As simple test I tried exporting the default Box you get in Blender. However, when I try the exported .dae file I get the following error in the console.
[SceneKit] Error: Cannot use volume generation on a generic mesh. fallbacking on surface
Even the surface emitter it creates is not correct and consists of 3 triangles.
I've tried various options in the Blender export but I can't get this to work. Anyone have any ideas?
Related
I'm trying to create joints on top of an animated Alembic mesh in order to use them later as a skin deformer.The current issue is that the joints are following the geometry.
Do I need to create a proper character rig for that? Is there a simple way to connect them directly to the original geometry without to have the package "duplicated geometry/blendshape/follicle"?
Many thanks,
I am rendering 3d furniture models in Autodesk Maya for an AR app. For showing a shadow of 3d model in scene file I need export of transparent PNG file from Autodesk Maya which I can use as diffuse image for shadow plain. I have attached example of the kind of file needed but I don't know how to generate the same in Autodesk Maya. Please help.Shadow file generated for chair model.
If you want to generate in Maya a UV texture containing shadows for ARKit / SceneKit models follow these steps (I should say that it's not easy if you're a beginner in Maya):
Create objects with appropriate shaders
Create UV map for your models in UV Editor
Activate Maya Software Renderer
Create lights with Raytraced Shadows
Don't forget turn Raytracing on in Render Options
Duplicate all 3D objects and lights
Select an object with cast shadows and a corresponding light
Go to Rendering module and choose Lighting/Shading - Transfer Maps... menu.
Setup all the properties for Shaded Output Map and its appropriate location and then press Bake.
Also you can output four additional passes in Maya Software Renderer:
Normal Map
Displace Map
Diffuse Map
Alpha Map
If you want just shadows, without diffuse texture, use white coloured Lambertt Shader in Maya.
I'm trying to carefully word this question because the problem doesn't appear to be that the animation isn't being imported. I can see the animation as <untitled animation> in the scene graph, and I can play the animation in Xcode; the issue is then locating the animation as an attribute to any of the objects. This is what I can see in the scene graph:
But, when I try and locate the animation here:
... nothing appears. So, it seems that it isn't importing the animation as a CAAnimation object with a key.
I have tried to programmatically enumerate through the child nodes to find any CAAnimation objects, but it doesn't see any. Other scenes work perfectly however.
What do I need to add to the DAE file to get it to build the CAAnimation object properly?
Worth pointing out that the 3D model and animation was exported from 3DS Max using OpenCollada. I don't know what the best practices are for exporting collada to SceneKit, and if anyone has any useful information that would be great. Apple don't seem to have anything.
The SceneKit editor in Xcode does not display the root node in the scene graph view, and it might be the one that holds the animation.
You should be able to retrieve the root node's animation programmatically.
I have created a box in X3D but i want to curve the top right and left corners of the box.
Any ideas on how? At the moment i have the code for the box only.
I cannot find much on the net about simple shape manipulation.
Thanks
You can't do it manually, directly in X3D.
You have to use a software like Blender or 3DS Max to create non-primitive shapes (more complex shapes than cubes, spheres, cylinders and so on.) There you have all kinds of functions that would change your object and then you can export it in several formats, including X3D.
I want to create a 3D scene with the SceneKit modeler and then read it into my Metal app. I see there is SceneKit and ModelIO API to do this but I am unclear on how the pieces fit together.
So, what I need is path from .scn file -> MDL Mesh -> geometry + texture. I am unclear on how I would sync my Metal shaders with materials created in the SceneKit modeler.
There's two main parts to what you're asking here: getting SceneKit data into ModelIO, and rendering ModelIO data with Metal.
To get SceneKit scenes into ModelIO, first use SceneKit API (SCNScene or SCNSceneSource to load the .scn file, then use ModelIO API to get the objects you want as meshes. You can create a MDLAsset from the entire scene using assetWithSCNScene:bufferAllocator: and then walk through the asset's object hierarchy in ModelIO to find the mesh you want, or walk through the node hierarchy in SceneKit to find the SCNNode or SCNGeometry you want and then get the mesh to ModelIO using objectWithSCNNode:bufferAllocator: or meshWithSCNGeometry:bufferAllocator:.
As for using ModelIO meshes in a Metal app, Apple has a sample code project that shows how to use ModelIO to load an OBJ mesh, use MetalKit to get the mesh data into Metal GPU buffers, and hook up the material information you get from ModelIO to shader variables for use in your own renderer.
You should be able to put the two of these together: where the sample code loads an OBJ to get a MDLAsset or MDLMesh, use the methods in (1) to get an asset or mesh out of a SceneKit file instead.
The SceneKit material model is of course much more complicated than the simple Phong shader used in the sample code. But the sample code does show how to iterate through a MDLMaterial's properties and set corresponding arguments in a Metal shader — if you create a more complicated shader, just follow the same steps to map the material properties to whatever your shader's inputs are.