Maya: Merging FBX animation on top of HIK rig with namespace - maya

I have an HIK character rig that is referenced into the scene with the namespace "rig:". I have a bunch of animation files that are of the same skeleton, but without an HIK rig and a namespace.
I can import and merge the animation on to my reference rig via Maya's Import window. In the Namespace Options, I select the HIK rig's namespace, then "Use selected namespace...add new string". And I set my animation to "Update". This works great.
However, I can't get a command line in MEL to work. The command I get for the above via the Script Editor's echo all is:
file -import -type "FBX" -ignoreVersion -ra true -mergeNamespacesOnClash false -namespace "testNs" -pr -importTimeRange "combine" "C:/myFile.fbx";
But for some reason this doesn't work. The command seems to omit the referencing info from the options window I mentioned above, namely, selecting the referencing parent. I didn't notice anything in the FBX MEL commands documentation either. Is Maya running another command that isn't getting output in the Script Editor's echo all?
Any help would e appreciated.

I needed to set the namespace to the HIK's character namespace from within the referencing system. I also needed to set the type of import option I wanted (in this case, merging only bones it can find names for). I guess this is effectively what the Maya file import options box is doing.
namespace -set "hik_rig_namespace";
FBXImportMode -v "exmerge";
file -import -type "FBX" -ignoreVersion -ra true -mergeNamespacesOnClash true -namespace "this_doesnt_matter" -pr -importTimeRange "combine" "C:/myFile.fbx";
namespace -set ":";
The namespace in the Maya import option doesn't seem to actually matter. After the file is imported, I set the namespace back to the root.

Related

Export multiple raster from grass gis

I used r.tile (http://grass.osgeo.org/grass71/manuals/r.tile.html) to create tiles from a huge geotiff. Now i want to export each tile to a file but havent found an easy way:
Using r.out.gdal I only can export one tile at a time which makes it unuseable as I have 100 tiles...
I havent found any other solution...anyone any ideas?
You just call r.out.gdal in a for loop in some scripting language.
Here is a example for Bash. It is a simple but complete script. It expects the raster map names as command line parameters and if the parameters are not present it fails with a usage suggestion.
#!/usr/bin/env bash
if [ $# -eq 0 ]
then
>&2 echo "No arguments supplied"
>&2 echo "Usage: $0 raster1 raster2 ..."
fi
for RASTER in "$#"
do
r.out.gdal input=$RASTER output=$RASTER.tiff format=GTiff
done
To run it you have to set it as executable (chmod u+x export.sh) first, then you can run it in the command line:
./export.sh tile1 tile2
Here is an example for Python which is much more convented for bigger tasks. The code does completely the same as the Bash script. Additionally, it uses GRASS Python Scripting Library to call the r.in.gdal module and print function to be forward compatible with Python 3.
#!/usr/bin/env python
from __future__ import print_function
import sys
import grass.script as gscript
if len(sys.argv) == 1:
print("No arguments supplied", file=sys.stderr)
print("Usage: {} raster1 raster2 ...".format(sys.argv[0]), file=sys.stderr)
for raster in sys.argv[1:]:
gscript.run_command('r.out.gdal', input=raster,
output=raster + '.tiff', format='GTiff')
Again, to run it you have to set it as executable (chmod u+x export.py) first, then you can run it in the command line:
./export.py tile1 tile2
Both examples assume you are using Linux, Mac OS X or something similar and running the script in system command line in GRASS GIS. On MS Windows, you should probably use the Python version and run it, for example from the GRASS GIS GUI through the Launch script item in the File menu in the Layer Manager.
The above answers require you to input the name of each raster layer. If you want to automate the whole process you can use the following Python script. This script first creates a list of all rasters that are available to grass, it then exports each as a Tiff file individually. To run it, either copy and paste the code into the GRASS python shell (accessible as a tab on the GRASS GUI layer manager window) or, save it as a .py file and, again from the Python shell, use the command execfile("foo.py").
import grass.script as grass
# create list of all rasters
rastlist=grass.read_command("g.list",type="rast")
rastlist1= rastlist.split(':', 1)[1]
rastlist2= rastlist1.split('-', 1)[0]
RastListClean=rastlist2.split()
# export all rasters
for raster in RastListClean:
grass.run_command('r.out.gdal', input=raster,
output=raster + '.tiff', format='GTiff')
Working with GRASS 8.2 on macOS, this version works.
The g.list() seems to return a string, separated perhaps by newlines.
split() breaks it into a python list that can be iterated.
Note that this only works because GRASS does not allow spaces in object names.
#!/usr/bin/env python3
import grass.script as grass
rastlist=grass.read_command("g.list", type="rast")
rastlistClean=rastlist.split()
for raster in rastlistClean:
grass.run_command('r.out.gdal', input=raster, output= raster + '.tif', format='GTiff', overwrite=True)
Credit to Beeman's answer which is the basis for this one.

Clearcase: "No such file or Directory" error during labeling

I am trying to label all the elements within a branch of a view, but some of the elements do not get labeled and instead give me a "No such file or Directory" error. I can see the files that generated the error in my command window, but they are highlighted in red. It seems like these files are not there and are thus generating the error. How can I remove these "files" from the view so that the labeling can continue and not generate errors?
Say that bolded words represent red highlighting. This is what I see:
file1 file2 directory1 directory2
Here is how my code is structured in my shell script:
cleartool mkview -tag $VIEWNAME ... (etc.)
cleartool setcs -tag $VIEWNAME configSpec.txt
cd /projectDirectory
labelname=`date "+%b-%d-%y"`
cleartool mklbtype -nc $labelname
cleartool mklabel -recurse $labelname /projectDirectory
The script starts recursing through the file tree from the projectDirectory. When it encounters file1 or directory1, I get the "No such file or directory" error. Otherwise, for file2 and directory2, the labeling occurs properly.
So, my question is this: How can I use the mklabel command or some other method to label all the files that are not highlighted in red?
You must first know the exact status of the "files in red"
For that, go in a shell to their parent folder, and type:
cleartool ls
That will give you their status (eclipsed?, private? other?), which will explain why the label cannot proceed.
Possible causes:
the files are symlinks
the files have spaces in their names (that shouldn't be the case here: cleartool mklabel should support that case)
the files are in the lost+found folder (solution: exclude that folder from your view with a -none selection rule)
Note: if an element isn't selected (no version selected and Rule: -none), then a recurse mklabel is supposed to generate that error message, but that won't prevent the label to be set on the other elements version.
So that error message should be safely ignored.

How to query Maya in script for supported file translator plugins?

I'm trying to specify an FBX file in MEL using the command
file -f -pmt 0 -options "v=0;" -typ "FBX" -o
on one computer this works great. On another, it fails but DOES work if I use
-typ "Fbx"
I think I'd like to query for the supported translators in my script, then either select the correct one or report an error. Is this possible? Am I mis-diagnosing the problem?
MEL has a command called pluginInfo. You could write a simple function that will return the proper spelling based on that. pluginInfo -v -query "fbxmaya"; will provide the version of the fbx plugin. I haven't used MEL in a while so I'm not gonna try to make this perfect but maybe something like if(pluginInfo -v -query "fbxmaya") ) string fbxType = "FBX" else( string fbxType = "Fbx"). Then just plug that var into file -f -pmt 0 -options "v=0;" -typ $fbxType -o.
It might be a different version of fbx. You'd have to provide another line which determines the version of fbx on that particular machine and pipes in the correct spelling.

Recursive checkin using Clearcase

I want to check in a directory and all the sub-directories into the clear case.
Is there a specific command to achieve it?
Currently I am going into each directory and manually checking in each file.
I would recommend this question:
Now the problem is to checkin everything that has changed.
It is problematic since often not everything has changed, and ClearCase will trigger an error message when trying to check in an identical file. Meaning you will need 2 commands:
ct lsco -r -cvi -fmt "ci -nc \"%n\"\n" | ct
ct lsco -r -cvi -fmt "unco -rm %n\n" | ct
(with 'ct being 'cleartool' : type 'doskey ct=cleartool $*' on Windows to set that alias)
But if by "checkin" you mean:
"enter into source control for the first time"
"updating a large number of files which may have changed on an existing versionned directory"
I would recommend creating a dynamic view and clearfsimport your snapshot tree (with the new files) in the dynamic view.
See this question or this question.
the clearfsimport script is better equipped to import multiple times the same set of files, and automatically:
add new files,
make new version of existing files previously imported (but modified in the source set of files re-imported)
remove files already imported but no longer present in the source set of files.
make a clear log of all operations made during the import process.
:
clearfsimport -preview -rec -nset c:\sourceDir\* m:\MyView\MyVob\MyDestinationDirectory
did you used -recurse option in the clearfsimport command.
Example: clearfsimport -recurse source_dir .
This should help.
If you're using the Windows client, right-click on the parent folder, select Search, leave the file name field empty, click Search, select all the files in the result window (ctrl-A), right-click on them and select ClearCase -> Add to Source Control
If you are in windows you may try,
for /f "usebackq" %i in (`cleartool lsco -cview -me -r -s`) do cleartool ci -nc %i

Creating a new subdirectory structure in ClearCase?

I'm a ClearCase newbie and up until now have been used to SVN. Therefore, I'm a bit confused on the steps I need to take to create a new directory structure containing multiple files to ClearCase.
So, say for example there is an existing directory structure within ClearCase as follows:
\ParentDirectory
\ChildDirectory1
\File1
\File2
\ChildDirectory2
\ChildDirectory3
\File1
\ChildDirectory4
If I want to add a new subdirectory to this structure, ChildDirectory5, which will contain a number of other files, how do I go about this? From what I have been reading, I will need to first of all check out the parent directory and then use the mkelem command to make each subdirectory and file.
However, I have already created the necessary files and directories on my local machine so I just need to check them into ClearCase somehow. With SVN, all I would've needed to do was copy the parent folder into a checked out repo and do an add & commit command sequence on it.
As explained in How can I use ClearCase to “add to source control …” recursively?, you have to use clearfsimport which does what you are saying (checkout the parent directories, mkelem for the elements)
clearfsimport -preview -rec -nset c:\sourceDir\ChildDirectory5 m:\MyView\MyVob\ParentDirectory
Note the :
-preview option: it will allow to check what would happen without actually doing anything.
'*' used only in Windows environment, in order to import the content of a directory
-nset option (see my previous answer about nset).
I would recommend dynamic view for those initialization phases where you need to import a lot of data: you can quickly see what your view looks like without making any update (like "without updating your workspace"):
ClearCase allows to access the data in two ways:
snapshot view (like a SVN workspace, except all the .svn are actually externalized in a view storage outside the workspace)
dynamic view: all your files are visible through the network (instant access/update)
I use a variant of this script (I call it "ctadd"):
#!/usr/bin/perl
use strict;
use Getopt::Attrribute;
(our $nodo : Getopt(nodo));
(our $exclude_pat : Getopt(exclude_pat=s));
for my $i (#ARGV) {
if ($i =~ /\s/) {
warn "skipping file with spaces ($i)\n";
next;
}
chomp(my #files = `find $i -type f`);
#files = grep !/~$/, #files; # emacs backup files
#files = grep !/^\#/, #files; # emacs autosave files
if (defined($exclude_pat)) {
#files = grep !/$exclude_pat/, #files;
}
foreach (#files) {
warn "skipping files with spaces ($_)\n" if /\s/ ;
}
#files = grep !/\s/, #files;
foreach (#files) {
my $cmd = "cleartool mkelem -nc -mkp \"$_\"";
print STDERR "$cmd\n";
system($cmd) unless $nodo;
}
}
The -mkpath option of cleartool mkelem will automatically create and/or check out any needed directories.
For this script, -nodo will have it simply output the commands, and -exclude will allow you to specify a pattern for which any file that matches it will be excluded.
Note that Getopt::Attribute is not part of the standard Perl distribution, but it is available on a CPAN mirror near you.
You have to import your local directory structure. The command is clearfsimport.

Resources