I am researching the way to improve efficiency of using cache in Maya. For wide use, Alembic Cache is a good choice as I know. So, I try to use it as replacement of Geometry Cache(mcx). But Alembic Cache has a limitation of matching names of targets. On behalf of Geometry Cache, is there any aid for using Alembic Cache?
I just answered this in a previously asked question.
To overcome the non-matching name target problem:
When you import the abc file into Maya, it creates an abc Node in the Maya scene. If the object name doesn't match your scene's object name, you can connect it manually.
The way you do this is as follows.
The alembic node has a bunch of output plug Arrays, like outPolyMesh, outNSurface etc. These contain the outputs. If your render object is a mesh, you will be able to find the corresponding output plug inside the outPolyMesh array. In your connections editor, just connect the corresponding outPolyMesh[i] plug into your inMesh plug of your render model's shape node.
Hope that was useful.
Related
I have a Postgis table of about 200 entries for which I have to enter lat/lon coordinates. I use QGIS to process and display them. Is there a way of clicking at the QGIS map and thereby entering the coordinates into the Postgis table?
I can find the coordinates on the map, copy the coordinates and enter them in the table. But that is a lot of work and it is error prone. I browsed thru all plugins, found the digitizing tools promising but this does not do what I want. I googled this specific question but didn't find a clue.
Is what I want possible at all?
I am rather a noob at gis and qgis so it took me some time before I had the right keywords to search with. And the right keyword for what I want to do apparently is "digitize". What I want is partially possible. Links that helped me are:
https://www.igismap.com/digitization-in-qgis-exploring-tools-for-digitizing/
https://www.qgistutorials.com/en/docs/digitizing_basics.html
https://gis.stackexchange.com/questions/41799/adding-shapefiles-to-postgis-database
The gist of it is:
enable the advance digitizing toolbar: view > toolbars > advanced digitizing toolbar
create a new shapefile layer: layer > create new shapefile layer. Don't forget the correct fields, in my case the name of the location. Also try to remember where you store the file (it will be a shapefile), you later need that to import the shapefile into postgis
enable the edit state of the current layer, enable the type of shape you want to import. In my case they were points
click on each location, a dialog pops up with the attributes you should add
later on you should import the shapefile, you need the third link for that together with your memory of where you stored that #$#%$& file
I'm using Solr 6.5 to index files from multiples ftp files into multiples cores (having one core for each type of document, like audio file, image, software, video and documents).
The situation is that I'm doing this to populate an app that in its front end has a social networking approach in which every user can add new tags or modify other metadata without restriction.
So when I execute again data import handler to add new files to my application, it erase the index that previosly was modified for the user and set up with the data-config default configuration.
My question: is there a way to tell DIH, if the id exists, continues without importing and just adds the files which don't have an id in the index?
If this is not possible, can I do something similar in a different way?
Thanks for everything!
Sounds like you are doing a full import with default settings. One of them is clean, which defaults to true and deletes the whole index before the import.
Try setting it to false and also look at preImportDeleteQuery and postImportDeleteQuery for even more precision.
I am using the Drupal 7 Migrate module to create a series of nodes from JPG and EPS files. I can get them to import just fine. But I notice that when I am done importing them if I look at the nodes it creates, none of the attached filefield and thumbnail files contain filename information.
Upon inspecting the file_managed table I see that both the filename and filemime fields are empty for ONLY the files that I attached via the migrate module. This also creates an issue with downloading the files.
Now I think the problem has to do with the fact that I am using "file_link" instead of "file_copy" as the file operation I specify. The problem is I am importing around 2TB (thats Terabytes) of image files. We had to put in a special request with Rackspace just to get access to that much disk space on our server. So I can't go around copying from one directory to the next because of space issues. So "file_link" seems like the obvious choice.
Now you probably want to see how I am doing this exactly, so here is the code snippet:
$jpg_arguments = MigrateFileFieldHandler::arguments(NULL,
'file_link', FILE_EXISTS_RENAME, 'en', array('source_field' => 'jpg_name'),
array('source_field' => 'jpg_filename'), array('source_field' => 'jpg_filename'));
$this->addFieldMapping('field_image', 'jpg_uri')
->arguments($jpg_arguments);
As you can see I am specifying no base path (just like the beer.inc example file does). I have set file_link, the language, and the source fields for the description, title, and alt.
It is able to generate thumbnails from the JPGs. But still missing those columns of data in the db table. I traced through the functions the best I could but I don't see what is causing this. I tried running the uri in the table through the functions that generate the filename and the filemime and they output just fine. It is like something is removing just those segments of data.
Does anyone have any idea what this could be? I am using the Drupal 7 Migrate module version 2.2. It is running on Drupal 7.8.
Thanks,
Patrick
Ok, so I have found the answer to yet another question of mine. This is actually an issue with the migrate module itself. The issue is documented here. I will be repealing this bounty (as soon as I figure out how).
Does OpenOffice or LibreOffice support any mime types which allow direct pasting/drag'n'drop of tabular data? I have implemented CSV drag and drop, but since my source data is already tabular, I'd like my users to not have to navigate the import screen that comes up with CSV.
I had exactly the same problem.
The solution is really stupid, and it cost me hours.
Instead of formatting you csv table to:
One\tTwo\tThree\n
Four\tFive\tSix\n
Use the \r character instead on \n as:
One\tTwo\tThree\r
Four\tFive\tSix\r
The mimetype you have to use is "text/plain"
I tried dragging some cells from one OOo Calc window to another, and it maintains the tabular structure of my data, which suggests it does allow such things (but doesn't prove it: it could be doing something special behind the scenes).
(I thought there used to be a program to list the mime-types that a drag contained, but I can't find one today.)
On a whim, I tried dragging a simple <table> from my web browser to OOo Calc, and it appeared there as a table, with no import screen. Based on this, I think that OOo sees a single <table> in a text/html data drop as something it knows how to put into cells.
I don't know if that's the best way but it seems to work!
My goal is to display various shapes(polygons, points, linestring) on google maps by using data entered into a Postgis database dynamically(i mean by that we can see modifications in the map in real time).
I was looking for a way to do this that used the spatial structure already provided in postgis(already designating if shape is a linestring or polygon, etc) instead of parsing out the coordinates and then re-entering spatial structure in google maps. I saw that google maps api is now compatible with kml data formats. And then I read that i have to convert postgis data to kml format.
I've done some reading in the forums about the actual process of converting postgis data to kml via FWTools, but didn't see anything that would help me. I'm new to kml but am familiar with postgis and perl and PHP. Is there a tutorial for the process of converting postgis data to kml? Where can I get started? Thanks for any help
You can use PostGIS to convert to KML directly:
SELECT ST_AsKML(geometry) from MyTable;
ST_AsKML is one of several output formats, including WKT, GML, GeoJSON, etc.
To show dynamic data in Google Earth, a common pattern is to use KML with a NetworkLink element. Have the link's viewRefreshMode equal to onStop and Google Earth will make requests (to a URL served by PHP, presumably) with bounding box parameters attached. Use the bounding box to query features in the PostGIS database, and return results as kml. This is great if you have lots and lots of features, but only want to retrieve those in the region the user is looking at.
Depending on the complexity of your application, you may also want to look at GeoDjango. (Familiarity with PostGIS is a big head start!)
You can get a textual representation of the spatial data from a Postgres DB using a text conversion function, like
SELECT AsText(MyGemoetry) from MyTable
then you parse the string, create your objects using various API functions - depending on the PostGIS geometry type - and append these object to the main GE plugin object in a DOM like way.
If you are familiar with JavaScript and have a fundamental knowledge of XML, a good start is http://code.google.com/apis/earth/documentation/reference/
Don't forget to specify unique ID's to your objects so you can find them later to drop/modify.
Maybe you can get some inspirations here, display the linked "locator.js" file and look at function PaintSubField(Coord) ... this is another way, bit crude but effective, avoiding to mess around with too many individual parent/child objects and structures
You also may want to consult sample applications and use the code playground for "rapid prototyping"
re "realtime" you need at least an event that you can link your generation/redraw routines to.
Good luck
MikeD