Default Values for Dates in ArcGIS - database

In Microsoft SQL I can use the GETDATE() function as the default value for a DATETIME field. I'd like to be able to do the same kind of thing for a date field in an ArcGIS geodatabase. Is this possible, or am I limited to literal values?
My geodatabase is using ArcSDE 9.1. The Feature Class with the defining attributes is versioned.
Thanks,
Camel

ArcGIS generally leverages an external database engine, so unless you are talking about an individual shapefile, your data is being stored in Access, SQL Server, or Oracle. Unless you have ArcSDE, it is probably Access. You can define data directly in the database and assign defaults there and then link to the tables from your map authoring tool.
EDIT After your last comment I consulted with one of my more GIS savvy friends and she had the following to offer
they will have to define the table and its defaults in the database and then join the table to the feature class via a common field. It is important not to join the date field to the feature class, in that case, the feature class would hold onto the
values set up in the feature class and
ignore the table value.
Hope that is of some help.

I ended up talking to Esri support about this issue. They confirmed that versioned tables do not inherit the default values of the original table (well, in SQL Server anyway).
With regards to creating a join between a table and the feature class:
The data is exported to a shape file and copied to a PocketPC device
Data entry is via an ArcPad application
The shape file is synchronised and re-imported into the SDE
So basically, the DATETIME default would have to survive the export/import process. I didn't test whether this is possible. In the end, I inserted the default value programmatically on the PocketPC.

Related

Export Outlook Emails Into SQL (Vai ACCESS?)

I have a email folder in Outlook that contains 100s of emails which record my discussions with a developer of some bespoke software. I want to import these into SQL to create a knowledge base of information that can be searched upon to extract all the decisions that we have made during the course of the 2 year project.
Having sreached the net, I found that it is very easy to dump the contents of an email folder into Access using the import data functionality. In fact I have linked the table and so believe (never used Access before!!) that I now have an Access table that is connected in 'real-time' to the Outlook folder. This is eactly what I want BUT in SLQ as this is something that I am very familiar with using.
So I have tried to import the Access database into SQL (which also appears to be relatively easy) but keep getting the message that 'The source database ...contains no visible tables or views'. Checking SQL pemissions, I am owner of this new databse.
Two questions please. First, cant believe that going through Access is the simplest way to do this and presume that I will loose the 'real-time' link - am I right? Second, given that I can see my Access database has a visible table, why am I getting the error?
The easiest and quickest way is to create a VBA macro where you can populate your SQL database from Outlook emails. You can build the table structure according to your needs and extract the required information from Outlook using VBA. I'd suggest processing emails in chunks using the Find/FindNext or Restrict methods of the Items class, so you will not reach the reference counter limit. The MailItem properties you may find described in MSDN.
BTW The internal store (if you use the cached mode) in Outlook acts like a database. So, why do you need to introduce yet a new database?

MS SQL database changes are not writing back to Umbraco CMS backend

I’m using Umbraco v4.11.10 CMS and have made bulk changes to member expiry dates via scripts in the MS SQL-server database. However, when I go back to the Umbraco backend, the changes aren’t being shown. The changes are being made to the “cms.ContentXml” tables in MS SQL.
Is there any way to force or sync the Umbraco CMS backend to show the values in the database? I understand Umbraco writes the data to XML so I deleted the umbraco.config file but that doesn’t help.
The ContentXml table is a kind of half-way house for Umbraco's content and shouldn't be edited directly, as it gets overwritten as soon as you make any change to the object in question and then save it.
The underlying data for Umbraco content (including media, content, and members) is actually stored in several other tables using a very flexible object graph structure and not easily editable via straightforward sql scripts.
Under the hood, it goes something like this:
User Edits an item (Member, Content, Media, etc.) and saves it.
[If it's a Content item] Umbraco creates a new version by copying the current version.
For each property of the item, Umbraco creates a new record in the cmsPropertyData table according to the datatypes storage type (int, date, nvarchar, ntext) and linked to the item's record via the propertyId and itemId.
Once the data has been saved, an XML representation of the object is generated and deposited into the cmsContentXml table, overwriting any previous versions for that item.
If it is a Content item, and the item is published, Umbraco will then write that same XML representation to the Umbraco.config cache file.
This is a fairly simplistic representation; but should suffice to give you an idea of the complexity involved in saving and modifying the data.
I propose that you take advantage of Umbraco's extensive API to do bulk changes to records by writing a plugin - there may also be plugins available that do something similar, and you may be able to leverage those, or obtain the code of some of them, as they are quite commonly open source; and base your code off that.

Microsoft Dynamics NAV 2009 How to remove objects outside of license?

During an upgrade process from 2009 to 2016 I'm trying to remove objects relating to an old discontinued product. The objects are not within the range of or license and consists of both Forms, Tables and Reports. When deleting I'm faced with the well known error:
"You do not have permission to delete the '[object name]' Table."
I've tried with my developers license and the customers license with no luck. Since the product is no longer existing there is no use keeping these objects around and I need them gone for the upgrade process.
What is the best approach or technique when deleting objects that's not in the license?
UPDATE: How this issue was resolved?
I got in contact with the product owner and explained my problem. They sent me a neat PowerShell script to run. This worked like a charm. Reading through the script I can see that it's using the SQL cmdlets to select and delete relevant data from the following SQL tables:
Objects, Object Metadata, Object Metadata Snapshot, Object Tracking, Object Translation, Permission.
This was the preferred method of the product owner who used to develop this product. It should be applicable to all NAV objects. I have not yet successfully tried one of the answers below (more tries to come). Hopefully this new information will provide someone with enough knowledge to provide a good answer.
The way which was successfully used by several people but for sure cannot be recommended for production system is to simply delete these objects via SQL from Object and supplemental tables. In case of tables, you would need to manually delete the SQL table itself as well as its VSIFT views.
A bit more better (probably) way is to change the number of the object via SQL and then delete the object via NAV.
The best way is to use the functionality of "killer objects" - which allow to delete objects via FOB import:
http://navisionary.com/2011/11/how-to-delete-bsolete-dynamics-nav-objects/
If you find the partner who can provide you with such killer objects (they need to have a license to create objects in needed range), it solves you problem in a "clean" way.
If not, you may want to consider creating empty objects in 50000 range in some test DB, changing their number to obsolete range via SQL, exporting them as FOB, and then importing them to your target DB with "Delete" option.
Create new empty database, export only needed objects from old database, import them to new database.
In Nav 2016 application database can be separated from data containing database so (I assume) you could just unmount it from database with old objects and mount it to new application database. Not sure tbh.
It is due to the range of the license, for example your development license has a range of tables 7.000.000 - 7.000.200. If you want to delete a table with ID 20.000.000 you have that error.
The best solution is when you do the updrage do not you consider these objects you need to delete. Exports all objects except the objects you want to delete.

How do you make your model's database portable in CakePHP?

I'm not very familiar with cake.. So here's my questions.. we're developing an app on mysql, but it may eventually need to deploy to mssql or oracle. How do we make sure that we won't have strange problems with our primary keys? In mysql they are AUTO INCREMENT columns but IIRC in oracle you need to use sequences... is there a way to make this a transparent change? Am I over thinking it?
Does anyone have experience with switching database vendors on a cakephp app? any pointers or things to keep an eye out for?
In the Cake database config file you choose your driver (see http://book.cakephp.org/view/40/Database-Configuration). Then, if you set your PK (which will also be your A_I column if using MySQL) with the fieldname id, Cake will automatically handle the auto_increment insertion. I would presume (NB: haven't tried Cake w/ something else) that Cake will take care of A_I columns in something like Oracle.
Cake uses its own DB abstraction layer -- but the included abstractions cover quite a bit, and it will perform as specified (i.e. it'll handle your auto increment stuff for you).
In short, you're probably overthinking. That said, I would mock up a little cake app, then try switching databases behind it (change your db config and your app should automagically switch over).
HTH,
Travis
The following practices work great for me
I use cake schemas ( I tend to set up 1 schema file for each group of models. I.E. User, Role, Profile might all be in one UsersSchema file )
Also take a look at using the debuggable.com FixturesShell - it allows you to import test case fixtures into the live database. Great for setting up that initial group of users and roles from the schema file.
Also, if you set your 'id' field to VARCHAR(36) instead of INT(#) cake will automatically use UUID style id's. This means you have a FAR FAR lower chance of your data having id value collisions if you need to move the data to another application or server.
The fixtures shell also has a command line tool for generating uuids ( so you can add them to your $records variable in the fixture for insertion etc. )
In summary - Use the CakeSchema schemas shell, the fixtures shell from debuggable.com and UUID values for your id's and it should give you a portable structure creation tool, a portable data insertion tool, and a portable id field format.
http://github.com/felixge/debuggable-scraps/tree/fd0e5ad625cb21f5ba16e6b186821a5774089ac7/cakephp/shells/fixtures
http://api.cakephp.org/class/schema-shell
You need to be using "cake schema" to manage your DB.. This will handle all the DB specific stuff when you create your database.
http://book.cakephp.org/view/735/Generating-and-using-Schema-files

Can you export packaging information (ERD or other data model) from Cognos 8.3?

I was wondering if there's a way to export package information from Cognos 8 from a regular user level or from the framework level.
For instance, I want the field names that cognos is pointing to on the database, i want the datatype, the description cognos uses when you right click a data element, etc..
Any suggestions?
(Unfortunately I'm not at my work computer right now) but Cognos saves everything in .xml files. I have an xml pretty printer that I use on model.xml before and after edits, so that I can use windiff to see what exactly changes in the model. I have also used an xml editor on model.xml on several occasions for global search and replace.
Having said that, I'm not sure how much of the database schema you can infer directly from model.xml, but I suspect if you had a script that could read and walk model.xml, and connect to the database to describe the objects, you could get what you need.
The answer appears to be yes, to anything that supports CWM (the Common Warehouse Model) but as for how...
One suggestion: ask IBM.
It appears that Powerdesigner 15 imports from xmi models.

Resources