I want to get the logical data model from XMI file I have, is it posible through Sybase powerdesigner.
Thanks & Regards,
Yashu Vyas
This reply may be dated, as it's from 2006.
Unfortunately, PowerDesigner doesn't really have a "Logical Data Model". What it does have is a Physical Data Model with a target. So, to reverse engineer from a DBMS specific PDM to a logical PDM, you generate from PDM to PDM and select Logical as the target.
-- Mike Nicewarner [TeamSybase]
In addition, here is a link to the PowerDesigner docs on logical data models.
Related
I have a Conceptual Model in PowerDesigner with 150+ tables and a few thousand columns/fields in total. The columns have been documented separately outside of powerdesigner and I would like to import/insert these comments to each column/field. Is there a way to do this with script rather than manually adding field level comments/notes?
The second step will be to produce a data model report with tables/columns and which includes these comments. I believe I have seen an answer to this here on StackOverflow, but if there are any newer tips, I welcome them.
Have dabbled with PowerDesigner scripts but not done any advance scripting yet. Hoping to be able to read a csv-like or xml file and mapping field names to internal table/field combinations to add the comment.
I have a report template which generates a report of all tables, but column comments are empty.
EDIT The XML value is saved in a XML column in SQL server with the entire transaction
I have a general question I suppose regarding the integrity of XML values stored in a SQL Server database.
We are working with very imnportant data elements in regards to healthcare. We currently utilize a BizTalk server that parses very complex looped and segmented files for eligibility and BizTalk parses the file, pushes out an XML "value" does some validation and then pushes it to the data tables.
I have a request from a Director of mine to create a report off of those XML values.
So I have trouble doing this for a couple reasons:
1) I would like to understand what exactly the XML has, does this data retain it's integrity regardless of whether we store the value in a table or store it in the XML?
2) Consistency - Will this data be consistent? Or does the fact that we are looking at XML values over and over using XML values to join the existing table to the XML "table" make the consistency an issue?
3) Accuracy - I would like this data to be accurate and consistent. I guess I'm having a hard time trusting that this data is available in the same form the data in a table is...
Am I being too overcautious here? Or are there valid reasons why this would not be a good idea to create reports for external clients?
Let me know if I can provide anything else, I'm looking for high-level comments, code should be somewhat irrelevant other than we have to use a value in the XML to render other values in the XML for linking purposes.
Off the bat I can think that this may not be consistent in that it's not set up like a DB table. No Primary Key, No Duplicate checks, No Indexing, etc...Is this true also?
Thanks in advance!
I think this article will answer your concerns: http://msdn.microsoft.com/en-us/library/hh403385.aspx
If you are treating a row with an xml column as your grain, the database will keep it transactionally consistent. With the XML type, you can use XML indexes to speed up your queries, which would be an advantage over storing this as varchar(max). Does this answer your question?
Hey can someone tell me what the Field, File and Index .ddf files do in pervasive. Do they have to changed or be updated when a table definition changes? Any insight would be GREATLY appreciated.
Cheers.
FILE.DDF links the underlying Btrieve Data files to a logical table name.
FIELD.DDF uses the File Id from FILE.DDF to define all of the fields including offsets, data types, etc for each table.
INDEX.DDF defines the indexes on the fields in FIELD.DDF.
They are the field information meta date used by PSQL to access the data files in a relation access method (ODBC, OLEDB, ADO.NET, etc).
They do have to be changed if the underlying data file is changed through Btrieve. If the table definition changes through SQL (like ALTER TABLE statements), the Pervasive Control Center, DTI (Distributed Tuning Interface), DTO (Distributed Tuning Object), PDAC, ActiveX, or DDF Builder then the DDFs are updated automatically.
I am new to Hibernate. Now I have a problem. I have 2 tables (Timetable, and Timetable_backup) with similar structure because the timetable_backup table is just back up version of timetable table which contains current data. Now I do not know how to get all data from the past to now. In hibernate, we cannot use UNION like in SQL to query. So I try to map 2 tables to 1 entity using Inheritance and #mappedsuperclass but it does not work for me. Please help me with this. If the context is not clear please tell me.
Kind Regards
Nathan
Probably what you want is something like Envers, a plugin for Hibernate that takes care of versioning records in a table. You just use a couple of annotations in your classes and it provides an interface to look for past records among other things.
You cannot do it.
The typical workaround is to map entity to the main table, and use native SQL queries to access the backup table.
By this time you might have found answer or workaround to the problem you have posted. If possible, can you please post it here so that it will help others.
Anyway I found following link which explains how to create tables using single POJO Mapping same POJO to more than one table in Hibernate.
As hibernate does not support union. I have extracted results from 2 queried (main table as well as backup table) and used listTimeTable.addAll(listbackTimeTable); This will give result same as union all operation.
Once again, please post your implementation for benefit of this community...
Thanks,
Shirish
This answer on a question on SO says
... you can read a LONG from a remote database, but you can't read a CLOB
I did not find anything about this on the internet, is it true? Any documentation or citings for this will be helpful.
The answer is correct in a certain context, for simple select statements over a DB link, you'll get this error:
ORA-22992: cannot use LOB locators selected from remote tables.
From the errors manual:
Cause: A remote LOB column cannot be referenced.
Action: Remove references to LOBs in remote tables.
I also had trouble finding definitive documentation on this...but we just ran into the same issue in our data warehouse. However, there are several work-arounds available, pulling the data over or creating a view for example.
#Peter Ilfrich: Doesn't that throw an exception when trying to access any clobs over 4000 bytes?
This is a little more convaluted, but it means you can safely pull back small clobs (< 4000) over a dblink.
select dbms_lob.substr#<link>((select <columnName> from dual#<link>), 4000, 1)
from <table>#<link>
where dbms_lob.getlength#<link>((select <columnName> from dual#<link>)) <= 4000;
Reading a CLOB (or a BLOB) over a dblink is possible with this PL/SQL package:
https://github.com/HowdPrescott/Lob_Over_DBLink
If both DB schemes are in the same Oracle instance, you can use the following workaround:
select (select <columnName> from dual) <columnName> from <table>#<link>
This will return you the same as if you would access a local LOB column.
Oracle 12.2 finally added support for distributed LOBs. We can now read data types like CLOB and XMLType over database links without any workarounds.
I had the same trouble yesterday. This is My solution: create a romote view on the romote table, when comes the CLOB cols, use to_char(),such as to_char(col2). Then you can select data from the view. It may not be a good solution, but it works.
You can create a Materialized View from remote table and then use from that for your needs
https://webcache.googleusercontent.com/search?q=cache:LG2eG1gThV8J:https://community.oracle.com/thread/471047%3Fstart%3D0%26tstart%3D0+&cd=2&hl=en&ct=clnk&gl=ir
When your table from dblink it's an Oracle Big Data External table (in my case an external table over a Hive table), you need to create a materialized view over it and use that materialized view with the above mentioned pl/sql package: https://github.com/HowdPrescott/Lob_Over_DBLink
Works for clobs > 4000.
Tested only for CLOB!