Database Driven Model Mapping in Java - database

So, I have a project where I get data from a dozen different sources, some are database objects, most often the data is in different JSON formats, or often XML formats. So, I need to take this disparate data and pull it into one single clean managed object that we control.
I have seen dozens of different posts on various tools to do object to object mapping. Orika being one of them, etc. But the problem is that Orika, like many of these still need solid classes defined to do the mapping. If there is a change to the mapping, then I have to change my class, re-commit it, then do a build and deploy new code ... BTW, testing would also have to be done like any code change. So, maybe some of these tools aren't a great solution for me.
Then I was looking to do some sort of database-driven mapping, where I have a source, a field, and then the new field or function I would like to take it to. So, with a database-driven tool, I could modify the fields in the database, and everything would keep working as it should. I could always create a front-end to modify this tool.
So, with that ... I am asking if there is any database-driven tool where I can map field to field, or fields to functions type of mapping? Drools was my first choice, but I don't know if it is my best choice? Maybe it is overkill for my needs? So, I was looking for advice on what might be the best tool to do my mapping.
Please let me know if you need any more information from me, and thanks for all the help!

Actually Orika can handle dynamic Data source like that, there is even an example on how to convert from XML Element (DOM API) or even JsonObject.
You can use an XML parser to convert your data into Element object, or Jackson to get JsonObject
Then define you class map between your "Canonical" Java Class and these dynamic "Classes"
http://orika-mapper.github.io/orika-docs/advanced-mappings.html Customizing the PropertyResolverStrategy
Here is an example of Orika mapping to MongoDB DBObject to Java Bean
https://gist.github.com/elaatifi/ade7321a1405c61ff8a9
However converting JSON is more straightforward than XML (the semantic of Attributes/Childs/Custom tags do not match with JavaBeans)

Related

How to store data with version controll and make it easy to edit and use in applications

I want to store a set of data (like drop tables for a game) that can be edited and "forked" (like a open source project, just data, so if I stop updating it, someone can continue with it) like a coding project. I also want that data to be easy to implement in code (for example, the same way you can use a database in code to get your values) for people that makes companion apps for said game.
What type of data storage would be the best for this scenario?
EDIT: By type of data storage I mean something Like XML or JSON or a database like Access or SQL as well as noSQL
It is a very general question, but I'm getting the feeling that you're looking for something like GitHub. If you don't know what that is, then you should probably look into it. GitHub supports svn and allows you to edit your code quite easily and let you look back to previous versions of your code. Hope this helps!

create.js, createPHP and Cakephp

We are trying to implement in page editing for our cake app. We would like to use create.js for the frontend and createphp to handle the connection between create.js and cakephp. I have been doing a lot of research on RDFa and I am generally baffled by how all this links together.
What I have:
Editable interface
Endpoints via actions in cakephp
What I need:
A way to convert the data sent by create.js to my database structure and a way to send data to create.js for rendering.
I have gotten to the point in createphp where you are supposed to create your own mapper. I don't know what the mapper should contain. It mentions that is has built-in mappers (Midgard\CreatePHP\Mapper ?) but I don't know how to load those either.
I have read the documentation but it doesn't give details on how to accomplish these tasks.
Thank you for your help on the two following questions.
1. How can I convert my data from create.js to cake and then back again for the views? (possible solution createPHP but doesn't have to be)
2. How do I create a mapper for createPHP or where could I find information to learn how to create a mapper for my instance?
Yeah, the documentation is unfortunately rather sparse. I will create an issue on createphp linking to this post, to give some hints how the documentation could be improved.
I try to explain how things work:
To convert the data from the REST call to your model, you indeed need a RdfMapper instance. See the setup section of the tutorial how you bootstrap that. the bundle comes with mappers for doctrine, which you can read for inspiration if you do not use doctrine. I recommend to extend AbstractRdfMapper in that case.
To render the rdfa, you need to configure what fields of your class should be what rdf type. You can either use the array mapper as in the tutorial, or use the xml mapping, or your own RdfDriver
The whole process is working fine in the symfony2 CreateBundle.

Salesforce: script to create custom object and fields

Is there a way to create custom objects and fields by using script or IDE ?
Salesforce is very easy to use, however, it's so time-consuming to create so many fields on Web interface. So, just wonder if there's ways to use script or IDE to create objects and fields in Salesforce?
You're looking for the Metadata API, or already developed tools which use the metadata api.
http://www.salesforce.com/us/developer/docs/api_meta/Content/meta_intro.htm
http://www.salesforce.com/us/developer/docs/api_meta/index.htm
Though using it directly will still require some developement, which may not save you much time. you get metadata in XML, but would still need to process it to what you want to achieve.
Somewhat also depend on the nature of what you want to do. I for instance had a requirement today for 150 custom labels based on an input file. Was much faster to generate metadata XML than to ever do that in the web interface. Deployed the metadata using the force.com IDE.

How can I import Landslide CRM data into Salesforce?

I'm importing some customer information from Landslide CRM into Salesforce.
Anyone have advice on the best methodology for doing the import?
It seems like the Apex Data Loader is the best way to go, but I don't know
if there are any issues with handling the objects in question, or if there
might be a specific tool or script to perform this migration.
Any experience with this import in specific or importing data into Salesforce
in general would be appriciated.
Importing Data to salesforce can be achieved in multiple ways depending on the type of data nd the requirements you have.
The first thing to do is get your data into CSV files so you'll need to find a way to export the dat afirst. For UTF-8 encoded data don't use Excel use something like OpenOffice (only required if you have UTF-8 Characters)
If its account and contact data for example. There is an import wizard available in Setup > Administration Setup > Import Business Accounts/Contacts
Next Option is as you say to use the Apex Data Loader. This probably the best approach.
The first thing and this is critical for big migrations is to Create a Field on your account object which will be a Unique Field for reference purposes. When creating this field set it as an External ID field and populate it with a unique reference for your accounts, the same goes for anything else which will be a parent. (you'll see why shortly.)
Next use the Insert option in the Data Loader to load the data mapping all the fields, especially the External Id
Now when you upload child objects use the Upsert option and map your Account Id via the External Id created earlier. This will match the accounts using your unique Id instead of you having to use the Salesforce id, saves alot of time.
Repeat the same for other objects and you should be good to go.
Apologies for the lack for structure here... doing this while in work and don't have alot of time but hope this helps.
The data loader works great for most types of imports. The one suggestion I would give you is to create a new custom field on your target objects (presumably Account and Contact) called "Landslide ID" or similar, identify it as an external ID field, and then import the primary keys from your source system into this field (along with the "real" data).
Doing this achieves a couple things - first, you have an easy unique link back to the source data for troubleshooting or tracing back to the source system. Second, if you find yourself in a situation where you need to import more fields or related data from the original source system, you'll be able to do so in an easy and correct way. It's just a good standard practice to adopt when doing data migrations -- it's almost no additional effort and can save you many hours in the future.

Pulling in Dynamic DBF Columns

I have been asked to pull in columns for use in a web app.I am using asp.net and C#. I was using a dataReader to populate the class variables. The problem is that the dbf file can change. Sometimes rows are added or deleted so my class would have to change every time the data source file changes to represent the columns Is there a way around this?
Lots of ways of addressing this issue, your problem is handled by a whole class of solutions known as Object Relational Mapping or ORM. The absolute king of these in the Java and .Net world is NHibernate. This does nean a rebuild with each DB change though, I use code generation to solve that problem, builds the class and mapping files direct from the DB. Then you get into TDD and CI, to make sure you haven't broken anything and then .....
However, if you want something quick and dirty you could create a dictionary within your classes and store any extra columns in there. Completely flexible but your classes extra columns aren't defined within the class itself.
I just used a few try/catch blocks to solve this.

Resources