I have a requirement to export to Microsoft Project from my company's program. As far as I've seen there are a few options:
Use one of the interchange formats, e.g. xml, mpx, mpd
Use the COM object model and automation to write the file
Buy a library that can write the files
The interchange formats have the problem that they will give you an import dialog when you open them, and if you want to save in a different format you need save as, and you need to select the file format before opening them. I.e. it's not a smooth experience for the customer.
Automation requires everyone who exports from our program to have MS Project installed, which is not acceptable.
The only library I could find was Aspose.Tasks which only writes into the Project XML format.
Does anyone know of any library that can write native mpp files? I've seen a post from Microsoft that they have no intention of documenting the file format, but there are some Project Viewers out there so someone must have done something with it? (Although reading from it can be done with an OleDB provider now that I think about it).
Anyone? Write MPP files?
The .NET Framework has the Microsoft.Office.Interop.MSProject set of classes. You could use them directly if your app is .NET, or write a dll in .NET and then reference that from your app.
I have never had reason to import or export Project files in non-native format but for kicks I just created and exported a simple Project in .XML format. Sure enough, when I open that file using the user interface I have to decide if I want to:
a) open as a new file,
b) Append the data to the active project or
c) Merge the data into the active project.
But, if I open the file using a VBA statement:
FileOpenEx ("Project1.xml")
I'm not troubled by the multiple-choice exam. If the default option provided by FileOpenEx is appropriate you could concoct a very simple procedure to which you could direct your users. I'm not sure if this meets your need?
Related
"In traditional file processing, the structure of data files is embedded in the application programs, so any changes to the structure of a file may require changing all programs that access that file. By contrast, DBMS access programs do not require such changes in most cases. The structure of data files is stored in the DBMS catalog separately from the access programs. We call this property program-data independence."
The following text is taken from the book Fundamentals of the Database system. I didn't get the part about the traditional file processing can somebody please explain(an example would be appreciated)?
I'll give you a simple example.
Microsoft Excel used to save its files in a proprietary binary format. In practical terms, this meant that you could only work on those files using Excel.
But now, Excel supports an open document format in XML that is text-based, and allows other programs like the OpenOffice SDK to interact with them. So you no longer need to rely on Excel to work with open document format Excel files.
Our current system database system is a clipper DOS application. The database inside its folder is fragmented/divided into many parts. I want to decrypt the database so that I will have only one database in all and avoid reshuffling of data. I'll attached the file folder Screenshot.. the database is on .DBF format
VScreenshot of files
Often you can decompile the CLIPPER exe file to source code and work from the .prg I've done it many times. The program to use is called WALKYRIE.
In Clipper and Fox Pro for DOS .dbf file is a simple table file.
If You want to use as data base with many tables in one unit.
You can import these tables in MS SQL data base and/or part of a MS Access database.
I see that you got several answers. Most are partially right. Let's address these one at a time:
All those files essentially comprise the "database" for the application you're using. They could be used by other applications as well. Besides having a lot of files, what is the problem you're trying to solve?
People mentioned indexes. You can generally ignore these. There are there primarily to make access to the data files faster. Any properly written clipper application will recreate these if they're missing or corrupted. You could test this by renaming one, running the app, and seeing what happens. If it doesn't recreate it you can name it back. Not replacing missing index files would be unusual behavior.
The DBF file format is binary, but barely. Most of what's in a DBF is text and is readable with an editor. But there's no reason to do so - I'm sure there are several free DBF utilities out there to to read DBF files. Getting the structure of the files could be very helpful.
Getting the data out of the files would also be fairly simple with a utility. If you look up the DBF format you could even write one fairly easily in Clipper, any other language that uses DBF files, or in something like Python. Any language that can open and write files, really. It's not hard - any competent developer could do this in a matter of hours. Must less if you're using Clipper or another language that natively reads DBX files.
Most people create dBase/Clipper programs with relational data, like SQL Server. Where SQL Server has tables that relate to each other dBase/Clipper has a file for each "table." This isn't a requirement, but it was almost certainly done this way.
Given that, if you get the table structures through a utility or by reading the headers in an editor (don't save them from an editor!) you could quite likely recreate the database schema (i.e. the map of the data). Once you have that it's fairly trivial to get the data into another type of database (SQL Sever, Access, or whatever you like to use.) If non of the files are too large it's conceivable to put all the files into Excel sheets. It really depends on what you want to do with it.
As others have said, you may be able to get the code by Valkyrie. Some people have used it very successfully. I don't know where you get it and I've never used it. Why do you not have the code? If this is a commercial application you likely should not have it. If it's a custom app who ever wrote it or paid to have it written should have the code.
Again, it's not clear to me what problem you're trying to solve. But there are many options for doing something with those DBF files. Fortunately they are one of the easier to read data formats you could be working with.
Let me know if you have any questions. Apologies for the typos that are no doubt scattered throughout this reply.
You sort of can get an idea of how they relate to each other by opening the index files they use (.NTX files). If you have the DBU utility (executable) around, you can open the DBF and load the index (NTX). LibreOffice Calc is also able to open DBFs (haven't tested .NTX).
If you open the .NTX on a text editor you will see the indexes in the beginning.
I open with Access, but I can save the data using a PrintFill Program.
I expose my problem: I have recently started using CortexDB, a NoSQL software to database analysis. I have read the (poor) documentation on https://docs.cortex-ag.com/en/CortexDB/CortexDB/, and purchased a free license to evaluate the operation of the program. As the documentation is unclear I would have some questions to ask you:
1) How do I create a database?
2) how can I import a database contained in an excel file (.csv)?
3) how do I create charts or analyzes regarding the data entered?
Thanks
because the question is very old I hope I can still help you.
First of all: you should download the latest release of the free version (simple registration and download)
if you downloaded the free version you got the server and two databases. A server process handles one database. For a second database you have to start a second server (different port of course). If you start the free version you should have an empty database (or the filled and configured demo db). If you want to create a complete new one without any predefined configuration, you have to start the server process with the command line and the parameter -n (ctxserver64 -n). If you did that you have to configure everything by hand with the tool ‘remote admin’.
the question is not clear for me. Do you mean how to import a csv file into a CortexDB or do you mean how to import the database content into an excel file?
If you want to import the csv file into a CortxDB, the easiest way is to use the tool CortexImplex. It’s completely explained in the online docs (https://docs.cortex-ag.com/en/CortexImplex/CortexImplex-Basics/)
If you want to export datasets as csv file the only thing you have to do is to configure a list in the CortexUniplex as a view for your datasets and export them as csv (you find the export function in the list menu).
I would do the charting with d3j. For this you can use the so called ‘DataService’ of the CortexUniplex. It’s a kind of an API for posting requests and getting JSON objects. If you have a completely configured UniPlex you can use all of your configuration as json objects for other apps (for example charts or an individual application).
The full version has a simple dashboard inside of the CortexUniplex. Maybe the vendor offers it in the free free version.
By the way: it’s always good to write an email to the info address. Because this database is not so famous and known, the guys are very helpful. Or contact them via twitter or other channels (see at the bottom of the cort ex-ag.com webpage).
I am helping out an organization which are planning on changing their members system. Right now their system is developed in Plone and all their data is in a Data.fs file.
Their system is down for the moment and it would take some time and effort to get it up and running.
Is there a way to get the data out from the database into a standard format such as csv files or SQL? Or do they need to get the system up and running beforehand and export the files from "within" plone?
Thanks for your help and ideas!
Kind regards,
Samuel
The Data.fs file is a Object Oriented Database file, and it is written by a framework called the ZODB. The data within it represent python instances, layed out in a tree structure.
You could open this database from a python script, but in order for you to make sense of the contained structures, you'll need access to the original class definitions that make up the stored instances. Without those class definitions all you'll get is placeholder objects (Broken objects) that are of no use at all.
As such, it's probably easier to just get the Plone instance back up and running, as it'll be easier to export the exact data you want out if you have things like the catalog (basically a specialized database index) to build your export.
It could be that this site is down because of something trivial, something we can help you with here on Stack Overflow, or on the Plone users mailinglists or in the #plone IRC channel. If you do get it up and running and have some details on what you are trying to export, we certainly can help.
You'll need to get the system up and running to export data. Data in the data.fs file is stored as Python pickles and is not intelligible to "outside" systems.
As the others have pointed out before, your best course would be to have Plone running back again. After doing so, try csvreplicata to export existing data to csv format. And for user accounts, try atreal.usersinout.
If you need professional help, you can search for available providers from http://plone.org/support/providers
For free support, post specific problems here.
Recently I managed to export Plone 4 site to sqlite using SQLExporter: http://plone.org/products/proteon.sqlexporter. But you need to get your Plone instance working first to use it.
A call to all Win32 developers... I'm developing an application in C using plain Win32. I wanted to ask about Windows development standards regarding these things:
Is there a standard Windows error log api? For example if my client uses my app and it crashes, I would like them to send me the error log and I would prefer this being a standard location so they can maybe access it with a standard Windows log utility.
My app needs to store settings information. I think the registry is the standard utility for this task. Is that right?
My app needs to store and retrieve files that it downloaded from the internet - images, executables etc. Is Application Data/myapp the standard location to store this type of information?
My app needs a very straight-forward database - I'm using CSV for this. I basically need to store and retrieve this type of data so I'm just serializing a .csv file from Application Data/myapp. Is there a better Windows standard way of doing this?
That's all for now :). Thanks!
Is there a standard Windows error log api?
There is the Windows Event Log, but I don't think you want a typical user having to go into it to extract your logged information.
You probably don't want to log by default, unless you're shipping questionable pre-release code. When a user is experiencing problems, then you have them turn logging on. In this case, I recommend placing the file somewhere that typical users have experience with, like My Documents.
By the way, if you're writing a standalone application and want the best possible information in the event of a crash, look into minidumps. Here is a Codeproject sample.
My app needs to store settings information
Yep, registry.
My app needs to store and retrieve files
Yes, App Data. Just be sure to use SHGetFolderPath and CSIDL_APPDATA.
My app needs a very straight-forward database
There's nothing wrong with CSV for simple data. You could store the data in XML and use MSXML to process it, if you prefer. I've used SQlite in the past when I needed fast, lightweight storage of more complicated data.