Miva Select Orders From Database - database

I have a client that is looking to create a batch order system that works with Miva.
I am wondering if anyone has created a system that exports orders created that day. The system we need to implement has to create a csv document and send the file to an ftp server.
If anyone has any ideas or examples I would greatly appreciate them as I have not yet worked with that part of Miva's system.
Edit:
After doing some research I found that miva's system saves the export file to the server which can be collected by ftp. I can run the automation and file conversion from our localhost but
I need to find a module or create a module that can create that export daily. Does anyone know of an existing module.

Miva doesn't have a native implementation of cURL.
But it does have, is a "Reports" area, where you can generate reports in CSV of orders placed within the last 24 hours.
It would be up to you to find a way to get the data from the export to the FTP server.
Depending on your setup, you can create a cron to do this for you, and use a different language to FTP the Miva generated CSV file to the proper server (that is, if it isn't the same as the Miva store is running on).

Related

Create and make analysis CortexDB

I expose my problem: I have recently started using CortexDB, a NoSQL software to database analysis. I have read the (poor) documentation on https://docs.cortex-ag.com/en/CortexDB/CortexDB/, and purchased a free license to evaluate the operation of the program. As the documentation is unclear I would have some questions to ask you:
1) How do I create a database?
2) how can I import a database contained in an excel file (.csv)?
3) how do I create charts or analyzes regarding the data entered?
Thanks
because the question is very old I hope I can still help you.
First of all: you should download the latest release of the free version (simple registration and download)
if you downloaded the free version you got the server and two databases. A server process handles one database. For a second database you have to start a second server (different port of course). If you start the free version you should have an empty database (or the filled and configured demo db). If you want to create a complete new one without any predefined configuration, you have to start the server process with the command line and the parameter -n (ctxserver64 -n). If you did that you have to configure everything by hand with the tool ‘remote admin’.
the question is not clear for me. Do you mean how to import a csv file into a CortexDB or do you mean how to import the database content into an excel file?
If you want to import the csv file into a CortxDB, the easiest way is to use the tool CortexImplex. It’s completely explained in the online docs (https://docs.cortex-ag.com/en/CortexImplex/CortexImplex-Basics/)
If you want to export datasets as csv file the only thing you have to do is to configure a list in the CortexUniplex as a view for your datasets and export them as csv (you find the export function in the list menu).
I would do the charting with d3j. For this you can use the so called ‘DataService’ of the CortexUniplex. It’s a kind of an API for posting requests and getting JSON objects. If you have a completely configured UniPlex you can use all of your configuration as json objects for other apps (for example charts or an individual application).
The full version has a simple dashboard inside of the CortexUniplex. Maybe the vendor offers it in the free free version.
By the way: it’s always good to write an email to the info address. Because this database is not so famous and known, the guys are very helpful. Or contact them via twitter or other channels (see at the bottom of the cort ex-ag.com webpage).

View Application Background Tasks

I have an .net application fully built up for commercial purpose. The application utilizes SQL Server DB to store all the information. If you like specifics of the application, it is a Forex trading application where it stores all the price ranges changes during the day. So all the data is stored in the SQL Server.
Now what I would like to know is how can I know where is the application DB is location and where it is pointing to.
Also application has this feature where we can export the tables that I am viewing to excel file, the way it creates excel files is really "beautiful", would like to know what are the VBA commands that it is utilizing. I suspect this might be a batch file that is being run when we trigger the export option.
The reason I am wanting to know this is,
I want to make this automated, as in everyday at 0600 I would like the export to be automatic and the exported file will have todays date... I can do all the later mentioned tasks, the thing I am having problem is knowing where is the db and the VBA commands.
If my memory serves me right, there was this program that gives u insight in to everything that your computer does.

Batch Processing Design Patterns

A partner who cannot support a real-time web service interface must SFTP CSV files to my linux environment.
The file is zipped and encrypted. The sftp server is a different virtual server than the one that will process the CSV data into my application's database.
I don't need help with the technical steps (bash script, etc) but I'm looking for file management conventions that assist with the following requirements:
Good auditabilty
Non-destructive
Recoverable
Basically I'm trying to figure out when it makes sense to make copies of the file, when to rename it to indicate some process step has been completed to a file, etc. (e.g. Do I keep the zip files or do I delete them once unzipped?)
There is going to be personal preference in the response, but I'm looking for that; to learn from someone who has more experience working with this type of interface. This seems better than me inventing something myself.
If the files are encrytped upon the network and within the files settings, then it cannot be successfully transmitted across unless the file is parsed within another file. You could try to make the sftp server foward the file onto a seperate machine,but this would only cause more issues because of the encryption type based on the files.

Convert plone database to csv or SQL

I am helping out an organization which are planning on changing their members system. Right now their system is developed in Plone and all their data is in a Data.fs file.
Their system is down for the moment and it would take some time and effort to get it up and running.
Is there a way to get the data out from the database into a standard format such as csv files or SQL? Or do they need to get the system up and running beforehand and export the files from "within" plone?
Thanks for your help and ideas!
Kind regards,
Samuel
The Data.fs file is a Object Oriented Database file, and it is written by a framework called the ZODB. The data within it represent python instances, layed out in a tree structure.
You could open this database from a python script, but in order for you to make sense of the contained structures, you'll need access to the original class definitions that make up the stored instances. Without those class definitions all you'll get is placeholder objects (Broken objects) that are of no use at all.
As such, it's probably easier to just get the Plone instance back up and running, as it'll be easier to export the exact data you want out if you have things like the catalog (basically a specialized database index) to build your export.
It could be that this site is down because of something trivial, something we can help you with here on Stack Overflow, or on the Plone users mailinglists or in the #plone IRC channel. If you do get it up and running and have some details on what you are trying to export, we certainly can help.
You'll need to get the system up and running to export data. Data in the data.fs file is stored as Python pickles and is not intelligible to "outside" systems.
As the others have pointed out before, your best course would be to have Plone running back again. After doing so, try csvreplicata to export existing data to csv format. And for user accounts, try atreal.usersinout.
If you need professional help, you can search for available providers from http://plone.org/support/providers
For free support, post specific problems here.
Recently I managed to export Plone 4 site to sqlite using SQLExporter: http://plone.org/products/proteon.sqlexporter. But you need to get your Plone instance working first to use it.

Using Excel with Silverlight app not writing new columns

I have a project as follows:
User uploads Excel file to server, server will return back with 2 new columns. User wants us to check prices being charged and we have file that holds average standard pricing.
In the desktop application just done, I use Microsoft.Office.Interop.Excel
for manipulating the Excel file.
But this is not available in Silverlight. Reading is not the issue.
The issue is adding 2 new columns. Program reads excel file using oledb, and oledb is very light and is available in web.
But for creating 2 new columns, I use Microsoft.Office.Interop.Excel that Microsoft provides.
This is not available in web.
I will be need to check how can we do this.
One possibility is to have the program on the server, waiting for a file, process the file, and email back to the user.
I just want to see if there is another way. I don't like this approach it doesn't seem best.
You have a few options for doing this with Silverlight. First, you can use the Excel XML format for the files which means adding a column is just an XML exercise. Second, if that doesn't work, you can upload the file to the server and run the same code you have in your desktop app to update the file. Once it is updated you can prompt the user to save the file back to their hard drive.
If you go the Excel XML route then you would need to create a web service to get the price data from your database out to the Silverlight on the client. Oledb won't work since you don't want to expose your database via oledb on the Internet.

Resources