Export data into SPSS file (*.sav) - export

Is there any solution to export data into SPSS (*.sav) files?
I have a web service with surveys, and result needed to be exported to different formats.
I cant find any solution for SPSS.
(in any language, free or non-free products - but need to execute on the server!)

I found a good library https://github.com/tiamo/spss.
It was updated, and works as a charm

Related

Parse Database PDF Export

I was wondering if there is any way to export data from the Parse.com database (based on conditions) to a PDF format. There does not seem to be any built-in functionality for this but I may be missing something.
The purpose of this is to create a monthly report of new entries into the database.
The only solution I can find is to pull out ParseObjects using a query against a condition (in this case, creation date) and then having to manually extract fields and construct a PDF document using a third-party library.
Although I cannot find any solutions, I feel that this sort of functionality would be commonly required and perhaps I am missing something.
Any help would be appreciated! Thank you.
There is no built in function, you could try to use a JavaScript library in a Cloud Code Background Job to write the file and schedule the job to run once a month, or like you already said, query the data using the API and write the file on your own server/client. That's pretty much your only option at the moment.

Export content from an ecommerce site without using the Backend

I have a site that I'm looking to transfer to Volusion. Importing tabled content into Volusion's a breeze, it's getting it tabled that's an issue. The old site has no real ability to export, nor do I know how to get at it's database. I'm thinking there must be some sort of script I can write to take the content from the frontend and download it in some sort of list that I can put into a CSV, and put into Volusion.
www.twincitygreetings.com
Any suggestions? I'm hoping to get in the image directory as well and download all them for upload to the new site.
You are going to need at the very least a file with product code, product name, weight and price.
Looking at the URL you provided it doesn't appear that the products their follow any type of orderly structure where you can target the images folder or products based on a known piece of information like a products code. Unless the back-end has some type of product export function you may have no choice but to recreate it from scratch.
I don't know if you solved this yet or not, but I would suggest scraping the data providing you have the information on the old site currently. This can be done easily using vbscript and excel, or if you aren't very savvy at coding you could look at a piece of software called mozenda. There are a whole variety of methods that can be used to scrape data, all of them pretty easy to learn with a bit of research. Basically you write a script that will crawl your dom and extract the data (to xml works best in my experience)
Hope this helps.

How can I export data from a c-tree program?

I'm in the process of consolidating our old stuff into TFS. One of the old things is a bug report repository. I believe it's using c-tree to store data, because it has .idx and .dat files. Id like to try to export all of this data into a txt/csv so someone else can sort through what is still relevant, and the import the good stuff into TFS.
The problem is I'm not sure how to go about exporting the data from the c-tree files. Any ideas?
Thanks,
Makolyte
If you have access to any Faircom tools or APIs, http://www.faircom.com/ace/support_doc_t.php would be a good place to start.
Otherwise, you can get an express edition of C-treeACE here: http://www.faircom.com/ace/download_t.php

Convert plone database to csv or SQL

I am helping out an organization which are planning on changing their members system. Right now their system is developed in Plone and all their data is in a Data.fs file.
Their system is down for the moment and it would take some time and effort to get it up and running.
Is there a way to get the data out from the database into a standard format such as csv files or SQL? Or do they need to get the system up and running beforehand and export the files from "within" plone?
Thanks for your help and ideas!
Kind regards,
Samuel
The Data.fs file is a Object Oriented Database file, and it is written by a framework called the ZODB. The data within it represent python instances, layed out in a tree structure.
You could open this database from a python script, but in order for you to make sense of the contained structures, you'll need access to the original class definitions that make up the stored instances. Without those class definitions all you'll get is placeholder objects (Broken objects) that are of no use at all.
As such, it's probably easier to just get the Plone instance back up and running, as it'll be easier to export the exact data you want out if you have things like the catalog (basically a specialized database index) to build your export.
It could be that this site is down because of something trivial, something we can help you with here on Stack Overflow, or on the Plone users mailinglists or in the #plone IRC channel. If you do get it up and running and have some details on what you are trying to export, we certainly can help.
You'll need to get the system up and running to export data. Data in the data.fs file is stored as Python pickles and is not intelligible to "outside" systems.
As the others have pointed out before, your best course would be to have Plone running back again. After doing so, try csvreplicata to export existing data to csv format. And for user accounts, try atreal.usersinout.
If you need professional help, you can search for available providers from http://plone.org/support/providers
For free support, post specific problems here.
Recently I managed to export Plone 4 site to sqlite using SQLExporter: http://plone.org/products/proteon.sqlexporter. But you need to get your Plone instance working first to use it.

I need a sql db that has all the stock symbols on the US market. ANYONE?

That is pretty much it. I need a .sql file with all the stock symbols and company names matched up for an autocomplete function I'm writing. ANYONE?
Well i dont know about a downloadable .sql file but there are numerous free and paid API's you could use to get the data for import into your db. Check out this similar questions for some options: Stock ticker symbol lookup API
I think i would go this route and maybe run a background process that does an update from the API every now and then so that you always have all the symbols up to date.
http://www.nasdaq.com/screening/company-list.aspx. This page contains company list from NASDAQ, AMEX and NYSE. You can then read in CSV data and store it as sql.
I use the python package pytickersymbols. The package offers an offline collection of stocks with metadata like google and yahoo symbols.
from pytickersymbols import PyTickerSymbols
stock_data = PyTickerSymbols()
nasdaq_stocks = list(stock_data.get_stocks_by_index('NASDAQ 100'))
sp500_stocks = list(stock_data.get_stocks_by_index('S&P 500'))
sp100_stocks = list(stock_data.get_stocks_by_index('S&P 100'))
In the repository, you can also find a YAML file which is maybe useful for a SQL file creation.
Take a look at the Company Fundamentals API at http://www.mergent.com/servius - should be pretty easy to extract the list from there.

Resources