I am wanting to upload a csv file, validate it, and then upload into an existing model. However, I am using ATK4.2.1 and finding that either the example code that have googled are either missing some steps or not relevant in version 4.2.1.
To this end, for my first step I have tried mixing and matching code in attempt to get a fileStore up and running with no working results thus far.
Is there a step by step tutorial/guide that anyone can point me to. The reference on the agiletoolkit.org site does not have any examples for the atk4-addons; that I can find. It seems that it is more involved than just adding an object to the page.
Grateful for any assistance.
You can find some installation instructions here:
http://agiletoolkit.org/doc/filestore
Import this SQL into your database atk4-addons/misc/docs/dbupdates/filestore-001.sql
Create a new page, page_fileadmin extends filestore/Page_FileAdmin
Create directory 'upload' and make it writable.
In your model instead of addField use $this->add('filestore/File');
If you need images, use filestore/Image instead.
Please join https://groups.google.com/forum/?fromgroups#!forum/agile-toolkit-devel for updates on new documentation.
Related
Playing with Yahoo's vespa.ai, I'm now at a point where I have a search definition with which I am happy, but still, have a bunch of garbage test documents stored.
Is there an easy way to delete/purge/drop all of them at once, ala SQL DROP TABLE or DELETE FROM X?
The only place I found at this point where deleting documents is clearly mentioned in the Document JSON format page. As far as I understand it requires deleting documents one by one, which is fine, but gets a bit cumbersome when one is just playing around.
I tried deleting the application via the Deploy API using the default tenant, but the data is still there when issuing search requests.
Did I miss something? or is this by design?
There's no API available to do this, but the vespa-remove-index command line tool could help you out. Ie, to drop everything:
$ vespa-stop-services
$ vespa-remove-index
$ vespa-start-services
You could also play around with using garbage collection for this, but I wouldn't go down this path unless you are unable to use vespa-remove-index.
I am trying to create a .xlsx file with data I retrieve from mysql to a node.js server that serves an angularjs project, but after hours of trying to find something via npm or google I almost gave up!
The two main problems I have are:
My data is in hebrew (i.e. rtl styling + different characters).
The Excel file that I export needs to be styled in a specific way, and it is a pain trying to style an excel file grammatically.
And then I had an idea!
What if I could create a google sheet doc in my google drive as a template including the styling, and then when the user clicks to create a new doc, I would just duplicate this template, and change the values to the new data.
But just trying to understand the google api is a headache on its on, apparently, there are 3! different api's: Drive, Sheets and auth.
So my question is as follows.
Is my idea valid? does anyone think it could work?
Where would I start, is there some guide or npm that would help?
Please don't comment to look in the docs, I am having a hard time to understand where to start from there.
I would suggest creating the template file locally instead of opting to google spreads.
There is a decent module I used sometime back, which does styling pretty well, Its called exceljs.
Though there is always the xlsx module, Which is very powerful but difficult to use
Also if you end up using google spreadsheets, I would suggest giving node-google-spreadsheet module a look
I've got quite the issue that I can't seem to figure out how to do cleanly after hours of scouring the internet, hoping you all could help me out.
We have an AngularJS web-app that is used to create an object in a database according a large set of variables. After the object is put into the database, we want to put some of the fields into a macro-enabled excel doc containing some necessary VBA macros that then downloads to the user's computer.
From what I've found, Angular can output the data to a new excel doc, but we need to have the data put into the pre-existing excel document. The closest tool I've found is SheetJS from a different StackOverflow question/answer, but that answer doesn't actually implement the functionality to edit the current excel, only write to a new excel document.
Any and all suggestions on fixing my problem are appreciated!
I may have found an answer, but it's not quite available yet. I left an issue on the SheetJS GitHub page which got answered today. Sounds like macro persistence functionality is being added into the release this upcoming Friday. From there, add bookVBA:true to the options of your readFile call like XLSX.readFile("filename", {bookVBA:true, /* .. other options */});, and then on your writeFile call you add the option of bookType:"xlsm".
This is of course speculatory, as the change isn't live yet. I'll make sure to update this answer at that time.
EDIT/UPDATE: SheetJS now works in copying over macros to a new workbook, but there's currently a bug. If you have any code in the ThisWorkbook object to do anything on startup, sheet changes, etc., they won't work after writing back out, as it creates a new ThisWorkbook1 object. I've already opened an issue on it, but that should be fixed relatively quickly.
I've been following this tutorial http://www.magentocommerce.com/knowledge-base/entry/tutorial-creating-a-magento-widget-part-1 to create a Magento widget as part of an extension I'm working on.
Whilst the widget was created successfully and worked as I wanted it, I changed the code and started getting the following error
Warning: Invalid argument supplied for foreach() in app/code/core/Mage/Widget/Model/Widget/Instance.php on line 502
When I changed the code back, the error was still present. However when I copied all my module to a fresh Magento install then the error wouldn't appear.
Although my widget does not explicitly use the database does anyone know if the act of installing and uninstalling a Magento widget makes changes to the core databases tables and if it does, which tables are altered.
Thanks
The core_resource table contains a list of all modules, so adding a new module will cause a new row to be created.
If you have anything in your module's sql folder, that code will be run depending on your module's version.
Without knowing exactly what code was run and changed, it's hard to know what your specific problem is.
http://www.magentocommerce.com/knowledge-base/entry/magento-for-dev-part-6-magento-setup-resources
So if you followed that tutorial you linked to I do not think it changed any database settings.
You can tell if a module will add tables or modify table columns by the following method.
Assume the module is called Foo_Bar and it is installed in the "community" code pool (as opposed to core or local).
Navigate to app/code/community/Foo/Bar. You will at minimum usually see etc and Block directories there.
If you see a sql directory then this module makes db changes. You also need to understand a module is versioned and may initially make a certain table and then modify it.
By way of example you can go to any core module and look for the same. For example I am running Enterprise 1.12 and went to:
app/code/core/Mage/Sendfriend/sql/sendfriend_setup
I see:
mysql4-upgrade-1.5.9.9-1.6.0.0.php
mysql4-upgrade-0.7.3-0.7.4.php
mysql4-upgrade-0.7.2-0.7.3.php
mysql4-upgrade-0.7.1-0.7.2.php
mysql4-install-0.7.0.php
install-1.6.0.0.php
Note the upgrade x-y nomenclature. That is what core_resource keeps track of.
If you are wondering where your new module's settings are saved, that is actually in core_config_data. try this:
SELECT * FROM core_config_data where path like '%foo%';
Assuming you have some setting in the admin you named "foo".
Now back to your problem. That is a common php error. You are running a foreach on something which can not be iterated. The code right before that is probably not returning an array or collection or whatever.
Ideally you should always wrap the foreach with a statement that checks that the item you are iterating over is not empty.
You can also turn off displaying errors or suppress that error using the # statement which is a bad practice...
I'd like to know your approach/experiences when it's time to initially populate the Grails DB that will hold your app data. Assuming you have CSVs with data, is is "safer" to create a script (with whatever tool fits you) that:
1.-Generates the Bootstrap commands with the domain classes, run it in test or dev environment and then use the native db commands to export it to prod?
2.-Create the DB's insert script assuming GORM's version = 0 and incrementing manually the soon-to-be autogenerated IDs ?
My fear is that the second approach may lead to inconsistencies for hibernate will have the responsability for the IDs generation and there may be something else I'm missing.
Thanks in advance.
Take a look at this link. This allows you to run groovy scripts in the normal grails context giving you access to all grails features including GORM. I'm currently importing data from a legacy database and have found that writing a Groovy script using the Groovy SQL interface to pull out the data then putting that data in domain objects appears to be the easiest thing to do. Once you have the data imported you just use the commands specific to your database system to move that data to the production database.
Update:
Apparently the updated entry referenced from the blog entry I link to no longer exists. I was able to get this working using code at the following link which is also referenced in the comments.
http://pastie.org/180868
Finally it seems that the simplest solution is to consider that GORM as of the current release (1.2) uses a single sequence for all auto-generated ids. So considering this when creating whatever scripts you need (in the language of your preference) should suffice. I understand it's planned for 1.3 release that every table has its own sequence.