How can I script WordPress for non-GUI automated Pages/Posts-import? - database

This is probably a "cannot see the forest because of the trees" situation,
but how do I create a script which does the automated import of Posts/Pages, without a hook in the WP Website-GUI (e.g. in Theme's functions.php). It should be standalone triggerable by calling the script name via webserver.
via this API-call wp_insert_post()

You want to use the WordPress API (http://codex.wordpress.org/XML-RPC_wp) to connect. You can do this with almost any scripting language, but since you mention running it on the webserver, and WordPress is written in PHP, we'll go with that language for now.
Check out this tutorial:
http://life.mysiteonline.org/archives/161-Automatic-Post-Creation-with-Wordpress,-PHP,-and-XML-RPC.html
He shows example of how to create a script that will insert a post into your WordPress blog. The script can be given execute permissions and ran via the command line or a cron job.
You will have to code the logic to get the post from wherever your data is stored, though.

Related

Liferay document checkin issue

I'm still new to Liferay and using Liferay 6.2
what i'm doing:
I am trying to add a document manually into my database using insert statement.
I inserted into dlfileentry, dlfileversion and AssertEntry.
Also, i created a folder with the valid name and file.
The issue:
upon entering the Documents and Media portlet, i can see the document name there but when i click on checkout, it will prompt a error saying that Documents and Media is temporarily unavailable. however i am still able to download the valid document.
Am i doing something wrong? Personally, i feel that i am missing one more table for the database but i'm not sure .
Thanks!
Yes, you're doing something wrong: You should never write to Liferay's database with SQL, as there might be more data required than what's directly visible to you. Obviously, you're running into exactly such an issue.
Liferay has an API which you can use locally, from within the same application server, or remotely as JSON or SOAP service. You should exclusively use this for write access to the database.
Alternatively, you might consider WebDAV access to your document repository as the way to add more documents to the document library.

Implementing Remote Api Shell (Python) in Google App Engine

Actually what I'm trying to implement is that, I have to access appengine datastore remotely using remote_api_shell.py. But the problem I'm facing is I'm able to logging in but couldn't access the entitys modules in my app. the steps for the procedure is not clear anywhere so I'm not able to proceed further.
I referred the articles
https://developers.google.com/appengine/docs/python/tools/remoteapi and
https://developers.google.com/appengine/articles/remote_api
They have used a command like
python $GAE_SDK_ROOT/remote_api_shell.py -s your_app_id.appspot.com
I dont know where to type it. I used command prompt for which i modified the above as
c:\program files(x86)\google\google_appengine\python remote_api_shell.py -s your_app_id.appspot.com
Its logging in. I'm able to save some entities in my datastore but unable to access my modules. i think there is a some kind of directories i want to specify or there are steps i have to follow before this which I might have missed. So i looking forward for Some Help to achieve it successfully.
Thanks.
First off cd to your application directory.
The run the remote shell as per the docs
python $GAE_SDK_ROOT/remote_api_shell.py -s your_app_id.appspot.com
If you use appengine_config.py to set up all your paths manually import that into the shell.
Other wise you should be able to import any modules etc that are defined at the root level of your application directory.

How do I make a CRON job to use the export feature of phpmyadmin to export DB

I want to export some tables in my DB to an Excel/Spreadsheet every month.
In PHPMyAdmin there is a direct option of exporting the result of a query to the desired filetype. How do I make use of this export feature without another script to run a cronjob on a monthly basis?
Basically on a CPanel (the DB is hosted in the web) we just have to give the path to the script to be executed via a cronjob. But in PHPMyAdmin there is no such opportunity. Its an included feature of PHPMyAdmin where we generally click and do it mannually. So how do i do it in Cpanel?
Do you have ssh access to the box? Personally I'd implement this outside of phpmyadmin, as phpmyadmin is just intended for manual operations via the interface. Why not write a simple script to export the db?
Something like mysqldump database table.
Being a web-app, the export function is a POST request. In the demo application the URL is http://demo.phpmyadmin.net/STABLE/export.php, and then the post data contains all the required parameters, for example: (You can use Fiddler/Chrome dev tools too view it)
token:3162d3b849cf652c2577a45f90022df7
export_type:server
export_method:quick
quick_or_custom:custom
output_format:sendit
filename_template:#SERVER#
remember_template:on
charset_of_file:utf-8
compression:none
what:excel
codegen_structure_or_data:data
codegen_format:0
csv_separator:,
csv_enclosed:"
.....
The one tricky bit is the authentication token, but I believe this is also possible to overcome using some configuration and/or extract parameters (like the 'direct login' in http://demo.phpmyadmin.net/)
See here
How to send data using curl from Linux command line?
If you want to avoid all this, there are many other web-automation tools that can record the scenario and play it back.
just write a simple php script to connect to your database and use the answer here:How to output MySQL query results in CSV format?

Open and Save Word files through internet

I have a situation that override my knowledge. Here is situation:
A simple web based system store a Word files. Users create them locally, then upload them to server. After that, another user can download, edit and upload again. All that is okay, but that steps of repeating Download/Upload cause troubles - in case when user forgot to upload after he make changes. The prerequisites is that they want to use only Word, so i can't use any web editors like CKEditor or Google Documents.
So - a question - is there a way to let users open/save that DOC files with Word without setting a VPN?
Server is a Windows 2008, and language is ASP.NET / classicASP. User access system via browsers.
I think you can embed a plugin called aceoffix in your web system, if the customers do not have to download, upload and save back to server. With aceoffix they can edit online and save back to the server directly. It is exactly the same interface as MS Office. Hope this will be helpful.
How about a tiny app (on clients) to act as a syncronizer (using FTP) ?
I think an embedded Word viewer would be something quite complex to pull off - especially if they require the native, proper and exact Word look/menus.
One alternative is to provide a plugin to your users, where they can access/sync documents directly from/to the server. But then you aren't using the a web site but a local plugin, which comes with its own headaches of course.
Creating a Word plugin is a nice way to make it seem like something "in the Office program" when you have actually created it yourself, so that your user don't have to feel like they are using another program. My idea is that you could create a way for users to load a Word file from the server, do changes to it and then upload them back to the server automatically.

cakephp: running controller action as a cron job not working

Am trying to run a controller action as a cron job, but it is giving me the message, "could not open input file"
To do the above, i used this link, http://bakery.cakephp.org/articles/view/calling-controller-actions-from-cron-and-the-command-line....
But not working for me, I also tried to place the cron dispatcher in /app/webroot, but still not working for me.
thanks...
PHP cannot load your cron_dispatcher.php. Make sure your cron entry is correct, i.e. points to the correct full path of your PHP file. Also make sure that you have your access rights correctly. It's possible that the file exists but that when PHP runs under cron, it is not allowed to read the file.
PS: Have you considered using CakePHP's Shell functionality instead of this dispatcher? It was designed for CLI use.
Call direct your action
wget http://www.example.com/homes/my
in my action wirte your code
set in cpanel ....

Resources