custom table doesnt exist in Moodle - database

I am building a block moodle plugin. For the plugin I have created three tables:
'block_learning_strategizer':contains just one field-id
'ls_basic': containing 3 fields-id,lp_name,description
'ls_path_details': containing 9 fields.
The definition is done through install.xml(under blocks/learning_strategizer/db)
The XML is as below:
<?xml version="1.0" encoding="UTF-8" ?>
<XMLDB PATH="blocks/learning_strategizer/db" VERSION="20120122" COMMENT="XMLDB file for Moodle blocks/learning_strategizer"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="../../../lib/xmldb/xmldb.xsd"
>
<TABLES>
<TABLE NAME="block_learning_strategizer" COMMENT="Default for block_learning_strategizer" NEXT="ls_basic">
<FIELDS>
<FIELD NAME="id" TYPE="int" LENGTH="10" NOTNULL="true" SEQUENCE="true"/>
</FIELDS>
<KEYS>
<KEY NAME="primary" TYPE="primary" FIELDS="id"/>
</KEYS>
</TABLE>
<TABLE NAME="ls_basic" COMMENT="Table contains name and description of learning paths" NEXT="ls_path_details">
<FIELDS>
<FIELD NAME="id" TYPE="int" LENGTH="10" NOTNULL="true" SEQUENCE="true"/>
<FIELD NAME="lp_name" TYPE="char" LENGTH="255" NOTNULL="true" SEQUENCE="false" PREVIOUS="id" NEXT="description"/>
<FIELD NAME="description" TYPE="text" NOTNULL="false" SEQUENCE="false" PREVIOUS="lp_name"/>
</FIELDS>
<KEYS>
<KEY NAME="primary" TYPE="primary" FIELDS="id"/>
</KEYS>
</TABLE>
<TABLE NAME="ls_path_details" COMMENT="Table contains details of created learning paths" PREVIOUS="ls_basic">
<FIELDS>
<FIELD NAME="id" SEQUENCE="true" TYPE="int" NOTNULL="true" LENGTH="10" NEXT="lp_id"/>
<FIELD NAME="lp_id" SEQUENCE="false" TYPE="int" LENGTH="10" NOTNULL="true" NEXT="id" PREVIOUS="course" />
<FIELD NAME="course" SEQUENCE="false" TYPE="int" LENGTH="10" NOTNULL="true" NEXT="section" PREVIOUS="lp_id"/>
<FIELD NAME="section" SEQUENCE="false" TYPE="int" NOTNULL="true" LENGTH="10" NEXT="req" PREVIOUS="course"/>
<FIELD NAME="req" SEQUENCE="false" TYPE="int" NOTNULL="true" LENGTH="2" NEXT="inc" PREVIOUS="section"/>
<FIELD NAME="inc" SEQUENCE="false" TYPE="int" NOTNULL="true" LENGTH="2" NEXT="modid" PREVIOUS="req"/>
<FIELD NAME="modid" SEQUENCE="false" TYPE="int" NOTNULL="true" LENGTH="3" NEXT="seqno" PREVIOUS="inc"/>
<FIELD NAME="seqno" SEQUENCE="false" TYPE="int" NOTNULL="true" LENGTH="10" NEXT="filename" PREVIOUS="modid"/>
<FIELD NAME="filename" SEQUENCE="false" TYPE="text" NOTNULL="true" LENGTH="255" PREVIOUS="seqno"/>
</FIELDS>
<KEYS>
<KEY NAME="primary" TYPE="primary" FIELDS="id" NEXT="lp_id"/>
<KEY NAME="lp_id" TYPE="foreign" FIELDS="lp_id" REFTABLE="ls_basic" REFFIELDS="id" PREVIOUS="primary" />
</KEYS>
</TABLE>
</TABLES>
</XMLDB>
However when I am trying to insert records I am getting an error :"Table "ls_basic" does not exist"
I checked XMLDB editor from Site administration and I could see the tables have been made.
I havent included an upgrade.php but as far as I know that file is optional.
It would be really helpful if someone could point out why do i get this error?

When a plugin is first installed, Moodle parses the install.xml file and uses this to create the database tables required for your plugin.
After the first installation of the plugin, Moodle does not look at the install.xml file again. Instead it relies on checking the upgrade.php file, at the point where your plugin's version number (in version.php) changes, in order to find out how to transform the previous database structure to match the new structure.
If your plugin is still under local development, you can get Moodle to re-parse the install.xml file by using the 'uninstall' feature in the 'Plugins' part of the 'Site administration' area. This will remove all the data for the plugin, then, if the code for the plugin still exists on the server, will immediately offer to re-install the plugin (which will create all the tables in install.xml).
If your plugin is already in use, or you do not want to lose any existing data, then you will need to use the XMLDB editor to generate the relevant lines of code to go in your upgrade.php file (and increase your plugin's version number to match).
See https://docs.moodle.org/dev/Upgrade_API for more details.
I would also suggest that this is a good time to fix your database tables to match the Moodle coding guidelines:
Variable names should not have _ characters in them - this applies to database field names as well (although _ in database table names are fine).
Plugin database tables should all start with the name of the plugin ('block_learning_stategizer' in this case) - if you end up using Travis CI to automatically check your plugins, then it will complain about your database table names.
Plugin names are strongly discouraged from having _ in them (other than between the plugin type and the rest of the name) - there have been a number of bugs over the years caused by Moodle core getting stuck on names that break this rule. It may be a good idea to rename your plugin 'block_learningstrategizer' now, before you hit any problems.

Related

solr : How to Clear the baseDir folder after the DIH import

Solr Version :: 6.6.1
I am able to import the pdf files into the Solr system using the DIH and performs the indexing as expected. But i wish to clear the folder C:/solr-6.6.1/server/solr/core_K2_Depot/Depot after the successful finish of the indexing process.
Please suggest, if there is a way to delete all the files from the folder via the DIH data-config.xml or by another easier way.
<!--Local filesystem-->
<dataConfig>
<dataSource type="BinFileDataSource"/>
<document>
<entity name="K2FileEntity" processor="FileListEntityProcessor" dataSource="null"
recursive = "true"
baseDir="C:/solr-6.6.1/server/solr/core_K2_Depot/Depot" fileName=".*pdf" rootEntity="false">
<field column="file" name="id"/>
<field column="fileLastModified" name="lastmodified" />
<entity name="pdf" processor="TikaEntityProcessor" onError="skip"
url="${K2FileEntity.fileAbsolutePath}" format="text">
<field column="title" name="title" meta="true"/>
<field column="dc:format" name="format" meta="true"/>
<field column="text" name="text"/>
</entity>
</entity>
</document>
</dataConfig>
Usuaully, in production you want to run DIH proces via shell scripts, which are at first copying needed files for ftp, http, s3, etc, than runs full-import or delta-import and later track the status of the indexing via status command as soon as it will successfully ends you just need to execute rm command
while flag; do
curl -XGET // get status of the DIH
if finished change flag to false
rm files -rf // removing not needed files for indexing
There are no any support of deleting external files in Solr

Apache Solr : How to access and index files from another server

Solr version :: 6.6.1
I am new to the Apache Solr and currently exploring how to use this technology to search in the PDF files.
https://lucene.apache.org/solr/guide/6_6/uploading-structured-data-store-data-with-the-data-import-handler.html#the-tikaentityprocessor
I am able to index the PDF files using the "BinFileDataSource" for the PDF files within the same server as shown in the below example.
Now i want to know if there is a way to change the baseDir pointing to the folder present under a different server.
Please suggest an example to access the PDF files from another server. How will i write the path in the baseDir attribute.
<dataConfig>
<dataSource type="BinFileDataSource"/> <!--Local filesystem-->
<document>
<entity name="K2FileEntity" processor="FileListEntityProcessor" dataSource="null"
recursive = "true"
baseDir="C:/solr-6.6.1/server/solr/core_K2_Depot/Depot" fileName=".*pdf" rootEntity="false">
<field column="file" name="id"/>
<field column="fileLastModified" name="lastmodified" />
<entity name="pdf" processor="TikaEntityProcessor" onError="skip"
url="${K2FileEntity.fileAbsolutePath}" format="text">
<field column="title" name="title" meta="true"/>
<field column="dc:format" name="format" meta="true"/>
<field column="text" name="text"/>
</entity>
</entity>
</document>
</dataConfig>
I finally found the answer from the solr-user mailing list.
Just change the baseDir to the folder present on another server (SMB paths works directly):
baseDir="\\CLDServer2\RemoteK2Depot"

Hi I want the file name using filelistentityprocessor and lineentityprocessor

This is my data-config.xml. I can't use Tika EntityProcessor. Is there any way I can do it with LineEntityProcessor?
I am using solr4.4 to index million of documents . i want the file names and modified time to be indexed as well . But couldnot find the way to do it.
In the data-config.xml I am fetching files using filelistentityprocessor and then parsing each and every line using lineentityprocessor.
<dataConfig>
<dataSource encoding="UTF-8" type="FileDataSource" name="fds" />
<document>
<entity
name="files"
dataSource="null"
rootEntity="false"
processor="FileListEntityProcessor"
baseDir="C:/Softwares/PlafFiles/"
fileName=".*\.PLF"
recursive="true"
>
<field column="fileLastModified" name="last_modified" />
<entity name="na_04"
processor="LineEntityProcessor"
dataSource="fds"
url="${files.fileAbsolutePath}"
transformer="script:parseRow23">
<field column="url" name="Plaf_filename"/>
<field column="source" />
<field column="pict_id" name="pict_id" />
<field column="pict_type" name="pict_type" />
<field column="hierarchy_id" name="hierarchy_id" />
<field column="book_id" name="book_id" />
<field column="ciscode" name="ciscode" />
<field column="plaf_line" />
</entity>
</entity>
</document>
</dataConfig>
From the documentation of FileListEntityProcessor:
The implicit fields generated by the FileListEntityProcessor are fileDir, file, fileAbsolutePath, fileSize, fileLastModified and these are available for use within the entity [..].
You can move these values into differently named fields by referencing them:
<field column="file" name="filenamefield" />
<field column="fileLastModified" name="last_modified" />
This will require that you have a schema.xml that actually allows those two names.
If you need to use them in another string / manipulate it further before inserting:
You're already using files.fileAbsolutePath, so by using ${files.file} and ${files.fileLastModified} you should be able to extract the values you want.
You can modify these values and insert them into a specific field by using the TemplateTransformer and referencing the generated fields:
<field column="filename" template="file:///${files.file}" />

How to include another XML file from within a Solr schema.xml?

I've got a number of different cores, each with its own schema, but they all share the same field types. I'd like to remove the duplication of the field type declarations and do something like this in my schema.xml files:
<?xml version="1.0" encoding="UTF-8" ?>
<schema name="foo" version="1.5">
<fields>
<field name="_version_" ...
<field name="id" ...
...
</fields>
<uniqueKey>id</uniqueKey>
<include "/path/to/field_types.xml">
</schema>
I don't see any mechanism in the docs to accomplish this however. I found one post referring to this:
<xi:include href="/path/to/field_types.xml" />
But that gives me a launch error: The prefix "xi" for element "xi:include" is not bound.
Anybody have an idea how to perform this type of raw include?
From this past Solr Issue - SOLR-3087, it looks like <xi:include> is the correct syntax, you just need to include the xi namespace reference inline.
<?xml version="1.0" encoding="UTF-8" ?>
<schema name="foo" version="1.5">
<fields>
<field name="_version_" ...
<field name="id" ...
...
</fields>
<uniqueKey>id</uniqueKey>
<xi:include href="/path/to/field_types.xml" xmlns:xi="http://www.w3.org/2001/XInclude"/>
</schema>
Another clean solution at this problem is add resources as external entities:
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE schema
[
<!ENTITY schemafieldtypes SYSTEM "schemafieldtypes.xml">
]>
then, in the xml, you can add everywhere this:
&schemafieldtypes;

unsupported type Exception on importing documents from Database with Solr 4.0

Looked up information provided on a related question to set up a import of all documents that are stored within a mysql database.
you can find the original question here
Thanks to steps provided I was able to make it work for me with mysql DB. My config looks identical to the one mentioned at above link.
<dataConfig>
<dataSource name="db"
jndiName="java:jboss/datasources/somename"
type="JdbcDataSource"
convertType="false" />
<dataSource name="dastream" type="FieldStreamDataSource" />
<dataSource name="dareader" type="FieldReaderDataSource" />
<document name="docs">
<entity name="doc" query="select * from document" dataSource="db">
<field name="id" column="id" />
<field name="name" column="descShort" />
<entity name="comment"
transformer="HTMLStripTransformer" dataSource="db"
query="select id, body, subject from comment where iddoc='${doc.id}'">
<field name="idComm" column="id" />
<field name="detail" column="body" stripHTML="true" />
<field name="subject" column="subject" />
</entity>
<entity name="attachments"
query="select id, attName, attContent, attContentType from Attachment where iddoc='${doc.id}'"
dataSource="db">
<field name="attachment_name" column="attName" />
<field name="idAttachment" column="id" />
<field name="attContentType" column="attContentType" />
<entity name="attachment"
dataSource="dastream"
processor="TikaEntityProcessor"
url="attContent"
dataField="attachments.attContent"
format="text"
onError="continue">
<field column="text" name="attachment_detail" />
</entity>
</entity>
</entity>
</document>
</dataConfig>
I have a variety of attachments in DB such as jpeg, pdf, excel, doc and plain text. Now everything works great for most of the binary data (jpeg, pdf doc and such). But the import fails for certain files. It appears that the datasource is set up to throw an exception when it encounters a String instead of an InputStream. I set the onError="continue" flag on the entity "attachment" to ensure that the DataImport went through despite this error. Noticed that this problem has happened for a number of files. The exception is given below. Ideas ??
Exception in entity : attachment:java.lang.RuntimeException: unsupported type : class java.lang.String
at org.apache.solr.handler.dataimport.FieldStreamDataSource.getData(FieldStreamDataSource.java:89)
at org.apache.solr.handler.dataimport.FieldStreamDataSource.getData(FieldStreamDataSource.java:48)
at org.apache.solr.handler.dataimport.TikaEntityProcessor.nextRow(TikaEntityProcessor.java:103) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:243)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:465)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:491)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:491)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:404)
at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:319)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:227)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:422)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:487)
at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:468)
I know this is an outdated question, but:
it appears to me that this exception is thrown when the BLOB (I work with Oracle) is null. When I add a where clause like "blob_column is not null", the problem disappears for me (Solr 4.10.1)

Resources