Add dynamic field based of file path - solr

I'm using DIH and Tika to index documents in different languages.
There's a folder for each language (e.g. /de/file001.pdf), and I want to extract the language from path and then dynamically add the language specific solr field (e.g. text_de).
Here's my attempted solution:
<dataConfig>
<script><![CDATA[
function addField(row) {
row.put('text_' + row.get('lang'), row.get('text'));
return row;
}
]]></script>
<dataSource type="BinFileDataSource" />
<document>
<entity name="files" dataSource="null" rootEntity="false"
processor="FileListEntityProcessor"
baseDir="/tmp/documents" fileName=".*\.(doc)|(pdf)|(docx)"
onError="skip"
recursive="true"
transformer="RegexTransformer" query="select * from files">
<field column="fileAbsolutePath" name="id" />
<field column="lang" regex=".*/(\w*)/.*" sourceColName="fileAbsolutePath"/>
<entity name="documentImport"
processor="TikaEntityProcessor"
url="${files.fileAbsolutePath}"
format="text"
transformer="script:addField">
<field column="date" name="date" meta="true"/>
<field column="title" name="title" meta="true"/>
</entity>
</entity>
</document>
This doesn't work because row contains the 'text' field but not the 'lang' field.

The approach is correct, however the problem is that you are using a row that has as scope only the current row.
In order to access to parent row, you have to use the context variable that you receive as second actual parameter to script function. The Context variable has the ContextImpl implementation and on each script invocation, Solr ScriptTransformer will send you as second parameter (see transformRow) the same Context instance.
The following script will allow you to extract field value from the parent row and should address your problem:
<dataConfig>
<script><![CDATA[
function addField(row, context) {
var lang = context.getParentContext().resolve('files.lang');
row.put('text_' + row.get('lang'), row.get('text'));
return row;
}
]]></script>

Related

Solr DataImport set field to specific value [duplicate]

Im making an index in solr from db in the following way:
<document name="Index">
<entity name="c" query="SELECT * FROM C">
<field column="Name" name="name"/>
</entity>
<entity name="p" query="SELECT * FROM P">
<field column="Name" name="name"/>
</entity>
</document>
Is it possible to have a static field that is set for each row that signify what type is returned to client so that one can make a call to the right database table based on that information from the json result?
That is a field that has no column in the table
<field name="id" value="1"/>
Or is there another way to solve this?
<document name="Index">
<entity name="c" transformer="TemplateTransformer" query="SELECT * FROM C">
<field column="Name" name="name"/>
<field column="id" template="1"/>
</entity>
<entity name="p" transformer="TemplateTransformer" query="SELECT * FROM P">
<field column="Name" name="name"/>
<field column="id" template="1"/>
</entity>
</document>
You can add a column to your SQL query that contains static data like this:
<document name="Index">
<entity name="c" query="SELECT *, 'foo' as NameFromC FROM C">
<field column="NameFromC" name="name"/>
</entity>
<entity name="p" query="SELECT *, 'bar' as NameFromP FROM P">
<field column="NameFromP" name="name"/>
</entity>
</document>
If you try to add a field with only name and template attributes, Solr will throw an error saying Field must have a column attribute.

Solr dataimport change dataSource dynamically

I have done the following settings for dataimport from about 20 mdb files using ucanaccess:
<?xml version="1.0" encoding="UTF-8" ?>
<dataConfig>
<dataSource name="a" driver="net.ucanaccess.jdbc.UcanaccessDriver" type="JdbcDataSource" url="jdbc:ucanaccess://E:/feqh/main.mdb;memory=false" />
<dataSource name="a1" driver="net.ucanaccess.jdbc.UcanaccessDriver" type="JdbcDataSource" url="jdbc:ucanaccess://E:/feqh/A/1.mdb;memory=false" />
<dataSource name="a2" driver="net.ucanaccess.jdbc.UcanaccessDriver" type="JdbcDataSource" url="jdbc:ucanaccess://E:/feqh/A/2.mdb;memory=false" />
<dataSource name="a3" driver="net.ucanaccess.jdbc.UcanaccessDriver" type="JdbcDataSource" url="jdbc:ucanaccess://E:/feqh/A/3.mdb;memory=false" />
<dataSource name="a4" driver="net.ucanaccess.jdbc.UcanaccessDriver" type="JdbcDataSource" url="jdbc:ucanaccess://E:/feqh/A/4.mdb;memory=false" />
<!-- and so on -->
<document>
<entity name="Book" dataSource="a"
query="select bkid AS id, bkid AS BookID,bk AS BookTitle, betaka AS BookInfo, cat as cat from 0bok">
<field column="id" name="id"/>
<field column="BookID" name="BookID"/>
<field column="BookTitle" name="BookTitle"/>
<field column="cat" name="cat"/>
<entity name="Category" dataSource="a"
query="select name as CatName, catord as CatWeight, Lvl as CatLevel from 0cat where id = ${Book.CAT}">
<field column="CatName" name="CatName"/>
<field column="CatWeight" name="CatWeight"/>
<field column="CatLevel" name="CatLevel"/>
</entity>
<entity name="Pages" dataSource="a5" onError="continue"
query="SELECT nass AS PageContent, page AS pageNum FROM b${Book.ID} ORDER BY page">
<field column="PageContent" name="PageContent" />
<field column="PageNum" name="PageNum" />
<entity name="Titles" dataSource="a5" onError="continue"
query="SELECT * FROM t${Book.ID} WHERE id = ${Pages.PAGE} ORDER BY sub">
<field column="ID" name="TitleID"/>
<field column="TIT" name="PageTitle"/>
<field column="SUB" name="TitleWeight"/>
<field column="LVL" name="TitleLevel"/>
</entity>
</entity>
</entity>
</document>
</dataConfig>
In every time I liked to import from a different dataSource I had to change dataSource attribute manually for both Pages and Titles entities, then perform dataimport without clean. Now with more than 600 mdb files, it is not an wise option. Is there any way to make looping inside the config? In other words: there is a main entity or mdb files that handles all books titles and categories then every book has its own mdb file named with its id for example 245.mdb for the book of id 245, So I need to change the dataSource for Pages and Titles dynamically.
You cannot create dataSources in a loop, but I believe you can pass dataSource information in a parameter variable. So, perhaps, you can loop over your collection outside of Solr and then trigger DIH with the correct source as a parameter variable.
Just ensure to run DIH in sync mode to avoid different calls stepping on each other (I think the param is syncMode)

Extract file name (without extension) while indexing using Data Import Handler in Solr

Im successfully able to index pdf,doc,ppt,etc files using the Data Import Handler in solr 4.3.0 .
My data-config.xml looks like this -
<dataConfig>
<dataSource name="bin" type="BinFileDataSource" />
<document>
<entity name="f" dataSource="null" rootEntity="false"
processor="FileListEntityProcessor"
baseDir="C:\Users\aroraarc\Desktop\Impdo"
fileName=".*\.(DOC)|(PDF)|(pdf)|(doc)|(docx)|(ppt)|(pptx)|(xls)|(xlsx)|(txt)" onError="skip"
recursive="true">
<field column="fileAbsolutePath" name="path" />
<field column="fileSize" name="size" />
<field column="fileLastModified" name="lastmodified" />
<field column="file" name="fileName"/>
<entity name="tika-test" dataSource="bin" processor="TikaEntityProcessor"
url="${f.fileAbsolutePath}" format="text" onError="skip">
<field column="Author" name="author" meta="true"/>
<field column="title" name="title" meta="true"/>
<field column="text" name="content"/>
</entity>
</entity>
</document>
</dataConfig>
However in the fileName field i want to insert the pure file name without the extension. Eg - Instead of 'HelloWorld.txt' I want only 'HelloWorld' to be inserted in the fileName field. How do I achieve this?
Thanks in advance!
Check ScriptTransformer to replace or change the value before it is indexed.
Example -
Data Config - Add custom field -
<script><![CDATA[
function changeFileName(row){
var fileName= row.get('fileName');
// Replace or remove the extension .. e.g. from last index of .
file_name_new = file_name.replace ......
row.put(fileName, row.get('file_name_new'));
return row;
}
]]></script>
Entity mapping -
<entity name="f" transformer="script:changeFileName" ....>
......
</entity>

How to show filenames in search results using Solr's FileListEntityProcessor

I am trying to scan all pdf/doc files in a directory. This works fine and I am able to scan all documents.
The next thing i'm trying to do is also receiving the filename of the file in the search results. The filename however never shows up. I tried a couple of things, but the documentation is not very helpfull about how to do this.
I am using the solr configuration found in the solr distribution: apache-solr-3.1.0/example/example-DIH/solr/tika/conf
This is my dataConfig:
<dataConfig>
<dataSource type="BinFileDataSource" name="bin"/>
<document>
<entity name="f" processor="FileListEntityProcessor" recursive="true"
rootEntity="false" dataSource="null" baseDir="C:/solrtestsmall"
fileName=".*\.(DOC)|(PDF)|(pdf)|(doc)" onError="skip">
<entity name="tika-test" processor="TikaEntityProcessor"
url="${f.fileAbsolutePath}" format="text" dataSource="bin"
onError="skip">
<field column="Author" name="author" meta="true"/>
<field column="title" name="title" meta="true"/>
<field column="text" name="text"/>
</entity>
<field column="fileName" name="fileName"/>
</entity>
</document>
</dataConfig>
I am interested in the way how to configure this correctly, and also the any other places I can find specific documentation.
You should use file instead of fileName in column
<field column="file" name="fileName"/>
Don't forget to add the 'fileName' to the schema.xml in the fields section.
<field name="fileName" type="string" indexed="true" stored="true" />

solr DIH - A problem about solr delta-imports

There is a problem when I use solr1.3 delta-imports to update the index. I have added the "last_modified" column in the table. After I use the "full-import" command to index the database data, the "dataimport.properties" file contains nothing, and when I use the "delta-import" command to update index, the solr list all the data in database not the lasted data. My db-data-config.xml:
deltaQuery="select shop_id from shop where last_modified > '${dataimporter.last_index_time}'">
<?xml version="1.0" encoding="UTF-8" ?>
<dataConfig>
<dataSource driver="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/funguide" user="root" password="root"/>
<document name="shopinfo">
<entity name="shop" pk="shop_id"
query="select shop_id,title,description,tel,address,longitude,latitude from shop"
<field column="shop_id" name="id" />
<field column="title" name="title" />
<field column="description" name="description" />
<field column="tel" name="tel" />
<field column="address" name="address" />
<field column="longitude" name="longitude" />
<field column="latitude" name="latitude" />
</entity>
</document>
</dataConfig>
Anyboby know how to solve the problem? Thanks!
enzhaohoo#gmail.com
I also would recommend upgrading to Solr 1.4 RC as there have been quite a few improvements made to delta-imports with DataImportHandler. Please see DataImportHandler - Using delta-import command - wikipage for specifics.

Resources