Multiple db queries with RABL and Mongoid/Moped - mongoid

I have a simple app that uses rabl and mongoid 3.1.0, with this action:
# index
#products = current_shop.products
The rabl code looks like this:
# index.json.rabl
collection #products
extends 'api/products/show'
# show.json.rabl
object #product
attributes :id
When I hit it, the logs show 4 requests related to #products:
MOPED: 127.0.0.1:27017 QUERY database=bamboo_development collection=products selector={"$query"=>{"shop_id"=>"511c7866fd896b1908000002"}, "$orderby"=>{:_id=>1}} flags=[:slave_ok] limit=-1 skip=0 batch_size=nil fields=nil (0.5682ms)
MOPED: 127.0.0.1:27017 QUERY database=bamboo_development collection=products selector={"$query"=>{"shop_id"=>"511c7866fd896b1908000002"}, "$orderby"=>{:_id=>1}} flags=[:slave_ok] limit=-1 skip=0 batch_size=nil fields=nil (1.1139ms)
MOPED: 127.0.0.1:27017 QUERY database=bamboo_development collection=products selector={"$query"=>{"shop_id"=>"511c7866fd896b1908000002"}, "$orderby"=>{:_id=>1}} flags=[:slave_ok] limit=0 skip=0 batch_size=nil fields=nil (0.4492ms)
MOPED: 127.0.0.1:27017 QUERY database=bamboo_development collection=products selector={"$query"=>{"shop_id"=>"511c7866fd896b1908000002"}, "$orderby"=>{:_id=>1}} flags=[:slave_ok] limit=0 skip=0 batch_size=nil fields=nil (0.4332ms)
When I do this:
#products = current_shop.products.search(params[:query])
render text: #products.map(&:name).join(",")
I get only one request related to #products:
MOPED: 127.0.0.1:27017 QUERY database=bamboo_development collection=products selector={"shop_id"=>"511c7866fd896b1908000002"} flags=[:slave_ok] limit=0 skip=0 batch_size=nil fields=nil (0.4611ms)
The duplicated queries look strange, does anyone know why that happens? It doesn't call child objects, just somewhat different collection queries, from what I see.

Lots of people getting same problem. Most likely because of rabl configuration problem. As recommended by Aymeric above, just do:
#products = current_shop.products.to_a

Related

Mailmerge result not displaying after connecting it with a database

I have a mailmerge that I am currently developing. I connected the mailmerge to a database (a particular query in the database) as datasource but I see nothing if I press Alt+f9. I understand that ctrl A and pressing F9 is for refreshing the data but it is still not working. Strangely it was showing before
Below is the mailmerge jargons when I toggle back:
DATABASE \d "X:\\PM\\PM-O\\PM-OQ\\PM-OQRA\\PROCESSES\\02_Cosmetics\\98_ Database\\99_Release\\Dossier_Developer.accdb" \c "Provider=Microsoft.ACE.OLEDB.12.0;User ID=Admin;Data Source=X:\\PM\\PM-O\\PM-OQ\\PM-OQRA\\PROCESSES\\02_Cosmetics\\98_ Database\\99_Release\\Dossier_Developer.accdb;Mode=Read;Extended Properties=\"\";Jet OLEDB:System database=\"\";Jet OLEDB:Registry Path=\"\";Jet OLEDB:Engine Type=6;Jet OLEDB:Database Locking Mode=1;Jet OLEDB:Global Partial Bulk Ops=2;Jet OLEDB:Global Bulk Transactions=1;Jet OLEDB:New Database Password=\"\";Jet OLEDB:Create System Database=False;Jet OLEDB:Encrypt Database=False;Jet OLEDB:Don't Copy Locale on Compact=False;Jet OLEDB:Compact Without Replica Repair=False;Jet OLEDB:SFP=False;Jet OLEDB:Support Complex Data=False;Jet OLEDB:Bypass UserInfo Validation=False;Jet OLEDB:Limited DB Caching=False;Jet OLEDB:Bypass ChoiceField Validation=False" \s "SELECT * FROM Q_DOSSIER_LeapingBunny_S_C01_ConnectArticleNo" \l "16" \b "191" \h

NiFi connection to SqlServor for ExecuteSQL

I'm trying to import some data from different SqlServer databases using ExecuteSQL in NiFi, but it's returning me an error. I've already imported a lot of other tables from MySQL databases without any problem and I'm trying to use the same workflow structure for the SqlServer dbs.
The structure is as follows:
There's a file .txt with the list of tables to be imported
This file is fetched, splitted and uptaded; so there's a FlowFile for each table of each db that has to be imported,
These FlowFiles are passed into ExecuteSQL which executes their contents
For example:
file.txt
table1
table2
table3
is being updated into 3 different FlowFiles:
FlowFile1
SELECT * FROM table1
FlowFile2
SELECT * FROM table2
FlowFile3
SELECT * FROM table3
which are passed to ExecuteSQL.
Here follows the configuration of ExecuteSQL (identical for SqlServer tables and MySQL ones)
ExecuteSQL
As the only difference with the import from MySQL db is in the connectors, this is how a generic MySQL connector has been configured:
SETTINGSPROPERTIES
Database Connection URL jdbc:mysql://00.00.00.00/DataBase?zeroDateTimeBehavior=convertToNull&autoReconnect=true
Database Driver Class Name com.mysql.jdbc.Driver
Database Driver Location(s) file:///path/mysql-connector-java-5.1.47-bin.jar
Database User user
PasswordSensitive value set
Max Wait Time 500 millis
Max Total Connections 8
Validation query No value set
And this is how a SqlServer connector has been configured:
SETTINGSPROPERTIES
Database Connection URL jdbc:jtds:sqlserver://00.00.00.00/DataBase;useNTLMv2=true;integratedSecurity=true;
Database Driver Class Name net.sourceforge.jtds.jdbc.Driver
Database Driver Location(s) /path/connectors/jtds-1.3.1.jar
Database User user
PasswordSensitive value set
Max Wait Time -1
Max Total Connections 8
Validation query No value set
It has to be noticed that one (only one!) SqlServer connector works and the ExecuteSQL processor imports the data without any problem. The even stranger thing is that the database that is being connected via this connector is located in the same place as other two (the connection URL and user/psw are identical), but only the first one is working.
Notice that I've tried appending ?zeroDateTimeBehavior=convertToNull&autoReconnect=true also to the SqlServer connections, supposing it was a problem of date type, but it didn't give any positive change.
Here is the error that is being returned:
12:02:46 CEST ERROR f1553b83-a173-1c0f-93cb-1c32f0f46d1d
00.00.00.00:0000 ExecuteSQL[id=****] ExecuteSQL[id=****] failed to process session due to null; Processor Administratively Yielded for 1 sec: java.lang.AbstractMethodError
Error retrieved from logs:
ERROR [Timer-Driven Process Thread-49] o.a.nifi.processors.standard.ExecuteSQL ExecuteSQL[id=****] ExecuteSQL[id=****] failed to process session due to java.lang.AbstractMethodError; Processor Administratively Yielded for 1 sec: java.lang.AbstractMethodError
java.lang.AbstractMethodError: null
at net.sourceforge.jtds.jdbc.JtdsConnection.isValid(JtdsConnection.java:2833)
at org.apache.commons.dbcp2.DelegatingConnection.isValid(DelegatingConnection.java:874)
at org.apache.commons.dbcp2.PoolableConnection.validate(PoolableConnection.java:270)
at org.apache.commons.dbcp2.PoolableConnectionFactory.validateConnection(PoolableConnectionFactory.java:389)
at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:2398)
at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:2381)
at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:2110)
at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1563)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:305)
at org.apache.nifi.dbcp.DBCPService.getConnection(DBCPService.java:49)
at sun.reflect.GeneratedMethodAccessor1696.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:84)
at com.sun.proxy.$Proxy449.getConnection(Unknown Source)
at org.apache.nifi.processors.standard.AbstractExecuteSQL.onTrigger(AbstractExecuteSQL.java:195)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

SOLR failed to create index for Cassandra 2.1.1 list which have user defined data type

I m trying to integrate Cassandra 2.1.1 with SOLR, but SOLR failed to
create index with following error message.
16767 [qtp297774990-12] INFO org.apache.solr.handler.dataimport.DataImporter – Loading DIH Configuration: dataconfigCassandra.xml
16779 [qtp297774990-12] INFO org.apache.solr.handler.dataimport.DataImporter – Data Configuration loaded successfully
16788 [Thread-15] INFO org.apache.solr.handler.dataimport.DataImporter – Starting Full Import
16789 [qtp297774990-12] INFO org.apache.solr.core.SolrCore – [Entity_dev] webapp=/solr path=/dataimport params={optimize=false&indent=true&clean=true&commit=true&verbose=false&command=full-import&debug=false&wt=json} status=0 QTime=27
16810 [qtp297774990-12] INFO org.apache.solr.core.SolrCore – [Entity_dev] webapp=/solr path=/dataimport params={indent=true&command=status&_=1416042006354&wt=json} status=0 QTime=0
16831 [Thread-15] INFO org.apache.solr.handler.dataimport.SimplePropertiesWriter – Read dataimport.properties
16917 [Thread-15] INFO org.apache.solr.search.SolrIndexSearcher – Opening Searcher#6214b0dc[Entity_dev] realtime
16945 [Thread-15] INFO org.apache.solr.handler.dataimport.JdbcDataSource – Creating a connection for entity Entity with URL: jdbc:cassandra://10.234.31.153:9160/galaxy_dev
17082 [Thread-15] INFO org.apache.solr.handler.dataimport.JdbcDataSource – Time taken for getConnection(): 136
17429 [Thread-15] ERROR org.apache.solr.handler.dataimport.DocBuilder – Exception while processing: Entity document : SolrInputDocument(fields: []):org.apache.solr.handler.dataimport.DataImportHandlerException: Unable to execute query: select * from entity Processing Document # 1
at
org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:71)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.<init>(JdbcDataSource.java:283)
at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:240)
at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:44)
at org.apache.solr.handler.dataimport.SqlEntityProcessor.initQuery(SqlEntityProcessor.java:59)
at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:73)
at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:243)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:476)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:415)
at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:330)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:232)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:416)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:480)
at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:461)
Caused by: java.lang.NullPointerException
at org.apache.cassandra.cql.jdbc.ListMaker.compose(ListMaker.java:61)
at org.apache.cassandra.cql.jdbc.TypedColumn.<init>(TypedColumn.java:68)
at org.apache.cassandra.cql.jdbc.CassandraResultSet.createColumn(CassandraResultSet.java:1174)
at org.apache.cassandra.cql.jdbc.CassandraResultSet.populateColumns(CassandraResultSet.java:240)
at org.apache.cassandra.cql.jdbc.CassandraResultSet.<init>(CassandraResultSet.java:200)
at org.apache.cassandra.cql.jdbc.CassandraStatement.doExecute(CassandraStatement.java:169)
at org.apache.cassandra.cql.jdbc.CassandraStatement.execute(CassandraStatement.java:205)
at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.<init>(JdbcDataSource.java:276)
... 12 more
CREATE TABLE dev.entity (
id uuid PRIMARY KEY,
begining int,
domain text,
domain_type text,
template_name text,
****field_values list<frozen<fieldmap>>****
)
User Defined Type :
CREATE TYPE galaxy_dev.fieldmap (
key text,
value text );
Please let me know what driver or Jar use to create SOLR index for latest CASSANDRA.

SQL Server - query XML node DTS:ConnectionManager DTS:Name in T-SQL

I'm trying to query out the contents of both "fields" in DTS:Name="ConnectionString". (Specifically, the text that begins with "". There can be multiple - in this example, there's 2.
I can't figure out how to query it. Between the colon and the dts: dts:, I'm stumped
Any help appreciated.
<DTS:Executable xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:ExecutableType="SSIS.Package.2">
<DTS:Property DTS:Name="SuppressConfigurationWarnings">0</DTS:Property>
<DTS:ConnectionManager>
<DTS:Property DTS:Name="DelayValidation">0</DTS:Property>
<DTS:ObjectData>
<DTS:ConnectionManager>
<DTS:Property DTS:Name="Retain">0</DTS:Property>
<DTS:Property DTS:Name="ConnectionString">Data Source=myserver;Initial Catalog=mydbname;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=false;Application Name=blah;</DTS:Property>
</DTS:ConnectionManager>
</DTS:ObjectData>
</DTS:ConnectionManager>
<DTS:ConnectionManager>
<DTS:ObjectData>
<DTS:ConnectionManager>
<DTS:Property DTS:Name="Retain">0</DTS:Property>
<DTS:Property DTS:Name="ConnectionString">Data Source=myserver2;Initial Catalog=mydb2;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=false;</DTS:Property>
</DTS:ConnectionManager>
</DTS:ObjectData>
</DTS:ConnectionManager>
</DTS:Executable>
It's not totally clear, but I'm assuming you want the connection strings themselves. So, let's imagine the document is in an XML type column called XmlColumn that is in a table called #XmlTable, then you could do this...
;WITH XMLNAMESPACES ('www.microsoft.com/SqlServer/Dts' as dts)
SELECT Con.Str.value('.', 'varchar(400)')
FROM #XmlTable
CROSS APPLY XmlColumn.nodes('//dts:Property[#dts:Name="ConnectionString"]') as Con(Str)
Note, we need to handle the XML namespace using the WITH statement and the semi-colon at the start is not a mistake. Then we pass an XPath expression to the nodes() method of the xml type, in order to retrieve the items you require.
See it in action here at SQL Fiddle.

Execute stored procedure to select data - Mule 3.3.0

I'm trying to execute a stored procedure to SELECT data from a SQL Server 2008 database using Mule 3.3.0.
In the Mule docs there is info about doing this with Oracle. I'm not sure if this is possible with SQL Server.
This is my Mule endpoint config
<jdbc:outbound-endpoint exchange-pattern="request-response" queryTimeout="-1" connector-ref="sqlServerConnector" queryKey="selectCoupons" doc:name="Database">
<jdbc:query key="selectCoupons" value="call sp_get_coupons()"/>
</jdbc:outbound-endpoint>
This is the output
Root Exception stack trace:
java.sql.SQLException: The executeUpdate method must not return a result set.
at net.sourceforge.jtds.jdbc.JtdsStatement.processResults(JtdsStatement.java:603)
at net.sourceforge.jtds.jdbc.JtdsStatement.executeSQL(JtdsStatement.java:546)
at net.sourceforge.jtds.jdbc.JtdsPreparedStatement.executeUpdate(JtdsPreparedStatement.java:506)
+ 3 more (set debug level logging or '-Dmule.verbose.exceptions=true' for everything)
I'm using the jTDS driver. Testing the stored procedure with a JDBC client I get the expected resultSet.
Any suggestions?
<db:stored-procedure config-ref="Oracle_Configuration1" doc:name="Database">
<db:parameterized-query><![CDATA[{call apps.create_sales_Order(:p_header_rec_oper,:P_order_number,:P_ordered_date,:P_line_id,:p_flow_Status_code,:P_return_status)}]]></db:parameterized-query>
<db:in-param name="p_header_rec_oper" value="CREATE"/>
<db:out-param name="P_order_number" type="INTEGER"/>
<db:out-param name="P_ordered_date" type="DATE"/>
<db:out-param name="P_line_id" type="VARCHAR"/>
<db:out-param name="p_flow_Status_code" type="VARCHAR"/>
<db:out-param name="P_return_status" type="VARCHAR"/>
</db:stored-procedure>
{
• P_return_status: "S",
• P_line_id: "684229",
• p_flow_Status_code: "ENTERED",
• P_ordered_date: "2015-05-22",
• P_order_number: 69393
}

Resources