MyBatis - returning empty value when IN operator is used - ibatis

I have a query like below,
<select id="getTableName" resultType="java.lang.String" parameterType="java.lang.String">
Select Distinct TableName From TABLE_COLUMN_MAPPING Where COLUMNNAME IN (#{columnNames})
</select>
am calling the service method by passing the values as below.
sampleService.getTableName("'BANKRM','APPLICANTID','APPLICANTTYPE','APPLICANTTYPE','TOTAL','SCQUES9'");
but its returning... [ ]
Help me why am not getting values instead am getting empty List.
P.S: I checked the log and its statements are created with out any issues
2013-03-22 19:16:27,521 DEBUG [main] org.apache.ibatis.datasource.pooled.PooledDataSource (debug:27) - Created connection 23845098.
2013-03-22 19:16:27,590 DEBUG [main] java.sql.Connection (debug:27) - ooo Connection Opened
2013-03-22 19:16:27,670 DEBUG [main] java.sql.PreparedStatement (debug:27) - ==> Executing: Select Distinct TableName From Config_FieldDetails Where COLUMNNAME IN (?)
2013-03-22 19:16:27,670 DEBUG [main] java.sql.PreparedStatement (debug:27) - ==> Parameters: 'BANKRM','APPLICANTID','APPLICANTTYPE','APPLICANTTYPE','TOTAL','SCQUES9'(String)
2013-03-22 19:16:27,954 DEBUG [main] java.sql.Connection (debug:27) - xxx Connection Closed
2013-03-22 19:16:27,955 DEBUG [main] org.apache.ibatis.datasource.pooled.PooledDataSource (debug:27) - Returned connection 23845098 to pool.
2013-03-22 19:16:27,955 INFO [main] com.hcl.cob.mybatis.bpm.service.impl.COBBPMCommonServiceImpl (getTableNameFromConfigFieldDetails:41) - getTableNameFromConfigFieldDetails(String columnName) -- End

Put the list of column names in one List parameter and then pass that to the query:
List<String> columnNames = Arrays.asList(["BANKRM", "APPLICANTID", "APPLICANTTYPE", "APPLICANTTYPE","TOTAL","SCQUES9"]);
sampleService.getTableName(columnNames);
Then update the query to take a list and use the foreach tag to iterate over the list values and put them in the in clause:
<select id="getTableName" resultType="java.lang.String" parameterType="java.util.List">
Select Distinct TableName From TABLE_COLUMN_MAPPING Where COLUMNNAME IN
<foreach item="columnName" index="index" collection="columnNames" open="(" separator="," close=")">
#{columnName}
</foreach>
</select>

If you want to put parameter which will be used to create PreparedStatement (not to pass parameter), you should use ${} notation:
${columnNames}

Related

How to create SQL language anonymous block in Snowflake with dynamic DDL statements

I have over 2 dozen tasks in our Snowflake database, all having the names in a similar pattern ending with a number (example : TSK_x, where x = 1,2,...,27).
I am trying to trying to write a procedure or anonymous block in Snowflake (without using Javascript stored proc) to generate a descending order task number statements and execute them from inside the procedure like :
ALTER TASK TSK_27 RESUME;
ALTER TASK TSK_26 RESUME;
...
ALTER TASK TSK_1 RESUME;
The task (TSK_1) is the parent task and needs to be enabled last.
As a background, that script will be included in Jenkins as part of our build. Our Jenkins does not allow multiple SQL statements in one file and so I am thinking of a stored proc like the one mentioned above.
Any help/suggestion will be much appreciated. I am new to Snowflake.
Query to "generate a descending order task number statements"
First execute -
show tasks;
created_on
name
state
2022-06-02 12:53:23.662 -0700
T1
started
2022-06-13 20:11:11.032 -0700
TASK_1
started
2022-06-13 20:24:20.211 -0700
TASK_10
started
2022-06-13 20:11:17.883 -0700
TASK_2
started
2022-06-13 20:24:10.871 -0700
TASK_2A
suspended
2022-06-13 20:11:22.769 -0700
TASK_3
started
2022-06-13 20:11:26.497 -0700
TASK_4
started
2022-06-13 20:11:30.725 -0700
TASK_5
started
2022-06-13 20:11:34.765 -0700
TASK_6
started
2022-06-13 20:11:38.313 -0700
TASK_7
started
Query (change order clause as needed - add desc in end) -
select "name" as name,"state" from table(result_scan(LAST_QUERY_ID()))
where regexp_like("name",'TASK_[[:digit:]]+$')
order by substr("name",1,4), to_number(substr("name",6));
NAME
state
TASK_1
started
TASK_2
started
TASK_3
started
TASK_4
started
TASK_5
started
TASK_6
started
TASK_7
started
TASK_10
started
Anonymous procedure to set tasks to resume -
show tasks;
EXECUTE IMMEDIATE $$
DECLARE
p_tsk string;
c1 CURSOR FOR select "name" as name from table(result_scan(LAST_QUERY_ID())) where regexp_like("name",'TASK_[[:digit:]]+$') order by substr("name",1,4), to_number(substr("name",6)) desc;
BEGIN
for record in c1 do
p_tsk:=record.name;
execute immediate 'alter task '||:p_tsk ||' suspend';
end for;
RETURN p_tsk;
END;
$$
;
To recursively resume all dependent tasks tied to a root task in a simple tree of tasks, query the SYSTEM$TASK_DEPENDENTS_ENABLE function rather than enabling each task individually (using ALTER TASK … RESUME).
Example:
select system$task_dependents_enable('mydb.myschema.mytask');

No sample data added to Apache Atlas Server: running apache quick_start.py

I have installed Apache-Atlas with embedded-hbase-solr on REHL. I am able to access http://localhost:21000 but when I run /apache-atlas-sources-2.0.0/distro/target/apache-atlas-2.0.0/bin/quick_start.py it throws below error.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.PatternLayout.
log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.PatternLayout.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.PatternLayout.
log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.PatternLayout.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.PatternLayout.
log4j:WARN No such property [maxFileSize] in org.apache.log4j.PatternLayout.
log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.PatternLayout.
Enter username for atlas :- atlas
Enter password for atlas :-
Creating sample types:
Exception in thread "main" org.apache.atlas.AtlasServiceException: Metadata service API org.apache.atlas.AtlasClientV2$API_V2#2507d7cd failed with status 401 (Unauthorized) Response Body ()
at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:427)
at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:353)
at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:229)
at org.apache.atlas.AtlasClientV2.createAtlasTypeDefs(AtlasClientV2.java:232)
at org.apache.atlas.examples.QuickStartV2.createTypes(QuickStartV2.java:213)
at org.apache.atlas.examples.QuickStartV2.runQuickstart(QuickStartV2.java:163)
at org.apache.atlas.examples.QuickStartV2.main(QuickStartV2.java:147)
No sample data added to Apache Atlas Server.
Below is the quick_start.log
2020-07-11 23:03:25,336 INFO - [main:] ~ Looking for atlas-application.properties in classpath (ApplicationProperties:110)
2020-07-11 23:03:25,341 INFO - [main:] ~ Loading atlas-application.properties from file:/apache-atlas-sources-2.0.0/distro/target/apache-atlas-2.0.0/conf/atlas-application.properties (ApplicationProperties:123)
2020-07-11 23:03:25,387 INFO - [main:] ~ Using graphdb backend 'janus' (ApplicationProperties:273)
2020-07-11 23:03:25,399 INFO - [main:] ~ Using storage backend 'hbase2' (ApplicationProperties:284)
2020-07-11 23:03:25,399 INFO - [main:] ~ Using index backend 'solr' (ApplicationProperties:295)
2020-07-11 23:03:25,399 INFO - [main:] ~ Setting solr-wait-searcher property 'true' (ApplicationProperties:301)
2020-07-11 23:03:25,400 INFO - [main:] ~ Setting index.search.map-name property 'false' (ApplicationProperties:305)
2020-07-11 23:03:25,400 INFO - [main:] ~ Property (set to default) atlas.graph.cache.db-cache = true (ApplicationProperties:318)
2020-07-11 23:03:25,400 INFO - [main:] ~ Property (set to default) atlas.graph.cache.db-cache-clean-wait = 20 (ApplicationProperties:318)
2020-07-11 23:03:25,400 INFO - [main:] ~ Property (set to default) atlas.graph.cache.db-cache-size = 0.5 (ApplicationProperties:318)
2020-07-11 23:03:25,400 INFO - [main:] ~ Property (set to default) atlas.graph.cache.tx-cache-size = 15000 (ApplicationProperties:318)
2020-07-11 23:03:25,400 INFO - [main:] ~ Property (set to default) atlas.graph.cache.tx-dirty-size = 120 (ApplicationProperties:318)
2020-07-11 23:03:37,405 INFO - [main:] ~ Client has only one service URL, will use that for all actions: http://localhost:21000 (AtlasBaseClient:321)
2020-07-11 23:03:37,930 INFO - [main:] ~ method=POST path=api/atlas/v2/types/typedefs/ contentType=application/json; charset=UTF-8 accept=application/json status=401 (AtlasBaseClient:387)
Any help is appreciated
Default user-name/password for Apache-Atlas is : admin/admin.
This is configurable from : conf/users-credentials.properties file, when using file based authentication.
Please find the steps to get admin password if the authentication is file based:
Stpe1: Goto the Atlas Conf Directory
$ATLAS_HOME/conf
Step2: Find the users-credentials.properties file and get the sha256-password information. In the below example, password is ** 8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918**
#username=group::sha256-password
admin=ROLE_ADMIN:: 8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918
Step3: Decrypt the above sha256 password using online tools.
https://md5decrypt.net/en/Sha256/
Password is admin
The quick_start.py would ask for username and password. The default username and password is admin.
Enter username for atlas :- admin
Enter password for atlas :- admin
Then, you should see the following log in your terminal:
Creating sample types:
Created type [DB]
Created type [Table]
Created type [StorageDesc]
Created type [Column]
Created type [LoadProcess]
Created type [LoadProcessExecution]
Created type [View]
Created type [JdbcAccess]
Created type [ETL]
Created type [Metric]
Created type [PII]
Created type [Fact]
Created type [Dimension]
Created type [Log Data]
Created type [Table_DB]
Created type [View_DB]
Created type [View_Tables]
Created type [Table_Columns]
Created type [Table_StorageDesc]
Creating sample entities:
Created entity of type [DB], guid: f0ff2e1a-bc50-40e0-84ac-926ca74258b7
Created entity of type [DB], guid: 6f7768af-604b-4628-ac3f-f40b3caed850
Created entity of type [DB], guid: 0cba0440-5ccb-43f3-b3fe-bd756afefeb1
Created entity of type [Table], guid: 64db3739-4a97-47f1-86e1-9882b4cefa33
Created entity of type [Table], guid: 199bc872-2eeb-4ec6-9f60-19b303edb06f
Created entity of type [Table], guid: c2ff0389-cb52-46c2-bb2a-d3aa712282f9
Created entity of type [Table], guid: 48b088de-ca96-43d9-bc13-8038c73409f8
Created entity of type [Table], guid: 023a8a69-ce0f-4c3c-bacb-f38c3d8ff725
Created entity of type [Table], guid: cac58477-fc1e-4b78-b3fa-7537b3a5d603
Created entity of type [Table], guid: 7a2863cf-3f2e-47e0-a33e-75cb906403c5
Created entity of type [Table], guid: bee4250d-c4bd-47b4-a7f1-b102cc440fb4
Created entity of type [View], guid: 2eeb8ffe-1fe2-499c-8b55-3e1b8c572d0b
Created entity of type [View], guid: da787577-ae9b-4d9b-b4b4-935fe92a8fe0
Created entity of type [LoadProcess], guid: c785a52a-69fc-4d8f-b1b0-704f25a583df
Created entity of type [LoadProcessExecution], guid: 169778ed-7ef1-4e28-b792-897f9a6d2a70
Created entity of type [LoadProcessExecution], guid: b8832398-812c-45ed-9223-ac3503263926
Created entity of type [LoadProcess], guid: 6a679027-f8af-4466-b620-768cfd3cd7eb
Created entity of type [LoadProcessExecution], guid: 5214dcea-37c4-48af-b8dd-46b9230d00cd
Created entity of type [LoadProcessExecution], guid: 9c4d95b0-cb3f-4d8a-8c90-14ab7fa7b545
Created entity of type [LoadProcess], guid: 334bcd6c-636d-4f16-9136-05b91020b055
Created entity of type [LoadProcessExecution], guid: c84f21c2-528e-4415-91cd-4fdecc7a3742
Created entity of type [LoadProcessExecution], guid: 3af888c1-4f56-48b3-b411-9ae25ebb21a8
Sample DSL Queries:
query [from DB] returned [3] rows.
query [DB] returned [3] rows.
query [DB where name=%22Reporting%22] returned [1] rows.
query [DB where name=%22encode_db_name%22] returned [ 0 ] rows.
query [Table where name=%2522sales_fact%2522] returned [1] rows.
query [DB where name="Reporting"] returned [1] rows.
query [DB where DB.name="Reporting"] returned [1] rows.
query [DB name = "Reporting"] returned [1] rows.
query [DB DB.name = "Reporting"] returned [1] rows.
query [DB where name="Reporting" select name, owner] returned [1] rows.
query [DB where DB.name="Reporting" select name, owner] returned [1] rows.
query [DB has name] returned [3] rows.
query [DB where DB has name] returned [3] rows.
query [DB is JdbcAccess] returned [ 0 ] rows.
query [from Table] returned [8] rows.
query [Table] returned [8] rows.
query [Table is Dimension] returned [5] rows.
query [Column where Column isa PII] returned [3] rows.
query [View is Dimension] returned [2] rows.
query [Column select Column.name] returned [10] rows.
query [Column select name] returned [9] rows.
query [Column where Column.name="customer_id"] returned [1] rows.
query [from Table select Table.name] returned [8] rows.
query [DB where (name = "Reporting")] returned [1] rows.
query [DB where DB is JdbcAccess] returned [ 0 ] rows.
query [DB where DB has name] returned [3] rows.
query [DB as db1 Table where (db1.name = "Reporting")] returned [ 0 ] rows.
query [Dimension] returned [9] rows.
query [JdbcAccess] returned [2] rows.
query [ETL] returned [10] rows.
query [Metric] returned [4] rows.
query [PII] returned [3] rows.
query [`Log Data`] returned [4] rows.
query [Table where name="sales_fact", columns] returned [4] rows.
query [Table where name="sales_fact", columns as column select column.name, column.dataType, column.comment] returned [4] rows.
query [from DataSet] returned [10] rows.
query [from Process] returned [3] rows.
Sample Lineage Info:
time_dim(Table) -> loadSalesDaily(LoadProcess)
loadSalesMonthly(LoadProcess) -> sales_fact_monthly_mv(Table)
sales_fact(Table) -> loadSalesDaily(LoadProcess)
sales_fact_daily_mv(Table) -> loadSalesMonthly(LoadProcess)
loadSalesDaily(LoadProcess) -> sales_fact_daily_mv(Table)
Sample data added to Apache Atlas Server.

Liquibase lock error AFTER database creation

I'm currently running a DDL Script using the Liquibase Java API. The whole script and the corresponding changeSet is exceuted successfully. However, after this execution Liquibase throws a LockException.
The ERROR LOG is as follows,
21713 [main] DEBUG liquibase.ext.mssql.database.MSSQLDatabase - Executing Statement: ALTER
TABLE [dbo].[VALIDATIONEXECUTORS] CHECK CONSTRAINT [FK_MSTAPPTYPE_VLDTNEXCUTORS]
21713 [main] INFO liquibase.executor.jvm.JdbcExecutor - ALTER TABLE [dbo].[VALIDATIONEXECUTORS]
CHECK CONSTRAINT [FK_MSTAPPTYPE_VLDTNEXCUTORS]
21715 [main] DEBUG liquibase.executor.jvm.JdbcExecutor - 0 row(s) affected
21715 [main] DEBUG liquibase.ext.mssql.database.MSSQLDatabase - Executing Statement: COMMIT
21715 [main] INFO liquibase.executor.jvm.JdbcExecutor - COMMIT
21735 [main] DEBUG liquibase.executor.jvm.JdbcExecutor - -1 row(s) affected
21735 [main] INFO liquibase.changelog.ChangeSet - SQL in file
E:\\LQBASE\\LiquibaseDemo\\src\\main\\resources\\db\\changelog\\ddl\\DBSchema.sql executed
21737 [main] INFO liquibase.changelog.ChangeSet - ChangeSet
src/main/resources/db/changelog/ddl_changelog.xml::Create_DB::skini ran successfully in 18064ms
21738 [main] INFO liquibase.executor.jvm.JdbcExecutor - select schema_name()
21739 [main] INFO liquibase.executor.jvm.JdbcExecutor - SELECT MAX(ORDEREXECUTED) FROM
IND_DEV.DATABASECHANGELOG
21742 [main] INFO liquibase.executor.jvm.JdbcExecutor - select schema_name()
21744 [main] DEBUG liquibase.executor.jvm.JdbcExecutor - Release Database Lock
21745 [main] INFO liquibase.executor.jvm.JdbcExecutor - select schema_name()
21747 [main] DEBUG liquibase.executor.jvm.JdbcExecutor - UPDATE IND_DEV.DATABASECHANGELOGLOCK
SET LOCKED = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1
21749 [main] INFO liquibase.executor.jvm.JdbcExecutor - select schema_name()
**21751 [main] INFO liquibase.lockservice.StandardLockService - Successfully released change log
lock
21752 [main] ERROR liquibase.Liquibase - Could not release lock
liquibase.exception.LockException: liquibase.exception.DatabaseException: Error executing SQL
UPDATE IND_DEV.DATABASECHANGELOGLOCK SET LOCKED = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE
ID = 1: Invalid object name 'IND_DEV.DATABASECHANGELOGLOCK'.**
at liquibase.lockservice.StandardLockService.releaseLock(StandardLockService.java:357)
at liquibase.Liquibase.update(Liquibase.java:206)
at liquibase.Liquibase.update(Liquibase.java:179)
at liquibase.Liquibase.update(Liquibase.java:175)
at liquibase.Liquibase.update(Liquibase.java:168)
at
com.sk.liquibase.LiquibaseDemo.LiquibaseConfig.createManageIDDatabase(LiquibaseConfig.java:34)
at com.sk.liquibase.LiquibaseDemo.App.main(App.java:12)
**Caused by: liquibase.exception.DatabaseException: Error executing SQL UPDATE
IND_DEV.DATABASECHANGELOGLOCK SET LOCKED = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1:
Invalid object name 'IND_DEV.DATABASECHANGELOGLOCK'.**
According to the error, IND_DEV (which is the DB username) is somehow being appended to the DATABASECHANGELOGLOCK table. Does anyone have any idea what the issue could be?
Sometimes if the update application is abruptly stopped, then the lock remains stuck. Possibly due to a killed liquibase process not releasing its lock
Then running
UPDATE DATABASECHANGELOGLOCK SET LOCKED=0, LOCKGRANTED=null, LOCKEDBY=null;
against the database helps.
Or you can simply drop the DATABASECHANGELOGLOCK table, it will be recreated. or whatever changeloglock name you have configured.

How do I insert timeseries data into Solr?

I have existing Solr collection named chronix.
ANd I also have configured chronix, but when I issue query on chronix JAVA FX then it is throwing error like this -
2018-06-29 09:30:16.611 INFO (qtp761960786-20) [ x:chronix] o.a.s.c.S.Request [chronix] webapp=/solr path=/select params={q=*:*&fl=%2Bdata&start=0&rows=200&wt=javabin&version=2} status=500 QTime=1
2018-06-29 09:30:16.611 ERROR (qtp761960786-20) [ x:chronix] o.a.s.s.HttpSolrCall null:java.lang.NumberFormatException: For input string: "+"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
What am I missing?
You have a + before the value in your fl parameter. The fl parameter should contain the list of fields you want to retrieve, and having a + here is not a valid field name.
To get the field data back, &fl=data is what you want, not &fl=+data (%2B is the url encoded version of +)

How can I declare a date type parameter in JDBC request in Jmeter

On executing my SP in Jmeter as below
QueryType:Callable Statement
Query:execute [QC].[usp_GetCallCounts] ?,?,?,?,?,?
Parameter Values: 33,'12-01-2016','12-15-2016',74861,0,Evaluator
Parameter Types: INTEGER,DATE,DATE,INTEGER,BIT,VARCHAR
Request
I get this Response message: java.lang.IllegalArgumentException
Log file details are as follow:
2018-01-12 18:50:04,887 INFO o.a.j.e.StandardJMeterEngine: Running the test!
2018-01-12 18:50:04,888 INFO o.a.j.s.SampleEvent: List of sample_variables: []
2018-01-12 18:50:04,890 INFO o.a.j.g.u.JMeterMenuBar: setRunning(true, local)
2018-01-12 18:50:05,395 INFO o.a.j.e.StandardJMeterEngine: Starting ThreadGroup: 1 : Thread Group
2018-01-12 18:50:05,395 INFO o.a.j.e.StandardJMeterEngine: Starting 1 threads for group Thread Group.
2018-01-12 18:50:05,395 INFO o.a.j.e.StandardJMeterEngine: Thread will continue on error
2018-01-12 18:50:05,395 INFO o.a.j.t.ThreadGroup: Starting thread group... number=1 threads=1 ramp-up=1 perThread=1000.0 delayedStart=false
2018-01-12 18:50:05,396 INFO o.a.j.t.ThreadGroup: Started thread group number 1
2018-01-12 18:50:05,396 INFO o.a.j.e.StandardJMeterEngine: All thread groups have been started
2018-01-12 18:50:05,396 INFO o.a.j.t.JMeterThread: Thread started: Thread Group 1-1
2018-01-12 18:50:20,055 INFO o.a.j.t.JMeterThread: Thread is done: Thread Group 1-1
2018-01-12 18:50:20,055 INFO o.a.j.t.JMeterThread: Thread finished: Thread Group 1-1
2018-01-12 18:50:20,055 INFO o.a.j.e.StandardJMeterEngine: Notifying test listeners of end of test
2018-01-12 18:50:20,056 INFO o.a.j.g.u.JMeterMenuBar: setRunning(false, local)
Encountered the same problem today, and after debugging and removing each parameter and value one by one we found out the problem is with the DATE parameter type.
So my advice would be to replace the DATE object with an TIMESTAMP object, since that's how we have done that, and it seems to work.
Parameter Types: INTEGER,TIMESTAMP,TIMESTAMP,INTEGER,BIT,VARCHAR
As per reference doc you need to use UPPER Case names as per:
https://docs.oracle.com/javase/8/docs/api/java/sql/Types.html
So it would be :
DATE

Resources