'do_replace()' not working? - atk4

while trying ATK4 I've found a problem:
$this->api->db->dsql()->table('person')->set('id', 1)->set('name', 'Test user')->do_replace();
This is not working. Then I looked a little bit deeper in ATK4 source and found in /opt/ipism/www/atk4/lib/DB/dsql.php the lines
public $sql_templates=array(
'select'=>"select [options] [field] [from] [table] [join] [where] [group] [having] [order] [limit]",
'insert'=>"insert [options_insert] into [table_noalias] ([set_fields]) values ([set_values])",
'replace'=>"replace [options_replace] into [table_noalias] ([set_fields]) values ([set_values])",
'update'=>"update [table_noalias] set [set] [where]",
'delete'=>"delete from [table_noalias] [where]",
'truncate'=>'truncate table [table_noalias]',
'describe'=>'desc [table_noalias]',
);
After changing the 'replace'-line into
'replace'=>"replace into [table_noalias] ([set_fields]) values ([set_values])",
it worked for me (removing the options_replace and appending a 's' to set_value). I'm using latest version from git with a MySQL database connection.
But I'm not sure, if I'm using 'do-replace()' in the wrong way?
ByE...
By the way: Is there a way to send fixes, without creating an account on GitHub or somewhere?
Edit: Here is the output if the options_replace isn't removed from the template:
replace [options_replace] into `person` (`id`,`name`) values ("1","John Doe") [:a_2, :a]Application Error: Database Query Failed
Exception_DB, code: 0Additional information: pdo_error: SQLSTATE[42000]: Syntax error or access violation: 1064 You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[options_replace] into `person` (`id`,`name`) values ('1' at line 1 mode: replace params: :a: 1 :a_2: John Doe query: replace [options_replace] into `person` (`id`,`name`) values (:a,:a_2) template: replace [options_replace] into [table_noalias] ([set_fields]) values ([set_values])/opt/ipism/www/atk4/lib/DB/dsql.php:1519
Stack trace:
File Object NameStack Trace/opt/ipism/www/atk4/lib/BaseException.php:63 Exception_DBException_DB->collectBasicData(Null)
/opt/ipism/www/atk4/lib/AbstractObject.php:545 Exception_DBException_DB->__construct("Database Query Failed", Null)
/opt/ipism/www/atk4/lib/DB/dsql.php:1519 sample_project_db_db_dsql_mysqlDB_dsql_mysql->exception("Database Query Failed")
/opt/ipism/www/atk4/lib/DB/dsql.php:1586 sample_project_db_db_dsql_mysqlDB_dsql_mysql->execute()
/opt/ipism/www/atk4/lib/DB/dsql.php:1624 sample_project_db_db_dsql_mysqlDB_dsql_mysql->replace()
/opt/ipism/www/page/test.php:40 sample_project_db_db_dsql_mysqlDB_dsql_mysql->do_replace()
/opt/ipism/www/atk4/lib/AbstractObject.php:306 sample_project_testpage_test->init()
/opt/ipism/www/atk4/lib/ApiFrontend.php:130 sample_projectFrontend->add("page_test", "test", "Content")
/opt/ipism/www/atk4/lib/ApiWeb.php:428 sample_projectFrontend->layout_Content()
/opt/ipism/www/atk4/lib/ApiFrontend.php:39 sample_projectFrontend->addLayout("Content")
/opt/ipism/www/atk4/lib/ApiWeb.php:275 sample_projectFrontend->initLayout()
/opt/ipism/www/index.php:15 sample_projectFrontend->main()
Note: To hide this information from your users, add $config['logger']['web_output']=false to your config.php file. Refer to documentation on 'Logger' for alternative logging options

Replace is similar to "insert" by it's nature, but instead of failing when primary key is duplicated, it replaces the value.
Please add ->debug() to your line before do_replace and give me the output, which would help me understand why that parameter needs removing.
set_value seems to be a typo, I have changed and committed it into master: https://github.com/atk4/atk4/commit/24b20865b9e3345a8e7504dfb68b7ef96335009e
the best way to submit changes is by creating a pull request. The best way to report issues is through "issues" in github currently.

Related

Bypassing an SQL Error 264 - column name is specified more than once in the SET clause

I've been instaling one of our products on a new test server and it's not working due to an error 264 - the column name 'kod_novy' is specified more than once in the SET clause.
I know where the problem is and reported it to the development for a fix. But we have the same application deployed and it works just fine.
In the code you can see the 'kod_novy' is used twice in the insert. My question is - does anyone know, how did our customer manage to ignore this error and successfully complete the T-SQL? Is that a server setting?
Thanks,
Z
insert into [server].db.dbo.prenos_c_banky (
id_prenos,
kod,
kod_novy,
ext_kod,
iud_job,
kod_banky,
kod_novy,
nazev,
znacka)
select
cast('2B06FB0A-2664-4714-91F6-A6D39BDE5B5F' as UNIQUEIDENTIFIER),
kod,
kod_novy,
ext_kod,
iud_job,
kod_banky,
kod_novy,
nazev,
znacka
from #c_banky

How to use laravel DB::setDatabaseName?

I use DB::setDatabaseName(<database name>) to reset the databasename, then I use DB::table(<table name>)->get() to retrieve data. However laravel does not change to new database.
This is my error message:
Illuminate/Database/QueryException with message 'SQLSTATE[42P01]:
Undefined table: 7 ERROR: relation "t" does not exist LINE 1: select
* from "t" ^ (SQL: select * from "t")'
The table t is in another database. I think when I use DB::setDatabaseName(<database name>) it would work.
Thank you for your help!
I don't know your database of detail information, but this help you to check database have changed.
// current database is 'db_1'
echo DB::getDatabaseName(); // return db_1
// Set database to 'db_2'
DB::setDatabaseName('db_2');
// If success, should return 'db_2' now.
echo DB::getDatabaseName();
// Check database tables.
DB::select('show tables');
I was facing a similar issue.
But changing the database solely might not always work.
You could use config->set() like so
config()->set('database.connections.mysql', $database_name);
But in my case I had to reconnect the database to change it dynamically.
So maybe this one works for the OP.
\DB::disconnect();
config()->set('database.pgsql.database', $database_name); // psgl = Postgress
\DB::reconnect();
You'll find more info here Laravel 6 config()->get('database.connections.mysql') not matching DB:connection()
Hope it helps

Mule:Database update failing although query works in Oracle

Maybe it is a case of me looking at this for too long. But I have this Oracle update query I am trying to run, I have verified the query works with hardcoded values on SQL Developer, howver when I run it from my flow Mule it fails.can anybody tell me what I am doing wrong?
Here is the query:
<db:update config-ref="DBConf" doc:name="abcd">
<db:dynamic-query><![CDATA[UPDATE myTable
SET TYPE= 'Entry',
ENTERED_DATE=SYSDATE,
ENTRY_BY= 2345,
ENTRY_DATE=TO_DATE('#[flowVars.entryDate]','YYYY-MM-DD')
WHERE ID = 'abcd1234']]>
</db:dynamic-query>
</db:update>
the flowVars.entryDate value is '2017-05-10'
This throws the following Error:
Message : ORA-01841: (full) year must be between -4713 and +9999, and not be 0
(java.sql.SQLDataException). Message payload is of type: Integer
Now the same query works like I said in SQL Developer but not in Mule, Can anybody provide any input
You can find the same problem answer in the following link:
Oracle: year must be between -4713 and +9999, and not be 0
Try this once. TO_DATE('2012-05-12','yyyy-mm-dd').
Delete the quotes for the #[flowvar.entrydate]

Dapper parameters not working?

Dapper 1.34 (on earlier Dapper ver like 1.1x this worked fine).
db.Query(#"Select [whatever] from #TableName Where [PREFIX]='#Prefix' order by [something] desc",
new { TableName = tableName, Prefix = prefix })
Error: System.Data.SqlClient.SqlException (0x80131904): Must declare the table variable "#TableName".
I get same error trying to define DynamicParameters and passing those.
=======
I am currently doing string substitution {%1} .. but that does not seem acceptable ...
Can I please get a sample, also looking at test class for dapper I cant see it running, maybe something wrong with my project setup ?
No, that has never worked fine, due to how SQL works:
table names cannot be parameterized, and dapper has never offered this; that could work if you pass in a DataTable as a table-valued-parameter, though - which dapper does support - but I don't think this is what you are after
'#Prefix' is a string literal; you just mean #Prefix (no single quotes)

Hive Serde errors with Array<Struct<>> org.json.JSONArray cannot be cast to [Ljava.lang.Object;

I have created a table :
add jar /../xlibs/hive-json-serde-0.2.jar;
CREATE EXTERNAL TABLE SerdeTest
(Unique_ID STRING
,MemberID STRING
,Data ARRAY>
)
PARTITIONED BY (Pyear INT, Pmonth INT)
ROW FORMAT SERDE "org.apache.hadoop.hive.contrib.serde2.JsonSerde";
ALTER TABLE SerdeTest ADD
PARTITION (Pyear = 2014, Pmonth =03) LOCATION '../Test2';
The data in the file :
{"Unique_ID":"ABC6800650654751","MemberID":"KHH966375835","Data":[{"SerialNo":1,"VariableName":"Var1","VariableValue":"A_49"},{"SerialNo":2,"VariableName":"Var2","VariableValue":"B_89"},{""SerialNo":3,"VariableName":"Var3","VariableValue":"A_99"}]}
Select query that I am using:
select Data[0].SerialNo from SerdeTest where Unique_ID = 'ABC6800650654751';
however, when I run this query I get the following error:
java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ClassCastException: org.json.JSONArray cannot be cast to [Ljava.lang.Object;
at org.apache.hadoop.hive.serde2.objectinspector.StandardListObjectInspector.getList(StandardListObjectInspector.java:98)
at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:330)
at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:386)
at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:237)
at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:223)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:539)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:157)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:418)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
at org.apache.hadoop.mapred.Child.main(Child.java:264)
]
Can anyone please suggest me what am I doing wrong
Few suggestions:
Make sure that all the packages of hive and hive-json-serde-0.2.jar have execute permission for hadoop user.
Hive creates a file called derby.log and metastore_db in the hive directory. It should be allowed to the user invoking the hive query to create files and directories.
Location for data should have / at the end. e.g. LOCATION '../Test2/';
In short, the working JAR is json-serde-1.3-jar-with-dependencies.jar which can be found here. This one is working with 'STRUCT' and can even ignore some malformed JSON. During the creation of the table, include the following code:
ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
WITH SERDEPROPERTIES ("ignore.malformed.json" = "true")
LOCATION ...
If needed, it is possible to recompile it from here or here. I tried the first repository and it is compiling fine for me, after adding the necessary libs. The repository has also been updated recently.
Check for more details here.

Resources