I am Working on magento 1.7 version.In this I have data in a array.I encoded this in json using json_encode and insert into database. But I an getting error like
connection was reset.
If I insert normal value then it is working fine.
In database I have field type longtext.I used mysql_real_escape_string(),base_63_encode(),serialize() but not succeed.
I am using following code
$table = Mage::getSingleton('core/resource')->getTableName('checkout_prescription_details');
$write = Mage::getSingleton('core/resource')->getConnection('core_write');
$custom = json_encode($customoptions);
$query = "insert into {$table} set `data`='$custom';
$write->query($query);
In this when I echo $query then It shows encoded data and when I insert this from phpmyadmin then it insert in database but using $write->query($query); this is not inserting in database.
Please suggest regarding this.
See Magento’s Core JSON Encoding and Decoding Functions
Encode an array
Mage::helper('core')->jsonEncode($array);
Decode an array
Mage::helper('core')->jsonDecode($jsonData);
Mage::getModel('checkout/prescription_details')
->setData('data', Mage::helper('core')->jsonEncode($customoptions))
->save();
you did not escape the query parameter and (SQL injection aside) a JSON string usually contains single quotes, which breaks the query.
you should not have done this with an SQL query in the first place, let Magento do the work for you. Assuming there is a model for this table with the alias checkout/prescription_details:
Mage::getModel('checkout/prescription_details')
->setData('data', json_encode($customoptions))
->save();
If there is no model, go ahead and create one. You should not have database tables without an according model.
Related
I've created one store procedure that returns the JSON. But it not return complete JSON only limited JSON returned. I used "For JSON auto" after Select statement. Have any Solution to get all JSON?
if you are using any Cast or covert operations. Use VARCHAR(MAX) instead of VARCHAR().
I have seen this issue in such cases.
Also if you are using the print output, then it is possible that the text might get truncated, but you can use the Select or Output parameter instead.
make the output parameter of type nvarchar(max)
or check this link :
Format Query Results as JSON with FOR JSON (SQL Server)
This might help
I found my solution here: https://learn.microsoft.com/en-us/sql/relational-databases/json/format-query-results-as-json-with-for-json-sql-server?view=sql-server-ver15#output-of-the-for-json-clause (same as above, but a specific section)
The issue was that I thought SQL Server was returning a single row/cell of JSON data because that is how SSMS displayed it. The truth is that it chops it into multiple rows.
I was retrieving the data in .NET using ExecuteScalar(), but I needed to use ExecuteReader(), and concatenate all rows together. Once I did that, I could deserialize the JSON without issue.
I am using Piwik and after inspecting the database i see a table: piwik_archive_blob__
This table has a column called value with type: mediumblob
The values appear to be jumbled characters. I assume that there is an encode/decode process.
Can anyone help me decode this column. I think there is good data here, but i need to be able to read it
Thanks
The value column stores serialized and gzcompressed DataTable objects, so there is no easy way to read it.
Just a quick example how you could uncompress and deserialize it using PHP:
Download the blob as a .bin using something a tool like phpmyadmin
Load the file into PHP, uncompress and unserialze it using the following:
<?php
$sBlobFile = file_get_contents( 'piwik_archive_blob_2017_03-value.bin' );
$sBlobFile = unserialize( gzuncompress ( $sBlobFile ) );
var_dump( $sBlobFile );
Of course you can also just retrieve the blob using MySQL and access it directly in PHP opposed to downloading it as a file first.
I needed to insert an array field into a database and I was pleased to notice that PostGreSQL had that functionality. But now I am not able to insert the data using the tables active record.
I have tried the below calls with no success
$active_record->array_column = $_array_of_values;
which gives me the exception
Exception Raised:CDbCommand failed to execute the SQL statement: SQLSTATE[22P02]: Invalid text representation: 7 ERROR: array value must start with "{" or dimension information
I have also tried this using
foreach($_array_of_values as $value){
$active_record->array_column[] = $value;
}
which tells me
Indirect modification of overloaded property FeatureRaw::$colors_names has no effect
Can anyone help me with this?
Thanks!
Data must be inserted in the form (text representation of an ARRAY):
INSERT INTO tbl (arr_col) VALUES ('{23,45}')
Or:
INSERT INTO tbl (arr_col) VALUES ('{foo,"bar, with comma"}')
So you need to enclose your array values in '{}' and separate them with comma ,. Use double quotes "" around text values that include a comma.
I listed more syntax variants to insert arrays in a related answer.
For those who also have same problem:
I didn't check the Yii1 behavior, but in Yii2 you simply can insert array as properly formed string as Erwin Brandstetter mentioned in his comment:
$activeRecord->arrayField = '{' . implode(',',$array_values) . '}';
Of course you need to make additional efforts when your $array_values has strings with commas, etc. And you still need to convert value back to array after you load ActiveRecord.
You can make these conversions in ActiveRecord's beforeSave() and afterLoad() and you will not need to convert values manually.
UPD. Recently I made a simple behavior for Yii2 to use array fields with ActiveRecord without manual field building: kossmoss/yii2-postgresql-array-field. It is more generalized way to solve the problem and I hope it will help. For those who use Yii1: you can investigate the package code and create your own solutuion compatible with your framework.
I am using Access database for one system, and SQL server for another system. The data gets synced between these two systems.
The problem is that one of the fields in a table in Access database is a Memo field which is in double-byte format. When I read this data using DataGridView in a Windows form, the text is displayed as ???.
Also, when data from this field is inserted in sql server database nvarchar(max) field, non-English characters are inserted as ???.
How can I fetch data from memo field, convert its encoding to Unicode, so that it appears correctly in SQL server database as well?
Please help!!!
I have no direct experience with datagrid controls, but I already noticed that some database values are not correctly displayed through MS-Access controls. Uniqueidentifiers, for example, are set to '?????' values when displayed on a form. You could try this in the debug window, where "myIdField" control is bound to "myIdField" field from the underlying recordset (unique Identifier type field):
? screen.activeForm.recordset.fields("myIdField")
{F0E3C822-BEE9-474F-8A4D-445A33F363EE}
? screen.activeForm.controls("myIdField")
????
Here is what the Access Help says on this issue:
The Microsoft Jet database engine stores GUIDs as
arrays of type Byte. However, Microsoft Access can't return Byte data
from a control on a form or report. In order to return the value of a
GUID from a control, you must convert it to a string. To convert a
GUID to a string, use the StringFromGUID function. To convert a string
back to a GUID, use the GUIDFromString function.
So if you are extracting values from controls to update a table (either directly or through a recordset), you might face similar issuers ...
One solution will be to update data directly from the recordset original value. Another option would be to open the original recordset with a query containing necessary conversion instructions so that the field will be correctly displayed through the control.
What I usually do in similar situation, where I have to manipulate uniqueIdentifier fields from multiple datasources (MS-Access and SQL Server for Example), is to 'standardize' these fields as text in the recordsets. Recordsets are then built with queries such as:
SQL Server
"SELECT convert(nvarchar(36),myIdField) as myIdField, .... FROM .... "
MS-Access
"SELECT stringFromGUID(myIdField) as myIdField, .... FROM .... "
I solved this issue by converting the encoding as follows:
//Define Windows 1252, Big5 and Unicode encodings
System.Text.Encoding enc1252 = System.Text.Encoding.GetEncoding(1252);
System.Text.Encoding encBig5 = System.Text.Encoding.GetEncoding(950);
System.Text.Encoding encUTF16 = System.Text.Encoding.Unicode;
byte[] arrByte1 = enc1252.GetBytes(note); //string to be converted
byte[] arrByte2 = System.Text.Encoding.Convert(encBig5, encUTF16, arrByte1);
string convertedText = encUTF16.GetString(arrByte2);
return convertedText;
Thank you all for pitching in!
I have a database with a bunch of stuff in it, and right now I'm reading in data, doing some processing on it, and then sticking it in a new database. My code generates this string:
query_string = "INSERT INTO OrgPhrase (EXACT_PHRASE,Org_ID) VALUES (HELLO,123)"
Then it's used this way:
Dim InsertCmd = New System.Data.OleDb.OleDbCommand(query_string, connection)
InsertCmd.ExecuteNonQuery()
The associated database (OLEdb connection) exists and opens fine, with all the tables and columns it's trying to work with already existing. The error message I get is "No value given for one or more required parameters"
Am I missing something? Did I spell something wrong? I don't have a ton of experience with database work, but I've never had this trouble inserting before.
I believe the query should be
query_string = "INSERT INTO OrgPhrase (EXACT_PHRASE,Org_ID) VALUES ('HELLO',123)"
Also, it may happen that the table has more than 2 columns that are NOT NUll and the values to them are required.
Consider parameterizing the query string. There are a couple of reasons for this. First, you can pass in the values without having to worry about whether or not you need single quotes. Second, you prevent SQL injection.
query_string = "INSERT INTO OrgPhrase (EXACT_PHRASE,Org_ID) VALUES (#ExactPhrase,#OrgId)"
You then create parametes based on the parameter names in the string. Unless, of course, your query string is always the same values, but that sounds a bit too hardcoded to be good.