Piwik database - piwik_archive_blob value column - matomo

I am using Piwik and after inspecting the database i see a table: piwik_archive_blob__
This table has a column called value with type: mediumblob
The values appear to be jumbled characters. I assume that there is an encode/decode process.
Can anyone help me decode this column. I think there is good data here, but i need to be able to read it
Thanks

The value column stores serialized and gzcompressed DataTable objects, so there is no easy way to read it.

Just a quick example how you could uncompress and deserialize it using PHP:
Download the blob as a .bin using something a tool like phpmyadmin
Load the file into PHP, uncompress and unserialze it using the following:
<?php
$sBlobFile = file_get_contents( 'piwik_archive_blob_2017_03-value.bin' );
$sBlobFile = unserialize( gzuncompress ( $sBlobFile ) );
var_dump( $sBlobFile );
Of course you can also just retrieve the blob using MySQL and access it directly in PHP opposed to downloading it as a file first.

Related

How to Extract .owl and save to mysql

I have a file ontobible.owl. how to extract that file and then save data to mysql (because I want display data from ontobible.owl in website). can anyone help me?
edited:
here is my ontobible.owl file (https://teamtrainit.com/ontobible.owl)
i've try open ontobible.owl with sublime text 3 and contains like this
<Verse rdf:about="http://www.semanticweb.org/budsus/ontologies/2021/7/ontobible#HOS5_2">
<verseID>HOS5_2</verseID>
<verse_text>And the revolters are profound to make slaughter, though I have been a rebuker of them all.</verse_text>
</Verse>
<Verse rdf:about="http://www.semanticweb.org/budsus/ontologies/2021/7/ontobible#2CH2_1">
<hasPerson rdf:resource="http://semanticbible.org/ns/2006/NTNames#god_1324"/>
<hasPerson rdf:resource="http://www.co-ode.org/roberts/family-tree.owl#solomon_2762"/>
<verseID>2CH2_1</verseID>
<verse_text>And Solomon determined to build an house for the name of the LORD, and an house for his kingdom.</verse_text>
</Verse>
how to convert that xml tag to array or json so I cant save it to mysql database
you have several options for extracting data from owl
use owl-api and write java code (i think owl api is accessible in other languages) to extract data and pack it in the format you need. also you can use sparql queries for extracting data via jena api
install protege, open your file in protege and save it in format json-dl. this format is very similar to the regular json and you can easily transform it for your needs
install fuseki server, add your file and using sparql queries extract data from there
i think that the second option is the easiest for start if you don't want to write queries or code and it won't take long

Is there a way to find out details of data type erorr in Snowflake?

I am pretty new to Snowflake Cloud offering and was just trying to load a simple .csv file from AWS s3 staging are to a table in Snowflake using copy command.
Here is what I used as the command:
copy into "database name"."schema"."table name"
from #S3_ACCESS
file_format = (format_name = format name);
When run the above code, I get the following error: Numeric value '63' is not recognized
Please see the attached image. Not sure what this error is and i'm not able to find any lead in Snowflake UI itself to find out what could be wrong with the value.
Thanks in Advance!
The error says, it was waiting a numberic value, but it got "63", and this value can not be converted to numeric value.
From the image you share, I can see that there are some weird characters around 6 and 3. There could be an issue with file encoding or data is corrupted.
Please check encoding option for file format:
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html#format-type-options-formattypeoptions
By the way, I recommend you always use utf-8.

How to use Large Object in PostgreSQL to yeild a image field?

How does creating a large object work? Does there need to be a client, because all I am hoping to do is have an image be one column.
I am typing the following commands after creating my table but I just get an error about the path not being correct for the image (even though I have it starting right from the C drive).
CREATE TABLE image (name text,
raster oid);
INSERT INTO image (name, raster)
VALUES ('beautiful image', lo_import('C:Documents/etc/motd'));
I am not running any C code, am I suppose to do that or does this automatically create the object Large Object?
If I am suppose to run some C code where would I do it with respect to PostgreSQL?
Can I do what I want all with PostgreSQL syntax? Is there another way to approach including images as a field?
Any help will be greatly appreciated.
According to PostgreSQL documentation, there's two ways to handle large objects (considering Java JDBC):
To use the bytea data type you should simply use the getBytes(), setBytes(), getBinaryStream(), or setBinaryStream() methods.
and
LargeObject API.
Also, you can covert your image to a base64 string and then insert it directly using, for instance, PgAdmin:
CREATE TABLE image_table (name varchar(255), DATA bytea);
INSERT INTO image_table
VALUES ('my_image.jpg',
decode('paste your byte array string here', 'base64'));
Full sample code here.

using pg_read_file read file in desktop PostgreSQL

I wanted to how to read a file in my desk top using pg_read_file in PostgreSQL
pg_read_file(filename text [, offset bigint, length bigint])
my query
select pg_read_file('/root/desktop/new.txt' , 0 , 1000000);
error
ERROR: absolute path not allowed
UPDATE
pg_read_file can read the files only from the data directory path, if you would like to know your data directory path use:
SHOW data_directory;
I think that you can resolve you problem by looking to this post
If you're using psql you can use \lo_import to create a large object from a local file.
The pg_read_file tool only allows reads from server-side files.
To read the content of a file from PostgreSQL you can use this.
CREATE TABLE demo(t text);
COPY demo from '[FILENAME]';
SELECT * FROM demo;
Each text-line in a SQL-ROW. Useful for temporary transfers.
lo_import(file path) will generate an oid.This may solve your problem. you can import any type of file using this (even image)

How to save Json_encode data in database in magento?

I am Working on magento 1.7 version.In this I have data in a array.I encoded this in json using json_encode and insert into database. But I an getting error like
connection was reset.
If I insert normal value then it is working fine.
In database I have field type longtext.I used mysql_real_escape_string(),base_63_encode(),serialize() but not succeed.
I am using following code
$table = Mage::getSingleton('core/resource')->getTableName('checkout_prescription_details');
$write = Mage::getSingleton('core/resource')->getConnection('core_write');
$custom = json_encode($customoptions);
$query = "insert into {$table} set `data`='$custom';
$write->query($query);
In this when I echo $query then It shows encoded data and when I insert this from phpmyadmin then it insert in database but using $write->query($query); this is not inserting in database.
Please suggest regarding this.
See Magento’s Core JSON Encoding and Decoding Functions
Encode an array
Mage::helper('core')->jsonEncode($array);
Decode an array
Mage::helper('core')->jsonDecode($jsonData);
Mage::getModel('checkout/prescription_details')
->setData('data', Mage::helper('core')->jsonEncode($customoptions))
->save();
you did not escape the query parameter and (SQL injection aside) a JSON string usually contains single quotes, which breaks the query.
you should not have done this with an SQL query in the first place, let Magento do the work for you. Assuming there is a model for this table with the alias checkout/prescription_details:
Mage::getModel('checkout/prescription_details')
->setData('data', json_encode($customoptions))
->save();
If there is no model, go ahead and create one. You should not have database tables without an according model.

Resources