My Solr Server shows old data every second day - solr

Am using solr with my application , specially for certain search module.
I had a failure few days back with which the entity audit table went out of sync.
After rectifying that issue whenever i add some data , it disappears the next day and solr shows the data which was there before that failure .
Say I add 500 users to my app on day 1 , go to solr admin UI and query : , it gives me all ( old data + new 500 ) , however when I login again next day and look for solr data , it gives that old data , does not shows newly added 500 .
After a delta-sync import , it starts to show correct one , but again back to square one the next day.
I suspect it to be some stale restoration happening , but seeking help as its haunting me now for a while.
Tried solr sync , but it works for then and there , and shows stale data the next day again .

Related

Is there a way to query and get max folder names in Azure Synapse from Data Lake? (without doing full file scans?)

FYI I am using Azure Synapse Analytics - Serverless SQL and Gen2 Data Lake for all this work
Here is my issue - I am working with the Azure cost data export feature that creates a new folder for each month and a new folder for each day (this is in Azure Gen2 Data Lake). In that folder, it puts a CSV with all the cost data for that month. Ultimately I only care about the last folder in each month. Data continues to trickle in, so usually, the last folder date is several day after the end of the month.
I have my data partitioned, so if I could pass back the latest month and dates, it greatly reduces the amount of data that is being scanned. I have been able to test this by hardcoding some values in a where statement and it works great.
My problem is I can't figure out how to get the max folder without actually scanning all the files in the 1st place
There is some example code that works, it does give me back a result that I can use though it does a full scan of the data lake. This pulls back nearly 6GB of data, vs. with hard coded values I can get it down to about 400 MB. The current size isn't a problem, but I know this is only going to continue to grow.
with usethesefiles as (
select
--rows.*
distinct
files.filepath(1) as file_path_one
,files.filepath(2) as file_path_two
from
openrowset(bulk 'azurecosts/azuredailyexport/*/*/*/*.csv'
, data_source = 'mydatalake_dfs_core_windows_net'
, format = 'CSV'
, parser_version = '2.0'
, string_delimiter = '"'
, header_row = true
)as files)
select file_path_one,
max(file_path_two) as file_path_two
from usethesefiles
group by file_path_one
This returns the following data:
file_path_one
file_path_two
20230201-20230228
202302101419
20230101-20230131
202302051419

Prestashop : invoice number in database stays at 0

PrestaShop version: 1.6.1
Hosting: ionos
PHP version: 5.6
MySQL version: MySQL 5.7
Hi,
I recently saved new settings in orders > invoices tab but did left the invoice number at 0 in order for it to continue being auto incremented. My problem is that since then no invoices are being generated and when I go to my db - in the ps_order_invoice table - I see that all new order invoice has the number 0. When I manually change the invoice number in the db, an invoice is being generated but it's not really a long term solution.
(I did ask on the prestashop forum but it's like a ghost town there, nobody answered).
I tried changing the invoice_start_number in the ps_configuration table to set it at the next number the invoice should be but it didn't change anything. All new invoices are still numbered 0. I also tried deleting the invoice_start_number line so that it would be recreated when I set it from the BO (and somehow solve everything) but didn't change anything either. And I can't set the column "number" as auto increment since there is already the id_order_invoice being auto incremented.
Does anyone have an idea on how to solve this ? Cause I really don't know what files are in charge of this nor what I should change in them.

Create Empty folder in S3 when there is no data while unloading from snowflake

I have scheduled to unload data from snowflake to S3 every hour . Data gets uploaded to this path : My_bucket/year=2021/month= /day= /hour = /data.csv
year , month , day and hour gets dynamically updated in the path at every hour run .
Data need not necessarily be there every hour. At that time No folder or path is getting created
I need to have folder for every hour in S3 irrespective of data flowing in .
like hour=1 ,hour=2 ,hour=3 and so on for all 24 hours every time the query runs.
There should be csv file if data is present in table and even if the data is not present path for that hour should be there with empty file
So how should I modify my sql query?
Hi You achieve this by using one workaround.
Since snowflake will not upload any file if the query return zero rows.
This workaround will copy empty-file with headers only if no data is available.
copy into <s3_path>
from (
-- this will act as header for both condition if data present or not.
select 'customer_id','store_id','metro_id','message_type'
UNION
select to_char(customer_id),to_char(store_id),to_charmetro_id(),message_type
from myTable
)
OVERRIDE=TRUE
FILE_FORMAT=(type=csv)

how create a report for user activity in drupal 7

i have a content type, say named material, with fields and all, and users can create, modify and delete all nodes of content type material.
The task i need to do, is have a report on the activity of each user, in a per month, or a per year basis. Example user 1, has created the following nodes, in January, updated these nodes and how many times each node was updated by the particular user.
Is there a module, or a view that could help me do something like this?
Or should i go to MySQL and run a query there?
And how then this query should look like?
(I am still on version 7 of drupal)
for example the following will return will select all nodes created within the past hour (3600 seconds).
$result = db_query("SELECT nid, title FROM {node} WHERE created > :created", array(
':created' => REQUEST_TIME - 3600,
));
But what i need is also all updates done by a certain user, for a certain period of time given...
Any example would be highly appreciated.

Data Collection Report

I Use DataCollection on SSMA\Object Explorer\Management\DataCollection in 2 day ago.
But each time I get report from Data Collection, no data available to display. The following image is a sample Server Activity History report of data collection reports.
this query will confirm if a snapshot was actually taken on that date: (change the name of your Data Management Warehouse)
SELECT [snapshot_time_id]
,[snapshot_time]
FROM [DATA_MANAGEMENT_WAREHOUSE].[core].[snapshot_timetable_internal]
WHERE CAST(snapshot_time AS DATE) = CAST('2014-06-19' AS DATE)
if there is no result set, run without the filter to confirm when the snapshots first started and what the interval is.
SELECT [snapshot_time_id]
,[snapshot_time]
FROM [DATA_MANAGEMENT_WAREHOUSE].[core].[snapshot_timetable_internal]
Then, if you have no results, check to see if the Agent service is on. I believe it could be related to one of these things.
EDIT: added another query to check for wait stats.
SELECT [wait_type]
,[waiting_tasks_count]
,[wait_time_ms]
,[signal_wait_time_ms]
,[collection_time]
,[snapshot_id]
FROM [DATA_MANAGEMENT_WAREHOUSE].[snapshots].[os_wait_stats]

Resources