Order a group of transactions with multiple levels using FIFO? - sql-server

I have a group of transactions with multiple levels that need to be ordered using FIFO. I don't have a typical parent child record set either. Transaction numbers are not sequenced, but each transaction has a "other transaction #" to relate it to any transactions that are transferred between accounts.
A transfer is movement of units between accounts in the same fund.
A switch is a switch of units between funds on the same account.
In order of levels, my transactions below:
The result I am looking for should have the same result set, but an additional column for "order" of transactions. The order needs to use FIFO based where the first transaction is number 1 and the order will follow transfers and switches in and out appropriately.
The challenge the order needs follow the transfers and switches down the whole hierarchy before moving to the next transaction in the same level.
Sample record set
CREATE TABLE #data (
[Level] INT NOT NULL,
FundId INT NOT NULL,
AccountId INT NOT NULL,
TransId VARCHAR(20) NOT NULL PRIMARY KEY CLUSTERED,
OtherTransId VARCHAR(20) NULL,
TransType VARCHAR(20) NOT NULL,
TransDate DATE NOT NULL,
AgeDays INT NULL,
Units INT NOT NULL
);
INSERT #data VALUES
(1,200,5000,'00000015','00000035','Switch In','2019-01-01',NULL,500),
(2,200,5000,'00000035','00000015','Switch Out','2019-01-01',NULL,500),
(2,200,5000,'00000070',NULL,'Buy','2018-09-09',452,700),
(2,200,5000,'00000046',NULL,'Sell','2018-09-12',449,200),
(1,100,5000,'00000001',NULL,'Buy','2019-06-30',159,100),
(1,100,5000,'00000002',NULL,'Sell','2019-07-15',NULL,20),
(1,100,5000,'00000003',NULL,'Buy','2019-07-31',128,50),
(1,100,5000,'00000004','00000011','Transfer Out','2019-08-15',NULL,45),
(2,100,6000,'00000005','00000020','Transfer In','2019-08-17',NULL,200),
(2,100,6000,'00000020','00000005','Transfer Out','2019-08-17',NULL,200),
(2,100,6000,'00000044',NULL,'Buy','2019-06-11',177,70),
(2,100,6000,'00000050','00000088','Transfer In','2019-06-10',NULL,130),
(3,100,7000,'00000088','00000050','Transfer Out','2019-06-10',NULL,130),
(3,100,7000,'00000079',NULL,'Buy','2019-06-01',187,130);
+-------+--------+-----------+---------------+---------------------+------------------+-----------------+------------+-------+
| Level | Fund # | Account # | Transaction # | Other Transaction # | Transaction Type | Date | Age (Days) | Units |
+-------+--------+-----------+---------------+---------------------+------------------+-----------------+------------+-------+
| 1 | 200 | 5000 | 00000015 | 00000035 | Switch In | January 1, 2019 | | 500 |
| 2 | 200 | 5000 | 00000035 | 00000015 | Switch Out | January 1, 2019 | | 500 |
| 2 | 200 | 5000 | 00000070 | | Buy | Sept 9, 2018 | 452 | 700 |
| 2 | 200 | 5000 | 00000046 | | Sell | Sept 12, 2018 | 449 | 200 |
| 1 | 100 | 5000 | 00000001 | | Buy | June 30, 2019 | 159 | 100 |
| 1 | 100 | 5000 | 00000002 | | Sell | July 15, 2019 | | 20 |
| 1 | 100 | 5000 | 00000003 | | Buy | July 31, 2019 | 128 | 50 |
| 1 | 100 | 5000 | 00000004 | 00000011 | Transfer Out | August 15, 2019 | | 45 |
| 2 | 100 | 6000 | 00000005 | 00000020 | Transfer In | August 17, 2019 | | 200 |
| 2 | 100 | 6000 | 00000020 | 00000005 | Transfer Out | August 17, 2019 | | 200 |
| 2 | 100 | 6000 | 00000044 | | Buy | June 11, 2019 | 177 | 70 |
| 2 | 100 | 6000 | 00000050 | 00000088 | Transfer In | June 10, 2019 | | 130 |
| 3 | 100 | 7000 | 00000088 | 00000050 | Transfer Out | June 10, 2019 | | 130 |
| 3 | 100 | 7000 | 00000079 | | Buy | June 1, 2019 | 187 | 130 |
+-------+--------+-----------+---------------+---------------------+------------------+-----------------+------------+-------+
Desired output: (added order column)
+-------+-------+--------+-----------+---------------+---------------------+------------------+-----------------+------------+-------+
| Order | Level | Fund # | Account # | Transaction # | Other Transaction # | Transaction Type | Date | Age (Days) | Units |
+-------+-------+--------+-----------+---------------+---------------------+------------------+-----------------+------------+-------+
| 1 | 1 | 200 | 5000 | 00000015 | 00000035 | Switch In | January 1, 2019 | | 500 |
| 2 | 2 | 200 | 5000 | 00000035 | 00000015 | Switch Out | January 1, 2019 | | 500 |
| 3 | 2 | 200 | 5000 | 00000070 | | Buy | Sept 9, 2018 | 452 | 700 |
| 4 | 2 | 200 | 5000 | 00000046 | | Sell | Sept 12, 2018 | 449 | 200 |
| 5 | 1 | 100 | 5000 | 00000001 | | Buy | June 30, 2019 | 159 | 100 |
| 6 | 1 | 100 | 5000 | 00000002 | | Sell | July 15, 2019 | | 20 |
| 7 | 1 | 100 | 5000 | 00000003 | | Buy | July 31, 2019 | 128 | 50 |
| 8 | 1 | 100 | 5000 | 00000004 | 00000011 | Transfer Out | August 15, 2019 | | 45 |
| 9 | 2 | 100 | 6000 | 00000005 | 00000020 | Transfer In | August 17, 2019 | | 200 |
| 10 | 2 | 100 | 6000 | 00000020 | 00000005 | Transfer Out | August 17, 2019 | | 200 |
| 11 | 2 | 100 | 6000 | 00000044 | | Buy | June 11, 2019 | 177 | 70 |
| 12 | 2 | 100 | 6000 | 00000050 | 00000088 | Transfer In | June 10, 2019 | | 130 |
| 13 | 3 | 100 | 7000 | 00000088 | 00000050 | Transfer Out | June 10, 2019 | | 130 |
| 14 | 3 | 100 | 7000 | 00000079 | | Buy | June 1, 2019 | 187 | 130 |
+-------+-------+--------+-----------+---------------+---------------------+------------------+-----------------+------------+-------+

Related

Grouping Data by Changing Status Over Time

I am trying to assign a group number to distinct groups of rows in a dataset that has changing data over time. The changing fields are tran_seq, prog_id, deg-id, cur_id, and enroll_status in my example. When any of those fields are different from the previous row, I need a new grouping number. When the fields are the same as the prior row, then the grouping number should stay the same. When I try ROW_NUMBER(), RANK(), or DENSE_RANK(), I get increasing values for the same group (e.g. the first 2 rows in example). I feel I need to ORDER BY start_date as it is temporal data.
+----+----------+---------+--------+--------+---------------+------------+------------+---------+
| | tran_seq | prog_id | deg_id | cur_id | enroll_status | start_date | end_date | desired |
+----+----------+---------+--------+--------+---------------+------------+------------+---------+
| 1 | 1 | 6 | 9 | 3 | ENRL | 2004-08-22 | 2004-12-11 | 1 |
| 2 | 1 | 6 | 9 | 3 | ENRL | 2006-01-10 | 2006-05-06 | 1 |
| 3 | 1 | 6 | 9 | 59 | ENRL | 2006-08-29 | 2006-12-16 | 2 |
| 4 | 2 | 12 | 23 | 45 | ENRL | 2014-01-21 | 2014-05-16 | 3 |
| 5 | 2 | 12 | 23 | 45 | ENRL | 2014-08-18 | 2014-12-05 | 3 |
| 6 | 2 | 12 | 23 | 45 | LOAP | 2015-01-20 | 2015-05-15 | 4 |
| 7 | 2 | 12 | 23 | 45 | ENRL | 2015-08-25 | 2015-12-11 | 5 |
| 8 | 2 | 12 | 23 | 45 | LOAP | 2016-01-12 | 2016-05-06 | 6 |
| 9 | 2 | 12 | 23 | 45 | ENRL | 2016-05-16 | 2016-08-05 | 7 |
| 10 | 2 | 12 | 23 | 45 | LOAJ | 2016-08-23 | 2016-12-02 | 8 |
| 11 | 2 | 12 | 23 | 45 | ENRL | 2017-01-18 | 2017-05-05 | 9 |
| 12 | 2 | 12 | 23 | 45 | ENRL | 2018-01-17 | 2018-05-11 | 9 |
+----+----------+---------+--------+--------+---------------+------------+------------+---------+
Once I have grouping numbers, I think I can group by those to get what I'm ultimately after: a timeline of different statuses with start dates and end dates. For the example data above, that would be:
+---+----------+---------+--------+--------+---------------+------------+------------+
| | tran_seq | prog_id | deg_id | cur_id | enroll_status | start_date | end_date |
+---+----------+---------+--------+--------+---------------+------------+------------+
| 1 | 1 | 6 | 9 | 3 | ENRL | 2004-08-22 | 2006-05-06 |
| 2 | 1 | 6 | 9 | 59 | ENRL | 2004-08-29 | 2006-12-16 |
| 3 | 2 | 12 | 23 | 45 | ENRL | 2014-01-21 | 2014-12-05 |
| 4 | 2 | 12 | 23 | 45 | LOAP | 2015-01-20 | 2015-05-15 |
| 5 | 2 | 12 | 23 | 45 | ENRL | 2015-08-25 | 2015-12-11 |
| 6 | 2 | 12 | 23 | 45 | LOAP | 2016-01-12 | 2016-05-06 |
| 7 | 2 | 12 | 23 | 45 | ENRL | 2016-05-16 | 2016-08-05 |
| 8 | 2 | 12 | 23 | 45 | LOAJ | 2016-08-23 | 2016-12-02 |
| 9 | 2 | 12 | 23 | 45 | ENRL | 2017-01-17 | 2018-05-06 |
+---+----------+---------+--------+--------+---------------+------------+------------+
This is a classic XY problem, in that you are asking for an intermediate step to a different solution, rather than asking about the solution itself.
As you included your overall end goal as a bit of an addendum however, here is how you can reach that without your intermediate step:
declare #t table(tran_seq int, prog_id int, deg_id int, cur_id int, enroll_status varchar(4), start_date date, end_date date, desired int)
insert into #t values
(1,6,9,3 ,'ENRL','2004-08-22','2004-12-11',1)
,(1,6,9,3 ,'ENRL','2006-01-10','2006-05-06',1)
,(1,6,9,59 ,'ENRL','2006-08-29','2006-12-16',2)
,(2,12,23,45,'ENRL','2014-01-21','2014-05-16',3)
,(2,12,23,45,'ENRL','2014-08-18','2014-12-05',3)
,(2,12,23,45,'LOAP','2015-01-20','2015-05-15',4)
,(2,12,23,45,'ENRL','2015-08-25','2015-12-11',5)
,(2,12,23,45,'LOAP','2016-01-12','2016-05-06',6)
,(2,12,23,45,'ENRL','2016-05-16','2016-08-05',7)
,(2,12,23,45,'LOAJ','2016-08-23','2016-12-02',8)
,(2,12,23,45,'ENRL','2017-01-18','2017-05-05',9)
,(2,12,23,45,'ENRL','2018-01-17','2018-05-11',9)
;
select tran_seq
,prog_id
,deg_id
,cur_id
,enroll_status
,min(start_date) as start_date
,max(end_date) as end_date
from(select *
,row_number() over (order by end_date) - row_number() over (partition by tran_seq,prog_id,deg_id,cur_id,enroll_status order by end_date) as grp
from #t
) AS g
group by tran_seq
,prog_id
,deg_id
,cur_id
,enroll_status
,grp
order by start_date;
Output
+----------+---------+--------+--------+---------------+------------+------------+
| tran_seq | prog_id | deg_id | cur_id | enroll_status | start_date | end_date |
+----------+---------+--------+--------+---------------+------------+------------+
| 1 | 6 | 9 | 3 | ENRL | 2004-08-22 | 2006-05-06 |
| 1 | 6 | 9 | 59 | ENRL | 2006-08-29 | 2006-12-16 |
| 2 | 12 | 23 | 45 | ENRL | 2014-01-21 | 2014-12-05 |
| 2 | 12 | 23 | 45 | LOAP | 2015-01-20 | 2015-05-15 |
| 2 | 12 | 23 | 45 | ENRL | 2015-08-25 | 2015-12-11 |
| 2 | 12 | 23 | 45 | LOAP | 2016-01-12 | 2016-05-06 |
| 2 | 12 | 23 | 45 | ENRL | 2016-05-16 | 2016-08-05 |
| 2 | 12 | 23 | 45 | LOAJ | 2016-08-23 | 2016-12-02 |
| 2 | 12 | 23 | 45 | ENRL | 2017-01-18 | 2018-05-11 |
+----------+---------+--------+--------+---------------+------------+------------+

Program/Currency lookup table

I need to create a lookup table so that I can determine by program and currency if the current day of a particular month is a holiday. I thought about building a calendar for each but that is going to get to big to deal with as programs may come and go.
A sample set of data would be something like:
Program | Currency | January | February | March | April | May | June | July | August | September | October | November | December
--------| ---------| --------| ---------| ------| -----------| ----------| -----| -----| -------| ----------| --------| ---------| -----------
Default | AUD | 1, 27 | - | - | 10, 13 | - | 8 | - | - | - | 5 | - | 25, 28
Default | CAD | 1 | 17 | - | 10 | 18 | - | 1 | 3 | 7 | 12 | 11 | 25, 28
Default | CHF | 1, 2 | - | - | 10, 13 | 1, 21 | 1 | - | - | - | - | - | 25
Default | DKK | 1 | - | - | 9, 10, 13 | 8, 21, 22 | 1, 5 | - | - | - | - | - | 24, 25, 31
Default | EUR | 1 | - | - | 10, 13 | 1 | - | - | - | - | - | - | 25
Default | GBP | 1 | - | - | 10, 13 | 8, 25 | - | - | 31 | - | - | - | 25, 28
I am not sure how to define this table.
I'd suggest something like:
Create Table Holiday_List
(
ID Int
, Program Varchar(50)
, Currency Char(3)
, Holiday_Date Date
)
Of course, you'll need an interface to maintain that table, but that should be pretty simple.

Sum, Group by and Null

I'm dipping my toes into SQL. I have the following table
+------+----+------+------+-------+
| Type | ID | QTY | Rate | Name |
+------+----+------+------+-------+
| B | 1 | 1000 | 21 | Jack |
| B | 2 | 2000 | 12 | Kevin |
| B | 1 | 3000 | 24 | Jack |
| B | 1 | 1000 | 23 | Jack |
| B | 3 | 200 | 13 | Mary |
| B | 2 | 3000 | 12 | Kevin |
| B | 4 | 4000 | 44 | Chris |
| B | 4 | 5000 | 43 | Chris |
| B | 3 | 1000 | 26 | Mary |
+------+----+------+------+-------+
I don't know how I would leverage Sum and Group by to achieve the following result.
+------+----+------+------+-------+------------+
| Type | ID | QTY | Rate | Name | Sum of QTY |
+------+----+------+------+-------+------------+
| B | 1 | 1000 | 21 | Jack | 5000 |
| B | 1 | 3000 | 24 | Jack | Null |
| B | 1 | 1000 | 23 | Jack | Null |
| B | 2 | 3000 | 12 | Kevin | 5000 |
| B | 2 | 3000 | 12 | Kevin | Null |
| B | 3 | 200 | 13 | Mary | 1200 |
| B | 3 | 1000 | 26 | Mary | Null |
| B | 4 | 4000 | 44 | Chris | 9000 |
| B | 4 | 5000 | 43 | Chris | Null |
+------+----+------+------+-------+------------+
Any help is appreciated!
You can use window function :
select t.*,
(case when row_number() over (partition by type, id order by name) = 1
then sum(qty) over (partition by type, id order by name)
end) as Sum_of_QTY
from table t;

SQL Server : How to subtract values throughout the rows by using values in another column?

I have a table named stock and sales as below :
Stock Table :
+--------+----------+---------+
| Stk_ID | Stk_Name | Stk_Qty |
+--------+----------+---------+
| 1001 | A | 20 |
| 1002 | B | 50 |
+--------+----------+---------+
Sales Table :
+----------+------------+------------+-----------+
| Sales_ID | Sales_Date | Sales_Item | Sales_Qty |
+----------+------------+------------+-----------+
| 2001 | 2016-07-15 | A | 5 |
| 2002 | 2016-07-20 | B | 7 |
| 2003 | 2016-07-23 | A | 4 |
| 2004 | 2016-07-29 | A | 2 |
| 2005 | 2016-08-03 | B | 15 |
| 2006 | 2016-08-07 | B | 10 |
| 2007 | 2016-08-10 | A | 5 |
+----------+------------+------------+-----------+
With the table above, how can I find the available stock Ava_Stk for each stock after every sales?
Ava_Stk is expected to subtract Sales_Qty from Stk_Qty after every sales.
+----------+------------+------------+-----------+---------+
| Sales_ID | Sales_Date | Sales_Item | Sales_Qty | Ava_Stk |
+----------+------------+------------+-----------+---------+
| 2001 | 2016-07-15 | A | 5 | 15 |
| 2002 | 2016-07-20 | B | 7 | 43 |
| 2003 | 2016-07-23 | A | 4 | 11 |
| 2004 | 2016-07-29 | A | 2 | 9 |
| 2005 | 2016-08-03 | B | 15 | 28 |
| 2006 | 2016-08-07 | B | 10 | 18 |
| 2007 | 2016-08-10 | A | 5 | 4 |
+----------+------------+------------+-----------+---------+
Thank you!
You want a cumulative sum and to subtract it from the stock table. In SQL Server 2012+:
select s.*,
(st.stk_qty -
sum(s.sales_qty) over (partition by s.sales_item order by sales_date)
) as ava_stk
from sales s join
stock st
on s.sales_item = st.stk_name;

Magento database: Invoice items database table?

Does anyone know where the Invoice data is stored in Magento database?
For example, I've found that the order data is stored in sales_order, sales_flat_order, sales_flat_order_item.
I've also found out that the main invoice data is stored in sales_order_entity, sales_order_entity_decimal and sales_order_entity_int. Through that I can change the subtotal and totals of the invoice in the system.
But! I don't know where to find the items data? For orders, that data is in sales_flat_order_item, but my sales_flat_invoice_item table is empty?!
http://img809.imageshack.us/img809/1921/invoicey.jpg
I will tell you what I know for 1.4.0.1 which is the version i currently develop for, it may or may not be the same for whatever version you are using.
Also, why are you in the database anyways? Magento has made models for you to use so that you don't have to work in the database. Regardless I will describe how I find whatever attribute I'm looking for ...
For starters I'm assuming that your already logged into the database via a mysql client, run
SELECT `entity_type_id`,`entity_type_code`,`entity_table` FROM `eav_entity_type`
which will get you something like ...
+----------------+----------------------+----------------------------------+
| entity_type_id | entity_type_code | entity_table |
+----------------+----------------------+----------------------------------+
| 1 | customer | customer/entity |
| 2 | customer_address | customer/address_entity |
| 3 | catalog_category | catalog/category |
| 4 | catalog_product | catalog/product |
| 5 | quote | sales/quote |
| 6 | quote_item | sales/quote_item |
| 7 | quote_address | sales/quote_address |
| 8 | quote_address_item | sales/quote_entity |
| 9 | quote_address_rate | sales/quote_entity |
| 10 | quote_payment | sales/quote_entity |
| 11 | order | sales/order |
| 12 | order_address | sales/order_entity |
| 13 | order_item | sales/order_entity |
| 14 | order_payment | sales/order_entity |
| 15 | order_status_history | sales/order_entity |
| 16 | invoice | sales/order_entity |
| 17 | invoice_item | sales/order_entity |
| 18 | invoice_comment | sales/order_entity |
| 19 | shipment | sales/order_entity |
| 20 | shipment_item | sales/order_entity |
| 21 | shipment_comment | sales/order_entity |
| 22 | shipment_track | sales/order_entity |
| 23 | creditmemo | sales/order_entity |
| 24 | creditmemo_item | sales/order_entity |
| 25 | creditmemo_comment | sales/order_entity |
+----------------+----------------------+----------------------------------+
We want to know more about the "invoice_item" entity so lets see what attributes it has ... run
SELECT `attribute_id`,`entity_type_id`,`attribute_code`,`backend_type` FROM `eav_attribute` WHERE `entity_type_id`=17;
and you'll get something like ...
+--------------+----------------+----------------------------------+--------------+
| attribute_id | entity_type_id | attribute_code | backend_type |
+--------------+----------------+----------------------------------+--------------+
| 349 | 17 | additional_data | text |
| 340 | 17 | base_cost | decimal |
| 346 | 17 | base_discount_amount | decimal |
| 345 | 17 | base_price | decimal |
| 679 | 17 | base_price_incl_tax | decimal |
| 348 | 17 | base_row_total | decimal |
| 681 | 17 | base_row_total_incl_tax | decimal |
| 347 | 17 | base_tax_amount | decimal |
| 567 | 17 | base_weee_tax_applied_amount | decimal |
| 568 | 17 | base_weee_tax_applied_row_amount | decimal |
| 579 | 17 | base_weee_tax_disposition | decimal |
| 580 | 17 | base_weee_tax_row_disposition | decimal |
| 337 | 17 | description | text |
| 342 | 17 | discount_amount | decimal |
| 336 | 17 | name | varchar |
| 334 | 17 | order_item_id | int |
| 333 | 17 | parent_id | static |
| 341 | 17 | price | decimal |
| 678 | 17 | price_incl_tax | decimal |
| 335 | 17 | product_id | int |
| 339 | 17 | qty | decimal |
| 344 | 17 | row_total | decimal |
| 680 | 17 | row_total_incl_tax | decimal |
| 338 | 17 | sku | varchar |
| 343 | 17 | tax_amount | decimal |
| 571 | 17 | weee_tax_applied | text |
| 569 | 17 | weee_tax_applied_amount | decimal |
| 570 | 17 | weee_tax_applied_row_amount | decimal |
| 577 | 17 | weee_tax_disposition | decimal |
| 578 | 17 | weee_tax_row_disposition | decimal |
+--------------+----------------+----------------------------------+--------------+
the last column (backend_type) combined with the table for the entity (entity_table) is where the attribute for that entity will be so attribute "additional_data" should be in sales_order_entity_text with an attribute_id of 349.
Armed with this information now we just need to find an invoice, I'll use an example from a test install of mine. Lets look for the "base_price" of an invoice item.
First lets find all the items that are associated to the invoice (in my case invoice entity_id of 1303954)
SELECT * FROM `sales_order_entity` WHERE `entity_type_id`=17 AND `parent_id`=1303954;
which gives 2 items
+-----------+----------------+------------------+--------------+-----------+----------+---------------------+---------------------+-----------+
| entity_id | entity_type_id | attribute_set_id | increment_id | parent_id | store_id | created_at | updated_at | is_active |
+-----------+----------------+------------------+--------------+-----------+----------+---------------------+---------------------+-----------+
| 1303955 | 17 | 0 | | 1303954 | NULL | 2011-06-01 14:10:48 | 2011-06-01 14:10:48 | 1 |
| 1303956 | 17 | 0 | | 1303954 | NULL | 2011-06-01 14:10:48 | 2011-06-01 14:10:48 | 1 |
+-----------+----------------+------------------+--------------+-----------+----------+---------------------+---------------------+-----------+
Lets choose the first one and find the 'base_price'
SELECT * FROM `sales_order_entity_decimal` WHERE `attribute_id`=345 AND `entity_id`=1303955;
Which gives us ....
+----------+----------------+--------------+-----------+---------+
| value_id | entity_type_id | attribute_id | entity_id | value |
+----------+----------------+--------------+-----------+---------+
| 7361390 | 17 | 345 | 1303955 | 31.2500 |
+----------+----------------+--------------+-----------+---------+
Which of course its just a simple update to change it.
Again if you can do it via a Magento model I would highly suggest you do it that way, but if manual is the only way to go then well I hope this helped :)

Resources