Background:
An item is manufactured by 3 and more machines in a production line. This item has no serial, QR Code or any other identifier. We only have boxes (with QR Codes) where these items are stored and transported from one machine to another.
Each machine will add some attributes to the item and at the end we would like to have a list of all attributes for each item.
Because we can't store the attributes to each individual item we have to store them to the box. Which means one box can have multiple attributes of the same type but with different values. example would be the serial of the print cartridge which could change during processing of the items in one box. so the first x items were printed with cartridge serial_1 and the remaining items with cartridge serial_2
So at the end when we get an item out of the box we have two serials for the print cartridge attribute. this is fine and not a problem.
How should these attributes be tracked/stored in a database? Or is there another way to track these kind of information?
My current solution with 3 DB Tables
Machines
to keep track where/who performed an action
id, name, ip, location, etc
Boxes
only thing that has a QR code for identification. each box is labeled
id, name, serial
Operations
history of events with generated data (item_printed, serial of cartridge)
id, machine_id, box_id, type_of_operation, timestamp, data
My concern is that there will be thousand/millions of operations for every single box and this could impact the performance/scalability for the production line. Because at the end when we retrive an item we have to go back in the history a collect all data from the previous events up to a specific operation of type "created"
Any suggestions how to improve this or any other idea for that problem?
Related
I am trying to replace our outdated shipping label program with a SQL Server report, and I am very green when it comes to doing this. I am testing using an ODBC connection with some sample data. I have created a DataSource which is a simple "SELECT * FROM LABELLIST" statement. Each row in this table contains ONE label.
In the DataSet I have each column that is needed (To, From, Carrier, PO, etc) listed. I have dragged the fields onto a blank report and in the expression it is set to "=Fields!FROMADDRESS.Value". As mentioned earlier, each ROW is a new label, but when previewing, I only get the first record as a label. What do I have to use (tablix, matrix, list, grouping?) to accomplish this and how?. I'm not sure how to search for this answer online and was hoping to get a reference page to read on how to do it. Everything I've found pertaining to labels or "row to page mapping" is showing how to print multiple labels/rows to one page and not each row to a single page.
EdIt: to clarify, each label is being sent to a zebra thermal printer and follows a similar format to a UPS or FedEx shipping label. Each row in the table will be one shipping label.
The key for you to understand is how SSRS handles Page Breaks. I have a similar answer here.
Whatever formatting you have for your labels should be placed inside a Rectangle.
Place this rectangle into a table with one cell that is grouped by label ID.
Set the group to page break between instances.
This makes the report repeat one instance of the label on each page. It can be a little tricky to understand at first, but it is a very useful trick. I have used this for reports like invoices where we needed one on each page.
I need to use SSRS to create many different reports, and I have been trying to find the best way for me to easily create them as need, and for users to navigate them and use them for their needs.
To give you and idea of the two sets of data I am dealing with:
EDI file from our customer
Raw data output from hardware configuration
Now the EDI data is fairly consistent, so these columns are static.
The hardware data is usually a massive list of different configuration. I receive them in different flat files formats and using SSIS or other tools I get the data into Key Value Pairs. Now in a report, I use matrix to keep EDI columns static, it matches with the hardware on serial number, and Hardware data pivots.
So the report does not break, and so I don't give the user too much information, it matches up on another table where I specify what keys I want to be columns.
Here is a small example of one of my reports:
The green columns are EDI, while the orange is the hardware.
My question is, is there a better way for me to be doing this? Some reports can get complicated like needing total for certain hardware (counting hardrive space, ram total etc.) which is difficult to do dynamically.
I have tried creating in reports in this fashion, with these parameters:
This way I can create the Key columns per project and user can select what report they want to run. The default is All Data.
Is there a better way for me to create these reports? SSRS really doesn't seem to play well with dynamic pivots.
Is there a better tool that will handle these reports dynamically, or let users pick and choose what they want to see in a report?
I can't visualise your data but if I understand correctly, you could have a dropdown list showing all the unique values that are in the column you are using in the column group. Set this to be multi-value and then simply have the WHERE clause read something like
SELECT * FROM myTable WHERE myColumnGroupField IN (#myColumnChoiceParameter)
This way the user could select whichever columns they would like.
You could extend this by adding another parameter that has some preset groups of columns (I think you might have one of these already if I understand correctly) that would set the default value of the main #myColumnChoiceParameter parameter.
If you want something more flexible then you might want to look at Power BI but depending on how you intend to deploy that might not be a simple option.
You cannot dynamically create columns in SSRS but you can control the visibility of the columns.
1) Create a list in table that contains the names of all the columns that yo want to toggle and include a column titled 'All'.
2) Create a parameter that is based on this table and make sure multi-select is turned on.
3) Right click on every column that you want to toggle, select visibility and then create a condition that checks if the user either selected All or selected the column from the parameter list.
4) Train users that by selecting and deselecting from the dropdown they control whats visible.
I've created Table control which is used while entering data. Data will flow from table control to internal table and then to the database. For reading data into internal table, subroutines will be used. Currently I'm stuck in building Table Control logic. There are 8 rows displayed initially in TC, after 1st row is filled and user presses enter, the data inside TC is gone. I tried debugging default SAP program's to check their logic, but the naming convention is confusing me a lot. Another issue is of blank row. Suppose the user enters value in 4th row (keeping the 3rd row blank) the row should move UP automatically after user click's enter.
Any solution for the above problem ? Or any useful link which I can refer ?
Assuming the internal table for the table control is declared as a global variable, ensure you populate the data stored in this internal table to your table control at PBO time.
Please provide some coding to get a clue what exactly you are doing.
I'm currently struggling with Oracle Apex.
I'm trying to create an application which enables customers to place their order. Therefore I create a report which lists the available products. Furthermore the report contains a column (the SQL query for that is simply '0' as "Quantity") which displays a text box. In this text boy the customer should be able to insert the required quantity.
I've create a screenshot to make it easier to follow me:
After the customer has fill out the form, the "Place Order" button will purchase the wished items then.
My question is now, how is it possible to read out in which text boxes did the user filled in a number and also to which product belongs it!
An easier solution would be to recreate the region but choose Form Region and then Tabular Form Region and then the wizard will help take care of the DML for you. But you need to use specific table columns for this to work.
To answer your question more directly - the input items defined in reports that are posted to the server can be accessed in PL/SQL as a set of "Global Arrays". These are defined as PL/SQL tables in the package apex_application with the names g_f01 through g_f50.
To be sure which of these arrays to use for the quantity text box you can look at the html of the page for the name attribute of the input tag. If it is f01 then you would be able to process the results by accessing each position or element in apex_application.g_f01.
To link the input with the table you would need some sort of key. If you use the wizard to build a Tabular Form all this headache is taken care of for you though.
I have a table containing user input which needs to be optimized.
I have some ideas about how to solve this but i would really appreciate your input on this. The table that needs optimization is called Value in the structure below.
All tables mentioned below has integer primary keys called Id.
Specs: Ms Sql Server 2008, Linq2Sql, asp.net website, C#.
The current structure looks as follows:
Page -> Field -> FieldControl -> ValueGroup -> Value
Page
A pages is a container for one or more Fields.
Field
A field is a container for one or more FieldControls such as a textbox or dropdown-options.
Relationships: PageId
FieldControl
If a Field is of the type 'TextBox' then a single FieldControl is created for the Field.
If a Field is of the type 'DropDown' then one FieldControl per dropdown option is created for the Field containing the option text.
Relationships: FieldId
ValueGroup
Each time a user fills in Fields within a Page and saves it, a new ValueGroup (Id) is created to keep track of user input that is relevant to that save. When a user wants to
look at a previously filled in form, the valuegroup is used to load the Values into the FieldControls of that previously filled in instance.
Relationships: None
Value
The actual input of a FieldControl. If the user typed 'Hello' in a TextBox then 'Hello' would be stored in a row in this table followed by a reference back to which FieldControl 'Hello' was inputted for. A ValueGroup is linked to values in order to group them to keep track of which save/instance they belong to as described in ValueGroup.
Relationships: ValueGroupId, FieldControlId
The problem
If 100.000 Pages are fully filled in, containing 10 TextBoxes each then we get 100.000 * 10 records in the Values table meaning we quickly reach one million records making it really slow as it is now. The user can create as many different pages with as many different Fields as he/she likes and all these values are stored in the Values table. The way i use this data is by either displaying a gridview with pagination that displays all records for a single Pagetype, or when looking at a specific Page instance (Values grouped by ValueGroupId).
Some ideas that i have:
Good indexing should be very important when optimizing the Values table.
Should i perhaps add a foreign key directly back to Page from Value, ending up with indexing by (Id, PageId, ValueGroup) allowing the gridview to retrieve values that are only relevant for one Page?
Should i look into partitioning the table and if so, how would you recommend that i do this? I was thinking that partitioning by Page, hence getting chunks of values that are only relevant to a certain page would be wise in this case right? How would the script/schema look for something like that where pages could be created/removed at any time by the users.
PS. There should be a badge on this forum for all people that finished reading this long post, and i hope ive made myself clear :)
Just to close this post. Correct indexing solved all performance problems.
This may be slightly off-topic, but why? Is this data that you need to access in real-time, or is it for some later processing? Could you perhaps pack the data into a single row and then unpack it later?
Generic
You say it is slow now and that can be many reasons for that other than the database
like low memory, high CPU, disk fragmentation, network load, sockets problems etc etc.
This should show up on a system monitor
Try for instance Sysinternals (now MS) tool: http://live.sysinternals.com/procexp.exe
But if that is all under control then back to the database.
Database index
One million records is not "that much" and should not be a problem.
An index should do the trick if you don't have any indexes right now.
You should probably set indexes on all tables if you haven't done so already.
I tried to do a database model, is this right:
http://www.freeimagehosting.net/image.php?a39cf99ae5.png
Table structure (?)
Page -> Field -> FieldControl -> ValueGroup -> Value
The table structure looks like it may not be the optimal one but it is hard to say exactly when I don't know how the application works.
Do all tables have the foreign keys of
the table above ?
Is this somewhat similar to your code ?
Pseudo code:
1. Get page info. Gives key "page-id"
2. Get all "Field":s marked with that "page-id".
Gives keys "field-id" & "fieldcontrol-id"
3. Loop trough all fields-id:s and get the FieldControl for each one
4. Loop trough all fields-id:s and get all ValueGroup:s.
Gives a list of "valuegroup-id":s keys
5. Loop trough all ValueGroup:s and get all fields