Select all in current view in Filemaker Pro 12 - database

I have a Filemaker Pro 12 database that I can sort and export selections using check boxes, I have a script to "Select none" and remove all the ticked items, but I would like to be able to search and then check all the results.
I have a button on the checkbox image that performs a Set field and the following:
Case (
ValueCount ( FilterValues ( Table::Checkbox ; Table::ID ) ) > 0;
Substitute ( Table::Checkbox ; Table::ID & ¶ ; "" ) ;
Table::Checkbox & Table::ID & ¶
)
Conditional formatting of the checkbox is:
not ValueCount ( FilterValues ( Table::Checkbox ; Table::ID ) ) > 0
The script for "Select none" is:
Set Field [Table::Checkbox; ""]
So what would the "Select all" script need to be?

There are quite a few methods to collect values across a found set into a return-delimited list. Looping is fine if your found set is fairly small; otherwise it may prove too slow.
Since your target is a global field anyway, you could use a simple:
Replace Field Contents [ No dialog; Table::gCheckbox; List ( Table::gCheckbox ; Table::ID ) ]
This will append the current found-set's values to the existing list. To start anew, begin your script by:
Set Field [ Table::gCheckbox; "" ]
Note:
In version 13, you can use the new summary field option of "List".
Caveat:
Make sure you have a backup while you experiment with Replace Field Contents[]; there is no undo.

You could write a script to walk through the current found set and get all of the IDs:
Set Variable [$currentRecord ; Get(RecordNumber)]
#
Goto Record [First]
Loop
Set Variable [$ids ; Table::ID & ¶ & $ids ]
Go To Record [Next ; Exit after Last]
End Loop
#
Go To Record [By Calculation ; $currentRecord]
#
Set Field [Table::Checkbox ; $ids ]
This method would save your current position, walk through the current found set, compile the ids into a variable, return you to your position, and set the checkbox field.

Related

Why does my Applescript loop fail almost all the time? Oddly, it works sometimes

The code below is used to run a sample text in TextEdit. There about 14 iterations and the loop can fail in any one of them. Below is an updated simplified version of the script. Its still giving me the same problem
Compare the code in the new image with the dialog result
set AppleScript's text item delimiters to ":"
tell application "TextEdit"
activate
tell the front document
set nameList to the first paragraph
set remindersList to {}
set lineCount to count the paragraphs
repeat with i from 2 to lineCount
set reminderName to paragraph i
set end of remindersList to reminderName
end repeat
end tell
end tell
tell application "Reminders"
set newList to make new list
set name of newList to nameList
set reminderDate to date
set listAccount to account
lineCount = lineCount - 2
repeat with i from 1 to lineCount
tell list nameList
set newremin to make new reminder
set reminderName to item i of remindersList
set {remName, remBody1, remBody2} to {text item 1, text item 2, text item 3} of reminderName
set name of newremin to remName
set body of newremin to remBody1 & return & remBody2
set due date of newremin to (current date) + (1 * days)
end tell
end repeat
end tell
set AppleScript's text item delimiters to ""

Calculate Sum and Insert as Row

Using SSIS I am bringing in raw text files that contain this in the output:
I use this data later to report on. The Key columns get pivoted. However, I don't want to show all those columns individually, I only want to show the total.
To accomplish this my idea was calculate the Sum on insert using a trigger, and then insert the sum as a new row into the data.
The output would look something like:
Is what I'm trying to do possible? Is there a better way to do this dynamically on pivot? To be clear I'm not just pivoting these rows for a report, there are other ones that don't need the sum calculated.
Using derived column and Script Component
You can achieve this by following these steps:
Add a derived column (name: intValue) with the following expression:
(DT_I4)(RIGHT([Value],2) == "GB" ? SUBSTRING([Value],1,FINDSTRING( [Value], " ", 1)) : "0")
So if the value ends with GB then the number is taken else the result is 0.
After that add a script component, in the Input and Output Properties, click on the Output and set the SynchronousInput property to None
Add 2 Output Columns outKey , outValue
In the Script Editor write the following script (VB.NET)
Private SumValues As Integer = 0
Public Overrides Sub PostExecute()
MyBase.PostExecute()
Output0Buffer.AddRow()
Output0Buffer.outKey = ""
Output0Buffer.outValue = SumValues.ToString & " GB"
End Sub
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
Output0Buffer.AddRow()
Output0Buffer.outKey = Row.Key
Output0Buffer.outValue = Row.Value
SumValues += Row.intValue
End Sub
I am going to show you a way but I don't recommend adding total to the end of the detail data. If you are going to report on it show it as a total.
After source add a data transformation:
C#
Add two columns to your data flow: Size int and type string
Select Value as readonly
Here is the code:
string[] splits = Row.value.ToString().Split(' '); //Make sure single quote for char
int goodValue;
if(Int32.TryParse(splits[0], out goodValue))
{
Row.Size = goodValue;
Row.Type = "GB";
}
else
{
Row.Size = 0;
Row.Type="None";
}
Now you have the data with the proper data types to do arithmatic in your table.
If you really want the data in your format. Add a multicast and an aggregate and SUM(Size) and then merge back into your original flow.
I was able to solve my problem in another way using a trigger.
I used this code:
INSERT INTO [Table] (
[Filename]
, [Type]
, [DeviceSN]
, [Property]
, [Value]
)
SELECT ms.[Filename],
ms.[Type],
ms.[DeviceSN],
'Memory Device.Total' AS [Key],
CAST(SUM(CAST(left(ms.[Value], 2) as INT)) AS VARCHAR) + ' GB' as 'Value'
FROM [Table] ms
JOIN inserted i ON i.Row# = ms.Row#
WHERE ms.[Value] like '%GB'
GROUP BY ms.[filename],
ms.[type],
ms.[devicesn]

Filtration in the List based on "User!UserID"

Goal:
To make a filtration in the table list with support of "where" statement based on SSRS:s built-in function "User!UserID" in SSRS.
Problem:
I need apply the output value of the code
REPLACE
(
MID
(
User!UserID,
InStr(User!UserID,"\")+1,
Len(User!UserID)
),
".",
" "
)
in the dataset, inside of Where statement, but I retrieve error.
I also tried to apply the sourcecode as a variable and use it in the dataset, but the query designer complain that the variable do not exist. I'm using MDX code
What should I do?
WHERE
(
FILTER
(
xxxxxxxx.xxxxxxxx.ALLMEMBERS AS c,
c.Current.Name =
REPLACE
(
MID
(
User!UserID,
InStr(User!UserID,"\")+1,
Len(User!UserID)
),
".",
" "
)
)
)
1.
You apply the code
REPLACE
(
MID
(
User!UserID,
InStr(User!UserID,"\")+1,
Len(User!UserID)
),
".",
" "
)
in a new parameter, default value.
select dataset and go to the query designer and add a new parameter. The new parameter shall have same common with other data with similiar first and last name in from the dimension table.
3.add the parameter code in the query designer mode with mdx coding.

mysql2sqlite.sh script is not working as required

I am using mysql2sqlite.sh from script Github to change my mysql database to sqlite. But the problem i am getting is that in my table the data 'E-001' gets changed to 'E?001'.
I have no idea how to modify the script to get the required result. Please help me.
the script is
#!/bin/sh
# Converts a mysqldump file into a Sqlite 3 compatible file. It also extracts the MySQL `KEY xxxxx` from the
# CREATE block and create them in separate commands _after_ all the INSERTs.
# Awk is choosen because it's fast and portable. You can use gawk, original awk or even the lightning fast mawk.
# The mysqldump file is traversed only once.
# Usage: $ ./mysql2sqlite mysqldump-opts db-name | sqlite3 database.sqlite
# Example: $ ./mysql2sqlite --no-data -u root -pMySecretPassWord myDbase | sqlite3 database.sqlite
# Thanks to and #artemyk and #gkuenning for their nice tweaks.
mysqldump --compatible=ansi --skip-extended-insert --compact "$#" | \
awk '
BEGIN {
FS=",$"
print "PRAGMA synchronous = OFF;"
print "PRAGMA journal_mode = MEMORY;"
print "BEGIN TRANSACTION;"
}
# CREATE TRIGGER statements have funny commenting. Remember we are in trigger.
/^\/\*.*CREATE.*TRIGGER/ {
gsub( /^.*TRIGGER/, "CREATE TRIGGER" )
print
inTrigger = 1
next
}
# The end of CREATE TRIGGER has a stray comment terminator
/END \*\/;;/ { gsub( /\*\//, "" ); print; inTrigger = 0; next }
# The rest of triggers just get passed through
inTrigger != 0 { print; next }
# Skip other comments
/^\/\*/ { next }
# Print all `INSERT` lines. The single quotes are protected by another single quote.
/INSERT/ {
gsub( /\\\047/, "\047\047" )
gsub(/\\n/, "\n")
gsub(/\\r/, "\r")
gsub(/\\"/, "\"")
gsub(/\\\\/, "\\")
gsub(/\\\032/, "\032")
print
next
}
# Print the `CREATE` line as is and capture the table name.
/^CREATE/ {
print
if ( match( $0, /\"[^\"]+/ ) ) tableName = substr( $0, RSTART+1, RLENGTH-1 )
}
# Replace `FULLTEXT KEY` or any other `XXXXX KEY` except PRIMARY by `KEY`
/^ [^"]+KEY/ && !/^ PRIMARY KEY/ { gsub( /.+KEY/, " KEY" ) }
# Get rid of field lengths in KEY lines
/ KEY/ { gsub(/\([0-9]+\)/, "") }
# Print all fields definition lines except the `KEY` lines.
/^ / && !/^( KEY|\);)/ {
gsub( /AUTO_INCREMENT|auto_increment/, "" )
gsub( /(CHARACTER SET|character set) [^ ]+ /, "" )
gsub( /DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP|default current_timestamp on update current_timestamp/, "" )
gsub( /(COLLATE|collate) [^ ]+ /, "" )
gsub(/(ENUM|enum)[^)]+\)/, "text ")
gsub(/(SET|set)\([^)]+\)/, "text ")
gsub(/UNSIGNED|unsigned/, "")
if (prev) print prev ","
prev = $1
}
# `KEY` lines are extracted from the `CREATE` block and stored in array for later print
# in a separate `CREATE KEY` command. The index name is prefixed by the table name to
# avoid a sqlite error for duplicate index name.
/^( KEY|\);)/ {
if (prev) print prev
prev=""
if ($0 == ");"){
print
} else {
if ( match( $0, /\"[^"]+/ ) ) indexName = substr( $0, RSTART+1, RLENGTH-1 )
if ( match( $0, /\([^()]+/ ) ) indexKey = substr( $0, RSTART+1, RLENGTH-1 )
key[tableName]=key[tableName] "CREATE INDEX \"" tableName "_" indexName "\" ON \"" tableName "\" (" indexKey ");\n"
}
}
# Print all `KEY` creation lines.
END {
for (table in key) printf key[table]
print "END TRANSACTION;"
}
'
exit 0
I can't give a guaranteed solution, but here's a simple technique I've been using successfully to handle similar issues (See "Notes", below). I've been wrestling with this script the last few days, and figure this is worth sharing in case there are others who need to tweak it but are stymied by the awk learning curve.
The basic idea is to have the script output to a text file, edit the file, then import into sqlite (More detailed instructions below).
You might have to experiment a bit, but at least you won't have to learn awk (though I've been trying and it's pretty fun...).
HOW TO
Run the script, exporting to a file (instead of passing directly
to sqlite3):
./mysql2sqlite -u root -pMySecretPassWord myDbase > sqliteimport.sql
Use your preferred text editing technique to clean up whatever mess
you've run into. For example, search/replace in sublimetext. (See the last note, below, for a tip.)
Import the cleaned up script into sqlite:
sqlite3 database.sqlite < sqliteimport.sql
NOTES:
I suspect what you're dealing with is an encoding problem -- that '-' represents a character that isn't recognized by, or means something different to, either your shell, the script (awk), or your sqlite database. Depending on your situation, you may not be able to finesse the problem (see the next note).
Be forewarned that this is most likely only going to work if the offending characters are embedded in text data (not just as text, but actual text content stored in a text field). If they're in a machine name (foreign key field, entity id, e.g.), binary data stored as text, or text data stored in a binary field (blob, eg), be careful. You could try it, but don't get your hopes up, and even if it seems to work be sure to test the heck out of it.
If in fact that '-' represents some unusual character, you probably won't be able to just type a hyphen into the 'search' field of your search/replace tool. Copy it from the source data (eg., open the file, highlight and copy to clipboard) then paste into the tool.
Hope this helps!
To convert mysql to sqlite3 you can use Navicom Premium.

Lua - SQLite3 isn't inserting rows into its database

I'm trying to build an expense app for Android phones, and I need a way to display the expenses. The plan (for my current step) is to allow the user to view their expenses. I want to show a calendar-like screen, and if there is at least one expense for a day, then use a different color for the button.
My problem is in inserting information to the sqlite3 table. Here is my code:
require "sqlite3"
--create path
local path = system.pathForFile("expenses.sqlite", system.DocumentsDirectory )
file = io.open( path, "r" )
if( file == nil )then
-- Doesn't Already Exist, So Copy it In From Resource Directory
pathSource = system.pathForFile( "expenses.sqlite", system.ResourceDirectory )
fileSource = io.open( pathSource, "r" )
contentsSource = fileSource:read( "*a" )
--Write Destination File in Documents Directory
pathDest = system.pathForFile( "expenses.sqlite", system.DocumentsDirectory )
fileDest = io.open( pathDest, "w" )
fileDest:write( contentsSource )
-- Done
io.close( fileSource )
io.close( fileDest )
end
db = sqlite3.open( path )
--setup the table if it doesn't exist
local tableSetup = [[CREATE TABLE IF NOT EXISTS expenses (id INTEGER PRIMARY KEY, amount, description, year, month, day);]]
db:exec(tableSetup)
local tableFill = [[INSERT INTO expenses VALUES (NULL,']] .. 15 .. [[',']] .. "Groceries" .. [[',']] .. 2013 .. [[',']] .. 4 .. [[',']] .. 8 ..[[');]]
db:exec(tableFill)
for row in db:nrows("SELECT * FROM expenses") do
print("hi")
if row.year == dateTable[i].year and row.month == dateTable[i].month and row.day == dateTable[i].day then
flag = dateTabel[i].day
end
end
I have looked everywhere to see if I've used the wrong sqlite3 commands wrong since I'm not very familiar to it, but I tried everything I found and nothing worked. The print("hi")
line doesn't execute, so that tells me that there are no rows in the table.
Also, if I say db:nrows("SELECT year, month, day FROM expenses"), sqlite3 gives me an error saying there is no year column. My overall guess is that I'm not inserting the information into the table properly, but I've tried everything I can think of. Can anyone help?
I figured out that there was an issue with the current version of sqlite3 and the one I have on my computer. Anyway, I changed a couple of lines and it works flawlessly. I changed the select statement and the for loop.
--sqlite statement
local check = "SELECT DISTINCT year, month, day FROM expenses WHERE year = '"..dateTable[i].year.."' AND month = '"..dateTable[i].month.."' AND day = '"..dateTable[i].day.."'"
--check if there is at least one expense for a given day
--if there isn't one, the for loop won't execute
for row in db:nrows(check) do
flag = row.day
end
Then I go on to create a button with a different color if the day number is equal to the flag variable.
This is all inside a another for loop which creates each dateTable[i].

Resources