feature
open_file_sample
local
l_file: UNIX_FILE_INFO
l_path: STRING
do
make
l_path := "/var/log/syslog"
l_file.update (l_path)
if l_file.parent_directory.exists and then l_file.parent_directory.is_writtable then
create l_file.make
end
-- AS the above statement doesn't exist!
check
syslog_file_exists_and_is_readable: l_file.exists and then l_file.is_readable
end
end
Is this the proper way to check for file existence in Eiffel?
I was wondering if there is a way not to create 2 objects. I'll complete my check with following statement:
define path `l_file_path := "/some/path/with_file.log"
check if parent directory exists and has rights to write into
create log file
The problem when accessing the file system is that the property of a file or directory may have changed between the time you query it and the time you want to use it (even if it's only a small fraction of a second). Because of that, assertions in Eiffel of the form:
f (a_file: RAW_FILE)
require
a_file.is_writable
do
a_file.open_write
may be violated. In the Gobo Eiffel libraries, instead of checking whether a file can be opened in write mode before actually opening it, the revert approach was chosen: try to open the file, and check whether it was opened successfully.
f (a_pathname: STRING)
local
l_file: KL_TEXT_OUTPUT_FILE
do
create l_file.make (a_pathname)
l_file.recursive_open_write
if l_file.is_open_write then
-- Write to the file.
l_file.close
else
-- Report the problem.
end
Note that it uses recursive_open_writeand not just open_write so that missing directories in the path get created as well.
You can use
{FILE_UTILITIES}.file_exists (the_file_name)
or
(create {RAW_FILE}.make_with_name (the_file_name)).exists
do
if not l_file.exists then
print ("error: '" + l_path + "' does not exist%N")
else
...
You can something similar to this
My final solution is following, and is subject to critics, I personnaly find it very complicated in comparison to more low level languages and libs (as bash for ex)
log_file_path: detachable PATH
-- Attached if can be created
local
l_file: UNIX_FILE_INFO
l_path, l_parent_dir: PATH
l_fu: FILE_UTILITIES
do
create l_fu
-- Parent directory check
create l_path.make_from_string ({APP_CONFIGURATION}.application_log_file_path)
l_parent_dir := l_path.parent
if not l_fu.directory_exists (l_parent_dir.out) then
l_fu.create_directory_path (l_parent_dir)
end
create l_file.make
l_file.update (l_parent_dir.out)
if not l_file.exists or
l_file.is_access_writable
then
io.putstring ("Error: " + log_file_path_string + " parent directory is not writtable and cannot be created")
check
parent_dir_exists_and_is_writtable: False
end
else
Result := l_path
end
ensure
file_name_could_be_created: Result /= Void
end
Related
The down below is TCL script I am not sure about the script could some please help me to solve the issue.
I am getting one error - should be "proc name args body"
tcl;
proc {
puts "########### Trying to find the Id's ###########"
mql start transaction
set Id {mql temp query bus 'AIRBUS_E_Document_ElectricalDiagram' * * where 'attribute[clau*].value==FALSE' select id;}
set error[Catch {proc $Id} sResult]
If {$error == 0}{
puts "$Id"
}else{
puts "Error -$sResult"
mql abort transaction
}
puts "######## Finding Id's are Completed #########"
}
Please let me if changes are required in here.
proc documentation -- missing procname and arglist
if documentation
Tcl is a word-oriented language, so it is vital that arguments to commands are separated by whitespace
If {$error == 0}{ ==> if {$error == 0} {
}else{ ==> } else {
set error[Catch {proc $Id} sResult]
again, missing space after "error"
I don't know what you want do to here.
See also the rules of Tcl syntax -- there's only 12 of them, so spend some time reading that.
I am trying to write a policy which governs administrator username compatibility, which consists of three rules: alphanumeric value, not part of disallowed names (admin, administrator etc.), and longer than 5 characters.
I have found that when using OPA as part of a CI pipeline (which is my use case), the most comfortable solution is to create an object (dictionary) which contains policy results so that I can query them directly. My line inside the CI pipeline will look like this:
for file in rego_directory:
opa eval -i file -d data.json "package_name.policy"
Which doesn't print all of the variables and temporary resources I use inside the rego file (which saves a lot of logs and outputs). In order to make this "policy" object, I have inside my rego file the following pattern:
policy[policy_name] = result {
policy_name :=
...
computations...
...
result := <logical condition>
}
Now, my questions are as following: This doesn't seem like a best practice to me. Is there another way to simply omit the variables from the output? previously I've worked with separate values (i.e. something like this:
default policy_1 = false
default policy_2 = false
policy_1 = {
<logical condition>
}
policy_2 = {
<logical condition>
}
Second question: How do I create an output dictionary (since JSON output of a dictionary is a nice format to work with, after all) which can satisfy multiple conditions? looking back at my example, I can't write something like this:
policy[policy_name] = result {
policy_name :=
...
computations...
...
result := <logical condition>
result := <logical condition 2>
}
Since this is double assignment, which is invalid. Even if I use = instead of :=, it creates conflicts if one term is true and the other is false, and errors are not what I'm looking for (I need boolean answers). How do I create a complex rule whose output I can put inside this dictionary?
Thanks in advance.
TLDR; to answer your questions explicitly:
Now, my questions are as following: This doesn't seem like a best practice to me. Is there another way to simply omit the variables from the output?
There's nothing wrong with having your software query for the value generated by some rule. In fact, rules are the fundamental way of defining abstractions in your policies so that you can decouple policy decision-making (i.e., the logic that produces the decision) from policy enforcement (i.e., the logic/actions taken based on the policy decision.)
The only real alternative is to query for a set of rules inside one or more packages like you showed.
Second question: How do I create an output dictionary (since JSON output of a dictionary is a nice format to work with, after all) which can satisfy multiple conditions? looking back at my example, I can't write something like this:
You're asking how to express Logical OR. In this case, you would create multiple definitions (we call them "incremental definitions" in some places) of the policy rule:
policy[policy_name] = result {
policy_name :=
...
computations...
...
result := <logical condition>
}
policy[policy_name2] = result2 {
policy_name2 :=
...
some other computations...
...
result2 := <some other logical condition>
}
This snippet defines an object (named policy) that maps policy_name to result and policy_name2 to result2. We call this kind of rule a Partial Object. You can also define Partial Sets. When you define a Partial Object, you need to ensure that each key maps to at-most-one value (otherwise you'll get a runtime error.)
Another way of structuring your policy would be to (essentially) define a deny-list using partial sets:
package usernames
deny["username must use alphanumeric characters"] {
re_match("[a-zA-Z0-9]", input.username)
}
deny["username must be at least 5 characters long"] {
count(input.username) < 5
}
deny["username is reserved"] {
reserved_usernames[input.username]
}
reserved_usernames := {"admin", "root"}
Then your CI pipeline can simply query for deny:
opa eval -i input.json 'data.usernames.deny'
The result will contain reasons why the username should not be allowed.
Delete all files except the files in array
files = {"init.lua", "client.lua", "config.htm", "server.lua", "update.lua"}
As mentioned in the documentation, you can get a list of all the files on the filesystem using file.list().
This returns a map file name => file size; so the filename is the table key, which you can make use of.
Keep in mind that:
Table indexing in Lua is acceptably fast
Key-Value pairs can be removed by setting the value to nil
So we can do something like this:
local whitelist = {"init.lua", "client.lua", "config.htm", "server.lua", "update.lua"}
local files = file.list()
-- Remove files found in the whitelist
for _, filename in ipairs(whitelist) do
files[filename] = nil
end
-- Delete the remaining files
for filename in pairs(files) do
file.remove(filename)
end
Firstly, we can create 3 array-like tables: one for the protected files (the one you provided), another to list all files in the directory and the last to select the files we will delete.
local protected = {"init.lua", "client.lua", "config.htm", "server.lua", "update.lua"}
local found = {}
local delete = {}
Next, we will gather the directory contents:
for name in io.popen([[dir "" /b]]):lines() do
table.insert(found, name)
end
It is worth noticing that it will get folders too, but it doesn't matter here, since we won't be able to delete these anyway.
Now, with a simple function and a for-loop we will get the values from found to delete except for the ones in protected:
local function contains(t, v)
for index, value in ipairs(t) do
if value == v then
return true
end
end
return false
end
for _, value in ipairs(found) do
if not contains(protected, value) then
table.insert(delete, value)
end
end
Now we delete the files in delete
for _, filename in ipairs(delete) do
print("Deleting "..filename.." (unless it is a folder)")
os.remove(filename) -- this function is simply unable to delete folders
end
And this is it. I must admit, I'm running it on Windows, so I used the dir command. Where you want to run it may need it to be changed to ls (from io.popen([[dir "" /b]]):lines() to io.popen([[ls "" /b]]):lines()).
Hopefully an easy problem for an experienced SQL person. I have an application which uses SQL Server, and I cannot perform this query in the application, so I'm hoping to back-door it, but I need help.
I have a table with a large list of emails and all its metadata. I'm trying to find email that is only between parties of this one company and flag them.
What I did was search where companyName.com is in To and From and marked a TagField as 1 (I did this through my application's front end).
Now what I need to do is search where any other possible values, ignoring companyName.com exist in To and From where I've already flagged them as 1 in TagField. From will usually just have one value, but To could have multiple, all formatted differently, but all separated by a semi-colon (I will probably have to apply this same search to CC and BCC columns, too).
Any thoughts?
Replace the ; with the empty string. Then check to see if the length changed. If there's one email address, there shouldn't be a ';'. You could also use the same technique to replace the company name with the empty string. Anything left would be the other companies.
select email_id, to_email
from yourtable
where TagField = 1 and len(to_email) <> len(replace(to_email,';',''))
This solution is based on the following thread
Number of times a particular character appears in a string
So I went an entirely different route and exported my data to a CSV and used Python to get to where I needed. Here's the code I used in case anybody needs it. What this returned for me was a list of DocIDs (unique identifiers that were in the CSV) where ever there was an email address in the To field that wasn't from one specific domain. I went into the original CSV and made sure all instances of this domain name were in all lowercase, too.
import csv
import tkinter as tk
from tkinter import filedialog
root = tk.Tk()
root.withdraw()
file_path = filedialog.askopenfilename()
sub = "domainname"
def findMultipleTo(dict):
for row in reader:
if row['To'].find(";") != -1:
toArray = row['To'].split(';')
newTo = [s for s in toArray if sub not in s]
row['To'] = newTo
else:
row['To'] = 'empty'
with open('location\\newCSV-BCCFieldSearch.csv', 'a') as f:
if row['To'] != "empty" and row['To'] != []:
print(row['DocID'], row['To'], file = f)
else:
pass
with open(file_path) as csvfile:
reader = csv.DictReader(csvfile)
findMultipleTo(reader)
How can I create a synonym in HyperFileSQL?
I have a table named USER and I cannot Access it via ODBC. I can't rename it, so I want to create a synonym for it. How do I do this?
To create Synonym use this fonction HAlias().
// Create an alias for the ORDERS file
// (Syntax available from version 19)
Orders2013 is Data Source <description=Orders>
IF HAlias(Orders, Orders2013) = True THEN
// ORDERS2013 can now be used in the processes
// It behaves the same way as
// the ORDERS file described in the analysis.
// Modify the directory
HChangeDir(Orders2013, "D:\SalesMgt\Archive2013")
// Modify the name
HChangeName(Orders2013, "Orders")
HOpen(Orders2013)
...
// Processes on the Orders2013 file
...
END
// Cancel the alias
HCancelAlias(Orders)