I'm working in SPSS with this kaplan-meier command:
KM data BY sample
/STATUS=status(0)
/PRINT TABLE MEAN
/PLOT SURVIVAL HAZARD
/TEST LOGRANK BRESLOW TARONE
/COMPARE OVERALL POOLED.
This is no problem but there's a lot of data i have to process and I'm trying to get this in a syntax file together. Can i do a loop of several kaplan-meier commands with data going through a set of variables such as {time0 time1 time2} and sample going through a set such as {sample0 sample1 sample2}.
I tried with DO REPEAT - END REPEAT. But I couldnt get it to work.
DO REPEAT applies to transformation commands. Procedures cannot be placed inside loops. However, if you install the Python Essentials from the SPSS Community site (www.ibm.com/developerworks/spssdevcentral), this is easy to do. If you can provide more details on what you want to loop over, we can explain how to do this.
Related
I am trying to loop through term years. the start_term is a static term 2010. Basically I just want to know how to flip the oracle way of doing a FOR Loop in Pentaho?
Below is what it originally looks like, but 'm not sure how to create it in Pentaho.
FOR I in start_term..End_term
LOOP
term_parm := to_char(I);
Thank you
1) What settings in the ColdFusion Administrator should be turned off/on?
2) What ColdFusion code should you use to properly benchmark execution time like getTickCount()?
3) What system information should you provide also like CF Engine, Version, Standard/Enterprise, DB, etc?
What we do is:
In Application.cfc's onRequestStart() -> set tick count value, add to REQUEST scope.
In Application.cfc's onRequestEnd() -> set tick count value, subtract first value from it to get total processing time in ms
We then have a set threshold (say 200ms) and if that threshold is reached we'll log a record in a database table
Typically we'll log the URL query string, the script name, the server name, etc.
This can give very useful information over time on how particular pages are performing. This can also be easily graphed so you can see if a page suddenly started taking 5000ms where before it was taking 300ms, and then you can check SVN to see what change did it :)
Hope that helps!
1) In CF administrator, in Debug Settings, you can turn on Enable Request Debugging Output, which outputs runtime and other debugging information at the bottom of every page. This can be useful if you want to see queries as well. If you want to use timers to you must select Timer Information in the Debug Settings(got hung on that for a hot minute).
2) You can use timers to have custom benchmarks of execution times. There are four types, inline, outside,comment or debug, each corresponding to where the output will be. In inline, it will create a little box around your code(if its a .cfm) and print the total runtime. The others will print in the bottom output that you turned on in CF admin.
3) I don't really know what you should provide. Wish I could help more. In my opinion the more information the better, so that what I would say :P
with respect to #mbseid's answer, request debugging adds a significant amount of processing time to any request, especially if you use CFCs. I would recommend you turn request debugging off and use getTickCount() at the top and bottom of the page and then take the difference to get the time to render that page. This will give you a much closer reflection of how the code will perform in production.
My co-worker is being unsafe with his code and is allowing a user to upload an SQL file to be run on the server.
He strips out any key words in the file such as "EXEC", "DROP", "UPDATE", "INSERT", "TRUNC"
I want to show him the error of his ways by exploiting his EXEC ( #sql )
My first attempt will be with 'EXEXECEC (N''SELECT ''You DRDROPOPped the ball Bob!'')'
But he might filter that all out in a loop.
Is there a way I can exploit my co-worker's code? Or is filtering out the key words enough?
Edit: I got him to check in his code. If the code contains a keyword he does not execute it. I'm still trying to figure out how to exploit this using the binary conversion.
Tell your co-worker he's a moron.
Do an obfuscated SQL query, something like:
select #sql = 0x44524f5020426f627350616e7473
This will need some tweaking depending on what the rest of the code looks like, but the idea is to encode your code in hex and execute it (or rather, let it be executed). There are other ways to obfuscate code to be injected.
You've got a huge security hole there. And the funny part is, this is not even something that needs to be reinvented. The proper way to stop such things from happening is to create and use an account with the correct permissions (eg: can only perform select queries on tables x, y and z).
Have a look at ASCII Encoded/Binary attacks ...
should convince your friend he is doomed.. ;)
And here some help on how to encode the strings ..
Converting a String to HEX in SQL
I'm tring to create an SSIS package to import some dataset files, however given that I seem to be hitting a brick
wall everytime I achieve a small part of the task I need to take a step back and perform a sanity check on what I'm
trying to achieve, and if you good people can advise whether SSIS is the way to go about this then I would
appreciate it.
These are my questions from this morning :-
debugging SSIS packages - debug.writeline
Changing an SSIS dts variables
What I'm trying to do is have a For..Each container enumerate over the files in a share on the SQL Server. For each
file it finds a script task runs to check various attributes of the filename, such as looking for a three letter
code, a date in CCYYMM, the name of the data contained therein, and optionally some comments. For example:-
ABC_201007_SalesData_[optional comment goes here].csv
I'm looking to parse the name using a regular expression and put the values of 'ABC', '201007', and
'SalesData' in variables.
I then want to move the file to an error folder if it doesn't meet certain criteria :-
Three character code
Six character date
Dataset name (e.g. SalesData, in this example)
CSV extension
I then want to lookup the Character code, the date (or part thereof), and the Dataset name against a lookup table
to mark off a 'checklist' of received files from each client.
Then, based on the entry in the checklist, I want to kick off another SSIS package.
So, for example I may have a table called 'Checklist' with these columns :-
Client code Dataset SSIS_Package
ABC SalesData NorthSalesData.dtsx
DEF SalesData SouthSalesData.dtsx
If anyone has a better way of achieving this I am interested in hearing about it.
Thanks in advance
That's an interesting scenario, and should be relatively easy to handle.
First, your choice of the Foreach Loop is a good one. You'll be using the Foreach File Enumerator. You can restrict the files you iterate over to be just CSVs so that you don't have to "filter" for those later.
The Foreach File Enumerator puts the filename (full path or just file name) into a variable - let's call that "FileName". There's (at least) two ways you can parse that - expressions or a Script Task. Depends which one you're more comfortable with. Either way, you'll need to create three variables to hold the "parts" of the filename - I'll call them "FileCode", "FileDate", and "FileDataset".
To do this with expressions, you need to set the EvaluateAsExpression property on FileCode, FileDate, and FileDataset to true. Then in the expressions, you need to use FINDSTRING and SUBSTRING to carve up FileName as you see fit. Expressions don't have Regex capability.
To do this in a Script Task, pass the FileName variable in as a ReadOnly variable, and the other three as ReadWrite. You can use the Regex capabilities of .Net, or just manually use IndexOf and Substring to get what you need.
Unfortunately, you have just missed the SQLLunch livemeeting on the ForEach loop: http://www.bidn.com/blogs/BradSchacht/ssis/812/sql-lunch-tomorrow
They are recording the session, however.
I have a select query producing a big output and I want to execute it in sqldeveloper, and get all the results into a file.
Sql-developer does not allow a result bigger than 5000 lines, and I have 100 000 lines to fetch...
I know i could use SQL+, but let's assume I want to do this in sqldeveloper.
Instead of using Run Script (F5), use Run Statement (Ctrl+Enter). Run Statement fetches 50 records at a time and displays them as you scroll through the results...but you can save the entire output to a file by right-clicking over the results and selecting Export Data -> csv/html/etc.
I'm a newbie SQLDeveloper user, so if there is a better way please let me know.
This question is really old, but posting this so it might help someone with a similar issue.
You can store your query in a query.sql file and and run it as a script. Here is a sample query.sql:
spool "C:\path\query_result.txt";
select * from my_table;
spool off;
In oracle sql developer you can just run this script like this and you should be able to get the result in your query_result.txt file.
#"C:\Path\to\script.sql"
Yes you can increase the size of the Worksheet by change the setting Tool-->Preferences - >Database -> Worksheet -> Max rows to print in a script(depends on you).
Mike G answer will work if you only want the output of a single statement.
However, if you want the output of a whole sql script with several statements, SQL*Plus reports, and some other output formats, you can use the spool command the same way as it is used in SQL*Plus.