Spyder Kernel Dying - sql-server

Let me first start by saying that this post is related to many on this site and that I am simply posting this for posterity and helping to add evidence as to why this particular error occurs.
So I have a script that I run that pulls in data to a data frame from sql, then it does some operations and writes the output back to the sql server. I first began experiencing the random "Kernel Died" error when I added the executemany() to my sqlalchemy script for writing to the server using pandas to_sql command. I could not figure this out for the life of me.
Continue down the road a bit and I began writing large amounts of information back to the server without using executemany() and was receiving the error that I had duplicate primary keys when writing to the server. I have known historically that there is a known issue in converting from pandas to h2o frames that the h2o frame will sometimes create a duplicate record in the frame. When this converts back to pandas you will still have the duplicate and when you write that back to the server you will still have the duplicate. One of the values being written back to my server is a primary key and so cannot have a duplicate entry and causes an error.
All of that being said. When I turn off executemany(), I get the error that there is a duplicate in the primary key and the operation stops. When I turn on the executemany(), I get the error that the "Kernel has Died".
So with that evidence I am proposing that the Kernel has Died error is some error in another package/entity that is not being translated back to the kernel and is causing a kernel has died error.
The question is: Where are we on these two issues? The posts I have seen are ~11 months old
Also if this post isn't allowed here where can I post information like this in order to help the developers understand the common problem?
EDIT: Follow on...
Just ran a test in jupyter with the same script. I received the appropriate error for duplicate primary key with executemany(). Just to be clear: in spyder I would have received "Kernel Died".
However i also received this error.
C:\Anaconda3\lib\site-packages\sqlalchemy\engine\base.py in
_execute_context(self, dialect, constructor, statement, parameters, *args)
1169 parameters,
-> 1170 context)
1171 elif not parameters and context.no_parameters:
C:\Anaconda3\lib\site-packages\sqlalchemy\engine\default.py in
do_executemany(self, cursor, statement, parameters, context)
503 def do_executemany(self, cursor, statement, parameters,
context=None):
--> 504 cursor.executemany(statement, parameters)
505

I just found a novice solution. Copy all your project files to a new directory, set the Spyder to factory defaults, and create a new project in the new directory! Dead kernel revives!

Something similar happened to me when using TensorFlow though. I fixed it by adding the following code at the start of my script
import os
os.environ['KMP_DUPLICATE_LIB_OK']='True'
It might have something to do with the kernel trying to import a module multiple times that it just crashes.

Related

Refresh sheet not generating error

I have inherited a spreadsheet with a macro which isn't working from someone who's left the company.
I didn't design it, but am trying to work out why it appears to be not working (in terms of not generating the correct outputs).
I noticed that there is a section which uses an OleDb connection to run a T-SQL query and update a particular sheet, beginning with the line:
With ActiveWorkbook.Connections("Daily_Production").OLEDBConnection”
and ending with the line:
ActiveWorkbook.Connections("Daily_Production").Refresh
The thing is, there is no worksheet in the book (including in hidden sheets) called "Daily_Production". However, it does not appear to generate an error on the "refresh" line.
I'm surprised that this didn't generate an error. Surely if there is no sheet with that name, it must generate an error?
Or am I missing something? I don't have much experience with OleDb connections - is it possible that it fails to generate an error and simply doesn't bring anything through?
Option 1:
The name of the connection is "Daily_Production", it's not a sheet's name. Simply write "Daily_ProductionALEALEALE" in your code and see if there is an error. If there is one, then Option 1 is correct :)
Option 2:
You have On Error Resume Next written somewhere.

Obsfucation of code in commercial product

Yesterday, my manager asked me to find and remove all references to 'previous incarnation of company' that appear in the binaries we produce for a product that we're launching in a few weeks. This got me wondering why, in a compiled stand-alone binary, there's so much human-readable content, and whether there's a simple way to obsfucate it so that the program's internals aren't hanging out in the open, so to speak (at least to anyone who opens it with a text editor or greps the file contents). Here are some examples of what I mean:
"WGL_3DFX_multisample À # ð>Unknown OpenGL error
GL_INVALID_FRAMEBUFFER_OPERATION"
" Unable to close due to unfinalised statements not an error SQL logic error or missing database access permission denied callback requested query abort database is locked database table is locked out of memory attempt to write a readonly database interrupted disk I/O error database disk image is malformed database or disk is full unable to open database file table contains no data database schema has changed String or BLOB exceeded size limit constraint failed datatype mismatch"
"flowChartDelay flowChartDisplay flowChartDocument flowChartExtract flowChartInputOutput flowChartInternalStorage flowChartMagneticDisk"
The majority of the file is human incomprehensible stuff like this, which is more what I'd expect from a binary:
"âÀÿ? ‰•þÿÿÇ…”þÿÿ ë‹…”þÿÿƒÀ‰…”þÿÿ‹”þÿÿ;Mà}`‹U‹‚¨ ‹”þÿÿ¶ƒúuF‹E‹ˆ° ‹•”þÿÿ·Q¯…ŒþÿÿÁ艅Œþÿÿ‹M"
I figured out I could simply do a search and replace for 'string that we don't want' and replace it with random text of the same length and the program would run fine, which is possibly easier than making 500 edits to our source to bring it up to date with the current status of the company as a legal entity (there are a tonne of functions called name_of_previous_company_foo()), and also easier than trying to integrate some exotic obfuscation utility into our complex and propritary build system, but it's not an especially elegant solution, and I'd still like to know if there's a way to make our binaries into something more like a black box, where someone can't just open it with a text editor and see our function and class names.
People build source code obfuscator tools for "commercial software" that can scramble strings and identifiers so they aren't easily read out of the binary, but are still usuable as the strings they intend to be.
Such obfuscators tend to be language specific, because they have to handle the fine detail of the language structure.
Google "Source Code Obfuscators" and you'll find many.

Cross Referencing Data Written to a text file with a existing Database in Delphi?

Im trying to cross reference data written to a text file With a Existing Database IE (check if the data written to the text file already exists in the database).
I have already created the program that writes the users login data (Name and Password) to a text file then i have started to write a algorithm to read data from the text file,but i am a bit stuck i have the Name Stored in the first line of the text file and the password (String values only) stored in the next line.
I have no idea how you would check if this data is already existing in the database,would you need to first extract the contents of the database first? or could you just cross reference it directly with the Database? I Have already created the Database(UserData.accdb) but i have not yet linked it up to the Form. This is what i have so far:
procedure TForm1.btnclickClick(Sender: TObject);
var
tRegister : TextFile;
Sline : String;
Sname,SPword : String;
begin
Assignfile(tRegister,'register.txt');
Try
Reset(tRegister);
except
Showmessage('File Register.txt does not exist');
Exit;
end;
While not EOF(tRegister) do
ReadLn(tRegister,Sline);
Sname:=Copy(Sline);
// This is where i want to add code
end;
end;
end.
Please don't be to harsh i am still new to Delphi :)
I understand from your question that you're currently stuck trying to check if a particular record exists in your database. I'll answer that very briefly because there are plenty similar questions on this site that should help you flesh out the detail.
However the title of your question asks about "Cross Referencing Data Written to a text file with a existing Database". From the description it sounds as if you're trying to reconcile data from two sources and figure what matches and what doesn't. I'll spend a bit more time answering this because I think there'll be more valuable information.
To check for data in a database, you need:
A connection component which you configure to point to your database.
A query component linked to the connection component.
The query text will use a SQL statement to select rows from a particular table in your database.
I suggest your query be parametrised to select specifically the row you're looking for (I'll explain why later.)
NOTE: You could use a table component instead of a query component and this will change how you check for existing rows. It has the advantage you won't need to write SQL code. But well written SQL will be more scalable.
The options above vary by what database and what components you're using. But as I said, there are many similar questions already. With a bit of research you should be able to figure it out.
If you get stuck, you can ask a more specific question with details about what you've tried and what's not working. (Remember this is not a free "do your work for you service", and you will get backlash if it looks like that's what you're expecting.)
Reconciling data between text file and database:
There are a few different approaches. The one you have chosen is quite acceptable. Basically it boils down to:
for each Entry in TheFile
.. if the Entry exists in TheDatabase
.. .. do something with Entry
.. .. otherwise do something else with Entry
The above steps are easy to understand, so it's easy to be confident the algorithm is correct. It doesn't matter if there aren't one-liners in Delphi to implement those steps. As a programmer, you have the power to create any additional functions/procedures you need.
It is just important that the structure of the routine be kept simple.
Any of the above steps that cannot be very trivially implemented, you then want to break down into smaller steps: 2.a. 2.b. ; 3.a. 3.b. 3.c. ; etc. (This is what is meant by top-down design.)
TIP: You want to convert all the different breakdowns into their own functions and procedures. This will make maintaining your program and reusing routines you've already written much easier.
I'm going to focus on breaking down step 2. How you do this can be quite important if your database and text files grow quite large. For example you could implement so that: every time you call the function to check "if Entry exists", it looks at every single record in your database. This would be very bad because if you have m entries in your file and n entries in your database you would be doing m x n checks.
Remember I said I'd explain why I suggest a parametrised query?
Databases are designed and written to manage data. Storing and retrieving data is their primary function, so let it do the work of finding out if the entry you're looking for exists. If for example you wrote your query to fetch all entries into your Delphi app and search there:
Increase the memory requirements of your application.
But more importantly, without extra work, expose yourself to the m x n problem mentioned above.
With a parametrised query, each time if EntryExists(...) is called you can change the parameter values and effectively ask the database to look for the record. The database does the work, and gives you an answer. So you might for example write your function as follows:
function TForm1.EntryExists(const AName: string): Boolean;
begin
qryFindEntry.Close;
qryFindEntry.Parameters.ParamByName('EntryName').Value := AName;
qryFindEntry.Open;
Result := qryFindEntry.RecordCount > 0;
end;
TIP: It will be very important that you define an index on the appropriate columns in your database, otherwise every time you open the query, it will also search every record.
NOTE: Another option that is very similar would be to write a stored procedure on your database, and use a stored procedure component to call the database.
Additional comments:
Your routine to process the file is hard-coded to use register.txt
This makes it not-reusable in its current form. Rather move the code into a separate method: procedure ProcessFile(AFileName: string);. Then in your button click event handler call: ProcessFile('register.txt');.
TIP: In fact it is usually a good idea to move the bulk your code out of event handlers into methods with appropriate parameters. Change your event handler to call these methods. Doing this will make your code easier to maintain, test and reuse.
Your exception handling is wrong
This is an extremely bad way to do exception handling.
First, you don't want to ever write unnecessary exception handling. It just bloats your code making it more difficult to read and maintain. When an exception is raised:
The program starts exiting code to the innermost finally/except block. (So an exception would already exit your routine - as you have added code to do.)
By default, an unhandled exception (meaning one you haven't swallowed somewhere) will be handled by the application exception handler. By default this will simply show an error dialog. (As you have added code to do.)
The only change your code makes is to show a different message to the one actually raised. The problem is that you've made an incorrect assumption. "File not exists" is not the only possible reason Reset(tRegister); might raise an exception:
The file may exist, but be exclusively locked.
The file may exist, but you don't have permission to access it.
There may be a resource error meaning the file is there but can't be opened.
So the only thing all your exception handling code has done is introduce a bug because it now has the ability to hide the real reason for the exception. Which can make troubleshooting much more difficult.
If you want to provide more information about the exception, the following is a better approach:
try
Reset(tRegister);
except
on E: Exception do
begin
//Note that the message doesn't make any assumptions about the cause of the error.
E.Message := 'Unable to open file "'+AFileName+'": ' + E.Message;
//Reraise the same exception but with extra potentially useful information.
raise;
end;
end;
The second problem is that even though you told the user about the error, you've hidden this fact from the rest of the program. Let's suppose you've found more uses for your ProcessFile method. You now have a routine that:
Receives files via email messages.
Calls ProcessFile.
Then deletes the file and the email message.
If an exception is raised in ProcessFile and you swallow (handle) it, then the above routine would delete a file that was not processed. This would obviously be bad. If you hadn't swallowed the exception, the above routine would skip the delete step because the program is looking for the next finally/except block. At least this way you still have record of the file for troubleshooting and reprocessing once the problem is resolved.
The third problem is that your exception handler is making the assumption your routine will always have a user to interact with. This limits reusability because if you now call ProcessFile in a server-side application, a dialog will pop up with no one to close it.
Leaving unresolved exceptions to be handled by the application exception handler means that you only need to change the default application exception handler in the server application, and all exceptions can be logged to file - without popping up a dialog.

ReleaseHandleFailed with SQLite + Entity Framework

I'm receiving an error message in VS2010 after I execute the following code to get values from a SQLite database via an automatically generated ADO.Net Entity Data Model.
using (Data.DbEntities ent = new Data.DbEntities())
{
var r = from tt in ent.Template_DB select tt;
r.First();//Required to cause error
}
The SQLite database table being accessed is called 'Template' (which was renamed to Template_DB for the model) with a few columns holding strings, longs and bits. All queries I've tried return exactly what's expected.
The message I receive is:
ReleaseHandleFailed was detected
A SafeHandle or CriticalHandle of type
'Microsoft.Win32.SafeHandles.SafeCapiHashHandle' failed to properly
release the handle with value 0x0D0DDCF0. This usually indicates that
the handle was released incorrectly via another means (such as
extracting the handle using DangerousGetHandle and closing it directly
or building another SafeHandle around it.)
This message comes up perhaps 60% of the time, up to 8 seconds after the code has completed. As far as I'm aware, the database is not encrypted and has no password. Until recently, I've been using similar MS-SQL databases with Entity Framework models and never seen an error like this.
Help!
EDIT:
I downloaded/installed "sqlite-netFx40-setup-bundle-x86-2010-1.0.81.0.exe" to install SQLite, from here. This included the System.Data.SQLite 1.0.81.0 (3.7.12.1) package (not 3.7.13 as stated in the comment below)

Process dimension fails with message "A FileStore error from WriteFile occurred"?

I am trying to process a dimension using SQL Server 2005 Analysis Services. This has worked in the past without problems but recently fails.
The dimension is hierarchical using 4 columns from a single table (the entire cube uses a single table).
The error message received (regardless if I process the entire cube or the dimension, whether I "Process full" or not) is this:
File system error: A FileStore error from WriteFile occurred. Physical file:
\\?\L:\Microsoft SQL Server\MSSQL.3\OLAP\Data\MSMDCacheRowset_xxx.tmp.
Logical file: . .
My guess is that this is related to the amount of growing data (currently 15 million rows in the specific table).
It has worked before (no changes has been made)
The processing reads 11 million rows before displaying the error
Physical memory on the server runs out at the time the error is displayed
Googling the error message results in a few hits indicating column size as a problem.
Could anyone point me in the right direction? I guess that one way out could be to try using smaller columns (varchar(x) instead of varchar(y)) but it feels like going around the problem instead of solving the issue.
Best regards
Erik Larsson
Check the dimensional property "ProcessingGroup" for all the relevant dimensions. If it's set to 'ByTable' try setting it to 'ByAttribute'
The reason this can cause processing issues with large dimensions (# of members, # of attributes, etc.) is because when using the ByTable setting, it will try to put the entire dimension into memory.
Other reason for this error can be limitation on size of .asstore file. If file is around 4Gb in size and dimension regularly processed by ProcessUpdate, you have to process it with ProcessFull to cure this issue.
I found this solution here
Doing a full reprocessing of the dimension throwing the error worked for me.

Resources