Isolated Storage Exception in silverlight - silverlight

I've isolated storage in my silverlight application to store some information for specific users.
On every login, I check storage space by using
IsolatedStorageFile.GetUserStoreForApplication()
After that I store some information in local variable and then clear all storage and get it again by using these lines
IsolatedStorageFile.GetUserStoreForApplication().Remove();
IsolatedStorageFile.GetUserStoreForApplication();
Sometimes I get error on IsolatedStorageFile.GetUserStoreForApplication(). Error detail is
System.IO.IsolatedStorage.IsolatedStorageException was caught
Message=Initialization failed.
StackTrace:
at System.IO.IsolatedStorage.IsolatedStorageFile.FetchOrCreateStore(String groupName, String storeName, IsolatedStorageFile isf)
at System.IO.IsolatedStorage.IsolatedStorageFile.GetUserStore(String group, String id)
at System.IO.IsolatedStorage.IsolatedStorageFile.GetUserStoreForApplication()
Error occurs randomly but when it happens,I lost my all data in storage. I don't know the reason of this error and didn't find any helpful article. I also found many related questions but my problem is still there.
Edit: I just got to know the reason of this behavior and that is According to this
If any of the directories or files in the store are in use, the removal attempt for the store fails. Any subsequent attempts to modify the store throw an IsolatedStorageException exception. In this case, you must ensure that the files or directories are explicitly deleted.
But I did find any method to explicitly delete whole store. Can anyone suggest me any solution?

Related

Flink reference data advice/best practice

Looking for some advice on where to store/access Flink reference data. Use case here is really simple - I have a single column text file with a list of countries. I am streaming twitter data and then matching the countries from the text file based on the (parsed) Location field of the tweet. In the IDE (Eclipse) its all good as I have a static ArrayList populated when the routine fires up via a static Build method in my Flink Mapper (ie implements Flinks MapFunction). This class is now inner static as it gets shirt on serialization otherwise. Point is, when the overridden map function is invoked at runtime from within the stream, the static array of country data is their waiting, fully populated and ready to be matched against. Works a charm. BUT, when deployed into a Flink cluster ( and it took me to hell and back last week to actually get the code to FIND the text file), the array is only populated as part of the Build method. When it comes to being used the data has mysteriously disappeared and I am left with an array size of 0. (ergo, not a lot of matches get found. Thus, 2 questions - why does it work in Eclipse and not on deploy (renders a lot of Eclipse unit tests pointless as well). Or possibly just more generally, what is the right way to cross reference this kind of static, fixed reference data within Flink? (and in a way that it is found in both Eclipse and the cluster...)
The standard way to handle static reference data is to load the data in the open method of a RichMapFunction or RichFlatMapFunction. Rich functions have open and close methods that are useful for creating and finalizing local state, and can access the runtime context.

Could "Could not find relationship target File[]" be triggered by a cron resource?

Is it possible that the cron resource type uses the file resource type underneath? I'm asking because I'm trying to debug this error:
could not find relationship target File[]
and the manifests that apply to this particular node contain no File resource type (that I haven't already verified work on other nodes). However, it does contain a questionable cron, which when commented out, doesn't generate that error.
Similar to what was specified by ncracker, it would seem that there is a file resource being referenced somewhere. If it wasn't directly a part of the cron that you commented out, you could look for a Cron { require => File[$somevar] } call elsewhere in your code.
That is a way that is sometimes used to tie together default resource behavior.

Can I save changes to objects to another TR besides those they are locked?

When I try to switch to edit mode for a Report source, a popup comes up telling me
"A new task will be created for the following request of user XXX".
A transport request is also being suggested.
I don't want to save my changes in this request however, but in another existing one. I am not aware of any versioning systems being implemented in my system, and don't know how to check that.
Is what i'm trying to achieve possible? And if so, how?
No, this is not possible. There are very good reasons for this being an exclusive lock -- reasons that you should know about before you attempt to change anything. Briefly speaking
The CTS only notes that an object was touched, not what change was made.
When the transport is released, the entire object in its current state is exported - there is no delta/diff logic involved.
Therefore you can't separately transport changes to the same development object. Furthermore, if you serialize this manually, the second transport will always comprise the changes of the first one.
Things get slightly more complicated with partial objects - you can have LIMU METH objects (methods of a class) in different transports, but as soon as you try to lock the R3TR CLAS main class, you'll have to resolve that.

SSRS Code Shared Variables and Simultaneous Report Execution

We have some SSRS reports that are failing when two of them are executed very close together.
I've found out that if two instances of an SSRS report run at the same time, any Code variables declared at the class level (not inside a function) can collide. I suspect this may be the cause of our report failures and I'm working up a potential fix.
The reason we're using the Code portion of SSRS at all is for things like custom group and page header calculation. The code is called from expressions in TextBoxes and returns what the current label should be. The code needs to maintain state to remember what the last header value was in order return it when unknown or to store the new header value for reuse.
Note: here are my resources for the variable collision problem:
The MSDN SSRS Forum:
Because this uses static variables, if two people run the report at the exact same
moment, there's a slim chance one will smash the other's variable state (In SQL 2000,
this could occasionally happen due to two users paginating through the same report at
the same time, not just due to exactly simultaneous executions). If you need to be 100%
certain to avoid this, you can make each of the shared variables a hash table based on
user ID (Globals!UserID).
Embedded Code in Reporting Services:
... if multiple users are executing the report with this code at the same time, both
reports will be changing the same Count field (that is why it is a shared field). You
don’t want to debug these sorts of interactions – stick to shared functions using only
local variables (variables passed ByVal or declared in the function body).
I guess the idea is that on the report generation server, the report is loaded and the Code module is a static class. If a second clients ask for the same report as another quickly enough, it connects to the same instance of that static class. (You're welcome to correct my description if I'm getting this wrong.)
So, I was proceeding with the idea of using a hash table to keep things isolated. I was planning on the hash key being an internal report parameter called InstanceID with default =Guid.NewGuid().ToString().
Part way through my research into this, though, I found that it is even more complicated because Hashtables aren't thread-safe, according to Maintaining State in Reporting Services.
That writer has code similar to what I was developing, only the whole thread-safe thing is completely outside my experience. It's going to take me hours to research all this and put together sensible code that I can be confident of and that performs well.
So before I go too much farther, I'm wondering if anyone else has already been down this path and could give me some advice. Here's the code I have so far:
Private Shared Data As New System.Collections.Hashtable()
Public Shared Function Initialize() As String
If Not Data.ContainsKey(Parameters!InstanceID.Value) Then
Data.Add(Parameters!InstanceID.Value, New System.Collections.Hashtable())
End If
LetValue("SomethingCount", 0)
Return ""
End Function
Private Shared Function GetValue(ByVal Name As String) As Object
Return Data.Item(Parameters!InstanceID.Value).Item(Name)
End Function
Private Shared Sub LetValue(ByVal Name As String, ByVal Value As Object)
Dim V As System.Collections.Hashtable = Data.Item(Parameters!InstanceID.Value)
If Not V.ContainsKey(Name) Then
V.Add(Name, Value)
Else
V.Item(Name) = Value
End If
End Sub
Public Shared Function SomethingCount() As Long
SomethingCount = GetValue("SomethingCount") + 1
LetValue("SomethingCount", SomethingCount)
End Function
My biggest concern here is thread safety. I might be able to figure out the rest of the questions below, but I am not experienced with this and I know it is an area that it is EASY to go wrong in. The link above uses the method Dim _sht as System.Collections.Hashtable = System.Collections.Hashtable.Synchronized(_hashtable). Is that best? What about Mutex? Semaphore? I have no experience in this.
I think the namespace System.Collections for Hashtable is correct, but I'm having trouble adding System.Collections as a reference in my report to try to cure my current error of "Could not load file or assembly 'System.Collections'". When I browse to add the reference, it's not an available component to select.
I just confirmed that I can call code from a parameter's default value expression, so I'll put my Initialize code there. I also just found out about the OnInit procedure, but this has its own gotchas to research and work around: the Parameters collection may not be referenced from the OnInit method during parameter initialization.
I'm unsure about declaring the Data variable as New, perhaps it should be only be instantiated in the initializer if not already done (but I worry about race conditions because of the delay between the check that it's empty and the instantiation of it).
I also have a question about the Shared keyword. Is it necessary in all cases? I get errors if I leave it off function declarations, but it appears to work when I leave it off the variable declaration. Testing multiple simultaneous report executions is difficult... Could someone explain what Shared means specifically in the context of SSRS Code?
Is there a better way to initialize variables? Should I provide a second parameter to the GetValue function which is the default value to use if it finds that the variable doesn't exist in the hashtable yet?
Is it better to have nested Hashtables as I chose in my implementation, or to concatenate my InstanceID with the variable name to have a flat hashtable?
I'd really appreciate guidance, ideas and/or critiques on any aspect of what I've presented here.
Thank you!
Erik
Your code looks fine. For thread safety only the root (shared) hashtable Data needs to be synchronised. If you want to avoid using your InstanceID you could use Globals.ExecutionTime and User.UserID concatenated.
Basically I think you just want to change to initialize like this:
Private Shared Data As System.Collections.Hashtable
If Data Is Nothing Then
Set Data = Hashtable.Synchronized(New System.Collections.Hashtable())
End If
The contained hashtables should only be used by one thread at a time anyway, but if in doubt, you could synchronize them too.

What ldap query returns the user objects now removed from active-directory?

Is there a ldap query that will return or list user objects that have been removed from the active-directory system? Must you track all the user objects currently in the active-directory, and maintain a "last seen" stamp in order to tell when a user object has been removed from the active-directory?
I really don't believe that this information is obtainable. In the next version of the OS (Windows 2008 R2), Microsoft is introducing the ability to do this by implementing a recycle bin like functionality for and Active Directory object. Please see these write ups for more info:
http://blogs.technet.com/niraj_kumar/archive/2009/02/03/new-feature-active-directory-recycle-bin-in-windows-2008-r2.aspx
http://technet.microsoft.com/en-us/library/dd392261.aspx
But you may see that this is in reference to the deletion of the object itself, and that it doesn't provide any information about when a property of the object changes. You can look at the last modified property but even then you have no way of knowing which property changed (more then likely it will be the last logon), so again your left with no help. If your trying to track a issue that is occurring which you can recreate, I recommend that create some sort of script/code which record the properties of a specific user at a given interval, and then just keep running it as you move from one step to the other i the recreation of the problem.

Resources