I'm having a problem that is really driving me nuts. The setup heres a little complicated, so please bare with me:
I have a dll, written in C#.NET, that contains a class, let's call it Manager, it's job is to search a directory, find other dlls and search them for classes that implement a certain interface, lets call it ICalc. It uses Assembly.LoadFrom to do this. It maintains a list of all the classes it finds that implement ICalc (in an arraylist of Type objects) and returns information about those classes (name and version) to the client program. Manager also has a method, let's call it GetCalc, that will instantiate and return an object that implements ICalc of a type requested by the caller. So far so good, and this seems to work okay.
Now I have another dll that contains a class, lets call it MyCalc that implements ICalc. When MyCalc is instantiated it must instantiate a few other objects that are related to its function. Among these are a class that contains some settings, lets call it MySettings. These settings are actually serialized to the disk and deserialized again by MyCalc using a static method in MySettings, something like:
public static MySettings LoadSettings(string filename)
Hopefully, you are still following this. Now, I create a simple C# console application to test this out and everything works fine - great!! But here's the problem, I'm trying to use LabView as the client for the Manager Dll and this is were I run into trouble. For some reason the deserialization no longer works and chokes with an "Invalid Cast" exception. Basically the line:
MySettings setting = (MySettings) MyFormatter.Deserialize(MyStream)
Doesn't work.
Does anybody have any clue why this might be? I'm really confused by this. What is LabView doing different from a simple console application that is causing this to fail?
Thanks