Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

UNICODE Compilation

I hope I'm posting this in the right place, I didn't see any boards with IMAQ listed..

Anyway,  my hardware is the IMAQ 1409 card and I'm using Visual C++  and several of the IMAQ/Vision API calls:

imgInterfaceOpen
imgSessionOpen
imgShowError

I am not trying to create a UNICODE compilation of my application, and I am getting these errors:

error C2664: 'imgInterfaceOpen' : cannot convert parameter 1 from 'unsigned short *' to 'char *'
        Types pointed to are unrelated; conversion requires reinterpret_cast, C-style cast or function-style cast

etc. for each call.

What is the solution? I don't see any reference to UNICODE API calls in the documentation.

Thanks
Paul Hemmer
0 Kudos
Message 1 of 7
(4,088 Views)
Paul,
 
    It may be easier to understand what's going on if you send me a few lines of your code that include the functions giving the errors.  I'm not sure what UNICODE is referring to in your sentence, but based on the error it looks like a syntax error.  Is the first parameter when you call imgInterfaceOpen a char *?  If I am misunderstanding some part of your post let me know, thank you!
 
-Allison
-Allison S.
Calibration Services
Product Support Engineer
0 Kudos
Message 2 of 7
(4,072 Views)
Thank you for taking time to respond, but if you aren't familiar with what UNICODE refers to, it's not likely you can help. The code was functioning just fine when compiling the project as _MBCS (multi-byte character encoding) but I am not trying to make a UNICODE build at which point the existing and working calls to the imaq API fail.

UNICODE is kinda like ASCII - a standard for character encoding.
0 Kudos
Message 3 of 7
(4,065 Views)
Paul,
 
    I apologize -- it appears that I misstated my question.  I was not asking what UNICODE is, I am very familiar with it.  I was more asking what the relevance of UNICODE is to your question.  You have stated several times that you are not using UNICODE, and yet you mention it as if it were causing the issue -- but I don't see it mentioned in the error text.  The error text is referring to an incorrect type in a parameter, which is unrelated to UNICODE as far as I know.  Could you please explain further what exact calls you are making?  Thank you.
 
-Allison S.
Applications Engineering
-Allison S.
Calibration Services
Product Support Engineer
0 Kudos
Message 4 of 7
(4,054 Views)
Hi Allison - forgive the typo in my last post, I meant to say "I am NOW using Unicode.."

In fact, when I changed the compilation paramater from _MBCS to _UNICODE in my project settings, I had several hundred errors in a similar veign throughout my code, with regard to character conversions and using the unicode compatible macros and redefinitions as needed. I resolved them all, using _T() macros, and various variants of the string functions I was using. The only remaining errors are the imaq api calls, as I posted in my first post. Those errors only occur when compiling a UNICODE build. They've compiled and worked perfectly as is for a few years now under _MBCS.  From what I gather, most 3rd party API's that support UNICODE would have seperate interfaces implemented for UNICODE, but I don't see any such thing in the imaq api. Which is what brings me here 🙂

I am doing this:

dwError = imgInterfaceOpen(CConfigManager::Instance()->GetInterfaceName(), &m_InterfaceID);

GetInterfaceName is defined like this:

    TCHAR m_lpszInterfaceName[32];
    LPTSTRGetInterfaceName() { return m_lpszInterfaceName; };




0 Kudos
Message 5 of 7
(4,042 Views)
for what it's worth, I am largely unfamiliar with UNICODE as it relates to implementing it in my C++ project, I'm basically taking a legacy app to UNICODE, and save for the imaq calls, have been successfull with it, so I could certainly be misunderstanding something 🙂

0 Kudos
Message 6 of 7
(4,041 Views)

Mound,

    I apologize for the delay.  I have found out that unfortunately our driver does not support UNICODE.  However, I believe Visual Studio provides conversion functions that you can use to convert from UNICODE to normal char* strings.  Thanks, and let me know if you have anymore questions!

-Allison S.

Applications Engineering

-Allison S.
Calibration Services
Product Support Engineer
0 Kudos
Message 7 of 7
(3,998 Views)