From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Actor Framework Discussions

cancel
Showing results for 
Search instead for 
Did you mean: 

Message sent to caller in stop core handled after last ack?

I have a daq actor who does some post processing in its Stop Core override and sends the resulting data to it's caller, where it is saved to a data file. 

 

When the caller receives the Last Ack from the daq actor it closes the data file. 

 

The problem is that the caller is receiving the Last Ack message before the final data message sent in Stop Core and prematurely closing the data file. 

 

Looking at Actor.vi, it seems that the message sent in Stop Core should be in the caller's queue before the Last Ack message is sent, but the Lask Ack message is being handled first...is this expected behavior?

 

 

 

 

0 Kudos
Message 1 of 8
(3,472 Views)

Last Ack is sent at critical priority, so yes, your caller might handle it before any message it got from Stop Core (or elsewhere), even though the message from Stop Core was sent first.

 

Why not just bundle your post-processing calculations into your nested actor's attributes?   Your caller can retrieve the results from the nested actor object in Handle Last Ack, write them to file, and then close the file.

Message 2 of 8
(3,468 Views)

Last Ack is sent at critical priority, so yes, your caller might handle it before any message it got from Stop Core (or elsewhere), even though the message from Stop Core was sent first.


Shouldn't "Last Ack" be the last message handled?  It's got "last" in its name, after all.  One of the reasons I don't like "priority" in a message queue is that it gets harder to reason about, but I would say that "Last Ack" should be lowest priority, so is the last handled.  

0 Kudos
Message 3 of 8
(3,447 Views)

Why not just bundle your post-processing calculations into your nested actor's attributes?   Your caller can retrieve the results from the nested actor object in Handle Last Ack, write them to file, and then close the file.


This is what I had gone with as a solution until I could understand what was going on...but I agree with drjdpowell that it is a little counter-intuitive that the Last Ack skips to the front of the queue. 

0 Kudos
Message 4 of 8
(3,441 Views)

It is counterintuitive. It's not a conscious decision of the AF, but it is a required result of other conscious decisions. Curiously, this is the first time I recall it ever coming up. That's surprising to me now that you highlight the issue. I guess the canonical AF behavior should be "if you are an actor who will return a result to caller and then terminate, return the result within yourself, not in a separate message, and let your caller harvest it from the Last Ack."

 

That actually has a nice symmetry to it -- caller creates a nested actor object, populates it with the problem instructions, then launches the nested actor. The caller then gets that same actor object back in the Last Ack message.

 

But I agree that the arrival of message after Last Ack is counterintuitive. The reason Last Ack gets critical priority is to stop the caller from continuing to send messages to a dead address, rather than assuming they will learn about it by checking for errors on the enqueuer. But it probably isn't as clean as it should be.

 

Put it on the list of things to consider for AF 2.0.

0 Kudos
Message 5 of 8
(3,419 Views)

 


The reason Last Ack gets critical priority is to stop the caller from continuing to send messages to a dead address, rather than assuming they will learn about it by checking for errors on the enqueuer. 

Priority wont reliably help you here, as you can get:

1) Caller sends Stop to Nested

2) Caller starts handling the next message

3) Nested stops and send Last Acq

4) Caller, continuing execution of the message, tries to send a message to Nested and gets an error.

 

The  reliable choices are

A) handle no message till one handles Last Ack (i.e. Synchronous Request-Reply)

or

B) Record somehow that Nested's address is to no longer be used even if you haven't yet received Last Ack.

0 Kudos
Message 6 of 8
(3,409 Views)

The  reliable choices are

A) handle no message till one handles Last Ack (i.e. Synchronous Request-Reply)

or

B) Record somehow that Nested's address is to no longer be used even if you haven't yet received Last Ack.


Even without leaving the "only send messages to callees and caller and self" paradigm, I don't think either of A or B will be sufficient. Imagine:

 

  1. Caller/Callee sends Stop Message to Actor under consideration (AuC). In the case of the Callee, this is could be due to an error propagating past Handle Error.
  2. Despite that end of the line (Caller/Callee) knowing about the Stop instruction and being held off from doing more by either A or B, the opposite end (Callee/Caller) has no knowledge that AuC will soon receive a Stop message. That Actor continues to send new messages.
  3. AuC receives Stop, stops, and sends Last Acq to Caller. In the case of automatically stopping nested actors, the callee would receive a message instructing it to stop, but that doesn't have to be the case (of course, if the stop isn't being handled somewhere, probably bigger problems...)
  4. Callee/Caller sends message to AuC, despite AuC having already sent Last Acq and stopped. One of these two (and there could be many callees, of course) doesn't know that a Stop message was sent by the other, so it can't preemptively wait for a Last Acq (in the case of the Caller) or automagically block messages to that enqueuer (for either of the cases, I guess).

GCentral
0 Kudos
Message 7 of 8
(3,401 Views)

In my particular case, I realized that my caller actor would need to send a message to itself from Handle Last Ack to close the data file so it would be at the back of the queue behind any data points still waiting to be saved to file. 

0 Kudos
Message 8 of 8
(3,380 Views)