12-10-2021 10:44 AM
@James_W wrote:
@JÞB wrote:
I've upped tmo to 5ms for testing on the PC that was causing the issue to see if I can resolve it.
that begs a whole new bunch of questions about what is going on with that system.
- Slow?
- Crappy processor
- User downloading **bleep**
- Bad memory sections
- ......
You know, the basic stuff that bites you in the "cereal out" port.
Relatively slow for this application - certainly doesn't meet the mininimum S/W specs laid down 3 years ago of 50cores and 60GB+ RAM, but I'm still expected to support it 😂
(I've said before this is a beast of an application!)
the Laptop the error popped up on must be well over 4 years old.
James
Does a queue require continuous blocks of data?
Maybe the memory isn't the (only) problem. Allocating a continuous block of data is much harder than separate blocks.
Queues do not allocate all the elements at Obtain Queue (I checked). If the data needs to be continuous, at some point heavy, heavy memory move operations will be needed. This could easily fail, even if there's plenty of memory on paper.
If the data comes in in chunks, enqueueing the chunks might do the trick. So with a queue of 1D string arrays you get less elements in your queue, and each small array would be much easier to allocate. If it doesn't come in chunks, this could still be handled in a queue manager class of course. A DIY memory manager starting to look better and better.
If the queue data (e.g. the pointers to the strings) isn't continuous, there will be a sever memory overhead in managing this. So, if it is continuous there might be a solution solution, if not, that might be the problem.
Just a though...
12-10-2021 11:04 AM - edited 12-10-2021 11:14 AM
@James_W wrote:
@JÞB wrote:
I've upped tmo to 5ms for testing on the PC that was causing the issue to see if I can resolve it.
that begs a whole new bunch of questions about what is going on with that system.
- Slow?
- Crappy processor
- User downloading **bleep**
- Bad memory sections
- ......
You know, the basic stuff that bites you in the "cereal out" port.
Relatively slow for this application - certainly doesn't meet the mininimum S/W specs laid down 3 years ago of 50cores and 60GB+ RAM, but I'm still expected to support it 😂
(I've said before this is a beast of an application!)
the Laptop the error popped up on must be well over 4 years old.
James
OH, SO IT IS A HARDWARE PROBLEM! You're being silly Jimmy my man! You really need a new application with support for the non-compliant hardware. You shouldn't be fixing what isn't broken.
Then, if by some chance you CAN find a full featured solution, consider porting a release upgrade to the existing systems (you know, with full IQ/OQ tracking and that kind of stuff.)
Frustrated smiley. If I had a dollar for every hour I spent shoehorning code.... wait...I charge by the hour so, that would be super cheap.
12-15-2021 02:17 AM
@JÞB wrote:
@JÞB wrote:
OH, SO IT IS A HARDWARE PROBLEM! You're being silly Jimmy my man! You really need a new application with support for the non-compliant hardware. You shouldn't be fixing what isn't broken.
Then, if by some chance you CAN find a full featured solution, consider porting a release upgrade to the existing systems (you know, with full IQ/OQ tracking and that kind of stuff.)
Frustrated smiley. If I had a dollar for every hour I spent shoehorning code.... wait...I charge by the hour so, that would be super cheap.
Not sure yet, started running a test yestedary, should know in the next few days. Unfortunately I can't change the H/W it's running on, I can only refactor the program as I make changes to make it better (easier to maintain and run faster with less memory leaks).
Re-working the architecture bit by bit, but it's old and easier to refactor than re-write from scratch.
Adding new features can cause issues as I'm constantly pushing the performance envelope.
Please no use of superpowers @Jay.
I will mark the solution and all the correct helpful bits when I know what's going on. 😉
James
12-15-2021 03:24 AM
BTW, how do you even specify a queue size more then 2GB?
The input is a I32, and 2GB is the maximum value.
12-15-2021 04:32 AM
wiebe@CARYA wrote:
BTW, how do you even specify a queue size more then 2GB?
The input is a I32, and 2GB is the maximum value.
Simple...
This is a ~30GB queue (create and watch your RAM disappear!) 😉
James
12-15-2021 04:59 AM
Right....
In summary... after 1 day of testing on the PC that was causing the issue.
The issue previously popped up after 30mins then 1 hour again.
with a timeout of 5ms the issue has disappeared and the S/W has been running for 24hours, so it's looking like a H/W issue hitting max capacity occasionally.
I've gone through and marked the 3 posts that I believe most accurately point to the issue as the solution (as I believe I'm allowed 3 solutions - and yes I've been naughty and marked one of mine).
Many thanks to all who helped.
(I didn't make a mistake as the max queue size was always ~60Gb + a few header packets.. the data packets were 8Mb+ when hitting that sort of queue size so the header 8bytes was negligible when it came to the maths and I never enqueued empty strings to the array, all data was a packet of a pre-defined size = known string length).
James
12-15-2021 08:10 AM
@James_W wrote:
wiebe@CARYA wrote:
BTW, how do you even specify a queue size more then 2GB?
The input is a I32, and 2GB is the maximum value.
Simple...
This is a ~30GB queue (create and watch your RAM disappear!) 😉
James
Ok, I'd call that the queue data's size, not the queue size. But I probably could have gotten that from the context.
12-15-2021 11:43 AM
@James_W wrote:Please no use of superpowers @Jay.
I will mark the solution and all the correct helpful bits when I know what's going on. 😉
James
I actually believe that I am in the lowest quartile of superpower users.
For others reading this, yes I admitted that some users can manipulate marked solutions on another thread. In that thread I ran into an OP that actually had a problem with File Buffering, I don't recall seeing that anywhere else on the forums so I mentioned that I would go back and clean that up later.
And James, with a few additional posts you could do that too, (only 9,324 posts to go!)