I'm having a problem with my CAL program.
My program imports data of changes over time of 2700 unique elements with 2 values, each with 75 data points. This information is then passed to my 4870 (2x [2700 x 75]). The GPU then computes the correlation coefficient of every element in relation to every other element, and then returns a table of the result (2x [2700 x 2700]). Everything performs flawlessly... until I start shutting down CAL, at which point the program will either seg fault, or glibc will abort the program because of a double free.
I'm on Ubuntu 8.10 x64, and using StreamSDK 1.3. The kernel has 4 inputs (2 int, 2 float), 1 const (int), and 4 outputs (2 int, 2 float)
The first cause, which always causes the glibc abort, not a seg fault is calResFree(). It only occurs when freeing the second 2 inputs, if I set up CAL to only free input 0 & 1, there's no problem. Freeing the outputs and constant buffer causes no problem.
The second cause, which happens no matter what, is calShutdown(). Most of the time, this causes a seg fault. Occasionally though (about 1 out of 6), I get the glibc error, followed by a backtrace.
I've included the code for my shutdown routine, and both backtraces. Any ideas on what's causing this???
Edit: I'd also like to clarify. This is not my first stream program, I have a few others that perform other functions that do not have this problem. However, this is the first one with this much output data. The code for this is based on the code for the other, working, programs.