1 Reply Latest reply on May 12, 2017 12:52 AM by marty1885

    How to copy buffer from one context to another context buffer which resides in same graphics card, fastest?

    tugrul_512bit

      In other words, does opencl have sharing context with other opencl contexts similar to opengl interop or  clMigrateMemObjects but without event dependency, with pure copying, not moving any buffer.

       

      I 'm working on a project and it has multiple contexts for multiple GPUs and I want to add dynamcity to it so it will move stages of a pipeline from one device to another, effectively using same device(but without touching pci-e since all buffers are in same graphics card).

       

      If I break the root of the project to do all pipelines in single opencl context, it will corrupt the open-close principle I've been applying. Maybe it was my mistake to not think everything from the beginning? But this time wouldn't it be premature optimization hence the root of evil?

      If its impossible to do pure-gpu data movement between multiple opencl contexts in a single graphics card, will it be added in future?

       

      Thank you for your time.

        • Re: How to copy buffer from one context to another context buffer which resides in same graphics card, fastest?
          marty1885

          First of all, if you are doing HPC. No, premature optimization is not the root of evil. Your program needs hours or even days (or months) wall time(actual time) to compute. Any optimization is valuable. Also, I don't think this is an optimization by any means(at least in the HPC world). This looks like a design/feature issue. As for the open-close principle. I believe you should have a central OpenCL manager that handles the lifetime of contexts. Where then you create devices and buffers from the context the manager creates.

           

          Back to the topic. I don't think copying buffer between context without CPU is possible. All OpenCL memory copying mechanics only works in the same context.

          1 of 1 people found this helpful