cancel
Showing results for 
Search instead for 
Did you mean: 

Gaming Discussions

highborn
Adept I

bsc ai thesis / gpu computing

TL;DR;

Can i GPU compute an AI workload in Vulkan?

Hello.

I'm a uni student working for his BSC degree, my diploma work will be about AI. More specifically i want to to be the genetic algorithm. With that said, i'm very much a beginner in AI. (and everything at the bsc level obviously.)

So when i took the course for my thesis (if that's the right word) i didn't really have any specific idea, so the prof/lecturer pitched me one. I like but there's a "slight" problem with it for me.

He said, that'd be very good, if we could replicate the data found here:

https://blog.openai.com/evolution-strategies/

I really like the idea of it but there's a massive hardware limitation for me. I can't just throw hundreds but even 32 cores and days of compute time at the problem. I'm an avid gamer and my rig is a slouch by no means, but obviously it's not a server farm. It is a dilemma for me and i was sitting on for a while and came up with something tho I've no idea if it even can be done, worth the time, and effort.

My solution to this would be to try to GPU compute it on the VULKAN API. My gpu is a good compute card as far as i know and if this kind of workload could be parallelized on it i'd be amazing. Note that their original code was written in python. What i know about it is that's it's easy to code, but terribly slow. I was thinking about rewritting it in C++ and try to GPU compute the workload.

My question would be that is it an option to do so, or do i need to think about an other topic.

Also if yes, any kind of material that helps me learn vulkan would be appreciated.

P.S.:
I've posted this also on redit but got no "good enough" answer so i came here, maybe you guyz could

My computer:
Ryzen R7 1700x

Vega64 (strix)
16gigs of ram.

2 Replies
nobodygamer
Adept II

Hi,

In general Evolutionary Algorithms are completely different than Neural Networks.

The main aspects that make NNs a good application for GPU programming is that:

1. They are using simple maths (addition, subtraction and multiplication), and

2. They are relying on matrix operations (something that GPUs excel at).

EAs though despite that they are parallelisable (each individual can be evaluated and mutated in parallel) they have an aspect that has difficulties

when you want to run them on GPU. This is the fact that an individual is not a always a simple mathematical equation which can be mapped to GPU operations.

Also you might need to consider that there will be a need to translate the representation of what an individual is to a structure that you can apply GPU computations.

On the aspect of what language to use yes C++ is the go to option. I would suggest to look into Python based approaches also, I know there is PyCUDA and Numba (NumPy on GPU). Also from a  fast search I found out that there is Vulkan for Python!

The most important part of this project for me is to model your EA's individuals and evolutionary operators in a manner that can be run on a GPU.

Best,

Christos

Anonymous
Not applicable

Hello,

I would maybe 1st develop your project to be running on you machine as test.
Then maybe look for the computing cloud platform, like Azure, Amazon, Google to run your AI project on a larger scale.
Usually the cloud computing platform give a free trial for a month or so if i remember well.

0 Likes