måndag 11 april 2016

Changing priorities in order to keep moving forward

As we did not have much success neither with compute shaders or VBOs we had a hard time moving forward with altering the particle system in a way which would fit our goals. We therefore decided to put this problem on hold for a while and instead work on other essential parts of the projects.

Something worth noting here is that we most likely have to keep investigating how to save the Particle System data in the VRAM of the GPU. One way to handle this is to keep investigating the VBO and possibly get it up and running to give us the desired effect. An alternative would be to find another solution to handle this problem. We have read a very interesting article where they used a texture to encode particle data. This could be a very quick way for us to store the particle data and easily access it in the next frame. The blog post can be found here: http://nullprogram.com/blog/2014/06/29/

Since we stopped working on the particle system for a while, we decided we should start working a bit more with handling the multitouch in the shader. We realized that this had to be done at one point anyways.

The problem with passing data to the shader in Unity is that we are very limited in the amount and types of uniforms we can send. For example, Unity cannot send an array of data to a GL shader program. So we had to get inventive.

We realized after a while that a texture is basically a big matrix that can store data in a sort of efficient way, if we are smart about it. Basically, we defined a maximum number of hands that we allow for multitouching the screen. For now, we set that number to 20, but it could be increased in the future. The shader needs to know a couple of things about the hands to be able to do calculations. The main thing is of course the current position. But, the initial position of the touch is also relevant - that way we can track how much the user has moved the hand without lifting the hand off the screen.

This meant we needed a 20 by 2 texture to store our data. Since each pixel in the texture consists of a vec3 (or vec4 if using RGBA), we are able to store quite a lot of data in a quite small texture. The first position in the texture, [i , 0], contains the x and y coordinates for the i:th hand. The second position, [i , 1] contains the intial position of the hand. This means that in the red component of the vec3 in the 0:th position contains the x value and the green component contains the y value. Same goes for the 1:st position.

We realize we could have had an even smaller texture (20 by 1) by encoding a RGBA value as x, y, xOld, yOld. However, with the way we have done it now we have reserverd some space in case we want to send some other data at a later stage. Also, this is an optimization that is probably completely unnecessary as the amount of data we are passing is already small. However, we will consider moving to the 20 by 1 texture at a later stage if we find no use for the now zero-value blue position and alfa position in the RGBA-array.

We are now working on decoding the hand position data inside the shader. However, it is difficult as debugging on the shader is nigh impossible. A sudden realization we got while writing this blog is that the reason we can't decode multitouch data at this moment might be that the texture expects values from 0 to 255 and not floating points between 0 to 1... We have to explore this...

Inga kommentarer:

Skicka en kommentar