Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How the hell would foveated streaming even work, it seems physically impossible. Tracking where your eye is looking then sending that information to a server, it processing it and then streaming that back seems impossible.


The data you're sending out is just the position and motion vectors of the pupils. And you probably only need about 16 bits for each of these numbers for 2 eyes. So the equivalent of two floating point numbers along a particular channel or 32 bits at minimum. Any lag can be compensated for by simply interpolating the motion vectors.

It actually makes a lot of sense!


Eye tracking hardware and software specifically focus on low latency, e.g. FPGA close to the sensor. The resulting packets they send is also ridiculously small (e.g 2 numbers as x,y positions of the pupils) so ... I can see that happening.

Sure eyes move very VERY fast but if you do relatively small compute on dedicated hardware it can also go quite fast while remaining affordable.


It just needs to be less impossible than not doing it. I.e. sending a full frame of information must be an even more impossible problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: