I’m on PSVR v2. There still exists the same amount of view drift.
I think what the Sony implementation of PSVR might be doing is track the headset over a longer period of time and look for a long-term bias in values and then do a least squares curve fit on that to minimize the drift. The assumption here is that you are playing a sit-down game, and not a room-scale, 360 game. In that case, Trinus could assume that you are facing forward most of the time. Would you be willing to implement that? I think if you sample the direction every eg tenth of a second and average the samples every 10 seconds using a rolling window of 1 minute, there should statistically be enough data to create a very good fit. This would require storing 14 kB of data in a circular buffer (600 samples * 8 bytes * 3 axes / 1024).
Here’s technically what I am thinking of. I tried to include a lot of detail, and you most likely know all of this, I just included it to preclude ambiguity.
Make a buffer that’s 600 samples long, call it the “samples buffer”. Each sample is a vector with all 3 axes. Initialize the buffer with the current looking direction. Record a new sample every 0.1 seconds. Once you reach the end of the buffer, start recording from the start, but don’t zero out the buffer, leave the old values in place, just overwrite the old ones when you get to them. Every 10 seconds average out the whole buffer, and that’s your “average looking direction”. Add this new “average looking direction” to a buffer called “averages” After 5 minutes, take those samples and split them into X, Y, and Z. For X, create a linear least squares fit (one axis is X, the other is time) and the gradient of the fitted line will be your drift. Do the same for Y and Z and add the newly found drift factor to the drift correction. Empty the “averages buffer” and “samples buffer” and start over.