Max 1080p Bitrate without Moonlight?

Home Forums General Max 1080p Bitrate without Moonlight?

This topic contains 3 replies, has 2 voices, and was last updated by  JustCallMeDanTheMan 2 weeks, 5 days ago.

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #7203

    JustCallMeDanTheMan
    Participant

    I’m upgrading our networking so I can make best use of Trinus VR. It’s fun now, but over-compressed to make it playable over an old WiFi network (802.11n 2.4GHz-only i.e. theoretical max of 72Mbps, real-life test bitrate ~28Mbps) with the kinds of artefacts that you expect from that.

    So, with a view to upgrading the house progressively, I’m looking at starting with a dongle for my PC that should meet ~200Mbps real-world speed (e.g. 802.11ac in AC600 mode on 5GHz, with a theoretical maximum of 433Mbps, which matches the max WiFi mode of any smartphone in our house) – one that is capable of running a hotspot so as to avoid upgrading the nasty cheap ISP wireless hub, which would be the most expensive item to upgrade.

    But before I buy this, given I’ll want to be playing at 1080p to match the smartphone screen, with minimal compression – thus avoiding as much software heavy lifting for Trinus as possible (I have an AMD card, so no Moonlight for me) and maxing out the quality – what is the max bitrate for 1920×1080, at say the cap of 70fps, that Trinus VR will try to transmit?

    There are a range of bit rates estimated for different encoders at 1080p 60fps (https://toolstud.io/video/bitrate.php?imagewidth=1920&imageheight=1080&colordepth=24&framerate=60), but I’m guessing the bit depth and don’t know the encoder used, so it’s quite a wide ballpark from as low as 130Mbps for H.264 (which should go to 150Mbps for 70fps) – a rate I should achieve easily with the proposed setup – to e.g. 393Mbps for the maximum rate given by a recognisable codec to me (JPEG2000 250Mbps for 2Kp24 compression: 7.6:1 or 13.2%, which should go to 459Mbps for 70fps).

    If it is higher than I can manage with ~200Mbps, I’ll consider buying a more powerful dongle now so I can get to higher speeds later on when I next upgrade my phone. Trickier though, as for some reason the current batch of e.g. AC1200 dongles all seem to have issues making hotspots under Windows… I can’t find a single one that is confirmed to work!

    #7204
    loxai
    loxai
    Keymaster

    I usually use USB to get the best performance. With a i7 6700HQ and Galaxy S7, I get 50fps at 1080, tops.
    When I was testing through wireless (using a high quality Linksys EA6700 router), I could get similar performance, but not as good (don’t remember the numbers).
    At such resolution, the network can be a bottleneck, but even if it manages, the decoding on the phone is also quite resource intensive.
    One option that can be helpful is Motion boost. It won’t noticeably increase frame rate, but it will improve latency when rotating.
    That aside, you can also experiment with Trinus AIOVR. It should give you similar results but, if you get your hands on an All-In-One with HDMI input (like the Magicsee m1 Pro) you’ll be able to get the best results without additional setup.

    #7208

    JustCallMeDanTheMan
    Participant

    Yeah, with a Sony Xperia XZ (my wife’s phone) and a Wileyfox Swift 2 X, on USB I get 220-250 Mbps effective download speed, after finally noticing the NDIS driver issue in the sticky post… actually, to be honest, rediscovering the solution myself online and then wishing I’d looked here first 🙂 Now the Delivery rate never really drops below the Capture rate with the available resolutions at Ultra and minimum compression.

    By the way, in case a future Google search leads here, the tool I used to measure the real world network speeds is here: WiFi Speed Test – it’s a helpful utility if you’re familiar with networking and can work through the trials of getting it to work for a USB tether.

    So USB is fine for me for using my regular mouse and keyboard, but moving the screen IN YO FACE – which is the best way I have heard anyone explain the effect ^_^. But it’s not really workable for immersing myself in the game, standing up in a clear space with my Steam controller and wireless gyroscopic mouse velcroed to my head – the Wileyfox sadly has no gyroscope, and my own older phones have terribly shaky ones, so this is an AWESOME precise alternative. I have the PROBOX2 Remote+, which only screws up at severe vertical angles e.g. looking at the sky or your feet – it then fails to translate movements into exactly the right direction e.g. up/down might be closer to a diagonal movement until the angle is less extreme again. That’s a shame given I play VERY 3D games – in fact the ONLY game so far I am playing with Trinus is Tribes Ascend, which is all skiing, jetpacks and projectiles flying all over the place while you’re flying at a stupidly fast speed – the vertigo feeling I got when I trialled it with Tridef 3D is incredible! I love it! But other than the extreme up/down problem, the general precision from the Remote+ is otherwise brilliant, and correcting the view is no major issue as my Steam controller has the mouse pad for easy continual ‘correcting’ of orientation mid-play.

    By the way, both of these wireless devices are SEVERE problems for the 2.4GHz spectrum available for WiFi, their competition kills my WiFi stone dead. I literally turn on the mouse (which inevitably is immediately transmitting constant orientation->mouse position data, as opposed to controller buttons which only transmit data when pressed) and watch my frame rate plummet to unplayable, or even boot me off the WiFi connection all together… hence the inherent need for 5GHz! WiFi analyzer on my wife’s Sony Xperia XZ shows clear air on 5GHz, too… I can practially breathe the freedom… ^_^

    Also, I’ve witnessed the phone decoding resource intensity. One phone literally overheated at high res, crashing all over the place 😀 mind you, I was trying on a VERY hot August day…

    Capture is my current bottleneck, it’s only maxing 70fps (thus, when it inevitably drops briefly, it is still playably high and feels like no interruption) with the game res set as low as 800 x 600 now – though when it’s IN YO FACE, is not such an issue at all. 40fps, and unpleasantly low when it briefly drops, is too slow for e.g. 1280 x 1024, which is a shame given the odd aspect ratio is closest to square as I can get for fake 3D mode.

    Next thing for me to try is moving back from General mode to Game mode, which earlier in my random testing used to trigger crashes for me, though that could have been due to the old game, old graphics card or other interfering overlay programs like Steam Big Picture and f.lux.

    Also, what’s the deal with the aspect ratio? Fake 3D takes my 4:3 and delivers what seems like 1:1 to each eye… is there a way of keeping it 4:3? I’d be happy with unused pixels at the top & bottom, my headset FOV is nice and wide and the game would appear more natural that way… Am I missing a setting somewhere?

    Ah, I see, that’s part of lens calibration – I’m on it… 🙂

    #7221

    JustCallMeDanTheMan
    Participant

    Lenses nicely calibrated, so time to adjust the FOV of my game too 🙂

    Game mode worked great (I disabled f.lux and didn’t use Steam Big Picture) and it was perfectly stable and had higher frame rates at higher resolutions. Definitely more playable than in General mode for sure… but instead the latency became noticeable at higher resolutions instead. I guess it takes time to capture, encode and decode all those more pixels! 800×600 latency was imperceptible and that’s more important to me in a fast paced game – it still looked great, which is very surprising given my previous monitor experiences. Must be the IN YO FACE effect ^_^

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.