MixCast Frequently Asked Questions

What Hardware do I need to get started with MixCast?

At a minimum, you’ll need a fairly powerful PC, a compatible camera (see here), and either the Oculus Rift or HTC Vive headset. To take it to the next level, we recommend setting up a green-screen, and having a third controller or Vive tracker allows you to move the camera while broadcasting.

What kinds of cameras are compatible with MixCast?

MixCast currently supports DirectShow-compatible USB devices, which include almost all webcams as well as some digital cameras and capture cards. 

If your device isn't appearing in the dropdown list, or is, but isn't outputting when selected, let us know, but in the meantime you can try a workaround described here!

Which games support MixCast?

A growing list of games are enabling MixCast broadcasting with the MixCast SDK (look for the MixCast logo in the game’s desktop UI). If your favourite game doesn’t support MixCast yet, let the developer know if you’re interested in using MixCast VR with their product. If you’re working on a title yourself, you can download the free SDK right here!

For many more titles you can use MixCast Capture. For a full list of supported and confirmed MixCast-ready titles, see our App Compatibility page!

Are there keyboard controls for any MixCast Studio functions?

Yes! See the list here.

How do I align the virtual image and camera feed?

This setup step is called Camera Alignment, and is performed while wearing your VR headset with MixCast Studio. 

  1. Find the Camera object floating in the room
  2. Point your controller at the button below the Camera in the middle, labelled Quick Setup, and squeeze the trigger.
  3. Bring the ring of the Vive controller that you used to select the button as close to the physical location of the camera as possible. The output of the camera is shown in VR. You should align the ring to encompass the center crosshairs on screen as largely as possible, center it on screen, and click the Grip button to capture the camera's position. 
  4. Take 1-2 steps back and repeat the following step 4 times:
  5. Hold a controller so its ring is centered in the crosshair presented on-screen. The scale is not important here, just the position. Press the Grip button while in this position.

    You should now have a fully aligned camera feed. Move your controllers around and test that the physical controllers remain overlapping their virtual counterparts as much as possible. If there is a delay between your physical controllers and your virtual controllers aligning, you may want to enable or adjust Buffered Mode.

How do I know when I have a good Camera Alignment result?

You can test your values by standing in the center of the viewable area, facing the Output Display, and holding your controllers in various positions: straight forward, straight out, and straight up (if necessary). If your virtual controllers and real controllers stay overlapping throughout, your alignment settings are correct. NOTE: Your camera’s Field of View value will also affect results.

What can I do if the camera image seems delayed relative to the virtual objects (virtual controllers moving before hands)

Your camera or capture card has “latency”, which can be compensated for by using the Buffered Output mode. The output mode can be set from the Compositing screen in MixCast VR Studio. Once set, you should experiment with the value of the Game Delay field (hit Enter after typing in a value) to find one that causes your hands and controllers to move in sync.

Once Buffered mode is enabled, you can adjust this value in keyboard in the application using the MixCast SDK while MixCast is active by pressing the [ key to decrease the delay of the virtual world, and the ] key to increase it.

How do I determine the Field of View of my camera?

For Mixed Reality cameras: Quick Setup is now the fastest, easiest, and generally most accurate method of determining your camera's Field of View. Put on your VR headset, grab your controllers, and select the button below the floating camera object with the label "Quick Setup" to run through the process.

For Virtual Cameras: Select whichever Field of View gives you yourdesired results! Values can range from 1-179 degrees.

How do I record my MixCast experience to play back or upload later?

Recording application output to video can be handled by whichever screen-capture application you prefer. Some of our favorites are OBS Studio and GeForce Experience.

How do I stream to Facebook Live, Twitch, etc with MixCast?

Streaming to your favorite platform can be handled by whichever screen-capture/streaming application you prefer. Some of our favorites are OBS Studio and GeForce Experience.

What can I do if my 3rd controller is being treated as one of my hands in a VR application?

Try these steps in order until you find success:

  • Hold the Steam button on your hand controllers and select Turn Off Controller, then re-enable it by pressing the Steam button and waving it around until detected
  • Unplug the 3rd controller, wait until your hand controllers appear, and then plug it in again
  • Unplug the 3rd controller, close your VR application and SteamVR, open SteamVR, turn on your hand controllers, then plug in your 3rd controller.

What can I do to improve the quality of static subtraction keying?

Your environment is the biggest factor to the quality of static subtraction keying. Here are some suggestions to try:

  • Disable automatic camera parameter management (auto-focus, auto-light, etc)
  • Check that objects behind you don’t have similar colors to your clothing or skin
  • Ensure that the lighting in the room is unchanging and your movement in the room doesn’t affect the lighting of the room significantly
  • Check if your device’s driver or software has any Automatic image controls enabled. These features will generally interfere with isolation mechanisms, although Auto-Focus can still be enabled for chromakeying in most cases.

What do I need to enable camera movement during filming?

Tracked camera movement involves dedicating one controller (or the Vive Tracker) to the camera you want to move. The controller and the camera should be fastened together so the camera view is unobstructed, as are the tracking sensors of the Vive controller. A rigid bond (such as a 3D printed mount or a mini-tripod clamp) is very important to the accuracy of the tracking over the session. If using a 3rd Controller, it needs to be connected via USB cable. To activate tracked movement in MixCast Studio, AFTER Camera Alignment, open the Camera Placement menu and click the “Take Closest” button. Moving that controller should now update the virtual view.

If your tracking is producing jitter in the position or rotation of the camera, you can enable motion smoothing in the UI. Smoothing can also be used with purely virtual cameras (input device set to NONE) to steady your hand motion.

What does Enhanced BG Removal do and how do I use it?

Player cropping allows you to exclude large areas from the real world camera feed in mixed reality without any additional hardware over your existing setup. This allows for greater flexibility in cinematic shots especially those in motion. It also increases the quality of your other background removal methods by providing additional removal logic.

The player cropping feature can be configured under the Subject menu in MixCast Studio. Player cropping operates by calculating a box around the player using their head, hands, and feet (approximated) positions. As well as enabling this feature, you can configure how much padding to add around your subject's head, hands and feet in Studio.

What does Player Lighting do and how do I use it?

Player Lighting, a feature of MixCast introduced in release 1.4, allows for virtual in-game lights to affect the color of the real player in the mixed reality view. This effect is quite compelling when implemented by developers, and helps blur the seam between virtual and real content. 

To enable Player Lighting, open the Lighting menu in MixCast Studio, and configure the "Take Lighting" slider to have a value higher than 0. Note you can blend this value from ignoring lighting to fully lit. You can then adjust the "Base Amount" and "Light Power" values to create the desired amount of contrast and brightness of the player. Try picking up the Torch object in the scene to test how a small powerful light affects the player.

Once you've configured your values, just jump into your/another application. The player in the mixed reality output should now be receiving lights from the application. If your mixed reality output doesn't appear correct when Player lighting is active in the application, but does in MixCast Studio, first see if the developer has posted any information about the issue. If not, let them know! They may not be aware of the issue or could already be working on it.

What resolution can MixCast output at?

There is no limit to MixCast’s output resolution! Without the need to use split-screen layering MixCast can output at the highest resolution available to your monitor or video capture device for streaming.


Revision #1
Created 11 months ago by Sulli
Updated 11 months ago by Sulli