MixCast SDK Frequently Asked Questions

Can I customize the look of the In-VR MixCast Display in my application?

Yes, you are free to override the MixCastCamera prefab or its contents to change its appearance, as long as the main display texture is left untouched for consistency. Be aware that updates to SDK may need to rewrite the prefab, so creating a copy and overriding the used prefab on the MixCastCameras object is advised.

Can the MixCast output include Image Effects/Post Processing?

Yes, the MixCast output can be enhanced with Image Effects, Post Processing, etc.

Simply add the desired effect components to the Camera component in the MixCast Camera prefab (located at MixCast/Prefabs/MixCast Camera), as you would for any camera. 

Currently, Immediate mode supports all effects, while Buffered mode supports all effects except those requiring access to the depth buffer (Depth of Field, SSAO, etc).

Do I need to change my build pipeline to support MixCast?

The only way that MixCast modifies your project settings is by adding project-wide script defines to enable further capabilities depending on your project's other enabled plugins (SteamVR, Oculus SDK, etc). This process is generally applied automatically on import, but for custom build pipelines, you may need to call ScriptDefineManager.EnforceAppropriateScriptDefines() if you've reset the active Script Defines.

How can I achieve accurate transparency in my application's Mixed Reality output?

You may have noticed that some or all of your transparent objects appearing differently in mixed reality when compositing in Buffered Mode. When between the camera and the subject (ex: the player), graphical elements such as meshes or particles may disappear, lose their original form, or even display black where they should be clear. 

Transparency has previously been a challenge for mixed reality, but MixCast provides a fully accurate transparency solution in MixCast 1.4 and up, so long as you can alter your shaders slightly (in a way invisible to the player). Note that this entire scenario applies to Buffered mode, as Immediate mode has already supported full transparency since 1.0!

To achieve full transparency support in Buffered mode, there are 2 steps to follow: 

Step 1
Upgrade your shaders' alpha channel support to Premultiplied Alpha. We've provided replacement shaders for the main particle shaders to get you started:

"Particles/Additive" becomes "Particles/Additive (WriteAlpha PMA)"
"Particles/Alpha Blend" becomes "Particles/AlphaBlend (WriteAlpha PMA)"

These provided shaders also exhibit the changes (2-3 lines) required to your custom shaders:
If you open one of these shaders (found in Assets/MixCast/Shaders), you'll see that 2 lines at the start of the CGPROGRAM and 1 line at the end of the fragment function are commented. These are the lines which should be copied in your own shaders. Ensure that your shaders take the correct Blend mode from the examples as they differ between Additive and AlphaBlend shaders.

Step 2
Find the BufferedMixCastCamera component in the MixCast Camera prefab (at MixCast Camera/State/Mode/Buffered) and ensure that the "Premultiplied Alpha" field is checked ON in the Inspector. 

Once these 2 steps have been completed, Buffered Mode should no longer exhibit the visual issues previous described, and you can enjoy accurate virtual rendering in your application's mixed reality feed!

How can I customize the look of MixCast’s output?

MixCast’s mixed reality Output’s visual appearance can be customized in several places:

• In the desktop UI: MixCast outputs through a standard desktop uGUI texture. Additional overlay UI elements can be added to the UI to enhance the experience (for example, a level name label), but in general it’s best to avoid cluttering the output.

• On the game camera: The Unity Camera component that renders the application can be found under MixCast/Prefabs/MixCastCamera.prefab/GameCamera/Camera. Here you can adjust the rendering settings or add image effects/post-processing.

• On the feed material: The other aspect of MixCast which can be customized is the treatment of the external feed (the player), by overriding the Material or Shader properties. The material can be found at MixCast/Materials/Camera Feed.mat.

How do I control the Output resolution of MixCast?

MixCast outputs to the existing desktop window for your application, so resolution is simply controlled by the size of your desktop window. We recommend enabling the Unity Player setting of having the desktop window resizable.

How does the Player Lighting feature integrate with my application?

By default, the MixCast camera object is equipped with the component SetLightsForInputFeed which passes the Lighting data from the scene to the feed material/shader. This shader, if configured to support lighting (as the provided shaders are), will take that lighting data and alter the camera feed color based on the amount of light that falls on it. Unity's existing Light component is used to represent a light for the player.

You can configure which Unity layer the player is represented in (for filtering which lights affect the player), and how much both Point lights and and Directional lights affect the player on the aforementioned components in the MixCast Camera prefab. Remember also that players will have the ability to dampen or disable the lighting effects completely.

The Player lighting feature supports Point lights and Directional lights at this time.

Should the player's Controllers appear in the MixCast output?

We recommend disabling the controller models for the mixed reality view if they mostly maintain the real controller's form factor. This is because the viewer is already able to see controllers: The physical ones in the player's hands! By disabling the controllers for MixCast, you're also reducing the impact of discrepancies in camera alignment or tracking, since there are fewer reference points for the viewer. 

To enable this behaviour, where the controllers are visible to the player but not the mixed reality display, do one of the following:
a) If using SteamVR's SteamVR_RenderModel component to generate the controller mesh(es): Add a SetRenderingControllerForMixCast component next to the RenderModel component and save the prefab/scene. 

b) If using standard Unity renderers: Add a SetRenderingForMixCast component on a GameObject which represents the group of Renderers to be controlled. The "Targets" field should automatically populate with the Renderers.
c) If using custom methods: Register for the MixCast.GameRenderStarted and MixCast.GameRenderEnded events and enable/disable the rendering of your custom elements accordingly.

These approaches work for any objects, not just your controllers. Consider whether your application has any other objects only meant for the player (or an independent camera) to see!

Revision #1
Created 1 year ago by Sulli
Updated 1 year ago by Sulli