multiverse documentation: see commit body

- better docstrings for pyblish UI
- updated images
- re-organized Maya content in website by splitting plugin docs and
  sidebar menu (existing links are not broken since the same entry point
  doc exists)
This commit is contained in:
pberto 2022-05-31 14:30:38 +09:00
parent 46ab3b8e3a
commit 8f26e68357
13 changed files with 530 additions and 392 deletions

View file

@ -7,7 +7,32 @@ from openpype.hosts.maya.api.lib import maintained_selection
class ExtractMultiverseLook(openpype.api.Extractor):
"""Extractor for Multiverse USD look data into a Maya Scene."""
"""Extractor for Multiverse USD look data.
This will extract:
- the shading networks that are assigned in MEOW as Maya material overrides
to a Multiverse Compound
- settings for a Multiverse Write Override operation.
Relevant settings are visible in the Maya set node created by a Multiverse
USD Look instance creator.
The input data contained in the set is:
- a single Multiverse Compound node with any number of Maya material
overrides (typically set in MEOW)
Upon publish two files will be written:
- a .usda override file containing material assignment information
- a .ma file containing shading networks
Note: when layering the material assignment override on a loaded Compound,
remember to set a matching attribute override with the namespace of
the loaded compound in order for the material assignment to resolve.
"""
label = "Extract Multiverse USD Look"
hosts = ["maya"]

View file

@ -8,7 +8,21 @@ from openpype.hosts.maya.api.lib import maintained_selection
class ExtractMultiverseUsd(openpype.api.Extractor):
"""Extractor for Multiverse USD asset data."""
"""Extractor for Multiverse USD Asset data.
This will extract settings for a Multiverse Write Asset operation:
they are visible in the Maya set node created by a Multiverse USD
Asset instance creator.
The input data contained in the set is:
- a single hierarchy of Maya nodes. Multiverse supports a variety of Maya
nodes such as transforms, mesh, curves, particles, instances, particle
instancers, pfx, MASH, lights, cameras, joints, connected materials,
shading networks etc. including many of their attributes.
Upon publish a .usd (or .usdz) asset file will be typically written.
"""
label = "Extract Multiverse USD Asset"
hosts = ["maya"]

View file

@ -7,7 +7,21 @@ from openpype.hosts.maya.api.lib import maintained_selection
class ExtractMultiverseUsdComposition(openpype.api.Extractor):
"""Extractor of Multiverse USD Composition."""
"""Extractor of Multiverse USD Composition data.
This will extract settings for a Multiverse Write Composition operation:
they are visible in the Maya set node created by a Multiverse USD
Composition instance creator.
The input data contained in the set is either:
- a single hierarchy consisting of several Multiverse Compound nodes, with
any number of layers, and Maya transform nodes
- a single Compound node with more than one layer (in this case the "Write
as Compound Layers" option should be set).
Upon publish a .usda composition file will be written.
"""
label = "Extract Multiverse USD Composition"
hosts = ["maya"]

View file

@ -7,7 +7,19 @@ from maya import cmds
class ExtractMultiverseUsdOverride(openpype.api.Extractor):
"""Extractor for Multiverse USD Override."""
"""Extractor for Multiverse USD Override data.
This will extract settings for a Multiverse Write Override operation:
they are visible in the Maya set node created by a Multiverse USD
Override instance creator.
The input data contained in the set is:
- a single Multiverse Compound node with any number of overrides (typically
set in MEOW)
Upon publish a .usda override file will be written.
"""
label = "Extract Multiverse USD Override"
hosts = ["maya"]

View file

@ -597,390 +597,3 @@ about customizing review process refer to [admin section](project_settings/setti
If you don't move `modelMain` into `reviewMain`, review will be generated but it will
be published as separate entity.
## Working with Multiverse in OpenPype
OpenPype supports creating, publishing and loading of [Multiverse | USD](
https://multi-verse.io) data. The minimum Multiverse version supported is v6.7,
and version 7.0 is recommended.
In a nutshell it is possible to:
- Create USD Assets, USD compositions, USD Overrides.
This _creates_ OpenPype instances as Maya set nodes that contain information
for published USD data.
- Create Multiverse Looks.
This _creates_ OpenPype instances as Maya set nodes that contain information
for published Maya shading networks data.
- Publish USD Assets, USD compositions and USD Overrides.
This _writes_ USD files to disk and _publishes_ information to the OpenPype
database.
- Publish Multiverse Looks.
This _writes_ Maya files to disk and _publishes_ information to the OpenPype
database.
- Load any USD data into Multiverse "Compound" shape nodes.
This _reads_ USD files (and also Alembic files) into Maya by _streaming_ them
to the viewport.
- Rendering USD data procedurally with 3Delight<sup>NSI</sup>, Arnold, Redshift,
RenderMan and VRay.
This reads USD files by _streaming_ them procedurally to the renderer, at
render time.
USD files written by Multiverse are 100% native USD data, they can be exchanged
with any other DCC applications able to interchange USD. Likewise, Multiverse
can read native USD data created by other applications. The USD extensions are
supported: `.usd` (binary), `.usda` (ASCII), `.usdz`. (zipped, optionally with
textures). Sequences of USD files can also be read via "USD clips".
It is also possible to load Alembic data (`.abc`) in Multiverse Compounds,
further compose it & override it in other USD files, and render it procedurally.
Alembic data is always converted on the fly (in memory) to USD data. USD clip
from Alembic data are also supported.
### Configuration
To configure Multiverse in OpenPype, an admin privileges needs to setup a new
OpenPype tool in the OpenPype Project Settings, using a similar configuration as
the one depicted here:
![Maya - Multiverse Setup](assets/maya-multiverse_setup.png)
For more information about setup of Multiverse please refer to the relative page
on the [Multiverse official documentation](https://multi-verse.io/docs).
### Understanding Assets, Compounds, Compositions, Overrides and Layering
In Multiverse we use some terminology that relates to USD I/O: terms like
"Assets", "Compounds", "Compositions", "Overrides" and "Layering".
Please hop to the new [Multiverse Introduction](
https://j-cube.jp/solutions/multiverse/docs/usage/introduction) page on the
official documentation to understand them before reading the next sections.
### Creators
It is possible to create OpenPype "instances" (resulting in Maya set containers)
for publishing Multiverse USD Assets, Compositions, Overrides and Looks.
When creating OpenPype instances for Multiverse USD Asset, Composition,
Override and Look, the creator plug-in will put the relative selected data in a
Maya set node which holds the properties used by the Multiverse data writer for
publishing.
You can choose the USD file format in the Creators' set nodes:
- Assets: `.usd` or `.usda` or `.usdz`
- Compositions: `.usd` or `.usda`
- Overrides: `.usd` or `.usda`
- Looks: `.ma`
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_asset_creator.png)
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_composition_creator.png)
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_override_creator.png)
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_look_creator.png)
### Publishers
The relative publishers for Multiverse USD Asset, Composition, Override and Look
are available. The first three write USD files to disk, while look writes a maya
file. All communicate publish info to the OpenPype database.
![Maya - Multiverse Publisher](assets/maya-multiverse_openpype_publishers.png)
### Loader
The loader creates a Multiverse "Compound" shape node reading the USD file of
choice. All data is _streamed_ to the viewport and not contained in Maya. Thanks
to the various viewport draw options the user can strategically decide how to
minimize the cost of viewport draw effectively being able to load any data, this
allows to bring into Maya scenes of virtually unlimited complexity.
![Maya - Multiverse Loader](assets/maya-multiverse_openpype_loader.png)
:::tip Note
When using the Loader, Multiverse, by design, never "imports" USD data into the
Maya scene as Maya data. Instead, when desired, Multiverse permits to import
specific USD primitives into the Maya scene as Maya data selectively from MEOW,
it also tracks what is being imported, so upon modification, it is possible to
write (create & publish) the modifies data as a USD file for being layered on
top of its relative Compound. See the [Multiverse Importer](
https://j-cube.jp/solutions/multiverse/docs/usage/importer)) documentation.
:::
### Look
In OpenPype a Multiverse Look is a Maya file that contains shading networks that
are assigned to the items of a Multiverse Compound.
Multiverse Looks are typically Maya-referenced in the lighting and shot scenes.
Materials are assigned to the USD items in the Compound via the "material
assignment" information that is output in the lookdev stage by a Multiverse
Override. Once published the override can be Layered on the Compound so that
materials will be assigned to items. Finally, an attribute Override on the root
item of the Compound is used to define the `namespace` with which the shading
networks were referenced in Maya. At this point the renderer knows which
material to assign to which item and it is possible to render and edit the
materials as usual. Because the material exists in Maya you can perform IPR and
tune the materials as you please.
As of Multiverse 7 it is also possible to write shading networks inside the USD
files: that is achieved by using either the Asset writer (if material are
defined in the modeling stage) and the Override writer (if materials are
defined in the lookdev or later stage). Shading networks in USD can then be
rendered in 3Delight NSI or used for interchange with DCC apps.
Some interesting consequences of USD shading networks in Multiverse:
1. they can be overridden by a shading network in Maya by assigning in MEOW a
Maya material as an override
2. they are available for assignment in MEOW, so you can assign a USD material
to an item as an override
3. From Hypershade you can use the Multiverse USD shading network write File>
Export option to write USD shading network libraries to then layer on an asset
and perform 2. again.
### Rendering
Multiverse offers procedural rendering with all the major production renderers:
- 3Delight<sup>NSI</sup>
- Arnold
- Redshift
- RenderMan
- VRay
Procedural rendering effectively means that data is _streamed_ to the renderer
at render-time, without the need to store the data in the Maya scene (this
effectively means small .ma/.mb files that load fast) nor in the renderer native
file format scene description file (this effectively means tiny `.nsi` / `.ass`
/ `.vrscene` / `.rib` files that load fast).
This is completely transparent to the user: Multiverse Compound nodes present in
the scene, once a render is launched, will stream data to the renderer in a
procedural fashion.
### Example Multiverse Pipeline and API
An example diagram of the data flow in a Maya pipeline using Multiverse is
available, see the [Multiverse Pipeline](
https://j-cube.jp/solutions/multiverse/docs/pipeline) documentation.
A very easy to use Python API to automate any task is available, the API is
user friendly and does not require any knowledge of the vast and complex USD
APIs. See the [Multiverse Python API](
https://j-cube.jp/solutions/multiverse/docs/dev/python-api.html) documentation.
## Working with Yeti in OpenPype
OpenPype can work with [Yeti](https://peregrinelabs.com/yeti/) in two data modes.
It can handle Yeti caches and Yeti rigs.
### Creating and publishing Yeti caches
Let start by creating simple Yeti setup, just one object and Yeti node. Open new
empty scene in Maya and create sphere. Then select sphere and go **Yeti → Create Yeti Node on Mesh**
Open Yeti node graph **Yeti → Open Graph Editor** and create setup like this:
![Maya - Yeti Basic Graph](assets/maya-yeti_basic_setup.jpg)
It doesn't matter what setting you use now, just select proper shape in first
*Import* node. Select your Yeti node and create *Yeti Cache instance* - **OpenPype → Create...**
and select **Yeti Cache**. Leave `Use selection` checked. You should end up with this setup:
![Maya - Yeti Basic Setup](assets/maya-yeti_basic_setup_outline.jpg)
You can see there is `yeticacheDefault` set. Instead of *Default* it could be named with
whatever name you've entered in `subset` field during instance creation.
We are almost ready for publishing cache. You can check basic settings by selecting
Yeti cache set and opening *Extra attributes* in Maya **Attribute Editor**.
![Maya - Yeti Basic Setup](assets/maya-yeti_cache_attributes.jpg)
Those attributes there are self-explanatory, but:
- `Preroll` is number of frames simulation will run before cache frames are stored.
This is useful to "steady" simulation for example.
- `Frame Start` from what frame we start to store cache files
- `Frame End` to what frame we are storing cache files
- `Fps` of cache
- `Samples` how many time samples we take during caching
You can now publish Yeti cache as any other types. **OpenPype → Publish**. It will
create sequence of `.fur` files and `.fursettings` metadata file with Yeti node
setting.
### Loading Yeti caches
You can load Yeti cache by **OpenPype → Load ...**. Select your cache, right+click on
it and select **Load Yeti cache**. This will create Yeti node in scene and set its
cache path to point to your published cache files. Note that this Yeti node will
be named with same name as the one you've used to publish cache. Also notice that
when you open graph on this Yeti node, all nodes are as they were in publishing node.
### Creating and publishing Yeti Rig
Yeti Rigs are working in similar way as caches, but are more complex and they deal with
other data used by Yeti, like geometry and textures.
Let's start by [loading](artist_hosts_maya.md#loading-model) into new scene some model.
I've loaded my Buddha model.
Create select model mesh, create Yeti node - **Yeti → Create Yeti Node on Mesh** and
setup similar Yeti graph as in cache example above.
Then select this Yeti node (mine is called with default name `pgYetiMaya1`) and
create *Yeti Rig instance* - **OpenPype → Create...** and select **Yeti Cache**.
Leave `Use selection` checked.
Last step is to add our model geometry to rig instance, so middle+drag its
geometry to `input_SET` under `yetiRigDefault` set representing rig instance.
Note that its name can differ and is based on your subset name.
![Maya - Yeti Rig Setup](assets/maya-yeti_rig.jpg)
Save your scene and ready for publishing our new simple Yeti Rig!
Go to publish **OpenPype → Publish** and run. This will publish rig with its geometry
as `.ma` scene, save Yeti node settings and export one frame of Yeti cache from
the beginning of your timeline. It will also collect all textures used in Yeti
node, copy them to publish folder `resource` directory and set *Image search path*
of published node to this location.
:::note Collect Yeti Cache failure
If you encounter **Collect Yeti Cache** failure during collecting phase, and the error is like
```fix
No object matches name: pgYetiMaya1Shape.cbId
```
then it is probably caused by scene not being saved before publishing.
:::
### Loading Yeti Rig
You can load published Yeti Rigs as any other thing in OpenPype - **OpenPype → Load ...**,
select you Yeti rig and right+click on it. In context menu you should see
**Load Yeti Cache** and **Load Yeti Rig** items (among others). First one will
load that one frame cache. The other one will load whole rig.
Notice that although we put only geometry into `input_SET`, whole hierarchy was
pulled inside also. This allows you to store complex scene element along Yeti
node.
:::tip auto-connecting rig mesh to existing one
If you select some objects before loading rig it will try to find shapes
under selected hierarchies and match them with shapes loaded with rig (published
under `input_SET`). This mechanism uses *cbId* attribute on those shapes.
If match is found shapes are connected using their `outMesh` and `outMesh`. Thus you can easily connect existing animation to loaded rig.
:::
## Working with Xgen in OpenPype
OpenPype support publishing and loading of Xgen interactive grooms. You can publish
them as mayaAscii files with scalps that can be loaded into another maya scene, or as
alembic caches.
### Publishing Xgen Grooms
To prepare xgen for publishing just select all the descriptions that should be published together and the create Xgen Subset in the scene using - **OpenPype menu****Create**... and select **Xgen Interactive**. Leave Use selection checked.
For actual publishing of your groom to go **OpenPype → Publish** and then press ▶ to publish. This will export `.ma` file containing your grooms with any geometries they are attached to and also a baked cache in `.abc` format
:::tip adding more descriptions
You can add multiple xgen description into the subset you are about to publish, simply by
adding them to the maya set that was created for you. Please make sure that only xgen description nodes are present inside of the set and not the scalp geometry.
:::
### Loading Xgen
You can use published xgens by loading them using OpenPype Publisher. You can choose to reference or import xgen. We don't have any automatic mesh linking at the moment and it is expected, that groom is published with a scalp, that can then be manually attached to your animated mesh for example.
The alembic representation can be loaded too and it contains the groom converted to curves. Keep in mind that the density of the alembic directly depends on your viewport xgen density at the point of export.
## Using Redshift Proxies
OpenPype supports working with Redshift Proxy files. You can create Redshift Proxy from almost
any hierarchy in Maya and it will be included there. Redshift can export animation
proxy file per frame.
### Creating Redshift Proxy
To mark data to publish as Redshift Proxy, select them in Maya and - **OpenPype → Create ...** and
then select **Redshift Proxy**. You can name your subset and hit **Create** button.
You can enable animation in Attribute Editor:
![Maya - Yeti Rig Setup](assets/maya-create_rs_proxy.jpg)
### Publishing Redshift Proxies
Once data are marked as Redshift Proxy instance, they can be published - **OpenPype → Publish ...**
### Using Redshift Proxies
Published proxy files can be loaded with OpenPype Loader. It will create mesh and attach Redshift Proxy
parameters to it - Redshift will then represent proxy with bounding box.
## Using VRay Proxies
OpenPype support publishing, loading and using of VRay Proxy in look management. Their underlying format
can be either vrmesh or alembic.
:::warning vrmesh or alembic and look management
Be aware that **vrmesh** cannot be used with looks as it doesn't retain IDs necessary to map shaders to geometry.
:::
### Creating VRay Proxy
To create VRay Proxy, select geometry you want and - **OpenPype → Create ...** select **VRay Proxy**. Name your
subset as you want and press **Create** button.
This will create `vrayproxy` set for your subset. You can set some options in Attribute editor, mainly if you want
export animation instead of single frame.
![Maya - VRay Proxy Creation](assets/maya-vray_proxy.jpg)
### Publishing VRay Proxies
VRay Proxy can be published - **OpenPype → Publish ...**. It will publish data as VRays `vrmesh` format and as
Alembic file.
## Using VRay Proxies
You can load VRay Proxy using loader - **OpenPype → Loader ...**
![Maya - VRay Proxy Creation](assets/maya-vray_proxy-loader.jpg)
Select your subset and right-click. Select **Import VRay Proxy (vrmesh)** to import it.
:::note
Note that even if it states `vrmesh` in descriptions, if loader finds Alembic published along (default behavior) it will
use abc file instead of vrmesh as it is more flexible and without it looks doesn't work.
:::

View file

@ -0,0 +1,237 @@
---
id: artist_hosts_maya_multiverse
title: Multiverse for Maya
sidebar_label: Multiverse USD
---
## Working with Multiverse in OpenPype
OpenPype supports creating, publishing and loading of [Multiverse | USD](
https://multi-verse.io) data. The minimum Multiverse version supported is v6.7,
and version 7.0 is recommended.
In a nutshell it is possible to:
- Create USD Assets, USD compositions, USD Overrides.
This _creates_ OpenPype instances as Maya set nodes that contain information
for published USD data.
- Create Multiverse Looks.
This _creates_ OpenPype instances as Maya set nodes that contain information
for published Maya shading networks data and USD material assignment data.
- Publish USD Assets, USD compositions and USD Overrides.
This _writes_ USD files to disk and _publishes_ information to the OpenPype
database.
- Publish Multiverse Looks.
This _writes_ a Maya file containing shading networks (to import in Maya), a
USD override file containing material assignment information (to layer in a
Multiverse Compound), it copies original & mip-mapped textures to disk and
_publishes_ information to the OpenPype database.
- Load any USD data into Multiverse "Compound" shape nodes.
This _reads_ USD files (and also Alembic files) into Maya by _streaming_ them
to the viewport.
- Rendering USD data procedurally with 3Delight<sup>NSI</sup>, Arnold, Redshift,
RenderMan and VRay.
This reads USD files by _streaming_ them procedurally to the renderer, at
render time.
USD files written by Multiverse are 100% native USD data, they can be exchanged
with any other DCC applications able to interchange USD. Likewise, Multiverse
can read native USD data created by other applications. The USD extensions are
supported: `.usd` (binary), `.usda` (ASCII), `.usdz`. (zipped, optionally with
textures). Sequences of USD files can also be read via "USD clips".
It is also possible to load Alembic data (`.abc`) in Multiverse Compounds,
further compose it & override it in other USD files, and render it procedurally.
Alembic data is always converted on the fly (in memory) to USD data. USD clip
from Alembic data are also supported.
### Configuration
To configure Multiverse in OpenPype, an admin privileges needs to setup a new
OpenPype tool in the OpenPype Project Settings, using a similar configuration as
the one depicted here:
![Maya - Multiverse Setup](assets/maya-multiverse_setup.png)
For more information about setup of Multiverse please refer to the relative page
on the [Multiverse official documentation](https://multi-verse.io/docs).
### Understanding Assets, Compounds, Compositions, Overrides and Layering
In Multiverse we use some terminology that relates to USD I/O: terms like
"Assets", "Compounds", "Compositions", "Overrides" and "Layering".
Please hop to the new [Multiverse Introduction](
https://j-cube.jp/solutions/multiverse/docs/usage/introduction) page on the
official documentation to understand them before reading the next sections.
### Creators
It is possible to create OpenPype "instances" (resulting in Maya set containers)
for publishing Multiverse USD Assets, Compositions, Overrides and Looks.
When creating OpenPype instances for Multiverse USD Asset, Composition,
Override and Look, the creator plug-in will put the relative selected data in a
Maya set node which holds the properties used by the Multiverse data writer for
publishing.
You can choose the USD file format in the Creators' set nodes:
- Assets: `.usd` (default) or `.usda` or `.usdz`
- Compositions: `.usda` (default) or `.usd`
- Overrides: `.usda` (default) or `.usd`
- Looks: `.ma`
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_asset_creator.png)
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_composition_creator.png)
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_override_creator.png)
![Maya - Multiverse Asset Creator](assets/maya-multiverse_openpype_look_creator.png)
### Publishers
The relative publishers for Multiverse USD Asset, Composition, Override and Look
are available. The first three write USD files to disk, while look writes a Maya
file along with the mip-mapped textures. All communicate publish info to the
OpenPype database.
![Maya - Multiverse Publisher](assets/maya-multiverse_openpype_publishers.png)
### Loader
The loader creates a Multiverse "Compound" shape node reading the USD file of
choice. All data is _streamed_ to the viewport and not contained in Maya. Thanks
to the various viewport draw options the user can strategically decide how to
minimize the cost of viewport draw effectively being able to load any data, this
allows to bring into Maya scenes of virtually unlimited complexity.
![Maya - Multiverse Loader](assets/maya-multiverse_openpype_loader.png)
:::tip Note
When using the Loader, Multiverse, by design, never "imports" USD data into the
Maya scene as Maya data. Instead, when desired, Multiverse permits to import
specific USD primitives, or entire hierarchies, into the Maya scene as Maya data
selectively from MEOW, it also tracks what is being imported with a "live
connection" , so upon modification, it is possible to write (create & publish)
the modifies data as a USD file for being layered on top of its relative
Compound. See the [Multiverse Importer](
https://j-cube.jp/solutions/multiverse/docs/usage/importer)) documentation.
:::
### Look
In OpenPype a Multiverse Look is the combination of:
- a Maya file that contains the shading networks that were assigned to the items
of a Multiverse Compound.
- a Multiverse USD Override file that contains the material assignment
information (which Maya material was assigned to which USD item)
- mip-mapped textures
Multiverse Look shading networks are typically Maya-referenced in the lighting
and shot scenes.
Materials are assigned to the USD items in the Compound via the "material
assignment" information that is output in the lookdev stage by a Multiverse
Override. Once published the override can be Layered on the Compound so that
materials will be assigned to items. Finally, an attribute Override on the root
item of the Compound is used to define the `namespace` with which the shading
networks were referenced in Maya. At this point the renderer knows which
material to assign to which item and it is possible to render and edit the
materials as usual. Because the material exists in Maya you can perform IPR and
tune the materials as you please.
The Multiverse Look will also publish textures in optimized mip-map format,
currently supporting the `.tdl` (Texture Delight) mip map format of the 3Delight
NSI renderer. MipMaps are required when the relative option is checked and you
are publishing Multiverse Looks with the `final` or `-` subset, while they are
not required with the `WIP` or `test` subsets. MipMaps are found automatically
as long as they exist alongside the original textures. Their generation can be
automatic when using 3Delight for Maya or can be manual by using the `tdlmake`
binary utility.
### About embedding shading networks in USD
Alternatively, but also complementary to the Multiverse Look, as of Multiverse
7 it is also possible to write shading networks _inside_ USD files: that is
achieved by using either the Asset writer (if material are defined in the
modeling stage) and the Override writer (if materials are defined in the lookdev
or later stage).
Some interesting consequences of USD shading networks in Multiverse:
1. they can be overridden by a shading network in Maya by assigning in MEOW a
Maya material as an override
2. they are available for assignment in MEOW, so you can assign a USD material
to an item as an override
3. From Hypershade you can use the Multiverse USD shading network write File>
Export option to write USD shading network libraries to then layer on an asset
and perform 2. again.
Note that:
- Shading networks in USD can then be currently rendered with
3Delight<sup>NSI</sup>
- Shading networks in USD can be used for interchange with DCC apps. Multiverse
shading networks are written natively with the USD Shade schema.
- usdPreviewSurface shading networks are too considered embedded shading
networks, though they are classified separately from non-preview / final
quality shading networks
- USDZ files use usdPreviewSurface shading networks, and therefore can be, too,
rendered (with 3Delight<sup>NSI</sup>)
- in case both usdPreviewSurface and final quality shading networks, the latter
will be used for rendering (while the former can be previewed in the viewport)
- it is possible to disable rendering of any embedded shading network via the
relative option in the Compound Attribute Editor.
### Rendering
Multiverse offers procedural rendering with all the major production renderers:
- 3Delight<sup>NSI</sup>
- Arnold
- Redshift
- RenderMan
- VRay
Procedural rendering effectively means that data is _streamed_ to the renderer
at render-time, without the need to store the data in the Maya scene (this
effectively means small .ma/.mb files that load fast) nor in the renderer native
file format scene description file (this effectively means tiny `.nsi` / `.ass`
/ `.vrscene` / `.rib` files that load fast).
This is completely transparent to the user: Multiverse Compound nodes present in
the scene, once a render is launched, will stream data to the renderer in a
procedural fashion.
### Example Multiverse Pipeline and API
An example diagram of the data flow in a Maya pipeline using Multiverse is
available, see the [Multiverse Pipeline](
https://j-cube.jp/solutions/multiverse/docs/pipeline) documentation.
A very easy to use Python API to automate any task is available, the API is
user friendly and does not require any knowledge of the vast and complex USD
APIs. See the [Multiverse Python API](
https://j-cube.jp/solutions/multiverse/docs/dev/python-api.html) documentation.

View file

@ -0,0 +1,31 @@
---
id: artist_hosts_maya_redshift
title: Redshift for Maya
sidebar_label: Redshift
---
## Working with Redshift in OpenPype
### Using Redshift Proxies
OpenPype supports working with Redshift Proxy files. You can create Redshift Proxy from almost
any hierarchy in Maya and it will be included there. Redshift can export animation
proxy file per frame.
### Creating Redshift Proxy
To mark data to publish as Redshift Proxy, select them in Maya and - **OpenPype → Create ...** and
then select **Redshift Proxy**. You can name your subset and hit **Create** button.
You can enable animation in Attribute Editor:
![Maya - Yeti Rig Setup](assets/maya-create_rs_proxy.jpg)
### Publishing Redshift Proxies
Once data are marked as Redshift Proxy instance, they can be published - **OpenPype → Publish ...**
### Using Redshift Proxies
Published proxy files can be loaded with OpenPype Loader. It will create mesh and attach Redshift Proxy
parameters to it - Redshift will then represent proxy with bounding box.

View file

@ -0,0 +1,44 @@
---
id: artist_hosts_maya_vray
title: VRay for Maya
sidebar_label: VRay
---
## Working with VRay in OpenPype
### #Using VRay Proxies
OpenPype support publishing, loading and using of VRay Proxy in look management. Their underlying format
can be either vrmesh or alembic.
:::warning vrmesh or alembic and look management
Be aware that **vrmesh** cannot be used with looks as it doesn't retain IDs necessary to map shaders to geometry.
:::
### Creating VRay Proxy
To create VRay Proxy, select geometry you want and - **OpenPype → Create ...** select **VRay Proxy**. Name your
subset as you want and press **Create** button.
This will create `vrayproxy` set for your subset. You can set some options in Attribute editor, mainly if you want
export animation instead of single frame.
![Maya - VRay Proxy Creation](assets/maya-vray_proxy.jpg)
### Publishing VRay Proxies
VRay Proxy can be published - **OpenPype → Publish ...**. It will publish data as VRays `vrmesh` format and as
Alembic file.
## Using VRay Proxies
You can load VRay Proxy using loader - **OpenPype → Loader ...**
![Maya - VRay Proxy Creation](assets/maya-vray_proxy-loader.jpg)
Select your subset and right-click. Select **Import VRay Proxy (vrmesh)** to import it.
:::note
Note that even if it states `vrmesh` in descriptions, if loader finds Alembic published along (default behavior) it will
use abc file instead of vrmesh as it is more flexible and without it looks doesn't work.
:::

View file

@ -0,0 +1,29 @@
---
id: artist_hosts_maya_xgen
title: Xgen for Maya
sidebar_label: Xgen
---
## Working with Xgen in OpenPype
OpenPype support publishing and loading of Xgen interactive grooms. You can publish
them as mayaAscii files with scalps that can be loaded into another maya scene, or as
alembic caches.
### Publishing Xgen Grooms
To prepare xgen for publishing just select all the descriptions that should be published together and the create Xgen Subset in the scene using - **OpenPype menu****Create**... and select **Xgen Interactive**. Leave Use selection checked.
For actual publishing of your groom to go **OpenPype → Publish** and then press ▶ to publish. This will export `.ma` file containing your grooms with any geometries they are attached to and also a baked cache in `.abc` format
:::tip adding more descriptions
You can add multiple xgen description into the subset you are about to publish, simply by
adding them to the maya set that was created for you. Please make sure that only xgen description nodes are present inside of the set and not the scalp geometry.
:::
### Loading Xgen
You can use published xgens by loading them using OpenPype Publisher. You can choose to reference or import xgen. We don't have any automatic mesh linking at the moment and it is expected, that groom is published with a scalp, that can then be manually attached to your animated mesh for example.
The alembic representation can be loaded too and it contains the groom converted to curves. Keep in mind that the density of the alembic directly depends on your viewport xgen density at the point of export.

View file

@ -0,0 +1,108 @@
---
id: artist_hosts_maya_yeti
title: Yeti for Maya
sidebar_label: Yeti
---
## Working with Yeti in OpenPype
OpenPype can work with [Yeti](https://peregrinelabs.com/yeti/) in two data modes.
It can handle Yeti caches and Yeti rigs.
### Creating and publishing Yeti caches
Let start by creating simple Yeti setup, just one object and Yeti node. Open new
empty scene in Maya and create sphere. Then select sphere and go **Yeti → Create Yeti Node on Mesh**
Open Yeti node graph **Yeti → Open Graph Editor** and create setup like this:
![Maya - Yeti Basic Graph](assets/maya-yeti_basic_setup.jpg)
It doesn't matter what setting you use now, just select proper shape in first
*Import* node. Select your Yeti node and create *Yeti Cache instance* - **OpenPype → Create...**
and select **Yeti Cache**. Leave `Use selection` checked. You should end up with this setup:
![Maya - Yeti Basic Setup](assets/maya-yeti_basic_setup_outline.jpg)
You can see there is `yeticacheDefault` set. Instead of *Default* it could be named with
whatever name you've entered in `subset` field during instance creation.
We are almost ready for publishing cache. You can check basic settings by selecting
Yeti cache set and opening *Extra attributes* in Maya **Attribute Editor**.
![Maya - Yeti Basic Setup](assets/maya-yeti_cache_attributes.jpg)
Those attributes there are self-explanatory, but:
- `Preroll` is number of frames simulation will run before cache frames are stored.
This is useful to "steady" simulation for example.
- `Frame Start` from what frame we start to store cache files
- `Frame End` to what frame we are storing cache files
- `Fps` of cache
- `Samples` how many time samples we take during caching
You can now publish Yeti cache as any other types. **OpenPype → Publish**. It will
create sequence of `.fur` files and `.fursettings` metadata file with Yeti node
setting.
### Loading Yeti caches
You can load Yeti cache by **OpenPype → Load ...**. Select your cache, right+click on
it and select **Load Yeti cache**. This will create Yeti node in scene and set its
cache path to point to your published cache files. Note that this Yeti node will
be named with same name as the one you've used to publish cache. Also notice that
when you open graph on this Yeti node, all nodes are as they were in publishing node.
### Creating and publishing Yeti Rig
Yeti Rigs are working in similar way as caches, but are more complex and they deal with
other data used by Yeti, like geometry and textures.
Let's start by [loading](artist_hosts_maya.md#loading-model) into new scene some model.
I've loaded my Buddha model.
Create select model mesh, create Yeti node - **Yeti → Create Yeti Node on Mesh** and
setup similar Yeti graph as in cache example above.
Then select this Yeti node (mine is called with default name `pgYetiMaya1`) and
create *Yeti Rig instance* - **OpenPype → Create...** and select **Yeti Cache**.
Leave `Use selection` checked.
Last step is to add our model geometry to rig instance, so middle+drag its
geometry to `input_SET` under `yetiRigDefault` set representing rig instance.
Note that its name can differ and is based on your subset name.
![Maya - Yeti Rig Setup](assets/maya-yeti_rig.jpg)
Save your scene and ready for publishing our new simple Yeti Rig!
Go to publish **OpenPype → Publish** and run. This will publish rig with its geometry
as `.ma` scene, save Yeti node settings and export one frame of Yeti cache from
the beginning of your timeline. It will also collect all textures used in Yeti
node, copy them to publish folder `resource` directory and set *Image search path*
of published node to this location.
:::note Collect Yeti Cache failure
If you encounter **Collect Yeti Cache** failure during collecting phase, and the error is like
```fix
No object matches name: pgYetiMaya1Shape.cbId
```
then it is probably caused by scene not being saved before publishing.
:::
### Loading Yeti Rig
You can load published Yeti Rigs as any other thing in OpenPype - **OpenPype → Load ...**,
select you Yeti rig and right+click on it. In context menu you should see
**Load Yeti Cache** and **Load Yeti Rig** items (among others). First one will
load that one frame cache. The other one will load whole rig.
Notice that although we put only geometry into `input_SET`, whole hierarchy was
pulled inside also. This allows you to store complex scene element along Yeti
node.
:::tip auto-connecting rig mesh to existing one
If you select some objects before loading rig it will try to find shapes
under selected hierarchies and match them with shapes loaded with rig (published
under `input_SET`). This mechanism uses *cbId* attribute on those shapes.
If match is found shapes are connected using their `outMesh` and `outMesh`. Thus you can easily connect existing animation to loaded rig.
:::

Binary file not shown.

Before

Width:  |  Height:  |  Size: 26 KiB

After

Width:  |  Height:  |  Size: 123 KiB

Before After
Before After

Binary file not shown.

Before

Width:  |  Height:  |  Size: 178 KiB

After

Width:  |  Height:  |  Size: 302 KiB

Before After
Before After

View file

@ -19,7 +19,18 @@ module.exports = {
items: [
"artist_hosts_hiero",
"artist_hosts_nuke_tut",
"artist_hosts_maya",
{
type: "category",
label: "Maya",
items: [
"artist_hosts_maya",
"artist_hosts_maya_multiverse",
"artist_hosts_maya_yeti",
"artist_hosts_maya_xgen",
"artist_hosts_maya_vray",
"artist_hosts_maya_redshift",
],
},
"artist_hosts_blender",
"artist_hosts_harmony",
"artist_hosts_houdini",