Merge develop

This commit is contained in:
Roy Nieterau 2023-03-29 14:46:41 +02:00
parent 6405700ed9
commit 669a2256ef
248 changed files with 13531 additions and 8058 deletions

View file

@ -27,4 +27,4 @@ import TabItem from '@theme/TabItem';
- for more details on how to use it go [here](admin_use#check-for-mongodb-database-connection)
## OPENPYPE_USERNAME
- if set it overides system created username
- if set it overrides system created username

View file

@ -6,13 +6,13 @@ sidebar_label: Maya
## Publish Plugins
### Render Settings Validator
### Render Settings Validator
`ValidateRenderSettings`
Render Settings Validator is here to make sure artists will submit renders
we correct settings. Some of these settings are needed by OpenPype but some
can be defined by TD using [OpenPype Settings UI](admin_settings.md).
with the correct settings. Some of these settings are needed by OpenPype but some
can be defined by the admin using [OpenPype Settings UI](admin_settings.md).
OpenPype enforced settings include:
@ -36,10 +36,9 @@ For **Renderman**:
For **Arnold**:
- there shouldn't be `<renderpass>` token when merge AOVs option is turned on
Additional check can be added via Settings - **Project Settings > Maya > Publish plugin > ValidateRenderSettings**.
You can add as many options as you want for every supported renderer. In first field put node type and attribute
and in the second required value.
and in the second required value. You can create multiple values for an attribute, but when repairing it'll be the first value in the list that get selected.
![Settings example](assets/maya-admin_render_settings_validator.png)
@ -51,7 +50,11 @@ just one instance of this node type but if that is not so, validator will go thr
instances and check the value there. Node type for **VRay** settings is `VRaySettingsNode`, for **Renderman**
it is `rmanGlobals`, for **Redshift** it is `RedshiftOptions`.
### Model Name Validator
:::info getting attribute values
If you do not know what an attributes value is supposed to be, for example for dropdown menu (enum), try changing the attribute and look in the script editor where it should log what the attribute was set to.
:::
### Model Name Validator
`ValidateRenderSettings`
@ -95,7 +98,7 @@ You can set various aspects of scene submission to farm with per-project setting
- **Optional** will mark sumission plugin optional
- **Active** will enable/disable plugin
- **Tile Assembler Plugin** will set what should be used to assemble tiles on Deadline. Either **Open Image IO** will be used
- **Tile Assembler Plugin** will set what should be used to assemble tiles on Deadline. Either **Open Image IO** will be used
or Deadlines **Draft Tile Assembler**.
- **Use Published scene** enable to render from published scene instead of scene in work area. Rendering from published files is much safer.
- **Use Asset dependencies** will mark job pending on farm until asset dependencies are fulfilled - for example Deadline will wait for scene file to be synced to cloud, etc.
@ -107,6 +110,41 @@ or Deadlines **Draft Tile Assembler**.
This is useful to fix some specific renderer glitches and advanced hacking of Maya Scene files. `Patch name` is label for patch for easier orientation.
`Patch regex` is regex used to find line in file, after `Patch line` string is inserted. Note that you need to add line ending.
### Extract Playblast Settings (review)
These settings provide granular control over how the playblasts or reviews are produced in Maya.
Some of these settings are also available on the instance itself, in which case these settings will become the default value when creating the review instance.
![Extract Playblast Settings](assets/maya-admin_extract_playblast_settings.png)
- **Compression type** which file encoding to use.
- **Data format** what format is the file encoding.
- **Quality** lets you control the compression value for the output. Results can vary depending on the compression you selected. Quality values can range from 0 to 100, with a default value of 95.
- **Background Color** the viewports background color.
- **Background Bottom** the viewports background bottom color.
- **Background Top** the viewports background top color.
- **Override display options** override the viewports display options to use what is set in the settings.
- **Isolate view** isolates the view to what is in the review instance. If only a camera is present in the review instance, all nodes are displayed in view.
- **Off Screen** records the playblast hidden from the user.
- **2D Pan/Zoom** enables the 2D Pan/Zoom functionality of the camera.
- **Renderer name** which renderer to use for playblasting.
- **Width** width of the output resolution. If this value is `0`, the asset's width is used.
- **Height** height of the output resolution. If this value is `0`, the asset's height is used.
#### Viewport Options
Most settings to override in the viewport are self explanatory and can be found in Maya.
![Extract Playblast Settings](assets/maya-admin_extract_playblast_settings_viewport_options.png)
- **Override Viewport Options** enable to use the settings below for the viewport when publishing the review.
#### Camera Options
These options are set on the camera shape when publishing the review. They correspond to attributes on the Maya camera shape node.
![Extract Playblast Settings](assets/maya-admin_extract_playblast_settings_camera_options.png)
## Custom Menu
You can add your custom tools menu into Maya by extending definitions in **Maya -> Scripts Menu Definition**.
![Custom menu definition](assets/maya-admin_scriptsmenu.png)
@ -142,7 +180,7 @@ Fill in the necessary fields (the optional fields are regex filters)
![new place holder](assets/maya-placeholder_new.png)
- Builder type: Wether the the placeholder should load current asset representations or linked assets representations
- Builder type: Whether the the placeholder should load current asset representations or linked assets representations
- Representation: Representation that will be loaded (ex: ma, abc, png, etc...)
@ -169,5 +207,3 @@ Fill in the necessary fields (the optional fields are regex filters)
- Build your workfile
![maya build template](assets/maya-build_workfile_from_template.png)

View file

@ -7,12 +7,15 @@ sidebar_label: Working with settings
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
OpenPype stores all of it's settings and configuration in the mongo database. To make the configuration as easy as possible we provide a robust GUI where you can access and change everything that is configurable
OpenPype stores all of its settings and configuration in the mongo database. To make the configuration as easy as possible we provide a robust GUI where you can access and change everything that is configurable
**Settings** GUI can be started from the tray menu *Admin -> Studio Settings*.
Please keep in mind that these settings are set-up for the full studio and not per-individual. If you're looking for individual artist settings, you can head to
[Local Settings](admin_settings_local.md) section in the artist documentation.
:::important Studio Settings versus Local Settings
Please keep in mind that these settings are set up for the full studio and not per-individual. If you're looking for individual artist settings, you can head to
[Local Settings](admin_settings_local.md) section in the documentation.
:::
## Categories
@ -76,7 +79,7 @@ You can also reset any settings to OpenPype default by doing `right click` and `
Many settings are useful to be adjusted on a per-project basis. To identify project
overrides, they are marked with **orange edge** and **orange labels** in the settings GUI.
The process of settting project overrides is similar to setting the Studio defaults. The key difference is to select a particular project you want to be configure. Those projects can be found on the left hand side of the Project Settings tab.
The process of setting project overrides is similar to setting the Studio defaults. The key difference is to select a particular project you want to be configure. Those projects can be found on the left hand side of the Project Settings tab.
In the image below you can see all three overrides at the same time.
1. Deadline has **no changes to the OpenPype defaults** at all — **grey** colour of left bar.

View file

@ -68,7 +68,7 @@ Add `--headless` to run OpenPype without graphical UI (useful on server or on au
`--verbose` `<level>` - change log verbose level of OpenPype loggers.
Level value can be integer in range `0-50` or one of enum strings `"notset" (0)`, `"debug" (10)`, `"info" (20)`, `"warning" (30)`, `"error" (40)`, `"ciritcal" (50)`. Value is stored to `OPENPYPE_LOG_LEVEL` environment variable for next processes.
Level value can be integer in range `0-50` or one of enum strings `"notset" (0)`, `"debug" (10)`, `"info" (20)`, `"warning" (30)`, `"error" (40)`, `"critical" (50)`. Value is stored to `OPENPYPE_LOG_LEVEL` environment variable for next processes.
```shell
openpype_console --verbose debug

View file

@ -47,7 +47,7 @@ This is the core functional area for you as a user. Most of your actions will ta
![Menu OpenPype](assets/3dsmax_menu_first_OP.png)
:::note OpenPype Menu
User should use this menu exclusively for **Opening/Saving** when dealing with work files not standard ```File Menu``` even though user still being able perform file operations via this menu but prefferably just performing quick saves during work session not saving actual workfile versions.
User should use this menu exclusively for **Opening/Saving** when dealing with work files not standard ```File Menu``` even though user still being able perform file operations via this menu but preferably just performing quick saves during work session not saving actual workfile versions.
:::
## Working With Scene Files
@ -73,7 +73,7 @@ OpenPype correctly names it and add version to the workfile. This basically happ
etc.
Basically meaning user is free of guessing what is the correct naming and other neccessities to keep everthing in order and managed.
Basically meaning user is free of guessing what is the correct naming and other necessities to keep everything in order and managed.
> Note: user still has also other options for naming like ```Subversion```, ```Artist's Note``` but we won't dive into those now.

View file

@ -34,7 +34,7 @@ a correct name. You should use it instead of standard file saving dialog.
In AfterEffects you'll find the tools in the `OpenPype` extension:
![Extension](assets/photoshop_extension.PNG) <!-- same menu as in PS -->
![Extension](assets/photoshop_extension.png) <!-- same menu as in PS -->
You can show the extension panel by going to `Window` > `Extensions` > `OpenPype`.
@ -104,7 +104,7 @@ There are currently 2 options of `render` item:
When you want to load existing published work, you can use the `Loader` tool. You can reach it in the extension's panel.
![Loader](assets/photoshop_loader.PNG) <!-- picture needs to be changed -->
![Loader](assets/photoshop_loader.png) <!-- picture needs to be changed -->
The supported families for loading into AfterEffects are:
@ -128,7 +128,7 @@ Now that we have some content loaded, you can manage which version is loaded. Th
Loaded images have to stay as smart layers in order to be updated. If you rasterize the layer, you can no longer update it to a different version using OpenPype tools.
:::
![Loader](assets/photoshop_manage.PNG)
![Loader](assets/photoshop_manage.png)
You can switch to a previous version of the image or update to the latest.

View file

@ -44,7 +44,7 @@ Because the saving to the network location happens in the background, be careful
`OpenPype > Create`
![Creator](assets/harmony_creator.PNG)
![Creator](assets/harmony_creator.png)
These are the families supported in Harmony:

View file

@ -231,14 +231,14 @@ All published instances that will replace the place holder must contain unique i
![Create menu](assets/nuke_publishedinstance.png)
The informations about these objects are given by the user by filling the extra attributes of the Place Holder
The information about these objects are given by the user by filling the extra attributes of the Place Holder
![Create menu](assets/nuke_fillingExtraAttributes.png)
### Update Place Holder
This tool alows the user to change the information provided in the extra attributes of the selected Place Holder.
This tool allows the user to change the information provided in the extra attributes of the selected Place Holder.
![Create menu](assets/nuke_updatePlaceHolder.png)
@ -250,7 +250,7 @@ This tool imports the template used and replaces the existed PlaceHolders with t
![Create menu](assets/nuke_buildWorfileFromTemplate.png)
#### Result
- Replace `PLACEHOLDER` node in the template with the published instance corresponding to the informations provided in extra attributes of the Place Holder
- Replace `PLACEHOLDER` node in the template with the published instance corresponding to the information provided in extra attributes of the Place Holder
![Create menu](assets/nuke_buildworkfile.png)

View file

@ -230,8 +230,8 @@ Maya settings concerning framerate, resolution and frame range are handled by
OpenPype. If set correctly in Ftrack, Maya will validate you have correct fps on
scene save and publishing offering way to fix it for you.
For resolution and frame range, use **OpenPype → Reset Frame Range** and
**OpenPype → Reset Resolution**
For resolution and frame range, use **OpenPype → Set Frame Range** and
**OpenPype → Set Resolution**
## Creating rigs with OpenPype
@ -386,7 +386,7 @@ Lets start with empty scene. First I'll pull in my favorite Buddha model.
there just click on **Reference (abc)**.
Next, I want to be sure that I have same frame range as is set on shot I am working
on. To do this just **OpenPype → Reset Frame Range**. This should set Maya timeline to same
on. To do this just **OpenPype → Set Frame Range**. This should set Maya timeline to same
values as they are set on shot in *Ftrack* for example.
I have my time set, so lets create some animation. We'll turn Buddha model around for
@ -500,7 +500,7 @@ and for vray:
maya/<Layer>/<Layer>
```
Doing **OpenPype → Reset Resolution** will set correct resolution on camera.
Doing **OpenPype → Set Resolution** will set correct resolution on camera.
Scene is now ready for submission and should publish without errors.
@ -516,6 +516,22 @@ In the scene from where you want to publish your model create *Render subset*. P
model subset (Maya set node) under corresponding `LAYER_` set under *Render instance*. During publish, it will submit this render to farm and
after it is rendered, it will be attached to your model subset.
### Tile Rendering
:::note Deadline
This feature is only supported when using Deadline. See [here](module_deadline#openpypetileassembler-plugin) for setup.
:::
On the render instance objectset you'll find:
* `Tile Rendering` - for enabling tile rendering.
* `Tile X` - number of tiles in the X axis.
* `Tile Y` - number of tiles in the Y axis.
When submittig to Deadline, you'll get:
- for each frame a tile rendering job, to render each from Maya.
- for each frame a tile assembly job, to assemble the rendered tiles.
- job to publish the assembled frames.
## Render Setups
### Publishing Render Setups

View file

@ -6,7 +6,7 @@ sidebar_label: Arnold
## Arnold Scene Source (.ass)
Arnold Scene Source can be published as a single file or a sequence of files, determined by the frame range.
When creating the instance, two objectsets are created; `content` and `proxy`. Meshes in the `proxy` objectset will be the viewport representation when loading as `standin`. Proxy representations are stored as `resources` of the subset.
When creating the instance, two objectsets are created; `content` and `proxy`. Meshes in the `proxy` objectset will be the viewport representation when loading as `standin`.
### Arnold Scene Source Proxy Workflow
In order to utilize operators and proxies, the content and proxy nodes need to share the same names (including the shape names). This is done by parenting the content and proxy nodes into separate groups. For example:

View file

@ -9,7 +9,9 @@ sidebar_label: Yeti
OpenPype can work with [Yeti](https://peregrinelabs.com/yeti/) in two data modes.
It can handle Yeti caches and Yeti rigs.
### Creating and publishing Yeti caches
## Yeti Caches
### Creating and publishing
Let start by creating simple Yeti setup, just one object and Yeti node. Open new
empty scene in Maya and create sphere. Then select sphere and go **Yeti → Create Yeti Node on Mesh**
@ -44,7 +46,15 @@ You can now publish Yeti cache as any other types. **OpenPype → Publish**. It
create sequence of `.fur` files and `.fursettings` metadata file with Yeti node
setting.
### Loading Yeti caches
:::note Collect Yeti Cache failure
If you encounter **Collect Yeti Cache** failure during collecting phase, and the error is like
```fix
No object matches name: pgYetiMaya1Shape.cbId
```
then it is probably caused by scene not being saved before publishing.
:::
### Loading
You can load Yeti cache by **OpenPype → Load ...**. Select your cache, right+click on
it and select **Load Yeti cache**. This will create Yeti node in scene and set its
@ -52,26 +62,39 @@ cache path to point to your published cache files. Note that this Yeti node will
be named with same name as the one you've used to publish cache. Also notice that
when you open graph on this Yeti node, all nodes are as they were in publishing node.
### Creating and publishing Yeti Rig
## Yeti Rigs
Yeti Rigs are working in similar way as caches, but are more complex and they deal with
other data used by Yeti, like geometry and textures.
### Creating and publishing
Let's start by [loading](artist_hosts_maya.md#loading-model) into new scene some model.
I've loaded my Buddha model.
Yeti Rigs are designed to connect to published models or animation rig. The workflow gives the Yeti Rig full control on that geometry to do additional things on top of whatever input comes in, e.g. deleting faces, pushing faces in/out, subdividing, etc.
Create select model mesh, create Yeti node - **Yeti → Create Yeti Node on Mesh** and
setup similar Yeti graph as in cache example above.
Let's start with a [model](artist_hosts_maya.md#loading-model) or [rig](artist_hosts_maya.md#loading-rigs) loaded into the scene. Here we are using a simple rig.
Then select this Yeti node (mine is called with default name `pgYetiMaya1`) and
create *Yeti Rig instance* - **OpenPype → Create...** and select **Yeti Cache**.
![Maya - Yeti Simple Rig](assets/maya-yeti_simple_rig.png)
We'll need to prepare the scene a bit. We want some Yeti hair on the ball geometry, so duplicating the geometry, adding the Yeti hair and grouping it together.
![Maya - Yeti Hair Setup](assets/maya-yeti_hair_setup.png)
:::note yeti nodes and types
You can use any number of Yeti nodes and types, but they have to have unique names.
:::
Now we need to connect the Yeti Rig with the animation rig. Yeti Rigs work by publishing the attribute connections from its input nodes and reconnect them later in the pipeline. This means we can only use attribute connections to from outside of the Yeti Rig hierarchy. Internal to the Yeti Rig hierarchy, we can use any complexity of node connections. We'll connnect the Yeti Rig geometry to the animation rig, with the transform and mesh attributes.
![Maya - Yeti Rig Setup](assets/maya-yeti_rig_setup.png)
Now we are ready for publishing. Select the Yeti Rig group (`rig_GRP`) and
create *Yeti Rig instance* - **OpenPype → Create...** and select **Yeti Rig**.
Leave `Use selection` checked.
Last step is to add our model geometry to rig instance, so middle+drag its
geometry to `input_SET` under `yetiRigDefault` set representing rig instance.
Last step is to add our geometry to the rig instance, so middle+drag its
geometry to `input_SET` under the `yetiRigMain` set representing rig instance.
Note that its name can differ and is based on your subset name.
![Maya - Yeti Rig Setup](assets/maya-yeti_rig.jpg)
![Maya - Yeti Publish Setup](assets/maya-yeti_publish_setup.png)
You can have any number of nodes in the Yeti Rig, but only nodes with incoming attribute connections from outside of the Yeti Rig hierarchy is needed in the `input_SET`.
Save your scene and ready for publishing our new simple Yeti Rig!
@ -81,28 +104,14 @@ the beginning of your timeline. It will also collect all textures used in Yeti
node, copy them to publish folder `resource` directory and set *Image search path*
of published node to this location.
:::note Collect Yeti Cache failure
If you encounter **Collect Yeti Cache** failure during collecting phase, and the error is like
```fix
No object matches name: pgYetiMaya1Shape.cbId
```
then it is probably caused by scene not being saved before publishing.
:::
### Loading
### Loading Yeti Rig
You can load published Yeti Rigs as any other thing in OpenPype - **OpenPype → Load ...**,
You can load published Yeti Rigs in OpenPype with **OpenPype → Load ...**,
select you Yeti rig and right+click on it. In context menu you should see
**Load Yeti Cache** and **Load Yeti Rig** items (among others). First one will
load that one frame cache. The other one will load whole rig.
**Load Yeti Rig** item (among others).
Notice that although we put only geometry into `input_SET`, whole hierarchy was
pulled inside also. This allows you to store complex scene element along Yeti
node.
To connect the Yeti Rig with published animation, we'll load in the animation and use the Inventory to establish the connections.
:::tip auto-connecting rig mesh to existing one
If you select some objects before loading rig it will try to find shapes
under selected hierarchies and match them with shapes loaded with rig (published
under `input_SET`). This mechanism uses *cbId* attribute on those shapes.
If match is found shapes are connected using their `outMesh` and `outMesh`. Thus you can easily connect existing animation to loaded rig.
:::
![Maya - Yeti Publish Setup](assets/maya-yeti_load_connections.png)
The Yeti Rig should now be following the animation. :tada:

View file

@ -75,7 +75,7 @@ enabled instances, you could see more information after clicking on `Details` ta
![Image instances creates](assets/photoshop_publish_validations.png)
In this dialog you could see publishable instances in left colummn, triggered plugins in the middle and logs in the right column.
In this dialog you could see publishable instances in left column, triggered plugins in the middle and logs in the right column.
In left column you could see that `review` instance was created automatically. This instance flattens all publishable instances or
all visible layers if no publishable instances were created into single image which could serve as a single reviewable element (for example in Ftrack).

View file

@ -2,7 +2,7 @@
id: artist_tools_sync_queue
title: Sync Queue
sidebar_label: Sync Queue
description: Track sites syncronization progress.
description: Track sites synchronization progress.
---
# Sync Queue

View file

Before

Width:  |  Height:  |  Size: 8.2 KiB

After

Width:  |  Height:  |  Size: 8.2 KiB

Before After
Before After

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 12 KiB

Before After
Before After

Binary file not shown.

After

Width:  |  Height:  |  Size: 136 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 123 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 131 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 58 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 109 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

View file

@ -24,8 +24,8 @@ It's up to the Loaders to read these values and apply the correct expected color
### Keys
- **colorspace** - string value used in other publish plugins and loaders
- **config** - storing two versions of path.
- **path** - is formated and with baked platform root. It is used for posible need to find out where we were sourcing color config during publishing.
- **template** - unformated tempate resolved from settings. It is used for other plugins targeted to remote publish which could be processed at different platform.
- **path** - is formatted and with baked platform root. It is used for possible need to find out where we were sourcing color config during publishing.
- **template** - unformatted template resolved from settings. It is used for other plugins targeted to remote publish which could be processed at different platform.
### Example
{
@ -63,7 +63,7 @@ It's up to the Loaders to read these values and apply the correct expected color
- set the `OCIO` environment variable before launching the host via a prelaunch hook
- or (if the host allows) to set the workfile OCIO config path using the host's API
3. Each Extractor exporting pixel data (e.g. image or video) has to use parent class `openpype.pipeline.publish.publish_plugins.ExtractorColormanaged` and use `self.set_representation_colorspace` on the representations to be integrated.
3. Each Extractor exporting pixel data (e.g. image or video) has to inherit from the mixin class `openpype.pipeline.publish.publish_plugins.ColormanagedPyblishPluginMixin` and use `self.set_representation_colorspace` on the representations to be integrated.
The **set_representation_colorspace** method adds `colorspaceData` to the representation. If the `colorspace` passed is not `None` then it is added directly to the representation with resolved config path otherwise a color space is assumed using the configured file rules. If no file rule matches the `colorspaceData` is **not** added to the representation.

View file

@ -45,10 +45,10 @@ openpype/hosts/{host name}
```
### Launch Hooks
Launch hooks are not directly connected to host implementation, but they can be used to modify launch of process which may be crutial for the implementation. Launch hook are plugins called when DCC is launched. They are processed in sequence before and after launch. Pre launch hooks can change how process of DCC is launched, e.g. change subprocess flags, modify environments or modify launch arguments. If prelaunch hook crashes the application is not launched at all. Postlaunch hooks are triggered after launch of subprocess. They can be used to change statuses in your project tracker, start timer, etc. Crashed postlaunch hooks have no effect on rest of postlaunch hooks or launched process. They can be filtered by platform, host and application and order is defined by integer value. Hooks inside host are automatically loaded (one reason why folder name should match host name) or can be defined from modules. Hooks execution share same launch context where can be stored data used across multiple hooks (please be very specific in stored keys e.g. 'project' vs. 'project_name'). For more detailed information look into `openpype/lib/applications.py`.
Launch hooks are not directly connected to host implementation, but they can be used to modify launch of process which may be crucial for the implementation. Launch hook are plugins called when DCC is launched. They are processed in sequence before and after launch. Pre launch hooks can change how process of DCC is launched, e.g. change subprocess flags, modify environments or modify launch arguments. If prelaunch hook crashes the application is not launched at all. Postlaunch hooks are triggered after launch of subprocess. They can be used to change statuses in your project tracker, start timer, etc. Crashed postlaunch hooks have no effect on rest of postlaunch hooks or launched process. They can be filtered by platform, host and application and order is defined by integer value. Hooks inside host are automatically loaded (one reason why folder name should match host name) or can be defined from modules. Hooks execution share same launch context where can be stored data used across multiple hooks (please be very specific in stored keys e.g. 'project' vs. 'project_name'). For more detailed information look into `openpype/lib/applications.py`.
### Public interface
Public face is at this moment related to launching of the DCC. At this moment there there is only option to modify environment variables before launch by implementing function `add_implementation_envs` (must be available in `openpype/hosts/{host name}/__init__.py`). The function is called after pre launch hooks, as last step before subprocess launch, to be able set environment variables crutial for proper integration. It is also good place for functions that are used in prelaunch hooks and in-DCC integration. Future plans are to be able get workfiles extensions from here. Right now workfiles extensions are hardcoded in `openpype/pipeline/constants.py` under `HOST_WORKFILE_EXTENSIONS`, we would like to handle hosts as addons similar to OpenPype modules, and more improvements which are now hardcoded.
Public face is at this moment related to launching of the DCC. At this moment there there is only option to modify environment variables before launch by implementing function `add_implementation_envs` (must be available in `openpype/hosts/{host name}/__init__.py`). The function is called after pre launch hooks, as last step before subprocess launch, to be able set environment variables crucial for proper integration. It is also good place for functions that are used in prelaunch hooks and in-DCC integration. Future plans are to be able get workfiles extensions from here. Right now workfiles extensions are hardcoded in `openpype/pipeline/constants.py` under `HOST_WORKFILE_EXTENSIONS`, we would like to handle hosts as addons similar to OpenPype modules, and more improvements which are now hardcoded.
### Integration
We've prepared base class `HostBase` in `openpype/host/host.py` to define minimum requirements and provide some default method implementations. The minimum requirement for a host is `name` attribute, this host would not be able to do much but is valid. To extend functionality we've prepared interfaces that helps to identify what is host capable of and if is possible to use certain tools with it. For those cases we defined interfaces for each workflow. `IWorkfileHost` interface add requirement to implement workfiles related methods which makes host usable in combination with Workfiles tool. `ILoadHost` interface add requirements to be able load, update, switch or remove referenced representations which should add support to use Loader and Scene Inventory tools. `INewPublisher` interface is required to be able use host with new OpenPype publish workflow. This is what must or can be implemented to allow certain functionality. `HostBase` will have more responsibility which will be taken from global variables in future. This process won't happen at once, but will be slow to keep backwards compatibility for some time.

View file

@ -415,7 +415,7 @@ class CreateRender(Creator):
# - 'asset' - asset name
# - 'task' - task name
# - 'variant' - variant
# - 'family' - instnace family
# - 'family' - instance family
# Check if should use selection or not
if pre_create_data.get("use_selection"):

View file

@ -355,7 +355,7 @@ These inputs wraps another inputs into {key: value} relation
{
"type": "text",
"key": "command",
"label": "Comand"
"label": "Command"
}
]
},
@ -420,7 +420,7 @@ How output of the schema could look like on save:
- number input, can be used for both integer and float
- key `"decimal"` defines how many decimal places will be used, 0 is for integer input (Default: `0`)
- key `"minimum"` as minimum allowed number to enter (Default: `-99999`)
- key `"maxium"` as maximum allowed number to enter (Default: `99999`)
- key `"maximum"` as maximum allowed number to enter (Default: `99999`)
- key `"steps"` will change single step value of UI inputs (using arrows and wheel scroll)
- for UI it is possible to show slider to enable this option set `show_slider` to `true`
```javascript
@ -602,7 +602,7 @@ How output of the schema could look like on save:
- there are 2 possible ways how to set the type:
1.) dictionary with item modifiers (`number` input has `minimum`, `maximum` and `decimals`) in that case item type must be set as value of `"type"` (example below)
2.) item type name as string without modifiers (e.g. [text](#text))
3.) enhancement of 1.) there is also support of `template` type but be carefull about endless loop of templates
3.) enhancement of 1.) there is also support of `template` type but be careful about endless loop of templates
- goal of using `template` is to easily change same item definitions in multiple lists
1.) with item modifiers

View file

@ -57,7 +57,7 @@ Content:
Contains end to end testing in a DCC. Currently it is setup to start DCC application with prepared worfkile, run publish process and compare results in DB and file system automatically.
This approach is implemented as it should work in any DCC application and should cover most common use cases. Not all hosts allow "real headless" publishing, but all hosts should allow to trigger
publish process programatically when UI of host is actually running.
publish process programmatically when UI of host is actually running.
There will be eventually also possibility to build workfile and publish it programmatically, this would work only in DCCs that support it (Maya, Nuke).

View file

@ -4,7 +4,7 @@ title: Ftrack
sidebar_label: Project Manager
---
Ftrack is currently the main project management option for OpenPype. This documentation assumes that you are familiar with Ftrack and it's basic principles. If you're new to Ftrack, we recommend having a thorough look at [Ftrack Official Documentation](https://help.ftrack.com/en/).
Ftrack is currently the main project management option for OpenPype. This documentation assumes that you are familiar with Ftrack and its basic principles. If you're new to Ftrack, we recommend having a thorough look at [Ftrack Official Documentation](https://help.ftrack.com/en/).
## Project management
Setting project attributes is the key to properly working pipeline.

View file

@ -45,6 +45,10 @@ executable. It is recommended to use the `openpype_console` executable as it pro
![Configure plugin](assets/deadline_configure_plugin.png)
### OpenPypeTileAssembler Plugin
To setup tile rendering copy the `OpenPypeTileAssembler` plugin to the repository;
`[OpenPype]\openpype\modules\deadline\repository\custom\plugins\OpenPypeTileAssembler` > `[DeadlineRepository]\custom\plugins\OpenPypeTileAssembler`
### Pools
The main pools can be configured at `project_settings/deadline/publish/CollectDeadlinePools/primary_pool`, which is applied to the rendering jobs.

View file

@ -8,7 +8,7 @@ import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Ftrack is currently the main project management option for OpenPype. This documentation assumes that you are familiar with Ftrack and it's basic principles. If you're new to Ftrack, we recommend having a thorough look at [Ftrack Official Documentation](http://ftrack.rtd.ftrack.com/en/stable/).
Ftrack is currently the main project management option for OpenPype. This documentation assumes that you are familiar with Ftrack and its basic principles. If you're new to Ftrack, we recommend having a thorough look at [Ftrack Official Documentation](http://ftrack.rtd.ftrack.com/en/stable/).
## Prepare Ftrack for OpenPype

View file

@ -7,7 +7,7 @@ sidebar_label: Kitsu
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Kitsu is a great open source production tracker and can be used for project management instead of Ftrack. This documentation assumes that you are familiar with Kitsu and it's basic principles. If you're new to Kitsu, we recommend having a thorough look at [Kitsu Official Documentation](https://kitsu.cg-wire.com/).
Kitsu is a great open source production tracker and can be used for project management instead of Ftrack. This documentation assumes that you are familiar with Kitsu and its basic principles. If you're new to Kitsu, we recommend having a thorough look at [Kitsu Official Documentation](https://kitsu.cg-wire.com/).
## Prepare Kitsu for OpenPype
@ -38,7 +38,20 @@ This functionality cannot deal with all cases and is not error proof, some inter
openpype_console module kitsu push-to-zou -l me@domain.ext -p my_password
```
## Integrate Kitsu Note
Task status can be automatically set during publish thanks to `Integrate Kitsu Note`. This feature can be configured in:
`Admin -> Studio Settings -> Project Settings -> Kitsu -> Integrate Kitsu Note`.
There are four settings available:
- `Set status on note` -> Turns on and off this integrator.
- `Note shortname` -> Which status shortname should be set automatically (Case sensitive).
- `Status change conditions - Status conditions` -> Conditions that need to be met for kitsu status to be changed. You can add as many conditions as you like. There are two fields to each conditions: `Condition` (Whether current status should be equal or not equal to the condition status) and `Short name` (Kitsu Shortname of the condition status).
- `Status change conditions - Family requirements` -> With this option you can add requirements to which families must be pushed or not in order to have the task status set by this integrator. There are two fields for each requirements: `Condition` (Same as the above) and `Family` (name of the family concerned by this requirement). For instance, adding one item set to `Not equal` and `workfile`, would mean the task status would change if a subset from another family than workfile is published (workfile can still be included), but not if you publish the workfile subset only.
![Integrate Kitsu Note project settings](assets/integrate_kitsu_note_settings.png)
## Q&A
### Is it safe to rename an entity from Kitsu?
Absolutely! Entities are linked by their unique IDs between the two databases.
But renaming from the OP's Project Manager won't apply the change to Kitsu, it'll be overriden during the next synchronization.
But renaming from the OP's Project Manager won't apply the change to Kitsu, it'll be overridden during the next synchronization.

View file

@ -89,7 +89,7 @@ all share the same provider).
Handles files stored on disk storage.
Local drive provider is the most basic one that is used for accessing all standard hard disk storage scenarios. It will work with any storage that can be mounted on your system in a standard way. This could correspond to a physical external hard drive, network mounted storage, internal drive or even VPN connected network drive. It doesn't care about how te drive is mounted, but you must be able to point to it with a simple directory path.
Local drive provider is the most basic one that is used for accessing all standard hard disk storage scenarios. It will work with any storage that can be mounted on your system in a standard way. This could correspond to a physical external hard drive, network mounted storage, internal drive or even VPN connected network drive. It doesn't care about how the drive is mounted, but you must be able to point to it with a simple directory path.
Default sites `local` and `studio` both use local drive provider.

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.7 KiB

View file

@ -10,7 +10,7 @@ import TabItem from '@theme/TabItem';
Project settings can have project specific values. Each new project is using studio values defined in **default** project but these values can be modified or overridden per project.
:::warning Default studio values
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orage colour). Any changes in default project may affect all existing projects.
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orange colour). Any changes in default project may affect all existing projects.
:::
## Color Management (ImageIO)
@ -39,14 +39,14 @@ Procedure of resolving path (from above example) will look first into path 1st a
### Using File rules
File rules are inspired by [OCIO v2 configuration]((https://opencolorio.readthedocs.io/en/latest/guides/authoring/rules.html)). Each rule has a unique name which can be overridden by host-specific _File rules_ (example: `project_settings/nuke/imageio/file_rules/rules`).
The _input pattern_ matching uses REGEX expression syntax (try [regexr.com](https://regexr.com/)). Matching rules procedure's intention is to be used during publishing or loading of representation. Since the publishing procedure is run before integrator formate publish template path, make sure the pattern is working or any work render path.
The _input pattern_ matching uses REGEX expression syntax (try [regexr.com](https://regexr.com/)). Matching rules procedure's intention is to be used during publishing or loading of representation. Since the publishing procedure is run before integrator format publish template path, make sure the pattern is working or any work render path.
:::warning Colorspace name input
The **colorspace name** value is a raw string input and no validation is run after saving project settings. We recommend to open the specified `config.ocio` file and copy pasting the exact colorspace names.
:::
### Extract OIIO Transcode
OIIOTools transcoder plugin with configurable output presets. Any incoming representation with `colorspaceData` is convertable to single or multiple representations with different target colorspaces or display and viewer names found in linked **config.ocio** file.
OIIOTools transcoder plugin with configurable output presets. Any incoming representation with `colorspaceData` is convertible to single or multiple representations with different target colorspaces or display and viewer names found in linked **config.ocio** file.
`oiiotool` is used for transcoding, eg. `oiiotool` must be present in `vendor/bin/oiio` or environment variable `OPENPYPE_OIIO_PATHS` must be provided for custom oiio installation.
@ -82,8 +82,8 @@ All context filters are lists which may contain strings or Regular expressions (
- **`tasks`** - Currently processed task. `["modeling", "animation"]`
:::important Filtering
Filters are optional. In case when multiple profiles match current context, profile with higher number of matched filters has higher priority that profile without filters.
(Eg. order of when filter is added doesn't matter, only the precision of matching does.)
Filters are optional. In case when multiple profiles match current context, profile with higher number of matched filters has higher priority than profile without filters.
(The order the profiles in settings doesn't matter, only the precision of matching does.)
:::
## Publish plugins
@ -94,7 +94,7 @@ Publish plugins used across all integrations.
### Extract Review
Plugin responsible for automatic FFmpeg conversion to variety of formats.
Extract review is using [profile filtering](#profile-filters) to be able render different outputs for different situations.
Extract review uses [profile filtering](#profile-filters) to render different outputs for different situations.
Applicable context filters:
**`hosts`** - Host from which publishing was triggered. `["maya", "nuke"]`
@ -104,7 +104,7 @@ Applicable context filters:
**Output Definitions**
Profile may generate multiple outputs from a single input. Each output must define unique name and output extension (use the extension without a dot e.g. **mp4**). All other settings of output definition are optional.
A profile may generate multiple outputs from a single input. Each output must define unique name and output extension (use the extension without a dot e.g. **mp4**). All other settings of output definition are optional.
![global_extract_review_output_defs](assets/global_extract_review_output_defs.png)
- **`Tags`**
@ -118,7 +118,7 @@ Profile may generate multiple outputs from a single input. Each output must defi
- **Output arguments** other FFmpeg output arguments like codec definition.
- **`Output width`** and **`Output height`**
- it is possible to rescale output to specified resolution and keep aspect ratio.
- It is possible to rescale output to specified resolution and keep aspect ratio.
- If value is set to 0, source resolution will be used.
- **`Overscan crop`**
@ -230,10 +230,10 @@ Applicable context filters:
## Tools
Settings for OpenPype tools.
## Creator
### Creator
Settings related to [Creator tool](artist_tools_creator).
### Subset name profiles
#### Subset name profiles
![global_tools_creator_subset_template](assets/global_tools_creator_subset_template.png)
Subset name helps to identify published content. More specific name helps with organization and avoid mixing of published content. Subset name is defined using one of templates defined in **Subset name profiles settings**. The template is filled with context information at the time of creation.
@ -263,10 +263,31 @@ Template may look like `"{family}{Task}{Variant}"`.
Some creators may have other keys as their context may require more information or more specific values. Make sure you've read documentation of host you're using.
## Workfiles
### Publish
#### Custom Staging Directory Profiles
With this feature, users can specify a custom data folder path based on presets, which can be used during the creation and publishing stages.
![global_tools_custom_staging_dir](assets/global_tools_custom_staging_dir.png)
Staging directories are used as a destination for intermediate files (as renders) before they are renamed and copied to proper location during the integration phase. They could be created completely dynamically in the temp folder or for some DCCs in the `work` area.
Example could be Nuke where artist might want to temporarily render pictures into `work` area to check them before they get published with the choice of "Use existing frames" on the write node.
One of the key advantages of this feature is that it allows users to choose the folder for writing such intermediate files to take advantage of faster storage for rendering, which can help improve workflow efficiency. Additionally, this feature allows users to keep their intermediate extracted data persistent, and use their own infrastructure for regular cleaning.
In some cases, these DCCs (Nuke, Houdini, Maya) automatically add a rendering path during the creation stage, which is then used in publishing. Creators and extractors of such DCCs need to use these profiles to fill paths in DCC's nodes to use this functionality.
The custom staging folder uses a path template configured in `project_anatomy/templates/others` with `transient` being a default example path that could be used. The template requires a 'folder' key for it to be usable as custom staging folder.
##### Known issues
- Any DCC that uses prefilled paths and store them inside of workfile nodes needs to implement resolving these paths with a configured profiles.
- If studio uses Site Sync remote artists need to have access to configured custom staging folder!
- Each node on the rendering farm must have access to configured custom staging folder!
### Workfiles
All settings related to Workfile tool.
### Open last workfile at launch
#### Open last workfile at launch
This feature allows you to define a rule for each task/host or toggle the feature globally to all tasks as they are visible in the picture.
![global_tools_workfile_open_last_version](assets/global_tools_workfile_open_last_version.png)

View file

@ -10,7 +10,7 @@ import TabItem from '@theme/TabItem';
Project settings can have project specific values. Each new project is using studio values defined in **default** project but these values can be modified or overridden per project.
:::warning Default studio values
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orage colour). Any changes in default project may affect all existing projects.
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orange colour). Any changes in default project may affect all existing projects.
:::
## Workfile Builder

View file

@ -10,7 +10,7 @@ import TabItem from '@theme/TabItem';
Project settings can have project specific values. Each new project is using studio values defined in **default** project but these values can be modified or overridden per project.
:::warning Default studio values
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orage colour). Any changes in default project may affect all existing projects.
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orange colour). Any changes in default project may affect all existing projects.
:::
## Creator Plugins

View file

@ -8,7 +8,7 @@ import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Ftrack is currently the main project management option for Pype. This documentation assumes that you are familiar with Ftrack and it's basic principles. If you're new to Ftrack, we recommend having a thorough look at [Ftrack Official Documentation](http://ftrack.rtd.ftrack.com/en/stable/).
Ftrack is currently the main project management option for Pype. This documentation assumes that you are familiar with Ftrack and its basic principles. If you're new to Ftrack, we recommend having a thorough look at [Ftrack Official Documentation](http://ftrack.rtd.ftrack.com/en/stable/).
## Prepare Ftrack for Pype

View file

@ -36,7 +36,7 @@ All context filters are lists which may contain strings or Regular expressions (
- **families** - Main family of processed instance. `["plate", "model"]`
:::important Filtering
Filters are optional and may not be set. In case when multiple profiles match current context, profile with filters has higher priority that profile without filters.
Filters are optional and may not be set. In case when multiple profiles match current context, profile with filters has higher priority than profile without filters.
:::
#### Profile outputs
@ -293,6 +293,7 @@ If source representation has suffix **"h264"** and burnin suffix is **"client"**
- It is allowed to use [Anatomy templates](admin_config#anatomy) themselves in burnins if they can be filled with available data.
- Additional keys in burnins:
| Burnin key | Description |
| --- | --- |
| frame_start | First frame number. |
@ -303,6 +304,7 @@ If source representation has suffix **"h264"** and burnin suffix is **"client"**
| resolution_height | Resolution height. |
| fps | Fps of an output. |
| timecode | Timecode by frame start and fps. |
| focalLength | **Only available in Maya**<br /><br />Camera focal length per frame. Use syntax `{focalLength:.2f}` for decimal truncating. Eg. `35.234985` with `{focalLength:.2f}` would produce `35.23`, whereas `{focalLength:.0f}` would produce `35`. |
:::warning
`timecode` is specific key that can be **only at the end of content**. (`"BOTTOM_RIGHT": "TC: {timecode}"`)

View file

@ -15,9 +15,9 @@ various usage scenarios.
## Studio Preparation
You can find detailed breakdown of technical requirements [here](dev_requirements), but in general OpenPype should be able
You can find a detailed breakdown of technical requirements [here](dev_requirements), but in general OpenPype should be able
to operate in most studios fairly quickly. The main obstacles are usually related to workflows and habits, that
might not be fully compatible with what OpenPype is expecting or enforcing. It is recommended to go through artists [key concepts](artist_concepts) to get idea about basics.
might not be fully compatible with what OpenPype is expecting or enforcing. It is recommended to go through artists [key concepts](artist_concepts) to get comfortable with the basics.
Keep in mind that if you run into any workflows that are not supported, it's usually just because we haven't hit
that particular case and it can most likely be added upon request.

File diff suppressed because it is too large Load diff