Merge branch 'develop' into feature/OP-3879_Change-extractor-usage-in-maya
46
CHANGELOG.md
|
|
@ -1,23 +1,42 @@
|
|||
# Changelog
|
||||
|
||||
## [3.14.2-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.14.2-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.1...HEAD)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Nuke: Build workfile by template [\#3763](https://github.com/pypeclub/OpenPype/pull/3763)
|
||||
- Houdini: Publishing workfiles [\#3697](https://github.com/pypeclub/OpenPype/pull/3697)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- SyncServer: Added cli commands for sync server [\#3765](https://github.com/pypeclub/OpenPype/pull/3765)
|
||||
- Maya: move set render settings menu entry [\#3669](https://github.com/pypeclub/OpenPype/pull/3669)
|
||||
- Scene Inventory: Maya add actions to select from or to scene [\#3659](https://github.com/pypeclub/OpenPype/pull/3659)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Fix - changed format of version string in pyproject.toml [\#3777](https://github.com/pypeclub/OpenPype/pull/3777)
|
||||
- Ftrack status fix typo prgoress -\> progress [\#3761](https://github.com/pypeclub/OpenPype/pull/3761)
|
||||
- Fix version resolution [\#3757](https://github.com/pypeclub/OpenPype/pull/3757)
|
||||
- Maya: `containerise` dont skip empty values [\#3674](https://github.com/pypeclub/OpenPype/pull/3674)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- AfterEffects: Use new Extractor location [\#3784](https://github.com/pypeclub/OpenPype/pull/3784)
|
||||
- General: Remove unused teshost [\#3773](https://github.com/pypeclub/OpenPype/pull/3773)
|
||||
- General: Copied 'Extractor' plugin to publish pipeline [\#3771](https://github.com/pypeclub/OpenPype/pull/3771)
|
||||
- General: Create project function moved to client code [\#3766](https://github.com/pypeclub/OpenPype/pull/3766)
|
||||
- General: Move hostdirname functionality into host [\#3749](https://github.com/pypeclub/OpenPype/pull/3749)
|
||||
- Webpublisher: Webpublisher is used as addon [\#3740](https://github.com/pypeclub/OpenPype/pull/3740)
|
||||
- General: Move publish utils to pipeline [\#3745](https://github.com/pypeclub/OpenPype/pull/3745)
|
||||
- Houdini: Define houdini as addon [\#3735](https://github.com/pypeclub/OpenPype/pull/3735)
|
||||
- Flame: Defined flame as addon [\#3732](https://github.com/pypeclub/OpenPype/pull/3732)
|
||||
- Resolve: Define resolve as addon [\#3727](https://github.com/pypeclub/OpenPype/pull/3727)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Standalone Publisher: Ignore empty labels, then still use name like other asset models [\#3779](https://github.com/pypeclub/OpenPype/pull/3779)
|
||||
|
||||
## [3.14.1](https://github.com/pypeclub/OpenPype/tree/3.14.1) (2022-08-30)
|
||||
|
||||
|
|
@ -63,6 +82,7 @@
|
|||
|
||||
- General: Move delivery logic to pipeline [\#3751](https://github.com/pypeclub/OpenPype/pull/3751)
|
||||
- General: Host addons cleanup [\#3744](https://github.com/pypeclub/OpenPype/pull/3744)
|
||||
- Webpublisher: Webpublisher is used as addon [\#3740](https://github.com/pypeclub/OpenPype/pull/3740)
|
||||
- Photoshop: Defined photoshop as addon [\#3736](https://github.com/pypeclub/OpenPype/pull/3736)
|
||||
- Harmony: Defined harmony as addon [\#3734](https://github.com/pypeclub/OpenPype/pull/3734)
|
||||
- General: Module interfaces cleanup [\#3731](https://github.com/pypeclub/OpenPype/pull/3731)
|
||||
|
|
@ -93,6 +113,7 @@
|
|||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: Addiotional component metadata [\#3685](https://github.com/pypeclub/OpenPype/pull/3685)
|
||||
- Ftrack: Set task status on farm publishing [\#3680](https://github.com/pypeclub/OpenPype/pull/3680)
|
||||
- Ftrack: Set task status on task creation in integrate hierarchy [\#3675](https://github.com/pypeclub/OpenPype/pull/3675)
|
||||
- Maya: Disable rendering of all lights for render instances submitted through Deadline. [\#3661](https://github.com/pypeclub/OpenPype/pull/3661)
|
||||
- General: Optimized OCIO configs [\#3650](https://github.com/pypeclub/OpenPype/pull/3650)
|
||||
|
|
@ -103,43 +124,22 @@
|
|||
- General: Fix finding of last version [\#3656](https://github.com/pypeclub/OpenPype/pull/3656)
|
||||
- General: Extract Review can scale with pixel aspect ratio [\#3644](https://github.com/pypeclub/OpenPype/pull/3644)
|
||||
- Maya: Refactor moved usage of CreateRender settings [\#3643](https://github.com/pypeclub/OpenPype/pull/3643)
|
||||
- General: Hero version representations have full context [\#3638](https://github.com/pypeclub/OpenPype/pull/3638)
|
||||
- Nuke: color settings for render write node is working now [\#3632](https://github.com/pypeclub/OpenPype/pull/3632)
|
||||
- Maya: FBX support for update in reference loader [\#3631](https://github.com/pypeclub/OpenPype/pull/3631)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Use client projects getter [\#3673](https://github.com/pypeclub/OpenPype/pull/3673)
|
||||
- Resolve: Match folder structure to other hosts [\#3653](https://github.com/pypeclub/OpenPype/pull/3653)
|
||||
- Maya: Hosts as modules [\#3647](https://github.com/pypeclub/OpenPype/pull/3647)
|
||||
- TimersManager: Plugins are in timers manager module [\#3639](https://github.com/pypeclub/OpenPype/pull/3639)
|
||||
- General: Move workfiles functions into pipeline [\#3637](https://github.com/pypeclub/OpenPype/pull/3637)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Deadline: Global job pre load is not Pype 2 compatible [\#3666](https://github.com/pypeclub/OpenPype/pull/3666)
|
||||
- Maya: Remove unused get current renderer logic [\#3645](https://github.com/pypeclub/OpenPype/pull/3645)
|
||||
- Kitsu|Fix: Movie project type fails & first loop children names [\#3636](https://github.com/pypeclub/OpenPype/pull/3636)
|
||||
- fix the bug of failing to extract look when UDIMs format used in AiImage [\#3628](https://github.com/pypeclub/OpenPype/pull/3628)
|
||||
|
||||
## [3.13.0](https://github.com/pypeclub/OpenPype/tree/3.13.0) (2022-08-09)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.13.0-nightly.1...3.13.0)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Editorial: Mix audio use side file for ffmpeg filters [\#3630](https://github.com/pypeclub/OpenPype/pull/3630)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: fix aov separator in Redshift [\#3625](https://github.com/pypeclub/OpenPype/pull/3625)
|
||||
- Fix for multi-version build on Mac [\#3622](https://github.com/pypeclub/OpenPype/pull/3622)
|
||||
- Ftrack: Sync hierarchical attributes can handle new created entities [\#3621](https://github.com/pypeclub/OpenPype/pull/3621)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Plugin settings handled by plugins [\#3623](https://github.com/pypeclub/OpenPype/pull/3623)
|
||||
|
||||
## [3.12.2](https://github.com/pypeclub/OpenPype/tree/3.12.2) (2022-07-27)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.2-nightly.4...3.12.2)
|
||||
|
|
|
|||
|
|
@ -63,7 +63,7 @@ class OpenPypeVersion(semver.VersionInfo):
|
|||
"""
|
||||
staging = False
|
||||
path = None
|
||||
_VERSION_REGEX = re.compile(r"(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$") # noqa: E501
|
||||
_VERSION_REGEX = re.compile(r"(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?") # noqa: E501
|
||||
_installed_version = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
|
|
|||
|
|
@ -58,7 +58,7 @@ def get_projects(active=True, inactive=False, fields=None):
|
|||
yield project_doc
|
||||
|
||||
|
||||
def get_project(project_name, active=True, inactive=False, fields=None):
|
||||
def get_project(project_name, active=True, inactive=True, fields=None):
|
||||
# Skip if both are disabled
|
||||
if not active and not inactive:
|
||||
return None
|
||||
|
|
|
|||
|
|
@ -2,14 +2,18 @@ import os
|
|||
import sys
|
||||
import six
|
||||
|
||||
import openpype.api
|
||||
from openpype.lib import (
|
||||
get_ffmpeg_tool_path,
|
||||
run_subprocess,
|
||||
)
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
class ExtractLocalRender(openpype.api.Extractor):
|
||||
class ExtractLocalRender(publish.Extractor):
|
||||
"""Render RenderQueue locally."""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.47
|
||||
order = publish.Extractor.order - 0.47
|
||||
label = "Extract Local Render"
|
||||
hosts = ["aftereffects"]
|
||||
families = ["renderLocal", "render.local"]
|
||||
|
|
@ -53,7 +57,7 @@ class ExtractLocalRender(openpype.api.Extractor):
|
|||
|
||||
instance.data["representations"] = [repre_data]
|
||||
|
||||
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
|
||||
ffmpeg_path = get_ffmpeg_tool_path("ffmpeg")
|
||||
# Generate thumbnail.
|
||||
thumbnail_path = os.path.join(staging_dir, "thumbnail.jpg")
|
||||
|
||||
|
|
@ -66,7 +70,7 @@ class ExtractLocalRender(openpype.api.Extractor):
|
|||
]
|
||||
self.log.debug("Thumbnail args:: {}".format(args))
|
||||
try:
|
||||
output = openpype.lib.run_subprocess(args)
|
||||
output = run_subprocess(args)
|
||||
except TypeError:
|
||||
self.log.warning("Error in creating thumbnail")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
|
|
|||
|
|
@ -1,13 +1,13 @@
|
|||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
class ExtractSaveScene(pyblish.api.ContextPlugin):
|
||||
"""Save scene before extraction."""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.48
|
||||
order = publish.Extractor.order - 0.48
|
||||
label = "Extract Save Scene"
|
||||
hosts = ["aftereffects"]
|
||||
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import openpype.api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
class RemovePublishHighlight(openpype.api.Extractor):
|
||||
class RemovePublishHighlight(publish.Extractor):
|
||||
"""Clean utf characters which are not working in DL
|
||||
|
||||
Published compositions are marked with unicode icon which causes
|
||||
|
|
@ -10,7 +10,7 @@ class RemovePublishHighlight(openpype.api.Extractor):
|
|||
rendering, add it later back to avoid confusion.
|
||||
"""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.49 # just before save
|
||||
order = publish.Extractor.order - 0.49 # just before save
|
||||
label = "Clean render comp"
|
||||
hosts = ["aftereffects"]
|
||||
families = ["render.farm"]
|
||||
|
|
|
|||
|
|
@ -104,13 +104,6 @@ def install():
|
|||
|
||||
cmds.menuItem(divider=True)
|
||||
|
||||
cmds.menuItem(
|
||||
"Set Render Settings",
|
||||
command=lambda *args: lib_rendersettings.RenderSettings().set_default_renderer_settings() # noqa
|
||||
)
|
||||
|
||||
cmds.menuItem(divider=True)
|
||||
|
||||
cmds.menuItem(
|
||||
"Work Files...",
|
||||
command=lambda *args: host_tools.show_workfiles(
|
||||
|
|
@ -132,6 +125,12 @@ def install():
|
|||
"Set Colorspace",
|
||||
command=lambda *args: lib.set_colorspace(),
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Set Render Settings",
|
||||
command=lambda *args: lib_rendersettings.RenderSettings().set_default_renderer_settings() # noqa
|
||||
)
|
||||
|
||||
cmds.menuItem(divider=True, parent=MENU_NAME)
|
||||
cmds.menuItem(
|
||||
"Build First Workfile",
|
||||
|
|
|
|||
|
|
@ -349,21 +349,13 @@ def containerise(name,
|
|||
("id", AVALON_CONTAINER_ID),
|
||||
("name", name),
|
||||
("namespace", namespace),
|
||||
("loader", str(loader)),
|
||||
("loader", loader),
|
||||
("representation", context["representation"]["_id"]),
|
||||
]
|
||||
|
||||
for key, value in data:
|
||||
if not value:
|
||||
continue
|
||||
|
||||
if isinstance(value, (int, float)):
|
||||
cmds.addAttr(container, longName=key, attributeType="short")
|
||||
cmds.setAttr(container + "." + key, value)
|
||||
|
||||
else:
|
||||
cmds.addAttr(container, longName=key, dataType="string")
|
||||
cmds.setAttr(container + "." + key, value, type="string")
|
||||
cmds.addAttr(container, longName=key, dataType="string")
|
||||
cmds.setAttr(container + "." + key, str(value), type="string")
|
||||
|
||||
main_container = cmds.ls(AVALON_CONTAINERS, type="objectSet")
|
||||
if not main_container:
|
||||
|
|
|
|||
46
openpype/hosts/maya/plugins/inventory/select_containers.py
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
from maya import cmds
|
||||
|
||||
from openpype.pipeline import InventoryAction, registered_host
|
||||
from openpype.hosts.maya.api.lib import get_container_members
|
||||
|
||||
|
||||
class SelectInScene(InventoryAction):
|
||||
"""Select nodes in the scene from selected containers in scene inventory"""
|
||||
|
||||
label = "Select in scene"
|
||||
icon = "search"
|
||||
color = "#888888"
|
||||
order = 99
|
||||
|
||||
def process(self, containers):
|
||||
|
||||
all_members = []
|
||||
for container in containers:
|
||||
members = get_container_members(container)
|
||||
all_members.extend(members)
|
||||
cmds.select(all_members, replace=True, noExpand=True)
|
||||
|
||||
|
||||
class HighlightBySceneSelection(InventoryAction):
|
||||
"""Select containers in scene inventory from the current scene selection"""
|
||||
|
||||
label = "Highlight by scene selection"
|
||||
icon = "search"
|
||||
color = "#888888"
|
||||
order = 100
|
||||
|
||||
def process(self, containers):
|
||||
|
||||
selection = set(cmds.ls(selection=True, long=True, objectsOnly=True))
|
||||
host = registered_host()
|
||||
|
||||
to_select = []
|
||||
for container in host.get_containers():
|
||||
members = get_container_members(container)
|
||||
if any(member in selection for member in members):
|
||||
to_select.append(container["objectName"])
|
||||
|
||||
return {
|
||||
"objectNames": to_select,
|
||||
"options": {"clear": True}
|
||||
}
|
||||
|
|
@ -78,6 +78,23 @@ class Context:
|
|||
_project_doc = None
|
||||
|
||||
|
||||
def get_main_window():
|
||||
"""Acquire Nuke's main window"""
|
||||
if Context.main_window is None:
|
||||
from Qt import QtWidgets
|
||||
|
||||
top_widgets = QtWidgets.QApplication.topLevelWidgets()
|
||||
name = "Foundry::UI::DockMainWindow"
|
||||
for widget in top_widgets:
|
||||
if (
|
||||
widget.inherits("QMainWindow")
|
||||
and widget.metaObject().className() == name
|
||||
):
|
||||
Context.main_window = widget
|
||||
break
|
||||
return Context.main_window
|
||||
|
||||
|
||||
class Knobby(object):
|
||||
"""For creating knob which it's type isn't mapped in `create_knobs`
|
||||
|
||||
|
|
@ -2706,32 +2723,25 @@ class DirmapCache:
|
|||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def _duplicate_node_temp():
|
||||
def node_tempfile():
|
||||
"""Create a temp file where node is pasted during duplication.
|
||||
|
||||
This is to avoid using clipboard for node duplication.
|
||||
"""
|
||||
|
||||
duplicate_node_temp_path = os.path.join(
|
||||
tempfile.gettempdir(),
|
||||
"openpype_nuke_duplicate_temp_{}".format(os.getpid())
|
||||
tmp_file = tempfile.NamedTemporaryFile(
|
||||
mode="w", prefix="openpype_nuke_temp_", suffix=".nk", delete=False
|
||||
)
|
||||
|
||||
# This can happen only if 'duplicate_node' would be
|
||||
if os.path.exists(duplicate_node_temp_path):
|
||||
log.warning((
|
||||
"Temp file for node duplication already exists."
|
||||
" Trying to remove {}"
|
||||
).format(duplicate_node_temp_path))
|
||||
os.remove(duplicate_node_temp_path)
|
||||
tmp_file.close()
|
||||
node_tempfile_path = tmp_file.name
|
||||
|
||||
try:
|
||||
# Yield the path where node can be copied
|
||||
yield duplicate_node_temp_path
|
||||
yield node_tempfile_path
|
||||
|
||||
finally:
|
||||
# Remove the file at the end
|
||||
os.remove(duplicate_node_temp_path)
|
||||
os.remove(node_tempfile_path)
|
||||
|
||||
|
||||
def duplicate_node(node):
|
||||
|
|
@ -2740,7 +2750,7 @@ def duplicate_node(node):
|
|||
# select required node for duplication
|
||||
node.setSelected(True)
|
||||
|
||||
with _duplicate_node_temp() as filepath:
|
||||
with node_tempfile() as filepath:
|
||||
# copy selected to temp filepath
|
||||
nuke.nodeCopy(filepath)
|
||||
|
||||
|
|
@ -2815,3 +2825,100 @@ def ls_img_sequence(path):
|
|||
}
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def get_group_io_nodes(nodes):
|
||||
"""Get the input and the output of a group of nodes."""
|
||||
|
||||
if not nodes:
|
||||
raise ValueError("there is no nodes in the list")
|
||||
|
||||
input_node = None
|
||||
output_node = None
|
||||
|
||||
if len(nodes) == 1:
|
||||
input_node = output_node = nodes[0]
|
||||
|
||||
else:
|
||||
for node in nodes:
|
||||
if "Input" in node.name():
|
||||
input_node = node
|
||||
|
||||
if "Output" in node.name():
|
||||
output_node = node
|
||||
|
||||
if input_node is not None and output_node is not None:
|
||||
break
|
||||
|
||||
if input_node is None:
|
||||
raise ValueError("No Input found")
|
||||
|
||||
if output_node is None:
|
||||
raise ValueError("No Output found")
|
||||
return input_node, output_node
|
||||
|
||||
|
||||
def get_extreme_positions(nodes):
|
||||
"""Get the 4 numbers that represent the box of a group of nodes."""
|
||||
|
||||
if not nodes:
|
||||
raise ValueError("there is no nodes in the list")
|
||||
|
||||
nodes_xpos = [n.xpos() for n in nodes] + \
|
||||
[n.xpos() + n.screenWidth() for n in nodes]
|
||||
|
||||
nodes_ypos = [n.ypos() for n in nodes] + \
|
||||
[n.ypos() + n.screenHeight() for n in nodes]
|
||||
|
||||
min_x, min_y = (min(nodes_xpos), min(nodes_ypos))
|
||||
max_x, max_y = (max(nodes_xpos), max(nodes_ypos))
|
||||
return min_x, min_y, max_x, max_y
|
||||
|
||||
|
||||
def refresh_node(node):
|
||||
"""Correct a bug caused by the multi-threading of nuke.
|
||||
|
||||
Refresh the node to make sure that it takes the desired attributes.
|
||||
"""
|
||||
|
||||
x = node.xpos()
|
||||
y = node.ypos()
|
||||
nuke.autoplaceSnap(node)
|
||||
node.setXYpos(x, y)
|
||||
|
||||
|
||||
def refresh_nodes(nodes):
|
||||
for node in nodes:
|
||||
refresh_node(node)
|
||||
|
||||
|
||||
def get_names_from_nodes(nodes):
|
||||
"""Get list of nodes names.
|
||||
|
||||
Args:
|
||||
nodes(List[nuke.Node]): List of nodes to convert into names.
|
||||
|
||||
Returns:
|
||||
List[str]: Name of passed nodes.
|
||||
"""
|
||||
|
||||
return [
|
||||
node.name()
|
||||
for node in nodes
|
||||
]
|
||||
|
||||
|
||||
def get_nodes_by_names(names):
|
||||
"""Get list of nuke nodes based on their names.
|
||||
|
||||
Args:
|
||||
names (List[str]): List of node names to be found.
|
||||
|
||||
Returns:
|
||||
List[nuke.Node]: List of nodes found by name.
|
||||
"""
|
||||
|
||||
return [
|
||||
nuke.toNode(name)
|
||||
for name in names
|
||||
]
|
||||
|
|
|
|||
220
openpype/hosts/nuke/api/lib_template_builder.py
Normal file
|
|
@ -0,0 +1,220 @@
|
|||
from collections import OrderedDict
|
||||
|
||||
import qargparse
|
||||
|
||||
import nuke
|
||||
|
||||
from openpype.tools.utils.widgets import OptionDialog
|
||||
|
||||
from .lib import imprint, get_main_window
|
||||
|
||||
|
||||
# To change as enum
|
||||
build_types = ["context_asset", "linked_asset", "all_assets"]
|
||||
|
||||
|
||||
def get_placeholder_attributes(node, enumerate=False):
|
||||
list_atts = {
|
||||
"builder_type",
|
||||
"family",
|
||||
"representation",
|
||||
"loader",
|
||||
"loader_args",
|
||||
"order",
|
||||
"asset",
|
||||
"subset",
|
||||
"hierarchy",
|
||||
"siblings",
|
||||
"last_loaded"
|
||||
}
|
||||
attributes = {}
|
||||
for attr in node.knobs().keys():
|
||||
if attr in list_atts:
|
||||
if enumerate:
|
||||
try:
|
||||
attributes[attr] = node.knob(attr).values()
|
||||
except AttributeError:
|
||||
attributes[attr] = node.knob(attr).getValue()
|
||||
else:
|
||||
attributes[attr] = node.knob(attr).getValue()
|
||||
|
||||
return attributes
|
||||
|
||||
|
||||
def delete_placeholder_attributes(node):
|
||||
"""Delete all extra placeholder attributes."""
|
||||
|
||||
extra_attributes = get_placeholder_attributes(node)
|
||||
for attribute in extra_attributes.keys():
|
||||
try:
|
||||
node.removeKnob(node.knob(attribute))
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
|
||||
def hide_placeholder_attributes(node):
|
||||
"""Hide all extra placeholder attributes."""
|
||||
|
||||
extra_attributes = get_placeholder_attributes(node)
|
||||
for attribute in extra_attributes.keys():
|
||||
try:
|
||||
node.knob(attribute).setVisible(False)
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
|
||||
def create_placeholder():
|
||||
args = placeholder_window()
|
||||
if not args:
|
||||
# operation canceled, no locator created
|
||||
return
|
||||
|
||||
placeholder = nuke.nodes.NoOp()
|
||||
placeholder.setName("PLACEHOLDER")
|
||||
placeholder.knob("tile_color").setValue(4278190335)
|
||||
|
||||
# custom arg parse to force empty data query
|
||||
# and still imprint them on placeholder
|
||||
# and getting items when arg is of type Enumerator
|
||||
options = OrderedDict()
|
||||
for arg in args:
|
||||
if not type(arg) == qargparse.Separator:
|
||||
options[str(arg)] = arg._data.get("items") or arg.read()
|
||||
imprint(placeholder, options)
|
||||
imprint(placeholder, {"is_placeholder": True})
|
||||
placeholder.knob("is_placeholder").setVisible(False)
|
||||
|
||||
|
||||
def update_placeholder():
|
||||
placeholder = nuke.selectedNodes()
|
||||
if not placeholder:
|
||||
raise ValueError("No node selected")
|
||||
if len(placeholder) > 1:
|
||||
raise ValueError("Too many selected nodes")
|
||||
placeholder = placeholder[0]
|
||||
|
||||
args = placeholder_window(get_placeholder_attributes(placeholder))
|
||||
if not args:
|
||||
return # operation canceled
|
||||
# delete placeholder attributes
|
||||
delete_placeholder_attributes(placeholder)
|
||||
|
||||
options = OrderedDict()
|
||||
for arg in args:
|
||||
if not type(arg) == qargparse.Separator:
|
||||
options[str(arg)] = arg._data.get("items") or arg.read()
|
||||
imprint(placeholder, options)
|
||||
|
||||
|
||||
def imprint_enum(placeholder, args):
|
||||
"""
|
||||
Imprint method doesn't act properly with enums.
|
||||
Replacing the functionnality with this for now
|
||||
"""
|
||||
|
||||
enum_values = {
|
||||
str(arg): arg.read()
|
||||
for arg in args
|
||||
if arg._data.get("items")
|
||||
}
|
||||
string_to_value_enum_table = {
|
||||
build: idx
|
||||
for idx, build in enumerate(build_types)
|
||||
}
|
||||
attrs = {}
|
||||
for key, value in enum_values.items():
|
||||
attrs[key] = string_to_value_enum_table[value]
|
||||
|
||||
|
||||
def placeholder_window(options=None):
|
||||
options = options or dict()
|
||||
dialog = OptionDialog(parent=get_main_window())
|
||||
dialog.setWindowTitle("Create Placeholder")
|
||||
|
||||
args = [
|
||||
qargparse.Separator("Main attributes"),
|
||||
qargparse.Enum(
|
||||
"builder_type",
|
||||
label="Asset Builder Type",
|
||||
default=options.get("builder_type", 0),
|
||||
items=build_types,
|
||||
help="""Asset Builder Type
|
||||
Builder type describe what template loader will look for.
|
||||
|
||||
context_asset : Template loader will look for subsets of
|
||||
current context asset (Asset bob will find asset)
|
||||
|
||||
linked_asset : Template loader will look for assets linked
|
||||
to current context asset.
|
||||
Linked asset are looked in OpenPype database under field "inputLinks"
|
||||
"""
|
||||
),
|
||||
qargparse.String(
|
||||
"family",
|
||||
default=options.get("family", ""),
|
||||
label="OpenPype Family",
|
||||
placeholder="ex: image, plate ..."),
|
||||
qargparse.String(
|
||||
"representation",
|
||||
default=options.get("representation", ""),
|
||||
label="OpenPype Representation",
|
||||
placeholder="ex: mov, png ..."),
|
||||
qargparse.String(
|
||||
"loader",
|
||||
default=options.get("loader", ""),
|
||||
label="Loader",
|
||||
placeholder="ex: LoadClip, LoadImage ...",
|
||||
help="""Loader
|
||||
|
||||
Defines what openpype loader will be used to load assets.
|
||||
Useable loader depends on current host's loader list.
|
||||
Field is case sensitive.
|
||||
"""),
|
||||
qargparse.String(
|
||||
"loader_args",
|
||||
default=options.get("loader_args", ""),
|
||||
label="Loader Arguments",
|
||||
placeholder='ex: {"camera":"persp", "lights":True}',
|
||||
help="""Loader
|
||||
|
||||
Defines a dictionnary of arguments used to load assets.
|
||||
Useable arguments depend on current placeholder Loader.
|
||||
Field should be a valid python dict. Anything else will be ignored.
|
||||
"""),
|
||||
qargparse.Integer(
|
||||
"order",
|
||||
default=options.get("order", 0),
|
||||
min=0,
|
||||
max=999,
|
||||
label="Order",
|
||||
placeholder="ex: 0, 100 ... (smallest order loaded first)",
|
||||
help="""Order
|
||||
|
||||
Order defines asset loading priority (0 to 999)
|
||||
Priority rule is : "lowest is first to load"."""),
|
||||
qargparse.Separator(
|
||||
"Optional attributes "),
|
||||
qargparse.String(
|
||||
"asset",
|
||||
default=options.get("asset", ""),
|
||||
label="Asset filter",
|
||||
placeholder="regex filtering by asset name",
|
||||
help="Filtering assets by matching field regex to asset's name"),
|
||||
qargparse.String(
|
||||
"subset",
|
||||
default=options.get("subset", ""),
|
||||
label="Subset filter",
|
||||
placeholder="regex filtering by subset name",
|
||||
help="Filtering assets by matching field regex to subset's name"),
|
||||
qargparse.String(
|
||||
"hierarchy",
|
||||
default=options.get("hierarchy", ""),
|
||||
label="Hierarchy filter",
|
||||
placeholder="regex filtering by asset's hierarchy",
|
||||
help="Filtering assets by matching field asset's hierarchy")
|
||||
]
|
||||
dialog.create(args)
|
||||
if not dialog.exec_():
|
||||
return None
|
||||
|
||||
return args
|
||||
|
|
@ -22,10 +22,16 @@ from openpype.pipeline import (
|
|||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.workfile import BuildWorkfile
|
||||
from openpype.pipeline.workfile.build_template import (
|
||||
build_workfile_template,
|
||||
update_workfile_template
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
|
||||
from .command import viewer_update_and_undo_stop
|
||||
from .lib import (
|
||||
Context,
|
||||
get_main_window,
|
||||
add_publish_knob,
|
||||
WorkfileSettings,
|
||||
process_workfile_builder,
|
||||
|
|
@ -33,7 +39,9 @@ from .lib import (
|
|||
check_inventory_versions,
|
||||
set_avalon_knob_data,
|
||||
read_avalon_data,
|
||||
Context
|
||||
)
|
||||
from .lib_template_builder import (
|
||||
create_placeholder, update_placeholder
|
||||
)
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
|
@ -53,23 +61,6 @@ if os.getenv("PYBLISH_GUI", None):
|
|||
pyblish.api.register_gui(os.getenv("PYBLISH_GUI", None))
|
||||
|
||||
|
||||
def get_main_window():
|
||||
"""Acquire Nuke's main window"""
|
||||
if Context.main_window is None:
|
||||
from Qt import QtWidgets
|
||||
|
||||
top_widgets = QtWidgets.QApplication.topLevelWidgets()
|
||||
name = "Foundry::UI::DockMainWindow"
|
||||
for widget in top_widgets:
|
||||
if (
|
||||
widget.inherits("QMainWindow")
|
||||
and widget.metaObject().className() == name
|
||||
):
|
||||
Context.main_window = widget
|
||||
break
|
||||
return Context.main_window
|
||||
|
||||
|
||||
def reload_config():
|
||||
"""Attempt to reload pipeline at run-time.
|
||||
|
||||
|
|
@ -219,6 +210,24 @@ def _install_menu():
|
|||
lambda: BuildWorkfile().process()
|
||||
)
|
||||
|
||||
menu_template = menu.addMenu("Template Builder") # creating template menu
|
||||
menu_template.addCommand(
|
||||
"Build Workfile from template",
|
||||
lambda: build_workfile_template()
|
||||
)
|
||||
menu_template.addCommand(
|
||||
"Update Workfile",
|
||||
lambda: update_workfile_template()
|
||||
)
|
||||
menu_template.addSeparator()
|
||||
menu_template.addCommand(
|
||||
"Create Place Holder",
|
||||
lambda: create_placeholder()
|
||||
)
|
||||
menu_template.addCommand(
|
||||
"Update Place Holder",
|
||||
lambda: update_placeholder()
|
||||
)
|
||||
menu.addSeparator()
|
||||
menu.addCommand(
|
||||
"Experimental tools...",
|
||||
|
|
|
|||
639
openpype/hosts/nuke/api/template_loader.py
Normal file
|
|
@ -0,0 +1,639 @@
|
|||
import re
|
||||
import collections
|
||||
|
||||
import nuke
|
||||
|
||||
from openpype.client import get_representations
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.workfile.abstract_template_loader import (
|
||||
AbstractPlaceholder,
|
||||
AbstractTemplateLoader,
|
||||
)
|
||||
|
||||
from .lib import (
|
||||
find_free_space_to_paste_nodes,
|
||||
get_extreme_positions,
|
||||
get_group_io_nodes,
|
||||
imprint,
|
||||
refresh_node,
|
||||
refresh_nodes,
|
||||
reset_selection,
|
||||
get_names_from_nodes,
|
||||
get_nodes_by_names,
|
||||
select_nodes,
|
||||
duplicate_node,
|
||||
node_tempfile,
|
||||
)
|
||||
|
||||
from .lib_template_builder import (
|
||||
delete_placeholder_attributes,
|
||||
get_placeholder_attributes,
|
||||
hide_placeholder_attributes
|
||||
)
|
||||
|
||||
PLACEHOLDER_SET = "PLACEHOLDERS_SET"
|
||||
|
||||
|
||||
class NukeTemplateLoader(AbstractTemplateLoader):
|
||||
"""Concrete implementation of AbstractTemplateLoader for Nuke
|
||||
|
||||
"""
|
||||
|
||||
def import_template(self, path):
|
||||
"""Import template into current scene.
|
||||
Block if a template is already loaded.
|
||||
|
||||
Args:
|
||||
path (str): A path to current template (usually given by
|
||||
get_template_path implementation)
|
||||
|
||||
Returns:
|
||||
bool: Wether the template was succesfully imported or not
|
||||
"""
|
||||
|
||||
# TODO check if the template is already imported
|
||||
|
||||
nuke.nodePaste(path)
|
||||
reset_selection()
|
||||
|
||||
return True
|
||||
|
||||
def preload(self, placeholder, loaders_by_name, last_representation):
|
||||
placeholder.data["nodes_init"] = nuke.allNodes()
|
||||
placeholder.data["last_repre_id"] = str(last_representation["_id"])
|
||||
|
||||
def populate_template(self, ignored_ids=None):
|
||||
processed_key = "_node_processed"
|
||||
|
||||
processed_nodes = []
|
||||
nodes = self.get_template_nodes()
|
||||
while nodes:
|
||||
# Mark nodes as processed so they're not re-executed
|
||||
# - that can happen if processing of placeholder node fails
|
||||
for node in nodes:
|
||||
imprint(node, {processed_key: True})
|
||||
processed_nodes.append(node)
|
||||
|
||||
super(NukeTemplateLoader, self).populate_template(ignored_ids)
|
||||
|
||||
# Recollect nodes to repopulate
|
||||
nodes = []
|
||||
for node in self.get_template_nodes():
|
||||
# Skip already processed nodes
|
||||
if (
|
||||
processed_key in node.knobs()
|
||||
and node.knob(processed_key).value()
|
||||
):
|
||||
continue
|
||||
nodes.append(node)
|
||||
|
||||
for node in processed_nodes:
|
||||
knob = node.knob(processed_key)
|
||||
if knob is not None:
|
||||
node.removeKnob(knob)
|
||||
|
||||
@staticmethod
|
||||
def get_template_nodes():
|
||||
placeholders = []
|
||||
all_groups = collections.deque()
|
||||
all_groups.append(nuke.thisGroup())
|
||||
while all_groups:
|
||||
group = all_groups.popleft()
|
||||
for node in group.nodes():
|
||||
if isinstance(node, nuke.Group):
|
||||
all_groups.append(node)
|
||||
|
||||
node_knobs = node.knobs()
|
||||
if (
|
||||
"builder_type" not in node_knobs
|
||||
or "is_placeholder" not in node_knobs
|
||||
or not node.knob("is_placeholder").value()
|
||||
):
|
||||
continue
|
||||
|
||||
if "empty" in node_knobs and node.knob("empty").value():
|
||||
continue
|
||||
|
||||
placeholders.append(node)
|
||||
|
||||
return placeholders
|
||||
|
||||
def update_missing_containers(self):
|
||||
nodes_by_id = collections.defaultdict(list)
|
||||
|
||||
for node in nuke.allNodes():
|
||||
node_knobs = node.knobs().keys()
|
||||
if "repre_id" in node_knobs:
|
||||
repre_id = node.knob("repre_id").getValue()
|
||||
nodes_by_id[repre_id].append(node.name())
|
||||
|
||||
if "empty" in node_knobs:
|
||||
node.removeKnob(node.knob("empty"))
|
||||
imprint(node, {"empty": False})
|
||||
|
||||
for node_names in nodes_by_id.values():
|
||||
node = None
|
||||
for node_name in node_names:
|
||||
node_by_name = nuke.toNode(node_name)
|
||||
if "builder_type" in node_by_name.knobs().keys():
|
||||
node = node_by_name
|
||||
break
|
||||
|
||||
if node is None:
|
||||
continue
|
||||
|
||||
placeholder = nuke.nodes.NoOp()
|
||||
placeholder.setName("PLACEHOLDER")
|
||||
placeholder.knob("tile_color").setValue(4278190335)
|
||||
attributes = get_placeholder_attributes(node, enumerate=True)
|
||||
imprint(placeholder, attributes)
|
||||
pos_x = int(node.knob("x").getValue())
|
||||
pos_y = int(node.knob("y").getValue())
|
||||
placeholder.setXYpos(pos_x, pos_y)
|
||||
imprint(placeholder, {"nb_children": 1})
|
||||
refresh_node(placeholder)
|
||||
|
||||
self.populate_template(self.get_loaded_containers_by_id())
|
||||
|
||||
def get_loaded_containers_by_id(self):
|
||||
repre_ids = set()
|
||||
for node in nuke.allNodes():
|
||||
if "repre_id" in node.knobs():
|
||||
repre_ids.add(node.knob("repre_id").getValue())
|
||||
|
||||
# Removes duplicates in the list
|
||||
return list(repre_ids)
|
||||
|
||||
def delete_placeholder(self, placeholder):
|
||||
placeholder_node = placeholder.data["node"]
|
||||
last_loaded = placeholder.data["last_loaded"]
|
||||
if not placeholder.data["delete"]:
|
||||
if "empty" in placeholder_node.knobs().keys():
|
||||
placeholder_node.removeKnob(placeholder_node.knob("empty"))
|
||||
imprint(placeholder_node, {"empty": True})
|
||||
return
|
||||
|
||||
if not last_loaded:
|
||||
nuke.delete(placeholder_node)
|
||||
return
|
||||
|
||||
if "last_loaded" in placeholder_node.knobs().keys():
|
||||
for node_name in placeholder_node.knob("last_loaded").values():
|
||||
node = nuke.toNode(node_name)
|
||||
try:
|
||||
delete_placeholder_attributes(node)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
last_loaded_names = [
|
||||
loaded_node.name()
|
||||
for loaded_node in last_loaded
|
||||
]
|
||||
imprint(placeholder_node, {"last_loaded": last_loaded_names})
|
||||
|
||||
for node in last_loaded:
|
||||
refresh_node(node)
|
||||
refresh_node(placeholder_node)
|
||||
if "builder_type" not in node.knobs().keys():
|
||||
attributes = get_placeholder_attributes(placeholder_node, True)
|
||||
imprint(node, attributes)
|
||||
imprint(node, {"is_placeholder": False})
|
||||
hide_placeholder_attributes(node)
|
||||
node.knob("is_placeholder").setVisible(False)
|
||||
imprint(
|
||||
node,
|
||||
{
|
||||
"x": placeholder_node.xpos(),
|
||||
"y": placeholder_node.ypos()
|
||||
}
|
||||
)
|
||||
node.knob("x").setVisible(False)
|
||||
node.knob("y").setVisible(False)
|
||||
nuke.delete(placeholder_node)
|
||||
|
||||
|
||||
class NukePlaceholder(AbstractPlaceholder):
|
||||
"""Concrete implementation of AbstractPlaceholder for Nuke"""
|
||||
|
||||
optional_keys = {"asset", "subset", "hierarchy"}
|
||||
|
||||
def get_data(self, node):
|
||||
user_data = dict()
|
||||
node_knobs = node.knobs()
|
||||
for attr in self.required_keys.union(self.optional_keys):
|
||||
if attr in node_knobs:
|
||||
user_data[attr] = node_knobs[attr].getValue()
|
||||
user_data["node"] = node
|
||||
|
||||
nb_children = 0
|
||||
if "nb_children" in node_knobs:
|
||||
nb_children = int(node_knobs["nb_children"].getValue())
|
||||
user_data["nb_children"] = nb_children
|
||||
|
||||
siblings = []
|
||||
if "siblings" in node_knobs:
|
||||
siblings = node_knobs["siblings"].values()
|
||||
user_data["siblings"] = siblings
|
||||
|
||||
node_full_name = node.fullName()
|
||||
user_data["group_name"] = node_full_name.rpartition(".")[0]
|
||||
user_data["last_loaded"] = []
|
||||
user_data["delete"] = False
|
||||
self.data = user_data
|
||||
|
||||
def parent_in_hierarchy(self, containers):
|
||||
return
|
||||
|
||||
def create_sib_copies(self):
|
||||
""" creating copies of the palce_holder siblings (the ones who were
|
||||
loaded with it) for the new nodes added
|
||||
|
||||
Returns :
|
||||
copies (dict) : with copied nodes names and their copies
|
||||
"""
|
||||
|
||||
copies = {}
|
||||
siblings = get_nodes_by_names(self.data["siblings"])
|
||||
for node in siblings:
|
||||
new_node = duplicate_node(node)
|
||||
|
||||
x_init = int(new_node.knob("x_init").getValue())
|
||||
y_init = int(new_node.knob("y_init").getValue())
|
||||
new_node.setXYpos(x_init, y_init)
|
||||
if isinstance(new_node, nuke.BackdropNode):
|
||||
w_init = new_node.knob("w_init").getValue()
|
||||
h_init = new_node.knob("h_init").getValue()
|
||||
new_node.knob("bdwidth").setValue(w_init)
|
||||
new_node.knob("bdheight").setValue(h_init)
|
||||
refresh_node(node)
|
||||
|
||||
if "repre_id" in node.knobs().keys():
|
||||
node.removeKnob(node.knob("repre_id"))
|
||||
copies[node.name()] = new_node
|
||||
return copies
|
||||
|
||||
def fix_z_order(self):
|
||||
"""Fix the problem of z_order when a backdrop is loaded."""
|
||||
|
||||
nodes_loaded = self.data["last_loaded"]
|
||||
loaded_backdrops = []
|
||||
bd_orders = set()
|
||||
for node in nodes_loaded:
|
||||
if isinstance(node, nuke.BackdropNode):
|
||||
loaded_backdrops.append(node)
|
||||
bd_orders.add(node.knob("z_order").getValue())
|
||||
|
||||
if not bd_orders:
|
||||
return
|
||||
|
||||
sib_orders = set()
|
||||
for node_name in self.data["siblings"]:
|
||||
node = nuke.toNode(node_name)
|
||||
if isinstance(node, nuke.BackdropNode):
|
||||
sib_orders.add(node.knob("z_order").getValue())
|
||||
|
||||
if not sib_orders:
|
||||
return
|
||||
|
||||
min_order = min(bd_orders)
|
||||
max_order = max(sib_orders)
|
||||
for backdrop_node in loaded_backdrops:
|
||||
z_order = backdrop_node.knob("z_order").getValue()
|
||||
backdrop_node.knob("z_order").setValue(
|
||||
z_order + max_order - min_order + 1)
|
||||
|
||||
def update_nodes(self, nodes, considered_nodes, offset_y=None):
|
||||
"""Adjust backdrop nodes dimensions and positions.
|
||||
|
||||
Considering some nodes sizes.
|
||||
|
||||
Args:
|
||||
nodes (list): list of nodes to update
|
||||
considered_nodes (list): list of nodes to consider while updating
|
||||
positions and dimensions
|
||||
offset (int): distance between copies
|
||||
"""
|
||||
|
||||
placeholder_node = self.data["node"]
|
||||
|
||||
min_x, min_y, max_x, max_y = get_extreme_positions(considered_nodes)
|
||||
|
||||
diff_x = diff_y = 0
|
||||
contained_nodes = [] # for backdrops
|
||||
|
||||
if offset_y is None:
|
||||
width_ph = placeholder_node.screenWidth()
|
||||
height_ph = placeholder_node.screenHeight()
|
||||
diff_y = max_y - min_y - height_ph
|
||||
diff_x = max_x - min_x - width_ph
|
||||
contained_nodes = [placeholder_node]
|
||||
min_x = placeholder_node.xpos()
|
||||
min_y = placeholder_node.ypos()
|
||||
else:
|
||||
siblings = get_nodes_by_names(self.data["siblings"])
|
||||
minX, _, maxX, _ = get_extreme_positions(siblings)
|
||||
diff_y = max_y - min_y + 20
|
||||
diff_x = abs(max_x - min_x - maxX + minX)
|
||||
contained_nodes = considered_nodes
|
||||
|
||||
if diff_y <= 0 and diff_x <= 0:
|
||||
return
|
||||
|
||||
for node in nodes:
|
||||
refresh_node(node)
|
||||
|
||||
if (
|
||||
node == placeholder_node
|
||||
or node in considered_nodes
|
||||
):
|
||||
continue
|
||||
|
||||
if (
|
||||
not isinstance(node, nuke.BackdropNode)
|
||||
or (
|
||||
isinstance(node, nuke.BackdropNode)
|
||||
and not set(contained_nodes) <= set(node.getNodes())
|
||||
)
|
||||
):
|
||||
if offset_y is None and node.xpos() >= min_x:
|
||||
node.setXpos(node.xpos() + diff_x)
|
||||
|
||||
if node.ypos() >= min_y:
|
||||
node.setYpos(node.ypos() + diff_y)
|
||||
|
||||
else:
|
||||
width = node.screenWidth()
|
||||
height = node.screenHeight()
|
||||
node.knob("bdwidth").setValue(width + diff_x)
|
||||
node.knob("bdheight").setValue(height + diff_y)
|
||||
|
||||
refresh_node(node)
|
||||
|
||||
def imprint_inits(self):
|
||||
"""Add initial positions and dimensions to the attributes"""
|
||||
|
||||
for node in nuke.allNodes():
|
||||
refresh_node(node)
|
||||
imprint(node, {"x_init": node.xpos(), "y_init": node.ypos()})
|
||||
node.knob("x_init").setVisible(False)
|
||||
node.knob("y_init").setVisible(False)
|
||||
width = node.screenWidth()
|
||||
height = node.screenHeight()
|
||||
if "bdwidth" in node.knobs():
|
||||
imprint(node, {"w_init": width, "h_init": height})
|
||||
node.knob("w_init").setVisible(False)
|
||||
node.knob("h_init").setVisible(False)
|
||||
refresh_node(node)
|
||||
|
||||
def imprint_siblings(self):
|
||||
"""
|
||||
- add siblings names to placeholder attributes (nodes loaded with it)
|
||||
- add Id to the attributes of all the other nodes
|
||||
"""
|
||||
|
||||
loaded_nodes = self.data["last_loaded"]
|
||||
loaded_nodes_set = set(loaded_nodes)
|
||||
data = {"repre_id": str(self.data["last_repre_id"])}
|
||||
|
||||
for node in loaded_nodes:
|
||||
node_knobs = node.knobs()
|
||||
if "builder_type" not in node_knobs:
|
||||
# save the id of representation for all imported nodes
|
||||
imprint(node, data)
|
||||
node.knob("repre_id").setVisible(False)
|
||||
refresh_node(node)
|
||||
continue
|
||||
|
||||
if (
|
||||
"is_placeholder" not in node_knobs
|
||||
or (
|
||||
"is_placeholder" in node_knobs
|
||||
and node.knob("is_placeholder").value()
|
||||
)
|
||||
):
|
||||
siblings = list(loaded_nodes_set - {node})
|
||||
siblings_name = get_names_from_nodes(siblings)
|
||||
siblings = {"siblings": siblings_name}
|
||||
imprint(node, siblings)
|
||||
|
||||
def set_loaded_connections(self):
|
||||
"""
|
||||
set inputs and outputs of loaded nodes"""
|
||||
|
||||
placeholder_node = self.data["node"]
|
||||
input_node, output_node = get_group_io_nodes(self.data["last_loaded"])
|
||||
for node in placeholder_node.dependent():
|
||||
for idx in range(node.inputs()):
|
||||
if node.input(idx) == placeholder_node:
|
||||
node.setInput(idx, output_node)
|
||||
|
||||
for node in placeholder_node.dependencies():
|
||||
for idx in range(placeholder_node.inputs()):
|
||||
if placeholder_node.input(idx) == node:
|
||||
input_node.setInput(0, node)
|
||||
|
||||
def set_copies_connections(self, copies):
|
||||
"""Set inputs and outputs of the copies.
|
||||
|
||||
Args:
|
||||
copies (dict): Copied nodes by their names.
|
||||
"""
|
||||
|
||||
last_input, last_output = get_group_io_nodes(self.data["last_loaded"])
|
||||
siblings = get_nodes_by_names(self.data["siblings"])
|
||||
siblings_input, siblings_output = get_group_io_nodes(siblings)
|
||||
copy_input = copies[siblings_input.name()]
|
||||
copy_output = copies[siblings_output.name()]
|
||||
|
||||
for node_init in siblings:
|
||||
if node_init == siblings_output:
|
||||
continue
|
||||
|
||||
node_copy = copies[node_init.name()]
|
||||
for node in node_init.dependent():
|
||||
for idx in range(node.inputs()):
|
||||
if node.input(idx) != node_init:
|
||||
continue
|
||||
|
||||
if node in siblings:
|
||||
copies[node.name()].setInput(idx, node_copy)
|
||||
else:
|
||||
last_input.setInput(0, node_copy)
|
||||
|
||||
for node in node_init.dependencies():
|
||||
for idx in range(node_init.inputs()):
|
||||
if node_init.input(idx) != node:
|
||||
continue
|
||||
|
||||
if node_init == siblings_input:
|
||||
copy_input.setInput(idx, node)
|
||||
elif node in siblings:
|
||||
node_copy.setInput(idx, copies[node.name()])
|
||||
else:
|
||||
node_copy.setInput(idx, last_output)
|
||||
|
||||
siblings_input.setInput(0, copy_output)
|
||||
|
||||
def move_to_placeholder_group(self, nodes_loaded):
|
||||
"""
|
||||
opening the placeholder's group and copying loaded nodes in it.
|
||||
|
||||
Returns :
|
||||
nodes_loaded (list): the new list of pasted nodes
|
||||
"""
|
||||
|
||||
groups_name = self.data["group_name"]
|
||||
reset_selection()
|
||||
select_nodes(nodes_loaded)
|
||||
if groups_name:
|
||||
with node_tempfile() as filepath:
|
||||
nuke.nodeCopy(filepath)
|
||||
for node in nuke.selectedNodes():
|
||||
nuke.delete(node)
|
||||
group = nuke.toNode(groups_name)
|
||||
group.begin()
|
||||
nuke.nodePaste(filepath)
|
||||
nodes_loaded = nuke.selectedNodes()
|
||||
return nodes_loaded
|
||||
|
||||
def clean(self):
|
||||
# deselect all selected nodes
|
||||
placeholder_node = self.data["node"]
|
||||
|
||||
# getting the latest nodes added
|
||||
nodes_init = self.data["nodes_init"]
|
||||
nodes_loaded = list(set(nuke.allNodes()) - set(nodes_init))
|
||||
self.log.debug("Loaded nodes: {}".format(nodes_loaded))
|
||||
if not nodes_loaded:
|
||||
return
|
||||
|
||||
self.data["delete"] = True
|
||||
|
||||
nodes_loaded = self.move_to_placeholder_group(nodes_loaded)
|
||||
self.data["last_loaded"] = nodes_loaded
|
||||
refresh_nodes(nodes_loaded)
|
||||
|
||||
# positioning of the loaded nodes
|
||||
min_x, min_y, _, _ = get_extreme_positions(nodes_loaded)
|
||||
for node in nodes_loaded:
|
||||
xpos = (node.xpos() - min_x) + placeholder_node.xpos()
|
||||
ypos = (node.ypos() - min_y) + placeholder_node.ypos()
|
||||
node.setXYpos(xpos, ypos)
|
||||
refresh_nodes(nodes_loaded)
|
||||
|
||||
self.fix_z_order() # fix the problem of z_order for backdrops
|
||||
self.imprint_siblings()
|
||||
|
||||
if self.data["nb_children"] == 0:
|
||||
# save initial nodes postions and dimensions, update them
|
||||
# and set inputs and outputs of loaded nodes
|
||||
|
||||
self.imprint_inits()
|
||||
self.update_nodes(nuke.allNodes(), nodes_loaded)
|
||||
self.set_loaded_connections()
|
||||
|
||||
elif self.data["siblings"]:
|
||||
# create copies of placeholder siblings for the new loaded nodes,
|
||||
# set their inputs and outpus and update all nodes positions and
|
||||
# dimensions and siblings names
|
||||
|
||||
siblings = get_nodes_by_names(self.data["siblings"])
|
||||
refresh_nodes(siblings)
|
||||
copies = self.create_sib_copies()
|
||||
new_nodes = list(copies.values()) # copies nodes
|
||||
self.update_nodes(new_nodes, nodes_loaded)
|
||||
placeholder_node.removeKnob(placeholder_node.knob("siblings"))
|
||||
new_nodes_name = get_names_from_nodes(new_nodes)
|
||||
imprint(placeholder_node, {"siblings": new_nodes_name})
|
||||
self.set_copies_connections(copies)
|
||||
|
||||
self.update_nodes(
|
||||
nuke.allNodes(),
|
||||
new_nodes + nodes_loaded,
|
||||
20
|
||||
)
|
||||
|
||||
new_siblings = get_names_from_nodes(new_nodes)
|
||||
self.data["siblings"] = new_siblings
|
||||
|
||||
else:
|
||||
# if the placeholder doesn't have siblings, the loaded
|
||||
# nodes will be placed in a free space
|
||||
|
||||
xpointer, ypointer = find_free_space_to_paste_nodes(
|
||||
nodes_loaded, direction="bottom", offset=200
|
||||
)
|
||||
node = nuke.createNode("NoOp")
|
||||
reset_selection()
|
||||
nuke.delete(node)
|
||||
for node in nodes_loaded:
|
||||
xpos = (node.xpos() - min_x) + xpointer
|
||||
ypos = (node.ypos() - min_y) + ypointer
|
||||
node.setXYpos(xpos, ypos)
|
||||
|
||||
self.data["nb_children"] += 1
|
||||
reset_selection()
|
||||
# go back to root group
|
||||
nuke.root().begin()
|
||||
|
||||
def get_representations(self, current_asset_doc, linked_asset_docs):
|
||||
project_name = legacy_io.active_project()
|
||||
|
||||
builder_type = self.data["builder_type"]
|
||||
if builder_type == "context_asset":
|
||||
context_filters = {
|
||||
"asset": [re.compile(self.data["asset"])],
|
||||
"subset": [re.compile(self.data["subset"])],
|
||||
"hierarchy": [re.compile(self.data["hierarchy"])],
|
||||
"representations": [self.data["representation"]],
|
||||
"family": [self.data["family"]]
|
||||
}
|
||||
|
||||
elif builder_type != "linked_asset":
|
||||
context_filters = {
|
||||
"asset": [
|
||||
current_asset_doc["name"],
|
||||
re.compile(self.data["asset"])
|
||||
],
|
||||
"subset": [re.compile(self.data["subset"])],
|
||||
"hierarchy": [re.compile(self.data["hierarchy"])],
|
||||
"representation": [self.data["representation"]],
|
||||
"family": [self.data["family"]]
|
||||
}
|
||||
|
||||
else:
|
||||
asset_regex = re.compile(self.data["asset"])
|
||||
linked_asset_names = []
|
||||
for asset_doc in linked_asset_docs:
|
||||
asset_name = asset_doc["name"]
|
||||
if asset_regex.match(asset_name):
|
||||
linked_asset_names.append(asset_name)
|
||||
|
||||
if not linked_asset_names:
|
||||
return []
|
||||
|
||||
context_filters = {
|
||||
"asset": linked_asset_names,
|
||||
"subset": [re.compile(self.data["subset"])],
|
||||
"hierarchy": [re.compile(self.data["hierarchy"])],
|
||||
"representation": [self.data["representation"]],
|
||||
"family": [self.data["family"]],
|
||||
}
|
||||
|
||||
return list(get_representations(
|
||||
project_name,
|
||||
context_filters=context_filters
|
||||
))
|
||||
|
||||
def err_message(self):
|
||||
return (
|
||||
"Error while trying to load a representation.\n"
|
||||
"Either the subset wasn't published or the template is malformed."
|
||||
"\n\n"
|
||||
"Builder was looking for:\n{attributes}".format(
|
||||
attributes="\n".join([
|
||||
"{}: {}".format(key.title(), value)
|
||||
for key, value in self.data.items()]
|
||||
)
|
||||
)
|
||||
)
|
||||
|
|
@ -5,6 +5,7 @@ from typing import Tuple
|
|||
import gazu
|
||||
|
||||
from openpype.lib.local_settings import OpenPypeSecureRegistry
|
||||
from openpype.lib import emit_event
|
||||
|
||||
|
||||
def validate_credentials(
|
||||
|
|
@ -32,6 +33,8 @@ def validate_credentials(
|
|||
except gazu.exception.AuthFailedException:
|
||||
return False
|
||||
|
||||
emit_event("kitsu.user.logged", data={"username": login}, source="kitsu")
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -177,7 +177,7 @@ class AbstractTemplateLoader:
|
|||
build_info["profiles"],
|
||||
{
|
||||
"task_types": task_type,
|
||||
"tasks": task_name
|
||||
"task_names": task_name
|
||||
}
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import os
|
||||
from importlib import import_module
|
||||
from openpype.lib import classes_from_module
|
||||
from openpype.host import HostBase
|
||||
|
|
@ -30,7 +31,7 @@ def build_workfile_template(*args):
|
|||
template_loader.populate_template()
|
||||
|
||||
|
||||
def update_workfile_template(args):
|
||||
def update_workfile_template(*args):
|
||||
template_loader = build_template_loader()
|
||||
template_loader.update_missing_containers()
|
||||
|
||||
|
|
@ -42,7 +43,10 @@ def build_template_loader():
|
|||
if isinstance(host, HostBase):
|
||||
host_name = host.name
|
||||
else:
|
||||
host_name = host.__name__.partition('.')[2]
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
if not host_name:
|
||||
host_name = host.__name__.split(".")[-2]
|
||||
|
||||
module_path = _module_path_format.format(host=host_name)
|
||||
module = import_module(module_path)
|
||||
if not module:
|
||||
|
|
|
|||
|
|
@ -980,4 +980,4 @@
|
|||
"ValidateNoAnimation": false
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -325,5 +325,8 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"templated_workfile_build": {
|
||||
"profiles": []
|
||||
},
|
||||
"filters": {}
|
||||
}
|
||||
|
|
@ -308,6 +308,10 @@
|
|||
"type": "schema_template",
|
||||
"name": "template_workfile_options"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_templated_workfile_build"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_publish_gui_filter"
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@
|
|||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "tasks",
|
||||
"key": "task_names",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
|
|
|
|||
|
|
@ -81,7 +81,7 @@ class AssetModel(TreeModel):
|
|||
for asset in current_assets:
|
||||
# get label from data, otherwise use name
|
||||
data = asset.get("data", {})
|
||||
label = data.get("label", asset["name"])
|
||||
label = data.get("label") or asset["name"]
|
||||
tags = data.get("tags", [])
|
||||
|
||||
# store for the asset for optimization
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.14.2-nightly.1"
|
||||
__version__ = "3.14.2-nightly.2"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.14.2-nightly.1" # OpenPype
|
||||
version = "3.14.2-nightly.2" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -101,7 +101,7 @@ def bump_file_versions(version):
|
|||
|
||||
# bump pyproject.toml
|
||||
filename = "pyproject.toml"
|
||||
regex = "version = \"(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(-((0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(\.(0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(\+([0-9a-zA-Z-]+(\.[0-9a-zA-Z-]+)*))?\" # OpenPype"
|
||||
regex = "version = \"(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(\+((0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(\.(0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(\+([0-9a-zA-Z-]+(\.[0-9a-zA-Z-]+)*))?\" # OpenPype"
|
||||
pyproject_version = f"version = \"{version}\" # OpenPype"
|
||||
file_regex_replace(filename, regex, pyproject_version)
|
||||
|
||||
|
|
|
|||
|
|
@ -68,6 +68,7 @@ function Install-Poetry() {
|
|||
}
|
||||
|
||||
$env:POETRY_HOME="$openpype_root\.poetry"
|
||||
$env:POETRY_VERSION="1.1.15"
|
||||
(Invoke-WebRequest -Uri https://install.python-poetry.org/ -UseBasicParsing).Content | & $($python) -
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -109,6 +109,7 @@ detect_python () {
|
|||
install_poetry () {
|
||||
echo -e "${BIGreen}>>>${RST} Installing Poetry ..."
|
||||
export POETRY_HOME="$openpype_root/.poetry"
|
||||
export POETRY_VERSION="1.1.15"
|
||||
command -v curl >/dev/null 2>&1 || { echo -e "${BIRed}!!!${RST}${BIYellow} Missing ${RST}${BIBlue}curl${BIYellow} command.${RST}"; return 1; }
|
||||
curl -sSL https://install.python-poetry.org/ | python -
|
||||
}
|
||||
|
|
|
|||
|
|
@ -96,6 +96,7 @@ Write-Color -Text ">>> ", "Cleaning cache files ... " -Color Green, Gray -NoNewl
|
|||
Get-ChildItem $openpype_root -Filter "__pycache__" -Force -Recurse| Where-Object {( $_.FullName -inotmatch '\\build\\' ) -and ( $_.FullName -inotmatch '\\.venv' )} | Remove-Item -Force -Recurse
|
||||
Get-ChildItem $openpype_root -Filter "*.pyc" -Force -Recurse | Where-Object {( $_.FullName -inotmatch '\\build\\' ) -and ( $_.FullName -inotmatch '\\.venv' )} | Remove-Item -Force
|
||||
Get-ChildItem $openpype_root -Filter "*.pyo" -Force -Recurse | Where-Object {( $_.FullName -inotmatch '\\build\\' ) -and ( $_.FullName -inotmatch '\\.venv' )} | Remove-Item -Force
|
||||
Write-Color -Text "OK" -Color Green
|
||||
|
||||
Write-Color -Text ">>> ", "Generating zip from current sources ..." -Color Green, Gray
|
||||
$env:PYTHONPATH="$($openpype_root);$($env:PYTHONPATH)"
|
||||
|
|
|
|||
|
|
@ -20,17 +20,17 @@ It defines:
|
|||
- Colour Management
|
||||
- File Formats
|
||||
|
||||
Anatomy is the only configuration that is always saved as project override. This is to make sure, that any updates to OpenPype or Studio default values, don't affect currently running productions.
|
||||
Anatomy is the only configuration that is always saved as an project override. This is to make sure that any updates to OpenPype or Studio default values, don't affect currently running productions.
|
||||
|
||||

|
||||
|
||||
## Roots
|
||||
|
||||
Roots define where files are stored with path to shared folder. It is required to set root path for each platform you are using in studio. All paths must point to same folder!
|
||||
Roots define where files are stored with path to a shared folder. It is required to set the root path for each platform you are using in the studio. All paths must point to the same folder!
|
||||
|
||||

|
||||
|
||||
It is possible to set multiple roots when necessary. That may be handy when you need to store specific type of data on another disk.
|
||||
It is possible to set multiple roots when necessary. That may be handy when you need to store a specific type of data on another disk.
|
||||

|
||||
|
||||
|
||||
|
|
@ -40,7 +40,7 @@ Note how multiple roots are used here, to push different types of files to diffe
|
|||
|
||||
## Templates
|
||||
|
||||
Templates define project's folder structure and filenames.
|
||||
Templates define the project's folder structure and filenames.
|
||||
|
||||
We have a few required anatomy templates for OpenPype to work properly, however we keep adding more when needed.
|
||||
|
||||
|
|
@ -100,14 +100,36 @@ We have a few required anatomy templates for OpenPype to work properly, however
|
|||
</div>
|
||||
</div>
|
||||
|
||||
### Anatomy reference keys
|
||||
|
||||
Anatomy templates have the ability to use "referenced keys". Best example is `path` in publish or work templates which just contains references to `folder` and `file` (`{@folder}/{@file}`). Any changes in folder or file template are propagated to the path template. The another example is simplification of version and frame formatting with paddings. You can notice that keys `{@version}` or `{@frame}` are used in default templates. They are referencing `Anatomy` -> `Templates` -> `Version` or `Frame` which handle version and frame formatting with padding.
|
||||
|
||||
So if you set `project_anatomy/templates/defaults/version_padding` to `5` the `{@version}` key will be transformed to `v{version:0>5}` automatically and version number in paths will have 5 numbers -> `v00001`.
|
||||
|
||||
### Optional keys
|
||||
|
||||
In some cases of template formatting not all keys are available and should be just ignored. For example `{frame}` should be available only for sequences but we have single publish template. To handle these cases it is possible to use special characters to mark segment of template which should be ignored, if it can't be filled because of missing keys. To mark these segments use `<` and `>`.
|
||||
.
|
||||
Template `{project[code]}_{asset}_{subset}<_{output}><.{@frame}>.{ext}` can handle all 4 possible situations when `output` and `frame` keys are available or not. The optional segments can contain additional text, like in the example dot (`.`) for frame and underscore (`_`) for output, those are also ignored if the keys are not available. Optional segments without formatting keys are kept untouched: `<br/>` -> stays as `<br/>`. It is possible to nest optional segments inside optional segments `<{asset}<.{@frame}><br/>>` which may result in empty string if `asset` key is not available.
|
||||
|
||||
## Attributes
|
||||
|
||||
Project attributes are used as default values for new assets created under project, except `Applications` and `Active project` which are project specific. Values of attributes that are **not** project specific are always used from assets. So if `tools` are not loading as expected it is because the assets have different values.
|
||||
|
||||

|
||||
|
||||
**Most of attributes don't need detailed explanation.**
|
||||
|
||||
| Attribute | Description |
|
||||
| --- | --- |
|
||||
| `Applications` | List of applications that can be used in the project. At the moment used only as a possible filter of applications. |
|
||||
| `Tools` | List of application tools. This value can be overridden per asset. |
|
||||
| `Active project` | Project won't be visible in tools if enabled.<br/> - To revert check `Show Inactive projects` checkbox in project settings. |
|
||||
|
||||
|
||||
## Task Types
|
||||
|
||||
Current state of default Task descriptors.
|
||||
Available task types on a project. Each task on an asset is referencing a task type on the project which allows access to additional task type attributes. At this moment only `short_name` is available (can be used in templates as `{task[short_name]}`).
|
||||
|
||||

|
||||
|
||||
|
|
|
|||
|
|
@ -202,3 +202,67 @@ This video shows a way to publish shot look as effect from Hiero to Nuke.
|
|||
### Assembling edit from published shot versions
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/5Wd6X-71vbg" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
|
||||
# Nuke Build Workfile
|
||||
This is a tool of Node Graph initialisation using a pre-created template.
|
||||
|
||||
### Add a profile
|
||||
The path to the template that will be used in the initialisation must be added as a profile on Project Settings.
|
||||
|
||||

|
||||
|
||||
### Create Place Holder
|
||||
|
||||

|
||||
|
||||
This tool creates a Place Holder, which is a node that will be replaced by published instances.
|
||||
|
||||

|
||||
#### Result
|
||||
- Create a red node called `PLACEHOLDER` which can be manipulated as wanted by using it in Node Graph.
|
||||
|
||||

|
||||
|
||||
:::note
|
||||
All published instances that will replace the place holder must contain unique input and output nodes in case they will not be imported as a single node.
|
||||
:::
|
||||
|
||||

|
||||
|
||||
|
||||
The informations about these objects are given by the user by filling the extra attributes of the Place Holder
|
||||
|
||||

|
||||
|
||||
|
||||
|
||||
### Update Place Holder
|
||||
This tool alows the user to change the information provided in the extra attributes of the selected Place Holder.
|
||||
|
||||

|
||||
|
||||
|
||||
|
||||
### Build Workfile from template
|
||||
This tool imports the template used and replaces the existed PlaceHolders with the corresponding published objects (which can contain Place Holders too). In case there is no published items with the description given, the place holder will remain in the node graph.
|
||||
|
||||

|
||||
|
||||
#### Result
|
||||
- Replace `PLACEHOLDER` node in the template with the published instance corresponding to the informations provided in extra attributes of the Place Holder
|
||||
|
||||

|
||||
|
||||
:::note
|
||||
In case the instance that will replace the Place holder **A** contains another Place Holder **B** that points to many published elements, all the nodes that were imported with **A** except **B** will be duplicated for each element that will replace **B**
|
||||
:::
|
||||
|
||||
### Update Workfile
|
||||
This tool can be used to check if some instances were published after the last build, so they will be imported.
|
||||
|
||||

|
||||
|
||||
:::note
|
||||
Imported instances must not be deleted because they contain extra attributes that will be used to update the workfile since the place holder is been deleted.
|
||||
:::
|
||||
BIN
website/docs/assets/nuke_addProfile.png
Normal file
|
After Width: | Height: | Size: 24 KiB |
BIN
website/docs/assets/nuke_buildWorfileFromTemplate.png
Normal file
|
After Width: | Height: | Size: 29 KiB |
BIN
website/docs/assets/nuke_buildworkfile.png
Normal file
|
After Width: | Height: | Size: 35 KiB |
BIN
website/docs/assets/nuke_createPlaceHolder.png
Normal file
|
After Width: | Height: | Size: 30 KiB |
BIN
website/docs/assets/nuke_fillingExtraAttributes.png
Normal file
|
After Width: | Height: | Size: 30 KiB |
BIN
website/docs/assets/nuke_placeHolderNode.png
Normal file
|
After Width: | Height: | Size: 3.9 KiB |
BIN
website/docs/assets/nuke_placeholder.png
Normal file
|
After Width: | Height: | Size: 12 KiB |
BIN
website/docs/assets/nuke_publishedinstance.png
Normal file
|
After Width: | Height: | Size: 20 KiB |
BIN
website/docs/assets/nuke_updatePlaceHolder.png
Normal file
|
After Width: | Height: | Size: 30 KiB |
BIN
website/docs/assets/nuke_updateWorkfile.png
Normal file
|
After Width: | Height: | Size: 30 KiB |
BIN
website/docs/assets/settings/anatomy_attributes.png
Normal file
|
After Width: | Height: | Size: 14 KiB |