mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-25 13:24:54 +01:00
Merge branch 'release/3.15.x' into enchancement/OP-2630_acescg_maya
# Conflicts: # openpype/plugins/publish/integrate.py # poetry.lock # pyproject.toml
This commit is contained in:
commit
fbd6e08cf9
345 changed files with 15401 additions and 6885 deletions
16
.pre-commit-config.yaml
Normal file
16
.pre-commit-config.yaml
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
# See https://pre-commit.com for more information
|
||||
# See https://pre-commit.com/hooks.html for more hooks
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.4.0
|
||||
hooks:
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
- id: check-yaml
|
||||
- id: check-added-large-files
|
||||
- id: no-commit-to-branch
|
||||
args: [ '--pattern', '^(?!((enhancement|feature|bugfix|documentation|tests|local|chore)\/[a-zA-Z0-9\-]+)$).*' ]
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 22.12.0
|
||||
hooks:
|
||||
- id: black
|
||||
53
CHANGELOG.md
53
CHANGELOG.md
|
|
@ -1,5 +1,58 @@
|
|||
# Changelog
|
||||
|
||||
## [3.14.10](https://github.com/ynput/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/ynput/OpenPype/compare/3.14.9...HEAD)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Global | Nuke: Creator placeholders in workfile template builder [\#4266](https://github.com/ynput/OpenPype/pull/4266)
|
||||
- Slack: Added dynamic message [\#4265](https://github.com/ynput/OpenPype/pull/4265)
|
||||
- Blender: Workfile Loader [\#4234](https://github.com/ynput/OpenPype/pull/4234)
|
||||
- Unreal: Publishing and Loading for UAssets [\#4198](https://github.com/ynput/OpenPype/pull/4198)
|
||||
- Publish: register publishes without copying them [\#4157](https://github.com/ynput/OpenPype/pull/4157)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Added install method with docstring to HostBase [\#4298](https://github.com/ynput/OpenPype/pull/4298)
|
||||
- Traypublisher: simple editorial multiple edl [\#4248](https://github.com/ynput/OpenPype/pull/4248)
|
||||
- General: Extend 'IPluginPaths' to have more available methods [\#4214](https://github.com/ynput/OpenPype/pull/4214)
|
||||
- Refactorization of folder coloring [\#4211](https://github.com/ynput/OpenPype/pull/4211)
|
||||
- Flame - loading multilayer with controlled layer names [\#4204](https://github.com/ynput/OpenPype/pull/4204)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Unreal: fix missing `maintained_selection` call [\#4300](https://github.com/ynput/OpenPype/pull/4300)
|
||||
- Ftrack: Fix receive of host ip on MacOs [\#4288](https://github.com/ynput/OpenPype/pull/4288)
|
||||
- SiteSync: sftp connection failing when shouldnt be tested [\#4278](https://github.com/ynput/OpenPype/pull/4278)
|
||||
- Deadline: fix default value for passing mongo url [\#4275](https://github.com/ynput/OpenPype/pull/4275)
|
||||
- Scene Manager: Fix variable name [\#4268](https://github.com/ynput/OpenPype/pull/4268)
|
||||
- Slack: notification fails because of missing published path [\#4264](https://github.com/ynput/OpenPype/pull/4264)
|
||||
- hiero: creator gui with min max [\#4257](https://github.com/ynput/OpenPype/pull/4257)
|
||||
- NiceCheckbox: Fix checker positioning in Python 2 [\#4253](https://github.com/ynput/OpenPype/pull/4253)
|
||||
- Publisher: Fix 'CreatorType' not equal for Python 2 DCCs [\#4249](https://github.com/ynput/OpenPype/pull/4249)
|
||||
- Deadline: fix dependencies [\#4242](https://github.com/ynput/OpenPype/pull/4242)
|
||||
- Houdini: hotfix instance data access [\#4236](https://github.com/ynput/OpenPype/pull/4236)
|
||||
- bugfix/image plane load error [\#4222](https://github.com/ynput/OpenPype/pull/4222)
|
||||
- Hiero: thumbnail from multilayer exr [\#4209](https://github.com/ynput/OpenPype/pull/4209)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Resolve: Use qtpy in Resolve [\#4254](https://github.com/ynput/OpenPype/pull/4254)
|
||||
- Houdini: Use qtpy in Houdini [\#4252](https://github.com/ynput/OpenPype/pull/4252)
|
||||
- Max: Use qtpy in Max [\#4251](https://github.com/ynput/OpenPype/pull/4251)
|
||||
- Maya: Use qtpy in Maya [\#4250](https://github.com/ynput/OpenPype/pull/4250)
|
||||
- Hiero: Use qtpy in Hiero [\#4240](https://github.com/ynput/OpenPype/pull/4240)
|
||||
- Nuke: Use qtpy in Nuke [\#4239](https://github.com/ynput/OpenPype/pull/4239)
|
||||
- Flame: Use qtpy in flame [\#4238](https://github.com/ynput/OpenPype/pull/4238)
|
||||
- General: Legacy io not used in global plugins [\#4134](https://github.com/ynput/OpenPype/pull/4134)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Bump json5 from 1.0.1 to 1.0.2 in /website [\#4292](https://github.com/ynput/OpenPype/pull/4292)
|
||||
- Maya: Fix validate frame range repair + fix create render with deadline disabled [\#4279](https://github.com/ynput/OpenPype/pull/4279)
|
||||
|
||||
|
||||
## [3.14.9](https://github.com/pypeclub/OpenPype/tree/3.14.9)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.8...3.14.9)
|
||||
|
|
|
|||
52
HISTORY.md
52
HISTORY.md
|
|
@ -1,5 +1,57 @@
|
|||
# Changelog
|
||||
|
||||
## [3.14.10](https://github.com/ynput/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/ynput/OpenPype/compare/3.14.9...3.14.10)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Global | Nuke: Creator placeholders in workfile template builder [\#4266](https://github.com/ynput/OpenPype/pull/4266)
|
||||
- Slack: Added dynamic message [\#4265](https://github.com/ynput/OpenPype/pull/4265)
|
||||
- Blender: Workfile Loader [\#4234](https://github.com/ynput/OpenPype/pull/4234)
|
||||
- Unreal: Publishing and Loading for UAssets [\#4198](https://github.com/ynput/OpenPype/pull/4198)
|
||||
- Publish: register publishes without copying them [\#4157](https://github.com/ynput/OpenPype/pull/4157)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Added install method with docstring to HostBase [\#4298](https://github.com/ynput/OpenPype/pull/4298)
|
||||
- Traypublisher: simple editorial multiple edl [\#4248](https://github.com/ynput/OpenPype/pull/4248)
|
||||
- General: Extend 'IPluginPaths' to have more available methods [\#4214](https://github.com/ynput/OpenPype/pull/4214)
|
||||
- Refactorization of folder coloring [\#4211](https://github.com/ynput/OpenPype/pull/4211)
|
||||
- Flame - loading multilayer with controlled layer names [\#4204](https://github.com/ynput/OpenPype/pull/4204)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Unreal: fix missing `maintained_selection` call [\#4300](https://github.com/ynput/OpenPype/pull/4300)
|
||||
- Ftrack: Fix receive of host ip on MacOs [\#4288](https://github.com/ynput/OpenPype/pull/4288)
|
||||
- SiteSync: sftp connection failing when shouldnt be tested [\#4278](https://github.com/ynput/OpenPype/pull/4278)
|
||||
- Deadline: fix default value for passing mongo url [\#4275](https://github.com/ynput/OpenPype/pull/4275)
|
||||
- Scene Manager: Fix variable name [\#4268](https://github.com/ynput/OpenPype/pull/4268)
|
||||
- Slack: notification fails because of missing published path [\#4264](https://github.com/ynput/OpenPype/pull/4264)
|
||||
- hiero: creator gui with min max [\#4257](https://github.com/ynput/OpenPype/pull/4257)
|
||||
- NiceCheckbox: Fix checker positioning in Python 2 [\#4253](https://github.com/ynput/OpenPype/pull/4253)
|
||||
- Publisher: Fix 'CreatorType' not equal for Python 2 DCCs [\#4249](https://github.com/ynput/OpenPype/pull/4249)
|
||||
- Deadline: fix dependencies [\#4242](https://github.com/ynput/OpenPype/pull/4242)
|
||||
- Houdini: hotfix instance data access [\#4236](https://github.com/ynput/OpenPype/pull/4236)
|
||||
- bugfix/image plane load error [\#4222](https://github.com/ynput/OpenPype/pull/4222)
|
||||
- Hiero: thumbnail from multilayer exr [\#4209](https://github.com/ynput/OpenPype/pull/4209)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Resolve: Use qtpy in Resolve [\#4254](https://github.com/ynput/OpenPype/pull/4254)
|
||||
- Houdini: Use qtpy in Houdini [\#4252](https://github.com/ynput/OpenPype/pull/4252)
|
||||
- Max: Use qtpy in Max [\#4251](https://github.com/ynput/OpenPype/pull/4251)
|
||||
- Maya: Use qtpy in Maya [\#4250](https://github.com/ynput/OpenPype/pull/4250)
|
||||
- Hiero: Use qtpy in Hiero [\#4240](https://github.com/ynput/OpenPype/pull/4240)
|
||||
- Nuke: Use qtpy in Nuke [\#4239](https://github.com/ynput/OpenPype/pull/4239)
|
||||
- Flame: Use qtpy in flame [\#4238](https://github.com/ynput/OpenPype/pull/4238)
|
||||
- General: Legacy io not used in global plugins [\#4134](https://github.com/ynput/OpenPype/pull/4134)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Bump json5 from 1.0.1 to 1.0.2 in /website [\#4292](https://github.com/ynput/OpenPype/pull/4292)
|
||||
- Maya: Fix validate frame range repair + fix create render with deadline disabled [\#4279](https://github.com/ynput/OpenPype/pull/4279)
|
||||
|
||||
|
||||
## [3.14.9](https://github.com/pypeclub/OpenPype/tree/3.14.9)
|
||||
|
||||
|
|
|
|||
|
|
@ -76,6 +76,18 @@ class HostBase(object):
|
|||
|
||||
pass
|
||||
|
||||
def install(self):
|
||||
"""Install host specific functionality.
|
||||
|
||||
This is where should be added menu with tools, registered callbacks
|
||||
and other host integration initialization.
|
||||
|
||||
It is called automatically when 'openpype.pipeline.install_host' is
|
||||
triggered.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@property
|
||||
def log(self):
|
||||
if self._log is None:
|
||||
|
|
|
|||
|
|
@ -40,6 +40,7 @@ class Server(threading.Thread):
|
|||
|
||||
# Create a TCP/IP socket
|
||||
self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
|
||||
# Bind the socket to the port
|
||||
server_address = ("127.0.0.1", port)
|
||||
|
|
@ -91,7 +92,13 @@ class Server(threading.Thread):
|
|||
self.log.info("wait ttt")
|
||||
# Receive the data in small chunks and retransmit it
|
||||
request = None
|
||||
header = self.connection.recv(10)
|
||||
try:
|
||||
header = self.connection.recv(10)
|
||||
except OSError:
|
||||
# could happen on MacOS
|
||||
self.log.info("")
|
||||
break
|
||||
|
||||
if len(header) == 0:
|
||||
# null data received, socket is closing.
|
||||
self.log.info(f"[{self.timestamp()}] Connection closing.")
|
||||
|
|
|
|||
|
|
@ -44,3 +44,6 @@ class CreateAnimation(plugin.Creator):
|
|||
# Default to not send to farm.
|
||||
self.data["farm"] = False
|
||||
self.data["priority"] = 50
|
||||
|
||||
# Default to write normals.
|
||||
self.data["writeNormals"] = True
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ class CreateMultiverseUsd(plugin.Creator):
|
|||
|
||||
name = "mvUsdMain"
|
||||
label = "Multiverse USD Asset"
|
||||
family = "mvUsd"
|
||||
family = "usd"
|
||||
icon = "cubes"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
|
|
|||
|
|
@ -1,5 +1,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import maya.cmds as cmds
|
||||
from maya import mel
|
||||
import os
|
||||
|
||||
from openpype.pipeline import (
|
||||
load,
|
||||
|
|
@ -11,12 +13,13 @@ from openpype.hosts.maya.api.lib import (
|
|||
unique_namespace
|
||||
)
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
from openpype.client import get_representation_by_id
|
||||
|
||||
|
||||
class MultiverseUsdLoader(load.LoaderPlugin):
|
||||
"""Read USD data in a Multiverse Compound"""
|
||||
|
||||
families = ["model", "mvUsd", "mvUsdComposition", "mvUsdOverride",
|
||||
families = ["model", "usd", "mvUsdComposition", "mvUsdOverride",
|
||||
"pointcache", "animation"]
|
||||
representations = ["usd", "usda", "usdc", "usdz", "abc"]
|
||||
|
||||
|
|
@ -26,7 +29,6 @@ class MultiverseUsdLoader(load.LoaderPlugin):
|
|||
color = "orange"
|
||||
|
||||
def load(self, context, name=None, namespace=None, options=None):
|
||||
|
||||
asset = context['asset']['name']
|
||||
namespace = namespace or unique_namespace(
|
||||
asset + "_",
|
||||
|
|
@ -34,22 +36,20 @@ class MultiverseUsdLoader(load.LoaderPlugin):
|
|||
suffix="_",
|
||||
)
|
||||
|
||||
# Create the shape
|
||||
# Make sure we can load the plugin
|
||||
cmds.loadPlugin("MultiverseForMaya", quiet=True)
|
||||
import multiverse
|
||||
|
||||
# Create the shape
|
||||
shape = None
|
||||
transform = None
|
||||
with maintained_selection():
|
||||
cmds.namespace(addNamespace=namespace)
|
||||
with namespaced(namespace, new=False):
|
||||
import multiverse
|
||||
shape = multiverse.CreateUsdCompound(self.fname)
|
||||
transform = cmds.listRelatives(
|
||||
shape, parent=True, fullPath=True)[0]
|
||||
|
||||
# Lock the shape node so the user cannot delete it.
|
||||
cmds.lockNode(shape, lock=True)
|
||||
|
||||
nodes = [transform, shape]
|
||||
self[:] = nodes
|
||||
|
||||
|
|
@ -70,15 +70,34 @@ class MultiverseUsdLoader(load.LoaderPlugin):
|
|||
shapes = cmds.ls(members, type="mvUsdCompoundShape")
|
||||
assert shapes, "Cannot find mvUsdCompoundShape in container"
|
||||
|
||||
path = get_representation_path(representation)
|
||||
project_name = representation["context"]["project"]["name"]
|
||||
prev_representation_id = cmds.getAttr("{}.representation".format(node))
|
||||
prev_representation = get_representation_by_id(project_name,
|
||||
prev_representation_id)
|
||||
prev_path = os.path.normpath(prev_representation["data"]["path"])
|
||||
|
||||
# Make sure we can load the plugin
|
||||
cmds.loadPlugin("MultiverseForMaya", quiet=True)
|
||||
import multiverse
|
||||
|
||||
for shape in shapes:
|
||||
multiverse.SetUsdCompoundAssetPaths(shape, [path])
|
||||
|
||||
asset_paths = multiverse.GetUsdCompoundAssetPaths(shape)
|
||||
asset_paths = [os.path.normpath(p) for p in asset_paths]
|
||||
|
||||
assert asset_paths.count(prev_path) == 1, \
|
||||
"Couldn't find matching path (or too many)"
|
||||
prev_path_idx = asset_paths.index(prev_path)
|
||||
|
||||
path = get_representation_path(representation)
|
||||
asset_paths[prev_path_idx] = path
|
||||
|
||||
multiverse.SetUsdCompoundAssetPaths(shape, asset_paths)
|
||||
|
||||
cmds.setAttr("{}.representation".format(node),
|
||||
str(representation["_id"]),
|
||||
type="string")
|
||||
mel.eval('refreshEditorTemplates;')
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
|
|
|||
132
openpype/hosts/maya/plugins/load/load_multiverse_usd_over.py
Normal file
132
openpype/hosts/maya/plugins/load/load_multiverse_usd_over.py
Normal file
|
|
@ -0,0 +1,132 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import maya.cmds as cmds
|
||||
from maya import mel
|
||||
import os
|
||||
|
||||
import qargparse
|
||||
|
||||
from openpype.pipeline import (
|
||||
load,
|
||||
get_representation_path
|
||||
)
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection
|
||||
)
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
from openpype.client import get_representation_by_id
|
||||
|
||||
|
||||
class MultiverseUsdOverLoader(load.LoaderPlugin):
|
||||
"""Reference file"""
|
||||
|
||||
families = ["mvUsdOverride"]
|
||||
representations = ["usda", "usd", "udsz"]
|
||||
|
||||
label = "Load Usd Override into Compound"
|
||||
order = -10
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
|
||||
options = [
|
||||
qargparse.String(
|
||||
"Which Compound",
|
||||
label="Compound",
|
||||
help="Select which compound to add this as a layer to."
|
||||
)
|
||||
]
|
||||
|
||||
def load(self, context, name=None, namespace=None, options=None):
|
||||
current_usd = cmds.ls(selection=True,
|
||||
type="mvUsdCompoundShape",
|
||||
dag=True,
|
||||
long=True)
|
||||
if len(current_usd) != 1:
|
||||
self.log.error("Current selection invalid: '{}', "
|
||||
"must contain exactly 1 mvUsdCompoundShape."
|
||||
"".format(current_usd))
|
||||
return
|
||||
|
||||
# Make sure we can load the plugin
|
||||
cmds.loadPlugin("MultiverseForMaya", quiet=True)
|
||||
import multiverse
|
||||
|
||||
nodes = current_usd
|
||||
with maintained_selection():
|
||||
multiverse.AddUsdCompoundAssetPath(current_usd[0], self.fname)
|
||||
|
||||
namespace = current_usd[0].split("|")[1].split(":")[0]
|
||||
|
||||
container = containerise(
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
nodes=nodes,
|
||||
context=context,
|
||||
loader=self.__class__.__name__)
|
||||
|
||||
cmds.addAttr(container, longName="mvUsdCompoundShape",
|
||||
niceName="mvUsdCompoundShape", dataType="string")
|
||||
cmds.setAttr(container + ".mvUsdCompoundShape",
|
||||
current_usd[0], type="string")
|
||||
|
||||
return container
|
||||
|
||||
def update(self, container, representation):
|
||||
# type: (dict, dict) -> None
|
||||
"""Update container with specified representation."""
|
||||
|
||||
cmds.loadPlugin("MultiverseForMaya", quiet=True)
|
||||
import multiverse
|
||||
|
||||
node = container['objectName']
|
||||
assert cmds.objExists(node), "Missing container"
|
||||
|
||||
members = cmds.sets(node, query=True) or []
|
||||
shapes = cmds.ls(members, type="mvUsdCompoundShape")
|
||||
assert shapes, "Cannot find mvUsdCompoundShape in container"
|
||||
|
||||
mvShape = container['mvUsdCompoundShape']
|
||||
assert mvShape, "Missing mv source"
|
||||
|
||||
project_name = representation["context"]["project"]["name"]
|
||||
prev_representation_id = cmds.getAttr("{}.representation".format(node))
|
||||
prev_representation = get_representation_by_id(project_name,
|
||||
prev_representation_id)
|
||||
prev_path = os.path.normpath(prev_representation["data"]["path"])
|
||||
|
||||
path = get_representation_path(representation)
|
||||
|
||||
for shape in shapes:
|
||||
asset_paths = multiverse.GetUsdCompoundAssetPaths(shape)
|
||||
asset_paths = [os.path.normpath(p) for p in asset_paths]
|
||||
|
||||
assert asset_paths.count(prev_path) == 1, \
|
||||
"Couldn't find matching path (or too many)"
|
||||
prev_path_idx = asset_paths.index(prev_path)
|
||||
asset_paths[prev_path_idx] = path
|
||||
multiverse.SetUsdCompoundAssetPaths(shape, asset_paths)
|
||||
|
||||
cmds.setAttr("{}.representation".format(node),
|
||||
str(representation["_id"]),
|
||||
type="string")
|
||||
mel.eval('refreshEditorTemplates;')
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
||||
def remove(self, container):
|
||||
# type: (dict) -> None
|
||||
"""Remove loaded container."""
|
||||
# Delete container and its contents
|
||||
if cmds.objExists(container['objectName']):
|
||||
members = cmds.sets(container['objectName'], query=True) or []
|
||||
cmds.delete([container['objectName']] + members)
|
||||
|
||||
# Remove the namespace, if empty
|
||||
namespace = container['namespace']
|
||||
if cmds.namespace(exists=namespace):
|
||||
members = cmds.namespaceInfo(namespace, listNamespace=True)
|
||||
if not members:
|
||||
cmds.namespace(removeNamespace=namespace)
|
||||
else:
|
||||
self.log.warning("Namespace not deleted because it "
|
||||
"still has members: %s", namespace)
|
||||
|
|
@ -26,7 +26,8 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
"rig",
|
||||
"camerarig",
|
||||
"xgen",
|
||||
"staticMesh"]
|
||||
"staticMesh",
|
||||
"mvLook"]
|
||||
representations = ["ma", "abc", "fbx", "mb"]
|
||||
|
||||
label = "Reference"
|
||||
|
|
|
|||
|
|
@ -74,13 +74,6 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
objectset = cmds.ls("*.id", long=True, type="objectSet",
|
||||
recursive=True, objectsOnly=True)
|
||||
|
||||
ctx_frame_start = context.data['frameStart']
|
||||
ctx_frame_end = context.data['frameEnd']
|
||||
ctx_handle_start = context.data['handleStart']
|
||||
ctx_handle_end = context.data['handleEnd']
|
||||
ctx_frame_start_handle = context.data['frameStartHandle']
|
||||
ctx_frame_end_handle = context.data['frameEndHandle']
|
||||
|
||||
context.data['objectsets'] = objectset
|
||||
for objset in objectset:
|
||||
|
||||
|
|
@ -156,31 +149,20 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
# Append start frame and end frame to label if present
|
||||
if "frameStart" and "frameEnd" in data:
|
||||
|
||||
# if frame range on maya set is the same as full shot range
|
||||
# adjust the values to match the asset data
|
||||
if (ctx_frame_start_handle == data["frameStart"]
|
||||
and ctx_frame_end_handle == data["frameEnd"]): # noqa: W503, E501
|
||||
data["frameStartHandle"] = ctx_frame_start_handle
|
||||
data["frameEndHandle"] = ctx_frame_end_handle
|
||||
data["frameStart"] = ctx_frame_start
|
||||
data["frameEnd"] = ctx_frame_end
|
||||
data["handleStart"] = ctx_handle_start
|
||||
data["handleEnd"] = ctx_handle_end
|
||||
|
||||
# if there are user values on start and end frame not matching
|
||||
# the asset, use them
|
||||
|
||||
else:
|
||||
if "handles" in data:
|
||||
data["handleStart"] = data["handles"]
|
||||
data["handleEnd"] = data["handles"]
|
||||
|
||||
data["frameStartHandle"] = data["frameStart"] - data["handleStart"] # noqa: E501
|
||||
data["frameEndHandle"] = data["frameEnd"] + data["handleEnd"] # noqa: E501
|
||||
|
||||
# Backwards compatibility for 'handles' data
|
||||
if "handles" in data:
|
||||
data["handleStart"] = data["handles"]
|
||||
data["handleEnd"] = data["handles"]
|
||||
data.pop('handles')
|
||||
|
||||
# Take handles from context if not set locally on the instance
|
||||
for key in ["handleStart", "handleEnd"]:
|
||||
if key not in data:
|
||||
data[key] = context.data[key]
|
||||
|
||||
data["frameStartHandle"] = data["frameStart"] - data["handleStart"] # noqa: E501
|
||||
data["frameEndHandle"] = data["frameEnd"] + data["handleEnd"] # noqa: E501
|
||||
|
||||
label += " [{0}-{1}]".format(int(data["frameStartHandle"]),
|
||||
int(data["frameEndHandle"]))
|
||||
|
||||
|
|
|
|||
|
|
@ -440,7 +440,8 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
for res in self.collect_resources(n):
|
||||
instance.data["resources"].append(res)
|
||||
|
||||
self.log.info("Collected resources: {}".format(instance.data["resources"]))
|
||||
self.log.info("Collected resources: {}".format(
|
||||
instance.data["resources"]))
|
||||
|
||||
# Log warning when no relevant sets were retrieved for the look.
|
||||
if (
|
||||
|
|
@ -548,6 +549,11 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
if not cmds.attributeQuery(attr, node=node, exists=True):
|
||||
continue
|
||||
attribute = "{}.{}".format(node, attr)
|
||||
# We don't support mixed-type attributes yet.
|
||||
if cmds.attributeQuery(attr, node=node, multi=True):
|
||||
self.log.warning("Attribute '{}' is mixed-type and is "
|
||||
"not supported yet.".format(attribute))
|
||||
continue
|
||||
if cmds.getAttr(attribute, type=True) == "message":
|
||||
continue
|
||||
node_attributes[attr] = cmds.getAttr(attribute)
|
||||
|
|
|
|||
|
|
@ -21,37 +21,68 @@ COLOUR_SPACES = ['sRGB', 'linear', 'auto']
|
|||
MIPMAP_EXTENSIONS = ['tdl']
|
||||
|
||||
|
||||
def get_look_attrs(node):
|
||||
"""Returns attributes of a node that are important for the look.
|
||||
class _NodeTypeAttrib(object):
|
||||
"""docstring for _NodeType"""
|
||||
|
||||
These are the "changed" attributes (those that have edits applied
|
||||
in the current scene).
|
||||
def __init__(self, name, fname, computed_fname=None, colour_space=None):
|
||||
self.name = name
|
||||
self.fname = fname
|
||||
self.computed_fname = computed_fname or fname
|
||||
self.colour_space = colour_space or "colorSpace"
|
||||
|
||||
Returns:
|
||||
list: Attribute names to extract
|
||||
def get_fname(self, node):
|
||||
return "{}.{}".format(node, self.fname)
|
||||
|
||||
def get_computed_fname(self, node):
|
||||
return "{}.{}".format(node, self.computed_fname)
|
||||
|
||||
def get_colour_space(self, node):
|
||||
return "{}.{}".format(node, self.colour_space)
|
||||
|
||||
def __str__(self):
|
||||
return "_NodeTypeAttrib(name={}, fname={}, "
|
||||
"computed_fname={}, colour_space={})".format(
|
||||
self.name, self.fname, self.computed_fname, self.colour_space)
|
||||
|
||||
|
||||
NODETYPES = {
|
||||
"file": [_NodeTypeAttrib("file", "fileTextureName",
|
||||
"computedFileTextureNamePattern")],
|
||||
"aiImage": [_NodeTypeAttrib("aiImage", "filename")],
|
||||
"RedshiftNormalMap": [_NodeTypeAttrib("RedshiftNormalMap", "tex0")],
|
||||
"dlTexture": [_NodeTypeAttrib("dlTexture", "textureFile",
|
||||
None, "textureFile_meta_colorspace")],
|
||||
"dlTriplanar": [_NodeTypeAttrib("dlTriplanar", "colorTexture",
|
||||
None, "colorTexture_meta_colorspace"),
|
||||
_NodeTypeAttrib("dlTriplanar", "floatTexture",
|
||||
None, "floatTexture_meta_colorspace"),
|
||||
_NodeTypeAttrib("dlTriplanar", "heightTexture",
|
||||
None, "heightTexture_meta_colorspace")]
|
||||
}
|
||||
|
||||
|
||||
def get_file_paths_for_node(node):
|
||||
"""Gets all the file paths in this node.
|
||||
|
||||
Returns all filepaths that this node references. Some node types only
|
||||
reference one, but others, like dlTriplanar, can reference 3.
|
||||
|
||||
Args:
|
||||
node (str): Name of the Maya node
|
||||
|
||||
Returns
|
||||
list(str): A list with all evaluated maya attributes for filepaths.
|
||||
"""
|
||||
# When referenced get only attributes that are "changed since file open"
|
||||
# which includes any reference edits, otherwise take *all* user defined
|
||||
# attributes
|
||||
is_referenced = cmds.referenceQuery(node, isNodeReferenced=True)
|
||||
result = cmds.listAttr(node, userDefined=True,
|
||||
changedSinceFileOpen=is_referenced) or []
|
||||
|
||||
# `cbId` is added when a scene is saved, ignore by default
|
||||
if "cbId" in result:
|
||||
result.remove("cbId")
|
||||
node_type = cmds.nodeType(node)
|
||||
if node_type not in NODETYPES:
|
||||
return []
|
||||
|
||||
# For shapes allow render stat changes
|
||||
if cmds.objectType(node, isAType="shape"):
|
||||
attrs = cmds.listAttr(node, changedSinceFileOpen=True) or []
|
||||
for attr in attrs:
|
||||
if attr in SHAPE_ATTRS:
|
||||
result.append(attr)
|
||||
elif attr.startswith('ai'):
|
||||
result.append(attr)
|
||||
|
||||
return result
|
||||
paths = []
|
||||
for node_type_attr in NODETYPES[node_type]:
|
||||
fname = cmds.getAttr("{}.{}".format(node, node_type_attr.fname))
|
||||
paths.append(fname)
|
||||
return paths
|
||||
|
||||
|
||||
def node_uses_image_sequence(node):
|
||||
|
|
@ -69,13 +100,29 @@ def node_uses_image_sequence(node):
|
|||
"""
|
||||
|
||||
# useFrameExtension indicates an explicit image sequence
|
||||
node_path = get_file_node_path(node).lower()
|
||||
paths = get_file_node_paths(node)
|
||||
paths = [path.lower() for path in paths]
|
||||
|
||||
# The following tokens imply a sequence
|
||||
patterns = ["<udim>", "<tile>", "<uvtile>", "u<u>_v<v>", "<frame0"]
|
||||
|
||||
return (cmds.getAttr('%s.useFrameExtension' % node) or
|
||||
any(pattern in node_path for pattern in patterns))
|
||||
def pattern_in_paths(patterns, paths):
|
||||
"""Helper function for checking to see if a pattern is contained
|
||||
in the list of paths"""
|
||||
for pattern in patterns:
|
||||
for path in paths:
|
||||
if pattern in path:
|
||||
return True
|
||||
return False
|
||||
|
||||
node_type = cmds.nodeType(node)
|
||||
if node_type == 'dlTexture':
|
||||
return (cmds.getAttr('{}.useImageSequence'.format(node)) or
|
||||
pattern_in_paths(patterns, paths))
|
||||
elif node_type == "file":
|
||||
return (cmds.getAttr('{}.useFrameExtension'.format(node)) or
|
||||
pattern_in_paths(patterns, paths))
|
||||
return False
|
||||
|
||||
|
||||
def seq_to_glob(path):
|
||||
|
|
@ -132,7 +179,7 @@ def seq_to_glob(path):
|
|||
return path
|
||||
|
||||
|
||||
def get_file_node_path(node):
|
||||
def get_file_node_paths(node):
|
||||
"""Get the file path used by a Maya file node.
|
||||
|
||||
Args:
|
||||
|
|
@ -158,15 +205,9 @@ def get_file_node_path(node):
|
|||
"<uvtile>"]
|
||||
lower = texture_pattern.lower()
|
||||
if any(pattern in lower for pattern in patterns):
|
||||
return texture_pattern
|
||||
return [texture_pattern]
|
||||
|
||||
if cmds.nodeType(node) == 'aiImage':
|
||||
return cmds.getAttr('{0}.filename'.format(node))
|
||||
if cmds.nodeType(node) == 'RedshiftNormalMap':
|
||||
return cmds.getAttr('{}.tex0'.format(node))
|
||||
|
||||
# otherwise use fileTextureName
|
||||
return cmds.getAttr('{0}.fileTextureName'.format(node))
|
||||
return get_file_paths_for_node(node)
|
||||
|
||||
|
||||
def get_file_node_files(node):
|
||||
|
|
@ -181,15 +222,15 @@ def get_file_node_files(node):
|
|||
|
||||
"""
|
||||
|
||||
path = get_file_node_path(node)
|
||||
path = cmds.workspace(expandName=path)
|
||||
paths = get_file_node_paths(node)
|
||||
paths = [cmds.workspace(expandName=path) for path in paths]
|
||||
if node_uses_image_sequence(node):
|
||||
glob_pattern = seq_to_glob(path)
|
||||
return glob.glob(glob_pattern)
|
||||
elif os.path.exists(path):
|
||||
return [path]
|
||||
globs = []
|
||||
for path in paths:
|
||||
globs += glob.glob(seq_to_glob(path))
|
||||
return globs
|
||||
else:
|
||||
return []
|
||||
return list(filter(lambda x: os.path.exists(x), paths))
|
||||
|
||||
|
||||
def get_mipmap(fname):
|
||||
|
|
@ -211,6 +252,11 @@ def is_mipmap(fname):
|
|||
class CollectMultiverseLookData(pyblish.api.InstancePlugin):
|
||||
"""Collect Multiverse Look
|
||||
|
||||
Searches through the overrides finding all material overrides. From there
|
||||
it extracts the shading group and then finds all texture files in the
|
||||
shading group network. It also checks for mipmap versions of texture files
|
||||
and adds them to the resouces to get published.
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.2
|
||||
|
|
@ -258,12 +304,20 @@ class CollectMultiverseLookData(pyblish.api.InstancePlugin):
|
|||
shadingGroup), "members": list()}
|
||||
|
||||
# The SG may reference files, add those too!
|
||||
history = cmds.listHistory(shadingGroup)
|
||||
files = cmds.ls(history, type="file", long=True)
|
||||
history = cmds.listHistory(
|
||||
shadingGroup, allConnections=True)
|
||||
|
||||
# We need to iterate over node_types since `cmds.ls` may
|
||||
# error out if we don't have the appropriate plugin loaded.
|
||||
files = []
|
||||
for node_type in NODETYPES.keys():
|
||||
files += cmds.ls(history,
|
||||
type=node_type,
|
||||
long=True)
|
||||
|
||||
for f in files:
|
||||
resources = self.collect_resource(f, publishMipMap)
|
||||
instance.data["resources"].append(resources)
|
||||
instance.data["resources"] += resources
|
||||
|
||||
elif isinstance(matOver, multiverse.MaterialSourceUsdPath):
|
||||
# TODO: Handle this later.
|
||||
|
|
@ -284,69 +338,63 @@ class CollectMultiverseLookData(pyblish.api.InstancePlugin):
|
|||
dict
|
||||
"""
|
||||
|
||||
self.log.debug("processing: {}".format(node))
|
||||
if cmds.nodeType(node) not in ["file", "aiImage", "RedshiftNormalMap"]:
|
||||
self.log.error(
|
||||
"Unsupported file node: {}".format(cmds.nodeType(node)))
|
||||
node_type = cmds.nodeType(node)
|
||||
self.log.debug("processing: {}/{}".format(node, node_type))
|
||||
|
||||
if node_type not in NODETYPES:
|
||||
self.log.error("Unsupported file node: {}".format(node_type))
|
||||
raise AssertionError("Unsupported file node")
|
||||
|
||||
if cmds.nodeType(node) == 'file':
|
||||
self.log.debug(" - file node")
|
||||
attribute = "{}.fileTextureName".format(node)
|
||||
computed_attribute = "{}.computedFileTextureNamePattern".format(
|
||||
node)
|
||||
elif cmds.nodeType(node) == 'aiImage':
|
||||
self.log.debug("aiImage node")
|
||||
attribute = "{}.filename".format(node)
|
||||
computed_attribute = attribute
|
||||
elif cmds.nodeType(node) == 'RedshiftNormalMap':
|
||||
self.log.debug("RedshiftNormalMap node")
|
||||
attribute = "{}.tex0".format(node)
|
||||
computed_attribute = attribute
|
||||
resources = []
|
||||
for node_type_attr in NODETYPES[node_type]:
|
||||
fname_attrib = node_type_attr.get_fname(node)
|
||||
computed_fname_attrib = node_type_attr.get_computed_fname(node)
|
||||
colour_space_attrib = node_type_attr.get_colour_space(node)
|
||||
|
||||
source = cmds.getAttr(attribute)
|
||||
self.log.info(" - file source: {}".format(source))
|
||||
color_space_attr = "{}.colorSpace".format(node)
|
||||
try:
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
except ValueError:
|
||||
# node doesn't have colorspace attribute
|
||||
source = cmds.getAttr(fname_attrib)
|
||||
color_space = "Raw"
|
||||
# Compare with the computed file path, e.g. the one with the <UDIM>
|
||||
# pattern in it, to generate some logging information about this
|
||||
# difference
|
||||
# computed_attribute = "{}.computedFileTextureNamePattern".format(node)
|
||||
computed_source = cmds.getAttr(computed_attribute)
|
||||
if source != computed_source:
|
||||
self.log.debug("Detected computed file pattern difference "
|
||||
"from original pattern: {0} "
|
||||
"({1} -> {2})".format(node,
|
||||
source,
|
||||
computed_source))
|
||||
try:
|
||||
color_space = cmds.getAttr(colour_space_attrib)
|
||||
except ValueError:
|
||||
# node doesn't have colorspace attribute, use "Raw" from before
|
||||
pass
|
||||
# Compare with the computed file path, e.g. the one with the <UDIM>
|
||||
# pattern in it, to generate some logging information about this
|
||||
# difference
|
||||
# computed_attribute = "{}.computedFileTextureNamePattern".format(node) # noqa
|
||||
computed_source = cmds.getAttr(computed_fname_attrib)
|
||||
if source != computed_source:
|
||||
self.log.debug("Detected computed file pattern difference "
|
||||
"from original pattern: {0} "
|
||||
"({1} -> {2})".format(node,
|
||||
source,
|
||||
computed_source))
|
||||
|
||||
# We replace backslashes with forward slashes because V-Ray
|
||||
# can't handle the UDIM files with the backslashes in the
|
||||
# paths as the computed patterns
|
||||
source = source.replace("\\", "/")
|
||||
# We replace backslashes with forward slashes because V-Ray
|
||||
# can't handle the UDIM files with the backslashes in the
|
||||
# paths as the computed patterns
|
||||
source = source.replace("\\", "/")
|
||||
|
||||
files = get_file_node_files(node)
|
||||
files = self.handle_files(files, publishMipMap)
|
||||
if len(files) == 0:
|
||||
self.log.error("No valid files found from node `%s`" % node)
|
||||
files = get_file_node_files(node)
|
||||
files = self.handle_files(files, publishMipMap)
|
||||
if len(files) == 0:
|
||||
self.log.error("No valid files found from node `%s`" % node)
|
||||
|
||||
self.log.info("collection of resource done:")
|
||||
self.log.info(" - node: {}".format(node))
|
||||
self.log.info(" - attribute: {}".format(attribute))
|
||||
self.log.info(" - source: {}".format(source))
|
||||
self.log.info(" - file: {}".format(files))
|
||||
self.log.info(" - color space: {}".format(color_space))
|
||||
self.log.info("collection of resource done:")
|
||||
self.log.info(" - node: {}".format(node))
|
||||
self.log.info(" - attribute: {}".format(fname_attrib))
|
||||
self.log.info(" - source: {}".format(source))
|
||||
self.log.info(" - file: {}".format(files))
|
||||
self.log.info(" - color space: {}".format(color_space))
|
||||
|
||||
# Define the resource
|
||||
return {"node": node,
|
||||
"attribute": attribute,
|
||||
"source": source, # required for resources
|
||||
"files": files,
|
||||
"color_space": color_space} # required for resources
|
||||
# Define the resource
|
||||
resource = {"node": node,
|
||||
"attribute": fname_attrib,
|
||||
"source": source, # required for resources
|
||||
"files": files,
|
||||
"color_space": color_space} # required for resources
|
||||
resources.append(resource)
|
||||
return resources
|
||||
|
||||
def handle_files(self, files, publishMipMap):
|
||||
"""This will go through all the files and make sure that they are
|
||||
|
|
|
|||
152
openpype/hosts/maya/plugins/publish/extract_import_reference.py
Normal file
152
openpype/hosts/maya/plugins/publish/extract_import_reference.py
Normal file
|
|
@ -0,0 +1,152 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from maya import cmds
|
||||
|
||||
import pyblish.api
|
||||
import tempfile
|
||||
|
||||
from openpype.lib import run_subprocess
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
||||
class ExtractImportReference(publish.Extractor):
|
||||
"""
|
||||
|
||||
Extract the scene with imported reference.
|
||||
The temp scene with imported reference is
|
||||
published for rendering if this extractor is activated
|
||||
|
||||
"""
|
||||
|
||||
label = "Extract Import Reference"
|
||||
order = pyblish.api.ExtractorOrder - 0.48
|
||||
hosts = ["maya"]
|
||||
families = ["renderlayer", "workfile"]
|
||||
optional = True
|
||||
tmp_format = "_tmp"
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_setting, system_settings):
|
||||
cls.active = project_setting["deadline"]["publish"]["MayaSubmitDeadline"]["import_reference"] # noqa
|
||||
|
||||
def process(self, instance):
|
||||
ext_mapping = (
|
||||
instance.context.data["project_settings"]["maya"]["ext_mapping"]
|
||||
)
|
||||
if ext_mapping:
|
||||
self.log.info("Looking in settings for scene type ...")
|
||||
# use extension mapping for first family found
|
||||
for family in self.families:
|
||||
try:
|
||||
self.scene_type = ext_mapping[family]
|
||||
self.log.info(
|
||||
"Using {} as scene type".format(self.scene_type))
|
||||
break
|
||||
|
||||
except KeyError:
|
||||
# set scene type to ma
|
||||
self.scene_type = "ma"
|
||||
|
||||
_scene_type = ("mayaAscii"
|
||||
if self.scene_type == "ma"
|
||||
else "mayaBinary")
|
||||
|
||||
dir_path = self.staging_dir(instance)
|
||||
# named the file with imported reference
|
||||
if instance.name == "Main":
|
||||
return
|
||||
tmp_name = instance.name + self.tmp_format
|
||||
current_name = cmds.file(query=True, sceneName=True)
|
||||
ref_scene_name = "{0}.{1}".format(tmp_name, self.scene_type)
|
||||
|
||||
reference_path = os.path.join(dir_path, ref_scene_name)
|
||||
tmp_path = os.path.dirname(current_name) + "/" + ref_scene_name
|
||||
|
||||
self.log.info("Performing extraction..")
|
||||
|
||||
# This generates script for mayapy to take care of reference
|
||||
# importing outside current session. It is passing current scene
|
||||
# name and destination scene name.
|
||||
script = ("""
|
||||
# -*- coding: utf-8 -*-
|
||||
'''Script to import references to given scene.'''
|
||||
import maya.standalone
|
||||
maya.standalone.initialize()
|
||||
# scene names filled by caller
|
||||
current_name = "{current_name}"
|
||||
ref_scene_name = "{ref_scene_name}"
|
||||
print(">>> Opening {{}} ...".format(current_name))
|
||||
cmds.file(current_name, open=True, force=True)
|
||||
print(">>> Processing references")
|
||||
all_reference = cmds.file(q=True, reference=True) or []
|
||||
for ref in all_reference:
|
||||
if cmds.referenceQuery(ref, il=True):
|
||||
cmds.file(ref, importReference=True)
|
||||
|
||||
nested_ref = cmds.file(q=True, reference=True)
|
||||
if nested_ref:
|
||||
for new_ref in nested_ref:
|
||||
if new_ref not in all_reference:
|
||||
all_reference.append(new_ref)
|
||||
|
||||
print(">>> Finish importing references")
|
||||
print(">>> Saving scene as {{}}".format(ref_scene_name))
|
||||
|
||||
cmds.file(rename=ref_scene_name)
|
||||
cmds.file(save=True, force=True)
|
||||
print("*** Done")
|
||||
""").format(current_name=current_name, ref_scene_name=tmp_path)
|
||||
mayapy_exe = os.path.join(os.getenv("MAYA_LOCATION"), "bin", "mayapy")
|
||||
if sys.platform == "windows":
|
||||
mayapy_exe += ".exe"
|
||||
mayapy_exe = os.path.normpath(mayapy_exe)
|
||||
# can't use TemporaryNamedFile as that can't be opened in another
|
||||
# process until handles are closed by context manager.
|
||||
with tempfile.TemporaryDirectory() as tmp_dir_name:
|
||||
tmp_script_path = os.path.join(tmp_dir_name, "import_ref.py")
|
||||
self.log.info("Using script file: {}".format(tmp_script_path))
|
||||
with open(tmp_script_path, "wt") as tmp:
|
||||
tmp.write(script)
|
||||
|
||||
try:
|
||||
run_subprocess([mayapy_exe, tmp_script_path])
|
||||
except Exception:
|
||||
self.log.error("Import reference failed", exc_info=True)
|
||||
raise
|
||||
|
||||
with lib.maintained_selection():
|
||||
cmds.select(all=True, noExpand=True)
|
||||
cmds.file(reference_path,
|
||||
force=True,
|
||||
typ=_scene_type,
|
||||
exportSelected=True,
|
||||
channels=True,
|
||||
constraints=True,
|
||||
shader=True,
|
||||
expressions=True,
|
||||
constructionHistory=True)
|
||||
|
||||
instance.context.data["currentFile"] = tmp_path
|
||||
|
||||
if "files" not in instance.data:
|
||||
instance.data["files"] = []
|
||||
instance.data["files"].append(ref_scene_name)
|
||||
|
||||
if instance.data.get("representations") is None:
|
||||
instance.data["representations"] = []
|
||||
|
||||
ref_representation = {
|
||||
"name": self.scene_type,
|
||||
"ext": self.scene_type,
|
||||
"files": ref_scene_name,
|
||||
"stagingDir": os.path.dirname(current_name),
|
||||
"outputName": "imported"
|
||||
}
|
||||
self.log.info("%s" % ref_representation)
|
||||
|
||||
instance.data["representations"].append(ref_representation)
|
||||
|
||||
self.log.info("Extracted instance '%s' to : '%s'" % (ref_scene_name,
|
||||
reference_path))
|
||||
|
|
@ -73,12 +73,12 @@ class ExtractMultiverseLook(publish.Extractor):
|
|||
"writeAll": False,
|
||||
"writeTransforms": False,
|
||||
"writeVisibility": False,
|
||||
"writeAttributes": False,
|
||||
"writeAttributes": True,
|
||||
"writeMaterials": True,
|
||||
"writeVariants": False,
|
||||
"writeVariantsDefinition": False,
|
||||
"writeActiveState": False,
|
||||
"writeNamespaces": False,
|
||||
"writeNamespaces": True,
|
||||
"numTimeSamples": 1,
|
||||
"timeSamplesSpan": 0.0
|
||||
}
|
||||
|
|
|
|||
|
|
@ -2,7 +2,9 @@ import os
|
|||
import six
|
||||
|
||||
from maya import cmds
|
||||
from maya import mel
|
||||
|
||||
import pyblish.api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
|
@ -26,7 +28,7 @@ class ExtractMultiverseUsd(publish.Extractor):
|
|||
|
||||
label = "Extract Multiverse USD Asset"
|
||||
hosts = ["maya"]
|
||||
families = ["mvUsd"]
|
||||
families = ["usd"]
|
||||
scene_type = "usd"
|
||||
file_formats = ["usd", "usda", "usdz"]
|
||||
|
||||
|
|
@ -87,7 +89,7 @@ class ExtractMultiverseUsd(publish.Extractor):
|
|||
return {
|
||||
"stripNamespaces": False,
|
||||
"mergeTransformAndShape": False,
|
||||
"writeAncestors": True,
|
||||
"writeAncestors": False,
|
||||
"flattenParentXforms": False,
|
||||
"writeSparseOverrides": False,
|
||||
"useMetaPrimPath": False,
|
||||
|
|
@ -147,7 +149,15 @@ class ExtractMultiverseUsd(publish.Extractor):
|
|||
|
||||
return options
|
||||
|
||||
def get_default_options(self):
|
||||
self.log.info("ExtractMultiverseUsd get_default_options")
|
||||
return self.default_options
|
||||
|
||||
def filter_members(self, members):
|
||||
return members
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
# Load plugin first
|
||||
cmds.loadPlugin("MultiverseForMaya", quiet=True)
|
||||
|
||||
|
|
@ -161,7 +171,7 @@ class ExtractMultiverseUsd(publish.Extractor):
|
|||
file_path = file_path.replace('\\', '/')
|
||||
|
||||
# Parse export options
|
||||
options = self.default_options
|
||||
options = self.get_default_options()
|
||||
options = self.parse_overrides(instance, options)
|
||||
self.log.info("Export options: {0}".format(options))
|
||||
|
||||
|
|
@ -170,27 +180,35 @@ class ExtractMultiverseUsd(publish.Extractor):
|
|||
|
||||
with maintained_selection():
|
||||
members = instance.data("setMembers")
|
||||
self.log.info('Collected object {}'.format(members))
|
||||
self.log.info('Collected objects: {}'.format(members))
|
||||
members = self.filter_members(members)
|
||||
if not members:
|
||||
self.log.error('No members!')
|
||||
return
|
||||
self.log.info(' - filtered: {}'.format(members))
|
||||
|
||||
import multiverse
|
||||
|
||||
time_opts = None
|
||||
frame_start = instance.data['frameStart']
|
||||
frame_end = instance.data['frameEnd']
|
||||
handle_start = instance.data['handleStart']
|
||||
handle_end = instance.data['handleEnd']
|
||||
step = instance.data['step']
|
||||
fps = instance.data['fps']
|
||||
if frame_end != frame_start:
|
||||
time_opts = multiverse.TimeOptions()
|
||||
|
||||
time_opts.writeTimeRange = True
|
||||
|
||||
handle_start = instance.data['handleStart']
|
||||
handle_end = instance.data['handleEnd']
|
||||
|
||||
time_opts.frameRange = (
|
||||
frame_start - handle_start, frame_end + handle_end)
|
||||
time_opts.frameIncrement = step
|
||||
time_opts.numTimeSamples = instance.data["numTimeSamples"]
|
||||
time_opts.timeSamplesSpan = instance.data["timeSamplesSpan"]
|
||||
time_opts.framePerSecond = fps
|
||||
time_opts.frameIncrement = instance.data['step']
|
||||
time_opts.numTimeSamples = instance.data.get(
|
||||
'numTimeSamples', options['numTimeSamples'])
|
||||
time_opts.timeSamplesSpan = instance.data.get(
|
||||
'timeSamplesSpan', options['timeSamplesSpan'])
|
||||
time_opts.framePerSecond = instance.data.get(
|
||||
'fps', mel.eval('currentTimeUnitToFPS()'))
|
||||
|
||||
asset_write_opts = multiverse.AssetWriteOptions(time_opts)
|
||||
options_discard_keys = {
|
||||
|
|
@ -203,11 +221,15 @@ class ExtractMultiverseUsd(publish.Extractor):
|
|||
'step',
|
||||
'fps'
|
||||
}
|
||||
self.log.debug("Write Options:")
|
||||
for key, value in options.items():
|
||||
if key in options_discard_keys:
|
||||
continue
|
||||
|
||||
self.log.debug(" - {}={}".format(key, value))
|
||||
setattr(asset_write_opts, key, value)
|
||||
|
||||
self.log.info('WriteAsset: {} / {}'.format(file_path, members))
|
||||
multiverse.WriteAsset(file_path, members, asset_write_opts)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
|
|
@ -223,3 +245,33 @@ class ExtractMultiverseUsd(publish.Extractor):
|
|||
|
||||
self.log.info("Extracted instance {} to {}".format(
|
||||
instance.name, file_path))
|
||||
|
||||
|
||||
class ExtractMultiverseUsdAnim(ExtractMultiverseUsd):
|
||||
"""Extractor for Multiverse USD Animation Sparse Cache data.
|
||||
|
||||
This will extract the sparse cache data from the scene and generate a
|
||||
USD file with all the animation data.
|
||||
|
||||
Upon publish a .usd sparse cache will be written.
|
||||
"""
|
||||
label = "Extract Multiverse USD Animation Sparse Cache"
|
||||
families = ["animation", "usd"]
|
||||
match = pyblish.api.Subset
|
||||
|
||||
def get_default_options(self):
|
||||
anim_options = self.default_options
|
||||
anim_options["writeSparseOverrides"] = True
|
||||
anim_options["writeUsdAttributes"] = True
|
||||
anim_options["stripNamespaces"] = True
|
||||
return anim_options
|
||||
|
||||
def filter_members(self, members):
|
||||
out_set = next((i for i in members if i.endswith("out_SET")), None)
|
||||
|
||||
if out_set is None:
|
||||
self.log.warning("Expecting out_SET")
|
||||
return None
|
||||
|
||||
members = cmds.ls(cmds.sets(out_set, query=True), long=True)
|
||||
return members
|
||||
|
|
|
|||
|
|
@ -93,12 +93,12 @@ class ValidateAssemblyModelTransforms(pyblish.api.InstancePlugin):
|
|||
from openpype.hosts.maya.api import lib
|
||||
|
||||
# Store namespace in variable, cosmetics thingy
|
||||
messagebox = QtWidgets.QMessageBox
|
||||
mode = messagebox.StandardButton.Ok | messagebox.StandardButton.Cancel
|
||||
choice = messagebox.warning(None,
|
||||
"Matrix reset",
|
||||
cls.prompt_message,
|
||||
mode)
|
||||
choice = QtWidgets.QMessageBox.warning(
|
||||
None,
|
||||
"Matrix reset",
|
||||
cls.prompt_message,
|
||||
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel
|
||||
)
|
||||
|
||||
invalid = cls.get_invalid(instance)
|
||||
if not invalid:
|
||||
|
|
|
|||
|
|
@ -80,13 +80,14 @@ class ValidateMvLookContents(pyblish.api.InstancePlugin):
|
|||
def is_or_has_mipmap(self, fname, files):
|
||||
ext = os.path.splitext(fname)[1][1:]
|
||||
if ext in MIPMAP_EXTENSIONS:
|
||||
self.log.debug("Is a mipmap '{}'".format(fname))
|
||||
self.log.debug(" - Is a mipmap '{}'".format(fname))
|
||||
return True
|
||||
|
||||
for colour_space in COLOUR_SPACES:
|
||||
for mipmap_ext in MIPMAP_EXTENSIONS:
|
||||
mipmap_fname = '.'.join([fname, colour_space, mipmap_ext])
|
||||
if mipmap_fname in files:
|
||||
self.log.debug("Has a mipmap '{}'".format(fname))
|
||||
self.log.debug(
|
||||
" - Has a mipmap '{}'".format(mipmap_fname))
|
||||
return True
|
||||
return False
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@ class ValidateTransformNamingSuffix(pyblish.api.InstancePlugin):
|
|||
- nurbsSurface: _NRB
|
||||
- locator: _LOC
|
||||
- null/group: _GRP
|
||||
Suffices can also be overriden by project settings.
|
||||
|
||||
.. warning::
|
||||
This grabs the first child shape as a reference and doesn't use the
|
||||
|
|
@ -44,6 +45,13 @@ class ValidateTransformNamingSuffix(pyblish.api.InstancePlugin):
|
|||
|
||||
ALLOW_IF_NOT_IN_SUFFIX_TABLE = True
|
||||
|
||||
@classmethod
|
||||
def get_table_for_invalid(cls):
|
||||
ss = []
|
||||
for k, v in cls.SUFFIX_NAMING_TABLE.items():
|
||||
ss.append(" - {}: {}".format(k, ", ".join(v)))
|
||||
return "\n".join(ss)
|
||||
|
||||
@staticmethod
|
||||
def is_valid_name(node_name, shape_type,
|
||||
SUFFIX_NAMING_TABLE, ALLOW_IF_NOT_IN_SUFFIX_TABLE):
|
||||
|
|
@ -106,5 +114,7 @@ class ValidateTransformNamingSuffix(pyblish.api.InstancePlugin):
|
|||
"""
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
valid = self.get_table_for_invalid()
|
||||
raise ValueError("Incorrectly named geometry "
|
||||
"transforms: {0}".format(invalid))
|
||||
"transforms: {0}, accepted suffixes are: "
|
||||
"\n{1}".format(invalid, valid))
|
||||
|
|
|
|||
|
|
@ -6,18 +6,26 @@ from .workio import (
|
|||
current_file,
|
||||
work_root,
|
||||
)
|
||||
|
||||
from .command import (
|
||||
viewer_update_and_undo_stop
|
||||
)
|
||||
|
||||
from .plugin import OpenPypeCreator
|
||||
from .plugin import (
|
||||
NukeCreator,
|
||||
NukeWriteCreator,
|
||||
NukeCreatorError,
|
||||
OpenPypeCreator,
|
||||
get_instance_group_node_childs,
|
||||
get_colorspace_from_node
|
||||
)
|
||||
from .pipeline import (
|
||||
install,
|
||||
uninstall,
|
||||
NukeHost,
|
||||
|
||||
ls,
|
||||
|
||||
list_instances,
|
||||
remove_instance,
|
||||
select_instance,
|
||||
|
||||
containerise,
|
||||
parse_container,
|
||||
update_container,
|
||||
|
|
@ -25,13 +33,19 @@ from .pipeline import (
|
|||
get_workfile_build_placeholder_plugins,
|
||||
)
|
||||
from .lib import (
|
||||
INSTANCE_DATA_KNOB,
|
||||
ROOT_DATA_KNOB,
|
||||
maintained_selection,
|
||||
reset_selection,
|
||||
select_nodes,
|
||||
get_view_process_node,
|
||||
duplicate_node,
|
||||
convert_knob_value_to_correct_type
|
||||
convert_knob_value_to_correct_type,
|
||||
get_node_data,
|
||||
set_node_data,
|
||||
update_node_data,
|
||||
create_write_node
|
||||
)
|
||||
|
||||
from .utils import (
|
||||
colorspace_exists_on_node,
|
||||
get_colorspace_list
|
||||
|
|
@ -47,23 +61,38 @@ __all__ = (
|
|||
|
||||
"viewer_update_and_undo_stop",
|
||||
|
||||
"NukeCreator",
|
||||
"NukeWriteCreator",
|
||||
"NukeCreatorError",
|
||||
"OpenPypeCreator",
|
||||
"install",
|
||||
"uninstall",
|
||||
"NukeHost",
|
||||
"get_instance_group_node_childs",
|
||||
"get_colorspace_from_node",
|
||||
|
||||
"ls",
|
||||
|
||||
"list_instances",
|
||||
"remove_instance",
|
||||
"select_instance",
|
||||
|
||||
"containerise",
|
||||
"parse_container",
|
||||
"update_container",
|
||||
|
||||
"get_workfile_build_placeholder_plugins",
|
||||
|
||||
"INSTANCE_DATA_KNOB",
|
||||
"ROOT_DATA_KNOB",
|
||||
"maintained_selection",
|
||||
"reset_selection",
|
||||
"select_nodes",
|
||||
"get_view_process_node",
|
||||
"duplicate_node",
|
||||
"convert_knob_value_to_correct_type",
|
||||
"get_node_data",
|
||||
"set_node_data",
|
||||
"update_node_data",
|
||||
"create_write_node",
|
||||
|
||||
"colorspace_exists_on_node",
|
||||
"get_colorspace_list"
|
||||
|
|
|
|||
|
|
@ -1,14 +1,15 @@
|
|||
import os
|
||||
from pprint import pformat
|
||||
import re
|
||||
import json
|
||||
import six
|
||||
import functools
|
||||
import warnings
|
||||
import platform
|
||||
import tempfile
|
||||
import contextlib
|
||||
from collections import OrderedDict
|
||||
|
||||
import clique
|
||||
|
||||
import nuke
|
||||
from qtpy import QtCore, QtWidgets
|
||||
|
||||
|
|
@ -64,6 +65,54 @@ EXCLUDED_KNOB_TYPE_ON_READ = (
|
|||
26, # Text Knob (But for backward compatibility, still be read
|
||||
# if value is not an empty string.)
|
||||
)
|
||||
JSON_PREFIX = "JSON:::"
|
||||
ROOT_DATA_KNOB = "publish_context"
|
||||
INSTANCE_DATA_KNOB = "publish_instance"
|
||||
|
||||
|
||||
class DeprecatedWarning(DeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
def deprecated(new_destination):
|
||||
"""Mark functions as deprecated.
|
||||
|
||||
It will result in a warning being emitted when the function is used.
|
||||
"""
|
||||
|
||||
func = None
|
||||
if callable(new_destination):
|
||||
func = new_destination
|
||||
new_destination = None
|
||||
|
||||
def _decorator(decorated_func):
|
||||
if new_destination is None:
|
||||
warning_message = (
|
||||
" Please check content of deprecated function to figure out"
|
||||
" possible replacement."
|
||||
)
|
||||
else:
|
||||
warning_message = " Please replace your usage with '{}'.".format(
|
||||
new_destination
|
||||
)
|
||||
|
||||
@functools.wraps(decorated_func)
|
||||
def wrapper(*args, **kwargs):
|
||||
warnings.simplefilter("always", DeprecatedWarning)
|
||||
warnings.warn(
|
||||
(
|
||||
"Call to deprecated function '{}'"
|
||||
"\nFunction was moved or removed.{}"
|
||||
).format(decorated_func.__name__, warning_message),
|
||||
category=DeprecatedWarning,
|
||||
stacklevel=4
|
||||
)
|
||||
return decorated_func(*args, **kwargs)
|
||||
return wrapper
|
||||
|
||||
if func is None:
|
||||
return _decorator
|
||||
return _decorator(func)
|
||||
|
||||
|
||||
class Context:
|
||||
|
|
@ -94,8 +143,78 @@ def get_main_window():
|
|||
return Context.main_window
|
||||
|
||||
|
||||
def set_node_data(node, knobname, data):
|
||||
"""Write data to node invisible knob
|
||||
|
||||
Will create new in case it doesnt exists
|
||||
or update the one already created.
|
||||
|
||||
Args:
|
||||
node (nuke.Node): node object
|
||||
knobname (str): knob name
|
||||
data (dict): data to be stored in knob
|
||||
"""
|
||||
# if exists then update data
|
||||
if knobname in node.knobs():
|
||||
log.debug("Updating knobname `{}` on node `{}`".format(
|
||||
knobname, node.name()
|
||||
))
|
||||
update_node_data(node, knobname, data)
|
||||
return
|
||||
|
||||
log.debug("Creating knobname `{}` on node `{}`".format(
|
||||
knobname, node.name()
|
||||
))
|
||||
# else create new
|
||||
knob_value = JSON_PREFIX + json.dumps(data)
|
||||
knob = nuke.String_Knob(knobname)
|
||||
knob.setValue(knob_value)
|
||||
knob.setFlag(nuke.INVISIBLE)
|
||||
node.addKnob(knob)
|
||||
|
||||
|
||||
def get_node_data(node, knobname):
|
||||
"""Read data from node.
|
||||
|
||||
Args:
|
||||
node (nuke.Node): node object
|
||||
knobname (str): knob name
|
||||
|
||||
Returns:
|
||||
dict: data stored in knob
|
||||
"""
|
||||
if knobname not in node.knobs():
|
||||
return
|
||||
|
||||
rawdata = node[knobname].getValue()
|
||||
if (
|
||||
isinstance(rawdata, six.string_types)
|
||||
and rawdata.startswith(JSON_PREFIX)
|
||||
):
|
||||
try:
|
||||
return json.loads(rawdata[len(JSON_PREFIX):])
|
||||
except json.JSONDecodeError:
|
||||
return
|
||||
|
||||
|
||||
def update_node_data(node, knobname, data):
|
||||
"""Update already present data.
|
||||
|
||||
Args:
|
||||
node (nuke.Node): node object
|
||||
knobname (str): knob name
|
||||
data (dict): data to update knob value
|
||||
"""
|
||||
knob = node[knobname]
|
||||
node_data = get_node_data(node, knobname) or {}
|
||||
node_data.update(data)
|
||||
knob_value = JSON_PREFIX + json.dumps(node_data)
|
||||
knob.setValue(knob_value)
|
||||
|
||||
|
||||
class Knobby(object):
|
||||
"""For creating knob which it's type isn't mapped in `create_knobs`
|
||||
"""[DEPRICATED] For creating knob which it's type isn't
|
||||
mapped in `create_knobs`
|
||||
|
||||
Args:
|
||||
type (string): Nuke knob type name
|
||||
|
|
@ -120,9 +239,15 @@ class Knobby(object):
|
|||
knob.setFlag(flag)
|
||||
return knob
|
||||
|
||||
@staticmethod
|
||||
def nice_naming(key):
|
||||
"""Convert camelCase name into UI Display Name"""
|
||||
words = re.findall('[A-Z][^A-Z]*', key[0].upper() + key[1:])
|
||||
return " ".join(words)
|
||||
|
||||
|
||||
def create_knobs(data, tab=None):
|
||||
"""Create knobs by data
|
||||
"""[DEPRICATED] Create knobs by data
|
||||
|
||||
Depending on the type of each dict value and creates the correct Knob.
|
||||
|
||||
|
|
@ -216,7 +341,7 @@ def create_knobs(data, tab=None):
|
|||
|
||||
|
||||
def imprint(node, data, tab=None):
|
||||
"""Store attributes with value on node
|
||||
"""[DEPRICATED] Store attributes with value on node
|
||||
|
||||
Parse user data into Node knobs.
|
||||
Use `collections.OrderedDict` to ensure knob order.
|
||||
|
|
@ -272,7 +397,7 @@ def imprint(node, data, tab=None):
|
|||
|
||||
|
||||
def add_publish_knob(node):
|
||||
"""Add Publish knob to node
|
||||
"""[DEPRICATED] Add Publish knob to node
|
||||
|
||||
Arguments:
|
||||
node (nuke.Node): nuke node to be processed
|
||||
|
|
@ -290,7 +415,7 @@ def add_publish_knob(node):
|
|||
|
||||
|
||||
def set_avalon_knob_data(node, data=None, prefix="avalon:"):
|
||||
""" Sets data into nodes's avalon knob
|
||||
"""[DEPRICATED] Sets data into nodes's avalon knob
|
||||
|
||||
Arguments:
|
||||
node (nuke.Node): Nuke node to imprint with data,
|
||||
|
|
@ -351,8 +476,8 @@ def set_avalon_knob_data(node, data=None, prefix="avalon:"):
|
|||
return node
|
||||
|
||||
|
||||
def get_avalon_knob_data(node, prefix="avalon:"):
|
||||
""" Gets a data from nodes's avalon knob
|
||||
def get_avalon_knob_data(node, prefix="avalon:", create=True):
|
||||
"""[DEPRICATED] Gets a data from nodes's avalon knob
|
||||
|
||||
Arguments:
|
||||
node (obj): Nuke node to search for data,
|
||||
|
|
@ -380,8 +505,11 @@ def get_avalon_knob_data(node, prefix="avalon:"):
|
|||
except NameError as e:
|
||||
# if it doesn't then create it
|
||||
log.debug("Creating avalon knob: `{}`".format(e))
|
||||
node = set_avalon_knob_data(node)
|
||||
return get_avalon_knob_data(node)
|
||||
if create:
|
||||
node = set_avalon_knob_data(node)
|
||||
return get_avalon_knob_data(node)
|
||||
else:
|
||||
return {}
|
||||
|
||||
# get data from filtered knobs
|
||||
data.update({k.replace(p, ''): node[k].value()
|
||||
|
|
@ -392,7 +520,7 @@ def get_avalon_knob_data(node, prefix="avalon:"):
|
|||
|
||||
|
||||
def fix_data_for_node_create(data):
|
||||
"""Fixing data to be used for nuke knobs
|
||||
"""[DEPRICATED] Fixing data to be used for nuke knobs
|
||||
"""
|
||||
for k, v in data.items():
|
||||
if isinstance(v, six.text_type):
|
||||
|
|
@ -403,7 +531,7 @@ def fix_data_for_node_create(data):
|
|||
|
||||
|
||||
def add_write_node_legacy(name, **kwarg):
|
||||
"""Adding nuke write node
|
||||
"""[DEPRICATED] Adding nuke write node
|
||||
Arguments:
|
||||
name (str): nuke node name
|
||||
kwarg (attrs): data for nuke knobs
|
||||
|
|
@ -566,15 +694,16 @@ def get_nuke_imageio_settings():
|
|||
Context.project_name)["nuke"]["imageio"]
|
||||
|
||||
# backward compatibility for project started before 3.10
|
||||
# those are still having `__legacy__` knob types
|
||||
# those are still having `__legacy__` knob types.
|
||||
if not project_imageio["enabled"]:
|
||||
return get_anatomy_settings(Context.project_name)["imageio"]["nuke"]
|
||||
|
||||
return get_project_settings(Context.project_name)["nuke"]["imageio"]
|
||||
|
||||
|
||||
@deprecated("openpype.hosts.nuke.api.lib.get_nuke_imageio_settings")
|
||||
def get_created_node_imageio_setting_legacy(nodeclass, creator, subset):
|
||||
''' Get preset data for dataflow (fileType, compression, bitDepth)
|
||||
'''[DEPRICATED] Get preset data for dataflow (fileType, compression, bitDepth)
|
||||
'''
|
||||
|
||||
assert any([creator, nodeclass]), nuke.message(
|
||||
|
|
@ -764,15 +893,33 @@ def get_imageio_input_colorspace(filename):
|
|||
def get_view_process_node():
|
||||
reset_selection()
|
||||
|
||||
ipn_orig = None
|
||||
for v in nuke.allNodes(filter="Viewer"):
|
||||
ipn = v['input_process_node'].getValue()
|
||||
if "VIEWER_INPUT" not in ipn:
|
||||
ipn_orig = nuke.toNode(ipn)
|
||||
ipn_orig.setSelected(True)
|
||||
ipn_node = None
|
||||
for v_ in nuke.allNodes(filter="Viewer"):
|
||||
ipn = v_['input_process_node'].getValue()
|
||||
ipn_node = nuke.toNode(ipn)
|
||||
|
||||
if ipn_orig:
|
||||
return duplicate_node(ipn_orig)
|
||||
# skip if no input node is set
|
||||
if not ipn:
|
||||
continue
|
||||
|
||||
if ipn == "VIEWER_INPUT" and not ipn_node:
|
||||
# since it is set by default we can ignore it
|
||||
# nobody usually use this but use it if
|
||||
# it exists in nodes
|
||||
continue
|
||||
|
||||
if not ipn_node:
|
||||
# in case a Viewer node is transfered from
|
||||
# different workfile with old values
|
||||
raise NameError((
|
||||
"Input process node name '{}' set in "
|
||||
"Viewer '{}' is does't exists in nodes"
|
||||
).format(ipn, v_.name()))
|
||||
|
||||
ipn_node.setSelected(True)
|
||||
|
||||
if ipn_node:
|
||||
return duplicate_node(ipn_node)
|
||||
|
||||
|
||||
def on_script_load():
|
||||
|
|
@ -971,27 +1118,14 @@ def format_anatomy(data):
|
|||
Return:
|
||||
path (str)
|
||||
'''
|
||||
# TODO: perhaps should be nonPublic
|
||||
|
||||
anatomy = Anatomy()
|
||||
log.debug("__ anatomy.templates: {}".format(anatomy.templates))
|
||||
|
||||
try:
|
||||
# TODO: bck compatibility with old anatomy template
|
||||
padding = int(
|
||||
anatomy.templates["render"].get(
|
||||
"frame_padding",
|
||||
anatomy.templates["render"].get("padding")
|
||||
)
|
||||
padding = int(
|
||||
anatomy.templates["render"].get(
|
||||
"frame_padding"
|
||||
)
|
||||
except KeyError as e:
|
||||
msg = ("`padding` key is not in `render` "
|
||||
"or `frame_padding` on is not available in "
|
||||
"Anatomy template. Please, add it there and restart "
|
||||
"the pipeline (padding: \"4\"): `{}`").format(e)
|
||||
|
||||
log.error(msg)
|
||||
nuke.message(msg)
|
||||
)
|
||||
|
||||
version = data.get("version", None)
|
||||
if not version:
|
||||
|
|
@ -999,16 +1133,16 @@ def format_anatomy(data):
|
|||
data["version"] = get_version_from_path(file)
|
||||
|
||||
project_name = anatomy.project_name
|
||||
asset_name = data["avalon"]["asset"]
|
||||
task_name = os.environ["AVALON_TASK"]
|
||||
asset_name = data["asset"]
|
||||
task_name = data["task"]
|
||||
host_name = os.environ["AVALON_APP"]
|
||||
context_data = get_template_data_with_names(
|
||||
project_name, asset_name, task_name, host_name
|
||||
)
|
||||
data.update(context_data)
|
||||
data.update({
|
||||
"subset": data["avalon"]["subset"],
|
||||
"family": data["avalon"]["family"],
|
||||
"subset": data["subset"],
|
||||
"family": data["family"],
|
||||
"frame": "#" * padding,
|
||||
})
|
||||
return anatomy.format(data)
|
||||
|
|
@ -1100,8 +1234,6 @@ def create_write_node(
|
|||
data,
|
||||
input=None,
|
||||
prenodes=None,
|
||||
review=True,
|
||||
farm=True,
|
||||
linked_knobs=None,
|
||||
**kwargs
|
||||
):
|
||||
|
|
@ -1143,35 +1275,26 @@ def create_write_node(
|
|||
'''
|
||||
prenodes = prenodes or {}
|
||||
|
||||
# group node knob overrides
|
||||
knob_overrides = data.pop("knobs", [])
|
||||
|
||||
# filtering variables
|
||||
plugin_name = data["creator"]
|
||||
subset = data["subset"]
|
||||
|
||||
# get knob settings for write node
|
||||
imageio_writes = get_imageio_node_setting(
|
||||
node_class=data["nodeclass"],
|
||||
node_class="Write",
|
||||
plugin_name=plugin_name,
|
||||
subset=subset
|
||||
)
|
||||
|
||||
for knob in imageio_writes["knobs"]:
|
||||
if knob["name"] == "file_type":
|
||||
representation = knob["value"]
|
||||
ext = knob["value"]
|
||||
|
||||
try:
|
||||
data.update({
|
||||
"imageio_writes": imageio_writes,
|
||||
"representation": representation,
|
||||
})
|
||||
anatomy_filled = format_anatomy(data)
|
||||
|
||||
except Exception as e:
|
||||
msg = "problem with resolving anatomy template: {}".format(e)
|
||||
log.error(msg)
|
||||
nuke.message(msg)
|
||||
data.update({
|
||||
"imageio_writes": imageio_writes,
|
||||
"ext": ext
|
||||
})
|
||||
anatomy_filled = format_anatomy(data)
|
||||
|
||||
# build file path to workfiles
|
||||
fdir = str(anatomy_filled["work"]["folder"]).replace("\\", "/")
|
||||
|
|
@ -1180,7 +1303,7 @@ def create_write_node(
|
|||
version=data["version"],
|
||||
subset=data["subset"],
|
||||
frame=data["frame"],
|
||||
ext=representation
|
||||
ext=ext
|
||||
)
|
||||
|
||||
# create directory
|
||||
|
|
@ -1234,14 +1357,6 @@ def create_write_node(
|
|||
# connect to previous node
|
||||
now_node.setInput(0, prev_node)
|
||||
|
||||
# imprinting group node
|
||||
set_avalon_knob_data(GN, data["avalon"])
|
||||
add_publish_knob(GN)
|
||||
add_rendering_knobs(GN, farm)
|
||||
|
||||
if review:
|
||||
add_review_knob(GN)
|
||||
|
||||
# add divider
|
||||
GN.addKnob(nuke.Text_Knob('', 'Rendering'))
|
||||
|
||||
|
|
@ -1287,11 +1402,7 @@ def create_write_node(
|
|||
# adding write to read button
|
||||
add_button_clear_rendered(GN, os.path.dirname(fpath))
|
||||
|
||||
# Deadline tab.
|
||||
add_deadline_tab(GN)
|
||||
|
||||
# open the our Tab as default
|
||||
GN[_NODE_TAB_NAME].setFlag(0)
|
||||
GN.addKnob(nuke.Text_Knob('', ''))
|
||||
|
||||
# set tile color
|
||||
tile_color = next(
|
||||
|
|
@ -1303,12 +1414,10 @@ def create_write_node(
|
|||
GN["tile_color"].setValue(
|
||||
color_gui_to_int(tile_color))
|
||||
|
||||
# finally add knob overrides
|
||||
set_node_knobs_from_settings(GN, knob_overrides, **kwargs)
|
||||
|
||||
return GN
|
||||
|
||||
|
||||
@deprecated("openpype.hosts.nuke.api.lib.create_write_node")
|
||||
def create_write_node_legacy(
|
||||
name, data, input=None, prenodes=None,
|
||||
review=True, linked_knobs=None, farm=True
|
||||
|
|
@ -1599,6 +1708,13 @@ def set_node_knobs_from_settings(node, knob_settings, **kwargs):
|
|||
if knob_name not in node.knobs():
|
||||
continue
|
||||
|
||||
if knob_type == "expression":
|
||||
knob_expression = knob["expression"]
|
||||
node[knob_name].setExpression(
|
||||
knob_expression
|
||||
)
|
||||
continue
|
||||
|
||||
# first deal with formatable knob settings
|
||||
if knob_type == "formatable":
|
||||
template = knob["template"]
|
||||
|
|
@ -1607,7 +1723,6 @@ def set_node_knobs_from_settings(node, knob_settings, **kwargs):
|
|||
_knob_value = template.format(
|
||||
**kwargs
|
||||
)
|
||||
log.debug("__ knob_value0: {}".format(_knob_value))
|
||||
except KeyError as msg:
|
||||
log.warning("__ msg: {}".format(msg))
|
||||
raise KeyError(msg)
|
||||
|
|
@ -1661,6 +1776,7 @@ def color_gui_to_int(color_gui):
|
|||
return int(hex_value, 16)
|
||||
|
||||
|
||||
@deprecated
|
||||
def add_rendering_knobs(node, farm=True):
|
||||
''' Adds additional rendering knobs to given node
|
||||
|
||||
|
|
@ -1681,6 +1797,7 @@ def add_rendering_knobs(node, farm=True):
|
|||
return node
|
||||
|
||||
|
||||
@deprecated
|
||||
def add_review_knob(node):
|
||||
''' Adds additional review knob to given node
|
||||
|
||||
|
|
@ -1697,7 +1814,9 @@ def add_review_knob(node):
|
|||
return node
|
||||
|
||||
|
||||
@deprecated
|
||||
def add_deadline_tab(node):
|
||||
# TODO: remove this as it is only linked to legacy create
|
||||
node.addKnob(nuke.Tab_Knob("Deadline"))
|
||||
|
||||
knob = nuke.Int_Knob("deadlinePriority", "Priority")
|
||||
|
|
@ -1723,7 +1842,10 @@ def add_deadline_tab(node):
|
|||
node.addKnob(knob)
|
||||
|
||||
|
||||
@deprecated
|
||||
def get_deadline_knob_names():
|
||||
# TODO: remove this as it is only linked to legacy
|
||||
# validate_write_deadline_tab
|
||||
return [
|
||||
"Deadline",
|
||||
"deadlineChunkSize",
|
||||
|
|
@ -2137,7 +2259,8 @@ class WorkfileSettings(object):
|
|||
|
||||
range = '{0}-{1}'.format(
|
||||
int(data["frameStart"]),
|
||||
int(data["frameEnd"]))
|
||||
int(data["frameEnd"])
|
||||
)
|
||||
|
||||
for node in nuke.allNodes(filter="Viewer"):
|
||||
node['frame_range'].setValue(range)
|
||||
|
|
@ -2145,12 +2268,14 @@ class WorkfileSettings(object):
|
|||
node['frame_range'].setValue(range)
|
||||
node['frame_range_lock'].setValue(True)
|
||||
|
||||
# adding handle_start/end to root avalon knob
|
||||
if not set_avalon_knob_data(self._root_node, {
|
||||
"handleStart": int(handle_start),
|
||||
"handleEnd": int(handle_end)
|
||||
}):
|
||||
log.warning("Cannot set Avalon knob to Root node!")
|
||||
set_node_data(
|
||||
self._root_node,
|
||||
INSTANCE_DATA_KNOB,
|
||||
{
|
||||
"handleStart": int(handle_start),
|
||||
"handleEnd": int(handle_end)
|
||||
}
|
||||
)
|
||||
|
||||
def reset_resolution(self):
|
||||
"""Set resolution to project resolution."""
|
||||
|
|
@ -2264,29 +2389,25 @@ def get_write_node_template_attr(node):
|
|||
''' Gets all defined data from presets
|
||||
|
||||
'''
|
||||
|
||||
# TODO: add identifiers to settings and rename settings key
|
||||
plugin_names_mapping = {
|
||||
"create_write_image": "CreateWriteImage",
|
||||
"create_write_prerender": "CreateWritePrerender",
|
||||
"create_write_render": "CreateWriteRender"
|
||||
}
|
||||
# get avalon data from node
|
||||
avalon_knob_data = read_avalon_data(node)
|
||||
# get template data
|
||||
nuke_imageio_writes = get_imageio_node_setting(
|
||||
node_class=avalon_knob_data["families"],
|
||||
plugin_name=avalon_knob_data["creator"],
|
||||
subset=avalon_knob_data["subset"]
|
||||
node_data = get_node_data(node, INSTANCE_DATA_KNOB)
|
||||
identifier = node_data["creator_identifier"]
|
||||
|
||||
# return template data
|
||||
return get_imageio_node_setting(
|
||||
node_class="Write",
|
||||
plugin_name=plugin_names_mapping[identifier],
|
||||
subset=node_data["subset"]
|
||||
)
|
||||
|
||||
|
||||
# collecting correct data
|
||||
correct_data = OrderedDict()
|
||||
|
||||
# adding imageio knob presets
|
||||
for k, v in nuke_imageio_writes.items():
|
||||
if k in ["_id", "_previous"]:
|
||||
continue
|
||||
correct_data[k] = v
|
||||
|
||||
# fix badly encoded data
|
||||
return fix_data_for_node_create(correct_data)
|
||||
|
||||
|
||||
def get_dependent_nodes(nodes):
|
||||
"""Get all dependent nodes connected to the list of nodes.
|
||||
|
||||
|
|
@ -2325,10 +2446,11 @@ def get_dependent_nodes(nodes):
|
|||
|
||||
|
||||
def find_free_space_to_paste_nodes(
|
||||
nodes,
|
||||
group=nuke.root(),
|
||||
direction="right",
|
||||
offset=300):
|
||||
nodes,
|
||||
group=nuke.root(),
|
||||
direction="right",
|
||||
offset=300
|
||||
):
|
||||
"""
|
||||
For getting coordinates in DAG (node graph) for placing new nodes
|
||||
|
||||
|
|
@ -2554,6 +2676,7 @@ def process_workfile_builder():
|
|||
open_file(last_workfile_path)
|
||||
|
||||
|
||||
@deprecated
|
||||
def recreate_instance(origin_node, avalon_data=None):
|
||||
"""Recreate input instance to different data
|
||||
|
||||
|
|
@ -2619,6 +2742,32 @@ def recreate_instance(origin_node, avalon_data=None):
|
|||
return new_node
|
||||
|
||||
|
||||
def add_scripts_menu():
|
||||
try:
|
||||
from scriptsmenu import launchfornuke
|
||||
except ImportError:
|
||||
log.warning(
|
||||
"Skipping studio.menu install, because "
|
||||
"'scriptsmenu' module seems unavailable."
|
||||
)
|
||||
return
|
||||
|
||||
# load configuration of custom menu
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
config = project_settings["nuke"]["scriptsmenu"]["definition"]
|
||||
_menu = project_settings["nuke"]["scriptsmenu"]["name"]
|
||||
|
||||
if not config:
|
||||
log.warning("Skipping studio menu, no definition found.")
|
||||
return
|
||||
|
||||
# run the launcher for Maya menu
|
||||
studio_menu = launchfornuke.main(title=_menu.title())
|
||||
|
||||
# apply configuration
|
||||
studio_menu.build_from_configuration(studio_menu, config)
|
||||
|
||||
|
||||
def add_scripts_gizmo():
|
||||
|
||||
# load configuration of custom menu
|
||||
|
|
@ -2799,48 +2948,6 @@ def dirmap_file_name_filter(file_name):
|
|||
return file_name
|
||||
|
||||
|
||||
# ------------------------------------
|
||||
# This function seems to be deprecated
|
||||
# ------------------------------------
|
||||
def ls_img_sequence(path):
|
||||
"""Listing all available coherent image sequence from path
|
||||
|
||||
Arguments:
|
||||
path (str): A nuke's node object
|
||||
|
||||
Returns:
|
||||
data (dict): with nuke formated path and frameranges
|
||||
"""
|
||||
file = os.path.basename(path)
|
||||
dirpath = os.path.dirname(path)
|
||||
base, ext = os.path.splitext(file)
|
||||
name, padding = os.path.splitext(base)
|
||||
|
||||
# populate list of files
|
||||
files = [
|
||||
f for f in os.listdir(dirpath)
|
||||
if name in f
|
||||
if ext in f
|
||||
]
|
||||
|
||||
# create collection from list of files
|
||||
collections, reminder = clique.assemble(files)
|
||||
|
||||
if len(collections) > 0:
|
||||
head = collections[0].format("{head}")
|
||||
padding = collections[0].format("{padding}") % 1
|
||||
padding = "#" * len(padding)
|
||||
tail = collections[0].format("{tail}")
|
||||
file = head + padding + tail
|
||||
|
||||
return {
|
||||
"path": os.path.join(dirpath, file).replace("\\", "/"),
|
||||
"frames": collections[0].format("[{ranges}]")
|
||||
}
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def get_group_io_nodes(nodes):
|
||||
"""Get the input and the output of a group of nodes."""
|
||||
|
||||
|
|
|
|||
|
|
@ -1,21 +1,24 @@
|
|||
import nuke
|
||||
|
||||
import os
|
||||
import importlib
|
||||
from collections import OrderedDict
|
||||
|
||||
import nuke
|
||||
|
||||
import pyblish.api
|
||||
|
||||
import openpype
|
||||
from openpype.host import (
|
||||
HostBase,
|
||||
IWorkfileHost,
|
||||
ILoadHost,
|
||||
IPublishHost
|
||||
)
|
||||
from openpype.settings import get_current_project_settings
|
||||
from openpype.lib import register_event_callback, Logger
|
||||
from openpype.pipeline import (
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
register_inventory_action_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
deregister_inventory_action_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.workfile import BuildWorkfile
|
||||
|
|
@ -24,6 +27,8 @@ from openpype.tools.utils import host_tools
|
|||
from .command import viewer_update_and_undo_stop
|
||||
from .lib import (
|
||||
Context,
|
||||
ROOT_DATA_KNOB,
|
||||
INSTANCE_DATA_KNOB,
|
||||
get_main_window,
|
||||
add_publish_knob,
|
||||
WorkfileSettings,
|
||||
|
|
@ -32,6 +37,12 @@ from .lib import (
|
|||
check_inventory_versions,
|
||||
set_avalon_knob_data,
|
||||
read_avalon_data,
|
||||
on_script_load,
|
||||
dirmap_file_name_filter,
|
||||
add_scripts_menu,
|
||||
add_scripts_gizmo,
|
||||
get_node_data,
|
||||
set_node_data
|
||||
)
|
||||
from .workfile_template_builder import (
|
||||
NukePlaceholderLoadPlugin,
|
||||
|
|
@ -41,6 +52,14 @@ from .workfile_template_builder import (
|
|||
create_placeholder,
|
||||
update_placeholder,
|
||||
)
|
||||
from .workio import (
|
||||
open_file,
|
||||
save_file,
|
||||
file_extensions,
|
||||
has_unsaved_changes,
|
||||
work_root,
|
||||
current_file
|
||||
)
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
|
@ -59,6 +78,95 @@ if os.getenv("PYBLISH_GUI", None):
|
|||
pyblish.api.register_gui(os.getenv("PYBLISH_GUI", None))
|
||||
|
||||
|
||||
class NukeHost(
|
||||
HostBase, IWorkfileHost, ILoadHost, IPublishHost
|
||||
):
|
||||
name = "nuke"
|
||||
|
||||
def open_workfile(self, filepath):
|
||||
return open_file(filepath)
|
||||
|
||||
def save_workfile(self, filepath=None):
|
||||
return save_file(filepath)
|
||||
|
||||
def work_root(self, session):
|
||||
return work_root(session)
|
||||
|
||||
def get_current_workfile(self):
|
||||
return current_file()
|
||||
|
||||
def workfile_has_unsaved_changes(self):
|
||||
return has_unsaved_changes()
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return file_extensions()
|
||||
|
||||
def get_containers(self):
|
||||
return ls()
|
||||
|
||||
def install(self):
|
||||
''' Installing all requarements for Nuke host
|
||||
'''
|
||||
|
||||
pyblish.api.register_host("nuke")
|
||||
|
||||
self.log.info("Registering Nuke plug-ins..")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
# Register Avalon event for workfiles loading.
|
||||
register_event_callback("workio.open_file", check_inventory_versions)
|
||||
register_event_callback("taskChanged", change_context_label)
|
||||
|
||||
pyblish.api.register_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled)
|
||||
|
||||
_install_menu()
|
||||
|
||||
# add script menu
|
||||
add_scripts_menu()
|
||||
add_scripts_gizmo()
|
||||
|
||||
add_nuke_callbacks()
|
||||
|
||||
launch_workfiles_app()
|
||||
|
||||
def get_context_data(self):
|
||||
root_node = nuke.root()
|
||||
return get_node_data(root_node, ROOT_DATA_KNOB)
|
||||
|
||||
def update_context_data(self, data, changes):
|
||||
root_node = nuke.root()
|
||||
set_node_data(root_node, ROOT_DATA_KNOB, data)
|
||||
|
||||
|
||||
def add_nuke_callbacks():
|
||||
""" Adding all available nuke callbacks
|
||||
"""
|
||||
workfile_settings = WorkfileSettings()
|
||||
# Set context settings.
|
||||
nuke.addOnCreate(
|
||||
workfile_settings.set_context_settings, nodeClass="Root")
|
||||
nuke.addOnCreate(workfile_settings.set_favorites, nodeClass="Root")
|
||||
nuke.addOnCreate(process_workfile_builder, nodeClass="Root")
|
||||
|
||||
# fix ffmpeg settings on script
|
||||
nuke.addOnScriptLoad(on_script_load)
|
||||
|
||||
# set checker for last versions on loaded containers
|
||||
nuke.addOnScriptLoad(check_inventory_versions)
|
||||
nuke.addOnScriptSave(check_inventory_versions)
|
||||
|
||||
# # set apply all workfile settings on script load and save
|
||||
nuke.addOnScriptLoad(WorkfileSettings().set_context_settings)
|
||||
|
||||
nuke.addFilenameFilter(dirmap_file_name_filter)
|
||||
|
||||
log.info("Added Nuke callbacks ...")
|
||||
|
||||
|
||||
def reload_config():
|
||||
"""Attempt to reload pipeline at run-time.
|
||||
|
||||
|
|
@ -84,52 +192,6 @@ def reload_config():
|
|||
reload(module)
|
||||
|
||||
|
||||
def install():
|
||||
''' Installing all requarements for Nuke host
|
||||
'''
|
||||
|
||||
pyblish.api.register_host("nuke")
|
||||
|
||||
log.info("Registering Nuke plug-ins..")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
# Register Avalon event for workfiles loading.
|
||||
register_event_callback("workio.open_file", check_inventory_versions)
|
||||
register_event_callback("taskChanged", change_context_label)
|
||||
|
||||
pyblish.api.register_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled)
|
||||
workfile_settings = WorkfileSettings()
|
||||
|
||||
# Set context settings.
|
||||
nuke.addOnCreate(workfile_settings.set_context_settings, nodeClass="Root")
|
||||
nuke.addOnCreate(workfile_settings.set_favorites, nodeClass="Root")
|
||||
nuke.addOnCreate(process_workfile_builder, nodeClass="Root")
|
||||
|
||||
_install_menu()
|
||||
launch_workfiles_app()
|
||||
|
||||
|
||||
def uninstall():
|
||||
'''Uninstalling host's integration
|
||||
'''
|
||||
log.info("Deregistering Nuke plug-ins..")
|
||||
pyblish.deregister_host("nuke")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
deregister_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
pyblish.api.deregister_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled)
|
||||
|
||||
reload_config()
|
||||
_uninstall_menu()
|
||||
|
||||
|
||||
def _show_workfiles():
|
||||
# Make sure parent is not set
|
||||
# - this makes Workfiles tool as separated window which
|
||||
|
|
@ -167,7 +229,15 @@ def _install_menu():
|
|||
menu.addSeparator()
|
||||
menu.addCommand(
|
||||
"Create...",
|
||||
lambda: host_tools.show_creator(parent=main_window)
|
||||
lambda: host_tools.show_publisher(
|
||||
tab="create"
|
||||
)
|
||||
)
|
||||
menu.addCommand(
|
||||
"Publish...",
|
||||
lambda: host_tools.show_publisher(
|
||||
tab="publish"
|
||||
)
|
||||
)
|
||||
menu.addCommand(
|
||||
"Load...",
|
||||
|
|
@ -176,14 +246,11 @@ def _install_menu():
|
|||
use_context=True
|
||||
)
|
||||
)
|
||||
menu.addCommand(
|
||||
"Publish...",
|
||||
lambda: host_tools.show_publish(parent=main_window)
|
||||
)
|
||||
menu.addCommand(
|
||||
"Manage...",
|
||||
lambda: host_tools.show_scene_inventory(parent=main_window)
|
||||
)
|
||||
menu.addSeparator()
|
||||
menu.addCommand(
|
||||
"Library...",
|
||||
lambda: host_tools.show_library_loader(
|
||||
|
|
@ -233,7 +300,7 @@ def _install_menu():
|
|||
"Experimental tools...",
|
||||
lambda: host_tools.show_experimental_tools_dialog(parent=main_window)
|
||||
)
|
||||
|
||||
menu.addSeparator()
|
||||
# add reload pipeline only in debug mode
|
||||
if bool(os.getenv("NUKE_DEBUG")):
|
||||
menu.addSeparator()
|
||||
|
|
@ -243,15 +310,6 @@ def _install_menu():
|
|||
add_shortcuts_from_presets()
|
||||
|
||||
|
||||
def _uninstall_menu():
|
||||
menubar = nuke.menu("Nuke")
|
||||
menu = menubar.findItem(MENU_LABEL)
|
||||
|
||||
for item in menu.items():
|
||||
log.info("Removing menu item: {}".format(item.name()))
|
||||
menu.removeItem(item.name())
|
||||
|
||||
|
||||
def change_context_label():
|
||||
menubar = nuke.menu("Nuke")
|
||||
menu = menubar.findItem(MENU_LABEL)
|
||||
|
|
@ -283,8 +341,8 @@ def add_shortcuts_from_presets():
|
|||
|
||||
if nuke_presets.get("menu"):
|
||||
menu_label_mapping = {
|
||||
"manage": "Manage...",
|
||||
"create": "Create...",
|
||||
"manage": "Manage...",
|
||||
"load": "Load...",
|
||||
"build_workfile": "Build Workfile",
|
||||
"publish": "Publish..."
|
||||
|
|
@ -302,7 +360,7 @@ def add_shortcuts_from_presets():
|
|||
item_label = menu_label_mapping[command_name]
|
||||
menuitem = menu.findItem(item_label)
|
||||
menuitem.setShortcut(shortcut_str)
|
||||
except AttributeError as e:
|
||||
except (AttributeError, KeyError) as e:
|
||||
log.error(e)
|
||||
|
||||
|
||||
|
|
@ -434,11 +492,72 @@ def ls():
|
|||
"""
|
||||
all_nodes = nuke.allNodes(recurseGroups=False)
|
||||
|
||||
# TODO: add readgeo, readcamera, readimage
|
||||
nodes = [n for n in all_nodes]
|
||||
|
||||
for n in nodes:
|
||||
log.debug("name: `{}`".format(n.name()))
|
||||
container = parse_container(n)
|
||||
if container:
|
||||
yield container
|
||||
|
||||
|
||||
def list_instances(creator_id=None):
|
||||
"""List all created instances to publish from current workfile.
|
||||
|
||||
For SubsetManager
|
||||
|
||||
Returns:
|
||||
(list) of dictionaries matching instances format
|
||||
"""
|
||||
listed_instances = []
|
||||
for node in nuke.allNodes(recurseGroups=True):
|
||||
|
||||
if node.Class() in ["Viewer", "Dot"]:
|
||||
continue
|
||||
|
||||
try:
|
||||
if node["disable"].value():
|
||||
continue
|
||||
except NameError:
|
||||
# pass if disable knob doesn't exist
|
||||
pass
|
||||
|
||||
# get data from avalon knob
|
||||
instance_data = get_node_data(
|
||||
node, INSTANCE_DATA_KNOB)
|
||||
|
||||
if not instance_data:
|
||||
continue
|
||||
|
||||
if instance_data["id"] != "pyblish.avalon.instance":
|
||||
continue
|
||||
|
||||
if creator_id and instance_data["creator_identifier"] != creator_id:
|
||||
continue
|
||||
|
||||
listed_instances.append((node, instance_data))
|
||||
|
||||
return listed_instances
|
||||
|
||||
|
||||
def remove_instance(instance):
|
||||
"""Remove instance from current workfile metadata.
|
||||
|
||||
For SubsetManager
|
||||
|
||||
Args:
|
||||
instance (dict): instance representation from subsetmanager model
|
||||
"""
|
||||
instance_node = instance.transient_data["node"]
|
||||
instance_knob = instance_node.knobs()[INSTANCE_DATA_KNOB]
|
||||
instance_node.removeKnob(instance_knob)
|
||||
|
||||
|
||||
def select_instance(instance):
|
||||
"""
|
||||
Select instance in Node View
|
||||
|
||||
Args:
|
||||
instance (dict): instance representation from subsetmanager model
|
||||
"""
|
||||
instance_node = instance.transient_data["node"]
|
||||
instance_node["selected"].setValue(True)
|
||||
|
|
|
|||
|
|
@ -1,27 +1,383 @@
|
|||
import nuke
|
||||
import re
|
||||
import os
|
||||
import sys
|
||||
import six
|
||||
import random
|
||||
import string
|
||||
from collections import OrderedDict
|
||||
from collections import OrderedDict, defaultdict
|
||||
from abc import abstractmethod
|
||||
|
||||
import nuke
|
||||
|
||||
from openpype.settings import get_current_project_settings
|
||||
from openpype.lib import (
|
||||
BoolDef,
|
||||
EnumDef
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
LoaderPlugin,
|
||||
CreatorError,
|
||||
Creator as NewCreator,
|
||||
CreatedInstance,
|
||||
legacy_io
|
||||
)
|
||||
from .lib import (
|
||||
INSTANCE_DATA_KNOB,
|
||||
Knobby,
|
||||
check_subsetname_exists,
|
||||
maintained_selection,
|
||||
get_avalon_knob_data,
|
||||
set_avalon_knob_data,
|
||||
add_publish_knob,
|
||||
get_nuke_imageio_settings,
|
||||
set_node_knobs_from_settings,
|
||||
set_node_data,
|
||||
get_node_data,
|
||||
get_view_process_node,
|
||||
get_viewer_config_from_string
|
||||
get_viewer_config_from_string,
|
||||
deprecated
|
||||
)
|
||||
from .pipeline import (
|
||||
list_instances,
|
||||
remove_instance
|
||||
)
|
||||
|
||||
|
||||
def _collect_and_cache_nodes(creator):
|
||||
key = "openpype.nuke.nodes"
|
||||
if key not in creator.collection_shared_data:
|
||||
instances_by_identifier = defaultdict(list)
|
||||
for item in list_instances():
|
||||
_, instance_data = item
|
||||
identifier = instance_data["creator_identifier"]
|
||||
instances_by_identifier[identifier].append(item)
|
||||
creator.collection_shared_data[key] = instances_by_identifier
|
||||
return creator.collection_shared_data[key]
|
||||
|
||||
|
||||
class NukeCreatorError(CreatorError):
|
||||
pass
|
||||
|
||||
|
||||
class NukeCreator(NewCreator):
|
||||
selected_nodes = []
|
||||
|
||||
def pass_pre_attributes_to_instance(
|
||||
self,
|
||||
instance_data,
|
||||
pre_create_data,
|
||||
keys=None
|
||||
):
|
||||
if not keys:
|
||||
keys = pre_create_data.keys()
|
||||
|
||||
creator_attrs = instance_data["creator_attributes"] = {}
|
||||
for pass_key in keys:
|
||||
creator_attrs[pass_key] = pre_create_data[pass_key]
|
||||
|
||||
def add_info_knob(self, node):
|
||||
if "OP_info" in node.knobs().keys():
|
||||
return
|
||||
|
||||
# add info text
|
||||
info_knob = nuke.Text_Knob("OP_info", "")
|
||||
info_knob.setValue("""
|
||||
<span style=\"color:#fc0303\">
|
||||
<p>This node is maintained by <b>OpenPype Publisher</b>.</p>
|
||||
<p>To remove it use Publisher gui.</p>
|
||||
</span>
|
||||
""")
|
||||
node.addKnob(info_knob)
|
||||
|
||||
def check_existing_subset(self, subset_name):
|
||||
"""Make sure subset name is unique.
|
||||
|
||||
It search within all nodes recursively
|
||||
and checks if subset name is found in
|
||||
any node having instance data knob.
|
||||
|
||||
Arguments:
|
||||
subset_name (str): Subset name
|
||||
"""
|
||||
|
||||
for node in nuke.allNodes(recurseGroups=True):
|
||||
# make sure testing node is having instance knob
|
||||
if INSTANCE_DATA_KNOB not in node.knobs().keys():
|
||||
continue
|
||||
node_data = get_node_data(node, INSTANCE_DATA_KNOB)
|
||||
|
||||
if not node_data:
|
||||
# a node has no instance data
|
||||
continue
|
||||
|
||||
# test if subset name is matching
|
||||
if node_data.get("subset") == subset_name:
|
||||
raise NukeCreatorError(
|
||||
(
|
||||
"A publish instance for '{}' already exists "
|
||||
"in nodes! Please change the variant "
|
||||
"name to ensure unique output."
|
||||
).format(subset_name)
|
||||
)
|
||||
|
||||
def create_instance_node(
|
||||
self,
|
||||
node_name,
|
||||
knobs=None,
|
||||
parent=None,
|
||||
node_type=None
|
||||
):
|
||||
"""Create node representing instance.
|
||||
|
||||
Arguments:
|
||||
node_name (str): Name of the new node.
|
||||
knobs (OrderedDict): node knobs name and values
|
||||
parent (str): Name of the parent node.
|
||||
node_type (str, optional): Nuke node Class.
|
||||
|
||||
Returns:
|
||||
nuke.Node: Newly created instance node.
|
||||
|
||||
"""
|
||||
node_type = node_type or "NoOp"
|
||||
|
||||
node_knobs = knobs or {}
|
||||
|
||||
# set parent node
|
||||
parent_node = nuke.root()
|
||||
if parent:
|
||||
parent_node = nuke.toNode(parent)
|
||||
|
||||
try:
|
||||
with parent_node:
|
||||
created_node = nuke.createNode(node_type)
|
||||
created_node["name"].setValue(node_name)
|
||||
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
for key, values in node_knobs.items():
|
||||
if key in created_node.knobs():
|
||||
created_node["key"].setValue(values)
|
||||
except Exception as _err:
|
||||
raise NukeCreatorError("Creating have failed: {}".format(_err))
|
||||
|
||||
return created_node
|
||||
|
||||
def set_selected_nodes(self, pre_create_data):
|
||||
if pre_create_data.get("use_selection"):
|
||||
self.selected_nodes = nuke.selectedNodes()
|
||||
if self.selected_nodes == []:
|
||||
raise NukeCreatorError("Creator error: No active selection")
|
||||
else:
|
||||
self.selected_nodes = []
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
|
||||
# make sure selected nodes are added
|
||||
self.set_selected_nodes(pre_create_data)
|
||||
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
try:
|
||||
instance_node = self.create_instance_node(
|
||||
subset_name,
|
||||
node_type=instance_data.pop("node_type", None)
|
||||
)
|
||||
instance = CreatedInstance(
|
||||
self.family,
|
||||
subset_name,
|
||||
instance_data,
|
||||
self
|
||||
)
|
||||
|
||||
instance.transient_data["node"] = instance_node
|
||||
|
||||
self._add_instance_to_context(instance)
|
||||
|
||||
set_node_data(
|
||||
instance_node, INSTANCE_DATA_KNOB, instance.data_to_store())
|
||||
|
||||
return instance
|
||||
|
||||
except Exception as er:
|
||||
six.reraise(
|
||||
NukeCreatorError,
|
||||
NukeCreatorError("Creator error: {}".format(er)),
|
||||
sys.exc_info()[2])
|
||||
|
||||
def collect_instances(self):
|
||||
cached_instances = _collect_and_cache_nodes(self)
|
||||
for (node, data) in cached_instances[self.identifier]:
|
||||
created_instance = CreatedInstance.from_existing(
|
||||
data, self
|
||||
)
|
||||
created_instance.transient_data["node"] = node
|
||||
self._add_instance_to_context(created_instance)
|
||||
|
||||
def update_instances(self, update_list):
|
||||
for created_inst, _changes in update_list:
|
||||
instance_node = created_inst.transient_data["node"]
|
||||
|
||||
# in case node is not existing anymore (user erased it manually)
|
||||
try:
|
||||
instance_node.fullName()
|
||||
except ValueError:
|
||||
self.remove_instances([created_inst])
|
||||
continue
|
||||
|
||||
set_node_data(
|
||||
instance_node,
|
||||
INSTANCE_DATA_KNOB,
|
||||
created_inst.data_to_store()
|
||||
)
|
||||
|
||||
def remove_instances(self, instances):
|
||||
for instance in instances:
|
||||
remove_instance(instance)
|
||||
self._remove_instance_from_context(instance)
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
return [
|
||||
BoolDef("use_selection", label="Use selection")
|
||||
]
|
||||
|
||||
def get_creator_settings(self, project_settings, settings_key=None):
|
||||
if not settings_key:
|
||||
settings_key = self.__class__.__name__
|
||||
return project_settings["nuke"]["create"][settings_key]
|
||||
|
||||
|
||||
class NukeWriteCreator(NukeCreator):
|
||||
"""Add Publishable Write node"""
|
||||
|
||||
identifier = "create_write"
|
||||
label = "Create Write"
|
||||
family = "write"
|
||||
icon = "sign-out"
|
||||
|
||||
def integrate_links(self, node, outputs=True):
|
||||
# skip if no selection
|
||||
if not self.selected_node:
|
||||
return
|
||||
|
||||
# collect dependencies
|
||||
input_nodes = [self.selected_node]
|
||||
dependent_nodes = self.selected_node.dependent() if outputs else []
|
||||
|
||||
# relinking to collected connections
|
||||
for i, input in enumerate(input_nodes):
|
||||
node.setInput(i, input)
|
||||
|
||||
# make it nicer in graph
|
||||
node.autoplace()
|
||||
|
||||
# relink also dependent nodes
|
||||
for dep_nodes in dependent_nodes:
|
||||
dep_nodes.setInput(0, node)
|
||||
|
||||
def set_selected_nodes(self, pre_create_data):
|
||||
if pre_create_data.get("use_selection"):
|
||||
selected_nodes = nuke.selectedNodes()
|
||||
if selected_nodes == []:
|
||||
raise NukeCreatorError("Creator error: No active selection")
|
||||
elif len(selected_nodes) > 1:
|
||||
NukeCreatorError("Creator error: Select only one camera node")
|
||||
self.selected_node = selected_nodes[0]
|
||||
else:
|
||||
self.selected_node = None
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
attr_defs = [
|
||||
BoolDef("use_selection", label="Use selection"),
|
||||
self._get_render_target_enum()
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool()
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def _get_render_target_enum(self):
|
||||
rendering_targets = {
|
||||
"local": "Local machine rendering",
|
||||
"frames": "Use existing frames"
|
||||
}
|
||||
if ("farm_rendering" in self.instance_attributes):
|
||||
rendering_targets["farm"] = "Farm rendering"
|
||||
|
||||
return EnumDef(
|
||||
"render_target",
|
||||
items=rendering_targets,
|
||||
label="Render target"
|
||||
)
|
||||
|
||||
def _get_reviewable_bool(self):
|
||||
return BoolDef(
|
||||
"review",
|
||||
default=("reviewable" in self.instance_attributes),
|
||||
label="Review"
|
||||
)
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# make sure selected nodes are added
|
||||
self.set_selected_nodes(pre_create_data)
|
||||
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance_node = self.create_instance_node(
|
||||
subset_name,
|
||||
instance_data
|
||||
)
|
||||
|
||||
try:
|
||||
instance = CreatedInstance(
|
||||
self.family,
|
||||
subset_name,
|
||||
instance_data,
|
||||
self
|
||||
)
|
||||
|
||||
instance.transient_data["node"] = instance_node
|
||||
|
||||
self._add_instance_to_context(instance)
|
||||
|
||||
set_node_data(
|
||||
instance_node, INSTANCE_DATA_KNOB, instance.data_to_store())
|
||||
|
||||
return instance
|
||||
|
||||
except Exception as er:
|
||||
six.reraise(
|
||||
NukeCreatorError,
|
||||
NukeCreatorError("Creator error: {}".format(er)),
|
||||
sys.exc_info()[2]
|
||||
)
|
||||
|
||||
def apply_settings(
|
||||
self,
|
||||
project_settings,
|
||||
system_settings
|
||||
):
|
||||
"""Method called on initialization of plugin to apply settings."""
|
||||
|
||||
# plugin settings
|
||||
plugin_settings = self.get_creator_settings(project_settings)
|
||||
|
||||
# individual attributes
|
||||
self.instance_attributes = plugin_settings.get(
|
||||
"instance_attributes") or self.instance_attributes
|
||||
self.prenodes = plugin_settings["prenodes"]
|
||||
self.default_variants = plugin_settings.get(
|
||||
"default_variants") or self.default_variants
|
||||
self.temp_rendering_path_template = (
|
||||
plugin_settings.get("temp_rendering_path_template")
|
||||
or self.temp_rendering_path_template
|
||||
)
|
||||
|
||||
|
||||
class OpenPypeCreator(LegacyCreator):
|
||||
|
|
@ -72,6 +428,41 @@ class OpenPypeCreator(LegacyCreator):
|
|||
return instance
|
||||
|
||||
|
||||
def get_instance_group_node_childs(instance):
|
||||
"""Return list of instance group node children
|
||||
|
||||
Args:
|
||||
instance (pyblish.Instance): pyblish instance
|
||||
|
||||
Returns:
|
||||
list: [nuke.Node]
|
||||
"""
|
||||
node = instance.data["transientData"]["node"]
|
||||
|
||||
if node.Class() != "Group":
|
||||
return
|
||||
|
||||
# collect child nodes
|
||||
child_nodes = []
|
||||
# iterate all nodes
|
||||
for node in nuke.allNodes(group=node):
|
||||
# add contained nodes to instance's node list
|
||||
child_nodes.append(node)
|
||||
|
||||
return child_nodes
|
||||
|
||||
|
||||
def get_colorspace_from_node(node):
|
||||
# Add version data to instance
|
||||
colorspace = node["colorspace"].value()
|
||||
|
||||
# remove default part of the string
|
||||
if "default (" in colorspace:
|
||||
colorspace = re.sub(r"default.\(|\)", "", colorspace)
|
||||
|
||||
return colorspace
|
||||
|
||||
|
||||
def get_review_presets_config():
|
||||
settings = get_current_project_settings()
|
||||
review_profiles = (
|
||||
|
|
@ -173,7 +564,6 @@ class ExporterReview(object):
|
|||
|
||||
def get_file_info(self):
|
||||
if self.collection:
|
||||
self.log.debug("Collection: `{}`".format(self.collection))
|
||||
# get path
|
||||
self.fname = os.path.basename(self.collection.format(
|
||||
"{head}{padding}{tail}"))
|
||||
|
|
@ -308,7 +698,6 @@ class ExporterReviewLut(ExporterReview):
|
|||
# connect
|
||||
self._temp_nodes.append(cms_node)
|
||||
self.previous_node = cms_node
|
||||
self.log.debug("CMSTestPattern... `{}`".format(self._temp_nodes))
|
||||
|
||||
if bake_viewer_process:
|
||||
# Node View Process
|
||||
|
|
@ -341,8 +730,6 @@ class ExporterReviewLut(ExporterReview):
|
|||
# connect
|
||||
gen_lut_node.setInput(0, self.previous_node)
|
||||
self._temp_nodes.append(gen_lut_node)
|
||||
self.log.debug("GenerateLUT... `{}`".format(self._temp_nodes))
|
||||
|
||||
# ---------- end nodes creation
|
||||
|
||||
# Export lut file
|
||||
|
|
@ -356,8 +743,6 @@ class ExporterReviewLut(ExporterReview):
|
|||
# ---------- generate representation data
|
||||
self.get_representation_data()
|
||||
|
||||
self.log.debug("Representation... `{}`".format(self.data))
|
||||
|
||||
# ---------- Clean up
|
||||
self.clean_nodes()
|
||||
|
||||
|
|
@ -583,6 +968,7 @@ class ExporterReviewMov(ExporterReview):
|
|||
return self.data
|
||||
|
||||
|
||||
@deprecated("openpype.hosts.nuke.api.plugin.NukeWriteCreator")
|
||||
class AbstractWriteRender(OpenPypeCreator):
|
||||
"""Abstract creator to gather similar implementation for Write creators"""
|
||||
name = ""
|
||||
|
|
@ -609,7 +995,6 @@ class AbstractWriteRender(OpenPypeCreator):
|
|||
|
||||
self.data = data
|
||||
self.nodes = nuke.selectedNodes()
|
||||
self.log.debug("_ self.data: '{}'".format(self.data))
|
||||
|
||||
def process(self):
|
||||
|
||||
|
|
@ -734,3 +1119,149 @@ class AbstractWriteRender(OpenPypeCreator):
|
|||
node (nuke.Node): group node with data as Knobs
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
def convert_to_valid_instaces():
|
||||
""" Check and convert to latest publisher instances
|
||||
|
||||
Also save as new minor version of workfile.
|
||||
"""
|
||||
def family_to_identifier(family):
|
||||
mapping = {
|
||||
"render": "create_write_render",
|
||||
"prerender": "create_write_prerender",
|
||||
"still": "create_write_image",
|
||||
"model": "create_model",
|
||||
"camera": "create_camera",
|
||||
"nukenodes": "create_backdrop",
|
||||
"gizmo": "create_gizmo",
|
||||
"source": "create_source"
|
||||
|
||||
}
|
||||
return mapping[family]
|
||||
|
||||
from openpype.hosts.nuke.api import workio
|
||||
|
||||
task_name = legacy_io.Session["AVALON_TASK"]
|
||||
|
||||
# save into new workfile
|
||||
current_file = workio.current_file()
|
||||
|
||||
# add file suffex if not
|
||||
if "_publisherConvert" not in current_file:
|
||||
new_workfile = (
|
||||
current_file[:-3]
|
||||
+ "_publisherConvert"
|
||||
+ current_file[-3:]
|
||||
)
|
||||
else:
|
||||
new_workfile = current_file
|
||||
|
||||
path = new_workfile.replace("\\", "/")
|
||||
nuke.scriptSaveAs(new_workfile, overwrite=1)
|
||||
nuke.Root()["name"].setValue(path)
|
||||
nuke.Root()["project_directory"].setValue(os.path.dirname(path))
|
||||
nuke.Root().setModified(False)
|
||||
|
||||
_remove_old_knobs(nuke.Root())
|
||||
|
||||
# loop all nodes and convert
|
||||
for node in nuke.allNodes(recurseGroups=True):
|
||||
transfer_data = {
|
||||
"creator_attributes": {}
|
||||
}
|
||||
creator_attr = transfer_data["creator_attributes"]
|
||||
|
||||
if node.Class() in ["Viewer", "Dot"]:
|
||||
continue
|
||||
|
||||
if get_node_data(node, INSTANCE_DATA_KNOB):
|
||||
continue
|
||||
|
||||
# get data from avalon knob
|
||||
avalon_knob_data = get_avalon_knob_data(
|
||||
node, ["avalon:", "ak:"])
|
||||
|
||||
if not avalon_knob_data:
|
||||
continue
|
||||
|
||||
if avalon_knob_data["id"] != "pyblish.avalon.instance":
|
||||
continue
|
||||
|
||||
transfer_data.update({
|
||||
k: v for k, v in avalon_knob_data.items()
|
||||
if k not in ["families", "creator"]
|
||||
})
|
||||
|
||||
transfer_data["task"] = task_name
|
||||
|
||||
family = avalon_knob_data["family"]
|
||||
# establish families
|
||||
families_ak = avalon_knob_data.get("families", [])
|
||||
|
||||
if "suspend_publish" in node.knobs():
|
||||
creator_attr["suspended_publish"] = (
|
||||
node["suspend_publish"].value())
|
||||
|
||||
# get review knob value
|
||||
if "review" in node.knobs():
|
||||
creator_attr["review"] = (
|
||||
node["review"].value())
|
||||
|
||||
if "publish" in node.knobs():
|
||||
transfer_data["active"] = (
|
||||
node["publish"].value())
|
||||
|
||||
# add idetifier
|
||||
transfer_data["creator_identifier"] = family_to_identifier(family)
|
||||
|
||||
# Add all nodes in group instances.
|
||||
if node.Class() == "Group":
|
||||
# only alter families for render family
|
||||
if families_ak and "write" in families_ak.lower():
|
||||
target = node["render"].value()
|
||||
if target == "Use existing frames":
|
||||
creator_attr["render_target"] = "frames"
|
||||
elif target == "Local":
|
||||
# Local rendering
|
||||
creator_attr["render_target"] = "local"
|
||||
elif target == "On farm":
|
||||
# Farm rendering
|
||||
creator_attr["render_target"] = "farm"
|
||||
|
||||
if "deadlinePriority" in node.knobs():
|
||||
transfer_data["farm_priority"] = (
|
||||
node["deadlinePriority"].value())
|
||||
if "deadlineChunkSize" in node.knobs():
|
||||
creator_attr["farm_chunk"] = (
|
||||
node["deadlineChunkSize"].value())
|
||||
if "deadlineConcurrentTasks" in node.knobs():
|
||||
creator_attr["farm_concurency"] = (
|
||||
node["deadlineConcurrentTasks"].value())
|
||||
|
||||
_remove_old_knobs(node)
|
||||
|
||||
# add new instance knob with transfer data
|
||||
set_node_data(
|
||||
node, INSTANCE_DATA_KNOB, transfer_data)
|
||||
|
||||
nuke.scriptSave()
|
||||
|
||||
|
||||
def _remove_old_knobs(node):
|
||||
remove_knobs = [
|
||||
"review", "publish", "render", "suspend_publish", "warn", "divd",
|
||||
"OpenpypeDataGroup", "OpenpypeDataGroup_End", "deadlinePriority",
|
||||
"deadlineChunkSize", "deadlineConcurrentTasks", "Deadline"
|
||||
]
|
||||
print(node.name())
|
||||
|
||||
# remove all old knobs
|
||||
for knob in node.allKnobs():
|
||||
try:
|
||||
if knob.name() in remove_knobs:
|
||||
node.removeKnob(knob)
|
||||
elif "avalon" in knob.name():
|
||||
node.removeKnob(knob)
|
||||
except ValueError:
|
||||
pass
|
||||
|
|
|
|||
49
openpype/hosts/nuke/plugins/create/convert_legacy.py
Normal file
49
openpype/hosts/nuke/plugins/create/convert_legacy.py
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
from openpype.pipeline.create.creator_plugins import SubsetConvertorPlugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
INSTANCE_DATA_KNOB,
|
||||
get_node_data,
|
||||
get_avalon_knob_data
|
||||
)
|
||||
from openpype.hosts.nuke.api.plugin import convert_to_valid_instaces
|
||||
|
||||
import nuke
|
||||
|
||||
|
||||
class LegacyConverted(SubsetConvertorPlugin):
|
||||
identifier = "legacy.converter"
|
||||
|
||||
def find_instances(self):
|
||||
|
||||
legacy_found = False
|
||||
# search for first available legacy item
|
||||
for node in nuke.allNodes(recurseGroups=True):
|
||||
|
||||
if node.Class() in ["Viewer", "Dot"]:
|
||||
continue
|
||||
|
||||
if get_node_data(node, INSTANCE_DATA_KNOB):
|
||||
continue
|
||||
|
||||
# get data from avalon knob
|
||||
avalon_knob_data = get_avalon_knob_data(
|
||||
node, ["avalon:", "ak:"], create=False)
|
||||
|
||||
if not avalon_knob_data:
|
||||
continue
|
||||
|
||||
if avalon_knob_data["id"] != "pyblish.avalon.instance":
|
||||
continue
|
||||
|
||||
# catch and break
|
||||
legacy_found = True
|
||||
break
|
||||
|
||||
if legacy_found:
|
||||
# if not item do not add legacy instance convertor
|
||||
self.add_convertor_item("Convert legacy instances")
|
||||
|
||||
def convert(self):
|
||||
# loop all instances and convert them
|
||||
convert_to_valid_instaces()
|
||||
# remove legacy item if all is fine
|
||||
self.remove_convertor_item()
|
||||
|
|
@ -1,56 +1,55 @@
|
|||
import nuke
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
select_nodes,
|
||||
set_avalon_knob_data
|
||||
from nukescripts import autoBackdrop
|
||||
|
||||
from openpype.hosts.nuke.api import (
|
||||
NukeCreator,
|
||||
NukeCreatorError,
|
||||
maintained_selection,
|
||||
select_nodes
|
||||
|
||||
)
|
||||
|
||||
|
||||
class CreateBackdrop(plugin.OpenPypeCreator):
|
||||
class CreateBackdrop(NukeCreator):
|
||||
"""Add Publishable Backdrop"""
|
||||
|
||||
name = "nukenodes"
|
||||
label = "Create Backdrop"
|
||||
identifier = "create_backdrop"
|
||||
label = "Nukenodes (backdrop)"
|
||||
family = "nukenodes"
|
||||
icon = "file-archive-o"
|
||||
defaults = ["Main"]
|
||||
maintain_selection = True
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateBackdrop, self).__init__(*args, **kwargs)
|
||||
self.nodes = nuke.selectedNodes()
|
||||
self.node_color = "0xdfea5dff"
|
||||
return
|
||||
# plugin attributes
|
||||
node_color = "0xdfea5dff"
|
||||
|
||||
def process(self):
|
||||
from nukescripts import autoBackdrop
|
||||
nodes = list()
|
||||
if (self.options or {}).get("useSelection"):
|
||||
nodes = self.nodes
|
||||
def create_instance_node(
|
||||
self,
|
||||
node_name,
|
||||
knobs=None,
|
||||
parent=None,
|
||||
node_type=None
|
||||
):
|
||||
with maintained_selection():
|
||||
if len(self.selected_nodes) >= 1:
|
||||
select_nodes(self.selected_nodes)
|
||||
|
||||
if len(nodes) >= 1:
|
||||
select_nodes(nodes)
|
||||
bckd_node = autoBackdrop()
|
||||
bckd_node["name"].setValue("{}_BDN".format(self.name))
|
||||
bckd_node["tile_color"].setValue(int(self.node_color, 16))
|
||||
bckd_node["note_font_size"].setValue(24)
|
||||
bckd_node["label"].setValue("[{}]".format(self.name))
|
||||
# add avalon knobs
|
||||
instance = set_avalon_knob_data(bckd_node, self.data)
|
||||
created_node = autoBackdrop()
|
||||
created_node["name"].setValue(node_name)
|
||||
created_node["tile_color"].setValue(int(self.node_color, 16))
|
||||
created_node["note_font_size"].setValue(24)
|
||||
created_node["label"].setValue("[{}]".format(node_name))
|
||||
|
||||
return instance
|
||||
else:
|
||||
msg = str("Please select nodes you "
|
||||
"wish to add to a container")
|
||||
self.log.error(msg)
|
||||
nuke.message(msg)
|
||||
return
|
||||
else:
|
||||
bckd_node = autoBackdrop()
|
||||
bckd_node["name"].setValue("{}_BDN".format(self.name))
|
||||
bckd_node["tile_color"].setValue(int(self.node_color, 16))
|
||||
bckd_node["note_font_size"].setValue(24)
|
||||
bckd_node["label"].setValue("[{}]".format(self.name))
|
||||
# add avalon knobs
|
||||
instance = set_avalon_knob_data(bckd_node, self.data)
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
return instance
|
||||
return created_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance = super(CreateBackdrop, self).create(
|
||||
subset_name,
|
||||
instance_data,
|
||||
pre_create_data
|
||||
)
|
||||
|
||||
return instance
|
||||
|
|
|
|||
|
|
@ -1,55 +1,68 @@
|
|||
import nuke
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
set_avalon_knob_data
|
||||
from openpype.hosts.nuke.api import (
|
||||
NukeCreator,
|
||||
NukeCreatorError,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
|
||||
class CreateCamera(plugin.OpenPypeCreator):
|
||||
"""Add Publishable Backdrop"""
|
||||
class CreateCamera(NukeCreator):
|
||||
"""Add Publishable Camera"""
|
||||
|
||||
name = "camera"
|
||||
label = "Create 3d Camera"
|
||||
identifier = "create_camera"
|
||||
label = "Camera (3d)"
|
||||
family = "camera"
|
||||
icon = "camera"
|
||||
defaults = ["Main"]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateCamera, self).__init__(*args, **kwargs)
|
||||
self.nodes = nuke.selectedNodes()
|
||||
self.node_color = "0xff9100ff"
|
||||
return
|
||||
# plugin attributes
|
||||
node_color = "0xff9100ff"
|
||||
|
||||
def process(self):
|
||||
nodes = list()
|
||||
if (self.options or {}).get("useSelection"):
|
||||
nodes = self.nodes
|
||||
|
||||
if len(nodes) >= 1:
|
||||
# loop selected nodes
|
||||
for n in nodes:
|
||||
data = self.data.copy()
|
||||
if len(nodes) > 1:
|
||||
# rename subset name only if more
|
||||
# then one node are selected
|
||||
subset = self.family + n["name"].value().capitalize()
|
||||
data["subset"] = subset
|
||||
|
||||
# change node color
|
||||
n["tile_color"].setValue(int(self.node_color, 16))
|
||||
# add avalon knobs
|
||||
set_avalon_knob_data(n, data)
|
||||
return True
|
||||
def create_instance_node(
|
||||
self,
|
||||
node_name,
|
||||
knobs=None,
|
||||
parent=None,
|
||||
node_type=None
|
||||
):
|
||||
with maintained_selection():
|
||||
if self.selected_nodes:
|
||||
node = self.selected_nodes[0]
|
||||
if node.Class() != "Camera3":
|
||||
raise NukeCreatorError(
|
||||
"Creator error: Select only camera node type")
|
||||
created_node = self.selected_nodes[0]
|
||||
else:
|
||||
msg = str("Please select nodes you "
|
||||
"wish to add to a container")
|
||||
self.log.error(msg)
|
||||
nuke.message(msg)
|
||||
return
|
||||
created_node = nuke.createNode("Camera2")
|
||||
|
||||
created_node["tile_color"].setValue(
|
||||
int(self.node_color, 16))
|
||||
|
||||
created_node["name"].setValue(node_name)
|
||||
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
return created_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance = super(CreateCamera, self).create(
|
||||
subset_name,
|
||||
instance_data,
|
||||
pre_create_data
|
||||
)
|
||||
|
||||
return instance
|
||||
|
||||
def set_selected_nodes(self, pre_create_data):
|
||||
if pre_create_data.get("use_selection"):
|
||||
self.selected_nodes = nuke.selectedNodes()
|
||||
if self.selected_nodes == []:
|
||||
raise NukeCreatorError(
|
||||
"Creator error: No active selection")
|
||||
elif len(self.selected_nodes) > 1:
|
||||
raise NukeCreatorError(
|
||||
"Creator error: Select only one camera node")
|
||||
else:
|
||||
# if selected is off then create one node
|
||||
camera_node = nuke.createNode("Camera2")
|
||||
camera_node["tile_color"].setValue(int(self.node_color, 16))
|
||||
# add avalon knobs
|
||||
instance = set_avalon_knob_data(camera_node, self.data)
|
||||
return instance
|
||||
self.selected_nodes = []
|
||||
|
|
|
|||
|
|
@ -1,87 +1,67 @@
|
|||
import nuke
|
||||
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
maintained_selection,
|
||||
select_nodes,
|
||||
set_avalon_knob_data
|
||||
from openpype.hosts.nuke.api import (
|
||||
NukeCreator,
|
||||
NukeCreatorError,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
|
||||
class CreateGizmo(plugin.OpenPypeCreator):
|
||||
"""Add Publishable "gizmo" group
|
||||
class CreateGizmo(NukeCreator):
|
||||
"""Add Publishable Group as gizmo"""
|
||||
|
||||
The name is symbolically gizmo as presumably
|
||||
it is something familiar to nuke users as group of nodes
|
||||
distributed downstream in workflow
|
||||
"""
|
||||
|
||||
name = "gizmo"
|
||||
label = "Gizmo"
|
||||
identifier = "create_gizmo"
|
||||
label = "Gizmo (group)"
|
||||
family = "gizmo"
|
||||
icon = "file-archive-o"
|
||||
defaults = ["ViewerInput", "Lut", "Effect"]
|
||||
default_variants = ["ViewerInput", "Lut", "Effect"]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateGizmo, self).__init__(*args, **kwargs)
|
||||
self.nodes = nuke.selectedNodes()
|
||||
self.node_color = "0x7533c1ff"
|
||||
return
|
||||
|
||||
def process(self):
|
||||
if (self.options or {}).get("useSelection"):
|
||||
nodes = self.nodes
|
||||
self.log.info(len(nodes))
|
||||
if len(nodes) == 1:
|
||||
select_nodes(nodes)
|
||||
node = nodes[-1]
|
||||
# check if Group node
|
||||
if node.Class() in "Group":
|
||||
node["name"].setValue("{}_GZM".format(self.name))
|
||||
node["tile_color"].setValue(int(self.node_color, 16))
|
||||
return set_avalon_knob_data(node, self.data)
|
||||
else:
|
||||
msg = ("Please select a group node "
|
||||
"you wish to publish as the gizmo")
|
||||
self.log.error(msg)
|
||||
nuke.message(msg)
|
||||
|
||||
if len(nodes) >= 2:
|
||||
select_nodes(nodes)
|
||||
nuke.makeGroup()
|
||||
gizmo_node = nuke.selectedNode()
|
||||
gizmo_node["name"].setValue("{}_GZM".format(self.name))
|
||||
gizmo_node["tile_color"].setValue(int(self.node_color, 16))
|
||||
|
||||
# add sticky node with guide
|
||||
with gizmo_node:
|
||||
sticky = nuke.createNode("StickyNote")
|
||||
sticky["label"].setValue(
|
||||
"Add following:\n- set Input"
|
||||
" nodes\n- set one Output1\n"
|
||||
"- create User knobs on the group")
|
||||
|
||||
# add avalon knobs
|
||||
return set_avalon_knob_data(gizmo_node, self.data)
|
||||
# plugin attributes
|
||||
node_color = "0x7533c1ff"
|
||||
|
||||
def create_instance_node(
|
||||
self,
|
||||
node_name,
|
||||
knobs=None,
|
||||
parent=None,
|
||||
node_type=None
|
||||
):
|
||||
with maintained_selection():
|
||||
if self.selected_nodes:
|
||||
node = self.selected_nodes[0]
|
||||
if node.Class() != "Group":
|
||||
raise NukeCreatorError(
|
||||
"Creator error: Select only 'Group' node type")
|
||||
created_node = node
|
||||
else:
|
||||
msg = "Please select nodes you wish to add to the gizmo"
|
||||
self.log.error(msg)
|
||||
nuke.message(msg)
|
||||
return
|
||||
created_node = nuke.collapseToGroup()
|
||||
|
||||
created_node["tile_color"].setValue(
|
||||
int(self.node_color, 16))
|
||||
|
||||
created_node["name"].setValue(node_name)
|
||||
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
return created_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance = super(CreateGizmo, self).create(
|
||||
subset_name,
|
||||
instance_data,
|
||||
pre_create_data
|
||||
)
|
||||
|
||||
return instance
|
||||
|
||||
def set_selected_nodes(self, pre_create_data):
|
||||
if pre_create_data.get("use_selection"):
|
||||
self.selected_nodes = nuke.selectedNodes()
|
||||
if self.selected_nodes == []:
|
||||
raise NukeCreatorError("Creator error: No active selection")
|
||||
elif len(self.selected_nodes) > 1:
|
||||
NukeCreatorError("Creator error: Select only one 'Group' node")
|
||||
else:
|
||||
with maintained_selection():
|
||||
gizmo_node = nuke.createNode("Group")
|
||||
gizmo_node["name"].setValue("{}_GZM".format(self.name))
|
||||
gizmo_node["tile_color"].setValue(int(self.node_color, 16))
|
||||
|
||||
# add sticky node with guide
|
||||
with gizmo_node:
|
||||
sticky = nuke.createNode("StickyNote")
|
||||
sticky["label"].setValue(
|
||||
"Add following:\n- add Input"
|
||||
" nodes\n- add one Output1\n"
|
||||
"- create User knobs on the group")
|
||||
|
||||
# add avalon knobs
|
||||
return set_avalon_knob_data(gizmo_node, self.data)
|
||||
self.selected_nodes = []
|
||||
|
|
|
|||
|
|
@ -1,87 +1,67 @@
|
|||
import nuke
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
set_avalon_knob_data
|
||||
from openpype.hosts.nuke.api import (
|
||||
NukeCreator,
|
||||
NukeCreatorError,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
|
||||
class CreateModel(plugin.OpenPypeCreator):
|
||||
"""Add Publishable Model Geometry"""
|
||||
class CreateModel(NukeCreator):
|
||||
"""Add Publishable Camera"""
|
||||
|
||||
name = "model"
|
||||
label = "Create 3d Model"
|
||||
identifier = "create_model"
|
||||
label = "Model (3d)"
|
||||
family = "model"
|
||||
icon = "cube"
|
||||
defaults = ["Main"]
|
||||
default_variants = ["Main"]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateModel, self).__init__(*args, **kwargs)
|
||||
self.nodes = nuke.selectedNodes()
|
||||
self.node_color = "0xff3200ff"
|
||||
return
|
||||
# plugin attributes
|
||||
node_color = "0xff3200ff"
|
||||
|
||||
def process(self):
|
||||
nodes = list()
|
||||
if (self.options or {}).get("useSelection"):
|
||||
nodes = self.nodes
|
||||
for n in nodes:
|
||||
n['selected'].setValue(0)
|
||||
end_nodes = list()
|
||||
|
||||
# get the latest nodes in tree for selecion
|
||||
for n in nodes:
|
||||
x = n
|
||||
end = 0
|
||||
while end == 0:
|
||||
try:
|
||||
x = x.dependent()[0]
|
||||
except:
|
||||
end_node = x
|
||||
end = 1
|
||||
end_nodes.append(end_node)
|
||||
|
||||
# set end_nodes
|
||||
end_nodes = list(set(end_nodes))
|
||||
|
||||
# check if nodes is 3d nodes
|
||||
for n in end_nodes:
|
||||
n['selected'].setValue(1)
|
||||
sn = nuke.createNode("Scene")
|
||||
if not sn.input(0):
|
||||
end_nodes.remove(n)
|
||||
nuke.delete(sn)
|
||||
|
||||
# loop over end nodes
|
||||
for n in end_nodes:
|
||||
n['selected'].setValue(1)
|
||||
|
||||
self.nodes = nuke.selectedNodes()
|
||||
nodes = self.nodes
|
||||
if len(nodes) >= 1:
|
||||
# loop selected nodes
|
||||
for n in nodes:
|
||||
data = self.data.copy()
|
||||
if len(nodes) > 1:
|
||||
# rename subset name only if more
|
||||
# then one node are selected
|
||||
subset = self.family + n["name"].value().capitalize()
|
||||
data["subset"] = subset
|
||||
|
||||
# change node color
|
||||
n["tile_color"].setValue(int(self.node_color, 16))
|
||||
# add avalon knobs
|
||||
set_avalon_knob_data(n, data)
|
||||
return True
|
||||
def create_instance_node(
|
||||
self,
|
||||
node_name,
|
||||
knobs=None,
|
||||
parent=None,
|
||||
node_type=None
|
||||
):
|
||||
with maintained_selection():
|
||||
if self.selected_nodes:
|
||||
node = self.selected_nodes[0]
|
||||
if node.Class() != "Scene":
|
||||
raise NukeCreatorError(
|
||||
"Creator error: Select only 'Scene' node type")
|
||||
created_node = node
|
||||
else:
|
||||
msg = str("Please select nodes you "
|
||||
"wish to add to a container")
|
||||
self.log.error(msg)
|
||||
nuke.message(msg)
|
||||
return
|
||||
created_node = nuke.createNode("Scene")
|
||||
|
||||
created_node["tile_color"].setValue(
|
||||
int(self.node_color, 16))
|
||||
|
||||
created_node["name"].setValue(node_name)
|
||||
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
return created_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance = super(CreateModel, self).create(
|
||||
subset_name,
|
||||
instance_data,
|
||||
pre_create_data
|
||||
)
|
||||
|
||||
return instance
|
||||
|
||||
def set_selected_nodes(self, pre_create_data):
|
||||
if pre_create_data.get("use_selection"):
|
||||
self.selected_nodes = nuke.selectedNodes()
|
||||
if self.selected_nodes == []:
|
||||
raise NukeCreatorError("Creator error: No active selection")
|
||||
elif len(self.selected_nodes) > 1:
|
||||
NukeCreatorError("Creator error: Select only one 'Scene' node")
|
||||
else:
|
||||
# if selected is off then create one node
|
||||
model_node = nuke.createNode("WriteGeo")
|
||||
model_node["tile_color"].setValue(int(self.node_color, 16))
|
||||
# add avalon knobs
|
||||
instance = set_avalon_knob_data(model_node, self.data)
|
||||
return instance
|
||||
self.selected_nodes = []
|
||||
|
|
|
|||
|
|
@ -1,57 +0,0 @@
|
|||
from collections import OrderedDict
|
||||
|
||||
import nuke
|
||||
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
set_avalon_knob_data
|
||||
)
|
||||
|
||||
|
||||
class CrateRead(plugin.OpenPypeCreator):
|
||||
# change this to template preset
|
||||
name = "ReadCopy"
|
||||
label = "Create Read Copy"
|
||||
hosts = ["nuke"]
|
||||
family = "source"
|
||||
families = family
|
||||
icon = "film"
|
||||
defaults = ["Effect", "Backplate", "Fire", "Smoke"]
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CrateRead, self).__init__(*args, **kwargs)
|
||||
self.nodes = nuke.selectedNodes()
|
||||
data = OrderedDict()
|
||||
data['family'] = self.family
|
||||
data['families'] = self.families
|
||||
|
||||
for k, v in self.data.items():
|
||||
if k not in data.keys():
|
||||
data.update({k: v})
|
||||
|
||||
self.data = data
|
||||
|
||||
def process(self):
|
||||
self.name = self.data["subset"]
|
||||
nodes = self.nodes
|
||||
|
||||
if not nodes or len(nodes) == 0:
|
||||
msg = "Please select Read node"
|
||||
self.log.error(msg)
|
||||
nuke.message(msg)
|
||||
else:
|
||||
count_reads = 0
|
||||
for node in nodes:
|
||||
if node.Class() != 'Read':
|
||||
continue
|
||||
avalon_data = self.data
|
||||
avalon_data['subset'] = "{}".format(self.name)
|
||||
set_avalon_knob_data(node, avalon_data)
|
||||
node['tile_color'].setValue(16744935)
|
||||
count_reads += 1
|
||||
|
||||
if count_reads < 1:
|
||||
msg = "Please select Read node"
|
||||
self.log.error(msg)
|
||||
nuke.message(msg)
|
||||
return
|
||||
88
openpype/hosts/nuke/plugins/create/create_source.py
Normal file
88
openpype/hosts/nuke/plugins/create/create_source.py
Normal file
|
|
@ -0,0 +1,88 @@
|
|||
import nuke
|
||||
import six
|
||||
import sys
|
||||
from openpype.hosts.nuke.api import (
|
||||
INSTANCE_DATA_KNOB,
|
||||
NukeCreator,
|
||||
NukeCreatorError,
|
||||
set_node_data
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
CreatedInstance
|
||||
)
|
||||
|
||||
|
||||
class CreateSource(NukeCreator):
|
||||
"""Add Publishable Read with source"""
|
||||
|
||||
identifier = "create_source"
|
||||
label = "Source (read)"
|
||||
family = "source"
|
||||
icon = "film"
|
||||
default_variants = ["Effect", "Backplate", "Fire", "Smoke"]
|
||||
|
||||
# plugin attributes
|
||||
node_color = "0xff9100ff"
|
||||
|
||||
def create_instance_node(
|
||||
self,
|
||||
node_name,
|
||||
read_node
|
||||
):
|
||||
read_node["tile_color"].setValue(
|
||||
int(self.node_color, 16))
|
||||
read_node["name"].setValue(node_name)
|
||||
self.add_info_knob(read_node)
|
||||
return read_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
|
||||
# make sure selected nodes are added
|
||||
self.set_selected_nodes(pre_create_data)
|
||||
|
||||
try:
|
||||
for read_node in self.selected_nodes:
|
||||
if read_node.Class() != 'Read':
|
||||
continue
|
||||
|
||||
node_name = read_node.name()
|
||||
_subset_name = subset_name + node_name
|
||||
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(_subset_name)
|
||||
|
||||
instance_node = self.create_instance_node(
|
||||
_subset_name,
|
||||
read_node
|
||||
)
|
||||
instance = CreatedInstance(
|
||||
self.family,
|
||||
_subset_name,
|
||||
instance_data,
|
||||
self
|
||||
)
|
||||
|
||||
instance.transient_data["node"] = instance_node
|
||||
|
||||
self._add_instance_to_context(instance)
|
||||
|
||||
set_node_data(
|
||||
instance_node,
|
||||
INSTANCE_DATA_KNOB,
|
||||
instance.data_to_store()
|
||||
)
|
||||
|
||||
except Exception as er:
|
||||
six.reraise(
|
||||
NukeCreatorError,
|
||||
NukeCreatorError("Creator error: {}".format(er)),
|
||||
sys.exc_info()[2])
|
||||
|
||||
def set_selected_nodes(self, pre_create_data):
|
||||
if pre_create_data.get("use_selection"):
|
||||
self.selected_nodes = nuke.selectedNodes()
|
||||
if self.selected_nodes == []:
|
||||
raise NukeCreatorError("Creator error: No active selection")
|
||||
else:
|
||||
NukeCreatorError(
|
||||
"Creator error: only supprted with active selection")
|
||||
173
openpype/hosts/nuke/plugins/create/create_write_image.py
Normal file
173
openpype/hosts/nuke/plugins/create/create_write_image.py
Normal file
|
|
@ -0,0 +1,173 @@
|
|||
import nuke
|
||||
import sys
|
||||
import six
|
||||
|
||||
from openpype.pipeline import (
|
||||
CreatedInstance
|
||||
)
|
||||
from openpype.lib import (
|
||||
BoolDef,
|
||||
NumberDef,
|
||||
UISeparatorDef,
|
||||
EnumDef
|
||||
)
|
||||
from openpype.hosts.nuke import api as napi
|
||||
|
||||
|
||||
class CreateWriteImage(napi.NukeWriteCreator):
|
||||
identifier = "create_write_image"
|
||||
label = "Image (write)"
|
||||
family = "image"
|
||||
icon = "sign-out"
|
||||
|
||||
instance_attributes = [
|
||||
"use_range_limit"
|
||||
]
|
||||
default_variants = [
|
||||
"StillFrame",
|
||||
"MPFrame",
|
||||
"LayoutFrame"
|
||||
]
|
||||
temp_rendering_path_template = (
|
||||
"{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}")
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
attr_defs = [
|
||||
BoolDef(
|
||||
"use_selection",
|
||||
default=True,
|
||||
label="Use selection"
|
||||
),
|
||||
self._get_render_target_enum(),
|
||||
UISeparatorDef(),
|
||||
self._get_frame_source_number()
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def _get_render_target_enum(self):
|
||||
rendering_targets = {
|
||||
"local": "Local machine rendering",
|
||||
"frames": "Use existing frames"
|
||||
}
|
||||
|
||||
return EnumDef(
|
||||
"render_target",
|
||||
items=rendering_targets,
|
||||
label="Render target"
|
||||
)
|
||||
|
||||
def _get_frame_source_number(self):
|
||||
return NumberDef(
|
||||
"active_frame",
|
||||
label="Active frame",
|
||||
default=nuke.frame()
|
||||
)
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool()
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def create_instance_node(self, subset_name, instance_data):
|
||||
linked_knobs_ = []
|
||||
if "use_range_limit" in self.instance_attributes:
|
||||
linked_knobs_ = ["channels", "___", "first", "last", "use_limit"]
|
||||
|
||||
# add fpath_template
|
||||
write_data = {
|
||||
"creator": self.__class__.__name__,
|
||||
"subset": subset_name,
|
||||
"fpath_template": self.temp_rendering_path_template
|
||||
}
|
||||
write_data.update(instance_data)
|
||||
|
||||
created_node = napi.create_write_node(
|
||||
subset_name,
|
||||
write_data,
|
||||
input=self.selected_node,
|
||||
prenodes=self.prenodes,
|
||||
linked_knobs=linked_knobs_,
|
||||
**{
|
||||
"frame": nuke.frame()
|
||||
}
|
||||
)
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
self._add_frame_range_limit(created_node, instance_data)
|
||||
|
||||
self.integrate_links(created_node, outputs=True)
|
||||
|
||||
return created_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
subset_name = subset_name.format(**pre_create_data)
|
||||
|
||||
# pass values from precreate to instance
|
||||
self.pass_pre_attributes_to_instance(
|
||||
instance_data,
|
||||
pre_create_data,
|
||||
[
|
||||
"active_frame",
|
||||
"render_target"
|
||||
]
|
||||
)
|
||||
|
||||
# make sure selected nodes are added
|
||||
self.set_selected_nodes(pre_create_data)
|
||||
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance_node = self.create_instance_node(
|
||||
subset_name,
|
||||
instance_data,
|
||||
)
|
||||
|
||||
try:
|
||||
instance = CreatedInstance(
|
||||
self.family,
|
||||
subset_name,
|
||||
instance_data,
|
||||
self
|
||||
)
|
||||
|
||||
instance.transient_data["node"] = instance_node
|
||||
|
||||
self._add_instance_to_context(instance)
|
||||
|
||||
napi.set_node_data(
|
||||
instance_node,
|
||||
napi.INSTANCE_DATA_KNOB,
|
||||
instance.data_to_store()
|
||||
)
|
||||
|
||||
return instance
|
||||
|
||||
except Exception as er:
|
||||
six.reraise(
|
||||
napi.NukeCreatorError,
|
||||
napi.NukeCreatorError("Creator error: {}".format(er)),
|
||||
sys.exc_info()[2]
|
||||
)
|
||||
|
||||
def _add_frame_range_limit(self, write_node, instance_data):
|
||||
if "use_range_limit" not in self.instance_attributes:
|
||||
return
|
||||
|
||||
active_frame = (
|
||||
instance_data["creator_attributes"].get("active_frame"))
|
||||
|
||||
write_node.begin()
|
||||
for n in nuke.allNodes():
|
||||
# get write node
|
||||
if n.Class() in "Write":
|
||||
w_node = n
|
||||
write_node.end()
|
||||
|
||||
w_node["use_limit"].setValue(True)
|
||||
w_node["first"].setValue(active_frame or nuke.frame())
|
||||
w_node["last"].setExpression("first")
|
||||
|
||||
return write_node
|
||||
|
|
@ -1,56 +1,176 @@
|
|||
import nuke
|
||||
import sys
|
||||
import six
|
||||
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
create_write_node, create_write_node_legacy)
|
||||
from openpype.pipeline import (
|
||||
CreatedInstance
|
||||
)
|
||||
from openpype.lib import (
|
||||
BoolDef,
|
||||
NumberDef,
|
||||
UISeparatorDef,
|
||||
UILabelDef
|
||||
)
|
||||
from openpype.hosts.nuke import api as napi
|
||||
|
||||
|
||||
class CreateWritePrerender(plugin.AbstractWriteRender):
|
||||
# change this to template preset
|
||||
name = "WritePrerender"
|
||||
label = "Create Write Prerender"
|
||||
hosts = ["nuke"]
|
||||
n_class = "Write"
|
||||
class CreateWritePrerender(napi.NukeWriteCreator):
|
||||
identifier = "create_write_prerender"
|
||||
label = "Prerender (write)"
|
||||
family = "prerender"
|
||||
icon = "sign-out"
|
||||
|
||||
# settings
|
||||
fpath_template = "{work}/render/nuke/{subset}/{subset}.{frame}.{ext}"
|
||||
defaults = ["Key01", "Bg01", "Fg01", "Branch01", "Part01"]
|
||||
reviewable = False
|
||||
use_range_limit = True
|
||||
instance_attributes = [
|
||||
"use_range_limit"
|
||||
]
|
||||
default_variants = [
|
||||
"Key01",
|
||||
"Bg01",
|
||||
"Fg01",
|
||||
"Branch01",
|
||||
"Part01"
|
||||
]
|
||||
temp_rendering_path_template = (
|
||||
"{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}")
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateWritePrerender, self).__init__(*args, **kwargs)
|
||||
def get_pre_create_attr_defs(self):
|
||||
attr_defs = [
|
||||
BoolDef(
|
||||
"use_selection",
|
||||
default=True,
|
||||
label="Use selection"
|
||||
),
|
||||
self._get_render_target_enum()
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool()
|
||||
]
|
||||
if "farm_rendering" in self.instance_attributes:
|
||||
attr_defs.extend([
|
||||
UISeparatorDef(),
|
||||
UILabelDef("Farm rendering attributes"),
|
||||
BoolDef("suspended_publish", label="Suspended publishing"),
|
||||
NumberDef(
|
||||
"farm_priority",
|
||||
label="Priority",
|
||||
minimum=1,
|
||||
maximum=99,
|
||||
default=50
|
||||
),
|
||||
NumberDef(
|
||||
"farm_chunk",
|
||||
label="Chunk size",
|
||||
minimum=1,
|
||||
maximum=99,
|
||||
default=10
|
||||
),
|
||||
NumberDef(
|
||||
"farm_concurency",
|
||||
label="Concurent tasks",
|
||||
minimum=1,
|
||||
maximum=10,
|
||||
default=1
|
||||
)
|
||||
])
|
||||
return attr_defs
|
||||
|
||||
def create_instance_node(self, subset_name, instance_data):
|
||||
linked_knobs_ = []
|
||||
if "use_range_limit" in self.instance_attributes:
|
||||
linked_knobs_ = ["channels", "___", "first", "last", "use_limit"]
|
||||
|
||||
def _create_write_node(self, selected_node, inputs, outputs, write_data):
|
||||
# add fpath_template
|
||||
write_data["fpath_template"] = self.fpath_template
|
||||
write_data["use_range_limit"] = self.use_range_limit
|
||||
write_data["frame_range"] = (
|
||||
nuke.root()["first_frame"].value(),
|
||||
nuke.root()["last_frame"].value()
|
||||
write_data = {
|
||||
"creator": self.__class__.__name__,
|
||||
"subset": subset_name,
|
||||
"fpath_template": self.temp_rendering_path_template
|
||||
}
|
||||
|
||||
write_data.update(instance_data)
|
||||
|
||||
# get width and height
|
||||
if self.selected_node:
|
||||
width, height = (
|
||||
self.selected_node.width(), self.selected_node.height())
|
||||
else:
|
||||
actual_format = nuke.root().knob('format').value()
|
||||
width, height = (actual_format.width(), actual_format.height())
|
||||
|
||||
created_node = napi.create_write_node(
|
||||
subset_name,
|
||||
write_data,
|
||||
input=self.selected_node,
|
||||
prenodes=self.prenodes,
|
||||
linked_knobs=linked_knobs_,
|
||||
**{
|
||||
"width": width,
|
||||
"height": height
|
||||
}
|
||||
)
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
self._add_frame_range_limit(created_node)
|
||||
|
||||
self.integrate_links(created_node, outputs=True)
|
||||
|
||||
return created_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# pass values from precreate to instance
|
||||
self.pass_pre_attributes_to_instance(
|
||||
instance_data,
|
||||
pre_create_data,
|
||||
[
|
||||
"render_target"
|
||||
]
|
||||
)
|
||||
|
||||
if not self.is_legacy():
|
||||
return create_write_node(
|
||||
self.data["subset"],
|
||||
write_data,
|
||||
input=selected_node,
|
||||
review=self.reviewable,
|
||||
linked_knobs=["channels", "___", "first", "last", "use_limit"]
|
||||
)
|
||||
else:
|
||||
return create_write_node_legacy(
|
||||
self.data["subset"],
|
||||
write_data,
|
||||
input=selected_node,
|
||||
review=self.reviewable,
|
||||
linked_knobs=["channels", "___", "first", "last", "use_limit"]
|
||||
# make sure selected nodes are added
|
||||
self.set_selected_nodes(pre_create_data)
|
||||
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance_node = self.create_instance_node(
|
||||
subset_name,
|
||||
instance_data
|
||||
)
|
||||
|
||||
try:
|
||||
instance = CreatedInstance(
|
||||
self.family,
|
||||
subset_name,
|
||||
instance_data,
|
||||
self
|
||||
)
|
||||
|
||||
def _modify_write_node(self, write_node):
|
||||
# open group node
|
||||
instance.transient_data["node"] = instance_node
|
||||
|
||||
self._add_instance_to_context(instance)
|
||||
|
||||
napi.set_node_data(
|
||||
instance_node,
|
||||
napi.INSTANCE_DATA_KNOB,
|
||||
instance.data_to_store()
|
||||
)
|
||||
|
||||
return instance
|
||||
|
||||
except Exception as er:
|
||||
six.reraise(
|
||||
napi.NukeCreatorError,
|
||||
napi.NukeCreatorError("Creator error: {}".format(er)),
|
||||
sys.exc_info()[2]
|
||||
)
|
||||
|
||||
def _add_frame_range_limit(self, write_node):
|
||||
if "use_range_limit" not in self.instance_attributes:
|
||||
return
|
||||
|
||||
write_node.begin()
|
||||
for n in nuke.allNodes():
|
||||
# get write node
|
||||
|
|
@ -58,9 +178,8 @@ class CreateWritePrerender(plugin.AbstractWriteRender):
|
|||
w_node = n
|
||||
write_node.end()
|
||||
|
||||
if self.use_range_limit:
|
||||
w_node["use_limit"].setValue(True)
|
||||
w_node["first"].setValue(nuke.root()["first_frame"].value())
|
||||
w_node["last"].setValue(nuke.root()["last_frame"].value())
|
||||
w_node["use_limit"].setValue(True)
|
||||
w_node["first"].setValue(nuke.root()["first_frame"].value())
|
||||
w_node["last"].setValue(nuke.root()["last_frame"].value())
|
||||
|
||||
return write_node
|
||||
|
|
|
|||
|
|
@ -1,86 +1,157 @@
|
|||
import nuke
|
||||
import sys
|
||||
import six
|
||||
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
create_write_node, create_write_node_legacy)
|
||||
from openpype.pipeline import (
|
||||
CreatedInstance
|
||||
)
|
||||
from openpype.lib import (
|
||||
BoolDef,
|
||||
NumberDef,
|
||||
UISeparatorDef,
|
||||
UILabelDef
|
||||
)
|
||||
from openpype.hosts.nuke import api as napi
|
||||
|
||||
|
||||
class CreateWriteRender(plugin.AbstractWriteRender):
|
||||
# change this to template preset
|
||||
name = "WriteRender"
|
||||
label = "Create Write Render"
|
||||
hosts = ["nuke"]
|
||||
n_class = "Write"
|
||||
class CreateWriteRender(napi.NukeWriteCreator):
|
||||
identifier = "create_write_render"
|
||||
label = "Render (write)"
|
||||
family = "render"
|
||||
icon = "sign-out"
|
||||
|
||||
# settings
|
||||
fpath_template = "{work}/render/nuke/{subset}/{subset}.{frame}.{ext}"
|
||||
defaults = ["Main", "Mask"]
|
||||
prenodes = {
|
||||
"Reformat01": {
|
||||
"nodeclass": "Reformat",
|
||||
"dependent": None,
|
||||
"knobs": [
|
||||
{
|
||||
"type": "text",
|
||||
"name": "resize",
|
||||
"value": "none"
|
||||
},
|
||||
{
|
||||
"type": "bool",
|
||||
"name": "black_outside",
|
||||
"value": True
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
instance_attributes = [
|
||||
"reviewable"
|
||||
]
|
||||
default_variants = [
|
||||
"Main",
|
||||
"Mask"
|
||||
]
|
||||
temp_rendering_path_template = (
|
||||
"{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}")
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateWriteRender, self).__init__(*args, **kwargs)
|
||||
def get_pre_create_attr_defs(self):
|
||||
attr_defs = [
|
||||
BoolDef(
|
||||
"use_selection",
|
||||
default=True,
|
||||
label="Use selection"
|
||||
),
|
||||
self._get_render_target_enum()
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def _create_write_node(self, selected_node, inputs, outputs, write_data):
|
||||
def get_instance_attr_defs(self):
|
||||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool()
|
||||
]
|
||||
if "farm_rendering" in self.instance_attributes:
|
||||
attr_defs.extend([
|
||||
UISeparatorDef(),
|
||||
UILabelDef("Farm rendering attributes"),
|
||||
BoolDef("suspended_publish", label="Suspended publishing"),
|
||||
NumberDef(
|
||||
"farm_priority",
|
||||
label="Priority",
|
||||
minimum=1,
|
||||
maximum=99,
|
||||
default=50
|
||||
),
|
||||
NumberDef(
|
||||
"farm_chunk",
|
||||
label="Chunk size",
|
||||
minimum=1,
|
||||
maximum=99,
|
||||
default=10
|
||||
),
|
||||
NumberDef(
|
||||
"farm_concurency",
|
||||
label="Concurent tasks",
|
||||
minimum=1,
|
||||
maximum=10,
|
||||
default=1
|
||||
)
|
||||
])
|
||||
return attr_defs
|
||||
|
||||
def create_instance_node(self, subset_name, instance_data):
|
||||
# add fpath_template
|
||||
write_data["fpath_template"] = self.fpath_template
|
||||
write_data = {
|
||||
"creator": self.__class__.__name__,
|
||||
"subset": subset_name,
|
||||
"fpath_template": self.temp_rendering_path_template
|
||||
}
|
||||
|
||||
write_data.update(instance_data)
|
||||
|
||||
# add reformat node to cut off all outside of format bounding box
|
||||
# get width and height
|
||||
try:
|
||||
width, height = (selected_node.width(), selected_node.height())
|
||||
except AttributeError:
|
||||
if self.selected_node:
|
||||
width, height = (
|
||||
self.selected_node.width(), self.selected_node.height())
|
||||
else:
|
||||
actual_format = nuke.root().knob('format').value()
|
||||
width, height = (actual_format.width(), actual_format.height())
|
||||
|
||||
if not self.is_legacy():
|
||||
return create_write_node(
|
||||
self.data["subset"],
|
||||
write_data,
|
||||
input=selected_node,
|
||||
prenodes=self.prenodes,
|
||||
**{
|
||||
"width": width,
|
||||
"height": height
|
||||
}
|
||||
)
|
||||
else:
|
||||
_prenodes = [
|
||||
{
|
||||
"name": "Reformat01",
|
||||
"class": "Reformat",
|
||||
"knobs": [
|
||||
("resize", 0),
|
||||
("black_outside", 1),
|
||||
],
|
||||
"dependent": None
|
||||
}
|
||||
created_node = napi.create_write_node(
|
||||
subset_name,
|
||||
write_data,
|
||||
input=self.selected_node,
|
||||
prenodes=self.prenodes,
|
||||
**{
|
||||
"width": width,
|
||||
"height": height
|
||||
}
|
||||
)
|
||||
self.add_info_knob(created_node)
|
||||
|
||||
self.integrate_links(created_node, outputs=False)
|
||||
|
||||
return created_node
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# pass values from precreate to instance
|
||||
self.pass_pre_attributes_to_instance(
|
||||
instance_data,
|
||||
pre_create_data,
|
||||
[
|
||||
"render_target"
|
||||
]
|
||||
)
|
||||
# make sure selected nodes are added
|
||||
self.set_selected_nodes(pre_create_data)
|
||||
|
||||
return create_write_node_legacy(
|
||||
self.data["subset"],
|
||||
write_data,
|
||||
input=selected_node,
|
||||
prenodes=_prenodes
|
||||
# make sure subset name is unique
|
||||
self.check_existing_subset(subset_name)
|
||||
|
||||
instance_node = self.create_instance_node(
|
||||
subset_name,
|
||||
instance_data
|
||||
)
|
||||
|
||||
try:
|
||||
instance = CreatedInstance(
|
||||
self.family,
|
||||
subset_name,
|
||||
instance_data,
|
||||
self
|
||||
)
|
||||
|
||||
def _modify_write_node(self, write_node):
|
||||
return write_node
|
||||
instance.transient_data["node"] = instance_node
|
||||
|
||||
self._add_instance_to_context(instance)
|
||||
|
||||
napi.set_node_data(
|
||||
instance_node,
|
||||
napi.INSTANCE_DATA_KNOB,
|
||||
instance.data_to_store()
|
||||
)
|
||||
|
||||
return instance
|
||||
|
||||
except Exception as er:
|
||||
six.reraise(
|
||||
napi.NukeCreatorError,
|
||||
napi.NukeCreatorError("Creator error: {}".format(er)),
|
||||
sys.exc_info()[2]
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,105 +0,0 @@
|
|||
import nuke
|
||||
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
create_write_node,
|
||||
create_write_node_legacy,
|
||||
get_created_node_imageio_setting_legacy
|
||||
)
|
||||
|
||||
# HACK: just to disable still image on projects which
|
||||
# are not having anatomy imageio preset for CreateWriteStill
|
||||
# TODO: remove this code as soon as it will be obsolete
|
||||
imageio_writes = get_created_node_imageio_setting_legacy(
|
||||
"Write",
|
||||
"CreateWriteStill",
|
||||
"stillMain"
|
||||
)
|
||||
print(imageio_writes["knobs"])
|
||||
|
||||
|
||||
class CreateWriteStill(plugin.AbstractWriteRender):
|
||||
# change this to template preset
|
||||
name = "WriteStillFrame"
|
||||
label = "Create Write Still Image"
|
||||
hosts = ["nuke"]
|
||||
n_class = "Write"
|
||||
family = "still"
|
||||
icon = "image"
|
||||
|
||||
# settings
|
||||
fpath_template = "{work}/render/nuke/{subset}/{subset}.{ext}"
|
||||
defaults = [
|
||||
"ImageFrame",
|
||||
"MPFrame",
|
||||
"LayoutFrame"
|
||||
]
|
||||
prenodes = {
|
||||
"FrameHold01": {
|
||||
"nodeclass": "FrameHold",
|
||||
"dependent": None,
|
||||
"knobs": [
|
||||
{
|
||||
"type": "formatable",
|
||||
"name": "first_frame",
|
||||
"template": "{frame}",
|
||||
"to_type": "number"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateWriteStill, self).__init__(*args, **kwargs)
|
||||
|
||||
def _create_write_node(self, selected_node, inputs, outputs, write_data):
|
||||
# add fpath_template
|
||||
write_data["fpath_template"] = self.fpath_template
|
||||
|
||||
if not self.is_legacy():
|
||||
return create_write_node(
|
||||
self.name,
|
||||
write_data,
|
||||
input=selected_node,
|
||||
review=False,
|
||||
prenodes=self.prenodes,
|
||||
farm=False,
|
||||
linked_knobs=["channels", "___", "first", "last", "use_limit"],
|
||||
**{
|
||||
"frame": nuke.frame()
|
||||
}
|
||||
)
|
||||
else:
|
||||
_prenodes = [
|
||||
{
|
||||
"name": "FrameHold01",
|
||||
"class": "FrameHold",
|
||||
"knobs": [
|
||||
("first_frame", nuke.frame())
|
||||
],
|
||||
"dependent": None
|
||||
}
|
||||
]
|
||||
return create_write_node_legacy(
|
||||
self.name,
|
||||
write_data,
|
||||
input=selected_node,
|
||||
review=False,
|
||||
prenodes=_prenodes,
|
||||
farm=False,
|
||||
linked_knobs=["channels", "___", "first", "last", "use_limit"]
|
||||
)
|
||||
|
||||
def _modify_write_node(self, write_node):
|
||||
write_node.begin()
|
||||
for n in nuke.allNodes():
|
||||
# get write node
|
||||
if n.Class() in "Write":
|
||||
w_node = n
|
||||
write_node.end()
|
||||
|
||||
w_node["use_limit"].setValue(True)
|
||||
w_node["first"].setValue(nuke.frame())
|
||||
w_node["last"].setValue(nuke.frame())
|
||||
|
||||
return write_node
|
||||
69
openpype/hosts/nuke/plugins/create/workfile_creator.py
Normal file
69
openpype/hosts/nuke/plugins/create/workfile_creator.py
Normal file
|
|
@ -0,0 +1,69 @@
|
|||
import openpype.hosts.nuke.api as api
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.pipeline import (
|
||||
AutoCreator,
|
||||
CreatedInstance,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.hosts.nuke.api import (
|
||||
INSTANCE_DATA_KNOB,
|
||||
set_node_data
|
||||
)
|
||||
import nuke
|
||||
|
||||
|
||||
class WorkfileCreator(AutoCreator):
|
||||
identifier = "workfile"
|
||||
family = "workfile"
|
||||
|
||||
default_variant = "Main"
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
return []
|
||||
|
||||
def collect_instances(self):
|
||||
root_node = nuke.root()
|
||||
instance_data = api.get_node_data(
|
||||
root_node, api.INSTANCE_DATA_KNOB
|
||||
)
|
||||
|
||||
project_name = legacy_io.Session["AVALON_PROJECT"]
|
||||
asset_name = legacy_io.Session["AVALON_ASSET"]
|
||||
task_name = legacy_io.Session["AVALON_TASK"]
|
||||
host_name = legacy_io.Session["AVALON_APP"]
|
||||
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
subset_name = self.get_subset_name(
|
||||
self.default_variant, task_name, asset_doc,
|
||||
project_name, host_name
|
||||
)
|
||||
instance_data.update({
|
||||
"asset": asset_name,
|
||||
"task": task_name,
|
||||
"variant": self.default_variant
|
||||
})
|
||||
instance_data.update(self.get_dynamic_data(
|
||||
self.default_variant, task_name, asset_doc,
|
||||
project_name, host_name, instance_data
|
||||
))
|
||||
|
||||
instance = CreatedInstance(
|
||||
self.family, subset_name, instance_data, self
|
||||
)
|
||||
instance.transient_data["node"] = root_node
|
||||
self._add_instance_to_context(instance)
|
||||
|
||||
def update_instances(self, update_list):
|
||||
for created_inst, _changes in update_list:
|
||||
instance_node = created_inst.transient_data["node"]
|
||||
|
||||
set_node_data(
|
||||
instance_node,
|
||||
INSTANCE_DATA_KNOB,
|
||||
created_inst.data_to_store()
|
||||
)
|
||||
|
||||
def create(self, options=None):
|
||||
# no need to create if it is created
|
||||
# in `collect_instances`
|
||||
pass
|
||||
|
|
@ -1,9 +1,9 @@
|
|||
from pprint import pformat
|
||||
import pyblish.api
|
||||
from openpype.hosts.nuke.api import lib as pnlib
|
||||
import nuke
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class CollectBackdrops(pyblish.api.InstancePlugin):
|
||||
"""Collect Backdrop node instance and its content
|
||||
"""
|
||||
|
|
@ -14,8 +14,9 @@ class CollectBackdrops(pyblish.api.InstancePlugin):
|
|||
families = ["nukenodes"]
|
||||
|
||||
def process(self, instance):
|
||||
self.log.debug(pformat(instance.data))
|
||||
|
||||
bckn = instance[0]
|
||||
bckn = instance.data["transientData"]["node"]
|
||||
|
||||
# define size of the backdrop
|
||||
left = bckn.xpos()
|
||||
|
|
@ -23,6 +24,7 @@ class CollectBackdrops(pyblish.api.InstancePlugin):
|
|||
right = left + bckn['bdwidth'].value()
|
||||
bottom = top + bckn['bdheight'].value()
|
||||
|
||||
instance.data["transientData"]["childNodes"] = []
|
||||
# iterate all nodes
|
||||
for node in nuke.allNodes():
|
||||
|
||||
|
|
@ -37,17 +39,17 @@ class CollectBackdrops(pyblish.api.InstancePlugin):
|
|||
and (node.ypos() + node.screenHeight() < bottom):
|
||||
|
||||
# add contained nodes to instance's node list
|
||||
instance.append(node)
|
||||
instance.data["transientData"]["childNodes"].append(node)
|
||||
|
||||
# get all connections from outside of backdrop
|
||||
nodes = instance[1:]
|
||||
nodes = instance.data["transientData"]["childNodes"]
|
||||
connections_in, connections_out = pnlib.get_dependent_nodes(nodes)
|
||||
instance.data["nodeConnectionsIn"] = connections_in
|
||||
instance.data["nodeConnectionsOut"] = connections_out
|
||||
instance.data["transientData"]["nodeConnectionsIn"] = connections_in
|
||||
instance.data["transientData"]["nodeConnectionsOut"] = connections_out
|
||||
|
||||
# make label nicer
|
||||
instance.data["label"] = "{0} ({1} nodes)".format(
|
||||
bckn.name(), len(instance) - 1)
|
||||
bckn.name(), len(instance.data["transientData"]["childNodes"]))
|
||||
|
||||
instance.data["families"].append(instance.data["family"])
|
||||
|
||||
|
|
@ -83,5 +85,4 @@ class CollectBackdrops(pyblish.api.InstancePlugin):
|
|||
"frameStart": first_frame,
|
||||
"frameEnd": last_frame
|
||||
})
|
||||
self.log.info("Backdrop content collected: `{}`".format(instance[:]))
|
||||
self.log.info("Backdrop instance collected: `{}`".format(instance))
|
||||
|
|
|
|||
69
openpype/hosts/nuke/plugins/publish/collect_context_data.py
Normal file
69
openpype/hosts/nuke/plugins/publish/collect_context_data.py
Normal file
|
|
@ -0,0 +1,69 @@
|
|||
import os
|
||||
import nuke
|
||||
import pyblish.api
|
||||
import openpype.api as api
|
||||
import openpype.hosts.nuke.api as napi
|
||||
from openpype.pipeline import KnownPublishError
|
||||
|
||||
|
||||
class CollectContextData(pyblish.api.ContextPlugin):
|
||||
"""Collect current context publish."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.499
|
||||
label = "Collect context data"
|
||||
hosts = ['nuke']
|
||||
|
||||
def process(self, context): # sourcery skip: avoid-builtin-shadow
|
||||
root_node = nuke.root()
|
||||
|
||||
current_file = os.path.normpath(root_node.name())
|
||||
|
||||
if current_file.lower() == "root":
|
||||
raise KnownPublishError(
|
||||
"Workfile is not correct file name. \n"
|
||||
"Use workfile tool to manage the name correctly."
|
||||
)
|
||||
|
||||
# Get frame range
|
||||
first_frame = int(root_node["first_frame"].getValue())
|
||||
last_frame = int(root_node["last_frame"].getValue())
|
||||
|
||||
# get instance data from root
|
||||
root_instance_context = napi.get_node_data(
|
||||
root_node, napi.INSTANCE_DATA_KNOB
|
||||
)
|
||||
|
||||
handle_start = root_instance_context["handleStart"]
|
||||
handle_end = root_instance_context["handleEnd"]
|
||||
|
||||
# Get format
|
||||
format = root_node['format'].value()
|
||||
resolution_width = format.width()
|
||||
resolution_height = format.height()
|
||||
pixel_aspect = format.pixelAspect()
|
||||
|
||||
script_data = {
|
||||
"frameStart": first_frame + handle_start,
|
||||
"frameEnd": last_frame - handle_end,
|
||||
"resolutionWidth": resolution_width,
|
||||
"resolutionHeight": resolution_height,
|
||||
"pixelAspect": pixel_aspect,
|
||||
|
||||
# backward compatibility handles
|
||||
"handles": handle_start,
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"step": 1,
|
||||
"fps": root_node['fps'].value(),
|
||||
|
||||
"currentFile": current_file,
|
||||
"version": int(api.get_version_from_path(current_file)),
|
||||
|
||||
"host": pyblish.api.current_host(),
|
||||
"hostVersion": nuke.NUKE_VERSION_STRING
|
||||
}
|
||||
|
||||
context.data["scriptData"] = script_data
|
||||
context.data.update(script_data)
|
||||
|
||||
self.log.info('Context from Nuke script collected')
|
||||
|
|
@ -2,25 +2,23 @@ import pyblish.api
|
|||
import nuke
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class CollectGizmo(pyblish.api.InstancePlugin):
|
||||
"""Collect Gizmo (group) node instance and its content
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.22
|
||||
label = "Collect Gizmo (Group)"
|
||||
label = "Collect Gizmo (group)"
|
||||
hosts = ["nuke"]
|
||||
families = ["gizmo"]
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
grpn = instance[0]
|
||||
gizmo_node = instance.data["transientData"]["node"]
|
||||
|
||||
# add family to familiess
|
||||
instance.data["families"].insert(0, instance.data["family"])
|
||||
# make label nicer
|
||||
instance.data["label"] = "{0} ({1} nodes)".format(
|
||||
grpn.name(), len(instance) - 1)
|
||||
instance.data["label"] = gizmo_node.name()
|
||||
|
||||
# Get frame range
|
||||
handle_start = instance.context.data["handleStart"]
|
||||
|
|
@ -46,5 +44,4 @@ class CollectGizmo(pyblish.api.InstancePlugin):
|
|||
"frameStart": first_frame,
|
||||
"frameEnd": last_frame
|
||||
})
|
||||
self.log.info("Gizmo content collected: `{}`".format(instance[:]))
|
||||
self.log.info("Gizmo instance collected: `{}`".format(instance))
|
||||
|
|
|
|||
44
openpype/hosts/nuke/plugins/publish/collect_instance_data.py
Normal file
44
openpype/hosts/nuke/plugins/publish/collect_instance_data.py
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
import nuke
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectInstanceData(pyblish.api.InstancePlugin):
|
||||
"""Collect all nodes with Avalon knob."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.49
|
||||
label = "Collect Instance Data"
|
||||
hosts = ["nuke", "nukeassist"]
|
||||
|
||||
# presets
|
||||
sync_workfile_version_on_families = []
|
||||
|
||||
def process(self, instance):
|
||||
family = instance.data["family"]
|
||||
|
||||
# Get format
|
||||
root = nuke.root()
|
||||
format_ = root['format'].value()
|
||||
resolution_width = format_.width()
|
||||
resolution_height = format_.height()
|
||||
pixel_aspect = format_.pixelAspect()
|
||||
|
||||
# sync workfile version
|
||||
if family in self.sync_workfile_version_on_families:
|
||||
self.log.debug(
|
||||
"Syncing version with workfile for '{}'".format(
|
||||
family
|
||||
)
|
||||
)
|
||||
# get version to instance for integration
|
||||
instance.data['version'] = instance.context.data['version']
|
||||
|
||||
instance.data.update({
|
||||
"step": 1,
|
||||
"fps": root['fps'].value(),
|
||||
"resolutionWidth": resolution_width,
|
||||
"resolutionHeight": resolution_height,
|
||||
"pixelAspect": pixel_aspect
|
||||
|
||||
})
|
||||
self.log.debug("Collected instance: {}".format(
|
||||
instance.data))
|
||||
|
|
@ -2,7 +2,6 @@ import pyblish.api
|
|||
import nuke
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class CollectModel(pyblish.api.InstancePlugin):
|
||||
"""Collect Model node instance and its content
|
||||
"""
|
||||
|
|
@ -14,12 +13,12 @@ class CollectModel(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
|
||||
grpn = instance[0]
|
||||
geo_node = instance.data["transientData"]["node"]
|
||||
|
||||
# add family to familiess
|
||||
instance.data["families"].insert(0, instance.data["family"])
|
||||
# make label nicer
|
||||
instance.data["label"] = grpn.name()
|
||||
instance.data["label"] = geo_node.name()
|
||||
|
||||
# Get frame range
|
||||
handle_start = instance.context.data["handleStart"]
|
||||
|
|
@ -45,5 +44,4 @@ class CollectModel(pyblish.api.InstancePlugin):
|
|||
"frameStart": first_frame,
|
||||
"frameEnd": last_frame
|
||||
})
|
||||
self.log.info("Model content collected: `{}`".format(instance[:]))
|
||||
self.log.info("Model instance collected: `{}`".format(instance))
|
||||
|
|
|
|||
|
|
@ -2,12 +2,10 @@ import os
|
|||
import re
|
||||
import nuke
|
||||
import pyblish.api
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class CollectNukeReads(pyblish.api.InstancePlugin):
|
||||
"""Collect all read nodes."""
|
||||
|
||||
|
|
@ -17,6 +15,8 @@ class CollectNukeReads(pyblish.api.InstancePlugin):
|
|||
families = ["source"]
|
||||
|
||||
def process(self, instance):
|
||||
node = instance.data["transientData"]["node"]
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
asset_name = legacy_io.Session["AVALON_ASSET"]
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
|
|
@ -25,7 +25,6 @@ class CollectNukeReads(pyblish.api.InstancePlugin):
|
|||
|
||||
self.log.debug("checking instance: {}".format(instance))
|
||||
|
||||
node = instance[0]
|
||||
if node.Class() != "Read":
|
||||
return
|
||||
|
||||
|
|
@ -99,10 +98,7 @@ class CollectNukeReads(pyblish.api.InstancePlugin):
|
|||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
||||
transfer = False
|
||||
if "publish" in node.knobs():
|
||||
transfer = node["publish"]
|
||||
|
||||
transfer = node["publish"] if "publish" in node.knobs() else False
|
||||
instance.data['transfer'] = transfer
|
||||
|
||||
# Add version data to instance
|
||||
|
|
|
|||
|
|
@ -8,10 +8,10 @@ class CollectSlate(pyblish.api.InstancePlugin):
|
|||
order = pyblish.api.CollectorOrder + 0.09
|
||||
label = "Collect Slate Node"
|
||||
hosts = ["nuke"]
|
||||
families = ["render", "render.local", "render.farm"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
node = instance[0]
|
||||
node = instance.data["transientData"]["node"]
|
||||
|
||||
slate = next((n for n in nuke.allNodes()
|
||||
if "slate" in n.name().lower()
|
||||
|
|
@ -35,7 +35,6 @@ class CollectSlate(pyblish.api.InstancePlugin):
|
|||
instance.data["slateNode"] = slate_node
|
||||
instance.data["slate"] = True
|
||||
instance.data["families"].append("slate")
|
||||
instance.data["versionData"]["families"].append("slate")
|
||||
self.log.info(
|
||||
"Slate node is in node graph: `{}`".format(slate.name()))
|
||||
self.log.debug(
|
||||
|
|
|
|||
40
openpype/hosts/nuke/plugins/publish/collect_workfile.py
Normal file
40
openpype/hosts/nuke/plugins/publish/collect_workfile.py
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
import os
|
||||
import nuke
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.InstancePlugin):
|
||||
"""Collect current script for publish."""
|
||||
|
||||
order = pyblish.api.CollectorOrder
|
||||
label = "Collect Workfile"
|
||||
hosts = ['nuke']
|
||||
families = ["workfile"]
|
||||
|
||||
def process(self, instance): # sourcery skip: avoid-builtin-shadow
|
||||
|
||||
script_data = instance.context.data["scriptData"]
|
||||
current_file = os.path.normpath(nuke.root().name())
|
||||
|
||||
# creating instances per write node
|
||||
staging_dir = os.path.dirname(current_file)
|
||||
base_name = os.path.basename(current_file)
|
||||
|
||||
# creating representation
|
||||
representation = {
|
||||
'name': 'nk',
|
||||
'ext': 'nk',
|
||||
'files': base_name,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
|
||||
# creating instance data
|
||||
instance.data.update({
|
||||
"name": base_name,
|
||||
"representations": [representation]
|
||||
})
|
||||
|
||||
# adding basic script data
|
||||
instance.data.update(script_data)
|
||||
|
||||
self.log.info("Collect script version")
|
||||
186
openpype/hosts/nuke/plugins/publish/collect_writes.py
Normal file
186
openpype/hosts/nuke/plugins/publish/collect_writes.py
Normal file
|
|
@ -0,0 +1,186 @@
|
|||
import os
|
||||
from pprint import pformat
|
||||
import nuke
|
||||
import pyblish.api
|
||||
from openpype.hosts.nuke import api as napi
|
||||
|
||||
|
||||
class CollectNukeWrites(pyblish.api.InstancePlugin):
|
||||
"""Collect all write nodes."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.48
|
||||
label = "Collect Writes"
|
||||
hosts = ["nuke", "nukeassist"]
|
||||
families = ["render", "prerender", "image"]
|
||||
|
||||
def process(self, instance):
|
||||
self.log.debug(pformat(instance.data))
|
||||
creator_attributes = instance.data["creator_attributes"]
|
||||
instance.data.update(creator_attributes)
|
||||
|
||||
group_node = instance.data["transientData"]["node"]
|
||||
render_target = instance.data["render_target"]
|
||||
family = instance.data["family"]
|
||||
families = instance.data["families"]
|
||||
|
||||
# add targeted family to families
|
||||
instance.data["families"].append(
|
||||
"{}.{}".format(family, render_target)
|
||||
)
|
||||
if instance.data.get("review"):
|
||||
instance.data["families"].append("review")
|
||||
|
||||
child_nodes = napi.get_instance_group_node_childs(instance)
|
||||
instance.data["transientData"]["childNodes"] = child_nodes
|
||||
|
||||
write_node = None
|
||||
for x in child_nodes:
|
||||
if x.Class() == "Write":
|
||||
write_node = x
|
||||
|
||||
if write_node is None:
|
||||
self.log.warning(
|
||||
"Created node '{}' is missing write node!".format(
|
||||
group_node.name()
|
||||
)
|
||||
)
|
||||
return
|
||||
|
||||
instance.data["writeNode"] = write_node
|
||||
self.log.debug("checking instance: {}".format(instance))
|
||||
|
||||
# Determine defined file type
|
||||
ext = write_node["file_type"].value()
|
||||
|
||||
# Get frame range
|
||||
handle_start = instance.context.data["handleStart"]
|
||||
handle_end = instance.context.data["handleEnd"]
|
||||
first_frame = int(nuke.root()["first_frame"].getValue())
|
||||
last_frame = int(nuke.root()["last_frame"].getValue())
|
||||
frame_length = int(last_frame - first_frame + 1)
|
||||
|
||||
if write_node["use_limit"].getValue():
|
||||
first_frame = int(write_node["first"].getValue())
|
||||
last_frame = int(write_node["last"].getValue())
|
||||
|
||||
write_file_path = nuke.filename(write_node)
|
||||
output_dir = os.path.dirname(write_file_path)
|
||||
|
||||
self.log.debug('output dir: {}'.format(output_dir))
|
||||
|
||||
if render_target == "frames":
|
||||
representation = {
|
||||
'name': ext,
|
||||
'ext': ext,
|
||||
"stagingDir": output_dir,
|
||||
"tags": []
|
||||
}
|
||||
|
||||
# get file path knob
|
||||
node_file_knob = write_node["file"]
|
||||
# list file paths based on input frames
|
||||
expected_paths = list(sorted({
|
||||
node_file_knob.evaluate(frame)
|
||||
for frame in range(first_frame, last_frame + 1)
|
||||
}))
|
||||
|
||||
# convert only to base names
|
||||
expected_filenames = [
|
||||
os.path.basename(filepath)
|
||||
for filepath in expected_paths
|
||||
]
|
||||
|
||||
# make sure files are existing at folder
|
||||
collected_frames = [
|
||||
filename
|
||||
for filename in os.listdir(output_dir)
|
||||
if filename in expected_filenames
|
||||
]
|
||||
|
||||
if collected_frames:
|
||||
collected_frames_len = len(collected_frames)
|
||||
frame_start_str = "%0{}d".format(
|
||||
len(str(last_frame))) % first_frame
|
||||
representation['frameStart'] = frame_start_str
|
||||
|
||||
# in case slate is expected and not yet rendered
|
||||
self.log.debug("_ frame_length: {}".format(frame_length))
|
||||
self.log.debug("_ collected_frames_len: {}".format(
|
||||
collected_frames_len))
|
||||
|
||||
# this will only run if slate frame is not already
|
||||
# rendered from previews publishes
|
||||
if (
|
||||
"slate" in families
|
||||
and frame_length == collected_frames_len
|
||||
and family == "render"
|
||||
):
|
||||
frame_slate_str = (
|
||||
"{{:0{}d}}".format(len(str(last_frame)))
|
||||
).format(first_frame - 1)
|
||||
|
||||
slate_frame = collected_frames[0].replace(
|
||||
frame_start_str, frame_slate_str)
|
||||
collected_frames.insert(0, slate_frame)
|
||||
|
||||
if collected_frames_len == 1:
|
||||
representation['files'] = collected_frames.pop()
|
||||
else:
|
||||
representation['files'] = collected_frames
|
||||
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Publishing rendered frames ...")
|
||||
|
||||
elif render_target == "farm":
|
||||
farm_priority = creator_attributes.get("farm_priority")
|
||||
farm_chunk = creator_attributes.get("farm_chunk")
|
||||
farm_concurency = creator_attributes.get("farm_concurency")
|
||||
instance.data.update({
|
||||
"deadlineChunkSize": farm_chunk or 1,
|
||||
"deadlinePriority": farm_priority or 50,
|
||||
"deadlineConcurrentTasks": farm_concurency or 0
|
||||
})
|
||||
# Farm rendering
|
||||
instance.data["transfer"] = False
|
||||
instance.data["farm"] = True
|
||||
self.log.info("Farm rendering ON ...")
|
||||
|
||||
# get colorspace and add to version data
|
||||
colorspace = napi.get_colorspace_from_node(write_node)
|
||||
version_data = {
|
||||
"colorspace": colorspace
|
||||
}
|
||||
|
||||
instance.data.update({
|
||||
"versionData": version_data,
|
||||
"path": write_file_path,
|
||||
"outputDir": output_dir,
|
||||
"ext": ext,
|
||||
"colorspace": colorspace
|
||||
})
|
||||
|
||||
if family == "render":
|
||||
instance.data.update({
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"frameStart": first_frame + handle_start,
|
||||
"frameEnd": last_frame - handle_end,
|
||||
"frameStartHandle": first_frame,
|
||||
"frameEndHandle": last_frame,
|
||||
})
|
||||
else:
|
||||
instance.data.update({
|
||||
"handleStart": 0,
|
||||
"handleEnd": 0,
|
||||
"frameStart": first_frame,
|
||||
"frameEnd": last_frame,
|
||||
"frameStartHandle": first_frame,
|
||||
"frameEndHandle": last_frame,
|
||||
})
|
||||
|
||||
# make sure rendered sequence on farm will
|
||||
# be used for exctract review
|
||||
if not instance.data["review"]:
|
||||
instance.data["useSequenceForReview"] = False
|
||||
|
||||
self.log.debug("instance.data: {}".format(pformat(instance.data)))
|
||||
|
|
@ -26,8 +26,14 @@ class ExtractBackdropNode(publish.Extractor):
|
|||
families = ["nukenodes"]
|
||||
|
||||
def process(self, instance):
|
||||
tmp_nodes = list()
|
||||
nodes = instance[1:]
|
||||
tmp_nodes = []
|
||||
child_nodes = instance.data["transientData"]["childNodes"]
|
||||
# all connections outside of backdrop
|
||||
connections_in = instance.data["transientData"]["nodeConnectionsIn"]
|
||||
connections_out = instance.data["transientData"]["nodeConnectionsOut"]
|
||||
self.log.debug("_ connections_in: `{}`".format(connections_in))
|
||||
self.log.debug("_ connections_out: `{}`".format(connections_out))
|
||||
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
filename = "{0}.nk".format(instance.name)
|
||||
|
|
@ -35,20 +41,14 @@ class ExtractBackdropNode(publish.Extractor):
|
|||
|
||||
# maintain selection
|
||||
with maintained_selection():
|
||||
# all connections outside of backdrop
|
||||
connections_in = instance.data["nodeConnectionsIn"]
|
||||
connections_out = instance.data["nodeConnectionsOut"]
|
||||
self.log.debug("_ connections_in: `{}`".format(connections_in))
|
||||
self.log.debug("_ connections_out: `{}`".format(connections_out))
|
||||
|
||||
# create input nodes and name them as passing node (*_INP)
|
||||
# create input child_nodes and name them as passing node (*_INP)
|
||||
for n, inputs in connections_in.items():
|
||||
for i, input in inputs:
|
||||
inpn = nuke.createNode("Input")
|
||||
inpn["name"].setValue("{}_{}_INP".format(n.name(), i))
|
||||
n.setInput(i, inpn)
|
||||
inpn.setXYpos(input.xpos(), input.ypos())
|
||||
nodes.append(inpn)
|
||||
child_nodes.append(inpn)
|
||||
tmp_nodes.append(inpn)
|
||||
|
||||
reset_selection()
|
||||
|
|
@ -63,13 +63,13 @@ class ExtractBackdropNode(publish.Extractor):
|
|||
if d.name() in n.name()), 0), opn)
|
||||
opn.setInput(0, n)
|
||||
opn.autoplace()
|
||||
nodes.append(opn)
|
||||
child_nodes.append(opn)
|
||||
tmp_nodes.append(opn)
|
||||
reset_selection()
|
||||
|
||||
# select nodes to copy
|
||||
# select child_nodes to copy
|
||||
reset_selection()
|
||||
select_nodes(nodes)
|
||||
select_nodes(child_nodes)
|
||||
# create tmp nk file
|
||||
# save file to the path
|
||||
nuke.nodeCopy(path)
|
||||
|
|
@ -104,6 +104,3 @@ class ExtractBackdropNode(publish.Extractor):
|
|||
|
||||
self.log.info("Extracted instance '{}' to: {}".format(
|
||||
instance.name, path))
|
||||
|
||||
self.log.info("Data {}".format(
|
||||
instance.data))
|
||||
|
|
|
|||
|
|
@ -28,6 +28,7 @@ class ExtractCamera(publish.Extractor):
|
|||
]
|
||||
|
||||
def process(self, instance):
|
||||
camera_node = instance.data["transientData"]["node"]
|
||||
handle_start = instance.context.data["handleStart"]
|
||||
handle_end = instance.context.data["handleEnd"]
|
||||
first_frame = int(nuke.root()["first_frame"].getValue())
|
||||
|
|
@ -38,7 +39,7 @@ class ExtractCamera(publish.Extractor):
|
|||
self.log.info("instance.data: `{}`".format(
|
||||
pformat(instance.data)))
|
||||
|
||||
rm_nodes = list()
|
||||
rm_nodes = []
|
||||
self.log.info("Crating additional nodes")
|
||||
subset = instance.data["subset"]
|
||||
staging_dir = self.staging_dir(instance)
|
||||
|
|
@ -58,7 +59,7 @@ class ExtractCamera(publish.Extractor):
|
|||
with maintained_selection():
|
||||
# bake camera with axeses onto word coordinate XYZ
|
||||
rm_n = bakeCameraWithAxeses(
|
||||
nuke.toNode(instance.data["name"]), output_range)
|
||||
camera_node, output_range)
|
||||
rm_nodes.append(rm_n)
|
||||
|
||||
# create scene node
|
||||
|
|
|
|||
|
|
@ -19,13 +19,14 @@ class ExtractGizmo(publish.Extractor):
|
|||
"""
|
||||
|
||||
order = pyblish.api.ExtractorOrder
|
||||
label = "Extract Gizmo (Group)"
|
||||
label = "Extract Gizmo (group)"
|
||||
hosts = ["nuke"]
|
||||
families = ["gizmo"]
|
||||
|
||||
def process(self, instance):
|
||||
tmp_nodes = list()
|
||||
orig_grpn = instance[0]
|
||||
tmp_nodes = []
|
||||
orig_grpn = instance.data["transientData"]["node"]
|
||||
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
filename = "{0}.nk".format(instance.name)
|
||||
|
|
@ -54,15 +55,6 @@ class ExtractGizmo(publish.Extractor):
|
|||
# convert gizmos to groups
|
||||
pnutils.bake_gizmos_recursively(copy_grpn)
|
||||
|
||||
# remove avalonknobs
|
||||
knobs = copy_grpn.knobs()
|
||||
avalon_knobs = [k for k in knobs.keys()
|
||||
for ak in ["avalon:", "ak:"]
|
||||
if ak in k]
|
||||
avalon_knobs.append("publish")
|
||||
for ak in avalon_knobs:
|
||||
copy_grpn.removeKnob(knobs[ak])
|
||||
|
||||
# add to temporary nodes
|
||||
tmp_nodes.append(copy_grpn)
|
||||
|
||||
|
|
|
|||
|
|
@ -36,8 +36,9 @@ class ExtractModel(publish.Extractor):
|
|||
self.log.info("instance.data: `{}`".format(
|
||||
pformat(instance.data)))
|
||||
|
||||
rm_nodes = list()
|
||||
model_node = instance[0]
|
||||
rm_nodes = []
|
||||
model_node = instance.data["transientData"]["node"]
|
||||
|
||||
self.log.info("Crating additional nodes")
|
||||
subset = instance.data["subset"]
|
||||
staging_dir = self.staging_dir(instance)
|
||||
|
|
|
|||
|
|
@ -16,13 +16,17 @@ class CreateOutputNode(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
# capture selection state
|
||||
with maintained_selection():
|
||||
active_node = [node for inst in context
|
||||
for node in inst
|
||||
if "ak:family" in node.knobs()]
|
||||
|
||||
active_node = [
|
||||
inst.data.get("transientData", {}).get("node")
|
||||
for inst in context
|
||||
if inst.data.get("transientData", {}).get("node")
|
||||
if inst.data.get(
|
||||
"transientData", {}).get("node").Class() != "Root"
|
||||
]
|
||||
|
||||
if active_node:
|
||||
self.log.info(active_node)
|
||||
active_node = active_node[0]
|
||||
active_node = active_node.pop()
|
||||
self.log.info(active_node)
|
||||
active_node['selected'].setValue(True)
|
||||
|
||||
|
|
|
|||
|
|
@ -22,18 +22,20 @@ class NukeRenderLocal(publish.ExtractorColormanaged):
|
|||
|
||||
def process(self, instance):
|
||||
families = instance.data["families"]
|
||||
child_nodes = (
|
||||
instance.data.get("transientData", {}).get("childNodes")
|
||||
or instance
|
||||
)
|
||||
|
||||
node = None
|
||||
for x in instance:
|
||||
for x in child_nodes:
|
||||
if x.Class() == "Write":
|
||||
node = x
|
||||
|
||||
self.log.debug("instance collected: {}".format(instance.data))
|
||||
|
||||
first_frame = instance.data.get("frameStartHandle", None)
|
||||
|
||||
last_frame = instance.data.get("frameEndHandle", None)
|
||||
node_subset_name = instance.data.get("name", None)
|
||||
node_subset_name = instance.data["subset"]
|
||||
|
||||
self.log.info("Starting render")
|
||||
self.log.info("Start frame: {}".format(first_frame))
|
||||
|
|
@ -60,7 +62,7 @@ class NukeRenderLocal(publish.ExtractorColormanaged):
|
|||
|
||||
# Render frames
|
||||
nuke.execute(
|
||||
node_subset_name,
|
||||
str(node_subset_name),
|
||||
int(first_frame),
|
||||
int(last_frame)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -4,11 +4,7 @@ import nuke
|
|||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.nuke.api import (
|
||||
maintained_selection,
|
||||
get_view_process_node
|
||||
)
|
||||
|
||||
from openpype.hosts.nuke import api as napi
|
||||
|
||||
if sys.version_info[0] >= 3:
|
||||
unicode = str
|
||||
|
|
@ -38,7 +34,7 @@ class ExtractThumbnail(publish.Extractor):
|
|||
if "render.farm" in instance.data["families"]:
|
||||
return
|
||||
|
||||
with maintained_selection():
|
||||
with napi.maintained_selection():
|
||||
self.log.debug("instance: {}".format(instance))
|
||||
self.log.debug("instance.data[families]: {}".format(
|
||||
instance.data["families"]))
|
||||
|
|
@ -69,7 +65,7 @@ class ExtractThumbnail(publish.Extractor):
|
|||
bake_viewer_input_process_node = kwargs[
|
||||
"bake_viewer_input_process"]
|
||||
|
||||
node = instance[0] # group node
|
||||
node = instance.data["transientData"]["node"] # group node
|
||||
self.log.info("Creating staging dir...")
|
||||
|
||||
if "representations" not in instance.data:
|
||||
|
|
@ -144,7 +140,7 @@ class ExtractThumbnail(publish.Extractor):
|
|||
if bake_viewer_process:
|
||||
if bake_viewer_input_process_node:
|
||||
# get input process and connect it to baking
|
||||
ipn = get_view_process_node()
|
||||
ipn = napi.get_view_process_node()
|
||||
if ipn is not None:
|
||||
ipn.setInput(0, previous_node)
|
||||
previous_node = ipn
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<root>
|
||||
<error id="main">
|
||||
<title>Shot/Asset mame</title>
|
||||
<title>Shot/Asset name</title>
|
||||
<description>
|
||||
## Invalid Shot/Asset name in subset
|
||||
|
||||
|
|
|
|||
|
|
@ -3,16 +3,30 @@
|
|||
<error id="main">
|
||||
<title>Knobs values</title>
|
||||
<description>
|
||||
## Invalid node's knobs values
|
||||
## Invalid node's knobs values
|
||||
|
||||
Following write node knobs needs to be repaired:
|
||||
Following write node knobs needs to be repaired:
|
||||
|
||||
{xml_msg}
|
||||
{xml_msg}
|
||||
|
||||
### How to repair?
|
||||
### How to repair?
|
||||
|
||||
1. Use Repair button.
|
||||
2. Hit Reload button on the publisher.
|
||||
1. Use Repair button.
|
||||
2. Hit Reload button on the publisher.
|
||||
</description>
|
||||
</error>
|
||||
<error id="legacy">
|
||||
<title>Legacy knob types</title>
|
||||
<description>
|
||||
## Knobs are in obsolete configuration
|
||||
|
||||
Settings needs to be fixed.
|
||||
|
||||
### How to repair?
|
||||
|
||||
Contact your supervisor or fix it in project settings at
|
||||
'project_settings/nuke/imageio/nodes/requiredNodes' at knobs.
|
||||
Each '__legacy__' type has to be defined accordingly to its type.
|
||||
</description>
|
||||
</error>
|
||||
</root>
|
||||
|
|
@ -1,158 +0,0 @@
|
|||
import nuke
|
||||
import pyblish.api
|
||||
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
add_publish_knob,
|
||||
get_avalon_knob_data
|
||||
)
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class PreCollectNukeInstances(pyblish.api.ContextPlugin):
|
||||
"""Collect all nodes with Avalon knob."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.49
|
||||
label = "Pre-collect Instances"
|
||||
hosts = ["nuke", "nukeassist"]
|
||||
|
||||
# presets
|
||||
sync_workfile_version_on_families = []
|
||||
|
||||
def process(self, context):
|
||||
instances = []
|
||||
|
||||
root = nuke.root()
|
||||
|
||||
self.log.debug("nuke.allNodes(): {}".format(nuke.allNodes()))
|
||||
for node in nuke.allNodes():
|
||||
|
||||
if node.Class() in ["Viewer", "Dot"]:
|
||||
continue
|
||||
|
||||
try:
|
||||
if node["disable"].value():
|
||||
continue
|
||||
except Exception as E:
|
||||
self.log.warning(E)
|
||||
|
||||
# get data from avalon knob
|
||||
avalon_knob_data = get_avalon_knob_data(
|
||||
node, ["avalon:", "ak:"])
|
||||
|
||||
self.log.debug("avalon_knob_data: {}".format(avalon_knob_data))
|
||||
|
||||
if not avalon_knob_data:
|
||||
continue
|
||||
|
||||
if avalon_knob_data["id"] != "pyblish.avalon.instance":
|
||||
continue
|
||||
|
||||
# establish families
|
||||
family = avalon_knob_data["family"]
|
||||
families_ak = avalon_knob_data.get("families", [])
|
||||
families = []
|
||||
|
||||
# except disabled nodes but exclude backdrops in test
|
||||
if ("nukenodes" not in family) and (node["disable"].value()):
|
||||
continue
|
||||
|
||||
subset = avalon_knob_data.get(
|
||||
"subset", None) or node["name"].value()
|
||||
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
instance.append(node)
|
||||
|
||||
suspend_publish = False
|
||||
if "suspend_publish" in node.knobs():
|
||||
suspend_publish = node["suspend_publish"].value()
|
||||
instance.data["suspend_publish"] = suspend_publish
|
||||
|
||||
# get review knob value
|
||||
review = False
|
||||
if "review" in node.knobs():
|
||||
review = node["review"].value()
|
||||
|
||||
if review:
|
||||
families.append("review")
|
||||
|
||||
# Add all nodes in group instances.
|
||||
if node.Class() == "Group":
|
||||
# only alter families for render family
|
||||
if families_ak and "write" in families_ak.lower():
|
||||
target = node["render"].value()
|
||||
if target == "Use existing frames":
|
||||
# Local rendering
|
||||
self.log.info("flagged for no render")
|
||||
families.append(families_ak.lower())
|
||||
elif target == "Local":
|
||||
# Local rendering
|
||||
self.log.info("flagged for local render")
|
||||
families.append("{}.local".format(family))
|
||||
family = families_ak.lower()
|
||||
elif target == "On farm":
|
||||
# Farm rendering
|
||||
self.log.info("flagged for farm render")
|
||||
instance.data["transfer"] = False
|
||||
instance.data["farm"] = True
|
||||
families.append("{}.farm".format(family))
|
||||
family = families_ak.lower()
|
||||
|
||||
node.begin()
|
||||
for i in nuke.allNodes():
|
||||
instance.append(i)
|
||||
node.end()
|
||||
|
||||
if not families and families_ak and family not in [
|
||||
"render", "prerender"]:
|
||||
families.append(families_ak.lower())
|
||||
|
||||
self.log.debug("__ family: `{}`".format(family))
|
||||
self.log.debug("__ families: `{}`".format(families))
|
||||
|
||||
# Get format
|
||||
format_ = root['format'].value()
|
||||
resolution_width = format_.width()
|
||||
resolution_height = format_.height()
|
||||
pixel_aspect = format_.pixelAspect()
|
||||
|
||||
# get publish knob value
|
||||
if "publish" not in node.knobs():
|
||||
add_publish_knob(node)
|
||||
|
||||
# sync workfile version
|
||||
_families_test = [family] + families
|
||||
self.log.debug("__ _families_test: `{}`".format(_families_test))
|
||||
for family_test in _families_test:
|
||||
if family_test in self.sync_workfile_version_on_families:
|
||||
self.log.debug(
|
||||
"Syncing version with workfile for '{}'".format(
|
||||
family_test
|
||||
)
|
||||
)
|
||||
# get version to instance for integration
|
||||
instance.data['version'] = instance.context.data['version']
|
||||
|
||||
instance.data.update({
|
||||
"subset": subset,
|
||||
"asset": avalon_knob_data["asset"],
|
||||
"label": node.name(),
|
||||
"name": node.name(),
|
||||
"subset": subset,
|
||||
"family": family,
|
||||
"families": families,
|
||||
"avalonKnob": avalon_knob_data,
|
||||
"step": 1,
|
||||
"publish": node.knob('publish').value(),
|
||||
"fps": nuke.root()['fps'].value(),
|
||||
"resolutionWidth": resolution_width,
|
||||
"resolutionHeight": resolution_height,
|
||||
"pixelAspect": pixel_aspect,
|
||||
"review": review,
|
||||
"representations": []
|
||||
|
||||
})
|
||||
self.log.info("collected instance: {}".format(instance.data))
|
||||
instances.append(instance)
|
||||
|
||||
self.log.debug("context: {}".format(context))
|
||||
|
|
@ -1,107 +0,0 @@
|
|||
import os
|
||||
|
||||
import nuke
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_version_from_path
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
add_publish_knob,
|
||||
get_avalon_knob_data
|
||||
)
|
||||
from openpype.pipeline import KnownPublishError
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
"""Collect current script for publish."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.50
|
||||
label = "Pre-collect Workfile"
|
||||
hosts = ['nuke']
|
||||
|
||||
def process(self, context): # sourcery skip: avoid-builtin-shadow
|
||||
root = nuke.root()
|
||||
|
||||
current_file = os.path.normpath(nuke.root().name())
|
||||
|
||||
if current_file.lower() == "root":
|
||||
raise KnownPublishError(
|
||||
"Workfile is not correct file name. \n"
|
||||
"Use workfile tool to manage the name correctly."
|
||||
)
|
||||
|
||||
knob_data = get_avalon_knob_data(root)
|
||||
|
||||
add_publish_knob(root)
|
||||
|
||||
family = "workfile"
|
||||
task = os.getenv("AVALON_TASK", None)
|
||||
# creating instances per write node
|
||||
staging_dir = os.path.dirname(current_file)
|
||||
base_name = os.path.basename(current_file)
|
||||
subset = family + task.capitalize()
|
||||
|
||||
# Get frame range
|
||||
first_frame = int(root["first_frame"].getValue())
|
||||
last_frame = int(root["last_frame"].getValue())
|
||||
|
||||
handle_start = int(knob_data.get("handleStart", 0))
|
||||
handle_end = int(knob_data.get("handleEnd", 0))
|
||||
|
||||
# Get format
|
||||
format = root['format'].value()
|
||||
resolution_width = format.width()
|
||||
resolution_height = format.height()
|
||||
pixel_aspect = format.pixelAspect()
|
||||
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
instance.add(root)
|
||||
|
||||
script_data = {
|
||||
"asset": os.getenv("AVALON_ASSET", None),
|
||||
"frameStart": first_frame + handle_start,
|
||||
"frameEnd": last_frame - handle_end,
|
||||
"resolutionWidth": resolution_width,
|
||||
"resolutionHeight": resolution_height,
|
||||
"pixelAspect": pixel_aspect,
|
||||
|
||||
# backward compatibility
|
||||
"handles": handle_start,
|
||||
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"step": 1,
|
||||
"fps": root['fps'].value(),
|
||||
|
||||
"currentFile": current_file,
|
||||
"version": int(get_version_from_path(current_file)),
|
||||
|
||||
"host": pyblish.api.current_host(),
|
||||
"hostVersion": nuke.NUKE_VERSION_STRING
|
||||
}
|
||||
context.data.update(script_data)
|
||||
|
||||
# creating representation
|
||||
representation = {
|
||||
'name': 'nk',
|
||||
'ext': 'nk',
|
||||
'files': base_name,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
|
||||
# creating instance data
|
||||
instance.data.update({
|
||||
"subset": subset,
|
||||
"label": base_name,
|
||||
"name": base_name,
|
||||
"publish": root.knob('publish').value(),
|
||||
"family": family,
|
||||
"families": [family],
|
||||
"representations": [representation]
|
||||
})
|
||||
|
||||
# adding basic script data
|
||||
instance.data.update(script_data)
|
||||
|
||||
self.log.info('Publishing script version')
|
||||
|
|
@ -1,207 +0,0 @@
|
|||
import os
|
||||
import re
|
||||
from pprint import pformat
|
||||
import nuke
|
||||
import pyblish.api
|
||||
|
||||
from openpype.client import (
|
||||
get_last_version_by_subset_name,
|
||||
get_representations,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
get_representation_path,
|
||||
)
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class CollectNukeWrites(pyblish.api.InstancePlugin):
|
||||
"""Collect all write nodes."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.48
|
||||
label = "Pre-collect Writes"
|
||||
hosts = ["nuke", "nukeassist"]
|
||||
families = ["write"]
|
||||
|
||||
def process(self, instance):
|
||||
_families_test = [instance.data["family"]] + instance.data["families"]
|
||||
self.log.debug("_families_test: {}".format(_families_test))
|
||||
|
||||
node = None
|
||||
for x in instance:
|
||||
if x.Class() == "Write":
|
||||
node = x
|
||||
|
||||
if node is None:
|
||||
return
|
||||
|
||||
instance.data["writeNode"] = node
|
||||
self.log.debug("checking instance: {}".format(instance))
|
||||
|
||||
# Determine defined file type
|
||||
ext = node["file_type"].value()
|
||||
|
||||
# Determine output type
|
||||
output_type = "img"
|
||||
if ext == "mov":
|
||||
output_type = "mov"
|
||||
|
||||
# Get frame range
|
||||
handle_start = instance.context.data["handleStart"]
|
||||
handle_end = instance.context.data["handleEnd"]
|
||||
first_frame = int(nuke.root()["first_frame"].getValue())
|
||||
last_frame = int(nuke.root()["last_frame"].getValue())
|
||||
frame_length = int(last_frame - first_frame + 1)
|
||||
|
||||
if node["use_limit"].getValue():
|
||||
first_frame = int(node["first"].getValue())
|
||||
last_frame = int(node["last"].getValue())
|
||||
|
||||
# Prepare expected output paths by evaluating each frame of write node
|
||||
# - paths are first collected to set to avoid duplicated paths, then
|
||||
# sorted and converted to list
|
||||
node_file = node["file"]
|
||||
expected_paths = list(sorted({
|
||||
node_file.evaluate(frame)
|
||||
for frame in range(first_frame, last_frame + 1)
|
||||
}))
|
||||
expected_filenames = [
|
||||
os.path.basename(filepath)
|
||||
for filepath in expected_paths
|
||||
]
|
||||
path = nuke.filename(node)
|
||||
output_dir = os.path.dirname(path)
|
||||
|
||||
self.log.debug('output dir: {}'.format(output_dir))
|
||||
|
||||
# create label
|
||||
name = node.name()
|
||||
# Include start and end render frame in label
|
||||
label = "{0} ({1}-{2})".format(
|
||||
name,
|
||||
int(first_frame),
|
||||
int(last_frame)
|
||||
)
|
||||
|
||||
if [fm for fm in _families_test
|
||||
if fm in ["render", "prerender", "still"]]:
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = list()
|
||||
|
||||
representation = {
|
||||
'name': ext,
|
||||
'ext': ext,
|
||||
"stagingDir": output_dir,
|
||||
"tags": list()
|
||||
}
|
||||
|
||||
try:
|
||||
collected_frames = [
|
||||
filename
|
||||
for filename in os.listdir(output_dir)
|
||||
if filename in expected_filenames
|
||||
]
|
||||
if collected_frames:
|
||||
collected_frames_len = len(collected_frames)
|
||||
frame_start_str = "%0{}d".format(
|
||||
len(str(last_frame))) % first_frame
|
||||
representation['frameStart'] = frame_start_str
|
||||
|
||||
# in case slate is expected and not yet rendered
|
||||
self.log.debug("_ frame_length: {}".format(frame_length))
|
||||
self.log.debug(
|
||||
"_ collected_frames_len: {}".format(
|
||||
collected_frames_len))
|
||||
# this will only run if slate frame is not already
|
||||
# rendered from previews publishes
|
||||
if "slate" in _families_test \
|
||||
and (frame_length == collected_frames_len) \
|
||||
and ("prerender" not in _families_test):
|
||||
frame_slate_str = "%0{}d".format(
|
||||
len(str(last_frame))) % (first_frame - 1)
|
||||
slate_frame = collected_frames[0].replace(
|
||||
frame_start_str, frame_slate_str)
|
||||
collected_frames.insert(0, slate_frame)
|
||||
|
||||
if collected_frames_len == 1:
|
||||
representation['files'] = collected_frames.pop()
|
||||
if "still" in _families_test:
|
||||
instance.data['family'] = 'image'
|
||||
instance.data["families"].remove('still')
|
||||
else:
|
||||
representation['files'] = collected_frames
|
||||
instance.data["representations"].append(representation)
|
||||
except Exception:
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.debug("couldn't collect frames: {}".format(label))
|
||||
|
||||
# Add version data to instance
|
||||
colorspace = node["colorspace"].value()
|
||||
|
||||
# remove default part of the string
|
||||
if "default (" in colorspace:
|
||||
colorspace = re.sub(r"default.\(|\)", "", colorspace)
|
||||
self.log.debug("colorspace: `{}`".format(colorspace))
|
||||
|
||||
version_data = {
|
||||
"families": [
|
||||
_f.replace(".local", "").replace(".farm", "")
|
||||
for _f in _families_test if "write" != _f
|
||||
],
|
||||
"colorspace": colorspace
|
||||
}
|
||||
|
||||
group_node = [x for x in instance if x.Class() == "Group"][0]
|
||||
dl_chunk_size = 1
|
||||
if "deadlineChunkSize" in group_node.knobs():
|
||||
dl_chunk_size = group_node["deadlineChunkSize"].value()
|
||||
|
||||
dl_priority = 50
|
||||
if "deadlinePriority" in group_node.knobs():
|
||||
dl_priority = group_node["deadlinePriority"].value()
|
||||
|
||||
dl_concurrent_tasks = 0
|
||||
if "deadlineConcurrentTasks" in group_node.knobs():
|
||||
dl_concurrent_tasks = group_node["deadlineConcurrentTasks"].value()
|
||||
|
||||
instance.data.update({
|
||||
"versionData": version_data,
|
||||
"path": path,
|
||||
"outputDir": output_dir,
|
||||
"ext": ext,
|
||||
"label": label,
|
||||
"outputType": output_type,
|
||||
"colorspace": colorspace,
|
||||
"deadlineChunkSize": dl_chunk_size,
|
||||
"deadlinePriority": dl_priority,
|
||||
"deadlineConcurrentTasks": dl_concurrent_tasks
|
||||
})
|
||||
|
||||
if self.is_prerender(_families_test):
|
||||
instance.data.update({
|
||||
"handleStart": 0,
|
||||
"handleEnd": 0,
|
||||
"frameStart": first_frame,
|
||||
"frameEnd": last_frame,
|
||||
"frameStartHandle": first_frame,
|
||||
"frameEndHandle": last_frame,
|
||||
})
|
||||
else:
|
||||
instance.data.update({
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"frameStart": first_frame + handle_start,
|
||||
"frameEnd": last_frame - handle_end,
|
||||
"frameStartHandle": first_frame,
|
||||
"frameEndHandle": last_frame,
|
||||
})
|
||||
|
||||
# make sure rendered sequence on farm will
|
||||
# be used for exctract review
|
||||
if not instance.data["review"]:
|
||||
instance.data["useSequenceForReview"] = False
|
||||
|
||||
self.log.debug("instance.data: {}".format(pformat(instance.data)))
|
||||
|
||||
def is_prerender(self, families):
|
||||
return next((f for f in families if "prerender" in f), None)
|
||||
|
|
@ -2,11 +2,10 @@
|
|||
"""Validate if instance asset is the same as context asset."""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import nuke
|
||||
import pyblish.api
|
||||
|
||||
import openpype.hosts.nuke.api.lib as nlib
|
||||
import openpype.hosts.nuke.api as nuke_api
|
||||
|
||||
from openpype.pipeline.publish import (
|
||||
ValidateContentsOrder,
|
||||
PublishXmlValidationError,
|
||||
|
|
@ -51,9 +50,10 @@ class SelectInvalidInstances(pyblish.api.Action):
|
|||
self.deselect()
|
||||
|
||||
def select(self, instances):
|
||||
nlib.select_nodes(
|
||||
[nuke.toNode(str(x)) for x in instances]
|
||||
)
|
||||
for inst in instances:
|
||||
if inst.data.get("transientData", {}).get("node"):
|
||||
select_node = inst.data["transientData"]["node"]
|
||||
select_node["selected"].setValue(True)
|
||||
|
||||
def deselect(self):
|
||||
nlib.reset_selection()
|
||||
|
|
@ -82,13 +82,14 @@ class RepairSelectInvalidInstances(pyblish.api.Action):
|
|||
|
||||
# Apply pyblish.logic to get the instances for the plug-in
|
||||
instances = pyblish.api.instances_by_plugin(failed, plugin)
|
||||
self.log.debug(instances)
|
||||
|
||||
context_asset = context.data["assetEntity"]["name"]
|
||||
for instance in instances:
|
||||
origin_node = instance[0]
|
||||
nuke_api.lib.recreate_instance(
|
||||
origin_node, avalon_data={"asset": context_asset}
|
||||
)
|
||||
node = instance.data["transientData"]["node"]
|
||||
node_data = nlib.get_node_data(node, nlib.INSTANCE_DATA_KNOB)
|
||||
node_data["asset"] = context_asset
|
||||
nlib.set_node_data(node, nlib.INSTANCE_DATA_KNOB, node_data)
|
||||
|
||||
|
||||
class ValidateCorrectAssetName(pyblish.api.InstancePlugin):
|
||||
|
|
@ -112,6 +113,7 @@ class ValidateCorrectAssetName(pyblish.api.InstancePlugin):
|
|||
def process(self, instance):
|
||||
asset = instance.data.get("asset")
|
||||
context_asset = instance.context.data["assetEntity"]["name"]
|
||||
node = instance.data["transientData"]["node"]
|
||||
|
||||
msg = (
|
||||
"Instance `{}` has wrong shot/asset name:\n"
|
||||
|
|
@ -123,7 +125,7 @@ class ValidateCorrectAssetName(pyblish.api.InstancePlugin):
|
|||
if asset != context_asset:
|
||||
raise PublishXmlValidationError(
|
||||
self, msg, formatting_data={
|
||||
"node_name": instance[0]["name"].value(),
|
||||
"node_name": node.name(),
|
||||
"wrong_name": asset,
|
||||
"correct_name": context_asset
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import nuke
|
||||
import pyblish
|
||||
from openpype.hosts.nuke.api.lib import maintained_selection
|
||||
from openpype.hosts.nuke import api as napi
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
|
||||
|
||||
|
|
@ -25,14 +25,14 @@ class SelectCenterInNodeGraph(pyblish.api.Action):
|
|||
# Apply pyblish.logic to get the instances for the plug-in
|
||||
instances = pyblish.api.instances_by_plugin(failed, plugin)
|
||||
|
||||
all_xC = list()
|
||||
all_yC = list()
|
||||
all_xC = []
|
||||
all_yC = []
|
||||
|
||||
# maintain selection
|
||||
with maintained_selection():
|
||||
with napi.maintained_selection():
|
||||
# collect all failed nodes xpos and ypos
|
||||
for instance in instances:
|
||||
bdn = instance[0]
|
||||
bdn = instance.data["transientData"]["node"]
|
||||
xC = bdn.xpos() + bdn.screenWidth() / 2
|
||||
yC = bdn.ypos() + bdn.screenHeight() / 2
|
||||
|
||||
|
|
@ -46,7 +46,6 @@ class SelectCenterInNodeGraph(pyblish.api.Action):
|
|||
nuke.zoom(2, [min(all_xC), min(all_yC)])
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class ValidateBackdrop(pyblish.api.InstancePlugin):
|
||||
""" Validate amount of nodes on backdrop node in case user
|
||||
forgoten to add nodes above the publishing backdrop node.
|
||||
|
|
@ -60,7 +59,8 @@ class ValidateBackdrop(pyblish.api.InstancePlugin):
|
|||
actions = [SelectCenterInNodeGraph]
|
||||
|
||||
def process(self, instance):
|
||||
connections_out = instance.data["nodeConnectionsOut"]
|
||||
child_nodes = instance.data["transientData"]["childNodes"]
|
||||
connections_out = instance.data["transientData"]["nodeConnectionsOut"]
|
||||
|
||||
msg_multiple_outputs = (
|
||||
"Only one outcoming connection from "
|
||||
|
|
@ -78,10 +78,10 @@ class ValidateBackdrop(pyblish.api.InstancePlugin):
|
|||
|
||||
self.log.debug(
|
||||
"Amount of nodes on instance: {}".format(
|
||||
len(instance))
|
||||
len(child_nodes))
|
||||
)
|
||||
|
||||
if len(instance) == 1:
|
||||
if child_nodes == []:
|
||||
raise PublishXmlValidationError(
|
||||
self,
|
||||
msg_no_nodes,
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import pyblish
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
from openpype.hosts.nuke.api import maintained_selection
|
||||
from openpype.hosts.nuke import api as napi
|
||||
import nuke
|
||||
|
||||
|
||||
|
|
@ -26,45 +26,44 @@ class OpenFailedGroupNode(pyblish.api.Action):
|
|||
instances = pyblish.api.instances_by_plugin(failed, plugin)
|
||||
|
||||
# maintain selection
|
||||
with maintained_selection():
|
||||
with napi.maintained_selection():
|
||||
# collect all failed nodes xpos and ypos
|
||||
for instance in instances:
|
||||
grpn = instance[0]
|
||||
grpn = instance.data["transientData"]["node"]
|
||||
nuke.showDag(grpn)
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class ValidateGizmo(pyblish.api.InstancePlugin):
|
||||
"""Validate amount of output nodes in gizmo (group) node"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
optional = True
|
||||
families = ["gizmo"]
|
||||
label = "Validate Gizmo (Group)"
|
||||
label = "Validate Gizmo (group)"
|
||||
hosts = ["nuke"]
|
||||
actions = [OpenFailedGroupNode]
|
||||
|
||||
def process(self, instance):
|
||||
grpn = instance[0]
|
||||
grpn = instance.data["transientData"]["node"]
|
||||
|
||||
with grpn:
|
||||
connections_out = nuke.allNodes('Output')
|
||||
msg_multiple_outputs = (
|
||||
"Only one outcoming connection from "
|
||||
"\"{}\" is allowed").format(instance.data["name"])
|
||||
|
||||
if len(connections_out) > 1:
|
||||
msg_multiple_outputs = (
|
||||
"Only one outcoming connection from "
|
||||
"\"{}\" is allowed").format(instance.data["name"])
|
||||
|
||||
raise PublishXmlValidationError(
|
||||
self, msg_multiple_outputs, "multiple_outputs",
|
||||
{"node_name": grpn["name"].value()}
|
||||
)
|
||||
|
||||
connections_in = nuke.allNodes('Input')
|
||||
msg_missing_inputs = (
|
||||
"At least one Input node has to be inside Group: "
|
||||
"\"{}\"").format(instance.data["name"])
|
||||
|
||||
if len(connections_in) == 0:
|
||||
msg_missing_inputs = (
|
||||
"At least one Input node has to be inside Group: "
|
||||
"\"{}\"").format(instance.data["name"])
|
||||
|
||||
raise PublishXmlValidationError(
|
||||
self, msg_missing_inputs, "no_inputs",
|
||||
{"node_name": grpn["name"].value()}
|
||||
|
|
|
|||
|
|
@ -61,17 +61,11 @@ class ValidateKnobs(pyblish.api.ContextPlugin):
|
|||
invalid_knobs = []
|
||||
|
||||
for instance in context:
|
||||
# Filter publisable instances.
|
||||
if not instance.data["publish"]:
|
||||
continue
|
||||
|
||||
# Filter families.
|
||||
families = [instance.data["family"]]
|
||||
families += instance.data.get("families", [])
|
||||
|
||||
if not families:
|
||||
continue
|
||||
|
||||
# Get all knobs to validate.
|
||||
knobs = {}
|
||||
for family in families:
|
||||
|
|
|
|||
|
|
@ -1,12 +1,19 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.hosts.nuke.api import maintained_selection
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
from openpype.hosts.nuke import api as napi
|
||||
from openpype.pipeline.publish import RepairAction
|
||||
from openpype.pipeline import (
|
||||
PublishXmlValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
|
||||
import nuke
|
||||
|
||||
|
||||
class ValidateOutputResolution(pyblish.api.InstancePlugin):
|
||||
class ValidateOutputResolution(
|
||||
OptionalPyblishPluginMixin,
|
||||
pyblish.api.InstancePlugin
|
||||
):
|
||||
"""Validates Output Resolution.
|
||||
|
||||
It is making sure the resolution of write's input is the same as
|
||||
|
|
@ -15,8 +22,8 @@ class ValidateOutputResolution(pyblish.api.InstancePlugin):
|
|||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
optional = True
|
||||
families = ["render", "render.local", "render.farm"]
|
||||
label = "Write Resolution"
|
||||
families = ["render"]
|
||||
label = "Write resolution"
|
||||
hosts = ["nuke"]
|
||||
actions = [RepairAction]
|
||||
|
||||
|
|
@ -24,14 +31,22 @@ class ValidateOutputResolution(pyblish.api.InstancePlugin):
|
|||
resolution_msg = "Reformat is set to wrong format"
|
||||
|
||||
def process(self, instance):
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise PublishXmlValidationError(self, invalid)
|
||||
|
||||
@classmethod
|
||||
def get_reformat(cls, instance):
|
||||
child_nodes = (
|
||||
instance.data.get("transientData", {}).get("childNodes")
|
||||
or instance
|
||||
)
|
||||
|
||||
reformat = None
|
||||
for inode in instance:
|
||||
for inode in child_nodes:
|
||||
if inode.Class() != "Reformat":
|
||||
continue
|
||||
reformat = inode
|
||||
|
|
@ -64,21 +79,26 @@ class ValidateOutputResolution(pyblish.api.InstancePlugin):
|
|||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
child_nodes = (
|
||||
instance.data.get("transientData", {}).get("childNodes")
|
||||
or instance
|
||||
)
|
||||
|
||||
invalid = cls.get_invalid(instance)
|
||||
grp_node = instance[0]
|
||||
grp_node = instance.data["transientData"]["node"]
|
||||
|
||||
if cls.missing_msg == invalid:
|
||||
# make sure we are inside of the group node
|
||||
with grp_node:
|
||||
# find input node and select it
|
||||
_input = None
|
||||
for inode in instance:
|
||||
for inode in child_nodes:
|
||||
if inode.Class() != "Input":
|
||||
continue
|
||||
_input = inode
|
||||
|
||||
# add reformat node under it
|
||||
with maintained_selection():
|
||||
with napi.maintained_selection():
|
||||
_input['selected'].setValue(True)
|
||||
_rfn = nuke.createNode("Reformat", "name Reformat01")
|
||||
_rfn["resize"].setValue(0)
|
||||
|
|
|
|||
|
|
@ -17,7 +17,6 @@ class FixProxyMode(pyblish.api.Action):
|
|||
rootNode["proxy"].setValue(False)
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class ValidateProxyMode(pyblish.api.ContextPlugin):
|
||||
"""Validate active proxy mode"""
|
||||
|
||||
|
|
|
|||
|
|
@ -1,87 +0,0 @@
|
|||
import os
|
||||
|
||||
import nuke
|
||||
|
||||
import toml
|
||||
import pyblish.api
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from openpype.pipeline import (
|
||||
discover_loader_plugins,
|
||||
load_container,
|
||||
)
|
||||
|
||||
|
||||
class RepairReadLegacyAction(pyblish.api.Action):
|
||||
|
||||
label = "Repair"
|
||||
icon = "wrench"
|
||||
on = "failed"
|
||||
|
||||
def process(self, context, plugin):
|
||||
|
||||
# Get the errored instances
|
||||
failed = []
|
||||
for result in context.data["results"]:
|
||||
if (result["error"] is not None and result["instance"] is not None
|
||||
and result["instance"] not in failed):
|
||||
failed.append(result["instance"])
|
||||
|
||||
# Apply pyblish.logic to get the instances for the plug-in
|
||||
instances = pyblish.api.instances_by_plugin(failed, plugin)
|
||||
|
||||
for instance in instances:
|
||||
|
||||
data = toml.loads(instance[0]["avalon"].value())
|
||||
data["name"] = instance[0].name()
|
||||
data["xpos"] = instance[0].xpos()
|
||||
data["ypos"] = instance[0].ypos()
|
||||
data["extension"] = os.path.splitext(
|
||||
instance[0]["file"].value()
|
||||
)[1][1:]
|
||||
|
||||
data["connections"] = []
|
||||
for d in instance[0].dependent():
|
||||
for i in range(d.inputs()):
|
||||
if d.input(i) == instance[0]:
|
||||
data["connections"].append([i, d])
|
||||
|
||||
nuke.delete(instance[0])
|
||||
|
||||
loader_name = "LoadSequence"
|
||||
if data["extension"] == "mov":
|
||||
loader_name = "LoadMov"
|
||||
|
||||
loader_plugin = None
|
||||
for Loader in discover_loader_plugins():
|
||||
if Loader.__name__ != loader_name:
|
||||
continue
|
||||
|
||||
loader_plugin = Loader
|
||||
|
||||
load_container(
|
||||
Loader=loader_plugin,
|
||||
representation=ObjectId(data["representation"])
|
||||
)
|
||||
|
||||
node = nuke.toNode(data["name"])
|
||||
for connection in data["connections"]:
|
||||
connection[1].setInput(connection[0], node)
|
||||
|
||||
node.setXYpos(data["xpos"], data["ypos"])
|
||||
|
||||
|
||||
class ValidateReadLegacy(pyblish.api.InstancePlugin):
|
||||
"""Validate legacy read instance[0]s."""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
optional = True
|
||||
families = ["read.legacy"]
|
||||
label = "Read Legacy"
|
||||
hosts = ["nuke"]
|
||||
actions = [RepairReadLegacyAction]
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
msg = "Clean up legacy read node \"{}\"".format(instance)
|
||||
assert False, msg
|
||||
|
|
@ -4,7 +4,6 @@ import clique
|
|||
from openpype.pipeline import PublishXmlValidationError
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class RepairActionBase(pyblish.api.Action):
|
||||
on = "failed"
|
||||
icon = "wrench"
|
||||
|
|
@ -23,6 +22,7 @@ class RepairActionBase(pyblish.api.Action):
|
|||
|
||||
def repair_knob(self, instances, state):
|
||||
for instance in instances:
|
||||
node = instance.data["transientData"]["node"]
|
||||
files_remove = [os.path.join(instance.data["outputDir"], f)
|
||||
for r in instance.data.get("representations", [])
|
||||
for f in r.get("files", [])
|
||||
|
|
@ -31,7 +31,7 @@ class RepairActionBase(pyblish.api.Action):
|
|||
for f in files_remove:
|
||||
os.remove(f)
|
||||
self.log.debug("removing file: {}".format(f))
|
||||
instance[0]["render"].setValue(state)
|
||||
node["render"].setValue(state)
|
||||
self.log.info("Rendering toggled to `{}`".format(state))
|
||||
|
||||
|
||||
|
|
@ -62,9 +62,10 @@ class ValidateRenderedFrames(pyblish.api.InstancePlugin):
|
|||
actions = [RepairCollectionActionToLocal, RepairCollectionActionToFarm]
|
||||
|
||||
def process(self, instance):
|
||||
node = instance.data["transientData"]["node"]
|
||||
|
||||
f_data = {
|
||||
"node_name": instance[0]["name"].value()
|
||||
"node_name": node.name()
|
||||
}
|
||||
|
||||
for repre in instance.data["representations"]:
|
||||
|
|
|
|||
|
|
@ -1,17 +1,19 @@
|
|||
from pprint import pformat
|
||||
from copy import deepcopy
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
from openpype.pipeline import (
|
||||
PublishXmlValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline.publish import RepairAction
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
get_avalon_knob_data,
|
||||
WorkfileSettings
|
||||
)
|
||||
import nuke
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class ValidateScriptAttributes(pyblish.api.InstancePlugin):
|
||||
class ValidateScriptAttributes(
|
||||
OptionalPyblishPluginMixin,
|
||||
pyblish.api.InstancePlugin
|
||||
):
|
||||
""" Validates file output. """
|
||||
|
||||
order = pyblish.api.ValidatorOrder + 0.1
|
||||
|
|
@ -22,14 +24,12 @@ class ValidateScriptAttributes(pyblish.api.InstancePlugin):
|
|||
actions = [RepairAction]
|
||||
|
||||
def process(self, instance):
|
||||
root = nuke.root()
|
||||
knob_data = get_avalon_knob_data(root)
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
|
||||
script_data = deepcopy(instance.context.data["scriptData"])
|
||||
|
||||
asset = instance.data["assetEntity"]
|
||||
# get asset data frame values
|
||||
frame_start = asset["data"]["frameStart"]
|
||||
frame_end = asset["data"]["frameEnd"]
|
||||
handle_start = asset["data"]["handleStart"]
|
||||
handle_end = asset["data"]["handleEnd"]
|
||||
|
||||
# These attributes will be checked
|
||||
attributes = [
|
||||
|
|
@ -48,37 +48,11 @@ class ValidateScriptAttributes(pyblish.api.InstancePlugin):
|
|||
for attr in attributes
|
||||
if attr in asset["data"]
|
||||
}
|
||||
# fix float to max 4 digints (only for evaluating)
|
||||
fps_data = float("{0:.4f}".format(
|
||||
asset_attributes["fps"]))
|
||||
# fix frame values to include handles
|
||||
asset_attributes.update({
|
||||
"frameStart": frame_start - handle_start,
|
||||
"frameEnd": frame_end + handle_end,
|
||||
"fps": fps_data
|
||||
})
|
||||
|
||||
self.log.debug(pformat(
|
||||
asset_attributes
|
||||
))
|
||||
|
||||
# Get format
|
||||
_format = root["format"].value()
|
||||
|
||||
# Get values from nukescript
|
||||
script_attributes = {
|
||||
"handleStart": int(knob_data["handleStart"]),
|
||||
"handleEnd": int(knob_data["handleEnd"]),
|
||||
"fps": float("{0:.4f}".format(root['fps'].value())),
|
||||
"frameStart": int(root["first_frame"].getValue()),
|
||||
"frameEnd": int(root["last_frame"].getValue()),
|
||||
"resolutionWidth": _format.width(),
|
||||
"resolutionHeight": _format.height(),
|
||||
"pixelAspect": _format.pixelAspect()
|
||||
}
|
||||
self.log.debug(pformat(
|
||||
script_attributes
|
||||
))
|
||||
asset_attributes["fps"] = float("{0:.4f}".format(
|
||||
asset_attributes["fps"]))
|
||||
script_data["fps"] = float("{0:.4f}".format(
|
||||
script_data["fps"]))
|
||||
|
||||
# Compare asset's values Nukescript X Database
|
||||
not_matching = []
|
||||
|
|
@ -87,14 +61,14 @@ class ValidateScriptAttributes(pyblish.api.InstancePlugin):
|
|||
"Asset vs Script attribute \"{}\": {}, {}".format(
|
||||
attr,
|
||||
asset_attributes[attr],
|
||||
script_attributes[attr]
|
||||
script_data[attr]
|
||||
)
|
||||
)
|
||||
if asset_attributes[attr] != script_attributes[attr]:
|
||||
if asset_attributes[attr] != script_data[attr]:
|
||||
not_matching.append({
|
||||
"name": attr,
|
||||
"expected": asset_attributes[attr],
|
||||
"actual": script_attributes[attr]
|
||||
"actual": script_data[attr]
|
||||
})
|
||||
|
||||
# Raise error if not matching
|
||||
|
|
|
|||
|
|
@ -1,53 +0,0 @@
|
|||
import pyblish.api
|
||||
import openpype.hosts.nuke.lib
|
||||
|
||||
|
||||
class RepairNukeWriteDeadlineTab(pyblish.api.Action):
|
||||
|
||||
label = "Repair"
|
||||
icon = "wrench"
|
||||
on = "failed"
|
||||
|
||||
def process(self, context, plugin):
|
||||
|
||||
# Get the errored instances
|
||||
failed = []
|
||||
for result in context.data["results"]:
|
||||
if (result["error"] is not None and result["instance"] is not None
|
||||
and result["instance"] not in failed):
|
||||
failed.append(result["instance"])
|
||||
|
||||
# Apply pyblish.logic to get the instances for the plug-in
|
||||
instances = pyblish.api.instances_by_plugin(failed, plugin)
|
||||
|
||||
for instance in instances:
|
||||
group_node = [x for x in instance if x.Class() == "Group"][0]
|
||||
|
||||
# Remove existing knobs.
|
||||
knob_names = openpype.hosts.nuke.lib.get_deadline_knob_names()
|
||||
for name, knob in group_node.knobs().items():
|
||||
if name in knob_names:
|
||||
group_node.removeKnob(knob)
|
||||
|
||||
openpype.hosts.nuke.lib.add_deadline_tab(group_node)
|
||||
|
||||
|
||||
class ValidateNukeWriteDeadlineTab(pyblish.api.InstancePlugin):
|
||||
"""Ensure Deadline tab is present and current."""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Deadline Tab"
|
||||
hosts = ["nuke"]
|
||||
optional = True
|
||||
families = ["render"]
|
||||
actions = [RepairNukeWriteDeadlineTab]
|
||||
|
||||
def process(self, instance):
|
||||
group_node = [x for x in instance if x.Class() == "Group"][0]
|
||||
|
||||
knob_names = openpype.hosts.nuke.lib.get_deadline_knob_names()
|
||||
missing_knobs = []
|
||||
for name in knob_names:
|
||||
if name not in group_node.knobs().keys():
|
||||
missing_knobs.append(name)
|
||||
assert not missing_knobs, "Missing knobs: {}".format(missing_knobs)
|
||||
|
|
@ -1,108 +0,0 @@
|
|||
import toml
|
||||
|
||||
import nuke
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import discover_creator_plugins
|
||||
from openpype.pipeline.publish import RepairAction
|
||||
from openpype.hosts.nuke.api.lib import get_avalon_knob_data
|
||||
|
||||
|
||||
class ValidateWriteLegacy(pyblish.api.InstancePlugin):
|
||||
"""Validate legacy write nodes."""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
optional = True
|
||||
families = ["write"]
|
||||
label = "Validate Write Legacy"
|
||||
hosts = ["nuke"]
|
||||
actions = [RepairAction]
|
||||
|
||||
def process(self, instance):
|
||||
node = instance[0]
|
||||
msg = "Clean up legacy write node \"{}\"".format(instance)
|
||||
|
||||
if node.Class() not in ["Group", "Write"]:
|
||||
return
|
||||
|
||||
# test avalon knobs
|
||||
family_knobs = ["ak:family", "avalon:family"]
|
||||
family_test = [k for k in node.knobs().keys() if k in family_knobs]
|
||||
self.log.debug("_ family_test: {}".format(family_test))
|
||||
|
||||
# test if render in family test knob
|
||||
# and only one item should be available
|
||||
assert len(family_test) == 1, msg + " > More avalon attributes"
|
||||
assert "render" in node[family_test[0]].value() \
|
||||
or "still" in node[family_test[0]].value(), msg + \
|
||||
" > Not correct family"
|
||||
# test if `file` knob in node, this way old
|
||||
# non-group-node write could be detected
|
||||
assert "file" not in node.knobs(), msg + \
|
||||
" > file knob should not be present"
|
||||
|
||||
# check if write node is having old render targeting
|
||||
assert "render_farm" not in node.knobs(), msg + \
|
||||
" > old way of setting render target"
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
node = instance[0]
|
||||
|
||||
if "Write" in node.Class():
|
||||
data = toml.loads(node["avalon"].value())
|
||||
else:
|
||||
data = get_avalon_knob_data(node)
|
||||
|
||||
# collect reusable data
|
||||
data["XYpos"] = (node.xpos(), node.ypos())
|
||||
data["input"] = node.input(0)
|
||||
data["publish"] = node["publish"].value()
|
||||
data["render"] = node["render"].value()
|
||||
data["render_farm"] = node["render_farm"].value()
|
||||
data["review"] = node["review"].value()
|
||||
data["use_limit"] = node["use_limit"].value()
|
||||
data["first"] = node["first"].value()
|
||||
data["last"] = node["last"].value()
|
||||
|
||||
family = data["family"]
|
||||
cls.log.debug("_ orig node family: {}".format(family))
|
||||
|
||||
# define what family of write node should be recreated
|
||||
if family == "render":
|
||||
Create_name = "CreateWriteRender"
|
||||
elif family == "prerender":
|
||||
Create_name = "CreateWritePrerender"
|
||||
elif family == "still":
|
||||
Create_name = "CreateWriteStill"
|
||||
|
||||
# get appropriate plugin class
|
||||
creator_plugin = None
|
||||
for Creator in discover_creator_plugins():
|
||||
if Creator.__name__ != Create_name:
|
||||
continue
|
||||
|
||||
creator_plugin = Creator
|
||||
|
||||
# delete the legaci write node
|
||||
nuke.delete(node)
|
||||
|
||||
# create write node with creator
|
||||
new_node_name = data["subset"]
|
||||
creator_plugin(new_node_name, data["asset"]).process()
|
||||
|
||||
node = nuke.toNode(new_node_name)
|
||||
node.setXYpos(*data["XYpos"])
|
||||
node.setInput(0, data["input"])
|
||||
node["publish"].setValue(data["publish"])
|
||||
node["review"].setValue(data["review"])
|
||||
node["use_limit"].setValue(data["use_limit"])
|
||||
node["first"].setValue(data["first"])
|
||||
node["last"].setValue(data["last"])
|
||||
|
||||
# recreate render targets
|
||||
if data["render"]:
|
||||
node["render"].setValue("Local")
|
||||
if data["render_farm"]:
|
||||
node["render"].setValue("On farm")
|
||||
|
|
@ -5,10 +5,13 @@ from openpype.hosts.nuke.api.lib import (
|
|||
set_node_knobs_from_settings,
|
||||
color_gui_to_int
|
||||
)
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
|
||||
from openpype.pipeline.publish import (
|
||||
PublishXmlValidationError,
|
||||
OptionalPyblishPluginMixin,
|
||||
)
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class RepairNukeWriteNodeAction(pyblish.api.Action):
|
||||
label = "Repair"
|
||||
on = "failed"
|
||||
|
|
@ -18,10 +21,15 @@ class RepairNukeWriteNodeAction(pyblish.api.Action):
|
|||
instances = get_errored_instances_from_context(context)
|
||||
|
||||
for instance in instances:
|
||||
write_group_node = instance[0]
|
||||
child_nodes = (
|
||||
instance.data.get("transientData", {}).get("childNodes")
|
||||
or instance
|
||||
)
|
||||
|
||||
write_group_node = instance.data["transientData"]["node"]
|
||||
# get write node from inside of group
|
||||
write_node = None
|
||||
for x in instance:
|
||||
for x in child_nodes:
|
||||
if x.Class() == "Write":
|
||||
write_node = x
|
||||
|
||||
|
|
@ -32,7 +40,10 @@ class RepairNukeWriteNodeAction(pyblish.api.Action):
|
|||
self.log.info("Node attributes were fixed")
|
||||
|
||||
|
||||
class ValidateNukeWriteNode(pyblish.api.InstancePlugin):
|
||||
class ValidateNukeWriteNode(
|
||||
OptionalPyblishPluginMixin,
|
||||
pyblish.api.InstancePlugin
|
||||
):
|
||||
""" Validate Write node's knobs.
|
||||
|
||||
Compare knobs on write node inside the render group
|
||||
|
|
@ -42,16 +53,24 @@ class ValidateNukeWriteNode(pyblish.api.InstancePlugin):
|
|||
order = pyblish.api.ValidatorOrder
|
||||
optional = True
|
||||
families = ["render"]
|
||||
label = "Write Node"
|
||||
label = "Validate write node"
|
||||
actions = [RepairNukeWriteNodeAction]
|
||||
hosts = ["nuke"]
|
||||
|
||||
def process(self, instance):
|
||||
write_group_node = instance[0]
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
|
||||
child_nodes = (
|
||||
instance.data.get("transientData", {}).get("childNodes")
|
||||
or instance
|
||||
)
|
||||
|
||||
write_group_node = instance.data["transientData"]["node"]
|
||||
|
||||
# get write node from inside of group
|
||||
write_node = None
|
||||
for x in instance:
|
||||
for x in child_nodes:
|
||||
if x.Class() == "Write":
|
||||
write_node = x
|
||||
|
||||
|
|
@ -60,17 +79,31 @@ class ValidateNukeWriteNode(pyblish.api.InstancePlugin):
|
|||
|
||||
correct_data = get_write_node_template_attr(write_group_node)
|
||||
|
||||
if correct_data:
|
||||
check_knobs = correct_data["knobs"]
|
||||
else:
|
||||
return
|
||||
|
||||
check = []
|
||||
self.log.debug("__ write_node: {}".format(
|
||||
write_node
|
||||
))
|
||||
self.log.debug("__ correct_data: {}".format(
|
||||
correct_data
|
||||
))
|
||||
|
||||
for knob_data in correct_data["knobs"]:
|
||||
knob_type = knob_data["type"]
|
||||
self.log.debug("__ knob_type: {}".format(
|
||||
knob_type
|
||||
))
|
||||
|
||||
if (
|
||||
knob_type == "__legacy__"
|
||||
):
|
||||
raise PublishXmlValidationError(
|
||||
self, (
|
||||
"Please update data in settings 'project_settings"
|
||||
"/nuke/imageio/nodes/requiredNodes'"
|
||||
),
|
||||
key="legacy"
|
||||
)
|
||||
|
||||
for knob_data in check_knobs:
|
||||
key = knob_data["name"]
|
||||
value = knob_data["value"]
|
||||
node_value = write_node[key].value()
|
||||
|
|
|
|||
|
|
@ -1,64 +1,5 @@
|
|||
import nuke
|
||||
import os
|
||||
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.nuke import api
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
on_script_load,
|
||||
check_inventory_versions,
|
||||
WorkfileSettings,
|
||||
dirmap_file_name_filter,
|
||||
add_scripts_gizmo
|
||||
)
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.hosts.nuke.api import NukeHost
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
install_host(api)
|
||||
|
||||
# fix ffmpeg settings on script
|
||||
nuke.addOnScriptLoad(on_script_load)
|
||||
|
||||
# set checker for last versions on loaded containers
|
||||
nuke.addOnScriptLoad(check_inventory_versions)
|
||||
nuke.addOnScriptSave(check_inventory_versions)
|
||||
|
||||
# # set apply all workfile settings on script load and save
|
||||
nuke.addOnScriptLoad(WorkfileSettings().set_context_settings)
|
||||
|
||||
nuke.addFilenameFilter(dirmap_file_name_filter)
|
||||
|
||||
log.info('Automatic syncing of write file knob to script version')
|
||||
|
||||
|
||||
def add_scripts_menu():
|
||||
try:
|
||||
from scriptsmenu import launchfornuke
|
||||
except ImportError:
|
||||
log.warning(
|
||||
"Skipping studio.menu install, because "
|
||||
"'scriptsmenu' module seems unavailable."
|
||||
)
|
||||
return
|
||||
|
||||
# load configuration of custom menu
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
config = project_settings["nuke"]["scriptsmenu"]["definition"]
|
||||
_menu = project_settings["nuke"]["scriptsmenu"]["name"]
|
||||
|
||||
if not config:
|
||||
log.warning("Skipping studio menu, no definition found.")
|
||||
return
|
||||
|
||||
# run the launcher for Maya menu
|
||||
studio_menu = launchfornuke.main(title=_menu.title())
|
||||
|
||||
# apply configuration
|
||||
studio_menu.build_from_configuration(studio_menu, config)
|
||||
|
||||
|
||||
add_scripts_menu()
|
||||
|
||||
add_scripts_gizmo()
|
||||
host = NukeHost()
|
||||
install_host(host)
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from Qt import QtWidgets, QtCore
|
||||
from qtpy import QtWidgets, QtCore
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import re
|
|||
import uuid
|
||||
|
||||
import qargparse
|
||||
from Qt import QtWidgets, QtCore
|
||||
from qtpy import QtWidgets, QtCore
|
||||
|
||||
from openpype.settings import get_current_project_settings
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
|
|
|||
|
|
@ -171,7 +171,6 @@ class ShotMetadataSolver:
|
|||
_index == 0
|
||||
and parents[-1]["entity_name"] == parent_name
|
||||
):
|
||||
self.log.debug(f" skipping : {parent_name}")
|
||||
continue
|
||||
|
||||
# in case first parent is project then start parents from start
|
||||
|
|
@ -179,7 +178,6 @@ class ShotMetadataSolver:
|
|||
_index == 0
|
||||
and parent_token_type == "Project"
|
||||
):
|
||||
self.log.debug("rebuilding parents from scratch")
|
||||
project_parent = parents[0]
|
||||
parents = [project_parent]
|
||||
continue
|
||||
|
|
@ -189,8 +187,6 @@ class ShotMetadataSolver:
|
|||
"entity_name": parent_name
|
||||
})
|
||||
|
||||
self.log.debug(f"__ parents: {parents}")
|
||||
|
||||
return parents
|
||||
|
||||
def _create_hierarchy_path(self, parents):
|
||||
|
|
@ -297,7 +293,6 @@ class ShotMetadataSolver:
|
|||
Returns:
|
||||
(str, dict): shot name and hierarchy data
|
||||
"""
|
||||
self.log.info(f"_ source_data: {source_data}")
|
||||
|
||||
tasks = {}
|
||||
asset_doc = source_data["selected_asset_doc"]
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
import os
|
||||
from copy import deepcopy
|
||||
from pprint import pformat
|
||||
import opentimelineio as otio
|
||||
from openpype.client import (
|
||||
get_asset_by_name,
|
||||
|
|
@ -13,9 +12,7 @@ from openpype.hosts.traypublisher.api.plugin import (
|
|||
from openpype.hosts.traypublisher.api.editorial import (
|
||||
ShotMetadataSolver
|
||||
)
|
||||
|
||||
from openpype.pipeline import CreatedInstance
|
||||
|
||||
from openpype.lib import (
|
||||
get_ffprobe_data,
|
||||
convert_ffprobe_fps_value,
|
||||
|
|
@ -33,14 +30,14 @@ from openpype.lib import (
|
|||
CLIP_ATTR_DEFS = [
|
||||
EnumDef(
|
||||
"fps",
|
||||
items={
|
||||
"from_selection": "From selection",
|
||||
23.997: "23.976",
|
||||
24: "24",
|
||||
25: "25",
|
||||
29.97: "29.97",
|
||||
30: "30"
|
||||
},
|
||||
items=[
|
||||
{"value": "from_selection", "label": "From selection"},
|
||||
{"value": 23.997, "label": "23.976"},
|
||||
{"value": 24, "label": "24"},
|
||||
{"value": 25, "label": "25"},
|
||||
{"value": 29.97, "label": "29.97"},
|
||||
{"value": 30, "label": "30"}
|
||||
],
|
||||
label="FPS"
|
||||
),
|
||||
NumberDef(
|
||||
|
|
@ -70,14 +67,12 @@ class EditorialClipInstanceCreatorBase(HiddenTrayPublishCreator):
|
|||
host_name = "traypublisher"
|
||||
|
||||
def create(self, instance_data, source_data=None):
|
||||
self.log.info(f"instance_data: {instance_data}")
|
||||
subset_name = instance_data["subset"]
|
||||
|
||||
# Create new instance
|
||||
new_instance = CreatedInstance(
|
||||
self.family, subset_name, instance_data, self
|
||||
)
|
||||
self.log.info(f"instance_data: {pformat(new_instance.data)}")
|
||||
|
||||
self._store_new_instance(new_instance)
|
||||
|
||||
|
|
@ -223,8 +218,6 @@ or updating already created. Publishing will create OTIO file.
|
|||
asset_name = instance_data["asset"]
|
||||
asset_doc = get_asset_by_name(self.project_name, asset_name)
|
||||
|
||||
self.log.info(pre_create_data["fps"])
|
||||
|
||||
if pre_create_data["fps"] == "from_selection":
|
||||
# get asset doc data attributes
|
||||
fps = asset_doc["data"]["fps"]
|
||||
|
|
@ -239,34 +232,43 @@ or updating already created. Publishing will create OTIO file.
|
|||
sequence_path_data = pre_create_data["sequence_filepath_data"]
|
||||
media_path_data = pre_create_data["media_filepaths_data"]
|
||||
|
||||
sequence_path = self._get_path_from_file_data(sequence_path_data)
|
||||
sequence_paths = self._get_path_from_file_data(
|
||||
sequence_path_data, multi=True)
|
||||
media_path = self._get_path_from_file_data(media_path_data)
|
||||
|
||||
# get otio timeline
|
||||
otio_timeline = self._create_otio_timeline(
|
||||
sequence_path, fps)
|
||||
first_otio_timeline = None
|
||||
for seq_path in sequence_paths:
|
||||
# get otio timeline
|
||||
otio_timeline = self._create_otio_timeline(
|
||||
seq_path, fps)
|
||||
|
||||
# Create all clip instances
|
||||
clip_instance_properties.update({
|
||||
"fps": fps,
|
||||
"parent_asset_name": asset_name,
|
||||
"variant": instance_data["variant"]
|
||||
})
|
||||
# Create all clip instances
|
||||
clip_instance_properties.update({
|
||||
"fps": fps,
|
||||
"parent_asset_name": asset_name,
|
||||
"variant": instance_data["variant"]
|
||||
})
|
||||
|
||||
# create clip instances
|
||||
self._get_clip_instances(
|
||||
otio_timeline,
|
||||
media_path,
|
||||
clip_instance_properties,
|
||||
family_presets=allowed_family_presets
|
||||
# create clip instances
|
||||
self._get_clip_instances(
|
||||
otio_timeline,
|
||||
media_path,
|
||||
clip_instance_properties,
|
||||
allowed_family_presets,
|
||||
os.path.basename(seq_path),
|
||||
first_otio_timeline
|
||||
)
|
||||
|
||||
)
|
||||
if not first_otio_timeline:
|
||||
# assing otio timeline for multi file to layer
|
||||
first_otio_timeline = otio_timeline
|
||||
|
||||
# create otio editorial instance
|
||||
self._create_otio_instance(
|
||||
subset_name, instance_data,
|
||||
sequence_path, media_path,
|
||||
otio_timeline
|
||||
subset_name,
|
||||
instance_data,
|
||||
seq_path, media_path,
|
||||
first_otio_timeline
|
||||
)
|
||||
|
||||
def _create_otio_instance(
|
||||
|
|
@ -317,14 +319,14 @@ or updating already created. Publishing will create OTIO file.
|
|||
kwargs["rate"] = fps
|
||||
kwargs["ignore_timecode_mismatch"] = True
|
||||
|
||||
self.log.info(f"kwargs: {kwargs}")
|
||||
return otio.adapters.read_from_file(sequence_path, **kwargs)
|
||||
|
||||
def _get_path_from_file_data(self, file_path_data):
|
||||
def _get_path_from_file_data(self, file_path_data, multi=False):
|
||||
"""Converting creator path data to single path string
|
||||
|
||||
Args:
|
||||
file_path_data (FileDefItem): creator path data inputs
|
||||
multi (bool): switch to multiple files mode
|
||||
|
||||
Raises:
|
||||
FileExistsError: in case nothing had been set
|
||||
|
|
@ -332,23 +334,29 @@ or updating already created. Publishing will create OTIO file.
|
|||
Returns:
|
||||
str: path string
|
||||
"""
|
||||
# TODO: just temporarly solving only one media file
|
||||
if isinstance(file_path_data, list):
|
||||
file_path_data = file_path_data.pop()
|
||||
return_path_list = []
|
||||
|
||||
if len(file_path_data["filenames"]) == 0:
|
||||
|
||||
if isinstance(file_path_data, list):
|
||||
return_path_list = [
|
||||
os.path.join(f["directory"], f["filenames"][0])
|
||||
for f in file_path_data
|
||||
]
|
||||
|
||||
if not return_path_list:
|
||||
raise FileExistsError(
|
||||
f"File path was not added: {file_path_data}")
|
||||
|
||||
return os.path.join(
|
||||
file_path_data["directory"], file_path_data["filenames"][0])
|
||||
return return_path_list if multi else return_path_list[0]
|
||||
|
||||
def _get_clip_instances(
|
||||
self,
|
||||
otio_timeline,
|
||||
media_path,
|
||||
instance_data,
|
||||
family_presets
|
||||
family_presets,
|
||||
sequence_file_name,
|
||||
first_otio_timeline=None
|
||||
):
|
||||
"""Helping function fro creating clip instance
|
||||
|
||||
|
|
@ -368,17 +376,15 @@ or updating already created. Publishing will create OTIO file.
|
|||
media_data = self._get_media_source_metadata(media_path)
|
||||
|
||||
for track in tracks:
|
||||
self.log.debug(f"track.name: {track.name}")
|
||||
track.name = f"{sequence_file_name} - {otio_timeline.name}"
|
||||
try:
|
||||
track_start_frame = (
|
||||
abs(track.source_range.start_time.value)
|
||||
)
|
||||
self.log.debug(f"track_start_frame: {track_start_frame}")
|
||||
track_start_frame -= self.timeline_frame_start
|
||||
except AttributeError:
|
||||
track_start_frame = 0
|
||||
|
||||
self.log.debug(f"track_start_frame: {track_start_frame}")
|
||||
|
||||
for clip in track.each_child():
|
||||
if not self._validate_clip_for_processing(clip):
|
||||
|
|
@ -400,10 +406,6 @@ or updating already created. Publishing will create OTIO file.
|
|||
"instance_label": None,
|
||||
"instance_id": None
|
||||
}
|
||||
self.log.info((
|
||||
"Creating subsets from presets: \n"
|
||||
f"{pformat(family_presets)}"
|
||||
))
|
||||
|
||||
for _fpreset in family_presets:
|
||||
# exclude audio family if no audio stream
|
||||
|
|
@ -419,7 +421,10 @@ or updating already created. Publishing will create OTIO file.
|
|||
deepcopy(base_instance_data),
|
||||
parenting_data
|
||||
)
|
||||
self.log.debug(f"{pformat(dict(instance.data))}")
|
||||
|
||||
# add track to first otioTimeline if it is in input args
|
||||
if first_otio_timeline:
|
||||
first_otio_timeline.tracks.append(deepcopy(track))
|
||||
|
||||
def _restore_otio_source_range(self, otio_clip):
|
||||
"""Infusing source range.
|
||||
|
|
@ -460,7 +465,6 @@ or updating already created. Publishing will create OTIO file.
|
|||
target_url=media_path,
|
||||
available_range=available_range
|
||||
)
|
||||
|
||||
otio_clip.media_reference = media_reference
|
||||
|
||||
def _get_media_source_metadata(self, path):
|
||||
|
|
@ -481,7 +485,6 @@ or updating already created. Publishing will create OTIO file.
|
|||
media_data = get_ffprobe_data(
|
||||
path, self.log
|
||||
)
|
||||
self.log.debug(f"__ media_data: {pformat(media_data)}")
|
||||
|
||||
# get video stream data
|
||||
video_stream = media_data["streams"][0]
|
||||
|
|
@ -589,9 +592,6 @@ or updating already created. Publishing will create OTIO file.
|
|||
# get variant name from preset or from inharitance
|
||||
_variant_name = preset.get("variant") or variant_name
|
||||
|
||||
self.log.debug(f"__ family: {family}")
|
||||
self.log.debug(f"__ preset: {preset}")
|
||||
|
||||
# subset name
|
||||
subset_name = "{}{}".format(
|
||||
family, _variant_name.capitalize()
|
||||
|
|
@ -722,17 +722,13 @@ or updating already created. Publishing will create OTIO file.
|
|||
clip_in += track_start_frame
|
||||
clip_out = otio_clip.range_in_parent().end_time_inclusive().value
|
||||
clip_out += track_start_frame
|
||||
self.log.info(f"clip_in: {clip_in} | clip_out: {clip_out}")
|
||||
|
||||
# add offset in case there is any
|
||||
self.log.debug(f"__ timeline_offset: {timeline_offset}")
|
||||
if timeline_offset:
|
||||
clip_in += timeline_offset
|
||||
clip_out += timeline_offset
|
||||
|
||||
clip_duration = otio_clip.duration().value
|
||||
self.log.info(f"clip duration: {clip_duration}")
|
||||
|
||||
source_in = otio_clip.trimmed_range().start_time.value
|
||||
source_out = source_in + clip_duration
|
||||
|
||||
|
|
@ -762,7 +758,6 @@ or updating already created. Publishing will create OTIO file.
|
|||
Returns:
|
||||
list: lit of dict with preset items
|
||||
"""
|
||||
self.log.debug(f"__ pre_create_data: {pre_create_data}")
|
||||
return [
|
||||
{"family": "shot"},
|
||||
*[
|
||||
|
|
@ -833,7 +828,7 @@ or updating already created. Publishing will create OTIO file.
|
|||
".fcpxml"
|
||||
],
|
||||
allow_sequences=False,
|
||||
single_item=True,
|
||||
single_item=False,
|
||||
label="Sequence file",
|
||||
),
|
||||
FileDef(
|
||||
|
|
|
|||
|
|
@ -7,8 +7,8 @@ exists under selected asset.
|
|||
"""
|
||||
from pathlib import Path
|
||||
|
||||
from openpype.client import get_subset_by_name, get_asset_by_name
|
||||
from openpype.lib.attribute_definitions import FileDef
|
||||
# from openpype.client import get_subset_by_name, get_asset_by_name
|
||||
from openpype.lib.attribute_definitions import FileDef, BoolDef
|
||||
from openpype.pipeline import (
|
||||
CreatedInstance,
|
||||
CreatorError
|
||||
|
|
@ -23,7 +23,8 @@ class OnlineCreator(TrayPublishCreator):
|
|||
label = "Online"
|
||||
family = "online"
|
||||
description = "Publish file retaining its original file name"
|
||||
extensions = [".mov", ".mp4", ".mxf", ".m4v", ".mpg"]
|
||||
extensions = [".mov", ".mp4", ".mxf", ".m4v", ".mpg", ".exr",
|
||||
".dpx", ".tif", ".png", ".jpg"]
|
||||
|
||||
def get_detail_description(self):
|
||||
return """# Create file retaining its original file name.
|
||||
|
|
@ -49,13 +50,17 @@ class OnlineCreator(TrayPublishCreator):
|
|||
|
||||
origin_basename = Path(files[0]).stem
|
||||
|
||||
# disable check for existing subset with the same name
|
||||
"""
|
||||
asset = get_asset_by_name(
|
||||
self.project_name, instance_data["asset"], fields=["_id"])
|
||||
|
||||
if get_subset_by_name(
|
||||
self.project_name, origin_basename, asset["_id"],
|
||||
fields=["_id"]):
|
||||
raise CreatorError(f"subset with {origin_basename} already "
|
||||
"exists in selected asset")
|
||||
"""
|
||||
|
||||
instance_data["originalBasename"] = origin_basename
|
||||
subset_name = origin_basename
|
||||
|
|
@ -69,15 +74,29 @@ class OnlineCreator(TrayPublishCreator):
|
|||
instance_data, self)
|
||||
self._store_new_instance(new_instance)
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
return [
|
||||
BoolDef(
|
||||
"add_review_family",
|
||||
default=True,
|
||||
label="Review"
|
||||
)
|
||||
]
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
return [
|
||||
FileDef(
|
||||
"representation_file",
|
||||
folders=False,
|
||||
extensions=self.extensions,
|
||||
allow_sequences=False,
|
||||
allow_sequences=True,
|
||||
single_item=True,
|
||||
label="Representation",
|
||||
),
|
||||
BoolDef(
|
||||
"add_review_family",
|
||||
default=True,
|
||||
label="Review"
|
||||
)
|
||||
]
|
||||
|
||||
|
|
|
|||
|
|
@ -12,12 +12,18 @@ class CollectOnlineFile(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
file = Path(instance.data["creator_attributes"]["path"])
|
||||
review = instance.data["creator_attributes"]["add_review_family"]
|
||||
instance.data["review"] = review
|
||||
if "review" not in instance.data["families"]:
|
||||
instance.data["families"].append("review")
|
||||
self.log.info(f"Adding review: {review}")
|
||||
|
||||
instance.data["representations"].append(
|
||||
{
|
||||
"name": file.suffix.lstrip("."),
|
||||
"ext": file.suffix.lstrip("."),
|
||||
"files": file.name,
|
||||
"stagingDir": file.parent.as_posix()
|
||||
"stagingDir": file.parent.as_posix(),
|
||||
"tags": ["review"] if review else []
|
||||
}
|
||||
)
|
||||
|
|
|
|||
|
|
@ -33,8 +33,6 @@ class CollectShotInstance(pyblish.api.InstancePlugin):
|
|||
]
|
||||
|
||||
def process(self, instance):
|
||||
self.log.debug(pformat(instance.data))
|
||||
|
||||
creator_identifier = instance.data["creator_identifier"]
|
||||
if "editorial" not in creator_identifier:
|
||||
return
|
||||
|
|
@ -82,7 +80,6 @@ class CollectShotInstance(pyblish.api.InstancePlugin):
|
|||
]
|
||||
|
||||
otio_clip = clips.pop()
|
||||
self.log.debug(f"__ otioclip.parent: {otio_clip.parent}")
|
||||
|
||||
return otio_clip
|
||||
|
||||
|
|
@ -172,7 +169,6 @@ class CollectShotInstance(pyblish.api.InstancePlugin):
|
|||
}
|
||||
|
||||
parents = instance.data.get('parents', [])
|
||||
self.log.debug(f"parents: {pformat(parents)}")
|
||||
|
||||
actual = {name: in_info}
|
||||
|
||||
|
|
@ -190,7 +186,6 @@ class CollectShotInstance(pyblish.api.InstancePlugin):
|
|||
|
||||
# adding hierarchy context to instance
|
||||
context.data["hierarchyContext"] = final_context
|
||||
self.log.debug(pformat(final_context))
|
||||
|
||||
def _update_dict(self, ex_dict, new_dict):
|
||||
""" Recursion function
|
||||
|
|
|
|||
|
|
@ -20,6 +20,8 @@ class ValidateOnlineFile(OptionalPyblishPluginMixin,
|
|||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
project_name = instance.context.data["projectName"]
|
||||
asset_id = instance.data["assetEntity"]["_id"]
|
||||
subset = get_subset_by_name(
|
||||
|
|
|
|||
|
|
@ -302,8 +302,9 @@ private:
|
|||
std::string websocket_url;
|
||||
// Should be avalon plugin available?
|
||||
// - this may change during processing if websocketet url is not set or server is down
|
||||
bool use_avalon;
|
||||
bool server_available;
|
||||
public:
|
||||
Communicator(std::string url);
|
||||
Communicator();
|
||||
websocket_endpoint endpoint;
|
||||
bool is_connected();
|
||||
|
|
@ -314,43 +315,45 @@ public:
|
|||
void call_notification(std::string method_name, nlohmann::json params);
|
||||
};
|
||||
|
||||
Communicator::Communicator() {
|
||||
|
||||
Communicator::Communicator(std::string url) {
|
||||
// URL to websocket server
|
||||
websocket_url = std::getenv("WEBSOCKET_URL");
|
||||
websocket_url = url;
|
||||
// Should be avalon plugin available?
|
||||
// - this may change during processing if websocketet url is not set or server is down
|
||||
if (websocket_url == "") {
|
||||
use_avalon = false;
|
||||
if (url == "") {
|
||||
server_available = false;
|
||||
} else {
|
||||
use_avalon = true;
|
||||
server_available = true;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
bool Communicator::is_connected(){
|
||||
return endpoint.connected();
|
||||
}
|
||||
|
||||
bool Communicator::is_usable(){
|
||||
return use_avalon;
|
||||
return server_available;
|
||||
}
|
||||
|
||||
void Communicator::connect()
|
||||
{
|
||||
if (!use_avalon) {
|
||||
if (!server_available) {
|
||||
return;
|
||||
}
|
||||
int con_result;
|
||||
con_result = endpoint.connect(websocket_url);
|
||||
if (con_result == -1)
|
||||
{
|
||||
use_avalon = false;
|
||||
server_available = false;
|
||||
} else {
|
||||
use_avalon = true;
|
||||
server_available = true;
|
||||
}
|
||||
}
|
||||
|
||||
void Communicator::call_notification(std::string method_name, nlohmann::json params) {
|
||||
if (!use_avalon || !is_connected()) {return;}
|
||||
if (!server_available || !is_connected()) {return;}
|
||||
|
||||
jsonrpcpp::Notification notification = {method_name, params};
|
||||
endpoint.send_notification(¬ification);
|
||||
|
|
@ -358,7 +361,7 @@ void Communicator::call_notification(std::string method_name, nlohmann::json par
|
|||
|
||||
jsonrpcpp::Response Communicator::call_method(std::string method_name, nlohmann::json params) {
|
||||
jsonrpcpp::Response response;
|
||||
if (!use_avalon || !is_connected())
|
||||
if (!server_available || !is_connected())
|
||||
{
|
||||
return response;
|
||||
}
|
||||
|
|
@ -382,7 +385,7 @@ jsonrpcpp::Response Communicator::call_method(std::string method_name, nlohmann:
|
|||
}
|
||||
|
||||
void Communicator::process_requests() {
|
||||
if (!use_avalon || !is_connected() || Data.messages.empty()) {return;}
|
||||
if (!server_available || !is_connected() || Data.messages.empty()) {return;}
|
||||
|
||||
std::string msg = Data.messages.front();
|
||||
Data.messages.pop();
|
||||
|
|
@ -458,7 +461,7 @@ void register_callbacks(){
|
|||
parser.register_request_callback("execute_george", execute_george);
|
||||
}
|
||||
|
||||
Communicator communication;
|
||||
Communicator* communication = nullptr;
|
||||
|
||||
////////////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
|
|
@ -484,7 +487,7 @@ static char* GetLocalString( PIFilter* iFilter, int iNum, char* iDefault )
|
|||
// in the localized file (or the localized file doesn't exist).
|
||||
std::string label_from_evn()
|
||||
{
|
||||
std::string _plugin_label = "Avalon";
|
||||
std::string _plugin_label = "OpenPype";
|
||||
if (std::getenv("AVALON_LABEL") && std::getenv("AVALON_LABEL") != "")
|
||||
{
|
||||
_plugin_label = std::getenv("AVALON_LABEL");
|
||||
|
|
@ -540,9 +543,12 @@ int FAR PASCAL PI_Open( PIFilter* iFilter )
|
|||
{
|
||||
PI_Parameters( iFilter, NULL ); // NULL as iArg means "open the requester"
|
||||
}
|
||||
|
||||
communication.connect();
|
||||
register_callbacks();
|
||||
char *env_value = std::getenv("WEBSOCKET_URL");
|
||||
if (env_value != NULL) {
|
||||
communication = new Communicator(env_value);
|
||||
communication->connect();
|
||||
register_callbacks();
|
||||
}
|
||||
return 1; // OK
|
||||
}
|
||||
|
||||
|
|
@ -560,7 +566,10 @@ void FAR PASCAL PI_Close( PIFilter* iFilter )
|
|||
{
|
||||
TVCloseReq( iFilter, Data.mReq );
|
||||
}
|
||||
communication.endpoint.close_connection();
|
||||
if (communication != nullptr) {
|
||||
communication->endpoint.close_connection();
|
||||
delete communication;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
|
@ -709,7 +718,7 @@ int FAR PASCAL PI_Msg( PIFilter* iFilter, INTPTR iEvent, INTPTR iReq, INTPTR* iA
|
|||
if (Data.menuItemsById.contains(button_up_item_id_str))
|
||||
{
|
||||
std::string callback_name = Data.menuItemsById[button_up_item_id_str].get<std::string>();
|
||||
communication.call_method(callback_name, nlohmann::json::array());
|
||||
communication->call_method(callback_name, nlohmann::json::array());
|
||||
}
|
||||
TVExecute( iFilter );
|
||||
break;
|
||||
|
|
@ -737,7 +746,9 @@ int FAR PASCAL PI_Msg( PIFilter* iFilter, INTPTR iEvent, INTPTR iReq, INTPTR* iA
|
|||
{
|
||||
newMenuItemsProcess(iFilter);
|
||||
}
|
||||
communication.process_requests();
|
||||
if (communication != nullptr) {
|
||||
communication->process_requests();
|
||||
}
|
||||
}
|
||||
|
||||
return 1;
|
||||
|
|
|
|||
Binary file not shown.
Binary file not shown.
|
|
@ -18,6 +18,7 @@ from .pipeline import (
|
|||
show_tools_popup,
|
||||
instantiate,
|
||||
UnrealHost,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
|
|
@ -36,4 +37,5 @@ __all__ = [
|
|||
"show_tools_popup",
|
||||
"instantiate",
|
||||
"UnrealHost",
|
||||
"maintained_selection"
|
||||
]
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@
|
|||
import os
|
||||
import logging
|
||||
from typing import List
|
||||
from contextlib import contextmanager
|
||||
import semver
|
||||
|
||||
import pyblish.api
|
||||
|
|
@ -447,3 +448,16 @@ def get_subsequences(sequence: unreal.LevelSequence):
|
|||
if subscene_track is not None and subscene_track.get_sections():
|
||||
return subscene_track.get_sections()
|
||||
return []
|
||||
|
||||
|
||||
@contextmanager
|
||||
def maintained_selection():
|
||||
"""Stub to be either implemented or replaced.
|
||||
|
||||
This is needed for old publisher implementation, but
|
||||
it is not supported (yet) in UE.
|
||||
"""
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
pass
|
||||
|
|
|
|||
61
openpype/hosts/unreal/plugins/create/create_uasset.py
Normal file
61
openpype/hosts/unreal/plugins/create/create_uasset.py
Normal file
|
|
@ -0,0 +1,61 @@
|
|||
"""Create UAsset."""
|
||||
from pathlib import Path
|
||||
|
||||
import unreal
|
||||
|
||||
from openpype.hosts.unreal.api import pipeline
|
||||
from openpype.pipeline import LegacyCreator
|
||||
|
||||
|
||||
class CreateUAsset(LegacyCreator):
|
||||
"""UAsset."""
|
||||
|
||||
name = "UAsset"
|
||||
label = "UAsset"
|
||||
family = "uasset"
|
||||
icon = "cube"
|
||||
|
||||
root = "/Game/OpenPype"
|
||||
suffix = "_INS"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateUAsset, self).__init__(*args, **kwargs)
|
||||
|
||||
def process(self):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
||||
subset = self.data["subset"]
|
||||
path = f"{self.root}/PublishInstances/"
|
||||
|
||||
unreal.EditorAssetLibrary.make_directory(path)
|
||||
|
||||
selection = []
|
||||
if (self.options or {}).get("useSelection"):
|
||||
sel_objects = unreal.EditorUtilityLibrary.get_selected_assets()
|
||||
selection = [a.get_path_name() for a in sel_objects]
|
||||
|
||||
if len(selection) != 1:
|
||||
raise RuntimeError("Please select only one object.")
|
||||
|
||||
obj = selection[0]
|
||||
|
||||
asset = ar.get_asset_by_object_path(obj).get_asset()
|
||||
sys_path = unreal.SystemLibrary.get_system_path(asset)
|
||||
|
||||
if not sys_path:
|
||||
raise RuntimeError(
|
||||
f"{Path(obj).name} is not on the disk. Likely it needs to"
|
||||
"be saved first.")
|
||||
|
||||
if Path(sys_path).suffix != ".uasset":
|
||||
raise RuntimeError(f"{Path(sys_path).name} is not a UAsset.")
|
||||
|
||||
unreal.log("selection: {}".format(selection))
|
||||
container_name = f"{subset}{self.suffix}"
|
||||
pipeline.create_publish_instance(
|
||||
instance=container_name, path=path)
|
||||
|
||||
data = self.data.copy()
|
||||
data["members"] = selection
|
||||
|
||||
pipeline.imprint(f"{path}/{container_name}", data)
|
||||
145
openpype/hosts/unreal/plugins/load/load_uasset.py
Normal file
145
openpype/hosts/unreal/plugins/load/load_uasset.py
Normal file
|
|
@ -0,0 +1,145 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Load UAsset."""
|
||||
from pathlib import Path
|
||||
import shutil
|
||||
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
import unreal # noqa
|
||||
|
||||
|
||||
class UAssetLoader(plugin.Loader):
|
||||
"""Load UAsset."""
|
||||
|
||||
families = ["uasset"]
|
||||
label = "Load UAsset"
|
||||
representations = ["uasset"]
|
||||
icon = "cube"
|
||||
color = "orange"
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
"""Load and containerise representation into Content Browser.
|
||||
|
||||
Args:
|
||||
context (dict): application context
|
||||
name (str): subset name
|
||||
namespace (str): in Unreal this is basically path to container.
|
||||
This is not passed here, so namespace is set
|
||||
by `containerise()` because only then we know
|
||||
real path.
|
||||
options (dict): Those would be data to be imprinted. This is not
|
||||
used now, data are imprinted by `containerise()`.
|
||||
|
||||
Returns:
|
||||
list(str): list of container content
|
||||
"""
|
||||
|
||||
# Create directory for asset and OpenPype container
|
||||
root = "/Game/OpenPype/Assets"
|
||||
asset = context.get('asset').get('name')
|
||||
suffix = "_CON"
|
||||
if asset:
|
||||
asset_name = "{}_{}".format(asset, name)
|
||||
else:
|
||||
asset_name = "{}".format(name)
|
||||
|
||||
tools = unreal.AssetToolsHelpers().get_asset_tools()
|
||||
asset_dir, container_name = tools.create_unique_asset_name(
|
||||
"{}/{}/{}".format(root, asset, name), suffix="")
|
||||
|
||||
container_name += suffix
|
||||
|
||||
unreal.EditorAssetLibrary.make_directory(asset_dir)
|
||||
|
||||
destination_path = asset_dir.replace(
|
||||
"/Game",
|
||||
Path(unreal.Paths.project_content_dir()).as_posix(),
|
||||
1)
|
||||
|
||||
shutil.copy(self.fname, f"{destination_path}/{name}.uasset")
|
||||
|
||||
# Create Asset Container
|
||||
unreal_pipeline.create_container(
|
||||
container=container_name, path=asset_dir)
|
||||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"asset": asset,
|
||||
"namespace": asset_dir,
|
||||
"container_name": container_name,
|
||||
"asset_name": asset_name,
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": context["representation"]["_id"],
|
||||
"parent": context["representation"]["parent"],
|
||||
"family": context["representation"]["context"]["family"]
|
||||
}
|
||||
unreal_pipeline.imprint(
|
||||
"{}/{}".format(asset_dir, container_name), data)
|
||||
|
||||
asset_content = unreal.EditorAssetLibrary.list_assets(
|
||||
asset_dir, recursive=True, include_folder=True
|
||||
)
|
||||
|
||||
for a in asset_content:
|
||||
unreal.EditorAssetLibrary.save_asset(a)
|
||||
|
||||
return asset_content
|
||||
|
||||
def update(self, container, representation):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
||||
asset_dir = container["namespace"]
|
||||
name = representation["context"]["subset"]
|
||||
|
||||
destination_path = asset_dir.replace(
|
||||
"/Game",
|
||||
Path(unreal.Paths.project_content_dir()).as_posix(),
|
||||
1)
|
||||
|
||||
asset_content = unreal.EditorAssetLibrary.list_assets(
|
||||
asset_dir, recursive=False, include_folder=True
|
||||
)
|
||||
|
||||
for asset in asset_content:
|
||||
obj = ar.get_asset_by_object_path(asset).get_asset()
|
||||
if not obj.get_class().get_name() == 'AssetContainer':
|
||||
unreal.EditorAssetLibrary.delete_asset(asset)
|
||||
|
||||
update_filepath = get_representation_path(representation)
|
||||
|
||||
shutil.copy(update_filepath, f"{destination_path}/{name}.uasset")
|
||||
|
||||
container_path = "{}/{}".format(container["namespace"],
|
||||
container["objectName"])
|
||||
# update metadata
|
||||
unreal_pipeline.imprint(
|
||||
container_path,
|
||||
{
|
||||
"representation": str(representation["_id"]),
|
||||
"parent": str(representation["parent"])
|
||||
})
|
||||
|
||||
asset_content = unreal.EditorAssetLibrary.list_assets(
|
||||
asset_dir, recursive=True, include_folder=True
|
||||
)
|
||||
|
||||
for a in asset_content:
|
||||
unreal.EditorAssetLibrary.save_asset(a)
|
||||
|
||||
def remove(self, container):
|
||||
path = container["namespace"]
|
||||
parent_path = Path(path).parent.as_posix()
|
||||
|
||||
unreal.EditorAssetLibrary.delete_directory(path)
|
||||
|
||||
asset_content = unreal.EditorAssetLibrary.list_assets(
|
||||
parent_path, recursive=False
|
||||
)
|
||||
|
||||
if len(asset_content) == 0:
|
||||
unreal.EditorAssetLibrary.delete_directory(parent_path)
|
||||
|
|
@ -25,9 +25,13 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
class_name = ["/Script/OpenPype",
|
||||
"AssetContainer"] if UNREAL_VERSION.major == 5 and \
|
||||
UNREAL_VERSION.minor > 0 else "OpenPypePublishInstance" # noqa
|
||||
class_name = [
|
||||
"/Script/OpenPype",
|
||||
"OpenPypePublishInstance"
|
||||
] if (
|
||||
UNREAL_VERSION.major == 5
|
||||
and UNREAL_VERSION.minor > 0
|
||||
) else "OpenPypePublishInstance" # noqa
|
||||
instance_containers = ar.get_assets_by_class(class_name, True)
|
||||
|
||||
for container_data in instance_containers:
|
||||
|
|
|
|||
42
openpype/hosts/unreal/plugins/publish/extract_uasset.py
Normal file
42
openpype/hosts/unreal/plugins/publish/extract_uasset.py
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
from pathlib import Path
|
||||
import shutil
|
||||
|
||||
import unreal
|
||||
|
||||
from openpype.pipeline import publish
|
||||
|
||||
|
||||
class ExtractUAsset(publish.Extractor):
|
||||
"""Extract a UAsset."""
|
||||
|
||||
label = "Extract UAsset"
|
||||
hosts = ["unreal"]
|
||||
families = ["uasset"]
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
|
||||
self.log.info("Performing extraction..")
|
||||
|
||||
staging_dir = self.staging_dir(instance)
|
||||
filename = "{}.uasset".format(instance.name)
|
||||
|
||||
obj = instance[0]
|
||||
|
||||
asset = ar.get_asset_by_object_path(obj).get_asset()
|
||||
sys_path = unreal.SystemLibrary.get_system_path(asset)
|
||||
filename = Path(sys_path).name
|
||||
|
||||
shutil.copy(sys_path, staging_dir)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'uasset',
|
||||
'ext': 'uasset',
|
||||
'files': filename,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
|
@ -0,0 +1,41 @@
|
|||
import unreal
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class ValidateNoDependencies(pyblish.api.InstancePlugin):
|
||||
"""Ensure that the uasset has no dependencies
|
||||
|
||||
The uasset is checked for dependencies. If there are any, the instance
|
||||
cannot be published.
|
||||
"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Check no dependencies"
|
||||
families = ["uasset"]
|
||||
hosts = ["unreal"]
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
all_dependencies = []
|
||||
|
||||
for obj in instance[:]:
|
||||
asset = ar.get_asset_by_object_path(obj)
|
||||
dependencies = ar.get_dependencies(
|
||||
asset.package_name,
|
||||
unreal.AssetRegistryDependencyOptions(
|
||||
include_soft_package_references=False,
|
||||
include_hard_package_references=True,
|
||||
include_searchable_names=False,
|
||||
include_soft_management_references=False,
|
||||
include_hard_management_references=False
|
||||
))
|
||||
if dependencies:
|
||||
for dep in dependencies:
|
||||
if str(dep).startswith("/Game/"):
|
||||
all_dependencies.append(str(dep))
|
||||
|
||||
if all_dependencies:
|
||||
raise RuntimeError(
|
||||
f"Dependencies found: {all_dependencies}")
|
||||
|
|
@ -908,24 +908,25 @@ class ApplicationLaunchContext:
|
|||
self.launch_args.extend(self.data.pop("app_args"))
|
||||
|
||||
# Handle launch environemtns
|
||||
env = self.data.pop("env", None)
|
||||
if env is not None and not isinstance(env, dict):
|
||||
src_env = self.data.pop("env", None)
|
||||
if src_env is not None and not isinstance(src_env, dict):
|
||||
self.log.warning((
|
||||
"Passed `env` kwarg has invalid type: {}. Expected: `dict`."
|
||||
" Using `os.environ` instead."
|
||||
).format(str(type(env))))
|
||||
env = None
|
||||
).format(str(type(src_env))))
|
||||
src_env = None
|
||||
|
||||
if env is None:
|
||||
env = os.environ
|
||||
if src_env is None:
|
||||
src_env = os.environ
|
||||
|
||||
# subprocess.Popen keyword arguments
|
||||
self.kwargs = {
|
||||
"env": {
|
||||
key: str(value)
|
||||
for key, value in env.items()
|
||||
}
|
||||
ignored_env = {"QT_API", }
|
||||
env = {
|
||||
key: str(value)
|
||||
for key, value in src_env.items()
|
||||
if key not in ignored_env
|
||||
}
|
||||
# subprocess.Popen keyword arguments
|
||||
self.kwargs = {"env": env}
|
||||
|
||||
if platform.system().lower() == "windows":
|
||||
# Detach new process from currently running process on Windows
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import re
|
|||
import collections
|
||||
import uuid
|
||||
import json
|
||||
import copy
|
||||
from abc import ABCMeta, abstractmethod, abstractproperty
|
||||
|
||||
import six
|
||||
|
|
@ -418,9 +419,8 @@ class EnumDef(AbtractAttrDef):
|
|||
"""Enumeration of single item from items.
|
||||
|
||||
Args:
|
||||
items: Items definition that can be coverted to
|
||||
`collections.OrderedDict`. Dictionary represent {value: label}
|
||||
relation.
|
||||
items: Items definition that can be coverted using
|
||||
'prepare_enum_items'.
|
||||
default: Default value. Must be one key(value) from passed items.
|
||||
"""
|
||||
|
||||
|
|
@ -433,38 +433,95 @@ class EnumDef(AbtractAttrDef):
|
|||
" defined values on initialization."
|
||||
).format(self.__class__.__name__))
|
||||
|
||||
items = collections.OrderedDict(items)
|
||||
if default not in items:
|
||||
for _key in items.keys():
|
||||
default = _key
|
||||
items = self.prepare_enum_items(items)
|
||||
item_values = [item["value"] for item in items]
|
||||
if default not in item_values:
|
||||
for value in item_values:
|
||||
default = value
|
||||
break
|
||||
|
||||
super(EnumDef, self).__init__(key, default=default, **kwargs)
|
||||
|
||||
self.items = items
|
||||
self._item_values = set(item_values)
|
||||
|
||||
def __eq__(self, other):
|
||||
if not super(EnumDef, self).__eq__(other):
|
||||
return False
|
||||
|
||||
if set(self.items.keys()) != set(other.items.keys()):
|
||||
return False
|
||||
|
||||
for key, label in self.items.items():
|
||||
if other.items[key] != label:
|
||||
return False
|
||||
return True
|
||||
return self.items == other.items
|
||||
|
||||
def convert_value(self, value):
|
||||
if value in self.items:
|
||||
if value in self._item_values:
|
||||
return value
|
||||
return self.default
|
||||
|
||||
def serialize(self):
|
||||
data = super(TextDef, self).serialize()
|
||||
data["items"] = list(self.items)
|
||||
data["items"] = copy.deepcopy(self.items)
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
def prepare_enum_items(items):
|
||||
"""Convert items to unified structure.
|
||||
|
||||
Output is a list where each item is dictionary with 'value'
|
||||
and 'label'.
|
||||
|
||||
```python
|
||||
# Example output
|
||||
[
|
||||
{"label": "Option 1", "value": 1},
|
||||
{"label": "Option 2", "value": 2},
|
||||
{"label": "Option 3", "value": 3}
|
||||
]
|
||||
```
|
||||
|
||||
Args:
|
||||
items (Union[Dict[str, Any], List[Any], List[Dict[str, Any]]): The
|
||||
items to convert.
|
||||
|
||||
Returns:
|
||||
List[Dict[str, Any]]: Unified structure of items.
|
||||
"""
|
||||
|
||||
output = []
|
||||
if isinstance(items, dict):
|
||||
for value, label in items.items():
|
||||
output.append({"label": label, "value": value})
|
||||
|
||||
elif isinstance(items, (tuple, list, set)):
|
||||
for item in items:
|
||||
if isinstance(item, dict):
|
||||
# Validate if 'value' is available
|
||||
if "value" not in item:
|
||||
raise KeyError("Item does not contain 'value' key.")
|
||||
|
||||
if "label" not in item:
|
||||
item["label"] = str(item["value"])
|
||||
elif isinstance(item, (list, tuple)):
|
||||
if len(item) == 2:
|
||||
value, label = item
|
||||
elif len(item) == 1:
|
||||
value = item[0]
|
||||
label = str(value)
|
||||
else:
|
||||
raise ValueError((
|
||||
"Invalid items count {}."
|
||||
" Expected 1 or 2. Value: {}"
|
||||
).format(len(item), str(item)))
|
||||
|
||||
item = {"label": label, "value": value}
|
||||
else:
|
||||
item = {"label": str(item), "value": item}
|
||||
output.append(item)
|
||||
|
||||
else:
|
||||
raise TypeError(
|
||||
"Unknown type for enum items '{}'".format(type(items))
|
||||
)
|
||||
|
||||
return output
|
||||
|
||||
class BoolDef(AbtractAttrDef):
|
||||
"""Boolean representation.
|
||||
|
|
|
|||
|
|
@ -92,7 +92,9 @@ class FileTransaction(object):
|
|||
def process(self):
|
||||
# Backup any existing files
|
||||
for dst, (src, _) in self._transfers.items():
|
||||
if dst == src or not os.path.exists(dst):
|
||||
self.log.debug("Checking file ... {} -> {}".format(src, dst))
|
||||
path_same = self._same_paths(src, dst)
|
||||
if path_same or not os.path.exists(dst):
|
||||
continue
|
||||
|
||||
# Backup original file
|
||||
|
|
@ -105,7 +107,8 @@ class FileTransaction(object):
|
|||
|
||||
# Copy the files to transfer
|
||||
for dst, (src, opts) in self._transfers.items():
|
||||
if dst == src:
|
||||
path_same = self._same_paths(src, dst)
|
||||
if path_same:
|
||||
self.log.debug(
|
||||
"Source and destionation are same files {} -> {}".format(
|
||||
src, dst))
|
||||
|
|
@ -182,3 +185,10 @@ class FileTransaction(object):
|
|||
else:
|
||||
self.log.critical("An unexpected error occurred.")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
def _same_paths(self, src, dst):
|
||||
# handles same paths but with C:/project vs c:/project
|
||||
if os.path.exists(src) and os.path.exists(dst):
|
||||
return os.path.samefile(src, dst)
|
||||
|
||||
return src == dst
|
||||
|
|
|
|||
|
|
@ -400,6 +400,7 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
label = "Submit to Deadline"
|
||||
order = pyblish.api.IntegratorOrder + 0.1
|
||||
import_reference = False
|
||||
use_published = True
|
||||
asset_dependencies = False
|
||||
|
||||
|
|
@ -424,7 +425,11 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
file_path = None
|
||||
if self.use_published:
|
||||
file_path = self.from_published_scene()
|
||||
if not self.import_reference:
|
||||
file_path = self.from_published_scene()
|
||||
else:
|
||||
self.log.info("use the scene with imported reference for rendering") # noqa
|
||||
file_path = context.data["currentFile"]
|
||||
|
||||
# fallback if nothing was set
|
||||
if not file_path:
|
||||
|
|
@ -516,7 +521,6 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
published.
|
||||
|
||||
"""
|
||||
|
||||
instance = self._instance
|
||||
workfile_instance = self._get_workfile_instance(instance.context)
|
||||
if workfile_instance is None:
|
||||
|
|
@ -524,7 +528,7 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
# determine published path from Anatomy.
|
||||
template_data = workfile_instance.data.get("anatomyData")
|
||||
rep = workfile_instance.data.get("representations")[0]
|
||||
rep = workfile_instance.data["representations"][0]
|
||||
template_data["representation"] = rep.get("name")
|
||||
template_data["ext"] = rep.get("ext")
|
||||
template_data["comment"] = None
|
||||
|
|
|
|||
|
|
@ -42,7 +42,7 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
instance.data["toBeRenderedOn"] = "deadline"
|
||||
families = instance.data["families"]
|
||||
|
||||
node = instance[0]
|
||||
node = instance.data["transientData"]["node"]
|
||||
context = instance.context
|
||||
|
||||
# get default deadline webservice url from deadline module
|
||||
|
|
|
|||
|
|
@ -973,7 +973,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
except Exception:
|
||||
# TODO logging
|
||||
# TODO report
|
||||
self.process_session.rolback()
|
||||
self.process_session.rollback()
|
||||
ent_path_items = [self.cur_project["full_name"]]
|
||||
ent_path_items.extend([
|
||||
par for par in avalon_entity["data"]["parents"]
|
||||
|
|
@ -1016,7 +1016,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
except Exception:
|
||||
# TODO logging
|
||||
# TODO report
|
||||
self.process_session.rolback()
|
||||
self.process_session.rollback()
|
||||
error_msg = (
|
||||
"Couldn't update custom attributes after recreation"
|
||||
" of entity in Ftrack"
|
||||
|
|
@ -1338,7 +1338,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
try:
|
||||
self.process_session.commit()
|
||||
except Exception:
|
||||
self.process_session.rolback()
|
||||
self.process_session.rollback()
|
||||
# TODO logging
|
||||
# TODO report
|
||||
error_msg = (
|
||||
|
|
|
|||
|
|
@ -129,8 +129,8 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
if not note_text.solved:
|
||||
self.log.warning((
|
||||
"Note template require more keys then can be provided."
|
||||
"\nTemplate: {}\nData: {}"
|
||||
).format(template, format_data))
|
||||
"\nTemplate: {}\nMissing values for keys:{}\nData: {}"
|
||||
).format(template, note_text.missing_keys, format_data))
|
||||
continue
|
||||
|
||||
if not note_text:
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ class SearchComboBox(QtWidgets.QComboBox):
|
|||
super(SearchComboBox, self).__init__(parent)
|
||||
|
||||
self.setEditable(True)
|
||||
self.setInsertPolicy(self.NoInsert)
|
||||
self.setInsertPolicy(QtWidgets.QComboBox.NoInsert)
|
||||
self.lineEdit().setPlaceholderText(placeholder)
|
||||
|
||||
# Apply completer settings
|
||||
|
|
|
|||
|
|
@ -39,6 +39,7 @@ class IntegrateSlackAPI(pyblish.api.InstancePlugin):
|
|||
token = instance.data["slack_token"]
|
||||
if additional_message:
|
||||
message = "{} \n".format(additional_message)
|
||||
users = groups = None
|
||||
for message_profile in instance.data["slack_channel_message_profiles"]:
|
||||
message += self._get_filled_message(message_profile["message"],
|
||||
instance,
|
||||
|
|
@ -60,8 +61,18 @@ class IntegrateSlackAPI(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
client = SlackPython3Operations(token, self.log)
|
||||
|
||||
users, groups = client.get_users_and_groups()
|
||||
message = self._translate_users(message, users, groups)
|
||||
if "@" in message:
|
||||
cache_key = "__cache_slack_ids"
|
||||
slack_ids = instance.context.data.get(cache_key, None)
|
||||
if slack_ids is None:
|
||||
users, groups = client.get_users_and_groups()
|
||||
instance.context.data[cache_key] = {}
|
||||
instance.context.data[cache_key]["users"] = users
|
||||
instance.context.data[cache_key]["groups"] = groups
|
||||
else:
|
||||
users = slack_ids["users"]
|
||||
groups = slack_ids["groups"]
|
||||
message = self._translate_users(message, users, groups)
|
||||
|
||||
msg_id, file_ids = client.send_message(channel,
|
||||
message,
|
||||
|
|
@ -212,7 +223,7 @@ class IntegrateSlackAPI(pyblish.api.InstancePlugin):
|
|||
|
||||
def _translate_users(self, message, users, groups):
|
||||
"""Replace all occurences of @mentions with proper <@name> format."""
|
||||
matches = re.findall(r"(?<!<)@[^ ]+", message)
|
||||
matches = re.findall(r"(?<!<)@\S+", message)
|
||||
in_quotes = re.findall(r"(?<!<)(['\"])(@[^'\"]+)", message)
|
||||
for item in in_quotes:
|
||||
matches.append(item[1])
|
||||
|
|
|
|||
|
|
@ -328,10 +328,6 @@ class GDriveHandler(AbstractProvider):
|
|||
last_tick = status = response = None
|
||||
status_val = 0
|
||||
while response is None:
|
||||
if server.is_representation_paused(representation['_id'],
|
||||
check_parents=True,
|
||||
project_name=project_name):
|
||||
raise ValueError("Paused during process, please redo.")
|
||||
if status:
|
||||
status_val = float(status.progress())
|
||||
if not last_tick or \
|
||||
|
|
@ -346,6 +342,13 @@ class GDriveHandler(AbstractProvider):
|
|||
site=site,
|
||||
progress=status_val
|
||||
)
|
||||
if server.is_representation_paused(
|
||||
project_name,
|
||||
representation['_id'],
|
||||
site,
|
||||
check_parents=True
|
||||
):
|
||||
raise ValueError("Paused during process, please redo.")
|
||||
status, response = request.next_chunk()
|
||||
|
||||
except errors.HttpError as ex:
|
||||
|
|
@ -415,10 +418,6 @@ class GDriveHandler(AbstractProvider):
|
|||
last_tick = status = response = None
|
||||
status_val = 0
|
||||
while response is None:
|
||||
if server.is_representation_paused(representation['_id'],
|
||||
check_parents=True,
|
||||
project_name=project_name):
|
||||
raise ValueError("Paused during process, please redo.")
|
||||
if status:
|
||||
status_val = float(status.progress())
|
||||
if not last_tick or \
|
||||
|
|
@ -433,6 +432,13 @@ class GDriveHandler(AbstractProvider):
|
|||
site=site,
|
||||
progress=status_val
|
||||
)
|
||||
if server.is_representation_paused(
|
||||
project_name,
|
||||
representation['_id'],
|
||||
site,
|
||||
check_parents=True
|
||||
):
|
||||
raise ValueError("Paused during process, please redo.")
|
||||
status, response = downloader.next_chunk()
|
||||
|
||||
return target_name
|
||||
|
|
|
|||
|
|
@ -284,9 +284,6 @@ class SyncServerThread(threading.Thread):
|
|||
# building folder tree structure in memory
|
||||
# call only if needed, eg. DO_UPLOAD or DO_DOWNLOAD
|
||||
for sync in sync_repres:
|
||||
if self.module.\
|
||||
is_representation_paused(sync['_id']):
|
||||
continue
|
||||
if limit <= 0:
|
||||
continue
|
||||
files = sync.get("files") or []
|
||||
|
|
|
|||
|
|
@ -124,7 +124,6 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
self.action_show_widget = None
|
||||
self._paused = False
|
||||
self._paused_projects = set()
|
||||
self._paused_representations = set()
|
||||
self._anatomies = {}
|
||||
|
||||
self._connection = None
|
||||
|
|
@ -475,7 +474,6 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
site_name (string): 'gdrive', 'studio' etc.
|
||||
"""
|
||||
self.log.info("Pausing SyncServer for {}".format(representation_id))
|
||||
self._paused_representations.add(representation_id)
|
||||
self.reset_site_on_representation(project_name, representation_id,
|
||||
site_name=site_name, pause=True)
|
||||
|
||||
|
|
@ -492,35 +490,47 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
site_name (string): 'gdrive', 'studio' etc.
|
||||
"""
|
||||
self.log.info("Unpausing SyncServer for {}".format(representation_id))
|
||||
try:
|
||||
self._paused_representations.remove(representation_id)
|
||||
except KeyError:
|
||||
pass
|
||||
# self.paused_representations is not persistent
|
||||
self.reset_site_on_representation(project_name, representation_id,
|
||||
site_name=site_name, pause=False)
|
||||
|
||||
def is_representation_paused(self, representation_id,
|
||||
check_parents=False, project_name=None):
|
||||
def is_representation_paused(self, project_name, representation_id,
|
||||
site_name, check_parents=False):
|
||||
"""
|
||||
Returns if 'representation_id' is paused or not.
|
||||
Returns if 'representation_id' is paused or not for site.
|
||||
|
||||
Args:
|
||||
representation_id (string): MongoDB objectId value
|
||||
project_name (str): project to check if paused
|
||||
representation_id (str): MongoDB objectId value
|
||||
site (str): site to check representation is paused for
|
||||
check_parents (bool): check if parent project or server itself
|
||||
are not paused
|
||||
project_name (string): project to check if paused
|
||||
|
||||
if 'check_parents', 'project_name' should be set too
|
||||
Returns:
|
||||
(bool)
|
||||
"""
|
||||
condition = representation_id in self._paused_representations
|
||||
if check_parents and project_name:
|
||||
condition = condition or \
|
||||
self.is_project_paused(project_name) or \
|
||||
self.is_paused()
|
||||
return condition
|
||||
# Check parents are paused
|
||||
if check_parents and (
|
||||
self.is_project_paused(project_name)
|
||||
or self.is_paused()
|
||||
):
|
||||
return True
|
||||
|
||||
# Get representation
|
||||
representation = get_representation_by_id(project_name,
|
||||
representation_id,
|
||||
fields=["files.sites"])
|
||||
if not representation:
|
||||
return False
|
||||
|
||||
# Check if representation is paused
|
||||
for file_info in representation.get("files", []):
|
||||
for site in file_info.get("sites", []):
|
||||
if site["name"] != site_name:
|
||||
continue
|
||||
|
||||
return site.get("paused", False)
|
||||
|
||||
return False
|
||||
|
||||
def pause_project(self, project_name):
|
||||
"""
|
||||
|
|
@ -1484,7 +1494,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
"$elemMatch": {
|
||||
"name": {"$in": [remote_site]},
|
||||
"created_dt": {"$exists": False},
|
||||
"tries": {"$in": retries_arr}
|
||||
"tries": {"$in": retries_arr},
|
||||
"paused": {"$exists": False}
|
||||
}
|
||||
}
|
||||
}]},
|
||||
|
|
@ -1494,7 +1505,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
"$elemMatch": {
|
||||
"name": active_site,
|
||||
"created_dt": {"$exists": False},
|
||||
"tries": {"$in": retries_arr}
|
||||
"tries": {"$in": retries_arr},
|
||||
"paused": {"$exists": False}
|
||||
}
|
||||
}}, {
|
||||
"files.sites": {
|
||||
|
|
|
|||
|
|
@ -156,8 +156,10 @@ class SyncProjectListWidget(QtWidgets.QWidget):
|
|||
if selected_index and \
|
||||
selected_index.isValid() and \
|
||||
not self._selection_changed:
|
||||
mode = QtCore.QItemSelectionModel.Select | \
|
||||
QtCore.QItemSelectionModel.Rows
|
||||
mode = (
|
||||
QtCore.QItemSelectionModel.Select
|
||||
| QtCore.QItemSelectionModel.Rows
|
||||
)
|
||||
self.project_list.selectionModel().select(selected_index, mode)
|
||||
|
||||
if self.current_project:
|
||||
|
|
@ -271,8 +273,10 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
for selected_id in self._selected_ids:
|
||||
index = self.model.get_index(selected_id)
|
||||
if index and index.isValid():
|
||||
mode = QtCore.QItemSelectionModel.Select | \
|
||||
QtCore.QItemSelectionModel.Rows
|
||||
mode = (
|
||||
QtCore.QItemSelectionModel.Select
|
||||
| QtCore.QItemSelectionModel.Rows
|
||||
)
|
||||
self.selection_model.select(index, mode)
|
||||
existing_ids.add(selected_id)
|
||||
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue