mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-25 05:14:40 +01:00
Merge branch 'develop' into feature/maya-multiple-deadline-servers
This commit is contained in:
commit
6dae396b0e
41 changed files with 895 additions and 367 deletions
29
CHANGELOG.md
29
CHANGELOG.md
|
|
@ -1,12 +1,19 @@
|
|||
# Changelog
|
||||
|
||||
## [3.3.0-nightly.6](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.3.0-nightly.8](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.2.0...HEAD)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Feature AE local render [\#1901](https://github.com/pypeclub/OpenPype/pull/1901)
|
||||
- Ftrack: Where I run action enhancement [\#1900](https://github.com/pypeclub/OpenPype/pull/1900)
|
||||
- Ftrack: Private project server actions [\#1899](https://github.com/pypeclub/OpenPype/pull/1899)
|
||||
- Support nested studio plugins paths. [\#1898](https://github.com/pypeclub/OpenPype/pull/1898)
|
||||
- Settings: global validators with options [\#1892](https://github.com/pypeclub/OpenPype/pull/1892)
|
||||
- Settings: Conditional dict enum positioning [\#1891](https://github.com/pypeclub/OpenPype/pull/1891)
|
||||
- Expose stop timer through rest api. [\#1886](https://github.com/pypeclub/OpenPype/pull/1886)
|
||||
- TVPaint: Increment workfile [\#1885](https://github.com/pypeclub/OpenPype/pull/1885)
|
||||
- Allow Multiple Notes to run on tasks. [\#1882](https://github.com/pypeclub/OpenPype/pull/1882)
|
||||
- Prepare for pyside2 [\#1869](https://github.com/pypeclub/OpenPype/pull/1869)
|
||||
- Filter hosts in settings host-enum [\#1868](https://github.com/pypeclub/OpenPype/pull/1868)
|
||||
|
|
@ -22,25 +29,31 @@
|
|||
- nuke: settings create missing default subsets [\#1829](https://github.com/pypeclub/OpenPype/pull/1829)
|
||||
- Update poetry lock [\#1823](https://github.com/pypeclub/OpenPype/pull/1823)
|
||||
- Settings: settings for plugins [\#1819](https://github.com/pypeclub/OpenPype/pull/1819)
|
||||
- Maya: Deadline custom settings [\#1797](https://github.com/pypeclub/OpenPype/pull/1797)
|
||||
- Maya: Shader name validation [\#1762](https://github.com/pypeclub/OpenPype/pull/1762)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Pyblish UI: Fix collecting stage processing [\#1903](https://github.com/pypeclub/OpenPype/pull/1903)
|
||||
- Burnins: Use input's bitrate in h624 [\#1902](https://github.com/pypeclub/OpenPype/pull/1902)
|
||||
- Bug: fixed python detection [\#1893](https://github.com/pypeclub/OpenPype/pull/1893)
|
||||
- global: integrate name missing default template [\#1890](https://github.com/pypeclub/OpenPype/pull/1890)
|
||||
- publisher: editorial plugins fixes [\#1889](https://github.com/pypeclub/OpenPype/pull/1889)
|
||||
- Normalize path returned from Workfiles. [\#1880](https://github.com/pypeclub/OpenPype/pull/1880)
|
||||
- Workfiles tool event arguments fix [\#1862](https://github.com/pypeclub/OpenPype/pull/1862)
|
||||
- imageio: fix grouping [\#1856](https://github.com/pypeclub/OpenPype/pull/1856)
|
||||
- publisher: missing version in subset prop [\#1849](https://github.com/pypeclub/OpenPype/pull/1849)
|
||||
- Ftrack type error fix in sync to avalon event handler [\#1845](https://github.com/pypeclub/OpenPype/pull/1845)
|
||||
- Nuke: updating effects subset fail [\#1841](https://github.com/pypeclub/OpenPype/pull/1841)
|
||||
- Fix - Standalone Publish better handling of loading multiple versions… [\#1837](https://github.com/pypeclub/OpenPype/pull/1837)
|
||||
- nuke: write render node skipped with crop [\#1836](https://github.com/pypeclub/OpenPype/pull/1836)
|
||||
- Project folder structure overrides [\#1813](https://github.com/pypeclub/OpenPype/pull/1813)
|
||||
- Maya: fix yeti settings path in extractor [\#1809](https://github.com/pypeclub/OpenPype/pull/1809)
|
||||
- Failsafe for cross project containers. [\#1806](https://github.com/pypeclub/OpenPype/pull/1806)
|
||||
- Houdini colector formatting keys fix [\#1802](https://github.com/pypeclub/OpenPype/pull/1802)
|
||||
- Settings error dialog on show [\#1798](https://github.com/pypeclub/OpenPype/pull/1798)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Maya: add support for `RedshiftNormalMap` node, fix `tx` linear space 🚀 [\#1863](https://github.com/pypeclub/OpenPype/pull/1863)
|
||||
- Add support for pyenv-win on windows [\#1822](https://github.com/pypeclub/OpenPype/pull/1822)
|
||||
- PS, AE - send actual context when another webserver is running [\#1811](https://github.com/pypeclub/OpenPype/pull/1811)
|
||||
|
||||
|
|
@ -52,6 +65,7 @@
|
|||
|
||||
- Nuke: ftrack family plugin settings preset [\#1805](https://github.com/pypeclub/OpenPype/pull/1805)
|
||||
- Standalone publisher last project [\#1799](https://github.com/pypeclub/OpenPype/pull/1799)
|
||||
- Maya: Deadline custom settings [\#1797](https://github.com/pypeclub/OpenPype/pull/1797)
|
||||
- Ftrack Multiple notes as server action [\#1795](https://github.com/pypeclub/OpenPype/pull/1795)
|
||||
- Settings conditional dict [\#1777](https://github.com/pypeclub/OpenPype/pull/1777)
|
||||
- Settings application use python 2 only where needed [\#1776](https://github.com/pypeclub/OpenPype/pull/1776)
|
||||
|
|
@ -61,9 +75,6 @@
|
|||
- Application executables with environment variables [\#1757](https://github.com/pypeclub/OpenPype/pull/1757)
|
||||
- Deadline: Nuke submission additional attributes [\#1756](https://github.com/pypeclub/OpenPype/pull/1756)
|
||||
- Settings schema without prefill [\#1753](https://github.com/pypeclub/OpenPype/pull/1753)
|
||||
- Settings Hosts enum [\#1739](https://github.com/pypeclub/OpenPype/pull/1739)
|
||||
- Validate containers settings [\#1736](https://github.com/pypeclub/OpenPype/pull/1736)
|
||||
- PS - added loader from sequence [\#1726](https://github.com/pypeclub/OpenPype/pull/1726)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -80,12 +91,6 @@
|
|||
- Standalone publisher thumbnail extractor fix [\#1761](https://github.com/pypeclub/OpenPype/pull/1761)
|
||||
- Anatomy others templates don't cause crash [\#1758](https://github.com/pypeclub/OpenPype/pull/1758)
|
||||
- Backend acre module commit update [\#1745](https://github.com/pypeclub/OpenPype/pull/1745)
|
||||
- hiero: precollect instances failing when audio selected [\#1743](https://github.com/pypeclub/OpenPype/pull/1743)
|
||||
- Hiero: creator instance error [\#1742](https://github.com/pypeclub/OpenPype/pull/1742)
|
||||
- Nuke: fixing render creator for no selection format failing [\#1741](https://github.com/pypeclub/OpenPype/pull/1741)
|
||||
- StandalonePublisher: failing collector for editorial [\#1738](https://github.com/pypeclub/OpenPype/pull/1738)
|
||||
- Local settings UI crash on missing defaults [\#1737](https://github.com/pypeclub/OpenPype/pull/1737)
|
||||
- TVPaint white background on thumbnail [\#1735](https://github.com/pypeclub/OpenPype/pull/1735)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
|
|||
|
|
@ -98,6 +98,11 @@ def install():
|
|||
.get(platform_name)
|
||||
) or []
|
||||
for path in project_plugins:
|
||||
try:
|
||||
path = str(path.format(**os.environ))
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,17 @@
|
|||
from openpype.hosts.aftereffects.plugins.create import create_render
|
||||
|
||||
import logging
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class CreateLocalRender(create_render.CreateRender):
|
||||
""" Creator to render locally.
|
||||
|
||||
Created only after default render on farm. So family 'render.local' is
|
||||
used for backward compatibility.
|
||||
"""
|
||||
|
||||
name = "renderDefault"
|
||||
label = "Render Locally"
|
||||
family = "renderLocal"
|
||||
|
|
@ -1,10 +1,14 @@
|
|||
from openpype.lib import abstract_collect_render
|
||||
from openpype.lib.abstract_collect_render import RenderInstance
|
||||
import pyblish.api
|
||||
import attr
|
||||
import os
|
||||
import re
|
||||
import attr
|
||||
import tempfile
|
||||
|
||||
from avalon import aftereffects
|
||||
import pyblish.api
|
||||
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import abstract_collect_render
|
||||
from openpype.lib.abstract_collect_render import RenderInstance
|
||||
|
||||
|
||||
@attr.s
|
||||
|
|
@ -13,6 +17,8 @@ class AERenderInstance(RenderInstance):
|
|||
comp_name = attr.ib(default=None)
|
||||
comp_id = attr.ib(default=None)
|
||||
fps = attr.ib(default=None)
|
||||
projectEntity = attr.ib(default=None)
|
||||
stagingDir = attr.ib(default=None)
|
||||
|
||||
|
||||
class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
||||
|
|
@ -21,6 +27,11 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
label = "Collect After Effects Render Layers"
|
||||
hosts = ["aftereffects"]
|
||||
|
||||
# internal
|
||||
family_remapping = {
|
||||
"render": ("render.farm", "farm"), # (family, label)
|
||||
"renderLocal": ("render", "local")
|
||||
}
|
||||
padding_width = 6
|
||||
rendered_extension = 'png'
|
||||
|
||||
|
|
@ -62,14 +73,16 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
fps = work_area_info.frameRate
|
||||
# TODO add resolution when supported by extension
|
||||
|
||||
if inst["family"] == "render" and inst["active"]:
|
||||
if inst["family"] in self.family_remapping.keys() \
|
||||
and inst["active"]:
|
||||
remapped_family = self.family_remapping[inst["family"]]
|
||||
instance = AERenderInstance(
|
||||
family="render.farm", # other way integrate would catch it
|
||||
families=["render.farm"],
|
||||
family=remapped_family[0],
|
||||
families=[remapped_family[0]],
|
||||
version=version,
|
||||
time="",
|
||||
source=current_file,
|
||||
label="{} - farm".format(inst["subset"]),
|
||||
label="{} - {}".format(inst["subset"], remapped_family[1]),
|
||||
subset=inst["subset"],
|
||||
asset=context.data["assetEntity"]["name"],
|
||||
attachTo=False,
|
||||
|
|
@ -105,6 +118,30 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
|
||||
instance.outputDir = self._get_output_dir(instance)
|
||||
|
||||
settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
reviewable_subset_filter = \
|
||||
(settings["deadline"]
|
||||
["publish"]
|
||||
["ProcessSubmittedJobOnFarm"]
|
||||
["aov_filter"])
|
||||
|
||||
if inst["family"] == "renderLocal":
|
||||
# for local renders
|
||||
instance.anatomyData["version"] = instance.version
|
||||
instance.anatomyData["subset"] = instance.subset
|
||||
instance.stagingDir = tempfile.mkdtemp()
|
||||
instance.projectEntity = project_entity
|
||||
|
||||
if self.hosts[0] in reviewable_subset_filter.keys():
|
||||
for aov_pattern in \
|
||||
reviewable_subset_filter[self.hosts[0]]:
|
||||
if re.match(aov_pattern, instance.subset):
|
||||
instance.families.append("review")
|
||||
instance.review = True
|
||||
break
|
||||
|
||||
self.log.info("New instance:: {}".format(instance))
|
||||
|
||||
instances.append(instance)
|
||||
|
||||
return instances
|
||||
|
|
|
|||
|
|
@ -0,0 +1,82 @@
|
|||
import os
|
||||
import six
|
||||
import sys
|
||||
|
||||
import openpype.api
|
||||
from avalon import aftereffects
|
||||
|
||||
|
||||
class ExtractLocalRender(openpype.api.Extractor):
|
||||
"""Render RenderQueue locally."""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.47
|
||||
label = "Extract Local Render"
|
||||
hosts = ["aftereffects"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
stub = aftereffects.stub()
|
||||
staging_dir = instance.data["stagingDir"]
|
||||
self.log.info("staging_dir::{}".format(staging_dir))
|
||||
|
||||
stub.render(staging_dir)
|
||||
|
||||
# pull file name from Render Queue Output module
|
||||
render_q = stub.get_render_info()
|
||||
if not render_q:
|
||||
raise ValueError("No file extension set in Render Queue")
|
||||
_, ext = os.path.splitext(os.path.basename(render_q.file_name))
|
||||
ext = ext[1:]
|
||||
|
||||
first_file_path = None
|
||||
files = []
|
||||
self.log.info("files::{}".format(os.listdir(staging_dir)))
|
||||
for file_name in os.listdir(staging_dir):
|
||||
files.append(file_name)
|
||||
if first_file_path is None:
|
||||
first_file_path = os.path.join(staging_dir,
|
||||
file_name)
|
||||
|
||||
resulting_files = files
|
||||
if len(files) == 1:
|
||||
resulting_files = files[0]
|
||||
|
||||
repre_data = {
|
||||
"frameStart": instance.data["frameStart"],
|
||||
"frameEnd": instance.data["frameEnd"],
|
||||
"name": ext,
|
||||
"ext": ext,
|
||||
"files": resulting_files,
|
||||
"stagingDir": staging_dir
|
||||
}
|
||||
if instance.data["review"]:
|
||||
repre_data["tags"] = ["review"]
|
||||
|
||||
instance.data["representations"] = [repre_data]
|
||||
|
||||
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
|
||||
# Generate thumbnail.
|
||||
thumbnail_path = os.path.join(staging_dir,
|
||||
"thumbnail.jpg")
|
||||
|
||||
args = [
|
||||
ffmpeg_path, "-y",
|
||||
"-i", first_file_path,
|
||||
"-vf", "scale=300:-1",
|
||||
"-vframes", "1",
|
||||
thumbnail_path
|
||||
]
|
||||
self.log.debug("Thumbnail args:: {}".format(args))
|
||||
try:
|
||||
output = openpype.lib.run_subprocess(args)
|
||||
except TypeError:
|
||||
self.log.warning("Error in creating thumbnail")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
instance.data["representations"].append({
|
||||
"name": "thumbnail",
|
||||
"ext": "jpg",
|
||||
"files": os.path.basename(thumbnail_path),
|
||||
"stagingDir": staging_dir,
|
||||
"tags": ["thumbnail"]
|
||||
})
|
||||
|
|
@ -53,7 +53,7 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Scene Settings"
|
||||
families = ["render.farm"]
|
||||
families = ["render.farm", "render"]
|
||||
hosts = ["aftereffects"]
|
||||
optional = True
|
||||
|
||||
|
|
|
|||
|
|
@ -54,6 +54,10 @@ class LoadClip(phiero.SequenceLoader):
|
|||
object_name = self.clip_name_template.format(
|
||||
**context["representation"]["context"])
|
||||
|
||||
# set colorspace
|
||||
if colorspace:
|
||||
track_item.source().setSourceMediaColourTransform(colorspace)
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
add_keys = [
|
||||
"frameStart", "frameEnd", "source", "author",
|
||||
|
|
@ -109,9 +113,14 @@ class LoadClip(phiero.SequenceLoader):
|
|||
colorspace = version_data.get("colorspace", None)
|
||||
object_name = "{}_{}".format(name, namespace)
|
||||
file = api.get_representation_path(representation).replace("\\", "/")
|
||||
clip = track_item.source()
|
||||
|
||||
# reconnect media to new path
|
||||
track_item.source().reconnectMedia(file)
|
||||
clip.reconnectMedia(file)
|
||||
|
||||
# set colorspace
|
||||
if colorspace:
|
||||
clip.setSourceMediaColourTransform(colorspace)
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
add_keys = [
|
||||
|
|
@ -160,6 +169,7 @@ class LoadClip(phiero.SequenceLoader):
|
|||
@classmethod
|
||||
def set_item_color(cls, track_item, version):
|
||||
|
||||
clip = track_item.source()
|
||||
# define version name
|
||||
version_name = version.get("name", None)
|
||||
# get all versions in list
|
||||
|
|
@ -172,6 +182,6 @@ class LoadClip(phiero.SequenceLoader):
|
|||
|
||||
# set clip colour
|
||||
if version_name == max_version:
|
||||
track_item.source().binItem().setColor(cls.clip_color_last)
|
||||
clip.binItem().setColor(cls.clip_color_last)
|
||||
else:
|
||||
track_item.source().binItem().setColor(cls.clip_color)
|
||||
clip.binItem().setColor(cls.clip_color)
|
||||
|
|
|
|||
|
|
@ -120,6 +120,13 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
# create instance
|
||||
instance = context.create_instance(**data)
|
||||
|
||||
# add colorspace data
|
||||
instance.data.update({
|
||||
"versionData": {
|
||||
"colorspace": track_item.sourceMediaColourTransform(),
|
||||
}
|
||||
})
|
||||
|
||||
# create shot instance for shot attributes create/update
|
||||
self.create_shot_instance(context, **data)
|
||||
|
||||
|
|
@ -133,13 +140,6 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
# create audio subset instance
|
||||
self.create_audio_instance(context, **data)
|
||||
|
||||
# add colorspace data
|
||||
instance.data.update({
|
||||
"versionData": {
|
||||
"colorspace": track_item.sourceMediaColourTransform(),
|
||||
}
|
||||
})
|
||||
|
||||
# add audioReview attribute to plate instance data
|
||||
# if reviewTrack is on
|
||||
if tag_data.get("reviewTrack") is not None:
|
||||
|
|
|
|||
|
|
@ -167,6 +167,8 @@ def get_file_node_path(node):
|
|||
|
||||
if cmds.nodeType(node) == 'aiImage':
|
||||
return cmds.getAttr('{0}.filename'.format(node))
|
||||
if cmds.nodeType(node) == 'RedshiftNormalMap':
|
||||
return cmds.getAttr('{}.tex0'.format(node))
|
||||
|
||||
# otherwise use fileTextureName
|
||||
return cmds.getAttr('{0}.fileTextureName'.format(node))
|
||||
|
|
@ -357,6 +359,7 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
|
||||
files = cmds.ls(history, type="file", long=True)
|
||||
files.extend(cmds.ls(history, type="aiImage", long=True))
|
||||
files.extend(cmds.ls(history, type="RedshiftNormalMap", long=True))
|
||||
|
||||
self.log.info("Collected file nodes:\n{}".format(files))
|
||||
# Collect textures if any file nodes are found
|
||||
|
|
@ -487,7 +490,7 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
"""
|
||||
|
||||
self.log.debug("processing: {}".format(node))
|
||||
if cmds.nodeType(node) not in ["file", "aiImage"]:
|
||||
if cmds.nodeType(node) not in ["file", "aiImage", "RedshiftNormalMap"]:
|
||||
self.log.error(
|
||||
"Unsupported file node: {}".format(cmds.nodeType(node)))
|
||||
raise AssertionError("Unsupported file node")
|
||||
|
|
@ -500,11 +503,19 @@ class CollectLook(pyblish.api.InstancePlugin):
|
|||
self.log.debug("aiImage node")
|
||||
attribute = "{}.filename".format(node)
|
||||
computed_attribute = attribute
|
||||
elif cmds.nodeType(node) == 'RedshiftNormalMap':
|
||||
self.log.debug("RedshiftNormalMap node")
|
||||
attribute = "{}.tex0".format(node)
|
||||
computed_attribute = attribute
|
||||
|
||||
source = cmds.getAttr(attribute)
|
||||
self.log.info(" - file source: {}".format(source))
|
||||
color_space_attr = "{}.colorSpace".format(node)
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
try:
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
except ValueError:
|
||||
# node doesn't have colorspace attribute
|
||||
color_space = "raw"
|
||||
# Compare with the computed file path, e.g. the one with the <UDIM>
|
||||
# pattern in it, to generate some logging information about this
|
||||
# difference
|
||||
|
|
|
|||
|
|
@ -233,11 +233,14 @@ class ExtractLook(openpype.api.Extractor):
|
|||
for filepath in files_metadata:
|
||||
|
||||
linearize = False
|
||||
if do_maketx and files_metadata[filepath]["color_space"] == "sRGB": # noqa: E501
|
||||
if do_maketx and files_metadata[filepath]["color_space"].lower() == "srgb": # noqa: E501
|
||||
linearize = True
|
||||
# set its file node to 'raw' as tx will be linearized
|
||||
files_metadata[filepath]["color_space"] = "raw"
|
||||
|
||||
if do_maketx:
|
||||
color_space = "raw"
|
||||
|
||||
source, mode, texture_hash = self._process_texture(
|
||||
filepath,
|
||||
do_maketx,
|
||||
|
|
@ -280,14 +283,19 @@ class ExtractLook(openpype.api.Extractor):
|
|||
# This will also trigger in the same order at end of context to
|
||||
# ensure after context it's still the original value.
|
||||
color_space_attr = resource["node"] + ".colorSpace"
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
if files_metadata[source]["color_space"] == "raw":
|
||||
# set color space to raw if we linearized it
|
||||
color_space = "Raw"
|
||||
# Remap file node filename to destination
|
||||
try:
|
||||
color_space = cmds.getAttr(color_space_attr)
|
||||
except ValueError:
|
||||
# node doesn't have color space attribute
|
||||
color_space = "raw"
|
||||
else:
|
||||
if files_metadata[source]["color_space"] == "raw":
|
||||
# set color space to raw if we linearized it
|
||||
color_space = "raw"
|
||||
# Remap file node filename to destination
|
||||
remap[color_space_attr] = color_space
|
||||
attr = resource["attribute"]
|
||||
remap[attr] = destinations[source]
|
||||
remap[color_space_attr] = color_space
|
||||
|
||||
self.log.info("Finished remapping destinations ...")
|
||||
|
||||
|
|
|
|||
|
|
@ -259,7 +259,8 @@ class LoadMov(api.Loader):
|
|||
read_node["last"].setValue(last)
|
||||
read_node['frame_mode'].setValue("start at")
|
||||
|
||||
if int(self.first_frame) == int(read_node['frame'].value()):
|
||||
if int(float(self.first_frame)) == int(
|
||||
float(read_node['frame'].value())):
|
||||
# start at workfile start
|
||||
read_node['frame'].setValue(str(self.first_frame))
|
||||
else:
|
||||
|
|
|
|||
|
|
@ -270,6 +270,7 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
# store origin
|
||||
if family == 'workfile':
|
||||
families = self.workfile_families
|
||||
families.append("texture_batch_workfile")
|
||||
|
||||
new_instance.data["source"] = "standalone publisher"
|
||||
else:
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ class ValidateTextureBatch(pyblish.api.InstancePlugin):
|
|||
label = "Validate Texture Presence"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["workfile"]
|
||||
families = ["texture_batch_workfile"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ class ValidateTextureBatchNaming(pyblish.api.InstancePlugin):
|
|||
label = "Validate Texture Batch Naming"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["workfile", "textures"]
|
||||
families = ["texture_batch_workfile", "textures"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ class ValidateTextureBatchWorkfiles(pyblish.api.InstancePlugin):
|
|||
label = "Validate Texture Workfile Has Resources"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["workfile"]
|
||||
families = ["texture_batch_workfile"]
|
||||
optional = True
|
||||
|
||||
# from presets
|
||||
|
|
|
|||
|
|
@ -1138,7 +1138,8 @@ def prepare_host_environments(data, implementation_envs=True):
|
|||
# Merge dictionaries
|
||||
env_values = _merge_env(tool_env, env_values)
|
||||
|
||||
loaded_env = _merge_env(acre.compute(env_values), data["env"])
|
||||
merged_env = _merge_env(env_values, data["env"])
|
||||
loaded_env = acre.compute(merged_env, cleanup=False)
|
||||
|
||||
final_env = None
|
||||
# Add host specific environments
|
||||
|
|
@ -1189,7 +1190,10 @@ def apply_project_environments_value(project_name, env, project_settings=None):
|
|||
|
||||
env_value = project_settings["global"]["project_environments"]
|
||||
if env_value:
|
||||
env.update(_merge_env(acre.parse(env_value), env))
|
||||
env.update(acre.compute(
|
||||
_merge_env(acre.parse(env_value), env),
|
||||
cleanup=False
|
||||
))
|
||||
return env
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,61 @@
|
|||
from openpype.modules.ftrack.lib import ServerAction
|
||||
|
||||
|
||||
class PrivateProjectDetectionAction(ServerAction):
|
||||
"""Action helps to identify if does not have access to project."""
|
||||
|
||||
identifier = "server.missing.perm.private.project"
|
||||
label = "Missing permissions"
|
||||
description = (
|
||||
"Main ftrack event server does not have access to this project."
|
||||
)
|
||||
|
||||
def _discover(self, event):
|
||||
"""Show action only if there is a selection in event data."""
|
||||
entities = self._translate_event(event)
|
||||
if entities:
|
||||
return None
|
||||
|
||||
selection = event["data"].get("selection")
|
||||
if not selection:
|
||||
return None
|
||||
|
||||
return {
|
||||
"items": [{
|
||||
"label": self.label,
|
||||
"variant": self.variant,
|
||||
"description": self.description,
|
||||
"actionIdentifier": self.discover_identifier,
|
||||
"icon": self.icon,
|
||||
}]
|
||||
}
|
||||
|
||||
def _launch(self, event):
|
||||
# Ignore if there are values in event data
|
||||
# - somebody clicked on submit button
|
||||
values = event["data"].get("values")
|
||||
if values:
|
||||
return None
|
||||
|
||||
title = "# Private project (missing permissions) #"
|
||||
msg = (
|
||||
"User ({}) or API Key used on Ftrack event server"
|
||||
" does not have permissions to access this private project."
|
||||
).format(self.session.api_user)
|
||||
return {
|
||||
"type": "form",
|
||||
"title": "Missing permissions",
|
||||
"items": [
|
||||
{"type": "label", "value": title},
|
||||
{"type": "label", "value": msg},
|
||||
# Add hidden to be able detect if was clicked on submit
|
||||
{"type": "hidden", "value": "1", "name": "hidden"}
|
||||
],
|
||||
"submit_button_label": "Got it"
|
||||
}
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
PrivateProjectDetectionAction(session).register()
|
||||
|
|
@ -1,33 +1,98 @@
|
|||
import platform
|
||||
import socket
|
||||
import getpass
|
||||
|
||||
from openpype.modules.ftrack.lib import BaseAction, statics_icon
|
||||
|
||||
|
||||
class ActionAskWhereIRun(BaseAction):
|
||||
""" Sometimes user forget where pipeline with his credentials is running.
|
||||
- this action triggers `ActionShowWhereIRun`
|
||||
"""
|
||||
ignore_me = True
|
||||
identifier = 'ask.where.i.run'
|
||||
label = 'Ask where I run'
|
||||
description = 'Triggers PC info where user have running OpenPype'
|
||||
icon = statics_icon("ftrack", "action_icons", "ActionAskWhereIRun.svg")
|
||||
class ActionWhereIRun(BaseAction):
|
||||
"""Show where same user has running OpenPype instances."""
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
""" Hide by default - Should be enabled only if you want to run.
|
||||
- best practise is to create another action that triggers this one
|
||||
"""
|
||||
identifier = "ask.where.i.run"
|
||||
show_identifier = "show.where.i.run"
|
||||
label = "OpenPype Admin"
|
||||
variant = "- Where I run"
|
||||
description = "Show PC info where user have running OpenPype"
|
||||
|
||||
return True
|
||||
def _discover(self, _event):
|
||||
return {
|
||||
"items": [{
|
||||
"label": self.label,
|
||||
"variant": self.variant,
|
||||
"description": self.description,
|
||||
"actionIdentifier": self.discover_identifier,
|
||||
"icon": self.icon,
|
||||
}]
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
more_data = {"event_hub_id": session.event_hub.id}
|
||||
self.trigger_action(
|
||||
"show.where.i.run", event, additional_event_data=more_data
|
||||
def _launch(self, event):
|
||||
self.trigger_action(self.show_identifier, event)
|
||||
|
||||
def register(self):
|
||||
# Register default action callbacks
|
||||
super(ActionWhereIRun, self).register()
|
||||
|
||||
# Add show identifier
|
||||
show_subscription = (
|
||||
"topic=ftrack.action.launch"
|
||||
" and data.actionIdentifier={}"
|
||||
" and source.user.username={}"
|
||||
).format(
|
||||
self.show_identifier,
|
||||
self.session.api_user
|
||||
)
|
||||
self.session.event_hub.subscribe(
|
||||
show_subscription,
|
||||
self._show_info
|
||||
)
|
||||
|
||||
return True
|
||||
def _show_info(self, event):
|
||||
title = "Where Do I Run?"
|
||||
msgs = {}
|
||||
all_keys = ["Hostname", "IP", "Username", "System name", "PC name"]
|
||||
try:
|
||||
host_name = socket.gethostname()
|
||||
msgs["Hostname"] = host_name
|
||||
host_ip = socket.gethostbyname(host_name)
|
||||
msgs["IP"] = host_ip
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
system_name, pc_name, *_ = platform.uname()
|
||||
msgs["System name"] = system_name
|
||||
msgs["PC name"] = pc_name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
msgs["Username"] = getpass.getuser()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
for key in all_keys:
|
||||
if not msgs.get(key):
|
||||
msgs[key] = "-Undefined-"
|
||||
|
||||
items = []
|
||||
first = True
|
||||
separator = {"type": "label", "value": "---"}
|
||||
for key, value in msgs.items():
|
||||
if first:
|
||||
first = False
|
||||
else:
|
||||
items.append(separator)
|
||||
self.log.debug("{}: {}".format(key, value))
|
||||
|
||||
subtitle = {"type": "label", "value": "<h3>{}</h3>".format(key)}
|
||||
items.append(subtitle)
|
||||
message = {"type": "label", "value": "<p>{}</p>".format(value)}
|
||||
items.append(message)
|
||||
|
||||
self.show_interface(items, title, event=event)
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
ActionAskWhereIRun(session).register()
|
||||
ActionWhereIRun(session).register()
|
||||
|
|
|
|||
|
|
@ -1,86 +0,0 @@
|
|||
import platform
|
||||
import socket
|
||||
import getpass
|
||||
from openpype.modules.ftrack.lib import BaseAction
|
||||
|
||||
|
||||
class ActionShowWhereIRun(BaseAction):
|
||||
""" Sometimes user forget where pipeline with his credentials is running.
|
||||
- this action shows on which PC, Username and IP is running
|
||||
- requirement action MUST be registered where we want to locate the PC:
|
||||
- - can't be used retrospectively...
|
||||
"""
|
||||
#: Action identifier.
|
||||
identifier = 'show.where.i.run'
|
||||
#: Action label.
|
||||
label = 'Show where I run'
|
||||
#: Action description.
|
||||
description = 'Shows PC info where user have running OpenPype'
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
""" Hide by default - Should be enabled only if you want to run.
|
||||
- best practise is to create another action that triggers this one
|
||||
"""
|
||||
|
||||
return False
|
||||
|
||||
@property
|
||||
def launch_identifier(self):
|
||||
return self.identifier
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
# Don't show info when was launch from this session
|
||||
if session.event_hub.id == event.get("data", {}).get("event_hub_id"):
|
||||
return True
|
||||
|
||||
title = "Where Do I Run?"
|
||||
msgs = {}
|
||||
all_keys = ["Hostname", "IP", "Username", "System name", "PC name"]
|
||||
try:
|
||||
host_name = socket.gethostname()
|
||||
msgs["Hostname"] = host_name
|
||||
host_ip = socket.gethostbyname(host_name)
|
||||
msgs["IP"] = host_ip
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
system_name, pc_name, *_ = platform.uname()
|
||||
msgs["System name"] = system_name
|
||||
msgs["PC name"] = pc_name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
msgs["Username"] = getpass.getuser()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
for key in all_keys:
|
||||
if not msgs.get(key):
|
||||
msgs[key] = "-Undefined-"
|
||||
|
||||
items = []
|
||||
first = True
|
||||
splitter = {'type': 'label', 'value': '---'}
|
||||
for key, value in msgs.items():
|
||||
if first:
|
||||
first = False
|
||||
else:
|
||||
items.append(splitter)
|
||||
self.log.debug("{}: {}".format(key, value))
|
||||
|
||||
subtitle = {'type': 'label', 'value': '<h3>{}</h3>'.format(key)}
|
||||
items.append(subtitle)
|
||||
message = {'type': 'label', 'value': '<p>{}</p>'.format(value)}
|
||||
items.append(message)
|
||||
|
||||
self.show_interface(items, title, event=event)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
ActionShowWhereIRun(session).register()
|
||||
|
|
@ -191,15 +191,22 @@ class BaseHandler(object):
|
|||
if session is None:
|
||||
session = self.session
|
||||
|
||||
_entities = event['data'].get('entities_object', None)
|
||||
_entities = event["data"].get("entities_object", None)
|
||||
if _entities is not None and not _entities:
|
||||
return _entities
|
||||
|
||||
if (
|
||||
_entities is None or
|
||||
_entities[0].get(
|
||||
'link', None
|
||||
_entities is None
|
||||
or _entities[0].get(
|
||||
"link", None
|
||||
) == ftrack_api.symbol.NOT_SET
|
||||
):
|
||||
_entities = self._get_entities(event)
|
||||
event['data']['entities_object'] = _entities
|
||||
_entities = [
|
||||
item
|
||||
for item in self._get_entities(event)
|
||||
if item is not None
|
||||
]
|
||||
event["data"]["entities_object"] = _entities
|
||||
|
||||
return _entities
|
||||
|
||||
|
|
|
|||
|
|
@ -44,7 +44,8 @@ class ExtractBurnin(openpype.api.Extractor):
|
|||
"harmony",
|
||||
"fusion",
|
||||
"aftereffects",
|
||||
"tvpaint"
|
||||
"tvpaint",
|
||||
"aftereffects"
|
||||
# "resolve"
|
||||
]
|
||||
optional = True
|
||||
|
|
|
|||
|
|
@ -44,7 +44,8 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
"standalonepublisher",
|
||||
"fusion",
|
||||
"tvpaint",
|
||||
"resolve"
|
||||
"resolve",
|
||||
"aftereffects"
|
||||
]
|
||||
|
||||
# Supported extensions
|
||||
|
|
|
|||
|
|
@ -113,6 +113,10 @@ def _h264_codec_args(ffprobe_data):
|
|||
|
||||
output.extend(["-codec:v", "h264"])
|
||||
|
||||
bit_rate = ffprobe_data.get("bit_rate")
|
||||
if bit_rate:
|
||||
output.extend(["-b:v", bit_rate])
|
||||
|
||||
pix_fmt = ffprobe_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
|
|
|||
|
|
@ -12,6 +12,30 @@
|
|||
"deadline"
|
||||
]
|
||||
},
|
||||
"ProcessSubmittedJobOnFarm": {
|
||||
"enabled": true,
|
||||
"deadline_department": "",
|
||||
"deadline_pool": "",
|
||||
"deadline_group": "",
|
||||
"deadline_chunk_size": 1,
|
||||
"deadline_priority": 50,
|
||||
"publishing_script": "",
|
||||
"skip_integration_repre_list": [],
|
||||
"aov_filter": {
|
||||
"maya": [
|
||||
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
|
||||
],
|
||||
"nuke": [
|
||||
".*"
|
||||
],
|
||||
"aftereffects": [
|
||||
".*"
|
||||
],
|
||||
"celaction": [
|
||||
".*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"MayaSubmitDeadline": {
|
||||
"enabled": true,
|
||||
"optional": false,
|
||||
|
|
|
|||
|
|
@ -298,6 +298,17 @@
|
|||
"add_ftrack_family": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"hosts": [
|
||||
"aftereffects"
|
||||
],
|
||||
"families": [
|
||||
"render"
|
||||
],
|
||||
"tasks": [],
|
||||
"add_ftrack_family": true,
|
||||
"advanced_filtering": []
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
|
|||
|
|
@ -173,28 +173,6 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"ProcessSubmittedJobOnFarm": {
|
||||
"enabled": true,
|
||||
"deadline_department": "",
|
||||
"deadline_pool": "",
|
||||
"deadline_group": "",
|
||||
"deadline_chunk_size": 1,
|
||||
"deadline_priority": 50,
|
||||
"publishing_script": "",
|
||||
"skip_integration_repre_list": [],
|
||||
"aov_filter": {
|
||||
"maya": [
|
||||
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
|
||||
],
|
||||
"nuke": [],
|
||||
"aftereffects": [
|
||||
".*"
|
||||
],
|
||||
"celaction": [
|
||||
".*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"CleanUp": {
|
||||
"paterns": [],
|
||||
"remove_temp_renders": false
|
||||
|
|
@ -257,6 +235,16 @@
|
|||
],
|
||||
"tasks": [],
|
||||
"template": "{family}{Task}"
|
||||
},
|
||||
{
|
||||
"families": [
|
||||
"renderLocal"
|
||||
],
|
||||
"hosts": [
|
||||
"aftereffects"
|
||||
],
|
||||
"tasks": [],
|
||||
"template": "render{Task}{Variant}"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import re
|
|||
import json
|
||||
import copy
|
||||
import inspect
|
||||
import contextlib
|
||||
|
||||
from .exceptions import (
|
||||
SchemaTemplateMissingKeys,
|
||||
|
|
@ -111,6 +112,10 @@ class SchemasHub:
|
|||
self._loaded_templates = {}
|
||||
self._loaded_schemas = {}
|
||||
|
||||
# Store validating and validated dynamic template or schemas
|
||||
self._validating_dynamic = set()
|
||||
self._validated_dynamic = set()
|
||||
|
||||
# It doesn't make sence to reload types on each reset as they can't be
|
||||
# changed
|
||||
self._load_types()
|
||||
|
|
@ -126,6 +131,60 @@ class SchemasHub:
|
|||
def gui_types(self):
|
||||
return self._gui_types
|
||||
|
||||
def get_template_name(self, item_def, default=None):
|
||||
"""Get template name from passed item definition.
|
||||
|
||||
Args:
|
||||
item_def(dict): Definition of item with "type".
|
||||
default(object): Default return value.
|
||||
"""
|
||||
output = default
|
||||
if not item_def or not isinstance(item_def, dict):
|
||||
return output
|
||||
|
||||
item_type = item_def.get("type")
|
||||
if item_type in ("template", "schema_template"):
|
||||
output = item_def["name"]
|
||||
return output
|
||||
|
||||
def is_dynamic_template_validating(self, template_name):
|
||||
"""Is template validating using different entity.
|
||||
|
||||
Returns:
|
||||
bool: Is template validating.
|
||||
"""
|
||||
if template_name in self._validating_dynamic:
|
||||
return True
|
||||
return False
|
||||
|
||||
def is_dynamic_template_validated(self, template_name):
|
||||
"""Is template already validated.
|
||||
|
||||
Returns:
|
||||
bool: Is template validated.
|
||||
"""
|
||||
|
||||
if template_name in self._validated_dynamic:
|
||||
return True
|
||||
return False
|
||||
|
||||
@contextlib.contextmanager
|
||||
def validating_dynamic(self, template_name):
|
||||
"""Template name is validating and validated.
|
||||
|
||||
Context manager that cares about storing template name validations of
|
||||
template.
|
||||
|
||||
This is to avoid infinite loop of dynamic children validation.
|
||||
"""
|
||||
self._validating_dynamic.add(template_name)
|
||||
try:
|
||||
yield
|
||||
self._validated_dynamic.add(template_name)
|
||||
|
||||
finally:
|
||||
self._validating_dynamic.remove(template_name)
|
||||
|
||||
def get_schema(self, schema_name):
|
||||
"""Get schema definition data by it's name.
|
||||
|
||||
|
|
|
|||
|
|
@ -141,7 +141,20 @@ class ListEntity(EndpointEntity):
|
|||
item_schema = self.schema_data["object_type"]
|
||||
if not isinstance(item_schema, dict):
|
||||
item_schema = {"type": item_schema}
|
||||
self.item_schema = item_schema
|
||||
|
||||
obj_template_name = self.schema_hub.get_template_name(item_schema)
|
||||
_item_schemas = self.schema_hub.resolve_schema_data(item_schema)
|
||||
if len(_item_schemas) == 1:
|
||||
self.item_schema = _item_schemas[0]
|
||||
if self.item_schema != item_schema:
|
||||
if "label" in self.item_schema:
|
||||
self.item_schema.pop("label")
|
||||
self.item_schema["use_label_wrap"] = False
|
||||
else:
|
||||
self.item_schema = _item_schemas
|
||||
|
||||
# Store if was used template or schema
|
||||
self._obj_template_name = obj_template_name
|
||||
|
||||
if self.group_item is None:
|
||||
self.is_group = True
|
||||
|
|
@ -150,6 +163,12 @@ class ListEntity(EndpointEntity):
|
|||
self.initial_value = []
|
||||
|
||||
def schema_validations(self):
|
||||
if isinstance(self.item_schema, list):
|
||||
reason = (
|
||||
"`ListWidget` has multiple items as object type."
|
||||
)
|
||||
raise EntitySchemaError(self, reason)
|
||||
|
||||
super(ListEntity, self).schema_validations()
|
||||
|
||||
if self.is_dynamic_item and self.use_label_wrap:
|
||||
|
|
@ -167,18 +186,36 @@ class ListEntity(EndpointEntity):
|
|||
raise EntitySchemaError(self, reason)
|
||||
|
||||
# Validate object type schema
|
||||
child_validated = False
|
||||
validate_children = True
|
||||
for child_entity in self.children:
|
||||
child_entity.schema_validations()
|
||||
child_validated = True
|
||||
validate_children = False
|
||||
break
|
||||
|
||||
if not child_validated:
|
||||
if validate_children and self._obj_template_name:
|
||||
_validated = self.schema_hub.is_dynamic_template_validated(
|
||||
self._obj_template_name
|
||||
)
|
||||
_validating = self.schema_hub.is_dynamic_template_validating(
|
||||
self._obj_template_name
|
||||
)
|
||||
validate_children = not _validated and not _validating
|
||||
|
||||
if not validate_children:
|
||||
return
|
||||
|
||||
def _validate():
|
||||
idx = 0
|
||||
tmp_child = self._add_new_item(idx)
|
||||
tmp_child.schema_validations()
|
||||
self.children.pop(idx)
|
||||
|
||||
if self._obj_template_name:
|
||||
with self.schema_hub.validating_dynamic(self._obj_template_name):
|
||||
_validate()
|
||||
else:
|
||||
_validate()
|
||||
|
||||
def get_child_path(self, child_obj):
|
||||
result_idx = None
|
||||
for idx, _child_obj in enumerate(self.children):
|
||||
|
|
|
|||
|
|
@ -417,6 +417,8 @@ How output of the schema could look like on save:
|
|||
- there are 2 possible ways how to set the type:
|
||||
1.) dictionary with item modifiers (`number` input has `minimum`, `maximum` and `decimals`) in that case item type must be set as value of `"type"` (example below)
|
||||
2.) item type name as string without modifiers (e.g. `text`)
|
||||
3.) enhancement of 1.) there is also support of `template` type but be carefull about endless loop of templates
|
||||
- goal of using `template` is to easily change same item definitions in multiple lists
|
||||
|
||||
1.) with item modifiers
|
||||
```
|
||||
|
|
@ -442,6 +444,65 @@ How output of the schema could look like on save:
|
|||
}
|
||||
```
|
||||
|
||||
3.) with template definition
|
||||
```
|
||||
# Schema of list item where template is used
|
||||
{
|
||||
"type": "list",
|
||||
"key": "menu_items",
|
||||
"label": "Menu Items",
|
||||
"object_type": {
|
||||
"type": "template",
|
||||
"name": "template_object_example"
|
||||
}
|
||||
}
|
||||
|
||||
# WARNING:
|
||||
# In this example the template use itself inside which will work in `list`
|
||||
# but may cause an issue in other entity types (e.g. `dict`).
|
||||
|
||||
'template_object_example.json' :
|
||||
[
|
||||
{
|
||||
"type": "dict-conditional",
|
||||
"use_label_wrap": true,
|
||||
"collapsible": true,
|
||||
"key": "menu_items",
|
||||
"label": "Menu items",
|
||||
"enum_key": "type",
|
||||
"enum_label": "Type",
|
||||
"enum_children": [
|
||||
{
|
||||
"key": "action",
|
||||
"label": "Action",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "key",
|
||||
"label": "Key"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"key": "menu",
|
||||
"label": "Menu",
|
||||
"children": [
|
||||
{
|
||||
"key": "children",
|
||||
"label": "Children",
|
||||
"type": "list",
|
||||
"object_type": {
|
||||
"type": "template",
|
||||
"name": "template_object_example"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### dict-modifiable
|
||||
- one of dictionary inputs, this is only used as value input
|
||||
- items in this input can be removed and added same way as in `list` input
|
||||
|
|
|
|||
|
|
@ -58,6 +58,101 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "ProcessSubmittedJobOnFarm",
|
||||
"label": "ProcessSubmittedJobOnFarm",
|
||||
"checkbox_key": "enabled",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "deadline_department",
|
||||
"label": "Deadline department"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "deadline_pool",
|
||||
"label": "Deadline Pool"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "deadline_group",
|
||||
"label": "Deadline Group"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "deadline_chunk_size",
|
||||
"label": "Deadline Chunk Size"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "deadline_priority",
|
||||
"label": "Deadline Priotity"
|
||||
},
|
||||
{
|
||||
"type": "splitter"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "publishing_script",
|
||||
"label": "Publishing script path"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_integration_repre_list",
|
||||
"label": "Skip integration of representation with ext",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "aov_filter",
|
||||
"label": "Reviewable subsets filter",
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "maya",
|
||||
"label": "Maya",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "nuke",
|
||||
"label": "Nuke",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "aftereffects",
|
||||
"label": "After Effects",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "celaction",
|
||||
"label": "Celaction",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
|
|||
|
|
@ -556,101 +556,6 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "ProcessSubmittedJobOnFarm",
|
||||
"label": "ProcessSubmittedJobOnFarm",
|
||||
"checkbox_key": "enabled",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "deadline_department",
|
||||
"label": "Deadline department"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "deadline_pool",
|
||||
"label": "Deadline Pool"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "deadline_group",
|
||||
"label": "Deadline Group"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "deadline_chunk_size",
|
||||
"label": "Deadline Chunk Size"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "deadline_priority",
|
||||
"label": "Deadline Priotity"
|
||||
},
|
||||
{
|
||||
"type": "splitter"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "publishing_script",
|
||||
"label": "Publishing script path"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_integration_repre_list",
|
||||
"label": "Skip integration of representation with ext",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "aov_filter",
|
||||
"label": "Reviewable subsets filter",
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "maya",
|
||||
"label": "Maya",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "nuke",
|
||||
"label": "Nuke",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "aftereffects",
|
||||
"label": "After Effects",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "celaction",
|
||||
"label": "Celaction",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
|
|||
|
|
@ -0,0 +1,58 @@
|
|||
[
|
||||
{
|
||||
"type": "dict-conditional",
|
||||
"use_label_wrap": true,
|
||||
"collapsible": true,
|
||||
"key": "menu_items",
|
||||
"label": "Menu items",
|
||||
"enum_key": "type",
|
||||
"enum_label": "Type",
|
||||
"enum_children": [
|
||||
{
|
||||
"key": "action",
|
||||
"label": "Action",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "key",
|
||||
"label": "Key"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "label",
|
||||
"label": "Label"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "command",
|
||||
"label": "Comand"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"key": "menu",
|
||||
"label": "Menu",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "label",
|
||||
"label": "Label"
|
||||
},
|
||||
{
|
||||
"key": "children",
|
||||
"label": "Children",
|
||||
"type": "list",
|
||||
"object_type": {
|
||||
"type": "template",
|
||||
"name": "example_infinite_hierarchy"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"key": "separator",
|
||||
"label": "Separator"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
|
|
@ -82,6 +82,17 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"use_label_wrap": true,
|
||||
"collapsible": true,
|
||||
"key": "infinite_hierarchy",
|
||||
"label": "Infinite list template hierarchy",
|
||||
"object_type": {
|
||||
"type": "template",
|
||||
"name": "example_infinite_hierarchy"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "schema_template_exaples",
|
||||
|
|
|
|||
|
|
@ -316,6 +316,7 @@ class Controller(QtCore.QObject):
|
|||
self.was_skipped.emit(plugin)
|
||||
continue
|
||||
|
||||
in_collect_stage = self.collect_state == 0
|
||||
if plugin.__instanceEnabled__:
|
||||
instances = pyblish.logic.instances_by_plugin(
|
||||
self.context, plugin
|
||||
|
|
@ -325,7 +326,10 @@ class Controller(QtCore.QObject):
|
|||
continue
|
||||
|
||||
for instance in instances:
|
||||
if instance.data.get("publish") is False:
|
||||
if (
|
||||
not in_collect_stage
|
||||
and instance.data.get("publish") is False
|
||||
):
|
||||
pyblish.logic.log.debug(
|
||||
"%s was inactive, skipping.." % instance
|
||||
)
|
||||
|
|
@ -338,7 +342,7 @@ class Controller(QtCore.QObject):
|
|||
yield (plugin, instance)
|
||||
else:
|
||||
families = util.collect_families_from_instances(
|
||||
self.context, only_active=True
|
||||
self.context, only_active=not in_collect_stage
|
||||
)
|
||||
plugins = pyblish.logic.plugins_by_families(
|
||||
[plugin], families
|
||||
|
|
|
|||
|
|
@ -498,6 +498,9 @@ class PluginModel(QtGui.QStandardItemModel):
|
|||
):
|
||||
new_flag_states[PluginStates.HasError] = True
|
||||
|
||||
if not publish_states & PluginStates.IsCompatible:
|
||||
new_flag_states[PluginStates.IsCompatible] = True
|
||||
|
||||
item.setData(new_flag_states, Roles.PublishFlagsRole)
|
||||
|
||||
records = item.data(Roles.LogRecordsRole) or []
|
||||
|
|
|
|||
|
|
@ -117,6 +117,9 @@ class ListItem(QtWidgets.QWidget):
|
|||
|
||||
self.spacer_widget = spacer_widget
|
||||
|
||||
self._row = -1
|
||||
self._is_last = False
|
||||
|
||||
@property
|
||||
def category_widget(self):
|
||||
return self.entity_widget.category_widget
|
||||
|
|
@ -136,28 +139,40 @@ class ListItem(QtWidgets.QWidget):
|
|||
def add_widget_to_layout(self, widget, label=None):
|
||||
self.content_layout.addWidget(widget, 1)
|
||||
|
||||
def set_row(self, row, is_last):
|
||||
if row == self._row and is_last == self._is_last:
|
||||
return
|
||||
|
||||
trigger_order_changed = (
|
||||
row != self._row
|
||||
or is_last != self._is_last
|
||||
)
|
||||
self._row = row
|
||||
self._is_last = is_last
|
||||
|
||||
if trigger_order_changed:
|
||||
self.order_changed()
|
||||
|
||||
@property
|
||||
def row(self):
|
||||
return self.entity_widget.input_fields.index(self)
|
||||
return self._row
|
||||
|
||||
def parent_rows_count(self):
|
||||
return len(self.entity_widget.input_fields)
|
||||
|
||||
def _on_add_clicked(self):
|
||||
self.entity_widget.add_new_item(row=self.row() + 1)
|
||||
self.entity_widget.add_new_item(row=self.row + 1)
|
||||
|
||||
def _on_remove_clicked(self):
|
||||
self.entity_widget.remove_row(self)
|
||||
|
||||
def _on_up_clicked(self):
|
||||
row = self.row()
|
||||
self.entity_widget.swap_rows(row - 1, row)
|
||||
self.entity_widget.swap_rows(self.row - 1, self.row)
|
||||
|
||||
def _on_down_clicked(self):
|
||||
row = self.row()
|
||||
self.entity_widget.swap_rows(row, row + 1)
|
||||
self.entity_widget.swap_rows(self.row, self.row + 1)
|
||||
|
||||
def order_changed(self):
|
||||
row = self.row()
|
||||
parent_row_count = self.parent_rows_count()
|
||||
if parent_row_count == 1:
|
||||
self.up_btn.setVisible(False)
|
||||
|
|
@ -168,11 +183,11 @@ class ListItem(QtWidgets.QWidget):
|
|||
self.up_btn.setVisible(True)
|
||||
self.down_btn.setVisible(True)
|
||||
|
||||
if row == 0:
|
||||
if self.row == 0:
|
||||
self.up_btn.setEnabled(False)
|
||||
self.down_btn.setEnabled(True)
|
||||
|
||||
elif row == parent_row_count - 1:
|
||||
elif self.row == parent_row_count - 1:
|
||||
self.up_btn.setEnabled(True)
|
||||
self.down_btn.setEnabled(False)
|
||||
|
||||
|
|
@ -191,6 +206,7 @@ class ListWidget(InputWidget):
|
|||
def create_ui(self):
|
||||
self._child_style_state = ""
|
||||
self.input_fields = []
|
||||
self._input_fields_by_entity_id = {}
|
||||
|
||||
main_layout = QtWidgets.QHBoxLayout(self)
|
||||
main_layout.setContentsMargins(0, 0, 0, 0)
|
||||
|
|
@ -243,8 +259,7 @@ class ListWidget(InputWidget):
|
|||
self.entity_widget.add_widget_to_layout(self, entity_label)
|
||||
|
||||
def set_entity_value(self):
|
||||
for input_field in tuple(self.input_fields):
|
||||
self.remove_row(input_field)
|
||||
self.remove_all_rows()
|
||||
|
||||
for entity in self.entity.children:
|
||||
self.add_row(entity)
|
||||
|
|
@ -262,39 +277,60 @@ class ListWidget(InputWidget):
|
|||
|
||||
def _on_entity_change(self):
|
||||
# TODO do less inefficient
|
||||
input_field_last_idx = len(self.input_fields) - 1
|
||||
child_len = len(self.entity)
|
||||
childen_order = []
|
||||
new_children = []
|
||||
for idx, child_entity in enumerate(self.entity):
|
||||
if idx > input_field_last_idx:
|
||||
self.add_row(child_entity, idx)
|
||||
input_field_last_idx += 1
|
||||
input_field = self._input_fields_by_entity_id.get(child_entity.id)
|
||||
if input_field is not None:
|
||||
childen_order.append(input_field)
|
||||
else:
|
||||
new_children.append((idx, child_entity))
|
||||
|
||||
order_changed = False
|
||||
for idx, input_field in enumerate(childen_order):
|
||||
current_field = self.input_fields[idx]
|
||||
if current_field is input_field:
|
||||
continue
|
||||
order_changed = True
|
||||
old_idx = self.input_fields.index(input_field)
|
||||
self.input_fields[old_idx], self.input_fields[idx] = (
|
||||
current_field, input_field
|
||||
)
|
||||
self.content_layout.insertWidget(idx + 1, input_field)
|
||||
|
||||
if self.input_fields[idx].entity is child_entity:
|
||||
continue
|
||||
kept_len = len(childen_order)
|
||||
fields_len = len(self.input_fields)
|
||||
if fields_len > kept_len:
|
||||
order_changed = True
|
||||
for row in reversed(range(kept_len, fields_len)):
|
||||
self.remove_row(row=row)
|
||||
|
||||
input_field_idx = None
|
||||
for _input_field_idx, input_field in enumerate(self.input_fields):
|
||||
if input_field.entity is child_entity:
|
||||
input_field_idx = _input_field_idx
|
||||
break
|
||||
for idx, child_entity in new_children:
|
||||
order_changed = False
|
||||
self.add_row(child_entity, idx)
|
||||
|
||||
if input_field_idx is None:
|
||||
self.add_row(child_entity, idx)
|
||||
input_field_last_idx += 1
|
||||
continue
|
||||
if not order_changed:
|
||||
return
|
||||
|
||||
input_field = self.input_fields.pop(input_field_idx)
|
||||
self.input_fields.insert(idx, input_field)
|
||||
self.content_layout.insertWidget(idx, input_field)
|
||||
self._on_order_change()
|
||||
|
||||
new_input_field_len = len(self.input_fields)
|
||||
if child_len != new_input_field_len:
|
||||
for _idx in range(child_len, new_input_field_len):
|
||||
# Remove row at the same index
|
||||
self.remove_row(self.input_fields[child_len])
|
||||
input_field_len = self.count()
|
||||
self.empty_row.setVisible(input_field_len == 0)
|
||||
|
||||
self.empty_row.setVisible(self.count() == 0)
|
||||
def _on_order_change(self):
|
||||
last_idx = self.count() - 1
|
||||
previous_input = None
|
||||
for idx, input_field in enumerate(self.input_fields):
|
||||
input_field.set_row(idx, idx == last_idx)
|
||||
next_input = input_field.input_field.focusProxy()
|
||||
if previous_input is not None:
|
||||
self.setTabOrder(previous_input, next_input)
|
||||
else:
|
||||
self.setTabOrder(self, next_input)
|
||||
previous_input = next_input
|
||||
|
||||
if previous_input is not None:
|
||||
self.setTabOrder(previous_input, self)
|
||||
|
||||
def count(self):
|
||||
return len(self.input_fields)
|
||||
|
|
@ -307,32 +343,20 @@ class ListWidget(InputWidget):
|
|||
|
||||
def add_new_item(self, row=None):
|
||||
new_entity = self.entity.add_new_item(row)
|
||||
for input_field in self.input_fields:
|
||||
if input_field.entity is new_entity:
|
||||
input_field.input_field.setFocus(True)
|
||||
break
|
||||
input_field = self._input_fields_by_entity_id.get(new_entity.id)
|
||||
if input_field is not None:
|
||||
input_field.input_field.setFocus(True)
|
||||
return new_entity
|
||||
|
||||
def add_row(self, child_entity, row=None):
|
||||
# Create new item
|
||||
item_widget = ListItem(child_entity, self)
|
||||
|
||||
previous_field = None
|
||||
next_field = None
|
||||
self._input_fields_by_entity_id[child_entity.id] = item_widget
|
||||
|
||||
if row is None:
|
||||
if self.input_fields:
|
||||
previous_field = self.input_fields[-1]
|
||||
self.content_layout.addWidget(item_widget)
|
||||
self.input_fields.append(item_widget)
|
||||
else:
|
||||
if row > 0:
|
||||
previous_field = self.input_fields[row - 1]
|
||||
|
||||
max_index = self.count()
|
||||
if row < max_index:
|
||||
next_field = self.input_fields[row]
|
||||
|
||||
self.content_layout.insertWidget(row + 1, item_widget)
|
||||
self.input_fields.insert(row, item_widget)
|
||||
|
||||
|
|
@ -342,49 +366,53 @@ class ListWidget(InputWidget):
|
|||
# added as widget here which won't because is not in input_fields
|
||||
item_widget.input_field.set_entity_value()
|
||||
|
||||
if previous_field:
|
||||
previous_field.order_changed()
|
||||
self._on_order_change()
|
||||
|
||||
if next_field:
|
||||
next_field.order_changed()
|
||||
|
||||
item_widget.order_changed()
|
||||
|
||||
previous_input = None
|
||||
for input_field in self.input_fields:
|
||||
if previous_input is not None:
|
||||
self.setTabOrder(
|
||||
previous_input, input_field.input_field.focusProxy()
|
||||
)
|
||||
previous_input = input_field.input_field.focusProxy()
|
||||
input_field_len = self.count()
|
||||
self.empty_row.setVisible(input_field_len == 0)
|
||||
|
||||
self.updateGeometry()
|
||||
|
||||
def remove_row(self, item_widget):
|
||||
row = self.input_fields.index(item_widget)
|
||||
previous_field = None
|
||||
next_field = None
|
||||
if row > 0:
|
||||
previous_field = self.input_fields[row - 1]
|
||||
def remove_all_rows(self):
|
||||
self._input_fields_by_entity_id = {}
|
||||
while self.input_fields:
|
||||
item_widget = self.input_fields.pop(0)
|
||||
self.content_layout.removeWidget(item_widget)
|
||||
item_widget.setParent(None)
|
||||
item_widget.deleteLater()
|
||||
|
||||
if row != len(self.input_fields) - 1:
|
||||
next_field = self.input_fields[row + 1]
|
||||
self.empty_row.setVisible(True)
|
||||
|
||||
self.updateGeometry()
|
||||
|
||||
def remove_row(self, item_widget=None, row=None):
|
||||
if item_widget is None:
|
||||
item_widget = self.input_fields[row]
|
||||
elif row is None:
|
||||
row = self.input_fields.index(item_widget)
|
||||
|
||||
self.content_layout.removeWidget(item_widget)
|
||||
self.input_fields.pop(row)
|
||||
self._input_fields_by_entity_id.pop(item_widget.entity.id)
|
||||
item_widget.setParent(None)
|
||||
item_widget.deleteLater()
|
||||
|
||||
if item_widget.entity in self.entity:
|
||||
self.entity.remove(item_widget.entity)
|
||||
|
||||
if previous_field:
|
||||
previous_field.order_changed()
|
||||
rows = self.count()
|
||||
any_item = rows == 0
|
||||
if any_item:
|
||||
start_row = 0
|
||||
if row > 0:
|
||||
start_row = row - 1
|
||||
|
||||
if next_field:
|
||||
next_field.order_changed()
|
||||
last_row = rows - 1
|
||||
_enum = enumerate(self.input_fields[start_row:rows])
|
||||
for idx, _item_widget in _enum:
|
||||
_item_widget.set_row(idx, idx == last_row)
|
||||
|
||||
self.empty_row.setVisible(self.count() == 0)
|
||||
self.empty_row.setVisible(any_item)
|
||||
|
||||
self.updateGeometry()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.3.0-nightly.6"
|
||||
__version__ = "3.3.0-nightly.8"
|
||||
|
|
|
|||
2
poetry.lock
generated
2
poetry.lock
generated
|
|
@ -11,7 +11,7 @@ develop = false
|
|||
type = "git"
|
||||
url = "https://github.com/pypeclub/acre.git"
|
||||
reference = "master"
|
||||
resolved_reference = "5a812c6dcfd3aada87adb49be98c548c894d6566"
|
||||
resolved_reference = "55a7c331e6dc5f81639af50ca4a8cc9d73e9273d"
|
||||
|
||||
[[package]]
|
||||
name = "aiohttp"
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
Subproject commit cfd4191e364b47de7364096f45d9d9d9a901692a
|
||||
Subproject commit e5c8a15fde77708c924eab3018bda255f17b5390
|
||||
6
start.py
6
start.py
|
|
@ -221,10 +221,14 @@ def set_openpype_global_environments() -> None:
|
|||
all_env = get_environments()
|
||||
general_env = all_env["global"]
|
||||
|
||||
env = acre.merge(
|
||||
merged_env = acre.merge(
|
||||
acre.parse(general_env),
|
||||
dict(os.environ)
|
||||
)
|
||||
env = acre.compute(
|
||||
merged_env,
|
||||
cleanup=False
|
||||
)
|
||||
os.environ.clear()
|
||||
os.environ.update(env)
|
||||
|
||||
|
|
|
|||
|
|
@ -22,7 +22,7 @@ Drag extension.zxp and drop it to Anastasyi's Extension Manager. The extension w
|
|||
## Implemented functionality
|
||||
|
||||
AfterEffects implementation currently allows you to import and add various media to composition (image plates, renders, audio files, video files etc.)
|
||||
and send prepared composition for rendering to Deadline.
|
||||
and send prepared composition for rendering to Deadline or render locally.
|
||||
|
||||
## Usage
|
||||
|
||||
|
|
@ -53,6 +53,12 @@ will be changed.
|
|||
|
||||
### Publish
|
||||
|
||||
#### RenderQueue
|
||||
|
||||
AE's Render Queue is required for publishing locally or on a farm. Artist needs to configure expected result format (extension, resolution) in the Render Queue in an Output module. Currently its expected to have only single render item and single output module in the Render Queue.
|
||||
|
||||
AE might throw some warning windows during publishing locally, so please pay attention to them in a case publishing seems to be stuck in a `Extract Local Render`.
|
||||
|
||||
When you are ready to share your work, you will need to publish it. This is done by opening the `Publish` by clicking the corresponding button in the OpenPype Panel.
|
||||
|
||||

|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue