Merge remote-tracking branch 'origin/develop' into feature/database-file-data-for-validation

This commit is contained in:
Ondřej Samohel 2021-07-19 15:04:35 +02:00
commit e66432d7ed
No known key found for this signature in database
GPG key ID: 02376E18990A97C6
97 changed files with 3725 additions and 1070 deletions

View file

@ -1,57 +1,100 @@
# Changelog
## [3.2.0-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.3.0-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.3...HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.2.0...HEAD)
**🚀 Enhancements**
- nuke: settings create missing default subsets [\#1829](https://github.com/pypeclub/OpenPype/pull/1829)
- Settings: settings for plugins [\#1819](https://github.com/pypeclub/OpenPype/pull/1819)
- Maya: Deadline custom settings [\#1797](https://github.com/pypeclub/OpenPype/pull/1797)
**🐛 Bug fixes**
- Project folder structure overrides [\#1813](https://github.com/pypeclub/OpenPype/pull/1813)
- Maya: fix yeti settings path in extractor [\#1809](https://github.com/pypeclub/OpenPype/pull/1809)
- Houdini colector formatting keys fix [\#1802](https://github.com/pypeclub/OpenPype/pull/1802)
## [3.2.0](https://github.com/pypeclub/OpenPype/tree/3.2.0) (2021-07-13)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.7...3.2.0)
**🚀 Enhancements**
- Nuke: ftrack family plugin settings preset [\#1805](https://github.com/pypeclub/OpenPype/pull/1805)
- Standalone publisher last project [\#1799](https://github.com/pypeclub/OpenPype/pull/1799)
- Ftrack Multiple notes as server action [\#1795](https://github.com/pypeclub/OpenPype/pull/1795)
- Settings conditional dict [\#1777](https://github.com/pypeclub/OpenPype/pull/1777)
- Settings application use python 2 only where needed [\#1776](https://github.com/pypeclub/OpenPype/pull/1776)
- Settings UI copy/paste [\#1769](https://github.com/pypeclub/OpenPype/pull/1769)
- Workfile tool widths [\#1766](https://github.com/pypeclub/OpenPype/pull/1766)
- Push hierarchical attributes care about task parent changes [\#1763](https://github.com/pypeclub/OpenPype/pull/1763)
- Application executables with environment variables [\#1757](https://github.com/pypeclub/OpenPype/pull/1757)
- Deadline: Nuke submission additional attributes [\#1756](https://github.com/pypeclub/OpenPype/pull/1756)
- Settings schema without prefill [\#1753](https://github.com/pypeclub/OpenPype/pull/1753)
- Settings Hosts enum [\#1739](https://github.com/pypeclub/OpenPype/pull/1739)
- Validate containers settings [\#1736](https://github.com/pypeclub/OpenPype/pull/1736)
- Autoupdate launcher [\#1725](https://github.com/pypeclub/OpenPype/pull/1725)
- Subset template and TVPaint subset template docs [\#1717](https://github.com/pypeclub/OpenPype/pull/1717)
- Overscan color extract review [\#1701](https://github.com/pypeclub/OpenPype/pull/1701)
- Nuke: Prerender Frame Range by default [\#1699](https://github.com/pypeclub/OpenPype/pull/1699)
- Smoother edges of color triangle [\#1695](https://github.com/pypeclub/OpenPype/pull/1695)
- PS - added loader from sequence [\#1726](https://github.com/pypeclub/OpenPype/pull/1726)
- Toggle Ftrack upload in StandalonePublisher [\#1708](https://github.com/pypeclub/OpenPype/pull/1708)
**🐛 Bug fixes**
- nuke: fixing wrong name of family folder when `used existing frames` [\#1803](https://github.com/pypeclub/OpenPype/pull/1803)
- Collect ftrack family bugs [\#1801](https://github.com/pypeclub/OpenPype/pull/1801)
- Invitee email can be None which break the Ftrack commit. [\#1788](https://github.com/pypeclub/OpenPype/pull/1788)
- Fix: staging and `--use-version` option [\#1786](https://github.com/pypeclub/OpenPype/pull/1786)
- Otio unrelated error on import [\#1782](https://github.com/pypeclub/OpenPype/pull/1782)
- FFprobe streams order [\#1775](https://github.com/pypeclub/OpenPype/pull/1775)
- Fix - single file files are str only, cast it to list to count properly [\#1772](https://github.com/pypeclub/OpenPype/pull/1772)
- Environments in app executable for MacOS [\#1768](https://github.com/pypeclub/OpenPype/pull/1768)
- Project specific environments [\#1767](https://github.com/pypeclub/OpenPype/pull/1767)
- Settings UI with refresh button [\#1764](https://github.com/pypeclub/OpenPype/pull/1764)
- Standalone publisher thumbnail extractor fix [\#1761](https://github.com/pypeclub/OpenPype/pull/1761)
- Anatomy others templates don't cause crash [\#1758](https://github.com/pypeclub/OpenPype/pull/1758)
- Backend acre module commit update [\#1745](https://github.com/pypeclub/OpenPype/pull/1745)
- hiero: precollect instances failing when audio selected [\#1743](https://github.com/pypeclub/OpenPype/pull/1743)
- Hiero: creator instance error [\#1742](https://github.com/pypeclub/OpenPype/pull/1742)
- Nuke: fixing render creator for no selection format failing [\#1741](https://github.com/pypeclub/OpenPype/pull/1741)
- StandalonePublisher: failing collector for editorial [\#1738](https://github.com/pypeclub/OpenPype/pull/1738)
- Local settings UI crash on missing defaults [\#1737](https://github.com/pypeclub/OpenPype/pull/1737)
- Ftrack missing custom attribute message [\#1734](https://github.com/pypeclub/OpenPype/pull/1734)
- Launcher project changes [\#1733](https://github.com/pypeclub/OpenPype/pull/1733)
- TVPaint use layer name for default variant [\#1724](https://github.com/pypeclub/OpenPype/pull/1724)
- Default subset template for TVPaint review and workfile families [\#1716](https://github.com/pypeclub/OpenPype/pull/1716)
- Maya: Extract review hotfix [\#1714](https://github.com/pypeclub/OpenPype/pull/1714)
- Settings: Imageio improving granularity [\#1711](https://github.com/pypeclub/OpenPype/pull/1711)
- Application without executables [\#1679](https://github.com/pypeclub/OpenPype/pull/1679)
## [2.18.3](https://github.com/pypeclub/OpenPype/tree/2.18.3) (2021-06-18)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.2...2.18.3)
**🐛 Bug fixes**
- Nuke: broken publishing rendered frames [\#1707](https://github.com/pypeclub/OpenPype/pull/1707)
- Remove publish highlight icon in AE [\#1664](https://github.com/pypeclub/OpenPype/pull/1664)
- TVPaint white background on thumbnail [\#1735](https://github.com/pypeclub/OpenPype/pull/1735)
**Merged pull requests:**
- Sync main 2.x back to 2.x develop [\#1715](https://github.com/pypeclub/OpenPype/pull/1715)
- Build: don't add Poetry to `PATH` [\#1808](https://github.com/pypeclub/OpenPype/pull/1808)
- Bump prismjs from 1.23.0 to 1.24.0 in /website [\#1773](https://github.com/pypeclub/OpenPype/pull/1773)
- Bc/fix/docs [\#1771](https://github.com/pypeclub/OpenPype/pull/1771)
- TVPaint ftrack family [\#1755](https://github.com/pypeclub/OpenPype/pull/1755)
## [2.18.4](https://github.com/pypeclub/OpenPype/tree/2.18.4) (2021-06-24)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.3...2.18.4)
**Merged pull requests:**
- celaction fixes [\#1754](https://github.com/pypeclub/OpenPype/pull/1754)
- celaciton: audio subset changed data structure [\#1750](https://github.com/pypeclub/OpenPype/pull/1750)
## [2.18.3](https://github.com/pypeclub/OpenPype/tree/2.18.3) (2021-06-23)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.2...2.18.3)
**🐛 Bug fixes**
- Tools names forwards compatibility [\#1727](https://github.com/pypeclub/OpenPype/pull/1727)
**⚠️ Deprecations**
- global: removing obsolete ftrack validator plugin [\#1710](https://github.com/pypeclub/OpenPype/pull/1710)
- \#683 - Validate frame range in Standalone Publisher [\#1680](https://github.com/pypeclub/OpenPype/pull/1680)
- Maya: Split model content validator [\#1654](https://github.com/pypeclub/OpenPype/pull/1654)
## [2.18.2](https://github.com/pypeclub/OpenPype/tree/2.18.2) (2021-06-16)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.1.0...2.18.2)
**🚀 Enhancements**
- StandalonePublisher: adding exception for adding `delete` tag to repre [\#1650](https://github.com/pypeclub/OpenPype/pull/1650)
**🐛 Bug fixes**
- Maya: Extract review hotfix - 2.x backport [\#1713](https://github.com/pypeclub/OpenPype/pull/1713)
- StandalonePublisher: instance data attribute `keepSequence` [\#1668](https://github.com/pypeclub/OpenPype/pull/1668)
**Merged pull requests:**
@ -69,28 +112,13 @@
- Sort applications and tools alphabetically in Settings UI [\#1689](https://github.com/pypeclub/OpenPype/pull/1689)
- \#683 - Validate Frame Range in Standalone Publisher [\#1683](https://github.com/pypeclub/OpenPype/pull/1683)
- Hiero: old container versions identify with red color [\#1682](https://github.com/pypeclub/OpenPype/pull/1682)
- Project Manger: Default name column width [\#1669](https://github.com/pypeclub/OpenPype/pull/1669)
- Remove outline in stylesheet [\#1667](https://github.com/pypeclub/OpenPype/pull/1667)
- TVPaint: Creator take layer name as default value for subset variant [\#1663](https://github.com/pypeclub/OpenPype/pull/1663)
- TVPaint custom subset template [\#1662](https://github.com/pypeclub/OpenPype/pull/1662)
- Editorial: conform assets validator [\#1659](https://github.com/pypeclub/OpenPype/pull/1659)
- Feature Slack integration [\#1657](https://github.com/pypeclub/OpenPype/pull/1657)
- Nuke - Publish simplification [\#1653](https://github.com/pypeclub/OpenPype/pull/1653)
- \#1333 - added tooltip hints to Pyblish buttons [\#1649](https://github.com/pypeclub/OpenPype/pull/1649)
**🐛 Bug fixes**
- Nuke: broken publishing rendered frames [\#1707](https://github.com/pypeclub/OpenPype/pull/1707)
- Standalone publisher Thumbnail export args [\#1705](https://github.com/pypeclub/OpenPype/pull/1705)
- Bad zip can break OpenPype start [\#1691](https://github.com/pypeclub/OpenPype/pull/1691)
- Hiero: published whole edit mov [\#1687](https://github.com/pypeclub/OpenPype/pull/1687)
- Ftrack subprocess handle of stdout/stderr [\#1675](https://github.com/pypeclub/OpenPype/pull/1675)
- Settings list race condifiton and mutable dict list conversion [\#1671](https://github.com/pypeclub/OpenPype/pull/1671)
- Mac launch arguments fix [\#1660](https://github.com/pypeclub/OpenPype/pull/1660)
- Fix missing dbm python module [\#1652](https://github.com/pypeclub/OpenPype/pull/1652)
- Transparent branches in view on Mac [\#1648](https://github.com/pypeclub/OpenPype/pull/1648)
- Add asset on task item [\#1646](https://github.com/pypeclub/OpenPype/pull/1646)
- Project manager save and queue [\#1645](https://github.com/pypeclub/OpenPype/pull/1645)
- New project anatomy values [\#1644](https://github.com/pypeclub/OpenPype/pull/1644)
**Merged pull requests:**

View file

@ -657,7 +657,7 @@ class BootstrapRepos:
]
# remove duplicates
openpype_versions = list(set(openpype_versions))
openpype_versions = sorted(list(set(openpype_versions)))
return openpype_versions

View file

@ -15,6 +15,9 @@ from .pype_commands import PypeCommands
expose_value=False, help="use specified version")
@click.option("--use-staging", is_flag=True,
expose_value=False, help="use staging variants")
@click.option("--list-versions", is_flag=True, expose_value=False,
help=("list all detected versions. Use With `--use-staging "
"to list staging versions."))
def main(ctx):
"""Pype is main command serving as entry point to pipeline system.

View file

@ -56,7 +56,7 @@ class CollectInstances(pyblish.api.ContextPlugin):
# Create nice name if the instance has a frame range.
label = data.get("name", node.name())
if "frameStart" in data and "frameEnd" in data:
frames = "[{startFrame} - {endFrame}]".format(**data)
frames = "[{frameStart} - {frameEnd}]".format(**data)
label = "{} {}".format(label, frames)
instance = context.create_instance(label)

View file

@ -133,10 +133,10 @@ class ExtractYetiRig(openpype.api.Extractor):
image_search_path = resources_dir = instance.data["resourcesDir"]
settings = instance.data.get("rigsettings", None)
if settings:
settings["imageSearchPath"] = image_search_path
with open(settings_path, "w") as fp:
json.dump(settings, fp, ensure_ascii=False)
assert settings, "Yeti rig settings were not collected."
settings["imageSearchPath"] = image_search_path
with open(settings_path, "w") as fp:
json.dump(settings, fp, ensure_ascii=False)
# add textures to transfers
if 'transfers' not in instance.data:
@ -192,12 +192,12 @@ class ExtractYetiRig(openpype.api.Extractor):
'stagingDir': dirname
}
)
self.log.info("settings file: {}".format(settings))
self.log.info("settings file: {}".format(settings_path))
instance.data["representations"].append(
{
'name': 'rigsettings',
'ext': 'rigsettings',
'files': os.path.basename(settings),
'files': os.path.basename(settings_path),
'stagingDir': dirname
}
)

View file

@ -70,8 +70,9 @@ class PreCollectNukeInstances(pyblish.api.ContextPlugin):
review = False
if "review" in node.knobs():
review = node["review"].value()
if review:
families.append("review")
families.append("ftrack")
# Add all nodes in group instances.
if node.Class() == "Group":
@ -81,18 +82,18 @@ class PreCollectNukeInstances(pyblish.api.ContextPlugin):
if target == "Use existing frames":
# Local rendering
self.log.info("flagged for no render")
families.append(family)
families.append(families_ak.lower())
elif target == "Local":
# Local rendering
self.log.info("flagged for local render")
families.append("{}.local".format(family))
family = families_ak.lower()
elif target == "On farm":
# Farm rendering
self.log.info("flagged for farm render")
instance.data["transfer"] = False
families.append("{}.farm".format(family))
family = families_ak.lower()
family = families_ak.lower()
node.begin()
for i in nuke.allNodes():

View file

@ -2,7 +2,7 @@
Optional:
presets -> extensions (
example of use:
[".mov", ".mp4"]
["mov", "mp4"]
)
presets -> source_dir (
example of use:
@ -11,6 +11,7 @@ Optional:
"{root[work]}/{project[name]}/inputs"
"./input"
"../input"
""
)
"""
@ -48,7 +49,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
actions = []
# presets
extensions = [".mov", ".mp4"]
extensions = ["mov", "mp4"]
source_dir = None
def process(self, instance):
@ -72,7 +73,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
video_path = None
basename = os.path.splitext(os.path.basename(file_path))[0]
if self.source_dir:
if self.source_dir != "":
source_dir = self.source_dir.replace("\\", "/")
if ("./" in source_dir) or ("../" in source_dir):
# get current working dir
@ -98,7 +99,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
if os.path.splitext(f)[0] not in basename:
continue
# filter out by respected extensions
if os.path.splitext(f)[1] not in self.extensions:
if os.path.splitext(f)[1][1:] not in self.extensions:
continue
video_path = os.path.join(
staging_dir, f

View file

@ -8,7 +8,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
"""Collect instances from editorial's OTIO sequence"""
order = pyblish.api.CollectorOrder + 0.01
label = "Collect Instances"
label = "Collect Editorial Instances"
hosts = ["standalonepublisher"]
families = ["editorial"]
@ -17,16 +17,12 @@ class CollectInstances(pyblish.api.InstancePlugin):
"referenceMain": {
"family": "review",
"families": ["clip"],
"extensions": [".mp4"]
"extensions": ["mp4"]
},
"audioMain": {
"family": "audio",
"families": ["clip"],
"extensions": [".wav"],
},
"shotMain": {
"family": "shot",
"families": []
"extensions": ["wav"],
}
}
timeline_frame_start = 900000 # starndard edl default (10:00:00:00)
@ -55,7 +51,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
fps = plib.get_asset()["data"]["fps"]
tracks = timeline.each_child(
descended_from_type=otio.schema.track.Track
descended_from_type=otio.schema.Track
)
# get data from avalon
@ -84,6 +80,9 @@ class CollectInstances(pyblish.api.InstancePlugin):
if clip.name is None:
continue
if isinstance(clip, otio.schema.Gap):
continue
# skip all generators like black ampty
if isinstance(
clip.media_reference,
@ -92,7 +91,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
# Transitions are ignored, because Clips have the full frame
# range.
if isinstance(clip, otio.schema.transition.Transition):
if isinstance(clip, otio.schema.Transition):
continue
# basic unique asset name
@ -175,7 +174,16 @@ class CollectInstances(pyblish.api.InstancePlugin):
data_key: instance.data.get(data_key)})
# adding subsets to context as instances
self.subsets.update({
"shotMain": {
"family": "shot",
"families": []
}
})
for subset, properities in self.subsets.items():
if properities["version"] == 0:
properities.pop("version")
# adding Review-able instance
subset_instance_data = instance_data.copy()
subset_instance_data.update(properities)

View file

@ -11,7 +11,7 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
# must be after `CollectInstances`
order = pyblish.api.CollectorOrder + 0.011
label = "Collect Instance Resources"
label = "Collect Editorial Resources"
hosts = ["standalonepublisher"]
families = ["clip"]
@ -177,19 +177,23 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
collection_head_name = None
# loop trough collections and create representations
for _collection in collections:
ext = _collection.tail
ext = _collection.tail[1:]
collection_head_name = _collection.head
frame_start = list(_collection.indexes)[0]
frame_end = list(_collection.indexes)[-1]
repre_data = {
"frameStart": frame_start,
"frameEnd": frame_end,
"name": ext[1:],
"ext": ext[1:],
"name": ext,
"ext": ext,
"files": [item for item in _collection],
"stagingDir": staging_dir
}
if instance_data.get("keepSequence"):
repre_data_keep = deepcopy(repre_data)
instance_data["representations"].append(repre_data_keep)
if "review" in instance_data["families"]:
repre_data.update({
"thumbnail": True,
@ -208,20 +212,20 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
# loop trough reminders and create representations
for _reminding_file in remainder:
ext = os.path.splitext(_reminding_file)[-1]
ext = os.path.splitext(_reminding_file)[-1][1:]
if ext not in instance_data["extensions"]:
continue
if collection_head_name and (
(collection_head_name + ext[1:]) not in _reminding_file
) and (ext in [".mp4", ".mov"]):
(collection_head_name + ext) not in _reminding_file
) and (ext in ["mp4", "mov"]):
self.log.info(f"Skipping file: {_reminding_file}")
continue
frame_start = 1
frame_end = 1
repre_data = {
"name": ext[1:],
"ext": ext[1:],
"name": ext,
"ext": ext,
"files": _reminding_file,
"stagingDir": staging_dir
}

View file

@ -131,20 +131,21 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
tasks_to_add = dict()
project_tasks = io.find_one({"type": "project"})["config"]["tasks"]
for task_name, task_data in self.shot_add_tasks.items():
try:
if task_data["type"] in project_tasks.keys():
tasks_to_add.update({task_name: task_data})
else:
raise KeyError(
"Wrong FtrackTaskType `{}` for `{}` is not"
" existing in `{}``".format(
task_data["type"],
task_name,
list(project_tasks.keys())))
except KeyError as error:
_task_data = deepcopy(task_data)
# fixing enumerator from settings
_task_data["type"] = task_data["type"][0]
# check if task type in project task types
if _task_data["type"] in project_tasks.keys():
tasks_to_add.update({task_name: _task_data})
else:
raise KeyError(
"Wrong presets: `{0}`".format(error)
)
"Wrong FtrackTaskType `{}` for `{}` is not"
" existing in `{}``".format(
_task_data["type"],
task_name,
list(project_tasks.keys())))
instance.data["tasks"] = tasks_to_add
else:

View file

@ -66,7 +66,6 @@ class ExtractThumbnailSP(pyblish.api.InstancePlugin):
else:
# Convert to jpeg if not yet
full_input_path = os.path.join(thumbnail_repre["stagingDir"], file)
full_input_path = '"{}"'.format(full_input_path)
self.log.info("input {}".format(full_input_path))
full_thumbnail_path = tempfile.mkstemp(suffix=".jpg")[1]

View file

@ -43,7 +43,10 @@ class ValidateFrameRange(pyblish.api.InstancePlugin):
self.log.warning("Cannot check for extension {}".format(ext))
return
frames = len(instance.data.get("representations", [None])[0]["files"])
files = instance.data.get("representations", [None])[0]["files"]
if isinstance(files, str):
files = [files]
frames = len(files)
err_msg = "Frame duration from DB:'{}' ". format(int(duration)) +\
" doesn't match number of files:'{}'".format(frames) +\

View file

@ -103,8 +103,6 @@ class CollectInstances(pyblish.api.ContextPlugin):
instance.data["layers"] = copy.deepcopy(
context.data["layersData"]
)
# Add ftrack family
instance.data["families"].append("ftrack")
elif family == "renderLayer":
instance = self.create_render_layer_instance(
@ -186,9 +184,6 @@ class CollectInstances(pyblish.api.ContextPlugin):
instance_data["layers"] = group_layers
# Add ftrack family
instance_data["families"].append("ftrack")
return context.create_instance(**instance_data)
def create_render_pass_instance(self, context, instance_data):

View file

@ -0,0 +1,9 @@
## Unreal Integration
Supported Unreal Engine version is 4.26+ (mainly because of major Python changes done there).
### Project naming
Unreal doesn't support project names starting with non-alphabetic character. So names like `123_myProject` are
invalid. If OpenPype detects such name it automatically prepends letter **P** to make it valid name, so `123_myProject`
will become `P123_myProject`. There is also soft-limit on project name length to be shorter than 20 characters.
Longer names will issue warning in Unreal Editor that there might be possible side effects.

View file

@ -1,38 +1,51 @@
# -*- coding: utf-8 -*-
"""Unreal launching and project tools."""
import sys
import os
import platform
import json
from distutils import dir_util
import subprocess
import re
from pathlib import Path
from collections import OrderedDict
from openpype.api import get_project_settings
def get_engine_versions():
"""
def get_engine_versions(env=None):
"""Detect Unreal Engine versions.
This will try to detect location and versions of installed Unreal Engine.
Location can be overridden by `UNREAL_ENGINE_LOCATION` environment
variable.
Returns:
Args:
env (dict, optional): Environment to use.
dict: dictionary with version as a key and dir as value.
Returns:
OrderedDict: dictionary with version as a key and dir as value.
so the highest version is first.
Example:
>>> get_engine_version()
>>> get_engine_versions()
{
"4.23": "C:/Epic Games/UE_4.23",
"4.24": "C:/Epic Games/UE_4.24"
}
"""
try:
engine_locations = {}
root, dirs, files = next(os.walk(os.environ["UNREAL_ENGINE_LOCATION"]))
for dir in dirs:
if dir.startswith("UE_"):
ver = dir.split("_")[1]
engine_locations[ver] = os.path.join(root, dir)
"""
env = env or os.environ
engine_locations = {}
try:
root, dirs, _ = next(os.walk(env["UNREAL_ENGINE_LOCATION"]))
for directory in dirs:
if directory.startswith("UE"):
try:
ver = re.split(r"[-_]", directory)[1]
except IndexError:
continue
engine_locations[ver] = os.path.join(root, directory)
except KeyError:
# environment variable not set
pass
@ -40,32 +53,52 @@ def get_engine_versions():
# specified directory doesn't exists
pass
# if we've got something, terminate autodetection process
# if we've got something, terminate auto-detection process
if engine_locations:
return engine_locations
return OrderedDict(sorted(engine_locations.items()))
# else kick in platform specific detection
if platform.system().lower() == "windows":
return _win_get_engine_versions()
elif platform.system().lower() == "linux":
return OrderedDict(sorted(_win_get_engine_versions().items()))
if platform.system().lower() == "linux":
# on linux, there is no installation and getting Unreal Engine involves
# git clone. So we'll probably depend on `UNREAL_ENGINE_LOCATION`.
pass
elif platform.system().lower() == "darwin":
return _darwin_get_engine_version()
if platform.system().lower() == "darwin":
return OrderedDict(sorted(_darwin_get_engine_version().items()))
return {}
return OrderedDict()
def get_editor_executable_path(engine_path: Path) -> Path:
"""Get UE4 Editor executable path."""
ue4_path = engine_path / "Engine/Binaries"
if platform.system().lower() == "windows":
ue4_path /= "Win64/UE4Editor.exe"
elif platform.system().lower() == "linux":
ue4_path /= "Linux/UE4Editor"
elif platform.system().lower() == "darwin":
ue4_path /= "Mac/UE4Editor"
return ue4_path
def _win_get_engine_versions():
"""
"""Get Unreal Engine versions on Windows.
If engines are installed via Epic Games Launcher then there is:
`%PROGRAMDATA%/Epic/UnrealEngineLauncher/LauncherInstalled.dat`
This file is JSON file listing installed stuff, Unreal engines
are marked with `"AppName" = "UE_X.XX"`` like `UE_4.24`
Returns:
dict: version as a key and path as a value.
"""
install_json_path = os.path.join(
os.environ.get("PROGRAMDATA"),
os.getenv("PROGRAMDATA"),
"Epic",
"UnrealEngineLauncher",
"LauncherInstalled.dat",
@ -75,11 +108,19 @@ def _win_get_engine_versions():
def _darwin_get_engine_version() -> dict:
"""
"""Get Unreal Engine versions on MacOS.
It works the same as on Windows, just JSON file location is different.
Returns:
dict: version as a key and path as a value.
See Aslo:
:func:`_win_get_engine_versions`.
"""
install_json_path = os.path.join(
os.environ.get("HOME"),
os.getenv("HOME"),
"Library",
"Application Support",
"Epic",
@ -91,25 +132,26 @@ def _darwin_get_engine_version() -> dict:
def _parse_launcher_locations(install_json_path: str) -> dict:
"""
This will parse locations from json file.
"""This will parse locations from json file.
Args:
install_json_path (str): Path to `LauncherInstalled.dat`.
Returns:
dict: with unreal engine versions as keys and
paths to those engine installations as value.
:param install_json_path: path to `LauncherInstalled.dat`
:type install_json_path: str
:returns: returns dict with unreal engine versions as keys and
paths to those engine installations as value.
:rtype: dict
"""
engine_locations = {}
if os.path.isfile(install_json_path):
with open(install_json_path, "r") as ilf:
try:
install_data = json.load(ilf)
except json.JSONDecodeError:
except json.JSONDecodeError as e:
raise Exception(
"Invalid `LauncherInstalled.dat file. `"
"Cannot determine Unreal Engine location."
)
) from e
for installation in install_data.get("InstallationList", []):
if installation.get("AppName").startswith("UE_"):
@ -121,55 +163,91 @@ def _parse_launcher_locations(install_json_path: str) -> dict:
def create_unreal_project(project_name: str,
ue_version: str,
pr_dir: str,
engine_path: str,
dev_mode: bool = False) -> None:
"""
This will create `.uproject` file at specified location. As there is no
way I know to create project via command line, this is easiest option.
Unreal project file is basically JSON file. If we find
pr_dir: Path,
engine_path: Path,
dev_mode: bool = False,
env: dict = None) -> None:
"""This will create `.uproject` file at specified location.
As there is no way I know to create project via command line, this is
easiest option. Unreal project file is basically JSON file. If we find
`AVALON_UNREAL_PLUGIN` environment variable we assume this is location
of Avalon Integration Plugin and we copy its content to project folder
and enable this plugin.
:param project_name: project name
:type project_name: str
:param ue_version: unreal engine version (like 4.23)
:type ue_version: str
:param pr_dir: path to directory where project will be created
:type pr_dir: str
:param engine_path: Path to Unreal Engine installation
:type engine_path: str
:param dev_mode: Flag to trigger C++ style Unreal project needing
Visual Studio and other tools to compile plugins from
sources. This will trigger automatically if `Binaries`
directory is not found in plugin folders as this indicates
this is only source distribution of the plugin. Dev mode
is also set by preset file `unreal/project_setup.json` in
**OPENPYPE_CONFIG**.
:type dev_mode: bool
:returns: None
"""
preset = get_project_settings(project_name)["unreal"]["project_setup"]
Args:
project_name (str): Name of the project.
ue_version (str): Unreal engine version (like 4.23).
pr_dir (Path): Path to directory where project will be created.
engine_path (Path): Path to Unreal Engine installation.
dev_mode (bool, optional): Flag to trigger C++ style Unreal project
needing Visual Studio and other tools to compile plugins from
sources. This will trigger automatically if `Binaries`
directory is not found in plugin folders as this indicates
this is only source distribution of the plugin. Dev mode
is also set by preset file `unreal/project_setup.json` in
**OPENPYPE_CONFIG**.
env (dict, optional): Environment to use. If not set, `os.environ`.
if os.path.isdir(os.environ.get("AVALON_UNREAL_PLUGIN", "")):
Throws:
NotImplementedError: For unsupported platforms.
Returns:
None
"""
env = env or os.environ
preset = get_project_settings(project_name)["unreal"]["project_setup"]
ue_id = ".".join(ue_version.split(".")[:2])
# get unreal engine identifier
# -------------------------------------------------------------------------
# FIXME (antirotor): As of 4.26 this is problem with UE4 built from
# sources. In that case Engine ID is calculated per machine/user and not
# from Engine files as this code then reads. This then prevents UE4
# to directly open project as it will complain about project being
# created in different UE4 version. When user convert such project
# to his UE4 version, Engine ID is replaced in uproject file. If some
# other user tries to open it, it will present him with similar error.
ue4_modules = Path()
if platform.system().lower() == "windows":
ue4_modules = Path(os.path.join(engine_path, "Engine", "Binaries",
"Win64", "UE4Editor.modules"))
if platform.system().lower() == "linux":
ue4_modules = Path(os.path.join(engine_path, "Engine", "Binaries",
"Linux", "UE4Editor.modules"))
if platform.system().lower() == "darwin":
ue4_modules = Path(os.path.join(engine_path, "Engine", "Binaries",
"Mac", "UE4Editor.modules"))
if ue4_modules.exists():
print("--- Loading Engine ID from modules file ...")
with open(ue4_modules, "r") as mp:
loaded_modules = json.load(mp)
if loaded_modules.get("BuildId"):
ue_id = "{" + loaded_modules.get("BuildId") + "}"
plugins_path = None
if os.path.isdir(env.get("AVALON_UNREAL_PLUGIN", "")):
# copy plugin to correct path under project
plugins_path = os.path.join(pr_dir, "Plugins")
avalon_plugin_path = os.path.join(plugins_path, "Avalon")
if not os.path.isdir(avalon_plugin_path):
os.makedirs(avalon_plugin_path, exist_ok=True)
plugins_path = pr_dir / "Plugins"
avalon_plugin_path = plugins_path / "Avalon"
if not avalon_plugin_path.is_dir():
avalon_plugin_path.mkdir(parents=True, exist_ok=True)
dir_util._path_created = {}
dir_util.copy_tree(os.environ.get("AVALON_UNREAL_PLUGIN"),
avalon_plugin_path)
avalon_plugin_path.as_posix())
if (not os.path.isdir(os.path.join(avalon_plugin_path, "Binaries"))
or not os.path.join(avalon_plugin_path, "Intermediate")):
if not (avalon_plugin_path / "Binaries").is_dir() \
or not (avalon_plugin_path / "Intermediate").is_dir():
dev_mode = True
# data for project file
data = {
"FileVersion": 3,
"EngineAssociation": ue_version,
"EngineAssociation": ue_id,
"Category": "",
"Description": "",
"Plugins": [
@ -179,35 +257,6 @@ def create_unreal_project(project_name: str,
]
}
if preset["install_unreal_python_engine"]:
# If `OPENPYPE_UNREAL_ENGINE_PYTHON_PLUGIN` is set, copy it from there
# to support offline installation.
# Otherwise clone UnrealEnginePython to Plugins directory
# https://github.com/20tab/UnrealEnginePython.git
uep_path = os.path.join(plugins_path, "UnrealEnginePython")
if os.environ.get("OPENPYPE_UNREAL_ENGINE_PYTHON_PLUGIN"):
os.makedirs(uep_path, exist_ok=True)
dir_util._path_created = {}
dir_util.copy_tree(
os.environ.get("OPENPYPE_UNREAL_ENGINE_PYTHON_PLUGIN"),
uep_path)
else:
# WARNING: this will trigger dev_mode, because we need to compile
# this plugin.
dev_mode = True
import git
git.Repo.clone_from(
"https://github.com/20tab/UnrealEnginePython.git",
uep_path)
data["Plugins"].append(
{"Name": "UnrealEnginePython", "Enabled": True})
if (not os.path.isdir(os.path.join(uep_path, "Binaries"))
or not os.path.join(uep_path, "Intermediate")):
dev_mode = True
if dev_mode or preset["dev_mode"]:
# this will add project module and necessary source file to make it
# C++ project and to (hopefully) make Unreal Editor to compile all
@ -220,51 +269,39 @@ def create_unreal_project(project_name: str,
"AdditionalDependencies": ["Engine"],
}]
if preset["install_unreal_python_engine"]:
# now we need to fix python path in:
# `UnrealEnginePython.Build.cs`
# to point to our python
with open(os.path.join(
uep_path, "Source",
"UnrealEnginePython",
"UnrealEnginePython.Build.cs"), mode="r") as f:
build_file = f.read()
fix = build_file.replace(
'private string pythonHome = "";',
'private string pythonHome = "{}";'.format(
sys.base_prefix.replace("\\", "/")))
with open(os.path.join(
uep_path, "Source",
"UnrealEnginePython",
"UnrealEnginePython.Build.cs"), mode="w") as f:
f.write(fix)
# write project file
project_file = os.path.join(pr_dir, "{}.uproject".format(project_name))
project_file = pr_dir / f"{project_name}.uproject"
with open(project_file, mode="w") as pf:
json.dump(data, pf, indent=4)
# UE < 4.26 have Python2 by default, so we need PySide
# but we will not need it in 4.26 and up
if int(ue_version.split(".")[1]) < 26:
# ensure we have PySide installed in engine
# TODO: make it work for other platforms 🍎 🐧
if platform.system().lower() == "windows":
python_path = os.path.join(engine_path, "Engine", "Binaries",
"ThirdParty", "Python", "Win64",
"python.exe")
# ensure we have PySide2 installed in engine
python_path = None
if platform.system().lower() == "windows":
python_path = engine_path / ("Engine/Binaries/ThirdParty/"
"Python3/Win64/pythonw.exe")
subprocess.run([python_path, "-m",
"pip", "install", "pyside"])
if platform.system().lower() == "linux":
python_path = engine_path / ("Engine/Binaries/ThirdParty/"
"Python3/Linux/bin/python3")
if platform.system().lower() == "darwin":
python_path = engine_path / ("Engine/Binaries/ThirdParty/"
"Python3/Mac/bin/python3")
if not python_path:
raise NotImplementedError("Unsupported platform")
if not python_path.exists():
raise RuntimeError(f"Unreal Python not found at {python_path}")
subprocess.run(
[python_path.as_posix(), "-m", "pip", "install", "pyside2"])
if dev_mode or preset["dev_mode"]:
_prepare_cpp_project(project_file, engine_path)
def _prepare_cpp_project(project_file: str, engine_path: str) -> None:
"""
def _prepare_cpp_project(project_file: Path, engine_path: Path) -> None:
"""Prepare CPP Unreal Project.
This function will add source files needed for project to be
rebuild along with the avalon integration plugin.
@ -273,19 +310,19 @@ def _prepare_cpp_project(project_file: str, engine_path: str) -> None:
by some generator. This needs more research as manually writing
those files is rather hackish. :skull_and_crossbones:
:param project_file: path to .uproject file
:type project_file: str
:param engine_path: path to unreal engine associated with project
:type engine_path: str
Args:
project_file (str): Path to .uproject file.
engine_path (str): Path to unreal engine associated with project.
"""
project_name = project_file.stem
project_dir = project_file.parent
targets_dir = project_dir / "Source"
sources_dir = targets_dir / project_name
project_name = os.path.splitext(os.path.basename(project_file))[0]
project_dir = os.path.dirname(project_file)
targets_dir = os.path.join(project_dir, "Source")
sources_dir = os.path.join(targets_dir, project_name)
os.makedirs(sources_dir, exist_ok=True)
os.makedirs(os.path.join(project_dir, "Content"), exist_ok=True)
sources_dir.mkdir(parents=True, exist_ok=True)
(project_dir / "Content").mkdir(parents=True, exist_ok=True)
module_target = '''
using UnrealBuildTool;
@ -360,59 +397,59 @@ class {1}_API A{0}GameModeBase : public AGameModeBase
}};
'''.format(project_name, project_name.upper())
with open(os.path.join(
targets_dir, f"{project_name}.Target.cs"), mode="w") as f:
with open(targets_dir / f"{project_name}.Target.cs", mode="w") as f:
f.write(module_target)
with open(os.path.join(
targets_dir, f"{project_name}Editor.Target.cs"), mode="w") as f:
with open(targets_dir / f"{project_name}Editor.Target.cs", mode="w") as f:
f.write(editor_module_target)
with open(os.path.join(
sources_dir, f"{project_name}.Build.cs"), mode="w") as f:
with open(sources_dir / f"{project_name}.Build.cs", mode="w") as f:
f.write(module_build)
with open(os.path.join(
sources_dir, f"{project_name}.cpp"), mode="w") as f:
with open(sources_dir / f"{project_name}.cpp", mode="w") as f:
f.write(module_cpp)
with open(os.path.join(
sources_dir, f"{project_name}.h"), mode="w") as f:
with open(sources_dir / f"{project_name}.h", mode="w") as f:
f.write(module_header)
with open(os.path.join(
sources_dir, f"{project_name}GameModeBase.cpp"), mode="w") as f:
with open(sources_dir / f"{project_name}GameModeBase.cpp", mode="w") as f:
f.write(game_mode_cpp)
with open(os.path.join(
sources_dir, f"{project_name}GameModeBase.h"), mode="w") as f:
with open(sources_dir / f"{project_name}GameModeBase.h", mode="w") as f:
f.write(game_mode_h)
u_build_tool = Path(
engine_path / "Engine/Binaries/DotNET/UnrealBuildTool.exe")
u_header_tool = None
arch = "Win64"
if platform.system().lower() == "windows":
u_build_tool = (f"{engine_path}/Engine/Binaries/DotNET/"
"UnrealBuildTool.exe")
u_header_tool = (f"{engine_path}/Engine/Binaries/Win64/"
f"UnrealHeaderTool.exe")
arch = "Win64"
u_header_tool = Path(
engine_path / "Engine/Binaries/Win64/UnrealHeaderTool.exe")
elif platform.system().lower() == "linux":
# WARNING: there is no UnrealBuildTool on linux?
u_build_tool = ""
u_header_tool = ""
arch = "Linux"
u_header_tool = Path(
engine_path / "Engine/Binaries/Linux/UnrealHeaderTool")
elif platform.system().lower() == "darwin":
# WARNING: there is no UnrealBuildTool on Mac?
u_build_tool = ""
u_header_tool = ""
# we need to test this out
arch = "Mac"
u_header_tool = Path(
engine_path / "Engine/Binaries/Mac/UnrealHeaderTool")
u_build_tool = u_build_tool.replace("\\", "/")
u_header_tool = u_header_tool.replace("\\", "/")
if not u_header_tool:
raise NotImplementedError("Unsupported platform")
command1 = [u_build_tool, "-projectfiles", f"-project={project_file}",
"-progress"]
command1 = [u_build_tool.as_posix(), "-projectfiles",
f"-project={project_file}", "-progress"]
subprocess.run(command1)
command2 = [u_build_tool, f"-ModuleWithSuffix={project_name},3555"
"Win64", "Development", "-TargetType=Editor"
f'-Project="{project_file}"', f'"{project_file}"'
command2 = [u_build_tool.as_posix(),
f"-ModuleWithSuffix={project_name},3555", arch,
"Development", "-TargetType=Editor",
f'-Project={project_file}',
f'{project_file}',
"-IgnoreJunk"]
subprocess.run(command2)

View file

@ -1,31 +1,49 @@
# -*- coding: utf-8 -*-
"""Hook to launch Unreal and prepare projects."""
import os
from pathlib import Path
from openpype.lib import (
PreLaunchHook,
ApplicationLaunchFailed
ApplicationLaunchFailed,
ApplicationNotFound
)
from openpype.hosts.unreal.api import lib as unreal_lib
class UnrealPrelaunchHook(PreLaunchHook):
"""
"""Hook to handle launching Unreal.
This hook will check if current workfile path has Unreal
project inside. IF not, it initialize it and finally it pass
path to the project by environment variable to Unreal launcher
shell script.
"""
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.signature = "( {} )".format(self.__class__.__name__)
def execute(self):
"""Hook entry method."""
asset_name = self.data["asset_name"]
task_name = self.data["task_name"]
workdir = self.launch_context.env["AVALON_WORKDIR"]
engine_version = self.app_name.split("/")[-1].replace("-", ".")
unreal_project_name = f"{asset_name}_{task_name}"
try:
if int(engine_version.split(".")[0]) < 4 and \
int(engine_version.split(".")[1]) < 26:
raise ApplicationLaunchFailed((
f"{self.signature} Old unsupported version of UE4 "
f"detected - {engine_version}"))
except ValueError:
# there can be string in minor version and in that case
# int cast is failing. This probably happens only with
# early access versions and is of no concert for this check
# so lets keep it quite.
...
# Unreal is sensitive about project names longer then 20 chars
if len(unreal_project_name) > 20:
@ -45,19 +63,21 @@ class UnrealPrelaunchHook(PreLaunchHook):
))
unreal_project_name = f"P{unreal_project_name}"
project_path = os.path.join(workdir, unreal_project_name)
project_path = Path(os.path.join(workdir, unreal_project_name))
self.log.info((
f"{self.signature} requested UE4 version: "
f"[ {engine_version} ]"
))
detected = unreal_lib.get_engine_versions()
detected = unreal_lib.get_engine_versions(self.launch_context.env)
detected_str = ', '.join(detected.keys()) or 'none'
self.log.info((
f"{self.signature} detected UE4 versions: "
f"[ {detected_str} ]"
))
if not detected:
raise ApplicationNotFound("No Unreal Engines are found.")
engine_version = ".".join(engine_version.split(".")[:2])
if engine_version not in detected.keys():
@ -66,13 +86,14 @@ class UnrealPrelaunchHook(PreLaunchHook):
f"detected [ {engine_version} ]"
))
os.makedirs(project_path, exist_ok=True)
ue4_path = unreal_lib.get_editor_executable_path(
Path(detected[engine_version]))
project_file = os.path.join(
project_path,
f"{unreal_project_name}.uproject"
)
if not os.path.isfile(project_file):
self.launch_context.launch_args.append(ue4_path.as_posix())
project_path.mkdir(parents=True, exist_ok=True)
project_file = project_path / f"{unreal_project_name}.uproject"
if not project_file.is_file():
engine_path = detected[engine_version]
self.log.info((
f"{self.signature} creating unreal "
@ -88,8 +109,9 @@ class UnrealPrelaunchHook(PreLaunchHook):
unreal_project_name,
engine_version,
project_path,
engine_path=engine_path
engine_path=Path(engine_path)
)
# Append project file to launch arguments
self.launch_context.launch_args.append(f"\"{project_file}\"")
self.launch_context.launch_args.append(
f"\"{project_file.as_posix()}\"")

View file

@ -733,6 +733,9 @@ class Templates:
continue
default_key_values[key] = templates.pop(key)
# Pop "others" key before before expected keys are processed
other_templates = templates.pop("others") or {}
keys_by_subkey = {}
for sub_key, sub_value in templates.items():
key_values = {}
@ -740,7 +743,6 @@ class Templates:
key_values.update(sub_value)
keys_by_subkey[sub_key] = cls.prepare_inner_keys(key_values)
other_templates = templates.get("others") or {}
for sub_key, sub_value in other_templates.items():
if sub_key in keys_by_subkey:
log.warning((

View file

@ -179,7 +179,7 @@ class Application:
if group.enabled:
enabled = data.get("enabled", True)
self.enabled = enabled
self.use_python_2 = data["use_python_2"]
self.use_python_2 = data.get("use_python_2", False)
self.label = data.get("variant_label") or name
self.full_name = "/".join((group.name, name))
@ -449,6 +449,12 @@ class ApplicationExecutable:
"""Representation of executable loaded from settings."""
def __init__(self, executable):
# Try to format executable with environments
try:
executable = executable.format(**os.environ)
except Exception:
pass
# On MacOS check if exists path to executable when ends with `.app`
# - it is common that path will lead to "/Applications/Blender" but
# real path is "/Applications/Blender.app"
@ -1153,6 +1159,9 @@ def prepare_host_environments(data, implementation_envs=True):
def apply_project_environments_value(project_name, env, project_settings=None):
"""Apply project specific environments on passed environments.
The enviornments are applied on passed `env` argument value so it is not
required to apply changes back.
Args:
project_name (str): Name of project for which environemnts should be
received.
@ -1161,6 +1170,9 @@ def apply_project_environments_value(project_name, env, project_settings=None):
project_settings (dict): Project settings for passed project name.
Optional if project settings are already prepared.
Returns:
dict: Passed env values with applied project environments.
Raises:
KeyError: If project settings do not contain keys for project specific
environments.
@ -1171,10 +1183,9 @@ def apply_project_environments_value(project_name, env, project_settings=None):
project_settings = get_project_settings(project_name)
env_value = project_settings["global"]["project_environments"]
if not env_value:
return env
parsed = acre.parse(env_value)
return _merge_env(parsed, env)
if env_value:
env.update(_merge_env(acre.parse(env_value), env))
return env
def prepare_context_environments(data):
@ -1203,9 +1214,8 @@ def prepare_context_environments(data):
# Load project specific environments
project_name = project_doc["name"]
data["env"] = apply_project_environments_value(
project_name, data["env"]
)
# Apply project specific environments on current env value
apply_project_environments_value(project_name, data["env"])
app = data["app"]
workdir_data = get_workdir_data(

View file

@ -7,6 +7,8 @@ try:
import opentimelineio as otio
from opentimelineio import opentime as _ot
except ImportError:
if not os.environ.get("AVALON_APP"):
raise
otio = discover_host_vendor_module("opentimelineio")
_ot = discover_host_vendor_module("opentimelineio.opentime")

View file

@ -89,8 +89,13 @@ def ffprobe_streams(path_to_file, logger=None):
popen_stdout, popen_stderr = popen.communicate()
if popen_stdout:
logger.debug("ffprobe stdout: {}".format(popen_stdout))
logger.debug("FFprobe stdout:\n{}".format(
popen_stdout.decode("utf-8")
))
if popen_stderr:
logger.debug("ffprobe stderr: {}".format(popen_stderr))
logger.warning("FFprobe stderr:\n{}".format(
popen_stderr.decode("utf-8")
))
return json.loads(popen_stdout)["streams"]

View file

@ -271,6 +271,22 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
["DEADLINE_REST_URL"]
)
self._job_info = (
context.data["project_settings"].get(
"deadline", {}).get(
"publish", {}).get(
"MayaSubmitDeadline", {}).get(
"jobInfo", {})
)
self._plugin_info = (
context.data["project_settings"].get(
"deadline", {}).get(
"publish", {}).get(
"MayaSubmitDeadline", {}).get(
"pluginInfo", {})
)
assert self._deadline_url, "Requires DEADLINE_REST_URL"
context = instance.context
@ -407,7 +423,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
self.payload_skeleton["JobInfo"]["Priority"] = \
self._instance.data.get("priority", 50)
if self.group != "none":
if self.group != "none" and self.group:
self.payload_skeleton["JobInfo"]["Group"] = self.group
if self.limit_groups:
@ -536,6 +552,10 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
self.preflight_check(instance)
# add jobInfo and pluginInfo variables from Settings
payload["JobInfo"].update(self._job_info)
payload["PluginInfo"].update(self._plugin_info)
# Prepare tiles data ------------------------------------------------
if instance.data.get("tileRendering"):
# if we have sequence of files, we need to create tile job for

View file

@ -32,6 +32,8 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
department = ""
limit_groups = {}
use_gpu = False
env_allowed_keys = []
env_search_replace_values = {}
def process(self, instance):
instance.data["toBeRenderedOn"] = "deadline"
@ -242,19 +244,19 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
"PYBLISHPLUGINPATH",
"NUKE_PATH",
"TOOL_ENV",
"OPENPYPE_DEV",
"FOUNDRY_LICENSE"
]
# add allowed keys from preset if any
if self.env_allowed_keys:
keys += self.env_allowed_keys
environment = dict({key: os.environ[key] for key in keys
if key in os.environ}, **api.Session)
# self.log.debug("enviro: {}".format(pprint(environment)))
for path in os.environ:
if path.lower().startswith('pype_'):
environment[path] = os.environ[path]
if path.lower().startswith('nuke_'):
environment[path] = os.environ[path]
if 'license' in path.lower():
environment[path] = os.environ[path]
for _path in os.environ:
if _path.lower().startswith('openpype_'):
environment[_path] = os.environ[_path]
clean_environment = {}
for key, value in environment.items():
@ -285,6 +287,13 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
environment = clean_environment
# to recognize job from PYPE for turning Event On/Off
environment["OPENPYPE_RENDER_JOB"] = "1"
# finally search replace in values of any key
if self.env_search_replace_values:
for key, value in environment.items():
for _k, _v in self.env_search_replace_values.items():
environment[key] = value.replace(_k, _v)
payload["JobInfo"].update({
"EnvironmentKeyValue%d" % index: "{key}={value}".format(
key=key,

View file

@ -16,11 +16,13 @@ def clone_review_session(session, entity):
# Add all invitees.
for invitee in entity["review_session_invitees"]:
# Make sure email is not None but string
email = invitee["email"] or ""
session.create(
"ReviewSessionInvitee",
{
"name": invitee["name"],
"email": invitee["email"],
"email": email,
"review_session": review_session
}
)

View file

@ -0,0 +1,167 @@
from openpype.modules.ftrack.lib import ServerAction
class MultipleNotesServer(ServerAction):
"""Action adds same note for muliple AssetVersions.
Note is added to selection of AssetVersions. Note is created with user
who triggered the action. It is possible to define note category of note.
"""
identifier = "multiple.notes.server"
label = "Multiple Notes (Server)"
description = "Add same note to multiple Asset Versions"
_none_category = "__NONE__"
def discover(self, session, entities, event):
"""Show action only on AssetVersions."""
if not entities:
return False
for entity in entities:
if entity.entity_type.lower() != "assetversion":
return False
return True
def interface(self, session, entities, event):
event_source = event["source"]
user_info = event_source.get("user") or {}
user_id = user_info.get("id")
if not user_id:
return None
values = event["data"].get("values")
if values:
return None
note_label = {
"type": "label",
"value": "# Enter note: #"
}
note_value = {
"name": "note",
"type": "textarea"
}
category_label = {
"type": "label",
"value": "## Category: ##"
}
category_data = []
category_data.append({
"label": "- None -",
"value": self._none_category
})
all_categories = session.query(
"select id, name from NoteCategory"
).all()
for cat in all_categories:
category_data.append({
"label": cat["name"],
"value": cat["id"]
})
category_value = {
"type": "enumerator",
"name": "category",
"data": category_data,
"value": self._none_category
}
splitter = {
"type": "label",
"value": "---"
}
return [
note_label,
note_value,
splitter,
category_label,
category_value
]
def launch(self, session, entities, event):
if "values" not in event["data"]:
return None
values = event["data"]["values"]
if len(values) <= 0 or "note" not in values:
return False
# Get Note text
note_value = values["note"]
if note_value.lower().strip() == "":
return {
"success": True,
"message": "Note was not entered. Skipping"
}
# Get User
event_source = event["source"]
user_info = event_source.get("user") or {}
user_id = user_info.get("id")
user = None
if user_id:
user = session.query(
'User where id is "{}"'.format(user_id)
).first()
if not user:
return {
"success": False,
"message": "Couldn't get user information."
}
# Logging message preparation
# - username
username = user.get("username") or "N/A"
# - AssetVersion ids
asset_version_ids_str = ",".join([entity["id"] for entity in entities])
# Base note data
note_data = {
"content": note_value,
"author": user
}
# Get category
category_id = values["category"]
if category_id == self._none_category:
category_id = None
category_name = None
if category_id is not None:
category = session.query(
"select id, name from NoteCategory where id is \"{}\"".format(
category_id
)
).first()
if category:
note_data["category"] = category
category_name = category["name"]
category_msg = ""
if category_name:
category_msg = " with category: \"{}\"".format(category_name)
self.log.warning((
"Creating note{} as User \"{}\" on "
"AssetVersions: {} with value \"{}\""
).format(category_msg, username, asset_version_ids_str, note_value))
# Create notes for entities
for entity in entities:
new_note = session.create("Note", note_data)
entity["notes"].append(new_note)
session.commit()
return True
def register(session):
'''Register plugin. Called when used as an plugin.'''
MultipleNotesServer(session).register()

View file

@ -2,7 +2,10 @@ import collections
import datetime
import ftrack_api
from openpype.modules.ftrack.lib import BaseEvent
from openpype.modules.ftrack.lib import (
BaseEvent,
query_custom_attributes
)
class PushFrameValuesToTaskEvent(BaseEvent):
@ -55,10 +58,6 @@ class PushFrameValuesToTaskEvent(BaseEvent):
if entity_info.get("entityType") != "task":
continue
# Skip `Task` entity type
if entity_info["entity_type"].lower() == "task":
continue
# Care only about changes of status
changes = entity_info.get("changes")
if not changes:
@ -74,6 +73,14 @@ class PushFrameValuesToTaskEvent(BaseEvent):
if project_id is None:
continue
# Skip `Task` entity type if parent didn't change
if entity_info["entity_type"].lower() == "task":
if (
"parent_id" not in changes
or changes["parent_id"]["new"] is None
):
continue
if project_id not in entities_info_by_project_id:
entities_info_by_project_id[project_id] = []
entities_info_by_project_id[project_id].append(entity_info)
@ -117,11 +124,24 @@ class PushFrameValuesToTaskEvent(BaseEvent):
))
return
interest_attributes = set(interest_attributes)
interest_entity_types = set(interest_entity_types)
# Separate value changes and task parent changes
_entities_info = []
task_parent_changes = []
for entity_info in entities_info:
if entity_info["entity_type"].lower() == "task":
task_parent_changes.append(entity_info)
else:
_entities_info.append(entity_info)
entities_info = _entities_info
# Filter entities info with changes
interesting_data, changed_keys_by_object_id = self.filter_changes(
session, event, entities_info, interest_attributes
)
if not interesting_data:
if not interesting_data and not task_parent_changes:
return
# Prepare object types
@ -131,6 +151,289 @@ class PushFrameValuesToTaskEvent(BaseEvent):
name_low = object_type["name"].lower()
object_types_by_name[name_low] = object_type
# NOTE it would be nice to check if `interesting_data` do not contain
# value changs of tasks that were created or moved
# - it is a complex way how to find out
if interesting_data:
self.process_attribute_changes(
session, object_types_by_name,
interesting_data, changed_keys_by_object_id,
interest_entity_types, interest_attributes
)
if task_parent_changes:
self.process_task_parent_change(
session, object_types_by_name, task_parent_changes,
interest_entity_types, interest_attributes
)
def process_task_parent_change(
self, session, object_types_by_name, task_parent_changes,
interest_entity_types, interest_attributes
):
"""Push custom attribute values if task parent has changed.
Parent is changed if task is created or if is moved under different
entity. We don't care about all task changes only about those that
have it's parent in interest types (from settings).
Tasks hierarchical value should be unset or set based on parents
real hierarchical value and non hierarchical custom attribute value
should be set to hierarchical value.
"""
# Store task ids which were created or moved under parent with entity
# type defined in settings (interest_entity_types).
task_ids = set()
# Store parent ids of matching task ids
matching_parent_ids = set()
# Store all entity ids of all entities to be able query hierarchical
# values.
whole_hierarchy_ids = set()
# Store parent id of each entity id
parent_id_by_entity_id = {}
for entity_info in task_parent_changes:
# Ignore entities with less parents than 2
# NOTE entity itself is also part of "parents" value
parents = entity_info.get("parents") or []
if len(parents) < 2:
continue
parent_info = parents[1]
# Check if parent has entity type we care about.
if parent_info["entity_type"] not in interest_entity_types:
continue
task_ids.add(entity_info["entityId"])
matching_parent_ids.add(parent_info["entityId"])
# Store whole hierarchi of task entity
prev_id = None
for item in parents:
item_id = item["entityId"]
whole_hierarchy_ids.add(item_id)
if prev_id is None:
prev_id = item_id
continue
parent_id_by_entity_id[prev_id] = item_id
if item["entityType"] == "show":
break
prev_id = item_id
# Just skip if nothing is interesting for our settings
if not matching_parent_ids:
return
# Query object type ids of parent ids for custom attribute
# definitions query
entities = session.query(
"select object_type_id from TypedContext where id in ({})".format(
self.join_query_keys(matching_parent_ids)
)
)
# Prepare task object id
task_object_id = object_types_by_name["task"]["id"]
# All object ids for which we're querying custom attribute definitions
object_type_ids = set()
object_type_ids.add(task_object_id)
for entity in entities:
object_type_ids.add(entity["object_type_id"])
attrs_by_obj_id, hier_attrs = self.attrs_configurations(
session, object_type_ids, interest_attributes
)
# Skip if all task attributes are not available
task_attrs = attrs_by_obj_id.get(task_object_id)
if not task_attrs:
return
# Skip attributes that is not in both hierarchical and nonhierarchical
# TODO be able to push values if hierarchical is available
for key in interest_attributes:
if key not in hier_attrs:
task_attrs.pop(key, None)
elif key not in task_attrs:
hier_attrs.pop(key)
# Skip if nothing remained
if not task_attrs:
return
# Do some preparations for custom attribute values query
attr_key_by_id = {}
nonhier_id_by_key = {}
hier_attr_ids = []
for key, attr_id in hier_attrs.items():
attr_key_by_id[attr_id] = key
hier_attr_ids.append(attr_id)
conf_ids = list(hier_attr_ids)
for key, attr_id in task_attrs.items():
attr_key_by_id[attr_id] = key
nonhier_id_by_key[key] = attr_id
conf_ids.append(attr_id)
# Query custom attribute values
# - result does not contain values for all entities only result of
# query callback to ftrack server
result = query_custom_attributes(
session, conf_ids, whole_hierarchy_ids
)
# Prepare variables where result will be stored
# - hierachical values should not contain attribute with value by
# default
hier_values_by_entity_id = {
entity_id: {}
for entity_id in whole_hierarchy_ids
}
# - real values of custom attributes
values_by_entity_id = {
entity_id: {
attr_id: None
for attr_id in conf_ids
}
for entity_id in whole_hierarchy_ids
}
for item in result:
attr_id = item["configuration_id"]
entity_id = item["entity_id"]
value = item["value"]
values_by_entity_id[entity_id][attr_id] = value
if attr_id in hier_attr_ids and value is not None:
hier_values_by_entity_id[entity_id][attr_id] = value
# Prepare values for all task entities
# - going through all parents and storing first value value
# - store None to those that are already known that do not have set
# value at all
for task_id in tuple(task_ids):
for attr_id in hier_attr_ids:
entity_ids = []
value = None
entity_id = task_id
while value is None:
entity_value = hier_values_by_entity_id[entity_id]
if attr_id in entity_value:
value = entity_value[attr_id]
if value is None:
break
if value is None:
entity_ids.append(entity_id)
entity_id = parent_id_by_entity_id.get(entity_id)
if entity_id is None:
break
for entity_id in entity_ids:
hier_values_by_entity_id[entity_id][attr_id] = value
# Prepare changes to commit
changes = []
for task_id in tuple(task_ids):
parent_id = parent_id_by_entity_id[task_id]
for attr_id in hier_attr_ids:
attr_key = attr_key_by_id[attr_id]
nonhier_id = nonhier_id_by_key[attr_key]
# Real value of hierarchical attribute on parent
# - If is none then should be unset
real_parent_value = values_by_entity_id[parent_id][attr_id]
# Current hierarchical value of a task
# - Will be compared to real parent value
hier_value = hier_values_by_entity_id[task_id][attr_id]
# Parent value that can be inherited from it's parent entity
parent_value = hier_values_by_entity_id[parent_id][attr_id]
# Task value of nonhierarchical custom attribute
nonhier_value = values_by_entity_id[task_id][nonhier_id]
if real_parent_value != hier_value:
changes.append({
"new_value": real_parent_value,
"attr_id": attr_id,
"entity_id": task_id,
"attr_key": attr_key
})
if parent_value != nonhier_value:
changes.append({
"new_value": parent_value,
"attr_id": nonhier_id,
"entity_id": task_id,
"attr_key": attr_key
})
self._commit_changes(session, changes)
def _commit_changes(self, session, changes):
uncommited_changes = False
for idx, item in enumerate(changes):
new_value = item["new_value"]
attr_id = item["attr_id"]
entity_id = item["entity_id"]
attr_key = item["attr_key"]
entity_key = collections.OrderedDict()
entity_key["configuration_id"] = attr_id
entity_key["entity_id"] = entity_id
self._cached_changes.append({
"attr_key": attr_key,
"entity_id": entity_id,
"value": new_value,
"time": datetime.datetime.now()
})
if new_value is None:
op = ftrack_api.operation.DeleteEntityOperation(
"CustomAttributeValue",
entity_key
)
else:
op = ftrack_api.operation.UpdateEntityOperation(
"ContextCustomAttributeValue",
entity_key,
"value",
ftrack_api.symbol.NOT_SET,
new_value
)
session.recorded_operations.push(op)
self.log.info((
"Changing Custom Attribute \"{}\" to value"
" \"{}\" on entity: {}"
).format(attr_key, new_value, entity_id))
if (idx + 1) % 20 == 0:
uncommited_changes = False
try:
session.commit()
except Exception:
session.rollback()
self.log.warning(
"Changing of values failed.", exc_info=True
)
else:
uncommited_changes = True
if uncommited_changes:
try:
session.commit()
except Exception:
session.rollback()
self.log.warning("Changing of values failed.", exc_info=True)
def process_attribute_changes(
self, session, object_types_by_name,
interesting_data, changed_keys_by_object_id,
interest_entity_types, interest_attributes
):
# Prepare task object id
task_object_id = object_types_by_name["task"]["id"]
@ -216,13 +519,13 @@ class PushFrameValuesToTaskEvent(BaseEvent):
task_entity_ids.add(task_id)
parent_id_by_task_id[task_id] = task_entity["parent_id"]
self.finalize(
self.finalize_attribute_changes(
session, interesting_data,
changed_keys, attrs_by_obj_id, hier_attrs,
task_entity_ids, parent_id_by_task_id
)
def finalize(
def finalize_attribute_changes(
self, session, interesting_data,
changed_keys, attrs_by_obj_id, hier_attrs,
task_entity_ids, parent_id_by_task_id
@ -248,6 +551,7 @@ class PushFrameValuesToTaskEvent(BaseEvent):
session, attr_ids, entity_ids, task_entity_ids, hier_attrs
)
changes = []
for entity_id, current_values in current_values_by_id.items():
parent_id = parent_id_by_task_id.get(entity_id)
if not parent_id:
@ -272,39 +576,13 @@ class PushFrameValuesToTaskEvent(BaseEvent):
if new_value == old_value:
continue
entity_key = collections.OrderedDict()
entity_key["configuration_id"] = attr_id
entity_key["entity_id"] = entity_id
self._cached_changes.append({
"attr_key": attr_key,
changes.append({
"new_value": new_value,
"attr_id": attr_id,
"entity_id": entity_id,
"value": new_value,
"time": datetime.datetime.now()
"attr_key": attr_key
})
if new_value is None:
op = ftrack_api.operation.DeleteEntityOperation(
"CustomAttributeValue",
entity_key
)
else:
op = ftrack_api.operation.UpdateEntityOperation(
"ContextCustomAttributeValue",
entity_key,
"value",
ftrack_api.symbol.NOT_SET,
new_value
)
session.recorded_operations.push(op)
self.log.info((
"Changing Custom Attribute \"{}\" to value"
" \"{}\" on entity: {}"
).format(attr_key, new_value, entity_id))
try:
session.commit()
except Exception:
session.rollback()
self.log.warning("Changing of values failed.", exc_info=True)
self._commit_changes(session, changes)
def filter_changes(
self, session, event, entities_info, interest_attributes

View file

@ -1,5 +1,6 @@
import os
import re
import json
from openpype.modules.ftrack.lib import BaseAction, statics_icon
from openpype.api import Anatomy, get_project_settings
@ -84,6 +85,9 @@ class CreateProjectFolders(BaseAction):
}
try:
if isinstance(project_folder_structure, str):
project_folder_structure = json.loads(project_folder_structure)
# Get paths based on presets
basic_paths = self.get_path_items(project_folder_structure)
self.create_folders(basic_paths, project_entity)

View file

@ -13,7 +13,8 @@ from .custom_attributes import (
default_custom_attributes_definition,
app_definitions_from_app_manager,
tool_definitions_from_app_manager,
get_openpype_attr
get_openpype_attr,
query_custom_attributes
)
from . import avalon_sync
@ -37,6 +38,7 @@ __all__ = (
"app_definitions_from_app_manager",
"tool_definitions_from_app_manager",
"get_openpype_attr",
"query_custom_attributes",
"avalon_sync",

View file

@ -81,3 +81,60 @@ def get_openpype_attr(session, split_hierarchical=True, query_keys=None):
return custom_attributes, hier_custom_attributes
return custom_attributes
def join_query_keys(keys):
"""Helper to join keys to query."""
return ",".join(["\"{}\"".format(key) for key in keys])
def query_custom_attributes(session, conf_ids, entity_ids, table_name=None):
"""Query custom attribute values from ftrack database.
Using ftrack call method result may differ based on used table name and
version of ftrack server.
Args:
session(ftrack_api.Session): Connected ftrack session.
conf_id(list, set, tuple): Configuration(attribute) ids which are
queried.
entity_ids(list, set, tuple): Entity ids for which are values queried.
table_name(str): Table nam from which values are queried. Not
recommended to change until you know what it means.
"""
output = []
# Just skip
if not conf_ids or not entity_ids:
return output
if table_name is None:
table_name = "ContextCustomAttributeValue"
# Prepare values to query
attributes_joined = join_query_keys(conf_ids)
attributes_len = len(conf_ids)
# Query values in chunks
chunk_size = int(5000 / attributes_len)
# Make sure entity_ids is `list` for chunk selection
entity_ids = list(entity_ids)
for idx in range(0, len(entity_ids), chunk_size):
entity_ids_joined = join_query_keys(
entity_ids[idx:idx + chunk_size]
)
call_expr = [{
"action": "query",
"expression": (
"select value, entity_id from {}"
" where entity_id in ({}) and configuration_id in ({})"
).format(table_name, entity_ids_joined, attributes_joined)
}]
if hasattr(session, "call"):
[result] = session.call(call_expr)
else:
[result] = session._call(call_expr)
for item in result["data"]:
output.append(item)
return output

View file

@ -51,7 +51,7 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
families = instance.data.get("families")
add_ftrack_family = profile["add_ftrack_family"]
additional_filters = profile.get("additional_filters")
additional_filters = profile.get("advanced_filtering")
if additional_filters:
add_ftrack_family = self._get_add_ftrack_f_from_addit_filters(
additional_filters,

View file

@ -975,11 +975,31 @@ class ExtractReview(pyblish.api.InstancePlugin):
# NOTE Skipped using instance's resolution
full_input_path_single_file = temp_data["full_input_path_single_file"]
input_data = ffprobe_streams(
full_input_path_single_file, self.log
)[0]
input_width = int(input_data["width"])
input_height = int(input_data["height"])
try:
streams = ffprobe_streams(
full_input_path_single_file, self.log
)
except Exception:
raise AssertionError((
"FFprobe couldn't read information about input file: \"{}\""
).format(full_input_path_single_file))
# Try to find first stream with defined 'width' and 'height'
# - this is to avoid order of streams where audio can be as first
# - there may be a better way (checking `codec_type`?)
input_width = None
input_height = None
for stream in streams:
if "width" in stream and "height" in stream:
input_width = int(stream["width"])
input_height = int(stream["height"])
break
# Raise exception of any stream didn't define input resolution
if input_width is None:
raise AssertionError((
"FFprobe couldn't read resolution from input file: \"{}\""
).format(full_input_path_single_file))
# NOTE Setting only one of `width` or `heigth` is not allowed
# - settings value can't have None but has value of 0

View file

@ -26,9 +26,23 @@ class ExtractReviewSlate(openpype.api.Extractor):
slate_path = inst_data.get("slateFrame")
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
slate_stream = openpype.lib.ffprobe_streams(slate_path, self.log)[0]
slate_width = slate_stream["width"]
slate_height = slate_stream["height"]
slate_streams = openpype.lib.ffprobe_streams(slate_path, self.log)
# Try to find first stream with defined 'width' and 'height'
# - this is to avoid order of streams where audio can be as first
# - there may be a better way (checking `codec_type`?)+
slate_width = None
slate_height = None
for slate_stream in slate_streams:
if "width" in slate_stream and "height" in slate_stream:
slate_width = int(slate_stream["width"])
slate_height = int(slate_stream["height"])
break
# Raise exception of any stream didn't define input resolution
if slate_width is None:
raise AssertionError((
"FFprobe couldn't read resolution from input file: \"{}\""
).format(slate_path))
if "reviewToWidth" in inst_data:
use_legacy_code = True
@ -309,16 +323,29 @@ class ExtractReviewSlate(openpype.api.Extractor):
)
return codec_args
codec_name = streams[0].get("codec_name")
# Try to find first stream that is not an audio
no_audio_stream = None
for stream in streams:
if stream.get("codec_type") != "audio":
no_audio_stream = stream
break
if no_audio_stream is None:
self.log.warning((
"Couldn't find stream that is not an audio from file \"{}\""
).format(full_input_path))
return codec_args
codec_name = no_audio_stream.get("codec_name")
if codec_name:
codec_args.append("-codec:v {}".format(codec_name))
profile_name = streams[0].get("profile")
profile_name = no_audio_stream.get("profile")
if profile_name:
profile_name = profile_name.replace(" ", "_").lower()
codec_args.append("-profile:v {}".format(profile_name))
pix_fmt = streams[0].get("pix_fmt")
pix_fmt = no_audio_stream.get("pix_fmt")
if pix_fmt:
codec_args.append("-pix_fmt {}".format(pix_fmt))
return codec_args

View file

@ -3,9 +3,13 @@
"ValidateExpectedFiles": {
"enabled": true,
"active": true,
"families": ["render"],
"targets": ["deadline"],
"allow_user_override": true
"allow_user_override": true,
"families": [
"render"
],
"targets": [
"deadline"
]
},
"MayaSubmitDeadline": {
"enabled": true,
@ -15,7 +19,9 @@
"use_published": true,
"asset_dependencies": true,
"group": "none",
"limit": []
"limit": [],
"jobInfo": {},
"pluginInfo": {}
},
"NukeSubmitDeadline": {
"enabled": true,
@ -29,6 +35,8 @@
"group": "",
"department": "",
"use_gpu": true,
"env_allowed_keys": [],
"env_search_replace_values": {},
"limit_groups": {}
},
"HarmonySubmitDeadline": {

View file

@ -229,7 +229,6 @@
"standalonepublisher"
],
"families": [
"review",
"plate"
],
"tasks": [],
@ -259,6 +258,45 @@
"tasks": [],
"add_ftrack_family": true,
"advanced_filtering": []
},
{
"hosts": [
"tvpaint"
],
"families": [
"renderPass"
],
"tasks": [],
"add_ftrack_family": false,
"advanced_filtering": []
},
{
"hosts": [
"tvpaint"
],
"families": [],
"tasks": [],
"add_ftrack_family": true,
"advanced_filtering": []
},
{
"hosts": [
"nuke"
],
"families": [
"write",
"render"
],
"tasks": [],
"add_ftrack_family": false,
"advanced_filtering": [
{
"families": [
"review"
],
"add_ftrack_family": true
}
]
}
]
},

View file

@ -172,6 +172,8 @@
"deadline_group": "",
"deadline_chunk_size": 1,
"deadline_priority": 50,
"publishing_script": "",
"skip_integration_repre_list": [],
"aov_filter": {
"maya": [
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
@ -184,6 +186,10 @@
".*"
]
}
},
"CleanUp": {
"paterns": [],
"remove_temp_renders": false
}
},
"tools": {
@ -271,28 +277,7 @@
}
}
},
"project_folder_structure": {
"__project_root__": {
"prod": {},
"resources": {
"footage": {
"plates": {},
"offline": {}
},
"audio": {},
"art_dept": {}
},
"editorial": {},
"assets[ftrack.Library]": {
"characters[ftrack]": {},
"locations[ftrack]": {}
},
"shots[ftrack.Sequence]": {
"scripts": {},
"editorial[ftrack.Folder]": {}
}
}
},
"project_folder_structure": "{\"__project_root__\": {\"prod\": {}, \"resources\": {\"footage\": {\"plates\": {}, \"offline\": {}}, \"audio\": {}, \"art_dept\": {}}, \"editorial\": {}, \"assets[ftrack.Library]\": {\"characters[ftrack]\": {}, \"locations[ftrack]\": {}}, \"shots[ftrack.Sequence]\": {\"scripts\": {}, \"editorial[ftrack.Folder]\": {}}}}",
"sync_server": {
"enabled": true,
"config": {

View file

@ -10,11 +10,22 @@
},
"create": {
"CreateWriteRender": {
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}"
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}",
"defaults": [
"Main",
"Mask"
]
},
"CreateWritePrerender": {
"fpath_template": "{work}/prerenders/nuke/{subset}/{subset}.{frame}.{ext}",
"use_range_limit": true
"use_range_limit": true,
"defaults": [
"Key01",
"Bg01",
"Fg01",
"Branch01",
"Part01"
]
}
},
"publish": {

View file

@ -165,6 +165,58 @@
],
"output": []
}
},
"CollectEditorial": {
"source_dir": "",
"extensions": [
"mov",
"mp4"
]
},
"CollectHierarchyInstance": {
"shot_rename_template": "{project[code]}_{_sequence_}_{_shot_}",
"shot_rename_search_patterns": {
"_sequence_": "(\\d{4})(?=_\\d{4})",
"_shot_": "(\\d{4})(?!_\\d{4})"
},
"shot_add_hierarchy": {
"parents_path": "{project}/{folder}/{sequence}",
"parents": {
"project": "{project[name]}",
"sequence": "{_sequence_}",
"folder": "shots"
}
},
"shot_add_tasks": {}
},
"shot_add_tasks": {
"custom_start_frame": 0,
"timeline_frame_start": 900000,
"timeline_frame_offset": 0,
"subsets": {
"referenceMain": {
"family": "review",
"families": [
"clip"
],
"extensions": [
"mp4"
],
"version": 0,
"keepSequence": false
},
"audioMain": {
"family": "audio",
"families": [
"clip"
],
"extensions": [
"wav"
],
"version": 0,
"keepSequence": false
}
}
}
}
}

View file

@ -807,7 +807,6 @@
"environment": {},
"variants": {
"2-83": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Blender Foundation\\Blender 2.83\\blender.exe"
@ -829,7 +828,6 @@
"environment": {}
},
"2-90": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Blender Foundation\\Blender 2.90\\blender.exe"
@ -851,7 +849,6 @@
"environment": {}
},
"2-91": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Blender Foundation\\Blender 2.91\\blender.exe"
@ -891,7 +888,6 @@
"20": {
"enabled": true,
"variant_label": "20",
"use_python_2": false,
"executables": {
"windows": [],
"darwin": [],
@ -907,7 +903,6 @@
"17": {
"enabled": true,
"variant_label": "17",
"use_python_2": false,
"executables": {
"windows": [],
"darwin": [
@ -932,7 +927,6 @@
"environment": {},
"variants": {
"animation_11-64bits": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\TVPaint Developpement\\TVPaint Animation 11 (64bits)\\TVPaint Animation 11 (64bits).exe"
@ -948,7 +942,6 @@
"environment": {}
},
"animation_11-32bits": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files (x86)\\TVPaint Developpement\\TVPaint Animation 11 (32bits)\\TVPaint Animation 11 (32bits).exe"
@ -982,7 +975,6 @@
"2020": {
"enabled": true,
"variant_label": "2020",
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Adobe\\Adobe Photoshop 2020\\Photoshop.exe"
@ -1000,7 +992,6 @@
"2021": {
"enabled": true,
"variant_label": "2021",
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Adobe\\Adobe Photoshop 2021\\Photoshop.exe"
@ -1030,7 +1021,6 @@
"2020": {
"enabled": true,
"variant_label": "2020",
"use_python_2": false,
"executables": {
"windows": [
""
@ -1048,7 +1038,6 @@
"2021": {
"enabled": true,
"variant_label": "2021",
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Adobe\\Adobe After Effects 2021\\Support Files\\AfterFX.exe"
@ -1095,7 +1084,7 @@
"unreal": {
"enabled": true,
"label": "Unreal Editor",
"icon": "{}/app_icons/ue4.png'",
"icon": "{}/app_icons/ue4.png",
"host_name": "unreal",
"environment": {},
"variants": {

View file

@ -111,6 +111,7 @@ from .enum_entity import (
from .list_entity import ListEntity
from .dict_immutable_keys_entity import DictImmutableKeysEntity
from .dict_mutable_keys_entity import DictMutableKeysEntity
from .dict_conditional import DictConditionalEntity
from .anatomy_entities import AnatomyEntity
@ -166,5 +167,7 @@ __all__ = (
"DictMutableKeysEntity",
"DictConditionalEntity",
"AnatomyEntity"
)

View file

@ -136,6 +136,7 @@ class BaseItemEntity(BaseEntity):
# Override state defines which values are used, saved and how.
# TODO convert to private attribute
self._override_state = OverrideState.NOT_DEFINED
self._ignore_missing_defaults = None
# These attributes may change values during existence of an object
# Default value, studio override values and project override values
@ -279,8 +280,13 @@ class BaseItemEntity(BaseEntity):
self, "Dynamic entity can't require restart."
)
@abstractproperty
def root_key(self):
"""Root is represented as this dictionary key."""
pass
@abstractmethod
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults):
"""Set override state and trigger it on children.
Method discard all changes in hierarchy and use values, metadata
@ -290,8 +296,15 @@ class BaseItemEntity(BaseEntity):
Should start on root entity and when triggered then must be called on
all entities in hierarchy.
Argument `ignore_missing_defaults` should be used when entity has
children that are not saved or used all the time but override statu
must be changed and children must have any default value.
Args:
state (OverrideState): State to which should be data changed.
ignore_missing_defaults (bool): Ignore missing default values.
Entity won't raise `DefaultsNotDefined` and
`StudioDefaultsNotDefined`.
"""
pass
@ -866,6 +879,10 @@ class ItemEntity(BaseItemEntity):
"""Call save on root item."""
self.root_item.save()
@property
def root_key(self):
return self.root_item.root_key
def schema_validations(self):
if not self.label and self.use_label_wrap:
reason = (
@ -885,7 +902,11 @@ class ItemEntity(BaseItemEntity):
def create_schema_object(self, *args, **kwargs):
"""Reference method for creation of entities defined in RootEntity."""
return self.root_item.create_schema_object(*args, **kwargs)
return self.schema_hub.create_schema_object(*args, **kwargs)
@property
def schema_hub(self):
return self.root_item.schema_hub
def get_entity_from_path(self, path):
return self.root_item.get_entity_from_path(path)

View file

@ -0,0 +1,707 @@
import copy
from .lib import (
OverrideState,
NOT_SET
)
from openpype.settings.constants import (
METADATA_KEYS,
M_OVERRIDEN_KEY,
KEY_REGEX
)
from . import (
BaseItemEntity,
ItemEntity,
GUIEntity
)
from .exceptions import (
SchemaDuplicatedKeys,
EntitySchemaError,
InvalidKeySymbols
)
class DictConditionalEntity(ItemEntity):
"""Entity represents dictionay with only one persistent key definition.
The persistent key is enumerator which define rest of children under
dictionary. There is not possibility of shared children.
Entity's keys can't be removed or added. But they may change based on
the persistent key. If you're change value manually (key by key) make sure
you'll change value of the persistent key as first. It is recommended to
use `set` method which handle this for you.
It is possible to use entity similar way as `dict` object. Returned values
are not real settings values but entities representing the value.
"""
schema_types = ["dict-conditional"]
_default_label_wrap = {
"use_label_wrap": False,
"collapsible": False,
"collapsed": True
}
def __getitem__(self, key):
"""Return entity inder key."""
if key == self.enum_key:
return self.enum_entity
return self.non_gui_children[self.current_enum][key]
def __setitem__(self, key, value):
"""Set value of item under key."""
if key == self.enum_key:
child_obj = self.enum_entity
else:
child_obj = self.non_gui_children[self.current_enum][key]
child_obj.set(value)
def __iter__(self):
"""Iter through keys."""
for key in self.keys():
yield key
def __contains__(self, key):
"""Check if key is available."""
if key == self.enum_key:
return True
return key in self.non_gui_children[self.current_enum]
def get(self, key, default=None):
"""Safe entity getter by key."""
if key == self.enum_key:
return self.enum_entity
return self.non_gui_children[self.current_enum].get(key, default)
def keys(self):
"""Entity's keys."""
keys = list(self.non_gui_children[self.current_enum].keys())
keys.insert(0, [self.enum_key])
return keys
def values(self):
"""Children entities."""
values = [
self.enum_entity
]
for child_entiy in self.non_gui_children[self.current_enum].values():
values.append(child_entiy)
return values
def items(self):
"""Children entities paired with their key (key, value)."""
items = [
(self.enum_key, self.enum_entity)
]
for key, value in self.non_gui_children[self.current_enum].items():
items.append((key, value))
return items
def set(self, value):
"""Set value."""
new_value = self.convert_to_valid_type(value)
# First change value of enum key if available
if self.enum_key in new_value:
self.enum_entity.set(new_value.pop(self.enum_key))
for _key, _value in new_value.items():
self.non_gui_children[self.current_enum][_key].set(_value)
def _item_initalization(self):
self._default_metadata = NOT_SET
self._studio_override_metadata = NOT_SET
self._project_override_metadata = NOT_SET
self._ignore_child_changes = False
# `current_metadata` are still when schema is loaded
# - only metadata stored with dict item are gorup overrides in
# M_OVERRIDEN_KEY
self._current_metadata = {}
self._metadata_are_modified = False
# Entity must be group or in group
if (
self.group_item is None
and not self.is_dynamic_item
and not self.is_in_dynamic_item
):
self.is_group = True
# Children are stored by key as keys are immutable and are defined by
# schema
self.valid_value_types = (dict, )
self.children = {}
self.non_gui_children = {}
self.gui_layout = {}
if self.is_dynamic_item:
self.require_key = False
self.enum_key = self.schema_data.get("enum_key")
self.enum_label = self.schema_data.get("enum_label")
self.enum_children = self.schema_data.get("enum_children")
self.enum_entity = None
self.highlight_content = self.schema_data.get(
"highlight_content", False
)
self.show_borders = self.schema_data.get("show_borders", True)
self._add_children()
@property
def current_enum(self):
"""Current value of enum entity.
This value define what children are used.
"""
if self.enum_entity is None:
return None
return self.enum_entity.value
def schema_validations(self):
"""Validation of schema data."""
# Enum key must be defined
if self.enum_key is None:
raise EntitySchemaError(self, "Key 'enum_key' is not set.")
# Validate type of enum children
if not isinstance(self.enum_children, list):
raise EntitySchemaError(
self, "Key 'enum_children' must be a list. Got: {}".format(
str(type(self.enum_children))
)
)
# Without defined enum children entity has nothing to do
if not self.enum_children:
raise EntitySchemaError(self, (
"Key 'enum_children' have empty value. Entity can't work"
" without children definitions."
))
children_def_keys = []
for children_def in self.enum_children:
if not isinstance(children_def, dict):
raise EntitySchemaError((
"Children definition under key 'enum_children' must"
" be a dictionary."
))
if "key" not in children_def:
raise EntitySchemaError((
"Children definition under key 'enum_children' miss"
" 'key' definition."
))
# We don't validate regex of these keys because they will be stored
# as value at the end.
key = children_def["key"]
if key in children_def_keys:
# TODO this hould probably be different exception?
raise SchemaDuplicatedKeys(self, key)
children_def_keys.append(key)
# Validate key duplications per each enum item
for children in self.children.values():
children_keys = set()
children_keys.add(self.enum_key)
for child_entity in children:
if not isinstance(child_entity, BaseItemEntity):
continue
elif child_entity.key not in children_keys:
children_keys.add(child_entity.key)
else:
raise SchemaDuplicatedKeys(self, child_entity.key)
# Enum key must match key regex
if not KEY_REGEX.match(self.enum_key):
raise InvalidKeySymbols(self.path, self.enum_key)
# Validate all remaining keys with key regex
for children_by_key in self.non_gui_children.values():
for key in children_by_key.keys():
if not KEY_REGEX.match(key):
raise InvalidKeySymbols(self.path, key)
super(DictConditionalEntity, self).schema_validations()
# Trigger schema validation on children entities
for children in self.children.values():
for child_obj in children:
child_obj.schema_validations()
def on_change(self):
"""Update metadata on change and pass change to parent."""
self._update_current_metadata()
for callback in self.on_change_callbacks:
callback()
self.parent.on_child_change(self)
def on_child_change(self, child_obj):
"""Trigger on change callback if child changes are not ignored."""
if self._ignore_child_changes:
return
if (
child_obj is self.enum_entity
or child_obj in self.children[self.current_enum]
):
self.on_change()
def _add_children(self):
"""Add children from schema data and repare enum items.
Each enum item must have defined it's children. None are shared across
all enum items.
Nice to have: Have ability to have shared keys across all enum items.
All children are stored by their enum item.
"""
# Skip if are not defined
# - schema validations should raise and exception
if not self.enum_children or not self.enum_key:
return
valid_enum_items = []
for item in self.enum_children:
if isinstance(item, dict) and "key" in item:
valid_enum_items.append(item)
enum_items = []
for item in valid_enum_items:
item_key = item["key"]
item_label = item.get("label") or item_key
enum_items.append({item_key: item_label})
if not enum_items:
return
# Create Enum child first
enum_key = self.enum_key or "invalid"
enum_schema = {
"type": "enum",
"multiselection": False,
"enum_items": enum_items,
"key": enum_key,
"label": self.enum_label or enum_key
}
enum_entity = self.create_schema_object(enum_schema, self)
self.enum_entity = enum_entity
# Create children per each enum item
for item in valid_enum_items:
item_key = item["key"]
# Make sure all keys have set value in these variables
# - key 'children' is optional
self.non_gui_children[item_key] = {}
self.children[item_key] = []
self.gui_layout[item_key] = []
children = item.get("children") or []
for children_schema in children:
child_obj = self.create_schema_object(children_schema, self)
self.children[item_key].append(child_obj)
self.gui_layout[item_key].append(child_obj)
if isinstance(child_obj, GUIEntity):
continue
self.non_gui_children[item_key][child_obj.key] = child_obj
def get_child_path(self, child_obj):
"""Get hierarchical path of child entity.
Child must be entity's direct children. This must be possible to get
for any children even if not from current enum value.
"""
if child_obj is self.enum_entity:
return "/".join([self.path, self.enum_key])
result_key = None
for children in self.non_gui_children.values():
for key, _child_obj in children.items():
if _child_obj is child_obj:
result_key = key
break
if result_key is None:
raise ValueError("Didn't found child {}".format(child_obj))
return "/".join([self.path, result_key])
def _update_current_metadata(self):
current_metadata = {}
for key, child_obj in self.non_gui_children[self.current_enum].items():
if self._override_state is OverrideState.DEFAULTS:
break
if not child_obj.is_group:
continue
if (
self._override_state is OverrideState.STUDIO
and not child_obj.has_studio_override
):
continue
if (
self._override_state is OverrideState.PROJECT
and not child_obj.has_project_override
):
continue
if M_OVERRIDEN_KEY not in current_metadata:
current_metadata[M_OVERRIDEN_KEY] = []
current_metadata[M_OVERRIDEN_KEY].append(key)
# Define if current metadata are avaialble for current override state
metadata = NOT_SET
if self._override_state is OverrideState.STUDIO:
metadata = self._studio_override_metadata
elif self._override_state is OverrideState.PROJECT:
metadata = self._project_override_metadata
if metadata is NOT_SET:
metadata = {}
self._metadata_are_modified = current_metadata != metadata
self._current_metadata = current_metadata
def set_override_state(self, state, ignore_missing_defaults):
# Trigger override state change of root if is not same
if self.root_item.override_state is not state:
self.root_item.set_override_state(state)
return
# Change has/had override states
self._override_state = state
self._ignore_missing_defaults = ignore_missing_defaults
# Set override state on enum entity first
self.enum_entity.set_override_state(state, ignore_missing_defaults)
# Set override state on other enum children
# - these must not raise exception about missing defaults
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.set_override_state(state, True)
self._update_current_metadata()
@property
def value(self):
output = {
self.enum_key: self.enum_entity.value
}
for key, child_obj in self.non_gui_children[self.current_enum].items():
output[key] = child_obj.value
return output
@property
def has_unsaved_changes(self):
if self._metadata_are_modified:
return True
return self._child_has_unsaved_changes
@property
def _child_has_unsaved_changes(self):
if self.enum_entity.has_unsaved_changes:
return True
for child_obj in self.non_gui_children[self.current_enum].values():
if child_obj.has_unsaved_changes:
return True
return False
@property
def has_studio_override(self):
return self._child_has_studio_override
@property
def _child_has_studio_override(self):
if self._override_state >= OverrideState.STUDIO:
if self.enum_entity.has_studio_override:
return True
for child_obj in self.non_gui_children[self.current_enum].values():
if child_obj.has_studio_override:
return True
return False
@property
def has_project_override(self):
return self._child_has_project_override
@property
def _child_has_project_override(self):
if self._override_state >= OverrideState.PROJECT:
if self.enum_entity.has_project_override:
return True
for child_obj in self.non_gui_children[self.current_enum].values():
if child_obj.has_project_override:
return True
return False
def settings_value(self):
if self._override_state is OverrideState.NOT_DEFINED:
return NOT_SET
if self._override_state is OverrideState.DEFAULTS:
children_items = [
(self.enum_key, self.enum_entity)
]
for item in self.non_gui_children[self.current_enum].items():
children_items.append(item)
output = {}
for key, child_obj in children_items:
child_value = child_obj.settings_value()
if not child_obj.is_file and not child_obj.file_item:
for _key, _value in child_value.items():
new_key = "/".join([key, _key])
output[new_key] = _value
else:
output[key] = child_value
return output
if self.is_group:
if self._override_state is OverrideState.STUDIO:
if not self.has_studio_override:
return NOT_SET
elif self._override_state is OverrideState.PROJECT:
if not self.has_project_override:
return NOT_SET
output = {}
children_items = [
(self.enum_key, self.enum_entity)
]
for item in self.non_gui_children[self.current_enum].items():
children_items.append(item)
for key, child_obj in children_items:
value = child_obj.settings_value()
if value is not NOT_SET:
output[key] = value
if not output:
return NOT_SET
output.update(self._current_metadata)
return output
def _prepare_value(self, value):
if value is NOT_SET or self.enum_key not in value:
return NOT_SET, NOT_SET
enum_value = value.get(self.enum_key)
if enum_value not in self.non_gui_children:
return NOT_SET, NOT_SET
# Create copy of value before poping values
value = copy.deepcopy(value)
metadata = {}
for key in METADATA_KEYS:
if key in value:
metadata[key] = value.pop(key)
enum_value = value.get(self.enum_key)
old_metadata = metadata.get(M_OVERRIDEN_KEY)
if old_metadata:
old_metadata_set = set(old_metadata)
new_metadata = []
non_gui_children = self.non_gui_children[enum_value]
for key in non_gui_children.keys():
if key in old_metadata:
new_metadata.append(key)
old_metadata_set.remove(key)
for key in old_metadata_set:
new_metadata.append(key)
metadata[M_OVERRIDEN_KEY] = new_metadata
return value, metadata
def update_default_value(self, value):
"""Update default values.
Not an api method, should be called by parent.
"""
value = self._check_update_value(value, "default")
self.has_default_value = value is not NOT_SET
# TODO add value validation
value, metadata = self._prepare_value(value)
self._default_metadata = metadata
if value is NOT_SET:
self.enum_entity.update_default_value(value)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.update_default_value(value)
return
value_keys = set(value.keys())
enum_value = value[self.enum_key]
expected_keys = set(self.non_gui_children[enum_value].keys())
expected_keys.add(self.enum_key)
unknown_keys = value_keys - expected_keys
if unknown_keys:
self.log.warning(
"{} Unknown keys in default values: {}".format(
self.path,
", ".join("\"{}\"".format(key) for key in unknown_keys)
)
)
self.enum_entity.update_default_value(enum_value)
for children_by_key in self.non_gui_children.values():
for key, child_obj in children_by_key.items():
child_value = value.get(key, NOT_SET)
child_obj.update_default_value(child_value)
def update_studio_value(self, value):
"""Update studio override values.
Not an api method, should be called by parent.
"""
value = self._check_update_value(value, "studio override")
value, metadata = self._prepare_value(value)
self._studio_override_metadata = metadata
self.had_studio_override = metadata is not NOT_SET
if value is NOT_SET:
self.enum_entity.update_studio_value(value)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.update_studio_value(value)
return
value_keys = set(value.keys())
enum_value = value[self.enum_key]
expected_keys = set(self.non_gui_children[enum_value])
expected_keys.add(self.enum_key)
unknown_keys = value_keys - expected_keys
if unknown_keys:
self.log.warning(
"{} Unknown keys in studio overrides: {}".format(
self.path,
", ".join("\"{}\"".format(key) for key in unknown_keys)
)
)
self.enum_entity.update_studio_value(enum_value)
for children_by_key in self.non_gui_children.values():
for key, child_obj in children_by_key.items():
child_value = value.get(key, NOT_SET)
child_obj.update_studio_value(child_value)
def update_project_value(self, value):
"""Update project override values.
Not an api method, should be called by parent.
"""
value = self._check_update_value(value, "project override")
value, metadata = self._prepare_value(value)
self._project_override_metadata = metadata
self.had_project_override = metadata is not NOT_SET
if value is NOT_SET:
self.enum_entity.update_project_value(value)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.update_project_value(value)
return
value_keys = set(value.keys())
enum_value = value[self.enum_key]
expected_keys = set(self.non_gui_children[enum_value])
expected_keys.add(self.enum_key)
unknown_keys = value_keys - expected_keys
if unknown_keys:
self.log.warning(
"{} Unknown keys in project overrides: {}".format(
self.path,
", ".join("\"{}\"".format(key) for key in unknown_keys)
)
)
self.enum_entity.update_project_value(enum_value)
for children_by_key in self.non_gui_children.values():
for key, child_obj in children_by_key.items():
child_value = value.get(key, NOT_SET)
child_obj.update_project_value(child_value)
def _discard_changes(self, on_change_trigger):
self._ignore_child_changes = True
self.enum_entity.discard_changes(on_change_trigger)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.discard_changes(on_change_trigger)
self._ignore_child_changes = False
def _add_to_studio_default(self, on_change_trigger):
self._ignore_child_changes = True
self.enum_entity.add_to_studio_default(on_change_trigger)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.add_to_studio_default(on_change_trigger)
self._ignore_child_changes = False
self._update_current_metadata()
self.parent.on_child_change(self)
def _remove_from_studio_default(self, on_change_trigger):
self._ignore_child_changes = True
self.enum_entity.remove_from_studio_default(on_change_trigger)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.remove_from_studio_default(on_change_trigger)
self._ignore_child_changes = False
def _add_to_project_override(self, on_change_trigger):
self._ignore_child_changes = True
self.enum_entity.add_to_project_override(on_change_trigger)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.add_to_project_override(on_change_trigger)
self._ignore_child_changes = False
self._update_current_metadata()
self.parent.on_child_change(self)
def _remove_from_project_override(self, on_change_trigger):
if self._override_state is not OverrideState.PROJECT:
return
self._ignore_child_changes = True
self.enum_entity.remove_from_project_override(on_change_trigger)
for children_by_key in self.non_gui_children.values():
for child_obj in children_by_key.values():
child_obj.remove_from_project_override(on_change_trigger)
self._ignore_child_changes = False
def reset_callbacks(self):
"""Reset registered callbacks on entity and children."""
super(DictConditionalEntity, self).reset_callbacks()
for children in self.children.values():
for child_entity in children:
child_entity.reset_callbacks()

View file

@ -1,4 +1,5 @@
import copy
import collections
from .lib import (
WRAPPER_TYPES,
@ -138,7 +139,16 @@ class DictImmutableKeysEntity(ItemEntity):
method when handling gui wrappers.
"""
added_children = []
for children_schema in schema_data["children"]:
children_deque = collections.deque()
for _children_schema in schema_data["children"]:
children_schemas = self.schema_hub.resolve_schema_data(
_children_schema
)
for children_schema in children_schemas:
children_deque.append(children_schema)
while children_deque:
children_schema = children_deque.popleft()
if children_schema["type"] in WRAPPER_TYPES:
_children_schema = copy.deepcopy(children_schema)
wrapper_children = self._add_children(
@ -248,7 +258,7 @@ class DictImmutableKeysEntity(ItemEntity):
self._metadata_are_modified = current_metadata != metadata
self._current_metadata = current_metadata
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults):
# Trigger override state change of root if is not same
if self.root_item.override_state is not state:
self.root_item.set_override_state(state)
@ -256,9 +266,10 @@ class DictImmutableKeysEntity(ItemEntity):
# Change has/had override states
self._override_state = state
self._ignore_missing_defaults = ignore_missing_defaults
for child_obj in self.non_gui_children.values():
child_obj.set_override_state(state)
child_obj.set_override_state(state, ignore_missing_defaults)
self._update_current_metadata()

View file

@ -154,7 +154,9 @@ class DictMutableKeysEntity(EndpointEntity):
def add_key(self, key):
new_child = self._add_key(key)
new_child.set_override_state(self._override_state)
new_child.set_override_state(
self._override_state, self._ignore_missing_defaults
)
self.on_change()
return new_child
@ -320,7 +322,7 @@ class DictMutableKeysEntity(EndpointEntity):
def _metadata_for_current_state(self):
return self._get_metadata_for_state(self._override_state)
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults):
# Trigger override state change of root if is not same
if self.root_item.override_state is not state:
self.root_item.set_override_state(state)
@ -328,14 +330,22 @@ class DictMutableKeysEntity(EndpointEntity):
# TODO change metadata
self._override_state = state
self._ignore_missing_defaults = ignore_missing_defaults
# Ignore if is dynamic item and use default in that case
if not self.is_dynamic_item and not self.is_in_dynamic_item:
if state > OverrideState.DEFAULTS:
if not self.has_default_value:
if (
not self.has_default_value
and not ignore_missing_defaults
):
raise DefaultsNotDefined(self)
elif state > OverrideState.STUDIO:
if not self.had_studio_override:
if (
not self.had_studio_override
and not ignore_missing_defaults
):
raise StudioDefaultsNotDefined(self)
if state is OverrideState.STUDIO:
@ -426,7 +436,7 @@ class DictMutableKeysEntity(EndpointEntity):
if label:
children_label_by_id[child_entity.id] = label
child_entity.set_override_state(state)
child_entity.set_override_state(state, ignore_missing_defaults)
self.children_label_by_id = children_label_by_id
@ -610,7 +620,9 @@ class DictMutableKeysEntity(EndpointEntity):
if not self._can_discard_changes:
return
self.set_override_state(self._override_state)
self.set_override_state(
self._override_state, self._ignore_missing_defaults
)
on_change_trigger.append(self.on_change)
def _add_to_studio_default(self, _on_change_trigger):
@ -645,7 +657,9 @@ class DictMutableKeysEntity(EndpointEntity):
if label:
children_label_by_id[child_entity.id] = label
child_entity.set_override_state(self._override_state)
child_entity.set_override_state(
self._override_state, self._ignore_missing_defaults
)
self.children_label_by_id = children_label_by_id
@ -694,7 +708,9 @@ class DictMutableKeysEntity(EndpointEntity):
if label:
children_label_by_id[child_entity.id] = label
child_entity.set_override_state(self._override_state)
child_entity.set_override_state(
self._override_state, self._ignore_missing_defaults
)
self.children_label_by_id = children_label_by_id

View file

@ -139,7 +139,8 @@ class HostsEnumEntity(BaseEnumEntity):
"photoshop",
"resolve",
"tvpaint",
"unreal"
"unreal",
"standalonepublisher"
]
if self.use_empty_value:
host_names.insert(0, "")

View file

@ -1,5 +1,6 @@
import re
import copy
import json
from abc import abstractmethod
from .base_entity import ItemEntity
@ -217,21 +218,28 @@ class InputEntity(EndpointEntity):
return True
return False
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults):
# Trigger override state change of root if is not same
if self.root_item.override_state is not state:
self.root_item.set_override_state(state)
return
self._override_state = state
self._ignore_missing_defaults = ignore_missing_defaults
# Ignore if is dynamic item and use default in that case
if not self.is_dynamic_item and not self.is_in_dynamic_item:
if state > OverrideState.DEFAULTS:
if not self.has_default_value:
if (
not self.has_default_value
and not ignore_missing_defaults
):
raise DefaultsNotDefined(self)
elif state > OverrideState.STUDIO:
if not self.had_studio_override:
if (
not self.had_studio_override
and not ignore_missing_defaults
):
raise StudioDefaultsNotDefined(self)
if state is OverrideState.STUDIO:
@ -433,6 +441,7 @@ class RawJsonEntity(InputEntity):
def _item_initalization(self):
# Schema must define if valid value is dict or list
store_as_string = self.schema_data.get("store_as_string", False)
is_list = self.schema_data.get("is_list", False)
if is_list:
valid_value_types = (list, )
@ -441,6 +450,8 @@ class RawJsonEntity(InputEntity):
valid_value_types = (dict, )
value_on_not_set = {}
self.store_as_string = store_as_string
self._is_list = is_list
self.valid_value_types = valid_value_types
self.value_on_not_set = value_on_not_set
@ -484,6 +495,23 @@ class RawJsonEntity(InputEntity):
result = self.metadata != self._metadata_for_current_state()
return result
def schema_validations(self):
if self.store_as_string and self.is_env_group:
reason = (
"RawJson entity can't store environment group metadata"
" as string."
)
raise EntitySchemaError(self, reason)
super(RawJsonEntity, self).schema_validations()
def _convert_to_valid_type(self, value):
if isinstance(value, STRING_TYPE):
try:
return json.loads(value)
except Exception:
pass
return super(RawJsonEntity, self)._convert_to_valid_type(value)
def _metadata_for_current_state(self):
if (
self._override_state is OverrideState.PROJECT
@ -503,6 +531,9 @@ class RawJsonEntity(InputEntity):
value = super(RawJsonEntity, self)._settings_value()
if self.is_env_group and isinstance(value, dict):
value.update(self.metadata)
if self.store_as_string:
return json.dumps(value)
return value
def _prepare_value(self, value):

View file

@ -150,14 +150,15 @@ class PathEntity(ItemEntity):
def value(self):
return self.child_obj.value
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults):
# Trigger override state change of root if is not same
if self.root_item.override_state is not state:
self.root_item.set_override_state(state)
return
self._override_state = state
self.child_obj.set_override_state(state)
self._ignore_missing_defaults = ignore_missing_defaults
self.child_obj.set_override_state(state, ignore_missing_defaults)
def update_default_value(self, value):
self.child_obj.update_default_value(value)
@ -344,25 +345,32 @@ class ListStrictEntity(ItemEntity):
return True
return False
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults):
# Trigger override state change of root if is not same
if self.root_item.override_state is not state:
self.root_item.set_override_state(state)
return
self._override_state = state
self._ignore_missing_defaults = ignore_missing_defaults
# Ignore if is dynamic item and use default in that case
if not self.is_dynamic_item and not self.is_in_dynamic_item:
if state > OverrideState.DEFAULTS:
if not self.has_default_value:
if (
not self.has_default_value
and not ignore_missing_defaults
):
raise DefaultsNotDefined(self)
elif state > OverrideState.STUDIO:
if not self.had_studio_override:
if (
not self.had_studio_override
and not ignore_missing_defaults
):
raise StudioDefaultsNotDefined(self)
for child_entity in self.children:
child_entity.set_override_state(state)
child_entity.set_override_state(state, ignore_missing_defaults)
self.initial_value = self.settings_value()

View file

@ -2,6 +2,7 @@ import os
import re
import json
import copy
import inspect
from .exceptions import (
SchemaTemplateMissingKeys,
@ -25,335 +26,6 @@ TEMPLATE_METADATA_KEYS = (
template_key_pattern = re.compile(r"(\{.*?[^{0]*\})")
def _pop_metadata_item(template):
found_idx = None
for idx, item in enumerate(template):
if not isinstance(item, dict):
continue
for key in TEMPLATE_METADATA_KEYS:
if key in item:
found_idx = idx
break
if found_idx is not None:
break
metadata_item = {}
if found_idx is not None:
metadata_item = template.pop(found_idx)
return metadata_item
def _fill_schema_template_data(
template, template_data, skip_paths, required_keys=None, missing_keys=None
):
first = False
if required_keys is None:
first = True
if "skip_paths" in template_data:
skip_paths = template_data["skip_paths"]
if not isinstance(skip_paths, list):
skip_paths = [skip_paths]
# Cleanup skip paths (skip empty values)
skip_paths = [path for path in skip_paths if path]
required_keys = set()
missing_keys = set()
# Copy template data as content may change
template = copy.deepcopy(template)
# Get metadata item from template
metadata_item = _pop_metadata_item(template)
# Check for default values for template data
default_values = metadata_item.get(DEFAULT_VALUES_KEY) or {}
for key, value in default_values.items():
if key not in template_data:
template_data[key] = value
if not template:
output = template
elif isinstance(template, list):
# Store paths by first part if path
# - None value says that whole key should be skipped
skip_paths_by_first_key = {}
for path in skip_paths:
parts = path.split("/")
key = parts.pop(0)
if key not in skip_paths_by_first_key:
skip_paths_by_first_key[key] = []
value = "/".join(parts)
skip_paths_by_first_key[key].append(value or None)
output = []
for item in template:
# Get skip paths for children item
_skip_paths = []
if not isinstance(item, dict):
pass
elif item.get("type") in WRAPPER_TYPES:
_skip_paths = copy.deepcopy(skip_paths)
elif skip_paths_by_first_key:
# Check if this item should be skipped
key = item.get("key")
if key and key in skip_paths_by_first_key:
_skip_paths = skip_paths_by_first_key[key]
# Skip whole item if None is in skip paths value
if None in _skip_paths:
continue
output_item = _fill_schema_template_data(
item, template_data, _skip_paths, required_keys, missing_keys
)
if output_item:
output.append(output_item)
elif isinstance(template, dict):
output = {}
for key, value in template.items():
output[key] = _fill_schema_template_data(
value, template_data, skip_paths, required_keys, missing_keys
)
if output.get("type") in WRAPPER_TYPES and not output.get("children"):
return {}
elif isinstance(template, STRING_TYPE):
# TODO find much better way how to handle filling template data
template = template.replace("{{", "__dbcb__").replace("}}", "__decb__")
for replacement_string in template_key_pattern.findall(template):
key = str(replacement_string[1:-1])
required_keys.add(key)
if key not in template_data:
missing_keys.add(key)
continue
value = template_data[key]
if replacement_string == template:
# Replace the value with value from templates data
# - with this is possible to set value with different type
template = value
else:
# Only replace the key in string
template = template.replace(replacement_string, value)
output = template.replace("__dbcb__", "{").replace("__decb__", "}")
else:
output = template
if first and missing_keys:
raise SchemaTemplateMissingKeys(missing_keys, required_keys)
return output
def _fill_schema_template(child_data, schema_collection, schema_templates):
template_name = child_data["name"]
template = schema_templates.get(template_name)
if template is None:
if template_name in schema_collection:
raise KeyError((
"Schema \"{}\" is used as `schema_template`"
).format(template_name))
raise KeyError("Schema template \"{}\" was not found".format(
template_name
))
# Default value must be dictionary (NOT list)
# - empty list would not add any item if `template_data` are not filled
template_data = child_data.get("template_data") or {}
if isinstance(template_data, dict):
template_data = [template_data]
skip_paths = child_data.get("skip_paths") or []
if isinstance(skip_paths, STRING_TYPE):
skip_paths = [skip_paths]
output = []
for single_template_data in template_data:
try:
filled_child = _fill_schema_template_data(
template, single_template_data, skip_paths
)
except SchemaTemplateMissingKeys as exc:
raise SchemaTemplateMissingKeys(
exc.missing_keys, exc.required_keys, template_name
)
for item in filled_child:
filled_item = _fill_inner_schemas(
item, schema_collection, schema_templates
)
if filled_item["type"] == "schema_template":
output.extend(_fill_schema_template(
filled_item, schema_collection, schema_templates
))
else:
output.append(filled_item)
return output
def _fill_inner_schemas(schema_data, schema_collection, schema_templates):
if schema_data["type"] == "schema":
raise ValueError("First item in schema data can't be schema.")
children_key = "children"
object_type_key = "object_type"
for item_key in (children_key, object_type_key):
children = schema_data.get(item_key)
if not children:
continue
if object_type_key == item_key:
if not isinstance(children, dict):
continue
children = [children]
new_children = []
for child in children:
child_type = child["type"]
if child_type == "schema":
schema_name = child["name"]
if schema_name not in schema_collection:
if schema_name in schema_templates:
raise KeyError((
"Schema template \"{}\" is used as `schema`"
).format(schema_name))
raise KeyError(
"Schema \"{}\" was not found".format(schema_name)
)
filled_child = _fill_inner_schemas(
schema_collection[schema_name],
schema_collection,
schema_templates
)
elif child_type in ("template", "schema_template"):
for filled_child in _fill_schema_template(
child, schema_collection, schema_templates
):
new_children.append(filled_child)
continue
else:
filled_child = _fill_inner_schemas(
child, schema_collection, schema_templates
)
new_children.append(filled_child)
if item_key == object_type_key:
if len(new_children) != 1:
raise KeyError((
"Failed to fill object type with type: {} | name {}"
).format(
child_type, str(child.get("name"))
))
new_children = new_children[0]
schema_data[item_key] = new_children
return schema_data
# TODO reimplement logic inside entities
def validate_environment_groups_uniquenes(
schema_data, env_groups=None, keys=None
):
is_first = False
if env_groups is None:
is_first = True
env_groups = {}
keys = []
my_keys = copy.deepcopy(keys)
key = schema_data.get("key")
if key:
my_keys.append(key)
env_group_key = schema_data.get("env_group_key")
if env_group_key:
if env_group_key not in env_groups:
env_groups[env_group_key] = []
env_groups[env_group_key].append("/".join(my_keys))
children = schema_data.get("children")
if not children:
return
for child in children:
validate_environment_groups_uniquenes(
child, env_groups, copy.deepcopy(my_keys)
)
if is_first:
invalid = {}
for env_group_key, key_paths in env_groups.items():
if len(key_paths) > 1:
invalid[env_group_key] = key_paths
if invalid:
raise SchemaDuplicatedEnvGroupKeys(invalid)
def validate_schema(schema_data):
validate_environment_groups_uniquenes(schema_data)
def get_gui_schema(subfolder, main_schema_name):
dirpath = os.path.join(
os.path.dirname(__file__),
"schemas",
subfolder
)
loaded_schemas = {}
loaded_schema_templates = {}
for root, _, filenames in os.walk(dirpath):
for filename in filenames:
basename, ext = os.path.splitext(filename)
if ext != ".json":
continue
filepath = os.path.join(root, filename)
with open(filepath, "r") as json_stream:
try:
schema_data = json.load(json_stream)
except Exception as exc:
raise ValueError((
"Unable to parse JSON file {}\n{}"
).format(filepath, str(exc)))
if isinstance(schema_data, list):
loaded_schema_templates[basename] = schema_data
else:
loaded_schemas[basename] = schema_data
main_schema = _fill_inner_schemas(
loaded_schemas[main_schema_name],
loaded_schemas,
loaded_schema_templates
)
validate_schema(main_schema)
return main_schema
def get_studio_settings_schema():
return get_gui_schema("system_schema", "schema_main")
def get_project_settings_schema():
return get_gui_schema("projects_schema", "schema_main")
class OverrideStateItem:
"""Object used as item for `OverrideState` enum.
@ -426,3 +98,506 @@ class OverrideState:
DEFAULTS = OverrideStateItem(0, "Defaults")
STUDIO = OverrideStateItem(1, "Studio overrides")
PROJECT = OverrideStateItem(2, "Project Overrides")
class SchemasHub:
def __init__(self, schema_subfolder, reset=True):
self._schema_subfolder = schema_subfolder
self._loaded_types = {}
self._gui_types = tuple()
self._crashed_on_load = {}
self._loaded_templates = {}
self._loaded_schemas = {}
# It doesn't make sence to reload types on each reset as they can't be
# changed
self._load_types()
# Trigger reset
if reset:
self.reset()
def reset(self):
self._load_schemas()
@property
def gui_types(self):
return self._gui_types
def get_schema(self, schema_name):
"""Get schema definition data by it's name.
Returns:
dict: Copy of schema loaded from json files.
Raises:
KeyError: When schema name is stored in loaded templates or json
file was not possible to parse or when schema name was not
found.
"""
if schema_name not in self._loaded_schemas:
if schema_name in self._loaded_templates:
raise KeyError((
"Template \"{}\" is used as `schema`"
).format(schema_name))
elif schema_name in self._crashed_on_load:
crashed_item = self._crashed_on_load[schema_name]
raise KeyError(
"Unable to parse schema file \"{}\". {}".format(
crashed_item["filepath"], crashed_item["message"]
)
)
raise KeyError(
"Schema \"{}\" was not found".format(schema_name)
)
return copy.deepcopy(self._loaded_schemas[schema_name])
def get_template(self, template_name):
"""Get template definition data by it's name.
Returns:
list: Copy of template items loaded from json files.
Raises:
KeyError: When template name is stored in loaded schemas or json
file was not possible to parse or when template name was not
found.
"""
if template_name not in self._loaded_templates:
if template_name in self._loaded_schemas:
raise KeyError((
"Schema \"{}\" is used as `template`"
).format(template_name))
elif template_name in self._crashed_on_load:
crashed_item = self._crashed_on_load[template_name]
raise KeyError(
"Unable to parse template file \"{}\". {}".format(
crashed_item["filepath"], crashed_item["message"]
)
)
raise KeyError(
"Template \"{}\" was not found".format(template_name)
)
return copy.deepcopy(self._loaded_templates[template_name])
def resolve_schema_data(self, schema_data):
"""Resolve single item schema data as few types can be expanded.
This is mainly for 'schema' and 'template' types. Type 'schema' does
not have entity representation and 'template' may contain more than one
output schemas.
In other cases is retuned passed schema item in list.
Goal is to have schema and template resolving at one place.
Returns:
list: Resolved schema data.
"""
schema_type = schema_data["type"]
if schema_type not in ("schema", "template", "schema_template"):
return [schema_data]
if schema_type == "schema":
return self.resolve_schema_data(
self.get_schema(schema_data["name"])
)
template_name = schema_data["name"]
template_def = self.get_template(template_name)
filled_template = self._fill_template(
schema_data, template_def
)
return filled_template
def create_schema_object(self, schema_data, *args, **kwargs):
"""Create entity for passed schema data.
Args:
schema_data(dict): Schema definition of settings entity.
Returns:
ItemEntity: Created entity for passed schema data item.
Raises:
ValueError: When 'schema', 'template' or any of wrapper types are
passed.
KeyError: When type of passed schema is not known.
"""
schema_type = schema_data["type"]
if schema_type in ("schema", "template", "schema_template"):
raise ValueError(
"Got unresolved schema data of type \"{}\"".format(schema_type)
)
if schema_type in WRAPPER_TYPES:
raise ValueError((
"Function `create_schema_object` can't create entities"
" of any wrapper type. Got type: \"{}\""
).format(schema_type))
klass = self._loaded_types.get(schema_type)
if not klass:
raise KeyError("Unknown type \"{}\"".format(schema_type))
return klass(schema_data, *args, **kwargs)
def _load_types(self):
"""Prepare entity types for cretion of their objects.
Currently all classes in `openpype.settings.entities` that inherited
from `BaseEntity` are stored as loaded types. GUI types are stored to
separated attribute to not mess up api access of entities.
TODOs:
Add more dynamic way how to add custom types from anywhere and
better handling of abstract classes. Skipping them is dangerous.
"""
from openpype.settings import entities
# Define known abstract classes
known_abstract_classes = (
entities.BaseEntity,
entities.BaseItemEntity,
entities.ItemEntity,
entities.EndpointEntity,
entities.InputEntity,
entities.BaseEnumEntity
)
self._loaded_types = {}
_gui_types = []
for attr in dir(entities):
item = getattr(entities, attr)
# Filter classes
if not inspect.isclass(item):
continue
# Skip classes that do not inherit from BaseEntity
if not issubclass(item, entities.BaseEntity):
continue
# Skip class that is abstract by design
if item in known_abstract_classes:
continue
if inspect.isabstract(item):
# Create an object to get crash and get traceback
item()
# Backwards compatibility
# Single entity may have multiple schema types
for schema_type in item.schema_types:
self._loaded_types[schema_type] = item
if item.gui_type:
_gui_types.append(item)
self._gui_types = tuple(_gui_types)
def _load_schemas(self):
"""Load schema definitions from json files."""
# Refresh all affecting variables
self._crashed_on_load = {}
self._loaded_templates = {}
self._loaded_schemas = {}
dirpath = os.path.join(
os.path.dirname(os.path.abspath(__file__)),
"schemas",
self._schema_subfolder
)
loaded_schemas = {}
loaded_templates = {}
for root, _, filenames in os.walk(dirpath):
for filename in filenames:
basename, ext = os.path.splitext(filename)
if ext != ".json":
continue
filepath = os.path.join(root, filename)
with open(filepath, "r") as json_stream:
try:
schema_data = json.load(json_stream)
except Exception as exc:
msg = str(exc)
print("Unable to parse JSON file {}\n{}".format(
filepath, msg
))
self._crashed_on_load[basename] = {
"filepath": filepath,
"message": msg
}
continue
if basename in self._crashed_on_load:
crashed_item = self._crashed_on_load[basename]
raise KeyError((
"Duplicated filename \"{}\"."
" One of them crashed on load \"{}\" {}"
).format(
filename,
crashed_item["filepath"],
crashed_item["message"]
))
if isinstance(schema_data, list):
if basename in loaded_templates:
raise KeyError(
"Duplicated template filename \"{}\"".format(
filename
)
)
loaded_templates[basename] = schema_data
else:
if basename in loaded_schemas:
raise KeyError(
"Duplicated schema filename \"{}\"".format(
filename
)
)
loaded_schemas[basename] = schema_data
self._loaded_templates = loaded_templates
self._loaded_schemas = loaded_schemas
def _fill_template(self, child_data, template_def):
"""Fill template based on schema definition and template definition.
Based on `child_data` is `template_def` modified and result is
returned.
Template definition may have defined data to fill which
should be filled with data from child data.
Child data may contain more than one output definition of an template.
Child data can define paths to skip. Path is full path of an item
which won't be returned.
TODO:
Be able to handle wrapper items here.
Args:
child_data(dict): Schema data of template item.
template_def(dict): Template definition that will be filled with
child_data.
Returns:
list: Resolved template always returns list of schemas.
"""
template_name = child_data["name"]
# Default value must be dictionary (NOT list)
# - empty list would not add any item if `template_data` are not filled
template_data = child_data.get("template_data") or {}
if isinstance(template_data, dict):
template_data = [template_data]
skip_paths = child_data.get("skip_paths") or []
if isinstance(skip_paths, STRING_TYPE):
skip_paths = [skip_paths]
output = []
for single_template_data in template_data:
try:
output.extend(self._fill_template_data(
template_def, single_template_data, skip_paths
))
except SchemaTemplateMissingKeys as exc:
raise SchemaTemplateMissingKeys(
exc.missing_keys, exc.required_keys, template_name
)
return output
def _fill_template_data(
self,
template,
template_data,
skip_paths,
required_keys=None,
missing_keys=None
):
"""Fill template values with data from schema data.
Template has more abilities than schemas. It is expected that template
will be used at multiple places (but may not). Schema represents
exactly one entity and it's children but template may represent more
entities.
Template can have "keys to fill" from their definition. Some key may be
required and some may be optional because template has their default
values defined.
Template also have ability to "skip paths" which means to skip entities
from it's content. A template can be used across multiple places with
different requirements.
Raises:
SchemaTemplateMissingKeys: When fill data do not contain all
required keys for template.
"""
first = False
if required_keys is None:
first = True
if "skip_paths" in template_data:
skip_paths = template_data["skip_paths"]
if not isinstance(skip_paths, list):
skip_paths = [skip_paths]
# Cleanup skip paths (skip empty values)
skip_paths = [path for path in skip_paths if path]
required_keys = set()
missing_keys = set()
# Copy template data as content may change
template = copy.deepcopy(template)
# Get metadata item from template
metadata_item = self._pop_metadata_item(template)
# Check for default values for template data
default_values = metadata_item.get(DEFAULT_VALUES_KEY) or {}
for key, value in default_values.items():
if key not in template_data:
template_data[key] = value
if not template:
output = template
elif isinstance(template, list):
# Store paths by first part if path
# - None value says that whole key should be skipped
skip_paths_by_first_key = {}
for path in skip_paths:
parts = path.split("/")
key = parts.pop(0)
if key not in skip_paths_by_first_key:
skip_paths_by_first_key[key] = []
value = "/".join(parts)
skip_paths_by_first_key[key].append(value or None)
output = []
for item in template:
# Get skip paths for children item
_skip_paths = []
if not isinstance(item, dict):
pass
elif item.get("type") in WRAPPER_TYPES:
_skip_paths = copy.deepcopy(skip_paths)
elif skip_paths_by_first_key:
# Check if this item should be skipped
key = item.get("key")
if key and key in skip_paths_by_first_key:
_skip_paths = skip_paths_by_first_key[key]
# Skip whole item if None is in skip paths value
if None in _skip_paths:
continue
output_item = self._fill_template_data(
item,
template_data,
_skip_paths,
required_keys,
missing_keys
)
if output_item:
output.append(output_item)
elif isinstance(template, dict):
output = {}
for key, value in template.items():
output[key] = self._fill_template_data(
value,
template_data,
skip_paths,
required_keys,
missing_keys
)
if (
output.get("type") in WRAPPER_TYPES
and not output.get("children")
):
return {}
elif isinstance(template, STRING_TYPE):
# TODO find much better way how to handle filling template data
template = (
template
.replace("{{", "__dbcb__")
.replace("}}", "__decb__")
)
full_replacement = False
for replacement_string in template_key_pattern.findall(template):
key = str(replacement_string[1:-1])
required_keys.add(key)
if key not in template_data:
missing_keys.add(key)
continue
value = template_data[key]
if replacement_string == template:
# Replace the value with value from templates data
# - with this is possible to set value with different type
template = value
full_replacement = True
else:
# Only replace the key in string
template = template.replace(replacement_string, value)
if not full_replacement:
output = (
template
.replace("__dbcb__", "{")
.replace("__decb__", "}")
)
else:
output = template
else:
output = template
if first and missing_keys:
raise SchemaTemplateMissingKeys(missing_keys, required_keys)
return output
def _pop_metadata_item(self, template_def):
"""Pop template metadata from template definition.
Template metadata may define default values if are not passed from
schema data.
"""
found_idx = None
for idx, item in enumerate(template_def):
if not isinstance(item, dict):
continue
for key in TEMPLATE_METADATA_KEYS:
if key in item:
found_idx = idx
break
if found_idx is not None:
break
metadata_item = {}
if found_idx is not None:
metadata_item = template_def.pop(found_idx)
return metadata_item

View file

@ -102,7 +102,9 @@ class ListEntity(EndpointEntity):
def add_new_item(self, idx=None, trigger_change=True):
child_obj = self._add_new_item(idx)
child_obj.set_override_state(self._override_state)
child_obj.set_override_state(
self._override_state, self._ignore_missing_defaults
)
if trigger_change:
self.on_child_change(child_obj)
@ -205,13 +207,14 @@ class ListEntity(EndpointEntity):
self._has_project_override = True
self.on_change()
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults):
# Trigger override state change of root if is not same
if self.root_item.override_state is not state:
self.root_item.set_override_state(state)
return
self._override_state = state
self._ignore_missing_defaults = ignore_missing_defaults
while self.children:
self.children.pop(0)
@ -219,11 +222,17 @@ class ListEntity(EndpointEntity):
# Ignore if is dynamic item and use default in that case
if not self.is_dynamic_item and not self.is_in_dynamic_item:
if state > OverrideState.DEFAULTS:
if not self.has_default_value:
if (
not self.has_default_value
and not ignore_missing_defaults
):
raise DefaultsNotDefined(self)
elif state > OverrideState.STUDIO:
if not self.had_studio_override:
if (
not self.had_studio_override
and not ignore_missing_defaults
):
raise StudioDefaultsNotDefined(self)
value = NOT_SET
@ -257,7 +266,9 @@ class ListEntity(EndpointEntity):
child_obj.update_studio_value(item)
for child_obj in self.children:
child_obj.set_override_state(self._override_state)
child_obj.set_override_state(
self._override_state, ignore_missing_defaults
)
self.initial_value = self.settings_value()
@ -395,7 +406,9 @@ class ListEntity(EndpointEntity):
if self.had_studio_override:
child_obj.update_studio_value(item)
child_obj.set_override_state(self._override_state)
child_obj.set_override_state(
self._override_state, self._ignore_missing_defaults
)
if self._override_state >= OverrideState.PROJECT:
self._has_project_override = self.had_project_override
@ -427,7 +440,9 @@ class ListEntity(EndpointEntity):
for item in value:
child_obj = self._add_new_item()
child_obj.update_default_value(item)
child_obj.set_override_state(self._override_state)
child_obj.set_override_state(
self._override_state, self._ignore_missing_defaults
)
self._ignore_child_changes = False
@ -460,7 +475,10 @@ class ListEntity(EndpointEntity):
child_obj.update_default_value(item)
if self._has_studio_override:
child_obj.update_studio_value(item)
child_obj.set_override_state(self._override_state)
child_obj.set_override_state(
self._override_state,
self._ignore_missing_defaults
)
self._ignore_child_changes = False

View file

@ -1,7 +1,7 @@
import os
import json
import copy
import inspect
import collections
from abc import abstractmethod
@ -10,8 +10,7 @@ from .lib import (
NOT_SET,
WRAPPER_TYPES,
OverrideState,
get_studio_settings_schema,
get_project_settings_schema
SchemasHub
)
from .exceptions import (
SchemaError,
@ -53,7 +52,12 @@ class RootEntity(BaseItemEntity):
"""
schema_types = ["root"]
def __init__(self, schema_data, reset):
def __init__(self, schema_hub, reset, main_schema_name=None):
self.schema_hub = schema_hub
if not main_schema_name:
main_schema_name = "schema_main"
schema_data = schema_hub.get_schema(main_schema_name)
super(RootEntity, self).__init__(schema_data)
self._require_restart_callbacks = []
self._item_ids_require_restart = set()
@ -130,7 +134,17 @@ class RootEntity(BaseItemEntity):
def _add_children(self, schema_data, first=True):
added_children = []
for children_schema in schema_data["children"]:
children_deque = collections.deque()
for _children_schema in schema_data["children"]:
children_schemas = self.schema_hub.resolve_schema_data(
_children_schema
)
for children_schema in children_schemas:
children_deque.append(children_schema)
while children_deque:
children_schema = children_deque.popleft()
if children_schema["type"] in WRAPPER_TYPES:
_children_schema = copy.deepcopy(children_schema)
wrapper_children = self._add_children(
@ -143,11 +157,13 @@ class RootEntity(BaseItemEntity):
child_obj = self.create_schema_object(children_schema, self)
self.children.append(child_obj)
added_children.append(child_obj)
if isinstance(child_obj, self._gui_types):
if isinstance(child_obj, self.schema_hub.gui_types):
continue
if child_obj.key in self.non_gui_children:
raise KeyError("Duplicated key \"{}\"".format(child_obj.key))
raise KeyError(
"Duplicated key \"{}\"".format(child_obj.key)
)
self.non_gui_children[child_obj.key] = child_obj
if not first:
@ -160,9 +176,6 @@ class RootEntity(BaseItemEntity):
# Store `self` to `root_item` for children entities
self.root_item = self
self._loaded_types = None
self._gui_types = None
# Children are stored by key as keys are immutable and are defined by
# schema
self.valid_value_types = (dict, )
@ -189,11 +202,10 @@ class RootEntity(BaseItemEntity):
if not KEY_REGEX.match(key):
raise InvalidKeySymbols(self.path, key)
@abstractmethod
def get_entity_from_path(self, path):
"""Return system settings entity."""
raise NotImplementedError((
"Method `get_entity_from_path` not available for \"{}\""
).format(self.__class__.__name__))
"""Return entity matching passed path."""
pass
def create_schema_object(self, schema_data, *args, **kwargs):
"""Create entity by entered schema data.
@ -201,56 +213,11 @@ class RootEntity(BaseItemEntity):
Available entities are loaded on first run. Children entities can call
this method.
"""
if self._loaded_types is None:
# Load available entities
from openpype.settings import entities
return self.schema_hub.create_schema_object(
schema_data, *args, **kwargs
)
# Define known abstract classes
known_abstract_classes = (
entities.BaseEntity,
entities.BaseItemEntity,
entities.ItemEntity,
entities.EndpointEntity,
entities.InputEntity,
entities.BaseEnumEntity
)
self._loaded_types = {}
_gui_types = []
for attr in dir(entities):
item = getattr(entities, attr)
# Filter classes
if not inspect.isclass(item):
continue
# Skip classes that do not inherit from BaseEntity
if not issubclass(item, entities.BaseEntity):
continue
# Skip class that is abstract by design
if item in known_abstract_classes:
continue
if inspect.isabstract(item):
# Create an object to get crash and get traceback
item()
# Backwards compatibility
# Single entity may have multiple schema types
for schema_type in item.schema_types:
self._loaded_types[schema_type] = item
if item.gui_type:
_gui_types.append(item)
self._gui_types = tuple(_gui_types)
klass = self._loaded_types.get(schema_data["type"])
if not klass:
raise KeyError("Unknown type \"{}\"".format(schema_data["type"]))
return klass(schema_data, *args, **kwargs)
def set_override_state(self, state):
def set_override_state(self, state, ignore_missing_defaults=None):
"""Set override state and trigger it on children.
Method will discard all changes in hierarchy and use values, metadata
@ -259,9 +226,12 @@ class RootEntity(BaseItemEntity):
Args:
state (OverrideState): State to which should be data changed.
"""
if not ignore_missing_defaults:
ignore_missing_defaults = False
self._override_state = state
for child_obj in self.non_gui_children.values():
child_obj.set_override_state(state)
child_obj.set_override_state(state, ignore_missing_defaults)
def on_change(self):
"""Trigger callbacks on change."""
@ -491,18 +461,32 @@ class SystemSettings(RootEntity):
schema_data (dict): Pass schema data to entity. This is for development
and debugging purposes.
"""
def __init__(
self, set_studio_state=True, reset=True, schema_data=None
):
if schema_data is None:
# Load system schemas
schema_data = get_studio_settings_schema()
root_key = SYSTEM_SETTINGS_KEY
super(SystemSettings, self).__init__(schema_data, reset)
def __init__(
self, set_studio_state=True, reset=True, schema_hub=None
):
if schema_hub is None:
# Load system schemas
schema_hub = SchemasHub("system_schema")
super(SystemSettings, self).__init__(schema_hub, reset)
if set_studio_state:
self.set_studio_state()
def get_entity_from_path(self, path):
"""Return system settings entity."""
path_parts = path.split("/")
first_part = path_parts[0]
output = self
if first_part == self.root_key:
path_parts.pop(0)
for path_part in path_parts:
output = output[path_part]
return output
def _reset_values(self):
default_value = get_default_settings()[SYSTEM_SETTINGS_KEY]
for key, child_obj in self.non_gui_children.items():
@ -600,22 +584,24 @@ class ProjectSettings(RootEntity):
schema_data (dict): Pass schema data to entity. This is for development
and debugging purposes.
"""
root_key = PROJECT_SETTINGS_KEY
def __init__(
self,
project_name=None,
change_state=True,
reset=True,
schema_data=None
schema_hub=None
):
self._project_name = project_name
self._system_settings_entity = None
if schema_data is None:
if schema_hub is None:
# Load system schemas
schema_data = get_project_settings_schema()
schema_hub = SchemasHub("projects_schema")
super(ProjectSettings, self).__init__(schema_data, reset)
super(ProjectSettings, self).__init__(schema_hub, reset)
if change_state:
if self.project_name is None:

View file

@ -181,6 +181,103 @@
}
```
## dict-conditional
- is similar to `dict` but has only one child entity that will be always available
- the one entity is enumerator of possible values and based on value of the entity are defined and used other children entities
- each value of enumerator have defined children that will be used
- there is no way how to have shared entities across multiple enum items
- value from enumerator is also stored next to other values
- to define the key under which will be enum value stored use `enum_key`
- `enum_key` must match key regex and any enum item can't have children with same key
- `enum_label` is label of the entity for UI purposes
- enum items are define with `enum_children`
- it's a list where each item represents enum item
- all items in `enum_children` must have at least `key` key which represents value stored under `enum_key`
- items can define `label` for UI purposes
- most important part is that item can define `children` key where are definitions of it's children (`children` value works the same way as in `dict`)
- entity must have defined `"label"` if is not used as widget
- is set as group if any parent is not group
- if `"label"` is entetered there which will be shown in GUI
- item with label can be collapsible
- that can be set with key `"collapsible"` as `True`/`False` (Default: `True`)
- with key `"collapsed"` as `True`/`False` can be set that is collapsed when GUI is opened (Default: `False`)
- it is possible to add darker background with `"highlight_content"` (Default: `False`)
- darker background has limits of maximum applies after 3-4 nested highlighted items there is not difference in the color
- output is dictionary `{the "key": children values}`
```
# Example
{
"type": "dict-conditional",
"key": "my_key",
"label": "My Key",
"enum_key": "type",
"enum_label": "label",
"enum_children": [
# Each item must be a dictionary with 'key'
{
"key": "action",
"label": "Action",
"children": [
{
"type": "text",
"key": "key",
"label": "Key"
},
{
"type": "text",
"key": "label",
"label": "Label"
},
{
"type": "text",
"key": "command",
"label": "Comand"
}
]
},
{
"key": "menu",
"label": "Menu",
"children": [
{
"key": "children",
"label": "Children",
"type": "list",
"object_type": "text"
}
]
},
{
# Separator does not have children as "separator" value is enough
"key": "separator",
"label": "Separator"
}
]
}
```
How output of the schema could look like on save:
```
{
"type": "separator"
}
{
"type": "action",
"key": "action_1",
"label": "Action 1",
"command": "run command -arg"
}
{
"type": "menu",
"children": [
"child_1",
"child_2"
]
}
```
## Inputs for setting any kind of value (`Pure` inputs)
- all these input must have defined `"key"` under which will be stored and `"label"` which will be shown next to input
- unless they are used in different types of inputs (later) "as widgets" in that case `"key"` and `"label"` are not required as there is not place where to set them
@ -240,6 +337,11 @@
- schema also defines valid value type
- by default it is dictionary
- to be able use list it is required to define `is_list` to `true`
- output can be stored as string
- this is to allow any keys in dictionary
- set key `store_as_string` to `true`
- code using that setting must expected that value is string and use json module to convert it to python types
```
{
"type": "raw-json",

View file

@ -108,6 +108,16 @@
"key": "limit",
"label": "Limit Groups",
"object_type": "text"
},
{
"type": "raw-json",
"key": "jobInfo",
"label": "Additional JobInfo data"
},
{
"type": "raw-json",
"key": "pluginInfo",
"label": "Additional PluginInfo data"
}
]
},
@ -173,6 +183,20 @@
"key": "use_gpu",
"label": "Use GPU"
},
{
"type": "list",
"key": "env_allowed_keys",
"object_type": "text",
"label": "Allowed environment keys"
},
{
"type": "dict-modifiable",
"key": "env_search_replace_values",
"label": "Search & replace in environment values",
"object_type": {
"type": "text"
}
},
{
"type": "dict-modifiable",
"key": "limit_groups",

View file

@ -17,7 +17,8 @@
"type": "raw-json",
"label": "Project Folder Structure",
"key": "project_folder_structure",
"use_label_wrap": true
"use_label_wrap": true,
"store_as_string": true
},
{
"type": "schema",

View file

@ -63,6 +63,14 @@
"type": "text",
"key": "fpath_template",
"label": "Path template"
},
{
"type": "list",
"key": "defaults",
"label": "Subset name defaults",
"object_type": {
"type": "text"
}
}
]
},
@ -82,6 +90,14 @@
"type": "boolean",
"key": "use_range_limit",
"label": "Use Frame range limit by default"
},
{
"type": "list",
"key": "defaults",
"label": "Subset name defaults",
"object_type": {
"type": "text"
}
}
]
}

View file

@ -130,6 +130,165 @@
]
}
]
},
{
"type": "dict",
"collapsible": true,
"key": "CollectEditorial",
"label": "Collect Editorial",
"is_group": true,
"children": [
{
"type": "text",
"key": "source_dir",
"label": "Editorial resources pointer"
},
{
"type": "list",
"key": "extensions",
"label": "Accepted extensions",
"object_type": "text"
}
]
},
{
"type": "dict",
"collapsible": true,
"key": "CollectHierarchyInstance",
"label": "Collect Instance Hierarchy",
"is_group": true,
"children": [
{
"type": "text",
"key": "shot_rename_template",
"label": "Shot rename template"
},
{
"key": "shot_rename_search_patterns",
"label": "Shot renaming paterns search",
"type": "dict-modifiable",
"highlight_content": true,
"object_type": {
"type": "text"
}
},
{
"type": "dict",
"key": "shot_add_hierarchy",
"label": "Shot hierarchy",
"children": [
{
"type": "text",
"key": "parents_path",
"label": "Parents path template"
},
{
"key": "parents",
"label": "Parents",
"type": "dict-modifiable",
"highlight_content": true,
"object_type": {
"type": "text"
}
}
]
},
{
"key": "shot_add_tasks",
"label": "Add tasks to shot",
"type": "dict-modifiable",
"highlight_content": true,
"object_type": {
"type": "dict",
"children": [
{
"type": "task-types-enum",
"key": "type",
"label": "Task type"
}
]
}
}
]
},
{
"type": "dict",
"collapsible": true,
"key": "shot_add_tasks",
"label": "Collect Clip Instances",
"is_group": true,
"children": [
{
"type": "number",
"key": "custom_start_frame",
"label": "Custom start frame",
"default": 0,
"minimum": 1,
"maximum": 100000
},
{
"type": "number",
"key": "timeline_frame_start",
"label": "Timeline start frame",
"default": 900000,
"minimum": 1,
"maximum": 10000000
},
{
"type": "number",
"key": "timeline_frame_offset",
"label": "Timeline frame offset",
"default": 0,
"minimum": -1000000,
"maximum": 1000000
},
{
"key": "subsets",
"label": "Subsets",
"type": "dict-modifiable",
"highlight_content": true,
"object_type": {
"type": "dict",
"children": [
{
"type": "text",
"key": "family",
"label": "Family"
},
{
"type": "list",
"key": "families",
"label": "Families",
"object_type": "text"
},
{
"type": "splitter"
},
{
"type": "list",
"key": "extensions",
"label": "Extensions",
"object_type": "text"
},
{
"key": "version",
"label": "Version lock",
"type": "number",
"default": 0,
"minimum": 0,
"maximum": 10
}
,
{
"type": "boolean",
"key": "keepSequence",
"label": "Keep sequence if used for review",
"default": false
}
]
}
}
]
}
]
}

View file

@ -15,11 +15,6 @@
"type": "boolean",
"key": "dev_mode",
"label": "Dev mode"
},
{
"type": "boolean",
"key": "install_unreal_python_engine",
"label": "Install unreal python engine"
}
]
}

View file

@ -554,6 +554,22 @@
"key": "deadline_priority",
"label": "Deadline Priotity"
},
{
"type": "splitter"
},
{
"type": "text",
"key": "publishing_script",
"label": "Publishing script path"
},
{
"type": "list",
"key": "skip_integration_repre_list",
"label": "Skip integration of representation with ext",
"object_type": {
"type": "text"
}
},
{
"type": "dict",
"key": "aov_filter",
@ -594,6 +610,30 @@
]
}
]
},
{
"type": "dict",
"collapsible": true,
"key": "CleanUp",
"label": "Clean Up",
"is_group": true,
"children": [
{
"type": "list",
"key": "paterns",
"label": "Paterrns (regex)",
"object_type": {
"type": "text"
}
},
{
"type": "boolean",
"key": "remove_temp_renders",
"label": "Remove Temp renders",
"default": false
}
]
}
]
}

View file

@ -9,6 +9,54 @@
"label": "Color input",
"type": "color"
},
{
"type": "dict-conditional",
"use_label_wrap": true,
"collapsible": true,
"key": "menu_items",
"label": "Menu items",
"enum_key": "type",
"enum_label": "Type",
"enum_children": [
{
"key": "action",
"label": "Action",
"children": [
{
"type": "text",
"key": "key",
"label": "Key"
},
{
"type": "text",
"key": "label",
"label": "Label"
},
{
"type": "text",
"key": "command",
"label": "Comand"
}
]
},
{
"key": "menu",
"label": "Menu",
"children": [
{
"key": "children",
"label": "Children",
"type": "list",
"object_type": "text"
}
]
},
{
"key": "separator",
"label": "Separator"
}
]
},
{
"type": "dict",
"key": "schema_template_exaples",

View file

@ -29,11 +29,13 @@
"template_data": [
{
"app_variant_label": "2020",
"app_variant": "2020"
"app_variant": "2020",
"variant_skip_paths": ["use_python_2"]
},
{
"app_variant_label": "2021",
"app_variant": "2021"
"app_variant": "2021",
"variant_skip_paths": ["use_python_2"]
}
]
}

View file

@ -30,7 +30,8 @@
"children": [
{
"type": "schema_template",
"name": "template_host_variant_items"
"name": "template_host_variant_items",
"skip_paths": ["use_python_2"]
}
]
}

View file

@ -29,11 +29,13 @@
"template_data": [
{
"app_variant_label": "20",
"app_variant": "20"
"app_variant": "20",
"variant_skip_paths": ["use_python_2"]
},
{
"app_variant_label": "17",
"app_variant": "17"
"app_variant": "17",
"variant_skip_paths": ["use_python_2"]
}
]
}

View file

@ -29,11 +29,13 @@
"template_data": [
{
"app_variant_label": "2020",
"app_variant": "2020"
"app_variant": "2020",
"variant_skip_paths": ["use_python_2"]
},
{
"app_variant_label": "2021",
"app_variant": "2021"
"app_variant": "2021",
"variant_skip_paths": ["use_python_2"]
}
]
}

View file

@ -30,7 +30,8 @@
"children": [
{
"type": "schema_template",
"name": "template_host_variant_items"
"name": "template_host_variant_items",
"skip_paths": ["use_python_2"]
}
]
}

View file

@ -1,4 +1,9 @@
[
{
"__default_values__": {
"variant_skip_paths": null
}
},
{
"type": "dict",
"key": "{app_variant}",
@ -19,7 +24,8 @@
},
{
"type": "schema_template",
"name": "template_host_variant_items"
"name": "template_host_variant_items",
"skip_paths": "{variant_skip_paths}"
}
]
}

View file

@ -1,3 +1,5 @@
import json
from Qt import QtWidgets, QtGui, QtCore
from openpype.tools.settings import CHILD_OFFSET
from .widgets import ExpandingWidget
@ -125,6 +127,117 @@ class BaseWidget(QtWidgets.QWidget):
actions_mapping[action] = remove_from_project_override
menu.addAction(action)
def _copy_value_actions(self, menu):
def copy_value():
mime_data = QtCore.QMimeData()
if self.entity.is_dynamic_item or self.entity.is_in_dynamic_item:
entity_path = None
else:
entity_path = "/".join(
[self.entity.root_key, self.entity.path]
)
value = self.entity.value
# Copy for settings tool
settings_data = {
"root_key": self.entity.root_key,
"value": value,
"path": entity_path
}
settings_encoded_data = QtCore.QByteArray()
settings_stream = QtCore.QDataStream(
settings_encoded_data, QtCore.QIODevice.WriteOnly
)
settings_stream.writeQString(json.dumps(settings_data))
mime_data.setData(
"application/copy_settings_value", settings_encoded_data
)
# Copy as json
json_encoded_data = None
if isinstance(value, (dict, list)):
json_encoded_data = QtCore.QByteArray()
json_stream = QtCore.QDataStream(
json_encoded_data, QtCore.QIODevice.WriteOnly
)
json_stream.writeQString(json.dumps(value))
mime_data.setData("application/json", json_encoded_data)
# Copy as text
if json_encoded_data is None:
# Store value as string
mime_data.setText(str(value))
else:
# Store data as json string
mime_data.setText(json.dumps(value, indent=4))
QtWidgets.QApplication.clipboard().setMimeData(mime_data)
action = QtWidgets.QAction("Copy", menu)
return [(action, copy_value)]
def _paste_value_actions(self, menu):
output = []
# Allow paste of value only if were copied from this UI
mime_data = QtWidgets.QApplication.clipboard().mimeData()
mime_value = mime_data.data("application/copy_settings_value")
# Skip if there is nothing to do
if not mime_value:
return output
settings_stream = QtCore.QDataStream(
mime_value, QtCore.QIODevice.ReadOnly
)
mime_data_value_str = settings_stream.readQString()
mime_data_value = json.loads(mime_data_value_str)
value = mime_data_value["value"]
path = mime_data_value["path"]
root_key = mime_data_value["root_key"]
# Try to find matching entity to be able paste values to same spot
# - entity can't by dynamic or in dynamic item
# - must be in same root entity as source copy
# Can't copy system settings <-> project settings
matching_entity = None
if path and root_key == self.entity.root_key:
try:
matching_entity = self.entity.get_entity_from_path(path)
except Exception:
pass
def _set_entity_value(_entity, _value):
try:
_entity.set(_value)
except Exception:
dialog = QtWidgets.QMessageBox(self)
dialog.setWindowTitle("Value does not match settings schema")
dialog.setIcon(QtWidgets.QMessageBox.Warning)
dialog.setText((
"Pasted value does not seem to match schema of destination"
" settings entity."
))
dialog.exec_()
# Simple paste value method
def paste_value():
_set_entity_value(self.entity, value)
action = QtWidgets.QAction("Paste", menu)
output.append((action, paste_value))
# Paste value to matchin entity
def paste_value_to_path():
_set_entity_value(matching_entity, value)
if matching_entity is not None:
action = QtWidgets.QAction("Paste to same place", menu)
output.append((action, paste_value_to_path))
return output
def show_actions_menu(self, event=None):
if event and event.button() != QtCore.Qt.RightButton:
return
@ -144,6 +257,15 @@ class BaseWidget(QtWidgets.QWidget):
self._add_to_project_override_action(menu, actions_mapping)
self._remove_from_project_override_action(menu, actions_mapping)
ui_actions = []
ui_actions.extend(self._copy_value_actions(menu))
ui_actions.extend(self._paste_value_actions(menu))
if ui_actions:
menu.addSeparator()
for action, callback in ui_actions:
menu.addAction(action)
actions_mapping[action] = callback
if not actions_mapping:
action = QtWidgets.QAction("< No action >")
actions_mapping[action] = None

View file

@ -11,6 +11,7 @@ from openpype.settings.entities import (
GUIEntity,
DictImmutableKeysEntity,
DictMutableKeysEntity,
DictConditionalEntity,
ListEntity,
PathEntity,
ListStrictEntity,
@ -35,6 +36,7 @@ from .base import GUIWidget
from .list_item_widget import ListWidget
from .list_strict_widget import ListStrictWidget
from .dict_mutable_widget import DictMutableKeysWidget
from .dict_conditional import DictConditionalWidget
from .item_widgets import (
BoolWidget,
DictImmutableKeysWidget,
@ -100,6 +102,9 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
if isinstance(entity, GUIEntity):
return GUIWidget(*args)
elif isinstance(entity, DictConditionalEntity):
return DictConditionalWidget(*args)
elif isinstance(entity, DictImmutableKeysEntity):
return DictImmutableKeysWidget(*args)
@ -183,6 +188,12 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
footer_widget = QtWidgets.QWidget(configurations_widget)
footer_layout = QtWidgets.QHBoxLayout(footer_widget)
refresh_icon = qtawesome.icon("fa.refresh", color="white")
refresh_btn = QtWidgets.QPushButton(footer_widget)
refresh_btn.setIcon(refresh_icon)
footer_layout.addWidget(refresh_btn, 0)
if self.user_role == "developer":
self._add_developer_ui(footer_layout)
@ -205,8 +216,10 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
main_layout.addWidget(configurations_widget, 1)
save_btn.clicked.connect(self._save)
refresh_btn.clicked.connect(self._on_refresh)
self.save_btn = save_btn
self.refresh_btn = refresh_btn
self.require_restart_label = require_restart_label
self.scroll_widget = scroll_widget
self.content_layout = content_layout
@ -220,10 +233,6 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
return
def _add_developer_ui(self, footer_layout):
refresh_icon = qtawesome.icon("fa.refresh", color="white")
refresh_button = QtWidgets.QPushButton()
refresh_button.setIcon(refresh_icon)
modify_defaults_widget = QtWidgets.QWidget()
modify_defaults_checkbox = QtWidgets.QCheckBox(modify_defaults_widget)
modify_defaults_checkbox.setChecked(self._hide_studio_overrides)
@ -235,10 +244,8 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
modify_defaults_layout.addWidget(label_widget)
modify_defaults_layout.addWidget(modify_defaults_checkbox)
footer_layout.addWidget(refresh_button, 0)
footer_layout.addWidget(modify_defaults_widget, 0)
refresh_button.clicked.connect(self._on_refresh)
modify_defaults_checkbox.stateChanged.connect(
self._on_modify_defaults
)

View file

@ -0,0 +1,304 @@
from Qt import QtWidgets
from .widgets import (
ExpandingWidget,
GridLabelWidget
)
from .wrapper_widgets import (
WrapperWidget,
CollapsibleWrapper,
FormWrapper
)
from .base import BaseWidget
from openpype.tools.settings import CHILD_OFFSET
class DictConditionalWidget(BaseWidget):
def create_ui(self):
self.input_fields = []
self._content_by_enum_value = {}
self._last_enum_value = None
self.label_widget = None
self.body_widget = None
self.content_widget = None
self.content_layout = None
label = None
if self.entity.is_dynamic_item:
self._ui_as_dynamic_item()
elif self.entity.use_label_wrap:
self._ui_label_wrap()
else:
self._ui_item_base()
label = self.entity.label
self._parent_widget_by_entity_id = {}
self._enum_key_by_wrapper_id = {}
self._added_wrapper_ids = set()
self.content_layout.setColumnStretch(0, 0)
self.content_layout.setColumnStretch(1, 1)
# Add enum entity to layout mapping
enum_entity = self.entity.enum_entity
self._parent_widget_by_entity_id[enum_entity.id] = self.content_widget
# Add rest of entities to wrapper mappings
for enum_key, children in self.entity.gui_layout.items():
parent_widget_by_entity_id = {}
content_widget = QtWidgets.QWidget(self.content_widget)
content_layout = QtWidgets.QGridLayout(content_widget)
content_layout.setColumnStretch(0, 0)
content_layout.setColumnStretch(1, 1)
content_layout.setContentsMargins(0, 0, 0, 0)
content_layout.setSpacing(5)
self._content_by_enum_value[enum_key] = {
"widget": content_widget,
"layout": content_layout
}
self._prepare_entity_layouts(
children,
content_widget,
parent_widget_by_entity_id
)
for item_id in parent_widget_by_entity_id.keys():
self._enum_key_by_wrapper_id[item_id] = enum_key
self._parent_widget_by_entity_id.update(parent_widget_by_entity_id)
enum_input_field = self.create_ui_for_entity(
self.category_widget, self.entity.enum_entity, self
)
self.enum_input_field = enum_input_field
self.input_fields.append(enum_input_field)
for item_key, children in self.entity.children.items():
content_widget = self._content_by_enum_value[item_key]["widget"]
row = self.content_layout.rowCount()
self.content_layout.addWidget(content_widget, row, 0, 1, 2)
for child_obj in children:
self.input_fields.append(
self.create_ui_for_entity(
self.category_widget, child_obj, self
)
)
if self.entity.use_label_wrap and self.content_layout.count() == 0:
self.body_widget.hide_toolbox(True)
self.entity_widget.add_widget_to_layout(self, label)
def _prepare_entity_layouts(
self, gui_layout, widget, parent_widget_by_entity_id
):
for child in gui_layout:
if not isinstance(child, dict):
parent_widget_by_entity_id[child.id] = widget
continue
if child["type"] == "collapsible-wrap":
wrapper = CollapsibleWrapper(child, widget)
elif child["type"] == "form":
wrapper = FormWrapper(child, widget)
else:
raise KeyError(
"Unknown Wrapper type \"{}\"".format(child["type"])
)
parent_widget_by_entity_id[wrapper.id] = widget
self._prepare_entity_layouts(
child["children"], wrapper, parent_widget_by_entity_id
)
def _ui_item_base(self):
self.setObjectName("DictInvisible")
self.content_widget = self
self.content_layout = QtWidgets.QGridLayout(self)
self.content_layout.setContentsMargins(0, 0, 0, 0)
self.content_layout.setSpacing(5)
def _ui_as_dynamic_item(self):
content_widget = QtWidgets.QWidget(self)
content_widget.setObjectName("DictAsWidgetBody")
show_borders = str(int(self.entity.show_borders))
content_widget.setProperty("show_borders", show_borders)
label_widget = QtWidgets.QLabel(self.entity.label)
content_layout = QtWidgets.QGridLayout(content_widget)
content_layout.setContentsMargins(5, 5, 5, 5)
main_layout = QtWidgets.QHBoxLayout(self)
main_layout.setContentsMargins(0, 0, 0, 0)
main_layout.setSpacing(5)
main_layout.addWidget(content_widget)
self.label_widget = label_widget
self.content_widget = content_widget
self.content_layout = content_layout
def _ui_label_wrap(self):
content_widget = QtWidgets.QWidget(self)
content_widget.setObjectName("ContentWidget")
if self.entity.highlight_content:
content_state = "hightlighted"
bottom_margin = 5
else:
content_state = ""
bottom_margin = 0
content_widget.setProperty("content_state", content_state)
content_layout_margins = (CHILD_OFFSET, 5, 0, bottom_margin)
body_widget = ExpandingWidget(self.entity.label, self)
label_widget = body_widget.label_widget
body_widget.set_content_widget(content_widget)
content_layout = QtWidgets.QGridLayout(content_widget)
content_layout.setContentsMargins(*content_layout_margins)
main_layout = QtWidgets.QHBoxLayout(self)
main_layout.setContentsMargins(0, 0, 0, 0)
main_layout.setSpacing(0)
main_layout.addWidget(body_widget)
self.label_widget = label_widget
self.body_widget = body_widget
self.content_widget = content_widget
self.content_layout = content_layout
if self.entity.collapsible:
if not self.entity.collapsed:
body_widget.toggle_content()
else:
body_widget.hide_toolbox(hide_content=False)
def add_widget_to_layout(self, widget, label=None):
if not widget.entity:
map_id = widget.id
else:
map_id = widget.entity.id
content_widget = self.content_widget
content_layout = self.content_layout
if map_id != self.entity.enum_entity.id:
enum_value = self._enum_key_by_wrapper_id[map_id]
content_widget = self._content_by_enum_value[enum_value]["widget"]
content_layout = self._content_by_enum_value[enum_value]["layout"]
wrapper = self._parent_widget_by_entity_id[map_id]
if wrapper is not content_widget:
wrapper.add_widget_to_layout(widget, label)
if wrapper.id not in self._added_wrapper_ids:
self.add_widget_to_layout(wrapper)
self._added_wrapper_ids.add(wrapper.id)
return
row = content_layout.rowCount()
if not label or isinstance(widget, WrapperWidget):
content_layout.addWidget(widget, row, 0, 1, 2)
else:
label_widget = GridLabelWidget(label, widget)
label_widget.input_field = widget
widget.label_widget = label_widget
content_layout.addWidget(label_widget, row, 0, 1, 1)
content_layout.addWidget(widget, row, 1, 1, 1)
def set_entity_value(self):
for input_field in self.input_fields:
input_field.set_entity_value()
self._on_entity_change()
def hierarchical_style_update(self):
self.update_style()
for input_field in self.input_fields:
input_field.hierarchical_style_update()
def update_style(self):
if not self.body_widget and not self.label_widget:
return
if self.entity.group_item:
group_item = self.entity.group_item
has_unsaved_changes = group_item.has_unsaved_changes
has_project_override = group_item.has_project_override
has_studio_override = group_item.has_studio_override
else:
has_unsaved_changes = self.entity.has_unsaved_changes
has_project_override = self.entity.has_project_override
has_studio_override = self.entity.has_studio_override
style_state = self.get_style_state(
self.is_invalid,
has_unsaved_changes,
has_project_override,
has_studio_override
)
if self._style_state == style_state:
return
self._style_state = style_state
if self.body_widget:
if style_state:
child_style_state = "child-{}".format(style_state)
else:
child_style_state = ""
self.body_widget.side_line_widget.setProperty(
"state", child_style_state
)
self.body_widget.side_line_widget.style().polish(
self.body_widget.side_line_widget
)
# There is nothing to care if there is no label
if not self.label_widget:
return
# Don't change label if is not group or under group item
if not self.entity.is_group and not self.entity.group_item:
return
self.label_widget.setProperty("state", style_state)
self.label_widget.style().polish(self.label_widget)
def _on_entity_change(self):
enum_value = self.enum_input_field.entity.value
if enum_value == self._last_enum_value:
return
self._last_enum_value = enum_value
for item_key, content in self._content_by_enum_value.items():
widget = content["widget"]
widget.setVisible(item_key == enum_value)
@property
def is_invalid(self):
return self._is_invalid or self._child_invalid
@property
def _child_invalid(self):
for input_field in self.input_fields:
if input_field.is_invalid:
return True
return False
def get_invalid(self):
invalid = []
for input_field in self.input_fields:
invalid.extend(input_field.get_invalid())
return invalid

View file

@ -145,7 +145,7 @@ class DictImmutableKeysWidget(BaseWidget):
self.content_widget = content_widget
self.content_layout = content_layout
if len(self.input_fields) == 1 and self.checkbox_widget:
if len(self.input_fields) == 1 and self.checkbox_child:
body_widget.hide_toolbox(hide_content=True)
elif self.entity.collapsible:

View file

@ -34,6 +34,13 @@ class Window(QtWidgets.QDialog):
self._db = AvalonMongoDB()
self._db.install()
try:
settings = QtCore.QSettings("pypeclub", "StandalonePublisher")
except Exception:
settings = None
self._settings = settings
self.pyblish_paths = pyblish_paths
self.setWindowTitle("Standalone Publish")
@ -44,7 +51,7 @@ class Window(QtWidgets.QDialog):
self.valid_parent = False
# assets widget
widget_assets = AssetWidget(dbcon=self._db, parent=self)
widget_assets = AssetWidget(self._db, settings, self)
# family widget
widget_family = FamilyWidget(dbcon=self._db, parent=self)

View file

@ -127,11 +127,12 @@ class AssetWidget(QtWidgets.QWidget):
current_changed = QtCore.Signal() # on view current index change
task_changed = QtCore.Signal()
def __init__(self, dbcon, parent=None):
def __init__(self, dbcon, settings, parent=None):
super(AssetWidget, self).__init__(parent=parent)
self.setContentsMargins(0, 0, 0, 0)
self.dbcon = dbcon
self._settings = settings
layout = QtWidgets.QVBoxLayout()
layout.setContentsMargins(0, 0, 0, 0)
@ -139,6 +140,10 @@ class AssetWidget(QtWidgets.QWidget):
# Project
self.combo_projects = QtWidgets.QComboBox()
# Change delegate so stylysheets are applied
project_delegate = QtWidgets.QStyledItemDelegate(self.combo_projects)
self.combo_projects.setItemDelegate(project_delegate)
self._set_projects()
self.combo_projects.currentTextChanged.connect(self.on_project_change)
# Tree View
@ -198,6 +203,7 @@ class AssetWidget(QtWidgets.QWidget):
self.selection_changed.connect(self._refresh_tasks)
self.project_delegate = project_delegate
self.task_view = task_view
self.task_model = task_model
self.refreshButton = refresh
@ -237,15 +243,59 @@ class AssetWidget(QtWidgets.QWidget):
output.extend(self.get_parents(parent))
return output
def _get_last_projects(self):
if not self._settings:
return []
project_names = []
for project_name in self._settings.value("projects", "").split("|"):
if project_name:
project_names.append(project_name)
return project_names
def _add_last_project(self, project_name):
if not self._settings:
return
last_projects = []
for _project_name in self._settings.value("projects", "").split("|"):
if _project_name:
last_projects.append(_project_name)
if project_name in last_projects:
last_projects.remove(project_name)
last_projects.insert(0, project_name)
while len(last_projects) > 5:
last_projects.pop(-1)
self._settings.setValue("projects", "|".join(last_projects))
def _set_projects(self):
projects = list()
project_names = list()
for project in self.dbcon.projects():
projects.append(project['name'])
project_name = project.get("name")
if project_name:
project_names.append(project_name)
self.combo_projects.clear()
if len(projects) > 0:
self.combo_projects.addItems(projects)
self.dbcon.Session["AVALON_PROJECT"] = projects[0]
if not project_names:
return
sorted_project_names = list(sorted(project_names))
self.combo_projects.addItems(list(sorted(sorted_project_names)))
last_project = sorted_project_names[0]
for project_name in self._get_last_projects():
if project_name in sorted_project_names:
last_project = project_name
break
index = sorted_project_names.index(last_project)
self.combo_projects.setCurrentIndex(index)
self.dbcon.Session["AVALON_PROJECT"] = last_project
def on_project_change(self):
projects = list()
@ -254,6 +304,7 @@ class AssetWidget(QtWidgets.QWidget):
project_name = self.combo_projects.currentText()
if project_name in projects:
self.dbcon.Session["AVALON_PROJECT"] = project_name
self._add_last_project(project_name)
self.project_changed.emit(project_name)

View file

@ -944,10 +944,8 @@ class Window(QtWidgets.QMainWindow):
split_widget.addWidget(tasks_widget)
split_widget.addWidget(files_widget)
split_widget.addWidget(side_panel)
split_widget.setStretchFactor(0, 1)
split_widget.setStretchFactor(1, 1)
split_widget.setStretchFactor(2, 3)
split_widget.setStretchFactor(3, 1)
split_widget.setSizes([255, 160, 455, 175])
body_layout.addWidget(split_widget)
# Add top margin for tasks to align it visually with files as
@ -976,7 +974,7 @@ class Window(QtWidgets.QMainWindow):
# Force focus on the open button by default, required for Houdini.
files_widget.btn_open.setFocus()
self.resize(1000, 600)
self.resize(1200, 600)
def keyPressEvent(self, event):
"""Custom keyPressEvent.

View file

@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
__version__ = "3.2.0-nightly.2"
__version__ = "3.3.0-nightly.2"

@ -1 +1 @@
Subproject commit efde026e5aad72dac0e69848005419e2c4f067f2
Subproject commit d8be0bdb37961e32243f1de0eb9696e86acf7443

149
start.py
View file

@ -135,18 +135,36 @@ if sys.__stdout__:
def _print(message: str):
if message.startswith("!!! "):
print("{}{}".format(term.orangered2("!!! "), message[4:]))
return
if message.startswith(">>> "):
print("{}{}".format(term.aquamarine3(">>> "), message[4:]))
return
if message.startswith("--- "):
print("{}{}".format(term.darkolivegreen3("--- "), message[4:]))
if message.startswith(" "):
print("{}{}".format(term.darkseagreen3(" "), message[4:]))
return
if message.startswith("*** "):
print("{}{}".format(term.gold("*** "), message[4:]))
return
if message.startswith(" - "):
print("{}{}".format(term.wheat(" - "), message[4:]))
return
if message.startswith(" . "):
print("{}{}".format(term.tan(" . "), message[4:]))
return
if message.startswith(" - "):
print("{}{}".format(term.seagreen3(" - "), message[7:]))
return
if message.startswith(" ! "):
print("{}{}".format(term.goldenrod(" ! "), message[7:]))
return
if message.startswith(" * "):
print("{}{}".format(term.aquamarine1(" * "), message[7:]))
return
if message.startswith(" "):
print("{}{}".format(term.darkseagreen3(" "), message[4:]))
return
print(message)
else:
def _print(message: str):
print(message)
@ -175,6 +193,17 @@ silent_commands = ["run", "igniter", "standalonepublisher",
"extractenvironments"]
def list_versions(openpype_versions: list, local_version=None) -> None:
"""Print list of detected versions."""
_print(" - Detected versions:")
for v in sorted(openpype_versions):
_print(f" - {v}: {v.path}")
if not openpype_versions:
_print(" ! none in repository detected")
if local_version:
_print(f" * local version {local_version}")
def set_openpype_global_environments() -> None:
"""Set global OpenPype's environments."""
import acre
@ -303,22 +332,37 @@ def _process_arguments() -> tuple:
# check for `--use-version=3.0.0` argument and `--use-staging`
use_version = None
use_staging = False
print_versions = False
for arg in sys.argv:
if arg == "--use-version":
_print("!!! Please use option --use-version like:")
_print(" --use-version=3.0.0")
sys.exit(1)
m = re.search(
r"--use-version=(?P<version>\d+\.\d+\.\d+(?:\S*)?)", arg)
if m and m.group('version'):
use_version = m.group('version')
sys.argv.remove(arg)
break
if arg.startswith("--use-version="):
m = re.search(
r"--use-version=(?P<version>\d+\.\d+\.\d+(?:\S*)?)", arg)
if m and m.group('version'):
use_version = m.group('version')
_print(">>> Requested version [ {} ]".format(use_version))
sys.argv.remove(arg)
if "+staging" in use_version:
use_staging = True
break
else:
_print("!!! Requested version isn't in correct format.")
_print((" Use --list-versions to find out"
" proper version string."))
sys.exit(1)
if "--use-staging" in sys.argv:
use_staging = True
sys.argv.remove("--use-staging")
if "--list-versions" in sys.argv:
print_versions = True
sys.argv.remove("--list-versions")
# handle igniter
# this is helper to run igniter before anything else
if "igniter" in sys.argv:
@ -334,7 +378,7 @@ def _process_arguments() -> tuple:
sys.argv.pop(idx)
sys.argv.insert(idx, "tray")
return use_version, use_staging
return use_version, use_staging, print_versions
def _determine_mongodb() -> str:
@ -487,7 +531,7 @@ def _find_frozen_openpype(use_version: str = None,
openpype_version = openpype_versions[-1]
except IndexError:
_print(("!!! Something is wrong and we didn't "
"found it again."))
"found it again."))
sys.exit(1)
elif return_code != 2:
_print(f" . finished ({return_code})")
@ -500,7 +544,7 @@ def _find_frozen_openpype(use_version: str = None,
_print("*** Still no luck finding OpenPype.")
_print(("*** We'll try to use the one coming "
"with OpenPype installation."))
version_path = _bootstrap_from_code(use_version)
version_path = _bootstrap_from_code(use_version, use_staging)
openpype_version = OpenPypeVersion(
version=BootstrapRepos.get_version(version_path),
path=version_path)
@ -519,13 +563,8 @@ def _find_frozen_openpype(use_version: str = None,
if found:
openpype_version = sorted(found)[-1]
if not openpype_version:
_print(f"!!! requested version {use_version} was not found.")
if openpype_versions:
_print(" - found: ")
for v in sorted(openpype_versions):
_print(f" - {v}: {v.path}")
_print(f" - local version {local_version}")
_print(f"!!! Requested version {use_version} was not found.")
list_versions(openpype_versions, local_version)
sys.exit(1)
# test if latest detected is installed (in user data dir)
@ -560,7 +599,7 @@ def _find_frozen_openpype(use_version: str = None,
return openpype_version.path
def _bootstrap_from_code(use_version):
def _bootstrap_from_code(use_version, use_staging):
"""Bootstrap live code (or the one coming with frozen OpenPype.
Args:
@ -575,15 +614,33 @@ def _bootstrap_from_code(use_version):
_openpype_root = OPENPYPE_ROOT
if getattr(sys, 'frozen', False):
local_version = bootstrap.get_version(Path(_openpype_root))
_print(f" - running version: {local_version}")
switch_str = f" - will switch to {use_version}" if use_version else ""
_print(f" - booting version: {local_version}{switch_str}")
assert local_version
else:
# get current version of OpenPype
local_version = bootstrap.get_local_live_version()
if use_version and use_version != local_version:
version_to_use = None
openpype_versions = bootstrap.find_openpype(include_zips=True)
version_to_use = None
openpype_versions = bootstrap.find_openpype(
include_zips=True, staging=use_staging)
if use_staging and not use_version:
try:
version_to_use = openpype_versions[-1]
except IndexError:
_print("!!! No staging versions are found.")
list_versions(openpype_versions, local_version)
sys.exit(1)
if version_to_use.path.is_file():
version_to_use.path = bootstrap.extract_openpype(
version_to_use)
bootstrap.add_paths_from_directory(version_to_use.path)
os.environ["OPENPYPE_VERSION"] = str(version_to_use)
version_path = version_to_use.path
os.environ["OPENPYPE_REPOS_ROOT"] = (version_path / "openpype").as_posix() # noqa: E501
_openpype_root = version_to_use.path.as_posix()
elif use_version and use_version != local_version:
v: OpenPypeVersion
found = [v for v in openpype_versions if str(v) == use_version]
if found:
@ -600,13 +657,8 @@ def _bootstrap_from_code(use_version):
os.environ["OPENPYPE_REPOS_ROOT"] = (version_path / "openpype").as_posix() # noqa: E501
_openpype_root = version_to_use.path.as_posix()
else:
_print(f"!!! requested version {use_version} was not found.")
if openpype_versions:
_print(" - found: ")
for v in sorted(openpype_versions):
_print(f" - {v}: {v.path}")
_print(f" - local version {local_version}")
_print(f"!!! Requested version {use_version} was not found.")
list_versions(openpype_versions, local_version)
sys.exit(1)
else:
os.environ["OPENPYPE_VERSION"] = local_version
@ -675,11 +727,16 @@ def boot():
# Process arguments
# ------------------------------------------------------------------------
use_version, use_staging = _process_arguments()
use_version, use_staging, print_versions = _process_arguments()
if os.getenv("OPENPYPE_VERSION"):
use_staging = "staging" in os.getenv("OPENPYPE_VERSION")
use_version = os.getenv("OPENPYPE_VERSION")
if use_version:
_print(("*** environment variable OPENPYPE_VERSION"
"is overridden by command line argument."))
else:
_print(">>> version set by environment variable")
use_staging = "staging" in os.getenv("OPENPYPE_VERSION")
use_version = os.getenv("OPENPYPE_VERSION")
# ------------------------------------------------------------------------
# Determine mongodb connection
@ -704,6 +761,24 @@ def boot():
if not os.getenv("OPENPYPE_PATH") and openpype_path:
os.environ["OPENPYPE_PATH"] = openpype_path
if print_versions:
if not use_staging:
_print("--- This will list only non-staging versions detected.")
_print(" To see staging versions, use --use-staging argument.")
else:
_print("--- This will list only staging versions detected.")
_print(" To see other version, omit --use-staging argument.")
_openpype_root = OPENPYPE_ROOT
openpype_versions = bootstrap.find_openpype(include_zips=True,
staging=use_staging)
if getattr(sys, 'frozen', False):
local_version = bootstrap.get_version(Path(_openpype_root))
else:
local_version = bootstrap.get_local_live_version()
list_versions(openpype_versions, local_version)
sys.exit(1)
# ------------------------------------------------------------------------
# Find OpenPype versions
# ------------------------------------------------------------------------
@ -718,7 +793,7 @@ def boot():
_print(f"!!! {e}")
sys.exit(1)
else:
version_path = _bootstrap_from_code(use_version)
version_path = _bootstrap_from_code(use_version, use_staging)
# set this to point either to `python` from venv in case of live code
# or to `openpype` or `openpype_console` in case of frozen code
@ -754,7 +829,7 @@ def boot():
from openpype.version import __version__
assert version_path, "Version path not defined."
info = get_info()
info = get_info(use_staging)
info.insert(0, f">>> Using OpenPype from [ {version_path} ]")
t_width = 20
@ -781,7 +856,7 @@ def boot():
sys.exit(1)
def get_info() -> list:
def get_info(use_staging=None) -> list:
"""Print additional information to console."""
from openpype.lib.mongo import get_default_components
from openpype.lib.log import PypeLogger
@ -789,7 +864,7 @@ def get_info() -> list:
components = get_default_components()
inf = []
if not getattr(sys, 'frozen', False):
if use_staging:
inf.append(("OpenPype variant", "staging"))
else:
inf.append(("OpenPype variant", "production"))

View file

@ -330,8 +330,8 @@ def test_find_openpype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
assert result[-1].path == expected_path, ("not a latest version of "
"OpenPype 3")
printer("testing finding OpenPype in OPENPYPE_PATH ...")
monkeypatch.setenv("OPENPYPE_PATH", e_path.as_posix())
result = fix_bootstrap.find_openpype(include_zips=True)
# we should have results as file were created
assert result is not None, "no OpenPype version found"
@ -349,6 +349,8 @@ def test_find_openpype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
monkeypatch.delenv("OPENPYPE_PATH", raising=False)
printer("testing finding OpenPype in user data dir ...")
# mock appdirs user_data_dir
def mock_user_data_dir(*args, **kwargs):
"""Mock local app data dir."""
@ -373,18 +375,7 @@ def test_find_openpype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
assert result[-1].path == expected_path, ("not a latest version of "
"OpenPype 2")
result = fix_bootstrap.find_openpype(e_path, include_zips=True)
assert result is not None, "no OpenPype version found"
expected_path = Path(
e_path / "{}{}{}".format(
test_versions_1[5].prefix,
test_versions_1[5].version,
test_versions_1[5].suffix
)
)
assert result[-1].path == expected_path, ("not a latest version of "
"OpenPype 1")
printer("testing finding OpenPype zip/dir precedence ...")
result = fix_bootstrap.find_openpype(dir_path, include_zips=True)
assert result is not None, "no OpenPype versions found"
expected_path = Path(

View file

@ -83,7 +83,8 @@ function Show-PSWarning() {
function Install-Poetry() {
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Installing Poetry ... "
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py -UseBasicParsing).Content | python -
$env:POETRY_HOME="$openpype_root\.poetry"
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py -UseBasicParsing).Content | python -
}
$art = @"
@ -115,11 +116,9 @@ $openpype_root = (Get-Item $script_dir).parent.FullName
$env:_INSIDE_OPENPYPE_TOOL = "1"
# make sure Poetry is in PATH
if (-not (Test-Path 'env:POETRY_HOME')) {
$env:POETRY_HOME = "$openpype_root\.poetry"
}
$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
Set-Location -Path $openpype_root
@ -164,7 +163,7 @@ Write-Host " ]" -ForegroundColor white
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -184,10 +183,10 @@ Write-Host ">>> " -NoNewline -ForegroundColor green
Write-Host "Building OpenPype ..."
$startTime = [int][double]::Parse((Get-Date -UFormat %s))
$out = & poetry run python setup.py build 2>&1
$out = & "$($env:POETRY_HOME)\bin\poetry" run python setup.py build 2>&1
Set-Content -Path "$($openpype_root)\build\build.log" -Value $out
if ($LASTEXITCODE -ne 0)
{
Set-Content -Path "$($openpype_root)\build\build.log" -Value $out
Write-Host "!!! " -NoNewLine -ForegroundColor Red
Write-Host "Build failed. Check the log: " -NoNewline
Write-Host ".\build\build.log" -ForegroundColor Yellow
@ -195,7 +194,7 @@ if ($LASTEXITCODE -ne 0)
}
Set-Content -Path "$($openpype_root)\build\build.log" -Value $out
& poetry run python "$($openpype_root)\tools\build_dependencies.py"
& "$($env:POETRY_HOME)\bin\poetry" run python "$($openpype_root)\tools\build_dependencies.py"
Write-Host ">>> " -NoNewline -ForegroundColor green
Write-Host "restoring current directory"

View file

@ -140,21 +140,6 @@ realpath () {
echo $(cd $(dirname "$1") || return; pwd)/$(basename "$1")
}
##############################################################################
# Install Poetry when needed
# Globals:
# PATH
# Arguments:
# None
# Returns:
# None
###############################################################################
install_poetry () {
echo -e "${BIGreen}>>>${RST} Installing Poetry ..."
command -v curl >/dev/null 2>&1 || { echo -e "${BIRed}!!!${RST}${BIYellow} Missing ${RST}${BIBlue}curl${BIYellow} command.${RST}"; return 1; }
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
}
# Main
main () {
echo -e "${BGreen}"
@ -171,11 +156,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
echo -e "${BIYellow}---${RST} Cleaning build directory ..."
rm -rf "$openpype_root/build" && mkdir "$openpype_root/build" > /dev/null
@ -201,11 +184,11 @@ if [ "$disable_submodule_update" == 1 ]; then
fi
echo -e "${BIGreen}>>>${RST} Building ..."
if [[ "$OSTYPE" == "linux-gnu"* ]]; then
poetry run python "$openpype_root/setup.py" build > "$openpype_root/build/build.log" || { echo -e "${BIRed}!!!${RST} Build failed, see the build log."; return; }
"$POETRY_HOME/bin/poetry" run python "$openpype_root/setup.py" build > "$openpype_root/build/build.log" || { echo -e "${BIRed}!!!${RST} Build failed, see the build log."; return; }
elif [[ "$OSTYPE" == "darwin"* ]]; then
poetry run python "$openpype_root/setup.py" bdist_mac > "$openpype_root/build/build.log" || { echo -e "${BIRed}!!!${RST} Build failed, see the build log."; return; }
"$POETRY_HOME/bin/poetry" run python "$openpype_root/setup.py" bdist_mac > "$openpype_root/build/build.log" || { echo -e "${BIRed}!!!${RST} Build failed, see the build log."; return; }
fi
poetry run python "$openpype_root/tools/build_dependencies.py"
"$POETRY_HOME/bin/poetry" run python "$openpype_root/tools/build_dependencies.py"
if [[ "$OSTYPE" == "darwin"* ]]; then
# fix code signing issue

View file

@ -64,14 +64,6 @@ function Show-PSWarning() {
}
}
function Install-Poetry() {
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Installing Poetry ... "
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py -UseBasicParsing).Content | python -
# add it to PATH
$env:PATH = "$($env:PATH);$($env:USERPROFILE)\.poetry\bin"
}
$art = @"
. . .. . ..

View file

@ -49,9 +49,7 @@ function Install-Poetry() {
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Installing Poetry ... "
$env:POETRY_HOME="$openpype_root\.poetry"
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py -UseBasicParsing).Content | python -
# add it to PATH
$env:PATH = "$($env:PATH);$openpype_root\.poetry\bin"
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py -UseBasicParsing).Content | python -
}
@ -94,11 +92,10 @@ $current_dir = Get-Location
$script_dir = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
$openpype_root = (Get-Item $script_dir).parent.FullName
# make sure Poetry is in PATH
if (-not (Test-Path 'env:POETRY_HOME')) {
$env:POETRY_HOME = "$openpype_root\.poetry"
}
$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
Set-Location -Path $openpype_root
@ -145,7 +142,7 @@ Test-Python
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Install-Poetry
Write-Host "INSTALLED" -ForegroundColor Cyan
@ -160,7 +157,7 @@ if (-not (Test-Path -PathType Leaf -Path "$($openpype_root)\poetry.lock")) {
Write-Host ">>> " -NoNewline -ForegroundColor green
Write-Host "Installing virtual environment from lock."
}
& poetry install --no-root $poetry_verbosity
& "$env:POETRY_HOME\bin\poetry" install --no-root $poetry_verbosity --ansi
if ($LASTEXITCODE -ne 0) {
Write-Host "!!! " -ForegroundColor yellow -NoNewline
Write-Host "Poetry command failed."

View file

@ -109,8 +109,7 @@ install_poetry () {
echo -e "${BIGreen}>>>${RST} Installing Poetry ..."
export POETRY_HOME="$openpype_root/.poetry"
command -v curl >/dev/null 2>&1 || { echo -e "${BIRed}!!!${RST}${BIYellow} Missing ${RST}${BIBlue}curl${BIYellow} command.${RST}"; return 1; }
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
export PATH="$PATH:$POETRY_HOME/bin"
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py | python -
}
##############################################################################
@ -154,11 +153,10 @@ main () {
# Directories
openpype_root=$(realpath $(dirname $(dirname "${BASH_SOURCE[0]}")))
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
pushd "$openpype_root" > /dev/null || return > /dev/null
@ -177,7 +175,7 @@ main () {
echo -e "${BIGreen}>>>${RST} Installing dependencies ..."
fi
poetry install --no-root $poetry_verbosity || { echo -e "${BIRed}!!!${RST} Poetry environment installation failed"; return; }
"$POETRY_HOME/bin/poetry" install --no-root $poetry_verbosity || { echo -e "${BIRed}!!!${RST} Poetry environment installation failed"; return; }
echo -e "${BIGreen}>>>${RST} Cleaning cache files ..."
clean_pyc
@ -186,10 +184,10 @@ main () {
# cx_freeze will crash on missing __pychache__ on these but
# reinstalling them solves the problem.
echo -e "${BIGreen}>>>${RST} Fixing pycache bug ..."
poetry run python -m pip install --force-reinstall pip
poetry run pip install --force-reinstall setuptools
poetry run pip install --force-reinstall wheel
poetry run python -m pip install --force-reinstall pip
"$POETRY_HOME/bin/poetry" run python -m pip install --force-reinstall pip
"$POETRY_HOME/bin/poetry" run pip install --force-reinstall setuptools
"$POETRY_HOME/bin/poetry" run pip install --force-reinstall wheel
"$POETRY_HOME/bin/poetry" run python -m pip install --force-reinstall pip
}
main -3

View file

@ -45,11 +45,9 @@ $openpype_root = (Get-Item $script_dir).parent.FullName
$env:_INSIDE_OPENPYPE_TOOL = "1"
# make sure Poetry is in PATH
if (-not (Test-Path 'env:POETRY_HOME')) {
$env:POETRY_HOME = "$openpype_root\.poetry"
}
$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
Set-Location -Path $openpype_root
@ -87,7 +85,7 @@ if (-not $openpype_version) {
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -107,5 +105,5 @@ Write-Host ">>> " -NoNewline -ForegroundColor green
Write-Host "Generating zip from current sources ..."
$env:PYTHONPATH="$($openpype_root);$($env:PYTHONPATH)"
$env:OPENPYPE_ROOT="$($openpype_root)"
& poetry run python "$($openpype_root)\tools\create_zip.py" $ARGS
& "$($env:POETRY_HOME)\bin\poetry" run python "$($openpype_root)\tools\create_zip.py" $ARGS
Set-Location -Path $current_dir

View file

@ -114,11 +114,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
pushd "$openpype_root" > /dev/null || return > /dev/null
@ -134,7 +132,7 @@ main () {
echo -e "${BIGreen}>>>${RST} Generating zip from current sources ..."
PYTHONPATH="$openpype_root:$PYTHONPATH"
OPENPYPE_ROOT="$openpype_root"
poetry run python3 "$openpype_root/tools/create_zip.py" "$@"
"$POETRY_HOME/bin/poetry" run python3 "$openpype_root/tools/create_zip.py" "$@"
}
main "$@"

View file

@ -17,18 +17,15 @@ $openpype_root = (Get-Item $script_dir).parent.FullName
$env:_INSIDE_OPENPYPE_TOOL = "1"
# make sure Poetry is in PATH
if (-not (Test-Path 'env:POETRY_HOME')) {
$env:POETRY_HOME = "$openpype_root\.poetry"
}
$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
Set-Location -Path $openpype_root
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -37,5 +34,5 @@ if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
Write-Host "OK" -ForegroundColor Green
}
& poetry run python "$($openpype_root)\tools\fetch_thirdparty_libs.py"
& "$($env:POETRY_HOME)\bin\poetry" run python "$($openpype_root)\tools\fetch_thirdparty_libs.py"
Set-Location -Path $current_dir

View file

@ -1,8 +1,5 @@
#!/usr/bin/env bash
# Run Pype Tray
art () {
cat <<-EOF
@ -82,11 +79,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
echo -e "${BIGreen}>>>${RST} Reading Poetry ... \c"
if [ -f "$POETRY_HOME/bin/poetry" ]; then
@ -100,7 +95,7 @@ main () {
pushd "$openpype_root" > /dev/null || return > /dev/null
echo -e "${BIGreen}>>>${RST} Running Pype tool ..."
poetry run python "$openpype_root/tools/fetch_thirdparty_libs.py"
"$POETRY_HOME/bin/poetry" run python "$openpype_root/tools/fetch_thirdparty_libs.py"
}
main

View file

@ -19,11 +19,9 @@ $openpype_root = (Get-Item $script_dir).parent.FullName
$env:_INSIDE_OPENPYPE_TOOL = "1"
# make sure Poetry is in PATH
if (-not (Test-Path 'env:POETRY_HOME')) {
$env:POETRY_HOME = "$openpype_root\.poetry"
}
$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
Set-Location -Path $openpype_root
@ -50,7 +48,7 @@ Write-Host $art -ForegroundColor DarkGreen
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -63,10 +61,10 @@ Write-Host "This will not overwrite existing source rst files, only scan and add
Set-Location -Path $openpype_root
Write-Host ">>> " -NoNewline -ForegroundColor green
Write-Host "Running apidoc ..."
& poetry run sphinx-apidoc -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$($openpype_root)\docs\source" igniter
& poetry run sphinx-apidoc.exe -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$($openpype_root)\docs\source" openpype vendor, openpype\vendor
& "$env:POETRY_HOME\bin\poetry" run sphinx-apidoc -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$($openpype_root)\docs\source" igniter
& "$env:POETRY_HOME\bin\poetry" run sphinx-apidoc.exe -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$($openpype_root)\docs\source" openpype vendor, openpype\vendor
Write-Host ">>> " -NoNewline -ForegroundColor green
Write-Host "Building html ..."
& poetry run python "$($openpype_root)\setup.py" build_sphinx
& "$env:POETRY_HOME\bin\poetry" run python "$($openpype_root)\setup.py" build_sphinx
Set-Location -Path $current_dir

View file

@ -83,11 +83,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
echo -e "${BIGreen}>>>${RST} Reading Poetry ... \c"
if [ -f "$POETRY_HOME/bin/poetry" ]; then
@ -101,11 +99,11 @@ main () {
pushd "$openpype_root" > /dev/null || return > /dev/null
echo -e "${BIGreen}>>>${RST} Running apidoc ..."
poetry run sphinx-apidoc -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$openpype_root/docs/source" igniter
poetry run sphinx-apidoc -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$openpype_root/docs/source" openpype vendor, openpype\vendor
"$POETRY_HOME/bin/poetry" run sphinx-apidoc -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$openpype_root/docs/source" igniter
"$POETRY_HOME/bin/poetry" run sphinx-apidoc -M -e -d 10 --ext-intersphinx --ext-todo --ext-coverage --ext-viewcode -o "$openpype_root/docs/source" openpype vendor, openpype\vendor
echo -e "${BIGreen}>>>${RST} Building html ..."
poetry run python3 "$openpype_root/setup.py" build_sphinx
"$POETRY_HOME/bin/poetry" run python3 "$openpype_root/setup.py" build_sphinx
}
main

View file

@ -47,7 +47,7 @@ Set-Location -Path $openpype_root
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -56,5 +56,5 @@ if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
Write-Host "OK" -ForegroundColor Green
}
& poetry run python "$($openpype_root)\start.py" projectmanager
& "$env:POETRY_HOME\bin\poetry" run python "$($openpype_root)\start.py" projectmanager
Set-Location -Path $current_dir

View file

@ -79,11 +79,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
pushd "$openpype_root" > /dev/null || return > /dev/null
@ -97,7 +95,7 @@ main () {
fi
echo -e "${BIGreen}>>>${RST} Generating zip from current sources ..."
poetry run python "$openpype_root/start.py" projectmanager
"$POETRY_HOME/bin/poetry" run python "$openpype_root/start.py" projectmanager
}
main

View file

@ -27,7 +27,7 @@ Set-Location -Path $openpype_root
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -36,5 +36,5 @@ if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
Write-Host "OK" -ForegroundColor Green
}
& poetry run python "$($openpype_root)\start.py" settings --dev
& "$env:POETRY_HOME\bin\poetry" run python "$($openpype_root)\start.py" settings --dev
Set-Location -Path $current_dir

View file

@ -79,11 +79,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
pushd "$openpype_root" > /dev/null || return > /dev/null
@ -97,7 +95,7 @@ main () {
fi
echo -e "${BIGreen}>>>${RST} Generating zip from current sources ..."
poetry run python3 "$openpype_root/start.py" settings --dev
"$POETRY_HOME/bin/poetry" run python3 "$openpype_root/start.py" settings --dev
}
main

View file

@ -59,11 +59,9 @@ $openpype_root = (Get-Item $script_dir).parent.FullName
$env:_INSIDE_OPENPYPE_TOOL = "1"
# make sure Poetry is in PATH
if (-not (Test-Path 'env:POETRY_HOME')) {
$env:POETRY_HOME = "$openpype_root\.poetry"
}
$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
Set-Location -Path $openpype_root
@ -83,7 +81,7 @@ Write-Host " ] ..." -ForegroundColor white
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -102,7 +100,7 @@ Write-Host ">>> " -NoNewline -ForegroundColor green
Write-Host "Testing OpenPype ..."
$original_pythonpath = $env:PYTHONPATH
$env:PYTHONPATH="$($openpype_root);$($env:PYTHONPATH)"
& poetry run pytest -x --capture=sys --print -W ignore::DeprecationWarning "$($openpype_root)/tests"
& "$env:POETRY_HOME\bin\poetry" run pytest -x --capture=sys --print -W ignore::DeprecationWarning "$($openpype_root)/tests"
$env:PYTHONPATH = $original_pythonpath
Write-Host ">>> " -NoNewline -ForegroundColor green

View file

@ -98,11 +98,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
echo -e "${BIGreen}>>>${RST} Reading Poetry ... \c"
if [ -f "$POETRY_HOME/bin/poetry" ]; then
@ -118,7 +116,7 @@ main () {
echo -e "${BIGreen}>>>${RST} Testing OpenPype ..."
original_pythonpath=$PYTHONPATH
export PYTHONPATH="$openpype_root:$PYTHONPATH"
poetry run pytest -x --capture=sys --print -W ignore::DeprecationWarning "$openpype_root/tests"
"$POETRY_HOME/bin/poetry" run pytest -x --capture=sys --print -W ignore::DeprecationWarning "$openpype_root/tests"
PYTHONPATH=$original_pythonpath
}

View file

@ -26,7 +26,7 @@ Set-Location -Path $openpype_root
Write-Host ">>> " -NoNewline -ForegroundColor Green
Write-Host "Reading Poetry ... " -NoNewline
if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Host "NOT FOUND" -ForegroundColor Yellow
Write-Host "*** " -NoNewline -ForegroundColor Yellow
Write-Host "We need to install Poetry create virtual env first ..."
@ -35,5 +35,5 @@ if (-not (Test-Path -PathType Container -Path "$openpype_root\.poetry\bin")) {
Write-Host "OK" -ForegroundColor Green
}
& poetry run python "$($openpype_root)\start.py" tray --debug
& "$($env:POETRY_HOME)\bin\poetry" run python "$($openpype_root)\start.py" tray --debug
Set-Location -Path $current_dir

View file

@ -56,11 +56,9 @@ main () {
_inside_openpype_tool="1"
# make sure Poetry is in PATH
if [[ -z $POETRY_HOME ]]; then
export POETRY_HOME="$openpype_root/.poetry"
fi
export PATH="$POETRY_HOME/bin:$PATH"
echo -e "${BIGreen}>>>${RST} Reading Poetry ... \c"
if [ -f "$POETRY_HOME/bin/poetry" ]; then
@ -74,7 +72,7 @@ main () {
pushd "$openpype_root" > /dev/null || return > /dev/null
echo -e "${BIGreen}>>>${RST} Running OpenPype Tray with debug option ..."
poetry run python3 "$openpype_root/start.py" tray --debug
"$POETRY_HOME/bin/poetry" run python3 "$openpype_root/start.py" tray --debug
}
main

View file

@ -21,6 +21,8 @@ openpype_console --use-version=3.0.0-foo+bar
`--use-staging` - to use staging versions of OpenPype.
`--list-versions [--use-staging]` - to list available versions.
For more information [see here](admin_use#run-openpype).
## Commands

View file

@ -46,6 +46,16 @@ openpype_console --use-version=3.0.1
`--use-staging` - to specify you prefer staging version. In that case it will be used
(if found) instead of production one.
:::tip List available versions
To list all available versions, use:
```shell
openpype_console --list-versions
```
You can add `--use-staging` to list staging versions.
:::
### Details
When you run OpenPype from executable, few check are made:

View file

@ -137,12 +137,12 @@ $ pyenv install -v 3.7.10
$ cd /path/to/pype-3
# set local python version
$ pyenv local 3.7.9
$ pyenv local 3.7.10
```
:::note Install build requirements for **Ubuntu**
```shell
sudo apt-get update; sudo apt-get install --no-install-recommends make build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev xz-utils tk-dev libxml2-dev libxmlsec1-dev libffi-dev liblzma-dev git
sudo apt-get update; sudo apt-get install --no-install-recommends make build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev xz-utils tk-dev libxml2-dev libxmlsec1-dev libffi-dev liblzma-dev git patchelf
```
In case you run in error about `xcb` when running Pype,

View file

@ -34,7 +34,7 @@ To prepare Ftrack for working with OpenPype you'll need to run [OpenPype Admin -
Ftrack Event Server is the key to automation of many tasks like _status change_, _thumbnail update_, _automatic synchronization to Avalon database_ and many more. Event server should run at all times to perform the required processing as it is not possible to catch some of them retrospectively with enough certainty.
### Running event server
There are specific launch arguments for event server. With `openpype eventserver` you can launch event server but without prior preparation it will terminate immediately. The reason is that event server requires 3 pieces of information: _Ftrack server url_, _paths to events_ and _credentials (Username and API key)_. Ftrack server URL and Event path are set from OpenPype's environments by default, but the credentials must be done separatelly for security reasons.
There are specific launch arguments for event server. With `openpype_console eventserver` you can launch event server but without prior preparation it will terminate immediately. The reason is that event server requires 3 pieces of information: _Ftrack server url_, _paths to events_ and _credentials (Username and API key)_. Ftrack server URL and Event path are set from OpenPype's environments by default, but the credentials must be done separatelly for security reasons.
@ -56,7 +56,7 @@ There are specific launch arguments for event server. With `openpype eventserver
- `--ftrack-url "https://yourdomain.ftrackapp.com/"` : Ftrack server URL _(it is not needed to enter if you have set `FTRACK_SERVER` in OpenPype' environments)_
- `--ftrack-events-path "//Paths/To/Events/"` : Paths to events folder. May contain multiple paths separated by `;`. _(it is not needed to enter if you have set `FTRACK_EVENTS_PATH` in OpenPype' environments)_
So if you want to use OpenPype's environments then you can launch event server for first time with these arguments `openpype.exe eventserver --ftrack-user "my.username" --ftrack-api-key "00000aaa-11bb-22cc-33dd-444444eeeee" --store-credentials`. Since that time, if everything was entered correctly, you can launch event server with `openpype.exe eventserver`.
So if you want to use OpenPype's environments then you can launch event server for first time with these arguments `openpype_console.exe eventserver --ftrack-user "my.username" --ftrack-api-key "00000aaa-11bb-22cc-33dd-444444eeeee" --store-credentials`. Since that time, if everything was entered correctly, you can launch event server with `openpype_console.exe eventserver`.
</TabItem>
<TabItem value="env">
@ -100,14 +100,17 @@ Event server should **not** run more than once! It may cause major issues.
<TabItem value="linux">
- create file:
`sudo vi /opt/OpenPype/run_event_server.sh`
`sudo vi /opt/openpype/run_event_server.sh`
- add content to the file:
```sh
#!\usr\bin\env
#!/usr/bin/env
export OPENPYPE_DEBUG=3
pushd /mnt/pipeline/prod/openpype-setup
. openpype eventserver --ftrack-user <openpype-admin-user> --ftrack-api-key <api-key>
. openpype_console eventserver --ftrack-user <openpype-admin-user> --ftrack-api-key <api-key>
```
- change file permission:
`sudo chmod 0755 /opt/openpype/run_event_server.sh`
- create service file:
`sudo vi /etc/systemd/system/openpype-ftrack-event-server.service`
- add content to the service file
@ -145,7 +148,7 @@ WantedBy=multi-user.target
@echo off
set OPENPYPE_DEBUG=3
pushd \\path\to\file\
call openpype.bat eventserver --ftrack-user <openpype-admin-user> --ftrack-api-key <api-key>
openpype_console.exe eventserver --ftrack-user <openpype-admin-user> --ftrack-api-key <api-key>
```
- download and install `nssm.cc`
- create Windows service according to nssm.cc manual

View file

@ -2667,15 +2667,6 @@ cli-boxes@^2.2.1:
resolved "https://registry.yarnpkg.com/cli-boxes/-/cli-boxes-2.2.1.tgz#ddd5035d25094fce220e9cab40a45840a440318f"
integrity sha512-y4coMcylgSCdVinjiDBuR8PCC2bLjyGTwEmPb9NHR/QaNU6EUOXcTY/s6VjGMD6ENSEaeQYHCY0GNGS5jfMwPw==
clipboard@^2.0.0:
version "2.0.8"
resolved "https://registry.yarnpkg.com/clipboard/-/clipboard-2.0.8.tgz#ffc6c103dd2967a83005f3f61976aa4655a4cdba"
integrity sha512-Y6WO0unAIQp5bLmk1zdThRhgJt/x3ks6f30s3oE3H1mgIEU33XyQjEf8gsf6DxC7NPX8Y1SsNWjUjL/ywLnnbQ==
dependencies:
good-listener "^1.2.2"
select "^1.1.2"
tiny-emitter "^2.0.0"
cliui@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/cliui/-/cliui-5.0.0.tgz#deefcfdb2e800784aa34f46fa08e06851c7bbbc5"
@ -3310,11 +3301,6 @@ del@^6.0.0:
rimraf "^3.0.2"
slash "^3.0.0"
delegate@^3.1.2:
version "3.2.0"
resolved "https://registry.yarnpkg.com/delegate/-/delegate-3.2.0.tgz#b66b71c3158522e8ab5744f720d8ca0c2af59166"
integrity sha512-IofjkYBZaZivn0V8nnsMJGBr4jVLxHDheKSW88PyxS5QC4Vo9ZbZVvhzlSxY87fVq3STR6r+4cGepyHkcWOQSw==
depd@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/depd/-/depd-1.1.2.tgz#9bcd52e14c097763e749b274c4346ed2e560b5a9"
@ -4224,13 +4210,6 @@ globby@^6.1.0:
pify "^2.0.0"
pinkie-promise "^2.0.0"
good-listener@^1.2.2:
version "1.2.2"
resolved "https://registry.yarnpkg.com/good-listener/-/good-listener-1.2.2.tgz#d53b30cdf9313dffb7dc9a0d477096aa6d145c50"
integrity sha1-1TswzfkxPf+33JoNR3CWqm0UXFA=
dependencies:
delegate "^3.1.2"
got@^9.6.0:
version "9.6.0"
resolved "https://registry.yarnpkg.com/got/-/got-9.6.0.tgz#edf45e7d67f99545705de1f7bbeeeb121765ed85"
@ -6615,11 +6594,9 @@ prism-react-renderer@^1.1.1:
integrity sha512-GHqzxLYImx1iKN1jJURcuRoA/0ygCcNhfGw1IT8nPIMzarmKQ3Nc+JcG0gi8JXQzuh0C5ShE4npMIoqNin40hg==
prismjs@^1.23.0:
version "1.23.0"
resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.23.0.tgz#d3b3967f7d72440690497652a9d40ff046067f33"
integrity sha512-c29LVsqOaLbBHuIbsTxaKENh1N2EQBOHaWv7gkHN4dgRbxSREqDnDbtFJYdpPauS4YCplMSNCABQ6Eeor69bAA==
optionalDependencies:
clipboard "^2.0.0"
version "1.24.0"
resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.24.0.tgz#0409c30068a6c52c89ef7f1089b3ca4de56be2ac"
integrity sha512-SqV5GRsNqnzCL8k5dfAjCNhUrF3pR0A9lTDSCUZeh/LIshheXJEaP0hwLz2t4XHivd2J/v2HR+gRnigzeKe3cQ==
process-nextick-args@~2.0.0:
version "2.0.1"
@ -7390,11 +7367,6 @@ select-hose@^2.0.0:
resolved "https://registry.yarnpkg.com/select-hose/-/select-hose-2.0.0.tgz#625d8658f865af43ec962bfc376a37359a4994ca"
integrity sha1-Yl2GWPhlr0Psliv8N2o3NZpJlMo=
select@^1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/select/-/select-1.1.2.tgz#0e7350acdec80b1108528786ec1d4418d11b396d"
integrity sha1-DnNQrN7ICxEIUoeG7B1EGNEbOW0=
selfsigned@^1.10.8:
version "1.10.8"
resolved "https://registry.yarnpkg.com/selfsigned/-/selfsigned-1.10.8.tgz#0d17208b7d12c33f8eac85c41835f27fc3d81a30"
@ -8016,11 +7988,6 @@ timsort@^0.3.0:
resolved "https://registry.yarnpkg.com/timsort/-/timsort-0.3.0.tgz#405411a8e7e6339fe64db9a234de11dc31e02bd4"
integrity sha1-QFQRqOfmM5/mTbmiNN4R3DHgK9Q=
tiny-emitter@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/tiny-emitter/-/tiny-emitter-2.1.0.tgz#1d1a56edfc51c43e863cbb5382a72330e3555423"
integrity sha512-NB6Dk1A9xgQPMoGqC5CVXn123gWyte215ONT5Pp5a0yt4nlEoO1ZWeCwpncaekPHXO60i47ihFnZPiRPjRMq4Q==
tiny-invariant@^1.0.2:
version "1.1.0"
resolved "https://registry.yarnpkg.com/tiny-invariant/-/tiny-invariant-1.1.0.tgz#634c5f8efdc27714b7f386c35e6760991d230875"