diff --git a/CHANGELOG.md b/CHANGELOG.md index f3c7820d8f..f767bc71d5 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,8 +1,63 @@ # Changelog -## [3.9.1](https://github.com/pypeclub/OpenPype/tree/3.9.1) (2022-03-17) +## [3.9.2-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD) -[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.0...3.9.1) +[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.1...HEAD) + +### 📖 Documentation + +- Docs: Added MongoDB requirements [\#2951](https://github.com/pypeclub/OpenPype/pull/2951) + +**🆕 New features** + +- Multiverse: First PR [\#2908](https://github.com/pypeclub/OpenPype/pull/2908) + +**🚀 Enhancements** + +- Slack: Added configurable maximum file size of review upload to Slack [\#2945](https://github.com/pypeclub/OpenPype/pull/2945) +- NewPublisher: Prepared implementation of optional pyblish plugin [\#2943](https://github.com/pypeclub/OpenPype/pull/2943) +- Workfiles: Open published workfiles [\#2925](https://github.com/pypeclub/OpenPype/pull/2925) +- General: Default modules loaded dynamically [\#2923](https://github.com/pypeclub/OpenPype/pull/2923) +- CI: change the version bump logic [\#2919](https://github.com/pypeclub/OpenPype/pull/2919) +- Deadline: Add headless argument [\#2916](https://github.com/pypeclub/OpenPype/pull/2916) +- Nuke: Add no-audio Tag [\#2911](https://github.com/pypeclub/OpenPype/pull/2911) +- Ftrack: Fill workfile in custom attribute [\#2906](https://github.com/pypeclub/OpenPype/pull/2906) +- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903) +- Settings UI: Add simple tooltips for settings entities [\#2901](https://github.com/pypeclub/OpenPype/pull/2901) + +**🐛 Bug fixes** + +- Slack: Added default for review\_upload\_limit for Slack [\#2965](https://github.com/pypeclub/OpenPype/pull/2965) +- Settings: Conditional dictionary avoid invalid logs [\#2956](https://github.com/pypeclub/OpenPype/pull/2956) +- LogViewer: Don't refresh on initialization [\#2949](https://github.com/pypeclub/OpenPype/pull/2949) +- nuke: python3 compatibility issue with `iteritems` [\#2948](https://github.com/pypeclub/OpenPype/pull/2948) +- General: anatomy data with correct task short key [\#2947](https://github.com/pypeclub/OpenPype/pull/2947) +- SceneInventory: Fix imports in UI [\#2944](https://github.com/pypeclub/OpenPype/pull/2944) +- Slack: add generic exception [\#2941](https://github.com/pypeclub/OpenPype/pull/2941) +- General: Python specific vendor paths on env injection [\#2939](https://github.com/pypeclub/OpenPype/pull/2939) +- General: More fail safe delete old versions [\#2936](https://github.com/pypeclub/OpenPype/pull/2936) +- Settings UI: Collapsed of collapsible wrapper works as expected [\#2934](https://github.com/pypeclub/OpenPype/pull/2934) +- Maya: Do not pass `set` to maya commands \(fixes support for older maya versions\) [\#2932](https://github.com/pypeclub/OpenPype/pull/2932) +- General: Don't print log record on OSError [\#2926](https://github.com/pypeclub/OpenPype/pull/2926) +- Hiero: Fix import of 'register\_event\_callback' [\#2924](https://github.com/pypeclub/OpenPype/pull/2924) +- Ftrack: Missing Ftrack id after editorial publish [\#2905](https://github.com/pypeclub/OpenPype/pull/2905) +- AfterEffects: Fix rendering for single frame in DL [\#2875](https://github.com/pypeclub/OpenPype/pull/2875) + +**🔀 Refactored code** + +- General: Move Attribute Definitions from pipeline [\#2931](https://github.com/pypeclub/OpenPype/pull/2931) +- General: Removed silo references and terminal splash [\#2927](https://github.com/pypeclub/OpenPype/pull/2927) +- General: Move pipeline constants to OpenPype [\#2918](https://github.com/pypeclub/OpenPype/pull/2918) +- General: Move formatting and workfile functions [\#2914](https://github.com/pypeclub/OpenPype/pull/2914) +- General: Move remaining plugins from avalon [\#2912](https://github.com/pypeclub/OpenPype/pull/2912) + +**Merged pull requests:** + +- Maya - added transparency into review creator [\#2952](https://github.com/pypeclub/OpenPype/pull/2952) + +## [3.9.1](https://github.com/pypeclub/OpenPype/tree/3.9.1) (2022-03-18) + +[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.1-nightly.3...3.9.1) **🚀 Enhancements** @@ -51,10 +106,6 @@ - Maya: add loaded containers to published instance [\#2837](https://github.com/pypeclub/OpenPype/pull/2837) - Ftrack: Can sync fps as string [\#2836](https://github.com/pypeclub/OpenPype/pull/2836) - General: Custom function for find executable [\#2822](https://github.com/pypeclub/OpenPype/pull/2822) -- General: Color dialog UI fixes [\#2817](https://github.com/pypeclub/OpenPype/pull/2817) -- global: letter box calculated on output as last process [\#2812](https://github.com/pypeclub/OpenPype/pull/2812) -- Nuke: adding Reformat to baking mov plugin [\#2811](https://github.com/pypeclub/OpenPype/pull/2811) -- Manager: Update all to latest button [\#2805](https://github.com/pypeclub/OpenPype/pull/2805) **🐛 Bug fixes** @@ -81,7 +132,6 @@ - Settings UI: Fix "Apply from" action [\#2820](https://github.com/pypeclub/OpenPype/pull/2820) - Ftrack: Job killer with missing user [\#2819](https://github.com/pypeclub/OpenPype/pull/2819) - Nuke: Use AVALON\_APP to get value for "app" key [\#2818](https://github.com/pypeclub/OpenPype/pull/2818) -- StandalonePublisher: use dynamic groups in subset names [\#2816](https://github.com/pypeclub/OpenPype/pull/2816) **🔀 Refactored code** @@ -92,7 +142,6 @@ - General: Move change context functions [\#2839](https://github.com/pypeclub/OpenPype/pull/2839) - Tools: Don't use avalon tools code [\#2829](https://github.com/pypeclub/OpenPype/pull/2829) - Move Unreal Implementation to OpenPype [\#2823](https://github.com/pypeclub/OpenPype/pull/2823) -- General: Extract template formatting from anatomy [\#2766](https://github.com/pypeclub/OpenPype/pull/2766) ## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07) diff --git a/openpype/__init__.py b/openpype/__init__.py index 99629a4257..8b94b2dc3f 100644 --- a/openpype/__init__.py +++ b/openpype/__init__.py @@ -78,6 +78,7 @@ def install(): from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, + register_inventory_action, ) from avalon import pipeline @@ -124,7 +125,7 @@ def install(): pyblish.register_plugin_path(path) register_loader_plugin_path(path) avalon.register_plugin_path(LegacyCreator, path) - avalon.register_plugin_path(avalon.InventoryAction, path) + register_inventory_action(path) # apply monkey patched discover to original one log.info("Patching discovery") diff --git a/openpype/hooks/pre_python_2_prelaunch.py b/openpype/hooks/pre_python_2_prelaunch.py deleted file mode 100644 index 84272d2e5d..0000000000 --- a/openpype/hooks/pre_python_2_prelaunch.py +++ /dev/null @@ -1,35 +0,0 @@ -import os -from openpype.lib import PreLaunchHook - - -class PrePython2Vendor(PreLaunchHook): - """Prepend python 2 dependencies for py2 hosts.""" - order = 10 - - def execute(self): - if not self.application.use_python_2: - return - - # Prepare vendor dir path - self.log.info("adding global python 2 vendor") - pype_root = os.getenv("OPENPYPE_REPOS_ROOT") - python_2_vendor = os.path.join( - pype_root, - "openpype", - "vendor", - "python", - "python_2" - ) - - # Add Python 2 modules - python_paths = [ - python_2_vendor - ] - - # Load PYTHONPATH from current launch context - python_path = self.launch_context.env.get("PYTHONPATH") - if python_path: - python_paths.append(python_path) - - # Set new PYTHONPATH to launch context environments - self.launch_context.env["PYTHONPATH"] = os.pathsep.join(python_paths) diff --git a/openpype/hosts/aftereffects/api/pipeline.py b/openpype/hosts/aftereffects/api/pipeline.py index 681f1c51a7..bb9affc9b6 100644 --- a/openpype/hosts/aftereffects/api/pipeline.py +++ b/openpype/hosts/aftereffects/api/pipeline.py @@ -2,10 +2,11 @@ import os import sys from Qt import QtWidgets +from bson.objectid import ObjectId import pyblish.api import avalon.api -from avalon import io, pipeline +from avalon import io from openpype import lib from openpype.api import Logger @@ -13,6 +14,7 @@ from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) import openpype.hosts.aftereffects from openpype.lib import register_event_callback @@ -29,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins") PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish") LOAD_PATH = os.path.join(PLUGINS_DIR, "load") CREATE_PATH = os.path.join(PLUGINS_DIR, "create") -INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory") def check_inventory(): @@ -42,7 +43,7 @@ def check_inventory(): representation = container['representation'] representation_doc = io.find_one( { - "_id": io.ObjectId(representation), + "_id": ObjectId(representation), "type": "representation" }, projection={"parent": True} @@ -149,7 +150,7 @@ def containerise(name, """ data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "name": name, "namespace": namespace, "loader": str(loader), diff --git a/openpype/hosts/aftereffects/api/workio.py b/openpype/hosts/aftereffects/api/workio.py index 04c7834d8f..5a8f86ead5 100644 --- a/openpype/hosts/aftereffects/api/workio.py +++ b/openpype/hosts/aftereffects/api/workio.py @@ -1,8 +1,8 @@ """Host API required Work Files tool""" import os +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS from .launch_logic import get_stub -from avalon import api def _active_document(): @@ -14,7 +14,7 @@ def _active_document(): def file_extensions(): - return api.HOST_WORKFILE_EXTENSIONS["aftereffects"] + return HOST_WORKFILE_EXTENSIONS["aftereffects"] def has_unsaved_changes(): diff --git a/openpype/hosts/blender/api/ops.py b/openpype/hosts/blender/api/ops.py index 3069c3e1c9..29d6d356c8 100644 --- a/openpype/hosts/blender/api/ops.py +++ b/openpype/hosts/blender/api/ops.py @@ -328,7 +328,6 @@ class LaunchWorkFiles(LaunchQtApp): result = super().execute(context) self._window.set_context({ "asset": avalon.api.Session["AVALON_ASSET"], - "silo": avalon.api.Session["AVALON_SILO"], "task": avalon.api.Session["AVALON_TASK"] }) return result diff --git a/openpype/hosts/blender/api/pipeline.py b/openpype/hosts/blender/api/pipeline.py index 07a7509dd7..8c580cf214 100644 --- a/openpype/hosts/blender/api/pipeline.py +++ b/openpype/hosts/blender/api/pipeline.py @@ -12,12 +12,12 @@ from . import ops import pyblish.api import avalon.api from avalon import io, schema -from avalon.pipeline import AVALON_CONTAINER_ID from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) from openpype.api import Logger from openpype.lib import ( @@ -31,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins") PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish") LOAD_PATH = os.path.join(PLUGINS_DIR, "load") CREATE_PATH = os.path.join(PLUGINS_DIR, "create") -INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory") ORIGINAL_EXCEPTHOOK = sys.excepthook diff --git a/openpype/hosts/blender/api/workio.py b/openpype/hosts/blender/api/workio.py index fd68761982..5eb9f82999 100644 --- a/openpype/hosts/blender/api/workio.py +++ b/openpype/hosts/blender/api/workio.py @@ -4,7 +4,8 @@ from pathlib import Path from typing import List, Optional import bpy -from avalon import api + +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS class OpenFileCacher: @@ -77,7 +78,7 @@ def has_unsaved_changes() -> bool: def file_extensions() -> List[str]: """Return the supported file extensions for Blender scene files.""" - return api.HOST_WORKFILE_EXTENSIONS["blender"] + return HOST_WORKFILE_EXTENSIONS["blender"] def work_root(session: dict) -> str: diff --git a/openpype/hosts/blender/plugins/load/load_abc.py b/openpype/hosts/blender/plugins/load/load_abc.py index 3daaeceffe..1b2e800769 100644 --- a/openpype/hosts/blender/plugins/load/load_abc.py +++ b/openpype/hosts/blender/plugins/load/load_abc.py @@ -6,11 +6,14 @@ from typing import Dict, List, Optional import bpy -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID, +) + from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) from openpype.hosts.blender.api import plugin, lib diff --git a/openpype/hosts/blender/plugins/load/load_audio.py b/openpype/hosts/blender/plugins/load/load_audio.py index b95c5db270..3f4fcc17de 100644 --- a/openpype/hosts/blender/plugins/load/load_audio.py +++ b/openpype/hosts/blender/plugins/load/load_audio.py @@ -6,12 +6,14 @@ from typing import Dict, List, Optional import bpy -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID, +) from openpype.hosts.blender.api import plugin from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) diff --git a/openpype/hosts/blender/plugins/load/load_camera_blend.py b/openpype/hosts/blender/plugins/load/load_camera_blend.py index 6ed2e8a575..f00027f0b4 100644 --- a/openpype/hosts/blender/plugins/load/load_camera_blend.py +++ b/openpype/hosts/blender/plugins/load/load_camera_blend.py @@ -7,12 +7,14 @@ from typing import Dict, List, Optional import bpy -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID, +) from openpype.hosts.blender.api import plugin from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) logger = logging.getLogger("openpype").getChild( diff --git a/openpype/hosts/blender/plugins/load/load_camera_fbx.py b/openpype/hosts/blender/plugins/load/load_camera_fbx.py index 626ed44f08..97f844e610 100644 --- a/openpype/hosts/blender/plugins/load/load_camera_fbx.py +++ b/openpype/hosts/blender/plugins/load/load_camera_fbx.py @@ -6,12 +6,14 @@ from typing import Dict, List, Optional import bpy -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID, +) from openpype.hosts.blender.api import plugin, lib from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) diff --git a/openpype/hosts/blender/plugins/load/load_fbx.py b/openpype/hosts/blender/plugins/load/load_fbx.py index 2d249ef647..ee2e7d175c 100644 --- a/openpype/hosts/blender/plugins/load/load_fbx.py +++ b/openpype/hosts/blender/plugins/load/load_fbx.py @@ -6,12 +6,14 @@ from typing import Dict, List, Optional import bpy -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID, +) from openpype.hosts.blender.api import plugin, lib from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) diff --git a/openpype/hosts/blender/plugins/load/load_layout_blend.py b/openpype/hosts/blender/plugins/load/load_layout_blend.py index d87df3c010..cf8e89ed1f 100644 --- a/openpype/hosts/blender/plugins/load/load_layout_blend.py +++ b/openpype/hosts/blender/plugins/load/load_layout_blend.py @@ -10,12 +10,12 @@ from openpype import lib from openpype.pipeline import ( legacy_create, get_representation_path, + AVALON_CONTAINER_ID, ) from openpype.hosts.blender.api import plugin from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) diff --git a/openpype/hosts/blender/plugins/load/load_layout_json.py b/openpype/hosts/blender/plugins/load/load_layout_json.py index 0693937fec..a0580af4a0 100644 --- a/openpype/hosts/blender/plugins/load/load_layout_json.py +++ b/openpype/hosts/blender/plugins/load/load_layout_json.py @@ -13,12 +13,12 @@ from openpype.pipeline import ( load_container, get_representation_path, loaders_from_representation, + AVALON_CONTAINER_ID, ) from openpype.hosts.blender.api.pipeline import ( AVALON_INSTANCES, AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) from openpype.hosts.blender.api import plugin diff --git a/openpype/hosts/blender/plugins/load/load_model.py b/openpype/hosts/blender/plugins/load/load_model.py index 18d01dcb29..0a5d98ffa0 100644 --- a/openpype/hosts/blender/plugins/load/load_model.py +++ b/openpype/hosts/blender/plugins/load/load_model.py @@ -6,12 +6,14 @@ from typing import Dict, List, Optional import bpy -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID, +) from openpype.hosts.blender.api import plugin from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) diff --git a/openpype/hosts/blender/plugins/load/load_rig.py b/openpype/hosts/blender/plugins/load/load_rig.py index cec088076c..4dfa96167f 100644 --- a/openpype/hosts/blender/plugins/load/load_rig.py +++ b/openpype/hosts/blender/plugins/load/load_rig.py @@ -10,6 +10,7 @@ from openpype import lib from openpype.pipeline import ( legacy_create, get_representation_path, + AVALON_CONTAINER_ID, ) from openpype.hosts.blender.api import ( plugin, @@ -18,7 +19,6 @@ from openpype.hosts.blender.api import ( from openpype.hosts.blender.api.pipeline import ( AVALON_CONTAINERS, AVALON_PROPERTY, - AVALON_CONTAINER_ID ) diff --git a/openpype/hosts/blender/plugins/publish/extract_layout.py b/openpype/hosts/blender/plugins/publish/extract_layout.py index cc7c90f4c8..b78a193d81 100644 --- a/openpype/hosts/blender/plugins/publish/extract_layout.py +++ b/openpype/hosts/blender/plugins/publish/extract_layout.py @@ -1,6 +1,8 @@ import os import json +from bson.objectid import ObjectId + import bpy import bpy_extras import bpy_extras.anim_utils @@ -140,7 +142,7 @@ class ExtractLayout(openpype.api.Extractor): blend = io.find_one( { "type": "representation", - "parent": io.ObjectId(parent), + "parent": ObjectId(parent), "name": "blend" }, projection={"_id": True}) @@ -151,7 +153,7 @@ class ExtractLayout(openpype.api.Extractor): fbx = io.find_one( { "type": "representation", - "parent": io.ObjectId(parent), + "parent": ObjectId(parent), "name": "fbx" }, projection={"_id": True}) @@ -162,7 +164,7 @@ class ExtractLayout(openpype.api.Extractor): abc = io.find_one( { "type": "representation", - "parent": io.ObjectId(parent), + "parent": ObjectId(parent), "name": "abc" }, projection={"_id": True}) diff --git a/openpype/hosts/flame/api/lib.py b/openpype/hosts/flame/api/lib.py index 74d9e7607a..aa2cfcb96d 100644 --- a/openpype/hosts/flame/api/lib.py +++ b/openpype/hosts/flame/api/lib.py @@ -18,6 +18,7 @@ log = Logger.get_logger(__name__) FRAME_PATTERN = re.compile(r"[\._](\d+)[\.]") + class CTX: # singleton used for passing data between api modules app_framework = None @@ -538,9 +539,17 @@ def get_segment_attributes(segment): # head and tail with forward compatibility if segment.head: - clip_data["segment_head"] = int(segment.head) + # `infinite` can be also returned + if isinstance(segment.head, str): + clip_data["segment_head"] = 0 + else: + clip_data["segment_head"] = int(segment.head) if segment.tail: - clip_data["segment_tail"] = int(segment.tail) + # `infinite` can be also returned + if isinstance(segment.tail, str): + clip_data["segment_tail"] = 0 + else: + clip_data["segment_tail"] = int(segment.tail) # add all available shot tokens shot_tokens = _get_shot_tokens_values(segment, [ diff --git a/openpype/hosts/flame/api/pipeline.py b/openpype/hosts/flame/api/pipeline.py index 930c6abe29..ca3f38c1bc 100644 --- a/openpype/hosts/flame/api/pipeline.py +++ b/openpype/hosts/flame/api/pipeline.py @@ -4,13 +4,14 @@ Basic avalon integration import os import contextlib from avalon import api as avalon -from avalon.pipeline import AVALON_CONTAINER_ID from pyblish import api as pyblish + from openpype.api import Logger from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) from .lib import ( set_segment_data_marker, @@ -26,7 +27,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins") PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish") LOAD_PATH = os.path.join(PLUGINS_DIR, "load") CREATE_PATH = os.path.join(PLUGINS_DIR, "create") -INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory") AVALON_CONTAINERS = "AVALON_CONTAINERS" @@ -34,12 +34,10 @@ log = Logger.get_logger(__name__) def install(): - pyblish.register_host("flame") pyblish.register_plugin_path(PUBLISH_PATH) register_loader_plugin_path(LOAD_PATH) avalon.register_plugin_path(LegacyCreator, CREATE_PATH) - avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH) log.info("OpenPype Flame plug-ins registred ...") # register callback for switching publishable @@ -47,6 +45,7 @@ def install(): log.info("OpenPype Flame host installed ...") + def uninstall(): pyblish.deregister_host("flame") @@ -54,7 +53,6 @@ def uninstall(): pyblish.deregister_plugin_path(PUBLISH_PATH) deregister_loader_plugin_path(LOAD_PATH) avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH) - avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH) # register callback for switching publishable pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled) diff --git a/openpype/hosts/flame/api/scripts/wiretap_com.py b/openpype/hosts/flame/api/scripts/wiretap_com.py index ee906c2608..54993d34eb 100644 --- a/openpype/hosts/flame/api/scripts/wiretap_com.py +++ b/openpype/hosts/flame/api/scripts/wiretap_com.py @@ -422,7 +422,13 @@ class WireTapCom(object): color_policy = color_policy or "Legacy" # check if the colour policy in custom dir - if not os.path.exists(color_policy): + if "/" in color_policy: + # if unlikelly full path was used make it redundant + color_policy = color_policy.replace("/syncolor/policies/", "") + # expecting input is `Shared/NameOfPolicy` + color_policy = "/syncolor/policies/{}".format( + color_policy) + else: color_policy = "/syncolor/policies/Autodesk/{}".format( color_policy) diff --git a/openpype/hosts/flame/plugins/publish/collect_timeline_instances.py b/openpype/hosts/flame/plugins/publish/collect_timeline_instances.py index 70340ad7a2..2482abd9c7 100644 --- a/openpype/hosts/flame/plugins/publish/collect_timeline_instances.py +++ b/openpype/hosts/flame/plugins/publish/collect_timeline_instances.py @@ -34,119 +34,125 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin): def process(self, context): project = context.data["flameProject"] sequence = context.data["flameSequence"] + selected_segments = context.data["flameSelectedSegments"] + self.log.debug("__ selected_segments: {}".format(selected_segments)) + self.otio_timeline = context.data["otioTimeline"] self.clips_in_reels = opfapi.get_clips_in_reels(project) self.fps = context.data["fps"] # process all sellected - with opfapi.maintained_segment_selection(sequence) as segments: - for segment in segments: - comment_attributes = self._get_comment_attributes(segment) - self.log.debug("_ comment_attributes: {}".format( - pformat(comment_attributes))) + for segment in selected_segments: + # get openpype tag data + marker_data = opfapi.get_segment_data_marker(segment) + self.log.debug("__ marker_data: {}".format( + pformat(marker_data))) - clip_data = opfapi.get_segment_attributes(segment) - clip_name = clip_data["segment_name"] - self.log.debug("clip_name: {}".format(clip_name)) + if not marker_data: + continue - # get openpype tag data - marker_data = opfapi.get_segment_data_marker(segment) - self.log.debug("__ marker_data: {}".format( - pformat(marker_data))) + if marker_data.get("id") != "pyblish.avalon.instance": + continue - if not marker_data: - continue + self.log.debug("__ segment.name: {}".format( + segment.name + )) - if marker_data.get("id") != "pyblish.avalon.instance": - continue + comment_attributes = self._get_comment_attributes(segment) - # get file path - file_path = clip_data["fpath"] + self.log.debug("_ comment_attributes: {}".format( + pformat(comment_attributes))) - # get source clip - source_clip = self._get_reel_clip(file_path) + clip_data = opfapi.get_segment_attributes(segment) + clip_name = clip_data["segment_name"] + self.log.debug("clip_name: {}".format(clip_name)) - first_frame = opfapi.get_frame_from_filename(file_path) or 0 + # get file path + file_path = clip_data["fpath"] - head, tail = self._get_head_tail(clip_data, first_frame) + # get source clip + source_clip = self._get_reel_clip(file_path) - # solve handles length - marker_data["handleStart"] = min( - marker_data["handleStart"], head) - marker_data["handleEnd"] = min( - marker_data["handleEnd"], tail) + first_frame = opfapi.get_frame_from_filename(file_path) or 0 - with_audio = bool(marker_data.pop("audio")) + head, tail = self._get_head_tail(clip_data, first_frame) - # add marker data to instance data - inst_data = dict(marker_data.items()) + # solve handles length + marker_data["handleStart"] = min( + marker_data["handleStart"], head) + marker_data["handleEnd"] = min( + marker_data["handleEnd"], tail) - asset = marker_data["asset"] - subset = marker_data["subset"] + with_audio = bool(marker_data.pop("audio")) - # insert family into families - family = marker_data["family"] - families = [str(f) for f in marker_data["families"]] - families.insert(0, str(family)) + # add marker data to instance data + inst_data = dict(marker_data.items()) - # form label - label = asset - if asset != clip_name: - label += " ({})".format(clip_name) - label += " {}".format(subset) - label += " {}".format("[" + ", ".join(families) + "]") + asset = marker_data["asset"] + subset = marker_data["subset"] - inst_data.update({ - "name": "{}_{}".format(asset, subset), - "label": label, - "asset": asset, - "item": segment, - "families": families, - "publish": marker_data["publish"], - "fps": self.fps, - "flameSourceClip": source_clip, - "sourceFirstFrame": int(first_frame), - "path": file_path - }) + # insert family into families + family = marker_data["family"] + families = [str(f) for f in marker_data["families"]] + families.insert(0, str(family)) - # get otio clip data - otio_data = self._get_otio_clip_instance_data(clip_data) or {} - self.log.debug("__ otio_data: {}".format(pformat(otio_data))) + # form label + label = asset + if asset != clip_name: + label += " ({})".format(clip_name) + label += " {} [{}]".format(subset, ", ".join(families)) - # add to instance data - inst_data.update(otio_data) - self.log.debug("__ inst_data: {}".format(pformat(inst_data))) + inst_data.update({ + "name": "{}_{}".format(asset, subset), + "label": label, + "asset": asset, + "item": segment, + "families": families, + "publish": marker_data["publish"], + "fps": self.fps, + "flameSourceClip": source_clip, + "sourceFirstFrame": int(first_frame), + "path": file_path + }) - # add resolution - self._get_resolution_to_data(inst_data, context) + # get otio clip data + otio_data = self._get_otio_clip_instance_data(clip_data) or {} + self.log.debug("__ otio_data: {}".format(pformat(otio_data))) - # add comment attributes if any - inst_data.update(comment_attributes) + # add to instance data + inst_data.update(otio_data) + self.log.debug("__ inst_data: {}".format(pformat(inst_data))) - # create instance - instance = context.create_instance(**inst_data) + # add resolution + self._get_resolution_to_data(inst_data, context) - # add colorspace data - instance.data.update({ - "versionData": { - "colorspace": clip_data["colour_space"], - } - }) + # add comment attributes if any + inst_data.update(comment_attributes) - # create shot instance for shot attributes create/update - self._create_shot_instance(context, clip_name, **inst_data) + # create instance + instance = context.create_instance(**inst_data) - self.log.info("Creating instance: {}".format(instance)) - self.log.info( - "_ instance.data: {}".format(pformat(instance.data))) + # add colorspace data + instance.data.update({ + "versionData": { + "colorspace": clip_data["colour_space"], + } + }) - if not with_audio: - continue + # create shot instance for shot attributes create/update + self._create_shot_instance(context, clip_name, **inst_data) - # add audioReview attribute to plate instance data - # if reviewTrack is on - if marker_data.get("reviewTrack") is not None: - instance.data["reviewAudio"] = True + self.log.info("Creating instance: {}".format(instance)) + self.log.info( + "_ instance.data: {}".format(pformat(instance.data))) + + if not with_audio: + continue + + # add audioReview attribute to plate instance data + # if reviewTrack is on + if marker_data.get("reviewTrack") is not None: + instance.data["reviewAudio"] = True def _get_comment_attributes(self, segment): comment = segment.comment.get_value() @@ -188,7 +194,7 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin): # get pattern defined by type pattern = TXT_PATERN - if a_type in ("number" , "float"): + if a_type in ("number", "float"): pattern = NUM_PATERN res_goup = pattern.findall(value) diff --git a/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py b/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py index faa5be9d68..c6aeae7730 100644 --- a/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py +++ b/openpype/hosts/flame/plugins/publish/collect_timeline_otio.py @@ -31,27 +31,28 @@ class CollecTimelineOTIO(pyblish.api.ContextPlugin): ) # adding otio timeline to context - with opfapi.maintained_segment_selection(sequence): + with opfapi.maintained_segment_selection(sequence) as selected_seg: otio_timeline = flame_export.create_otio_timeline(sequence) - instance_data = { - "name": subset_name, - "asset": asset_doc["name"], - "subset": subset_name, - "family": "workfile" - } + instance_data = { + "name": subset_name, + "asset": asset_doc["name"], + "subset": subset_name, + "family": "workfile" + } - # create instance with workfile - instance = context.create_instance(**instance_data) - self.log.info("Creating instance: {}".format(instance)) + # create instance with workfile + instance = context.create_instance(**instance_data) + self.log.info("Creating instance: {}".format(instance)) - # update context with main project attributes - context.data.update({ - "flameProject": project, - "flameSequence": sequence, - "otioTimeline": otio_timeline, - "currentFile": "Flame/{}/{}".format( - project.name, sequence.name - ), - "fps": float(str(sequence.frame_rate)[:-4]) - }) + # update context with main project attributes + context.data.update({ + "flameProject": project, + "flameSequence": sequence, + "otioTimeline": otio_timeline, + "currentFile": "Flame/{}/{}".format( + project.name, sequence.name + ), + "flameSelectedSegments": selected_seg, + "fps": float(str(sequence.frame_rate)[:-4]) + }) diff --git a/openpype/hosts/fusion/api/lib.py b/openpype/hosts/fusion/api/lib.py index 2bb5ea8aae..f7a2360bfa 100644 --- a/openpype/hosts/fusion/api/lib.py +++ b/openpype/hosts/fusion/api/lib.py @@ -3,6 +3,7 @@ import sys import re import contextlib +from bson.objectid import ObjectId from Qt import QtGui from avalon import io @@ -92,7 +93,7 @@ def switch_item(container, # Collect any of current asset, subset and representation if not provided # so we can use the original name from those. if any(not x for x in [asset_name, subset_name, representation_name]): - _id = io.ObjectId(container["representation"]) + _id = ObjectId(container["representation"]) representation = io.find_one({"type": "representation", "_id": _id}) version, subset, asset, project = io.parenthood(representation) diff --git a/openpype/hosts/fusion/api/pipeline.py b/openpype/hosts/fusion/api/pipeline.py index 92e54ad6f5..c9cd76770a 100644 --- a/openpype/hosts/fusion/api/pipeline.py +++ b/openpype/hosts/fusion/api/pipeline.py @@ -8,13 +8,15 @@ import contextlib import pyblish.api import avalon.api -from avalon.pipeline import AVALON_CONTAINER_ID from openpype.api import Logger from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + register_inventory_action_path, + deregister_inventory_action_path, + AVALON_CONTAINER_ID, ) import openpype.hosts.fusion @@ -69,7 +71,7 @@ def install(): register_loader_plugin_path(LOAD_PATH) avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH) - avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH) + register_inventory_action_path(INVENTORY_PATH) pyblish.api.register_callback( "instanceToggled", on_pyblish_instance_toggled @@ -93,9 +95,7 @@ def uninstall(): deregister_loader_plugin_path(LOAD_PATH) avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH) - avalon.api.deregister_plugin_path( - avalon.api.InventoryAction, INVENTORY_PATH - ) + deregister_inventory_action_path(INVENTORY_PATH) pyblish.api.deregister_callback( "instanceToggled", on_pyblish_instance_toggled diff --git a/openpype/hosts/fusion/api/workio.py b/openpype/hosts/fusion/api/workio.py index ec9ac7481a..a1710c6e3a 100644 --- a/openpype/hosts/fusion/api/workio.py +++ b/openpype/hosts/fusion/api/workio.py @@ -1,12 +1,14 @@ """Host API required Work Files tool""" import sys import os -from avalon import api + +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS + from .pipeline import get_current_comp def file_extensions(): - return api.HOST_WORKFILE_EXTENSIONS["fusion"] + return HOST_WORKFILE_EXTENSIONS["fusion"] def has_unsaved_changes(): diff --git a/openpype/hosts/fusion/plugins/inventory/select_containers.py b/openpype/hosts/fusion/plugins/inventory/select_containers.py index 294c134505..d554b73a5b 100644 --- a/openpype/hosts/fusion/plugins/inventory/select_containers.py +++ b/openpype/hosts/fusion/plugins/inventory/select_containers.py @@ -1,7 +1,7 @@ -from avalon import api +from openpype.pipeline import InventoryAction -class FusionSelectContainers(api.InventoryAction): +class FusionSelectContainers(InventoryAction): label = "Select Containers" icon = "mouse-pointer" diff --git a/openpype/hosts/fusion/plugins/inventory/set_tool_color.py b/openpype/hosts/fusion/plugins/inventory/set_tool_color.py index 2f5ae4d241..c7530ce674 100644 --- a/openpype/hosts/fusion/plugins/inventory/set_tool_color.py +++ b/openpype/hosts/fusion/plugins/inventory/set_tool_color.py @@ -1,6 +1,6 @@ -from avalon import api from Qt import QtGui, QtWidgets +from openpype.pipeline import InventoryAction from openpype import style from openpype.hosts.fusion.api import ( get_current_comp, @@ -8,7 +8,7 @@ from openpype.hosts.fusion.api import ( ) -class FusionSetToolColor(api.InventoryAction): +class FusionSetToolColor(InventoryAction): """Update the color of the selected tools""" label = "Set Tool Color" diff --git a/openpype/hosts/harmony/api/pipeline.py b/openpype/hosts/harmony/api/pipeline.py index f967da15ca..420e9720db 100644 --- a/openpype/hosts/harmony/api/pipeline.py +++ b/openpype/hosts/harmony/api/pipeline.py @@ -2,11 +2,11 @@ import os from pathlib import Path import logging +from bson.objectid import ObjectId import pyblish.api from avalon import io import avalon.api -from avalon.pipeline import AVALON_CONTAINER_ID from openpype import lib from openpype.lib import register_event_callback @@ -14,6 +14,7 @@ from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) import openpype.hosts.harmony import openpype.hosts.harmony.api as harmony @@ -113,7 +114,7 @@ def check_inventory(): representation = container['representation'] representation_doc = io.find_one( { - "_id": io.ObjectId(representation), + "_id": ObjectId(representation), "type": "representation" }, projection={"parent": True} diff --git a/openpype/hosts/harmony/api/workio.py b/openpype/hosts/harmony/api/workio.py index 38a00ae414..ab1cb9b1a9 100644 --- a/openpype/hosts/harmony/api/workio.py +++ b/openpype/hosts/harmony/api/workio.py @@ -2,20 +2,21 @@ import os import shutil +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS + from .lib import ( ProcessContext, get_local_harmony_path, zip_and_move, launch_zip_file ) -from avalon import api # used to lock saving until previous save is done. save_disabled = False def file_extensions(): - return api.HOST_WORKFILE_EXTENSIONS["harmony"] + return HOST_WORKFILE_EXTENSIONS["harmony"] def has_unsaved_changes(): diff --git a/openpype/hosts/hiero/api/events.py b/openpype/hosts/hiero/api/events.py index 9439199933..7fab3edfc8 100644 --- a/openpype/hosts/hiero/api/events.py +++ b/openpype/hosts/hiero/api/events.py @@ -1,12 +1,12 @@ import os import hiero.core.events from openpype.api import Logger +from openpype.lib import register_event_callback from .lib import ( sync_avalon_data_to_workfile, launch_workfiles_app, selection_changed_timeline, before_project_save, - register_event_callback ) from .tags import add_tags_to_workfile from .menu import update_menu_task_label diff --git a/openpype/hosts/hiero/api/lib.py b/openpype/hosts/hiero/api/lib.py index a9467ae5a4..df3b24ff2c 100644 --- a/openpype/hosts/hiero/api/lib.py +++ b/openpype/hosts/hiero/api/lib.py @@ -8,7 +8,10 @@ import platform import ast import shutil import hiero + from Qt import QtWidgets +from bson.objectid import ObjectId + import avalon.api as avalon import avalon.io from openpype.api import (Logger, Anatomy, get_anatomy_settings) @@ -1006,7 +1009,7 @@ def check_inventory_versions(): # get representation from io representation = io.find_one({ "type": "representation", - "_id": io.ObjectId(container["representation"]) + "_id": ObjectId(container["representation"]) }) # Get start frame from version data diff --git a/openpype/hosts/hiero/api/pipeline.py b/openpype/hosts/hiero/api/pipeline.py index eff126c0b6..0d3c8914ce 100644 --- a/openpype/hosts/hiero/api/pipeline.py +++ b/openpype/hosts/hiero/api/pipeline.py @@ -4,7 +4,7 @@ Basic avalon integration import os import contextlib from collections import OrderedDict -from avalon.pipeline import AVALON_CONTAINER_ID + from avalon import api as avalon from avalon import schema from pyblish import api as pyblish @@ -13,6 +13,7 @@ from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) from openpype.tools.utils import host_tools from . import lib, menu, events @@ -28,7 +29,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins") PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish").replace("\\", "/") LOAD_PATH = os.path.join(PLUGINS_DIR, "load").replace("\\", "/") CREATE_PATH = os.path.join(PLUGINS_DIR, "create").replace("\\", "/") -INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory").replace("\\", "/") AVALON_CONTAINERS = ":AVALON_CONTAINERS" @@ -51,7 +51,6 @@ def install(): pyblish.register_plugin_path(PUBLISH_PATH) register_loader_plugin_path(LOAD_PATH) avalon.register_plugin_path(LegacyCreator, CREATE_PATH) - avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH) # register callback for switching publishable pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled) diff --git a/openpype/hosts/hiero/api/workio.py b/openpype/hosts/hiero/api/workio.py index dacb11624f..394cb5e2ab 100644 --- a/openpype/hosts/hiero/api/workio.py +++ b/openpype/hosts/hiero/api/workio.py @@ -1,14 +1,14 @@ import os import hiero -from avalon import api + from openpype.api import Logger +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS - -log = Logger().get_logger(__name__) +log = Logger.get_logger(__name__) def file_extensions(): - return api.HOST_WORKFILE_EXTENSIONS["hiero"] + return HOST_WORKFILE_EXTENSIONS["hiero"] def has_unsaved_changes(): diff --git a/openpype/hosts/houdini/api/pipeline.py b/openpype/hosts/houdini/api/pipeline.py index 7d4e58efb7..d079c9ea81 100644 --- a/openpype/hosts/houdini/api/pipeline.py +++ b/openpype/hosts/houdini/api/pipeline.py @@ -8,12 +8,12 @@ import hdefereval import pyblish.api import avalon.api -from avalon.pipeline import AVALON_CONTAINER_ID from avalon.lib import find_submodule from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, + AVALON_CONTAINER_ID, ) import openpype.hosts.houdini from openpype.hosts.houdini.api import lib diff --git a/openpype/hosts/houdini/api/workio.py b/openpype/hosts/houdini/api/workio.py index e7310163ea..e0213023fd 100644 --- a/openpype/hosts/houdini/api/workio.py +++ b/openpype/hosts/houdini/api/workio.py @@ -2,11 +2,11 @@ import os import hou -from avalon import api +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS def file_extensions(): - return api.HOST_WORKFILE_EXTENSIONS["houdini"] + return HOST_WORKFILE_EXTENSIONS["houdini"] def has_unsaved_changes(): diff --git a/openpype/hosts/houdini/plugins/load/load_image.py b/openpype/hosts/houdini/plugins/load/load_image.py index bd9ea3eee3..671f08f18f 100644 --- a/openpype/hosts/houdini/plugins/load/load_image.py +++ b/openpype/hosts/houdini/plugins/load/load_image.py @@ -3,6 +3,7 @@ import os from openpype.pipeline import ( load, get_representation_path, + AVALON_CONTAINER_ID, ) from openpype.hosts.houdini.api import lib, pipeline @@ -73,7 +74,7 @@ class ImageLoader(load.LoaderPlugin): # Imprint it manually data = { "schema": "avalon-core:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "name": node_name, "namespace": namespace, "loader": str(self.__class__.__name__), diff --git a/openpype/hosts/houdini/plugins/load/load_usd_layer.py b/openpype/hosts/houdini/plugins/load/load_usd_layer.py index d803e6abfe..48580fc3aa 100644 --- a/openpype/hosts/houdini/plugins/load/load_usd_layer.py +++ b/openpype/hosts/houdini/plugins/load/load_usd_layer.py @@ -1,8 +1,9 @@ from openpype.pipeline import ( load, get_representation_path, + AVALON_CONTAINER_ID, ) -from openpype.hosts.houdini.api import lib, pipeline +from openpype.hosts.houdini.api import lib class USDSublayerLoader(load.LoaderPlugin): @@ -43,7 +44,7 @@ class USDSublayerLoader(load.LoaderPlugin): # Imprint it manually data = { "schema": "avalon-core:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "name": node_name, "namespace": namespace, "loader": str(self.__class__.__name__), diff --git a/openpype/hosts/houdini/plugins/load/load_usd_reference.py b/openpype/hosts/houdini/plugins/load/load_usd_reference.py index fdb443f4cf..6851c77e6d 100644 --- a/openpype/hosts/houdini/plugins/load/load_usd_reference.py +++ b/openpype/hosts/houdini/plugins/load/load_usd_reference.py @@ -1,8 +1,9 @@ from openpype.pipeline import ( load, get_representation_path, + AVALON_CONTAINER_ID, ) -from openpype.hosts.houdini.api import lib, pipeline +from openpype.hosts.houdini.api import lib class USDReferenceLoader(load.LoaderPlugin): @@ -43,7 +44,7 @@ class USDReferenceLoader(load.LoaderPlugin): # Imprint it manually data = { "schema": "avalon-core:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "name": node_name, "namespace": namespace, "loader": str(self.__class__.__name__), diff --git a/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py b/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py index 4071eb3e0c..499b733570 100644 --- a/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py +++ b/openpype/hosts/houdini/vendor/husdoutputprocessors/avalon_uri_processor.py @@ -145,7 +145,6 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase): path = self._template.format(**{ "root": root, "project": PROJECT, - "silo": asset_doc["silo"], "asset": asset_doc["name"], "subset": subset, "representation": ext, @@ -165,4 +164,3 @@ output_processor = AvalonURIOutputProcessor() def usdOutputProcessor(): return output_processor - diff --git a/openpype/hosts/maya/api/lib.py b/openpype/hosts/maya/api/lib.py index 376c033d46..92fc5133a9 100644 --- a/openpype/hosts/maya/api/lib.py +++ b/openpype/hosts/maya/api/lib.py @@ -1511,7 +1511,7 @@ def get_container_members(container): members = cmds.sets(container, query=True) or [] members = cmds.ls(members, long=True, objectsOnly=True) or [] - members = set(members) + all_members = set(members) # Include any referenced nodes from any reference in the container # This is required since we've removed adding ALL nodes of a reference @@ -1530,9 +1530,9 @@ def get_container_members(container): reference_members = cmds.ls(reference_members, long=True, objectsOnly=True) - members.update(reference_members) + all_members.update(reference_members) - return members + return list(all_members) # region LOOKDEV diff --git a/openpype/hosts/maya/api/pipeline.py b/openpype/hosts/maya/api/pipeline.py index 5cdc3ff4fd..bb61128178 100644 --- a/openpype/hosts/maya/api/pipeline.py +++ b/openpype/hosts/maya/api/pipeline.py @@ -10,7 +10,6 @@ import pyblish.api import avalon.api from avalon.lib import find_submodule -from avalon.pipeline import AVALON_CONTAINER_ID import openpype.hosts.maya from openpype.tools.utils import host_tools @@ -23,7 +22,10 @@ from openpype.lib.path_tools import HostDirmap from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, + register_inventory_action_path, deregister_loader_plugin_path, + deregister_inventory_action_path, + AVALON_CONTAINER_ID, ) from openpype.hosts.maya.lib import copy_workspace_mel from . import menu, lib @@ -59,7 +61,7 @@ def install(): register_loader_plugin_path(LOAD_PATH) avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH) - avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH) + register_inventory_action_path(INVENTORY_PATH) log.info(PUBLISH_PATH) log.info("Installing callbacks ... ") @@ -188,9 +190,7 @@ def uninstall(): deregister_loader_plugin_path(LOAD_PATH) avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH) - avalon.api.deregister_plugin_path( - avalon.api.InventoryAction, INVENTORY_PATH - ) + deregister_inventory_action_path(INVENTORY_PATH) menu.uninstall() diff --git a/openpype/hosts/maya/api/plugin.py b/openpype/hosts/maya/api/plugin.py index 84379bc145..3721868823 100644 --- a/openpype/hosts/maya/api/plugin.py +++ b/openpype/hosts/maya/api/plugin.py @@ -4,11 +4,11 @@ from maya import cmds import qargparse -from avalon.pipeline import AVALON_CONTAINER_ID from openpype.pipeline import ( LegacyCreator, LoaderPlugin, get_representation_path, + AVALON_CONTAINER_ID, ) from .pipeline import containerise diff --git a/openpype/hosts/maya/api/setdress.py b/openpype/hosts/maya/api/setdress.py index 96a9700b88..0b60564e5e 100644 --- a/openpype/hosts/maya/api/setdress.py +++ b/openpype/hosts/maya/api/setdress.py @@ -6,6 +6,8 @@ import contextlib import copy import six +from bson.objectid import ObjectId + from maya import cmds from avalon import io @@ -282,7 +284,7 @@ def update_package_version(container, version): # Versioning (from `core.maya.pipeline`) current_representation = io.find_one({ - "_id": io.ObjectId(container["representation"]) + "_id": ObjectId(container["representation"]) }) assert current_representation is not None, "This is a bug" @@ -327,7 +329,7 @@ def update_package(set_container, representation): # Load the original package data current_representation = io.find_one({ - "_id": io.ObjectId(set_container['representation']), + "_id": ObjectId(set_container['representation']), "type": "representation" }) @@ -478,10 +480,10 @@ def update_scene(set_container, containers, current_data, new_data, new_file): # They *must* use the same asset, subset and Loader for # `update_container` to make sense. old = io.find_one({ - "_id": io.ObjectId(representation_current) + "_id": ObjectId(representation_current) }) new = io.find_one({ - "_id": io.ObjectId(representation_new) + "_id": ObjectId(representation_new) }) is_valid = compare_representations(old=old, new=new) if not is_valid: diff --git a/openpype/hosts/maya/api/workio.py b/openpype/hosts/maya/api/workio.py index 698c48e81e..fd4961c4bf 100644 --- a/openpype/hosts/maya/api/workio.py +++ b/openpype/hosts/maya/api/workio.py @@ -1,11 +1,12 @@ """Host API required Work Files tool""" import os from maya import cmds -from avalon import api + +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS def file_extensions(): - return api.HOST_WORKFILE_EXTENSIONS["maya"] + return HOST_WORKFILE_EXTENSIONS["maya"] def has_unsaved_changes(): diff --git a/openpype/hosts/maya/plugins/create/create_multiverse_usd.py b/openpype/hosts/maya/plugins/create/create_multiverse_usd.py new file mode 100644 index 0000000000..b2266e5a57 --- /dev/null +++ b/openpype/hosts/maya/plugins/create/create_multiverse_usd.py @@ -0,0 +1,51 @@ +from openpype.hosts.maya.api import plugin, lib + + +class CreateMultiverseUsd(plugin.Creator): + """Multiverse USD data""" + + name = "usdMain" + label = "Multiverse USD" + family = "usd" + icon = "cubes" + + def __init__(self, *args, **kwargs): + super(CreateMultiverseUsd, self).__init__(*args, **kwargs) + + # Add animation data first, since it maintains order. + self.data.update(lib.collect_animation_data(True)) + + self.data["stripNamespaces"] = False + self.data["mergeTransformAndShape"] = False + self.data["writeAncestors"] = True + self.data["flattenParentXforms"] = False + self.data["writeSparseOverrides"] = False + self.data["useMetaPrimPath"] = False + self.data["customRootPath"] = '' + self.data["customAttributes"] = '' + self.data["nodeTypesToIgnore"] = '' + self.data["writeMeshes"] = True + self.data["writeCurves"] = True + self.data["writeParticles"] = True + self.data["writeCameras"] = False + self.data["writeLights"] = False + self.data["writeJoints"] = False + self.data["writeCollections"] = False + self.data["writePositions"] = True + self.data["writeNormals"] = True + self.data["writeUVs"] = True + self.data["writeColorSets"] = False + self.data["writeTangents"] = False + self.data["writeRefPositions"] = False + self.data["writeBlendShapes"] = False + self.data["writeDisplayColor"] = False + self.data["writeSkinWeights"] = False + self.data["writeMaterialAssignment"] = False + self.data["writeHardwareShader"] = False + self.data["writeShadingNetworks"] = False + self.data["writeTransformMatrix"] = True + self.data["writeUsdAttributes"] = False + self.data["timeVaryingTopology"] = False + self.data["customMaterialNamespace"] = '' + self.data["numTimeSamples"] = 1 + self.data["timeSamplesSpan"] = 0.0 diff --git a/openpype/hosts/maya/plugins/create/create_multiverse_usd_comp.py b/openpype/hosts/maya/plugins/create/create_multiverse_usd_comp.py new file mode 100644 index 0000000000..77b808c459 --- /dev/null +++ b/openpype/hosts/maya/plugins/create/create_multiverse_usd_comp.py @@ -0,0 +1,23 @@ +from openpype.hosts.maya.api import plugin, lib + + +class CreateMultiverseUsdComp(plugin.Creator): + """Create Multiverse USD Composition""" + + name = "usdCompositionMain" + label = "Multiverse USD Composition" + family = "usdComposition" + icon = "cubes" + + def __init__(self, *args, **kwargs): + super(CreateMultiverseUsdComp, self).__init__(*args, **kwargs) + + # Add animation data first, since it maintains order. + self.data.update(lib.collect_animation_data(True)) + + self.data["stripNamespaces"] = False + self.data["mergeTransformAndShape"] = False + self.data["flattenContent"] = False + self.data["writePendingOverrides"] = False + self.data["numTimeSamples"] = 1 + self.data["timeSamplesSpan"] = 0.0 diff --git a/openpype/hosts/maya/plugins/create/create_multiverse_usd_over.py b/openpype/hosts/maya/plugins/create/create_multiverse_usd_over.py new file mode 100644 index 0000000000..bb82ab2039 --- /dev/null +++ b/openpype/hosts/maya/plugins/create/create_multiverse_usd_over.py @@ -0,0 +1,28 @@ +from openpype.hosts.maya.api import plugin, lib + + +class CreateMultiverseUsdOver(plugin.Creator): + """Multiverse USD data""" + + name = "usdOverrideMain" + label = "Multiverse USD Override" + family = "usdOverride" + icon = "cubes" + + def __init__(self, *args, **kwargs): + super(CreateMultiverseUsdOver, self).__init__(*args, **kwargs) + + # Add animation data first, since it maintains order. + self.data.update(lib.collect_animation_data(True)) + + self.data["writeAll"] = False + self.data["writeTransforms"] = True + self.data["writeVisibility"] = True + self.data["writeAttributes"] = True + self.data["writeMaterials"] = True + self.data["writeVariants"] = True + self.data["writeVariantsDefinition"] = True + self.data["writeActiveState"] = True + self.data["writeNamespaces"] = False + self.data["numTimeSamples"] = 1 + self.data["timeSamplesSpan"] = 0.0 diff --git a/openpype/hosts/maya/plugins/create/create_review.py b/openpype/hosts/maya/plugins/create/create_review.py index 14a21d28ca..fbf3399f61 100644 --- a/openpype/hosts/maya/plugins/create/create_review.py +++ b/openpype/hosts/maya/plugins/create/create_review.py @@ -15,6 +15,14 @@ class CreateReview(plugin.Creator): keepImages = False isolate = False imagePlane = True + transparency = [ + "preset", + "simple", + "object sorting", + "weighted average", + "depth peeling", + "alpha cut" + ] def __init__(self, *args, **kwargs): super(CreateReview, self).__init__(*args, **kwargs) @@ -28,5 +36,6 @@ class CreateReview(plugin.Creator): data["isolate"] = self.isolate data["keepImages"] = self.keepImages data["imagePlane"] = self.imagePlane + data["transparency"] = self.transparency self.data = data diff --git a/openpype/hosts/maya/plugins/inventory/import_modelrender.py b/openpype/hosts/maya/plugins/inventory/import_modelrender.py index c5d3d0c8f4..d9bb256fac 100644 --- a/openpype/hosts/maya/plugins/inventory/import_modelrender.py +++ b/openpype/hosts/maya/plugins/inventory/import_modelrender.py @@ -1,6 +1,8 @@ import json -from avalon import api, io +from avalon import io +from bson.objectid import ObjectId from openpype.pipeline import ( + InventoryAction, get_representation_context, get_representation_path_from_context, ) @@ -10,7 +12,7 @@ from openpype.hosts.maya.api.lib import ( ) -class ImportModelRender(api.InventoryAction): +class ImportModelRender(InventoryAction): label = "Import Model Render Sets" icon = "industry" @@ -39,7 +41,7 @@ class ImportModelRender(api.InventoryAction): nodes.append(n) repr_doc = io.find_one({ - "_id": io.ObjectId(container["representation"]), + "_id": ObjectId(container["representation"]), }) version_id = repr_doc["parent"] diff --git a/openpype/hosts/maya/plugins/inventory/import_reference.py b/openpype/hosts/maya/plugins/inventory/import_reference.py index 2fa132a867..afb1e0e17f 100644 --- a/openpype/hosts/maya/plugins/inventory/import_reference.py +++ b/openpype/hosts/maya/plugins/inventory/import_reference.py @@ -1,11 +1,10 @@ from maya import cmds -from avalon import api - +from openpype.pipeline import InventoryAction from openpype.hosts.maya.api.plugin import get_reference_node -class ImportReference(api.InventoryAction): +class ImportReference(InventoryAction): """Imports selected reference to inside of the file.""" label = "Import Reference" diff --git a/openpype/hosts/maya/plugins/load/load_multiverse_usd.py b/openpype/hosts/maya/plugins/load/load_multiverse_usd.py new file mode 100644 index 0000000000..c03f2c5d92 --- /dev/null +++ b/openpype/hosts/maya/plugins/load/load_multiverse_usd.py @@ -0,0 +1,102 @@ +# -*- coding: utf-8 -*- +import maya.cmds as cmds + +from openpype.pipeline import ( + load, + get_representation_path +) +from openpype.hosts.maya.api.lib import ( + maintained_selection, + namespaced, + unique_namespace +) +from openpype.hosts.maya.api.pipeline import containerise + + +class MultiverseUsdLoader(load.LoaderPlugin): + """Load the USD by Multiverse""" + + families = ["model", "usd", "usdComposition", "usdOverride", + "pointcache", "animation"] + representations = ["usd", "usda", "usdc", "usdz", "abc"] + + label = "Read USD by Multiverse" + order = -10 + icon = "code-fork" + color = "orange" + + def load(self, context, name=None, namespace=None, options=None): + + asset = context['asset']['name'] + namespace = namespace or unique_namespace( + asset + "_", + prefix="_" if asset[0].isdigit() else "", + suffix="_", + ) + + # Create the shape + cmds.loadPlugin("MultiverseForMaya", quiet=True) + + shape = None + transform = None + with maintained_selection(): + cmds.namespace(addNamespace=namespace) + with namespaced(namespace, new=False): + import multiverse + shape = multiverse.CreateUsdCompound(self.fname) + transform = cmds.listRelatives( + shape, parent=True, fullPath=True)[0] + + # Lock the shape node so the user cannot delete it. + cmds.lockNode(shape, lock=True) + + nodes = [transform, shape] + self[:] = nodes + + return containerise( + name=name, + namespace=namespace, + nodes=nodes, + context=context, + loader=self.__class__.__name__) + + def update(self, container, representation): + # type: (dict, dict) -> None + """Update container with specified representation.""" + node = container['objectName'] + assert cmds.objExists(node), "Missing container" + + members = cmds.sets(node, query=True) or [] + shapes = cmds.ls(members, type="mvUsdCompoundShape") + assert shapes, "Cannot find mvUsdCompoundShape in container" + + path = get_representation_path(representation) + + import multiverse + for shape in shapes: + multiverse.SetUsdCompoundAssetPaths(shape, [path]) + + cmds.setAttr("{}.representation".format(node), + str(representation["_id"]), + type="string") + + def switch(self, container, representation): + self.update(container, representation) + + def remove(self, container): + # type: (dict) -> None + """Remove loaded container.""" + # Delete container and its contents + if cmds.objExists(container['objectName']): + members = cmds.sets(container['objectName'], query=True) or [] + cmds.delete([container['objectName']] + members) + + # Remove the namespace, if empty + namespace = container['namespace'] + if cmds.namespace(exists=namespace): + members = cmds.namespaceInfo(namespace, listNamespace=True) + if not members: + cmds.namespace(removeNamespace=namespace) + else: + self.log.warning("Namespace not deleted because it " + "still has members: %s", namespace) diff --git a/openpype/hosts/maya/plugins/load/load_vrayproxy.py b/openpype/hosts/maya/plugins/load/load_vrayproxy.py index 5b79b1efb3..69d54df62b 100644 --- a/openpype/hosts/maya/plugins/load/load_vrayproxy.py +++ b/openpype/hosts/maya/plugins/load/load_vrayproxy.py @@ -7,6 +7,8 @@ loader will use them instead of native vray vrmesh format. """ import os +from bson.objectid import ObjectId + import maya.cmds as cmds from avalon import io @@ -186,7 +188,7 @@ class VRayProxyLoader(load.LoaderPlugin): abc_rep = io.find_one( { "type": "representation", - "parent": io.ObjectId(version_id), + "parent": ObjectId(version_id), "name": "abc" }) diff --git a/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py b/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py index 389995d30c..3a47cdadb5 100644 --- a/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py +++ b/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py @@ -6,7 +6,7 @@ from maya import cmds import openpype.api from openpype.hosts.maya.api.lib import maintained_selection -from avalon.pipeline import AVALON_CONTAINER_ID +from openpype.pipeline import AVALON_CONTAINER_ID class ExtractMayaSceneRaw(openpype.api.Extractor): diff --git a/openpype/hosts/maya/plugins/publish/extract_multiverse_usd.py b/openpype/hosts/maya/plugins/publish/extract_multiverse_usd.py new file mode 100644 index 0000000000..4e4efdc32c --- /dev/null +++ b/openpype/hosts/maya/plugins/publish/extract_multiverse_usd.py @@ -0,0 +1,210 @@ +import os +import six + +from maya import cmds + +import openpype.api +from openpype.hosts.maya.api.lib import maintained_selection + + +class ExtractMultiverseUsd(openpype.api.Extractor): + """Extractor for USD by Multiverse.""" + + label = "Extract Multiverse USD" + hosts = ["maya"] + families = ["usd"] + + @property + def options(self): + """Overridable options for Multiverse USD Export + + Given in the following format + - {NAME: EXPECTED TYPE} + + If the overridden option's type does not match, + the option is not included and a warning is logged. + + """ + + return { + "stripNamespaces": bool, + "mergeTransformAndShape": bool, + "writeAncestors": bool, + "flattenParentXforms": bool, + "writeSparseOverrides": bool, + "useMetaPrimPath": bool, + "customRootPath": str, + "customAttributes": str, + "nodeTypesToIgnore": str, + "writeMeshes": bool, + "writeCurves": bool, + "writeParticles": bool, + "writeCameras": bool, + "writeLights": bool, + "writeJoints": bool, + "writeCollections": bool, + "writePositions": bool, + "writeNormals": bool, + "writeUVs": bool, + "writeColorSets": bool, + "writeTangents": bool, + "writeRefPositions": bool, + "writeBlendShapes": bool, + "writeDisplayColor": bool, + "writeSkinWeights": bool, + "writeMaterialAssignment": bool, + "writeHardwareShader": bool, + "writeShadingNetworks": bool, + "writeTransformMatrix": bool, + "writeUsdAttributes": bool, + "timeVaryingTopology": bool, + "customMaterialNamespace": str, + "numTimeSamples": int, + "timeSamplesSpan": float + } + + @property + def default_options(self): + """The default options for Multiverse USD extraction.""" + + return { + "stripNamespaces": False, + "mergeTransformAndShape": False, + "writeAncestors": True, + "flattenParentXforms": False, + "writeSparseOverrides": False, + "useMetaPrimPath": False, + "customRootPath": str(), + "customAttributes": str(), + "nodeTypesToIgnore": str(), + "writeMeshes": True, + "writeCurves": True, + "writeParticles": True, + "writeCameras": False, + "writeLights": False, + "writeJoints": False, + "writeCollections": False, + "writePositions": True, + "writeNormals": True, + "writeUVs": True, + "writeColorSets": False, + "writeTangents": False, + "writeRefPositions": False, + "writeBlendShapes": False, + "writeDisplayColor": False, + "writeSkinWeights": False, + "writeMaterialAssignment": False, + "writeHardwareShader": False, + "writeShadingNetworks": False, + "writeTransformMatrix": True, + "writeUsdAttributes": False, + "timeVaryingTopology": False, + "customMaterialNamespace": str(), + "numTimeSamples": 1, + "timeSamplesSpan": 0.0 + } + + def parse_overrides(self, instance, options): + """Inspect data of instance to determine overridden options""" + + for key in instance.data: + if key not in self.options: + continue + + # Ensure the data is of correct type + value = instance.data[key] + if isinstance(value, six.text_type): + value = str(value) + if not isinstance(value, self.options[key]): + self.log.warning( + "Overridden attribute {key} was of " + "the wrong type: {invalid_type} " + "- should have been {valid_type}".format( + key=key, + invalid_type=type(value).__name__, + valid_type=self.options[key].__name__)) + continue + + options[key] = value + + return options + + def process(self, instance): + # Load plugin firstly + cmds.loadPlugin("MultiverseForMaya", quiet=True) + + # Define output file path + staging_dir = self.staging_dir(instance) + file_name = "{}.usd".format(instance.name) + file_path = os.path.join(staging_dir, file_name) + file_path = file_path.replace('\\', '/') + + # Parse export options + options = self.default_options + options = self.parse_overrides(instance, options) + self.log.info("Export options: {0}".format(options)) + + # Perform extraction + self.log.info("Performing extraction ...") + + with maintained_selection(): + members = instance.data("setMembers") + members = cmds.ls(members, + dag=True, + shapes=True, + type=("mesh"), + noIntermediate=True, + long=True) + self.log.info('Collected object {}'.format(members)) + + import multiverse + + time_opts = None + frame_start = instance.data['frameStart'] + frame_end = instance.data['frameEnd'] + handle_start = instance.data['handleStart'] + handle_end = instance.data['handleEnd'] + step = instance.data['step'] + fps = instance.data['fps'] + if frame_end != frame_start: + time_opts = multiverse.TimeOptions() + + time_opts.writeTimeRange = True + time_opts.frameRange = ( + frame_start - handle_start, frame_end + handle_end) + time_opts.frameIncrement = step + time_opts.numTimeSamples = instance.data["numTimeSamples"] + time_opts.timeSamplesSpan = instance.data["timeSamplesSpan"] + time_opts.framePerSecond = fps + + asset_write_opts = multiverse.AssetWriteOptions(time_opts) + options_discard_keys = { + 'numTimeSamples', + 'timeSamplesSpan', + 'frameStart', + 'frameEnd', + 'handleStart', + 'handleEnd', + 'step', + 'fps' + } + for key, value in options.items(): + if key in options_discard_keys: + continue + setattr(asset_write_opts, key, value) + + multiverse.WriteAsset(file_path, members, asset_write_opts) + + if "representations" not in instance.data: + instance.data["representations"] = [] + + representation = { + 'name': 'usd', + 'ext': 'usd', + 'files': file_name, + "stagingDir": staging_dir + } + instance.data["representations"].append(representation) + + self.log.info("Extracted instance {} to {}".format( + instance.name, file_path)) diff --git a/openpype/hosts/maya/plugins/publish/extract_multiverse_usd_comp.py b/openpype/hosts/maya/plugins/publish/extract_multiverse_usd_comp.py new file mode 100644 index 0000000000..8fccc412e6 --- /dev/null +++ b/openpype/hosts/maya/plugins/publish/extract_multiverse_usd_comp.py @@ -0,0 +1,151 @@ +import os + +from maya import cmds + +import openpype.api +from openpype.hosts.maya.api.lib import maintained_selection + + +class ExtractMultiverseUsdComposition(openpype.api.Extractor): + """Extractor of Multiverse USD Composition.""" + + label = "Extract Multiverse USD Composition" + hosts = ["maya"] + families = ["usdComposition"] + + @property + def options(self): + """Overridable options for Multiverse USD Export + + Given in the following format + - {NAME: EXPECTED TYPE} + + If the overridden option's type does not match, + the option is not included and a warning is logged. + + """ + + return { + "stripNamespaces": bool, + "mergeTransformAndShape": bool, + "flattenContent": bool, + "writePendingOverrides": bool, + "numTimeSamples": int, + "timeSamplesSpan": float + } + + @property + def default_options(self): + """The default options for Multiverse USD extraction.""" + + return { + "stripNamespaces": True, + "mergeTransformAndShape": False, + "flattenContent": False, + "writePendingOverrides": False, + "numTimeSamples": 1, + "timeSamplesSpan": 0.0 + } + + def parse_overrides(self, instance, options): + """Inspect data of instance to determine overridden options""" + + for key in instance.data: + if key not in self.options: + continue + + # Ensure the data is of correct type + value = instance.data[key] + if not isinstance(value, self.options[key]): + self.log.warning( + "Overridden attribute {key} was of " + "the wrong type: {invalid_type} " + "- should have been {valid_type}".format( + key=key, + invalid_type=type(value).__name__, + valid_type=self.options[key].__name__)) + continue + + options[key] = value + + return options + + def process(self, instance): + # Load plugin firstly + cmds.loadPlugin("MultiverseForMaya", quiet=True) + + # Define output file path + staging_dir = self.staging_dir(instance) + file_name = "{}.usd".format(instance.name) + file_path = os.path.join(staging_dir, file_name) + file_path = file_path.replace('\\', '/') + + # Parse export options + options = self.default_options + options = self.parse_overrides(instance, options) + self.log.info("Export options: {0}".format(options)) + + # Perform extraction + self.log.info("Performing extraction ...") + + with maintained_selection(): + members = instance.data("setMembers") + members = cmds.ls(members, + dag=True, + shapes=True, + type="mvUsdCompoundShape", + noIntermediate=True, + long=True) + self.log.info('Collected object {}'.format(members)) + + import multiverse + + time_opts = None + frame_start = instance.data['frameStart'] + frame_end = instance.data['frameEnd'] + handle_start = instance.data['handleStart'] + handle_end = instance.data['handleEnd'] + step = instance.data['step'] + fps = instance.data['fps'] + if frame_end != frame_start: + time_opts = multiverse.TimeOptions() + + time_opts.writeTimeRange = True + time_opts.frameRange = ( + frame_start - handle_start, frame_end + handle_end) + time_opts.frameIncrement = step + time_opts.numTimeSamples = instance.data["numTimeSamples"] + time_opts.timeSamplesSpan = instance.data["timeSamplesSpan"] + time_opts.framePerSecond = fps + + comp_write_opts = multiverse.CompositionWriteOptions() + options_discard_keys = { + 'numTimeSamples', + 'timeSamplesSpan', + 'frameStart', + 'frameEnd', + 'handleStart', + 'handleEnd', + 'step', + 'fps' + } + for key, value in options.items(): + if key in options_discard_keys: + continue + setattr(comp_write_opts, key, value) + + multiverse.WriteComposition(file_path, members, comp_write_opts) + + if "representations" not in instance.data: + instance.data["representations"] = [] + + representation = { + 'name': 'usd', + 'ext': 'usd', + 'files': file_name, + "stagingDir": staging_dir + } + instance.data["representations"].append(representation) + + self.log.info("Extracted instance {} to {}".format( + instance.name, file_path)) diff --git a/openpype/hosts/maya/plugins/publish/extract_multiverse_usd_over.py b/openpype/hosts/maya/plugins/publish/extract_multiverse_usd_over.py new file mode 100644 index 0000000000..ce0e8a392a --- /dev/null +++ b/openpype/hosts/maya/plugins/publish/extract_multiverse_usd_over.py @@ -0,0 +1,139 @@ +import os + +import openpype.api +from openpype.hosts.maya.api.lib import maintained_selection + +from maya import cmds + + +class ExtractMultiverseUsdOverride(openpype.api.Extractor): + """Extractor for USD Override by Multiverse.""" + + label = "Extract Multiverse USD Override" + hosts = ["maya"] + families = ["usdOverride"] + + @property + def options(self): + """Overridable options for Multiverse USD Export + + Given in the following format + - {NAME: EXPECTED TYPE} + + If the overridden option's type does not match, + the option is not included and a warning is logged. + + """ + + return { + "writeAll": bool, + "writeTransforms": bool, + "writeVisibility": bool, + "writeAttributes": bool, + "writeMaterials": bool, + "writeVariants": bool, + "writeVariantsDefinition": bool, + "writeActiveState": bool, + "writeNamespaces": bool, + "numTimeSamples": int, + "timeSamplesSpan": float + } + + @property + def default_options(self): + """The default options for Multiverse USD extraction.""" + + return { + "writeAll": False, + "writeTransforms": True, + "writeVisibility": True, + "writeAttributes": True, + "writeMaterials": True, + "writeVariants": True, + "writeVariantsDefinition": True, + "writeActiveState": True, + "writeNamespaces": False, + "numTimeSamples": 1, + "timeSamplesSpan": 0.0 + } + + def process(self, instance): + # Load plugin firstly + cmds.loadPlugin("MultiverseForMaya", quiet=True) + + # Define output file path + staging_dir = self.staging_dir(instance) + file_name = "{}.usda".format(instance.name) + file_path = os.path.join(staging_dir, file_name) + file_path = file_path.replace("\\", "/") + + # Parse export options + options = self.default_options + self.log.info("Export options: {0}".format(options)) + + # Perform extraction + self.log.info("Performing extraction ...") + + with maintained_selection(): + members = instance.data("setMembers") + members = cmds.ls(members, + dag=True, + shapes=True, + type="mvUsdCompoundShape", + noIntermediate=True, + long=True) + self.log.info("Collected object {}".format(members)) + + # TODO: Deal with asset, composition, overide with options. + import multiverse + + time_opts = None + frame_start = instance.data["frameStart"] + frame_end = instance.data["frameEnd"] + handle_start = instance.data["handleStart"] + handle_end = instance.data["handleEnd"] + step = instance.data["step"] + fps = instance.data["fps"] + if frame_end != frame_start: + time_opts = multiverse.TimeOptions() + + time_opts.writeTimeRange = True + time_opts.frameRange = ( + frame_start - handle_start, frame_end + handle_end) + time_opts.frameIncrement = step + time_opts.numTimeSamples = instance.data["numTimeSamples"] + time_opts.timeSamplesSpan = instance.data["timeSamplesSpan"] + time_opts.framePerSecond = fps + + over_write_opts = multiverse.OverridesWriteOptions(time_opts) + options_discard_keys = { + "numTimeSamples", + "timeSamplesSpan", + "frameStart", + "frameEnd", + "handleStart", + "handleEnd", + "step", + "fps" + } + for key, value in options.items(): + if key in options_discard_keys: + continue + setattr(over_write_opts, key, value) + + for member in members: + multiverse.WriteOverrides(file_path, member, over_write_opts) + + if "representations" not in instance.data: + instance.data["representations"] = [] + + representation = { + "name": "usd", + "ext": "usd", + "files": file_name, + "stagingDir": staging_dir + } + instance.data["representations"].append(representation) + + self.log.info("Extracted instance {} to {}".format( + instance.name, file_path)) diff --git a/openpype/hosts/maya/plugins/publish/extract_playblast.py b/openpype/hosts/maya/plugins/publish/extract_playblast.py index b233a57453..bb1ecf279d 100644 --- a/openpype/hosts/maya/plugins/publish/extract_playblast.py +++ b/openpype/hosts/maya/plugins/publish/extract_playblast.py @@ -73,6 +73,11 @@ class ExtractPlayblast(openpype.api.Extractor): pm.currentTime(refreshFrameInt - 1, edit=True) pm.currentTime(refreshFrameInt, edit=True) + # Override transparency if requested. + transparency = instance.data.get("transparency", 0) + if transparency != 0: + preset["viewport2_options"]["transparencyAlgorithm"] = transparency + # Isolate view is requested by having objects in the set besides a # camera. if preset.pop("isolate_view", False) and instance.data.get("isolate"): diff --git a/openpype/hosts/nuke/api/command.py b/openpype/hosts/nuke/api/command.py index 212d4757c6..6f74c08e97 100644 --- a/openpype/hosts/nuke/api/command.py +++ b/openpype/hosts/nuke/api/command.py @@ -1,6 +1,7 @@ import logging import contextlib import nuke +from bson.objectid import ObjectId from avalon import api, io @@ -70,10 +71,10 @@ def get_handles(asset): if "visualParent" in data: vp = data["visualParent"] if vp is not None: - parent_asset = io.find_one({"_id": io.ObjectId(vp)}) + parent_asset = io.find_one({"_id": ObjectId(vp)}) if parent_asset is None: - parent_asset = io.find_one({"_id": io.ObjectId(asset["parent"])}) + parent_asset = io.find_one({"_id": ObjectId(asset["parent"])}) if parent_asset is not None: return get_handles(parent_asset) diff --git a/openpype/hosts/nuke/api/lib.py b/openpype/hosts/nuke/api/lib.py index dba7ec1b85..3c8ba3e77c 100644 --- a/openpype/hosts/nuke/api/lib.py +++ b/openpype/hosts/nuke/api/lib.py @@ -6,10 +6,11 @@ import contextlib from collections import OrderedDict import clique +from bson.objectid import ObjectId import nuke -from avalon import api, io, lib +from avalon import api, io from openpype.api import ( Logger, @@ -20,7 +21,6 @@ from openpype.api import ( get_workdir_data, get_asset, get_current_project_settings, - ApplicationManager ) from openpype.tools.utils import host_tools from openpype.lib.path_tools import HostDirmap @@ -570,7 +570,7 @@ def check_inventory_versions(): # get representation from io representation = io.find_one({ "type": "representation", - "_id": io.ObjectId(avalon_knob_data["representation"]) + "_id": ObjectId(avalon_knob_data["representation"]) }) # Failsafe for not finding the representation. diff --git a/openpype/hosts/nuke/api/pipeline.py b/openpype/hosts/nuke/api/pipeline.py index fd2e16b8d3..1d110cb94a 100644 --- a/openpype/hosts/nuke/api/pipeline.py +++ b/openpype/hosts/nuke/api/pipeline.py @@ -6,7 +6,6 @@ import nuke import pyblish.api import avalon.api -from avalon import pipeline import openpype from openpype.api import ( @@ -18,7 +17,10 @@ from openpype.lib import register_event_callback from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, + register_inventory_action_path, deregister_loader_plugin_path, + deregister_inventory_action_path, + AVALON_CONTAINER_ID, ) from openpype.tools.utils import host_tools @@ -105,7 +107,7 @@ def install(): pyblish.api.register_plugin_path(PUBLISH_PATH) register_loader_plugin_path(LOAD_PATH) avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH) - avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH) + register_inventory_action_path(INVENTORY_PATH) # Register Avalon event for workfiles loading. register_event_callback("workio.open_file", check_inventory_versions) @@ -131,6 +133,7 @@ def uninstall(): pyblish.api.deregister_plugin_path(PUBLISH_PATH) deregister_loader_plugin_path(LOAD_PATH) avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH) + deregister_inventory_action_path(INVENTORY_PATH) pyblish.api.deregister_callback( "instanceToggled", on_pyblish_instance_toggled) @@ -330,7 +333,7 @@ def containerise(node, data = OrderedDict( [ ("schema", "openpype:container-2.0"), - ("id", pipeline.AVALON_CONTAINER_ID), + ("id", AVALON_CONTAINER_ID), ("name", name), ("namespace", namespace), ("loader", str(loader)), diff --git a/openpype/hosts/nuke/api/workio.py b/openpype/hosts/nuke/api/workio.py index dbc24fdc9b..68fcb0927f 100644 --- a/openpype/hosts/nuke/api/workio.py +++ b/openpype/hosts/nuke/api/workio.py @@ -1,11 +1,12 @@ """Host API required Work Files tool""" import os import nuke -import avalon.api + +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS def file_extensions(): - return avalon.api.HOST_WORKFILE_EXTENSIONS["nuke"] + return HOST_WORKFILE_EXTENSIONS["nuke"] def has_unsaved_changes(): diff --git a/openpype/hosts/nuke/plugins/inventory/repair_old_loaders.py b/openpype/hosts/nuke/plugins/inventory/repair_old_loaders.py index 5f834be557..c04c939a8d 100644 --- a/openpype/hosts/nuke/plugins/inventory/repair_old_loaders.py +++ b/openpype/hosts/nuke/plugins/inventory/repair_old_loaders.py @@ -1,9 +1,9 @@ -from avalon import api from openpype.api import Logger +from openpype.pipeline import InventoryAction from openpype.hosts.nuke.api.lib import set_avalon_knob_data -class RepairOldLoaders(api.InventoryAction): +class RepairOldLoaders(InventoryAction): label = "Repair Old Loaders" icon = "gears" diff --git a/openpype/hosts/nuke/plugins/inventory/select_containers.py b/openpype/hosts/nuke/plugins/inventory/select_containers.py index 3f174b3562..d7d5f00b87 100644 --- a/openpype/hosts/nuke/plugins/inventory/select_containers.py +++ b/openpype/hosts/nuke/plugins/inventory/select_containers.py @@ -1,8 +1,8 @@ -from avalon import api +from openpype.pipeline import InventoryAction from openpype.hosts.nuke.api.commands import viewer_update_and_undo_stop -class SelectContainers(api.InventoryAction): +class SelectContainers(InventoryAction): label = "Select Containers" icon = "mouse-pointer" diff --git a/openpype/hosts/nuke/plugins/load/load_effects.py b/openpype/hosts/nuke/plugins/load/load_effects.py index 68c3952942..1ed32996e1 100644 --- a/openpype/hosts/nuke/plugins/load/load_effects.py +++ b/openpype/hosts/nuke/plugins/load/load_effects.py @@ -72,7 +72,7 @@ class LoadEffects(load.LoaderPlugin): # getting data from json file with unicode conversion with open(file, "r") as f: json_f = {self.byteify(key): self.byteify(value) - for key, value in json.load(f).iteritems()} + for key, value in json.load(f).items()} # get correct order of nodes by positions on track and subtrack nodes_order = self.reorder_nodes(json_f) @@ -188,7 +188,7 @@ class LoadEffects(load.LoaderPlugin): # getting data from json file with unicode conversion with open(file, "r") as f: json_f = {self.byteify(key): self.byteify(value) - for key, value in json.load(f).iteritems()} + for key, value in json.load(f).items()} # get correct order of nodes by positions on track and subtrack nodes_order = self.reorder_nodes(json_f) @@ -330,11 +330,11 @@ class LoadEffects(load.LoaderPlugin): if isinstance(input, dict): return {self.byteify(key): self.byteify(value) - for key, value in input.iteritems()} + for key, value in input.items()} elif isinstance(input, list): return [self.byteify(element) for element in input] - elif isinstance(input, unicode): - return input.encode('utf-8') + elif isinstance(input, str): + return str(input) else: return input diff --git a/openpype/hosts/nuke/plugins/load/load_effects_ip.py b/openpype/hosts/nuke/plugins/load/load_effects_ip.py index 9c4fd4c2c6..383776111f 100644 --- a/openpype/hosts/nuke/plugins/load/load_effects_ip.py +++ b/openpype/hosts/nuke/plugins/load/load_effects_ip.py @@ -74,7 +74,7 @@ class LoadEffectsInputProcess(load.LoaderPlugin): # getting data from json file with unicode conversion with open(file, "r") as f: json_f = {self.byteify(key): self.byteify(value) - for key, value in json.load(f).iteritems()} + for key, value in json.load(f).items()} # get correct order of nodes by positions on track and subtrack nodes_order = self.reorder_nodes(json_f) @@ -194,7 +194,7 @@ class LoadEffectsInputProcess(load.LoaderPlugin): # getting data from json file with unicode conversion with open(file, "r") as f: json_f = {self.byteify(key): self.byteify(value) - for key, value in json.load(f).iteritems()} + for key, value in json.load(f).items()} # get correct order of nodes by positions on track and subtrack nodes_order = self.reorder_nodes(json_f) @@ -350,11 +350,11 @@ class LoadEffectsInputProcess(load.LoaderPlugin): if isinstance(input, dict): return {self.byteify(key): self.byteify(value) - for key, value in input.iteritems()} + for key, value in input.items()} elif isinstance(input, list): return [self.byteify(element) for element in input] - elif isinstance(input, unicode): - return input.encode('utf-8') + elif isinstance(input, str): + return str(input) else: return input diff --git a/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py b/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py index 87bebce15b..df52a22364 100644 --- a/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py +++ b/openpype/hosts/nuke/plugins/load/load_gizmo_ip.py @@ -240,7 +240,7 @@ class LoadGizmoInputProcess(load.LoaderPlugin): if isinstance(input, dict): return {self.byteify(key): self.byteify(value) - for key, value in input.iteritems()} + for key, value in input.items()} elif isinstance(input, list): return [self.byteify(element) for element in input] elif isinstance(input, unicode): diff --git a/openpype/hosts/nuke/plugins/publish/extract_review_data_mov.py b/openpype/hosts/nuke/plugins/publish/extract_review_data_mov.py index 544b9e04da..31a8ff18ee 100644 --- a/openpype/hosts/nuke/plugins/publish/extract_review_data_mov.py +++ b/openpype/hosts/nuke/plugins/publish/extract_review_data_mov.py @@ -24,7 +24,11 @@ class ExtractReviewDataMov(openpype.api.Extractor): outputs = {} def process(self, instance): - families = instance.data["families"] + families = set(instance.data["families"]) + + # add main family to make sure all families are compared + families.add(instance.data["family"]) + task_type = instance.context.data["taskType"] subset = instance.data["subset"] self.log.info("Creating staging dir...") @@ -50,51 +54,31 @@ class ExtractReviewDataMov(openpype.api.Extractor): f_task_types = o_data["filter"]["task_types"] f_subsets = o_data["filter"]["sebsets"] + self.log.debug( + "f_families `{}` > families: {}".format( + f_families, families)) + + self.log.debug( + "f_task_types `{}` > task_type: {}".format( + f_task_types, task_type)) + + self.log.debug( + "f_subsets `{}` > subset: {}".format( + f_subsets, subset)) + # test if family found in context - test_families = any([ - # first if exact family set is matching - # make sure only interesetion of list is correct - bool(set(families).intersection(f_families)), - # and if famiies are set at all - # if not then return True because we want this preset - # to be active if nothig is set - bool(not f_families) - ]) + # using intersection to make sure all defined + # families are present in combination + if f_families and not families.intersection(f_families): + continue # test task types from filter - test_task_types = any([ - # check if actual task type is defined in task types - # set in preset's filter - bool(task_type in f_task_types), - # and if taskTypes are defined in preset filter - # if not then return True, because we want this filter - # to be active if no taskType is set - bool(not f_task_types) - ]) + if f_task_types and task_type not in f_task_types: + continue # test subsets from filter - test_subsets = any([ - # check if any of subset filter inputs - # converted to regex patern is not found in subset - # we keep strict case sensitivity - bool(next(( - s for s in f_subsets - if re.search(re.compile(s), subset) - ), None)), - # but if no subsets were set then make this acuntable too - bool(not f_subsets) - ]) - - # we need all filters to be positive for this - # preset to be activated - test_all = all([ - test_families, - test_task_types, - test_subsets - ]) - - # if it is not positive then skip this preset - if not test_all: + if f_subsets and not any( + re.search(s, subset) for s in f_subsets): continue self.log.info( diff --git a/openpype/hosts/nuke/plugins/publish/validate_write_deadline_tab.py b/openpype/hosts/nuke/plugins/publish/validate_write_deadline_tab.py index 5ee93403d0..907577a97d 100644 --- a/openpype/hosts/nuke/plugins/publish/validate_write_deadline_tab.py +++ b/openpype/hosts/nuke/plugins/publish/validate_write_deadline_tab.py @@ -25,7 +25,7 @@ class RepairNukeWriteDeadlineTab(pyblish.api.Action): # Remove existing knobs. knob_names = openpype.hosts.nuke.lib.get_deadline_knob_names() - for name, knob in group_node.knobs().iteritems(): + for name, knob in group_node.knobs().items(): if name in knob_names: group_node.removeKnob(knob) diff --git a/openpype/hosts/photoshop/api/pipeline.py b/openpype/hosts/photoshop/api/pipeline.py index e814e1ca4d..c2ad0ac7b0 100644 --- a/openpype/hosts/photoshop/api/pipeline.py +++ b/openpype/hosts/photoshop/api/pipeline.py @@ -1,9 +1,10 @@ import os from Qt import QtWidgets +from bson.objectid import ObjectId import pyblish.api import avalon.api -from avalon import pipeline, io +from avalon import io from openpype.api import Logger from openpype.lib import register_event_callback @@ -11,6 +12,7 @@ from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) import openpype.hosts.photoshop @@ -36,7 +38,7 @@ def check_inventory(): representation = container['representation'] representation_doc = io.find_one( { - "_id": io.ObjectId(representation), + "_id": ObjectId(representation), "type": "representation" }, projection={"parent": True} @@ -221,7 +223,7 @@ def containerise( data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "name": name, "namespace": namespace, "loader": str(loader), diff --git a/openpype/hosts/photoshop/api/workio.py b/openpype/hosts/photoshop/api/workio.py index 0bf3ed2bd9..951c5dbfff 100644 --- a/openpype/hosts/photoshop/api/workio.py +++ b/openpype/hosts/photoshop/api/workio.py @@ -1,8 +1,7 @@ """Host API required Work Files tool""" import os -import avalon.api - +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS from . import lib @@ -15,7 +14,7 @@ def _active_document(): def file_extensions(): - return avalon.api.HOST_WORKFILE_EXTENSIONS["photoshop"] + return HOST_WORKFILE_EXTENSIONS["photoshop"] def has_unsaved_changes(): diff --git a/openpype/hosts/resolve/api/pipeline.py b/openpype/hosts/resolve/api/pipeline.py index fa309e3503..e8b017ead5 100644 --- a/openpype/hosts/resolve/api/pipeline.py +++ b/openpype/hosts/resolve/api/pipeline.py @@ -6,13 +6,13 @@ import contextlib from collections import OrderedDict from avalon import api as avalon from avalon import schema -from avalon.pipeline import AVALON_CONTAINER_ID from pyblish import api as pyblish from openpype.api import Logger from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) from . import lib from . import PLUGINS_DIR @@ -22,7 +22,6 @@ log = Logger().get_logger(__name__) PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish") LOAD_PATH = os.path.join(PLUGINS_DIR, "load") CREATE_PATH = os.path.join(PLUGINS_DIR, "create") -INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory") AVALON_CONTAINERS = ":AVALON_CONTAINERS" @@ -48,7 +47,6 @@ def install(): register_loader_plugin_path(LOAD_PATH) avalon.register_plugin_path(LegacyCreator, CREATE_PATH) - avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH) # register callback for switching publishable pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled) @@ -73,7 +71,6 @@ def uninstall(): deregister_loader_plugin_path(LOAD_PATH) avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH) - avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH) # register callback for switching publishable pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled) diff --git a/openpype/hosts/testhost/plugins/create/auto_creator.py b/openpype/hosts/testhost/plugins/create/auto_creator.py index 45c573e487..d5935602a0 100644 --- a/openpype/hosts/testhost/plugins/create/auto_creator.py +++ b/openpype/hosts/testhost/plugins/create/auto_creator.py @@ -1,10 +1,10 @@ +from avalon import io +from openpype.lib import NumberDef from openpype.hosts.testhost.api import pipeline from openpype.pipeline import ( AutoCreator, CreatedInstance, - lib ) -from avalon import io class MyAutoCreator(AutoCreator): @@ -13,7 +13,7 @@ class MyAutoCreator(AutoCreator): def get_instance_attr_defs(self): output = [ - lib.NumberDef("number_key", label="Number") + NumberDef("number_key", label="Number") ] return output diff --git a/openpype/hosts/testhost/plugins/create/test_creator_1.py b/openpype/hosts/testhost/plugins/create/test_creator_1.py index 45c30e8a27..7664276fa2 100644 --- a/openpype/hosts/testhost/plugins/create/test_creator_1.py +++ b/openpype/hosts/testhost/plugins/create/test_creator_1.py @@ -1,10 +1,16 @@ import json from openpype import resources from openpype.hosts.testhost.api import pipeline +from openpype.lib import ( + UISeparatorDef, + UILabelDef, + BoolDef, + NumberDef, + FileDef, +) from openpype.pipeline import ( Creator, CreatedInstance, - lib ) @@ -54,17 +60,17 @@ class TestCreatorOne(Creator): def get_instance_attr_defs(self): output = [ - lib.NumberDef("number_key", label="Number"), + NumberDef("number_key", label="Number"), ] return output def get_pre_create_attr_defs(self): output = [ - lib.BoolDef("use_selection", label="Use selection"), - lib.UISeparatorDef(), - lib.UILabelDef("Testing label"), - lib.FileDef("filepath", folders=True, label="Filepath"), - lib.FileDef( + BoolDef("use_selection", label="Use selection"), + UISeparatorDef(), + UILabelDef("Testing label"), + FileDef("filepath", folders=True, label="Filepath"), + FileDef( "filepath_2", multipath=True, folders=True, label="Filepath 2" ) ] diff --git a/openpype/hosts/testhost/plugins/create/test_creator_2.py b/openpype/hosts/testhost/plugins/create/test_creator_2.py index e66304a038..f54adee8a2 100644 --- a/openpype/hosts/testhost/plugins/create/test_creator_2.py +++ b/openpype/hosts/testhost/plugins/create/test_creator_2.py @@ -1,8 +1,8 @@ +from openpype.lib import NumberDef, TextDef from openpype.hosts.testhost.api import pipeline from openpype.pipeline import ( Creator, CreatedInstance, - lib ) @@ -40,8 +40,8 @@ class TestCreatorTwo(Creator): def get_instance_attr_defs(self): output = [ - lib.NumberDef("number_key"), - lib.TextDef("text_key") + NumberDef("number_key"), + TextDef("text_key") ] return output diff --git a/openpype/hosts/testhost/plugins/publish/collect_instance_1.py b/openpype/hosts/testhost/plugins/publish/collect_instance_1.py index 3c035eccb6..c7241a15a8 100644 --- a/openpype/hosts/testhost/plugins/publish/collect_instance_1.py +++ b/openpype/hosts/testhost/plugins/publish/collect_instance_1.py @@ -1,10 +1,8 @@ import json import pyblish.api -from openpype.pipeline import ( - OpenPypePyblishPluginMixin, - attribute_definitions -) +from openpype.lib import attribute_definitions +from openpype.pipeline import OpenPypePyblishPluginMixin class CollectInstanceOneTestHost( diff --git a/openpype/hosts/tvpaint/api/pipeline.py b/openpype/hosts/tvpaint/api/pipeline.py index 46c9d3a1dd..ec880a1abc 100644 --- a/openpype/hosts/tvpaint/api/pipeline.py +++ b/openpype/hosts/tvpaint/api/pipeline.py @@ -10,7 +10,6 @@ import pyblish.api import avalon.api from avalon import io -from avalon.pipeline import AVALON_CONTAINER_ID from openpype.hosts import tvpaint from openpype.api import get_current_project_settings @@ -19,6 +18,7 @@ from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) from .lib import ( diff --git a/openpype/hosts/tvpaint/api/workio.py b/openpype/hosts/tvpaint/api/workio.py index c513bec6cf..88bdd7117e 100644 --- a/openpype/hosts/tvpaint/api/workio.py +++ b/openpype/hosts/tvpaint/api/workio.py @@ -4,6 +4,7 @@ """ from avalon import api +from openpype.pipeline import HOST_WORKFILE_EXTENSIONS from .lib import ( execute_george, execute_george_through_file @@ -47,7 +48,7 @@ def has_unsaved_changes(): def file_extensions(): """Return the supported file extensions for Blender scene files.""" - return api.HOST_WORKFILE_EXTENSIONS["tvpaint"] + return HOST_WORKFILE_EXTENSIONS["tvpaint"] def work_root(session): diff --git a/openpype/hosts/unreal/api/pipeline.py b/openpype/hosts/unreal/api/pipeline.py index 9ec11b942d..713c588976 100644 --- a/openpype/hosts/unreal/api/pipeline.py +++ b/openpype/hosts/unreal/api/pipeline.py @@ -4,13 +4,13 @@ import logging from typing import List import pyblish.api -from avalon.pipeline import AVALON_CONTAINER_ID from avalon import api from openpype.pipeline import ( LegacyCreator, register_loader_plugin_path, deregister_loader_plugin_path, + AVALON_CONTAINER_ID, ) from openpype.tools.utils import host_tools import openpype.hosts.unreal diff --git a/openpype/hosts/unreal/plugins/load/load_alembic_geometrycache.py b/openpype/hosts/unreal/plugins/load/load_alembic_geometrycache.py index 3508fe5ed7..6ac3531b40 100644 --- a/openpype/hosts/unreal/plugins/load/load_alembic_geometrycache.py +++ b/openpype/hosts/unreal/plugins/load/load_alembic_geometrycache.py @@ -2,8 +2,10 @@ """Loader for published alembics.""" import os -from avalon import pipeline -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID +) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline @@ -117,7 +119,7 @@ class PointCacheAlembicLoader(plugin.Loader): data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "asset": asset, "namespace": asset_dir, "container_name": container_name, diff --git a/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py b/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py index 180942de51..b2c3889f68 100644 --- a/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py +++ b/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py @@ -2,8 +2,10 @@ """Load Skeletal Mesh alembics.""" import os -from avalon import pipeline -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID +) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline import unreal # noqa @@ -81,7 +83,7 @@ class SkeletalMeshAlembicLoader(plugin.Loader): data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "asset": asset, "namespace": asset_dir, "container_name": container_name, diff --git a/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py b/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py index 4e00af1d97..5a73c72c64 100644 --- a/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py +++ b/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py @@ -2,8 +2,10 @@ """Loader for Static Mesh alembics.""" import os -from avalon import pipeline -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID +) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline import unreal # noqa @@ -100,7 +102,7 @@ class StaticMeshAlembicLoader(plugin.Loader): data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "asset": asset, "namespace": asset_dir, "container_name": container_name, diff --git a/openpype/hosts/unreal/plugins/load/load_animation.py b/openpype/hosts/unreal/plugins/load/load_animation.py index 8ef81f7851..c9a1633031 100644 --- a/openpype/hosts/unreal/plugins/load/load_animation.py +++ b/openpype/hosts/unreal/plugins/load/load_animation.py @@ -3,8 +3,10 @@ import os import json -from avalon import pipeline -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID +) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline import unreal # noqa @@ -135,7 +137,7 @@ class AnimationFBXLoader(plugin.Loader): data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "asset": asset, "namespace": asset_dir, "container_name": container_name, diff --git a/openpype/hosts/unreal/plugins/load/load_camera.py b/openpype/hosts/unreal/plugins/load/load_camera.py index 0de9470ef9..40bca0b0c7 100644 --- a/openpype/hosts/unreal/plugins/load/load_camera.py +++ b/openpype/hosts/unreal/plugins/load/load_camera.py @@ -2,7 +2,8 @@ """Load camera from FBX.""" import os -from avalon import io, pipeline +from avalon import io +from openpype.pipeline import AVALON_CONTAINER_ID from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline import unreal # noqa @@ -116,7 +117,7 @@ class CameraLoader(plugin.Loader): data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "asset": asset, "namespace": asset_dir, "container_name": container_name, diff --git a/openpype/hosts/unreal/plugins/load/load_layout.py b/openpype/hosts/unreal/plugins/load/load_layout.py index 19ee179d20..7f6ce7d822 100644 --- a/openpype/hosts/unreal/plugins/load/load_layout.py +++ b/openpype/hosts/unreal/plugins/load/load_layout.py @@ -11,12 +11,12 @@ from unreal import AssetToolsHelpers from unreal import FBXImportType from unreal import MathLibrary as umath -from avalon.pipeline import AVALON_CONTAINER_ID from openpype.pipeline import ( discover_loader_plugins, loaders_from_representation, load_container, get_representation_path, + AVALON_CONTAINER_ID, ) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline diff --git a/openpype/hosts/unreal/plugins/load/load_rig.py b/openpype/hosts/unreal/plugins/load/load_rig.py index 3d5616364c..ff844a5e94 100644 --- a/openpype/hosts/unreal/plugins/load/load_rig.py +++ b/openpype/hosts/unreal/plugins/load/load_rig.py @@ -2,8 +2,10 @@ """Load Skeletal Meshes form FBX.""" import os -from avalon import pipeline -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID +) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline import unreal # noqa @@ -101,7 +103,7 @@ class SkeletalMeshFBXLoader(plugin.Loader): data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "asset": asset, "namespace": asset_dir, "container_name": container_name, diff --git a/openpype/hosts/unreal/plugins/load/load_staticmeshfbx.py b/openpype/hosts/unreal/plugins/load/load_staticmeshfbx.py index 587fc83a77..282d249947 100644 --- a/openpype/hosts/unreal/plugins/load/load_staticmeshfbx.py +++ b/openpype/hosts/unreal/plugins/load/load_staticmeshfbx.py @@ -2,8 +2,10 @@ """Load Static meshes form FBX.""" import os -from avalon import pipeline -from openpype.pipeline import get_representation_path +from openpype.pipeline import ( + get_representation_path, + AVALON_CONTAINER_ID +) from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline import unreal # noqa @@ -95,7 +97,7 @@ class StaticMeshFBXLoader(plugin.Loader): data = { "schema": "openpype:container-2.0", - "id": pipeline.AVALON_CONTAINER_ID, + "id": AVALON_CONTAINER_ID, "asset": asset, "namespace": asset_dir, "container_name": container_name, diff --git a/openpype/hosts/unreal/plugins/publish/extract_layout.py b/openpype/hosts/unreal/plugins/publish/extract_layout.py index 2d09b0e7bd..f34a47b89f 100644 --- a/openpype/hosts/unreal/plugins/publish/extract_layout.py +++ b/openpype/hosts/unreal/plugins/publish/extract_layout.py @@ -3,6 +3,8 @@ import os import json import math +from bson.objectid import ObjectId + import unreal from unreal import EditorLevelLibrary as ell from unreal import EditorAssetLibrary as eal @@ -62,7 +64,7 @@ class ExtractLayout(openpype.api.Extractor): blend = io.find_one( { "type": "representation", - "parent": io.ObjectId(parent), + "parent": ObjectId(parent), "name": "blend" }, projection={"_id": True}) diff --git a/openpype/lib/__init__.py b/openpype/lib/__init__.py index 1ebafbb2d2..e8b6d18f4e 100644 --- a/openpype/lib/__init__.py +++ b/openpype/lib/__init__.py @@ -29,6 +29,21 @@ from .vendor_bin_utils import ( is_oiio_supported ) +from .attribute_definitions import ( + AbtractAttrDef, + + UIDef, + UISeparatorDef, + UILabelDef, + + UnknownDef, + NumberDef, + TextDef, + EnumDef, + BoolDef, + FileDef, +) + from .env_tools import ( env_value_to_bool, get_paths_from_environ, @@ -233,6 +248,19 @@ __all__ = [ "get_ffmpeg_tool_path", "is_oiio_supported", + "AbtractAttrDef", + + "UIDef", + "UISeparatorDef", + "UILabelDef", + + "UnknownDef", + "NumberDef", + "TextDef", + "EnumDef", + "BoolDef", + "FileDef", + "import_filepath", "modules_from_path", "recursive_bases_from_class", diff --git a/openpype/lib/applications.py b/openpype/lib/applications.py index 557c016d74..ad59ae0dbc 100644 --- a/openpype/lib/applications.py +++ b/openpype/lib/applications.py @@ -1319,6 +1319,41 @@ def _merge_env(env, current_env): return result +def _add_python_version_paths(app, env, logger): + """Add vendor packages specific for a Python version.""" + + # Skip adding if host name is not set + if not app.host_name: + return + + # Add Python 2/3 modules + openpype_root = os.getenv("OPENPYPE_REPOS_ROOT") + python_vendor_dir = os.path.join( + openpype_root, + "openpype", + "vendor", + "python" + ) + if app.use_python_2: + pythonpath = os.path.join(python_vendor_dir, "python_2") + else: + pythonpath = os.path.join(python_vendor_dir, "python_3") + + if not os.path.exists(pythonpath): + return + + logger.debug("Adding Python version specific paths to PYTHONPATH") + python_paths = [pythonpath] + + # Load PYTHONPATH from current launch context + python_path = env.get("PYTHONPATH") + if python_path: + python_paths.append(python_path) + + # Set new PYTHONPATH to launch context environments + env["PYTHONPATH"] = os.pathsep.join(python_paths) + + def prepare_app_environments(data, env_group=None, implementation_envs=True): """Modify launch environments based on launched app and context. @@ -1331,6 +1366,8 @@ def prepare_app_environments(data, env_group=None, implementation_envs=True): app = data["app"] log = data["log"] + _add_python_version_paths(app, data["env"], log) + # `added_env_keys` has debug purpose added_env_keys = {app.group.name, app.name} # Environments for application @@ -1545,6 +1582,7 @@ def _prepare_last_workfile(data, workdir): workdir (str): Path to folder where workfiles should be stored. """ import avalon.api + from openpype.pipeline import HOST_WORKFILE_EXTENSIONS log = data["log"] @@ -1593,7 +1631,7 @@ def _prepare_last_workfile(data, workdir): # Last workfile path last_workfile_path = data.get("last_workfile_path") or "" if not last_workfile_path: - extensions = avalon.api.HOST_WORKFILE_EXTENSIONS.get(app.host_name) + extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name) if extensions: anatomy = data["anatomy"] project_settings = data["project_settings"] diff --git a/openpype/pipeline/lib/attribute_definitions.py b/openpype/lib/attribute_definitions.py similarity index 100% rename from openpype/pipeline/lib/attribute_definitions.py rename to openpype/lib/attribute_definitions.py diff --git a/openpype/lib/avalon_context.py b/openpype/lib/avalon_context.py index 8e9fff5f67..b4e6abb72d 100644 --- a/openpype/lib/avalon_context.py +++ b/openpype/lib/avalon_context.py @@ -9,6 +9,8 @@ import collections import functools import getpass +from bson.objectid import ObjectId + from openpype.settings import ( get_project_settings, get_system_settings @@ -169,7 +171,7 @@ def any_outdated(): representation_doc = avalon.io.find_one( { - "_id": avalon.io.ObjectId(representation), + "_id": ObjectId(representation), "type": "representation" }, projection={"parent": True} @@ -1703,7 +1705,7 @@ def _get_task_context_data_for_anatomy( "task": { "name": task_name, "type": task_type, - "short_name": project_task_type_data["short_name"] + "short": project_task_type_data["short_name"] } } diff --git a/openpype/lib/log.py b/openpype/lib/log.py index 98a3bae8e6..f33385e0ba 100644 --- a/openpype/lib/log.py +++ b/openpype/lib/log.py @@ -98,6 +98,10 @@ class PypeStreamHandler(logging.StreamHandler): self.flush() except (KeyboardInterrupt, SystemExit): raise + + except OSError: + self.handleError(record) + except Exception: print(repr(record)) self.handleError(record) diff --git a/openpype/lib/splash.txt b/openpype/lib/splash.txt deleted file mode 100644 index 833bcd4b9c..0000000000 --- a/openpype/lib/splash.txt +++ /dev/null @@ -1,413 +0,0 @@ - - - - * - - - - - - - .* - - - - - - * - .* - * - - - - . - * - .* - * - . - - . - * - .* - .* - .* - * - . - . - * - .* - .* - .* - * - . - _. - /** - \ * - \* - * - * - . - __. - ---* - \ \* - \ * - \* - * - . - \___. - /* * - \ \ * - \ \* - \ * - \* - . - |____. - /* * - \|\ * - \ \ * - \ \ * - \ \* - \/. - _/_____. - /* * - / \ * - \ \ * - \ \ * - \ \__* - \/__. - __________. - --*-- ___* - \ \ \/_* - \ \ __* - \ \ \_* - \ \____\* - \/____/. - \____________ . - /* ___ \* - \ \ \/_\ * - \ \ _____* - \ \ \___/* - \ \____\ * - \/____/ . - |___________ . - /* ___ \ * - \|\ \/_\ \ * - \ \ _____/ * - \ \ \___/ * - \ \____\ / * - \/____/ \. - _/__________ . - /* ___ \ * - / \ \/_\ \ * - \ \ _____/ * - \ \ \___/ ---* - \ \____\ / \__* - \/____/ \/__. - ____________ . - --*-- ___ \ * - \ \ \/_\ \ * - \ \ _____/ * - \ \ \___/ ---- * - \ \____\ / \____\* - \/____/ \/____/. - ____________ - /\ ___ \ . - \ \ \/_\ \ * - \ \ _____/ * - \ \ \___/ ---- * - \ \____\ / \____\ . - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ . - \ \ _____/ * - \ \ \___/ ---- * - \ \____\ / \____\ . - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ . - \ \ \___/ ---- * - \ \____\ / \____\ . - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ - \ \ \___/ ---- * - \ \____\ / \____\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ - \ \ \___/ ---- . - \ \____\ / \____\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ _ - \ \ \___/ ---- - \ \____\ / \____\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ - \ \ \___/ ---- - \ \____\ / \____\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ - \ \ \___/ ---- \ - \ \____\ / \____\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ - \ \ \___/ ---- \ - \ \____\ / \____\ \ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ - \ \ \___/ ---- \ - \ \____\ / \____\ __\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ - \ \ \___/ ---- \ - \ \____\ / \____\ \__\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ - \ \ \___/ ---- \ \ - \ \____\ / \____\ \__\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ - \ \ \___/ ---- \ \ - \ \____\ / \____\ \__\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___. - \ \ \___/ ---- \ \\ - \ \____\ / \____\ \__\, - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ . - \ \ \___/ ---- \ \\ - \ \____\ / \____\ \__\\, - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ _. - \ \ \___/ ---- \ \\\ - \ \____\ / \____\ \__\\\ - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ __. - \ \ \___/ ---- \ \\ \ - \ \____\ / \____\ \__\\_/. - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___. - \ \ \___/ ---- \ \\ \\ - \ \____\ / \____\ \__\\__\. - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ . - \ \ \___/ ---- \ \\ \\ - \ \____\ / \____\ \__\\__\\. - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ _. - \ \ \___/ ---- \ \\ \\\ - \ \____\ / \____\ \__\\__\\. - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ __. - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\_. - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ __. - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__. - \/____/ \/____/ - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ * - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ O* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ .oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ ..oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . .oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . p.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . Py.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYp.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPe.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE .oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE c.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE C1.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE ClU.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE CluB.oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE Club .oO* - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE Club . .. - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE Club . .. - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE Club . . - ____________ - /\ ___ \ - \ \ \/_\ \ - \ \ _____/ ___ ___ ___ - \ \ \___/ ---- \ \\ \\ \ - \ \____\ / \____\ \__\\__\\__\ - \/____/ \/____/ . PYPE Club . diff --git a/openpype/lib/terminal_splash.py b/openpype/lib/terminal_splash.py deleted file mode 100644 index 0ba2706a27..0000000000 --- a/openpype/lib/terminal_splash.py +++ /dev/null @@ -1,43 +0,0 @@ -# -*- coding: utf-8 -*- -"""Pype terminal animation.""" -import blessed -from pathlib import Path -from time import sleep - -NO_TERMINAL = False - -try: - term = blessed.Terminal() -except AttributeError: - # this happens when blessed cannot find proper terminal. - # If so, skip printing ascii art animation. - NO_TERMINAL = True - - -def play_animation(): - """Play ASCII art Pype animation.""" - if NO_TERMINAL: - return - print(term.home + term.clear) - frame_size = 7 - splash_file = Path(__file__).parent / "splash.txt" - with splash_file.open("r") as sf: - animation = sf.readlines() - - animation_length = int(len(animation) / frame_size) - current_frame = 0 - for _ in range(animation_length): - frame = "".join( - scanline - for y, scanline in enumerate( - animation[current_frame: current_frame + frame_size] - ) - ) - - with term.location(0, 0): - # term.aquamarine3_bold(frame) - print(f"{term.bold}{term.aquamarine3}{frame}{term.normal}") - - sleep(0.02) - current_frame += frame_size - print(term.move_y(7)) diff --git a/openpype/lib/usdlib.py b/openpype/lib/usdlib.py index 3ae7430c7b..89021156b4 100644 --- a/openpype/lib/usdlib.py +++ b/openpype/lib/usdlib.py @@ -315,7 +315,7 @@ def get_usd_master_path(asset, subset, representation): ) template = project["config"]["template"]["publish"] - if isinstance(asset, dict) and "silo" in asset and "name" in asset: + if isinstance(asset, dict) and "name" in asset: # Allow explicitly passing asset document asset_doc = asset else: @@ -325,7 +325,6 @@ def get_usd_master_path(asset, subset, representation): **{ "root": api.registered_root(), "project": api.Session["AVALON_PROJECT"], - "silo": asset_doc["silo"], "asset": asset_doc["name"], "subset": subset, "representation": representation, diff --git a/openpype/modules/base.py b/openpype/modules/base.py index 175957ae39..5cdeb86087 100644 --- a/openpype/modules/base.py +++ b/openpype/modules/base.py @@ -28,26 +28,15 @@ from openpype.settings.lib import ( ) from openpype.lib import PypeLogger - -DEFAULT_OPENPYPE_MODULES = ( - "avalon_apps", - "clockify", - "log_viewer", - "deadline", - "muster", - "royalrender", - "python_console_interpreter", - "ftrack", - "slack", - "webserver", - "launcher_action", - "project_manager_action", - "settings_action", - "standalonepublish_action", - "traypublish_action", - "job_queue", - "timers_manager", - "sync_server", +# Files that will be always ignored on modules import +IGNORED_FILENAMES = ( + "__pycache__", +) +# Files ignored on modules import from "./openpype/modules" +IGNORED_DEFAULT_FILENAMES = ( + "__init__.py", + "base.py", + "interfaces.py", ) @@ -146,9 +135,16 @@ class _LoadCache: def get_default_modules_dir(): """Path to default OpenPype modules.""" + current_dir = os.path.abspath(os.path.dirname(__file__)) - return os.path.join(current_dir, "default_modules") + output = [] + for folder_name in ("default_modules", ): + path = os.path.join(current_dir, folder_name) + if os.path.exists(path) and os.path.isdir(path): + output.append(path) + + return output def get_dynamic_modules_dirs(): @@ -186,7 +182,7 @@ def get_dynamic_modules_dirs(): def get_module_dirs(): """List of paths where OpenPype modules can be found.""" _dirpaths = [] - _dirpaths.append(get_default_modules_dir()) + _dirpaths.extend(get_default_modules_dir()) _dirpaths.extend(get_dynamic_modules_dirs()) dirpaths = [] @@ -292,25 +288,45 @@ def _load_modules(): log = PypeLogger.get_logger("ModulesLoader") + current_dir = os.path.abspath(os.path.dirname(__file__)) + processed_paths = set() + processed_paths.add(current_dir) # Import default modules imported from 'openpype.modules' - for default_module_name in DEFAULT_OPENPYPE_MODULES: + for filename in os.listdir(current_dir): + # Ignore filenames + if ( + filename in IGNORED_FILENAMES + or filename in IGNORED_DEFAULT_FILENAMES + ): + continue + + fullpath = os.path.join(current_dir, filename) + basename, ext = os.path.splitext(filename) + + if not os.path.isdir(fullpath) and ext not in (".py", ): + continue + try: - import_str = "openpype.modules.{}".format(default_module_name) - new_import_str = "{}.{}".format(modules_key, default_module_name) + import_str = "openpype.modules.{}".format(basename) + new_import_str = "{}.{}".format(modules_key, basename) default_module = __import__(import_str, fromlist=("", )) sys.modules[new_import_str] = default_module - setattr(openpype_modules, default_module_name, default_module) + setattr(openpype_modules, basename, default_module) except Exception: msg = ( "Failed to import default module '{}'." - ).format(default_module_name) + ).format(basename) log.error(msg, exc_info=True) # Look for OpenPype modules in paths defined with `get_module_dirs` # - dynamically imported OpenPype modules and addons - dirpaths = get_module_dirs() - for dirpath in dirpaths: + for dirpath in get_module_dirs(): + # Skip already processed paths + if dirpath in processed_paths: + continue + processed_paths.add(dirpath) + if not os.path.exists(dirpath): log.warning(( "Could not find path when loading OpenPype modules \"{}\"" @@ -319,12 +335,15 @@ def _load_modules(): for filename in os.listdir(dirpath): # Ignore filenames - if filename in ("__pycache__", ): + if filename in IGNORED_FILENAMES: continue fullpath = os.path.join(dirpath, filename) basename, ext = os.path.splitext(filename) + if not os.path.isdir(fullpath) and ext not in (".py", ): + continue + # TODO add more logic how to define if folder is module or not # - check manifest and content of manifest try: diff --git a/openpype/modules/ftrack/event_handlers_user/action_delete_old_versions.py b/openpype/modules/ftrack/event_handlers_user/action_delete_old_versions.py index 1b694e25f1..5871646b20 100644 --- a/openpype/modules/ftrack/event_handlers_user/action_delete_old_versions.py +++ b/openpype/modules/ftrack/event_handlers_user/action_delete_old_versions.py @@ -492,7 +492,8 @@ class DeleteOldVersions(BaseAction): os.remove(file_path) self.log.debug("Removed file: {}".format(file_path)) - remainders.remove(file_path_base) + if file_path_base in remainders: + remainders.remove(file_path_base) continue seq_path_base = os.path.split(seq_path)[1] diff --git a/openpype/modules/log_viewer/tray/app.py b/openpype/modules/log_viewer/tray/app.py index 1e8d6483cd..71827fcac9 100644 --- a/openpype/modules/log_viewer/tray/app.py +++ b/openpype/modules/log_viewer/tray/app.py @@ -26,3 +26,12 @@ class LogsWindow(QtWidgets.QWidget): self.log_detail = log_detail self.setStyleSheet(style.load_stylesheet()) + + self._frist_show = True + + def showEvent(self, event): + super(LogsWindow, self).showEvent(event) + + if self._frist_show: + self._frist_show = False + self.logs_widget.refresh() diff --git a/openpype/modules/log_viewer/tray/widgets.py b/openpype/modules/log_viewer/tray/widgets.py index ff77405de5..ed08e62109 100644 --- a/openpype/modules/log_viewer/tray/widgets.py +++ b/openpype/modules/log_viewer/tray/widgets.py @@ -155,6 +155,11 @@ class LogsWidget(QtWidgets.QWidget): QtCore.Qt.DescendingOrder ) + refresh_triggered_timer = QtCore.QTimer() + refresh_triggered_timer.setSingleShot(True) + refresh_triggered_timer.setInterval(200) + + refresh_triggered_timer.timeout.connect(self._on_refresh_timeout) view.selectionModel().selectionChanged.connect(self._on_index_change) refresh_btn.clicked.connect(self._on_refresh_clicked) @@ -169,10 +174,12 @@ class LogsWidget(QtWidgets.QWidget): self.detail_widget = detail_widget self.refresh_btn = refresh_btn - # prepare - self.refresh() + self._refresh_triggered_timer = refresh_triggered_timer def refresh(self): + self._refresh_triggered_timer.start() + + def _on_refresh_timeout(self): self.model.refresh() self.detail_widget.refresh() diff --git a/openpype/modules/slack/plugins/publish/collect_slack_family.py b/openpype/modules/slack/plugins/publish/collect_slack_family.py index 6c965b04cd..7475bdc89e 100644 --- a/openpype/modules/slack/plugins/publish/collect_slack_family.py +++ b/openpype/modules/slack/plugins/publish/collect_slack_family.py @@ -35,20 +35,25 @@ class CollectSlackFamilies(pyblish.api.InstancePlugin): return # make slack publishable - if profile: - self.log.info("Found profile: {}".format(profile)) - if instance.data.get('families'): - instance.data['families'].append('slack') - else: - instance.data['families'] = ['slack'] + if not profile: + return - instance.data["slack_channel_message_profiles"] = \ - profile["channel_messages"] + self.log.info("Found profile: {}".format(profile)) + if instance.data.get('families'): + instance.data['families'].append('slack') + else: + instance.data['families'] = ['slack'] - slack_token = (instance.context.data["project_settings"] - ["slack"] - ["token"]) - instance.data["slack_token"] = slack_token + selected_profiles = profile["channel_messages"] + for prof in selected_profiles: + prof["review_upload_limit"] = profile.get("review_upload_limit", + 50) + instance.data["slack_channel_message_profiles"] = selected_profiles + + slack_token = (instance.context.data["project_settings"] + ["slack"] + ["token"]) + instance.data["slack_token"] = slack_token def main_family_from_instance(self, instance): # TODO yank from integrate """Returns main family of entered instance.""" diff --git a/openpype/modules/slack/plugins/publish/integrate_slack_api.py b/openpype/modules/slack/plugins/publish/integrate_slack_api.py index 018a7594bb..10bde7d4c0 100644 --- a/openpype/modules/slack/plugins/publish/integrate_slack_api.py +++ b/openpype/modules/slack/plugins/publish/integrate_slack_api.py @@ -35,7 +35,7 @@ class IntegrateSlackAPI(pyblish.api.InstancePlugin): message = self._get_filled_message(message_profile["message"], instance, review_path) - self.log.info("message:: {}".format(message)) + self.log.debug("message:: {}".format(message)) if not message: return @@ -43,7 +43,8 @@ class IntegrateSlackAPI(pyblish.api.InstancePlugin): publish_files.add(thumbnail_path) if message_profile["upload_review"] and review_path: - publish_files.add(review_path) + message, publish_files = self._handle_review_upload( + message, message_profile, publish_files, review_path) project = instance.context.data["anatomyData"]["project"]["code"] for channel in message_profile["channels"]: @@ -75,6 +76,19 @@ class IntegrateSlackAPI(pyblish.api.InstancePlugin): dbcon = mongo_client[database_name]["notification_messages"] dbcon.insert_one(msg) + def _handle_review_upload(self, message, message_profile, publish_files, + review_path): + """Check if uploaded file is not too large""" + review_file_size_MB = os.path.getsize(review_path) / 1024 / 1024 + file_limit = message_profile.get("review_upload_limit", 50) + if review_file_size_MB > file_limit: + message += "\nReview upload omitted because of file size." + if review_path not in message: + message += "\nFile located at: {}".format(review_path) + else: + publish_files.add(review_path) + return message, publish_files + def _get_filled_message(self, message_templ, instance, review_path=None): """Use message_templ and data from instance to get message content. @@ -210,6 +224,9 @@ class IntegrateSlackAPI(pyblish.api.InstancePlugin): # You will get a SlackApiError if "ok" is False error_str = self._enrich_error(str(e.response["error"]), channel) self.log.warning("Error happened {}".format(error_str)) + except Exception as e: + error_str = self._enrich_error(str(e), channel) + self.log.warning("Not SlackAPI error", exc_info=True) return None, [] diff --git a/openpype/pipeline/__init__.py b/openpype/pipeline/__init__.py index 26970e4edc..511e4c7b94 100644 --- a/openpype/pipeline/__init__.py +++ b/openpype/pipeline/__init__.py @@ -1,4 +1,7 @@ -from .lib import attribute_definitions +from .constants import ( + AVALON_CONTAINER_ID, + HOST_WORKFILE_EXTENSIONS, +) from .create import ( BaseCreator, @@ -38,11 +41,31 @@ from .publish import ( PublishValidationError, PublishXmlValidationError, KnownPublishError, - OpenPypePyblishPluginMixin + OpenPypePyblishPluginMixin, + OptionalPyblishPluginMixin, +) + +from .actions import ( + LauncherAction, + + InventoryAction, + + discover_launcher_actions, + register_launcher_action, + register_launcher_action_path, + + discover_inventory_actions, + register_inventory_action, + register_inventory_action_path, + deregister_inventory_action, + deregister_inventory_action_path, ) __all__ = ( + "AVALON_CONTAINER_ID", + "HOST_WORKFILE_EXTENSIONS", + "attribute_definitions", # --- Create --- @@ -82,5 +105,20 @@ __all__ = ( "PublishValidationError", "PublishXmlValidationError", "KnownPublishError", - "OpenPypePyblishPluginMixin" + "OpenPypePyblishPluginMixin", + "OptionalPyblishPluginMixin", + + # --- Actions --- + "LauncherAction", + "InventoryAction", + + "discover_launcher_actions", + "register_launcher_action", + "register_launcher_action_path", + + "discover_inventory_actions", + "register_inventory_action", + "register_inventory_action_path", + "deregister_inventory_action", + "deregister_inventory_action_path", ) diff --git a/openpype/pipeline/actions.py b/openpype/pipeline/actions.py new file mode 100644 index 0000000000..141e277db3 --- /dev/null +++ b/openpype/pipeline/actions.py @@ -0,0 +1,144 @@ +import logging + + +class LauncherAction(object): + """A custom action available""" + name = None + label = None + icon = None + color = None + order = 0 + + log = logging.getLogger("LauncherAction") + log.propagate = True + + def is_compatible(self, session): + """Return whether the class is compatible with the Session.""" + return True + + def process(self, session, **kwargs): + pass + + +class InventoryAction(object): + """A custom action for the scene inventory tool + + If registered the action will be visible in the Right Mouse Button menu + under the submenu "Actions". + + """ + + label = None + icon = None + color = None + order = 0 + + log = logging.getLogger("InventoryAction") + log.propagate = True + + @staticmethod + def is_compatible(container): + """Override function in a custom class + + This method is specifically used to ensure the action can operate on + the container. + + Args: + container(dict): the data of a loaded asset, see host.ls() + + Returns: + bool + """ + return bool(container.get("objectName")) + + def process(self, containers): + """Override function in a custom class + + This method will receive all containers even those which are + incompatible. It is advised to create a small filter along the lines + of this example: + + valid_containers = filter(self.is_compatible(c) for c in containers) + + The return value will need to be a True-ish value to trigger + the data_changed signal in order to refresh the view. + + You can return a list of container names to trigger GUI to select + treeview items. + + You can return a dict to carry extra GUI options. For example: + { + "objectNames": [container names...], + "options": {"mode": "toggle", + "clear": False} + } + Currently workable GUI options are: + - clear (bool): Clear current selection before selecting by action. + Default `True`. + - mode (str): selection mode, use one of these: + "select", "deselect", "toggle". Default is "select". + + Args: + containers (list): list of dictionaries + + Return: + bool, list or dict + + """ + return True + + +# Launcher action +def discover_launcher_actions(): + import avalon.api + + return avalon.api.discover(LauncherAction) + + +def register_launcher_action(plugin): + import avalon.api + + return avalon.api.register_plugin(LauncherAction, plugin) + + +def register_launcher_action_path(path): + import avalon.api + + return avalon.api.register_plugin_path(LauncherAction, path) + + +# Inventory action +def discover_inventory_actions(): + import avalon.api + + actions = avalon.api.discover(InventoryAction) + filtered_actions = [] + for action in actions: + if action is not InventoryAction: + filtered_actions.append(action) + + return filtered_actions + + +def register_inventory_action(plugin): + import avalon.api + + return avalon.api.register_plugin(InventoryAction, plugin) + + +def deregister_inventory_action(plugin): + import avalon.api + + avalon.api.deregister_plugin(InventoryAction, plugin) + + +def register_inventory_action_path(path): + import avalon.api + + return avalon.api.register_plugin_path(InventoryAction, path) + + +def deregister_inventory_action_path(path): + import avalon.api + + return avalon.api.deregister_plugin_path(InventoryAction, path) diff --git a/openpype/pipeline/constants.py b/openpype/pipeline/constants.py new file mode 100644 index 0000000000..e6496cbf95 --- /dev/null +++ b/openpype/pipeline/constants.py @@ -0,0 +1,19 @@ +# Metadata ID of loaded container into scene +AVALON_CONTAINER_ID = "pyblish.avalon.container" + +# TODO get extensions from host implementations +HOST_WORKFILE_EXTENSIONS = { + "blender": [".blend"], + "celaction": [".scn"], + "tvpaint": [".tvpp"], + "fusion": [".comp"], + "harmony": [".zip"], + "houdini": [".hip", ".hiplc", ".hipnc"], + "maya": [".ma", ".mb"], + "nuke": [".nk"], + "hiero": [".hrox"], + "photoshop": [".psd", ".psb"], + "premiere": [".prproj"], + "resolve": [".drp"], + "aftereffects": [".aep"] +} diff --git a/openpype/pipeline/create/context.py b/openpype/pipeline/create/context.py index c2757a4502..eeb08a6294 100644 --- a/openpype/pipeline/create/context.py +++ b/openpype/pipeline/create/context.py @@ -6,7 +6,6 @@ import inspect from uuid import uuid4 from contextlib import contextmanager -from ..lib import UnknownDef from .creator_plugins import ( BaseCreator, Creator, @@ -87,6 +86,8 @@ class AttributeValues: origin_data(dict): Values loaded from host before conversion. """ def __init__(self, attr_defs, values, origin_data=None): + from openpype.lib.attribute_definitions import UnknownDef + if origin_data is None: origin_data = copy.deepcopy(values) self._origin_data = origin_data diff --git a/openpype/pipeline/lib/__init__.py b/openpype/pipeline/lib/__init__.py deleted file mode 100644 index f762c4205d..0000000000 --- a/openpype/pipeline/lib/__init__.py +++ /dev/null @@ -1,30 +0,0 @@ -from .attribute_definitions import ( - AbtractAttrDef, - - UIDef, - UISeparatorDef, - UILabelDef, - - UnknownDef, - NumberDef, - TextDef, - EnumDef, - BoolDef, - FileDef, -) - - -__all__ = ( - "AbtractAttrDef", - - "UIDef", - "UISeparatorDef", - "UILabelDef", - - "UnknownDef", - "NumberDef", - "TextDef", - "EnumDef", - "BoolDef", - "FileDef", -) diff --git a/openpype/pipeline/load/plugins.py b/openpype/pipeline/load/plugins.py index 601ad3b258..9b2b6bb084 100644 --- a/openpype/pipeline/load/plugins.py +++ b/openpype/pipeline/load/plugins.py @@ -127,4 +127,5 @@ def register_loader_plugin_path(path): def deregister_loader_plugin(plugin): import avalon.api + avalon.api.deregister_plugin(LoaderPlugin, plugin) diff --git a/openpype/pipeline/load/utils.py b/openpype/pipeline/load/utils.py index 6d32c11cd7..53ac6b626d 100644 --- a/openpype/pipeline/load/utils.py +++ b/openpype/pipeline/load/utils.py @@ -7,6 +7,7 @@ import inspect import numbers import six +from bson.objectid import ObjectId from avalon import io, schema from avalon.api import Session, registered_root @@ -67,7 +68,7 @@ def get_repres_contexts(representation_ids, dbcon=None): _representation_ids = [] for repre_id in representation_ids: if isinstance(repre_id, six.string_types): - repre_id = io.ObjectId(repre_id) + repre_id = ObjectId(repre_id) _representation_ids.append(repre_id) repre_docs = dbcon.find({ @@ -174,7 +175,7 @@ def get_subset_contexts(subset_ids, dbcon=None): _subset_ids = set() for subset_id in subset_ids: if isinstance(subset_id, six.string_types): - subset_id = io.ObjectId(subset_id) + subset_id = ObjectId(subset_id) _subset_ids.add(subset_id) subset_docs = dbcon.find({ @@ -217,7 +218,7 @@ def get_representation_context(representation): """Return parenthood context for representation. Args: - representation (str or io.ObjectId or dict): The representation id + representation (str or ObjectId or dict): The representation id or full representation as returned by the database. Returns: @@ -227,9 +228,9 @@ def get_representation_context(representation): assert representation is not None, "This is a bug" - if isinstance(representation, (six.string_types, io.ObjectId)): + if isinstance(representation, (six.string_types, ObjectId)): representation = io.find_one( - {"_id": io.ObjectId(str(representation))}) + {"_id": ObjectId(str(representation))}) version, subset, asset, project = io.parenthood(representation) @@ -340,7 +341,7 @@ def load_container( Args: Loader (Loader): The loader class to trigger. - representation (str or io.ObjectId or dict): The representation id + representation (str or ObjectId or dict): The representation id or full representation as returned by the database. namespace (str, Optional): The namespace to assign. Defaults to None. name (str, Optional): The name to assign. Defaults to subset name. @@ -404,7 +405,7 @@ def update_container(container, version=-1): # Compute the different version from 'representation' current_representation = io.find_one({ - "_id": io.ObjectId(container["representation"]) + "_id": ObjectId(container["representation"]) }) assert current_representation is not None, "This is a bug" @@ -593,7 +594,6 @@ def get_representation_path(representation, root=None, dbcon=None): "code": project.get("data", {}).get("code") }, "asset": asset["name"], - "silo": asset.get("silo"), "hierarchy": hierarchy, "subset": subset["name"], "version": version_["name"], diff --git a/openpype/pipeline/publish/__init__.py b/openpype/pipeline/publish/__init__.py index c2729a46ce..af5d7c4a91 100644 --- a/openpype/pipeline/publish/__init__.py +++ b/openpype/pipeline/publish/__init__.py @@ -3,6 +3,7 @@ from .publish_plugins import ( PublishXmlValidationError, KnownPublishError, OpenPypePyblishPluginMixin, + OptionalPyblishPluginMixin, ) from .lib import ( @@ -18,6 +19,7 @@ __all__ = ( "PublishXmlValidationError", "KnownPublishError", "OpenPypePyblishPluginMixin", + "OptionalPyblishPluginMixin", "DiscoverResult", "publish_plugins_discover", diff --git a/openpype/pipeline/publish/publish_plugins.py b/openpype/pipeline/publish/publish_plugins.py index bce64ec709..2402a005c2 100644 --- a/openpype/pipeline/publish/publish_plugins.py +++ b/openpype/pipeline/publish/publish_plugins.py @@ -1,3 +1,4 @@ +from openpype.lib import BoolDef from .lib import load_help_content_from_plugin @@ -108,3 +109,64 @@ class OpenPypePyblishPluginMixin: plugin_values[key] ) return attribute_values + + def get_attr_values_from_data(self, data): + """Get attribute values for attribute definitions from data. + + Args: + data(dict): Data from instance or context. + """ + return ( + data + .get("publish_attributes", {}) + .get(self.__class__.__name__, {}) + ) + + +class OptionalPyblishPluginMixin(OpenPypePyblishPluginMixin): + """Prepare mixin for optional plugins. + + Defined active attribute definition prepared for published and + prepares method which will check if is active or not. + + ``` + class ValidateScene( + pyblish.api.InstancePlugin, OptionalPyblishPluginMixin + ): + def process(self, instance): + # Skip the instance if is not active by data on the instance + if not self.is_active(instance.data): + return + ``` + """ + + @classmethod + def get_attribute_defs(cls): + """Attribute definitions based on plugin's optional attribute.""" + + # Empty list if plugin is not optional + if not getattr(cls, "optional", None): + return [] + + # Get active value from class as default value + active = getattr(cls, "active", True) + # Return boolean stored under 'active' key with label of the class name + label = cls.label or cls.__name__ + return [ + BoolDef("active", default=active, label=label) + ] + + def is_active(self, data): + """Check if plugins is active for instance/context based on their data. + + Args: + data(dict): Data from instance or context. + """ + # Skip if is not optional and return True + if not getattr(self, "optional", None): + return True + attr_values = self.get_attr_values_from_data(data) + active = attr_values.get("active") + if active is None: + active = getattr(self, "active", True) + return active diff --git a/openpype/pipeline/thumbnail.py b/openpype/pipeline/thumbnail.py new file mode 100644 index 0000000000..12bab83be6 --- /dev/null +++ b/openpype/pipeline/thumbnail.py @@ -0,0 +1,147 @@ +import os +import copy +import logging + +log = logging.getLogger(__name__) + + +def get_thumbnail_binary(thumbnail_entity, thumbnail_type, dbcon=None): + if not thumbnail_entity: + return + + resolvers = discover_thumbnail_resolvers() + resolvers = sorted(resolvers, key=lambda cls: cls.priority) + if dbcon is None: + from avalon import io + dbcon = io + + for Resolver in resolvers: + available_types = Resolver.thumbnail_types + if ( + thumbnail_type not in available_types + and "*" not in available_types + and ( + isinstance(available_types, (list, tuple)) + and len(available_types) == 0 + ) + ): + continue + try: + instance = Resolver(dbcon) + result = instance.process(thumbnail_entity, thumbnail_type) + if result: + return result + + except Exception: + log.warning("Resolver {0} failed durring process.".format( + Resolver.__class__.__name__, exc_info=True + )) + + +class ThumbnailResolver(object): + """Determine how to get data from thumbnail entity. + + "priority" - determines the order of processing in `get_thumbnail_binary`, + lower number is processed earlier. + "thumbnail_types" - it is expected that thumbnails will be used in more + more than one level, there is only ["thumbnail"] type at the moment + of creating this docstring but it is expected to add "ico" and "full" + in future. + """ + + priority = 100 + thumbnail_types = ["*"] + + def __init__(self, dbcon): + self._log = None + self.dbcon = dbcon + + @property + def log(self): + if self._log is None: + self._log = logging.getLogger(self.__class__.__name__) + return self._log + + def process(self, thumbnail_entity, thumbnail_type): + pass + + +class TemplateResolver(ThumbnailResolver): + + priority = 90 + + def process(self, thumbnail_entity, thumbnail_type): + + if not os.environ.get("AVALON_THUMBNAIL_ROOT"): + return + + template = thumbnail_entity["data"].get("template") + if not template: + self.log.debug("Thumbnail entity does not have set template") + return + + project = self.dbcon.find_one( + {"type": "project"}, + { + "name": True, + "data.code": True + } + ) + + template_data = copy.deepcopy( + thumbnail_entity["data"].get("template_data") or {} + ) + template_data.update({ + "_id": str(thumbnail_entity["_id"]), + "thumbnail_type": thumbnail_type, + "thumbnail_root": os.environ.get("AVALON_THUMBNAIL_ROOT"), + "project": { + "name": project["name"], + "code": project["data"].get("code") + } + }) + + try: + filepath = os.path.normpath(template.format(**template_data)) + except KeyError: + self.log.warning(( + "Missing template data keys for template <{0}> || Data: {1}" + ).format(template, str(template_data))) + return + + if not os.path.exists(filepath): + self.log.warning("File does not exist \"{0}\"".format(filepath)) + return + + with open(filepath, "rb") as _file: + content = _file.read() + + return content + + +class BinaryThumbnail(ThumbnailResolver): + def process(self, thumbnail_entity, thumbnail_type): + return thumbnail_entity["data"].get("binary_data") + + +# Thumbnail resolvers +def discover_thumbnail_resolvers(): + import avalon.api + + return avalon.api.discover(ThumbnailResolver) + + +def register_thumbnail_resolver(plugin): + import avalon.api + + return avalon.api.register_plugin(ThumbnailResolver, plugin) + + +def register_thumbnail_resolver_path(path): + import avalon.api + + return avalon.api.register_plugin_path(ThumbnailResolver, path) + + +register_thumbnail_resolver(TemplateResolver) +register_thumbnail_resolver(BinaryThumbnail) diff --git a/openpype/plugins/load/delete_old_versions.py b/openpype/plugins/load/delete_old_versions.py index 692acdec02..2789f4ea23 100644 --- a/openpype/plugins/load/delete_old_versions.py +++ b/openpype/plugins/load/delete_old_versions.py @@ -126,7 +126,8 @@ class DeleteOldVersions(load.SubsetLoaderPlugin): os.remove(file_path) self.log.debug("Removed file: {}".format(file_path)) - remainders.remove(file_path_base) + if file_path_base in remainders: + remainders.remove(file_path_base) continue seq_path_base = os.path.split(seq_path)[1] @@ -333,6 +334,8 @@ class DeleteOldVersions(load.SubsetLoaderPlugin): def main(self, data, remove_publish_folder): # Size of files. size = 0 + if not data: + return size if remove_publish_folder: size = self.delete_whole_dir_paths(data["dir_paths"].values()) @@ -418,6 +421,8 @@ class DeleteOldVersions(load.SubsetLoaderPlugin): ) data = self.get_data(context, versions_to_keep) + if not data: + continue size += self.main(data, remove_publish_folder) print("Progressing {}/{}".format(count + 1, len(contexts))) diff --git a/openpype/plugins/publish/collect_scene_loaded_versions.py b/openpype/plugins/publish/collect_scene_loaded_versions.py index d8119846c6..6746757e5f 100644 --- a/openpype/plugins/publish/collect_scene_loaded_versions.py +++ b/openpype/plugins/publish/collect_scene_loaded_versions.py @@ -1,3 +1,4 @@ +from bson.objectid import ObjectId import pyblish.api from avalon import api, io @@ -35,7 +36,7 @@ class CollectSceneLoadedVersions(pyblish.api.ContextPlugin): loaded_versions = [] _containers = list(host.ls()) - _repr_ids = [io.ObjectId(c["representation"]) for c in _containers] + _repr_ids = [ObjectId(c["representation"]) for c in _containers] version_by_repr = { str(doc["_id"]): doc["parent"] for doc in io.find({"_id": {"$in": _repr_ids}}, projection={"parent": 1}) @@ -46,7 +47,7 @@ class CollectSceneLoadedVersions(pyblish.api.ContextPlugin): # may have more then one representation that are same version version = { "subsetName": con["name"], - "representation": io.ObjectId(con["representation"]), + "representation": ObjectId(con["representation"]), "version": version_by_repr[con["representation"]], # _id } loaded_versions.append(version) diff --git a/openpype/plugins/publish/extract_hierarchy_avalon.py b/openpype/plugins/publish/extract_hierarchy_avalon.py index e263edd931..b062a9c4b5 100644 --- a/openpype/plugins/publish/extract_hierarchy_avalon.py +++ b/openpype/plugins/publish/extract_hierarchy_avalon.py @@ -64,7 +64,7 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin): data["tasks"] = tasks parents = [] visualParent = None - # do not store project"s id as visualParent (silo asset) + # do not store project"s id as visualParent if self.project is not None: if self.project["_id"] != parent["_id"]: visualParent = parent["_id"] diff --git a/openpype/plugins/publish/integrate_hero_version.py b/openpype/plugins/publish/integrate_hero_version.py index 60245314f4..466606d08b 100644 --- a/openpype/plugins/publish/integrate_hero_version.py +++ b/openpype/plugins/publish/integrate_hero_version.py @@ -4,6 +4,7 @@ import clique import errno import shutil +from bson.objectid import ObjectId from pymongo import InsertOne, ReplaceOne import pyblish.api from avalon import api, io, schema @@ -161,7 +162,7 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin): if old_version: new_version_id = old_version["_id"] else: - new_version_id = io.ObjectId() + new_version_id = ObjectId() new_hero_version = { "_id": new_version_id, @@ -384,7 +385,7 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin): # Create representation else: - repre["_id"] = io.ObjectId() + repre["_id"] = ObjectId() bulk_writes.append( InsertOne(repre) ) @@ -420,7 +421,7 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin): else: repre["old_id"] = repre["_id"] - repre["_id"] = io.ObjectId() + repre["_id"] = ObjectId() repre["type"] = "archived_representation" bulk_writes.append( InsertOne(repre) diff --git a/openpype/plugins/publish/integrate_inputlinks.py b/openpype/plugins/publish/integrate_inputlinks.py index f973dfc963..11cffc4638 100644 --- a/openpype/plugins/publish/integrate_inputlinks.py +++ b/openpype/plugins/publish/integrate_inputlinks.py @@ -1,8 +1,10 @@ - from collections import OrderedDict -from avalon import io + +from bson.objectid import ObjectId import pyblish.api +from avalon import io + class IntegrateInputLinks(pyblish.api.ContextPlugin): """Connecting version level dependency links""" @@ -104,7 +106,7 @@ class IntegrateInputLinks(pyblish.api.ContextPlugin): # future. link = OrderedDict() link["type"] = link_type - link["id"] = io.ObjectId(input_id) + link["id"] = ObjectId(input_id) link["linkedBy"] = "publish" if "inputLinks" not in version_doc["data"]: diff --git a/openpype/plugins/publish/integrate_new.py b/openpype/plugins/publish/integrate_new.py index 6ca6125cb2..2304f98713 100644 --- a/openpype/plugins/publish/integrate_new.py +++ b/openpype/plugins/publish/integrate_new.py @@ -9,6 +9,7 @@ import six import re import shutil +from bson.objectid import ObjectId from pymongo import DeleteOne, InsertOne import pyblish.api from avalon import io @@ -104,7 +105,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): "effect", "xgen", "hda", - "usd" + "usd", + "usdComposition", + "usdOverride" ] exclude_families = ["clip"] db_representation_context_keys = [ @@ -294,7 +297,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): bulk_writes.append(DeleteOne({"_id": repre_id})) repre["orig_id"] = repre_id - repre["_id"] = io.ObjectId() + repre["_id"] = ObjectId() repre["type"] = "archived_representation" bulk_writes.append(InsertOne(repre)) @@ -573,7 +576,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): # Create new id if existing representations does not match if repre_id is None: - repre_id = io.ObjectId() + repre_id = ObjectId() data = repre.get("data") or {} data.update({'path': dst, 'template': template}) @@ -782,7 +785,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): families = [instance.data["family"]] families.extend(instance.data.get("families", [])) io.update_many( - {"type": "subset", "_id": io.ObjectId(subset["_id"])}, + {"type": "subset", "_id": ObjectId(subset["_id"])}, {"$set": {"data.families": families}} ) @@ -807,7 +810,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): if subset_group: io.update_many({ 'type': 'subset', - '_id': io.ObjectId(subset_id) + '_id': ObjectId(subset_id) }, {'$set': {'data.subsetGroup': subset_group}}) def _get_subset_group(self, instance): @@ -1054,7 +1057,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): sync_project_presets = None rec = { - "_id": io.ObjectId(), + "_id": ObjectId(), "path": path } if size: diff --git a/openpype/settings/constants.py b/openpype/settings/constants.py index 8b8acf5714..19ff953eb4 100644 --- a/openpype/settings/constants.py +++ b/openpype/settings/constants.py @@ -8,11 +8,11 @@ M_ENVIRONMENT_KEY = "__environment_keys__" # Metadata key for storing dynamic created labels M_DYNAMIC_KEY_LABEL = "__dynamic_keys_labels__" -METADATA_KEYS = ( +METADATA_KEYS = frozenset([ M_OVERRIDDEN_KEY, M_ENVIRONMENT_KEY, M_DYNAMIC_KEY_LABEL -) +]) # Keys where studio's system overrides are stored GLOBAL_SETTINGS_KEY = "global_settings" diff --git a/openpype/settings/defaults/project_settings/slack.json b/openpype/settings/defaults/project_settings/slack.json index d77b8c2208..c156fed08e 100644 --- a/openpype/settings/defaults/project_settings/slack.json +++ b/openpype/settings/defaults/project_settings/slack.json @@ -11,6 +11,7 @@ "task_types": [], "tasks": [], "subsets": [], + "review_upload_limit": 50.0, "channel_messages": [] } ] diff --git a/openpype/settings/entities/base_entity.py b/openpype/settings/entities/base_entity.py index 76700d605d..21ee44ae77 100644 --- a/openpype/settings/entities/base_entity.py +++ b/openpype/settings/entities/base_entity.py @@ -173,6 +173,10 @@ class BaseItemEntity(BaseEntity): # Entity has set `_project_override_value` (is not NOT_SET) self.had_project_override = False + self._default_log_invalid_types = True + self._studio_log_invalid_types = True + self._project_log_invalid_types = True + # Callbacks that are called on change. # - main current purspose is to register GUI callbacks self.on_change_callbacks = [] @@ -419,7 +423,7 @@ class BaseItemEntity(BaseEntity): raise InvalidValueType(self.valid_value_types, type(value), self.path) # TODO convert to private method - def _check_update_value(self, value, value_source): + def _check_update_value(self, value, value_source, log_invalid_types=True): """Validation of value on update methods. Update methods update data from currently saved settings so it is @@ -447,16 +451,17 @@ class BaseItemEntity(BaseEntity): if new_value is not NOT_SET: return new_value - # Warning log about invalid value type. - self.log.warning( - ( - "{} Got invalid value type for {} values." - " Expected types: {} | Got Type: {} | Value: \"{}\"" - ).format( - self.path, value_source, - self.valid_value_types, type(value), str(value) + if log_invalid_types: + # Warning log about invalid value type. + self.log.warning( + ( + "{} Got invalid value type for {} values." + " Expected types: {} | Got Type: {} | Value: \"{}\"" + ).format( + self.path, value_source, + self.valid_value_types, type(value), str(value) + ) ) - ) return NOT_SET def available_for_role(self, role_name=None): @@ -985,7 +990,7 @@ class ItemEntity(BaseItemEntity): return self.root_item.get_entity_from_path(path) @abstractmethod - def update_default_value(self, parent_values): + def update_default_value(self, parent_values, log_invalid_types=True): """Fill default values on startup or on refresh. Default values stored in `openpype` repository should update all items @@ -995,11 +1000,13 @@ class ItemEntity(BaseItemEntity): Args: parent_values (dict): Values of parent's item. But in case item is used as widget, `parent_values` contain value for item. + log_invalid_types (bool): Log invalid type of value. Used when + entity can have children with same keys and different types. """ pass @abstractmethod - def update_studio_value(self, parent_values): + def update_studio_value(self, parent_values, log_invalid_types=True): """Fill studio override values on startup or on refresh. Set studio value if is not set to NOT_SET, in that case studio @@ -1008,11 +1015,13 @@ class ItemEntity(BaseItemEntity): Args: parent_values (dict): Values of parent's item. But in case item is used as widget, `parent_values` contain value for item. + log_invalid_types (bool): Log invalid type of value. Used when + entity can have children with same keys and different types. """ pass @abstractmethod - def update_project_value(self, parent_values): + def update_project_value(self, parent_values, log_invalid_types=True): """Fill project override values on startup, refresh or project change. Set project value if is not set to NOT_SET, in that case project @@ -1021,5 +1030,7 @@ class ItemEntity(BaseItemEntity): Args: parent_values (dict): Values of parent's item. But in case item is used as widget, `parent_values` contain value for item. + log_invalid_types (bool): Log invalid type of value. Used when + entity can have children with same keys and different types. """ pass diff --git a/openpype/settings/entities/dict_conditional.py b/openpype/settings/entities/dict_conditional.py index 19f326aea7..88d2dc8296 100644 --- a/openpype/settings/entities/dict_conditional.py +++ b/openpype/settings/entities/dict_conditional.py @@ -518,12 +518,18 @@ class DictConditionalEntity(ItemEntity): output.update(self._current_metadata) return output - def _prepare_value(self, value): + def _prepare_value(self, value, log_invalid_types): if value is NOT_SET or self.enum_key not in value: return NOT_SET, NOT_SET enum_value = value.get(self.enum_key) if enum_value not in self.non_gui_children: + if log_invalid_types: + self.log.warning( + "{} Unknown enum key in default values: {}".format( + self.path, enum_value + ) + ) return NOT_SET, NOT_SET # Create copy of value before poping values @@ -551,22 +557,25 @@ class DictConditionalEntity(ItemEntity): return value, metadata - def update_default_value(self, value): + def update_default_value(self, value, log_invalid_types=True): """Update default values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "default") + self._default_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "default", log_invalid_types + ) self.has_default_value = value is not NOT_SET # TODO add value validation - value, metadata = self._prepare_value(value) + value, metadata = self._prepare_value(value, log_invalid_types) self._default_metadata = metadata if value is NOT_SET: - self.enum_entity.update_default_value(value) + self.enum_entity.update_default_value(value, log_invalid_types) for children_by_key in self.non_gui_children.values(): for child_obj in children_by_key.values(): - child_obj.update_default_value(value) + child_obj.update_default_value(value, log_invalid_types) return value_keys = set(value.keys()) @@ -574,7 +583,7 @@ class DictConditionalEntity(ItemEntity): expected_keys = set(self.non_gui_children[enum_value].keys()) expected_keys.add(self.enum_key) unknown_keys = value_keys - expected_keys - if unknown_keys: + if unknown_keys and log_invalid_types: self.log.warning( "{} Unknown keys in default values: {}".format( self.path, @@ -582,28 +591,37 @@ class DictConditionalEntity(ItemEntity): ) ) - self.enum_entity.update_default_value(enum_value) - for children_by_key in self.non_gui_children.values(): + self.enum_entity.update_default_value(enum_value, log_invalid_types) + + for enum_key, children_by_key in self.non_gui_children.items(): + _log_invalid_types = log_invalid_types + if _log_invalid_types: + _log_invalid_types = enum_key == enum_value + value_copy = copy.deepcopy(value) for key, child_obj in children_by_key.items(): child_value = value_copy.get(key, NOT_SET) - child_obj.update_default_value(child_value) + child_obj.update_default_value(child_value, _log_invalid_types) - def update_studio_value(self, value): + def update_studio_value(self, value, log_invalid_types=True): """Update studio override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "studio override") - value, metadata = self._prepare_value(value) + + self._studio_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "studio override", log_invalid_types + ) + value, metadata = self._prepare_value(value, log_invalid_types) self._studio_override_metadata = metadata self.had_studio_override = metadata is not NOT_SET if value is NOT_SET: - self.enum_entity.update_studio_value(value) + self.enum_entity.update_studio_value(value, log_invalid_types) for children_by_key in self.non_gui_children.values(): for child_obj in children_by_key.values(): - child_obj.update_studio_value(value) + child_obj.update_studio_value(value, log_invalid_types) return value_keys = set(value.keys()) @@ -611,7 +629,7 @@ class DictConditionalEntity(ItemEntity): expected_keys = set(self.non_gui_children[enum_value]) expected_keys.add(self.enum_key) unknown_keys = value_keys - expected_keys - if unknown_keys: + if unknown_keys and log_invalid_types: self.log.warning( "{} Unknown keys in studio overrides: {}".format( self.path, @@ -619,28 +637,36 @@ class DictConditionalEntity(ItemEntity): ) ) - self.enum_entity.update_studio_value(enum_value) - for children_by_key in self.non_gui_children.values(): + self.enum_entity.update_studio_value(enum_value, log_invalid_types) + for enum_key, children_by_key in self.non_gui_children.items(): + _log_invalid_types = log_invalid_types + if _log_invalid_types: + _log_invalid_types = enum_key == enum_value + value_copy = copy.deepcopy(value) for key, child_obj in children_by_key.items(): child_value = value_copy.get(key, NOT_SET) - child_obj.update_studio_value(child_value) + child_obj.update_studio_value(child_value, _log_invalid_types) - def update_project_value(self, value): + def update_project_value(self, value, log_invalid_types=True): """Update project override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "project override") - value, metadata = self._prepare_value(value) + + self._project_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "project override", log_invalid_types + ) + value, metadata = self._prepare_value(value, log_invalid_types) self._project_override_metadata = metadata self.had_project_override = metadata is not NOT_SET if value is NOT_SET: - self.enum_entity.update_project_value(value) + self.enum_entity.update_project_value(value, log_invalid_types) for children_by_key in self.non_gui_children.values(): for child_obj in children_by_key.values(): - child_obj.update_project_value(value) + child_obj.update_project_value(value, log_invalid_types) return value_keys = set(value.keys()) @@ -648,7 +674,7 @@ class DictConditionalEntity(ItemEntity): expected_keys = set(self.non_gui_children[enum_value]) expected_keys.add(self.enum_key) unknown_keys = value_keys - expected_keys - if unknown_keys: + if unknown_keys and log_invalid_types: self.log.warning( "{} Unknown keys in project overrides: {}".format( self.path, @@ -656,12 +682,16 @@ class DictConditionalEntity(ItemEntity): ) ) - self.enum_entity.update_project_value(enum_value) - for children_by_key in self.non_gui_children.values(): + self.enum_entity.update_project_value(enum_value, log_invalid_types) + for enum_key, children_by_key in self.non_gui_children.items(): + _log_invalid_types = log_invalid_types + if _log_invalid_types: + _log_invalid_types = enum_key == enum_value + value_copy = copy.deepcopy(value) for key, child_obj in children_by_key.items(): child_value = value_copy.get(key, NOT_SET) - child_obj.update_project_value(child_value) + child_obj.update_project_value(child_value, _log_invalid_types) def _discard_changes(self, on_change_trigger): self._ignore_child_changes = True diff --git a/openpype/settings/entities/dict_immutable_keys_entity.py b/openpype/settings/entities/dict_immutable_keys_entity.py index 060f8d522e..0209681e95 100644 --- a/openpype/settings/entities/dict_immutable_keys_entity.py +++ b/openpype/settings/entities/dict_immutable_keys_entity.py @@ -414,12 +414,16 @@ class DictImmutableKeysEntity(ItemEntity): return value, metadata - def update_default_value(self, value): + def update_default_value(self, value, log_invalid_types=True): """Update default values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "default") + + self._default_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "default", log_invalid_types + ) self.has_default_value = value is not NOT_SET # TODO add value validation value, metadata = self._prepare_value(value) @@ -427,13 +431,13 @@ class DictImmutableKeysEntity(ItemEntity): if value is NOT_SET: for child_obj in self.non_gui_children.values(): - child_obj.update_default_value(value) + child_obj.update_default_value(value, log_invalid_types) return value_keys = set(value.keys()) expected_keys = set(self.non_gui_children) unknown_keys = value_keys - expected_keys - if unknown_keys: + if unknown_keys and log_invalid_types: self.log.warning( "{} Unknown keys in default values: {}".format( self.path, @@ -443,27 +447,31 @@ class DictImmutableKeysEntity(ItemEntity): for key, child_obj in self.non_gui_children.items(): child_value = value.get(key, NOT_SET) - child_obj.update_default_value(child_value) + child_obj.update_default_value(child_value, log_invalid_types) - def update_studio_value(self, value): + def update_studio_value(self, value, log_invalid_types=True): """Update studio override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "studio override") + + self._studio_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "studio override", log_invalid_types + ) value, metadata = self._prepare_value(value) self._studio_override_metadata = metadata self.had_studio_override = metadata is not NOT_SET if value is NOT_SET: for child_obj in self.non_gui_children.values(): - child_obj.update_studio_value(value) + child_obj.update_studio_value(value, log_invalid_types) return value_keys = set(value.keys()) expected_keys = set(self.non_gui_children) unknown_keys = value_keys - expected_keys - if unknown_keys: + if unknown_keys and log_invalid_types: self.log.warning( "{} Unknown keys in studio overrides: {}".format( self.path, @@ -472,27 +480,31 @@ class DictImmutableKeysEntity(ItemEntity): ) for key, child_obj in self.non_gui_children.items(): child_value = value.get(key, NOT_SET) - child_obj.update_studio_value(child_value) + child_obj.update_studio_value(child_value, log_invalid_types) - def update_project_value(self, value): + def update_project_value(self, value, log_invalid_types=True): """Update project override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "project override") + + self._project_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "project override", log_invalid_types + ) value, metadata = self._prepare_value(value) self._project_override_metadata = metadata self.had_project_override = metadata is not NOT_SET if value is NOT_SET: for child_obj in self.non_gui_children.values(): - child_obj.update_project_value(value) + child_obj.update_project_value(value, log_invalid_types) return value_keys = set(value.keys()) expected_keys = set(self.non_gui_children) unknown_keys = value_keys - expected_keys - if unknown_keys: + if unknown_keys and log_invalid_types: self.log.warning( "{} Unknown keys in project overrides: {}".format( self.path, @@ -502,7 +514,7 @@ class DictImmutableKeysEntity(ItemEntity): for key, child_obj in self.non_gui_children.items(): child_value = value.get(key, NOT_SET) - child_obj.update_project_value(child_value) + child_obj.update_project_value(child_value, log_invalid_types) def _discard_changes(self, on_change_trigger): self._ignore_child_changes = True @@ -694,37 +706,48 @@ class RootsDictEntity(DictImmutableKeysEntity): self._metadata_are_modified = False self._current_metadata = {} - def update_default_value(self, value): + def update_default_value(self, value, log_invalid_types=True): """Update default values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "default") + + self._default_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "default", log_invalid_types + ) value, _ = self._prepare_value(value) self._default_value = value self._default_metadata = {} self.has_default_value = value is not NOT_SET - def update_studio_value(self, value): + def update_studio_value(self, value, log_invalid_types=True): """Update studio override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "studio override") + + self._studio_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "studio override", log_invalid_types + ) value, _ = self._prepare_value(value) self._studio_value = value self._studio_override_metadata = {} self.had_studio_override = value is not NOT_SET - def update_project_value(self, value): + def update_project_value(self, value, log_invalid_types=True): """Update project override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "project override") + self._project_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "project override", log_invalid_types + ) value, _metadata = self._prepare_value(value) self._project_value = value @@ -886,37 +909,48 @@ class SyncServerSites(DictImmutableKeysEntity): self._metadata_are_modified = False self._current_metadata = {} - def update_default_value(self, value): + def update_default_value(self, value, log_invalid_types=True): """Update default values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "default") + + self._default_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "default", log_invalid_types + ) value, _ = self._prepare_value(value) self._default_value = value self._default_metadata = {} self.has_default_value = value is not NOT_SET - def update_studio_value(self, value): + def update_studio_value(self, value, log_invalid_types=True): """Update studio override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "studio override") + + self._studio_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "studio override", log_invalid_types + ) value, _ = self._prepare_value(value) self._studio_value = value self._studio_override_metadata = {} self.had_studio_override = value is not NOT_SET - def update_project_value(self, value): + def update_project_value(self, value, log_invalid_types=True): """Update project override values. Not an api method, should be called by parent. """ - value = self._check_update_value(value, "project override") + self._project_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "project override", log_invalid_types + ) value, _metadata = self._prepare_value(value) self._project_value = value diff --git a/openpype/settings/entities/dict_mutable_keys_entity.py b/openpype/settings/entities/dict_mutable_keys_entity.py index 6b9c0bc7ed..a0c93b97a7 100644 --- a/openpype/settings/entities/dict_mutable_keys_entity.py +++ b/openpype/settings/entities/dict_mutable_keys_entity.py @@ -393,11 +393,15 @@ class DictMutableKeysEntity(EndpointEntity): value = self.value_on_not_set using_values_from_state = False + log_invalid_types = True if state is OverrideState.PROJECT: + log_invalid_types = self._project_log_invalid_types using_values_from_state = using_project_overrides elif state is OverrideState.STUDIO: + log_invalid_types = self._studio_log_invalid_types using_values_from_state = using_studio_overrides elif state is OverrideState.DEFAULTS: + log_invalid_types = self._default_log_invalid_types using_values_from_state = using_default_values new_value = copy.deepcopy(value) @@ -437,11 +441,11 @@ class DictMutableKeysEntity(EndpointEntity): if not label: label = metadata_labels.get(new_key) - child_entity.update_default_value(_value) + child_entity.update_default_value(_value, log_invalid_types) if using_project_overrides: - child_entity.update_project_value(_value) + child_entity.update_project_value(_value, log_invalid_types) elif using_studio_overrides: - child_entity.update_studio_value(_value) + child_entity.update_studio_value(_value, log_invalid_types) if label: children_label_by_id[child_entity.id] = label @@ -598,8 +602,11 @@ class DictMutableKeysEntity(EndpointEntity): metadata[key] = value.pop(key) return value, metadata - def update_default_value(self, value): - value = self._check_update_value(value, "default") + def update_default_value(self, value, log_invalid_types=True): + self._default_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "default", log_invalid_types + ) has_default_value = value is not NOT_SET if has_default_value: for required_key in self.required_keys: @@ -611,15 +618,21 @@ class DictMutableKeysEntity(EndpointEntity): self._default_value = value self._default_metadata = metadata - def update_studio_value(self, value): - value = self._check_update_value(value, "studio override") + def update_studio_value(self, value, log_invalid_types=True): + self._studio_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "studio override", log_invalid_types + ) value, metadata = self._prepare_value(value) self._studio_override_value = value self._studio_override_metadata = metadata self.had_studio_override = value is not NOT_SET - def update_project_value(self, value): - value = self._check_update_value(value, "project override") + def update_project_value(self, value, log_invalid_types=True): + self._project_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "project override", log_invalid_types + ) value, metadata = self._prepare_value(value) self._project_override_value = value self._project_override_metadata = metadata @@ -686,9 +699,12 @@ class DictMutableKeysEntity(EndpointEntity): if not self._can_remove_from_project_override: return + log_invalid_types = True if self._has_studio_override: + log_invalid_types = self._studio_log_invalid_types value = self._studio_override_value elif self.has_default_value: + log_invalid_types = self._default_log_invalid_types value = self._default_value else: value = self.value_on_not_set @@ -709,9 +725,9 @@ class DictMutableKeysEntity(EndpointEntity): for _key, _value in new_value.items(): new_key = self._convert_to_regex_valid_key(_key) child_entity = self._add_key(new_key) - child_entity.update_default_value(_value) + child_entity.update_default_value(_value, log_invalid_types) if self._has_studio_override: - child_entity.update_studio_value(_value) + child_entity.update_studio_value(_value, log_invalid_types) label = metadata_labels.get(_key) if label: diff --git a/openpype/settings/entities/input_entities.py b/openpype/settings/entities/input_entities.py index 7512d7bfcc..3dcd238672 100644 --- a/openpype/settings/entities/input_entities.py +++ b/openpype/settings/entities/input_entities.py @@ -90,18 +90,27 @@ class EndpointEntity(ItemEntity): def require_restart(self): return self.has_unsaved_changes - def update_default_value(self, value): - value = self._check_update_value(value, "default") + def update_default_value(self, value, log_invalid_types=True): + self._default_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "default", log_invalid_types + ) self._default_value = value self.has_default_value = value is not NOT_SET - def update_studio_value(self, value): - value = self._check_update_value(value, "studio override") + def update_studio_value(self, value, log_invalid_types=True): + self._studio_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "studio override", log_invalid_types + ) self._studio_override_value = value self.had_studio_override = bool(value is not NOT_SET) - def update_project_value(self, value): - value = self._check_update_value(value, "project override") + def update_project_value(self, value, log_invalid_types=True): + self._project_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "project override", log_invalid_types + ) self._project_override_value = value self.had_project_override = bool(value is not NOT_SET) @@ -590,22 +599,26 @@ class RawJsonEntity(InputEntity): metadata[key] = value.pop(key) return value, metadata - def update_default_value(self, value): - value = self._check_update_value(value, "default") + def update_default_value(self, value, log_invalid_types=True): + value = self._check_update_value(value, "default", log_invalid_types) self.has_default_value = value is not NOT_SET value, metadata = self._prepare_value(value) self._default_value = value self.default_metadata = metadata - def update_studio_value(self, value): - value = self._check_update_value(value, "studio override") + def update_studio_value(self, value, log_invalid_types=True): + value = self._check_update_value( + value, "studio override", log_invalid_types + ) self.had_studio_override = value is not NOT_SET value, metadata = self._prepare_value(value) self._studio_override_value = value self.studio_override_metadata = metadata - def update_project_value(self, value): - value = self._check_update_value(value, "project override") + def update_project_value(self, value, log_invalid_types=True): + value = self._check_update_value( + value, "project override", log_invalid_types + ) self.had_project_override = value is not NOT_SET value, metadata = self._prepare_value(value) self._project_override_value = value diff --git a/openpype/settings/entities/item_entities.py b/openpype/settings/entities/item_entities.py index 9c6f428b97..3b756e4ede 100644 --- a/openpype/settings/entities/item_entities.py +++ b/openpype/settings/entities/item_entities.py @@ -173,14 +173,17 @@ class PathEntity(ItemEntity): self._ignore_missing_defaults = ignore_missing_defaults self.child_obj.set_override_state(state, ignore_missing_defaults) - def update_default_value(self, value): - self.child_obj.update_default_value(value) + def update_default_value(self, value, log_invalid_types=True): + self._default_log_invalid_types = log_invalid_types + self.child_obj.update_default_value(value, log_invalid_types) - def update_project_value(self, value): - self.child_obj.update_project_value(value) + def update_project_value(self, value, log_invalid_types=True): + self._studio_log_invalid_types = log_invalid_types + self.child_obj.update_project_value(value, log_invalid_types) - def update_studio_value(self, value): - self.child_obj.update_studio_value(value) + def update_studio_value(self, value, log_invalid_types=True): + self._project_log_invalid_types = log_invalid_types + self.child_obj.update_studio_value(value, log_invalid_types) def _discard_changes(self, *args, **kwargs): self.child_obj.discard_changes(*args, **kwargs) @@ -472,9 +475,9 @@ class ListStrictEntity(ItemEntity): self._has_project_override = False - def _check_update_value(self, value, value_type): + def _check_update_value(self, value, value_type, log_invalid_types=True): value = super(ListStrictEntity, self)._check_update_value( - value, value_type + value, value_type, log_invalid_types ) if value is NOT_SET: return value @@ -484,15 +487,16 @@ class ListStrictEntity(ItemEntity): if value_len == child_len: return value - self.log.warning( - ( - "{} Amount of strict list items in {} values is" - " not same as expected. Expected {} items. Got {} items. {}" - ).format( - self.path, value_type, - child_len, value_len, str(value) + if log_invalid_types: + self.log.warning( + ( + "{} Amount of strict list items in {} values is not same" + " as expected. Expected {} items. Got {} items. {}" + ).format( + self.path, value_type, + child_len, value_len, str(value) + ) ) - ) if value_len < child_len: # Fill missing values with NOT_SET @@ -504,36 +508,51 @@ class ListStrictEntity(ItemEntity): value.pop(child_len) return value - def update_default_value(self, value): - value = self._check_update_value(value, "default") + def update_default_value(self, value, log_invalid_types=True): + self._default_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "default", log_invalid_types + ) self.has_default_value = value is not NOT_SET if value is NOT_SET: for child_obj in self.children: - child_obj.update_default_value(value) + child_obj.update_default_value(value, log_invalid_types) else: for idx, item_value in enumerate(value): - self.children[idx].update_default_value(item_value) + self.children[idx].update_default_value( + item_value, log_invalid_types + ) - def update_studio_value(self, value): - value = self._check_update_value(value, "studio override") + def update_studio_value(self, value, log_invalid_types=True): + self._studio_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "studio override", log_invalid_types + ) if value is NOT_SET: for child_obj in self.children: - child_obj.update_studio_value(value) + child_obj.update_studio_value(value, log_invalid_types) else: for idx, item_value in enumerate(value): - self.children[idx].update_studio_value(item_value) + self.children[idx].update_studio_value( + item_value, log_invalid_types + ) - def update_project_value(self, value): - value = self._check_update_value(value, "project override") + def update_project_value(self, value, log_invalid_types=True): + self._project_log_invalid_types = log_invalid_types + value = self._check_update_value( + value, "project override", log_invalid_types + ) if value is NOT_SET: for child_obj in self.children: - child_obj.update_project_value(value) + child_obj.update_project_value(value, log_invalid_types) else: for idx, item_value in enumerate(value): - self.children[idx].update_project_value(item_value) + self.children[idx].update_project_value( + item_value, log_invalid_types + ) def reset_callbacks(self): super(ListStrictEntity, self).reset_callbacks() diff --git a/openpype/settings/entities/list_entity.py b/openpype/settings/entities/list_entity.py index 0268c208bb..5d6a64b3ea 100644 --- a/openpype/settings/entities/list_entity.py +++ b/openpype/settings/entities/list_entity.py @@ -325,16 +325,24 @@ class ListEntity(EndpointEntity): for item in value: child_obj = self._add_new_item() - child_obj.update_default_value(item) + child_obj.update_default_value( + item, self._default_log_invalid_types + ) if self._override_state is OverrideState.PROJECT: if self.had_project_override: - child_obj.update_project_value(item) + child_obj.update_project_value( + item, self._project_log_invalid_types + ) elif self.had_studio_override: - child_obj.update_studio_value(item) + child_obj.update_studio_value( + item, self._studio_log_invalid_types + ) elif self._override_state is OverrideState.STUDIO: if self.had_studio_override: - child_obj.update_studio_value(item) + child_obj.update_studio_value( + item, self._studio_log_invalid_types + ) for child_obj in self.children: child_obj.set_override_state( @@ -466,16 +474,24 @@ class ListEntity(EndpointEntity): for item in value: child_obj = self._add_new_item() - child_obj.update_default_value(item) + child_obj.update_default_value( + item, self._default_log_invalid_types + ) if self._override_state is OverrideState.PROJECT: if self.had_project_override: - child_obj.update_project_value(item) + child_obj.update_project_value( + item, self._project_log_invalid_types + ) elif self.had_studio_override: - child_obj.update_studio_value(item) + child_obj.update_studio_value( + item, self._studio_log_invalid_types + ) elif self._override_state is OverrideState.STUDIO: if self.had_studio_override: - child_obj.update_studio_value(item) + child_obj.update_studio_value( + item, self._studio_log_invalid_types + ) child_obj.set_override_state( self._override_state, self._ignore_missing_defaults diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_slack.json b/openpype/settings/entities/schemas/projects_schema/schema_project_slack.json index 14814d8b01..1a9804cd4f 100644 --- a/openpype/settings/entities/schemas/projects_schema/schema_project_slack.json +++ b/openpype/settings/entities/schemas/projects_schema/schema_project_slack.json @@ -75,6 +75,15 @@ "type": "list", "object_type": "text" }, + { + "type": "number", + "key": "review_upload_limit", + "label": "Upload review maximum file size (MB)", + "decimal": 2, + "default": 50, + "minimum": 0, + "maximum": 1000000 + }, { "type": "separator" }, diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_representation_tags.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_representation_tags.json index 7607e1a8c1..484fbf9d07 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_representation_tags.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_representation_tags.json @@ -24,6 +24,9 @@ }, { "sequence": "Output as image sequence" + }, + { + "no-audio": "Do not add audio" } ] } diff --git a/openpype/tools/context_dialog/window.py b/openpype/tools/context_dialog/window.py index c8464faa3e..9e030853bf 100644 --- a/openpype/tools/context_dialog/window.py +++ b/openpype/tools/context_dialog/window.py @@ -308,7 +308,6 @@ class ContextDialog(QtWidgets.QDialog): self._validate_strict() def _set_asset_to_tasks_widget(self): - # filter None docs they are silo asset_id = self._assets_widget.get_selected_asset_id() self._tasks_widget.set_asset_id(asset_id) diff --git a/openpype/tools/launcher/actions.py b/openpype/tools/launcher/actions.py index fbaef05261..546bda1c34 100644 --- a/openpype/tools/launcher/actions.py +++ b/openpype/tools/launcher/actions.py @@ -1,6 +1,7 @@ import os -from avalon import api +from Qt import QtWidgets, QtGui + from openpype import PLUGINS_DIR from openpype import style from openpype.api import Logger, resources @@ -8,7 +9,10 @@ from openpype.lib import ( ApplictionExecutableNotFound, ApplicationLaunchFailed ) -from Qt import QtWidgets, QtGui +from openpype.pipeline import ( + LauncherAction, + register_launcher_action_path, +) def register_actions_from_paths(paths): @@ -29,14 +33,15 @@ def register_actions_from_paths(paths): print("Path was not found: {}".format(path)) continue - api.register_plugin_path(api.Action, path) + register_launcher_action_path(path) def register_config_actions(): """Register actions from the configuration for Launcher""" actions_dir = os.path.join(PLUGINS_DIR, "actions") - register_actions_from_paths([actions_dir]) + if os.path.exists(actions_dir): + register_actions_from_paths([actions_dir]) def register_environment_actions(): @@ -46,7 +51,9 @@ def register_environment_actions(): register_actions_from_paths(paths_str.split(os.pathsep)) -class ApplicationAction(api.Action): +# TODO move to 'openpype.pipeline.actions' +# - remove Qt related stuff and implement exceptions to show error in launcher +class ApplicationAction(LauncherAction): """Pype's application launcher Application action based on pype's ApplicationManager system. @@ -74,7 +81,7 @@ class ApplicationAction(api.Action): @property def log(self): if self._log is None: - self._log = Logger().get_logger(self.__class__.__name__) + self._log = Logger.get_logger(self.__class__.__name__) return self._log def is_compatible(self, session): diff --git a/openpype/tools/launcher/lib.py b/openpype/tools/launcher/lib.py index 68c759f295..c1392b7b8f 100644 --- a/openpype/tools/launcher/lib.py +++ b/openpype/tools/launcher/lib.py @@ -1,19 +1,3 @@ -"""Utility script for updating database with configuration files - -Until assets are created entirely in the database, this script -provides a bridge between the file-based project inventory and configuration. - -- Migrating an old project: - $ python -m avalon.inventory --extract --silo-parent=f02_prod - $ python -m avalon.inventory --upload - -- Managing an existing project: - 1. Run `python -m avalon.inventory --load` - 2. Update the .inventory.toml or .config.toml - 3. Run `python -m avalon.inventory --save` - -""" - import os from Qt import QtGui import qtawesome diff --git a/openpype/tools/launcher/models.py b/openpype/tools/launcher/models.py index 85d553fca4..13567e7916 100644 --- a/openpype/tools/launcher/models.py +++ b/openpype/tools/launcher/models.py @@ -8,12 +8,13 @@ import time import appdirs from Qt import QtCore, QtGui import qtawesome -from avalon import api + from openpype.lib import JSONSettingRegistry from openpype.lib.applications import ( CUSTOM_LAUNCH_APP_GROUPS, ApplicationManager ) +from openpype.pipeline import discover_launcher_actions from openpype.tools.utils.lib import ( DynamicQThread, get_project_icon, @@ -68,7 +69,7 @@ class ActionModel(QtGui.QStandardItemModel): def discover(self): """Set up Actions cache. Run this for each new project.""" # Discover all registered actions - actions = api.discover(api.Action) + actions = discover_launcher_actions() # Get available project actions and the application actions app_actions = self.get_application_actions() diff --git a/openpype/tools/libraryloader/app.py b/openpype/tools/libraryloader/app.py index 9f8845f30f..b73b415128 100644 --- a/openpype/tools/libraryloader/app.py +++ b/openpype/tools/libraryloader/app.py @@ -9,14 +9,14 @@ from openpype.tools.loader.widgets import ( ThumbnailWidget, VersionWidget, FamilyListView, - RepresentationWidget + RepresentationWidget, + SubsetWidget ) from openpype.tools.utils.assets_widget import MultiSelectAssetsWidget from openpype.modules import ModulesManager from . import lib -from .widgets import LibrarySubsetWidget module = sys.modules[__name__] module.window = None @@ -92,7 +92,7 @@ class LibraryLoaderWindow(QtWidgets.QDialog): # --- Middle part --- # Subsets widget - subsets_widget = LibrarySubsetWidget( + subsets_widget = SubsetWidget( dbcon, self.groups_config, self.family_config_cache, @@ -448,10 +448,7 @@ class LibraryLoaderWindow(QtWidgets.QDialog): def _set_context(self, context, refresh=True): """Set the selection in the interface using a context. The context must contain `asset` data by name. - Note: Prior to setting context ensure `refresh` is triggered so that - the "silos" are listed correctly, aside from that setting the - context will force a refresh further down because it changes - the active silo and asset. + Args: context (dict): The context to apply. Returns: @@ -463,12 +460,6 @@ class LibraryLoaderWindow(QtWidgets.QDialog): return if refresh: - # Workaround: - # Force a direct (non-scheduled) refresh prior to setting the - # asset widget's silo and asset selection to ensure it's correctly - # displaying the silo tabs. Calling `window.refresh()` and directly - # `window.set_context()` the `set_context()` seems to override the - # scheduled refresh and the silo tabs are not shown. self._refresh_assets() self._assets_widget.select_asset_by_name(asset_name) diff --git a/openpype/tools/libraryloader/lib.py b/openpype/tools/libraryloader/lib.py index 6a497a6a16..182b48893a 100644 --- a/openpype/tools/libraryloader/lib.py +++ b/openpype/tools/libraryloader/lib.py @@ -1,7 +1,6 @@ import os import importlib import logging -from openpype.api import Anatomy log = logging.getLogger(__name__) @@ -20,14 +19,3 @@ def find_config(): log.info("Found %s, loading.." % config) return importlib.import_module(config) - - -class RegisteredRoots: - roots_per_project = {} - - @classmethod - def registered_root(cls, project_name): - if project_name not in cls.roots_per_project: - cls.roots_per_project[project_name] = Anatomy(project_name).roots - - return cls.roots_per_project[project_name] diff --git a/openpype/tools/libraryloader/widgets.py b/openpype/tools/libraryloader/widgets.py deleted file mode 100644 index 45f9ea2048..0000000000 --- a/openpype/tools/libraryloader/widgets.py +++ /dev/null @@ -1,18 +0,0 @@ -from Qt import QtWidgets - -from .lib import RegisteredRoots -from openpype.tools.loader.widgets import SubsetWidget - - -class LibrarySubsetWidget(SubsetWidget): - def on_copy_source(self): - """Copy formatted source path to clipboard""" - source = self.data.get("source", None) - if not source: - return - - project_name = self.dbcon.Session["AVALON_PROJECT"] - root = RegisteredRoots.registered_root(project_name) - path = source.format(root=root) - clipboard = QtWidgets.QApplication.clipboard() - clipboard.setText(path) diff --git a/openpype/tools/loader/app.py b/openpype/tools/loader/app.py index d73a977ac6..923a1fabdb 100644 --- a/openpype/tools/loader/app.py +++ b/openpype/tools/loader/app.py @@ -290,7 +290,6 @@ class LoaderWindow(QtWidgets.QDialog): subsets_model.clear() self.clear_assets_underlines() - # filter None docs they are silo asset_ids = self._assets_widget.get_selected_asset_ids() # Start loading subsets_widget.set_loading_state( @@ -381,17 +380,9 @@ class LoaderWindow(QtWidgets.QDialog): The context must contain `asset` data by name. - Note: Prior to setting context ensure `refresh` is triggered so that - the "silos" are listed correctly, aside from that setting the - context will force a refresh further down because it changes - the active silo and asset. - Args: context (dict): The context to apply. - - Returns: - None - + refrest (bool): Trigger refresh on context set. """ asset = context.get("asset", None) @@ -399,12 +390,6 @@ class LoaderWindow(QtWidgets.QDialog): return if refresh: - # Workaround: - # Force a direct (non-scheduled) refresh prior to setting the - # asset widget's silo and asset selection to ensure it's correctly - # displaying the silo tabs. Calling `window.refresh()` and directly - # `window.set_context()` the `set_context()` seems to override the - # scheduled refresh and the silo tabs are not shown. self._refresh() self._assets_widget.select_asset_by_name(asset) diff --git a/openpype/tools/loader/widgets.py b/openpype/tools/loader/widgets.py index b14bdd0e93..42fb62b632 100644 --- a/openpype/tools/loader/widgets.py +++ b/openpype/tools/loader/widgets.py @@ -7,9 +7,9 @@ import collections from Qt import QtWidgets, QtCore, QtGui -from avalon import api, pipeline - +from openpype.api import Anatomy from openpype.pipeline import HeroVersionType +from openpype.pipeline.thumbnail import get_thumbnail_binary from openpype.pipeline.load import ( discover_loader_plugins, SubsetLoaderPlugin, @@ -640,6 +640,7 @@ class VersionTextEdit(QtWidgets.QTextEdit): "source": None, "raw": None } + self._anatomy = None # Reset self.set_version(None) @@ -730,20 +731,20 @@ class VersionTextEdit(QtWidgets.QTextEdit): # Add additional actions when any text so we can assume # the version is set. if self.toPlainText().strip(): - menu.addSeparator() - action = QtWidgets.QAction("Copy source path to clipboard", - menu) + action = QtWidgets.QAction( + "Copy source path to clipboard", menu + ) action.triggered.connect(self.on_copy_source) menu.addAction(action) - action = QtWidgets.QAction("Copy raw data to clipboard", - menu) + action = QtWidgets.QAction( + "Copy raw data to clipboard", menu + ) action.triggered.connect(self.on_copy_raw) menu.addAction(action) menu.exec_(event.globalPos()) - del menu def on_copy_source(self): """Copy formatted source path to clipboard""" @@ -751,7 +752,11 @@ class VersionTextEdit(QtWidgets.QTextEdit): if not source: return - path = source.format(root=api.registered_root()) + project_name = self.dbcon.Session["AVALON_PROJECT"] + if self._anatomy is None or self._anatomy.project_name != project_name: + self._anatomy = Anatomy(project_name) + + path = source.format(root=self._anatomy.roots) clipboard = QtWidgets.QApplication.clipboard() clipboard.setText(path) @@ -771,7 +776,6 @@ class VersionTextEdit(QtWidgets.QTextEdit): class ThumbnailWidget(QtWidgets.QLabel): - aspect_ratio = (16, 9) max_width = 300 @@ -863,7 +867,7 @@ class ThumbnailWidget(QtWidgets.QLabel): if not thumbnail_ent: return - thumbnail_bin = pipeline.get_thumbnail_binary( + thumbnail_bin = get_thumbnail_binary( thumbnail_ent, "thumbnail", self.dbcon ) if not thumbnail_bin: diff --git a/openpype/tools/mayalookassigner/commands.py b/openpype/tools/mayalookassigner/commands.py index df72e41354..78fd51c7a3 100644 --- a/openpype/tools/mayalookassigner/commands.py +++ b/openpype/tools/mayalookassigner/commands.py @@ -2,6 +2,7 @@ from collections import defaultdict import logging import os +from bson.objectid import ObjectId import maya.cmds as cmds from avalon import io, api @@ -157,7 +158,7 @@ def create_items_from_nodes(nodes): return asset_view_items for _id, id_nodes in id_hashes.items(): - asset = io.find_one({"_id": io.ObjectId(_id)}, + asset = io.find_one({"_id": ObjectId(_id)}, projection={"name": True}) # Skip if asset id is not found diff --git a/openpype/tools/mayalookassigner/vray_proxies.py b/openpype/tools/mayalookassigner/vray_proxies.py index 6a9347449a..25621fc652 100644 --- a/openpype/tools/mayalookassigner/vray_proxies.py +++ b/openpype/tools/mayalookassigner/vray_proxies.py @@ -6,6 +6,7 @@ import logging import json import six +from bson.objectid import ObjectId import alembic.Abc from maya import cmds @@ -231,7 +232,7 @@ def get_latest_version(asset_id, subset): """ subset = io.find_one({"name": subset, - "parent": io.ObjectId(asset_id), + "parent": ObjectId(asset_id), "type": "subset"}) if not subset: raise RuntimeError("Subset does not exist: %s" % subset) diff --git a/openpype/tools/sceneinventory/model.py b/openpype/tools/sceneinventory/model.py index 7173ae751e..091d6ca925 100644 --- a/openpype/tools/sceneinventory/model.py +++ b/openpype/tools/sceneinventory/model.py @@ -5,6 +5,7 @@ from collections import defaultdict from Qt import QtCore, QtGui import qtawesome +from bson.objectid import ObjectId from avalon import api, io, schema from openpype.pipeline import HeroVersionType @@ -299,7 +300,7 @@ class InventoryModel(TreeModel): for repre_id, group_dict in sorted(grouped.items()): group_items = group_dict["items"] # Get parenthood per group - representation = io.find_one({"_id": io.ObjectId(repre_id)}) + representation = io.find_one({"_id": ObjectId(repre_id)}) if not representation: not_found["representation"].append(group_items) not_found_ids.append(repre_id) diff --git a/openpype/tools/sceneinventory/switch_dialog.py b/openpype/tools/sceneinventory/switch_dialog.py index 0e7b1b759a..bb3e2615ac 100644 --- a/openpype/tools/sceneinventory/switch_dialog.py +++ b/openpype/tools/sceneinventory/switch_dialog.py @@ -2,12 +2,14 @@ import collections import logging from Qt import QtWidgets, QtCore import qtawesome +from bson.objectid import ObjectId -from avalon import io, pipeline -from openpype.pipeline import ( +from avalon import io +from openpype.pipeline.load import ( discover_loader_plugins, switch_container, get_repres_contexts, + loaders_from_repre_context, ) from .widgets import ( @@ -146,7 +148,7 @@ class SwitchAssetDialog(QtWidgets.QDialog): repre_ids = set() content_loaders = set() for item in self._items: - repre_ids.add(io.ObjectId(item["representation"])) + repre_ids.add(ObjectId(item["representation"])) content_loaders.add(item["loader"]) repres = list(io.find({ @@ -369,7 +371,7 @@ class SwitchAssetDialog(QtWidgets.QDialog): loaders = None for repre_context in repre_contexts.values(): - _loaders = set(pipeline.loaders_from_repre_context( + _loaders = set(loaders_from_repre_context( available_loaders, repre_context )) if loaders is None: @@ -1306,7 +1308,7 @@ class SwitchAssetDialog(QtWidgets.QDialog): repre_docs_by_parent_id_by_name[parent_id][name] = repre_doc for container in self._items: - container_repre_id = io.ObjectId(container["representation"]) + container_repre_id = ObjectId(container["representation"]) container_repre = self.content_repres[container_repre_id] container_repre_name = container_repre["name"] diff --git a/openpype/tools/sceneinventory/view.py b/openpype/tools/sceneinventory/view.py index c38390c614..2df6d00406 100644 --- a/openpype/tools/sceneinventory/view.py +++ b/openpype/tools/sceneinventory/view.py @@ -4,14 +4,16 @@ from functools import partial from Qt import QtWidgets, QtCore import qtawesome +from bson.objectid import ObjectId -from avalon import io, api +from avalon import io from openpype import style from openpype.pipeline import ( HeroVersionType, update_container, remove_container, + discover_inventory_actions, ) from openpype.modules import ModulesManager from openpype.tools.utils.lib import ( @@ -78,7 +80,7 @@ class SceneInventoryView(QtWidgets.QTreeView): repre_ids = [] for item in items: - item_id = io.ObjectId(item["representation"]) + item_id = ObjectId(item["representation"]) if item_id not in repre_ids: repre_ids.append(item_id) @@ -145,7 +147,7 @@ class SceneInventoryView(QtWidgets.QTreeView): def _on_switch_to_versioned(items): repre_ids = [] for item in items: - item_id = io.ObjectId(item["representation"]) + item_id = ObjectId(item["representation"]) if item_id not in repre_ids: repre_ids.append(item_id) @@ -195,7 +197,7 @@ class SceneInventoryView(QtWidgets.QTreeView): version_doc["name"] for item in items: - repre_id = io.ObjectId(item["representation"]) + repre_id = ObjectId(item["representation"]) version_id = version_id_by_repre_id.get(repre_id) version_name = version_name_by_id.get(version_id) if version_name is not None: @@ -487,7 +489,7 @@ class SceneInventoryView(QtWidgets.QTreeView): containers = containers or [dict()] # Check which action will be available in the menu - Plugins = api.discover(api.InventoryAction) + Plugins = discover_inventory_actions() compatible = [p() for p in Plugins if any(p.is_compatible(c) for c in containers)] @@ -658,7 +660,7 @@ class SceneInventoryView(QtWidgets.QTreeView): active = items[-1] # Get available versions for active representation - representation_id = io.ObjectId(active["representation"]) + representation_id = ObjectId(active["representation"]) representation = io.find_one({"_id": representation_id}) version = io.find_one({ "_id": representation["parent"] diff --git a/openpype/tools/settings/settings/wrapper_widgets.py b/openpype/tools/settings/settings/wrapper_widgets.py index 7370fcf945..b14a226912 100644 --- a/openpype/tools/settings/settings/wrapper_widgets.py +++ b/openpype/tools/settings/settings/wrapper_widgets.py @@ -92,7 +92,8 @@ class CollapsibleWrapper(WrapperWidget): self.content_layout = content_layout if self.collapsible: - body_widget.toggle_content(self.collapsed) + if not self.collapsed: + body_widget.toggle_content() else: body_widget.hide_toolbox(hide_content=False) diff --git a/openpype/tools/standalonepublish/widgets/model_asset.py b/openpype/tools/standalonepublish/widgets/model_asset.py index a7316a2aa7..02e9073555 100644 --- a/openpype/tools/standalonepublish/widgets/model_asset.py +++ b/openpype/tools/standalonepublish/widgets/model_asset.py @@ -35,7 +35,7 @@ def _iter_model_rows(model, class AssetModel(TreeModel): - """A model listing assets in the silo in the active project. + """A model listing assets in the active project. The assets are displayed in a treeview, they are visually parented by a `visualParent` field in the database containing an `_id` to a parent @@ -64,7 +64,7 @@ class AssetModel(TreeModel): self.refresh() - def _add_hierarchy(self, assets, parent=None, silos=None): + def _add_hierarchy(self, assets, parent=None): """Add the assets that are related to the parent as children items. This method does *not* query the database. These instead are queried @@ -72,27 +72,8 @@ class AssetModel(TreeModel): queries. Resulting in up to 10x speed increase. Args: - assets (dict): All assets in the currently active silo stored - by key/value - - Returns: - None - + assets (dict): All assets from current project. """ - if silos: - # WARNING: Silo item "_id" is set to silo value - # mainly because GUI issue with preserve selection and expanded row - # and because of easier hierarchy parenting (in "assets") - for silo in silos: - node = Node({ - "_id": silo, - "name": silo, - "label": silo, - "type": "silo" - }) - self.add_child(node, parent=parent) - self._add_hierarchy(assets, parent=node) - parent_id = parent["_id"] if parent else None current_assets = assets.get(parent_id, list()) @@ -132,27 +113,19 @@ class AssetModel(TreeModel): self.beginResetModel() - # Get all assets in current silo sorted by name + # Get all assets in current project sorted by name db_assets = self.dbcon.find({"type": "asset"}).sort("name", 1) - silos = db_assets.distinct("silo") or None - # if any silo is set to None then it's expected it should not be used - if silos and None in silos: - silos = None # Group the assets by their visual parent's id assets_by_parent = collections.defaultdict(list) for asset in db_assets: - parent_id = ( - asset.get("data", {}).get("visualParent") or - asset.get("silo") - ) + parent_id = asset.get("data", {}).get("visualParent") assets_by_parent[parent_id].append(asset) # Build the hierarchical tree items recursively self._add_hierarchy( assets_by_parent, - parent=None, - silos=silos + parent=None ) self.endResetModel() @@ -174,8 +147,6 @@ class AssetModel(TreeModel): # Allow a custom icon and custom icon color to be defined data = node.get("_document", {}).get("data", {}) icon = data.get("icon", None) - if icon is None and node.get("type") == "silo": - icon = "database" color = data.get("color", self._default_asset_icon_color) if icon is None: diff --git a/openpype/tools/standalonepublish/widgets/widget_asset.py b/openpype/tools/standalonepublish/widgets/widget_asset.py index e6b74f8f82..8b43cd7cf8 100644 --- a/openpype/tools/standalonepublish/widgets/widget_asset.py +++ b/openpype/tools/standalonepublish/widgets/widget_asset.py @@ -229,7 +229,6 @@ class AssetWidget(QtWidgets.QWidget): data = { 'project': project['name'], 'asset': asset['name'], - 'silo': asset.get("silo"), 'parents': self.get_parents(asset), 'task': task } diff --git a/openpype/tools/texture_copy/app.py b/openpype/tools/texture_copy/app.py index ceca98a082..0c3c260e51 100644 --- a/openpype/tools/texture_copy/app.py +++ b/openpype/tools/texture_copy/app.py @@ -57,7 +57,6 @@ class TextureCopy: "name": project_name, "code": project['data']['code'] }, - "silo": asset.get('silo'), "asset": asset['name'], "family": 'texture', "subset": 'Main', @@ -155,7 +154,6 @@ def texture_copy(asset, project, path): t.echo(">>> Initializing avalon session ...") os.environ["AVALON_PROJECT"] = project os.environ["AVALON_ASSET"] = asset - os.environ["AVALON_SILO"] = "" TextureCopy().process(asset, project, path) diff --git a/openpype/tools/workfiles/app.py b/openpype/tools/workfiles/app.py index ccf80ee98b..f0e7900cf5 100644 --- a/openpype/tools/workfiles/app.py +++ b/openpype/tools/workfiles/app.py @@ -61,7 +61,6 @@ def show(root=None, debug=False, parent=None, use_context=True, save=True): if use_context: context = { "asset": api.Session["AVALON_ASSET"], - "silo": api.Session["AVALON_SILO"], "task": api.Session["AVALON_TASK"] } window.set_context(context) diff --git a/openpype/version.py b/openpype/version.py index 1ef25e3f48..6d55672aca 100644 --- a/openpype/version.py +++ b/openpype/version.py @@ -1,3 +1,3 @@ # -*- coding: utf-8 -*- """Package declaring Pype version.""" -__version__ = "3.9.1" +__version__ = "3.9.2-nightly.3" diff --git a/openpype/widgets/attribute_defs/widgets.py b/openpype/widgets/attribute_defs/widgets.py index a6f1b8d6c9..23f025967d 100644 --- a/openpype/widgets/attribute_defs/widgets.py +++ b/openpype/widgets/attribute_defs/widgets.py @@ -2,7 +2,7 @@ import uuid from Qt import QtWidgets, QtCore -from openpype.pipeline.lib import ( +from openpype.lib.attribute_definitions import ( AbtractAttrDef, UnknownDef, NumberDef, diff --git a/poetry.lock b/poetry.lock index ee7b839b8d..ed2b0dd3c2 100644 --- a/poetry.lock +++ b/poetry.lock @@ -11,7 +11,7 @@ develop = false type = "git" url = "https://github.com/pypeclub/acre.git" reference = "master" -resolved_reference = "55a7c331e6dc5f81639af50ca4a8cc9d73e9273d" +resolved_reference = "126f7a188cfe36718f707f42ebbc597e86aa86c3" [[package]] name = "aiohttp" diff --git a/pyproject.toml b/pyproject.toml index 7c09495a99..479cd731fe 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "OpenPype" -version = "3.9.1" # OpenPype +version = "3.9.2-nightly.3" # OpenPype description = "Open VFX and Animation pipeline with support." authors = ["OpenPype Team "] license = "MIT License" diff --git a/repos/avalon-core b/repos/avalon-core index 64491fbbcf..2fa14cea6f 160000 --- a/repos/avalon-core +++ b/repos/avalon-core @@ -1 +1 @@ -Subproject commit 64491fbbcf89ba2a0b3a20d67d7486c6142232b3 +Subproject commit 2fa14cea6f6a9d86eec70bbb96860cbe4c75c8eb diff --git a/website/docs/dev_requirements.md b/website/docs/dev_requirements.md index bbf3b1fb5b..6c87054ba0 100644 --- a/website/docs/dev_requirements.md +++ b/website/docs/dev_requirements.md @@ -33,6 +33,8 @@ It can be built and ran on all common platforms. We develop and test on the foll ## Database +Database version should be at least **MongoDB 4.4**. + Pype needs site-wide installation of **MongoDB**. It should be installed on reliable server, that all workstations (and possibly render nodes) can connect. This server holds **Avalon** database that is at the core of everything