Merge remote-tracking branch 'upstream/develop' into fusion_integration_v2

This commit is contained in:
Roy Nieterau 2022-03-29 19:43:02 +02:00
commit 9bb5626224
187 changed files with 5438 additions and 2572 deletions

View file

@ -1,19 +1,78 @@
# Changelog
## [3.9.1-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.9.2-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.0...HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.1...HEAD)
### 📖 Documentation
- Docs: Added MongoDB requirements [\#2951](https://github.com/pypeclub/OpenPype/pull/2951)
**🚀 Enhancements**
- Slack: Added configurable maximum file size of review upload to Slack [\#2945](https://github.com/pypeclub/OpenPype/pull/2945)
- NewPublisher: Prepared implementation of optional pyblish plugin [\#2943](https://github.com/pypeclub/OpenPype/pull/2943)
- Workfiles: Open published workfiles [\#2925](https://github.com/pypeclub/OpenPype/pull/2925)
- CI: change the version bump logic [\#2919](https://github.com/pypeclub/OpenPype/pull/2919)
- Deadline: Add headless argument [\#2916](https://github.com/pypeclub/OpenPype/pull/2916)
- Nuke: Add no-audio Tag [\#2911](https://github.com/pypeclub/OpenPype/pull/2911)
- Ftrack: Fill workfile in custom attribute [\#2906](https://github.com/pypeclub/OpenPype/pull/2906)
- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903)
- Settings UI: Add simple tooltips for settings entities [\#2901](https://github.com/pypeclub/OpenPype/pull/2901)
**🐛 Bug fixes**
- LogViewer: Don't refresh on initialization [\#2949](https://github.com/pypeclub/OpenPype/pull/2949)
- General: anatomy data with correct task short key [\#2947](https://github.com/pypeclub/OpenPype/pull/2947)
- SceneInventory: Fix imports in UI [\#2944](https://github.com/pypeclub/OpenPype/pull/2944)
- Slack: add generic exception [\#2941](https://github.com/pypeclub/OpenPype/pull/2941)
- General: Python specific vendor paths on env injection [\#2939](https://github.com/pypeclub/OpenPype/pull/2939)
- General: More fail safe delete old versions [\#2936](https://github.com/pypeclub/OpenPype/pull/2936)
- Settings UI: Collapsed of collapsible wrapper works as expected [\#2934](https://github.com/pypeclub/OpenPype/pull/2934)
- General: Don't print log record on OSError [\#2926](https://github.com/pypeclub/OpenPype/pull/2926)
- Hiero: Fix import of 'register\_event\_callback' [\#2924](https://github.com/pypeclub/OpenPype/pull/2924)
- Ftrack: Missing Ftrack id after editorial publish [\#2905](https://github.com/pypeclub/OpenPype/pull/2905)
**🔀 Refactored code**
- General: Move Attribute Definitions from pipeline [\#2931](https://github.com/pypeclub/OpenPype/pull/2931)
- General: Removed silo references and terminal splash [\#2927](https://github.com/pypeclub/OpenPype/pull/2927)
- General: Move pipeline constants to OpenPype [\#2918](https://github.com/pypeclub/OpenPype/pull/2918)
- General: Move formatting and workfile functions [\#2914](https://github.com/pypeclub/OpenPype/pull/2914)
- General: Move remaining plugins from avalon [\#2912](https://github.com/pypeclub/OpenPype/pull/2912)
**Merged pull requests:**
- Maya: Do not pass `set` to maya commands \(fixes support for older maya versions\) [\#2932](https://github.com/pypeclub/OpenPype/pull/2932)
## [3.9.1](https://github.com/pypeclub/OpenPype/tree/3.9.1) (2022-03-18)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.1-nightly.3...3.9.1)
**🚀 Enhancements**
- General: Change how OPENPYPE\_DEBUG value is handled [\#2907](https://github.com/pypeclub/OpenPype/pull/2907)
- nuke: imageio adding ocio config version 1.2 [\#2897](https://github.com/pypeclub/OpenPype/pull/2897)
- Flame: support for comment with xml attribute overrides [\#2892](https://github.com/pypeclub/OpenPype/pull/2892)
- Nuke: ExtractReviewSlate can handle more codes and profiles [\#2879](https://github.com/pypeclub/OpenPype/pull/2879)
- Flame: sequence used for reference video [\#2869](https://github.com/pypeclub/OpenPype/pull/2869)
**🐛 Bug fixes**
- General: Fix use of Anatomy roots [\#2904](https://github.com/pypeclub/OpenPype/pull/2904)
- Fixing gap detection in extract review [\#2902](https://github.com/pypeclub/OpenPype/pull/2902)
- Pyblish Pype - ensure current state is correct when entering new group order [\#2899](https://github.com/pypeclub/OpenPype/pull/2899)
- SceneInventory: Fix import of load function [\#2894](https://github.com/pypeclub/OpenPype/pull/2894)
- Harmony - fixed creator issue [\#2891](https://github.com/pypeclub/OpenPype/pull/2891)
- General: Remove forgotten use of avalon Creator [\#2885](https://github.com/pypeclub/OpenPype/pull/2885)
- General: Avoid circular import [\#2884](https://github.com/pypeclub/OpenPype/pull/2884)
- Fixes for attaching loaded containers \(\#2837\) [\#2874](https://github.com/pypeclub/OpenPype/pull/2874)
- Maya: Deformer node ids validation plugin [\#2826](https://github.com/pypeclub/OpenPype/pull/2826)
**🔀 Refactored code**
- General: Reduce style usage to OpenPype repository [\#2889](https://github.com/pypeclub/OpenPype/pull/2889)
- General: Move loader logic from avalon to openpype [\#2886](https://github.com/pypeclub/OpenPype/pull/2886)
## [3.9.0](https://github.com/pypeclub/OpenPype/tree/3.9.0) (2022-03-14)
@ -23,6 +82,10 @@
- AssetCreator: Remove the tool [\#2845](https://github.com/pypeclub/OpenPype/pull/2845)
### 📖 Documentation
- Documentation: Change Photoshop & AfterEffects plugin path [\#2878](https://github.com/pypeclub/OpenPype/pull/2878)
**🚀 Enhancements**
- General: Subset name filtering in ExtractReview outpus [\#2872](https://github.com/pypeclub/OpenPype/pull/2872)
@ -36,11 +99,11 @@
- General: Color dialog UI fixes [\#2817](https://github.com/pypeclub/OpenPype/pull/2817)
- global: letter box calculated on output as last process [\#2812](https://github.com/pypeclub/OpenPype/pull/2812)
- Nuke: adding Reformat to baking mov plugin [\#2811](https://github.com/pypeclub/OpenPype/pull/2811)
- Manager: Update all to latest button [\#2805](https://github.com/pypeclub/OpenPype/pull/2805)
**🐛 Bug fixes**
- General: Missing time function [\#2877](https://github.com/pypeclub/OpenPype/pull/2877)
- AfterEffects: Fix rendering for single frame in DL [\#2875](https://github.com/pypeclub/OpenPype/pull/2875)
- Deadline: Fix plugin name for tile assemble [\#2868](https://github.com/pypeclub/OpenPype/pull/2868)
- Nuke: gizmo precollect fix [\#2866](https://github.com/pypeclub/OpenPype/pull/2866)
- General: Fix hardlink for windows [\#2864](https://github.com/pypeclub/OpenPype/pull/2864)
@ -74,7 +137,6 @@
- General: Move change context functions [\#2839](https://github.com/pypeclub/OpenPype/pull/2839)
- Tools: Don't use avalon tools code [\#2829](https://github.com/pypeclub/OpenPype/pull/2829)
- Move Unreal Implementation to OpenPype [\#2823](https://github.com/pypeclub/OpenPype/pull/2823)
- General: Extract template formatting from anatomy [\#2766](https://github.com/pypeclub/OpenPype/pull/2766)
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)

View file

@ -78,6 +78,7 @@ def install():
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
register_inventory_action,
)
from avalon import pipeline
@ -124,7 +125,7 @@ def install():
pyblish.register_plugin_path(path)
register_loader_plugin_path(path)
avalon.register_plugin_path(LegacyCreator, path)
avalon.register_plugin_path(avalon.InventoryAction, path)
register_inventory_action(path)
# apply monkey patched discover to original one
log.info("Patching discovery")

View file

@ -101,7 +101,7 @@ def eventserver(debug,
on linux and window service).
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = "3"
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().launch_eventservercli(
ftrack_url,
@ -128,7 +128,7 @@ def webpublisherwebserver(debug, executable, upload_dir, host=None, port=None):
Expect "pype.club" user created on Ftrack.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = "3"
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().launch_webpublisher_webservercli(
upload_dir=upload_dir,
@ -176,7 +176,7 @@ def publish(debug, paths, targets, gui):
More than one path is allowed.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands.publish(list(paths), targets, gui)
@ -195,7 +195,7 @@ def remotepublishfromapp(debug, project, path, host, user=None, targets=None):
More than one path is allowed.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands.remotepublishfromapp(
project, path, host, user, targets=targets
)
@ -215,7 +215,7 @@ def remotepublish(debug, project, path, user=None, targets=None):
More than one path is allowed.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands.remotepublish(project, path, user, targets=targets)
@ -240,7 +240,7 @@ def texturecopy(debug, project, asset, path):
Nothing is written to database.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().texture_copy(project, asset, path)
@ -409,7 +409,7 @@ def syncserver(debug, active_site):
var OPENPYPE_LOCAL_ID set to 'active_site'.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().syncserver(active_site)

View file

@ -1,35 +0,0 @@
import os
from openpype.lib import PreLaunchHook
class PrePython2Vendor(PreLaunchHook):
"""Prepend python 2 dependencies for py2 hosts."""
order = 10
def execute(self):
if not self.application.use_python_2:
return
# Prepare vendor dir path
self.log.info("adding global python 2 vendor")
pype_root = os.getenv("OPENPYPE_REPOS_ROOT")
python_2_vendor = os.path.join(
pype_root,
"openpype",
"vendor",
"python",
"python_2"
)
# Add Python 2 modules
python_paths = [
python_2_vendor
]
# Load PYTHONPATH from current launch context
python_path = self.launch_context.env.get("PYTHONPATH")
if python_path:
python_paths.append(python_path)
# Set new PYTHONPATH to launch context environments
self.launch_context.env["PYTHONPATH"] = os.pathsep.join(python_paths)

View file

@ -2,10 +2,11 @@ import os
import sys
from Qt import QtWidgets
from bson.objectid import ObjectId
import pyblish.api
import avalon.api
from avalon import io, pipeline
from avalon import io
from openpype import lib
from openpype.api import Logger
@ -13,6 +14,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.aftereffects
from openpype.lib import register_event_callback
@ -29,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
def check_inventory():
@ -42,7 +43,7 @@ def check_inventory():
representation = container['representation']
representation_doc = io.find_one(
{
"_id": io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}
@ -149,7 +150,7 @@ def containerise(name,
"""
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": name,
"namespace": namespace,
"loader": str(loader),

View file

@ -1,8 +1,8 @@
"""Host API required Work Files tool"""
import os
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .launch_logic import get_stub
from avalon import api
def _active_document():
@ -14,7 +14,7 @@ def _active_document():
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["aftereffects"]
return HOST_WORKFILE_EXTENSIONS["aftereffects"]
def has_unsaved_changes():

View file

@ -328,7 +328,6 @@ class LaunchWorkFiles(LaunchQtApp):
result = super().execute(context)
self._window.set_context({
"asset": avalon.api.Session["AVALON_ASSET"],
"silo": avalon.api.Session["AVALON_SILO"],
"task": avalon.api.Session["AVALON_TASK"]
})
return result

View file

@ -12,12 +12,12 @@ from . import ops
import pyblish.api
import avalon.api
from avalon import io, schema
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from openpype.api import Logger
from openpype.lib import (
@ -31,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
ORIGINAL_EXCEPTHOOK = sys.excepthook

View file

@ -4,7 +4,8 @@ from pathlib import Path
from typing import List, Optional
import bpy
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
class OpenFileCacher:
@ -77,7 +78,7 @@ def has_unsaved_changes() -> bool:
def file_extensions() -> List[str]:
"""Return the supported file extensions for Blender scene files."""
return api.HOST_WORKFILE_EXTENSIONS["blender"]
return HOST_WORKFILE_EXTENSIONS["blender"]
def work_root(session: dict) -> str:

View file

@ -6,11 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)
from openpype.hosts.blender.api import plugin, lib

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -7,12 +7,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)
logger = logging.getLogger("openpype").getChild(

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin, lib
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin, lib
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -10,12 +10,12 @@ from openpype import lib
from openpype.pipeline import (
legacy_create,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -13,12 +13,12 @@ from openpype.pipeline import (
load_container,
get_representation_path,
loaders_from_representation,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api.pipeline import (
AVALON_INSTANCES,
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)
from openpype.hosts.blender.api import plugin

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -10,6 +10,7 @@ from openpype import lib
from openpype.pipeline import (
legacy_create,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import (
plugin,
@ -18,7 +19,6 @@ from openpype.hosts.blender.api import (
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -1,6 +1,8 @@
import os
import json
from bson.objectid import ObjectId
import bpy
import bpy_extras
import bpy_extras.anim_utils
@ -140,7 +142,7 @@ class ExtractLayout(openpype.api.Extractor):
blend = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "blend"
},
projection={"_id": True})
@ -151,7 +153,7 @@ class ExtractLayout(openpype.api.Extractor):
fbx = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "fbx"
},
projection={"_id": True})
@ -162,7 +164,7 @@ class ExtractLayout(openpype.api.Extractor):
abc = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "abc"
},
projection={"_id": True})

View file

@ -4,13 +4,14 @@ Basic avalon integration
import os
import contextlib
from avalon import api as avalon
from avalon.pipeline import AVALON_CONTAINER_ID
from pyblish import api as pyblish
from openpype.api import Logger
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from .lib import (
set_segment_data_marker,
@ -26,7 +27,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
AVALON_CONTAINERS = "AVALON_CONTAINERS"
@ -34,12 +34,10 @@ log = Logger.get_logger(__name__)
def install():
pyblish.register_host("flame")
pyblish.register_plugin_path(PUBLISH_PATH)
register_loader_plugin_path(LOAD_PATH)
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
log.info("OpenPype Flame plug-ins registred ...")
# register callback for switching publishable
@ -47,6 +45,7 @@ def install():
log.info("OpenPype Flame host installed ...")
def uninstall():
pyblish.deregister_host("flame")
@ -54,7 +53,6 @@ def uninstall():
pyblish.deregister_plugin_path(PUBLISH_PATH)
deregister_loader_plugin_path(LOAD_PATH)
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)

View file

@ -1,3 +1,4 @@
import re
import pyblish
import openpype
import openpype.hosts.flame.api as opfapi
@ -6,6 +7,10 @@ from openpype.hosts.flame.otio import flame_export
# # developer reload modules
from pprint import pformat
# constatns
NUM_PATERN = re.compile(r"([0-9\.]+)")
TXT_PATERN = re.compile(r"([a-zA-Z]+)")
class CollectTimelineInstances(pyblish.api.ContextPlugin):
"""Collect all Timeline segment selection."""
@ -16,6 +21,16 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
audio_track_items = []
# TODO: add to settings
# settings
xml_preset_attrs_from_comments = {
"width": "number",
"height": "number",
"pixelRatio": "float",
"resizeType": "string",
"resizeFilter": "string"
}
def process(self, context):
project = context.data["flameProject"]
sequence = context.data["flameSequence"]
@ -26,6 +41,10 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
# process all sellected
with opfapi.maintained_segment_selection(sequence) as segments:
for segment in segments:
comment_attributes = self._get_comment_attributes(segment)
self.log.debug("_ comment_attributes: {}".format(
pformat(comment_attributes)))
clip_data = opfapi.get_segment_attributes(segment)
clip_name = clip_data["segment_name"]
self.log.debug("clip_name: {}".format(clip_name))
@ -101,6 +120,9 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
# add resolution
self._get_resolution_to_data(inst_data, context)
# add comment attributes if any
inst_data.update(comment_attributes)
# create instance
instance = context.create_instance(**inst_data)
@ -126,6 +148,94 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
if marker_data.get("reviewTrack") is not None:
instance.data["reviewAudio"] = True
def _get_comment_attributes(self, segment):
comment = segment.comment.get_value()
# try to find attributes
attributes = {
"xml_overrides": {
"pixelRatio": 1.00}
}
# search for `:`
for split in self._split_comments(comment):
# make sure we ignore if not `:` in key
if ":" not in split:
continue
self._get_xml_preset_attrs(
attributes, split)
# add xml overides resolution to instance data
xml_overrides = attributes["xml_overrides"]
if xml_overrides.get("width"):
attributes.update({
"resolutionWidth": xml_overrides["width"],
"resolutionHeight": xml_overrides["height"],
"pixelAspect": xml_overrides["pixelRatio"]
})
return attributes
def _get_xml_preset_attrs(self, attributes, split):
# split to key and value
key, value = split.split(":")
for a_name, a_type in self.xml_preset_attrs_from_comments.items():
# exclude all not related attributes
if a_name.lower() not in key.lower():
continue
# get pattern defined by type
pattern = TXT_PATERN
if a_type in ("number" , "float"):
pattern = NUM_PATERN
res_goup = pattern.findall(value)
# raise if nothing is found as it is not correctly defined
if not res_goup:
raise ValueError((
"Value for `{}` attribute is not "
"set correctly: `{}`").format(a_name, split))
if "string" in a_type:
_value = res_goup[0]
if "float" in a_type:
_value = float(res_goup[0])
if "number" in a_type:
_value = int(res_goup[0])
attributes["xml_overrides"][a_name] = _value
# condition for resolution in key
if "resolution" in key.lower():
res_goup = NUM_PATERN.findall(value)
# check if axpect was also defined
# 1920x1080x1.5
aspect = res_goup[2] if len(res_goup) > 2 else 1
width = int(res_goup[0])
height = int(res_goup[1])
pixel_ratio = float(aspect)
attributes["xml_overrides"].update({
"width": width,
"height": height,
"pixelRatio": pixel_ratio
})
def _split_comments(self, comment_string):
# first split comment by comma
split_comments = []
if "," in comment_string:
split_comments.extend(comment_string.split(","))
elif ";" in comment_string:
split_comments.extend(comment_string.split(";"))
else:
split_comments.append(comment_string)
return split_comments
def _get_head_tail(self, clip_data, first_frame):
# calculate head and tail with forward compatibility
head = clip_data.get("segment_head")

View file

@ -1,6 +1,7 @@
import os
from pprint import pformat
from copy import deepcopy
import pyblish.api
import openpype.api
from openpype.hosts.flame import api as opfapi
@ -23,6 +24,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
"xml_preset_file": "Jpeg (8-bit).xml",
"xml_preset_dir": "",
"export_type": "File Sequence",
"ignore_comment_attrs": True,
"colorspace_out": "Output - sRGB",
"representation_add_range": False,
"representation_tags": ["thumbnail"]
@ -32,6 +34,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
"xml_preset_file": "Apple iPad (1920x1080).xml",
"xml_preset_dir": "",
"export_type": "Movie",
"ignore_comment_attrs": True,
"colorspace_out": "Output - Rec.709",
"representation_add_range": True,
"representation_tags": [
@ -102,6 +105,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
preset_dir = preset_config["xml_preset_dir"]
export_type = preset_config["export_type"]
repre_tags = preset_config["representation_tags"]
ignore_comment_attrs = preset_config["ignore_comment_attrs"]
color_out = preset_config["colorspace_out"]
# get frame range with handles for representation range
@ -131,6 +135,14 @@ class ExtractSubsetResources(openpype.api.Extractor):
"startFrame": frame_start
})
if not ignore_comment_attrs:
# add any xml overrides collected form segment.comment
modify_xml_data.update(instance.data["xml_overrides"])
self.log.debug("__ modify_xml_data: {}".format(pformat(
modify_xml_data
)))
# with maintained duplication loop all presets
with opfapi.maintained_object_duplication(
exporting_clip) as duplclip:

View file

@ -3,6 +3,7 @@ import sys
import re
import contextlib
from bson.objectid import ObjectId
from Qt import QtGui
from avalon import io
@ -117,7 +118,7 @@ def switch_item(container,
# Collect any of current asset, subset and representation if not provided
# so we can use the original name from those.
if any(not x for x in [asset_name, subset_name, representation_name]):
_id = io.ObjectId(container["representation"])
_id = ObjectId(container["representation"])
representation = io.find_one({"type": "representation", "_id": _id})
version, subset, asset, project = io.parenthood(representation)

View file

@ -8,13 +8,15 @@ import contextlib
import pyblish.api
import avalon.api
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.api import Logger
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
register_inventory_action_path,
deregister_inventory_action_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.fusion
@ -69,7 +71,7 @@ def install():
register_loader_plugin_path(LOAD_PATH)
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
register_inventory_action_path(INVENTORY_PATH)
pyblish.api.register_callback(
"instanceToggled", on_pyblish_instance_toggled
@ -93,9 +95,7 @@ def uninstall():
deregister_loader_plugin_path(LOAD_PATH)
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.deregister_plugin_path(
avalon.api.InventoryAction, INVENTORY_PATH
)
deregister_inventory_action_path(INVENTORY_PATH)
pyblish.api.deregister_callback(
"instanceToggled", on_pyblish_instance_toggled

View file

@ -1,12 +1,14 @@
"""Host API required Work Files tool"""
import sys
import os
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .pipeline import get_current_comp
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["fusion"]
return HOST_WORKFILE_EXTENSIONS["fusion"]
def has_unsaved_changes():

View file

@ -1,7 +1,7 @@
from avalon import api
from openpype.pipeline import InventoryAction
class FusionSelectContainers(api.InventoryAction):
class FusionSelectContainers(InventoryAction):
label = "Select Containers"
icon = "mouse-pointer"

View file

@ -1,6 +1,6 @@
from avalon import api
from Qt import QtGui, QtWidgets
from openpype.pipeline import InventoryAction
from openpype import style
from openpype.hosts.fusion.api import (
get_current_comp,
@ -8,7 +8,7 @@ from openpype.hosts.fusion.api import (
)
class FusionSetToolColor(api.InventoryAction):
class FusionSetToolColor(InventoryAction):
"""Update the color of the selected tools"""
label = "Set Tool Color"

View file

@ -2,11 +2,11 @@ import os
from pathlib import Path
import logging
from bson.objectid import ObjectId
import pyblish.api
from avalon import io
import avalon.api
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype import lib
from openpype.lib import register_event_callback
@ -14,6 +14,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.harmony
import openpype.hosts.harmony.api as harmony
@ -113,7 +114,7 @@ def check_inventory():
representation = container['representation']
representation_doc = io.find_one(
{
"_id": io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}

View file

@ -2,20 +2,21 @@
import os
import shutil
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .lib import (
ProcessContext,
get_local_harmony_path,
zip_and_move,
launch_zip_file
)
from avalon import api
# used to lock saving until previous save is done.
save_disabled = False
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["harmony"]
return HOST_WORKFILE_EXTENSIONS["harmony"]
def has_unsaved_changes():

View file

@ -1,12 +1,12 @@
import os
import hiero.core.events
from openpype.api import Logger
from openpype.lib import register_event_callback
from .lib import (
sync_avalon_data_to_workfile,
launch_workfiles_app,
selection_changed_timeline,
before_project_save,
register_event_callback
)
from .tags import add_tags_to_workfile
from .menu import update_menu_task_label

View file

@ -8,7 +8,10 @@ import platform
import ast
import shutil
import hiero
from Qt import QtWidgets
from bson.objectid import ObjectId
import avalon.api as avalon
import avalon.io
from openpype.api import (Logger, Anatomy, get_anatomy_settings)
@ -1006,7 +1009,7 @@ def check_inventory_versions():
# get representation from io
representation = io.find_one({
"type": "representation",
"_id": io.ObjectId(container["representation"])
"_id": ObjectId(container["representation"])
})
# Get start frame from version data

View file

@ -4,7 +4,7 @@ Basic avalon integration
import os
import contextlib
from collections import OrderedDict
from avalon.pipeline import AVALON_CONTAINER_ID
from avalon import api as avalon
from avalon import schema
from pyblish import api as pyblish
@ -13,6 +13,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from openpype.tools.utils import host_tools
from . import lib, menu, events
@ -28,7 +29,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish").replace("\\", "/")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load").replace("\\", "/")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create").replace("\\", "/")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory").replace("\\", "/")
AVALON_CONTAINERS = ":AVALON_CONTAINERS"
@ -51,7 +51,6 @@ def install():
pyblish.register_plugin_path(PUBLISH_PATH)
register_loader_plugin_path(LOAD_PATH)
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)

View file

@ -1,14 +1,14 @@
import os
import hiero
from avalon import api
from openpype.api import Logger
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
log = Logger().get_logger(__name__)
log = Logger.get_logger(__name__)
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["hiero"]
return HOST_WORKFILE_EXTENSIONS["hiero"]
def has_unsaved_changes():

View file

@ -8,12 +8,12 @@ import hdefereval
import pyblish.api
import avalon.api
from avalon.pipeline import AVALON_CONTAINER_ID
from avalon.lib import find_submodule
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.houdini
from openpype.hosts.houdini.api import lib

View file

@ -2,11 +2,11 @@
import os
import hou
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["houdini"]
return HOST_WORKFILE_EXTENSIONS["houdini"]
def has_unsaved_changes():

View file

@ -3,6 +3,7 @@ import os
from openpype.pipeline import (
load,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.houdini.api import lib, pipeline
@ -73,7 +74,7 @@ class ImageLoader(load.LoaderPlugin):
# Imprint it manually
data = {
"schema": "avalon-core:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": node_name,
"namespace": namespace,
"loader": str(self.__class__.__name__),

View file

@ -1,8 +1,9 @@
from openpype.pipeline import (
load,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.houdini.api import lib, pipeline
from openpype.hosts.houdini.api import lib
class USDSublayerLoader(load.LoaderPlugin):
@ -43,7 +44,7 @@ class USDSublayerLoader(load.LoaderPlugin):
# Imprint it manually
data = {
"schema": "avalon-core:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": node_name,
"namespace": namespace,
"loader": str(self.__class__.__name__),

View file

@ -1,8 +1,9 @@
from openpype.pipeline import (
load,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.houdini.api import lib, pipeline
from openpype.hosts.houdini.api import lib
class USDReferenceLoader(load.LoaderPlugin):
@ -43,7 +44,7 @@ class USDReferenceLoader(load.LoaderPlugin):
# Imprint it manually
data = {
"schema": "avalon-core:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": node_name,
"namespace": namespace,
"loader": str(self.__class__.__name__),

View file

@ -145,7 +145,6 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
path = self._template.format(**{
"root": root,
"project": PROJECT,
"silo": asset_doc["silo"],
"asset": asset_doc["name"],
"subset": subset,
"representation": ext,
@ -165,4 +164,3 @@ output_processor = AvalonURIOutputProcessor()
def usdOutputProcessor():
return output_processor

View file

@ -1511,7 +1511,7 @@ def get_container_members(container):
members = cmds.sets(container, query=True) or []
members = cmds.ls(members, long=True, objectsOnly=True) or []
members = set(members)
all_members = set(members)
# Include any referenced nodes from any reference in the container
# This is required since we've removed adding ALL nodes of a reference
@ -1530,9 +1530,9 @@ def get_container_members(container):
reference_members = cmds.ls(reference_members,
long=True,
objectsOnly=True)
members.update(reference_members)
all_members.update(reference_members)
return members
return list(all_members)
# region LOOKDEV
@ -1937,18 +1937,26 @@ def remove_other_uv_sets(mesh):
cmds.removeMultiInstance(attr, b=True)
def get_id_from_history(node):
def get_id_from_sibling(node, history_only=True):
"""Return first node id in the history chain that matches this node.
The nodes in history must be of the exact same node type and must be
parented under the same parent.
Optionally, if no matching node is found from the history, all the
siblings of the node that are of the same type are checked.
Additionally to having the same parent, the sibling must be marked as
'intermediate object'.
Args:
node (str): node to retrieve the
node (str): node to retrieve the history from
history_only (bool): if True and if nothing found in history,
look for an 'intermediate object' in all the node's siblings
of same type
Returns:
str or None: The id from the node in history or None when no id found
on any valid nodes in the history.
str or None: The id from the sibling node or None when no id found
on any valid nodes in the history or siblings.
"""
@ -1977,6 +1985,45 @@ def get_id_from_history(node):
if _id:
return _id
if not history_only:
# Get siblings of same type
similar_nodes = cmds.listRelatives(parent,
type=node_type,
fullPath=True)
similar_nodes = cmds.ls(similar_nodes, exactType=node_type, long=True)
# Exclude itself
similar_nodes = [x for x in similar_nodes if x != node]
# Get all unique ids from siblings in order since
# we consistently take the first one found
sibling_ids = OrderedDict()
for similar_node in similar_nodes:
# Check if "intermediate object"
if not cmds.getAttr(similar_node + ".intermediateObject"):
continue
_id = get_id(similar_node)
if not _id:
continue
if _id in sibling_ids:
sibling_ids[_id].append(similar_node)
else:
sibling_ids[_id] = [similar_node]
if sibling_ids:
first_id, found_nodes = next(iter(sibling_ids.items()))
# Log a warning if we've found multiple unique ids
if len(sibling_ids) > 1:
log.warning(("Found more than 1 intermediate shape with"
" unique id for '{}'. Using id of first"
" found: '{}'".format(node, found_nodes[0])))
return first_id
# Project settings
def set_scene_fps(fps, update=True):

View file

@ -10,7 +10,6 @@ import pyblish.api
import avalon.api
from avalon.lib import find_submodule
from avalon.pipeline import AVALON_CONTAINER_ID
import openpype.hosts.maya
from openpype.tools.utils import host_tools
@ -23,7 +22,10 @@ from openpype.lib.path_tools import HostDirmap
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
register_inventory_action_path,
deregister_loader_plugin_path,
deregister_inventory_action_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.maya.lib import copy_workspace_mel
from . import menu, lib
@ -59,7 +61,7 @@ def install():
register_loader_plugin_path(LOAD_PATH)
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
register_inventory_action_path(INVENTORY_PATH)
log.info(PUBLISH_PATH)
log.info("Installing callbacks ... ")
@ -188,9 +190,7 @@ def uninstall():
deregister_loader_plugin_path(LOAD_PATH)
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.deregister_plugin_path(
avalon.api.InventoryAction, INVENTORY_PATH
)
deregister_inventory_action_path(INVENTORY_PATH)
menu.uninstall()

View file

@ -4,11 +4,11 @@ from maya import cmds
import qargparse
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import (
LegacyCreator,
LoaderPlugin,
get_representation_path,
AVALON_CONTAINER_ID,
)
from .pipeline import containerise

View file

@ -6,6 +6,8 @@ import contextlib
import copy
import six
from bson.objectid import ObjectId
from maya import cmds
from avalon import io
@ -282,7 +284,7 @@ def update_package_version(container, version):
# Versioning (from `core.maya.pipeline`)
current_representation = io.find_one({
"_id": io.ObjectId(container["representation"])
"_id": ObjectId(container["representation"])
})
assert current_representation is not None, "This is a bug"
@ -327,7 +329,7 @@ def update_package(set_container, representation):
# Load the original package data
current_representation = io.find_one({
"_id": io.ObjectId(set_container['representation']),
"_id": ObjectId(set_container['representation']),
"type": "representation"
})
@ -478,10 +480,10 @@ def update_scene(set_container, containers, current_data, new_data, new_file):
# They *must* use the same asset, subset and Loader for
# `update_container` to make sense.
old = io.find_one({
"_id": io.ObjectId(representation_current)
"_id": ObjectId(representation_current)
})
new = io.find_one({
"_id": io.ObjectId(representation_new)
"_id": ObjectId(representation_new)
})
is_valid = compare_representations(old=old, new=new)
if not is_valid:

View file

@ -1,11 +1,12 @@
"""Host API required Work Files tool"""
import os
from maya import cmds
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["maya"]
return HOST_WORKFILE_EXTENSIONS["maya"]
def has_unsaved_changes():

View file

@ -0,0 +1,51 @@
from openpype.hosts.maya.api import plugin, lib
class CreateMultiverseUsd(plugin.Creator):
"""Multiverse USD data"""
name = "usdMain"
label = "Multiverse USD"
family = "usd"
icon = "cubes"
def __init__(self, *args, **kwargs):
super(CreateMultiverseUsd, self).__init__(*args, **kwargs)
# Add animation data first, since it maintains order.
self.data.update(lib.collect_animation_data(True))
self.data["stripNamespaces"] = False
self.data["mergeTransformAndShape"] = False
self.data["writeAncestors"] = True
self.data["flattenParentXforms"] = False
self.data["writeSparseOverrides"] = False
self.data["useMetaPrimPath"] = False
self.data["customRootPath"] = ''
self.data["customAttributes"] = ''
self.data["nodeTypesToIgnore"] = ''
self.data["writeMeshes"] = True
self.data["writeCurves"] = True
self.data["writeParticles"] = True
self.data["writeCameras"] = False
self.data["writeLights"] = False
self.data["writeJoints"] = False
self.data["writeCollections"] = False
self.data["writePositions"] = True
self.data["writeNormals"] = True
self.data["writeUVs"] = True
self.data["writeColorSets"] = False
self.data["writeTangents"] = False
self.data["writeRefPositions"] = False
self.data["writeBlendShapes"] = False
self.data["writeDisplayColor"] = False
self.data["writeSkinWeights"] = False
self.data["writeMaterialAssignment"] = False
self.data["writeHardwareShader"] = False
self.data["writeShadingNetworks"] = False
self.data["writeTransformMatrix"] = True
self.data["writeUsdAttributes"] = False
self.data["timeVaryingTopology"] = False
self.data["customMaterialNamespace"] = ''
self.data["numTimeSamples"] = 1
self.data["timeSamplesSpan"] = 0.0

View file

@ -0,0 +1,23 @@
from openpype.hosts.maya.api import plugin, lib
class CreateMultiverseUsdComp(plugin.Creator):
"""Create Multiverse USD Composition"""
name = "usdCompositionMain"
label = "Multiverse USD Composition"
family = "usdComposition"
icon = "cubes"
def __init__(self, *args, **kwargs):
super(CreateMultiverseUsdComp, self).__init__(*args, **kwargs)
# Add animation data first, since it maintains order.
self.data.update(lib.collect_animation_data(True))
self.data["stripNamespaces"] = False
self.data["mergeTransformAndShape"] = False
self.data["flattenContent"] = False
self.data["writePendingOverrides"] = False
self.data["numTimeSamples"] = 1
self.data["timeSamplesSpan"] = 0.0

View file

@ -0,0 +1,28 @@
from openpype.hosts.maya.api import plugin, lib
class CreateMultiverseUsdOver(plugin.Creator):
"""Multiverse USD data"""
name = "usdOverrideMain"
label = "Multiverse USD Override"
family = "usdOverride"
icon = "cubes"
def __init__(self, *args, **kwargs):
super(CreateMultiverseUsdOver, self).__init__(*args, **kwargs)
# Add animation data first, since it maintains order.
self.data.update(lib.collect_animation_data(True))
self.data["writeAll"] = False
self.data["writeTransforms"] = True
self.data["writeVisibility"] = True
self.data["writeAttributes"] = True
self.data["writeMaterials"] = True
self.data["writeVariants"] = True
self.data["writeVariantsDefinition"] = True
self.data["writeActiveState"] = True
self.data["writeNamespaces"] = False
self.data["numTimeSamples"] = 1
self.data["timeSamplesSpan"] = 0.0

View file

@ -15,6 +15,14 @@ class CreateReview(plugin.Creator):
keepImages = False
isolate = False
imagePlane = True
transparency = [
"preset",
"simple",
"object sorting",
"weighted average",
"depth peeling",
"alpha cut"
]
def __init__(self, *args, **kwargs):
super(CreateReview, self).__init__(*args, **kwargs)
@ -28,5 +36,6 @@ class CreateReview(plugin.Creator):
data["isolate"] = self.isolate
data["keepImages"] = self.keepImages
data["imagePlane"] = self.imagePlane
data["transparency"] = self.transparency
self.data = data

View file

@ -1,6 +1,8 @@
import json
from avalon import api, io
from avalon import io
from bson.objectid import ObjectId
from openpype.pipeline import (
InventoryAction,
get_representation_context,
get_representation_path_from_context,
)
@ -10,7 +12,7 @@ from openpype.hosts.maya.api.lib import (
)
class ImportModelRender(api.InventoryAction):
class ImportModelRender(InventoryAction):
label = "Import Model Render Sets"
icon = "industry"
@ -39,7 +41,7 @@ class ImportModelRender(api.InventoryAction):
nodes.append(n)
repr_doc = io.find_one({
"_id": io.ObjectId(container["representation"]),
"_id": ObjectId(container["representation"]),
})
version_id = repr_doc["parent"]

View file

@ -1,11 +1,10 @@
from maya import cmds
from avalon import api
from openpype.pipeline import InventoryAction
from openpype.hosts.maya.api.plugin import get_reference_node
class ImportReference(api.InventoryAction):
class ImportReference(InventoryAction):
"""Imports selected reference to inside of the file."""
label = "Import Reference"

View file

@ -0,0 +1,102 @@
# -*- coding: utf-8 -*-
import maya.cmds as cmds
from openpype.pipeline import (
load,
get_representation_path
)
from openpype.hosts.maya.api.lib import (
maintained_selection,
namespaced,
unique_namespace
)
from openpype.hosts.maya.api.pipeline import containerise
class MultiverseUsdLoader(load.LoaderPlugin):
"""Load the USD by Multiverse"""
families = ["model", "usd", "usdComposition", "usdOverride",
"pointcache", "animation"]
representations = ["usd", "usda", "usdc", "usdz", "abc"]
label = "Read USD by Multiverse"
order = -10
icon = "code-fork"
color = "orange"
def load(self, context, name=None, namespace=None, options=None):
asset = context['asset']['name']
namespace = namespace or unique_namespace(
asset + "_",
prefix="_" if asset[0].isdigit() else "",
suffix="_",
)
# Create the shape
cmds.loadPlugin("MultiverseForMaya", quiet=True)
shape = None
transform = None
with maintained_selection():
cmds.namespace(addNamespace=namespace)
with namespaced(namespace, new=False):
import multiverse
shape = multiverse.CreateUsdCompound(self.fname)
transform = cmds.listRelatives(
shape, parent=True, fullPath=True)[0]
# Lock the shape node so the user cannot delete it.
cmds.lockNode(shape, lock=True)
nodes = [transform, shape]
self[:] = nodes
return containerise(
name=name,
namespace=namespace,
nodes=nodes,
context=context,
loader=self.__class__.__name__)
def update(self, container, representation):
# type: (dict, dict) -> None
"""Update container with specified representation."""
node = container['objectName']
assert cmds.objExists(node), "Missing container"
members = cmds.sets(node, query=True) or []
shapes = cmds.ls(members, type="mvUsdCompoundShape")
assert shapes, "Cannot find mvUsdCompoundShape in container"
path = get_representation_path(representation)
import multiverse
for shape in shapes:
multiverse.SetUsdCompoundAssetPaths(shape, [path])
cmds.setAttr("{}.representation".format(node),
str(representation["_id"]),
type="string")
def switch(self, container, representation):
self.update(container, representation)
def remove(self, container):
# type: (dict) -> None
"""Remove loaded container."""
# Delete container and its contents
if cmds.objExists(container['objectName']):
members = cmds.sets(container['objectName'], query=True) or []
cmds.delete([container['objectName']] + members)
# Remove the namespace, if empty
namespace = container['namespace']
if cmds.namespace(exists=namespace):
members = cmds.namespaceInfo(namespace, listNamespace=True)
if not members:
cmds.namespace(removeNamespace=namespace)
else:
self.log.warning("Namespace not deleted because it "
"still has members: %s", namespace)

View file

@ -7,6 +7,8 @@ loader will use them instead of native vray vrmesh format.
"""
import os
from bson.objectid import ObjectId
import maya.cmds as cmds
from avalon import io
@ -186,7 +188,7 @@ class VRayProxyLoader(load.LoaderPlugin):
abc_rep = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(version_id),
"parent": ObjectId(version_id),
"name": "abc"
})

View file

@ -6,7 +6,7 @@ from maya import cmds
import openpype.api
from openpype.hosts.maya.api.lib import maintained_selection
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import AVALON_CONTAINER_ID
class ExtractMayaSceneRaw(openpype.api.Extractor):

View file

@ -0,0 +1,210 @@
import os
import six
from maya import cmds
import openpype.api
from openpype.hosts.maya.api.lib import maintained_selection
class ExtractMultiverseUsd(openpype.api.Extractor):
"""Extractor for USD by Multiverse."""
label = "Extract Multiverse USD"
hosts = ["maya"]
families = ["usd"]
@property
def options(self):
"""Overridable options for Multiverse USD Export
Given in the following format
- {NAME: EXPECTED TYPE}
If the overridden option's type does not match,
the option is not included and a warning is logged.
"""
return {
"stripNamespaces": bool,
"mergeTransformAndShape": bool,
"writeAncestors": bool,
"flattenParentXforms": bool,
"writeSparseOverrides": bool,
"useMetaPrimPath": bool,
"customRootPath": str,
"customAttributes": str,
"nodeTypesToIgnore": str,
"writeMeshes": bool,
"writeCurves": bool,
"writeParticles": bool,
"writeCameras": bool,
"writeLights": bool,
"writeJoints": bool,
"writeCollections": bool,
"writePositions": bool,
"writeNormals": bool,
"writeUVs": bool,
"writeColorSets": bool,
"writeTangents": bool,
"writeRefPositions": bool,
"writeBlendShapes": bool,
"writeDisplayColor": bool,
"writeSkinWeights": bool,
"writeMaterialAssignment": bool,
"writeHardwareShader": bool,
"writeShadingNetworks": bool,
"writeTransformMatrix": bool,
"writeUsdAttributes": bool,
"timeVaryingTopology": bool,
"customMaterialNamespace": str,
"numTimeSamples": int,
"timeSamplesSpan": float
}
@property
def default_options(self):
"""The default options for Multiverse USD extraction."""
return {
"stripNamespaces": False,
"mergeTransformAndShape": False,
"writeAncestors": True,
"flattenParentXforms": False,
"writeSparseOverrides": False,
"useMetaPrimPath": False,
"customRootPath": str(),
"customAttributes": str(),
"nodeTypesToIgnore": str(),
"writeMeshes": True,
"writeCurves": True,
"writeParticles": True,
"writeCameras": False,
"writeLights": False,
"writeJoints": False,
"writeCollections": False,
"writePositions": True,
"writeNormals": True,
"writeUVs": True,
"writeColorSets": False,
"writeTangents": False,
"writeRefPositions": False,
"writeBlendShapes": False,
"writeDisplayColor": False,
"writeSkinWeights": False,
"writeMaterialAssignment": False,
"writeHardwareShader": False,
"writeShadingNetworks": False,
"writeTransformMatrix": True,
"writeUsdAttributes": False,
"timeVaryingTopology": False,
"customMaterialNamespace": str(),
"numTimeSamples": 1,
"timeSamplesSpan": 0.0
}
def parse_overrides(self, instance, options):
"""Inspect data of instance to determine overridden options"""
for key in instance.data:
if key not in self.options:
continue
# Ensure the data is of correct type
value = instance.data[key]
if isinstance(value, six.text_type):
value = str(value)
if not isinstance(value, self.options[key]):
self.log.warning(
"Overridden attribute {key} was of "
"the wrong type: {invalid_type} "
"- should have been {valid_type}".format(
key=key,
invalid_type=type(value).__name__,
valid_type=self.options[key].__name__))
continue
options[key] = value
return options
def process(self, instance):
# Load plugin firstly
cmds.loadPlugin("MultiverseForMaya", quiet=True)
# Define output file path
staging_dir = self.staging_dir(instance)
file_name = "{}.usd".format(instance.name)
file_path = os.path.join(staging_dir, file_name)
file_path = file_path.replace('\\', '/')
# Parse export options
options = self.default_options
options = self.parse_overrides(instance, options)
self.log.info("Export options: {0}".format(options))
# Perform extraction
self.log.info("Performing extraction ...")
with maintained_selection():
members = instance.data("setMembers")
members = cmds.ls(members,
dag=True,
shapes=True,
type=("mesh"),
noIntermediate=True,
long=True)
self.log.info('Collected object {}'.format(members))
import multiverse
time_opts = None
frame_start = instance.data['frameStart']
frame_end = instance.data['frameEnd']
handle_start = instance.data['handleStart']
handle_end = instance.data['handleEnd']
step = instance.data['step']
fps = instance.data['fps']
if frame_end != frame_start:
time_opts = multiverse.TimeOptions()
time_opts.writeTimeRange = True
time_opts.frameRange = (
frame_start - handle_start, frame_end + handle_end)
time_opts.frameIncrement = step
time_opts.numTimeSamples = instance.data["numTimeSamples"]
time_opts.timeSamplesSpan = instance.data["timeSamplesSpan"]
time_opts.framePerSecond = fps
asset_write_opts = multiverse.AssetWriteOptions(time_opts)
options_discard_keys = {
'numTimeSamples',
'timeSamplesSpan',
'frameStart',
'frameEnd',
'handleStart',
'handleEnd',
'step',
'fps'
}
for key, value in options.items():
if key in options_discard_keys:
continue
setattr(asset_write_opts, key, value)
multiverse.WriteAsset(file_path, members, asset_write_opts)
if "representations" not in instance.data:
instance.data["representations"] = []
representation = {
'name': 'usd',
'ext': 'usd',
'files': file_name,
"stagingDir": staging_dir
}
instance.data["representations"].append(representation)
self.log.info("Extracted instance {} to {}".format(
instance.name, file_path))

View file

@ -0,0 +1,151 @@
import os
from maya import cmds
import openpype.api
from openpype.hosts.maya.api.lib import maintained_selection
class ExtractMultiverseUsdComposition(openpype.api.Extractor):
"""Extractor of Multiverse USD Composition."""
label = "Extract Multiverse USD Composition"
hosts = ["maya"]
families = ["usdComposition"]
@property
def options(self):
"""Overridable options for Multiverse USD Export
Given in the following format
- {NAME: EXPECTED TYPE}
If the overridden option's type does not match,
the option is not included and a warning is logged.
"""
return {
"stripNamespaces": bool,
"mergeTransformAndShape": bool,
"flattenContent": bool,
"writePendingOverrides": bool,
"numTimeSamples": int,
"timeSamplesSpan": float
}
@property
def default_options(self):
"""The default options for Multiverse USD extraction."""
return {
"stripNamespaces": True,
"mergeTransformAndShape": False,
"flattenContent": False,
"writePendingOverrides": False,
"numTimeSamples": 1,
"timeSamplesSpan": 0.0
}
def parse_overrides(self, instance, options):
"""Inspect data of instance to determine overridden options"""
for key in instance.data:
if key not in self.options:
continue
# Ensure the data is of correct type
value = instance.data[key]
if not isinstance(value, self.options[key]):
self.log.warning(
"Overridden attribute {key} was of "
"the wrong type: {invalid_type} "
"- should have been {valid_type}".format(
key=key,
invalid_type=type(value).__name__,
valid_type=self.options[key].__name__))
continue
options[key] = value
return options
def process(self, instance):
# Load plugin firstly
cmds.loadPlugin("MultiverseForMaya", quiet=True)
# Define output file path
staging_dir = self.staging_dir(instance)
file_name = "{}.usd".format(instance.name)
file_path = os.path.join(staging_dir, file_name)
file_path = file_path.replace('\\', '/')
# Parse export options
options = self.default_options
options = self.parse_overrides(instance, options)
self.log.info("Export options: {0}".format(options))
# Perform extraction
self.log.info("Performing extraction ...")
with maintained_selection():
members = instance.data("setMembers")
members = cmds.ls(members,
dag=True,
shapes=True,
type="mvUsdCompoundShape",
noIntermediate=True,
long=True)
self.log.info('Collected object {}'.format(members))
import multiverse
time_opts = None
frame_start = instance.data['frameStart']
frame_end = instance.data['frameEnd']
handle_start = instance.data['handleStart']
handle_end = instance.data['handleEnd']
step = instance.data['step']
fps = instance.data['fps']
if frame_end != frame_start:
time_opts = multiverse.TimeOptions()
time_opts.writeTimeRange = True
time_opts.frameRange = (
frame_start - handle_start, frame_end + handle_end)
time_opts.frameIncrement = step
time_opts.numTimeSamples = instance.data["numTimeSamples"]
time_opts.timeSamplesSpan = instance.data["timeSamplesSpan"]
time_opts.framePerSecond = fps
comp_write_opts = multiverse.CompositionWriteOptions()
options_discard_keys = {
'numTimeSamples',
'timeSamplesSpan',
'frameStart',
'frameEnd',
'handleStart',
'handleEnd',
'step',
'fps'
}
for key, value in options.items():
if key in options_discard_keys:
continue
setattr(comp_write_opts, key, value)
multiverse.WriteComposition(file_path, members, comp_write_opts)
if "representations" not in instance.data:
instance.data["representations"] = []
representation = {
'name': 'usd',
'ext': 'usd',
'files': file_name,
"stagingDir": staging_dir
}
instance.data["representations"].append(representation)
self.log.info("Extracted instance {} to {}".format(
instance.name, file_path))

View file

@ -0,0 +1,139 @@
import os
import openpype.api
from openpype.hosts.maya.api.lib import maintained_selection
from maya import cmds
class ExtractMultiverseUsdOverride(openpype.api.Extractor):
"""Extractor for USD Override by Multiverse."""
label = "Extract Multiverse USD Override"
hosts = ["maya"]
families = ["usdOverride"]
@property
def options(self):
"""Overridable options for Multiverse USD Export
Given in the following format
- {NAME: EXPECTED TYPE}
If the overridden option's type does not match,
the option is not included and a warning is logged.
"""
return {
"writeAll": bool,
"writeTransforms": bool,
"writeVisibility": bool,
"writeAttributes": bool,
"writeMaterials": bool,
"writeVariants": bool,
"writeVariantsDefinition": bool,
"writeActiveState": bool,
"writeNamespaces": bool,
"numTimeSamples": int,
"timeSamplesSpan": float
}
@property
def default_options(self):
"""The default options for Multiverse USD extraction."""
return {
"writeAll": False,
"writeTransforms": True,
"writeVisibility": True,
"writeAttributes": True,
"writeMaterials": True,
"writeVariants": True,
"writeVariantsDefinition": True,
"writeActiveState": True,
"writeNamespaces": False,
"numTimeSamples": 1,
"timeSamplesSpan": 0.0
}
def process(self, instance):
# Load plugin firstly
cmds.loadPlugin("MultiverseForMaya", quiet=True)
# Define output file path
staging_dir = self.staging_dir(instance)
file_name = "{}.usda".format(instance.name)
file_path = os.path.join(staging_dir, file_name)
file_path = file_path.replace("\\", "/")
# Parse export options
options = self.default_options
self.log.info("Export options: {0}".format(options))
# Perform extraction
self.log.info("Performing extraction ...")
with maintained_selection():
members = instance.data("setMembers")
members = cmds.ls(members,
dag=True,
shapes=True,
type="mvUsdCompoundShape",
noIntermediate=True,
long=True)
self.log.info("Collected object {}".format(members))
# TODO: Deal with asset, composition, overide with options.
import multiverse
time_opts = None
frame_start = instance.data["frameStart"]
frame_end = instance.data["frameEnd"]
handle_start = instance.data["handleStart"]
handle_end = instance.data["handleEnd"]
step = instance.data["step"]
fps = instance.data["fps"]
if frame_end != frame_start:
time_opts = multiverse.TimeOptions()
time_opts.writeTimeRange = True
time_opts.frameRange = (
frame_start - handle_start, frame_end + handle_end)
time_opts.frameIncrement = step
time_opts.numTimeSamples = instance.data["numTimeSamples"]
time_opts.timeSamplesSpan = instance.data["timeSamplesSpan"]
time_opts.framePerSecond = fps
over_write_opts = multiverse.OverridesWriteOptions(time_opts)
options_discard_keys = {
"numTimeSamples",
"timeSamplesSpan",
"frameStart",
"frameEnd",
"handleStart",
"handleEnd",
"step",
"fps"
}
for key, value in options.items():
if key in options_discard_keys:
continue
setattr(over_write_opts, key, value)
for member in members:
multiverse.WriteOverrides(file_path, member, over_write_opts)
if "representations" not in instance.data:
instance.data["representations"] = []
representation = {
"name": "usd",
"ext": "usd",
"files": file_name,
"stagingDir": staging_dir
}
instance.data["representations"].append(representation)
self.log.info("Extracted instance {} to {}".format(
instance.name, file_path))

View file

@ -73,6 +73,11 @@ class ExtractPlayblast(openpype.api.Extractor):
pm.currentTime(refreshFrameInt - 1, edit=True)
pm.currentTime(refreshFrameInt, edit=True)
# Override transparency if requested.
transparency = instance.data.get("transparency", 0)
if transparency != 0:
preset["viewport2_options"]["transparencyAlgorithm"] = transparency
# Isolate view is requested by having objects in the set besides a
# camera.
if preset.pop("isolate_view", False) and instance.data.get("isolate"):

View file

@ -32,8 +32,8 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
# if a deformer has been created on the shape
invalid = self.get_invalid(instance)
if invalid:
raise RuntimeError("Nodes found with non-related "
"asset IDs: {0}".format(invalid))
raise RuntimeError("Nodes found with mismatching "
"IDs: {0}".format(invalid))
@classmethod
def get_invalid(cls, instance):
@ -65,7 +65,7 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
invalid.append(node)
continue
history_id = lib.get_id_from_history(node)
history_id = lib.get_id_from_sibling(node)
if history_id is not None and node_id != history_id:
invalid.append(node)
@ -76,7 +76,7 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
for node in cls.get_invalid(instance):
# Get the original id from history
history_id = lib.get_id_from_history(node)
history_id = lib.get_id_from_sibling(node)
if not history_id:
cls.log.error("Could not find ID in history for '%s'", node)
continue

View file

@ -48,7 +48,7 @@ class ValidateNodeIdsDeformedShape(pyblish.api.InstancePlugin):
invalid = []
for shape in shapes:
history_id = lib.get_id_from_history(shape)
history_id = lib.get_id_from_sibling(shape)
if history_id:
current_id = lib.get_id(shape)
if current_id != history_id:
@ -61,7 +61,7 @@ class ValidateNodeIdsDeformedShape(pyblish.api.InstancePlugin):
for node in cls.get_invalid(instance):
# Get the original id from history
history_id = lib.get_id_from_history(node)
history_id = lib.get_id_from_sibling(node)
if not history_id:
cls.log.error("Could not find ID in history for '%s'", node)
continue

View file

@ -24,6 +24,7 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
openpype.hosts.maya.api.action.SelectInvalidAction,
openpype.api.RepairAction
]
allow_history_only = False
def process(self, instance):
"""Process all meshes"""
@ -32,8 +33,8 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
# if a deformer has been created on the shape
invalid = self.get_invalid(instance)
if invalid:
raise RuntimeError("Nodes found with non-related "
"asset IDs: {0}".format(invalid))
raise RuntimeError("Nodes found with mismatching "
"IDs: {0}".format(invalid))
@classmethod
def get_invalid(cls, instance):
@ -51,10 +52,13 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
noIntermediate=True)
for shape in shapes:
history_id = lib.get_id_from_history(shape)
if history_id:
sibling_id = lib.get_id_from_sibling(
shape,
history_only=cls.allow_history_only
)
if sibling_id:
current_id = lib.get_id(shape)
if current_id != history_id:
if current_id != sibling_id:
invalid.append(shape)
return invalid
@ -63,10 +67,13 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin):
def repair(cls, instance):
for node in cls.get_invalid(instance):
# Get the original id from history
history_id = lib.get_id_from_history(node)
if not history_id:
cls.log.error("Could not find ID in history for '%s'", node)
# Get the original id from sibling
sibling_id = lib.get_id_from_sibling(
node,
history_only=cls.allow_history_only
)
if not sibling_id:
cls.log.error("Could not find ID in siblings for '%s'", node)
continue
lib.set_id(node, history_id, overwrite=True)
lib.set_id(node, sibling_id, overwrite=True)

View file

@ -1,6 +1,7 @@
import logging
import contextlib
import nuke
from bson.objectid import ObjectId
from avalon import api, io
@ -70,10 +71,10 @@ def get_handles(asset):
if "visualParent" in data:
vp = data["visualParent"]
if vp is not None:
parent_asset = io.find_one({"_id": io.ObjectId(vp)})
parent_asset = io.find_one({"_id": ObjectId(vp)})
if parent_asset is None:
parent_asset = io.find_one({"_id": io.ObjectId(asset["parent"])})
parent_asset = io.find_one({"_id": ObjectId(asset["parent"])})
if parent_asset is not None:
return get_handles(parent_asset)

View file

@ -6,10 +6,11 @@ import contextlib
from collections import OrderedDict
import clique
from bson.objectid import ObjectId
import nuke
from avalon import api, io, lib
from avalon import api, io
from openpype.api import (
Logger,
@ -20,7 +21,6 @@ from openpype.api import (
get_workdir_data,
get_asset,
get_current_project_settings,
ApplicationManager
)
from openpype.tools.utils import host_tools
from openpype.lib.path_tools import HostDirmap
@ -570,7 +570,7 @@ def check_inventory_versions():
# get representation from io
representation = io.find_one({
"type": "representation",
"_id": io.ObjectId(avalon_knob_data["representation"])
"_id": ObjectId(avalon_knob_data["representation"])
})
# Failsafe for not finding the representation.

View file

@ -6,7 +6,6 @@ import nuke
import pyblish.api
import avalon.api
from avalon import pipeline
import openpype
from openpype.api import (
@ -18,7 +17,10 @@ from openpype.lib import register_event_callback
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
register_inventory_action_path,
deregister_loader_plugin_path,
deregister_inventory_action_path,
AVALON_CONTAINER_ID,
)
from openpype.tools.utils import host_tools
@ -105,7 +107,7 @@ def install():
pyblish.api.register_plugin_path(PUBLISH_PATH)
register_loader_plugin_path(LOAD_PATH)
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
register_inventory_action_path(INVENTORY_PATH)
# Register Avalon event for workfiles loading.
register_event_callback("workio.open_file", check_inventory_versions)
@ -131,6 +133,7 @@ def uninstall():
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
deregister_loader_plugin_path(LOAD_PATH)
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
deregister_inventory_action_path(INVENTORY_PATH)
pyblish.api.deregister_callback(
"instanceToggled", on_pyblish_instance_toggled)
@ -330,7 +333,7 @@ def containerise(node,
data = OrderedDict(
[
("schema", "openpype:container-2.0"),
("id", pipeline.AVALON_CONTAINER_ID),
("id", AVALON_CONTAINER_ID),
("name", name),
("namespace", namespace),
("loader", str(loader)),

View file

@ -1,11 +1,12 @@
"""Host API required Work Files tool"""
import os
import nuke
import avalon.api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return avalon.api.HOST_WORKFILE_EXTENSIONS["nuke"]
return HOST_WORKFILE_EXTENSIONS["nuke"]
def has_unsaved_changes():

View file

@ -1,9 +1,9 @@
from avalon import api
from openpype.api import Logger
from openpype.pipeline import InventoryAction
from openpype.hosts.nuke.api.lib import set_avalon_knob_data
class RepairOldLoaders(api.InventoryAction):
class RepairOldLoaders(InventoryAction):
label = "Repair Old Loaders"
icon = "gears"

View file

@ -1,8 +1,8 @@
from avalon import api
from openpype.pipeline import InventoryAction
from openpype.hosts.nuke.api.commands import viewer_update_and_undo_stop
class SelectContainers(api.InventoryAction):
class SelectContainers(InventoryAction):
label = "Select Containers"
icon = "mouse-pointer"

View file

@ -101,7 +101,7 @@ class LoadClip(plugin.NukeLoader):
last += self.handle_end
if not is_sequence:
duration = last - first + 1
duration = last - first
first = 1
last = first + duration
elif "#" not in file:
@ -216,7 +216,7 @@ class LoadClip(plugin.NukeLoader):
last += self.handle_end
if not is_sequence:
duration = last - first + 1
duration = last - first
first = 1
last = first + duration
elif "#" not in file:

View file

@ -72,7 +72,7 @@ class LoadEffects(load.LoaderPlugin):
# getting data from json file with unicode conversion
with open(file, "r") as f:
json_f = {self.byteify(key): self.byteify(value)
for key, value in json.load(f).iteritems()}
for key, value in json.load(f).items()}
# get correct order of nodes by positions on track and subtrack
nodes_order = self.reorder_nodes(json_f)
@ -188,7 +188,7 @@ class LoadEffects(load.LoaderPlugin):
# getting data from json file with unicode conversion
with open(file, "r") as f:
json_f = {self.byteify(key): self.byteify(value)
for key, value in json.load(f).iteritems()}
for key, value in json.load(f).items()}
# get correct order of nodes by positions on track and subtrack
nodes_order = self.reorder_nodes(json_f)
@ -330,11 +330,11 @@ class LoadEffects(load.LoaderPlugin):
if isinstance(input, dict):
return {self.byteify(key): self.byteify(value)
for key, value in input.iteritems()}
for key, value in input.items()}
elif isinstance(input, list):
return [self.byteify(element) for element in input]
elif isinstance(input, unicode):
return input.encode('utf-8')
elif isinstance(input, str):
return str(input)
else:
return input

View file

@ -74,7 +74,7 @@ class LoadEffectsInputProcess(load.LoaderPlugin):
# getting data from json file with unicode conversion
with open(file, "r") as f:
json_f = {self.byteify(key): self.byteify(value)
for key, value in json.load(f).iteritems()}
for key, value in json.load(f).items()}
# get correct order of nodes by positions on track and subtrack
nodes_order = self.reorder_nodes(json_f)
@ -194,7 +194,7 @@ class LoadEffectsInputProcess(load.LoaderPlugin):
# getting data from json file with unicode conversion
with open(file, "r") as f:
json_f = {self.byteify(key): self.byteify(value)
for key, value in json.load(f).iteritems()}
for key, value in json.load(f).items()}
# get correct order of nodes by positions on track and subtrack
nodes_order = self.reorder_nodes(json_f)
@ -350,11 +350,11 @@ class LoadEffectsInputProcess(load.LoaderPlugin):
if isinstance(input, dict):
return {self.byteify(key): self.byteify(value)
for key, value in input.iteritems()}
for key, value in input.items()}
elif isinstance(input, list):
return [self.byteify(element) for element in input]
elif isinstance(input, unicode):
return input.encode('utf-8')
elif isinstance(input, str):
return str(input)
else:
return input

View file

@ -240,7 +240,7 @@ class LoadGizmoInputProcess(load.LoaderPlugin):
if isinstance(input, dict):
return {self.byteify(key): self.byteify(value)
for key, value in input.iteritems()}
for key, value in input.items()}
elif isinstance(input, list):
return [self.byteify(element) for element in input]
elif isinstance(input, unicode):

View file

@ -24,7 +24,11 @@ class ExtractReviewDataMov(openpype.api.Extractor):
outputs = {}
def process(self, instance):
families = instance.data["families"]
families = set(instance.data["families"])
# add main family to make sure all families are compared
families.add(instance.data["family"])
task_type = instance.context.data["taskType"]
subset = instance.data["subset"]
self.log.info("Creating staging dir...")
@ -50,51 +54,31 @@ class ExtractReviewDataMov(openpype.api.Extractor):
f_task_types = o_data["filter"]["task_types"]
f_subsets = o_data["filter"]["sebsets"]
self.log.debug(
"f_families `{}` > families: {}".format(
f_families, families))
self.log.debug(
"f_task_types `{}` > task_type: {}".format(
f_task_types, task_type))
self.log.debug(
"f_subsets `{}` > subset: {}".format(
f_subsets, subset))
# test if family found in context
test_families = any([
# first if exact family set is matching
# make sure only interesetion of list is correct
bool(set(families).intersection(f_families)),
# and if famiies are set at all
# if not then return True because we want this preset
# to be active if nothig is set
bool(not f_families)
])
# using intersection to make sure all defined
# families are present in combination
if f_families and not families.intersection(f_families):
continue
# test task types from filter
test_task_types = any([
# check if actual task type is defined in task types
# set in preset's filter
bool(task_type in f_task_types),
# and if taskTypes are defined in preset filter
# if not then return True, because we want this filter
# to be active if no taskType is set
bool(not f_task_types)
])
if f_task_types and task_type not in f_task_types:
continue
# test subsets from filter
test_subsets = any([
# check if any of subset filter inputs
# converted to regex patern is not found in subset
# we keep strict case sensitivity
bool(next((
s for s in f_subsets
if re.search(re.compile(s), subset)
), None)),
# but if no subsets were set then make this acuntable too
bool(not f_subsets)
])
# we need all filters to be positive for this
# preset to be activated
test_all = all([
test_families,
test_task_types,
test_subsets
])
# if it is not positive then skip this preset
if not test_all:
if f_subsets and not any(
re.search(s, subset) for s in f_subsets):
continue
self.log.info(

View file

@ -25,7 +25,7 @@ class RepairNukeWriteDeadlineTab(pyblish.api.Action):
# Remove existing knobs.
knob_names = openpype.hosts.nuke.lib.get_deadline_knob_names()
for name, knob in group_node.knobs().iteritems():
for name, knob in group_node.knobs().items():
if name in knob_names:
group_node.removeKnob(knob)

View file

@ -1,9 +1,10 @@
import os
from Qt import QtWidgets
from bson.objectid import ObjectId
import pyblish.api
import avalon.api
from avalon import pipeline, io
from avalon import io
from openpype.api import Logger
from openpype.lib import register_event_callback
@ -11,6 +12,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.photoshop
@ -36,7 +38,7 @@ def check_inventory():
representation = container['representation']
representation_doc = io.find_one(
{
"_id": io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}
@ -221,7 +223,7 @@ def containerise(
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": name,
"namespace": namespace,
"loader": str(loader),

View file

@ -1,8 +1,7 @@
"""Host API required Work Files tool"""
import os
import avalon.api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from . import lib
@ -15,7 +14,7 @@ def _active_document():
def file_extensions():
return avalon.api.HOST_WORKFILE_EXTENSIONS["photoshop"]
return HOST_WORKFILE_EXTENSIONS["photoshop"]
def has_unsaved_changes():

View file

@ -6,13 +6,13 @@ import contextlib
from collections import OrderedDict
from avalon import api as avalon
from avalon import schema
from avalon.pipeline import AVALON_CONTAINER_ID
from pyblish import api as pyblish
from openpype.api import Logger
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from . import lib
from . import PLUGINS_DIR
@ -22,7 +22,6 @@ log = Logger().get_logger(__name__)
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
AVALON_CONTAINERS = ":AVALON_CONTAINERS"
@ -48,7 +47,6 @@ def install():
register_loader_plugin_path(LOAD_PATH)
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
@ -73,7 +71,6 @@ def uninstall():
deregister_loader_plugin_path(LOAD_PATH)
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)

View file

@ -3,9 +3,10 @@ import re
import pyblish.api
import json
from avalon.api import format_template_with_optional_keys
from openpype.lib import prepare_template_data
from openpype.lib import (
prepare_template_data,
StringTemplate,
)
class CollectTextures(pyblish.api.ContextPlugin):
@ -110,8 +111,9 @@ class CollectTextures(pyblish.api.ContextPlugin):
formatting_data.update(explicit_data)
fill_pairs = prepare_template_data(formatting_data)
workfile_subset = format_template_with_optional_keys(
fill_pairs, self.workfile_subset_template)
workfile_subset = StringTemplate.format_strict_template(
self.workfile_subset_template, fill_pairs
)
asset_build = self._get_asset_build(
repre_file,
@ -201,8 +203,9 @@ class CollectTextures(pyblish.api.ContextPlugin):
formatting_data.update(explicit_data)
fill_pairs = prepare_template_data(formatting_data)
subset = format_template_with_optional_keys(
fill_pairs, self.texture_subset_template)
subset = StringTemplate.format_strict_template(
self.texture_subset_template, fill_pairs
)
asset_build = self._get_asset_build(
repre_file,

View file

@ -1,10 +1,10 @@
from avalon import io
from openpype.lib import NumberDef
from openpype.hosts.testhost.api import pipeline
from openpype.pipeline import (
AutoCreator,
CreatedInstance,
lib
)
from avalon import io
class MyAutoCreator(AutoCreator):
@ -13,7 +13,7 @@ class MyAutoCreator(AutoCreator):
def get_instance_attr_defs(self):
output = [
lib.NumberDef("number_key", label="Number")
NumberDef("number_key", label="Number")
]
return output

View file

@ -1,10 +1,16 @@
import json
from openpype import resources
from openpype.hosts.testhost.api import pipeline
from openpype.lib import (
UISeparatorDef,
UILabelDef,
BoolDef,
NumberDef,
FileDef,
)
from openpype.pipeline import (
Creator,
CreatedInstance,
lib
)
@ -54,17 +60,17 @@ class TestCreatorOne(Creator):
def get_instance_attr_defs(self):
output = [
lib.NumberDef("number_key", label="Number"),
NumberDef("number_key", label="Number"),
]
return output
def get_pre_create_attr_defs(self):
output = [
lib.BoolDef("use_selection", label="Use selection"),
lib.UISeparatorDef(),
lib.UILabelDef("Testing label"),
lib.FileDef("filepath", folders=True, label="Filepath"),
lib.FileDef(
BoolDef("use_selection", label="Use selection"),
UISeparatorDef(),
UILabelDef("Testing label"),
FileDef("filepath", folders=True, label="Filepath"),
FileDef(
"filepath_2", multipath=True, folders=True, label="Filepath 2"
)
]

View file

@ -1,8 +1,8 @@
from openpype.lib import NumberDef, TextDef
from openpype.hosts.testhost.api import pipeline
from openpype.pipeline import (
Creator,
CreatedInstance,
lib
)
@ -40,8 +40,8 @@ class TestCreatorTwo(Creator):
def get_instance_attr_defs(self):
output = [
lib.NumberDef("number_key"),
lib.TextDef("text_key")
NumberDef("number_key"),
TextDef("text_key")
]
return output

View file

@ -1,10 +1,8 @@
import json
import pyblish.api
from openpype.pipeline import (
OpenPypePyblishPluginMixin,
attribute_definitions
)
from openpype.lib import attribute_definitions
from openpype.pipeline import OpenPypePyblishPluginMixin
class CollectInstanceOneTestHost(

View file

@ -10,7 +10,6 @@ import pyblish.api
import avalon.api
from avalon import io
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.hosts import tvpaint
from openpype.api import get_current_project_settings
@ -19,6 +18,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from .lib import (

View file

@ -4,6 +4,7 @@
"""
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .lib import (
execute_george,
execute_george_through_file
@ -47,7 +48,7 @@ def has_unsaved_changes():
def file_extensions():
"""Return the supported file extensions for Blender scene files."""
return api.HOST_WORKFILE_EXTENSIONS["tvpaint"]
return HOST_WORKFILE_EXTENSIONS["tvpaint"]
def work_root(session):

View file

@ -1,10 +1,11 @@
import getpass
import os
from avalon import api, io
from openpype.lib import (
StringTemplate,
get_workfile_template_key_from_context,
get_workdir_data
get_workdir_data,
get_last_workfile_with_version,
)
from openpype.api import Anatomy
from openpype.hosts.tvpaint.api import lib, pipeline, plugin
@ -67,9 +68,8 @@ class LoadWorkfile(plugin.Loader):
data = get_workdir_data(project_doc, asset_doc, task_name, host_name)
data["root"] = anatomy.roots
data["user"] = getpass.getuser()
template = anatomy.templates[template_key]["file"]
file_template = anatomy.templates[template_key]["file"]
# Define saving file extension
if current_file:
@ -81,11 +81,12 @@ class LoadWorkfile(plugin.Loader):
data["ext"] = extension
work_root = api.format_template_with_optional_keys(
data, anatomy.templates[template_key]["folder"]
folder_template = anatomy.templates[template_key]["folder"]
work_root = StringTemplate.format_strict_template(
folder_template, data
)
version = api.last_workfile_with_version(
work_root, template, data, host.file_extensions()
version = get_last_workfile_with_version(
work_root, file_template, data, host.file_extensions()
)[1]
if version is None:
@ -95,8 +96,8 @@ class LoadWorkfile(plugin.Loader):
data["version"] = version
path = os.path.join(
work_root,
api.format_template_with_optional_keys(data, template)
filename = StringTemplate.format_strict_template(
file_template, data
)
path = os.path.join(work_root, filename)
host.save_file(path)

View file

@ -4,13 +4,13 @@ import logging
from typing import List
import pyblish.api
from avalon.pipeline import AVALON_CONTAINER_ID
from avalon import api
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from openpype.tools.utils import host_tools
import openpype.hosts.unreal

View file

@ -2,8 +2,10 @@
"""Loader for published alembics."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
@ -117,7 +119,7 @@ class PointCacheAlembicLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,8 +2,10 @@
"""Load Skeletal Mesh alembics."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -81,7 +83,7 @@ class SkeletalMeshAlembicLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,8 +2,10 @@
"""Loader for Static Mesh alembics."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -100,7 +102,7 @@ class StaticMeshAlembicLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -3,8 +3,10 @@
import os
import json
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -135,7 +137,7 @@ class AnimationFBXLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,7 +2,8 @@
"""Load camera from FBX."""
import os
from avalon import io, pipeline
from avalon import io
from openpype.pipeline import AVALON_CONTAINER_ID
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -116,7 +117,7 @@ class CameraLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -11,12 +11,12 @@ from unreal import AssetToolsHelpers
from unreal import FBXImportType
from unreal import MathLibrary as umath
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import (
discover_loader_plugins,
loaders_from_representation,
load_container,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline

View file

@ -2,8 +2,10 @@
"""Load Skeletal Meshes form FBX."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -101,7 +103,7 @@ class SkeletalMeshFBXLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,8 +2,10 @@
"""Load Static meshes form FBX."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -95,7 +97,7 @@ class StaticMeshFBXLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -3,6 +3,8 @@ import os
import json
import math
from bson.objectid import ObjectId
import unreal
from unreal import EditorLevelLibrary as ell
from unreal import EditorAssetLibrary as eal
@ -62,7 +64,7 @@ class ExtractLayout(openpype.api.Extractor):
blend = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "blend"
},
projection={"_id": True})

View file

@ -29,6 +29,21 @@ from .vendor_bin_utils import (
is_oiio_supported
)
from .attribute_definitions import (
AbtractAttrDef,
UIDef,
UISeparatorDef,
UILabelDef,
UnknownDef,
NumberDef,
TextDef,
EnumDef,
BoolDef,
FileDef,
)
from .env_tools import (
env_value_to_bool,
get_paths_from_environ,
@ -114,6 +129,8 @@ from .avalon_context import (
get_workdir_data,
get_workdir,
get_workdir_with_workdir_data,
get_last_workfile_with_version,
get_last_workfile,
create_workfile_doc,
save_workfile_data_to_doc,
@ -231,6 +248,19 @@ __all__ = [
"get_ffmpeg_tool_path",
"is_oiio_supported",
"AbtractAttrDef",
"UIDef",
"UISeparatorDef",
"UILabelDef",
"UnknownDef",
"NumberDef",
"TextDef",
"EnumDef",
"BoolDef",
"FileDef",
"import_filepath",
"modules_from_path",
"recursive_bases_from_class",
@ -263,6 +293,8 @@ __all__ = [
"get_workdir_data",
"get_workdir",
"get_workdir_with_workdir_data",
"get_last_workfile_with_version",
"get_last_workfile",
"create_workfile_doc",
"save_workfile_data_to_doc",

View file

@ -28,7 +28,8 @@ from .local_settings import get_openpype_username
from .avalon_context import (
get_workdir_data,
get_workdir_with_workdir_data,
get_workfile_template_key
get_workfile_template_key,
get_last_workfile
)
from .python_module_tools import (
@ -1318,6 +1319,41 @@ def _merge_env(env, current_env):
return result
def _add_python_version_paths(app, env, logger):
"""Add vendor packages specific for a Python version."""
# Skip adding if host name is not set
if not app.host_name:
return
# Add Python 2/3 modules
openpype_root = os.getenv("OPENPYPE_REPOS_ROOT")
python_vendor_dir = os.path.join(
openpype_root,
"openpype",
"vendor",
"python"
)
if app.use_python_2:
pythonpath = os.path.join(python_vendor_dir, "python_2")
else:
pythonpath = os.path.join(python_vendor_dir, "python_3")
if not os.path.exists(pythonpath):
return
logger.debug("Adding Python version specific paths to PYTHONPATH")
python_paths = [pythonpath]
# Load PYTHONPATH from current launch context
python_path = env.get("PYTHONPATH")
if python_path:
python_paths.append(python_path)
# Set new PYTHONPATH to launch context environments
env["PYTHONPATH"] = os.pathsep.join(python_paths)
def prepare_app_environments(data, env_group=None, implementation_envs=True):
"""Modify launch environments based on launched app and context.
@ -1330,6 +1366,8 @@ def prepare_app_environments(data, env_group=None, implementation_envs=True):
app = data["app"]
log = data["log"]
_add_python_version_paths(app, data["env"], log)
# `added_env_keys` has debug purpose
added_env_keys = {app.group.name, app.name}
# Environments for application
@ -1544,6 +1582,7 @@ def _prepare_last_workfile(data, workdir):
workdir (str): Path to folder where workfiles should be stored.
"""
import avalon.api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
log = data["log"]
@ -1592,7 +1631,7 @@ def _prepare_last_workfile(data, workdir):
# Last workfile path
last_workfile_path = data.get("last_workfile_path") or ""
if not last_workfile_path:
extensions = avalon.api.HOST_WORKFILE_EXTENSIONS.get(app.host_name)
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
if extensions:
anatomy = data["anatomy"]
project_settings = data["project_settings"]
@ -1609,7 +1648,7 @@ def _prepare_last_workfile(data, workdir):
"ext": extensions[0]
})
last_workfile_path = avalon.api.last_workfile(
last_workfile_path = get_last_workfile(
workdir, file_template, workdir_data, extensions, True
)

View file

@ -9,6 +9,8 @@ import collections
import functools
import getpass
from bson.objectid import ObjectId
from openpype.settings import (
get_project_settings,
get_system_settings
@ -16,6 +18,7 @@ from openpype.settings import (
from .anatomy import Anatomy
from .profiles_filtering import filter_profiles
from .events import emit_event
from .path_templates import StringTemplate
# avalon module is not imported at the top
# - may not be in path at the time of pype.lib initialization
@ -168,7 +171,7 @@ def any_outdated():
representation_doc = avalon.io.find_one(
{
"_id": avalon.io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}
@ -1702,7 +1705,7 @@ def _get_task_context_data_for_anatomy(
"task": {
"name": task_name,
"type": task_type,
"short_name": project_task_type_data["short_name"]
"short": project_task_type_data["short_name"]
}
}
@ -1735,8 +1738,6 @@ def get_custom_workfile_template_by_context(
context. (Existence of formatted path is not validated.)
"""
from openpype.lib import filter_profiles
if anatomy is None:
anatomy = Anatomy(project_doc["name"])
@ -1759,7 +1760,9 @@ def get_custom_workfile_template_by_context(
# there are some anatomy template strings
if matching_item:
template = matching_item["path"][platform.system().lower()]
return template.format(**anatomy_context_data)
return StringTemplate.format_strict_template(
template, anatomy_context_data
)
return None
@ -1847,3 +1850,124 @@ def get_custom_workfile_template(template_profiles):
io.Session["AVALON_TASK"],
io
)
def get_last_workfile_with_version(
workdir, file_template, fill_data, extensions
):
"""Return last workfile version.
Args:
workdir(str): Path to dir where workfiles are stored.
file_template(str): Template of file name.
fill_data(dict): Data for filling template.
extensions(list, tuple): All allowed file extensions of workfile.
Returns:
tuple: Last workfile<str> with version<int> if there is any otherwise
returns (None, None).
"""
if not os.path.exists(workdir):
return None, None
# Fast match on extension
filenames = [
filename
for filename in os.listdir(workdir)
if os.path.splitext(filename)[1] in extensions
]
# Build template without optionals, version to digits only regex
# and comment to any definable value.
_ext = []
for ext in extensions:
if not ext.startswith("."):
ext = "." + ext
# Escape dot for regex
ext = "\\" + ext
_ext.append(ext)
ext_expression = "(?:" + "|".join(_ext) + ")"
# Replace `.{ext}` with `{ext}` so we are sure there is not dot at the end
file_template = re.sub(r"\.?{ext}", ext_expression, file_template)
# Replace optional keys with optional content regex
file_template = re.sub(r"<.*?>", r".*?", file_template)
# Replace `{version}` with group regex
file_template = re.sub(r"{version.*?}", r"([0-9]+)", file_template)
file_template = re.sub(r"{comment.*?}", r".+?", file_template)
file_template = StringTemplate.format_strict_template(
file_template, fill_data
)
# Match with ignore case on Windows due to the Windows
# OS not being case-sensitive. This avoids later running
# into the error that the file did exist if it existed
# with a different upper/lower-case.
kwargs = {}
if platform.system().lower() == "windows":
kwargs["flags"] = re.IGNORECASE
# Get highest version among existing matching files
version = None
output_filenames = []
for filename in sorted(filenames):
match = re.match(file_template, filename, **kwargs)
if not match:
continue
file_version = int(match.group(1))
if version is None or file_version > version:
output_filenames[:] = []
version = file_version
if file_version == version:
output_filenames.append(filename)
output_filename = None
if output_filenames:
if len(output_filenames) == 1:
output_filename = output_filenames[0]
else:
last_time = None
for _output_filename in output_filenames:
full_path = os.path.join(workdir, _output_filename)
mod_time = os.path.getmtime(full_path)
if last_time is None or last_time < mod_time:
output_filename = _output_filename
last_time = mod_time
return output_filename, version
def get_last_workfile(
workdir, file_template, fill_data, extensions, full_path=False
):
"""Return last workfile filename.
Returns file with version 1 if there is not workfile yet.
Args:
workdir(str): Path to dir where workfiles are stored.
file_template(str): Template of file name.
fill_data(dict): Data for filling template.
extensions(list, tuple): All allowed file extensions of workfile.
full_path(bool): Full path to file is returned if set to True.
Returns:
str: Last or first workfile as filename of full path to filename.
"""
filename, version = get_last_workfile_with_version(
workdir, file_template, fill_data, extensions
)
if filename is None:
data = copy.deepcopy(fill_data)
data["version"] = 1
data.pop("comment", None)
if not data.get("ext"):
data["ext"] = extensions[0]
filename = StringTemplate.format_strict_template(file_template, data)
if full_path:
return os.path.normpath(os.path.join(workdir, filename))
return filename

View file

@ -5,19 +5,30 @@ import glob
import clique
import collections
from .path_templates import (
StringTemplate,
TemplateUnsolved,
)
def collect_frames(files):
"""
Returns dict of source path and its frame, if from sequence
Uses clique as most precise solution
Uses clique as most precise solution, used when anatomy template that
created files is not known.
Assumption is that frames are separated by '.', negative frames are not
allowed.
Args:
files(list): list of source paths
files(list) or (set with single value): list of source paths
Returns:
(dict): {'/asset/subset_v001.0001.png': '0001', ....}
"""
collections, remainder = clique.assemble(files, minimum_items=1)
patterns = [clique.PATTERNS["frames"]]
collections, remainder = clique.assemble(files, minimum_items=1,
patterns=patterns)
sources_and_frames = {}
if collections:
@ -46,8 +57,6 @@ def sizeof_fmt(num, suffix='B'):
def path_from_representation(representation, anatomy):
from avalon import pipeline # safer importing
try:
template = representation["data"]["template"]
@ -57,12 +66,10 @@ def path_from_representation(representation, anatomy):
try:
context = representation["context"]
context["root"] = anatomy.roots
path = pipeline.format_template_with_optional_keys(
context, template
)
path = os.path.normpath(path.replace("/", "\\"))
path = StringTemplate.format_strict_template(template, context)
return os.path.normpath(path)
except KeyError:
except TemplateUnsolved:
# Template references unavailable data
return None

View file

@ -98,6 +98,10 @@ class PypeStreamHandler(logging.StreamHandler):
self.flush()
except (KeyboardInterrupt, SystemExit):
raise
except OSError:
self.handleError(record)
except Exception:
print(repr(record))
self.handleError(record)
@ -227,7 +231,7 @@ class PypeLogger:
logger = logging.getLogger(name or "__main__")
if cls.pype_debug > 1:
if cls.pype_debug > 0:
logger.setLevel(logging.DEBUG)
else:
logger.setLevel(logging.INFO)

View file

@ -1,413 +0,0 @@
*
.*
*
.*
*
.
*
.*
*
.
.
*
.*
.*
.*
*
.
.
*
.*
.*
.*
*
.
_.
/**
\ *
\*
*
*
.
__.
---*
\ \*
\ *
\*
*
.
\___.
/* *
\ \ *
\ \*
\ *
\*
.
|____.
/* *
\|\ *
\ \ *
\ \ *
\ \*
\/.
_/_____.
/* *
/ \ *
\ \ *
\ \ *
\ \__*
\/__.
__________.
--*-- ___*
\ \ \/_*
\ \ __*
\ \ \_*
\ \____\*
\/____/.
\____________ .
/* ___ \*
\ \ \/_\ *
\ \ _____*
\ \ \___/*
\ \____\ *
\/____/ .
|___________ .
/* ___ \ *
\|\ \/_\ \ *
\ \ _____/ *
\ \ \___/ *
\ \____\ / *
\/____/ \.
_/__________ .
/* ___ \ *
/ \ \/_\ \ *
\ \ _____/ *
\ \ \___/ ---*
\ \____\ / \__*
\/____/ \/__.
____________ .
--*-- ___ \ *
\ \ \/_\ \ *
\ \ _____/ *
\ \ \___/ ---- *
\ \____\ / \____\*
\/____/ \/____/.
____________
/\ ___ \ .
\ \ \/_\ \ *
\ \ _____/ *
\ \ \___/ ---- *
\ \____\ / \____\ .
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \ .
\ \ _____/ *
\ \ \___/ ---- *
\ \____\ / \____\ .
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ .
\ \ \___/ ---- *
\ \____\ / \____\ .
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/
\ \ \___/ ---- *
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/
\ \ \___/ ---- .
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ _
\ \ \___/ ----
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ----
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\ \
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\ __\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\ \__\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \ \
\ \____\ / \____\ \__\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \ \
\ \____\ / \____\ \__\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___.
\ \ \___/ ---- \ \\
\ \____\ / \____\ \__\,
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ .
\ \ \___/ ---- \ \\
\ \____\ / \____\ \__\\,
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ _.
\ \ \___/ ---- \ \\\
\ \____\ / \____\ \__\\\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ __.
\ \ \___/ ---- \ \\ \
\ \____\ / \____\ \__\\_/.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___.
\ \ \___/ ---- \ \\ \\
\ \____\ / \____\ \__\\__\.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ .
\ \ \___/ ---- \ \\ \\
\ \____\ / \____\ \__\\__\\.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ _.
\ \ \___/ ---- \ \\ \\\
\ \____\ / \____\ \__\\__\\.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ __.
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\_.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ __.
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ .
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ *
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ O*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ ..oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . p.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . Py.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYp.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPe.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE c.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE C1.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE ClU.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE CluB.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club . ..
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club . ..
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club . .
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club .

Some files were not shown because too many files have changed in this diff Show more