Merge branch 'develop' of github.com:pypeclub/OpenPype into feature/OP-2766_PS-to-new-publisher

This commit is contained in:
Petr Kalis 2022-03-23 12:29:36 +01:00
commit 268082ef7b
125 changed files with 1717 additions and 896 deletions

View file

@ -1,23 +1,47 @@
# Changelog
## [3.9.1-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.9.2-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.0...HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.1...HEAD)
**🚀 Enhancements**
- CI: change the version bump logic [\#2919](https://github.com/pypeclub/OpenPype/pull/2919)
- Deadline: Add headless argument [\#2916](https://github.com/pypeclub/OpenPype/pull/2916)
- Ftrack: Fill workfile in custom attribute [\#2906](https://github.com/pypeclub/OpenPype/pull/2906)
- Settings UI: Add simple tooltips for settings entities [\#2901](https://github.com/pypeclub/OpenPype/pull/2901)
**🐛 Bug fixes**
- Ftrack: Missing Ftrack id after editorial publish [\#2905](https://github.com/pypeclub/OpenPype/pull/2905)
- AfterEffects: Fix rendering for single frame in DL [\#2875](https://github.com/pypeclub/OpenPype/pull/2875)
**🔀 Refactored code**
- General: Move formatting and workfile functions [\#2914](https://github.com/pypeclub/OpenPype/pull/2914)
## [3.9.1](https://github.com/pypeclub/OpenPype/tree/3.9.1) (2022-03-18)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.1-nightly.3...3.9.1)
**🚀 Enhancements**
- General: Change how OPENPYPE\_DEBUG value is handled [\#2907](https://github.com/pypeclub/OpenPype/pull/2907)
- nuke: imageio adding ocio config version 1.2 [\#2897](https://github.com/pypeclub/OpenPype/pull/2897)
- Flame: support for comment with xml attribute overrides [\#2892](https://github.com/pypeclub/OpenPype/pull/2892)
- Nuke: ExtractReviewSlate can handle more codes and profiles [\#2879](https://github.com/pypeclub/OpenPype/pull/2879)
- Flame: sequence used for reference video [\#2869](https://github.com/pypeclub/OpenPype/pull/2869)
**🐛 Bug fixes**
- General: Fix use of Anatomy roots [\#2904](https://github.com/pypeclub/OpenPype/pull/2904)
- Fixing gap detection in extract review [\#2902](https://github.com/pypeclub/OpenPype/pull/2902)
- Pyblish Pype - ensure current state is correct when entering new group order [\#2899](https://github.com/pypeclub/OpenPype/pull/2899)
- SceneInventory: Fix import of load function [\#2894](https://github.com/pypeclub/OpenPype/pull/2894)
- Harmony - fixed creator issue [\#2891](https://github.com/pypeclub/OpenPype/pull/2891)
- General: Remove forgotten use of avalon Creator [\#2885](https://github.com/pypeclub/OpenPype/pull/2885)
- General: Avoid circular import [\#2884](https://github.com/pypeclub/OpenPype/pull/2884)
- Fixes for attaching loaded containers \(\#2837\) [\#2874](https://github.com/pypeclub/OpenPype/pull/2874)
- Maya: Deformer node ids validation plugin [\#2826](https://github.com/pypeclub/OpenPype/pull/2826)
**🔀 Refactored code**
@ -32,6 +56,10 @@
- AssetCreator: Remove the tool [\#2845](https://github.com/pypeclub/OpenPype/pull/2845)
### 📖 Documentation
- Documentation: Change Photoshop & AfterEffects plugin path [\#2878](https://github.com/pypeclub/OpenPype/pull/2878)
**🚀 Enhancements**
- General: Subset name filtering in ExtractReview outpus [\#2872](https://github.com/pypeclub/OpenPype/pull/2872)
@ -66,6 +94,7 @@
- Maya: Stop creation of reviews for Cryptomattes [\#2832](https://github.com/pypeclub/OpenPype/pull/2832)
- Deadline: Remove recreated event [\#2828](https://github.com/pypeclub/OpenPype/pull/2828)
- Deadline: Added missing events folder [\#2827](https://github.com/pypeclub/OpenPype/pull/2827)
- Maya: Deformer node ids validation plugin [\#2826](https://github.com/pypeclub/OpenPype/pull/2826)
- Settings: Missing document with OP versions may break start of OpenPype [\#2825](https://github.com/pypeclub/OpenPype/pull/2825)
- Deadline: more detailed temp file name for environment json [\#2824](https://github.com/pypeclub/OpenPype/pull/2824)
- General: Host name was formed from obsolete code [\#2821](https://github.com/pypeclub/OpenPype/pull/2821)
@ -83,7 +112,6 @@
- General: Move change context functions [\#2839](https://github.com/pypeclub/OpenPype/pull/2839)
- Tools: Don't use avalon tools code [\#2829](https://github.com/pypeclub/OpenPype/pull/2829)
- Move Unreal Implementation to OpenPype [\#2823](https://github.com/pypeclub/OpenPype/pull/2823)
- General: Extract template formatting from anatomy [\#2766](https://github.com/pypeclub/OpenPype/pull/2766)
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)

View file

@ -78,6 +78,7 @@ def install():
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
register_inventory_action,
)
from avalon import pipeline
@ -124,7 +125,7 @@ def install():
pyblish.register_plugin_path(path)
register_loader_plugin_path(path)
avalon.register_plugin_path(LegacyCreator, path)
avalon.register_plugin_path(avalon.InventoryAction, path)
register_inventory_action(path)
# apply monkey patched discover to original one
log.info("Patching discovery")

View file

@ -101,7 +101,7 @@ def eventserver(debug,
on linux and window service).
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = "3"
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().launch_eventservercli(
ftrack_url,
@ -128,7 +128,7 @@ def webpublisherwebserver(debug, executable, upload_dir, host=None, port=None):
Expect "pype.club" user created on Ftrack.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = "3"
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().launch_webpublisher_webservercli(
upload_dir=upload_dir,
@ -176,7 +176,7 @@ def publish(debug, paths, targets, gui):
More than one path is allowed.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands.publish(list(paths), targets, gui)
@ -195,7 +195,7 @@ def remotepublishfromapp(debug, project, path, host, user=None, targets=None):
More than one path is allowed.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands.remotepublishfromapp(
project, path, host, user, targets=targets
)
@ -215,7 +215,7 @@ def remotepublish(debug, project, path, user=None, targets=None):
More than one path is allowed.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands.remotepublish(project, path, user, targets=targets)
@ -240,7 +240,7 @@ def texturecopy(debug, project, asset, path):
Nothing is written to database.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().texture_copy(project, asset, path)
@ -409,7 +409,7 @@ def syncserver(debug, active_site):
var OPENPYPE_LOCAL_ID set to 'active_site'.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
os.environ["OPENPYPE_DEBUG"] = "1"
PypeCommands().syncserver(active_site)

View file

@ -2,10 +2,11 @@ import os
import sys
from Qt import QtWidgets
from bson.objectid import ObjectId
import pyblish.api
import avalon.api
from avalon import io, pipeline
from avalon import io
from openpype import lib
from openpype.api import Logger
@ -13,6 +14,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.aftereffects
from openpype.lib import register_event_callback
@ -29,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
def check_inventory():
@ -42,7 +43,7 @@ def check_inventory():
representation = container['representation']
representation_doc = io.find_one(
{
"_id": io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}
@ -149,7 +150,7 @@ def containerise(name,
"""
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": name,
"namespace": namespace,
"loader": str(loader),

View file

@ -1,8 +1,8 @@
"""Host API required Work Files tool"""
import os
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .launch_logic import get_stub
from avalon import api
def _active_document():
@ -14,7 +14,7 @@ def _active_document():
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["aftereffects"]
return HOST_WORKFILE_EXTENSIONS["aftereffects"]
def has_unsaved_changes():

View file

@ -328,7 +328,6 @@ class LaunchWorkFiles(LaunchQtApp):
result = super().execute(context)
self._window.set_context({
"asset": avalon.api.Session["AVALON_ASSET"],
"silo": avalon.api.Session["AVALON_SILO"],
"task": avalon.api.Session["AVALON_TASK"]
})
return result

View file

@ -12,12 +12,12 @@ from . import ops
import pyblish.api
import avalon.api
from avalon import io, schema
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from openpype.api import Logger
from openpype.lib import (
@ -31,7 +31,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
ORIGINAL_EXCEPTHOOK = sys.excepthook

View file

@ -4,7 +4,8 @@ from pathlib import Path
from typing import List, Optional
import bpy
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
class OpenFileCacher:
@ -77,7 +78,7 @@ def has_unsaved_changes() -> bool:
def file_extensions() -> List[str]:
"""Return the supported file extensions for Blender scene files."""
return api.HOST_WORKFILE_EXTENSIONS["blender"]
return HOST_WORKFILE_EXTENSIONS["blender"]
def work_root(session: dict) -> str:

View file

@ -6,11 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)
from openpype.hosts.blender.api import plugin, lib

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -7,12 +7,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)
logger = logging.getLogger("openpype").getChild(

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin, lib
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin, lib
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -10,12 +10,12 @@ from openpype import lib
from openpype.pipeline import (
legacy_create,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -13,12 +13,12 @@ from openpype.pipeline import (
load_container,
get_representation_path,
loaders_from_representation,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api.pipeline import (
AVALON_INSTANCES,
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)
from openpype.hosts.blender.api import plugin

View file

@ -6,12 +6,14 @@ from typing import Dict, List, Optional
import bpy
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -10,6 +10,7 @@ from openpype import lib
from openpype.pipeline import (
legacy_create,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.blender.api import (
plugin,
@ -18,7 +19,6 @@ from openpype.hosts.blender.api import (
from openpype.hosts.blender.api.pipeline import (
AVALON_CONTAINERS,
AVALON_PROPERTY,
AVALON_CONTAINER_ID
)

View file

@ -1,6 +1,8 @@
import os
import json
from bson.objectid import ObjectId
import bpy
import bpy_extras
import bpy_extras.anim_utils
@ -140,7 +142,7 @@ class ExtractLayout(openpype.api.Extractor):
blend = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "blend"
},
projection={"_id": True})
@ -151,7 +153,7 @@ class ExtractLayout(openpype.api.Extractor):
fbx = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "fbx"
},
projection={"_id": True})
@ -162,7 +164,7 @@ class ExtractLayout(openpype.api.Extractor):
abc = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "abc"
},
projection={"_id": True})

View file

@ -4,13 +4,14 @@ Basic avalon integration
import os
import contextlib
from avalon import api as avalon
from avalon.pipeline import AVALON_CONTAINER_ID
from pyblish import api as pyblish
from openpype.api import Logger
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from .lib import (
set_segment_data_marker,
@ -26,7 +27,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
AVALON_CONTAINERS = "AVALON_CONTAINERS"
@ -34,12 +34,10 @@ log = Logger.get_logger(__name__)
def install():
pyblish.register_host("flame")
pyblish.register_plugin_path(PUBLISH_PATH)
register_loader_plugin_path(LOAD_PATH)
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
log.info("OpenPype Flame plug-ins registred ...")
# register callback for switching publishable
@ -47,6 +45,7 @@ def install():
log.info("OpenPype Flame host installed ...")
def uninstall():
pyblish.deregister_host("flame")
@ -54,7 +53,6 @@ def uninstall():
pyblish.deregister_plugin_path(PUBLISH_PATH)
deregister_loader_plugin_path(LOAD_PATH)
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)

View file

@ -3,6 +3,7 @@ import sys
import re
import contextlib
from bson.objectid import ObjectId
from Qt import QtGui
from avalon import io
@ -92,7 +93,7 @@ def switch_item(container,
# Collect any of current asset, subset and representation if not provided
# so we can use the original name from those.
if any(not x for x in [asset_name, subset_name, representation_name]):
_id = io.ObjectId(container["representation"])
_id = ObjectId(container["representation"])
representation = io.find_one({"type": "representation", "_id": _id})
version, subset, asset, project = io.parenthood(representation)

View file

@ -8,13 +8,15 @@ import contextlib
import pyblish.api
import avalon.api
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.api import Logger
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
register_inventory_action_path,
deregister_inventory_action_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.fusion
@ -69,7 +71,7 @@ def install():
register_loader_plugin_path(LOAD_PATH)
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
register_inventory_action_path(INVENTORY_PATH)
pyblish.api.register_callback(
"instanceToggled", on_pyblish_instance_toggled
@ -93,9 +95,7 @@ def uninstall():
deregister_loader_plugin_path(LOAD_PATH)
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.deregister_plugin_path(
avalon.api.InventoryAction, INVENTORY_PATH
)
deregister_inventory_action_path(INVENTORY_PATH)
pyblish.api.deregister_callback(
"instanceToggled", on_pyblish_instance_toggled

View file

@ -1,12 +1,14 @@
"""Host API required Work Files tool"""
import sys
import os
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .pipeline import get_current_comp
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["fusion"]
return HOST_WORKFILE_EXTENSIONS["fusion"]
def has_unsaved_changes():

View file

@ -1,7 +1,7 @@
from avalon import api
from openpype.pipeline import InventoryAction
class FusionSelectContainers(api.InventoryAction):
class FusionSelectContainers(InventoryAction):
label = "Select Containers"
icon = "mouse-pointer"

View file

@ -1,6 +1,6 @@
from avalon import api
from Qt import QtGui, QtWidgets
from openpype.pipeline import InventoryAction
from openpype import style
from openpype.hosts.fusion.api import (
get_current_comp,
@ -8,7 +8,7 @@ from openpype.hosts.fusion.api import (
)
class FusionSetToolColor(api.InventoryAction):
class FusionSetToolColor(InventoryAction):
"""Update the color of the selected tools"""
label = "Set Tool Color"

View file

@ -2,11 +2,11 @@ import os
from pathlib import Path
import logging
from bson.objectid import ObjectId
import pyblish.api
from avalon import io
import avalon.api
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype import lib
from openpype.lib import register_event_callback
@ -14,6 +14,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.harmony
import openpype.hosts.harmony.api as harmony
@ -113,7 +114,7 @@ def check_inventory():
representation = container['representation']
representation_doc = io.find_one(
{
"_id": io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}

View file

@ -2,20 +2,21 @@
import os
import shutil
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .lib import (
ProcessContext,
get_local_harmony_path,
zip_and_move,
launch_zip_file
)
from avalon import api
# used to lock saving until previous save is done.
save_disabled = False
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["harmony"]
return HOST_WORKFILE_EXTENSIONS["harmony"]
def has_unsaved_changes():

View file

@ -1,12 +1,12 @@
import os
import hiero.core.events
from openpype.api import Logger
from openpype.lib import register_event_callback
from .lib import (
sync_avalon_data_to_workfile,
launch_workfiles_app,
selection_changed_timeline,
before_project_save,
register_event_callback
)
from .tags import add_tags_to_workfile
from .menu import update_menu_task_label

View file

@ -8,7 +8,10 @@ import platform
import ast
import shutil
import hiero
from Qt import QtWidgets
from bson.objectid import ObjectId
import avalon.api as avalon
import avalon.io
from openpype.api import (Logger, Anatomy, get_anatomy_settings)
@ -1006,7 +1009,7 @@ def check_inventory_versions():
# get representation from io
representation = io.find_one({
"type": "representation",
"_id": io.ObjectId(container["representation"])
"_id": ObjectId(container["representation"])
})
# Get start frame from version data

View file

@ -4,7 +4,7 @@ Basic avalon integration
import os
import contextlib
from collections import OrderedDict
from avalon.pipeline import AVALON_CONTAINER_ID
from avalon import api as avalon
from avalon import schema
from pyblish import api as pyblish
@ -13,6 +13,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from openpype.tools.utils import host_tools
from . import lib, menu, events
@ -28,7 +29,6 @@ PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish").replace("\\", "/")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load").replace("\\", "/")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create").replace("\\", "/")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory").replace("\\", "/")
AVALON_CONTAINERS = ":AVALON_CONTAINERS"
@ -51,7 +51,6 @@ def install():
pyblish.register_plugin_path(PUBLISH_PATH)
register_loader_plugin_path(LOAD_PATH)
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)

View file

@ -1,14 +1,14 @@
import os
import hiero
from avalon import api
from openpype.api import Logger
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
log = Logger().get_logger(__name__)
log = Logger.get_logger(__name__)
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["hiero"]
return HOST_WORKFILE_EXTENSIONS["hiero"]
def has_unsaved_changes():

View file

@ -8,12 +8,12 @@ import hdefereval
import pyblish.api
import avalon.api
from avalon.pipeline import AVALON_CONTAINER_ID
from avalon.lib import find_submodule
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.houdini
from openpype.hosts.houdini.api import lib

View file

@ -2,11 +2,11 @@
import os
import hou
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["houdini"]
return HOST_WORKFILE_EXTENSIONS["houdini"]
def has_unsaved_changes():

View file

@ -3,6 +3,7 @@ import os
from openpype.pipeline import (
load,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.houdini.api import lib, pipeline
@ -73,7 +74,7 @@ class ImageLoader(load.LoaderPlugin):
# Imprint it manually
data = {
"schema": "avalon-core:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": node_name,
"namespace": namespace,
"loader": str(self.__class__.__name__),

View file

@ -1,8 +1,9 @@
from openpype.pipeline import (
load,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.houdini.api import lib, pipeline
from openpype.hosts.houdini.api import lib
class USDSublayerLoader(load.LoaderPlugin):
@ -43,7 +44,7 @@ class USDSublayerLoader(load.LoaderPlugin):
# Imprint it manually
data = {
"schema": "avalon-core:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": node_name,
"namespace": namespace,
"loader": str(self.__class__.__name__),

View file

@ -1,8 +1,9 @@
from openpype.pipeline import (
load,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.houdini.api import lib, pipeline
from openpype.hosts.houdini.api import lib
class USDReferenceLoader(load.LoaderPlugin):
@ -43,7 +44,7 @@ class USDReferenceLoader(load.LoaderPlugin):
# Imprint it manually
data = {
"schema": "avalon-core:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": node_name,
"namespace": namespace,
"loader": str(self.__class__.__name__),

View file

@ -145,7 +145,6 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
path = self._template.format(**{
"root": root,
"project": PROJECT,
"silo": asset_doc["silo"],
"asset": asset_doc["name"],
"subset": subset,
"representation": ext,
@ -165,4 +164,3 @@ output_processor = AvalonURIOutputProcessor()
def usdOutputProcessor():
return output_processor

View file

@ -10,7 +10,6 @@ import pyblish.api
import avalon.api
from avalon.lib import find_submodule
from avalon.pipeline import AVALON_CONTAINER_ID
import openpype.hosts.maya
from openpype.tools.utils import host_tools
@ -23,7 +22,10 @@ from openpype.lib.path_tools import HostDirmap
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
register_inventory_action_path,
deregister_loader_plugin_path,
deregister_inventory_action_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.maya.lib import copy_workspace_mel
from . import menu, lib
@ -59,7 +61,7 @@ def install():
register_loader_plugin_path(LOAD_PATH)
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
register_inventory_action_path(INVENTORY_PATH)
log.info(PUBLISH_PATH)
log.info("Installing callbacks ... ")
@ -188,9 +190,7 @@ def uninstall():
deregister_loader_plugin_path(LOAD_PATH)
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.deregister_plugin_path(
avalon.api.InventoryAction, INVENTORY_PATH
)
deregister_inventory_action_path(INVENTORY_PATH)
menu.uninstall()

View file

@ -4,11 +4,11 @@ from maya import cmds
import qargparse
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import (
LegacyCreator,
LoaderPlugin,
get_representation_path,
AVALON_CONTAINER_ID,
)
from .pipeline import containerise

View file

@ -6,6 +6,8 @@ import contextlib
import copy
import six
from bson.objectid import ObjectId
from maya import cmds
from avalon import io
@ -282,7 +284,7 @@ def update_package_version(container, version):
# Versioning (from `core.maya.pipeline`)
current_representation = io.find_one({
"_id": io.ObjectId(container["representation"])
"_id": ObjectId(container["representation"])
})
assert current_representation is not None, "This is a bug"
@ -327,7 +329,7 @@ def update_package(set_container, representation):
# Load the original package data
current_representation = io.find_one({
"_id": io.ObjectId(set_container['representation']),
"_id": ObjectId(set_container['representation']),
"type": "representation"
})
@ -478,10 +480,10 @@ def update_scene(set_container, containers, current_data, new_data, new_file):
# They *must* use the same asset, subset and Loader for
# `update_container` to make sense.
old = io.find_one({
"_id": io.ObjectId(representation_current)
"_id": ObjectId(representation_current)
})
new = io.find_one({
"_id": io.ObjectId(representation_new)
"_id": ObjectId(representation_new)
})
is_valid = compare_representations(old=old, new=new)
if not is_valid:

View file

@ -1,11 +1,12 @@
"""Host API required Work Files tool"""
import os
from maya import cmds
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return api.HOST_WORKFILE_EXTENSIONS["maya"]
return HOST_WORKFILE_EXTENSIONS["maya"]
def has_unsaved_changes():

View file

@ -1,6 +1,8 @@
import json
from avalon import api, io
from avalon import io
from bson.objectid import ObjectId
from openpype.pipeline import (
InventoryAction,
get_representation_context,
get_representation_path_from_context,
)
@ -10,7 +12,7 @@ from openpype.hosts.maya.api.lib import (
)
class ImportModelRender(api.InventoryAction):
class ImportModelRender(InventoryAction):
label = "Import Model Render Sets"
icon = "industry"
@ -39,7 +41,7 @@ class ImportModelRender(api.InventoryAction):
nodes.append(n)
repr_doc = io.find_one({
"_id": io.ObjectId(container["representation"]),
"_id": ObjectId(container["representation"]),
})
version_id = repr_doc["parent"]

View file

@ -1,11 +1,10 @@
from maya import cmds
from avalon import api
from openpype.pipeline import InventoryAction
from openpype.hosts.maya.api.plugin import get_reference_node
class ImportReference(api.InventoryAction):
class ImportReference(InventoryAction):
"""Imports selected reference to inside of the file."""
label = "Import Reference"

View file

@ -7,6 +7,8 @@ loader will use them instead of native vray vrmesh format.
"""
import os
from bson.objectid import ObjectId
import maya.cmds as cmds
from avalon import io
@ -186,7 +188,7 @@ class VRayProxyLoader(load.LoaderPlugin):
abc_rep = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(version_id),
"parent": ObjectId(version_id),
"name": "abc"
})

View file

@ -6,7 +6,7 @@ from maya import cmds
import openpype.api
from openpype.hosts.maya.api.lib import maintained_selection
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import AVALON_CONTAINER_ID
class ExtractMayaSceneRaw(openpype.api.Extractor):

View file

@ -1,6 +1,7 @@
import logging
import contextlib
import nuke
from bson.objectid import ObjectId
from avalon import api, io
@ -70,10 +71,10 @@ def get_handles(asset):
if "visualParent" in data:
vp = data["visualParent"]
if vp is not None:
parent_asset = io.find_one({"_id": io.ObjectId(vp)})
parent_asset = io.find_one({"_id": ObjectId(vp)})
if parent_asset is None:
parent_asset = io.find_one({"_id": io.ObjectId(asset["parent"])})
parent_asset = io.find_one({"_id": ObjectId(asset["parent"])})
if parent_asset is not None:
return get_handles(parent_asset)

View file

@ -6,10 +6,11 @@ import contextlib
from collections import OrderedDict
import clique
from bson.objectid import ObjectId
import nuke
from avalon import api, io, lib
from avalon import api, io
from openpype.api import (
Logger,
@ -20,7 +21,6 @@ from openpype.api import (
get_workdir_data,
get_asset,
get_current_project_settings,
ApplicationManager
)
from openpype.tools.utils import host_tools
from openpype.lib.path_tools import HostDirmap
@ -570,7 +570,7 @@ def check_inventory_versions():
# get representation from io
representation = io.find_one({
"type": "representation",
"_id": io.ObjectId(avalon_knob_data["representation"])
"_id": ObjectId(avalon_knob_data["representation"])
})
# Failsafe for not finding the representation.

View file

@ -6,7 +6,6 @@ import nuke
import pyblish.api
import avalon.api
from avalon import pipeline
import openpype
from openpype.api import (
@ -18,7 +17,10 @@ from openpype.lib import register_event_callback
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
register_inventory_action_path,
deregister_loader_plugin_path,
deregister_inventory_action_path,
AVALON_CONTAINER_ID,
)
from openpype.tools.utils import host_tools
@ -105,7 +107,7 @@ def install():
pyblish.api.register_plugin_path(PUBLISH_PATH)
register_loader_plugin_path(LOAD_PATH)
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
register_inventory_action_path(INVENTORY_PATH)
# Register Avalon event for workfiles loading.
register_event_callback("workio.open_file", check_inventory_versions)
@ -131,6 +133,7 @@ def uninstall():
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
deregister_loader_plugin_path(LOAD_PATH)
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
deregister_inventory_action_path(INVENTORY_PATH)
pyblish.api.deregister_callback(
"instanceToggled", on_pyblish_instance_toggled)
@ -330,7 +333,7 @@ def containerise(node,
data = OrderedDict(
[
("schema", "openpype:container-2.0"),
("id", pipeline.AVALON_CONTAINER_ID),
("id", AVALON_CONTAINER_ID),
("name", name),
("namespace", namespace),
("loader", str(loader)),

View file

@ -1,11 +1,12 @@
"""Host API required Work Files tool"""
import os
import nuke
import avalon.api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return avalon.api.HOST_WORKFILE_EXTENSIONS["nuke"]
return HOST_WORKFILE_EXTENSIONS["nuke"]
def has_unsaved_changes():

View file

@ -1,9 +1,9 @@
from avalon import api
from openpype.api import Logger
from openpype.pipeline import InventoryAction
from openpype.hosts.nuke.api.lib import set_avalon_knob_data
class RepairOldLoaders(api.InventoryAction):
class RepairOldLoaders(InventoryAction):
label = "Repair Old Loaders"
icon = "gears"

View file

@ -1,8 +1,8 @@
from avalon import api
from openpype.pipeline import InventoryAction
from openpype.hosts.nuke.api.commands import viewer_update_and_undo_stop
class SelectContainers(api.InventoryAction):
class SelectContainers(InventoryAction):
label = "Select Containers"
icon = "mouse-pointer"

View file

@ -101,7 +101,7 @@ class LoadClip(plugin.NukeLoader):
last += self.handle_end
if not is_sequence:
duration = last - first + 1
duration = last - first
first = 1
last = first + duration
elif "#" not in file:
@ -216,7 +216,7 @@ class LoadClip(plugin.NukeLoader):
last += self.handle_end
if not is_sequence:
duration = last - first + 1
duration = last - first
first = 1
last = first + duration
elif "#" not in file:

View file

@ -1,9 +1,10 @@
import os
from Qt import QtWidgets
from bson.objectid import ObjectId
import pyblish.api
import avalon.api
from avalon import pipeline, io
from avalon import io
from openpype.api import Logger
from openpype.lib import register_event_callback
@ -12,6 +13,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.photoshop
@ -37,7 +39,7 @@ def check_inventory():
representation = container['representation']
representation_doc = io.find_one(
{
"_id": io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}
@ -226,7 +228,7 @@ def containerise(
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"name": name,
"namespace": namespace,
"loader": str(loader),

View file

@ -1,8 +1,7 @@
"""Host API required Work Files tool"""
import os
import avalon.api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from . import lib
@ -15,7 +14,7 @@ def _active_document():
def file_extensions():
return avalon.api.HOST_WORKFILE_EXTENSIONS["photoshop"]
return HOST_WORKFILE_EXTENSIONS["photoshop"]
def has_unsaved_changes():

View file

@ -6,13 +6,13 @@ import contextlib
from collections import OrderedDict
from avalon import api as avalon
from avalon import schema
from avalon.pipeline import AVALON_CONTAINER_ID
from pyblish import api as pyblish
from openpype.api import Logger
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from . import lib
from . import PLUGINS_DIR
@ -22,7 +22,6 @@ log = Logger().get_logger(__name__)
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
AVALON_CONTAINERS = ":AVALON_CONTAINERS"
@ -48,7 +47,6 @@ def install():
register_loader_plugin_path(LOAD_PATH)
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
@ -73,7 +71,6 @@ def uninstall():
deregister_loader_plugin_path(LOAD_PATH)
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
# register callback for switching publishable
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)

View file

@ -3,9 +3,10 @@ import re
import pyblish.api
import json
from avalon.api import format_template_with_optional_keys
from openpype.lib import prepare_template_data
from openpype.lib import (
prepare_template_data,
StringTemplate,
)
class CollectTextures(pyblish.api.ContextPlugin):
@ -110,8 +111,9 @@ class CollectTextures(pyblish.api.ContextPlugin):
formatting_data.update(explicit_data)
fill_pairs = prepare_template_data(formatting_data)
workfile_subset = format_template_with_optional_keys(
fill_pairs, self.workfile_subset_template)
workfile_subset = StringTemplate.format_strict_template(
self.workfile_subset_template, fill_pairs
)
asset_build = self._get_asset_build(
repre_file,
@ -201,8 +203,9 @@ class CollectTextures(pyblish.api.ContextPlugin):
formatting_data.update(explicit_data)
fill_pairs = prepare_template_data(formatting_data)
subset = format_template_with_optional_keys(
fill_pairs, self.texture_subset_template)
subset = StringTemplate.format_strict_template(
self.texture_subset_template, fill_pairs
)
asset_build = self._get_asset_build(
repre_file,

View file

@ -10,7 +10,6 @@ import pyblish.api
import avalon.api
from avalon import io
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.hosts import tvpaint
from openpype.api import get_current_project_settings
@ -19,6 +18,7 @@ from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from .lib import (

View file

@ -4,6 +4,7 @@
"""
from avalon import api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .lib import (
execute_george,
execute_george_through_file
@ -47,7 +48,7 @@ def has_unsaved_changes():
def file_extensions():
"""Return the supported file extensions for Blender scene files."""
return api.HOST_WORKFILE_EXTENSIONS["tvpaint"]
return HOST_WORKFILE_EXTENSIONS["tvpaint"]
def work_root(session):

View file

@ -1,10 +1,11 @@
import getpass
import os
from avalon import api, io
from openpype.lib import (
StringTemplate,
get_workfile_template_key_from_context,
get_workdir_data
get_workdir_data,
get_last_workfile_with_version,
)
from openpype.api import Anatomy
from openpype.hosts.tvpaint.api import lib, pipeline, plugin
@ -67,9 +68,8 @@ class LoadWorkfile(plugin.Loader):
data = get_workdir_data(project_doc, asset_doc, task_name, host_name)
data["root"] = anatomy.roots
data["user"] = getpass.getuser()
template = anatomy.templates[template_key]["file"]
file_template = anatomy.templates[template_key]["file"]
# Define saving file extension
if current_file:
@ -81,11 +81,12 @@ class LoadWorkfile(plugin.Loader):
data["ext"] = extension
work_root = api.format_template_with_optional_keys(
data, anatomy.templates[template_key]["folder"]
folder_template = anatomy.templates[template_key]["folder"]
work_root = StringTemplate.format_strict_template(
folder_template, data
)
version = api.last_workfile_with_version(
work_root, template, data, host.file_extensions()
version = get_last_workfile_with_version(
work_root, file_template, data, host.file_extensions()
)[1]
if version is None:
@ -95,8 +96,8 @@ class LoadWorkfile(plugin.Loader):
data["version"] = version
path = os.path.join(
work_root,
api.format_template_with_optional_keys(data, template)
filename = StringTemplate.format_strict_template(
file_template, data
)
path = os.path.join(work_root, filename)
host.save_file(path)

View file

@ -4,13 +4,13 @@ import logging
from typing import List
import pyblish.api
from avalon.pipeline import AVALON_CONTAINER_ID
from avalon import api
from openpype.pipeline import (
LegacyCreator,
register_loader_plugin_path,
deregister_loader_plugin_path,
AVALON_CONTAINER_ID,
)
from openpype.tools.utils import host_tools
import openpype.hosts.unreal

View file

@ -2,8 +2,10 @@
"""Loader for published alembics."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
@ -117,7 +119,7 @@ class PointCacheAlembicLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,8 +2,10 @@
"""Load Skeletal Mesh alembics."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -81,7 +83,7 @@ class SkeletalMeshAlembicLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,8 +2,10 @@
"""Loader for Static Mesh alembics."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -100,7 +102,7 @@ class StaticMeshAlembicLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -3,8 +3,10 @@
import os
import json
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -135,7 +137,7 @@ class AnimationFBXLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,7 +2,8 @@
"""Load camera from FBX."""
import os
from avalon import io, pipeline
from avalon import io
from openpype.pipeline import AVALON_CONTAINER_ID
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -116,7 +117,7 @@ class CameraLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -11,12 +11,12 @@ from unreal import AssetToolsHelpers
from unreal import FBXImportType
from unreal import MathLibrary as umath
from avalon.pipeline import AVALON_CONTAINER_ID
from openpype.pipeline import (
discover_loader_plugins,
loaders_from_representation,
load_container,
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline

View file

@ -2,8 +2,10 @@
"""Load Skeletal Meshes form FBX."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -101,7 +103,7 @@ class SkeletalMeshFBXLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -2,8 +2,10 @@
"""Load Static meshes form FBX."""
import os
from avalon import pipeline
from openpype.pipeline import get_representation_path
from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID
)
from openpype.hosts.unreal.api import plugin
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
import unreal # noqa
@ -95,7 +97,7 @@ class StaticMeshFBXLoader(plugin.Loader):
data = {
"schema": "openpype:container-2.0",
"id": pipeline.AVALON_CONTAINER_ID,
"id": AVALON_CONTAINER_ID,
"asset": asset,
"namespace": asset_dir,
"container_name": container_name,

View file

@ -3,6 +3,8 @@ import os
import json
import math
from bson.objectid import ObjectId
import unreal
from unreal import EditorLevelLibrary as ell
from unreal import EditorAssetLibrary as eal
@ -62,7 +64,7 @@ class ExtractLayout(openpype.api.Extractor):
blend = io.find_one(
{
"type": "representation",
"parent": io.ObjectId(parent),
"parent": ObjectId(parent),
"name": "blend"
},
projection={"_id": True})

View file

@ -114,6 +114,8 @@ from .avalon_context import (
get_workdir_data,
get_workdir,
get_workdir_with_workdir_data,
get_last_workfile_with_version,
get_last_workfile,
create_workfile_doc,
save_workfile_data_to_doc,
@ -263,6 +265,8 @@ __all__ = [
"get_workdir_data",
"get_workdir",
"get_workdir_with_workdir_data",
"get_last_workfile_with_version",
"get_last_workfile",
"create_workfile_doc",
"save_workfile_data_to_doc",

View file

@ -28,7 +28,8 @@ from .local_settings import get_openpype_username
from .avalon_context import (
get_workdir_data,
get_workdir_with_workdir_data,
get_workfile_template_key
get_workfile_template_key,
get_last_workfile
)
from .python_module_tools import (
@ -1544,6 +1545,7 @@ def _prepare_last_workfile(data, workdir):
workdir (str): Path to folder where workfiles should be stored.
"""
import avalon.api
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
log = data["log"]
@ -1592,7 +1594,7 @@ def _prepare_last_workfile(data, workdir):
# Last workfile path
last_workfile_path = data.get("last_workfile_path") or ""
if not last_workfile_path:
extensions = avalon.api.HOST_WORKFILE_EXTENSIONS.get(app.host_name)
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
if extensions:
anatomy = data["anatomy"]
project_settings = data["project_settings"]
@ -1609,7 +1611,7 @@ def _prepare_last_workfile(data, workdir):
"ext": extensions[0]
})
last_workfile_path = avalon.api.last_workfile(
last_workfile_path = get_last_workfile(
workdir, file_template, workdir_data, extensions, True
)

View file

@ -9,6 +9,8 @@ import collections
import functools
import getpass
from bson.objectid import ObjectId
from openpype.settings import (
get_project_settings,
get_system_settings
@ -16,6 +18,7 @@ from openpype.settings import (
from .anatomy import Anatomy
from .profiles_filtering import filter_profiles
from .events import emit_event
from .path_templates import StringTemplate
# avalon module is not imported at the top
# - may not be in path at the time of pype.lib initialization
@ -168,7 +171,7 @@ def any_outdated():
representation_doc = avalon.io.find_one(
{
"_id": avalon.io.ObjectId(representation),
"_id": ObjectId(representation),
"type": "representation"
},
projection={"parent": True}
@ -1735,8 +1738,6 @@ def get_custom_workfile_template_by_context(
context. (Existence of formatted path is not validated.)
"""
from openpype.lib import filter_profiles
if anatomy is None:
anatomy = Anatomy(project_doc["name"])
@ -1759,7 +1760,9 @@ def get_custom_workfile_template_by_context(
# there are some anatomy template strings
if matching_item:
template = matching_item["path"][platform.system().lower()]
return template.format(**anatomy_context_data)
return StringTemplate.format_strict_template(
template, anatomy_context_data
)
return None
@ -1847,3 +1850,124 @@ def get_custom_workfile_template(template_profiles):
io.Session["AVALON_TASK"],
io
)
def get_last_workfile_with_version(
workdir, file_template, fill_data, extensions
):
"""Return last workfile version.
Args:
workdir(str): Path to dir where workfiles are stored.
file_template(str): Template of file name.
fill_data(dict): Data for filling template.
extensions(list, tuple): All allowed file extensions of workfile.
Returns:
tuple: Last workfile<str> with version<int> if there is any otherwise
returns (None, None).
"""
if not os.path.exists(workdir):
return None, None
# Fast match on extension
filenames = [
filename
for filename in os.listdir(workdir)
if os.path.splitext(filename)[1] in extensions
]
# Build template without optionals, version to digits only regex
# and comment to any definable value.
_ext = []
for ext in extensions:
if not ext.startswith("."):
ext = "." + ext
# Escape dot for regex
ext = "\\" + ext
_ext.append(ext)
ext_expression = "(?:" + "|".join(_ext) + ")"
# Replace `.{ext}` with `{ext}` so we are sure there is not dot at the end
file_template = re.sub(r"\.?{ext}", ext_expression, file_template)
# Replace optional keys with optional content regex
file_template = re.sub(r"<.*?>", r".*?", file_template)
# Replace `{version}` with group regex
file_template = re.sub(r"{version.*?}", r"([0-9]+)", file_template)
file_template = re.sub(r"{comment.*?}", r".+?", file_template)
file_template = StringTemplate.format_strict_template(
file_template, fill_data
)
# Match with ignore case on Windows due to the Windows
# OS not being case-sensitive. This avoids later running
# into the error that the file did exist if it existed
# with a different upper/lower-case.
kwargs = {}
if platform.system().lower() == "windows":
kwargs["flags"] = re.IGNORECASE
# Get highest version among existing matching files
version = None
output_filenames = []
for filename in sorted(filenames):
match = re.match(file_template, filename, **kwargs)
if not match:
continue
file_version = int(match.group(1))
if version is None or file_version > version:
output_filenames[:] = []
version = file_version
if file_version == version:
output_filenames.append(filename)
output_filename = None
if output_filenames:
if len(output_filenames) == 1:
output_filename = output_filenames[0]
else:
last_time = None
for _output_filename in output_filenames:
full_path = os.path.join(workdir, _output_filename)
mod_time = os.path.getmtime(full_path)
if last_time is None or last_time < mod_time:
output_filename = _output_filename
last_time = mod_time
return output_filename, version
def get_last_workfile(
workdir, file_template, fill_data, extensions, full_path=False
):
"""Return last workfile filename.
Returns file with version 1 if there is not workfile yet.
Args:
workdir(str): Path to dir where workfiles are stored.
file_template(str): Template of file name.
fill_data(dict): Data for filling template.
extensions(list, tuple): All allowed file extensions of workfile.
full_path(bool): Full path to file is returned if set to True.
Returns:
str: Last or first workfile as filename of full path to filename.
"""
filename, version = get_last_workfile_with_version(
workdir, file_template, fill_data, extensions
)
if filename is None:
data = copy.deepcopy(fill_data)
data["version"] = 1
data.pop("comment", None)
if not data.get("ext"):
data["ext"] = extensions[0]
filename = StringTemplate.format_strict_template(file_template, data)
if full_path:
return os.path.normpath(os.path.join(workdir, filename))
return filename

View file

@ -5,19 +5,30 @@ import glob
import clique
import collections
from .path_templates import (
StringTemplate,
TemplateUnsolved,
)
def collect_frames(files):
"""
Returns dict of source path and its frame, if from sequence
Uses clique as most precise solution
Uses clique as most precise solution, used when anatomy template that
created files is not known.
Assumption is that frames are separated by '.', negative frames are not
allowed.
Args:
files(list): list of source paths
files(list) or (set with single value): list of source paths
Returns:
(dict): {'/asset/subset_v001.0001.png': '0001', ....}
"""
collections, remainder = clique.assemble(files, minimum_items=1)
patterns = [clique.PATTERNS["frames"]]
collections, remainder = clique.assemble(files, minimum_items=1,
patterns=patterns)
sources_and_frames = {}
if collections:
@ -46,8 +57,6 @@ def sizeof_fmt(num, suffix='B'):
def path_from_representation(representation, anatomy):
from avalon import pipeline # safer importing
try:
template = representation["data"]["template"]
@ -57,12 +66,10 @@ def path_from_representation(representation, anatomy):
try:
context = representation["context"]
context["root"] = anatomy.roots
path = pipeline.format_template_with_optional_keys(
context, template
)
path = os.path.normpath(path.replace("/", "\\"))
path = StringTemplate.format_strict_template(template, context)
return os.path.normpath(path)
except KeyError:
except TemplateUnsolved:
# Template references unavailable data
return None

View file

@ -98,6 +98,10 @@ class PypeStreamHandler(logging.StreamHandler):
self.flush()
except (KeyboardInterrupt, SystemExit):
raise
except OSError:
self.handleError(record)
except Exception:
print(repr(record))
self.handleError(record)
@ -227,7 +231,7 @@ class PypeLogger:
logger = logging.getLogger(name or "__main__")
if cls.pype_debug > 1:
if cls.pype_debug > 0:
logger.setLevel(logging.DEBUG)
else:
logger.setLevel(logging.INFO)

View file

@ -1,413 +0,0 @@
*
.*
*
.*
*
.
*
.*
*
.
.
*
.*
.*
.*
*
.
.
*
.*
.*
.*
*
.
_.
/**
\ *
\*
*
*
.
__.
---*
\ \*
\ *
\*
*
.
\___.
/* *
\ \ *
\ \*
\ *
\*
.
|____.
/* *
\|\ *
\ \ *
\ \ *
\ \*
\/.
_/_____.
/* *
/ \ *
\ \ *
\ \ *
\ \__*
\/__.
__________.
--*-- ___*
\ \ \/_*
\ \ __*
\ \ \_*
\ \____\*
\/____/.
\____________ .
/* ___ \*
\ \ \/_\ *
\ \ _____*
\ \ \___/*
\ \____\ *
\/____/ .
|___________ .
/* ___ \ *
\|\ \/_\ \ *
\ \ _____/ *
\ \ \___/ *
\ \____\ / *
\/____/ \.
_/__________ .
/* ___ \ *
/ \ \/_\ \ *
\ \ _____/ *
\ \ \___/ ---*
\ \____\ / \__*
\/____/ \/__.
____________ .
--*-- ___ \ *
\ \ \/_\ \ *
\ \ _____/ *
\ \ \___/ ---- *
\ \____\ / \____\*
\/____/ \/____/.
____________
/\ ___ \ .
\ \ \/_\ \ *
\ \ _____/ *
\ \ \___/ ---- *
\ \____\ / \____\ .
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \ .
\ \ _____/ *
\ \ \___/ ---- *
\ \____\ / \____\ .
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ .
\ \ \___/ ---- *
\ \____\ / \____\ .
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/
\ \ \___/ ---- *
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/
\ \ \___/ ---- .
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ _
\ \ \___/ ----
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ----
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\ \
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\ __\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \
\ \____\ / \____\ \__\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \ \
\ \____\ / \____\ \__\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___
\ \ \___/ ---- \ \
\ \____\ / \____\ \__\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___.
\ \ \___/ ---- \ \\
\ \____\ / \____\ \__\,
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ .
\ \ \___/ ---- \ \\
\ \____\ / \____\ \__\\,
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ _.
\ \ \___/ ---- \ \\\
\ \____\ / \____\ \__\\\
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ __.
\ \ \___/ ---- \ \\ \
\ \____\ / \____\ \__\\_/.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___.
\ \ \___/ ---- \ \\ \\
\ \____\ / \____\ \__\\__\.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ .
\ \ \___/ ---- \ \\ \\
\ \____\ / \____\ \__\\__\\.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ _.
\ \ \___/ ---- \ \\ \\\
\ \____\ / \____\ \__\\__\\.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ __.
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\_.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ __.
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__.
\/____/ \/____/
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ .
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ *
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ O*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ ..oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . p.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . Py.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYp.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPe.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE c.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE C1.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE ClU.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE CluB.oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club .oO*
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club . ..
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club . ..
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club . .
____________
/\ ___ \
\ \ \/_\ \
\ \ _____/ ___ ___ ___
\ \ \___/ ---- \ \\ \\ \
\ \____\ / \____\ \__\\__\\__\
\/____/ \/____/ . PYPE Club .

View file

@ -1,43 +0,0 @@
# -*- coding: utf-8 -*-
"""Pype terminal animation."""
import blessed
from pathlib import Path
from time import sleep
NO_TERMINAL = False
try:
term = blessed.Terminal()
except AttributeError:
# this happens when blessed cannot find proper terminal.
# If so, skip printing ascii art animation.
NO_TERMINAL = True
def play_animation():
"""Play ASCII art Pype animation."""
if NO_TERMINAL:
return
print(term.home + term.clear)
frame_size = 7
splash_file = Path(__file__).parent / "splash.txt"
with splash_file.open("r") as sf:
animation = sf.readlines()
animation_length = int(len(animation) / frame_size)
current_frame = 0
for _ in range(animation_length):
frame = "".join(
scanline
for y, scanline in enumerate(
animation[current_frame: current_frame + frame_size]
)
)
with term.location(0, 0):
# term.aquamarine3_bold(frame)
print(f"{term.bold}{term.aquamarine3}{frame}{term.normal}")
sleep(0.02)
current_frame += frame_size
print(term.move_y(7))

View file

@ -315,7 +315,7 @@ def get_usd_master_path(asset, subset, representation):
)
template = project["config"]["template"]["publish"]
if isinstance(asset, dict) and "silo" in asset and "name" in asset:
if isinstance(asset, dict) and "name" in asset:
# Allow explicitly passing asset document
asset_doc = asset
else:
@ -325,7 +325,6 @@ def get_usd_master_path(asset, subset, representation):
**{
"root": api.registered_root(),
"project": api.Session["AVALON_PROJECT"],
"silo": asset_doc["silo"],
"asset": asset_doc["name"],
"subset": subset,
"representation": representation,

View file

@ -6,6 +6,7 @@ import pyblish.api
from avalon import api
from openpype.lib import env_value_to_bool
from openpype.lib.delivery import collect_frames
from openpype_modules.deadline import abstract_submit_deadline
from openpype_modules.deadline.abstract_submit_deadline import DeadlineJobInfo
@ -102,24 +103,18 @@ class AfterEffectsSubmitDeadline(
def get_plugin_info(self):
deadline_plugin_info = DeadlinePluginInfo()
context = self._instance.context
script_path = context.data["currentFile"]
render_path = self._instance.data["expectedFiles"][0]
if len(self._instance.data["expectedFiles"]) > 1:
file_name, frame = list(collect_frames([render_path]).items())[0]
if frame:
# replace frame ('000001') with Deadline's required '[#######]'
# expects filename in format project_asset_subset_version.FRAME.ext
render_dir = os.path.dirname(render_path)
file_name = os.path.basename(render_path)
arr = file_name.split('.')
assert len(arr) == 3, \
"Unable to parse frames from {}".format(file_name)
hashed = '[{}]'.format(len(arr[1]) * "#")
render_path = os.path.join(render_dir,
'{}.{}.{}'.format(arr[0], hashed,
arr[2]))
hashed = '[{}]'.format(len(frame) * "#")
file_name = file_name.replace(frame, hashed)
render_path = os.path.join(render_dir, file_name)
deadline_plugin_info.Comp = self._instance.data["comp_name"]
deadline_plugin_info.Version = self._instance.data["app_version"]

View file

@ -236,6 +236,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
environment["OPENPYPE_MONGO"] = mongo_url
args = [
"--headless",
'publish',
roothless_metadata_path,
"--targets", "deadline",
@ -606,7 +607,18 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
"fps": instance.get("fps"),
"tags": ["review"]
})
self._solve_families(instance, True)
self._solve_families(instance, True)
already_there = False
for repre in instance.get("representations", []):
# might be added explicitly before by publish_on_farm
already_there = repre.get("files") == rep["files"]
if already_there:
self.log.debug("repre {} already_there".format(repre))
break
if not already_there:
representations.append(rep)
return representations

View file

@ -46,6 +46,7 @@ def inject_openpype_environment(deadlinePlugin):
args = [
openpype_app,
"--headless",
'extractenvironments',
export_url
]

View file

@ -199,8 +199,10 @@ class SyncToAvalonEvent(BaseEvent):
if proj:
ftrack_id = proj["data"].get("ftrackId")
if ftrack_id is None:
ftrack_id = self._update_project_ftrack_id()
proj["data"]["ftrackId"] = ftrack_id
self.handle_missing_ftrack_id(proj)
ftrack_id = proj["data"]["ftrackId"]
self._avalon_ents_by_ftrack_id[ftrack_id] = proj
self._avalon_ents_by_ftrack_id[ftrack_id] = proj
for ent in ents:
ftrack_id = ent["data"].get("ftrackId")
@ -209,15 +211,78 @@ class SyncToAvalonEvent(BaseEvent):
self._avalon_ents_by_ftrack_id[ftrack_id] = ent
return self._avalon_ents_by_ftrack_id
def _update_project_ftrack_id(self):
ftrack_id = self.cur_project["id"]
def handle_missing_ftrack_id(self, doc):
# TODO handling of missing ftrack id is primarily issue of editorial
# publishing it would be better to find out what causes that
# ftrack id is removed during the publishing
ftrack_id = doc["data"].get("ftrackId")
if ftrack_id is not None:
return
if doc["type"] == "project":
ftrack_id = self.cur_project["id"]
self.dbcon.update_one(
{"type": "project"},
{"$set": {
"data.ftrackId": ftrack_id,
"data.entityType": self.cur_project.entity_type
}}
)
doc["data"]["ftrackId"] = ftrack_id
doc["data"]["entityType"] = self.cur_project.entity_type
self.log.info("Updated ftrack id of project \"{}\"".format(
self.cur_project["full_name"]
))
return
if doc["type"] != "asset":
return
doc_parents = doc.get("data", {}).get("parents")
if doc_parents is None:
return
entities = self.process_session.query((
"select id, link from TypedContext"
" where project_id is \"{}\" and name is \"{}\""
).format(self.cur_project["id"], doc["name"])).all()
self.log.info("Entities: {}".format(str(entities)))
matching_entity = None
for entity in entities:
parents = []
for item in entity["link"]:
if item["id"] == entity["id"]:
break
low_type = item["type"].lower()
if low_type == "typedcontext":
parents.append(item["name"])
if doc_parents == parents:
matching_entity = entity
break
if matching_entity is None:
return
ftrack_id = matching_entity["id"]
self.dbcon.update_one(
{"type": "project"},
{"$set": {"data.ftrackId": ftrack_id}}
{"_id": doc["_id"]},
{"$set": {
"data.ftrackId": ftrack_id,
"data.entityType": matching_entity.entity_type
}}
)
doc["data"]["ftrackId"] = ftrack_id
doc["data"]["entityType"] = matching_entity.entity_type
return ftrack_id
entity_path_items = []
for item in entity["link"]:
entity_path_items.append(item["name"])
self.log.info("Updated ftrack id of entity \"{}\"".format(
"/".join(entity_path_items)
))
self._avalon_ents_by_ftrack_id[ftrack_id] = doc
@property
def avalon_subsets_by_parents(self):
@ -857,7 +922,14 @@ class SyncToAvalonEvent(BaseEvent):
if vis_par is None:
vis_par = proj["_id"]
parent_ent = self.avalon_ents_by_id[vis_par]
parent_ftrack_id = parent_ent["data"]["ftrackId"]
parent_ftrack_id = parent_ent["data"].get("ftrackId")
if parent_ftrack_id is None:
self.handle_missing_ftrack_id(parent_ent)
parent_ftrack_id = parent_ent["data"].get("ftrackId")
if parent_ftrack_id is None:
continue
parent_ftrack_ent = self.ftrack_ents_by_id.get(
parent_ftrack_id
)
@ -2128,7 +2200,13 @@ class SyncToAvalonEvent(BaseEvent):
vis_par = avalon_ent["parent"]
parent_ent = self.avalon_ents_by_id[vis_par]
parent_ftrack_id = parent_ent["data"]["ftrackId"]
parent_ftrack_id = parent_ent["data"].get("ftrackId")
if parent_ftrack_id is None:
self.handle_missing_ftrack_id(parent_ent)
parent_ftrack_id = parent_ent["data"].get("ftrackId")
if parent_ftrack_id is None:
continue
if parent_ftrack_id not in entities_dict:
entities_dict[parent_ftrack_id] = {
"children": [],

View file

@ -87,8 +87,8 @@ class UserAssigmentEvent(BaseEvent):
if not user_id:
return None, None
task = session.query('Task where id is "{}"'.format(task_id)).one()
user = session.query('User where id is "{}"'.format(user_id)).one()
task = session.query('Task where id is "{}"'.format(task_id)).first()
user = session.query('User where id is "{}"'.format(user_id)).first()
return task, user

View file

@ -5,11 +5,11 @@ import uuid
import clique
from pymongo import UpdateOne
from openpype_modules.ftrack.lib import BaseAction, statics_icon
from avalon.api import AvalonMongoDB
from openpype.api import Anatomy
import avalon.pipeline
from openpype.api import Anatomy
from openpype.lib import StringTemplate, TemplateUnsolved
from openpype_modules.ftrack.lib import BaseAction, statics_icon
class DeleteOldVersions(BaseAction):
@ -563,18 +563,16 @@ class DeleteOldVersions(BaseAction):
try:
context = representation["context"]
context["root"] = anatomy.roots
path = avalon.pipeline.format_template_with_optional_keys(
context, template
)
path = StringTemplate.format_strict_template(template, context)
if "frame" in context:
context["frame"] = self.sequence_splitter
sequence_path = os.path.normpath(
avalon.pipeline.format_template_with_optional_keys(
StringTemplate.format_strict_template(
context, template
)
)
except KeyError:
except (KeyError, TemplateUnsolved):
# Template references unavailable data
return (None, None)

View file

@ -0,0 +1,494 @@
import os
import sys
import json
import collections
import tempfile
import datetime
import ftrack_api
from avalon.api import AvalonMongoDB
from openpype.api import get_project_settings
from openpype.lib import (
get_workfile_template_key,
get_workdir_data,
Anatomy,
StringTemplate,
)
from openpype_modules.ftrack.lib import BaseAction, statics_icon
from openpype_modules.ftrack.lib.avalon_sync import create_chunks
NOT_SYNCHRONIZED_TITLE = "Not synchronized"
class FillWorkfileAttributeAction(BaseAction):
"""Action fill work filename into custom attribute on tasks.
Prerequirements are that the project is synchronized so it is possible to
access project anatomy and project/asset documents. Tasks that are not
synchronized are skipped too.
"""
identifier = "fill.workfile.attr"
label = "OpenPype Admin"
variant = "- Fill workfile attribute"
description = "Precalculate and fill workfile name into a custom attribute"
icon = statics_icon("ftrack", "action_icons", "OpenPypeAdmin.svg")
settings_key = "fill_workfile_attribute"
def discover(self, session, entities, event):
""" Validate selection. """
is_valid = False
for ent in event["data"]["selection"]:
# Ignore entities that are not tasks or projects
if ent["entityType"].lower() in ["show", "task"]:
is_valid = True
break
if is_valid:
is_valid = self.valid_roles(session, entities, event)
return is_valid
def launch(self, session, entities, event):
# Separate entities and get project entity
project_entity = None
for entity in entities:
if project_entity is None:
project_entity = self.get_project_from_entity(entity)
break
if not project_entity:
return {
"message": (
"Couldn't find project entity."
" Could be an issue with permissions."
),
"success": False
}
# Get project settings and check if custom attribute where workfile
# should be set is defined.
project_name = project_entity["full_name"]
project_settings = get_project_settings(project_name)
custom_attribute_key = (
project_settings
.get("ftrack", {})
.get("user_handlers", {})
.get(self.settings_key, {})
.get("custom_attribute_key")
)
if not custom_attribute_key:
return {
"success": False,
"message": "Custom attribute key is not set in settings"
}
# Try to find the custom attribute
# - get Task type object id
task_obj_type = session.query(
"select id from ObjectType where name is \"Task\""
).one()
# - get text custom attribute type
text_type = session.query(
"select id from CustomAttributeType where name is \"text\""
).one()
# - find the attribute
attr_conf = session.query(
(
"select id, key from CustomAttributeConfiguration"
" where object_type_id is \"{}\""
" and type_id is \"{}\""
" and key is \"{}\""
).format(
task_obj_type["id"], text_type["id"], custom_attribute_key
)
).first()
if not attr_conf:
return {
"success": False,
"message": (
"Could not find Task (text) Custom attribute \"{}\""
).format(custom_attribute_key)
}
# Store report information
report = collections.defaultdict(list)
user_entity = session.query(
"User where id is {}".format(event["source"]["user"]["id"])
).one()
job_entity = session.create("Job", {
"user": user_entity,
"status": "running",
"data": json.dumps({
"description": "(0/3) Fill of workfiles started"
})
})
session.commit()
try:
self.in_job_process(
session,
entities,
job_entity,
project_entity,
project_settings,
attr_conf,
report
)
except Exception:
self.log.error(
"Fill of workfiles to custom attribute failed", exc_info=True
)
session.rollback()
description = "Fill of workfiles Failed (Download traceback)"
self.add_traceback_to_job(
job_entity, session, sys.exc_info(), description
)
return {
"message": (
"Fill of workfiles failed."
" Check job for more information"
),
"success": False
}
job_entity["status"] = "done"
job_entity["data"] = json.dumps({
"description": "Fill of workfiles completed."
})
session.commit()
if report:
temp_obj = tempfile.NamedTemporaryFile(
mode="w",
prefix="openpype_ftrack_",
suffix=".json",
delete=False
)
temp_obj.close()
temp_filepath = temp_obj.name
with open(temp_filepath, "w") as temp_file:
json.dump(report, temp_file)
component_name = "{}_{}".format(
"FillWorkfilesReport",
datetime.datetime.now().strftime("%y-%m-%d-%H%M")
)
self.add_file_component_to_job(
job_entity, session, temp_filepath, component_name
)
# Delete temp file
os.remove(temp_filepath)
self._show_report(event, report, project_name)
return {
"message": (
"Fill of workfiles finished with few issues."
" Check job for more information"
),
"success": True
}
return {
"success": True,
"message": "Finished with filling of work filenames"
}
def _show_report(self, event, report, project_name):
items = []
title = "Fill workfiles report ({}):".format(project_name)
for subtitle, lines in report.items():
if items:
items.append({
"type": "label",
"value": "---"
})
items.append({
"type": "label",
"value": "# {}".format(subtitle)
})
items.append({
"type": "label",
"value": '<p>{}</p>'.format("<br>".join(lines))
})
self.show_interface(
items=items,
title=title,
event=event
)
def in_job_process(
self,
session,
entities,
job_entity,
project_entity,
project_settings,
attr_conf,
report
):
task_entities = []
other_entities = []
project_selected = False
for entity in entities:
ent_type_low = entity.entity_type.lower()
if ent_type_low == "project":
project_selected = True
break
elif ent_type_low == "task":
task_entities.append(entity)
else:
other_entities.append(entity)
project_name = project_entity["full_name"]
# Find matchin asset documents and map them by ftrack task entities
# - result stored to 'asset_docs_with_task_entities' is list with
# tuple `(asset document, [task entitis, ...])`
dbcon = AvalonMongoDB()
dbcon.Session["AVALON_PROJECT"] = project_name
# Quety all asset documents
asset_docs = list(dbcon.find({"type": "asset"}))
job_entity["data"] = json.dumps({
"description": "(1/3) Asset documents queried."
})
session.commit()
# When project is selected then we can query whole project
if project_selected:
asset_docs_with_task_entities = self._get_asset_docs_for_project(
session, project_entity, asset_docs, report
)
else:
asset_docs_with_task_entities = self._get_tasks_for_selection(
session, other_entities, task_entities, asset_docs, report
)
job_entity["data"] = json.dumps({
"description": "(2/3) Queried related task entities."
})
session.commit()
# Keep placeholders in the template unfilled
host_name = "{app}"
extension = "{ext}"
project_doc = dbcon.find_one({"type": "project"})
project_settings = get_project_settings(project_name)
anatomy = Anatomy(project_name)
templates_by_key = {}
operations = []
for asset_doc, task_entities in asset_docs_with_task_entities:
for task_entity in task_entities:
workfile_data = get_workdir_data(
project_doc, asset_doc, task_entity["name"], host_name
)
# Use version 1 for each workfile
workfile_data["version"] = 1
workfile_data["ext"] = extension
task_type = workfile_data["task"]["type"]
template_key = get_workfile_template_key(
task_type, host_name, project_settings=project_settings
)
if template_key in templates_by_key:
template = templates_by_key[template_key]
else:
template = StringTemplate(
anatomy.templates[template_key]["file"]
)
templates_by_key[template_key] = template
result = template.format(workfile_data)
if not result.solved:
# TODO report
pass
else:
table_values = collections.OrderedDict((
("configuration_id", attr_conf["id"]),
("entity_id", task_entity["id"])
))
operations.append(
ftrack_api.operation.UpdateEntityOperation(
"ContextCustomAttributeValue",
table_values,
"value",
ftrack_api.symbol.NOT_SET,
str(result)
)
)
if operations:
for sub_operations in create_chunks(operations, 50):
for op in sub_operations:
session.recorded_operations.push(op)
session.commit()
job_entity["data"] = json.dumps({
"description": "(3/3) Set custom attribute values."
})
session.commit()
def _get_entity_path(self, entity):
path_items = []
for item in entity["link"]:
if item["type"].lower() != "project":
path_items.append(item["name"])
return "/".join(path_items)
def _get_asset_docs_for_project(
self, session, project_entity, asset_docs, report
):
asset_docs_task_names = {}
for asset_doc in asset_docs:
asset_data = asset_doc["data"]
ftrack_id = asset_data.get("ftrackId")
if not ftrack_id:
hierarchy = list(asset_data.get("parents") or [])
hierarchy.append(asset_doc["name"])
path = "/".join(hierarchy)
report[NOT_SYNCHRONIZED_TITLE].append(path)
continue
asset_tasks = asset_data.get("tasks") or {}
asset_docs_task_names[ftrack_id] = (
asset_doc, list(asset_tasks.keys())
)
task_entities = session.query((
"select id, name, parent_id, link from Task where project_id is {}"
).format(project_entity["id"])).all()
task_entities_by_parent_id = collections.defaultdict(list)
for task_entity in task_entities:
parent_id = task_entity["parent_id"]
task_entities_by_parent_id[parent_id].append(task_entity)
output = []
for ftrack_id, item in asset_docs_task_names.items():
asset_doc, task_names = item
valid_task_entities = []
for task_entity in task_entities_by_parent_id[ftrack_id]:
if task_entity["name"] in task_names:
valid_task_entities.append(task_entity)
else:
path = self._get_entity_path(task_entity)
report[NOT_SYNCHRONIZED_TITLE].append(path)
if valid_task_entities:
output.append((asset_doc, valid_task_entities))
return output
def _get_tasks_for_selection(
self, session, other_entities, task_entities, asset_docs, report
):
all_tasks = object()
asset_docs_by_ftrack_id = {}
asset_docs_by_parent_id = collections.defaultdict(list)
for asset_doc in asset_docs:
asset_data = asset_doc["data"]
ftrack_id = asset_data.get("ftrackId")
parent_id = asset_data.get("visualParent")
asset_docs_by_parent_id[parent_id].append(asset_doc)
if ftrack_id:
asset_docs_by_ftrack_id[ftrack_id] = asset_doc
missing_doc_ftrack_ids = {}
all_tasks_ids = set()
task_names_by_ftrack_id = collections.defaultdict(list)
for other_entity in other_entities:
ftrack_id = other_entity["id"]
if ftrack_id not in asset_docs_by_ftrack_id:
missing_doc_ftrack_ids[ftrack_id] = None
continue
all_tasks_ids.add(ftrack_id)
task_names_by_ftrack_id[ftrack_id] = all_tasks
for task_entity in task_entities:
parent_id = task_entity["parent_id"]
if parent_id not in asset_docs_by_ftrack_id:
missing_doc_ftrack_ids[parent_id] = None
continue
if all_tasks_ids not in all_tasks_ids:
task_names_by_ftrack_id[ftrack_id].append(task_entity["name"])
ftrack_ids = set()
asset_doc_with_task_names_by_id = {}
for ftrack_id, task_names in task_names_by_ftrack_id.items():
asset_doc = asset_docs_by_ftrack_id[ftrack_id]
asset_data = asset_doc["data"]
asset_tasks = asset_data.get("tasks") or {}
if task_names is all_tasks:
task_names = list(asset_tasks.keys())
else:
new_task_names = []
for task_name in task_names:
if task_name in asset_tasks:
new_task_names.append(task_name)
continue
if ftrack_id not in missing_doc_ftrack_ids:
missing_doc_ftrack_ids[ftrack_id] = []
if missing_doc_ftrack_ids[ftrack_id] is not None:
missing_doc_ftrack_ids[ftrack_id].append(task_name)
task_names = new_task_names
if task_names:
ftrack_ids.add(ftrack_id)
asset_doc_with_task_names_by_id[ftrack_id] = (
asset_doc, task_names
)
task_entities = session.query((
"select id, name, parent_id from Task where parent_id in ({})"
).format(self.join_query_keys(ftrack_ids))).all()
task_entitiy_by_parent_id = collections.defaultdict(list)
for task_entity in task_entities:
parent_id = task_entity["parent_id"]
task_entitiy_by_parent_id[parent_id].append(task_entity)
output = []
for ftrack_id, item in asset_doc_with_task_names_by_id.items():
asset_doc, task_names = item
valid_task_entities = []
for task_entity in task_entitiy_by_parent_id[ftrack_id]:
if task_entity["name"] in task_names:
valid_task_entities.append(task_entity)
else:
if ftrack_id not in missing_doc_ftrack_ids:
missing_doc_ftrack_ids[ftrack_id] = []
if missing_doc_ftrack_ids[ftrack_id] is not None:
missing_doc_ftrack_ids[ftrack_id].append(task_name)
if valid_task_entities:
output.append((asset_doc, valid_task_entities))
# Store report information about not synchronized entities
if missing_doc_ftrack_ids:
missing_entities = session.query(
"select id, link from TypedContext where id in ({})".format(
self.join_query_keys(missing_doc_ftrack_ids.keys())
)
).all()
for missing_entity in missing_entities:
path = self._get_entity_path(missing_entity)
task_names = missing_doc_ftrack_ids[missing_entity["id"]]
if task_names is None:
report[NOT_SYNCHRONIZED_TITLE].append(path)
else:
for task_name in task_names:
task_path = "/".join([path, task_name])
report[NOT_SYNCHRONIZED_TITLE].append(task_path)
return output
def register(session):
FillWorkfileAttributeAction(session).register()

View file

@ -44,7 +44,7 @@ class BaseEvent(BaseHandler):
return self._get_entities(
event,
session,
ignore=['socialfeed', 'socialnotification']
ignore=['socialfeed', 'socialnotification', 'team']
)
def get_project_name_from_event(self, session, event, project_id):

View file

@ -1,3 +1,8 @@
from .constants import (
AVALON_CONTAINER_ID,
HOST_WORKFILE_EXTENSIONS,
)
from .lib import attribute_definitions
from .create import (
@ -41,8 +46,27 @@ from .publish import (
OpenPypePyblishPluginMixin
)
from .actions import (
LauncherAction,
InventoryAction,
discover_launcher_actions,
register_launcher_action,
register_launcher_action_path,
discover_inventory_actions,
register_inventory_action,
register_inventory_action_path,
deregister_inventory_action,
deregister_inventory_action_path,
)
__all__ = (
"AVALON_CONTAINER_ID",
"HOST_WORKFILE_EXTENSIONS",
"attribute_definitions",
# --- Create ---
@ -82,5 +106,19 @@ __all__ = (
"PublishValidationError",
"PublishXmlValidationError",
"KnownPublishError",
"OpenPypePyblishPluginMixin"
"OpenPypePyblishPluginMixin",
# --- Actions ---
"LauncherAction",
"InventoryAction",
"discover_launcher_actions",
"register_launcher_action",
"register_launcher_action_path",
"discover_inventory_actions",
"register_inventory_action",
"register_inventory_action_path",
"deregister_inventory_action",
"deregister_inventory_action_path",
)

View file

@ -0,0 +1,146 @@
import logging
class LauncherAction(object):
"""A custom action available"""
name = None
label = None
icon = None
color = None
order = 0
log = logging.getLogger("LauncherAction")
log.propagate = True
def is_compatible(self, session):
"""Return whether the class is compatible with the Session."""
return True
def process(self, session, **kwargs):
pass
class InventoryAction(object):
"""A custom action for the scene inventory tool
If registered the action will be visible in the Right Mouse Button menu
under the submenu "Actions".
"""
label = None
icon = None
color = None
order = 0
log = logging.getLogger("InventoryAction")
log.propagate = True
@staticmethod
def is_compatible(container):
"""Override function in a custom class
This method is specifically used to ensure the action can operate on
the container.
Args:
container(dict): the data of a loaded asset, see host.ls()
Returns:
bool
"""
return bool(container.get("objectName"))
def process(self, containers):
"""Override function in a custom class
This method will receive all containers even those which are
incompatible. It is advised to create a small filter along the lines
of this example:
valid_containers = filter(self.is_compatible(c) for c in containers)
The return value will need to be a True-ish value to trigger
the data_changed signal in order to refresh the view.
You can return a list of container names to trigger GUI to select
treeview items.
You can return a dict to carry extra GUI options. For example:
{
"objectNames": [container names...],
"options": {"mode": "toggle",
"clear": False}
}
Currently workable GUI options are:
- clear (bool): Clear current selection before selecting by action.
Default `True`.
- mode (str): selection mode, use one of these:
"select", "deselect", "toggle". Default is "select".
Args:
containers (list): list of dictionaries
Return:
bool, list or dict
"""
return True
# Launcher action
def discover_launcher_actions():
import avalon.api
return avalon.api.discover(LauncherAction)
def register_launcher_action(plugin):
import avalon.api
return avalon.api.register_plugin(LauncherAction, plugin)
def register_launcher_action_path(path):
import avalon.api
return avalon.api.register_plugin_path(LauncherAction, path)
# Inventory action
def discover_inventory_actions():
import avalon.api
actions = avalon.api.discover(InventoryAction)
filtered_actions = []
for action in actions:
if action is not InventoryAction:
print("DISCOVERED", action)
filtered_actions.append(action)
else:
print("GOT SOURCE")
return filtered_actions
def register_inventory_action(plugin):
import avalon.api
return avalon.api.register_plugin(InventoryAction, plugin)
def deregister_inventory_action(plugin):
import avalon.api
avalon.api.deregister_plugin(InventoryAction, plugin)
def register_inventory_action_path(path):
import avalon.api
return avalon.api.register_plugin_path(InventoryAction, path)
def deregister_inventory_action_path(path):
import avalon.api
return avalon.api.deregister_plugin_path(InventoryAction, path)

View file

@ -0,0 +1,19 @@
# Metadata ID of loaded container into scene
AVALON_CONTAINER_ID = "pyblish.avalon.container"
# TODO get extensions from host implementations
HOST_WORKFILE_EXTENSIONS = {
"blender": [".blend"],
"celaction": [".scn"],
"tvpaint": [".tvpp"],
"fusion": [".comp"],
"harmony": [".zip"],
"houdini": [".hip", ".hiplc", ".hipnc"],
"maya": [".ma", ".mb"],
"nuke": [".nk"],
"hiero": [".hrox"],
"photoshop": [".psd", ".psb"],
"premiere": [".prproj"],
"resolve": [".drp"],
"aftereffects": [".aep"]
}

View file

@ -127,4 +127,5 @@ def register_loader_plugin_path(path):
def deregister_loader_plugin(plugin):
import avalon.api
avalon.api.deregister_plugin(LoaderPlugin, plugin)

View file

@ -7,6 +7,7 @@ import inspect
import numbers
import six
from bson.objectid import ObjectId
from avalon import io, schema
from avalon.api import Session, registered_root
@ -67,7 +68,7 @@ def get_repres_contexts(representation_ids, dbcon=None):
_representation_ids = []
for repre_id in representation_ids:
if isinstance(repre_id, six.string_types):
repre_id = io.ObjectId(repre_id)
repre_id = ObjectId(repre_id)
_representation_ids.append(repre_id)
repre_docs = dbcon.find({
@ -174,7 +175,7 @@ def get_subset_contexts(subset_ids, dbcon=None):
_subset_ids = set()
for subset_id in subset_ids:
if isinstance(subset_id, six.string_types):
subset_id = io.ObjectId(subset_id)
subset_id = ObjectId(subset_id)
_subset_ids.add(subset_id)
subset_docs = dbcon.find({
@ -217,7 +218,7 @@ def get_representation_context(representation):
"""Return parenthood context for representation.
Args:
representation (str or io.ObjectId or dict): The representation id
representation (str or ObjectId or dict): The representation id
or full representation as returned by the database.
Returns:
@ -227,9 +228,9 @@ def get_representation_context(representation):
assert representation is not None, "This is a bug"
if isinstance(representation, (six.string_types, io.ObjectId)):
if isinstance(representation, (six.string_types, ObjectId)):
representation = io.find_one(
{"_id": io.ObjectId(str(representation))})
{"_id": ObjectId(str(representation))})
version, subset, asset, project = io.parenthood(representation)
@ -340,7 +341,7 @@ def load_container(
Args:
Loader (Loader): The loader class to trigger.
representation (str or io.ObjectId or dict): The representation id
representation (str or ObjectId or dict): The representation id
or full representation as returned by the database.
namespace (str, Optional): The namespace to assign. Defaults to None.
name (str, Optional): The name to assign. Defaults to subset name.
@ -404,7 +405,7 @@ def update_container(container, version=-1):
# Compute the different version from 'representation'
current_representation = io.find_one({
"_id": io.ObjectId(container["representation"])
"_id": ObjectId(container["representation"])
})
assert current_representation is not None, "This is a bug"
@ -502,7 +503,7 @@ def get_representation_path_from_context(context):
session_project = Session.get("AVALON_PROJECT")
if project_doc and project_doc["name"] != session_project:
anatomy = Anatomy(project_doc["name"])
root = anatomy.roots_obj
root = anatomy.roots
return get_representation_path(representation, root)
@ -525,7 +526,7 @@ def get_representation_path(representation, root=None, dbcon=None):
"""
from openpype.lib import StringTemplate
from openpype.lib import StringTemplate, TemplateUnsolved
if dbcon is None:
dbcon = io
@ -542,13 +543,14 @@ def get_representation_path(representation, root=None, dbcon=None):
try:
context = representation["context"]
context["root"] = root
template_obj = StringTemplate(template)
path = str(template_obj.format(context))
path = StringTemplate.format_strict_template(
template, context
)
# Force replacing backslashes with forward slashed if not on
# windows
if platform.system().lower() != "windows":
path = path.replace("\\", "/")
except KeyError:
except (TemplateUnsolved, KeyError):
# Template references unavailable data
return None
@ -592,7 +594,6 @@ def get_representation_path(representation, root=None, dbcon=None):
"code": project.get("data", {}).get("code")
},
"asset": asset["name"],
"silo": asset.get("silo"),
"hierarchy": hierarchy,
"subset": subset["name"],
"version": version_["name"],

View file

@ -0,0 +1,147 @@
import os
import copy
import logging
log = logging.getLogger(__name__)
def get_thumbnail_binary(thumbnail_entity, thumbnail_type, dbcon=None):
if not thumbnail_entity:
return
resolvers = discover_thumbnail_resolvers()
resolvers = sorted(resolvers, key=lambda cls: cls.priority)
if dbcon is None:
from avalon import io
dbcon = io
for Resolver in resolvers:
available_types = Resolver.thumbnail_types
if (
thumbnail_type not in available_types
and "*" not in available_types
and (
isinstance(available_types, (list, tuple))
and len(available_types) == 0
)
):
continue
try:
instance = Resolver(dbcon)
result = instance.process(thumbnail_entity, thumbnail_type)
if result:
return result
except Exception:
log.warning("Resolver {0} failed durring process.".format(
Resolver.__class__.__name__, exc_info=True
))
class ThumbnailResolver(object):
"""Determine how to get data from thumbnail entity.
"priority" - determines the order of processing in `get_thumbnail_binary`,
lower number is processed earlier.
"thumbnail_types" - it is expected that thumbnails will be used in more
more than one level, there is only ["thumbnail"] type at the moment
of creating this docstring but it is expected to add "ico" and "full"
in future.
"""
priority = 100
thumbnail_types = ["*"]
def __init__(self, dbcon):
self._log = None
self.dbcon = dbcon
@property
def log(self):
if self._log is None:
self._log = logging.getLogger(self.__class__.__name__)
return self._log
def process(self, thumbnail_entity, thumbnail_type):
pass
class TemplateResolver(ThumbnailResolver):
priority = 90
def process(self, thumbnail_entity, thumbnail_type):
if not os.environ.get("AVALON_THUMBNAIL_ROOT"):
return
template = thumbnail_entity["data"].get("template")
if not template:
self.log.debug("Thumbnail entity does not have set template")
return
project = self.dbcon.find_one(
{"type": "project"},
{
"name": True,
"data.code": True
}
)
template_data = copy.deepcopy(
thumbnail_entity["data"].get("template_data") or {}
)
template_data.update({
"_id": str(thumbnail_entity["_id"]),
"thumbnail_type": thumbnail_type,
"thumbnail_root": os.environ.get("AVALON_THUMBNAIL_ROOT"),
"project": {
"name": project["name"],
"code": project["data"].get("code")
}
})
try:
filepath = os.path.normpath(template.format(**template_data))
except KeyError:
self.log.warning((
"Missing template data keys for template <{0}> || Data: {1}"
).format(template, str(template_data)))
return
if not os.path.exists(filepath):
self.log.warning("File does not exist \"{0}\"".format(filepath))
return
with open(filepath, "rb") as _file:
content = _file.read()
return content
class BinaryThumbnail(ThumbnailResolver):
def process(self, thumbnail_entity, thumbnail_type):
return thumbnail_entity["data"].get("binary_data")
# Thumbnail resolvers
def discover_thumbnail_resolvers():
import avalon.api
return avalon.api.discover(ThumbnailResolver)
def register_thumbnail_resolver(plugin):
import avalon.api
return avalon.api.register_plugin(ThumbnailResolver, plugin)
def register_thumbnail_resolver_path(path):
import avalon.api
return avalon.api.register_plugin_path(ThumbnailResolver, path)
register_thumbnail_resolver(TemplateResolver)
register_thumbnail_resolver(BinaryThumbnail)

View file

@ -1,3 +1,4 @@
from bson.objectid import ObjectId
import pyblish.api
from avalon import api, io
@ -35,7 +36,7 @@ class CollectSceneLoadedVersions(pyblish.api.ContextPlugin):
loaded_versions = []
_containers = list(host.ls())
_repr_ids = [io.ObjectId(c["representation"]) for c in _containers]
_repr_ids = [ObjectId(c["representation"]) for c in _containers]
version_by_repr = {
str(doc["_id"]): doc["parent"] for doc in
io.find({"_id": {"$in": _repr_ids}}, projection={"parent": 1})
@ -46,7 +47,7 @@ class CollectSceneLoadedVersions(pyblish.api.ContextPlugin):
# may have more then one representation that are same version
version = {
"subsetName": con["name"],
"representation": io.ObjectId(con["representation"]),
"representation": ObjectId(con["representation"]),
"version": version_by_repr[con["representation"]], # _id
}
loaded_versions.append(version)

View file

@ -64,7 +64,7 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin):
data["tasks"] = tasks
parents = []
visualParent = None
# do not store project"s id as visualParent (silo asset)
# do not store project"s id as visualParent
if self.project is not None:
if self.project["_id"] != parent["_id"]:
visualParent = parent["_id"]

View file

@ -4,6 +4,7 @@ import clique
import errno
import shutil
from bson.objectid import ObjectId
from pymongo import InsertOne, ReplaceOne
import pyblish.api
from avalon import api, io, schema
@ -161,7 +162,7 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
if old_version:
new_version_id = old_version["_id"]
else:
new_version_id = io.ObjectId()
new_version_id = ObjectId()
new_hero_version = {
"_id": new_version_id,
@ -384,7 +385,7 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
# Create representation
else:
repre["_id"] = io.ObjectId()
repre["_id"] = ObjectId()
bulk_writes.append(
InsertOne(repre)
)
@ -420,7 +421,7 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
else:
repre["old_id"] = repre["_id"]
repre["_id"] = io.ObjectId()
repre["_id"] = ObjectId()
repre["type"] = "archived_representation"
bulk_writes.append(
InsertOne(repre)

View file

@ -1,8 +1,10 @@
from collections import OrderedDict
from avalon import io
from bson.objectid import ObjectId
import pyblish.api
from avalon import io
class IntegrateInputLinks(pyblish.api.ContextPlugin):
"""Connecting version level dependency links"""
@ -104,7 +106,7 @@ class IntegrateInputLinks(pyblish.api.ContextPlugin):
# future.
link = OrderedDict()
link["type"] = link_type
link["id"] = io.ObjectId(input_id)
link["id"] = ObjectId(input_id)
link["linkedBy"] = "publish"
if "inputLinks" not in version_doc["data"]:

View file

@ -9,17 +9,19 @@ import six
import re
import shutil
from bson.objectid import ObjectId
from pymongo import DeleteOne, InsertOne
import pyblish.api
from avalon import io
from avalon.api import format_template_with_optional_keys
import openpype.api
from datetime import datetime
# from pype.modules import ModulesManager
from openpype.lib.profiles_filtering import filter_profiles
from openpype.lib import (
prepare_template_data,
create_hard_link
create_hard_link,
StringTemplate,
TemplateUnsolved
)
# this is needed until speedcopy for linux is fixed
@ -293,7 +295,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
bulk_writes.append(DeleteOne({"_id": repre_id}))
repre["orig_id"] = repre_id
repre["_id"] = io.ObjectId()
repre["_id"] = ObjectId()
repre["type"] = "archived_representation"
bulk_writes.append(InsertOne(repre))
@ -572,7 +574,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
# Create new id if existing representations does not match
if repre_id is None:
repre_id = io.ObjectId()
repre_id = ObjectId()
data = repre.get("data") or {}
data.update({'path': dst, 'template': template})
@ -781,7 +783,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
families = [instance.data["family"]]
families.extend(instance.data.get("families", []))
io.update_many(
{"type": "subset", "_id": io.ObjectId(subset["_id"])},
{"type": "subset", "_id": ObjectId(subset["_id"])},
{"$set": {"data.families": families}}
)
@ -806,7 +808,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
if subset_group:
io.update_many({
'type': 'subset',
'_id': io.ObjectId(subset_id)
'_id': ObjectId(subset_id)
}, {'$set': {'data.subsetGroup': subset_group}})
def _get_subset_group(self, instance):
@ -854,9 +856,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
fill_pairs = prepare_template_data(fill_pairs)
try:
filled_template = \
format_template_with_optional_keys(fill_pairs, template)
except KeyError:
filled_template = StringTemplate.format_strict_template(
template, fill_pairs
)
except (KeyError, TemplateUnsolved):
keys = []
if fill_pairs:
keys = fill_pairs.keys()
@ -1052,7 +1055,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
sync_project_presets = None
rec = {
"_id": io.ObjectId(),
"_id": ObjectId(),
"path": path
}
if size:

View file

@ -193,6 +193,11 @@
"Administrator"
]
},
"fill_workfile_attribute": {
"enabled": false,
"custom_attribute_key": "",
"role_list": []
},
"seed_project": {
"enabled": true,
"role_list": [

View file

@ -28,6 +28,10 @@ class BaseEntity:
def __init__(self, schema_data, *args, **kwargs):
self.schema_data = schema_data
tooltip = None
if schema_data:
tooltip = schema_data.get("tooltip")
self.tooltip = tooltip
# Entity id
self._id = uuid4()

View file

@ -14,6 +14,7 @@
- this keys is not allowed for all inputs as they may have not reason for that
- key is validated, can be only once in hierarchy but is not required
- currently there are `system settings` and `project settings`
- all entities can have set `"tooltip"` key with description which will be shown in UI
## Inner schema
- GUI schemas are huge json files, to be able to split whole configuration into multiple schema there's type `schema`

View file

@ -589,6 +589,34 @@
}
]
},
{
"type": "dict",
"key": "fill_workfile_attribute",
"label": "Fill workfile Custom attribute",
"checkbox_key": "enabled",
"children": [
{
"type": "boolean",
"key": "enabled",
"label": "Enabled"
},
{
"type": "label",
"label": "Custom attribute must be <b>Text</b> type added to <b>Task</b> entity type"
},
{
"type": "text",
"key": "custom_attribute_key",
"label": "Custom attribute key"
},
{
"type": "list",
"key": "role_list",
"label": "Roles",
"object_type": "text"
}
]
},
{
"type": "dict",
"key": "seed_project",

View file

@ -195,6 +195,9 @@
{
"aces_1.1": "aces_1.1"
},
{
"aces_1.2": "aces_1.2"
},
{
"custom": "custom"
}

Some files were not shown because too many files have changed in this diff Show more