Merge branch 'develop' into feature/OP-3593_Move-load-functions-into-pipeline
35
CHANGELOG.md
|
|
@ -1,11 +1,17 @@
|
||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
## [3.12.2-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
## [3.12.2-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||||
|
|
||||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.12.1...HEAD)
|
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.12.1...HEAD)
|
||||||
|
|
||||||
|
### 📖 Documentation
|
||||||
|
|
||||||
|
- Update website with more studios [\#3554](https://github.com/pypeclub/OpenPype/pull/3554)
|
||||||
|
- Documentation: Update publishing dev docs [\#3549](https://github.com/pypeclub/OpenPype/pull/3549)
|
||||||
|
|
||||||
**🚀 Enhancements**
|
**🚀 Enhancements**
|
||||||
|
|
||||||
|
- Maya: add additional validators to Settings [\#3540](https://github.com/pypeclub/OpenPype/pull/3540)
|
||||||
- General: Interactive console in cli [\#3526](https://github.com/pypeclub/OpenPype/pull/3526)
|
- General: Interactive console in cli [\#3526](https://github.com/pypeclub/OpenPype/pull/3526)
|
||||||
- Ftrack: Automatic daily review session creation can define trigger hour [\#3516](https://github.com/pypeclub/OpenPype/pull/3516)
|
- Ftrack: Automatic daily review session creation can define trigger hour [\#3516](https://github.com/pypeclub/OpenPype/pull/3516)
|
||||||
- Ftrack: add source into Note [\#3509](https://github.com/pypeclub/OpenPype/pull/3509)
|
- Ftrack: add source into Note [\#3509](https://github.com/pypeclub/OpenPype/pull/3509)
|
||||||
|
|
@ -20,8 +26,15 @@
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
**🐛 Bug fixes**
|
||||||
|
|
||||||
|
- Remove invalid submodules from `/vendor` [\#3557](https://github.com/pypeclub/OpenPype/pull/3557)
|
||||||
|
- General: Remove hosts filter on integrator plugins [\#3556](https://github.com/pypeclub/OpenPype/pull/3556)
|
||||||
|
- Settings: Clean default values of environments [\#3550](https://github.com/pypeclub/OpenPype/pull/3550)
|
||||||
|
- Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547)
|
||||||
|
- Workfiles tool: Show of tool and it's flags [\#3539](https://github.com/pypeclub/OpenPype/pull/3539)
|
||||||
|
- General: Create workfile documents works again [\#3538](https://github.com/pypeclub/OpenPype/pull/3538)
|
||||||
- Additional fixes for powershell scripts [\#3525](https://github.com/pypeclub/OpenPype/pull/3525)
|
- Additional fixes for powershell scripts [\#3525](https://github.com/pypeclub/OpenPype/pull/3525)
|
||||||
- Maya: Added wrapper around cmds.setAttr [\#3523](https://github.com/pypeclub/OpenPype/pull/3523)
|
- Maya: Added wrapper around cmds.setAttr [\#3523](https://github.com/pypeclub/OpenPype/pull/3523)
|
||||||
|
- Nuke: double slate [\#3521](https://github.com/pypeclub/OpenPype/pull/3521)
|
||||||
- General: Fix hash of centos oiio archive [\#3519](https://github.com/pypeclub/OpenPype/pull/3519)
|
- General: Fix hash of centos oiio archive [\#3519](https://github.com/pypeclub/OpenPype/pull/3519)
|
||||||
- Maya: Renderman display output fix [\#3514](https://github.com/pypeclub/OpenPype/pull/3514)
|
- Maya: Renderman display output fix [\#3514](https://github.com/pypeclub/OpenPype/pull/3514)
|
||||||
- TrayPublisher: Simple creation enhancements and fixes [\#3513](https://github.com/pypeclub/OpenPype/pull/3513)
|
- TrayPublisher: Simple creation enhancements and fixes [\#3513](https://github.com/pypeclub/OpenPype/pull/3513)
|
||||||
|
|
@ -31,8 +44,12 @@
|
||||||
|
|
||||||
**🔀 Refactored code**
|
**🔀 Refactored code**
|
||||||
|
|
||||||
|
- Refactor Integrate Asset [\#3530](https://github.com/pypeclub/OpenPype/pull/3530)
|
||||||
- General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529)
|
- General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529)
|
||||||
|
- General: Get current context document functions [\#3522](https://github.com/pypeclub/OpenPype/pull/3522)
|
||||||
|
- Kitsu: Use query function from client [\#3496](https://github.com/pypeclub/OpenPype/pull/3496)
|
||||||
- TimersManager: Use query functions [\#3495](https://github.com/pypeclub/OpenPype/pull/3495)
|
- TimersManager: Use query functions [\#3495](https://github.com/pypeclub/OpenPype/pull/3495)
|
||||||
|
- Deadline: Use query functions [\#3466](https://github.com/pypeclub/OpenPype/pull/3466)
|
||||||
|
|
||||||
## [3.12.1](https://github.com/pypeclub/OpenPype/tree/3.12.1) (2022-07-13)
|
## [3.12.1](https://github.com/pypeclub/OpenPype/tree/3.12.1) (2022-07-13)
|
||||||
|
|
||||||
|
|
@ -57,7 +74,6 @@
|
||||||
- Windows installer: Clean old files and add version subfolder [\#3445](https://github.com/pypeclub/OpenPype/pull/3445)
|
- Windows installer: Clean old files and add version subfolder [\#3445](https://github.com/pypeclub/OpenPype/pull/3445)
|
||||||
- Blender: Bugfix - Set fps properly on open [\#3426](https://github.com/pypeclub/OpenPype/pull/3426)
|
- Blender: Bugfix - Set fps properly on open [\#3426](https://github.com/pypeclub/OpenPype/pull/3426)
|
||||||
- Hiero: Add custom scripts menu [\#3425](https://github.com/pypeclub/OpenPype/pull/3425)
|
- Hiero: Add custom scripts menu [\#3425](https://github.com/pypeclub/OpenPype/pull/3425)
|
||||||
- Blender: pre pyside install for all platforms [\#3400](https://github.com/pypeclub/OpenPype/pull/3400)
|
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
**🐛 Bug fixes**
|
||||||
|
|
||||||
|
|
@ -95,34 +111,19 @@
|
||||||
|
|
||||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.0-nightly.3...3.12.0)
|
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.0-nightly.3...3.12.0)
|
||||||
|
|
||||||
### 📖 Documentation
|
|
||||||
|
|
||||||
- Fix typo in documentation: pyenv on mac [\#3417](https://github.com/pypeclub/OpenPype/pull/3417)
|
|
||||||
- Linux: update OIIO package [\#3401](https://github.com/pypeclub/OpenPype/pull/3401)
|
|
||||||
|
|
||||||
**🚀 Enhancements**
|
**🚀 Enhancements**
|
||||||
|
|
||||||
- Webserver: Added CORS middleware [\#3422](https://github.com/pypeclub/OpenPype/pull/3422)
|
- Webserver: Added CORS middleware [\#3422](https://github.com/pypeclub/OpenPype/pull/3422)
|
||||||
- Attribute Defs UI: Files widget show what is allowed to drop in [\#3411](https://github.com/pypeclub/OpenPype/pull/3411)
|
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
**🐛 Bug fixes**
|
||||||
|
|
||||||
- NewPublisher: Fix subset name change on change of creator plugin [\#3420](https://github.com/pypeclub/OpenPype/pull/3420)
|
- NewPublisher: Fix subset name change on change of creator plugin [\#3420](https://github.com/pypeclub/OpenPype/pull/3420)
|
||||||
- Bug: fix invalid avalon import [\#3418](https://github.com/pypeclub/OpenPype/pull/3418)
|
- Bug: fix invalid avalon import [\#3418](https://github.com/pypeclub/OpenPype/pull/3418)
|
||||||
- Nuke: Fix keyword argument in query function [\#3414](https://github.com/pypeclub/OpenPype/pull/3414)
|
|
||||||
- Houdini: fix loading and updating vbd/bgeo sequences [\#3408](https://github.com/pypeclub/OpenPype/pull/3408)
|
|
||||||
- Nuke: Collect representation files based on Write [\#3407](https://github.com/pypeclub/OpenPype/pull/3407)
|
|
||||||
- General: Filter representations before integration start [\#3398](https://github.com/pypeclub/OpenPype/pull/3398)
|
|
||||||
- Maya: look collector typo [\#3392](https://github.com/pypeclub/OpenPype/pull/3392)
|
|
||||||
|
|
||||||
**🔀 Refactored code**
|
**🔀 Refactored code**
|
||||||
|
|
||||||
- Unreal: Use client query functions [\#3421](https://github.com/pypeclub/OpenPype/pull/3421)
|
- Unreal: Use client query functions [\#3421](https://github.com/pypeclub/OpenPype/pull/3421)
|
||||||
- General: Move editorial lib to pipeline [\#3419](https://github.com/pypeclub/OpenPype/pull/3419)
|
- General: Move editorial lib to pipeline [\#3419](https://github.com/pypeclub/OpenPype/pull/3419)
|
||||||
- Kitsu: renaming to plural func sync\_all\_projects [\#3397](https://github.com/pypeclub/OpenPype/pull/3397)
|
|
||||||
- Houdini: Use client query functions [\#3395](https://github.com/pypeclub/OpenPype/pull/3395)
|
|
||||||
- Hiero: Use client query functions [\#3393](https://github.com/pypeclub/OpenPype/pull/3393)
|
|
||||||
- Nuke: Use client query functions [\#3391](https://github.com/pypeclub/OpenPype/pull/3391)
|
|
||||||
|
|
||||||
## [3.11.1](https://github.com/pypeclub/OpenPype/tree/3.11.1) (2022-06-20)
|
## [3.11.1](https://github.com/pypeclub/OpenPype/tree/3.11.1) (2022-06-20)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -4,7 +4,6 @@ import logging
|
||||||
|
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
|
|
||||||
from openpype import lib
|
|
||||||
from openpype.lib import register_event_callback
|
from openpype.lib import register_event_callback
|
||||||
from openpype.pipeline import (
|
from openpype.pipeline import (
|
||||||
register_loader_plugin_path,
|
register_loader_plugin_path,
|
||||||
|
|
@ -14,6 +13,7 @@ from openpype.pipeline import (
|
||||||
AVALON_CONTAINER_ID,
|
AVALON_CONTAINER_ID,
|
||||||
)
|
)
|
||||||
from openpype.pipeline.load import get_outdated_containers
|
from openpype.pipeline.load import get_outdated_containers
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
import openpype.hosts.harmony
|
import openpype.hosts.harmony
|
||||||
import openpype.hosts.harmony.api as harmony
|
import openpype.hosts.harmony.api as harmony
|
||||||
|
|
||||||
|
|
@ -49,7 +49,9 @@ def get_asset_settings():
|
||||||
dict: Scene data.
|
dict: Scene data.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
asset_data = lib.get_asset()["data"]
|
|
||||||
|
asset_doc = get_current_project_asset()
|
||||||
|
asset_data = asset_doc["data"]
|
||||||
fps = asset_data.get("fps")
|
fps = asset_data.get("fps")
|
||||||
frame_start = asset_data.get("frameStart")
|
frame_start = asset_data.get("frameStart")
|
||||||
frame_end = asset_data.get("frameEnd")
|
frame_end = asset_data.get("frameEnd")
|
||||||
|
|
|
||||||
|
|
@ -55,6 +55,10 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
||||||
|
|
||||||
def process(self, instance):
|
def process(self, instance):
|
||||||
"""Plugin entry point."""
|
"""Plugin entry point."""
|
||||||
|
|
||||||
|
# TODO 'get_asset_settings' could expect asset document as argument
|
||||||
|
# which is available on 'context.data["assetEntity"]'
|
||||||
|
# - the same approach can be used in 'ValidateSceneSettingsRepair'
|
||||||
expected_settings = harmony.get_asset_settings()
|
expected_settings = harmony.get_asset_settings()
|
||||||
self.log.info("scene settings from DB:".format(expected_settings))
|
self.log.info("scene settings from DB:".format(expected_settings))
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -10,6 +10,7 @@ import qargparse
|
||||||
|
|
||||||
import openpype.api as openpype
|
import openpype.api as openpype
|
||||||
from openpype.pipeline import LoaderPlugin, LegacyCreator
|
from openpype.pipeline import LoaderPlugin, LegacyCreator
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
from . import lib
|
from . import lib
|
||||||
|
|
||||||
log = openpype.Logger().get_logger(__name__)
|
log = openpype.Logger().get_logger(__name__)
|
||||||
|
|
@ -484,7 +485,7 @@ class ClipLoader:
|
||||||
|
|
||||||
"""
|
"""
|
||||||
asset_name = self.context["representation"]["context"]["asset"]
|
asset_name = self.context["representation"]["context"]["asset"]
|
||||||
asset_doc = openpype.get_asset(asset_name)
|
asset_doc = get_current_project_asset(asset_name)
|
||||||
log.debug("__ asset_doc: {}".format(pformat(asset_doc)))
|
log.debug("__ asset_doc: {}".format(pformat(asset_doc)))
|
||||||
self.data["assetData"] = asset_doc["data"]
|
self.data["assetData"] = asset_doc["data"]
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -5,8 +5,8 @@ from contextlib import contextmanager
|
||||||
import six
|
import six
|
||||||
|
|
||||||
from openpype.client import get_asset_by_name
|
from openpype.client import get_asset_by_name
|
||||||
from openpype.api import get_asset
|
|
||||||
from openpype.pipeline import legacy_io
|
from openpype.pipeline import legacy_io
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
|
|
||||||
|
|
||||||
import hou
|
import hou
|
||||||
|
|
@ -16,7 +16,7 @@ log = logging.getLogger(__name__)
|
||||||
|
|
||||||
def get_asset_fps():
|
def get_asset_fps():
|
||||||
"""Return current asset fps."""
|
"""Return current asset fps."""
|
||||||
return get_asset()["data"].get("fps")
|
return get_current_project_asset()["data"].get("fps")
|
||||||
|
|
||||||
|
|
||||||
def set_id(node, unique_id, overwrite=False):
|
def set_id(node, unique_id, overwrite=False):
|
||||||
|
|
|
||||||
|
|
@ -23,7 +23,6 @@ from openpype.client import (
|
||||||
get_last_versions,
|
get_last_versions,
|
||||||
get_representation_by_name
|
get_representation_by_name
|
||||||
)
|
)
|
||||||
from openpype import lib
|
|
||||||
from openpype.api import get_anatomy_settings
|
from openpype.api import get_anatomy_settings
|
||||||
from openpype.pipeline import (
|
from openpype.pipeline import (
|
||||||
legacy_io,
|
legacy_io,
|
||||||
|
|
@ -33,6 +32,7 @@ from openpype.pipeline import (
|
||||||
load_container,
|
load_container,
|
||||||
registered_host,
|
registered_host,
|
||||||
)
|
)
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
from .commands import reset_frame_range
|
from .commands import reset_frame_range
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -2174,7 +2174,7 @@ def reset_scene_resolution():
|
||||||
project_name = legacy_io.active_project()
|
project_name = legacy_io.active_project()
|
||||||
project_doc = get_project(project_name)
|
project_doc = get_project(project_name)
|
||||||
project_data = project_doc["data"]
|
project_data = project_doc["data"]
|
||||||
asset_data = lib.get_asset()["data"]
|
asset_data = get_current_project_asset()["data"]
|
||||||
|
|
||||||
# Set project resolution
|
# Set project resolution
|
||||||
width_key = "resolutionWidth"
|
width_key = "resolutionWidth"
|
||||||
|
|
@ -2208,7 +2208,8 @@ def set_context_settings():
|
||||||
project_name = legacy_io.active_project()
|
project_name = legacy_io.active_project()
|
||||||
project_doc = get_project(project_name)
|
project_doc = get_project(project_name)
|
||||||
project_data = project_doc["data"]
|
project_data = project_doc["data"]
|
||||||
asset_data = lib.get_asset()["data"]
|
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||||
|
asset_data = asset_doc.get("data", {})
|
||||||
|
|
||||||
# Set project fps
|
# Set project fps
|
||||||
fps = asset_data.get("fps", project_data.get("fps", 25))
|
fps = asset_data.get("fps", project_data.get("fps", 25))
|
||||||
|
|
@ -2233,7 +2234,7 @@ def validate_fps():
|
||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
fps = lib.get_asset()["data"]["fps"]
|
fps = get_current_project_asset(fields=["data.fps"])["data"]["fps"]
|
||||||
# TODO(antirotor): This is hack as for framerates having multiple
|
# TODO(antirotor): This is hack as for framerates having multiple
|
||||||
# decimal places. FTrack is ceiling decimal values on
|
# decimal places. FTrack is ceiling decimal values on
|
||||||
# fps to two decimal places but Maya 2019+ is reporting those fps
|
# fps to two decimal places but Maya 2019+ is reporting those fps
|
||||||
|
|
@ -3051,8 +3052,9 @@ def update_content_on_context_change():
|
||||||
This will update scene content to match new asset on context change
|
This will update scene content to match new asset on context change
|
||||||
"""
|
"""
|
||||||
scene_sets = cmds.listSets(allSets=True)
|
scene_sets = cmds.listSets(allSets=True)
|
||||||
new_asset = legacy_io.Session["AVALON_ASSET"]
|
asset_doc = get_current_project_asset()
|
||||||
new_data = lib.get_asset()["data"]
|
new_asset = asset_doc["name"]
|
||||||
|
new_data = asset_doc["data"]
|
||||||
for s in scene_sets:
|
for s in scene_sets:
|
||||||
try:
|
try:
|
||||||
if cmds.getAttr("{}.id".format(s)) == "pyblish.avalon.instance":
|
if cmds.getAttr("{}.id".format(s)) == "pyblish.avalon.instance":
|
||||||
|
|
|
||||||
|
|
@ -15,13 +15,13 @@ from openpype.hosts.maya.api import (
|
||||||
from openpype.lib import requests_get
|
from openpype.lib import requests_get
|
||||||
from openpype.api import (
|
from openpype.api import (
|
||||||
get_system_settings,
|
get_system_settings,
|
||||||
get_project_settings,
|
get_project_settings)
|
||||||
get_asset)
|
|
||||||
from openpype.modules import ModulesManager
|
from openpype.modules import ModulesManager
|
||||||
from openpype.pipeline import (
|
from openpype.pipeline import (
|
||||||
CreatorError,
|
CreatorError,
|
||||||
legacy_io,
|
legacy_io,
|
||||||
)
|
)
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
|
|
||||||
|
|
||||||
class CreateRender(plugin.Creator):
|
class CreateRender(plugin.Creator):
|
||||||
|
|
@ -413,7 +413,7 @@ class CreateRender(plugin.Creator):
|
||||||
prefix,
|
prefix,
|
||||||
type="string")
|
type="string")
|
||||||
|
|
||||||
asset = get_asset()
|
asset = get_current_project_asset()
|
||||||
|
|
||||||
if renderer == "arnold":
|
if renderer == "arnold":
|
||||||
# set format to exr
|
# set format to exr
|
||||||
|
|
|
||||||
|
|
@ -2,8 +2,8 @@ import maya.cmds as cmds
|
||||||
|
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
import openpype.api
|
import openpype.api
|
||||||
from openpype import lib
|
|
||||||
import openpype.hosts.maya.api.lib as mayalib
|
import openpype.hosts.maya.api.lib as mayalib
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
from math import ceil
|
from math import ceil
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -41,7 +41,9 @@ class ValidateMayaUnits(pyblish.api.ContextPlugin):
|
||||||
# now flooring the value?
|
# now flooring the value?
|
||||||
fps = float_round(context.data.get('fps'), 2, ceil)
|
fps = float_round(context.data.get('fps'), 2, ceil)
|
||||||
|
|
||||||
asset_fps = lib.get_asset()["data"]["fps"]
|
# TODO repace query with using 'context.data["assetEntity"]'
|
||||||
|
asset_doc = get_current_project_asset()
|
||||||
|
asset_fps = asset_doc["data"]["fps"]
|
||||||
|
|
||||||
self.log.info('Units (linear): {0}'.format(linearunits))
|
self.log.info('Units (linear): {0}'.format(linearunits))
|
||||||
self.log.info('Units (angular): {0}'.format(angularunits))
|
self.log.info('Units (angular): {0}'.format(angularunits))
|
||||||
|
|
@ -91,5 +93,7 @@ class ValidateMayaUnits(pyblish.api.ContextPlugin):
|
||||||
cls.log.debug(current_linear)
|
cls.log.debug(current_linear)
|
||||||
|
|
||||||
cls.log.info("Setting time unit to match project")
|
cls.log.info("Setting time unit to match project")
|
||||||
asset_fps = lib.get_asset()["data"]["fps"]
|
# TODO repace query with using 'context.data["assetEntity"]'
|
||||||
|
asset_doc = get_current_project_asset()
|
||||||
|
asset_fps = asset_doc["data"]["fps"]
|
||||||
mayalib.set_scene_fps(asset_fps)
|
mayalib.set_scene_fps(asset_fps)
|
||||||
|
|
|
||||||
|
|
@ -6,7 +6,7 @@ from openpype.pipeline import PublishXmlValidationError
|
||||||
|
|
||||||
|
|
||||||
class ValidateReviewSubsetUniqueness(pyblish.api.ContextPlugin):
|
class ValidateReviewSubsetUniqueness(pyblish.api.ContextPlugin):
|
||||||
"""Validates that nodes has common root."""
|
"""Validates that review subset has unique name."""
|
||||||
|
|
||||||
order = openpype.api.ValidateContentsOrder
|
order = openpype.api.ValidateContentsOrder
|
||||||
hosts = ["maya"]
|
hosts = ["maya"]
|
||||||
|
|
@ -17,7 +17,7 @@ class ValidateReviewSubsetUniqueness(pyblish.api.ContextPlugin):
|
||||||
subset_names = []
|
subset_names = []
|
||||||
|
|
||||||
for instance in context:
|
for instance in context:
|
||||||
self.log.info("instance:: {}".format(instance.data))
|
self.log.debug("Instance: {}".format(instance.data))
|
||||||
if instance.data.get('publish'):
|
if instance.data.get('publish'):
|
||||||
subset_names.append(instance.data.get('subset'))
|
subset_names.append(instance.data.get('subset'))
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -4,8 +4,7 @@ import openpype.api
|
||||||
|
|
||||||
|
|
||||||
class ValidateSetdressRoot(pyblish.api.InstancePlugin):
|
class ValidateSetdressRoot(pyblish.api.InstancePlugin):
|
||||||
"""
|
"""Validate if set dress top root node is published."""
|
||||||
"""
|
|
||||||
|
|
||||||
order = openpype.api.ValidateContentsOrder
|
order = openpype.api.ValidateContentsOrder
|
||||||
label = "SetDress Root"
|
label = "SetDress Root"
|
||||||
|
|
|
||||||
|
|
@ -24,7 +24,6 @@ from openpype.api import (
|
||||||
BuildWorkfile,
|
BuildWorkfile,
|
||||||
get_version_from_path,
|
get_version_from_path,
|
||||||
get_workdir_data,
|
get_workdir_data,
|
||||||
get_asset,
|
|
||||||
get_current_project_settings,
|
get_current_project_settings,
|
||||||
)
|
)
|
||||||
from openpype.tools.utils import host_tools
|
from openpype.tools.utils import host_tools
|
||||||
|
|
@ -40,6 +39,7 @@ from openpype.pipeline import (
|
||||||
legacy_io,
|
legacy_io,
|
||||||
Anatomy,
|
Anatomy,
|
||||||
)
|
)
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
|
|
||||||
from . import gizmo_menu
|
from . import gizmo_menu
|
||||||
|
|
||||||
|
|
@ -1766,7 +1766,7 @@ class WorkfileSettings(object):
|
||||||
kwargs.get("asset_name")
|
kwargs.get("asset_name")
|
||||||
or legacy_io.Session["AVALON_ASSET"]
|
or legacy_io.Session["AVALON_ASSET"]
|
||||||
)
|
)
|
||||||
self._asset_entity = get_asset(self._asset)
|
self._asset_entity = get_current_project_asset(self._asset)
|
||||||
self._root_node = root_node or nuke.root()
|
self._root_node = root_node or nuke.root()
|
||||||
self._nodes = self.get_nodes(nodes=nodes)
|
self._nodes = self.get_nodes(nodes=nodes)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,6 @@
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
|
|
||||||
from openpype.client import get_project, get_asset_by_id
|
from openpype.client import get_project, get_asset_by_id, get_asset_by_name
|
||||||
from openpype import lib
|
|
||||||
from openpype.pipeline import legacy_io
|
from openpype.pipeline import legacy_io
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -17,10 +16,11 @@ class ValidateScript(pyblish.api.InstancePlugin):
|
||||||
|
|
||||||
def process(self, instance):
|
def process(self, instance):
|
||||||
ctx_data = instance.context.data
|
ctx_data = instance.context.data
|
||||||
asset_name = ctx_data["asset"]
|
|
||||||
asset = lib.get_asset(asset_name)
|
|
||||||
asset_data = asset["data"]
|
|
||||||
project_name = legacy_io.active_project()
|
project_name = legacy_io.active_project()
|
||||||
|
asset_name = ctx_data["asset"]
|
||||||
|
# TODO repace query with using 'instance.data["assetEntity"]'
|
||||||
|
asset = get_asset_by_name(project_name, asset_name)
|
||||||
|
asset_data = asset["data"]
|
||||||
|
|
||||||
# These attributes will be checked
|
# These attributes will be checked
|
||||||
attributes = [
|
attributes = [
|
||||||
|
|
|
||||||
|
|
@ -4,11 +4,11 @@ import uuid
|
||||||
import qargparse
|
import qargparse
|
||||||
from Qt import QtWidgets, QtCore
|
from Qt import QtWidgets, QtCore
|
||||||
|
|
||||||
import openpype.api as pype
|
|
||||||
from openpype.pipeline import (
|
from openpype.pipeline import (
|
||||||
LegacyCreator,
|
LegacyCreator,
|
||||||
LoaderPlugin,
|
LoaderPlugin,
|
||||||
)
|
)
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
from openpype.hosts import resolve
|
from openpype.hosts import resolve
|
||||||
from . import lib
|
from . import lib
|
||||||
|
|
||||||
|
|
@ -375,7 +375,7 @@ class ClipLoader:
|
||||||
|
|
||||||
"""
|
"""
|
||||||
asset_name = self.context["representation"]["context"]["asset"]
|
asset_name = self.context["representation"]["context"]["asset"]
|
||||||
self.data["assetData"] = pype.get_asset(asset_name)["data"]
|
self.data["assetData"] = get_current_project_asset(asset_name)["data"]
|
||||||
|
|
||||||
def load(self):
|
def load(self):
|
||||||
# create project bin for the media to be imported into
|
# create project bin for the media to be imported into
|
||||||
|
|
|
||||||
|
|
@ -19,6 +19,7 @@ import os
|
||||||
import opentimelineio as otio
|
import opentimelineio as otio
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
from openpype import lib as plib
|
from openpype import lib as plib
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
|
|
||||||
|
|
||||||
class OTIO_View(pyblish.api.Action):
|
class OTIO_View(pyblish.api.Action):
|
||||||
|
|
@ -116,7 +117,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
||||||
if extension == ".edl":
|
if extension == ".edl":
|
||||||
# EDL has no frame rate embedded so needs explicit
|
# EDL has no frame rate embedded so needs explicit
|
||||||
# frame rate else 24 is asssumed.
|
# frame rate else 24 is asssumed.
|
||||||
kwargs["rate"] = plib.get_asset()["data"]["fps"]
|
kwargs["rate"] = get_current_project_asset()["data"]["fps"]
|
||||||
|
|
||||||
instance.data["otio_timeline"] = otio.adapters.read_from_file(
|
instance.data["otio_timeline"] = otio.adapters.read_from_file(
|
||||||
file_path, **kwargs)
|
file_path, **kwargs)
|
||||||
|
|
|
||||||
|
|
@ -1,8 +1,12 @@
|
||||||
import os
|
import os
|
||||||
|
from copy import deepcopy
|
||||||
|
|
||||||
import opentimelineio as otio
|
import opentimelineio as otio
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
|
|
||||||
from openpype import lib as plib
|
from openpype import lib as plib
|
||||||
from copy import deepcopy
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
|
|
||||||
|
|
||||||
class CollectInstances(pyblish.api.InstancePlugin):
|
class CollectInstances(pyblish.api.InstancePlugin):
|
||||||
"""Collect instances from editorial's OTIO sequence"""
|
"""Collect instances from editorial's OTIO sequence"""
|
||||||
|
|
@ -48,7 +52,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
||||||
|
|
||||||
# get timeline otio data
|
# get timeline otio data
|
||||||
timeline = instance.data["otio_timeline"]
|
timeline = instance.data["otio_timeline"]
|
||||||
fps = plib.get_asset()["data"]["fps"]
|
fps = get_current_project_asset()["data"]["fps"]
|
||||||
|
|
||||||
tracks = timeline.each_child(
|
tracks = timeline.each_child(
|
||||||
descended_from_type=otio.schema.Track
|
descended_from_type=otio.schema.Track
|
||||||
|
|
|
||||||
|
|
@ -3,8 +3,8 @@ import re
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
|
|
||||||
import openpype.api
|
import openpype.api
|
||||||
from openpype import lib
|
|
||||||
from openpype.pipeline import PublishXmlValidationError
|
from openpype.pipeline import PublishXmlValidationError
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
|
|
||||||
|
|
||||||
class ValidateFrameRange(pyblish.api.InstancePlugin):
|
class ValidateFrameRange(pyblish.api.InstancePlugin):
|
||||||
|
|
@ -27,7 +27,8 @@ class ValidateFrameRange(pyblish.api.InstancePlugin):
|
||||||
for pattern in self.skip_timelines_check):
|
for pattern in self.skip_timelines_check):
|
||||||
self.log.info("Skipping for {} task".format(instance.data["task"]))
|
self.log.info("Skipping for {} task".format(instance.data["task"]))
|
||||||
|
|
||||||
asset_data = lib.get_asset(instance.data["asset"])["data"]
|
# TODO repace query with using 'instance.data["assetEntity"]'
|
||||||
|
asset_data = get_current_project_asset(instance.data["asset"])["data"]
|
||||||
frame_start = asset_data["frameStart"]
|
frame_start = asset_data["frameStart"]
|
||||||
frame_end = asset_data["frameEnd"]
|
frame_end = asset_data["frameEnd"]
|
||||||
handle_start = asset_data["handleStart"]
|
handle_start = asset_data["handleStart"]
|
||||||
|
|
|
||||||
|
|
@ -8,13 +8,13 @@ from unreal import EditorAssetLibrary
|
||||||
from unreal import MovieSceneSkeletalAnimationTrack
|
from unreal import MovieSceneSkeletalAnimationTrack
|
||||||
from unreal import MovieSceneSkeletalAnimationSection
|
from unreal import MovieSceneSkeletalAnimationSection
|
||||||
|
|
||||||
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
from openpype.pipeline import (
|
from openpype.pipeline import (
|
||||||
get_representation_path,
|
get_representation_path,
|
||||||
AVALON_CONTAINER_ID
|
AVALON_CONTAINER_ID
|
||||||
)
|
)
|
||||||
from openpype.hosts.unreal.api import plugin
|
from openpype.hosts.unreal.api import plugin
|
||||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||||
from openpype.api import get_asset
|
|
||||||
|
|
||||||
|
|
||||||
class AnimationFBXLoader(plugin.Loader):
|
class AnimationFBXLoader(plugin.Loader):
|
||||||
|
|
@ -53,6 +53,8 @@ class AnimationFBXLoader(plugin.Loader):
|
||||||
if not actor:
|
if not actor:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||||
|
|
||||||
task.set_editor_property('filename', self.fname)
|
task.set_editor_property('filename', self.fname)
|
||||||
task.set_editor_property('destination_path', asset_dir)
|
task.set_editor_property('destination_path', asset_dir)
|
||||||
task.set_editor_property('destination_name', asset_name)
|
task.set_editor_property('destination_name', asset_name)
|
||||||
|
|
@ -80,7 +82,7 @@ class AnimationFBXLoader(plugin.Loader):
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'use_default_sample_rate', False)
|
'use_default_sample_rate', False)
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'import_custom_attribute', True)
|
'import_custom_attribute', True)
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
|
|
@ -246,6 +248,7 @@ class AnimationFBXLoader(plugin.Loader):
|
||||||
def update(self, container, representation):
|
def update(self, container, representation):
|
||||||
name = container["asset_name"]
|
name = container["asset_name"]
|
||||||
source_path = get_representation_path(representation)
|
source_path = get_representation_path(representation)
|
||||||
|
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||||
destination_path = container["namespace"]
|
destination_path = container["namespace"]
|
||||||
|
|
||||||
task = unreal.AssetImportTask()
|
task = unreal.AssetImportTask()
|
||||||
|
|
@ -279,7 +282,7 @@ class AnimationFBXLoader(plugin.Loader):
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'use_default_sample_rate', False)
|
'use_default_sample_rate', False)
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'import_custom_attribute', True)
|
'import_custom_attribute', True)
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
|
|
|
||||||
|
|
@ -20,7 +20,7 @@ from openpype.pipeline import (
|
||||||
AVALON_CONTAINER_ID,
|
AVALON_CONTAINER_ID,
|
||||||
legacy_io,
|
legacy_io,
|
||||||
)
|
)
|
||||||
from openpype.api import get_asset
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
from openpype.hosts.unreal.api import plugin
|
from openpype.hosts.unreal.api import plugin
|
||||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||||
|
|
||||||
|
|
@ -225,6 +225,7 @@ class LayoutLoader(plugin.Loader):
|
||||||
|
|
||||||
anim_path = f"{asset_dir}/animations/{anim_file_name}"
|
anim_path = f"{asset_dir}/animations/{anim_file_name}"
|
||||||
|
|
||||||
|
asset_doc = get_current_project_asset()
|
||||||
# Import animation
|
# Import animation
|
||||||
task = unreal.AssetImportTask()
|
task = unreal.AssetImportTask()
|
||||||
task.options = unreal.FbxImportUI()
|
task.options = unreal.FbxImportUI()
|
||||||
|
|
@ -259,7 +260,7 @@ class LayoutLoader(plugin.Loader):
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'use_default_sample_rate', False)
|
'use_default_sample_rate', False)
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
'import_custom_attribute', True)
|
'import_custom_attribute', True)
|
||||||
task.options.anim_sequence_import_data.set_editor_property(
|
task.options.anim_sequence_import_data.set_editor_property(
|
||||||
|
|
|
||||||
|
|
@ -204,7 +204,7 @@ def any_outdated():
|
||||||
return any_outdated_containers()
|
return any_outdated_containers()
|
||||||
|
|
||||||
|
|
||||||
@with_pipeline_io
|
@deprecated("openpype.pipeline.context_tools.get_current_project_asset")
|
||||||
def get_asset(asset_name=None):
|
def get_asset(asset_name=None):
|
||||||
""" Returning asset document from database by its name.
|
""" Returning asset document from database by its name.
|
||||||
|
|
||||||
|
|
@ -217,15 +217,9 @@ def get_asset(asset_name=None):
|
||||||
(MongoDB document)
|
(MongoDB document)
|
||||||
"""
|
"""
|
||||||
|
|
||||||
project_name = legacy_io.active_project()
|
from openpype.pipeline.context_tools import get_current_project_asset
|
||||||
if not asset_name:
|
|
||||||
asset_name = legacy_io.Session["AVALON_ASSET"]
|
|
||||||
|
|
||||||
asset_document = get_asset_by_name(project_name, asset_name)
|
return get_current_project_asset(asset_name=asset_name)
|
||||||
if not asset_document:
|
|
||||||
raise TypeError("Entity \"{}\" was not found in DB".format(asset_name))
|
|
||||||
|
|
||||||
return asset_document
|
|
||||||
|
|
||||||
|
|
||||||
def get_system_general_anatomy_data(system_settings=None):
|
def get_system_general_anatomy_data(system_settings=None):
|
||||||
|
|
|
||||||
171
openpype/lib/file_transaction.py
Normal file
|
|
@ -0,0 +1,171 @@
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
import sys
|
||||||
|
import errno
|
||||||
|
import six
|
||||||
|
|
||||||
|
from openpype.lib import create_hard_link
|
||||||
|
|
||||||
|
# this is needed until speedcopy for linux is fixed
|
||||||
|
if sys.platform == "win32":
|
||||||
|
from speedcopy import copyfile
|
||||||
|
else:
|
||||||
|
from shutil import copyfile
|
||||||
|
|
||||||
|
|
||||||
|
class FileTransaction(object):
|
||||||
|
"""
|
||||||
|
|
||||||
|
The file transaction is a three step process.
|
||||||
|
|
||||||
|
1) Rename any existing files to a "temporary backup" during `process()`
|
||||||
|
2) Copy the files to final destination during `process()`
|
||||||
|
3) Remove any backed up files (*no rollback possible!) during `finalize()`
|
||||||
|
|
||||||
|
Step 3 is done during `finalize()`. If not called the .bak files will
|
||||||
|
remain on disk.
|
||||||
|
|
||||||
|
These steps try to ensure that we don't overwrite half of any existing
|
||||||
|
files e.g. if they are currently in use.
|
||||||
|
|
||||||
|
Note:
|
||||||
|
A regular filesystem is *not* a transactional file system and even
|
||||||
|
though this implementation tries to produce a 'safe copy' with a
|
||||||
|
potential rollback do keep in mind that it's inherently unsafe due
|
||||||
|
to how filesystem works and a myriad of things could happen during
|
||||||
|
the transaction that break the logic. A file storage could go down,
|
||||||
|
permissions could be changed, other machines could be moving or writing
|
||||||
|
files. A lot can happen.
|
||||||
|
|
||||||
|
Warning:
|
||||||
|
Any folders created during the transfer will not be removed.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
MODE_COPY = 0
|
||||||
|
MODE_HARDLINK = 1
|
||||||
|
|
||||||
|
def __init__(self, log=None):
|
||||||
|
|
||||||
|
if log is None:
|
||||||
|
log = logging.getLogger("FileTransaction")
|
||||||
|
|
||||||
|
self.log = log
|
||||||
|
|
||||||
|
# The transfer queue
|
||||||
|
# todo: make this an actual FIFO queue?
|
||||||
|
self._transfers = {}
|
||||||
|
|
||||||
|
# Destination file paths that a file was transferred to
|
||||||
|
self._transferred = []
|
||||||
|
|
||||||
|
# Backup file location mapping to original locations
|
||||||
|
self._backup_to_original = {}
|
||||||
|
|
||||||
|
def add(self, src, dst, mode=MODE_COPY):
|
||||||
|
"""Add a new file to transfer queue"""
|
||||||
|
opts = {"mode": mode}
|
||||||
|
|
||||||
|
src = os.path.abspath(src)
|
||||||
|
dst = os.path.abspath(dst)
|
||||||
|
|
||||||
|
if dst in self._transfers:
|
||||||
|
queued_src = self._transfers[dst][0]
|
||||||
|
if src == queued_src:
|
||||||
|
self.log.debug("File transfer was already "
|
||||||
|
"in queue: {} -> {}".format(src, dst))
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
self.log.warning("File transfer in queue replaced..")
|
||||||
|
self.log.debug("Removed from queue: "
|
||||||
|
"{} -> {}".format(queued_src, dst))
|
||||||
|
self.log.debug("Added to queue: {} -> {}".format(src, dst))
|
||||||
|
|
||||||
|
self._transfers[dst] = (src, opts)
|
||||||
|
|
||||||
|
def process(self):
|
||||||
|
|
||||||
|
# Backup any existing files
|
||||||
|
for dst in self._transfers.keys():
|
||||||
|
if os.path.exists(dst):
|
||||||
|
# Backup original file
|
||||||
|
# todo: add timestamp or uuid to ensure unique
|
||||||
|
backup = dst + ".bak"
|
||||||
|
self._backup_to_original[backup] = dst
|
||||||
|
self.log.debug("Backup existing file: "
|
||||||
|
"{} -> {}".format(dst, backup))
|
||||||
|
os.rename(dst, backup)
|
||||||
|
|
||||||
|
# Copy the files to transfer
|
||||||
|
for dst, (src, opts) in self._transfers.items():
|
||||||
|
self._create_folder_for_file(dst)
|
||||||
|
|
||||||
|
if opts["mode"] == self.MODE_COPY:
|
||||||
|
self.log.debug("Copying file ... {} -> {}".format(src, dst))
|
||||||
|
copyfile(src, dst)
|
||||||
|
elif opts["mode"] == self.MODE_HARDLINK:
|
||||||
|
self.log.debug("Hardlinking file ... {} -> {}".format(src,
|
||||||
|
dst))
|
||||||
|
create_hard_link(src, dst)
|
||||||
|
|
||||||
|
self._transferred.append(dst)
|
||||||
|
|
||||||
|
def finalize(self):
|
||||||
|
# Delete any backed up files
|
||||||
|
for backup in self._backup_to_original.keys():
|
||||||
|
try:
|
||||||
|
os.remove(backup)
|
||||||
|
except OSError:
|
||||||
|
self.log.error("Failed to remove backup file: "
|
||||||
|
"{}".format(backup),
|
||||||
|
exc_info=True)
|
||||||
|
|
||||||
|
def rollback(self):
|
||||||
|
|
||||||
|
errors = 0
|
||||||
|
|
||||||
|
# Rollback any transferred files
|
||||||
|
for path in self._transferred:
|
||||||
|
try:
|
||||||
|
os.remove(path)
|
||||||
|
except OSError:
|
||||||
|
errors += 1
|
||||||
|
self.log.error("Failed to rollback created file: "
|
||||||
|
"{}".format(path),
|
||||||
|
exc_info=True)
|
||||||
|
|
||||||
|
# Rollback the backups
|
||||||
|
for backup, original in self._backup_to_original.items():
|
||||||
|
try:
|
||||||
|
os.rename(backup, original)
|
||||||
|
except OSError:
|
||||||
|
errors += 1
|
||||||
|
self.log.error("Failed to restore original file: "
|
||||||
|
"{} -> {}".format(backup, original),
|
||||||
|
exc_info=True)
|
||||||
|
|
||||||
|
if errors:
|
||||||
|
self.log.error("{} errors occurred during "
|
||||||
|
"rollback.".format(errors), exc_info=True)
|
||||||
|
six.reraise(*sys.exc_info())
|
||||||
|
|
||||||
|
@property
|
||||||
|
def transferred(self):
|
||||||
|
"""Return the processed transfers destination paths"""
|
||||||
|
return list(self._transferred)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def backups(self):
|
||||||
|
"""Return the backup file paths"""
|
||||||
|
return list(self._backup_to_original.keys())
|
||||||
|
|
||||||
|
def _create_folder_for_file(self, path):
|
||||||
|
dirname = os.path.dirname(path)
|
||||||
|
try:
|
||||||
|
os.makedirs(dirname)
|
||||||
|
except OSError as e:
|
||||||
|
if e.errno == errno.EEXIST:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
self.log.critical("An unexpected error occurred.")
|
||||||
|
six.reraise(*sys.exc_info())
|
||||||
|
|
@ -10,7 +10,12 @@ import pyblish.api
|
||||||
from pyblish.lib import MessageHandler
|
from pyblish.lib import MessageHandler
|
||||||
|
|
||||||
import openpype
|
import openpype
|
||||||
from openpype.client import version_is_latest
|
from openpype.client import (
|
||||||
|
get_project,
|
||||||
|
get_asset_by_id,
|
||||||
|
get_asset_by_name,
|
||||||
|
version_is_latest,
|
||||||
|
)
|
||||||
from openpype.modules import load_modules, ModulesManager
|
from openpype.modules import load_modules, ModulesManager
|
||||||
from openpype.settings import get_project_settings
|
from openpype.settings import get_project_settings
|
||||||
from openpype.lib import filter_pyblish_plugins
|
from openpype.lib import filter_pyblish_plugins
|
||||||
|
|
@ -241,29 +246,7 @@ def registered_host():
|
||||||
|
|
||||||
|
|
||||||
def deregister_host():
|
def deregister_host():
|
||||||
_registered_host["_"] = default_host()
|
_registered_host["_"] = None
|
||||||
|
|
||||||
|
|
||||||
def default_host():
|
|
||||||
"""A default host, in place of anything better
|
|
||||||
|
|
||||||
This may be considered as reference for the
|
|
||||||
interface a host must implement. It also ensures
|
|
||||||
that the system runs, even when nothing is there
|
|
||||||
to support it.
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
host = types.ModuleType("defaultHost")
|
|
||||||
|
|
||||||
def ls():
|
|
||||||
return list()
|
|
||||||
|
|
||||||
host.__dict__.update({
|
|
||||||
"ls": ls
|
|
||||||
})
|
|
||||||
|
|
||||||
return host
|
|
||||||
|
|
||||||
|
|
||||||
def debug_host():
|
def debug_host():
|
||||||
|
|
@ -307,6 +290,52 @@ def debug_host():
|
||||||
return host
|
return host
|
||||||
|
|
||||||
|
|
||||||
|
def get_current_project(fields=None):
|
||||||
|
"""Helper function to get project document based on global Session.
|
||||||
|
|
||||||
|
This function should be called only in process where host is installed.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: Project document.
|
||||||
|
None: Project is not set.
|
||||||
|
"""
|
||||||
|
|
||||||
|
project_name = legacy_io.active_project()
|
||||||
|
return get_project(project_name, fields=fields)
|
||||||
|
|
||||||
|
|
||||||
|
def get_current_project_asset(asset_name=None, asset_id=None, fields=None):
|
||||||
|
"""Helper function to get asset document based on global Session.
|
||||||
|
|
||||||
|
This function should be called only in process where host is installed.
|
||||||
|
|
||||||
|
Asset is found out based on passed asset name or id (not both). Asset name
|
||||||
|
is not used for filtering if asset id is passed. When both asset name and
|
||||||
|
id are missing then asset name from current process is used.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
asset_name (str): Name of asset used for filter.
|
||||||
|
asset_id (Union[str, ObjectId]): Asset document id. If entered then
|
||||||
|
is used as only filter.
|
||||||
|
fields (Union[List[str], None]): Limit returned data of asset documents
|
||||||
|
to specific keys.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: Asset document.
|
||||||
|
None: Asset is not set or not exist.
|
||||||
|
"""
|
||||||
|
|
||||||
|
project_name = legacy_io.active_project()
|
||||||
|
if asset_id:
|
||||||
|
return get_asset_by_id(project_name, asset_id, fields=fields)
|
||||||
|
|
||||||
|
if not asset_name:
|
||||||
|
asset_name = legacy_io.Session.get("AVALON_ASSET")
|
||||||
|
# Skip if is not set even on context
|
||||||
|
if not asset_name:
|
||||||
|
return None
|
||||||
|
return get_asset_by_name(project_name, asset_name, fields=fields)
|
||||||
|
|
||||||
def is_representation_from_latest(representation):
|
def is_representation_from_latest(representation):
|
||||||
"""Return whether the representation is from latest version
|
"""Return whether the representation is from latest version
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -285,36 +285,34 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
||||||
audio_channels,
|
audio_channels,
|
||||||
audio_sample_rate,
|
audio_sample_rate,
|
||||||
audio_channel_layout,
|
audio_channel_layout,
|
||||||
|
input_frame_rate
|
||||||
)
|
)
|
||||||
|
|
||||||
# replace slate with silent slate for concat
|
# replace slate with silent slate for concat
|
||||||
slate_v_path = slate_silent_path
|
slate_v_path = slate_silent_path
|
||||||
|
|
||||||
# create ffmpeg concat text file path
|
# concat slate and videos together with concat filter
|
||||||
conc_text_file = input_file.replace(ext, "") + "_concat" + ".txt"
|
# this will reencode the output
|
||||||
conc_text_path = os.path.join(
|
if input_audio:
|
||||||
os.path.normpath(stagingdir), conc_text_file)
|
fmap = [
|
||||||
_remove_at_end.append(conc_text_path)
|
"-filter_complex",
|
||||||
self.log.debug("__ conc_text_path: {}".format(conc_text_path))
|
"[0:v] [0:a] [1:v] [1:a] concat=n=2:v=1:a=1 [v] [a]",
|
||||||
|
"-map", '[v]',
|
||||||
new_line = "\n"
|
"-map", '[a]'
|
||||||
with open(conc_text_path, "w") as conc_text_f:
|
]
|
||||||
conc_text_f.writelines([
|
else:
|
||||||
"file {}".format(
|
fmap = [
|
||||||
slate_v_path.replace("\\", "/")),
|
"-filter_complex",
|
||||||
new_line,
|
"[0:v] [1:v] concat=n=2:v=1:a=0 [v]",
|
||||||
"file {}".format(input_path.replace("\\", "/"))
|
"-map", '[v]'
|
||||||
])
|
]
|
||||||
|
|
||||||
# concat slate and videos together
|
|
||||||
concat_args = [
|
concat_args = [
|
||||||
ffmpeg_path,
|
ffmpeg_path,
|
||||||
"-y",
|
"-y",
|
||||||
"-f", "concat",
|
"-i", slate_v_path,
|
||||||
"-safe", "0",
|
"-i", input_path,
|
||||||
"-i", conc_text_path,
|
|
||||||
"-c", "copy",
|
|
||||||
]
|
]
|
||||||
|
concat_args.extend(fmap)
|
||||||
if offset_timecode:
|
if offset_timecode:
|
||||||
concat_args.extend(["-timecode", offset_timecode])
|
concat_args.extend(["-timecode", offset_timecode])
|
||||||
# NOTE: Added because of OP Atom demuxers
|
# NOTE: Added because of OP Atom demuxers
|
||||||
|
|
@ -322,12 +320,18 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
||||||
# - keep format of output
|
# - keep format of output
|
||||||
if format_args:
|
if format_args:
|
||||||
concat_args.extend(format_args)
|
concat_args.extend(format_args)
|
||||||
|
|
||||||
|
if codec_args:
|
||||||
|
concat_args.extend(codec_args)
|
||||||
|
|
||||||
# Use arguments from ffmpeg preset
|
# Use arguments from ffmpeg preset
|
||||||
source_ffmpeg_cmd = repre.get("ffmpeg_cmd")
|
source_ffmpeg_cmd = repre.get("ffmpeg_cmd")
|
||||||
if source_ffmpeg_cmd:
|
if source_ffmpeg_cmd:
|
||||||
copy_args = (
|
copy_args = (
|
||||||
"-metadata",
|
"-metadata",
|
||||||
"-metadata:s:v:0",
|
"-metadata:s:v:0",
|
||||||
|
"-b:v",
|
||||||
|
"-b:a",
|
||||||
)
|
)
|
||||||
args = source_ffmpeg_cmd.split(" ")
|
args = source_ffmpeg_cmd.split(" ")
|
||||||
for indx, arg in enumerate(args):
|
for indx, arg in enumerate(args):
|
||||||
|
|
@ -335,12 +339,14 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
||||||
concat_args.append(arg)
|
concat_args.append(arg)
|
||||||
# assumes arg has one parameter
|
# assumes arg has one parameter
|
||||||
concat_args.append(args[indx + 1])
|
concat_args.append(args[indx + 1])
|
||||||
|
|
||||||
# add final output path
|
# add final output path
|
||||||
concat_args.append(output_path)
|
concat_args.append(output_path)
|
||||||
|
|
||||||
# ffmpeg concat subprocess
|
# ffmpeg concat subprocess
|
||||||
self.log.debug(
|
self.log.debug(
|
||||||
"Executing concat: {}".format(" ".join(concat_args))
|
"Executing concat filter: {}".format
|
||||||
|
(" ".join(concat_args))
|
||||||
)
|
)
|
||||||
openpype.api.run_subprocess(
|
openpype.api.run_subprocess(
|
||||||
concat_args, logger=self.log
|
concat_args, logger=self.log
|
||||||
|
|
@ -488,9 +494,10 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
||||||
audio_channels,
|
audio_channels,
|
||||||
audio_sample_rate,
|
audio_sample_rate,
|
||||||
audio_channel_layout,
|
audio_channel_layout,
|
||||||
|
input_frame_rate
|
||||||
):
|
):
|
||||||
# Get duration of one frame in micro seconds
|
# Get duration of one frame in micro seconds
|
||||||
items = audio_sample_rate.split("/")
|
items = input_frame_rate.split("/")
|
||||||
if len(items) == 1:
|
if len(items) == 1:
|
||||||
one_frame_duration = 1.0 / float(items[0])
|
one_frame_duration = 1.0 / float(items[0])
|
||||||
elif len(items) == 2:
|
elif len(items) == 2:
|
||||||
|
|
|
||||||
908
openpype/plugins/publish/integrate.py
Normal file
|
|
@ -0,0 +1,908 @@
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
import sys
|
||||||
|
import copy
|
||||||
|
import clique
|
||||||
|
import six
|
||||||
|
|
||||||
|
from bson.objectid import ObjectId
|
||||||
|
from pymongo import DeleteMany, ReplaceOne, InsertOne, UpdateOne
|
||||||
|
import pyblish.api
|
||||||
|
|
||||||
|
import openpype.api
|
||||||
|
from openpype.lib.profiles_filtering import filter_profiles
|
||||||
|
from openpype.lib.file_transaction import FileTransaction
|
||||||
|
from openpype.pipeline import legacy_io
|
||||||
|
from openpype.pipeline.publish import KnownPublishError
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def assemble(files):
|
||||||
|
"""Convenience `clique.assemble` wrapper for files of a single collection.
|
||||||
|
|
||||||
|
Unlike `clique.assemble` this wrapper does not allow more than a single
|
||||||
|
Collection nor any remainder files. Errors will be raised when not only
|
||||||
|
a single collection is assembled.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
clique.Collection: A single sequence Collection
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError: Error is raised when files do not result in a single
|
||||||
|
collected Collection.
|
||||||
|
|
||||||
|
"""
|
||||||
|
# todo: move this to lib?
|
||||||
|
# Get the sequence as a collection. The files must be of a single
|
||||||
|
# sequence and have no remainder outside of the collections.
|
||||||
|
patterns = [clique.PATTERNS["frames"]]
|
||||||
|
collections, remainder = clique.assemble(files,
|
||||||
|
minimum_items=1,
|
||||||
|
patterns=patterns)
|
||||||
|
if not collections:
|
||||||
|
raise ValueError("No collections found in files: "
|
||||||
|
"{}".format(files))
|
||||||
|
if remainder:
|
||||||
|
raise ValueError("Files found not detected as part"
|
||||||
|
" of a sequence: {}".format(remainder))
|
||||||
|
if len(collections) > 1:
|
||||||
|
raise ValueError("Files in sequence are not part of a"
|
||||||
|
" single sequence collection: "
|
||||||
|
"{}".format(collections))
|
||||||
|
return collections[0]
|
||||||
|
|
||||||
|
|
||||||
|
def get_instance_families(instance):
|
||||||
|
"""Get all families of the instance"""
|
||||||
|
# todo: move this to lib?
|
||||||
|
family = instance.data.get("family")
|
||||||
|
families = []
|
||||||
|
if family:
|
||||||
|
families.append(family)
|
||||||
|
|
||||||
|
for _family in (instance.data.get("families") or []):
|
||||||
|
if _family not in families:
|
||||||
|
families.append(_family)
|
||||||
|
|
||||||
|
return families
|
||||||
|
|
||||||
|
|
||||||
|
def get_frame_padded(frame, padding):
|
||||||
|
"""Return frame number as string with `padding` amount of padded zeros"""
|
||||||
|
return "{frame:0{padding}d}".format(padding=padding, frame=frame)
|
||||||
|
|
||||||
|
|
||||||
|
def get_first_frame_padded(collection):
|
||||||
|
"""Return first frame as padded number from `clique.Collection`"""
|
||||||
|
start_frame = next(iter(collection.indexes))
|
||||||
|
return get_frame_padded(start_frame, padding=collection.padding)
|
||||||
|
|
||||||
|
|
||||||
|
class IntegrateAsset(pyblish.api.InstancePlugin):
|
||||||
|
"""Register publish in the database and transfer files to destinations.
|
||||||
|
|
||||||
|
Steps:
|
||||||
|
1) Register the subset and version
|
||||||
|
2) Transfer the representation files to the destination
|
||||||
|
3) Register the representation
|
||||||
|
|
||||||
|
Requires:
|
||||||
|
instance.data['representations'] - must be a list and each member
|
||||||
|
must be a dictionary with following data:
|
||||||
|
'files': list of filenames for sequence, string for single file.
|
||||||
|
Only the filename is allowed, without the folder path.
|
||||||
|
'stagingDir': "path/to/folder/with/files"
|
||||||
|
'name': representation name (usually the same as extension)
|
||||||
|
'ext': file extension
|
||||||
|
optional data
|
||||||
|
"frameStart"
|
||||||
|
"frameEnd"
|
||||||
|
'fps'
|
||||||
|
"data": additional metadata for each representation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
label = "Integrate Asset"
|
||||||
|
order = pyblish.api.IntegratorOrder
|
||||||
|
families = ["workfile",
|
||||||
|
"pointcache",
|
||||||
|
"camera",
|
||||||
|
"animation",
|
||||||
|
"model",
|
||||||
|
"mayaAscii",
|
||||||
|
"mayaScene",
|
||||||
|
"setdress",
|
||||||
|
"layout",
|
||||||
|
"ass",
|
||||||
|
"vdbcache",
|
||||||
|
"scene",
|
||||||
|
"vrayproxy",
|
||||||
|
"vrayscene_layer",
|
||||||
|
"render",
|
||||||
|
"prerender",
|
||||||
|
"imagesequence",
|
||||||
|
"review",
|
||||||
|
"rendersetup",
|
||||||
|
"rig",
|
||||||
|
"plate",
|
||||||
|
"look",
|
||||||
|
"audio",
|
||||||
|
"yetiRig",
|
||||||
|
"yeticache",
|
||||||
|
"nukenodes",
|
||||||
|
"gizmo",
|
||||||
|
"source",
|
||||||
|
"matchmove",
|
||||||
|
"image",
|
||||||
|
"assembly",
|
||||||
|
"fbx",
|
||||||
|
"textures",
|
||||||
|
"action",
|
||||||
|
"harmony.template",
|
||||||
|
"harmony.palette",
|
||||||
|
"editorial",
|
||||||
|
"background",
|
||||||
|
"camerarig",
|
||||||
|
"redshiftproxy",
|
||||||
|
"effect",
|
||||||
|
"xgen",
|
||||||
|
"hda",
|
||||||
|
"usd",
|
||||||
|
"staticMesh",
|
||||||
|
"skeletalMesh",
|
||||||
|
"mvLook",
|
||||||
|
"mvUsd",
|
||||||
|
"mvUsdComposition",
|
||||||
|
"mvUsdOverride",
|
||||||
|
"simpleUnrealTexture"
|
||||||
|
]
|
||||||
|
exclude_families = ["clip", "render.farm"]
|
||||||
|
default_template_name = "publish"
|
||||||
|
|
||||||
|
# Representation context keys that should always be written to
|
||||||
|
# the database even if not used by the destination template
|
||||||
|
db_representation_context_keys = [
|
||||||
|
"project", "asset", "task", "subset", "version", "representation",
|
||||||
|
"family", "hierarchy", "username"
|
||||||
|
]
|
||||||
|
skip_host_families = []
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
if self._temp_skip_instance_by_settings(instance):
|
||||||
|
return
|
||||||
|
|
||||||
|
# Mark instance as processed for legacy integrator
|
||||||
|
instance.data["processedWithNewIntegrator"] = True
|
||||||
|
|
||||||
|
# Instance should be integrated on a farm
|
||||||
|
if instance.data.get("farm"):
|
||||||
|
self.log.info(
|
||||||
|
"Instance is marked to be processed on farm. Skipping")
|
||||||
|
return
|
||||||
|
|
||||||
|
filtered_repres = self.filter_representations(instance)
|
||||||
|
# Skip instance if there are not representations to integrate
|
||||||
|
# all representations should not be integrated
|
||||||
|
if not filtered_repres:
|
||||||
|
self.log.warning((
|
||||||
|
"Skipping, there are no representations"
|
||||||
|
" to integrate for instance {}"
|
||||||
|
).format(instance.data["family"]))
|
||||||
|
return
|
||||||
|
|
||||||
|
# Exclude instances that also contain families from exclude families
|
||||||
|
families = set(get_instance_families(instance))
|
||||||
|
exclude = families & set(self.exclude_families)
|
||||||
|
if exclude:
|
||||||
|
self.log.debug("Instance not integrated due to exclude "
|
||||||
|
"families found: {}".format(", ".join(exclude)))
|
||||||
|
return
|
||||||
|
|
||||||
|
file_transactions = FileTransaction(log=self.log)
|
||||||
|
try:
|
||||||
|
self.register(instance, file_transactions, filtered_repres)
|
||||||
|
except Exception:
|
||||||
|
# clean destination
|
||||||
|
# todo: preferably we'd also rollback *any* changes to the database
|
||||||
|
file_transactions.rollback()
|
||||||
|
self.log.critical("Error when registering", exc_info=True)
|
||||||
|
six.reraise(*sys.exc_info())
|
||||||
|
|
||||||
|
# Finalizing can't rollback safely so no use for moving it to
|
||||||
|
# the try, except.
|
||||||
|
file_transactions.finalize()
|
||||||
|
|
||||||
|
def _temp_skip_instance_by_settings(self, instance):
|
||||||
|
"""Decide if instance will be processed with new or legacy integrator.
|
||||||
|
|
||||||
|
This is temporary solution until we test all usecases with new (this)
|
||||||
|
integrator plugin.
|
||||||
|
"""
|
||||||
|
|
||||||
|
host_name = instance.context.data["hostName"]
|
||||||
|
instance_family = instance.data["family"]
|
||||||
|
instance_families = set(instance.data.get("families") or [])
|
||||||
|
|
||||||
|
skip = False
|
||||||
|
for item in self.skip_host_families:
|
||||||
|
if host_name not in item["host"]:
|
||||||
|
continue
|
||||||
|
|
||||||
|
families = set(item["families"])
|
||||||
|
if instance_family in families:
|
||||||
|
skip = True
|
||||||
|
break
|
||||||
|
|
||||||
|
for family in instance_families:
|
||||||
|
if family in families:
|
||||||
|
skip = True
|
||||||
|
break
|
||||||
|
|
||||||
|
if skip:
|
||||||
|
break
|
||||||
|
|
||||||
|
if skip:
|
||||||
|
self.log.debug("Instance is marked to be skipped by settings.")
|
||||||
|
return skip
|
||||||
|
|
||||||
|
def filter_representations(self, instance):
|
||||||
|
# Prepare repsentations that should be integrated
|
||||||
|
repres = instance.data.get("representations")
|
||||||
|
# Raise error if instance don't have any representations
|
||||||
|
if not repres:
|
||||||
|
raise KnownPublishError(
|
||||||
|
"Instance {} has no representations to integrate".format(
|
||||||
|
instance.data["family"]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Validate type of stored representations
|
||||||
|
if not isinstance(repres, (list, tuple)):
|
||||||
|
raise TypeError(
|
||||||
|
"Instance 'files' must be a list, got: {0} {1}".format(
|
||||||
|
str(type(repres)), str(repres)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Filter representations
|
||||||
|
filtered_repres = []
|
||||||
|
for repre in repres:
|
||||||
|
if "delete" in repre.get("tags", []):
|
||||||
|
continue
|
||||||
|
filtered_repres.append(repre)
|
||||||
|
|
||||||
|
return filtered_repres
|
||||||
|
|
||||||
|
def register(self, instance, file_transactions, filtered_repres):
|
||||||
|
instance_stagingdir = instance.data.get("stagingDir")
|
||||||
|
if not instance_stagingdir:
|
||||||
|
self.log.info((
|
||||||
|
"{0} is missing reference to staging directory."
|
||||||
|
" Will try to get it from representation."
|
||||||
|
).format(instance))
|
||||||
|
|
||||||
|
else:
|
||||||
|
self.log.debug(
|
||||||
|
"Establishing staging directory "
|
||||||
|
"@ {0}".format(instance_stagingdir)
|
||||||
|
)
|
||||||
|
|
||||||
|
template_name = self.get_template_name(instance)
|
||||||
|
|
||||||
|
subset, subset_writes = self.prepare_subset(instance)
|
||||||
|
version, version_writes = self.prepare_version(instance, subset)
|
||||||
|
instance.data["versionEntity"] = version
|
||||||
|
|
||||||
|
# Get existing representations (if any)
|
||||||
|
existing_repres_by_name = {
|
||||||
|
repres["name"].lower(): repres for repres in legacy_io.find(
|
||||||
|
{
|
||||||
|
"parent": version["_id"],
|
||||||
|
"type": "representation"
|
||||||
|
},
|
||||||
|
# Only care about id and name of existing representations
|
||||||
|
projection={"_id": True, "name": True}
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prepare all representations
|
||||||
|
prepared_representations = []
|
||||||
|
for repre in filtered_repres:
|
||||||
|
# todo: reduce/simplify what is returned from this function
|
||||||
|
prepared = self.prepare_representation(
|
||||||
|
repre,
|
||||||
|
template_name,
|
||||||
|
existing_repres_by_name,
|
||||||
|
version,
|
||||||
|
instance_stagingdir,
|
||||||
|
instance)
|
||||||
|
|
||||||
|
for src, dst in prepared["transfers"]:
|
||||||
|
# todo: add support for hardlink transfers
|
||||||
|
file_transactions.add(src, dst)
|
||||||
|
|
||||||
|
prepared_representations.append(prepared)
|
||||||
|
|
||||||
|
# Each instance can also have pre-defined transfers not explicitly
|
||||||
|
# part of a representation - like texture resources used by a
|
||||||
|
# .ma representation. Those destination paths are pre-defined, etc.
|
||||||
|
# todo: should we move or simplify this logic?
|
||||||
|
resource_destinations = set()
|
||||||
|
for src, dst in instance.data.get("transfers", []):
|
||||||
|
file_transactions.add(src, dst, mode=FileTransaction.MODE_COPY)
|
||||||
|
resource_destinations.add(os.path.abspath(dst))
|
||||||
|
|
||||||
|
for src, dst in instance.data.get("hardlinks", []):
|
||||||
|
file_transactions.add(src, dst, mode=FileTransaction.MODE_HARDLINK)
|
||||||
|
resource_destinations.add(os.path.abspath(dst))
|
||||||
|
|
||||||
|
# Bulk write to the database
|
||||||
|
# We write the subset and version to the database before the File
|
||||||
|
# Transaction to reduce the chances of another publish trying to
|
||||||
|
# publish to the same version number since that chance can greatly
|
||||||
|
# increase if the file transaction takes a long time.
|
||||||
|
legacy_io.bulk_write(subset_writes + version_writes)
|
||||||
|
self.log.info("Subset {subset[name]} and Version {version[name]} "
|
||||||
|
"written to database..".format(subset=subset,
|
||||||
|
version=version))
|
||||||
|
|
||||||
|
# Process all file transfers of all integrations now
|
||||||
|
self.log.debug("Integrating source files to destination ...")
|
||||||
|
file_transactions.process()
|
||||||
|
self.log.debug(
|
||||||
|
"Backed up existing files: {}".format(file_transactions.backups))
|
||||||
|
self.log.debug(
|
||||||
|
"Transferred files: {}".format(file_transactions.transferred))
|
||||||
|
self.log.debug("Retrieving Representation Site Sync information ...")
|
||||||
|
|
||||||
|
# Get the accessible sites for Site Sync
|
||||||
|
modules_by_name = instance.context.data["openPypeModules"]
|
||||||
|
sync_server_module = modules_by_name["sync_server"]
|
||||||
|
sites = sync_server_module.compute_resource_sync_sites(
|
||||||
|
project_name=instance.data["projectEntity"]["name"]
|
||||||
|
)
|
||||||
|
self.log.debug("Sync Server Sites: {}".format(sites))
|
||||||
|
|
||||||
|
# Compute the resource file infos once (files belonging to the
|
||||||
|
# version instance instead of an individual representation) so
|
||||||
|
# we can re-use those file infos per representation
|
||||||
|
anatomy = instance.context.data["anatomy"]
|
||||||
|
resource_file_infos = self.get_files_info(resource_destinations,
|
||||||
|
sites=sites,
|
||||||
|
anatomy=anatomy)
|
||||||
|
|
||||||
|
# Finalize the representations now the published files are integrated
|
||||||
|
# Get 'files' info for representations and its attached resources
|
||||||
|
representation_writes = []
|
||||||
|
new_repre_names_low = set()
|
||||||
|
for prepared in prepared_representations:
|
||||||
|
representation = prepared["representation"]
|
||||||
|
transfers = prepared["transfers"]
|
||||||
|
destinations = [dst for src, dst in transfers]
|
||||||
|
representation["files"] = self.get_files_info(
|
||||||
|
destinations, sites=sites, anatomy=anatomy
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add the version resource file infos to each representation
|
||||||
|
representation["files"] += resource_file_infos
|
||||||
|
|
||||||
|
# Set up representation for writing to the database. Since
|
||||||
|
# we *might* be overwriting an existing entry if the version
|
||||||
|
# already existed we'll use ReplaceOnce with `upsert=True`
|
||||||
|
representation_writes.append(ReplaceOne(
|
||||||
|
filter={"_id": representation["_id"]},
|
||||||
|
replacement=representation,
|
||||||
|
upsert=True
|
||||||
|
))
|
||||||
|
|
||||||
|
new_repre_names_low.add(representation["name"].lower())
|
||||||
|
|
||||||
|
# Delete any existing representations that didn't get any new data
|
||||||
|
# if the instance is not set to append mode
|
||||||
|
if not instance.data.get("append", False):
|
||||||
|
delete_names = set()
|
||||||
|
for name, existing_repres in existing_repres_by_name.items():
|
||||||
|
if name not in new_repre_names_low:
|
||||||
|
# We add the exact representation name because `name` is
|
||||||
|
# lowercase for name matching only and not in the database
|
||||||
|
delete_names.add(existing_repres["name"])
|
||||||
|
if delete_names:
|
||||||
|
representation_writes.append(DeleteMany(
|
||||||
|
filter={
|
||||||
|
"parent": version["_id"],
|
||||||
|
"name": {"$in": list(delete_names)}
|
||||||
|
}
|
||||||
|
))
|
||||||
|
|
||||||
|
# Write representations to the database
|
||||||
|
legacy_io.bulk_write(representation_writes)
|
||||||
|
|
||||||
|
# Backwards compatibility
|
||||||
|
# todo: can we avoid the need to store this?
|
||||||
|
instance.data["published_representations"] = {
|
||||||
|
p["representation"]["_id"]: p for p in prepared_representations
|
||||||
|
}
|
||||||
|
|
||||||
|
self.log.info("Registered {} representations"
|
||||||
|
"".format(len(prepared_representations)))
|
||||||
|
|
||||||
|
def prepare_subset(self, instance):
|
||||||
|
asset = instance.data.get("assetEntity")
|
||||||
|
subset_name = instance.data["subset"]
|
||||||
|
self.log.debug("Subset: {}".format(subset_name))
|
||||||
|
|
||||||
|
# Get existing subset if it exists
|
||||||
|
subset = legacy_io.find_one({
|
||||||
|
"type": "subset",
|
||||||
|
"parent": asset["_id"],
|
||||||
|
"name": subset_name
|
||||||
|
})
|
||||||
|
|
||||||
|
# Define subset data
|
||||||
|
data = {
|
||||||
|
"families": get_instance_families(instance)
|
||||||
|
}
|
||||||
|
|
||||||
|
subset_group = instance.data.get("subsetGroup")
|
||||||
|
if subset_group:
|
||||||
|
data["subsetGroup"] = subset_group
|
||||||
|
|
||||||
|
bulk_writes = []
|
||||||
|
if subset is None:
|
||||||
|
# Create a new subset
|
||||||
|
self.log.info("Subset '%s' not found, creating ..." % subset_name)
|
||||||
|
subset = {
|
||||||
|
"_id": ObjectId(),
|
||||||
|
"schema": "openpype:subset-3.0",
|
||||||
|
"type": "subset",
|
||||||
|
"name": subset_name,
|
||||||
|
"data": data,
|
||||||
|
"parent": asset["_id"]
|
||||||
|
}
|
||||||
|
bulk_writes.append(InsertOne(subset))
|
||||||
|
|
||||||
|
else:
|
||||||
|
# Update existing subset data with new data and set in database.
|
||||||
|
# We also change the found subset in-place so we don't need to
|
||||||
|
# re-query the subset afterwards
|
||||||
|
subset["data"].update(data)
|
||||||
|
bulk_writes.append(UpdateOne(
|
||||||
|
{"type": "subset", "_id": subset["_id"]},
|
||||||
|
{"$set": {
|
||||||
|
"data": subset["data"]
|
||||||
|
}}
|
||||||
|
))
|
||||||
|
|
||||||
|
self.log.info("Prepared subset: {}".format(subset_name))
|
||||||
|
return subset, bulk_writes
|
||||||
|
|
||||||
|
def prepare_version(self, instance, subset):
|
||||||
|
|
||||||
|
version_number = instance.data["version"]
|
||||||
|
|
||||||
|
version = {
|
||||||
|
"schema": "openpype:version-3.0",
|
||||||
|
"type": "version",
|
||||||
|
"parent": subset["_id"],
|
||||||
|
"name": version_number,
|
||||||
|
"data": self.create_version_data(instance)
|
||||||
|
}
|
||||||
|
|
||||||
|
existing_version = legacy_io.find_one({
|
||||||
|
'type': 'version',
|
||||||
|
'parent': subset["_id"],
|
||||||
|
'name': version_number
|
||||||
|
}, projection={"_id": True})
|
||||||
|
|
||||||
|
if existing_version:
|
||||||
|
self.log.debug("Updating existing version ...")
|
||||||
|
version["_id"] = existing_version["_id"]
|
||||||
|
else:
|
||||||
|
self.log.debug("Creating new version ...")
|
||||||
|
version["_id"] = ObjectId()
|
||||||
|
|
||||||
|
bulk_writes = [ReplaceOne(
|
||||||
|
filter={"_id": version["_id"]},
|
||||||
|
replacement=version,
|
||||||
|
upsert=True
|
||||||
|
)]
|
||||||
|
|
||||||
|
self.log.info("Prepared version: v{0:03d}".format(version["name"]))
|
||||||
|
|
||||||
|
return version, bulk_writes
|
||||||
|
|
||||||
|
def prepare_representation(self, repre,
|
||||||
|
template_name,
|
||||||
|
existing_repres_by_name,
|
||||||
|
version,
|
||||||
|
instance_stagingdir,
|
||||||
|
instance):
|
||||||
|
|
||||||
|
# pre-flight validations
|
||||||
|
if repre["ext"].startswith("."):
|
||||||
|
raise ValueError("Extension must not start with a dot '.': "
|
||||||
|
"{}".format(repre["ext"]))
|
||||||
|
|
||||||
|
if repre.get("transfers"):
|
||||||
|
raise ValueError("Representation is not allowed to have transfers"
|
||||||
|
"data before integration. They are computed in "
|
||||||
|
"the integrator"
|
||||||
|
"Got: {}".format(repre["transfers"]))
|
||||||
|
|
||||||
|
# create template data for Anatomy
|
||||||
|
template_data = copy.deepcopy(instance.data["anatomyData"])
|
||||||
|
|
||||||
|
# required representation keys
|
||||||
|
files = repre['files']
|
||||||
|
template_data["representation"] = repre["name"]
|
||||||
|
template_data["ext"] = repre["ext"]
|
||||||
|
|
||||||
|
# optionals
|
||||||
|
# retrieve additional anatomy data from representation if exists
|
||||||
|
for key, anatomy_key in {
|
||||||
|
# Representation Key: Anatomy data key
|
||||||
|
"resolutionWidth": "resolution_width",
|
||||||
|
"resolutionHeight": "resolution_height",
|
||||||
|
"fps": "fps",
|
||||||
|
"outputName": "output",
|
||||||
|
"originalBasename": "originalBasename"
|
||||||
|
}.items():
|
||||||
|
# Allow to take value from representation
|
||||||
|
# if not found also consider instance.data
|
||||||
|
if key in repre:
|
||||||
|
value = repre[key]
|
||||||
|
elif key in instance.data:
|
||||||
|
value = instance.data[key]
|
||||||
|
else:
|
||||||
|
continue
|
||||||
|
template_data[anatomy_key] = value
|
||||||
|
|
||||||
|
if repre.get('stagingDir'):
|
||||||
|
stagingdir = repre['stagingDir']
|
||||||
|
else:
|
||||||
|
# Fall back to instance staging dir if not explicitly
|
||||||
|
# set for representation in the instance
|
||||||
|
self.log.debug("Representation uses instance staging dir: "
|
||||||
|
"{}".format(instance_stagingdir))
|
||||||
|
stagingdir = instance_stagingdir
|
||||||
|
if not stagingdir:
|
||||||
|
raise ValueError("No staging directory set for representation: "
|
||||||
|
"{}".format(repre))
|
||||||
|
|
||||||
|
self.log.debug("Anatomy template name: {}".format(template_name))
|
||||||
|
anatomy = instance.context.data['anatomy']
|
||||||
|
template = os.path.normpath(anatomy.templates[template_name]["path"])
|
||||||
|
|
||||||
|
is_udim = bool(repre.get("udim"))
|
||||||
|
is_sequence_representation = isinstance(files, (list, tuple))
|
||||||
|
if is_sequence_representation:
|
||||||
|
# Collection of files (sequence)
|
||||||
|
assert not any(os.path.isabs(fname) for fname in files), (
|
||||||
|
"Given file names contain full paths"
|
||||||
|
)
|
||||||
|
|
||||||
|
src_collection = assemble(files)
|
||||||
|
|
||||||
|
# If the representation has `frameStart` set it renumbers the
|
||||||
|
# frame indices of the published collection. It will start from
|
||||||
|
# that `frameStart` index instead. Thus if that frame start
|
||||||
|
# differs from the collection we want to shift the destination
|
||||||
|
# frame indices from the source collection.
|
||||||
|
destination_indexes = list(src_collection.indexes)
|
||||||
|
destination_padding = len(get_first_frame_padded(src_collection))
|
||||||
|
if repre.get("frameStart") is not None and not is_udim:
|
||||||
|
index_frame_start = int(repre.get("frameStart"))
|
||||||
|
|
||||||
|
render_template = anatomy.templates[template_name]
|
||||||
|
# todo: should we ALWAYS manage the frame padding even when not
|
||||||
|
# having `frameStart` set?
|
||||||
|
frame_start_padding = int(
|
||||||
|
render_template.get(
|
||||||
|
"frame_padding",
|
||||||
|
render_template.get("padding")
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Shift destination sequence to the start frame
|
||||||
|
src_start_frame = next(iter(src_collection.indexes))
|
||||||
|
shift = index_frame_start - src_start_frame
|
||||||
|
if shift:
|
||||||
|
destination_indexes = [
|
||||||
|
frame + shift for frame in destination_indexes
|
||||||
|
]
|
||||||
|
destination_padding = frame_start_padding
|
||||||
|
|
||||||
|
# To construct the destination template with anatomy we require
|
||||||
|
# a Frame or UDIM tile set for the template data. We use the first
|
||||||
|
# index of the destination for that because that could've shifted
|
||||||
|
# from the source indexes, etc.
|
||||||
|
first_index_padded = get_frame_padded(frame=destination_indexes[0],
|
||||||
|
padding=destination_padding)
|
||||||
|
if is_udim:
|
||||||
|
# UDIM representations handle ranges in a different manner
|
||||||
|
template_data["udim"] = first_index_padded
|
||||||
|
else:
|
||||||
|
template_data["frame"] = first_index_padded
|
||||||
|
|
||||||
|
# Construct destination collection from template
|
||||||
|
anatomy_filled = anatomy.format(template_data)
|
||||||
|
template_filled = anatomy_filled[template_name]["path"]
|
||||||
|
repre_context = template_filled.used_values
|
||||||
|
self.log.debug("Template filled: {}".format(str(template_filled)))
|
||||||
|
dst_collection = assemble([os.path.normpath(template_filled)])
|
||||||
|
|
||||||
|
# Update the destination indexes and padding
|
||||||
|
dst_collection.indexes.clear()
|
||||||
|
dst_collection.indexes.update(set(destination_indexes))
|
||||||
|
dst_collection.padding = destination_padding
|
||||||
|
assert (
|
||||||
|
len(src_collection.indexes) == len(dst_collection.indexes)
|
||||||
|
), "This is a bug"
|
||||||
|
|
||||||
|
# Multiple file transfers
|
||||||
|
transfers = []
|
||||||
|
for src_file_name, dst in zip(src_collection, dst_collection):
|
||||||
|
src = os.path.join(stagingdir, src_file_name)
|
||||||
|
transfers.append((src, dst))
|
||||||
|
|
||||||
|
else:
|
||||||
|
# Single file
|
||||||
|
fname = files
|
||||||
|
assert not os.path.isabs(fname), (
|
||||||
|
"Given file name is a full path"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Manage anatomy template data
|
||||||
|
template_data.pop("frame", None)
|
||||||
|
if is_udim:
|
||||||
|
template_data["udim"] = repre["udim"][0]
|
||||||
|
|
||||||
|
# Construct destination filepath from template
|
||||||
|
anatomy_filled = anatomy.format(template_data)
|
||||||
|
template_filled = anatomy_filled[template_name]["path"]
|
||||||
|
repre_context = template_filled.used_values
|
||||||
|
dst = os.path.normpath(template_filled)
|
||||||
|
|
||||||
|
# Single file transfer
|
||||||
|
src = os.path.join(stagingdir, fname)
|
||||||
|
transfers = [(src, dst)]
|
||||||
|
|
||||||
|
# todo: Are we sure the assumption each representation
|
||||||
|
# ends up in the same folder is valid?
|
||||||
|
if not instance.data.get("publishDir"):
|
||||||
|
instance.data["publishDir"] = (
|
||||||
|
anatomy_filled
|
||||||
|
[template_name]
|
||||||
|
["folder"]
|
||||||
|
)
|
||||||
|
|
||||||
|
for key in self.db_representation_context_keys:
|
||||||
|
# Also add these values to the context even if not used by the
|
||||||
|
# destination template
|
||||||
|
value = template_data.get(key)
|
||||||
|
if not value:
|
||||||
|
continue
|
||||||
|
repre_context[key] = template_data[key]
|
||||||
|
|
||||||
|
# Explicitly store the full list even though template data might
|
||||||
|
# have a different value because it uses just a single udim tile
|
||||||
|
if repre.get("udim"):
|
||||||
|
repre_context["udim"] = repre.get("udim") # store list
|
||||||
|
|
||||||
|
# Use previous representation's id if there is a name match
|
||||||
|
existing = existing_repres_by_name.get(repre["name"].lower())
|
||||||
|
if existing:
|
||||||
|
repre_id = existing["_id"]
|
||||||
|
else:
|
||||||
|
repre_id = ObjectId()
|
||||||
|
|
||||||
|
# Backwards compatibility:
|
||||||
|
# Store first transferred destination as published path data
|
||||||
|
# todo: can we remove this?
|
||||||
|
# todo: We shouldn't change data that makes its way back into
|
||||||
|
# instance.data[] until we know the publish actually succeeded
|
||||||
|
# otherwise `published_path` might not actually be valid?
|
||||||
|
published_path = transfers[0][1]
|
||||||
|
repre["published_path"] = published_path # Backwards compatibility
|
||||||
|
|
||||||
|
# todo: `repre` is not the actual `representation` entity
|
||||||
|
# we should simplify/clarify difference between data above
|
||||||
|
# and the actual representation entity for the database
|
||||||
|
data = repre.get("data", {})
|
||||||
|
data.update({'path': published_path, 'template': template})
|
||||||
|
representation = {
|
||||||
|
"_id": repre_id,
|
||||||
|
"schema": "openpype:representation-2.0",
|
||||||
|
"type": "representation",
|
||||||
|
"parent": version["_id"],
|
||||||
|
"name": repre['name'],
|
||||||
|
"data": data,
|
||||||
|
|
||||||
|
# Imprint shortcut to context for performance reasons.
|
||||||
|
"context": repre_context
|
||||||
|
}
|
||||||
|
|
||||||
|
# todo: simplify/streamline which additional data makes its way into
|
||||||
|
# the representation context
|
||||||
|
if repre.get("outputName"):
|
||||||
|
representation["context"]["output"] = repre['outputName']
|
||||||
|
|
||||||
|
if is_sequence_representation and repre.get("frameStart") is not None:
|
||||||
|
representation['context']['frame'] = template_data["frame"]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"representation": representation,
|
||||||
|
"anatomy_data": template_data,
|
||||||
|
"transfers": transfers,
|
||||||
|
# todo: avoid the need for 'published_files' used by Integrate Hero
|
||||||
|
# backwards compatibility
|
||||||
|
"published_files": [transfer[1] for transfer in transfers]
|
||||||
|
}
|
||||||
|
|
||||||
|
def create_version_data(self, instance):
|
||||||
|
"""Create the data dictionary for the version
|
||||||
|
|
||||||
|
Args:
|
||||||
|
instance: the current instance being published
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: the required information for version["data"]
|
||||||
|
"""
|
||||||
|
|
||||||
|
context = instance.context
|
||||||
|
|
||||||
|
# create relative source path for DB
|
||||||
|
if "source" in instance.data:
|
||||||
|
source = instance.data["source"]
|
||||||
|
else:
|
||||||
|
source = context.data["currentFile"]
|
||||||
|
anatomy = instance.context.data["anatomy"]
|
||||||
|
source = self.get_rootless_path(anatomy, source)
|
||||||
|
self.log.debug("Source: {}".format(source))
|
||||||
|
|
||||||
|
version_data = {
|
||||||
|
"families": get_instance_families(instance),
|
||||||
|
"time": context.data["time"],
|
||||||
|
"author": context.data["user"],
|
||||||
|
"source": source,
|
||||||
|
"comment": context.data.get("comment"),
|
||||||
|
"machine": context.data.get("machine"),
|
||||||
|
"fps": instance.data.get("fps", context.data.get("fps"))
|
||||||
|
}
|
||||||
|
|
||||||
|
# todo: preferably we wouldn't need this "if dict" etc. logic and
|
||||||
|
# instead be able to rely what the input value is if it's set.
|
||||||
|
intent_value = context.data.get("intent")
|
||||||
|
if intent_value and isinstance(intent_value, dict):
|
||||||
|
intent_value = intent_value.get("value")
|
||||||
|
|
||||||
|
if intent_value:
|
||||||
|
version_data["intent"] = intent_value
|
||||||
|
|
||||||
|
# Include optional data if present in
|
||||||
|
optionals = [
|
||||||
|
"frameStart", "frameEnd", "step", "handles",
|
||||||
|
"handleEnd", "handleStart", "sourceHashes"
|
||||||
|
]
|
||||||
|
for key in optionals:
|
||||||
|
if key in instance.data:
|
||||||
|
version_data[key] = instance.data[key]
|
||||||
|
|
||||||
|
# Include instance.data[versionData] directly
|
||||||
|
version_data_instance = instance.data.get('versionData')
|
||||||
|
if version_data_instance:
|
||||||
|
version_data.update(version_data_instance)
|
||||||
|
|
||||||
|
return version_data
|
||||||
|
|
||||||
|
def get_template_name(self, instance):
|
||||||
|
"""Return anatomy template name to use for integration"""
|
||||||
|
# Define publish template name from profiles
|
||||||
|
filter_criteria = self.get_profile_filter_criteria(instance)
|
||||||
|
template_name_profiles = self._get_template_name_profiles(instance)
|
||||||
|
profile = filter_profiles(
|
||||||
|
template_name_profiles,
|
||||||
|
filter_criteria,
|
||||||
|
logger=self.log
|
||||||
|
)
|
||||||
|
|
||||||
|
if profile:
|
||||||
|
return profile["template_name"]
|
||||||
|
return self.default_template_name
|
||||||
|
|
||||||
|
def _get_template_name_profiles(self, instance):
|
||||||
|
"""Receive profiles for publish template keys.
|
||||||
|
|
||||||
|
Reuse template name profiles from legacy integrator. Goal is to move
|
||||||
|
the profile settings out of plugin settings but until that happens we
|
||||||
|
want to be able set it at one place and don't break backwards
|
||||||
|
compatibility (more then once).
|
||||||
|
"""
|
||||||
|
|
||||||
|
return (
|
||||||
|
instance.context.data["project_settings"]
|
||||||
|
["global"]
|
||||||
|
["publish"]
|
||||||
|
["IntegrateAssetNew"]
|
||||||
|
["template_name_profiles"]
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_profile_filter_criteria(self, instance):
|
||||||
|
"""Return filter criteria for `filter_profiles`"""
|
||||||
|
# Anatomy data is pre-filled by Collectors
|
||||||
|
anatomy_data = instance.data["anatomyData"]
|
||||||
|
|
||||||
|
# Task can be optional in anatomy data
|
||||||
|
task = anatomy_data.get("task", {})
|
||||||
|
|
||||||
|
# Return filter criteria
|
||||||
|
return {
|
||||||
|
"families": anatomy_data["family"],
|
||||||
|
"tasks": task.get("name"),
|
||||||
|
"task_types": task.get("type"),
|
||||||
|
"hosts": instance.context.data["hostName"],
|
||||||
|
}
|
||||||
|
|
||||||
|
def get_rootless_path(self, anatomy, path):
|
||||||
|
"""Returns, if possible, path without absolute portion from root
|
||||||
|
(eg. 'c:\' or '/opt/..')
|
||||||
|
|
||||||
|
This information is platform dependent and shouldn't be captured.
|
||||||
|
Example:
|
||||||
|
'c:/projects/MyProject1/Assets/publish...' >
|
||||||
|
'{root}/MyProject1/Assets...'
|
||||||
|
|
||||||
|
Args:
|
||||||
|
anatomy: anatomy part from instance
|
||||||
|
path: path (absolute)
|
||||||
|
Returns:
|
||||||
|
path: modified path if possible, or unmodified path
|
||||||
|
+ warning logged
|
||||||
|
"""
|
||||||
|
success, rootless_path = anatomy.find_root_template_from_path(path)
|
||||||
|
if success:
|
||||||
|
path = rootless_path
|
||||||
|
else:
|
||||||
|
self.log.warning((
|
||||||
|
"Could not find root path for remapping \"{}\"."
|
||||||
|
" This may cause issues on farm."
|
||||||
|
).format(path))
|
||||||
|
return path
|
||||||
|
|
||||||
|
def get_files_info(self, destinations, sites, anatomy):
|
||||||
|
"""Prepare 'files' info portion for representations.
|
||||||
|
|
||||||
|
Arguments:
|
||||||
|
destinations (list): List of transferred file destinations
|
||||||
|
sites (list): array of published locations
|
||||||
|
anatomy: anatomy part from instance
|
||||||
|
Returns:
|
||||||
|
output_resources: array of dictionaries to be added to 'files' key
|
||||||
|
in representation
|
||||||
|
"""
|
||||||
|
file_infos = []
|
||||||
|
for file_path in destinations:
|
||||||
|
file_info = self.prepare_file_info(file_path, anatomy, sites=sites)
|
||||||
|
file_infos.append(file_info)
|
||||||
|
return file_infos
|
||||||
|
|
||||||
|
def prepare_file_info(self, path, anatomy, sites):
|
||||||
|
""" Prepare information for one file (asset or resource)
|
||||||
|
|
||||||
|
Arguments:
|
||||||
|
path: destination url of published file
|
||||||
|
anatomy: anatomy part from instance
|
||||||
|
sites: array of published locations,
|
||||||
|
[ {'name':'studio', 'created_dt':date} by default
|
||||||
|
keys expected ['studio', 'site1', 'gdrive1']
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: file info dictionary
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
"_id": ObjectId(),
|
||||||
|
"path": self.get_rootless_path(anatomy, path),
|
||||||
|
"size": os.path.getsize(path),
|
||||||
|
"hash": openpype.api.source_hash(path),
|
||||||
|
"sites": sites
|
||||||
|
}
|
||||||
|
|
@ -69,8 +69,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
"data": additional metadata for each representation.
|
"data": additional metadata for each representation.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
label = "Integrate Asset New"
|
label = "Integrate Asset (legacy)"
|
||||||
order = pyblish.api.IntegratorOrder
|
# Make sure it happens after new integrator
|
||||||
|
order = pyblish.api.IntegratorOrder + 0.00001
|
||||||
families = ["workfile",
|
families = ["workfile",
|
||||||
"pointcache",
|
"pointcache",
|
||||||
"camera",
|
"camera",
|
||||||
|
|
@ -101,7 +102,6 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
"source",
|
"source",
|
||||||
"matchmove",
|
"matchmove",
|
||||||
"image",
|
"image",
|
||||||
"source",
|
|
||||||
"assembly",
|
"assembly",
|
||||||
"fbx",
|
"fbx",
|
||||||
"textures",
|
"textures",
|
||||||
|
|
@ -142,6 +142,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
subset_grouping_profiles = None
|
subset_grouping_profiles = None
|
||||||
|
|
||||||
def process(self, instance):
|
def process(self, instance):
|
||||||
|
if instance.data.get("processedWithNewIntegrator"):
|
||||||
|
self.log.info("Instance was already processed with new integrator")
|
||||||
|
return
|
||||||
|
|
||||||
for ef in self.exclude_families:
|
for ef in self.exclude_families:
|
||||||
if (
|
if (
|
||||||
instance.data["family"] == ef or
|
instance.data["family"] == ef or
|
||||||
98
openpype/plugins/publish/integrate_subset_group.py
Normal file
|
|
@ -0,0 +1,98 @@
|
||||||
|
"""Produces instance.data["subsetGroup"] data used during integration.
|
||||||
|
|
||||||
|
Requires:
|
||||||
|
dict -> context["anatomyData"] *(pyblish.api.CollectorOrder + 0.49)
|
||||||
|
|
||||||
|
Provides:
|
||||||
|
instance -> subsetGroup (str)
|
||||||
|
|
||||||
|
"""
|
||||||
|
import pyblish.api
|
||||||
|
|
||||||
|
from openpype.lib.profiles_filtering import filter_profiles
|
||||||
|
from openpype.lib import (
|
||||||
|
prepare_template_data,
|
||||||
|
StringTemplate,
|
||||||
|
TemplateUnsolved
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class IntegrateSubsetGroup(pyblish.api.InstancePlugin):
|
||||||
|
"""Integrate Subset Group for publish."""
|
||||||
|
|
||||||
|
# Run after CollectAnatomyInstanceData
|
||||||
|
order = pyblish.api.IntegratorOrder - 0.1
|
||||||
|
label = "Subset Group"
|
||||||
|
|
||||||
|
# Attributes set by settings
|
||||||
|
subset_grouping_profiles = None
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
"""Look into subset group profiles set by settings.
|
||||||
|
|
||||||
|
Attribute 'subset_grouping_profiles' is defined by OpenPype settings.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Skip if 'subset_grouping_profiles' is empty
|
||||||
|
if not self.subset_grouping_profiles:
|
||||||
|
return
|
||||||
|
|
||||||
|
if instance.data.get("subsetGroup"):
|
||||||
|
# If subsetGroup is already set then allow that value to remain
|
||||||
|
self.log.debug((
|
||||||
|
"Skipping collect subset group due to existing value: {}"
|
||||||
|
).format(instance.data["subsetGroup"]))
|
||||||
|
return
|
||||||
|
|
||||||
|
# Skip if there is no matching profile
|
||||||
|
filter_criteria = self.get_profile_filter_criteria(instance)
|
||||||
|
profile = filter_profiles(
|
||||||
|
self.subset_grouping_profiles,
|
||||||
|
filter_criteria,
|
||||||
|
logger=self.log
|
||||||
|
)
|
||||||
|
|
||||||
|
if not profile:
|
||||||
|
return
|
||||||
|
|
||||||
|
template = profile["template"]
|
||||||
|
|
||||||
|
fill_pairs = prepare_template_data({
|
||||||
|
"family": filter_criteria["families"],
|
||||||
|
"task": filter_criteria["tasks"],
|
||||||
|
"host": filter_criteria["hosts"],
|
||||||
|
"subset": instance.data["subset"],
|
||||||
|
"renderlayer": instance.data.get("renderlayer")
|
||||||
|
})
|
||||||
|
|
||||||
|
filled_template = None
|
||||||
|
try:
|
||||||
|
filled_template = StringTemplate.format_strict_template(
|
||||||
|
template, fill_pairs
|
||||||
|
)
|
||||||
|
except (KeyError, TemplateUnsolved):
|
||||||
|
keys = fill_pairs.keys()
|
||||||
|
self.log.warning((
|
||||||
|
"Subset grouping failed. Only {} are expected in Settings"
|
||||||
|
).format(','.join(keys)))
|
||||||
|
|
||||||
|
if filled_template:
|
||||||
|
instance.data["subsetGroup"] = filled_template
|
||||||
|
|
||||||
|
def get_profile_filter_criteria(self, instance):
|
||||||
|
"""Return filter criteria for `filter_profiles`"""
|
||||||
|
# TODO: This logic is used in much more plug-ins in one way or another
|
||||||
|
# Maybe better suited for lib?
|
||||||
|
# Anatomy data is pre-filled by Collectors
|
||||||
|
anatomy_data = instance.data["anatomyData"]
|
||||||
|
|
||||||
|
# Task can be optional in anatomy data
|
||||||
|
task = anatomy_data.get("task", {})
|
||||||
|
|
||||||
|
# Return filter criteria
|
||||||
|
return {
|
||||||
|
"families": anatomy_data["family"],
|
||||||
|
"tasks": task.get("name"),
|
||||||
|
"hosts": anatomy_data["app"],
|
||||||
|
"task_types": task.get("type")
|
||||||
|
}
|
||||||
|
|
@ -159,7 +159,27 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
"IntegrateSubsetGroup": {
|
||||||
|
"subset_grouping_profiles": [
|
||||||
|
{
|
||||||
|
"families": [],
|
||||||
|
"hosts": [],
|
||||||
|
"task_types": [],
|
||||||
|
"tasks": [],
|
||||||
|
"template": ""
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
"IntegrateAssetNew": {
|
"IntegrateAssetNew": {
|
||||||
|
"subset_grouping_profiles": [
|
||||||
|
{
|
||||||
|
"families": [],
|
||||||
|
"hosts": [],
|
||||||
|
"task_types": [],
|
||||||
|
"tasks": [],
|
||||||
|
"template": ""
|
||||||
|
}
|
||||||
|
],
|
||||||
"template_name_profiles": [
|
"template_name_profiles": [
|
||||||
{
|
{
|
||||||
"families": [],
|
"families": [],
|
||||||
|
|
@ -202,17 +222,11 @@
|
||||||
"tasks": [],
|
"tasks": [],
|
||||||
"template_name": "maya2unreal"
|
"template_name": "maya2unreal"
|
||||||
}
|
}
|
||||||
],
|
|
||||||
"subset_grouping_profiles": [
|
|
||||||
{
|
|
||||||
"families": [],
|
|
||||||
"hosts": [],
|
|
||||||
"task_types": [],
|
|
||||||
"tasks": [],
|
|
||||||
"template": ""
|
|
||||||
}
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
"IntegrateAsset": {
|
||||||
|
"skip_host_families": []
|
||||||
|
},
|
||||||
"IntegrateHeroVersion": {
|
"IntegrateHeroVersion": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
|
|
|
||||||
|
|
@ -205,10 +205,15 @@
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true,
|
"active": true,
|
||||||
"exclude_families": ["model", "rig", "staticMesh"]
|
"exclude_families": [
|
||||||
|
"model",
|
||||||
|
"rig",
|
||||||
|
"staticMesh"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
"ValidateShaderName": {
|
"ValidateShaderName": {
|
||||||
"enabled": false,
|
"enabled": false,
|
||||||
|
"optional": true,
|
||||||
"regex": "(?P<asset>.*)_(.*)_SHD"
|
"regex": "(?P<asset>.*)_(.*)_SHD"
|
||||||
},
|
},
|
||||||
"ValidateShadingEngine": {
|
"ValidateShadingEngine": {
|
||||||
|
|
@ -222,6 +227,7 @@
|
||||||
},
|
},
|
||||||
"ValidateLoadedPlugin": {
|
"ValidateLoadedPlugin": {
|
||||||
"enabled": false,
|
"enabled": false,
|
||||||
|
"optional": true,
|
||||||
"whitelist_native_plugins": false,
|
"whitelist_native_plugins": false,
|
||||||
"authorized_plugins": []
|
"authorized_plugins": []
|
||||||
},
|
},
|
||||||
|
|
@ -236,6 +242,7 @@
|
||||||
},
|
},
|
||||||
"ValidateUnrealStaticMeshName": {
|
"ValidateUnrealStaticMeshName": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
|
"optional": true,
|
||||||
"validate_mesh": false,
|
"validate_mesh": false,
|
||||||
"validate_collision": true
|
"validate_collision": true
|
||||||
},
|
},
|
||||||
|
|
@ -252,6 +259,81 @@
|
||||||
"redshift_render_attributes": [],
|
"redshift_render_attributes": [],
|
||||||
"renderman_render_attributes": []
|
"renderman_render_attributes": []
|
||||||
},
|
},
|
||||||
|
"ValidateCurrentRenderLayerIsRenderable": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateRenderImageRule": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateRenderNoDefaultCameras": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateRenderSingleCamera": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateRenderLayerAOVs": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateStepSize": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateVRayDistributedRendering": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateVrayReferencedAOVs": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateVRayTranslatorEnabled": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateVrayProxy": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateVrayProxyMembers": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateYetiRenderScriptCallbacks": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateYetiRigCacheState": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateYetiRigInputShapesInInstance": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateYetiRigSettings": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
"ValidateModelName": {
|
"ValidateModelName": {
|
||||||
"enabled": false,
|
"enabled": false,
|
||||||
"database": true,
|
"database": true,
|
||||||
|
|
@ -270,6 +352,7 @@
|
||||||
},
|
},
|
||||||
"ValidateTransformNamingSuffix": {
|
"ValidateTransformNamingSuffix": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
|
"optional": true,
|
||||||
"SUFFIX_NAMING_TABLE": {
|
"SUFFIX_NAMING_TABLE": {
|
||||||
"mesh": [
|
"mesh": [
|
||||||
"_GEO",
|
"_GEO",
|
||||||
|
|
@ -293,7 +376,7 @@
|
||||||
"ALLOW_IF_NOT_IN_SUFFIX_TABLE": true
|
"ALLOW_IF_NOT_IN_SUFFIX_TABLE": true
|
||||||
},
|
},
|
||||||
"ValidateColorSets": {
|
"ValidateColorSets": {
|
||||||
"enabled": false,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
|
|
@ -337,6 +420,16 @@
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
|
"ValidateMeshNoNegativeScale": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateMeshNonZeroEdgeLength": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": true,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
"ValidateMeshNormalsUnlocked": {
|
"ValidateMeshNormalsUnlocked": {
|
||||||
"enabled": false,
|
"enabled": false,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
|
|
@ -359,22 +452,22 @@
|
||||||
},
|
},
|
||||||
"ValidateNoNamespace": {
|
"ValidateNoNamespace": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": false,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
"ValidateNoNullTransforms": {
|
"ValidateNoNullTransforms": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": false,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
"ValidateNoUnknownNodes": {
|
"ValidateNoUnknownNodes": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": false,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
"ValidateNodeNoGhosting": {
|
"ValidateNodeNoGhosting": {
|
||||||
"enabled": false,
|
"enabled": false,
|
||||||
"optional": true,
|
"optional": false,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
"ValidateShapeDefaultNames": {
|
"ValidateShapeDefaultNames": {
|
||||||
|
|
@ -402,6 +495,21 @@
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
|
"ValidateNoVRayMesh": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateUnrealMeshTriangulated": {
|
||||||
|
"enabled": false,
|
||||||
|
"optional": true,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateAlembicVisibleOnly": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
"ExtractAlembic": {
|
"ExtractAlembic": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"families": [
|
"families": [
|
||||||
|
|
@ -425,8 +533,34 @@
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
|
"ValidateAnimationContent": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateOutRelatedNodeIds": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateRigControllersArnoldAttributes": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateSkeletalMeshHierarchy": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateSkinclusterDeformerSet": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
"ValidateRigOutSetNodeIds": {
|
"ValidateRigOutSetNodeIds": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
"allow_history_only": false
|
"allow_history_only": false
|
||||||
},
|
},
|
||||||
"ValidateCameraAttributes": {
|
"ValidateCameraAttributes": {
|
||||||
|
|
@ -439,14 +573,44 @@
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
|
"ValidateAssemblyNamespaces": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateAssemblyModelTransforms": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
"ValidateAssRelativePaths": {
|
"ValidateAssRelativePaths": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateInstancerContent": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateInstancerFrameRanges": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateNoDefaultCameras": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
|
"ValidateUnrealUpAxis": {
|
||||||
|
"enabled": false,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
"ValidateCameraContents": {
|
"ValidateCameraContents": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": false,
|
||||||
"validate_shapes": true
|
"validate_shapes": true
|
||||||
},
|
},
|
||||||
"ExtractPlayblast": {
|
"ExtractPlayblast": {
|
||||||
|
|
|
||||||
|
|
@ -2,11 +2,7 @@
|
||||||
"studio_name": "Studio name",
|
"studio_name": "Studio name",
|
||||||
"studio_code": "stu",
|
"studio_code": "stu",
|
||||||
"admin_password": "",
|
"admin_password": "",
|
||||||
"environment": {
|
"environment": {},
|
||||||
"__environment_keys__": {
|
|
||||||
"global": []
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"log_to_server": true,
|
"log_to_server": true,
|
||||||
"disk_mapping": {
|
"disk_mapping": {
|
||||||
"windows": [],
|
"windows": [],
|
||||||
|
|
|
||||||
|
|
@ -528,10 +528,111 @@
|
||||||
{
|
{
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"collapsible": true,
|
"collapsible": true,
|
||||||
"key": "IntegrateAssetNew",
|
"key": "IntegrateSubsetGroup",
|
||||||
"label": "IntegrateAssetNew",
|
"label": "Integrate Subset Group",
|
||||||
"is_group": true,
|
"is_group": true,
|
||||||
"children": [
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "subset_grouping_profiles",
|
||||||
|
"label": "Subset grouping profiles",
|
||||||
|
"use_label_wrap": true,
|
||||||
|
"object_type": {
|
||||||
|
"type": "dict",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "label",
|
||||||
|
"label": "Set all published instances as a part of specific group named according to 'Template'. <br>Implemented all variants of placeholders [{task},{family},{host},{subset},{renderlayer}]"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "families",
|
||||||
|
"label": "Families",
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "hosts-enum",
|
||||||
|
"key": "hosts",
|
||||||
|
"label": "Hosts",
|
||||||
|
"multiselection": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "task_types",
|
||||||
|
"label": "Task types",
|
||||||
|
"type": "task-types-enum"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "tasks",
|
||||||
|
"label": "Task names",
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "separator"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "text",
|
||||||
|
"key": "template",
|
||||||
|
"label": "Template"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": true,
|
||||||
|
"key": "IntegrateAssetNew",
|
||||||
|
"label": "IntegrateAsset (Legacy)",
|
||||||
|
"is_group": true,
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "label",
|
||||||
|
"label": "<b>NOTE:</b> Subset grouping profiles settings were moved to <a href=\"settings://project_settings/global/publish/IntegrateSubsetGroup/subset_grouping_profiles\"><b>Integrate Subset Group</b></a>. Please move values there."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "subset_grouping_profiles",
|
||||||
|
"label": "Subset grouping profiles (DEPRECATED)",
|
||||||
|
"use_label_wrap": true,
|
||||||
|
"object_type": {
|
||||||
|
"type": "dict",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"key": "families",
|
||||||
|
"label": "Families",
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "hosts-enum",
|
||||||
|
"key": "hosts",
|
||||||
|
"label": "Hosts",
|
||||||
|
"multiselection": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "task_types",
|
||||||
|
"label": "Task types",
|
||||||
|
"type": "task-types-enum"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "tasks",
|
||||||
|
"label": "Task names",
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "separator"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "text",
|
||||||
|
"key": "template",
|
||||||
|
"label": "Template"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "list",
|
"type": "list",
|
||||||
"key": "template_name_profiles",
|
"key": "template_name_profiles",
|
||||||
|
|
@ -577,49 +678,34 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
},
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": true,
|
||||||
|
"key": "IntegrateAsset",
|
||||||
|
"label": "Integrate Asset",
|
||||||
|
"is_group": true,
|
||||||
|
"children": [
|
||||||
{
|
{
|
||||||
"type": "list",
|
"type": "list",
|
||||||
"key": "subset_grouping_profiles",
|
"key": "skip_host_families",
|
||||||
"label": "Subset grouping profiles",
|
"label": "Skip hosts and families",
|
||||||
"use_label_wrap": true,
|
"use_label_wrap": true,
|
||||||
"object_type": {
|
"object_type": {
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"type": "label",
|
"type": "hosts-enum",
|
||||||
"label": "Set all published instances as a part of specific group named according to 'Template'. <br>Implemented all variants of placeholders [{task},{family},{host},{subset},{renderlayer}]"
|
"key": "host",
|
||||||
|
"label": "Host"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
"type": "list",
|
||||||
"key": "families",
|
"key": "families",
|
||||||
"label": "Families",
|
"label": "Families",
|
||||||
"type": "list",
|
|
||||||
"object_type": "text"
|
"object_type": "text"
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "hosts-enum",
|
|
||||||
"key": "hosts",
|
|
||||||
"label": "Hosts",
|
|
||||||
"multiselection": true
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"key": "task_types",
|
|
||||||
"label": "Task types",
|
|
||||||
"type": "task-types-enum"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"key": "tasks",
|
|
||||||
"label": "Task names",
|
|
||||||
"type": "list",
|
|
||||||
"object_type": "text"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "separator"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "text",
|
|
||||||
"key": "template",
|
|
||||||
"label": "Template"
|
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -107,6 +107,11 @@
|
||||||
"key": "enabled",
|
"key": "enabled",
|
||||||
"label": "Enabled"
|
"label": "Enabled"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "optional",
|
||||||
|
"label": "Optional"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "label",
|
"type": "label",
|
||||||
"label": "Shader name regex can use named capture group <b>asset</b> to validate against current asset name.<p><b>Example:</b><br/><code>^.*(?P=<asset>.+)_SHD</code></p>"
|
"label": "Shader name regex can use named capture group <b>asset</b> to validate against current asset name.<p><b>Example:</b><br/><code>^.*(?P=<asset>.+)_SHD</code></p>"
|
||||||
|
|
@ -159,6 +164,11 @@
|
||||||
"key": "enabled",
|
"key": "enabled",
|
||||||
"label": "Enabled"
|
"label": "Enabled"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "optional",
|
||||||
|
"label": "Optional"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "boolean",
|
"type": "boolean",
|
||||||
"key": "whitelist_native_plugins",
|
"key": "whitelist_native_plugins",
|
||||||
|
|
@ -246,6 +256,11 @@
|
||||||
"key": "enabled",
|
"key": "enabled",
|
||||||
"label": "Enabled"
|
"label": "Enabled"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "optional",
|
||||||
|
"label": "Optional"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "boolean",
|
"type": "boolean",
|
||||||
"key": "validate_mesh",
|
"key": "validate_mesh",
|
||||||
|
|
@ -332,6 +347,72 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "schema_template",
|
||||||
|
"name": "template_publish_plugin",
|
||||||
|
"template_data": [
|
||||||
|
{
|
||||||
|
"key": "ValidateCurrentRenderLayerIsRenderable",
|
||||||
|
"label": "Validate Current Render Layer Has Renderable Camera"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateRenderImageRule",
|
||||||
|
"label": "Validate Images File Rule (Workspace)"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateRenderNoDefaultCameras",
|
||||||
|
"label": "Validate No Default Cameras Renderable"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateRenderSingleCamera",
|
||||||
|
"label": "Validate Render Single Camera"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateRenderLayerAOVs",
|
||||||
|
"label": "Validate Render Passes / AOVs Are Registered"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateStepSize",
|
||||||
|
"label": "Validate Step Size"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateVRayDistributedRendering",
|
||||||
|
"label": "VRay Distributed Rendering"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateVrayReferencedAOVs",
|
||||||
|
"label": "VRay Referenced AOVs"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateVRayTranslatorEnabled",
|
||||||
|
"label": "VRay Translator Settings"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateVrayProxy",
|
||||||
|
"label": "VRay Proxy Settings"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateVrayProxyMembers",
|
||||||
|
"label": "VRay Proxy Members"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateYetiRenderScriptCallbacks",
|
||||||
|
"label": "Yeti Render Script Callbacks"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateYetiRigCacheState",
|
||||||
|
"label": "Yeti Rig Cache State"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateYetiRigInputShapesInInstance",
|
||||||
|
"label": "Yeti Rig Input Shapes In Instance"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateYetiRigSettings",
|
||||||
|
"label": "Yeti Rig Settings"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "collapsible-wrap",
|
"type": "collapsible-wrap",
|
||||||
"label": "Model",
|
"label": "Model",
|
||||||
|
|
@ -416,6 +497,11 @@
|
||||||
"key": "enabled",
|
"key": "enabled",
|
||||||
"label": "Enabled"
|
"label": "Enabled"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "optional",
|
||||||
|
"label": "Optional"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "label",
|
"type": "label",
|
||||||
"label": "Validates transform suffix based on the type of its children shapes."
|
"label": "Validates transform suffix based on the type of its children shapes."
|
||||||
|
|
@ -472,6 +558,14 @@
|
||||||
"key": "ValidateMeshNonManifold",
|
"key": "ValidateMeshNonManifold",
|
||||||
"label": "ValidateMeshNonManifold"
|
"label": "ValidateMeshNonManifold"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateMeshNoNegativeScale",
|
||||||
|
"label": "Validate Mesh No Negative Scale"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateMeshNonZeroEdgeLength",
|
||||||
|
"label": "Validate Mesh Edge Length Non Zero"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"key": "ValidateMeshNormalsUnlocked",
|
"key": "ValidateMeshNormalsUnlocked",
|
||||||
"label": "ValidateMeshNormalsUnlocked"
|
"label": "ValidateMeshNormalsUnlocked"
|
||||||
|
|
@ -525,6 +619,18 @@
|
||||||
{
|
{
|
||||||
"key": "ValidateUniqueNames",
|
"key": "ValidateUniqueNames",
|
||||||
"label": "ValidateUniqueNames"
|
"label": "ValidateUniqueNames"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateNoVRayMesh",
|
||||||
|
"label": "Validate No V-Ray Proxies (VRayMesh)"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateUnrealMeshTriangulated",
|
||||||
|
"label": "Validate if Mesh is Triangulated"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateAlembicVisibleOnly",
|
||||||
|
"label": "Validate Alembic visible node"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|
@ -573,6 +679,26 @@
|
||||||
{
|
{
|
||||||
"key": "ValidateRigControllers",
|
"key": "ValidateRigControllers",
|
||||||
"label": "Validate Rig Controllers"
|
"label": "Validate Rig Controllers"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateAnimationContent",
|
||||||
|
"label": "Validate Animation Content"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateOutRelatedNodeIds",
|
||||||
|
"label": "Validate Animation Out Set Related Node Ids"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateRigControllersArnoldAttributes",
|
||||||
|
"label": "Validate Rig Controllers (Arnold Attributes)"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateSkeletalMeshHierarchy",
|
||||||
|
"label": "Validate Skeletal Mesh Top Node"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateSkinclusterDeformerSet",
|
||||||
|
"label": "Validate Skincluster Deformer Relationships"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|
@ -589,6 +715,11 @@
|
||||||
"key": "enabled",
|
"key": "enabled",
|
||||||
"label": "Enabled"
|
"label": "Enabled"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "optional",
|
||||||
|
"label": "Optional"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "boolean",
|
"type": "boolean",
|
||||||
"key": "allow_history_only",
|
"key": "allow_history_only",
|
||||||
|
|
@ -611,9 +742,33 @@
|
||||||
"key": "ValidateAssemblyName",
|
"key": "ValidateAssemblyName",
|
||||||
"label": "Validate Assembly Name"
|
"label": "Validate Assembly Name"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateAssemblyNamespaces",
|
||||||
|
"label": "Validate Assembly Namespaces"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateAssemblyModelTransforms",
|
||||||
|
"label": "Validate Assembly Model Transforms"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"key": "ValidateAssRelativePaths",
|
"key": "ValidateAssRelativePaths",
|
||||||
"label": "ValidateAssRelativePaths"
|
"label": "ValidateAssRelativePaths"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateInstancerContent",
|
||||||
|
"label": "Validate Instancer Content"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateInstancerFrameRanges",
|
||||||
|
"label": "Validate Instancer Cache Frame Ranges"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateNoDefaultCameras",
|
||||||
|
"label": "Validate No Default Cameras"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "ValidateUnrealUpAxis",
|
||||||
|
"label": "Validate Unreal Up-Axis check"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|
|
||||||
|
|
@ -1,3 +1,3 @@
|
||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
"""Package declaring Pype version."""
|
"""Package declaring Pype version."""
|
||||||
__version__ = "3.12.2-nightly.2"
|
__version__ = "3.12.2-nightly.3"
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,6 @@
|
||||||
[tool.poetry]
|
[tool.poetry]
|
||||||
name = "OpenPype"
|
name = "OpenPype"
|
||||||
version = "3.12.2-nightly.2" # OpenPype
|
version = "3.12.2-nightly.3" # OpenPype
|
||||||
description = "Open VFX and Animation pipeline with support."
|
description = "Open VFX and Animation pipeline with support."
|
||||||
authors = ["OpenPype Team <info@openpype.io>"]
|
authors = ["OpenPype Team <info@openpype.io>"]
|
||||||
license = "MIT License"
|
license = "MIT License"
|
||||||
|
|
|
||||||
|
|
@ -66,7 +66,7 @@ Another optional function is **get_current_context**. This function is handy in
|
||||||
Main responsibility of create plugin is to create, update, collect and remove instance metadata and propagate changes to create context. Has access to **CreateContext** (`self.create_context`) that discovered the plugin so has also access to other creators and instances. Create plugins have a lot of responsibility so it is recommended to implement common code per host.
|
Main responsibility of create plugin is to create, update, collect and remove instance metadata and propagate changes to create context. Has access to **CreateContext** (`self.create_context`) that discovered the plugin so has also access to other creators and instances. Create plugins have a lot of responsibility so it is recommended to implement common code per host.
|
||||||
|
|
||||||
#### *BaseCreator*
|
#### *BaseCreator*
|
||||||
Base implementation of creator plugin. It is not recommended to use this class as base for production plugins but rather use one of **AutoCreator** and **Creator** variants.
|
Base implementation of creator plugin. It is not recommended to use this class as base for production plugins but rather use one of **HiddenCreator**, **AutoCreator** and **Creator** variants.
|
||||||
|
|
||||||
**Abstractions**
|
**Abstractions**
|
||||||
- **`family`** (class attr) - Tells what kind of instance will be created.
|
- **`family`** (class attr) - Tells what kind of instance will be created.
|
||||||
|
|
@ -92,7 +92,7 @@ def collect_instances(self):
|
||||||
self._add_instance_to_context(instance)
|
self._add_instance_to_context(instance)
|
||||||
```
|
```
|
||||||
|
|
||||||
- **`create`** (method) - Create a new object of **CreatedInstance** store its metadata to the workfile and add the instance into the created context. Failed Creating should raise **CreatorError** if an error happens that artists can fix or give them some useful information. Triggers and implementation differs for **Creator** and **AutoCreator**.
|
- **`create`** (method) - Create a new object of **CreatedInstance** store its metadata to the workfile and add the instance into the created context. Failed Creating should raise **CreatorError** if an error happens that artists can fix or give them some useful information. Triggers and implementation differs for **Creator**, **HiddenCreator** and **AutoCreator**.
|
||||||
|
|
||||||
- **`update_instances`** (method) - Update data of instances. Receives tuple with **instance** and **changes**.
|
- **`update_instances`** (method) - Update data of instances. Receives tuple with **instance** and **changes**.
|
||||||
```python
|
```python
|
||||||
|
|
@ -172,11 +172,11 @@ class RenderLayerCreator(Creator):
|
||||||
icon = "fa5.building"
|
icon = "fa5.building"
|
||||||
```
|
```
|
||||||
|
|
||||||
- **`get_instance_attr_defs`** (method) - Attribute definitions of instance. Creator can define attribute values with default values for each instance. These attributes may affect how instances will be instance processed during publishing. Attribute defiitions can be used from `openpype.pipeline.lib.attribute_definitions` (NOTE: Will be moved to `openpype.lib.attribute_definitions` soon). Attribute definitions define basic types of values for different cases e.g. boolean, number, string, enumerator, etc. Default implementation returns **instance_attr_defs**.
|
- **`get_instance_attr_defs`** (method) - Attribute definitions of instance. Creator can define attribute values with default values for each instance. These attributes may affect how instances will be instance processed during publishing. Attribute defiitions can be used from `openpype.lib.attribute_definitions`. Attribute definitions define basic types of values for different cases e.g. boolean, number, string, enumerator, etc. Default implementation returns **instance_attr_defs**.
|
||||||
- **`instance_attr_defs`** (attr) - Attribute for default implementation of **get_instance_attr_defs**.
|
- **`instance_attr_defs`** (attr) - Attribute for default implementation of **get_instance_attr_defs**.
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from openpype.pipeline import attribute_definitions
|
from openpype.lib import attribute_definitions
|
||||||
|
|
||||||
|
|
||||||
class RenderLayerCreator(Creator):
|
class RenderLayerCreator(Creator):
|
||||||
|
|
@ -199,6 +199,20 @@ class RenderLayerCreator(Creator):
|
||||||
- **`get_dynamic_data`** (method) - Can be used to extend data for subset templates which may be required in some cases.
|
- **`get_dynamic_data`** (method) - Can be used to extend data for subset templates which may be required in some cases.
|
||||||
|
|
||||||
|
|
||||||
|
#### *HiddenCreator*
|
||||||
|
Creator which is not showed in UI so artist can't trigger it directly but is available for other creators. This creator is primarily meant for cases when creation should create different types of instances. For example during editorial publishing where input is single edl file but should create 2 or more kind of instances each with different family, attributes and abilities. Arguments for creation were limited to `instance_data` and `source_data`. Data of `instance_data` should follow what is sent to other creators and `source_data` can be used to send custom data defined by main creator. It is expected that `HiddenCreator` has specific main or "parent" creator.
|
||||||
|
|
||||||
|
```python
|
||||||
|
def create(self, instance_data, source_data):
|
||||||
|
variant = instance_data["variant"]
|
||||||
|
task_name = instance_data["task"]
|
||||||
|
asset_name = instance_data["asset"]
|
||||||
|
asset_doc = get_asset_by_name(self.project_name, asset_name)
|
||||||
|
self.get_subset_name(
|
||||||
|
variant, task_name, asset_doc, self.project_name, self.host_name)
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
#### *AutoCreator*
|
#### *AutoCreator*
|
||||||
Creator that is triggered on reset of create context. Can be used for families that are expected to be created automatically without artist interaction (e.g. **workfile**). Method `create` is triggered after collecting all creators.
|
Creator that is triggered on reset of create context. Can be used for families that are expected to be created automatically without artist interaction (e.g. **workfile**). Method `create` is triggered after collecting all creators.
|
||||||
|
|
||||||
|
|
@ -234,14 +248,14 @@ def create(self):
|
||||||
# - variant can be filled from settings
|
# - variant can be filled from settings
|
||||||
variant = self._variant_name
|
variant = self._variant_name
|
||||||
# Only place where we can look for current context
|
# Only place where we can look for current context
|
||||||
project_name = io.Session["AVALON_PROJECT"]
|
project_name = self.project_name
|
||||||
asset_name = io.Session["AVALON_ASSET"]
|
asset_name = legacy_io.Session["AVALON_ASSET"]
|
||||||
task_name = io.Session["AVALON_TASK"]
|
task_name = legacy_io.Session["AVALON_TASK"]
|
||||||
host_name = io.Session["AVALON_APP"]
|
host_name = legacy_io.Session["AVALON_APP"]
|
||||||
|
|
||||||
# Create new instance if does not exist yet
|
# Create new instance if does not exist yet
|
||||||
if existing_instance is None:
|
if existing_instance is None:
|
||||||
asset_doc = io.find_one({"type": "asset", "name": asset_name})
|
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||||
subset_name = self.get_subset_name(
|
subset_name = self.get_subset_name(
|
||||||
variant, task_name, asset_doc, project_name, host_name
|
variant, task_name, asset_doc, project_name, host_name
|
||||||
)
|
)
|
||||||
|
|
@ -264,7 +278,7 @@ def create(self):
|
||||||
existing_instance["asset"] != asset_name
|
existing_instance["asset"] != asset_name
|
||||||
or existing_instance["task"] != task_name
|
or existing_instance["task"] != task_name
|
||||||
):
|
):
|
||||||
asset_doc = io.find_one({"type": "asset", "name": asset_name})
|
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||||
subset_name = self.get_subset_name(
|
subset_name = self.get_subset_name(
|
||||||
variant, task_name, asset_doc, project_name, host_name
|
variant, task_name, asset_doc, project_name, host_name
|
||||||
)
|
)
|
||||||
|
|
@ -297,7 +311,8 @@ class BulkRenderCreator(Creator):
|
||||||
- **`pre_create_attr_defs`** (attr) - Attribute for default implementation of **get_pre_create_attr_defs**.
|
- **`pre_create_attr_defs`** (attr) - Attribute for default implementation of **get_pre_create_attr_defs**.
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from openpype.pipeline import Creator, attribute_definitions
|
from openpype.lib import attribute_definitions
|
||||||
|
from openpype.pipeline.create import Creator
|
||||||
|
|
||||||
|
|
||||||
class CreateRender(Creator):
|
class CreateRender(Creator):
|
||||||
|
|
@ -470,10 +485,8 @@ Possible attribute definitions can be found in `openpype/pipeline/lib/attribute_
|
||||||
|
|
||||||
```python
|
```python
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
from openpype.pipeline import (
|
from openpype.lib import attribute_definitions
|
||||||
OpenPypePyblishPluginMixin,
|
from openpype.pipeline import OpenPypePyblishPluginMixin
|
||||||
attribute_definitions,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# Example context plugin
|
# Example context plugin
|
||||||
|
|
|
||||||
|
|
@ -196,12 +196,12 @@ html[data-theme='dark'] .header-github-link::before {
|
||||||
padding: 20px
|
padding: 20px
|
||||||
}
|
}
|
||||||
|
|
||||||
.showcase .client {
|
.showcase .studio {
|
||||||
display: flex;
|
display: flex;
|
||||||
justify-content: space-between;
|
justify-content: space-between;
|
||||||
}
|
}
|
||||||
|
|
||||||
.showcase .client img {
|
.showcase .studio img {
|
||||||
max-height: 110px;
|
max-height: 110px;
|
||||||
padding: 20px;
|
padding: 20px;
|
||||||
max-width: 160px;
|
max-width: 160px;
|
||||||
|
|
|
||||||
|
|
@ -65,13 +65,17 @@ const collab = [
|
||||||
image: '/img/clothcat.png',
|
image: '/img/clothcat.png',
|
||||||
infoLink: 'https://www.clothcatanimation.com/'
|
infoLink: 'https://www.clothcatanimation.com/'
|
||||||
}, {
|
}, {
|
||||||
title: 'Ellipse Studio',
|
title: 'Ellipse Animation',
|
||||||
image: '/img/ellipse-studio.png',
|
image: '/img/ellipse_animation.svg',
|
||||||
infoLink: 'http://www.dargaudmedia.com'
|
infoLink: 'http://www.ellipseanimation.com'
|
||||||
}, {
|
}, {
|
||||||
title: 'J Cube Inc',
|
title: 'J Cube Inc',
|
||||||
image: '/img/jcube_logo_bw.png',
|
image: '/img/jcube_logo_bw.png',
|
||||||
infoLink: 'https://j-cube.jp'
|
infoLink: 'https://j-cube.jp'
|
||||||
|
}, {
|
||||||
|
title: 'Normaal Animation',
|
||||||
|
image: '/img/logo_normaal.png',
|
||||||
|
infoLink: 'https://j-cube.jp'
|
||||||
}
|
}
|
||||||
];
|
];
|
||||||
|
|
||||||
|
|
@ -153,7 +157,32 @@ const studios = [
|
||||||
title: "IGG Canada",
|
title: "IGG Canada",
|
||||||
image: "/img/igg-logo.png",
|
image: "/img/igg-logo.png",
|
||||||
infoLink: "https://www.igg.com/",
|
infoLink: "https://www.igg.com/",
|
||||||
}
|
},
|
||||||
|
{
|
||||||
|
title: "Agora Studio",
|
||||||
|
image: "/img/agora_studio.png",
|
||||||
|
infoLink: "https://agora.studio/",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
title: "Lucan Visuals",
|
||||||
|
image: "/img/lucan_Logo_On_White-HR.png",
|
||||||
|
infoLink: "https://www.lucan.tv/",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
title: "No Ghost",
|
||||||
|
image: "/img/noghost.png",
|
||||||
|
infoLink: "https://www.noghost.co.uk/",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
title: "Static VFX",
|
||||||
|
image: "/img/staticvfx.png",
|
||||||
|
infoLink: "http://www.staticvfx.com/",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
title: "Method n Madness",
|
||||||
|
image: "/img/methodmadness.png",
|
||||||
|
infoLink: "https://www.methodnmadness.com/",
|
||||||
|
}
|
||||||
];
|
];
|
||||||
|
|
||||||
function Service({imageUrl, title, description}) {
|
function Service({imageUrl, title, description}) {
|
||||||
|
|
@ -166,10 +195,10 @@ function Service({imageUrl, title, description}) {
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
function Client({title, image, infoLink}) {
|
function Studio({title, image, infoLink}) {
|
||||||
const imgUrl = useBaseUrl(image);
|
const imgUrl = useBaseUrl(image);
|
||||||
return (
|
return (
|
||||||
<a className="client" href={infoLink}>
|
<a className="studio" href={infoLink}>
|
||||||
<img src={image} alt="" title={title}></img>
|
<img src={image} alt="" title={title}></img>
|
||||||
</a>
|
</a>
|
||||||
);
|
);
|
||||||
|
|
@ -465,7 +494,7 @@ function Home() {
|
||||||
<h2>Studios using openPype</h2>
|
<h2>Studios using openPype</h2>
|
||||||
<div className="showcase">
|
<div className="showcase">
|
||||||
{studios.map((props, idx) => (
|
{studios.map((props, idx) => (
|
||||||
<Client key={idx} {...props} />
|
<Studio key={idx} {...props} />
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
|
||||||
31
website/static/img/NoGhost_Logo_black.svg
Normal file
|
|
@ -0,0 +1,31 @@
|
||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<!-- Generator: Adobe Illustrator 25.3.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||||
|
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||||
|
viewBox="0 0 1341 216" style="enable-background:new 0 0 1341 216;" xml:space="preserve">
|
||||||
|
<style type="text/css">
|
||||||
|
.st0{fill:#000000;}
|
||||||
|
</style>
|
||||||
|
<g>
|
||||||
|
<path class="st0" d="M132,0.3l0,18l47,81.1c3.9,7.3,2.3,16.6-4.2,22.2c-7.6,6.5-19,5.7-25.6-1.9c-0.9-1-2.2-3.2-2.2-3.2L79.7,0.3
|
||||||
|
H39.3C17.8,0.3,0.4,17.7,0.4,39.2L0.3,215.8h83.8l0-17.9l-46.3-80.1c-4.6-7.4-3.3-17.3,3.4-23.2c5.5-4.9,13.6-5.7,20.2-2.5
|
||||||
|
c4,1.9,6.4,4.7,8.1,7.9l66.8,115.8h40.3c21.5,0,38.9-17.4,38.9-38.9V0.3H132z"/>
|
||||||
|
<path class="st0" d="M367.8,0.3H227.6v176c0,21.8,17.7,39.5,39.5,39.5h140.2v-176C407.3,18,389.6,0.3,367.8,0.3z M350,191.9
|
||||||
|
c-10.1,0-18.9-5.5-23.5-13.7l0,0L261.8,66.4c-2.9-4.3-4.6-9.5-4.6-15.1c0-14.9,12.1-27,27-27c10.7,0,20,6.2,24.3,15.3l0,0
|
||||||
|
l64.4,111.3c0.3,0.4,0.5,0.8,0.7,1.3l0.1,0.1l0,0c2,3.8,3.2,8.1,3.2,12.7C377,179.8,364.9,191.9,350,191.9z"/>
|
||||||
|
<path class="st0" d="M984.5,0.3H844.3v176c0,21.8,17.7,39.5,39.5,39.5H1024v-176C1024,18,1006.3,0.3,984.5,0.3z M966.7,191.9
|
||||||
|
c-10.1,0-18.9-5.5-23.5-13.7l0,0L878.6,66.4c-2.9-4.3-4.6-9.5-4.6-15.1c0-14.9,12.1-27,27-27c10.7,0,20,6.2,24.3,15.3l0,0
|
||||||
|
l64.2,111.3c0.3,0.4,0.5,0.8,0.7,1.3l0.1,0.1l0,0c2,3.8,3.2,8.1,3.2,12.7C993.7,179.8,981.6,191.9,966.7,191.9z"/>
|
||||||
|
<path class="st0" d="M554.5,96.5v17.9l28.7,49.7c0.3,0.4,0.5,0.8,0.7,1.3l0.1,0.2l0,0c1.2,2.4,1.9,5.2,1.9,8.1
|
||||||
|
c0,10-8.1,18.1-18.1,18.1c-6.3,0-11.8-3.2-15.1-8l0,0l-0.7-1.3c-0.1-0.1-0.1-0.2-0.2-0.3L497.4,88l0,0c-1.8-2.8-2.8-6.1-2.8-9.7
|
||||||
|
c0-10,8.1-18.1,18.1-18.1c0,0,0,0,0,0h95c14.9,0,26.9-12.1,26.9-26.9V0.3H529.2c0,0,0,0,0,0c-50,0-90.8,40-91.9,89.8l0,0v35.9l0,0
|
||||||
|
c1.2,49.8,41.9,89.7,91.9,89.7h0h105.5V96.5H554.5z"/>
|
||||||
|
<path class="st0" d="M748.6,0.3l0,18.2l26,45.1c1.3,2.5,2.1,5.4,2.1,8.4c0,10-8.1,18.1-18.1,18.1h-29.5c-10,0-18.1-8.1-18.1-18.1
|
||||||
|
V0.3h-64.2v215.5h83.8l0-17.9l-26.4-45.7c-1.3-2.5-2-5.3-2-8.2c0-10,8.1-18.1,18.1-18.1c0,0,0,0,0,0h29.5c0,0,0,0,0,0
|
||||||
|
c10,0,18.1,8.1,18.1,18.1l0,71.9h64.5V0.3H748.6z"/>
|
||||||
|
<path class="st0" d="M1269.1,60.2c0.1,0,71.6,0,71.6,0v-33c0-14.9-12.1-26.9-26.9-26.9h-93.7c-1.2,0-2.4,0-3.6,0
|
||||||
|
c-123.6,0-149.9,0-154.1,0c-14.9,0-26.9,12.1-26.9,26.9c0,5.3,1.6,10.3,4.2,14.5l71.2,123.3c1.4,2.6,2.3,5.6,2.3,8.8
|
||||||
|
c0,10-8.1,18.1-18.1,18.1c-6.6,0-12.4-3.6-15.6-8.9l-15.7-27.1h-28.3v59.9h134.5c14.9,0,26.9-12.1,26.9-26.9c0-5-1.4-9.8-3.8-13.8
|
||||||
|
l-71.5-123.7l0,0c-1.5-2.6-2.3-5.6-2.3-8.8c0-10,8.1-18.1,18.1-18.1c7.2,0,13.4,4.2,16.4,10.3l14.8,25.4h40.4v155.6h83.8l-0.1-60.6
|
||||||
|
l-39.5-68c-1.5-2.6-2.3-5.6-2.3-8.8C1251,68.3,1259.1,60.2,1269.1,60.2z"/>
|
||||||
|
</g>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 2.7 KiB |
BIN
website/static/img/agora_studio.png
Normal file
|
After Width: | Height: | Size: 131 KiB |
9
website/static/img/ellipse_animation.svg
Normal file
|
After Width: | Height: | Size: 63 KiB |
|
Before Width: | Height: | Size: 78 KiB After Width: | Height: | Size: 94 KiB |
BIN
website/static/img/logo_normaal.png
Normal file
|
After Width: | Height: | Size: 13 KiB |
BIN
website/static/img/lucan_Logo_On_White-HR.png
Normal file
|
After Width: | Height: | Size: 76 KiB |
BIN
website/static/img/methodmadness.png
Normal file
|
After Width: | Height: | Size: 8.4 KiB |
BIN
website/static/img/noghost.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
BIN
website/static/img/staticvfx.png
Normal file
|
After Width: | Height: | Size: 13 KiB |