Merge branch 'develop' into release/3.13.x
35
CHANGELOG.md
|
|
@ -1,11 +1,17 @@
|
|||
# Changelog
|
||||
|
||||
## [3.12.2-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.12.2-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.12.1...HEAD)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Update website with more studios [\#3554](https://github.com/pypeclub/OpenPype/pull/3554)
|
||||
- Documentation: Update publishing dev docs [\#3549](https://github.com/pypeclub/OpenPype/pull/3549)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Maya: add additional validators to Settings [\#3540](https://github.com/pypeclub/OpenPype/pull/3540)
|
||||
- General: Interactive console in cli [\#3526](https://github.com/pypeclub/OpenPype/pull/3526)
|
||||
- Ftrack: Automatic daily review session creation can define trigger hour [\#3516](https://github.com/pypeclub/OpenPype/pull/3516)
|
||||
- Ftrack: add source into Note [\#3509](https://github.com/pypeclub/OpenPype/pull/3509)
|
||||
|
|
@ -20,8 +26,15 @@
|
|||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Remove invalid submodules from `/vendor` [\#3557](https://github.com/pypeclub/OpenPype/pull/3557)
|
||||
- General: Remove hosts filter on integrator plugins [\#3556](https://github.com/pypeclub/OpenPype/pull/3556)
|
||||
- Settings: Clean default values of environments [\#3550](https://github.com/pypeclub/OpenPype/pull/3550)
|
||||
- Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547)
|
||||
- Workfiles tool: Show of tool and it's flags [\#3539](https://github.com/pypeclub/OpenPype/pull/3539)
|
||||
- General: Create workfile documents works again [\#3538](https://github.com/pypeclub/OpenPype/pull/3538)
|
||||
- Additional fixes for powershell scripts [\#3525](https://github.com/pypeclub/OpenPype/pull/3525)
|
||||
- Maya: Added wrapper around cmds.setAttr [\#3523](https://github.com/pypeclub/OpenPype/pull/3523)
|
||||
- Nuke: double slate [\#3521](https://github.com/pypeclub/OpenPype/pull/3521)
|
||||
- General: Fix hash of centos oiio archive [\#3519](https://github.com/pypeclub/OpenPype/pull/3519)
|
||||
- Maya: Renderman display output fix [\#3514](https://github.com/pypeclub/OpenPype/pull/3514)
|
||||
- TrayPublisher: Simple creation enhancements and fixes [\#3513](https://github.com/pypeclub/OpenPype/pull/3513)
|
||||
|
|
@ -31,8 +44,12 @@
|
|||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Refactor Integrate Asset [\#3530](https://github.com/pypeclub/OpenPype/pull/3530)
|
||||
- General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529)
|
||||
- General: Get current context document functions [\#3522](https://github.com/pypeclub/OpenPype/pull/3522)
|
||||
- Kitsu: Use query function from client [\#3496](https://github.com/pypeclub/OpenPype/pull/3496)
|
||||
- TimersManager: Use query functions [\#3495](https://github.com/pypeclub/OpenPype/pull/3495)
|
||||
- Deadline: Use query functions [\#3466](https://github.com/pypeclub/OpenPype/pull/3466)
|
||||
|
||||
## [3.12.1](https://github.com/pypeclub/OpenPype/tree/3.12.1) (2022-07-13)
|
||||
|
||||
|
|
@ -57,7 +74,6 @@
|
|||
- Windows installer: Clean old files and add version subfolder [\#3445](https://github.com/pypeclub/OpenPype/pull/3445)
|
||||
- Blender: Bugfix - Set fps properly on open [\#3426](https://github.com/pypeclub/OpenPype/pull/3426)
|
||||
- Hiero: Add custom scripts menu [\#3425](https://github.com/pypeclub/OpenPype/pull/3425)
|
||||
- Blender: pre pyside install for all platforms [\#3400](https://github.com/pypeclub/OpenPype/pull/3400)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -95,34 +111,19 @@
|
|||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.0-nightly.3...3.12.0)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Fix typo in documentation: pyenv on mac [\#3417](https://github.com/pypeclub/OpenPype/pull/3417)
|
||||
- Linux: update OIIO package [\#3401](https://github.com/pypeclub/OpenPype/pull/3401)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Webserver: Added CORS middleware [\#3422](https://github.com/pypeclub/OpenPype/pull/3422)
|
||||
- Attribute Defs UI: Files widget show what is allowed to drop in [\#3411](https://github.com/pypeclub/OpenPype/pull/3411)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- NewPublisher: Fix subset name change on change of creator plugin [\#3420](https://github.com/pypeclub/OpenPype/pull/3420)
|
||||
- Bug: fix invalid avalon import [\#3418](https://github.com/pypeclub/OpenPype/pull/3418)
|
||||
- Nuke: Fix keyword argument in query function [\#3414](https://github.com/pypeclub/OpenPype/pull/3414)
|
||||
- Houdini: fix loading and updating vbd/bgeo sequences [\#3408](https://github.com/pypeclub/OpenPype/pull/3408)
|
||||
- Nuke: Collect representation files based on Write [\#3407](https://github.com/pypeclub/OpenPype/pull/3407)
|
||||
- General: Filter representations before integration start [\#3398](https://github.com/pypeclub/OpenPype/pull/3398)
|
||||
- Maya: look collector typo [\#3392](https://github.com/pypeclub/OpenPype/pull/3392)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Unreal: Use client query functions [\#3421](https://github.com/pypeclub/OpenPype/pull/3421)
|
||||
- General: Move editorial lib to pipeline [\#3419](https://github.com/pypeclub/OpenPype/pull/3419)
|
||||
- Kitsu: renaming to plural func sync\_all\_projects [\#3397](https://github.com/pypeclub/OpenPype/pull/3397)
|
||||
- Houdini: Use client query functions [\#3395](https://github.com/pypeclub/OpenPype/pull/3395)
|
||||
- Hiero: Use client query functions [\#3393](https://github.com/pypeclub/OpenPype/pull/3393)
|
||||
- Nuke: Use client query functions [\#3391](https://github.com/pypeclub/OpenPype/pull/3391)
|
||||
|
||||
## [3.11.1](https://github.com/pypeclub/OpenPype/tree/3.11.1) (2022-06-20)
|
||||
|
||||
|
|
|
|||
|
|
@ -25,6 +25,8 @@ from .entities import (
|
|||
get_last_version_by_subset_name,
|
||||
get_output_link_versions,
|
||||
|
||||
version_is_latest,
|
||||
|
||||
get_representation_by_id,
|
||||
get_representation_by_name,
|
||||
get_representations,
|
||||
|
|
@ -66,6 +68,8 @@ __all__ = (
|
|||
"get_last_version_by_subset_name",
|
||||
"get_output_link_versions",
|
||||
|
||||
"version_is_latest",
|
||||
|
||||
"get_representation_by_id",
|
||||
"get_representation_by_name",
|
||||
"get_representations",
|
||||
|
|
|
|||
|
|
@ -561,6 +561,42 @@ def get_version_by_name(project_name, version, subset_id, fields=None):
|
|||
return conn.find_one(query_filter, _prepare_fields(fields))
|
||||
|
||||
|
||||
def version_is_latest(project_name, version_id):
|
||||
"""Is version the latest from it's subset.
|
||||
|
||||
Note:
|
||||
Hero versions are considered as latest.
|
||||
|
||||
Todo:
|
||||
Maybe raise exception when version was not found?
|
||||
|
||||
Args:
|
||||
project_name (str):Name of project where to look for queried entities.
|
||||
version_id (Union[str, ObjectId]): Version id which is checked.
|
||||
|
||||
Returns:
|
||||
bool: True if is latest version from subset else False.
|
||||
"""
|
||||
|
||||
version_id = _convert_id(version_id)
|
||||
if not version_id:
|
||||
return False
|
||||
version_doc = get_version_by_id(
|
||||
project_name, version_id, fields=["_id", "type", "parent"]
|
||||
)
|
||||
# What to do when version is not found?
|
||||
if not version_doc:
|
||||
return False
|
||||
|
||||
if version_doc["type"] == "hero_version":
|
||||
return True
|
||||
|
||||
last_version = get_last_version_by_subset_id(
|
||||
project_name, version_doc["parent"], fields=["_id"]
|
||||
)
|
||||
return last_version["_id"] == version_id
|
||||
|
||||
|
||||
def _get_versions(
|
||||
project_name,
|
||||
subset_ids=None,
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
||||
|
|
@ -15,6 +14,7 @@ from openpype.pipeline import (
|
|||
AVALON_CONTAINER_ID,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.aftereffects
|
||||
from openpype.lib import register_event_callback
|
||||
|
||||
|
|
@ -136,7 +136,7 @@ def ls():
|
|||
|
||||
def check_inventory():
|
||||
"""Checks loaded containers if they are of highest version"""
|
||||
if not lib.any_outdated():
|
||||
if not any_outdated_containers():
|
||||
return
|
||||
|
||||
# Warn about outdated containers.
|
||||
|
|
|
|||
|
|
@ -4,17 +4,16 @@ import logging
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype import lib
|
||||
from openpype.client import get_representation_by_id
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.load import get_outdated_containers
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
import openpype.hosts.harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
|
@ -50,7 +49,9 @@ def get_asset_settings():
|
|||
dict: Scene data.
|
||||
|
||||
"""
|
||||
asset_data = lib.get_asset()["data"]
|
||||
|
||||
asset_doc = get_current_project_asset()
|
||||
asset_data = asset_doc["data"]
|
||||
fps = asset_data.get("fps")
|
||||
frame_start = asset_data.get("frameStart")
|
||||
frame_end = asset_data.get("frameEnd")
|
||||
|
|
@ -105,16 +106,7 @@ def check_inventory():
|
|||
in Harmony.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
outdated_containers = []
|
||||
for container in ls():
|
||||
representation_id = container['representation']
|
||||
representation_doc = get_representation_by_id(
|
||||
project_name, representation_id, fields=["parent"]
|
||||
)
|
||||
if representation_doc and not lib.is_latest(representation_doc):
|
||||
outdated_containers.append(container)
|
||||
|
||||
outdated_containers = get_outdated_containers()
|
||||
if not outdated_containers:
|
||||
return
|
||||
|
||||
|
|
|
|||
|
|
@ -5,8 +5,8 @@ from openpype.pipeline import (
|
|||
load,
|
||||
get_representation_path,
|
||||
)
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
copy_files = """function copyFile(srcFilename, dstFilename)
|
||||
|
|
@ -280,9 +280,7 @@ class BackgroundLoader(load.LoaderPlugin):
|
|||
)
|
||||
|
||||
def update(self, container, representation):
|
||||
|
||||
path = get_representation_path(representation)
|
||||
|
||||
with open(path) as json_file:
|
||||
data = json.load(json_file)
|
||||
|
||||
|
|
@ -300,10 +298,9 @@ class BackgroundLoader(load.LoaderPlugin):
|
|||
|
||||
bg_folder = os.path.dirname(path)
|
||||
|
||||
path = get_representation_path(representation)
|
||||
|
||||
print(container)
|
||||
|
||||
is_latest = is_representation_from_latest(representation)
|
||||
for layer in sorted(layers):
|
||||
file_to_import = [
|
||||
os.path.join(bg_folder, layer).replace("\\", "/")
|
||||
|
|
@ -347,7 +344,7 @@ class BackgroundLoader(load.LoaderPlugin):
|
|||
}
|
||||
%s
|
||||
""" % (sig, sig)
|
||||
if openpype.lib.is_latest(representation):
|
||||
if is_latest:
|
||||
harmony.send({"function": func, "args": [node, "green"]})
|
||||
else:
|
||||
harmony.send({"function": func, "args": [node, "red"]})
|
||||
|
|
|
|||
|
|
@ -10,8 +10,8 @@ from openpype.pipeline import (
|
|||
load,
|
||||
get_representation_path,
|
||||
)
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
class ImageSequenceLoader(load.LoaderPlugin):
|
||||
|
|
@ -109,7 +109,7 @@ class ImageSequenceLoader(load.LoaderPlugin):
|
|||
)
|
||||
|
||||
# Colour node.
|
||||
if openpype.lib.is_latest(representation):
|
||||
if is_representation_from_latest(representation):
|
||||
harmony.send(
|
||||
{
|
||||
"function": "PypeHarmony.setColor",
|
||||
|
|
|
|||
|
|
@ -10,8 +10,8 @@ from openpype.pipeline import (
|
|||
load,
|
||||
get_representation_path,
|
||||
)
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
class TemplateLoader(load.LoaderPlugin):
|
||||
|
|
@ -83,7 +83,7 @@ class TemplateLoader(load.LoaderPlugin):
|
|||
self_name = self.__class__.__name__
|
||||
|
||||
update_and_replace = False
|
||||
if openpype.lib.is_latest(representation):
|
||||
if is_representation_from_latest(representation):
|
||||
self._set_green(node)
|
||||
else:
|
||||
self._set_red(node)
|
||||
|
|
|
|||
|
|
@ -55,6 +55,10 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
"""Plugin entry point."""
|
||||
|
||||
# TODO 'get_asset_settings' could expect asset document as argument
|
||||
# which is available on 'context.data["assetEntity"]'
|
||||
# - the same approach can be used in 'ValidateSceneSettingsRepair'
|
||||
expected_settings = harmony.get_asset_settings()
|
||||
self.log.info("scene settings from DB:".format(expected_settings))
|
||||
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ import qargparse
|
|||
|
||||
import openpype.api as openpype
|
||||
from openpype.pipeline import LoaderPlugin, LegacyCreator
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from . import lib
|
||||
|
||||
log = openpype.Logger().get_logger(__name__)
|
||||
|
|
@ -484,7 +485,7 @@ class ClipLoader:
|
|||
|
||||
"""
|
||||
asset_name = self.context["representation"]["context"]["asset"]
|
||||
asset_doc = openpype.get_asset(asset_name)
|
||||
asset_doc = get_current_project_asset(asset_name)
|
||||
log.debug("__ asset_doc: {}".format(pformat(asset_doc)))
|
||||
self.data["assetData"] = asset_doc["data"]
|
||||
|
||||
|
|
|
|||
|
|
@ -5,8 +5,8 @@ from contextlib import contextmanager
|
|||
import six
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.api import get_asset
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
import hou
|
||||
|
|
@ -16,7 +16,7 @@ log = logging.getLogger(__name__)
|
|||
|
||||
def get_asset_fps():
|
||||
"""Return current asset fps."""
|
||||
return get_asset()["data"].get("fps")
|
||||
return get_current_project_asset()["data"].get("fps")
|
||||
|
||||
|
||||
def set_id(node, unique_id, overwrite=False):
|
||||
|
|
|
|||
|
|
@ -12,13 +12,13 @@ from openpype.pipeline import (
|
|||
register_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.houdini
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
from openpype.lib import (
|
||||
register_event_callback,
|
||||
emit_event,
|
||||
any_outdated,
|
||||
)
|
||||
|
||||
from .lib import get_asset_fps
|
||||
|
|
@ -245,7 +245,7 @@ def on_open():
|
|||
# ensure it is using correct FPS for the asset
|
||||
lib.validate_fps()
|
||||
|
||||
if any_outdated():
|
||||
if any_outdated_containers():
|
||||
from openpype.widgets import popup
|
||||
|
||||
log.warning("Scene has outdated content.")
|
||||
|
|
|
|||
|
|
@ -23,7 +23,6 @@ from openpype.client import (
|
|||
get_last_versions,
|
||||
get_representation_by_name
|
||||
)
|
||||
from openpype import lib
|
||||
from openpype.api import get_anatomy_settings
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
|
|
@ -33,6 +32,7 @@ from openpype.pipeline import (
|
|||
load_container,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from .commands import reset_frame_range
|
||||
|
||||
|
||||
|
|
@ -2174,7 +2174,7 @@ def reset_scene_resolution():
|
|||
project_name = legacy_io.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
project_data = project_doc["data"]
|
||||
asset_data = lib.get_asset()["data"]
|
||||
asset_data = get_current_project_asset()["data"]
|
||||
|
||||
# Set project resolution
|
||||
width_key = "resolutionWidth"
|
||||
|
|
@ -2208,7 +2208,8 @@ def set_context_settings():
|
|||
project_name = legacy_io.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
project_data = project_doc["data"]
|
||||
asset_data = lib.get_asset()["data"]
|
||||
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||
asset_data = asset_doc.get("data", {})
|
||||
|
||||
# Set project fps
|
||||
fps = asset_data.get("fps", project_data.get("fps", 25))
|
||||
|
|
@ -2233,7 +2234,7 @@ def validate_fps():
|
|||
|
||||
"""
|
||||
|
||||
fps = lib.get_asset()["data"]["fps"]
|
||||
fps = get_current_project_asset(fields=["data.fps"])["data"]["fps"]
|
||||
# TODO(antirotor): This is hack as for framerates having multiple
|
||||
# decimal places. FTrack is ceiling decimal values on
|
||||
# fps to two decimal places but Maya 2019+ is reporting those fps
|
||||
|
|
@ -3051,8 +3052,9 @@ def update_content_on_context_change():
|
|||
This will update scene content to match new asset on context change
|
||||
"""
|
||||
scene_sets = cmds.listSets(allSets=True)
|
||||
new_asset = legacy_io.Session["AVALON_ASSET"]
|
||||
new_data = lib.get_asset()["data"]
|
||||
asset_doc = get_current_project_asset()
|
||||
new_asset = asset_doc["name"]
|
||||
new_data = asset_doc["data"]
|
||||
for s in scene_sets:
|
||||
try:
|
||||
if cmds.getAttr("{}.id".format(s)) == "pyblish.avalon.instance":
|
||||
|
|
|
|||
|
|
@ -13,7 +13,6 @@ from openpype.host import HostBase, IWorkfileHost, ILoadHost
|
|||
import openpype.hosts.maya
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib import (
|
||||
any_outdated,
|
||||
register_event_callback,
|
||||
emit_event
|
||||
)
|
||||
|
|
@ -28,6 +27,7 @@ from openpype.pipeline import (
|
|||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
from openpype.hosts.maya.lib import copy_workspace_mel
|
||||
from . import menu, lib
|
||||
from .workio import (
|
||||
|
|
@ -470,7 +470,7 @@ def on_open():
|
|||
lib.validate_fps()
|
||||
lib.fix_incompatible_containers()
|
||||
|
||||
if any_outdated():
|
||||
if any_outdated_containers():
|
||||
log.warning("Scene has outdated content.")
|
||||
|
||||
# Find maya main window
|
||||
|
|
|
|||
|
|
@ -15,13 +15,13 @@ from openpype.hosts.maya.api import (
|
|||
from openpype.lib import requests_get
|
||||
from openpype.api import (
|
||||
get_system_settings,
|
||||
get_project_settings,
|
||||
get_asset)
|
||||
get_project_settings)
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.pipeline import (
|
||||
CreatorError,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class CreateRender(plugin.Creator):
|
||||
|
|
@ -413,7 +413,7 @@ class CreateRender(plugin.Creator):
|
|||
prefix,
|
||||
type="string")
|
||||
|
||||
asset = get_asset()
|
||||
asset = get_current_project_asset()
|
||||
|
||||
if renderer == "arnold":
|
||||
# set format to exr
|
||||
|
|
|
|||
|
|
@ -2,8 +2,8 @@ import maya.cmds as cmds
|
|||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype import lib
|
||||
import openpype.hosts.maya.api.lib as mayalib
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from math import ceil
|
||||
|
||||
|
||||
|
|
@ -41,7 +41,9 @@ class ValidateMayaUnits(pyblish.api.ContextPlugin):
|
|||
# now flooring the value?
|
||||
fps = float_round(context.data.get('fps'), 2, ceil)
|
||||
|
||||
asset_fps = lib.get_asset()["data"]["fps"]
|
||||
# TODO repace query with using 'context.data["assetEntity"]'
|
||||
asset_doc = get_current_project_asset()
|
||||
asset_fps = asset_doc["data"]["fps"]
|
||||
|
||||
self.log.info('Units (linear): {0}'.format(linearunits))
|
||||
self.log.info('Units (angular): {0}'.format(angularunits))
|
||||
|
|
@ -91,5 +93,7 @@ class ValidateMayaUnits(pyblish.api.ContextPlugin):
|
|||
cls.log.debug(current_linear)
|
||||
|
||||
cls.log.info("Setting time unit to match project")
|
||||
asset_fps = lib.get_asset()["data"]["fps"]
|
||||
# TODO repace query with using 'context.data["assetEntity"]'
|
||||
asset_doc = get_current_project_asset()
|
||||
asset_fps = asset_doc["data"]["fps"]
|
||||
mayalib.set_scene_fps(asset_fps)
|
||||
|
|
|
|||
|
|
@ -24,7 +24,6 @@ from openpype.api import (
|
|||
BuildWorkfile,
|
||||
get_version_from_path,
|
||||
get_workdir_data,
|
||||
get_asset,
|
||||
get_current_project_settings,
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
|
|
@ -40,6 +39,7 @@ from openpype.pipeline import (
|
|||
legacy_io,
|
||||
Anatomy,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
from . import gizmo_menu
|
||||
|
||||
|
|
@ -1766,7 +1766,7 @@ class WorkfileSettings(object):
|
|||
kwargs.get("asset_name")
|
||||
or legacy_io.Session["AVALON_ASSET"]
|
||||
)
|
||||
self._asset_entity = get_asset(self._asset)
|
||||
self._asset_entity = get_current_project_asset(self._asset)
|
||||
self._root_node = root_node or nuke.root()
|
||||
self._nodes = self.get_nodes(nodes=nodes)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,6 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.client import get_project, get_asset_by_id
|
||||
from openpype import lib
|
||||
from openpype.client import get_project, get_asset_by_id, get_asset_by_name
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
|
||||
|
|
@ -17,10 +16,11 @@ class ValidateScript(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
ctx_data = instance.context.data
|
||||
asset_name = ctx_data["asset"]
|
||||
asset = lib.get_asset(asset_name)
|
||||
asset_data = asset["data"]
|
||||
project_name = legacy_io.active_project()
|
||||
asset_name = ctx_data["asset"]
|
||||
# TODO repace query with using 'instance.data["assetEntity"]'
|
||||
asset = get_asset_by_name(project_name, asset_name)
|
||||
asset_data = asset["data"]
|
||||
|
||||
# These attributes will be checked
|
||||
attributes = [
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
import os
|
||||
from Qt import QtWidgets
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
|
@ -13,8 +12,8 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.photoshop
|
||||
|
||||
from . import lib
|
||||
|
|
@ -30,7 +29,7 @@ INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
|||
|
||||
|
||||
def check_inventory():
|
||||
if not lib.any_outdated():
|
||||
if not any_outdated_containers():
|
||||
return
|
||||
|
||||
# Warn about outdated containers.
|
||||
|
|
|
|||
|
|
@ -4,11 +4,11 @@ import uuid
|
|||
import qargparse
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
import openpype.api as pype
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
LoaderPlugin,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from openpype.hosts import resolve
|
||||
from . import lib
|
||||
|
||||
|
|
@ -375,7 +375,7 @@ class ClipLoader:
|
|||
|
||||
"""
|
||||
asset_name = self.context["representation"]["context"]["asset"]
|
||||
self.data["assetData"] = pype.get_asset(asset_name)["data"]
|
||||
self.data["assetData"] = get_current_project_asset(asset_name)["data"]
|
||||
|
||||
def load(self):
|
||||
# create project bin for the media to be imported into
|
||||
|
|
|
|||
|
|
@ -19,6 +19,7 @@ import os
|
|||
import opentimelineio as otio
|
||||
import pyblish.api
|
||||
from openpype import lib as plib
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class OTIO_View(pyblish.api.Action):
|
||||
|
|
@ -116,7 +117,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
if extension == ".edl":
|
||||
# EDL has no frame rate embedded so needs explicit
|
||||
# frame rate else 24 is asssumed.
|
||||
kwargs["rate"] = plib.get_asset()["data"]["fps"]
|
||||
kwargs["rate"] = get_current_project_asset()["data"]["fps"]
|
||||
|
||||
instance.data["otio_timeline"] = otio.adapters.read_from_file(
|
||||
file_path, **kwargs)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,12 @@
|
|||
import os
|
||||
from copy import deepcopy
|
||||
|
||||
import opentimelineio as otio
|
||||
import pyblish.api
|
||||
|
||||
from openpype import lib as plib
|
||||
from copy import deepcopy
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.InstancePlugin):
|
||||
"""Collect instances from editorial's OTIO sequence"""
|
||||
|
|
@ -48,7 +52,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
|
||||
# get timeline otio data
|
||||
timeline = instance.data["otio_timeline"]
|
||||
fps = plib.get_asset()["data"]["fps"]
|
||||
fps = get_current_project_asset()["data"]["fps"]
|
||||
|
||||
tracks = timeline.each_child(
|
||||
descended_from_type=otio.schema.Track
|
||||
|
|
|
|||
|
|
@ -3,8 +3,8 @@ import re
|
|||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype import lib
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class ValidateFrameRange(pyblish.api.InstancePlugin):
|
||||
|
|
@ -27,7 +27,8 @@ class ValidateFrameRange(pyblish.api.InstancePlugin):
|
|||
for pattern in self.skip_timelines_check):
|
||||
self.log.info("Skipping for {} task".format(instance.data["task"]))
|
||||
|
||||
asset_data = lib.get_asset(instance.data["asset"])["data"]
|
||||
# TODO repace query with using 'instance.data["assetEntity"]'
|
||||
asset_data = get_current_project_asset(instance.data["asset"])["data"]
|
||||
frame_start = asset_data["frameStart"]
|
||||
frame_end = asset_data["frameEnd"]
|
||||
handle_start = asset_data["handleStart"]
|
||||
|
|
|
|||
|
|
@ -8,13 +8,13 @@ from unreal import EditorAssetLibrary
|
|||
from unreal import MovieSceneSkeletalAnimationTrack
|
||||
from unreal import MovieSceneSkeletalAnimationSection
|
||||
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
from openpype.api import get_asset
|
||||
|
||||
|
||||
class AnimationFBXLoader(plugin.Loader):
|
||||
|
|
@ -53,6 +53,8 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
if not actor:
|
||||
return None
|
||||
|
||||
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||
|
||||
task.set_editor_property('filename', self.fname)
|
||||
task.set_editor_property('destination_path', asset_dir)
|
||||
task.set_editor_property('destination_name', asset_name)
|
||||
|
|
@ -80,7 +82,7 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'use_default_sample_rate', False)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
||||
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'import_custom_attribute', True)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
|
|
@ -246,6 +248,7 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
def update(self, container, representation):
|
||||
name = container["asset_name"]
|
||||
source_path = get_representation_path(representation)
|
||||
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||
destination_path = container["namespace"]
|
||||
|
||||
task = unreal.AssetImportTask()
|
||||
|
|
@ -279,7 +282,7 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'use_default_sample_rate', False)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
||||
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'import_custom_attribute', True)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ from openpype.pipeline import (
|
|||
AVALON_CONTAINER_ID,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.api import get_asset
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
|
||||
|
|
@ -225,6 +225,7 @@ class LayoutLoader(plugin.Loader):
|
|||
|
||||
anim_path = f"{asset_dir}/animations/{anim_file_name}"
|
||||
|
||||
asset_doc = get_current_project_asset()
|
||||
# Import animation
|
||||
task = unreal.AssetImportTask()
|
||||
task.options = unreal.FbxImportUI()
|
||||
|
|
@ -259,7 +260,7 @@ class LayoutLoader(plugin.Loader):
|
|||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'use_default_sample_rate', False)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
||||
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'import_custom_attribute', True)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
|
|
|
|||
|
|
@ -15,11 +15,10 @@ from openpype.client import (
|
|||
get_asset_by_name,
|
||||
get_subset_by_name,
|
||||
get_subsets,
|
||||
get_version_by_id,
|
||||
get_last_versions,
|
||||
get_last_version_by_subset_id,
|
||||
get_last_version_by_subset_name,
|
||||
get_representations,
|
||||
get_representation_by_id,
|
||||
get_workfile_info,
|
||||
)
|
||||
from openpype.settings import (
|
||||
|
|
@ -180,7 +179,7 @@ def with_pipeline_io(func):
|
|||
return wrapped
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.pipeline.context_tools.is_representation_from_latest")
|
||||
def is_latest(representation):
|
||||
"""Return whether the representation is from latest version
|
||||
|
||||
|
|
@ -191,52 +190,21 @@ def is_latest(representation):
|
|||
bool: Whether the representation is of latest version.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_version_by_id(
|
||||
project_name,
|
||||
representation["parent"],
|
||||
fields=["_id", "type", "parent"]
|
||||
)
|
||||
if version["type"] == "hero_version":
|
||||
return True
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
|
||||
# Get highest version under the parent
|
||||
last_version = get_last_version_by_subset_id(
|
||||
project_name, version["parent"], fields=["_id"]
|
||||
)
|
||||
|
||||
return version["_id"] == last_version["_id"]
|
||||
return is_representation_from_latest(representation)
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.pipeline.load.any_outdated_containers")
|
||||
def any_outdated():
|
||||
"""Return whether the current scene has any outdated content"""
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
checked = set()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
if representation in checked:
|
||||
continue
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
|
||||
representation_doc = get_representation_by_id(
|
||||
project_name, representation, fields=["parent"]
|
||||
)
|
||||
if representation_doc and not is_latest(representation_doc):
|
||||
return True
|
||||
elif not representation_doc:
|
||||
log.debug("Container '{objectName}' has an invalid "
|
||||
"representation, it is missing in the "
|
||||
"database".format(**container))
|
||||
|
||||
checked.add(representation)
|
||||
|
||||
return False
|
||||
return any_outdated_containers()
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.pipeline.context_tools.get_current_project_asset")
|
||||
def get_asset(asset_name=None):
|
||||
""" Returning asset document from database by its name.
|
||||
|
||||
|
|
@ -249,15 +217,9 @@ def get_asset(asset_name=None):
|
|||
(MongoDB document)
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
if not asset_name:
|
||||
asset_name = legacy_io.Session["AVALON_ASSET"]
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
asset_document = get_asset_by_name(project_name, asset_name)
|
||||
if not asset_document:
|
||||
raise TypeError("Entity \"{}\" was not found in DB".format(asset_name))
|
||||
|
||||
return asset_document
|
||||
return get_current_project_asset(asset_name=asset_name)
|
||||
|
||||
|
||||
def get_system_general_anatomy_data(system_settings=None):
|
||||
|
|
@ -319,7 +281,7 @@ def get_linked_assets(asset_doc):
|
|||
return list(get_assets(project_name, link_ids))
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.client.get_last_version_by_subset_name")
|
||||
def get_latest_version(asset_name, subset_name, dbcon=None, project_name=None):
|
||||
"""Retrieve latest version from `asset_name`, and `subset_name`.
|
||||
|
||||
|
|
@ -340,6 +302,8 @@ def get_latest_version(asset_name, subset_name, dbcon=None, project_name=None):
|
|||
|
||||
if not project_name:
|
||||
if not dbcon:
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
log.debug("Using `legacy_io` for query.")
|
||||
dbcon = legacy_io
|
||||
# Make sure is installed
|
||||
|
|
@ -347,37 +311,9 @@ def get_latest_version(asset_name, subset_name, dbcon=None, project_name=None):
|
|||
|
||||
project_name = dbcon.active_project()
|
||||
|
||||
log.debug((
|
||||
"Getting latest version for Project: \"{}\" Asset: \"{}\""
|
||||
" and Subset: \"{}\""
|
||||
).format(project_name, asset_name, subset_name))
|
||||
|
||||
# Query asset document id by asset name
|
||||
asset_doc = get_asset_by_name(project_name, asset_name, fields=["_id"])
|
||||
if not asset_doc:
|
||||
log.info(
|
||||
"Asset \"{}\" was not found in Database.".format(asset_name)
|
||||
)
|
||||
return None
|
||||
|
||||
subset_doc = get_subset_by_name(
|
||||
project_name, subset_name, asset_doc["_id"]
|
||||
return get_last_version_by_subset_name(
|
||||
project_name, subset_name, asset_name=asset_name
|
||||
)
|
||||
if not subset_doc:
|
||||
log.info(
|
||||
"Subset \"{}\" was not found in Database.".format(subset_name)
|
||||
)
|
||||
return None
|
||||
|
||||
version_doc = get_last_version_by_subset_id(
|
||||
project_name, subset_doc["_id"]
|
||||
)
|
||||
if not version_doc:
|
||||
log.info(
|
||||
"Subset \"{}\" does not have any version yet.".format(subset_name)
|
||||
)
|
||||
return None
|
||||
return version_doc
|
||||
|
||||
|
||||
def get_workfile_template_key_from_context(
|
||||
|
|
|
|||
171
openpype/lib/file_transaction.py
Normal file
|
|
@ -0,0 +1,171 @@
|
|||
import os
|
||||
import logging
|
||||
import sys
|
||||
import errno
|
||||
import six
|
||||
|
||||
from openpype.lib import create_hard_link
|
||||
|
||||
# this is needed until speedcopy for linux is fixed
|
||||
if sys.platform == "win32":
|
||||
from speedcopy import copyfile
|
||||
else:
|
||||
from shutil import copyfile
|
||||
|
||||
|
||||
class FileTransaction(object):
|
||||
"""
|
||||
|
||||
The file transaction is a three step process.
|
||||
|
||||
1) Rename any existing files to a "temporary backup" during `process()`
|
||||
2) Copy the files to final destination during `process()`
|
||||
3) Remove any backed up files (*no rollback possible!) during `finalize()`
|
||||
|
||||
Step 3 is done during `finalize()`. If not called the .bak files will
|
||||
remain on disk.
|
||||
|
||||
These steps try to ensure that we don't overwrite half of any existing
|
||||
files e.g. if they are currently in use.
|
||||
|
||||
Note:
|
||||
A regular filesystem is *not* a transactional file system and even
|
||||
though this implementation tries to produce a 'safe copy' with a
|
||||
potential rollback do keep in mind that it's inherently unsafe due
|
||||
to how filesystem works and a myriad of things could happen during
|
||||
the transaction that break the logic. A file storage could go down,
|
||||
permissions could be changed, other machines could be moving or writing
|
||||
files. A lot can happen.
|
||||
|
||||
Warning:
|
||||
Any folders created during the transfer will not be removed.
|
||||
|
||||
"""
|
||||
|
||||
MODE_COPY = 0
|
||||
MODE_HARDLINK = 1
|
||||
|
||||
def __init__(self, log=None):
|
||||
|
||||
if log is None:
|
||||
log = logging.getLogger("FileTransaction")
|
||||
|
||||
self.log = log
|
||||
|
||||
# The transfer queue
|
||||
# todo: make this an actual FIFO queue?
|
||||
self._transfers = {}
|
||||
|
||||
# Destination file paths that a file was transferred to
|
||||
self._transferred = []
|
||||
|
||||
# Backup file location mapping to original locations
|
||||
self._backup_to_original = {}
|
||||
|
||||
def add(self, src, dst, mode=MODE_COPY):
|
||||
"""Add a new file to transfer queue"""
|
||||
opts = {"mode": mode}
|
||||
|
||||
src = os.path.abspath(src)
|
||||
dst = os.path.abspath(dst)
|
||||
|
||||
if dst in self._transfers:
|
||||
queued_src = self._transfers[dst][0]
|
||||
if src == queued_src:
|
||||
self.log.debug("File transfer was already "
|
||||
"in queue: {} -> {}".format(src, dst))
|
||||
return
|
||||
else:
|
||||
self.log.warning("File transfer in queue replaced..")
|
||||
self.log.debug("Removed from queue: "
|
||||
"{} -> {}".format(queued_src, dst))
|
||||
self.log.debug("Added to queue: {} -> {}".format(src, dst))
|
||||
|
||||
self._transfers[dst] = (src, opts)
|
||||
|
||||
def process(self):
|
||||
|
||||
# Backup any existing files
|
||||
for dst in self._transfers.keys():
|
||||
if os.path.exists(dst):
|
||||
# Backup original file
|
||||
# todo: add timestamp or uuid to ensure unique
|
||||
backup = dst + ".bak"
|
||||
self._backup_to_original[backup] = dst
|
||||
self.log.debug("Backup existing file: "
|
||||
"{} -> {}".format(dst, backup))
|
||||
os.rename(dst, backup)
|
||||
|
||||
# Copy the files to transfer
|
||||
for dst, (src, opts) in self._transfers.items():
|
||||
self._create_folder_for_file(dst)
|
||||
|
||||
if opts["mode"] == self.MODE_COPY:
|
||||
self.log.debug("Copying file ... {} -> {}".format(src, dst))
|
||||
copyfile(src, dst)
|
||||
elif opts["mode"] == self.MODE_HARDLINK:
|
||||
self.log.debug("Hardlinking file ... {} -> {}".format(src,
|
||||
dst))
|
||||
create_hard_link(src, dst)
|
||||
|
||||
self._transferred.append(dst)
|
||||
|
||||
def finalize(self):
|
||||
# Delete any backed up files
|
||||
for backup in self._backup_to_original.keys():
|
||||
try:
|
||||
os.remove(backup)
|
||||
except OSError:
|
||||
self.log.error("Failed to remove backup file: "
|
||||
"{}".format(backup),
|
||||
exc_info=True)
|
||||
|
||||
def rollback(self):
|
||||
|
||||
errors = 0
|
||||
|
||||
# Rollback any transferred files
|
||||
for path in self._transferred:
|
||||
try:
|
||||
os.remove(path)
|
||||
except OSError:
|
||||
errors += 1
|
||||
self.log.error("Failed to rollback created file: "
|
||||
"{}".format(path),
|
||||
exc_info=True)
|
||||
|
||||
# Rollback the backups
|
||||
for backup, original in self._backup_to_original.items():
|
||||
try:
|
||||
os.rename(backup, original)
|
||||
except OSError:
|
||||
errors += 1
|
||||
self.log.error("Failed to restore original file: "
|
||||
"{} -> {}".format(backup, original),
|
||||
exc_info=True)
|
||||
|
||||
if errors:
|
||||
self.log.error("{} errors occurred during "
|
||||
"rollback.".format(errors), exc_info=True)
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
@property
|
||||
def transferred(self):
|
||||
"""Return the processed transfers destination paths"""
|
||||
return list(self._transferred)
|
||||
|
||||
@property
|
||||
def backups(self):
|
||||
"""Return the backup file paths"""
|
||||
return list(self._backup_to_original.keys())
|
||||
|
||||
def _create_folder_for_file(self, path):
|
||||
dirname = os.path.dirname(path)
|
||||
try:
|
||||
os.makedirs(dirname)
|
||||
except OSError as e:
|
||||
if e.errno == errno.EEXIST:
|
||||
pass
|
||||
else:
|
||||
self.log.critical("An unexpected error occurred.")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
|
@ -10,8 +10,10 @@ import clique
|
|||
|
||||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype.client import get_representations
|
||||
from openpype.client import (
|
||||
get_last_version_by_subset_name,
|
||||
get_representations,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
legacy_io,
|
||||
|
|
@ -343,8 +345,13 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
|
||||
# get latest version of subset
|
||||
# this will stop if subset wasn't published yet
|
||||
version = openpype.api.get_latest_version(instance.data.get("asset"),
|
||||
instance.data.get("subset"))
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_last_version_by_subset_name(
|
||||
project_name,
|
||||
instance.data.get("subset"),
|
||||
asset_name=instance.data.get("asset")
|
||||
)
|
||||
|
||||
# get its files based on extension
|
||||
subset_resources = get_resources(
|
||||
project_name, version, representation.get("ext")
|
||||
|
|
@ -1025,9 +1032,12 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
prev_start = None
|
||||
prev_end = None
|
||||
|
||||
version = openpype.api.get_latest_version(asset_name=asset,
|
||||
subset_name=subset
|
||||
)
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_last_version_by_subset_name(
|
||||
project_name,
|
||||
subset,
|
||||
asset_name=asset
|
||||
)
|
||||
|
||||
# Set prev start / end frames for comparison
|
||||
if not prev_start and not prev_end:
|
||||
|
|
@ -1072,7 +1082,12 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
based on 'publish' template
|
||||
"""
|
||||
if not version:
|
||||
version = openpype.api.get_latest_version(asset, subset)
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_last_version_by_subset_name(
|
||||
project_name,
|
||||
subset,
|
||||
asset_name=asset
|
||||
)
|
||||
if version:
|
||||
version = int(version["name"]) + 1
|
||||
else:
|
||||
|
|
|
|||
|
|
@ -10,6 +10,12 @@ import pyblish.api
|
|||
from pyblish.lib import MessageHandler
|
||||
|
||||
import openpype
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_asset_by_id,
|
||||
get_asset_by_name,
|
||||
version_is_latest,
|
||||
)
|
||||
from openpype.modules import load_modules, ModulesManager
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import filter_pyblish_plugins
|
||||
|
|
@ -240,29 +246,7 @@ def registered_host():
|
|||
|
||||
|
||||
def deregister_host():
|
||||
_registered_host["_"] = default_host()
|
||||
|
||||
|
||||
def default_host():
|
||||
"""A default host, in place of anything better
|
||||
|
||||
This may be considered as reference for the
|
||||
interface a host must implement. It also ensures
|
||||
that the system runs, even when nothing is there
|
||||
to support it.
|
||||
|
||||
"""
|
||||
|
||||
host = types.ModuleType("defaultHost")
|
||||
|
||||
def ls():
|
||||
return list()
|
||||
|
||||
host.__dict__.update({
|
||||
"ls": ls
|
||||
})
|
||||
|
||||
return host
|
||||
_registered_host["_"] = None
|
||||
|
||||
|
||||
def debug_host():
|
||||
|
|
@ -304,3 +288,63 @@ def debug_host():
|
|||
})
|
||||
|
||||
return host
|
||||
|
||||
|
||||
def get_current_project(fields=None):
|
||||
"""Helper function to get project document based on global Session.
|
||||
|
||||
This function should be called only in process where host is installed.
|
||||
|
||||
Returns:
|
||||
dict: Project document.
|
||||
None: Project is not set.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
return get_project(project_name, fields=fields)
|
||||
|
||||
|
||||
def get_current_project_asset(asset_name=None, asset_id=None, fields=None):
|
||||
"""Helper function to get asset document based on global Session.
|
||||
|
||||
This function should be called only in process where host is installed.
|
||||
|
||||
Asset is found out based on passed asset name or id (not both). Asset name
|
||||
is not used for filtering if asset id is passed. When both asset name and
|
||||
id are missing then asset name from current process is used.
|
||||
|
||||
Args:
|
||||
asset_name (str): Name of asset used for filter.
|
||||
asset_id (Union[str, ObjectId]): Asset document id. If entered then
|
||||
is used as only filter.
|
||||
fields (Union[List[str], None]): Limit returned data of asset documents
|
||||
to specific keys.
|
||||
|
||||
Returns:
|
||||
dict: Asset document.
|
||||
None: Asset is not set or not exist.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
if asset_id:
|
||||
return get_asset_by_id(project_name, asset_id, fields=fields)
|
||||
|
||||
if not asset_name:
|
||||
asset_name = legacy_io.Session.get("AVALON_ASSET")
|
||||
# Skip if is not set even on context
|
||||
if not asset_name:
|
||||
return None
|
||||
return get_asset_by_name(project_name, asset_name, fields=fields)
|
||||
|
||||
def is_representation_from_latest(representation):
|
||||
"""Return whether the representation is from latest version
|
||||
|
||||
Args:
|
||||
representation (dict): The representation document from the database.
|
||||
|
||||
Returns:
|
||||
bool: Whether the representation is of latest version.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
return version_is_latest(project_name, representation["parent"])
|
||||
|
|
|
|||
|
|
@ -24,6 +24,10 @@ from .utils import (
|
|||
|
||||
loaders_from_repre_context,
|
||||
loaders_from_representation,
|
||||
|
||||
any_outdated_containers,
|
||||
get_outdated_containers,
|
||||
filter_containers,
|
||||
)
|
||||
|
||||
from .plugins import (
|
||||
|
|
@ -66,6 +70,10 @@ __all__ = (
|
|||
"loaders_from_repre_context",
|
||||
"loaders_from_representation",
|
||||
|
||||
"any_outdated_containers",
|
||||
"get_outdated_containers",
|
||||
"filter_containers",
|
||||
|
||||
# plugins.py
|
||||
"LoaderPlugin",
|
||||
"SubsetLoaderPlugin",
|
||||
|
|
|
|||
|
|
@ -4,8 +4,10 @@ import copy
|
|||
import getpass
|
||||
import logging
|
||||
import inspect
|
||||
import collections
|
||||
import numbers
|
||||
|
||||
from openpype.host import ILoadHost
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
|
|
@ -15,6 +17,7 @@ from openpype.client import (
|
|||
get_last_version_by_subset_id,
|
||||
get_hero_version_by_subset_id,
|
||||
get_version_by_name,
|
||||
get_last_versions,
|
||||
get_representations,
|
||||
get_representation_by_id,
|
||||
get_representation_by_name,
|
||||
|
|
@ -28,6 +31,11 @@ from openpype.pipeline import (
|
|||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
ContainersFilterResult = collections.namedtuple(
|
||||
"ContainersFilterResult",
|
||||
["latest", "outdated", "not_foud", "invalid"]
|
||||
)
|
||||
|
||||
|
||||
class HeroVersionType(object):
|
||||
def __init__(self, version):
|
||||
|
|
@ -685,3 +693,164 @@ def loaders_from_representation(loaders, representation):
|
|||
|
||||
context = get_representation_context(representation)
|
||||
return loaders_from_repre_context(loaders, context)
|
||||
|
||||
|
||||
def any_outdated_containers(host=None, project_name=None):
|
||||
"""Check if there are any outdated containers in scene."""
|
||||
|
||||
if get_outdated_containers(host, project_name):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def get_outdated_containers(host=None, project_name=None):
|
||||
"""Collect outdated containers from host scene.
|
||||
|
||||
Currently registered host and project in global session are used if
|
||||
arguments are not passed.
|
||||
|
||||
Args:
|
||||
host (ModuleType): Host implementation with 'ls' function available.
|
||||
project_name (str): Name of project in which context we are.
|
||||
"""
|
||||
|
||||
if host is None:
|
||||
from openpype.pipeline import registered_host
|
||||
host = registered_host()
|
||||
|
||||
if project_name is None:
|
||||
project_name = legacy_io.active_project()
|
||||
|
||||
if isinstance(host, ILoadHost):
|
||||
containers = host.get_containers()
|
||||
else:
|
||||
containers = host.ls()
|
||||
return filter_containers(containers, project_name).outdated
|
||||
|
||||
|
||||
def filter_containers(containers, project_name):
|
||||
"""Filter containers and split them into 4 categories.
|
||||
|
||||
Categories are 'latest', 'outdated', 'invalid' and 'not_found'.
|
||||
The 'lastest' containers are from last version, 'outdated' are not,
|
||||
'invalid' are invalid containers (invalid content) and 'not_foud' has
|
||||
some missing entity in database.
|
||||
|
||||
Args:
|
||||
containers (Iterable[dict]): List of containers referenced into scene.
|
||||
project_name (str): Name of project in which context shoud look for
|
||||
versions.
|
||||
|
||||
Returns:
|
||||
ContainersFilterResult: Named tuple with 'latest', 'outdated',
|
||||
'invalid' and 'not_found' containers.
|
||||
"""
|
||||
|
||||
# Make sure containers is list that won't change
|
||||
containers = list(containers)
|
||||
|
||||
outdated_containers = []
|
||||
uptodate_containers = []
|
||||
not_found_containers = []
|
||||
invalid_containers = []
|
||||
output = ContainersFilterResult(
|
||||
uptodate_containers,
|
||||
outdated_containers,
|
||||
not_found_containers,
|
||||
invalid_containers
|
||||
)
|
||||
# Query representation docs to get it's version ids
|
||||
repre_ids = {
|
||||
container["representation"]
|
||||
for container in containers
|
||||
if container["representation"]
|
||||
}
|
||||
if not repre_ids:
|
||||
if containers:
|
||||
invalid_containers.extend(containers)
|
||||
return output
|
||||
|
||||
repre_docs = get_representations(
|
||||
project_name,
|
||||
representation_ids=repre_ids,
|
||||
fields=["_id", "parent"]
|
||||
)
|
||||
# Store representations by stringified representation id
|
||||
repre_docs_by_str_id = {}
|
||||
repre_docs_by_version_id = collections.defaultdict(list)
|
||||
for repre_doc in repre_docs:
|
||||
repre_id = str(repre_doc["_id"])
|
||||
version_id = repre_doc["parent"]
|
||||
repre_docs_by_str_id[repre_id] = repre_doc
|
||||
repre_docs_by_version_id[version_id].append(repre_doc)
|
||||
|
||||
# Query version docs to get it's subset ids
|
||||
# - also query hero version to be able identify if representation
|
||||
# belongs to existing version
|
||||
version_docs = get_versions(
|
||||
project_name,
|
||||
version_ids=repre_docs_by_version_id.keys(),
|
||||
hero=True,
|
||||
fields=["_id", "parent", "type"]
|
||||
)
|
||||
verisons_by_id = {}
|
||||
versions_by_subset_id = collections.defaultdict(list)
|
||||
hero_version_ids = set()
|
||||
for version_doc in version_docs:
|
||||
version_id = version_doc["_id"]
|
||||
# Store versions by their ids
|
||||
verisons_by_id[version_id] = version_doc
|
||||
# There's no need to query subsets for hero versions
|
||||
# - they are considered as latest?
|
||||
if version_doc["type"] == "hero_version":
|
||||
hero_version_ids.add(version_id)
|
||||
continue
|
||||
subset_id = version_doc["parent"]
|
||||
versions_by_subset_id[subset_id].append(version_doc)
|
||||
|
||||
last_versions = get_last_versions(
|
||||
project_name,
|
||||
subset_ids=versions_by_subset_id.keys(),
|
||||
fields=["_id"]
|
||||
)
|
||||
# Figure out which versions are outdated
|
||||
outdated_version_ids = set()
|
||||
for subset_id, last_version_doc in last_versions.items():
|
||||
for version_doc in versions_by_subset_id[subset_id]:
|
||||
version_id = version_doc["_id"]
|
||||
if version_id != last_version_doc["_id"]:
|
||||
outdated_version_ids.add(version_id)
|
||||
|
||||
# Based on all collected data figure out which containers are outdated
|
||||
# - log out if there are missing representation or version documents
|
||||
for container in containers:
|
||||
container_name = container["objectName"]
|
||||
repre_id = container["representation"]
|
||||
if not repre_id:
|
||||
invalid_containers.append(container)
|
||||
continue
|
||||
|
||||
repre_doc = repre_docs_by_str_id.get(repre_id)
|
||||
if not repre_doc:
|
||||
log.debug((
|
||||
"Container '{}' has an invalid representation."
|
||||
" It is missing in the database."
|
||||
).format(container_name))
|
||||
not_found_containers.append(container)
|
||||
continue
|
||||
|
||||
version_id = repre_doc["parent"]
|
||||
if version_id in outdated_version_ids:
|
||||
outdated_containers.append(container)
|
||||
|
||||
elif version_id not in verisons_by_id:
|
||||
log.debug((
|
||||
"Representation on container '{}' has an invalid version."
|
||||
" It is missing in the database."
|
||||
).format(container_name))
|
||||
not_found_containers.append(container)
|
||||
|
||||
else:
|
||||
uptodate_containers.append(container)
|
||||
|
||||
return output
|
||||
|
|
|
|||
|
|
@ -285,36 +285,34 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
|||
audio_channels,
|
||||
audio_sample_rate,
|
||||
audio_channel_layout,
|
||||
input_frame_rate
|
||||
)
|
||||
|
||||
# replace slate with silent slate for concat
|
||||
slate_v_path = slate_silent_path
|
||||
|
||||
# create ffmpeg concat text file path
|
||||
conc_text_file = input_file.replace(ext, "") + "_concat" + ".txt"
|
||||
conc_text_path = os.path.join(
|
||||
os.path.normpath(stagingdir), conc_text_file)
|
||||
_remove_at_end.append(conc_text_path)
|
||||
self.log.debug("__ conc_text_path: {}".format(conc_text_path))
|
||||
|
||||
new_line = "\n"
|
||||
with open(conc_text_path, "w") as conc_text_f:
|
||||
conc_text_f.writelines([
|
||||
"file {}".format(
|
||||
slate_v_path.replace("\\", "/")),
|
||||
new_line,
|
||||
"file {}".format(input_path.replace("\\", "/"))
|
||||
])
|
||||
|
||||
# concat slate and videos together
|
||||
# concat slate and videos together with concat filter
|
||||
# this will reencode the output
|
||||
if input_audio:
|
||||
fmap = [
|
||||
"-filter_complex",
|
||||
"[0:v] [0:a] [1:v] [1:a] concat=n=2:v=1:a=1 [v] [a]",
|
||||
"-map", '[v]',
|
||||
"-map", '[a]'
|
||||
]
|
||||
else:
|
||||
fmap = [
|
||||
"-filter_complex",
|
||||
"[0:v] [1:v] concat=n=2:v=1:a=0 [v]",
|
||||
"-map", '[v]'
|
||||
]
|
||||
concat_args = [
|
||||
ffmpeg_path,
|
||||
"-y",
|
||||
"-f", "concat",
|
||||
"-safe", "0",
|
||||
"-i", conc_text_path,
|
||||
"-c", "copy",
|
||||
"-i", slate_v_path,
|
||||
"-i", input_path,
|
||||
]
|
||||
concat_args.extend(fmap)
|
||||
if offset_timecode:
|
||||
concat_args.extend(["-timecode", offset_timecode])
|
||||
# NOTE: Added because of OP Atom demuxers
|
||||
|
|
@ -322,12 +320,18 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
|||
# - keep format of output
|
||||
if format_args:
|
||||
concat_args.extend(format_args)
|
||||
|
||||
if codec_args:
|
||||
concat_args.extend(codec_args)
|
||||
|
||||
# Use arguments from ffmpeg preset
|
||||
source_ffmpeg_cmd = repre.get("ffmpeg_cmd")
|
||||
if source_ffmpeg_cmd:
|
||||
copy_args = (
|
||||
"-metadata",
|
||||
"-metadata:s:v:0",
|
||||
"-b:v",
|
||||
"-b:a",
|
||||
)
|
||||
args = source_ffmpeg_cmd.split(" ")
|
||||
for indx, arg in enumerate(args):
|
||||
|
|
@ -335,12 +339,14 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
|||
concat_args.append(arg)
|
||||
# assumes arg has one parameter
|
||||
concat_args.append(args[indx + 1])
|
||||
|
||||
# add final output path
|
||||
concat_args.append(output_path)
|
||||
|
||||
# ffmpeg concat subprocess
|
||||
self.log.debug(
|
||||
"Executing concat: {}".format(" ".join(concat_args))
|
||||
"Executing concat filter: {}".format
|
||||
(" ".join(concat_args))
|
||||
)
|
||||
openpype.api.run_subprocess(
|
||||
concat_args, logger=self.log
|
||||
|
|
@ -488,9 +494,10 @@ class ExtractReviewSlate(openpype.api.Extractor):
|
|||
audio_channels,
|
||||
audio_sample_rate,
|
||||
audio_channel_layout,
|
||||
input_frame_rate
|
||||
):
|
||||
# Get duration of one frame in micro seconds
|
||||
items = audio_sample_rate.split("/")
|
||||
items = input_frame_rate.split("/")
|
||||
if len(items) == 1:
|
||||
one_frame_duration = 1.0 / float(items[0])
|
||||
elif len(items) == 2:
|
||||
|
|
|
|||
908
openpype/plugins/publish/integrate.py
Normal file
|
|
@ -0,0 +1,908 @@
|
|||
import os
|
||||
import logging
|
||||
import sys
|
||||
import copy
|
||||
import clique
|
||||
import six
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
from pymongo import DeleteMany, ReplaceOne, InsertOne, UpdateOne
|
||||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype.lib.profiles_filtering import filter_profiles
|
||||
from openpype.lib.file_transaction import FileTransaction
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.publish import KnownPublishError
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def assemble(files):
|
||||
"""Convenience `clique.assemble` wrapper for files of a single collection.
|
||||
|
||||
Unlike `clique.assemble` this wrapper does not allow more than a single
|
||||
Collection nor any remainder files. Errors will be raised when not only
|
||||
a single collection is assembled.
|
||||
|
||||
Returns:
|
||||
clique.Collection: A single sequence Collection
|
||||
|
||||
Raises:
|
||||
ValueError: Error is raised when files do not result in a single
|
||||
collected Collection.
|
||||
|
||||
"""
|
||||
# todo: move this to lib?
|
||||
# Get the sequence as a collection. The files must be of a single
|
||||
# sequence and have no remainder outside of the collections.
|
||||
patterns = [clique.PATTERNS["frames"]]
|
||||
collections, remainder = clique.assemble(files,
|
||||
minimum_items=1,
|
||||
patterns=patterns)
|
||||
if not collections:
|
||||
raise ValueError("No collections found in files: "
|
||||
"{}".format(files))
|
||||
if remainder:
|
||||
raise ValueError("Files found not detected as part"
|
||||
" of a sequence: {}".format(remainder))
|
||||
if len(collections) > 1:
|
||||
raise ValueError("Files in sequence are not part of a"
|
||||
" single sequence collection: "
|
||||
"{}".format(collections))
|
||||
return collections[0]
|
||||
|
||||
|
||||
def get_instance_families(instance):
|
||||
"""Get all families of the instance"""
|
||||
# todo: move this to lib?
|
||||
family = instance.data.get("family")
|
||||
families = []
|
||||
if family:
|
||||
families.append(family)
|
||||
|
||||
for _family in (instance.data.get("families") or []):
|
||||
if _family not in families:
|
||||
families.append(_family)
|
||||
|
||||
return families
|
||||
|
||||
|
||||
def get_frame_padded(frame, padding):
|
||||
"""Return frame number as string with `padding` amount of padded zeros"""
|
||||
return "{frame:0{padding}d}".format(padding=padding, frame=frame)
|
||||
|
||||
|
||||
def get_first_frame_padded(collection):
|
||||
"""Return first frame as padded number from `clique.Collection`"""
|
||||
start_frame = next(iter(collection.indexes))
|
||||
return get_frame_padded(start_frame, padding=collection.padding)
|
||||
|
||||
|
||||
class IntegrateAsset(pyblish.api.InstancePlugin):
|
||||
"""Register publish in the database and transfer files to destinations.
|
||||
|
||||
Steps:
|
||||
1) Register the subset and version
|
||||
2) Transfer the representation files to the destination
|
||||
3) Register the representation
|
||||
|
||||
Requires:
|
||||
instance.data['representations'] - must be a list and each member
|
||||
must be a dictionary with following data:
|
||||
'files': list of filenames for sequence, string for single file.
|
||||
Only the filename is allowed, without the folder path.
|
||||
'stagingDir': "path/to/folder/with/files"
|
||||
'name': representation name (usually the same as extension)
|
||||
'ext': file extension
|
||||
optional data
|
||||
"frameStart"
|
||||
"frameEnd"
|
||||
'fps'
|
||||
"data": additional metadata for each representation.
|
||||
"""
|
||||
|
||||
label = "Integrate Asset"
|
||||
order = pyblish.api.IntegratorOrder
|
||||
families = ["workfile",
|
||||
"pointcache",
|
||||
"camera",
|
||||
"animation",
|
||||
"model",
|
||||
"mayaAscii",
|
||||
"mayaScene",
|
||||
"setdress",
|
||||
"layout",
|
||||
"ass",
|
||||
"vdbcache",
|
||||
"scene",
|
||||
"vrayproxy",
|
||||
"vrayscene_layer",
|
||||
"render",
|
||||
"prerender",
|
||||
"imagesequence",
|
||||
"review",
|
||||
"rendersetup",
|
||||
"rig",
|
||||
"plate",
|
||||
"look",
|
||||
"audio",
|
||||
"yetiRig",
|
||||
"yeticache",
|
||||
"nukenodes",
|
||||
"gizmo",
|
||||
"source",
|
||||
"matchmove",
|
||||
"image",
|
||||
"assembly",
|
||||
"fbx",
|
||||
"textures",
|
||||
"action",
|
||||
"harmony.template",
|
||||
"harmony.palette",
|
||||
"editorial",
|
||||
"background",
|
||||
"camerarig",
|
||||
"redshiftproxy",
|
||||
"effect",
|
||||
"xgen",
|
||||
"hda",
|
||||
"usd",
|
||||
"staticMesh",
|
||||
"skeletalMesh",
|
||||
"mvLook",
|
||||
"mvUsd",
|
||||
"mvUsdComposition",
|
||||
"mvUsdOverride",
|
||||
"simpleUnrealTexture"
|
||||
]
|
||||
exclude_families = ["clip", "render.farm"]
|
||||
default_template_name = "publish"
|
||||
|
||||
# Representation context keys that should always be written to
|
||||
# the database even if not used by the destination template
|
||||
db_representation_context_keys = [
|
||||
"project", "asset", "task", "subset", "version", "representation",
|
||||
"family", "hierarchy", "username"
|
||||
]
|
||||
skip_host_families = []
|
||||
|
||||
def process(self, instance):
|
||||
if self._temp_skip_instance_by_settings(instance):
|
||||
return
|
||||
|
||||
# Mark instance as processed for legacy integrator
|
||||
instance.data["processedWithNewIntegrator"] = True
|
||||
|
||||
# Instance should be integrated on a farm
|
||||
if instance.data.get("farm"):
|
||||
self.log.info(
|
||||
"Instance is marked to be processed on farm. Skipping")
|
||||
return
|
||||
|
||||
filtered_repres = self.filter_representations(instance)
|
||||
# Skip instance if there are not representations to integrate
|
||||
# all representations should not be integrated
|
||||
if not filtered_repres:
|
||||
self.log.warning((
|
||||
"Skipping, there are no representations"
|
||||
" to integrate for instance {}"
|
||||
).format(instance.data["family"]))
|
||||
return
|
||||
|
||||
# Exclude instances that also contain families from exclude families
|
||||
families = set(get_instance_families(instance))
|
||||
exclude = families & set(self.exclude_families)
|
||||
if exclude:
|
||||
self.log.debug("Instance not integrated due to exclude "
|
||||
"families found: {}".format(", ".join(exclude)))
|
||||
return
|
||||
|
||||
file_transactions = FileTransaction(log=self.log)
|
||||
try:
|
||||
self.register(instance, file_transactions, filtered_repres)
|
||||
except Exception:
|
||||
# clean destination
|
||||
# todo: preferably we'd also rollback *any* changes to the database
|
||||
file_transactions.rollback()
|
||||
self.log.critical("Error when registering", exc_info=True)
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
# Finalizing can't rollback safely so no use for moving it to
|
||||
# the try, except.
|
||||
file_transactions.finalize()
|
||||
|
||||
def _temp_skip_instance_by_settings(self, instance):
|
||||
"""Decide if instance will be processed with new or legacy integrator.
|
||||
|
||||
This is temporary solution until we test all usecases with new (this)
|
||||
integrator plugin.
|
||||
"""
|
||||
|
||||
host_name = instance.context.data["hostName"]
|
||||
instance_family = instance.data["family"]
|
||||
instance_families = set(instance.data.get("families") or [])
|
||||
|
||||
skip = False
|
||||
for item in self.skip_host_families:
|
||||
if host_name not in item["host"]:
|
||||
continue
|
||||
|
||||
families = set(item["families"])
|
||||
if instance_family in families:
|
||||
skip = True
|
||||
break
|
||||
|
||||
for family in instance_families:
|
||||
if family in families:
|
||||
skip = True
|
||||
break
|
||||
|
||||
if skip:
|
||||
break
|
||||
|
||||
if skip:
|
||||
self.log.debug("Instance is marked to be skipped by settings.")
|
||||
return skip
|
||||
|
||||
def filter_representations(self, instance):
|
||||
# Prepare repsentations that should be integrated
|
||||
repres = instance.data.get("representations")
|
||||
# Raise error if instance don't have any representations
|
||||
if not repres:
|
||||
raise KnownPublishError(
|
||||
"Instance {} has no representations to integrate".format(
|
||||
instance.data["family"]
|
||||
)
|
||||
)
|
||||
|
||||
# Validate type of stored representations
|
||||
if not isinstance(repres, (list, tuple)):
|
||||
raise TypeError(
|
||||
"Instance 'files' must be a list, got: {0} {1}".format(
|
||||
str(type(repres)), str(repres)
|
||||
)
|
||||
)
|
||||
|
||||
# Filter representations
|
||||
filtered_repres = []
|
||||
for repre in repres:
|
||||
if "delete" in repre.get("tags", []):
|
||||
continue
|
||||
filtered_repres.append(repre)
|
||||
|
||||
return filtered_repres
|
||||
|
||||
def register(self, instance, file_transactions, filtered_repres):
|
||||
instance_stagingdir = instance.data.get("stagingDir")
|
||||
if not instance_stagingdir:
|
||||
self.log.info((
|
||||
"{0} is missing reference to staging directory."
|
||||
" Will try to get it from representation."
|
||||
).format(instance))
|
||||
|
||||
else:
|
||||
self.log.debug(
|
||||
"Establishing staging directory "
|
||||
"@ {0}".format(instance_stagingdir)
|
||||
)
|
||||
|
||||
template_name = self.get_template_name(instance)
|
||||
|
||||
subset, subset_writes = self.prepare_subset(instance)
|
||||
version, version_writes = self.prepare_version(instance, subset)
|
||||
instance.data["versionEntity"] = version
|
||||
|
||||
# Get existing representations (if any)
|
||||
existing_repres_by_name = {
|
||||
repres["name"].lower(): repres for repres in legacy_io.find(
|
||||
{
|
||||
"parent": version["_id"],
|
||||
"type": "representation"
|
||||
},
|
||||
# Only care about id and name of existing representations
|
||||
projection={"_id": True, "name": True}
|
||||
)
|
||||
}
|
||||
|
||||
# Prepare all representations
|
||||
prepared_representations = []
|
||||
for repre in filtered_repres:
|
||||
# todo: reduce/simplify what is returned from this function
|
||||
prepared = self.prepare_representation(
|
||||
repre,
|
||||
template_name,
|
||||
existing_repres_by_name,
|
||||
version,
|
||||
instance_stagingdir,
|
||||
instance)
|
||||
|
||||
for src, dst in prepared["transfers"]:
|
||||
# todo: add support for hardlink transfers
|
||||
file_transactions.add(src, dst)
|
||||
|
||||
prepared_representations.append(prepared)
|
||||
|
||||
# Each instance can also have pre-defined transfers not explicitly
|
||||
# part of a representation - like texture resources used by a
|
||||
# .ma representation. Those destination paths are pre-defined, etc.
|
||||
# todo: should we move or simplify this logic?
|
||||
resource_destinations = set()
|
||||
for src, dst in instance.data.get("transfers", []):
|
||||
file_transactions.add(src, dst, mode=FileTransaction.MODE_COPY)
|
||||
resource_destinations.add(os.path.abspath(dst))
|
||||
|
||||
for src, dst in instance.data.get("hardlinks", []):
|
||||
file_transactions.add(src, dst, mode=FileTransaction.MODE_HARDLINK)
|
||||
resource_destinations.add(os.path.abspath(dst))
|
||||
|
||||
# Bulk write to the database
|
||||
# We write the subset and version to the database before the File
|
||||
# Transaction to reduce the chances of another publish trying to
|
||||
# publish to the same version number since that chance can greatly
|
||||
# increase if the file transaction takes a long time.
|
||||
legacy_io.bulk_write(subset_writes + version_writes)
|
||||
self.log.info("Subset {subset[name]} and Version {version[name]} "
|
||||
"written to database..".format(subset=subset,
|
||||
version=version))
|
||||
|
||||
# Process all file transfers of all integrations now
|
||||
self.log.debug("Integrating source files to destination ...")
|
||||
file_transactions.process()
|
||||
self.log.debug(
|
||||
"Backed up existing files: {}".format(file_transactions.backups))
|
||||
self.log.debug(
|
||||
"Transferred files: {}".format(file_transactions.transferred))
|
||||
self.log.debug("Retrieving Representation Site Sync information ...")
|
||||
|
||||
# Get the accessible sites for Site Sync
|
||||
modules_by_name = instance.context.data["openPypeModules"]
|
||||
sync_server_module = modules_by_name["sync_server"]
|
||||
sites = sync_server_module.compute_resource_sync_sites(
|
||||
project_name=instance.data["projectEntity"]["name"]
|
||||
)
|
||||
self.log.debug("Sync Server Sites: {}".format(sites))
|
||||
|
||||
# Compute the resource file infos once (files belonging to the
|
||||
# version instance instead of an individual representation) so
|
||||
# we can re-use those file infos per representation
|
||||
anatomy = instance.context.data["anatomy"]
|
||||
resource_file_infos = self.get_files_info(resource_destinations,
|
||||
sites=sites,
|
||||
anatomy=anatomy)
|
||||
|
||||
# Finalize the representations now the published files are integrated
|
||||
# Get 'files' info for representations and its attached resources
|
||||
representation_writes = []
|
||||
new_repre_names_low = set()
|
||||
for prepared in prepared_representations:
|
||||
representation = prepared["representation"]
|
||||
transfers = prepared["transfers"]
|
||||
destinations = [dst for src, dst in transfers]
|
||||
representation["files"] = self.get_files_info(
|
||||
destinations, sites=sites, anatomy=anatomy
|
||||
)
|
||||
|
||||
# Add the version resource file infos to each representation
|
||||
representation["files"] += resource_file_infos
|
||||
|
||||
# Set up representation for writing to the database. Since
|
||||
# we *might* be overwriting an existing entry if the version
|
||||
# already existed we'll use ReplaceOnce with `upsert=True`
|
||||
representation_writes.append(ReplaceOne(
|
||||
filter={"_id": representation["_id"]},
|
||||
replacement=representation,
|
||||
upsert=True
|
||||
))
|
||||
|
||||
new_repre_names_low.add(representation["name"].lower())
|
||||
|
||||
# Delete any existing representations that didn't get any new data
|
||||
# if the instance is not set to append mode
|
||||
if not instance.data.get("append", False):
|
||||
delete_names = set()
|
||||
for name, existing_repres in existing_repres_by_name.items():
|
||||
if name not in new_repre_names_low:
|
||||
# We add the exact representation name because `name` is
|
||||
# lowercase for name matching only and not in the database
|
||||
delete_names.add(existing_repres["name"])
|
||||
if delete_names:
|
||||
representation_writes.append(DeleteMany(
|
||||
filter={
|
||||
"parent": version["_id"],
|
||||
"name": {"$in": list(delete_names)}
|
||||
}
|
||||
))
|
||||
|
||||
# Write representations to the database
|
||||
legacy_io.bulk_write(representation_writes)
|
||||
|
||||
# Backwards compatibility
|
||||
# todo: can we avoid the need to store this?
|
||||
instance.data["published_representations"] = {
|
||||
p["representation"]["_id"]: p for p in prepared_representations
|
||||
}
|
||||
|
||||
self.log.info("Registered {} representations"
|
||||
"".format(len(prepared_representations)))
|
||||
|
||||
def prepare_subset(self, instance):
|
||||
asset = instance.data.get("assetEntity")
|
||||
subset_name = instance.data["subset"]
|
||||
self.log.debug("Subset: {}".format(subset_name))
|
||||
|
||||
# Get existing subset if it exists
|
||||
subset = legacy_io.find_one({
|
||||
"type": "subset",
|
||||
"parent": asset["_id"],
|
||||
"name": subset_name
|
||||
})
|
||||
|
||||
# Define subset data
|
||||
data = {
|
||||
"families": get_instance_families(instance)
|
||||
}
|
||||
|
||||
subset_group = instance.data.get("subsetGroup")
|
||||
if subset_group:
|
||||
data["subsetGroup"] = subset_group
|
||||
|
||||
bulk_writes = []
|
||||
if subset is None:
|
||||
# Create a new subset
|
||||
self.log.info("Subset '%s' not found, creating ..." % subset_name)
|
||||
subset = {
|
||||
"_id": ObjectId(),
|
||||
"schema": "openpype:subset-3.0",
|
||||
"type": "subset",
|
||||
"name": subset_name,
|
||||
"data": data,
|
||||
"parent": asset["_id"]
|
||||
}
|
||||
bulk_writes.append(InsertOne(subset))
|
||||
|
||||
else:
|
||||
# Update existing subset data with new data and set in database.
|
||||
# We also change the found subset in-place so we don't need to
|
||||
# re-query the subset afterwards
|
||||
subset["data"].update(data)
|
||||
bulk_writes.append(UpdateOne(
|
||||
{"type": "subset", "_id": subset["_id"]},
|
||||
{"$set": {
|
||||
"data": subset["data"]
|
||||
}}
|
||||
))
|
||||
|
||||
self.log.info("Prepared subset: {}".format(subset_name))
|
||||
return subset, bulk_writes
|
||||
|
||||
def prepare_version(self, instance, subset):
|
||||
|
||||
version_number = instance.data["version"]
|
||||
|
||||
version = {
|
||||
"schema": "openpype:version-3.0",
|
||||
"type": "version",
|
||||
"parent": subset["_id"],
|
||||
"name": version_number,
|
||||
"data": self.create_version_data(instance)
|
||||
}
|
||||
|
||||
existing_version = legacy_io.find_one({
|
||||
'type': 'version',
|
||||
'parent': subset["_id"],
|
||||
'name': version_number
|
||||
}, projection={"_id": True})
|
||||
|
||||
if existing_version:
|
||||
self.log.debug("Updating existing version ...")
|
||||
version["_id"] = existing_version["_id"]
|
||||
else:
|
||||
self.log.debug("Creating new version ...")
|
||||
version["_id"] = ObjectId()
|
||||
|
||||
bulk_writes = [ReplaceOne(
|
||||
filter={"_id": version["_id"]},
|
||||
replacement=version,
|
||||
upsert=True
|
||||
)]
|
||||
|
||||
self.log.info("Prepared version: v{0:03d}".format(version["name"]))
|
||||
|
||||
return version, bulk_writes
|
||||
|
||||
def prepare_representation(self, repre,
|
||||
template_name,
|
||||
existing_repres_by_name,
|
||||
version,
|
||||
instance_stagingdir,
|
||||
instance):
|
||||
|
||||
# pre-flight validations
|
||||
if repre["ext"].startswith("."):
|
||||
raise ValueError("Extension must not start with a dot '.': "
|
||||
"{}".format(repre["ext"]))
|
||||
|
||||
if repre.get("transfers"):
|
||||
raise ValueError("Representation is not allowed to have transfers"
|
||||
"data before integration. They are computed in "
|
||||
"the integrator"
|
||||
"Got: {}".format(repre["transfers"]))
|
||||
|
||||
# create template data for Anatomy
|
||||
template_data = copy.deepcopy(instance.data["anatomyData"])
|
||||
|
||||
# required representation keys
|
||||
files = repre['files']
|
||||
template_data["representation"] = repre["name"]
|
||||
template_data["ext"] = repre["ext"]
|
||||
|
||||
# optionals
|
||||
# retrieve additional anatomy data from representation if exists
|
||||
for key, anatomy_key in {
|
||||
# Representation Key: Anatomy data key
|
||||
"resolutionWidth": "resolution_width",
|
||||
"resolutionHeight": "resolution_height",
|
||||
"fps": "fps",
|
||||
"outputName": "output",
|
||||
"originalBasename": "originalBasename"
|
||||
}.items():
|
||||
# Allow to take value from representation
|
||||
# if not found also consider instance.data
|
||||
if key in repre:
|
||||
value = repre[key]
|
||||
elif key in instance.data:
|
||||
value = instance.data[key]
|
||||
else:
|
||||
continue
|
||||
template_data[anatomy_key] = value
|
||||
|
||||
if repre.get('stagingDir'):
|
||||
stagingdir = repre['stagingDir']
|
||||
else:
|
||||
# Fall back to instance staging dir if not explicitly
|
||||
# set for representation in the instance
|
||||
self.log.debug("Representation uses instance staging dir: "
|
||||
"{}".format(instance_stagingdir))
|
||||
stagingdir = instance_stagingdir
|
||||
if not stagingdir:
|
||||
raise ValueError("No staging directory set for representation: "
|
||||
"{}".format(repre))
|
||||
|
||||
self.log.debug("Anatomy template name: {}".format(template_name))
|
||||
anatomy = instance.context.data['anatomy']
|
||||
template = os.path.normpath(anatomy.templates[template_name]["path"])
|
||||
|
||||
is_udim = bool(repre.get("udim"))
|
||||
is_sequence_representation = isinstance(files, (list, tuple))
|
||||
if is_sequence_representation:
|
||||
# Collection of files (sequence)
|
||||
assert not any(os.path.isabs(fname) for fname in files), (
|
||||
"Given file names contain full paths"
|
||||
)
|
||||
|
||||
src_collection = assemble(files)
|
||||
|
||||
# If the representation has `frameStart` set it renumbers the
|
||||
# frame indices of the published collection. It will start from
|
||||
# that `frameStart` index instead. Thus if that frame start
|
||||
# differs from the collection we want to shift the destination
|
||||
# frame indices from the source collection.
|
||||
destination_indexes = list(src_collection.indexes)
|
||||
destination_padding = len(get_first_frame_padded(src_collection))
|
||||
if repre.get("frameStart") is not None and not is_udim:
|
||||
index_frame_start = int(repre.get("frameStart"))
|
||||
|
||||
render_template = anatomy.templates[template_name]
|
||||
# todo: should we ALWAYS manage the frame padding even when not
|
||||
# having `frameStart` set?
|
||||
frame_start_padding = int(
|
||||
render_template.get(
|
||||
"frame_padding",
|
||||
render_template.get("padding")
|
||||
)
|
||||
)
|
||||
|
||||
# Shift destination sequence to the start frame
|
||||
src_start_frame = next(iter(src_collection.indexes))
|
||||
shift = index_frame_start - src_start_frame
|
||||
if shift:
|
||||
destination_indexes = [
|
||||
frame + shift for frame in destination_indexes
|
||||
]
|
||||
destination_padding = frame_start_padding
|
||||
|
||||
# To construct the destination template with anatomy we require
|
||||
# a Frame or UDIM tile set for the template data. We use the first
|
||||
# index of the destination for that because that could've shifted
|
||||
# from the source indexes, etc.
|
||||
first_index_padded = get_frame_padded(frame=destination_indexes[0],
|
||||
padding=destination_padding)
|
||||
if is_udim:
|
||||
# UDIM representations handle ranges in a different manner
|
||||
template_data["udim"] = first_index_padded
|
||||
else:
|
||||
template_data["frame"] = first_index_padded
|
||||
|
||||
# Construct destination collection from template
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled[template_name]["path"]
|
||||
repre_context = template_filled.used_values
|
||||
self.log.debug("Template filled: {}".format(str(template_filled)))
|
||||
dst_collection = assemble([os.path.normpath(template_filled)])
|
||||
|
||||
# Update the destination indexes and padding
|
||||
dst_collection.indexes.clear()
|
||||
dst_collection.indexes.update(set(destination_indexes))
|
||||
dst_collection.padding = destination_padding
|
||||
assert (
|
||||
len(src_collection.indexes) == len(dst_collection.indexes)
|
||||
), "This is a bug"
|
||||
|
||||
# Multiple file transfers
|
||||
transfers = []
|
||||
for src_file_name, dst in zip(src_collection, dst_collection):
|
||||
src = os.path.join(stagingdir, src_file_name)
|
||||
transfers.append((src, dst))
|
||||
|
||||
else:
|
||||
# Single file
|
||||
fname = files
|
||||
assert not os.path.isabs(fname), (
|
||||
"Given file name is a full path"
|
||||
)
|
||||
|
||||
# Manage anatomy template data
|
||||
template_data.pop("frame", None)
|
||||
if is_udim:
|
||||
template_data["udim"] = repre["udim"][0]
|
||||
|
||||
# Construct destination filepath from template
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled[template_name]["path"]
|
||||
repre_context = template_filled.used_values
|
||||
dst = os.path.normpath(template_filled)
|
||||
|
||||
# Single file transfer
|
||||
src = os.path.join(stagingdir, fname)
|
||||
transfers = [(src, dst)]
|
||||
|
||||
# todo: Are we sure the assumption each representation
|
||||
# ends up in the same folder is valid?
|
||||
if not instance.data.get("publishDir"):
|
||||
instance.data["publishDir"] = (
|
||||
anatomy_filled
|
||||
[template_name]
|
||||
["folder"]
|
||||
)
|
||||
|
||||
for key in self.db_representation_context_keys:
|
||||
# Also add these values to the context even if not used by the
|
||||
# destination template
|
||||
value = template_data.get(key)
|
||||
if not value:
|
||||
continue
|
||||
repre_context[key] = template_data[key]
|
||||
|
||||
# Explicitly store the full list even though template data might
|
||||
# have a different value because it uses just a single udim tile
|
||||
if repre.get("udim"):
|
||||
repre_context["udim"] = repre.get("udim") # store list
|
||||
|
||||
# Use previous representation's id if there is a name match
|
||||
existing = existing_repres_by_name.get(repre["name"].lower())
|
||||
if existing:
|
||||
repre_id = existing["_id"]
|
||||
else:
|
||||
repre_id = ObjectId()
|
||||
|
||||
# Backwards compatibility:
|
||||
# Store first transferred destination as published path data
|
||||
# todo: can we remove this?
|
||||
# todo: We shouldn't change data that makes its way back into
|
||||
# instance.data[] until we know the publish actually succeeded
|
||||
# otherwise `published_path` might not actually be valid?
|
||||
published_path = transfers[0][1]
|
||||
repre["published_path"] = published_path # Backwards compatibility
|
||||
|
||||
# todo: `repre` is not the actual `representation` entity
|
||||
# we should simplify/clarify difference between data above
|
||||
# and the actual representation entity for the database
|
||||
data = repre.get("data", {})
|
||||
data.update({'path': published_path, 'template': template})
|
||||
representation = {
|
||||
"_id": repre_id,
|
||||
"schema": "openpype:representation-2.0",
|
||||
"type": "representation",
|
||||
"parent": version["_id"],
|
||||
"name": repre['name'],
|
||||
"data": data,
|
||||
|
||||
# Imprint shortcut to context for performance reasons.
|
||||
"context": repre_context
|
||||
}
|
||||
|
||||
# todo: simplify/streamline which additional data makes its way into
|
||||
# the representation context
|
||||
if repre.get("outputName"):
|
||||
representation["context"]["output"] = repre['outputName']
|
||||
|
||||
if is_sequence_representation and repre.get("frameStart") is not None:
|
||||
representation['context']['frame'] = template_data["frame"]
|
||||
|
||||
return {
|
||||
"representation": representation,
|
||||
"anatomy_data": template_data,
|
||||
"transfers": transfers,
|
||||
# todo: avoid the need for 'published_files' used by Integrate Hero
|
||||
# backwards compatibility
|
||||
"published_files": [transfer[1] for transfer in transfers]
|
||||
}
|
||||
|
||||
def create_version_data(self, instance):
|
||||
"""Create the data dictionary for the version
|
||||
|
||||
Args:
|
||||
instance: the current instance being published
|
||||
|
||||
Returns:
|
||||
dict: the required information for version["data"]
|
||||
"""
|
||||
|
||||
context = instance.context
|
||||
|
||||
# create relative source path for DB
|
||||
if "source" in instance.data:
|
||||
source = instance.data["source"]
|
||||
else:
|
||||
source = context.data["currentFile"]
|
||||
anatomy = instance.context.data["anatomy"]
|
||||
source = self.get_rootless_path(anatomy, source)
|
||||
self.log.debug("Source: {}".format(source))
|
||||
|
||||
version_data = {
|
||||
"families": get_instance_families(instance),
|
||||
"time": context.data["time"],
|
||||
"author": context.data["user"],
|
||||
"source": source,
|
||||
"comment": context.data.get("comment"),
|
||||
"machine": context.data.get("machine"),
|
||||
"fps": instance.data.get("fps", context.data.get("fps"))
|
||||
}
|
||||
|
||||
# todo: preferably we wouldn't need this "if dict" etc. logic and
|
||||
# instead be able to rely what the input value is if it's set.
|
||||
intent_value = context.data.get("intent")
|
||||
if intent_value and isinstance(intent_value, dict):
|
||||
intent_value = intent_value.get("value")
|
||||
|
||||
if intent_value:
|
||||
version_data["intent"] = intent_value
|
||||
|
||||
# Include optional data if present in
|
||||
optionals = [
|
||||
"frameStart", "frameEnd", "step", "handles",
|
||||
"handleEnd", "handleStart", "sourceHashes"
|
||||
]
|
||||
for key in optionals:
|
||||
if key in instance.data:
|
||||
version_data[key] = instance.data[key]
|
||||
|
||||
# Include instance.data[versionData] directly
|
||||
version_data_instance = instance.data.get('versionData')
|
||||
if version_data_instance:
|
||||
version_data.update(version_data_instance)
|
||||
|
||||
return version_data
|
||||
|
||||
def get_template_name(self, instance):
|
||||
"""Return anatomy template name to use for integration"""
|
||||
# Define publish template name from profiles
|
||||
filter_criteria = self.get_profile_filter_criteria(instance)
|
||||
template_name_profiles = self._get_template_name_profiles(instance)
|
||||
profile = filter_profiles(
|
||||
template_name_profiles,
|
||||
filter_criteria,
|
||||
logger=self.log
|
||||
)
|
||||
|
||||
if profile:
|
||||
return profile["template_name"]
|
||||
return self.default_template_name
|
||||
|
||||
def _get_template_name_profiles(self, instance):
|
||||
"""Receive profiles for publish template keys.
|
||||
|
||||
Reuse template name profiles from legacy integrator. Goal is to move
|
||||
the profile settings out of plugin settings but until that happens we
|
||||
want to be able set it at one place and don't break backwards
|
||||
compatibility (more then once).
|
||||
"""
|
||||
|
||||
return (
|
||||
instance.context.data["project_settings"]
|
||||
["global"]
|
||||
["publish"]
|
||||
["IntegrateAssetNew"]
|
||||
["template_name_profiles"]
|
||||
)
|
||||
|
||||
def get_profile_filter_criteria(self, instance):
|
||||
"""Return filter criteria for `filter_profiles`"""
|
||||
# Anatomy data is pre-filled by Collectors
|
||||
anatomy_data = instance.data["anatomyData"]
|
||||
|
||||
# Task can be optional in anatomy data
|
||||
task = anatomy_data.get("task", {})
|
||||
|
||||
# Return filter criteria
|
||||
return {
|
||||
"families": anatomy_data["family"],
|
||||
"tasks": task.get("name"),
|
||||
"task_types": task.get("type"),
|
||||
"hosts": instance.context.data["hostName"],
|
||||
}
|
||||
|
||||
def get_rootless_path(self, anatomy, path):
|
||||
"""Returns, if possible, path without absolute portion from root
|
||||
(eg. 'c:\' or '/opt/..')
|
||||
|
||||
This information is platform dependent and shouldn't be captured.
|
||||
Example:
|
||||
'c:/projects/MyProject1/Assets/publish...' >
|
||||
'{root}/MyProject1/Assets...'
|
||||
|
||||
Args:
|
||||
anatomy: anatomy part from instance
|
||||
path: path (absolute)
|
||||
Returns:
|
||||
path: modified path if possible, or unmodified path
|
||||
+ warning logged
|
||||
"""
|
||||
success, rootless_path = anatomy.find_root_template_from_path(path)
|
||||
if success:
|
||||
path = rootless_path
|
||||
else:
|
||||
self.log.warning((
|
||||
"Could not find root path for remapping \"{}\"."
|
||||
" This may cause issues on farm."
|
||||
).format(path))
|
||||
return path
|
||||
|
||||
def get_files_info(self, destinations, sites, anatomy):
|
||||
"""Prepare 'files' info portion for representations.
|
||||
|
||||
Arguments:
|
||||
destinations (list): List of transferred file destinations
|
||||
sites (list): array of published locations
|
||||
anatomy: anatomy part from instance
|
||||
Returns:
|
||||
output_resources: array of dictionaries to be added to 'files' key
|
||||
in representation
|
||||
"""
|
||||
file_infos = []
|
||||
for file_path in destinations:
|
||||
file_info = self.prepare_file_info(file_path, anatomy, sites=sites)
|
||||
file_infos.append(file_info)
|
||||
return file_infos
|
||||
|
||||
def prepare_file_info(self, path, anatomy, sites):
|
||||
""" Prepare information for one file (asset or resource)
|
||||
|
||||
Arguments:
|
||||
path: destination url of published file
|
||||
anatomy: anatomy part from instance
|
||||
sites: array of published locations,
|
||||
[ {'name':'studio', 'created_dt':date} by default
|
||||
keys expected ['studio', 'site1', 'gdrive1']
|
||||
|
||||
Returns:
|
||||
dict: file info dictionary
|
||||
"""
|
||||
return {
|
||||
"_id": ObjectId(),
|
||||
"path": self.get_rootless_path(anatomy, path),
|
||||
"size": os.path.getsize(path),
|
||||
"hash": openpype.api.source_hash(path),
|
||||
"sites": sites
|
||||
}
|
||||
|
|
@ -69,8 +69,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
"data": additional metadata for each representation.
|
||||
"""
|
||||
|
||||
label = "Integrate Asset New"
|
||||
order = pyblish.api.IntegratorOrder
|
||||
label = "Integrate Asset (legacy)"
|
||||
# Make sure it happens after new integrator
|
||||
order = pyblish.api.IntegratorOrder + 0.00001
|
||||
families = ["workfile",
|
||||
"pointcache",
|
||||
"camera",
|
||||
|
|
@ -101,7 +102,6 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
"source",
|
||||
"matchmove",
|
||||
"image",
|
||||
"source",
|
||||
"assembly",
|
||||
"fbx",
|
||||
"textures",
|
||||
|
|
@ -142,6 +142,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
subset_grouping_profiles = None
|
||||
|
||||
def process(self, instance):
|
||||
if instance.data.get("processedWithNewIntegrator"):
|
||||
self.log.info("Instance was already processed with new integrator")
|
||||
return
|
||||
|
||||
for ef in self.exclude_families:
|
||||
if (
|
||||
instance.data["family"] == ef or
|
||||
98
openpype/plugins/publish/integrate_subset_group.py
Normal file
|
|
@ -0,0 +1,98 @@
|
|||
"""Produces instance.data["subsetGroup"] data used during integration.
|
||||
|
||||
Requires:
|
||||
dict -> context["anatomyData"] *(pyblish.api.CollectorOrder + 0.49)
|
||||
|
||||
Provides:
|
||||
instance -> subsetGroup (str)
|
||||
|
||||
"""
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib.profiles_filtering import filter_profiles
|
||||
from openpype.lib import (
|
||||
prepare_template_data,
|
||||
StringTemplate,
|
||||
TemplateUnsolved
|
||||
)
|
||||
|
||||
|
||||
class IntegrateSubsetGroup(pyblish.api.InstancePlugin):
|
||||
"""Integrate Subset Group for publish."""
|
||||
|
||||
# Run after CollectAnatomyInstanceData
|
||||
order = pyblish.api.IntegratorOrder - 0.1
|
||||
label = "Subset Group"
|
||||
|
||||
# Attributes set by settings
|
||||
subset_grouping_profiles = None
|
||||
|
||||
def process(self, instance):
|
||||
"""Look into subset group profiles set by settings.
|
||||
|
||||
Attribute 'subset_grouping_profiles' is defined by OpenPype settings.
|
||||
"""
|
||||
|
||||
# Skip if 'subset_grouping_profiles' is empty
|
||||
if not self.subset_grouping_profiles:
|
||||
return
|
||||
|
||||
if instance.data.get("subsetGroup"):
|
||||
# If subsetGroup is already set then allow that value to remain
|
||||
self.log.debug((
|
||||
"Skipping collect subset group due to existing value: {}"
|
||||
).format(instance.data["subsetGroup"]))
|
||||
return
|
||||
|
||||
# Skip if there is no matching profile
|
||||
filter_criteria = self.get_profile_filter_criteria(instance)
|
||||
profile = filter_profiles(
|
||||
self.subset_grouping_profiles,
|
||||
filter_criteria,
|
||||
logger=self.log
|
||||
)
|
||||
|
||||
if not profile:
|
||||
return
|
||||
|
||||
template = profile["template"]
|
||||
|
||||
fill_pairs = prepare_template_data({
|
||||
"family": filter_criteria["families"],
|
||||
"task": filter_criteria["tasks"],
|
||||
"host": filter_criteria["hosts"],
|
||||
"subset": instance.data["subset"],
|
||||
"renderlayer": instance.data.get("renderlayer")
|
||||
})
|
||||
|
||||
filled_template = None
|
||||
try:
|
||||
filled_template = StringTemplate.format_strict_template(
|
||||
template, fill_pairs
|
||||
)
|
||||
except (KeyError, TemplateUnsolved):
|
||||
keys = fill_pairs.keys()
|
||||
self.log.warning((
|
||||
"Subset grouping failed. Only {} are expected in Settings"
|
||||
).format(','.join(keys)))
|
||||
|
||||
if filled_template:
|
||||
instance.data["subsetGroup"] = filled_template
|
||||
|
||||
def get_profile_filter_criteria(self, instance):
|
||||
"""Return filter criteria for `filter_profiles`"""
|
||||
# TODO: This logic is used in much more plug-ins in one way or another
|
||||
# Maybe better suited for lib?
|
||||
# Anatomy data is pre-filled by Collectors
|
||||
anatomy_data = instance.data["anatomyData"]
|
||||
|
||||
# Task can be optional in anatomy data
|
||||
task = anatomy_data.get("task", {})
|
||||
|
||||
# Return filter criteria
|
||||
return {
|
||||
"families": anatomy_data["family"],
|
||||
"tasks": task.get("name"),
|
||||
"hosts": anatomy_data["app"],
|
||||
"task_types": task.get("type")
|
||||
}
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
import pyblish.api
|
||||
import openpype.lib
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
|
||||
|
||||
class ShowInventory(pyblish.api.Action):
|
||||
|
|
@ -19,10 +19,10 @@ class ValidateContainers(pyblish.api.ContextPlugin):
|
|||
|
||||
label = "Validate Containers"
|
||||
order = pyblish.api.ValidatorOrder
|
||||
hosts = ["maya", "houdini", "nuke", "harmony", "photoshop"]
|
||||
hosts = ["maya", "houdini", "nuke", "harmony", "photoshop", "aftereffects"]
|
||||
optional = True
|
||||
actions = [ShowInventory]
|
||||
|
||||
def process(self, context):
|
||||
if openpype.lib.any_outdated():
|
||||
if any_outdated_containers():
|
||||
raise ValueError("There are outdated containers in the scene.")
|
||||
|
|
|
|||
|
|
@ -159,7 +159,27 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"IntegrateSubsetGroup": {
|
||||
"subset_grouping_profiles": [
|
||||
{
|
||||
"families": [],
|
||||
"hosts": [],
|
||||
"task_types": [],
|
||||
"tasks": [],
|
||||
"template": ""
|
||||
}
|
||||
]
|
||||
},
|
||||
"IntegrateAssetNew": {
|
||||
"subset_grouping_profiles": [
|
||||
{
|
||||
"families": [],
|
||||
"hosts": [],
|
||||
"task_types": [],
|
||||
"tasks": [],
|
||||
"template": ""
|
||||
}
|
||||
],
|
||||
"template_name_profiles": [
|
||||
{
|
||||
"families": [],
|
||||
|
|
@ -202,17 +222,11 @@
|
|||
"tasks": [],
|
||||
"template_name": "maya2unreal"
|
||||
}
|
||||
],
|
||||
"subset_grouping_profiles": [
|
||||
{
|
||||
"families": [],
|
||||
"hosts": [],
|
||||
"task_types": [],
|
||||
"tasks": [],
|
||||
"template": ""
|
||||
}
|
||||
]
|
||||
},
|
||||
"IntegrateAsset": {
|
||||
"skip_host_families": []
|
||||
},
|
||||
"IntegrateHeroVersion": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
|
|
|
|||
|
|
@ -528,10 +528,111 @@
|
|||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "IntegrateAssetNew",
|
||||
"label": "IntegrateAssetNew",
|
||||
"key": "IntegrateSubsetGroup",
|
||||
"label": "Integrate Subset Group",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "subset_grouping_profiles",
|
||||
"label": "Subset grouping profiles",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Set all published instances as a part of specific group named according to 'Template'. <br>Implemented all variants of placeholders [{task},{family},{host},{subset},{renderlayer}]"
|
||||
},
|
||||
{
|
||||
"key": "families",
|
||||
"label": "Families",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "hosts-enum",
|
||||
"key": "hosts",
|
||||
"label": "Hosts",
|
||||
"multiselection": true
|
||||
},
|
||||
{
|
||||
"key": "task_types",
|
||||
"label": "Task types",
|
||||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "tasks",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "template",
|
||||
"label": "Template"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "IntegrateAssetNew",
|
||||
"label": "IntegrateAsset (Legacy)",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "label",
|
||||
"label": "<b>NOTE:</b> Subset grouping profiles settings were moved to <a href=\"settings://project_settings/global/publish/IntegrateSubsetGroup/subset_grouping_profiles\"><b>Integrate Subset Group</b></a>. Please move values there."
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "subset_grouping_profiles",
|
||||
"label": "Subset grouping profiles (DEPRECATED)",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"key": "families",
|
||||
"label": "Families",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "hosts-enum",
|
||||
"key": "hosts",
|
||||
"label": "Hosts",
|
||||
"multiselection": true
|
||||
},
|
||||
{
|
||||
"key": "task_types",
|
||||
"label": "Task types",
|
||||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "tasks",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "template",
|
||||
"label": "Template"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "template_name_profiles",
|
||||
|
|
@ -577,49 +678,34 @@
|
|||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "IntegrateAsset",
|
||||
"label": "Integrate Asset",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "subset_grouping_profiles",
|
||||
"label": "Subset grouping profiles",
|
||||
"key": "skip_host_families",
|
||||
"label": "Skip hosts and families",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Set all published instances as a part of specific group named according to 'Template'. <br>Implemented all variants of placeholders [{task},{family},{host},{subset},{renderlayer}]"
|
||||
"type": "hosts-enum",
|
||||
"key": "host",
|
||||
"label": "Host"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "families",
|
||||
"label": "Families",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "hosts-enum",
|
||||
"key": "hosts",
|
||||
"label": "Hosts",
|
||||
"multiselection": true
|
||||
},
|
||||
{
|
||||
"key": "task_types",
|
||||
"label": "Task types",
|
||||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "tasks",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "template",
|
||||
"label": "Template"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -22,7 +22,6 @@ def test_backward_compatibility(printer):
|
|||
from openpype.lib import any_outdated
|
||||
from openpype.lib import get_asset
|
||||
from openpype.lib import get_linked_assets
|
||||
from openpype.lib import get_latest_version
|
||||
from openpype.lib import get_ffprobe_streams
|
||||
|
||||
from openpype.hosts.fusion.lib import switch_item
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.12.2-nightly.2"
|
||||
__version__ = "3.12.2-nightly.3"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.12.2-nightly.2" # OpenPype
|
||||
version = "3.12.2-nightly.3" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -196,12 +196,12 @@ html[data-theme='dark'] .header-github-link::before {
|
|||
padding: 20px
|
||||
}
|
||||
|
||||
.showcase .client {
|
||||
.showcase .studio {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
}
|
||||
|
||||
.showcase .client img {
|
||||
.showcase .studio img {
|
||||
max-height: 110px;
|
||||
padding: 20px;
|
||||
max-width: 160px;
|
||||
|
|
|
|||
|
|
@ -65,13 +65,17 @@ const collab = [
|
|||
image: '/img/clothcat.png',
|
||||
infoLink: 'https://www.clothcatanimation.com/'
|
||||
}, {
|
||||
title: 'Ellipse Studio',
|
||||
image: '/img/ellipse-studio.png',
|
||||
infoLink: 'http://www.dargaudmedia.com'
|
||||
title: 'Ellipse Animation',
|
||||
image: '/img/ellipse_animation.svg',
|
||||
infoLink: 'http://www.ellipseanimation.com'
|
||||
}, {
|
||||
title: 'J Cube Inc',
|
||||
image: '/img/jcube_logo_bw.png',
|
||||
infoLink: 'https://j-cube.jp'
|
||||
}, {
|
||||
title: 'Normaal Animation',
|
||||
image: '/img/logo_normaal.png',
|
||||
infoLink: 'https://j-cube.jp'
|
||||
}
|
||||
];
|
||||
|
||||
|
|
@ -153,7 +157,32 @@ const studios = [
|
|||
title: "IGG Canada",
|
||||
image: "/img/igg-logo.png",
|
||||
infoLink: "https://www.igg.com/",
|
||||
}
|
||||
},
|
||||
{
|
||||
title: "Agora Studio",
|
||||
image: "/img/agora_studio.png",
|
||||
infoLink: "https://agora.studio/",
|
||||
},
|
||||
{
|
||||
title: "Lucan Visuals",
|
||||
image: "/img/lucan_Logo_On_White-HR.png",
|
||||
infoLink: "https://www.lucan.tv/",
|
||||
},
|
||||
{
|
||||
title: "No Ghost",
|
||||
image: "/img/noghost.png",
|
||||
infoLink: "https://www.noghost.co.uk/",
|
||||
},
|
||||
{
|
||||
title: "Static VFX",
|
||||
image: "/img/staticvfx.png",
|
||||
infoLink: "http://www.staticvfx.com/",
|
||||
},
|
||||
{
|
||||
title: "Method n Madness",
|
||||
image: "/img/methodmadness.png",
|
||||
infoLink: "https://www.methodnmadness.com/",
|
||||
}
|
||||
];
|
||||
|
||||
function Service({imageUrl, title, description}) {
|
||||
|
|
@ -166,10 +195,10 @@ function Service({imageUrl, title, description}) {
|
|||
);
|
||||
}
|
||||
|
||||
function Client({title, image, infoLink}) {
|
||||
function Studio({title, image, infoLink}) {
|
||||
const imgUrl = useBaseUrl(image);
|
||||
return (
|
||||
<a className="client" href={infoLink}>
|
||||
<a className="studio" href={infoLink}>
|
||||
<img src={image} alt="" title={title}></img>
|
||||
</a>
|
||||
);
|
||||
|
|
@ -465,7 +494,7 @@ function Home() {
|
|||
<h2>Studios using openPype</h2>
|
||||
<div className="showcase">
|
||||
{studios.map((props, idx) => (
|
||||
<Client key={idx} {...props} />
|
||||
<Studio key={idx} {...props} />
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
|||
31
website/static/img/NoGhost_Logo_black.svg
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<!-- Generator: Adobe Illustrator 25.3.1, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
|
||||
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||
viewBox="0 0 1341 216" style="enable-background:new 0 0 1341 216;" xml:space="preserve">
|
||||
<style type="text/css">
|
||||
.st0{fill:#000000;}
|
||||
</style>
|
||||
<g>
|
||||
<path class="st0" d="M132,0.3l0,18l47,81.1c3.9,7.3,2.3,16.6-4.2,22.2c-7.6,6.5-19,5.7-25.6-1.9c-0.9-1-2.2-3.2-2.2-3.2L79.7,0.3
|
||||
H39.3C17.8,0.3,0.4,17.7,0.4,39.2L0.3,215.8h83.8l0-17.9l-46.3-80.1c-4.6-7.4-3.3-17.3,3.4-23.2c5.5-4.9,13.6-5.7,20.2-2.5
|
||||
c4,1.9,6.4,4.7,8.1,7.9l66.8,115.8h40.3c21.5,0,38.9-17.4,38.9-38.9V0.3H132z"/>
|
||||
<path class="st0" d="M367.8,0.3H227.6v176c0,21.8,17.7,39.5,39.5,39.5h140.2v-176C407.3,18,389.6,0.3,367.8,0.3z M350,191.9
|
||||
c-10.1,0-18.9-5.5-23.5-13.7l0,0L261.8,66.4c-2.9-4.3-4.6-9.5-4.6-15.1c0-14.9,12.1-27,27-27c10.7,0,20,6.2,24.3,15.3l0,0
|
||||
l64.4,111.3c0.3,0.4,0.5,0.8,0.7,1.3l0.1,0.1l0,0c2,3.8,3.2,8.1,3.2,12.7C377,179.8,364.9,191.9,350,191.9z"/>
|
||||
<path class="st0" d="M984.5,0.3H844.3v176c0,21.8,17.7,39.5,39.5,39.5H1024v-176C1024,18,1006.3,0.3,984.5,0.3z M966.7,191.9
|
||||
c-10.1,0-18.9-5.5-23.5-13.7l0,0L878.6,66.4c-2.9-4.3-4.6-9.5-4.6-15.1c0-14.9,12.1-27,27-27c10.7,0,20,6.2,24.3,15.3l0,0
|
||||
l64.2,111.3c0.3,0.4,0.5,0.8,0.7,1.3l0.1,0.1l0,0c2,3.8,3.2,8.1,3.2,12.7C993.7,179.8,981.6,191.9,966.7,191.9z"/>
|
||||
<path class="st0" d="M554.5,96.5v17.9l28.7,49.7c0.3,0.4,0.5,0.8,0.7,1.3l0.1,0.2l0,0c1.2,2.4,1.9,5.2,1.9,8.1
|
||||
c0,10-8.1,18.1-18.1,18.1c-6.3,0-11.8-3.2-15.1-8l0,0l-0.7-1.3c-0.1-0.1-0.1-0.2-0.2-0.3L497.4,88l0,0c-1.8-2.8-2.8-6.1-2.8-9.7
|
||||
c0-10,8.1-18.1,18.1-18.1c0,0,0,0,0,0h95c14.9,0,26.9-12.1,26.9-26.9V0.3H529.2c0,0,0,0,0,0c-50,0-90.8,40-91.9,89.8l0,0v35.9l0,0
|
||||
c1.2,49.8,41.9,89.7,91.9,89.7h0h105.5V96.5H554.5z"/>
|
||||
<path class="st0" d="M748.6,0.3l0,18.2l26,45.1c1.3,2.5,2.1,5.4,2.1,8.4c0,10-8.1,18.1-18.1,18.1h-29.5c-10,0-18.1-8.1-18.1-18.1
|
||||
V0.3h-64.2v215.5h83.8l0-17.9l-26.4-45.7c-1.3-2.5-2-5.3-2-8.2c0-10,8.1-18.1,18.1-18.1c0,0,0,0,0,0h29.5c0,0,0,0,0,0
|
||||
c10,0,18.1,8.1,18.1,18.1l0,71.9h64.5V0.3H748.6z"/>
|
||||
<path class="st0" d="M1269.1,60.2c0.1,0,71.6,0,71.6,0v-33c0-14.9-12.1-26.9-26.9-26.9h-93.7c-1.2,0-2.4,0-3.6,0
|
||||
c-123.6,0-149.9,0-154.1,0c-14.9,0-26.9,12.1-26.9,26.9c0,5.3,1.6,10.3,4.2,14.5l71.2,123.3c1.4,2.6,2.3,5.6,2.3,8.8
|
||||
c0,10-8.1,18.1-18.1,18.1c-6.6,0-12.4-3.6-15.6-8.9l-15.7-27.1h-28.3v59.9h134.5c14.9,0,26.9-12.1,26.9-26.9c0-5-1.4-9.8-3.8-13.8
|
||||
l-71.5-123.7l0,0c-1.5-2.6-2.3-5.6-2.3-8.8c0-10,8.1-18.1,18.1-18.1c7.2,0,13.4,4.2,16.4,10.3l14.8,25.4h40.4v155.6h83.8l-0.1-60.6
|
||||
l-39.5-68c-1.5-2.6-2.3-5.6-2.3-8.8C1251,68.3,1259.1,60.2,1269.1,60.2z"/>
|
||||
</g>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 2.7 KiB |
BIN
website/static/img/agora_studio.png
Normal file
|
After Width: | Height: | Size: 131 KiB |
9
website/static/img/ellipse_animation.svg
Normal file
|
After Width: | Height: | Size: 63 KiB |
|
Before Width: | Height: | Size: 78 KiB After Width: | Height: | Size: 94 KiB |
BIN
website/static/img/logo_normaal.png
Normal file
|
After Width: | Height: | Size: 13 KiB |
BIN
website/static/img/lucan_Logo_On_White-HR.png
Normal file
|
After Width: | Height: | Size: 76 KiB |
BIN
website/static/img/methodmadness.png
Normal file
|
After Width: | Height: | Size: 8.4 KiB |
BIN
website/static/img/noghost.png
Normal file
|
After Width: | Height: | Size: 22 KiB |
BIN
website/static/img/staticvfx.png
Normal file
|
After Width: | Height: | Size: 13 KiB |