mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into feature/OP-3419_Move-editorial-logic-to-pipeline
This commit is contained in:
commit
83ff0a8fcc
69 changed files with 1083 additions and 743 deletions
70
CHANGELOG.md
70
CHANGELOG.md
|
|
@ -1,8 +1,42 @@
|
|||
# Changelog
|
||||
|
||||
## [3.11.0-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.11.1](https://github.com/pypeclub/OpenPype/tree/3.11.1) (2022-06-20)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.10.0...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.11.0...3.11.1)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Flame: custom export temp folder [\#3346](https://github.com/pypeclub/OpenPype/pull/3346)
|
||||
- Nuke: removing third-party plugins [\#3344](https://github.com/pypeclub/OpenPype/pull/3344)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Pyblish Pype: Hiding/Close issues [\#3367](https://github.com/pypeclub/OpenPype/pull/3367)
|
||||
- Ftrack: Removed requirement of pypeclub role from default settings [\#3354](https://github.com/pypeclub/OpenPype/pull/3354)
|
||||
- Kitsu: Prevent crash on missing frames information [\#3352](https://github.com/pypeclub/OpenPype/pull/3352)
|
||||
- Ftrack: Open browser from tray [\#3320](https://github.com/pypeclub/OpenPype/pull/3320)
|
||||
- Enhancement: More control over thumbnail processing. [\#3259](https://github.com/pypeclub/OpenPype/pull/3259)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: bake streams with slate on farm [\#3368](https://github.com/pypeclub/OpenPype/pull/3368)
|
||||
- Harmony: audio validator has wrong logic [\#3364](https://github.com/pypeclub/OpenPype/pull/3364)
|
||||
- Nuke: Fix missing variable in extract thumbnail [\#3363](https://github.com/pypeclub/OpenPype/pull/3363)
|
||||
- Nuke: Fix precollect writes [\#3361](https://github.com/pypeclub/OpenPype/pull/3361)
|
||||
- AE- fix validate\_scene\_settings and renderLocal [\#3358](https://github.com/pypeclub/OpenPype/pull/3358)
|
||||
- deadline: fixing misidentification of revieables [\#3356](https://github.com/pypeclub/OpenPype/pull/3356)
|
||||
- General: Create only one thumbnail per instance [\#3351](https://github.com/pypeclub/OpenPype/pull/3351)
|
||||
- General: Fix last version function [\#3345](https://github.com/pypeclub/OpenPype/pull/3345)
|
||||
- Deadline: added OPENPYPE\_MONGO to filter [\#3336](https://github.com/pypeclub/OpenPype/pull/3336)
|
||||
- Nuke: fixing farm publishing if review is disabled [\#3306](https://github.com/pypeclub/OpenPype/pull/3306)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Webpublisher: Use client query functions [\#3333](https://github.com/pypeclub/OpenPype/pull/3333)
|
||||
|
||||
## [3.11.0](https://github.com/pypeclub/OpenPype/tree/3.11.0) (2022-06-17)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.11.0-nightly.4...3.11.0)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
|
|
@ -22,11 +56,8 @@
|
|||
- General: Updated windows oiio tool [\#3268](https://github.com/pypeclub/OpenPype/pull/3268)
|
||||
- Unreal: add support for skeletalMesh and staticMesh to loaders [\#3267](https://github.com/pypeclub/OpenPype/pull/3267)
|
||||
- Maya: reference loaders could store placeholder in referenced url [\#3264](https://github.com/pypeclub/OpenPype/pull/3264)
|
||||
- Maya: FBX camera export [\#3253](https://github.com/pypeclub/OpenPype/pull/3253)
|
||||
- TVPaint: Init file for TVPaint worker also handle guideline images [\#3250](https://github.com/pypeclub/OpenPype/pull/3250)
|
||||
- Nuke: Change default icon path in settings [\#3247](https://github.com/pypeclub/OpenPype/pull/3247)
|
||||
- Maya: publishing of animation and pointcache on a farm [\#3225](https://github.com/pypeclub/OpenPype/pull/3225)
|
||||
- Maya: Look assigner UI improvements [\#3208](https://github.com/pypeclub/OpenPype/pull/3208)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -43,13 +74,12 @@
|
|||
- Webpublisher: return only active projects in ProjectsEndpoint [\#3281](https://github.com/pypeclub/OpenPype/pull/3281)
|
||||
- Hiero: add support for task tags 3.10.x [\#3279](https://github.com/pypeclub/OpenPype/pull/3279)
|
||||
- General: Fix Oiio tool path resolving [\#3278](https://github.com/pypeclub/OpenPype/pull/3278)
|
||||
- Maya: Fix udim support for e.g. uppercase \<UDIM\> tag [\#3266](https://github.com/pypeclub/OpenPype/pull/3266)
|
||||
- Nuke: bake reformat was failing on string type [\#3261](https://github.com/pypeclub/OpenPype/pull/3261)
|
||||
- Maya: hotfix Pxr multitexture in looks [\#3260](https://github.com/pypeclub/OpenPype/pull/3260)
|
||||
- Unreal: Fix Camera Loading if Layout is missing [\#3255](https://github.com/pypeclub/OpenPype/pull/3255)
|
||||
- Unreal: Fixed Animation loading in UE5 [\#3240](https://github.com/pypeclub/OpenPype/pull/3240)
|
||||
- Unreal: Fixed Render creation in UE5 [\#3239](https://github.com/pypeclub/OpenPype/pull/3239)
|
||||
- Unreal: Fixed Camera loading in UE5 [\#3238](https://github.com/pypeclub/OpenPype/pull/3238)
|
||||
- TVPaint: Look for more groups than 12 [\#3228](https://github.com/pypeclub/OpenPype/pull/3228)
|
||||
- Flame: debugging [\#3224](https://github.com/pypeclub/OpenPype/pull/3224)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
|
|
@ -60,7 +90,7 @@
|
|||
|
||||
- Maya: add pointcache family to gpu cache loader [\#3318](https://github.com/pypeclub/OpenPype/pull/3318)
|
||||
- Maya look: skip empty file attributes [\#3274](https://github.com/pypeclub/OpenPype/pull/3274)
|
||||
- Harmony: message length in 21.1 [\#3258](https://github.com/pypeclub/OpenPype/pull/3258)
|
||||
- Harmony: 21.1 fix [\#3248](https://github.com/pypeclub/OpenPype/pull/3248)
|
||||
|
||||
## [3.10.0](https://github.com/pypeclub/OpenPype/tree/3.10.0) (2022-05-26)
|
||||
|
||||
|
|
@ -68,46 +98,26 @@
|
|||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- TVPaint: Init file for TVPaint worker also handle guideline images [\#3251](https://github.com/pypeclub/OpenPype/pull/3251)
|
||||
- Maya: FBX camera export [\#3253](https://github.com/pypeclub/OpenPype/pull/3253)
|
||||
- General: updating common vendor `scriptmenu` to 1.5.2 [\#3246](https://github.com/pypeclub/OpenPype/pull/3246)
|
||||
- Project Manager: Allow to paste Tasks into multiple assets at the same time [\#3226](https://github.com/pypeclub/OpenPype/pull/3226)
|
||||
- Project manager: Sped up project load [\#3216](https://github.com/pypeclub/OpenPype/pull/3216)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: Fix udim support for e.g. uppercase \<UDIM\> tag [\#3266](https://github.com/pypeclub/OpenPype/pull/3266)
|
||||
- Unreal: Fix Camera Loading if Layout is missing [\#3255](https://github.com/pypeclub/OpenPype/pull/3255)
|
||||
- nuke: use framerange issue [\#3254](https://github.com/pypeclub/OpenPype/pull/3254)
|
||||
- Ftrack: Chunk sizes for queries has minimal condition [\#3244](https://github.com/pypeclub/OpenPype/pull/3244)
|
||||
- Maya: renderman displays needs to be filtered [\#3242](https://github.com/pypeclub/OpenPype/pull/3242)
|
||||
- Ftrack: Validate that the user exists on ftrack [\#3237](https://github.com/pypeclub/OpenPype/pull/3237)
|
||||
- Maya: Fix support for multiple resolutions [\#3236](https://github.com/pypeclub/OpenPype/pull/3236)
|
||||
- Hiero: debugging frame range and other 3.10 [\#3222](https://github.com/pypeclub/OpenPype/pull/3222)
|
||||
- Project Manager: Fix persistent editors on project change [\#3218](https://github.com/pypeclub/OpenPype/pull/3218)
|
||||
- Deadline: instance data overwrite fix [\#3214](https://github.com/pypeclub/OpenPype/pull/3214)
|
||||
- Ftrack: Push hierarchical attributes action works [\#3210](https://github.com/pypeclub/OpenPype/pull/3210)
|
||||
- Standalone Publisher: Always create new representation for thumbnail [\#3203](https://github.com/pypeclub/OpenPype/pull/3203)
|
||||
- Photoshop: skip collector when automatic testing [\#3202](https://github.com/pypeclub/OpenPype/pull/3202)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Harmony: message length in 21.1 [\#3257](https://github.com/pypeclub/OpenPype/pull/3257)
|
||||
- Harmony: 21.1 fix [\#3249](https://github.com/pypeclub/OpenPype/pull/3249)
|
||||
- Maya: added jpg to filter for Image Plane Loader [\#3223](https://github.com/pypeclub/OpenPype/pull/3223)
|
||||
|
||||
## [3.9.8](https://github.com/pypeclub/OpenPype/tree/3.9.8) (2022-05-19)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.7...3.9.8)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- nuke: generate publishing nodes inside render group node [\#3206](https://github.com/pypeclub/OpenPype/pull/3206)
|
||||
- Loader UI: Speed issues of loader with sync server [\#3200](https://github.com/pypeclub/OpenPype/pull/3200)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Standalone Publisher: Always create new representation for thumbnail [\#3204](https://github.com/pypeclub/OpenPype/pull/3204)
|
||||
|
||||
## [3.9.7](https://github.com/pypeclub/OpenPype/tree/3.9.7) (2022-05-11)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.6...3.9.7)
|
||||
|
|
|
|||
|
|
@ -44,7 +44,6 @@ from . import resources
|
|||
|
||||
from .plugin import (
|
||||
Extractor,
|
||||
Integrator,
|
||||
|
||||
ValidatePipelineOrder,
|
||||
ValidateContentsOrder,
|
||||
|
|
@ -87,7 +86,6 @@ __all__ = [
|
|||
|
||||
# plugin classes
|
||||
"Extractor",
|
||||
"Integrator",
|
||||
# ordering
|
||||
"ValidatePipelineOrder",
|
||||
"ValidateContentsOrder",
|
||||
|
|
|
|||
|
|
@ -5,6 +5,7 @@ from .entities import (
|
|||
get_asset_by_id,
|
||||
get_asset_by_name,
|
||||
get_assets,
|
||||
get_archived_assets,
|
||||
get_asset_ids_with_subsets,
|
||||
|
||||
get_subset_by_id,
|
||||
|
|
@ -41,6 +42,7 @@ __all__ = (
|
|||
"get_asset_by_id",
|
||||
"get_asset_by_name",
|
||||
"get_assets",
|
||||
"get_archived_assets",
|
||||
"get_asset_ids_with_subsets",
|
||||
|
||||
"get_subset_by_id",
|
||||
|
|
|
|||
|
|
@ -139,8 +139,16 @@ def get_asset_by_name(project_name, asset_name, fields=None):
|
|||
return conn.find_one(query_filter, _prepare_fields(fields))
|
||||
|
||||
|
||||
def get_assets(
|
||||
project_name, asset_ids=None, asset_names=None, archived=False, fields=None
|
||||
# NOTE this could be just public function?
|
||||
# - any better variable name instead of 'standard'?
|
||||
# - same approach can be used for rest of types
|
||||
def _get_assets(
|
||||
project_name,
|
||||
asset_ids=None,
|
||||
asset_names=None,
|
||||
standard=True,
|
||||
archived=False,
|
||||
fields=None
|
||||
):
|
||||
"""Assets for specified project by passed filters.
|
||||
|
||||
|
|
@ -153,6 +161,8 @@ def get_assets(
|
|||
project_name (str): Name of project where to look for queried entities.
|
||||
asset_ids (list[str|ObjectId]): Asset ids that should be found.
|
||||
asset_names (list[str]): Name assets that should be found.
|
||||
standard (bool): Query standart assets (type 'asset').
|
||||
archived (bool): Query archived assets (type 'archived_asset').
|
||||
fields (list[str]): Fields that should be returned. All fields are
|
||||
returned if 'None' is passed.
|
||||
|
||||
|
|
@ -161,10 +171,15 @@ def get_assets(
|
|||
passed filters.
|
||||
"""
|
||||
|
||||
asset_types = ["asset"]
|
||||
asset_types = []
|
||||
if standard:
|
||||
asset_types.append("asset")
|
||||
if archived:
|
||||
asset_types.append("archived_asset")
|
||||
|
||||
if not asset_types:
|
||||
return []
|
||||
|
||||
if len(asset_types) == 1:
|
||||
query_filter = {"type": asset_types[0]}
|
||||
else:
|
||||
|
|
@ -186,6 +201,68 @@ def get_assets(
|
|||
return conn.find(query_filter, _prepare_fields(fields))
|
||||
|
||||
|
||||
def get_assets(
|
||||
project_name,
|
||||
asset_ids=None,
|
||||
asset_names=None,
|
||||
archived=False,
|
||||
fields=None
|
||||
):
|
||||
"""Assets for specified project by passed filters.
|
||||
|
||||
Passed filters (ids and names) are always combined so all conditions must
|
||||
match.
|
||||
|
||||
To receive all assets from project just keep filters empty.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where to look for queried entities.
|
||||
asset_ids (list[str|ObjectId]): Asset ids that should be found.
|
||||
asset_names (list[str]): Name assets that should be found.
|
||||
archived (bool): Add also archived assets.
|
||||
fields (list[str]): Fields that should be returned. All fields are
|
||||
returned if 'None' is passed.
|
||||
|
||||
Returns:
|
||||
Cursor: Query cursor as iterable which returns asset documents matching
|
||||
passed filters.
|
||||
"""
|
||||
|
||||
return _get_assets(
|
||||
project_name, asset_ids, asset_names, True, archived, fields
|
||||
)
|
||||
|
||||
|
||||
def get_archived_assets(
|
||||
project_name,
|
||||
asset_ids=None,
|
||||
asset_names=None,
|
||||
fields=None
|
||||
):
|
||||
"""Archived assets for specified project by passed filters.
|
||||
|
||||
Passed filters (ids and names) are always combined so all conditions must
|
||||
match.
|
||||
|
||||
To receive all archived assets from project just keep filters empty.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where to look for queried entities.
|
||||
asset_ids (list[str|ObjectId]): Asset ids that should be found.
|
||||
asset_names (list[str]): Name assets that should be found.
|
||||
fields (list[str]): Fields that should be returned. All fields are
|
||||
returned if 'None' is passed.
|
||||
|
||||
Returns:
|
||||
Cursor: Query cursor as iterable which returns asset documents matching
|
||||
passed filters.
|
||||
"""
|
||||
|
||||
return _get_assets(
|
||||
project_name, asset_ids, asset_names, False, True, fields
|
||||
)
|
||||
|
||||
|
||||
def get_asset_ids_with_subsets(project_name, asset_ids=None):
|
||||
"""Find out which assets have existing subsets.
|
||||
|
||||
|
|
@ -432,6 +509,7 @@ def _get_versions(
|
|||
project_name,
|
||||
subset_ids=None,
|
||||
version_ids=None,
|
||||
versions=None,
|
||||
standard=True,
|
||||
hero=False,
|
||||
fields=None
|
||||
|
|
@ -462,6 +540,16 @@ def _get_versions(
|
|||
return []
|
||||
query_filter["_id"] = {"$in": version_ids}
|
||||
|
||||
if versions is not None:
|
||||
versions = list(versions)
|
||||
if not versions:
|
||||
return []
|
||||
|
||||
if len(versions) == 1:
|
||||
query_filter["name"] = versions[0]
|
||||
else:
|
||||
query_filter["name"] = {"$in": versions}
|
||||
|
||||
conn = _get_project_connection(project_name)
|
||||
|
||||
return conn.find(query_filter, _prepare_fields(fields))
|
||||
|
|
@ -471,6 +559,7 @@ def get_versions(
|
|||
project_name,
|
||||
version_ids=None,
|
||||
subset_ids=None,
|
||||
versions=None,
|
||||
hero=False,
|
||||
fields=None
|
||||
):
|
||||
|
|
@ -484,6 +573,8 @@ def get_versions(
|
|||
Filter ignored if 'None' is passed.
|
||||
subset_ids (list[str]): Subset ids that will be queried.
|
||||
Filter ignored if 'None' is passed.
|
||||
versions (list[int]): Version names (as integers).
|
||||
Filter ignored if 'None' is passed.
|
||||
hero (bool): Look also for hero versions.
|
||||
fields (list[str]): Fields that should be returned. All fields are
|
||||
returned if 'None' is passed.
|
||||
|
|
@ -496,6 +587,7 @@ def get_versions(
|
|||
project_name,
|
||||
subset_ids,
|
||||
version_ids,
|
||||
versions,
|
||||
standard=True,
|
||||
hero=hero,
|
||||
fields=fields
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ class AERenderInstance(RenderInstance):
|
|||
projectEntity = attr.ib(default=None)
|
||||
stagingDir = attr.ib(default=None)
|
||||
app_version = attr.ib(default=None)
|
||||
publish_attributes = attr.ib(default=None)
|
||||
publish_attributes = attr.ib(default={})
|
||||
file_name = attr.ib(default=None)
|
||||
|
||||
|
||||
|
|
@ -90,7 +90,7 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
|
||||
subset_name = inst.data["subset"]
|
||||
instance = AERenderInstance(
|
||||
family=family,
|
||||
family="render",
|
||||
families=inst.data.get("families", []),
|
||||
version=version,
|
||||
time="",
|
||||
|
|
@ -116,7 +116,7 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
toBeRenderedOn='deadline',
|
||||
fps=fps,
|
||||
app_version=app_version,
|
||||
publish_attributes=inst.data.get("publish_attributes"),
|
||||
publish_attributes=inst.data.get("publish_attributes", {}),
|
||||
file_name=render_q.file_name
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -54,7 +54,7 @@ class ValidateSceneSettings(OptionalPyblishPluginMixin,
|
|||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Scene Settings"
|
||||
families = ["render.farm", "render"]
|
||||
families = ["render.farm", "render.local", "render"]
|
||||
hosts = ["aftereffects"]
|
||||
optional = True
|
||||
|
||||
|
|
|
|||
|
|
@ -610,7 +610,8 @@ class ImageSequenceLoader(load.LoaderPlugin):
|
|||
def update(self, container, representation):
|
||||
node = container.pop("node")
|
||||
|
||||
version = legacy_io.find_one({"_id": representation["parent"]})
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_version_by_id(project_name, representation["parent"])
|
||||
files = []
|
||||
for f in version["data"]["files"]:
|
||||
files.append(
|
||||
|
|
|
|||
|
|
@ -2,10 +2,10 @@ import os
|
|||
from pathlib import Path
|
||||
import logging
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
import pyblish.api
|
||||
|
||||
from openpype import lib
|
||||
from openpype.client import get_representation_by_id
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
|
|
@ -104,22 +104,20 @@ def check_inventory():
|
|||
If it does it will colorize outdated nodes and display warning message
|
||||
in Harmony.
|
||||
"""
|
||||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
outdated_containers = []
|
||||
for container in ls():
|
||||
representation = container['representation']
|
||||
representation_doc = legacy_io.find_one(
|
||||
{
|
||||
"_id": ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
representation_id = container['representation']
|
||||
representation_doc = get_representation_by_id(
|
||||
project_name, representation_id, fields=["parent"]
|
||||
)
|
||||
if representation_doc and not lib.is_latest(representation_doc):
|
||||
outdated_containers.append(container)
|
||||
|
||||
if not outdated_containers:
|
||||
return
|
||||
|
||||
# Colour nodes.
|
||||
outdated_nodes = []
|
||||
for container in outdated_containers:
|
||||
|
|
|
|||
|
|
@ -47,6 +47,6 @@ class ValidateAudio(pyblish.api.InstancePlugin):
|
|||
formatting_data = {
|
||||
"audio_url": audio_path
|
||||
}
|
||||
if os.path.isfile(audio_path):
|
||||
if not os.path.isfile(audio_path):
|
||||
raise PublishXmlValidationError(self, msg,
|
||||
formatting_data=formatting_data)
|
||||
|
|
|
|||
|
|
@ -130,6 +130,8 @@ def get_output_parameter(node):
|
|||
elif node_type == "arnold":
|
||||
if node.evalParm("ar_ass_export_enable"):
|
||||
return node.parm("ar_ass_file")
|
||||
elif node_type == "Redshift_Proxy_Output":
|
||||
return node.parm("RS_archive_file")
|
||||
|
||||
raise TypeError("Node type '%s' not supported" % node_type)
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,48 @@
|
|||
from openpype.hosts.houdini.api import plugin
|
||||
|
||||
|
||||
class CreateRedshiftProxy(plugin.Creator):
|
||||
"""Redshift Proxy"""
|
||||
|
||||
label = "Redshift Proxy"
|
||||
family = "redshiftproxy"
|
||||
icon = "magic"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateRedshiftProxy, self).__init__(*args, **kwargs)
|
||||
|
||||
# Remove the active, we are checking the bypass flag of the nodes
|
||||
self.data.pop("active", None)
|
||||
|
||||
# Redshift provides a `Redshift_Proxy_Output` node type which shows
|
||||
# a limited set of parameters by default and is set to extract a
|
||||
# Redshift Proxy. However when "imprinting" extra parameters needed
|
||||
# for OpenPype it starts showing all its parameters again. It's unclear
|
||||
# why this happens.
|
||||
# TODO: Somehow enforce so that it only shows the original limited
|
||||
# attributes of the Redshift_Proxy_Output node type
|
||||
self.data.update({"node_type": "Redshift_Proxy_Output"})
|
||||
|
||||
def _process(self, instance):
|
||||
"""Creator main entry point.
|
||||
|
||||
Args:
|
||||
instance (hou.Node): Created Houdini instance.
|
||||
|
||||
"""
|
||||
parms = {
|
||||
"RS_archive_file": '$HIP/pyblish/`chs("subset")`.$F4.rs',
|
||||
}
|
||||
|
||||
if self.nodes:
|
||||
node = self.nodes[0]
|
||||
path = node.path()
|
||||
parms["RS_archive_sopPath"] = path
|
||||
|
||||
instance.setParms(parms)
|
||||
|
||||
# Lock some Avalon attributes
|
||||
to_lock = ["family", "id"]
|
||||
for name in to_lock:
|
||||
parm = instance.parm(name)
|
||||
parm.lock(True)
|
||||
|
|
@ -20,7 +20,7 @@ class CollectFrames(pyblish.api.InstancePlugin):
|
|||
|
||||
order = pyblish.api.CollectorOrder
|
||||
label = "Collect Frames"
|
||||
families = ["vdbcache", "imagesequence", "ass"]
|
||||
families = ["vdbcache", "imagesequence", "ass", "redshiftproxy"]
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
|
|
|
|||
|
|
@ -12,6 +12,7 @@ class CollectOutputSOPPath(pyblish.api.InstancePlugin):
|
|||
"imagesequence",
|
||||
"usd",
|
||||
"usdrender",
|
||||
"redshiftproxy"
|
||||
]
|
||||
|
||||
hosts = ["houdini"]
|
||||
|
|
@ -54,6 +55,8 @@ class CollectOutputSOPPath(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
out_node = node.parm("loppath").evalAsNode()
|
||||
|
||||
elif node_type == "Redshift_Proxy_Output":
|
||||
out_node = node.parm("RS_archive_sopPath").evalAsNode()
|
||||
else:
|
||||
raise ValueError(
|
||||
"ROP node type '%s' is" " not supported." % node_type
|
||||
|
|
|
|||
|
|
@ -0,0 +1,48 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.hosts.houdini.api.lib import render_rop
|
||||
|
||||
|
||||
class ExtractRedshiftProxy(openpype.api.Extractor):
|
||||
|
||||
order = pyblish.api.ExtractorOrder + 0.1
|
||||
label = "Extract Redshift Proxy"
|
||||
families = ["redshiftproxy"]
|
||||
hosts = ["houdini"]
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
ropnode = instance[0]
|
||||
|
||||
# Get the filename from the filename parameter
|
||||
# `.evalParm(parameter)` will make sure all tokens are resolved
|
||||
output = ropnode.evalParm("RS_archive_file")
|
||||
staging_dir = os.path.normpath(os.path.dirname(output))
|
||||
instance.data["stagingDir"] = staging_dir
|
||||
file_name = os.path.basename(output)
|
||||
|
||||
self.log.info("Writing Redshift Proxy '%s' to '%s'" % (file_name,
|
||||
staging_dir))
|
||||
|
||||
render_rop(ropnode)
|
||||
|
||||
output = instance.data["frames"]
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
"name": "rs",
|
||||
"ext": "rs",
|
||||
"files": output,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
|
||||
# A single frame may also be rendered without start/end frame.
|
||||
if "frameStart" in instance.data and "frameEnd" in instance.data:
|
||||
representation["frameStart"] = instance.data["frameStart"]
|
||||
representation["frameEnd"] = instance.data["frameEnd"]
|
||||
|
||||
instance.data["representations"].append(representation)
|
||||
|
|
@ -16,7 +16,7 @@ class CreateMultiverseUsd(plugin.Creator):
|
|||
self.data.update(lib.collect_animation_data(True))
|
||||
|
||||
self.data["fileFormat"] = ["usd", "usda", "usdz"]
|
||||
self.data["stripNamespaces"] = False
|
||||
self.data["stripNamespaces"] = True
|
||||
self.data["mergeTransformAndShape"] = False
|
||||
self.data["writeAncestors"] = True
|
||||
self.data["flattenParentXforms"] = False
|
||||
|
|
@ -37,15 +37,15 @@ class CreateMultiverseUsd(plugin.Creator):
|
|||
self.data["writeUVs"] = True
|
||||
self.data["writeColorSets"] = False
|
||||
self.data["writeTangents"] = False
|
||||
self.data["writeRefPositions"] = False
|
||||
self.data["writeRefPositions"] = True
|
||||
self.data["writeBlendShapes"] = False
|
||||
self.data["writeDisplayColor"] = False
|
||||
self.data["writeDisplayColor"] = True
|
||||
self.data["writeSkinWeights"] = False
|
||||
self.data["writeMaterialAssignment"] = False
|
||||
self.data["writeHardwareShader"] = False
|
||||
self.data["writeShadingNetworks"] = False
|
||||
self.data["writeTransformMatrix"] = True
|
||||
self.data["writeUsdAttributes"] = False
|
||||
self.data["writeUsdAttributes"] = True
|
||||
self.data["writeInstancesAsReferences"] = False
|
||||
self.data["timeVaryingTopology"] = False
|
||||
self.data["customMaterialNamespace"] = ''
|
||||
|
|
|
|||
|
|
@ -28,18 +28,19 @@ def get_all_children(nodes):
|
|||
dag = sel.getDagPath(0)
|
||||
|
||||
iterator.reset(dag)
|
||||
next(iterator) # ignore self
|
||||
# ignore self
|
||||
iterator.next() # noqa: B305
|
||||
while not iterator.isDone():
|
||||
|
||||
path = iterator.fullPathName()
|
||||
|
||||
if path in traversed:
|
||||
iterator.prune()
|
||||
next(iterator)
|
||||
iterator.next() # noqa: B305
|
||||
continue
|
||||
|
||||
traversed.add(path)
|
||||
next(iterator)
|
||||
iterator.next() # noqa: B305
|
||||
|
||||
return list(traversed)
|
||||
|
||||
|
|
|
|||
|
|
@ -111,7 +111,8 @@ class ExtractPlayblast(openpype.api.Extractor):
|
|||
self.log.debug("playblast path {}".format(path))
|
||||
|
||||
collected_files = os.listdir(stagingdir)
|
||||
collections, remainder = clique.assemble(collected_files)
|
||||
collections, remainder = clique.assemble(collected_files,
|
||||
minimum_items=1)
|
||||
|
||||
self.log.debug("filename {}".format(filename))
|
||||
frame_collection = None
|
||||
|
|
@ -134,10 +135,15 @@ class ExtractPlayblast(openpype.api.Extractor):
|
|||
# Add camera node name to representation data
|
||||
camera_node_name = pm.ls(camera)[0].getTransform().name()
|
||||
|
||||
collected_files = list(frame_collection)
|
||||
# single frame file shouldn't be in list, only as a string
|
||||
if len(collected_files) == 1:
|
||||
collected_files = collected_files[0]
|
||||
|
||||
representation = {
|
||||
'name': 'png',
|
||||
'ext': 'png',
|
||||
'files': list(frame_collection),
|
||||
'files': collected_files,
|
||||
"stagingDir": stagingdir,
|
||||
"frameStart": start,
|
||||
"frameEnd": end,
|
||||
|
|
|
|||
|
|
@ -65,48 +65,46 @@ class ExtractThumbnail(openpype.api.Extractor):
|
|||
temporary_nodes = []
|
||||
|
||||
# try to connect already rendered images
|
||||
if self.use_rendered:
|
||||
collection = instance.data.get("collection", None)
|
||||
self.log.debug("__ collection: `{}`".format(collection))
|
||||
previous_node = node
|
||||
collection = instance.data.get("collection", None)
|
||||
self.log.debug("__ collection: `{}`".format(collection))
|
||||
|
||||
if collection:
|
||||
# get path
|
||||
fname = os.path.basename(collection.format(
|
||||
"{head}{padding}{tail}"))
|
||||
fhead = collection.format("{head}")
|
||||
if collection:
|
||||
# get path
|
||||
fname = os.path.basename(collection.format(
|
||||
"{head}{padding}{tail}"))
|
||||
fhead = collection.format("{head}")
|
||||
|
||||
thumb_fname = list(collection)[mid_frame]
|
||||
else:
|
||||
fname = thumb_fname = os.path.basename(
|
||||
instance.data.get("path", None))
|
||||
fhead = os.path.splitext(fname)[0] + "."
|
||||
thumb_fname = list(collection)[mid_frame]
|
||||
else:
|
||||
fname = thumb_fname = os.path.basename(
|
||||
instance.data.get("path", None))
|
||||
fhead = os.path.splitext(fname)[0] + "."
|
||||
|
||||
self.log.debug("__ fhead: `{}`".format(fhead))
|
||||
self.log.debug("__ fhead: `{}`".format(fhead))
|
||||
|
||||
if "#" in fhead:
|
||||
fhead = fhead.replace("#", "")[:-1]
|
||||
if "#" in fhead:
|
||||
fhead = fhead.replace("#", "")[:-1]
|
||||
|
||||
path_render = os.path.join(
|
||||
staging_dir, thumb_fname).replace("\\", "/")
|
||||
self.log.debug("__ path_render: `{}`".format(path_render))
|
||||
path_render = os.path.join(
|
||||
staging_dir, thumb_fname).replace("\\", "/")
|
||||
self.log.debug("__ path_render: `{}`".format(path_render))
|
||||
|
||||
if self.use_rendered and os.path.isfile(path_render):
|
||||
# check if file exist otherwise connect to write node
|
||||
if os.path.isfile(path_render):
|
||||
rnode = nuke.createNode("Read")
|
||||
rnode = nuke.createNode("Read")
|
||||
|
||||
rnode["file"].setValue(path_render)
|
||||
rnode["file"].setValue(path_render)
|
||||
|
||||
# turn it raw if none of baking is ON
|
||||
if all([
|
||||
not self.bake_viewer_input_process,
|
||||
not self.bake_viewer_process
|
||||
]):
|
||||
rnode["raw"].setValue(True)
|
||||
# turn it raw if none of baking is ON
|
||||
if all([
|
||||
not self.bake_viewer_input_process,
|
||||
not self.bake_viewer_process
|
||||
]):
|
||||
rnode["raw"].setValue(True)
|
||||
|
||||
temporary_nodes.append(rnode)
|
||||
previous_node = rnode
|
||||
else:
|
||||
previous_node = node
|
||||
temporary_nodes.append(rnode)
|
||||
previous_node = rnode
|
||||
|
||||
# bake viewer input look node into thumbnail image
|
||||
if self.bake_viewer_input_process:
|
||||
|
|
|
|||
|
|
@ -72,12 +72,12 @@ class CollectNukeWrites(pyblish.api.InstancePlugin):
|
|||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = list()
|
||||
|
||||
representation = {
|
||||
'name': ext,
|
||||
'ext': ext,
|
||||
"stagingDir": output_dir,
|
||||
"tags": list()
|
||||
}
|
||||
representation = {
|
||||
'name': ext,
|
||||
'ext': ext,
|
||||
"stagingDir": output_dir,
|
||||
"tags": list()
|
||||
}
|
||||
|
||||
try:
|
||||
collected_frames = [f for f in os.listdir(output_dir)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import openpype.hosts.photoshop.api as api
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.pipeline import (
|
||||
AutoCreator,
|
||||
CreatedInstance,
|
||||
|
|
@ -40,10 +41,7 @@ class PSWorkfileCreator(AutoCreator):
|
|||
task_name = legacy_io.Session["AVALON_TASK"]
|
||||
host_name = legacy_io.Session["AVALON_APP"]
|
||||
if existing_instance is None:
|
||||
asset_doc = legacy_io.find_one({
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
})
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
subset_name = self.get_subset_name(
|
||||
variant, task_name, asset_doc, project_name, host_name
|
||||
)
|
||||
|
|
@ -67,10 +65,7 @@ class PSWorkfileCreator(AutoCreator):
|
|||
existing_instance["asset"] != asset_name
|
||||
or existing_instance["task"] != task_name
|
||||
):
|
||||
asset_doc = legacy_io.find_one({
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
})
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
subset_name = self.get_subset_name(
|
||||
variant, task_name, asset_doc, project_name, host_name
|
||||
)
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import json
|
|||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.client import get_asset_by_name
|
||||
|
||||
|
||||
class CollectBulkMovInstances(pyblish.api.InstancePlugin):
|
||||
|
|
@ -24,12 +24,9 @@ class CollectBulkMovInstances(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
context = instance.context
|
||||
project_name = context.data["projectEntity"]["name"]
|
||||
asset_name = instance.data["asset"]
|
||||
|
||||
asset_doc = legacy_io.find_one({
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
})
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
if not asset_doc:
|
||||
raise AssertionError((
|
||||
"Couldn't find Asset document with name \"{}\""
|
||||
|
|
@ -52,7 +49,7 @@ class CollectBulkMovInstances(pyblish.api.InstancePlugin):
|
|||
self.subset_name_variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
legacy_io.Session["AVALON_PROJECT"]
|
||||
project_name
|
||||
)
|
||||
instance_name = f"{asset_name}_{subset_name}"
|
||||
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import re
|
|||
from copy import deepcopy
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.client import get_asset_by_id
|
||||
|
||||
|
||||
class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
||||
|
|
@ -61,27 +61,32 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
**instance.data["anatomyData"])
|
||||
|
||||
def create_hierarchy(self, instance):
|
||||
parents = list()
|
||||
hierarchy = list()
|
||||
visual_hierarchy = [instance.context.data["assetEntity"]]
|
||||
asset_doc = instance.context.data["assetEntity"]
|
||||
project_doc = instance.context.data["projectEntity"]
|
||||
project_name = project_doc["name"]
|
||||
visual_hierarchy = [asset_doc]
|
||||
current_doc = asset_doc
|
||||
while True:
|
||||
visual_parent = legacy_io.find_one(
|
||||
{"_id": visual_hierarchy[-1]["data"]["visualParent"]}
|
||||
)
|
||||
if visual_parent:
|
||||
visual_hierarchy.append(visual_parent)
|
||||
else:
|
||||
visual_hierarchy.append(
|
||||
instance.context.data["projectEntity"])
|
||||
visual_parent_id = current_doc["data"]["visualParent"]
|
||||
visual_parent = None
|
||||
if visual_parent_id:
|
||||
visual_parent = get_asset_by_id(project_name, visual_parent_id)
|
||||
|
||||
if not visual_parent:
|
||||
visual_hierarchy.append(project_doc)
|
||||
break
|
||||
visual_hierarchy.append(visual_parent)
|
||||
current_doc = visual_parent
|
||||
|
||||
# add current selection context hierarchy from standalonepublisher
|
||||
parents = list()
|
||||
for entity in reversed(visual_hierarchy):
|
||||
parents.append({
|
||||
"entity_type": entity["data"]["entityType"],
|
||||
"entity_name": entity["name"]
|
||||
})
|
||||
|
||||
hierarchy = list()
|
||||
if self.shot_add_hierarchy:
|
||||
parent_template_patern = re.compile(r"\{([a-z]*?)\}")
|
||||
# fill the parents parts from presets
|
||||
|
|
@ -129,9 +134,8 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
self.log.debug(f"Hierarchy: {hierarchy}")
|
||||
self.log.debug(f"parents: {parents}")
|
||||
|
||||
tasks_to_add = dict()
|
||||
if self.shot_add_tasks:
|
||||
tasks_to_add = dict()
|
||||
project_doc = legacy_io.find_one({"type": "project"})
|
||||
project_tasks = project_doc["config"]["tasks"]
|
||||
for task_name, task_data in self.shot_add_tasks.items():
|
||||
_task_data = deepcopy(task_data)
|
||||
|
|
@ -150,9 +154,7 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
task_name,
|
||||
list(project_tasks.keys())))
|
||||
|
||||
instance.data["tasks"] = tasks_to_add
|
||||
else:
|
||||
instance.data["tasks"] = dict()
|
||||
instance.data["tasks"] = tasks_to_add
|
||||
|
||||
# updating hierarchy data
|
||||
instance.data["anatomyData"].update({
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ import collections
|
|||
import pyblish.api
|
||||
from pprint import pformat
|
||||
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.client import get_assets
|
||||
|
||||
|
||||
class CollectMatchingAssetToInstance(pyblish.api.InstancePlugin):
|
||||
|
|
@ -119,8 +119,9 @@ class CollectMatchingAssetToInstance(pyblish.api.InstancePlugin):
|
|||
|
||||
def _asset_docs_by_parent_id(self, instance):
|
||||
# Query all assets for project and store them by parent's id to list
|
||||
project_name = instance.context.data["projectEntity"]["name"]
|
||||
asset_docs_by_parent_id = collections.defaultdict(list)
|
||||
for asset_doc in legacy_io.find({"type": "asset"}):
|
||||
for asset_doc in get_assets(project_name):
|
||||
parent_id = asset_doc["data"]["visualParent"]
|
||||
asset_docs_by_parent_id[parent_id].append(asset_doc)
|
||||
return asset_docs_by_parent_id
|
||||
|
|
|
|||
|
|
@ -1,9 +1,7 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import (
|
||||
PublishXmlValidationError,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.client import get_assets
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
|
||||
|
||||
class ValidateTaskExistence(pyblish.api.ContextPlugin):
|
||||
|
|
@ -20,15 +18,11 @@ class ValidateTaskExistence(pyblish.api.ContextPlugin):
|
|||
for instance in context:
|
||||
asset_names.add(instance.data["asset"])
|
||||
|
||||
asset_docs = legacy_io.find(
|
||||
{
|
||||
"type": "asset",
|
||||
"name": {"$in": list(asset_names)}
|
||||
},
|
||||
{
|
||||
"name": 1,
|
||||
"data.tasks": 1
|
||||
}
|
||||
project_name = context.data["projectEntity"]["name"]
|
||||
asset_docs = get_assets(
|
||||
project_name,
|
||||
asset_names=asset_names,
|
||||
fields=["name", "data.tasks"]
|
||||
)
|
||||
tasks_by_asset_names = {}
|
||||
for asset_doc in asset_docs:
|
||||
|
|
|
|||
|
|
@ -13,9 +13,13 @@ import tempfile
|
|||
import math
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.client import (
|
||||
get_asset_by_name,
|
||||
get_last_version_by_subset_name
|
||||
)
|
||||
from openpype.lib import (
|
||||
prepare_template_data,
|
||||
get_asset,
|
||||
get_ffprobe_streams,
|
||||
convert_ffprobe_fps_value,
|
||||
)
|
||||
|
|
@ -23,7 +27,6 @@ from openpype.lib.plugin_tools import (
|
|||
parse_json,
|
||||
get_subset_name_with_asset_doc
|
||||
)
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
|
||||
class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
||||
|
|
@ -56,8 +59,9 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
|
||||
self.log.info("task_sub:: {}".format(task_subfolders))
|
||||
|
||||
project_name = context.data["project_name"]
|
||||
asset_name = context.data["asset"]
|
||||
asset_doc = get_asset()
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
task_name = context.data["task"]
|
||||
task_type = context.data["taskType"]
|
||||
project_name = context.data["project_name"]
|
||||
|
|
@ -80,7 +84,9 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
family, variant, task_name, asset_doc,
|
||||
project_name=project_name, host_name="webpublisher"
|
||||
)
|
||||
version = self._get_last_version(asset_name, subset_name) + 1
|
||||
version = self._get_next_version(
|
||||
project_name, asset_doc, subset_name
|
||||
)
|
||||
|
||||
instance = context.create_instance(subset_name)
|
||||
instance.data["asset"] = asset_name
|
||||
|
|
@ -219,55 +225,19 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
config["families"],
|
||||
config["tags"])
|
||||
|
||||
def _get_last_version(self, asset_name, subset_name):
|
||||
"""Returns version number or 0 for 'asset' and 'subset'"""
|
||||
query = [
|
||||
{
|
||||
"$match": {"type": "asset", "name": asset_name}
|
||||
},
|
||||
{
|
||||
"$lookup":
|
||||
{
|
||||
"from": os.environ["AVALON_PROJECT"],
|
||||
"localField": "_id",
|
||||
"foreignField": "parent",
|
||||
"as": "subsets"
|
||||
}
|
||||
},
|
||||
{
|
||||
"$unwind": "$subsets"
|
||||
},
|
||||
{
|
||||
"$match": {"subsets.type": "subset",
|
||||
"subsets.name": subset_name}},
|
||||
{
|
||||
"$lookup":
|
||||
{
|
||||
"from": os.environ["AVALON_PROJECT"],
|
||||
"localField": "subsets._id",
|
||||
"foreignField": "parent",
|
||||
"as": "versions"
|
||||
}
|
||||
},
|
||||
{
|
||||
"$unwind": "$versions"
|
||||
},
|
||||
{
|
||||
"$group": {
|
||||
"_id": {
|
||||
"asset_name": "$name",
|
||||
"subset_name": "$subsets.name"
|
||||
},
|
||||
'version': {'$max': "$versions.name"}
|
||||
}
|
||||
}
|
||||
]
|
||||
version = list(legacy_io.aggregate(query))
|
||||
def _get_next_version(self, project_name, asset_doc, subset_name):
|
||||
"""Returns version number or 1 for 'asset' and 'subset'"""
|
||||
|
||||
if version:
|
||||
return version[0].get("version") or 0
|
||||
else:
|
||||
return 0
|
||||
version_doc = get_last_version_by_subset_name(
|
||||
project_name,
|
||||
subset_name,
|
||||
asset_doc["_id"],
|
||||
fields=["name"]
|
||||
)
|
||||
version = 1
|
||||
if version_doc:
|
||||
version += int(version_doc["name"])
|
||||
return version
|
||||
|
||||
def _get_number_of_frames(self, file_url):
|
||||
"""Return duration in frames"""
|
||||
|
|
|
|||
|
|
@ -2,11 +2,15 @@
|
|||
import os
|
||||
import json
|
||||
import datetime
|
||||
from bson.objectid import ObjectId
|
||||
import collections
|
||||
from aiohttp.web_response import Response
|
||||
import subprocess
|
||||
from bson.objectid import ObjectId
|
||||
from aiohttp.web_response import Response
|
||||
|
||||
from openpype.client import (
|
||||
get_projects,
|
||||
get_assets,
|
||||
)
|
||||
from openpype.lib import (
|
||||
OpenPypeMongoConnection,
|
||||
PypeLogger,
|
||||
|
|
@ -16,30 +20,29 @@ from openpype.lib.remote_publish import (
|
|||
ERROR_STATUS,
|
||||
REPROCESS_STATUS
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype_modules.avalon_apps.rest_api import _RestApiEndpoint
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype_modules.webserver.base_routes import RestApiEndpoint
|
||||
|
||||
log = PypeLogger.get_logger("WebpublishRoutes")
|
||||
|
||||
|
||||
|
||||
log = PypeLogger.get_logger("WebServer")
|
||||
class ResourceRestApiEndpoint(RestApiEndpoint):
|
||||
def __init__(self, resource):
|
||||
self.resource = resource
|
||||
super(ResourceRestApiEndpoint, self).__init__()
|
||||
|
||||
|
||||
class RestApiResource:
|
||||
"""Resource carrying needed info and Avalon DB connection for publish."""
|
||||
def __init__(self, server_manager, executable, upload_dir,
|
||||
studio_task_queue=None):
|
||||
self.server_manager = server_manager
|
||||
self.upload_dir = upload_dir
|
||||
self.executable = executable
|
||||
class WebpublishApiEndpoint(ResourceRestApiEndpoint):
|
||||
@property
|
||||
def dbcon(self):
|
||||
return self.resource.dbcon
|
||||
|
||||
if studio_task_queue is None:
|
||||
studio_task_queue = collections.deque().dequeu
|
||||
self.studio_task_queue = studio_task_queue
|
||||
|
||||
self.dbcon = AvalonMongoDB()
|
||||
self.dbcon.install()
|
||||
class JsonApiResource:
|
||||
"""Resource for json manipulation.
|
||||
|
||||
All resources handling sending output to REST should inherit from
|
||||
"""
|
||||
@staticmethod
|
||||
def json_dump_handler(value):
|
||||
if isinstance(value, datetime.datetime):
|
||||
|
|
@ -59,19 +62,33 @@ class RestApiResource:
|
|||
).encode("utf-8")
|
||||
|
||||
|
||||
class OpenPypeRestApiResource(RestApiResource):
|
||||
class RestApiResource(JsonApiResource):
|
||||
"""Resource carrying needed info and Avalon DB connection for publish."""
|
||||
def __init__(self, server_manager, executable, upload_dir,
|
||||
studio_task_queue=None):
|
||||
self.server_manager = server_manager
|
||||
self.upload_dir = upload_dir
|
||||
self.executable = executable
|
||||
|
||||
if studio_task_queue is None:
|
||||
studio_task_queue = collections.deque().dequeu
|
||||
self.studio_task_queue = studio_task_queue
|
||||
|
||||
|
||||
class WebpublishRestApiResource(JsonApiResource):
|
||||
"""Resource carrying OP DB connection for storing batch info into DB."""
|
||||
def __init__(self, ):
|
||||
|
||||
def __init__(self):
|
||||
mongo_client = OpenPypeMongoConnection.get_mongo_client()
|
||||
database_name = os.environ["OPENPYPE_DATABASE_NAME"]
|
||||
self.dbcon = mongo_client[database_name]["webpublishes"]
|
||||
|
||||
|
||||
class ProjectsEndpoint(_RestApiEndpoint):
|
||||
class ProjectsEndpoint(ResourceRestApiEndpoint):
|
||||
"""Returns list of dict with project info (id, name)."""
|
||||
async def get(self) -> Response:
|
||||
output = []
|
||||
for project_doc in self.dbcon.projects():
|
||||
for project_doc in get_projects():
|
||||
ret_val = {
|
||||
"id": project_doc["_id"],
|
||||
"name": project_doc["name"]
|
||||
|
|
@ -84,7 +101,7 @@ class ProjectsEndpoint(_RestApiEndpoint):
|
|||
)
|
||||
|
||||
|
||||
class HiearchyEndpoint(_RestApiEndpoint):
|
||||
class HiearchyEndpoint(ResourceRestApiEndpoint):
|
||||
"""Returns dictionary with context tree from assets."""
|
||||
async def get(self, project_name) -> Response:
|
||||
query_projection = {
|
||||
|
|
@ -96,10 +113,7 @@ class HiearchyEndpoint(_RestApiEndpoint):
|
|||
"type": 1,
|
||||
}
|
||||
|
||||
asset_docs = self.dbcon.database[project_name].find(
|
||||
{"type": "asset"},
|
||||
query_projection
|
||||
)
|
||||
asset_docs = get_assets(project_name, fields=query_projection.keys())
|
||||
asset_docs_by_id = {
|
||||
asset_doc["_id"]: asset_doc
|
||||
for asset_doc in asset_docs
|
||||
|
|
@ -183,7 +197,7 @@ class TaskNode(Node):
|
|||
self["attributes"] = {}
|
||||
|
||||
|
||||
class BatchPublishEndpoint(_RestApiEndpoint):
|
||||
class BatchPublishEndpoint(WebpublishApiEndpoint):
|
||||
"""Triggers headless publishing of batch."""
|
||||
async def post(self, request) -> Response:
|
||||
# Validate existence of openpype executable
|
||||
|
|
@ -288,7 +302,7 @@ class BatchPublishEndpoint(_RestApiEndpoint):
|
|||
)
|
||||
|
||||
|
||||
class TaskPublishEndpoint(_RestApiEndpoint):
|
||||
class TaskPublishEndpoint(WebpublishApiEndpoint):
|
||||
"""Prepared endpoint triggered after each task - for future development."""
|
||||
async def post(self, request) -> Response:
|
||||
return Response(
|
||||
|
|
@ -298,8 +312,12 @@ class TaskPublishEndpoint(_RestApiEndpoint):
|
|||
)
|
||||
|
||||
|
||||
class BatchStatusEndpoint(_RestApiEndpoint):
|
||||
"""Returns dict with info for batch_id."""
|
||||
class BatchStatusEndpoint(WebpublishApiEndpoint):
|
||||
"""Returns dict with info for batch_id.
|
||||
|
||||
Uses 'WebpublishRestApiResource'.
|
||||
"""
|
||||
|
||||
async def get(self, batch_id) -> Response:
|
||||
output = self.dbcon.find_one({"batch_id": batch_id})
|
||||
|
||||
|
|
@ -318,8 +336,12 @@ class BatchStatusEndpoint(_RestApiEndpoint):
|
|||
)
|
||||
|
||||
|
||||
class UserReportEndpoint(_RestApiEndpoint):
|
||||
"""Returns list of dict with batch info for user (email address)."""
|
||||
class UserReportEndpoint(WebpublishApiEndpoint):
|
||||
"""Returns list of dict with batch info for user (email address).
|
||||
|
||||
Uses 'WebpublishRestApiResource'.
|
||||
"""
|
||||
|
||||
async def get(self, user) -> Response:
|
||||
output = list(self.dbcon.find({"user": user},
|
||||
projection={"log": False}))
|
||||
|
|
@ -338,7 +360,7 @@ class UserReportEndpoint(_RestApiEndpoint):
|
|||
)
|
||||
|
||||
|
||||
class ConfiguredExtensionsEndpoint(_RestApiEndpoint):
|
||||
class ConfiguredExtensionsEndpoint(WebpublishApiEndpoint):
|
||||
"""Returns dict of extensions which have mapping to family.
|
||||
|
||||
Returns:
|
||||
|
|
@ -378,8 +400,12 @@ class ConfiguredExtensionsEndpoint(_RestApiEndpoint):
|
|||
)
|
||||
|
||||
|
||||
class BatchReprocessEndpoint(_RestApiEndpoint):
|
||||
"""Marks latest 'batch_id' for reprocessing, returns 404 if not found."""
|
||||
class BatchReprocessEndpoint(WebpublishApiEndpoint):
|
||||
"""Marks latest 'batch_id' for reprocessing, returns 404 if not found.
|
||||
|
||||
Uses 'WebpublishRestApiResource'.
|
||||
"""
|
||||
|
||||
async def post(self, batch_id) -> Response:
|
||||
batches = self.dbcon.find({"batch_id": batch_id,
|
||||
"status": ERROR_STATUS}).sort("_id", -1)
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ from openpype.lib import PypeLogger
|
|||
|
||||
from .webpublish_routes import (
|
||||
RestApiResource,
|
||||
OpenPypeRestApiResource,
|
||||
WebpublishRestApiResource,
|
||||
HiearchyEndpoint,
|
||||
ProjectsEndpoint,
|
||||
ConfiguredExtensionsEndpoint,
|
||||
|
|
@ -27,7 +27,7 @@ from openpype.lib.remote_publish import (
|
|||
)
|
||||
|
||||
|
||||
log = PypeLogger().get_logger("webserver_gui")
|
||||
log = PypeLogger.get_logger("webserver_gui")
|
||||
|
||||
|
||||
def run_webserver(*args, **kwargs):
|
||||
|
|
@ -69,16 +69,14 @@ def run_webserver(*args, **kwargs):
|
|||
)
|
||||
|
||||
# triggers publish
|
||||
webpublisher_task_publish_endpoint = \
|
||||
BatchPublishEndpoint(resource)
|
||||
webpublisher_task_publish_endpoint = BatchPublishEndpoint(resource)
|
||||
server_manager.add_route(
|
||||
"POST",
|
||||
"/api/webpublish/batch",
|
||||
webpublisher_task_publish_endpoint.dispatch
|
||||
)
|
||||
|
||||
webpublisher_batch_publish_endpoint = \
|
||||
TaskPublishEndpoint(resource)
|
||||
webpublisher_batch_publish_endpoint = TaskPublishEndpoint(resource)
|
||||
server_manager.add_route(
|
||||
"POST",
|
||||
"/api/webpublish/task",
|
||||
|
|
@ -86,27 +84,26 @@ def run_webserver(*args, **kwargs):
|
|||
)
|
||||
|
||||
# reporting
|
||||
openpype_resource = OpenPypeRestApiResource()
|
||||
batch_status_endpoint = BatchStatusEndpoint(openpype_resource)
|
||||
webpublish_resource = WebpublishRestApiResource()
|
||||
batch_status_endpoint = BatchStatusEndpoint(webpublish_resource)
|
||||
server_manager.add_route(
|
||||
"GET",
|
||||
"/api/batch_status/{batch_id}",
|
||||
batch_status_endpoint.dispatch
|
||||
)
|
||||
|
||||
user_status_endpoint = UserReportEndpoint(openpype_resource)
|
||||
user_status_endpoint = UserReportEndpoint(webpublish_resource)
|
||||
server_manager.add_route(
|
||||
"GET",
|
||||
"/api/publishes/{user}",
|
||||
user_status_endpoint.dispatch
|
||||
)
|
||||
|
||||
webpublisher_batch_reprocess_endpoint = \
|
||||
BatchReprocessEndpoint(openpype_resource)
|
||||
batch_reprocess_endpoint = BatchReprocessEndpoint(webpublish_resource)
|
||||
server_manager.add_route(
|
||||
"POST",
|
||||
"/api/webpublish/reprocess/{batch_id}",
|
||||
webpublisher_batch_reprocess_endpoint.dispatch
|
||||
batch_reprocess_endpoint.dispatch
|
||||
)
|
||||
|
||||
server_manager.start_server()
|
||||
|
|
|
|||
|
|
@ -7,7 +7,6 @@ import platform
|
|||
import logging
|
||||
import collections
|
||||
import functools
|
||||
import getpass
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
|
|
@ -19,6 +18,7 @@ from .anatomy import Anatomy
|
|||
from .profiles_filtering import filter_profiles
|
||||
from .events import emit_event
|
||||
from .path_templates import StringTemplate
|
||||
from .local_settings import get_openpype_username
|
||||
|
||||
legacy_io = None
|
||||
|
||||
|
|
@ -550,7 +550,7 @@ def get_workdir_data(project_doc, asset_doc, task_name, host_name):
|
|||
"asset": asset_doc["name"],
|
||||
"parent": parent_name,
|
||||
"app": host_name,
|
||||
"user": getpass.getuser(),
|
||||
"user": get_openpype_username(),
|
||||
"hierarchy": hierarchy,
|
||||
}
|
||||
|
||||
|
|
@ -797,8 +797,14 @@ def update_current_task(task=None, asset=None, app=None, template_key=None):
|
|||
else:
|
||||
os.environ[key] = value
|
||||
|
||||
data = changes.copy()
|
||||
# Convert env keys to human readable keys
|
||||
data["project_name"] = legacy_io.Session["AVALON_PROJECT"]
|
||||
data["asset_name"] = legacy_io.Session["AVALON_ASSET"]
|
||||
data["task_name"] = legacy_io.Session["AVALON_TASK"]
|
||||
|
||||
# Emit session change
|
||||
emit_event("taskChanged", changes.copy())
|
||||
emit_event("taskChanged", data)
|
||||
|
||||
return changes
|
||||
|
||||
|
|
|
|||
|
|
@ -463,6 +463,25 @@ class OpenPypeModule:
|
|||
|
||||
pass
|
||||
|
||||
def on_host_install(self, host, host_name, project_name):
|
||||
"""Host was installed which gives option to handle in-host logic.
|
||||
|
||||
It is a good option to register in-host event callbacks which are
|
||||
specific for the module. The module is kept in memory for rest of
|
||||
the process.
|
||||
|
||||
Arguments may change in future. E.g. 'host_name' should be possible
|
||||
to receive from 'host' object.
|
||||
|
||||
Args:
|
||||
host (ModuleType): Access to installed/registered host object.
|
||||
host_name (str): Name of host.
|
||||
project_name (str): Project name which is main part of host
|
||||
context.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
def cli(self, module_click_group):
|
||||
"""Add commands to click group.
|
||||
|
||||
|
|
|
|||
|
|
@ -33,6 +33,7 @@ class AfterEffectsSubmitDeadline(
|
|||
hosts = ["aftereffects"]
|
||||
families = ["render.farm"] # cannot be "render' as that is integrated
|
||||
use_published = True
|
||||
targets = ["local"]
|
||||
|
||||
priority = 50
|
||||
chunk_size = 1000000
|
||||
|
|
|
|||
|
|
@ -238,6 +238,7 @@ class HarmonySubmitDeadline(
|
|||
order = pyblish.api.IntegratorOrder + 0.1
|
||||
hosts = ["harmony"]
|
||||
families = ["render.farm"]
|
||||
targets = ["local"]
|
||||
|
||||
optional = True
|
||||
use_published = False
|
||||
|
|
@ -321,7 +322,9 @@ class HarmonySubmitDeadline(
|
|||
)
|
||||
unzip_dir = (published_scene.parent / published_scene.stem)
|
||||
with _ZipFile(published_scene, "r") as zip_ref:
|
||||
zip_ref.extractall(unzip_dir.as_posix())
|
||||
# UNC path (//?/) added to minimalize risk with extracting
|
||||
# to large file paths
|
||||
zip_ref.extractall("//?/" + str(unzip_dir.as_posix()))
|
||||
|
||||
# find any xstage files in directory, prefer the one with the same name
|
||||
# as directory (plus extension)
|
||||
|
|
|
|||
|
|
@ -287,6 +287,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
order = pyblish.api.IntegratorOrder + 0.1
|
||||
hosts = ["maya"]
|
||||
families = ["renderlayer"]
|
||||
targets = ["local"]
|
||||
|
||||
use_published = True
|
||||
tile_assembler_plugin = "OpenPypeTileAssembler"
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ import openpype.api
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class MayaSubmitRemotePublishDeadline(openpype.api.Integrator):
|
||||
class MayaSubmitRemotePublishDeadline(pyblish.api.InstancePlugin):
|
||||
"""Submit Maya scene to perform a local publish in Deadline.
|
||||
|
||||
Publishing in Deadline can be helpful for scenes that publish very slow.
|
||||
|
|
@ -31,6 +31,7 @@ class MayaSubmitRemotePublishDeadline(openpype.api.Integrator):
|
|||
order = pyblish.api.IntegratorOrder
|
||||
hosts = ["maya"]
|
||||
families = ["publish.farm"]
|
||||
targets = ["local"]
|
||||
|
||||
def process(self, instance):
|
||||
settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
|
|
|
|||
|
|
@ -23,6 +23,7 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
hosts = ["nuke", "nukestudio"]
|
||||
families = ["render.farm", "prerender.farm"]
|
||||
optional = True
|
||||
targets = ["local"]
|
||||
|
||||
# presets
|
||||
priority = 50
|
||||
|
|
@ -54,8 +55,8 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self._ver = re.search(r"\d+\.\d+", context.data.get("hostVersion"))
|
||||
self._deadline_user = context.data.get(
|
||||
"deadlineUser", getpass.getuser())
|
||||
self._frame_start = int(instance.data["frameStartHandle"])
|
||||
self._frame_end = int(instance.data["frameEndHandle"])
|
||||
submit_frame_start = int(instance.data["frameStartHandle"])
|
||||
submit_frame_end = int(instance.data["frameEndHandle"])
|
||||
|
||||
# get output path
|
||||
render_path = instance.data['path']
|
||||
|
|
@ -81,13 +82,16 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
# exception for slate workflow
|
||||
if "slate" in instance.data["families"]:
|
||||
self._frame_start -= 1
|
||||
submit_frame_start -= 1
|
||||
|
||||
response = self.payload_submit(instance,
|
||||
script_path,
|
||||
render_path,
|
||||
node.name()
|
||||
)
|
||||
response = self.payload_submit(
|
||||
instance,
|
||||
script_path,
|
||||
render_path,
|
||||
node.name(),
|
||||
submit_frame_start,
|
||||
submit_frame_end
|
||||
)
|
||||
# Store output dir for unified publisher (filesequence)
|
||||
instance.data["deadlineSubmissionJob"] = response.json()
|
||||
instance.data["outputDir"] = os.path.dirname(
|
||||
|
|
@ -95,20 +99,22 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
instance.data["publishJobState"] = "Suspended"
|
||||
|
||||
if instance.data.get("bakingNukeScripts"):
|
||||
# exception for slate workflow
|
||||
if "slate" in instance.data["families"]:
|
||||
submit_frame_start += 1
|
||||
|
||||
for baking_script in instance.data["bakingNukeScripts"]:
|
||||
render_path = baking_script["bakeRenderPath"]
|
||||
script_path = baking_script["bakeScriptPath"]
|
||||
exe_node_name = baking_script["bakeWriteNodeName"]
|
||||
|
||||
# exception for slate workflow
|
||||
if "slate" in instance.data["families"]:
|
||||
self._frame_start += 1
|
||||
|
||||
resp = self.payload_submit(
|
||||
instance,
|
||||
script_path,
|
||||
render_path,
|
||||
exe_node_name,
|
||||
submit_frame_start,
|
||||
submit_frame_end,
|
||||
response.json()
|
||||
)
|
||||
|
||||
|
|
@ -125,13 +131,16 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
families.insert(0, "prerender")
|
||||
instance.data["families"] = families
|
||||
|
||||
def payload_submit(self,
|
||||
instance,
|
||||
script_path,
|
||||
render_path,
|
||||
exe_node_name,
|
||||
responce_data=None
|
||||
):
|
||||
def payload_submit(
|
||||
self,
|
||||
instance,
|
||||
script_path,
|
||||
render_path,
|
||||
exe_node_name,
|
||||
start_frame,
|
||||
end_frame,
|
||||
responce_data=None
|
||||
):
|
||||
render_dir = os.path.normpath(os.path.dirname(render_path))
|
||||
script_name = os.path.basename(script_path)
|
||||
jobname = "%s - %s" % (script_name, instance.name)
|
||||
|
|
@ -191,8 +200,8 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
"Plugin": "Nuke",
|
||||
"Frames": "{start}-{end}".format(
|
||||
start=self._frame_start,
|
||||
end=self._frame_end
|
||||
start=start_frame,
|
||||
end=end_frame
|
||||
),
|
||||
"Comment": self._comment,
|
||||
|
||||
|
|
@ -292,7 +301,13 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.log.info(json.dumps(payload, indent=4, sort_keys=True))
|
||||
|
||||
# adding expectied files to instance.data
|
||||
self.expected_files(instance, render_path)
|
||||
self.expected_files(
|
||||
instance,
|
||||
render_path,
|
||||
start_frame,
|
||||
end_frame
|
||||
)
|
||||
|
||||
self.log.debug("__ expectedFiles: `{}`".format(
|
||||
instance.data["expectedFiles"]))
|
||||
response = requests.post(self.deadline_url, json=payload, timeout=10)
|
||||
|
|
@ -338,9 +353,13 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.log.debug("_ path: `{}`".format(path))
|
||||
return path
|
||||
|
||||
def expected_files(self,
|
||||
instance,
|
||||
path):
|
||||
def expected_files(
|
||||
self,
|
||||
instance,
|
||||
path,
|
||||
start_frame,
|
||||
end_frame
|
||||
):
|
||||
""" Create expected files in instance data
|
||||
"""
|
||||
if not instance.data.get("expectedFiles"):
|
||||
|
|
@ -358,7 +377,7 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
instance.data["expectedFiles"].append(path)
|
||||
return
|
||||
|
||||
for i in range(self._frame_start, (self._frame_end + 1)):
|
||||
for i in range(start_frame, (end_frame + 1)):
|
||||
instance.data["expectedFiles"].append(
|
||||
os.path.join(dir, (file % i)).replace("\\", "/"))
|
||||
|
||||
|
|
|
|||
|
|
@ -103,6 +103,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
order = pyblish.api.IntegratorOrder + 0.2
|
||||
icon = "tractor"
|
||||
deadline_plugin = "OpenPype"
|
||||
targets = ["local"]
|
||||
|
||||
hosts = ["fusion", "maya", "nuke", "celaction", "aftereffects", "harmony"]
|
||||
|
||||
|
|
@ -642,9 +643,6 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
def _solve_families(self, instance, preview=False):
|
||||
families = instance.get("families")
|
||||
|
||||
# test also instance data review attribute
|
||||
preview = preview or instance.get("review")
|
||||
|
||||
# if we have one representation with preview tag
|
||||
# flag whole instance for review and for ftrack
|
||||
if preview:
|
||||
|
|
@ -724,10 +722,17 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
" This may cause issues."
|
||||
).format(source))
|
||||
|
||||
families = ["render"]
|
||||
family = "render"
|
||||
if "prerender" in instance.data["families"]:
|
||||
family = "prerender"
|
||||
families = [family]
|
||||
|
||||
# pass review to families if marked as review
|
||||
if data.get("review"):
|
||||
families.append("review")
|
||||
|
||||
instance_skeleton_data = {
|
||||
"family": "render",
|
||||
"family": family,
|
||||
"subset": subset,
|
||||
"families": families,
|
||||
"asset": asset,
|
||||
|
|
@ -749,15 +754,6 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"useSequenceForReview": data.get("useSequenceForReview", True)
|
||||
}
|
||||
|
||||
if "prerender" in instance.data["families"]:
|
||||
instance_skeleton_data.update({
|
||||
"family": "prerender",
|
||||
"families": []})
|
||||
|
||||
# also include review attribute if available
|
||||
if "review" in data:
|
||||
instance_skeleton_data["review"] = data["review"]
|
||||
|
||||
# skip locking version if we are creating v01
|
||||
instance_version = instance.data.get("version") # take this if exists
|
||||
if instance_version != 1:
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import json
|
||||
|
||||
from openpype.client import get_project
|
||||
from openpype.api import ProjectSettings
|
||||
from openpype.lib import create_project
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.settings import SaveWarningExc
|
||||
|
||||
from openpype_modules.ftrack.lib import (
|
||||
|
|
@ -363,12 +363,8 @@ class PrepareProjectServer(ServerAction):
|
|||
project_name = project_entity["full_name"]
|
||||
|
||||
# Try to find project document
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.install()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = dbcon.find_one({
|
||||
"type": "project"
|
||||
})
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Create project if is not available
|
||||
# - creation is required to be able set project anatomy and attributes
|
||||
if not project_doc:
|
||||
|
|
@ -376,9 +372,7 @@ class PrepareProjectServer(ServerAction):
|
|||
self.log.info("Creating project \"{} [{}]\"".format(
|
||||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code, dbcon=dbcon)
|
||||
|
||||
dbcon.uninstall()
|
||||
create_project(project_name, project_code)
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
|
|
|
|||
|
|
@ -12,6 +12,12 @@ from pymongo import UpdateOne
|
|||
import arrow
|
||||
import ftrack_api
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
get_archived_assets,
|
||||
get_asset_ids_with_subsets
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB, schema
|
||||
|
||||
from openpype_modules.ftrack.lib import (
|
||||
|
|
@ -149,12 +155,11 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
@property
|
||||
def avalon_entities(self):
|
||||
if self._avalon_ents is None:
|
||||
project_name = self.cur_project["full_name"]
|
||||
self.dbcon.install()
|
||||
self.dbcon.Session["AVALON_PROJECT"] = (
|
||||
self.cur_project["full_name"]
|
||||
)
|
||||
avalon_project = self.dbcon.find_one({"type": "project"})
|
||||
avalon_entities = list(self.dbcon.find({"type": "asset"}))
|
||||
self.dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
avalon_project = get_project(project_name)
|
||||
avalon_entities = list(get_assets(project_name))
|
||||
self._avalon_ents = (avalon_project, avalon_entities)
|
||||
return self._avalon_ents
|
||||
|
||||
|
|
@ -284,28 +289,21 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
self._avalon_ents_by_ftrack_id[ftrack_id] = doc
|
||||
|
||||
@property
|
||||
def avalon_subsets_by_parents(self):
|
||||
if self._avalon_subsets_by_parents is None:
|
||||
self._avalon_subsets_by_parents = collections.defaultdict(list)
|
||||
self.dbcon.install()
|
||||
self.dbcon.Session["AVALON_PROJECT"] = (
|
||||
self.cur_project["full_name"]
|
||||
def avalon_asset_ids_with_subsets(self):
|
||||
if self._avalon_asset_ids_with_subsets is None:
|
||||
project_name = self.cur_project["full_name"]
|
||||
self._avalon_asset_ids_with_subsets = get_asset_ids_with_subsets(
|
||||
project_name
|
||||
)
|
||||
for subset in self.dbcon.find({"type": "subset"}):
|
||||
self._avalon_subsets_by_parents[subset["parent"]].append(
|
||||
subset
|
||||
)
|
||||
return self._avalon_subsets_by_parents
|
||||
|
||||
return self._avalon_asset_ids_with_subsets
|
||||
|
||||
@property
|
||||
def avalon_archived_by_id(self):
|
||||
if self._avalon_archived_by_id is None:
|
||||
self._avalon_archived_by_id = {}
|
||||
self.dbcon.install()
|
||||
self.dbcon.Session["AVALON_PROJECT"] = (
|
||||
self.cur_project["full_name"]
|
||||
)
|
||||
for asset in self.dbcon.find({"type": "archived_asset"}):
|
||||
project_name = self.cur_project["full_name"]
|
||||
for asset in get_archived_assets(project_name):
|
||||
self._avalon_archived_by_id[asset["_id"]] = asset
|
||||
return self._avalon_archived_by_id
|
||||
|
||||
|
|
@ -327,7 +325,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
avalon_project, avalon_entities = self.avalon_entities
|
||||
self._changeability_by_mongo_id[avalon_project["_id"]] = False
|
||||
self._bubble_changeability(
|
||||
list(self.avalon_subsets_by_parents.keys())
|
||||
list(self.avalon_asset_ids_with_subsets)
|
||||
)
|
||||
|
||||
return self._changeability_by_mongo_id
|
||||
|
|
@ -449,14 +447,9 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
if not entity:
|
||||
# if entity is not found then it is subset without parent
|
||||
if entity_id in unchangeable_ids:
|
||||
_subset_ids = [
|
||||
str(sub["_id"]) for sub in
|
||||
self.avalon_subsets_by_parents[entity_id]
|
||||
]
|
||||
joined_subset_ids = "| ".join(_subset_ids)
|
||||
self.log.warning((
|
||||
"Parent <{}> for subsets <{}> does not exist"
|
||||
).format(str(entity_id), joined_subset_ids))
|
||||
"Parent <{}> with subsets does not exist"
|
||||
).format(str(entity_id)))
|
||||
else:
|
||||
self.log.warning((
|
||||
"In avalon are entities without valid parents that"
|
||||
|
|
@ -483,7 +476,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
self._avalon_ents_by_parent_id = None
|
||||
self._avalon_ents_by_ftrack_id = None
|
||||
self._avalon_ents_by_name = None
|
||||
self._avalon_subsets_by_parents = None
|
||||
self._avalon_asset_ids_with_subsets = None
|
||||
self._changeability_by_mongo_id = None
|
||||
self._avalon_archived_by_id = None
|
||||
self._avalon_archived_by_name = None
|
||||
|
|
|
|||
|
|
@ -1,11 +1,9 @@
|
|||
import re
|
||||
import subprocess
|
||||
|
||||
from openpype.client import get_asset_by_id, get_asset_by_name
|
||||
from openpype_modules.ftrack.lib import BaseEvent
|
||||
from openpype_modules.ftrack.lib.avalon_sync import CUST_ATTR_ID_KEY
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from openpype.api import Anatomy, get_project_settings
|
||||
|
||||
|
|
@ -36,8 +34,6 @@ class UserAssigmentEvent(BaseEvent):
|
|||
3) path to publish files of task user was (de)assigned to
|
||||
"""
|
||||
|
||||
db_con = AvalonMongoDB()
|
||||
|
||||
def error(self, *err):
|
||||
for e in err:
|
||||
self.log.error(e)
|
||||
|
|
@ -101,26 +97,16 @@ class UserAssigmentEvent(BaseEvent):
|
|||
:rtype: dict
|
||||
"""
|
||||
parent = task['parent']
|
||||
self.db_con.install()
|
||||
self.db_con.Session['AVALON_PROJECT'] = task['project']['full_name']
|
||||
|
||||
project_name = task["project"]["full_name"]
|
||||
avalon_entity = None
|
||||
parent_id = parent['custom_attributes'].get(CUST_ATTR_ID_KEY)
|
||||
if parent_id:
|
||||
parent_id = ObjectId(parent_id)
|
||||
avalon_entity = self.db_con.find_one({
|
||||
'_id': parent_id,
|
||||
'type': 'asset'
|
||||
})
|
||||
avalon_entity = get_asset_by_id(project_name, parent_id)
|
||||
|
||||
if not avalon_entity:
|
||||
avalon_entity = self.db_con.find_one({
|
||||
'type': 'asset',
|
||||
'name': parent['name']
|
||||
})
|
||||
avalon_entity = get_asset_by_name(project_name, parent["name"])
|
||||
|
||||
if not avalon_entity:
|
||||
self.db_con.uninstall()
|
||||
msg = 'Entity "{}" not found in avalon database'.format(
|
||||
parent['name']
|
||||
)
|
||||
|
|
@ -129,7 +115,6 @@ class UserAssigmentEvent(BaseEvent):
|
|||
'success': False,
|
||||
'message': msg
|
||||
}
|
||||
self.db_con.uninstall()
|
||||
return avalon_entity
|
||||
|
||||
def _get_hierarchy(self, asset):
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import os
|
||||
|
||||
from openpype.client import get_project
|
||||
from openpype_modules.ftrack.lib import BaseAction
|
||||
from openpype.lib.applications import (
|
||||
ApplicationManager,
|
||||
|
|
@ -7,7 +8,6 @@ from openpype.lib.applications import (
|
|||
ApplictionExecutableNotFound,
|
||||
CUSTOM_LAUNCH_APP_GROUPS
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
|
||||
class AppplicationsAction(BaseAction):
|
||||
|
|
@ -25,7 +25,6 @@ class AppplicationsAction(BaseAction):
|
|||
super(AppplicationsAction, self).__init__(*args, **kwargs)
|
||||
|
||||
self.application_manager = ApplicationManager()
|
||||
self.dbcon = AvalonMongoDB()
|
||||
|
||||
@property
|
||||
def discover_identifier(self):
|
||||
|
|
@ -110,12 +109,7 @@ class AppplicationsAction(BaseAction):
|
|||
if avalon_project_doc is None:
|
||||
ft_project = self.get_project_from_entity(entity)
|
||||
project_name = ft_project["full_name"]
|
||||
if not self.dbcon.is_installed():
|
||||
self.dbcon.install()
|
||||
self.dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
avalon_project_doc = self.dbcon.find_one({
|
||||
"type": "project"
|
||||
}) or False
|
||||
avalon_project_doc = get_project(project_name) or False
|
||||
event["data"]["avalon_project_doc"] = avalon_project_doc
|
||||
|
||||
if not avalon_project_doc:
|
||||
|
|
|
|||
|
|
@ -140,9 +140,9 @@ class CustomAttributes(BaseAction):
|
|||
identifier = 'create.update.attributes'
|
||||
#: Action label.
|
||||
label = "OpenPype Admin"
|
||||
variant = '- Create/Update Avalon Attributes'
|
||||
variant = '- Create/Update Custom Attributes'
|
||||
#: Action description.
|
||||
description = 'Creates Avalon/Mongo ID for double check'
|
||||
description = 'Creates required custom attributes in ftrack'
|
||||
icon = statics_icon("ftrack", "action_icons", "OpenPypeAdmin.svg")
|
||||
settings_key = "create_update_attributes"
|
||||
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ from datetime import datetime
|
|||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from openpype.client import get_assets, get_subsets
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
from openpype_modules.ftrack.lib.avalon_sync import create_chunks
|
||||
|
|
@ -91,10 +92,8 @@ class DeleteAssetSubset(BaseAction):
|
|||
continue
|
||||
|
||||
ftrack_id = entity.get("entityId")
|
||||
if not ftrack_id:
|
||||
continue
|
||||
|
||||
ftrack_ids.append(ftrack_id)
|
||||
if ftrack_id:
|
||||
ftrack_ids.append(ftrack_id)
|
||||
|
||||
if project_in_selection:
|
||||
msg = "It is not possible to use this action on project entity."
|
||||
|
|
@ -120,48 +119,51 @@ class DeleteAssetSubset(BaseAction):
|
|||
"message": "Invalid selection for this action (Bug)"
|
||||
}
|
||||
|
||||
if entities[0].entity_type.lower() == "project":
|
||||
project = entities[0]
|
||||
else:
|
||||
project = entities[0]["project"]
|
||||
|
||||
project = self.get_project_from_entity(entities[0], session)
|
||||
project_name = project["full_name"]
|
||||
self.dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
selected_av_entities = list(self.dbcon.find({
|
||||
"type": "asset",
|
||||
"data.ftrackId": {"$in": ftrack_ids}
|
||||
}))
|
||||
asset_docs = list(get_assets(
|
||||
project_name,
|
||||
fields=["_id", "name", "data.ftrackId", "data.parents"]
|
||||
))
|
||||
selected_av_entities = []
|
||||
found_ftrack_ids = set()
|
||||
asset_docs_by_name = collections.defaultdict(list)
|
||||
for asset_doc in asset_docs:
|
||||
ftrack_id = asset_doc["data"].get("ftrackId")
|
||||
if ftrack_id:
|
||||
found_ftrack_ids.add(ftrack_id)
|
||||
if ftrack_id in entity_mapping:
|
||||
selected_av_entities.append(asset_doc)
|
||||
|
||||
asset_name = asset_doc["name"]
|
||||
asset_docs_by_name[asset_name].append(asset_doc)
|
||||
|
||||
found_without_ftrack_id = {}
|
||||
if len(selected_av_entities) != len(ftrack_ids):
|
||||
found_ftrack_ids = [
|
||||
ent["data"]["ftrackId"] for ent in selected_av_entities
|
||||
]
|
||||
for ftrack_id, entity in entity_mapping.items():
|
||||
if ftrack_id in found_ftrack_ids:
|
||||
for ftrack_id, entity in entity_mapping.items():
|
||||
if ftrack_id in found_ftrack_ids:
|
||||
continue
|
||||
|
||||
av_ents_by_name = asset_docs_by_name[entity["name"]]
|
||||
if not av_ents_by_name:
|
||||
continue
|
||||
|
||||
ent_path_items = [ent["name"] for ent in entity["link"]]
|
||||
end_index = len(ent_path_items) - 1
|
||||
parents = ent_path_items[1:end_index:]
|
||||
# TODO we should say to user that
|
||||
# few of them are missing in avalon
|
||||
for av_ent in av_ents_by_name:
|
||||
if av_ent["data"]["parents"] != parents:
|
||||
continue
|
||||
|
||||
av_ents_by_name = list(self.dbcon.find({
|
||||
"type": "asset",
|
||||
"name": entity["name"]
|
||||
}))
|
||||
if not av_ents_by_name:
|
||||
continue
|
||||
|
||||
ent_path_items = [ent["name"] for ent in entity["link"]]
|
||||
parents = ent_path_items[1:len(ent_path_items)-1:]
|
||||
# TODO we should say to user that
|
||||
# few of them are missing in avalon
|
||||
for av_ent in av_ents_by_name:
|
||||
if av_ent["data"]["parents"] != parents:
|
||||
continue
|
||||
|
||||
# TODO we should say to user that found entity
|
||||
# with same name does not match same ftrack id?
|
||||
if "ftrackId" not in av_ent["data"]:
|
||||
selected_av_entities.append(av_ent)
|
||||
found_without_ftrack_id[str(av_ent["_id"])] = ftrack_id
|
||||
break
|
||||
# TODO we should say to user that found entity
|
||||
# with same name does not match same ftrack id?
|
||||
if "ftrackId" not in av_ent["data"]:
|
||||
selected_av_entities.append(av_ent)
|
||||
found_without_ftrack_id[str(av_ent["_id"])] = ftrack_id
|
||||
break
|
||||
|
||||
if not selected_av_entities:
|
||||
return {
|
||||
|
|
@ -206,10 +208,7 @@ class DeleteAssetSubset(BaseAction):
|
|||
|
||||
items.append(id_item)
|
||||
asset_ids = [ent["_id"] for ent in selected_av_entities]
|
||||
subsets_for_selection = self.dbcon.find({
|
||||
"type": "subset",
|
||||
"parent": {"$in": asset_ids}
|
||||
})
|
||||
subsets_for_selection = get_subsets(project_name, asset_ids=asset_ids)
|
||||
|
||||
asset_ending = ""
|
||||
if len(selected_av_entities) > 1:
|
||||
|
|
@ -459,13 +458,9 @@ class DeleteAssetSubset(BaseAction):
|
|||
if len(assets_to_delete) > 0:
|
||||
map_av_ftrack_id = spec_data["without_ftrack_id"]
|
||||
# Prepare data when deleting whole avalon asset
|
||||
avalon_assets = self.dbcon.find(
|
||||
{"type": "asset"},
|
||||
{
|
||||
"_id": 1,
|
||||
"data.visualParent": 1,
|
||||
"data.ftrackId": 1
|
||||
}
|
||||
avalon_assets = get_assets(
|
||||
project_name,
|
||||
fields=["_id", "data.visualParent", "data.ftrackId"]
|
||||
)
|
||||
avalon_assets_by_parent = collections.defaultdict(list)
|
||||
for asset in avalon_assets:
|
||||
|
|
|
|||
|
|
@ -5,7 +5,12 @@ import uuid
|
|||
import clique
|
||||
from pymongo import UpdateOne
|
||||
|
||||
|
||||
from openpype.client import (
|
||||
get_assets,
|
||||
get_subsets,
|
||||
get_versions,
|
||||
get_representations
|
||||
)
|
||||
from openpype.api import Anatomy
|
||||
from openpype.lib import StringTemplate, TemplateUnsolved
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
|
@ -198,10 +203,9 @@ class DeleteOldVersions(BaseAction):
|
|||
self.log.debug("Project is set to {}".format(project_name))
|
||||
|
||||
# Get Assets from avalon database
|
||||
assets = list(self.dbcon.find({
|
||||
"type": "asset",
|
||||
"name": {"$in": avalon_asset_names}
|
||||
}))
|
||||
assets = list(
|
||||
get_assets(project_name, asset_names=avalon_asset_names)
|
||||
)
|
||||
asset_id_to_name_map = {
|
||||
asset["_id"]: asset["name"] for asset in assets
|
||||
}
|
||||
|
|
@ -210,10 +214,9 @@ class DeleteOldVersions(BaseAction):
|
|||
self.log.debug("Collected assets ({})".format(len(asset_ids)))
|
||||
|
||||
# Get Subsets
|
||||
subsets = list(self.dbcon.find({
|
||||
"type": "subset",
|
||||
"parent": {"$in": asset_ids}
|
||||
}))
|
||||
subsets = list(
|
||||
get_subsets(project_name, asset_ids=asset_ids)
|
||||
)
|
||||
subsets_by_id = {}
|
||||
subset_ids = []
|
||||
for subset in subsets:
|
||||
|
|
@ -230,10 +233,9 @@ class DeleteOldVersions(BaseAction):
|
|||
self.log.debug("Collected subsets ({})".format(len(subset_ids)))
|
||||
|
||||
# Get Versions
|
||||
versions = list(self.dbcon.find({
|
||||
"type": "version",
|
||||
"parent": {"$in": subset_ids}
|
||||
}))
|
||||
versions = list(
|
||||
get_versions(project_name, subset_ids=subset_ids)
|
||||
)
|
||||
|
||||
versions_by_parent = collections.defaultdict(list)
|
||||
for ent in versions:
|
||||
|
|
@ -295,10 +297,9 @@ class DeleteOldVersions(BaseAction):
|
|||
"message": msg
|
||||
}
|
||||
|
||||
repres = list(self.dbcon.find({
|
||||
"type": "representation",
|
||||
"parent": {"$in": version_ids}
|
||||
}))
|
||||
repres = list(
|
||||
get_representations(project_name, version_ids=version_ids)
|
||||
)
|
||||
|
||||
self.log.debug(
|
||||
"Collected representations to remove ({})".format(len(repres))
|
||||
|
|
|
|||
|
|
@ -3,8 +3,13 @@ import copy
|
|||
import json
|
||||
import collections
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
get_subsets,
|
||||
get_versions,
|
||||
get_representations
|
||||
)
|
||||
from openpype.api import Anatomy, config
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
from openpype_modules.ftrack.lib.avalon_sync import CUST_ATTR_ID_KEY
|
||||
|
|
@ -18,11 +23,9 @@ from openpype.lib.delivery import (
|
|||
process_single_file,
|
||||
process_sequence
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
|
||||
class Delivery(BaseAction):
|
||||
|
||||
identifier = "delivery.action"
|
||||
label = "Delivery"
|
||||
description = "Deliver data to client"
|
||||
|
|
@ -30,11 +33,6 @@ class Delivery(BaseAction):
|
|||
icon = statics_icon("ftrack", "action_icons", "Delivery.svg")
|
||||
settings_key = "delivery_action"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.dbcon = AvalonMongoDB()
|
||||
|
||||
super(Delivery, self).__init__(*args, **kwargs)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
is_valid = False
|
||||
for entity in entities:
|
||||
|
|
@ -57,9 +55,7 @@ class Delivery(BaseAction):
|
|||
|
||||
project_entity = self.get_project_from_entity(entities[0])
|
||||
project_name = project_entity["full_name"]
|
||||
self.dbcon.install()
|
||||
self.dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = self.dbcon.find_one({"type": "project"}, {"name": True})
|
||||
project_doc = get_project(project_name, fields=["name"])
|
||||
if not project_doc:
|
||||
return {
|
||||
"success": False,
|
||||
|
|
@ -68,8 +64,7 @@ class Delivery(BaseAction):
|
|||
).format(project_name)
|
||||
}
|
||||
|
||||
repre_names = self._get_repre_names(session, entities)
|
||||
self.dbcon.uninstall()
|
||||
repre_names = self._get_repre_names(project_name, session, entities)
|
||||
|
||||
items.append({
|
||||
"type": "hidden",
|
||||
|
|
@ -198,17 +193,21 @@ class Delivery(BaseAction):
|
|||
"title": title
|
||||
}
|
||||
|
||||
def _get_repre_names(self, session, entities):
|
||||
version_ids = self._get_interest_version_ids(session, entities)
|
||||
def _get_repre_names(self, project_name, session, entities):
|
||||
version_ids = self._get_interest_version_ids(
|
||||
project_name, session, entities
|
||||
)
|
||||
if not version_ids:
|
||||
return []
|
||||
repre_docs = self.dbcon.find({
|
||||
"type": "representation",
|
||||
"parent": {"$in": version_ids}
|
||||
})
|
||||
return list(sorted(repre_docs.distinct("name")))
|
||||
repre_docs = get_representations(
|
||||
project_name,
|
||||
version_ids=version_ids,
|
||||
fields=["name"]
|
||||
)
|
||||
repre_names = {repre_doc["name"] for repre_doc in repre_docs}
|
||||
return list(sorted(repre_names))
|
||||
|
||||
def _get_interest_version_ids(self, session, entities):
|
||||
def _get_interest_version_ids(self, project_name, session, entities):
|
||||
# Extract AssetVersion entities
|
||||
asset_versions = self._extract_asset_versions(session, entities)
|
||||
# Prepare Asset ids
|
||||
|
|
@ -235,14 +234,18 @@ class Delivery(BaseAction):
|
|||
subset_names.add(asset["name"])
|
||||
version_nums.add(asset_version["version"])
|
||||
|
||||
asset_docs_by_ftrack_id = self._get_asset_docs(session, parent_ids)
|
||||
asset_docs_by_ftrack_id = self._get_asset_docs(
|
||||
project_name, session, parent_ids
|
||||
)
|
||||
subset_docs = self._get_subset_docs(
|
||||
project_name,
|
||||
asset_docs_by_ftrack_id,
|
||||
subset_names,
|
||||
asset_versions,
|
||||
assets_by_id
|
||||
)
|
||||
version_docs = self._get_version_docs(
|
||||
project_name,
|
||||
asset_docs_by_ftrack_id,
|
||||
subset_docs,
|
||||
version_nums,
|
||||
|
|
@ -290,6 +293,7 @@ class Delivery(BaseAction):
|
|||
|
||||
def _get_version_docs(
|
||||
self,
|
||||
project_name,
|
||||
asset_docs_by_ftrack_id,
|
||||
subset_docs,
|
||||
version_nums,
|
||||
|
|
@ -300,11 +304,11 @@ class Delivery(BaseAction):
|
|||
subset_doc["_id"]: subset_doc
|
||||
for subset_doc in subset_docs
|
||||
}
|
||||
version_docs = list(self.dbcon.find({
|
||||
"type": "version",
|
||||
"parent": {"$in": list(subset_docs_by_id.keys())},
|
||||
"name": {"$in": list(version_nums)}
|
||||
}))
|
||||
version_docs = list(get_versions(
|
||||
project_name,
|
||||
subset_ids=subset_docs_by_id.keys(),
|
||||
versions=version_nums
|
||||
))
|
||||
version_docs_by_parent_id = collections.defaultdict(dict)
|
||||
for version_doc in version_docs:
|
||||
subset_doc = subset_docs_by_id[version_doc["parent"]]
|
||||
|
|
@ -345,6 +349,7 @@ class Delivery(BaseAction):
|
|||
|
||||
def _get_subset_docs(
|
||||
self,
|
||||
project_name,
|
||||
asset_docs_by_ftrack_id,
|
||||
subset_names,
|
||||
asset_versions,
|
||||
|
|
@ -354,11 +359,11 @@ class Delivery(BaseAction):
|
|||
asset_doc["_id"]
|
||||
for asset_doc in asset_docs_by_ftrack_id.values()
|
||||
]
|
||||
subset_docs = list(self.dbcon.find({
|
||||
"type": "subset",
|
||||
"parent": {"$in": asset_doc_ids},
|
||||
"name": {"$in": list(subset_names)}
|
||||
}))
|
||||
subset_docs = list(get_subsets(
|
||||
project_name,
|
||||
asset_ids=asset_doc_ids,
|
||||
subset_names=subset_names
|
||||
))
|
||||
subset_docs_by_parent_id = collections.defaultdict(dict)
|
||||
for subset_doc in subset_docs:
|
||||
asset_id = subset_doc["parent"]
|
||||
|
|
@ -385,15 +390,21 @@ class Delivery(BaseAction):
|
|||
filtered_subsets.append(subset_doc)
|
||||
return filtered_subsets
|
||||
|
||||
def _get_asset_docs(self, session, parent_ids):
|
||||
asset_docs = list(self.dbcon.find({
|
||||
"type": "asset",
|
||||
"data.ftrackId": {"$in": list(parent_ids)}
|
||||
}))
|
||||
def _get_asset_docs(self, project_name, session, parent_ids):
|
||||
asset_docs = list(get_assets(
|
||||
project_name, fields=["_id", "name", "data.ftrackId"]
|
||||
))
|
||||
|
||||
asset_docs_by_id = {}
|
||||
asset_docs_by_name = {}
|
||||
asset_docs_by_ftrack_id = {}
|
||||
for asset_doc in asset_docs:
|
||||
asset_id = str(asset_doc["_id"])
|
||||
asset_name = asset_doc["name"]
|
||||
ftrack_id = asset_doc["data"].get("ftrackId")
|
||||
|
||||
asset_docs_by_id[asset_id] = asset_doc
|
||||
asset_docs_by_name[asset_name] = asset_doc
|
||||
if ftrack_id:
|
||||
asset_docs_by_ftrack_id[ftrack_id] = asset_doc
|
||||
|
||||
|
|
@ -406,15 +417,15 @@ class Delivery(BaseAction):
|
|||
avalon_mongo_id_values = query_custom_attributes(
|
||||
session, [attr_def["id"]], parent_ids, True
|
||||
)
|
||||
entity_ids_by_mongo_id = {
|
||||
ObjectId(item["value"]): item["entity_id"]
|
||||
for item in avalon_mongo_id_values
|
||||
if item["value"]
|
||||
}
|
||||
|
||||
missing_ids = set(parent_ids)
|
||||
for entity_id in set(entity_ids_by_mongo_id.values()):
|
||||
if entity_id in missing_ids:
|
||||
for item in avalon_mongo_id_values:
|
||||
if not item["value"]:
|
||||
continue
|
||||
asset_id = item["value"]
|
||||
entity_id = item["entity_id"]
|
||||
asset_doc = asset_docs_by_id.get(asset_id)
|
||||
if asset_doc:
|
||||
asset_docs_by_ftrack_id[entity_id] = asset_doc
|
||||
missing_ids.remove(entity_id)
|
||||
|
||||
entity_ids_by_name = {}
|
||||
|
|
@ -427,36 +438,10 @@ class Delivery(BaseAction):
|
|||
for entity in not_found_entities
|
||||
}
|
||||
|
||||
expressions = []
|
||||
if entity_ids_by_mongo_id:
|
||||
expression = {
|
||||
"type": "asset",
|
||||
"_id": {"$in": list(entity_ids_by_mongo_id.keys())}
|
||||
}
|
||||
expressions.append(expression)
|
||||
|
||||
if entity_ids_by_name:
|
||||
expression = {
|
||||
"type": "asset",
|
||||
"name": {"$in": list(entity_ids_by_name.keys())}
|
||||
}
|
||||
expressions.append(expression)
|
||||
|
||||
if expressions:
|
||||
if len(expressions) == 1:
|
||||
filter = expressions[0]
|
||||
else:
|
||||
filter = {"$or": expressions}
|
||||
|
||||
asset_docs = self.dbcon.find(filter)
|
||||
for asset_doc in asset_docs:
|
||||
if asset_doc["_id"] in entity_ids_by_mongo_id:
|
||||
entity_id = entity_ids_by_mongo_id[asset_doc["_id"]]
|
||||
asset_docs_by_ftrack_id[entity_id] = asset_doc
|
||||
|
||||
elif asset_doc["name"] in entity_ids_by_name:
|
||||
entity_id = entity_ids_by_name[asset_doc["name"]]
|
||||
asset_docs_by_ftrack_id[entity_id] = asset_doc
|
||||
for asset_name, entity_id in entity_ids_by_name.items():
|
||||
asset_doc = asset_docs_by_name.get(asset_name)
|
||||
if asset_doc:
|
||||
asset_docs_by_ftrack_id[entity_id] = asset_doc
|
||||
|
||||
return asset_docs_by_ftrack_id
|
||||
|
||||
|
|
@ -490,7 +475,6 @@ class Delivery(BaseAction):
|
|||
session.commit()
|
||||
|
||||
try:
|
||||
self.dbcon.install()
|
||||
report = self.real_launch(session, entities, event)
|
||||
|
||||
except Exception as exc:
|
||||
|
|
@ -516,7 +500,6 @@ class Delivery(BaseAction):
|
|||
else:
|
||||
job["status"] = "failed"
|
||||
session.commit()
|
||||
self.dbcon.uninstall()
|
||||
|
||||
if not report["success"]:
|
||||
self.show_interface(
|
||||
|
|
@ -558,16 +541,15 @@ class Delivery(BaseAction):
|
|||
if not os.path.exists(location_path):
|
||||
os.makedirs(location_path)
|
||||
|
||||
self.dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
self.log.debug("Collecting representations to process.")
|
||||
version_ids = self._get_interest_version_ids(session, entities)
|
||||
repres_to_deliver = list(self.dbcon.find({
|
||||
"type": "representation",
|
||||
"parent": {"$in": version_ids},
|
||||
"name": {"$in": repre_names}
|
||||
}))
|
||||
|
||||
version_ids = self._get_interest_version_ids(
|
||||
project_name, session, entities
|
||||
)
|
||||
repres_to_deliver = list(get_representations(
|
||||
project_name,
|
||||
representation_names=repre_names,
|
||||
version_ids=version_ids
|
||||
))
|
||||
anatomy = Anatomy(project_name)
|
||||
|
||||
format_dict = get_format_dict(anatomy, location_path)
|
||||
|
|
|
|||
|
|
@ -7,6 +7,10 @@ import datetime
|
|||
|
||||
import ftrack_api
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
)
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.lib import (
|
||||
get_workfile_template_key,
|
||||
|
|
@ -14,7 +18,6 @@ from openpype.lib import (
|
|||
Anatomy,
|
||||
StringTemplate,
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
from openpype_modules.ftrack.lib.avalon_sync import create_chunks
|
||||
|
||||
|
|
@ -248,10 +251,8 @@ class FillWorkfileAttributeAction(BaseAction):
|
|||
# Find matchin asset documents and map them by ftrack task entities
|
||||
# - result stored to 'asset_docs_with_task_entities' is list with
|
||||
# tuple `(asset document, [task entitis, ...])`
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
# Quety all asset documents
|
||||
asset_docs = list(dbcon.find({"type": "asset"}))
|
||||
asset_docs = list(get_assets(project_name))
|
||||
job_entity["data"] = json.dumps({
|
||||
"description": "(1/3) Asset documents queried."
|
||||
})
|
||||
|
|
@ -276,7 +277,7 @@ class FillWorkfileAttributeAction(BaseAction):
|
|||
# Keep placeholders in the template unfilled
|
||||
host_name = "{app}"
|
||||
extension = "{ext}"
|
||||
project_doc = dbcon.find_one({"type": "project"})
|
||||
project_doc = get_project(project_name)
|
||||
project_settings = get_project_settings(project_name)
|
||||
anatomy = Anatomy(project_name)
|
||||
templates_by_key = {}
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import json
|
||||
|
||||
from openpype.client import get_project
|
||||
from openpype.api import ProjectSettings
|
||||
from openpype.lib import create_project
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.settings import SaveWarningExc
|
||||
|
||||
from openpype_modules.ftrack.lib import (
|
||||
|
|
@ -389,12 +389,8 @@ class PrepareProjectLocal(BaseAction):
|
|||
project_name = project_entity["full_name"]
|
||||
|
||||
# Try to find project document
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.install()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = dbcon.find_one({
|
||||
"type": "project"
|
||||
})
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Create project if is not available
|
||||
# - creation is required to be able set project anatomy and attributes
|
||||
if not project_doc:
|
||||
|
|
@ -402,9 +398,7 @@ class PrepareProjectLocal(BaseAction):
|
|||
self.log.info("Creating project \"{} [{}]\"".format(
|
||||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code, dbcon=dbcon)
|
||||
|
||||
dbcon.uninstall()
|
||||
create_project(project_name, project_code)
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
|
|
|
|||
|
|
@ -5,9 +5,16 @@ import json
|
|||
|
||||
import ftrack_api
|
||||
|
||||
from openpype.client import (
|
||||
get_asset_by_name,
|
||||
get_subset_by_name,
|
||||
get_version_by_name,
|
||||
get_representation_by_name
|
||||
)
|
||||
from openpype.api import Anatomy
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
legacy_io,
|
||||
AvalonMongoDB,
|
||||
)
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
|
||||
|
|
@ -255,9 +262,10 @@ class RVAction(BaseAction):
|
|||
"Component", list(event["data"]["values"].values())[0]
|
||||
)["version"]["asset"]["parent"]["link"][0]
|
||||
project = session.get(link["type"], link["id"])
|
||||
os.environ["AVALON_PROJECT"] = project["name"]
|
||||
legacy_io.Session["AVALON_PROJECT"] = project["name"]
|
||||
legacy_io.install()
|
||||
project_name = project["full_name"]
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
anatomy = Anatomy(project_name)
|
||||
|
||||
location = ftrack_api.Session().pick_location()
|
||||
|
||||
|
|
@ -281,37 +289,38 @@ class RVAction(BaseAction):
|
|||
if online_source:
|
||||
continue
|
||||
|
||||
asset = legacy_io.find_one({"type": "asset", "name": parent_name})
|
||||
subset = legacy_io.find_one(
|
||||
{
|
||||
"type": "subset",
|
||||
"name": component["version"]["asset"]["name"],
|
||||
"parent": asset["_id"]
|
||||
}
|
||||
subset_name = component["version"]["asset"]["name"]
|
||||
version_name = component["version"]["version"]
|
||||
representation_name = component["file_type"][1:]
|
||||
|
||||
asset_doc = get_asset_by_name(
|
||||
project_name, parent_name, fields=["_id"]
|
||||
)
|
||||
version = legacy_io.find_one(
|
||||
{
|
||||
"type": "version",
|
||||
"name": component["version"]["version"],
|
||||
"parent": subset["_id"]
|
||||
}
|
||||
subset_doc = get_subset_by_name(
|
||||
project_name,
|
||||
subset_name=subset_name,
|
||||
asset_id=asset_doc["_id"]
|
||||
)
|
||||
representation = legacy_io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": version["_id"],
|
||||
"name": component["file_type"][1:]
|
||||
}
|
||||
version_doc = get_version_by_name(
|
||||
project_name,
|
||||
version=version_name,
|
||||
subset_id=subset_doc["_id"]
|
||||
)
|
||||
if representation is None:
|
||||
representation = legacy_io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": version["_id"],
|
||||
"name": "preview"
|
||||
}
|
||||
repre_doc = get_representation_by_name(
|
||||
project_name,
|
||||
version_id=version_doc["_id"],
|
||||
representation_name=representation_name
|
||||
)
|
||||
if not repre_doc:
|
||||
repre_doc = get_representation_by_name(
|
||||
project_name,
|
||||
version_id=version_doc["_id"],
|
||||
representation_name="preview"
|
||||
)
|
||||
paths.append(get_representation_path(representation))
|
||||
|
||||
paths.append(get_representation_path(
|
||||
repre_doc, root=anatomy.roots, dbcon=dbcon
|
||||
))
|
||||
|
||||
return paths
|
||||
|
||||
|
|
|
|||
|
|
@ -5,6 +5,14 @@ import requests
|
|||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_asset_by_id,
|
||||
get_assets,
|
||||
get_subset_by_name,
|
||||
get_version_by_name,
|
||||
get_representations
|
||||
)
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
from openpype.api import Anatomy
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
|
@ -385,7 +393,7 @@ class StoreThumbnailsToAvalon(BaseAction):
|
|||
|
||||
db_con.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
avalon_project = db_con.find_one({"type": "project"})
|
||||
avalon_project = get_project(project_name)
|
||||
output["project"] = avalon_project
|
||||
|
||||
if not avalon_project:
|
||||
|
|
@ -399,19 +407,17 @@ class StoreThumbnailsToAvalon(BaseAction):
|
|||
asset_mongo_id = parent["custom_attributes"].get(CUST_ATTR_ID_KEY)
|
||||
if asset_mongo_id:
|
||||
try:
|
||||
asset_mongo_id = ObjectId(asset_mongo_id)
|
||||
asset_ent = db_con.find_one({
|
||||
"type": "asset",
|
||||
"_id": asset_mongo_id
|
||||
})
|
||||
asset_ent = get_asset_by_id(project_name, asset_mongo_id)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if not asset_ent:
|
||||
asset_ent = db_con.find_one({
|
||||
"type": "asset",
|
||||
"data.ftrackId": parent["id"]
|
||||
})
|
||||
asset_docs = get_assets(project_name, asset_names=[parent["name"]])
|
||||
for asset_doc in asset_docs:
|
||||
ftrack_id = asset_doc.get("data", {}).get("ftrackId")
|
||||
if ftrack_id == parent["id"]:
|
||||
asset_ent = asset_doc
|
||||
break
|
||||
|
||||
output["asset"] = asset_ent
|
||||
|
||||
|
|
@ -422,13 +428,11 @@ class StoreThumbnailsToAvalon(BaseAction):
|
|||
)
|
||||
return output
|
||||
|
||||
asset_mongo_id = asset_ent["_id"]
|
||||
|
||||
subset_ent = db_con.find_one({
|
||||
"type": "subset",
|
||||
"parent": asset_mongo_id,
|
||||
"name": subset_name
|
||||
})
|
||||
subset_ent = get_subset_by_name(
|
||||
project_name,
|
||||
subset_name=subset_name,
|
||||
asset_id=asset_ent["_id"]
|
||||
)
|
||||
|
||||
output["subset"] = subset_ent
|
||||
|
||||
|
|
@ -439,11 +443,11 @@ class StoreThumbnailsToAvalon(BaseAction):
|
|||
).format(subset_name, ent_path)
|
||||
return output
|
||||
|
||||
version_ent = db_con.find_one({
|
||||
"type": "version",
|
||||
"name": version,
|
||||
"parent": subset_ent["_id"]
|
||||
})
|
||||
version_ent = get_version_by_name(
|
||||
project_name,
|
||||
version,
|
||||
subset_ent["_id"]
|
||||
)
|
||||
|
||||
output["version"] = version_ent
|
||||
|
||||
|
|
@ -454,10 +458,10 @@ class StoreThumbnailsToAvalon(BaseAction):
|
|||
).format(version, subset_name, ent_path)
|
||||
return output
|
||||
|
||||
repre_ents = list(db_con.find({
|
||||
"type": "representation",
|
||||
"parent": version_ent["_id"]
|
||||
}))
|
||||
repre_ents = list(get_representations(
|
||||
project_name,
|
||||
version_ids=[version_ent["_id"]]
|
||||
))
|
||||
|
||||
output["representations"] = repre_ents
|
||||
return output
|
||||
|
|
|
|||
|
|
@ -6,6 +6,14 @@ import numbers
|
|||
|
||||
import six
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
get_archived_assets,
|
||||
get_subsets,
|
||||
get_versions,
|
||||
get_representations
|
||||
)
|
||||
from openpype.api import (
|
||||
Logger,
|
||||
get_anatomy_settings
|
||||
|
|
@ -576,6 +584,10 @@ class SyncEntitiesFactory:
|
|||
self.ft_project_id = ft_project_id
|
||||
self.entities_dict = entities_dict
|
||||
|
||||
@property
|
||||
def project_name(self):
|
||||
return self.entities_dict[self.ft_project_id]["name"]
|
||||
|
||||
@property
|
||||
def avalon_ents_by_id(self):
|
||||
"""
|
||||
|
|
@ -660,9 +672,9 @@ class SyncEntitiesFactory:
|
|||
(list) of assets
|
||||
"""
|
||||
if self._avalon_archived_ents is None:
|
||||
self._avalon_archived_ents = [
|
||||
ent for ent in self.dbcon.find({"type": "archived_asset"})
|
||||
]
|
||||
self._avalon_archived_ents = list(
|
||||
get_archived_assets(self.project_name)
|
||||
)
|
||||
return self._avalon_archived_ents
|
||||
|
||||
@property
|
||||
|
|
@ -730,7 +742,7 @@ class SyncEntitiesFactory:
|
|||
"""
|
||||
if self._subsets_by_parent_id is None:
|
||||
self._subsets_by_parent_id = collections.defaultdict(list)
|
||||
for subset in self.dbcon.find({"type": "subset"}):
|
||||
for subset in get_subsets(self.project_name):
|
||||
self._subsets_by_parent_id[str(subset["parent"])].append(
|
||||
subset
|
||||
)
|
||||
|
|
@ -1421,8 +1433,8 @@ class SyncEntitiesFactory:
|
|||
# Avalon entities
|
||||
self.dbcon.install()
|
||||
self.dbcon.Session["AVALON_PROJECT"] = ft_project_name
|
||||
avalon_project = self.dbcon.find_one({"type": "project"})
|
||||
avalon_entities = self.dbcon.find({"type": "asset"})
|
||||
avalon_project = get_project(ft_project_name)
|
||||
avalon_entities = get_assets(ft_project_name)
|
||||
self.avalon_project = avalon_project
|
||||
self.avalon_entities = avalon_entities
|
||||
|
||||
|
|
@ -2258,46 +2270,37 @@ class SyncEntitiesFactory:
|
|||
self._delete_subsets_without_asset(subsets_to_remove)
|
||||
|
||||
def _delete_subsets_without_asset(self, not_existing_parents):
|
||||
subset_ids = []
|
||||
version_ids = []
|
||||
repre_ids = []
|
||||
to_delete = []
|
||||
|
||||
subset_ids = []
|
||||
for parent_id in not_existing_parents:
|
||||
subsets = self.subsets_by_parent_id.get(parent_id)
|
||||
if not subsets:
|
||||
continue
|
||||
for subset in subsets:
|
||||
if subset.get("type") != "subset":
|
||||
continue
|
||||
subset_ids.append(subset["_id"])
|
||||
if subset.get("type") == "subset":
|
||||
subset_ids.append(subset["_id"])
|
||||
|
||||
db_subsets = self.dbcon.find({
|
||||
"_id": {"$in": subset_ids},
|
||||
"type": "subset"
|
||||
})
|
||||
if not db_subsets:
|
||||
return
|
||||
|
||||
db_versions = self.dbcon.find({
|
||||
"parent": {"$in": subset_ids},
|
||||
"type": "version"
|
||||
})
|
||||
if db_versions:
|
||||
version_ids = [ver["_id"] for ver in db_versions]
|
||||
|
||||
db_repres = self.dbcon.find({
|
||||
"parent": {"$in": version_ids},
|
||||
"type": "representation"
|
||||
})
|
||||
if db_repres:
|
||||
repre_ids = [repre["_id"] for repre in db_repres]
|
||||
db_versions = get_versions(
|
||||
self.project_name,
|
||||
subset_ids=subset_ids,
|
||||
fields=["_id"]
|
||||
)
|
||||
version_ids = [ver["_id"] for ver in db_versions]
|
||||
db_repres = get_representations(
|
||||
self.project_name,
|
||||
version_ids=version_ids,
|
||||
fields=["_id"]
|
||||
)
|
||||
repre_ids = [repre["_id"] for repre in db_repres]
|
||||
|
||||
to_delete.extend(subset_ids)
|
||||
to_delete.extend(version_ids)
|
||||
to_delete.extend(repre_ids)
|
||||
|
||||
self.dbcon.delete_many({"_id": {"$in": to_delete}})
|
||||
if to_delete:
|
||||
self.dbcon.delete_many({"_id": {"$in": to_delete}})
|
||||
|
||||
# Probably deprecated
|
||||
def _check_changeability(self, parent_id=None):
|
||||
|
|
@ -2779,8 +2782,7 @@ class SyncEntitiesFactory:
|
|||
|
||||
def report(self):
|
||||
items = []
|
||||
project_name = self.entities_dict[self.ft_project_id]["name"]
|
||||
title = "Synchronization report ({}):".format(project_name)
|
||||
title = "Synchronization report ({}):".format(self.project_name)
|
||||
|
||||
keys = ["error", "warning", "info"]
|
||||
for key in keys:
|
||||
|
|
|
|||
|
|
@ -3,7 +3,8 @@ import collections
|
|||
import six
|
||||
import pyblish.api
|
||||
from copy import deepcopy
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.client import get_asset_by_id
|
||||
|
||||
|
||||
# Copy of constant `openpype_modules.ftrack.lib.avalon_sync.CUST_ATTR_AUTO_SYNC`
|
||||
CUST_ATTR_AUTO_SYNC = "avalon_auto_sync"
|
||||
|
|
@ -82,9 +83,6 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
auto_sync_state = project[
|
||||
"custom_attributes"][CUST_ATTR_AUTO_SYNC]
|
||||
|
||||
if not legacy_io.Session:
|
||||
legacy_io.install()
|
||||
|
||||
self.ft_project = None
|
||||
|
||||
# disable termporarily ftrack project's autosyncing
|
||||
|
|
@ -93,14 +91,14 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
|
||||
try:
|
||||
# import ftrack hierarchy
|
||||
self.import_to_ftrack(hierarchy_context)
|
||||
self.import_to_ftrack(project_name, hierarchy_context)
|
||||
except Exception:
|
||||
raise
|
||||
finally:
|
||||
if auto_sync_state:
|
||||
self.auto_sync_on(project)
|
||||
|
||||
def import_to_ftrack(self, input_data, parent=None):
|
||||
def import_to_ftrack(self, project_name, input_data, parent=None):
|
||||
# Prequery hiearchical custom attributes
|
||||
hier_custom_attributes = get_pype_attr(self.session)[1]
|
||||
hier_attr_by_key = {
|
||||
|
|
@ -222,7 +220,7 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
six.reraise(tp, value, tb)
|
||||
|
||||
# Incoming links.
|
||||
self.create_links(entity_data, entity)
|
||||
self.create_links(project_name, entity_data, entity)
|
||||
try:
|
||||
self.session.commit()
|
||||
except Exception:
|
||||
|
|
@ -255,9 +253,9 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
# Import children.
|
||||
if 'childs' in entity_data:
|
||||
self.import_to_ftrack(
|
||||
entity_data['childs'], entity)
|
||||
project_name, entity_data['childs'], entity)
|
||||
|
||||
def create_links(self, entity_data, entity):
|
||||
def create_links(self, project_name, entity_data, entity):
|
||||
# Clear existing links.
|
||||
for link in entity.get("incoming_links", []):
|
||||
self.session.delete(link)
|
||||
|
|
@ -270,9 +268,15 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
six.reraise(tp, value, tb)
|
||||
|
||||
# Create new links.
|
||||
for input in entity_data.get("inputs", []):
|
||||
input_id = legacy_io.find_one({"_id": input})["data"]["ftrackId"]
|
||||
assetbuild = self.session.get("AssetBuild", input_id)
|
||||
for asset_id in entity_data.get("inputs", []):
|
||||
asset_doc = get_asset_by_id(project_name, asset_id)
|
||||
ftrack_id = None
|
||||
if asset_doc:
|
||||
ftrack_id = asset_doc["data"].get("ftrackId")
|
||||
if not ftrack_id:
|
||||
continue
|
||||
|
||||
assetbuild = self.session.get("AssetBuild", ftrack_id)
|
||||
self.log.debug(
|
||||
"Creating link from {0} to {1}".format(
|
||||
assetbuild["name"], entity["name"]
|
||||
|
|
|
|||
|
|
@ -85,7 +85,7 @@ def update_op_assets(
|
|||
# Frame in, fallback on 0
|
||||
frame_in = int(item_data.get("frame_in") or 0)
|
||||
item_data["frameStart"] = frame_in
|
||||
item_data.pop("frame_in")
|
||||
item_data.pop("frame_in", None)
|
||||
# Frame out, fallback on frame_in + duration
|
||||
frames_duration = int(item.get("nb_frames") or 1)
|
||||
frame_out = (
|
||||
|
|
@ -94,7 +94,7 @@ def update_op_assets(
|
|||
else frame_in + frames_duration
|
||||
)
|
||||
item_data["frameEnd"] = int(frame_out)
|
||||
item_data.pop("frame_out")
|
||||
item_data.pop("frame_out", None)
|
||||
# Fps, fallback to project's value when entity fps is deleted
|
||||
if not item_data.get("fps") and item_doc["data"].get("fps"):
|
||||
item_data["fps"] = project_doc["data"]["fps"]
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ from openpype_interfaces import (
|
|||
ITrayService,
|
||||
ILaunchHookPaths
|
||||
)
|
||||
from openpype.lib.events import register_event_callback
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
from .exceptions import InvalidContextError
|
||||
|
|
@ -422,3 +423,20 @@ class TimersManager(OpenPypeModule, ITrayService, ILaunchHookPaths):
|
|||
}
|
||||
|
||||
return requests.post(rest_api_url, json=data)
|
||||
|
||||
def on_host_install(self, host, host_name, project_name):
|
||||
self.log.debug("Installing task changed callback")
|
||||
register_event_callback("taskChanged", self._on_host_task_change)
|
||||
|
||||
def _on_host_task_change(self, event):
|
||||
project_name = event["project_name"]
|
||||
asset_name = event["asset_name"]
|
||||
task_name = event["task_name"]
|
||||
self.log.debug((
|
||||
"Sending message that timer should change to"
|
||||
" Project: {} Asset: {} Task: {}"
|
||||
).format(project_name, asset_name, task_name))
|
||||
|
||||
self.start_timer_with_webserver(
|
||||
project_name, asset_name, task_name, self.log
|
||||
)
|
||||
|
|
|
|||
|
|
@ -16,9 +16,7 @@ from openpype.modules import load_modules, ModulesManager
|
|||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import (
|
||||
Anatomy,
|
||||
register_event_callback,
|
||||
filter_pyblish_plugins,
|
||||
change_timer_to_current_context,
|
||||
)
|
||||
|
||||
from . import (
|
||||
|
|
@ -33,6 +31,9 @@ from . import (
|
|||
_is_installed = False
|
||||
_registered_root = {"_": ""}
|
||||
_registered_host = {"_": None}
|
||||
# Keep modules manager (and it's modules) in memory
|
||||
# - that gives option to register modules' callbacks
|
||||
_modules_manager = None
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -44,6 +45,23 @@ PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
|||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
|
||||
|
||||
def _get_modules_manager():
|
||||
"""Get or create modules manager for host installation.
|
||||
|
||||
This is not meant for public usage. Reason is to keep modules
|
||||
in memory of process to be able trigger their event callbacks if they
|
||||
need any.
|
||||
|
||||
Returns:
|
||||
ModulesManager: Manager wrapping discovered modules.
|
||||
"""
|
||||
|
||||
global _modules_manager
|
||||
if _modules_manager is None:
|
||||
_modules_manager = ModulesManager()
|
||||
return _modules_manager
|
||||
|
||||
|
||||
def register_root(path):
|
||||
"""Register currently active root"""
|
||||
log.info("Registering root: %s" % path)
|
||||
|
|
@ -74,6 +92,7 @@ def install_host(host):
|
|||
_is_installed = True
|
||||
|
||||
legacy_io.install()
|
||||
modules_manager = _get_modules_manager()
|
||||
|
||||
missing = list()
|
||||
for key in ("AVALON_PROJECT", "AVALON_ASSET"):
|
||||
|
|
@ -95,8 +114,6 @@ def install_host(host):
|
|||
|
||||
register_host(host)
|
||||
|
||||
register_event_callback("taskChanged", _on_task_change)
|
||||
|
||||
def modified_emit(obj, record):
|
||||
"""Method replacing `emit` in Pyblish's MessageHandler."""
|
||||
record.msg = record.getMessage()
|
||||
|
|
@ -112,7 +129,14 @@ def install_host(host):
|
|||
else:
|
||||
pyblish.api.register_target("local")
|
||||
|
||||
install_openpype_plugins()
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
|
||||
# Give option to handle host installation
|
||||
for module in modules_manager.get_enabled_modules():
|
||||
module.on_host_install(host, host_name, project_name)
|
||||
|
||||
install_openpype_plugins(project_name, host_name)
|
||||
|
||||
|
||||
def install_openpype_plugins(project_name=None, host_name=None):
|
||||
|
|
@ -124,7 +148,7 @@ def install_openpype_plugins(project_name=None, host_name=None):
|
|||
pyblish.api.register_discovery_filter(filter_pyblish_plugins)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
|
||||
modules_manager = ModulesManager()
|
||||
modules_manager = _get_modules_manager()
|
||||
publish_plugin_dirs = modules_manager.collect_plugin_paths()["publish"]
|
||||
for path in publish_plugin_dirs:
|
||||
pyblish.api.register_plugin_path(path)
|
||||
|
|
@ -168,10 +192,6 @@ def install_openpype_plugins(project_name=None, host_name=None):
|
|||
register_inventory_action(path)
|
||||
|
||||
|
||||
def _on_task_change():
|
||||
change_timer_to_current_context()
|
||||
|
||||
|
||||
def uninstall_host():
|
||||
"""Undo all of what `install()` did"""
|
||||
host = registered_host()
|
||||
|
|
|
|||
|
|
@ -829,9 +829,10 @@ class CreateContext:
|
|||
discover_result = publish_plugins_discover()
|
||||
publish_plugins = discover_result.plugins
|
||||
|
||||
targets = pyblish.logic.registered_targets() or ["default"]
|
||||
targets = set(pyblish.logic.registered_targets())
|
||||
targets.add("default")
|
||||
plugins_by_targets = pyblish.logic.plugins_by_targets(
|
||||
publish_plugins, targets
|
||||
publish_plugins, list(targets)
|
||||
)
|
||||
# Collect plugins that can have attribute definitions
|
||||
for plugin in publish_plugins:
|
||||
|
|
|
|||
|
|
@ -18,16 +18,6 @@ class InstancePlugin(pyblish.api.InstancePlugin):
|
|||
super(InstancePlugin, cls).process(cls, *args, **kwargs)
|
||||
|
||||
|
||||
class Integrator(InstancePlugin):
|
||||
"""Integrator base class.
|
||||
|
||||
Wraps pyblish instance plugin. Targets set to "local" which means all
|
||||
integrators should run on "local" publishes, by default.
|
||||
"remote" targets could be used for integrators that should run externally.
|
||||
"""
|
||||
targets = ["local"]
|
||||
|
||||
|
||||
class Extractor(InstancePlugin):
|
||||
"""Extractor base class.
|
||||
|
||||
|
|
@ -38,8 +28,6 @@ class Extractor(InstancePlugin):
|
|||
|
||||
"""
|
||||
|
||||
targets = ["local"]
|
||||
|
||||
order = 2.0
|
||||
|
||||
def staging_dir(self, instance):
|
||||
|
|
|
|||
|
|
@ -1,5 +1,3 @@
|
|||
import os
|
||||
import getpass
|
||||
import pyblish.api
|
||||
from openpype.lib import get_openpype_username
|
||||
|
||||
|
|
|
|||
|
|
@ -763,7 +763,8 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
start_frame = int(start_frame)
|
||||
end_frame = int(end_frame)
|
||||
collections = clique.assemble(files)[0]
|
||||
assert len(collections) == 1, "Multiple collections found."
|
||||
msg = "Multiple collections {} found.".format(collections)
|
||||
assert len(collections) == 1, msg
|
||||
col = collections[0]
|
||||
|
||||
# do nothing if no gap is found in input range
|
||||
|
|
|
|||
|
|
@ -95,4 +95,4 @@
|
|||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -66,6 +66,28 @@
|
|||
"defaults": [],
|
||||
"joint_hints": "jnt_org"
|
||||
},
|
||||
"CreateMultiverseLook": {
|
||||
"enabled": true,
|
||||
"publish_mip_map": true
|
||||
},
|
||||
"CreateMultiverseUsd": {
|
||||
"enabled": true,
|
||||
"defaults": [
|
||||
"Main"
|
||||
]
|
||||
},
|
||||
"CreateMultiverseUsdComp": {
|
||||
"enabled": true,
|
||||
"defaults": [
|
||||
"Main"
|
||||
]
|
||||
},
|
||||
"CreateMultiverseUsdOver": {
|
||||
"enabled": true,
|
||||
"defaults": [
|
||||
"Main"
|
||||
]
|
||||
},
|
||||
"CreateAnimation": {
|
||||
"enabled": true,
|
||||
"defaults": [
|
||||
|
|
@ -379,6 +401,14 @@
|
|||
"optional": true,
|
||||
"active": true
|
||||
},
|
||||
"ExtractAlembic": {
|
||||
"enabled": true,
|
||||
"families": [
|
||||
"pointcache",
|
||||
"model",
|
||||
"vrayproxy"
|
||||
]
|
||||
},
|
||||
"ValidateRigContents": {
|
||||
"enabled": false,
|
||||
"optional": true,
|
||||
|
|
|
|||
|
|
@ -59,13 +59,11 @@
|
|||
"applications": {
|
||||
"write_security_roles": [
|
||||
"API",
|
||||
"Administrator",
|
||||
"Pypeclub"
|
||||
"Administrator"
|
||||
],
|
||||
"read_security_roles": [
|
||||
"API",
|
||||
"Administrator",
|
||||
"Pypeclub"
|
||||
"Administrator"
|
||||
]
|
||||
}
|
||||
},
|
||||
|
|
@ -73,25 +71,21 @@
|
|||
"tools_env": {
|
||||
"write_security_roles": [
|
||||
"API",
|
||||
"Administrator",
|
||||
"Pypeclub"
|
||||
"Administrator"
|
||||
],
|
||||
"read_security_roles": [
|
||||
"API",
|
||||
"Administrator",
|
||||
"Pypeclub"
|
||||
"Administrator"
|
||||
]
|
||||
},
|
||||
"avalon_mongo_id": {
|
||||
"write_security_roles": [
|
||||
"API",
|
||||
"Administrator",
|
||||
"Pypeclub"
|
||||
"Administrator"
|
||||
],
|
||||
"read_security_roles": [
|
||||
"API",
|
||||
"Administrator",
|
||||
"Pypeclub"
|
||||
"Administrator"
|
||||
]
|
||||
},
|
||||
"fps": {
|
||||
|
|
|
|||
|
|
@ -124,10 +124,41 @@
|
|||
]
|
||||
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "CreateMultiverseLook",
|
||||
"label": "Create Multiverse Look",
|
||||
"checkbox_key": "enabled",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "publish_mip_map",
|
||||
"label": "Publish Mip Maps"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "schema_template",
|
||||
"name": "template_create_plugin",
|
||||
"template_data": [
|
||||
{
|
||||
"key": "CreateMultiverseUsd",
|
||||
"label": "Create Multiverse USD"
|
||||
},
|
||||
{
|
||||
"key": "CreateMultiverseUsdComp",
|
||||
"label": "Create Multiverse USD Composition"
|
||||
},
|
||||
{
|
||||
"key": "CreateMultiverseUsdOver",
|
||||
"label": "Create Multiverse USD Override"
|
||||
},
|
||||
{
|
||||
"key": "CreateAnimation",
|
||||
"label": "Create Animation"
|
||||
|
|
|
|||
|
|
@ -504,6 +504,30 @@
|
|||
"label": "ValidateUniqueNames"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Extractors"
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "ExtractAlembic",
|
||||
"label": "Extract Alembic",
|
||||
"checkbox_key": "enabled",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"key": "families",
|
||||
"label": "Families",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
|
|||
|
|
@ -468,10 +468,8 @@ class Window(QtWidgets.QDialog):
|
|||
current_page == "terminal"
|
||||
)
|
||||
|
||||
self.state = {
|
||||
"is_closing": False,
|
||||
"current_page": current_page
|
||||
}
|
||||
self._current_page = current_page
|
||||
self._hidden_for_plugin_process = False
|
||||
|
||||
self.tabs[current_page].setChecked(True)
|
||||
|
||||
|
|
@ -590,14 +588,14 @@ class Window(QtWidgets.QDialog):
|
|||
target_page = page
|
||||
if direction is None:
|
||||
direction = -1
|
||||
elif name == self.state["current_page"]:
|
||||
elif name == self._current_page:
|
||||
previous_page = page
|
||||
if direction is None:
|
||||
direction = 1
|
||||
else:
|
||||
page.setVisible(False)
|
||||
|
||||
self.state["current_page"] = target
|
||||
self._current_page = target
|
||||
self.slide_page(previous_page, target_page, direction)
|
||||
|
||||
def slide_page(self, previous_page, target_page, direction):
|
||||
|
|
@ -684,7 +682,7 @@ class Window(QtWidgets.QDialog):
|
|||
comment_visible=None,
|
||||
terminal_filters_visibile=None
|
||||
):
|
||||
target = self.state["current_page"]
|
||||
target = self._current_page
|
||||
comment_visibility = (
|
||||
not self.perspective_widget.isVisible()
|
||||
and not target == "terminal"
|
||||
|
|
@ -845,7 +843,7 @@ class Window(QtWidgets.QDialog):
|
|||
|
||||
def apply_log_suspend_value(self, value):
|
||||
self._suspend_logs = value
|
||||
if self.state["current_page"] == "terminal":
|
||||
if self._current_page == "terminal":
|
||||
self.tabs["overview"].setChecked(True)
|
||||
|
||||
self.tabs["terminal"].setVisible(not self._suspend_logs)
|
||||
|
|
@ -882,9 +880,21 @@ class Window(QtWidgets.QDialog):
|
|||
visibility = True
|
||||
if hasattr(plugin, "hide_ui_on_process") and plugin.hide_ui_on_process:
|
||||
visibility = False
|
||||
self._hidden_for_plugin_process = not visibility
|
||||
|
||||
if self.isVisible() != visibility:
|
||||
self.setVisible(visibility)
|
||||
self._ensure_visible(visibility)
|
||||
|
||||
def _ensure_visible(self, visible):
|
||||
if self.isVisible() == visible:
|
||||
return
|
||||
|
||||
if not visible:
|
||||
self.setVisible(visible)
|
||||
else:
|
||||
self.show()
|
||||
self.raise_()
|
||||
self.activateWindow()
|
||||
self.showNormal()
|
||||
|
||||
def on_plugin_action_menu_requested(self, pos):
|
||||
"""The user right-clicked on a plug-in
|
||||
|
|
@ -955,7 +965,7 @@ class Window(QtWidgets.QDialog):
|
|||
self.intent_box.setEnabled(True)
|
||||
|
||||
# Refresh tab
|
||||
self.on_tab_changed(self.state["current_page"])
|
||||
self.on_tab_changed(self._current_page)
|
||||
self.update_compatibility()
|
||||
|
||||
self.button_suspend_logs.setEnabled(False)
|
||||
|
|
@ -1027,8 +1037,9 @@ class Window(QtWidgets.QDialog):
|
|||
|
||||
self._update_state()
|
||||
|
||||
if not self.isVisible():
|
||||
self.setVisible(True)
|
||||
if self._hidden_for_plugin_process:
|
||||
self._hidden_for_plugin_process = False
|
||||
self._ensure_visible(True)
|
||||
|
||||
def on_was_skipped(self, plugin):
|
||||
plugin_item = self.plugin_model.plugin_items[plugin.id]
|
||||
|
|
@ -1103,8 +1114,9 @@ class Window(QtWidgets.QDialog):
|
|||
plugin_item, instance_item
|
||||
)
|
||||
|
||||
if not self.isVisible():
|
||||
self.setVisible(True)
|
||||
if self._hidden_for_plugin_process:
|
||||
self._hidden_for_plugin_process = False
|
||||
self._ensure_visible(True)
|
||||
|
||||
# -------------------------------------------------------------------------
|
||||
#
|
||||
|
|
@ -1223,53 +1235,20 @@ class Window(QtWidgets.QDialog):
|
|||
|
||||
"""
|
||||
|
||||
# Make it snappy, but take care to clean it all up.
|
||||
# TODO(marcus): Enable GUI to return on problem, such
|
||||
# as asking whether or not the user really wants to quit
|
||||
# given there are things currently running.
|
||||
self.hide()
|
||||
self.info(self.tr("Closing.."))
|
||||
|
||||
if self.state["is_closing"]:
|
||||
if self.controller.is_running:
|
||||
self.info(self.tr("..as soon as processing is finished.."))
|
||||
self.controller.stop()
|
||||
|
||||
# Explicitly clear potentially referenced data
|
||||
self.info(self.tr("Cleaning up models.."))
|
||||
self.intent_model.deleteLater()
|
||||
self.plugin_model.deleteLater()
|
||||
self.terminal_model.deleteLater()
|
||||
self.terminal_proxy.deleteLater()
|
||||
self.plugin_proxy.deleteLater()
|
||||
self.info(self.tr("Cleaning up controller.."))
|
||||
self.controller.cleanup()
|
||||
|
||||
self.overview_instance_view.setModel(None)
|
||||
self.overview_plugin_view.setModel(None)
|
||||
self.terminal_view.setModel(None)
|
||||
|
||||
self.info(self.tr("Cleaning up controller.."))
|
||||
self.controller.cleanup()
|
||||
|
||||
self.info(self.tr("All clean!"))
|
||||
self.info(self.tr("Good bye"))
|
||||
return super(Window, self).closeEvent(event)
|
||||
|
||||
self.info(self.tr("Closing.."))
|
||||
|
||||
def on_problem():
|
||||
self.heads_up(
|
||||
"Warning", "Had trouble closing down. "
|
||||
"Please tell someone and try again."
|
||||
)
|
||||
self.show()
|
||||
|
||||
if self.controller.is_running:
|
||||
self.info(self.tr("..as soon as processing is finished.."))
|
||||
self.controller.stop()
|
||||
self.finished.connect(self.close)
|
||||
util.defer(200, on_problem)
|
||||
return event.ignore()
|
||||
|
||||
self.state["is_closing"] = True
|
||||
|
||||
util.defer(200, self.close)
|
||||
return event.ignore()
|
||||
event.accept()
|
||||
|
||||
def reject(self):
|
||||
"""Handle ESC key"""
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import os
|
||||
import logging
|
||||
import shutil
|
||||
import copy
|
||||
|
||||
import Qt
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
|
@ -90,7 +91,9 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
self._task_type = None
|
||||
|
||||
# Pype's anatomy object for current project
|
||||
self.anatomy = Anatomy(legacy_io.Session["AVALON_PROJECT"])
|
||||
project_name = legacy_io.Session["AVALON_PROJECT"]
|
||||
self.anatomy = Anatomy(project_name)
|
||||
self.project_name = project_name
|
||||
# Template key used to get work template from anatomy templates
|
||||
self.template_key = "work"
|
||||
|
||||
|
|
@ -98,6 +101,7 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
self._workfiles_root = None
|
||||
self._workdir_path = None
|
||||
self.host = registered_host()
|
||||
self.host_name = os.environ["AVALON_APP"]
|
||||
|
||||
# Whether to automatically select the latest modified
|
||||
# file on a refresh of the files model.
|
||||
|
|
@ -385,8 +389,9 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
return None
|
||||
|
||||
if self._asset_doc is None:
|
||||
project_name = legacy_io.active_project()
|
||||
self._asset_doc = get_asset_by_id(project_name, self._asset_id)
|
||||
self._asset_doc = get_asset_by_id(
|
||||
self.project_name, self._asset_id
|
||||
)
|
||||
|
||||
return self._asset_doc
|
||||
|
||||
|
|
@ -396,8 +401,8 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
session = legacy_io.Session.copy()
|
||||
self.template_key = get_workfile_template_key(
|
||||
self._task_type,
|
||||
session["AVALON_APP"],
|
||||
project_name=session["AVALON_PROJECT"]
|
||||
self.host_name,
|
||||
project_name=self.project_name
|
||||
)
|
||||
changes = compute_session_changes(
|
||||
session,
|
||||
|
|
@ -430,6 +435,21 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
template_key=self.template_key
|
||||
)
|
||||
|
||||
def _get_event_context_data(self):
|
||||
asset_id = None
|
||||
asset_name = None
|
||||
asset_doc = self._get_asset_doc()
|
||||
if asset_doc:
|
||||
asset_id = asset_doc["_id"]
|
||||
asset_name = asset_doc["name"]
|
||||
return {
|
||||
"project_name": self.project_name,
|
||||
"asset_id": asset_id,
|
||||
"asset_name": asset_name,
|
||||
"task_name": self._task_name,
|
||||
"host_name": self.host_name
|
||||
}
|
||||
|
||||
def open_file(self, filepath):
|
||||
host = self.host
|
||||
if host.has_unsaved_changes():
|
||||
|
|
@ -453,8 +473,21 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
# Save current scene, continue to open file
|
||||
host.save_file(current_file)
|
||||
|
||||
event_data_before = self._get_event_context_data()
|
||||
event_data_before["filepath"] = filepath
|
||||
event_data_after = copy.deepcopy(event_data_before)
|
||||
emit_event(
|
||||
"workfile.open.before",
|
||||
event_data_before,
|
||||
source="workfiles.tool"
|
||||
)
|
||||
self._enter_session()
|
||||
host.open_file(filepath)
|
||||
emit_event(
|
||||
"workfile.open.after",
|
||||
event_data_after,
|
||||
source="workfiles.tool"
|
||||
)
|
||||
self.file_opened.emit()
|
||||
|
||||
def save_changes_prompt(self):
|
||||
|
|
@ -567,9 +600,14 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
src_path = self._get_selected_filepath()
|
||||
|
||||
# Trigger before save event
|
||||
event_data_before = self._get_event_context_data()
|
||||
event_data_before.update({
|
||||
"filename": work_filename,
|
||||
"workdir_path": self._workdir_path
|
||||
})
|
||||
emit_event(
|
||||
"workfile.save.before",
|
||||
{"filename": work_filename, "workdir_path": self._workdir_path},
|
||||
event_data_before,
|
||||
source="workfiles.tool"
|
||||
)
|
||||
|
||||
|
|
@ -602,15 +640,20 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
# Create extra folders
|
||||
create_workdir_extra_folders(
|
||||
self._workdir_path,
|
||||
legacy_io.Session["AVALON_APP"],
|
||||
self.host_name,
|
||||
self._task_type,
|
||||
self._task_name,
|
||||
legacy_io.Session["AVALON_PROJECT"]
|
||||
self.project_name
|
||||
)
|
||||
event_data_after = self._get_event_context_data()
|
||||
event_data_after.update({
|
||||
"filename": work_filename,
|
||||
"workdir_path": self._workdir_path
|
||||
})
|
||||
# Trigger after save events
|
||||
emit_event(
|
||||
"workfile.save.after",
|
||||
{"filename": work_filename, "workdir_path": self._workdir_path},
|
||||
event_data_after,
|
||||
source="workfiles.tool"
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.11.0-nightly.3"
|
||||
__version__ = "3.11.1"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.11.0-nightly.3" # OpenPype
|
||||
version = "3.11.1" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -68,6 +68,7 @@ We have a few required anatomy templates for OpenPype to work properly, however
|
|||
| `representation` | Representation name |
|
||||
| `frame` | Frame number for sequence files. |
|
||||
| `app` | Application Name |
|
||||
| `user` | User's login name (can be overridden in local settings) |
|
||||
| `output` | |
|
||||
| `comment` | |
|
||||
|
||||
|
|
|
|||
|
|
@ -65,6 +65,25 @@ the one depicted here:
|
|||
|
||||

|
||||
|
||||
|
||||
```
|
||||
{
|
||||
"MULTIVERSE_PATH": "/Path/to/Multiverse-{MULTIVERSE_VERSION}",
|
||||
"MAYA_MODULE_PATH": "{MULTIVERSE}/Maya;{MAYA_MODULE_PATH}"
|
||||
}
|
||||
|
||||
{
|
||||
"MULTIVERSE_VERSION": "7.1.0-py27"
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
The Multiverse Maya module file (.mod) pointed above contains all the necessary
|
||||
environment variables to run Multiverse.
|
||||
|
||||
The OpenPype settings will contain blocks to enable/disable the Multiverse
|
||||
Creators and Loader, along with sensible studio setting.
|
||||
|
||||
For more information about setup of Multiverse please refer to the relative page
|
||||
on the [Multiverse official documentation](https://multi-verse.io/docs).
|
||||
|
||||
|
|
@ -94,7 +113,7 @@ You can choose the USD file format in the Creators' set nodes:
|
|||
- Assets: `.usd` (default) or `.usda` or `.usdz`
|
||||
- Compositions: `.usda` (default) or `.usd`
|
||||
- Overrides: `.usda` (default) or `.usd`
|
||||
- Looks: `.ma`
|
||||
- Looks: `.ma` (default)
|
||||
|
||||

|
||||
|
||||
|
|
|
|||
Binary file not shown.
|
Before Width: | Height: | Size: 161 KiB After Width: | Height: | Size: 448 KiB |
|
|
@ -26,7 +26,7 @@ You can only use our Ftrack Actions and publish to Ftrack if each artist is logg
|
|||
### Custom Attributes
|
||||
After successfully connecting OpenPype with you Ftrack, you can right click on any project in Ftrack and you should see a bunch of actions available. The most important one is called `OpenPype Admin` and contains multiple options inside.
|
||||
|
||||
To prepare Ftrack for working with OpenPype you'll need to run [OpenPype Admin - Create/Update Avalon Attributes](manager_ftrack_actions.md#create-update-avalon-attributes), which creates and sets the Custom Attributes necessary for OpenPype to function.
|
||||
To prepare Ftrack for working with OpenPype you'll need to run [OpenPype Admin - Create/Update Custom Attributes](manager_ftrack_actions.md#create-update-avalon-attributes), which creates and sets the Custom Attributes necessary for OpenPype to function.
|
||||
|
||||
|
||||
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue