mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' of github.com:pypeclub/OpenPype into chore/site_sync_expose_sites_skeleton
This commit is contained in:
commit
fec9144a5e
40 changed files with 940 additions and 638 deletions
52
CHANGELOG.md
52
CHANGELOG.md
|
|
@ -1,8 +1,8 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.3-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.9.3](https://github.com/pypeclub/OpenPype/tree/3.9.3) (2022-04-07)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.2...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.2...3.9.3)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
|
|
@ -10,16 +10,29 @@
|
|||
|
||||
**🆕 New features**
|
||||
|
||||
- Ftrack: Add description integrator [\#3027](https://github.com/pypeclub/OpenPype/pull/3027)
|
||||
- Publishing textures for Unreal [\#2988](https://github.com/pypeclub/OpenPype/pull/2988)
|
||||
- Maya to Unreal \> Static and Skeletal Meshes [\#2978](https://github.com/pypeclub/OpenPype/pull/2978)
|
||||
- Maya to Unreal: Static and Skeletal Meshes [\#2978](https://github.com/pypeclub/OpenPype/pull/2978)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: Add more options for note text of integrate ftrack note [\#3025](https://github.com/pypeclub/OpenPype/pull/3025)
|
||||
- Console Interpreter: Changed how console splitter size are reused on show [\#3016](https://github.com/pypeclub/OpenPype/pull/3016)
|
||||
- Deadline: Use more suitable name for sequence review logic [\#3015](https://github.com/pypeclub/OpenPype/pull/3015)
|
||||
- Nuke: add concurrency attr to deadline job [\#3005](https://github.com/pypeclub/OpenPype/pull/3005)
|
||||
- Photoshop: create image without instance [\#3001](https://github.com/pypeclub/OpenPype/pull/3001)
|
||||
- Deadline: priority configurable in Maya jobs [\#2995](https://github.com/pypeclub/OpenPype/pull/2995)
|
||||
- Workfiles tool: Save as published workfiles [\#2937](https://github.com/pypeclub/OpenPype/pull/2937)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Deadline: Fixed default value of use sequence for review [\#3033](https://github.com/pypeclub/OpenPype/pull/3033)
|
||||
- Settings UI: Version column can be extended so version are visible [\#3032](https://github.com/pypeclub/OpenPype/pull/3032)
|
||||
- General: Fix import after movements [\#3028](https://github.com/pypeclub/OpenPype/pull/3028)
|
||||
- Harmony: Added creating subset name for workfile from template [\#3024](https://github.com/pypeclub/OpenPype/pull/3024)
|
||||
- AfterEffects: Added creating subset name for workfile from template [\#3023](https://github.com/pypeclub/OpenPype/pull/3023)
|
||||
- General: Add example addons to ignored [\#3022](https://github.com/pypeclub/OpenPype/pull/3022)
|
||||
- Maya: Remove missing import [\#3017](https://github.com/pypeclub/OpenPype/pull/3017)
|
||||
- Ftrack: multiple reviewable componets [\#3012](https://github.com/pypeclub/OpenPype/pull/3012)
|
||||
- Tray publisher: Fixes after code movement [\#3010](https://github.com/pypeclub/OpenPype/pull/3010)
|
||||
- Nuke: fixing unicode type detection in effect loaders [\#3002](https://github.com/pypeclub/OpenPype/pull/3002)
|
||||
|
|
@ -27,6 +40,7 @@
|
|||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Maya: Allow to select invalid camera contents if no cameras found [\#3030](https://github.com/pypeclub/OpenPype/pull/3030)
|
||||
- General: adding limitations for pyright [\#2994](https://github.com/pypeclub/OpenPype/pull/2994)
|
||||
|
||||
## [3.9.2](https://github.com/pypeclub/OpenPype/tree/3.9.2) (2022-04-04)
|
||||
|
|
@ -46,7 +60,6 @@
|
|||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Photoshop: create image without instance [\#3001](https://github.com/pypeclub/OpenPype/pull/3001)
|
||||
- TVPaint: Render scene family [\#3000](https://github.com/pypeclub/OpenPype/pull/3000)
|
||||
- Nuke: ReviewDataMov Read RAW attribute [\#2985](https://github.com/pypeclub/OpenPype/pull/2985)
|
||||
- General: `METADATA\_KEYS` constant as `frozenset` for optimal immutable lookup [\#2980](https://github.com/pypeclub/OpenPype/pull/2980)
|
||||
|
|
@ -58,7 +71,7 @@
|
|||
- Workfiles: Open published workfiles [\#2925](https://github.com/pypeclub/OpenPype/pull/2925)
|
||||
- General: Default modules loaded dynamically [\#2923](https://github.com/pypeclub/OpenPype/pull/2923)
|
||||
- Nuke: Add no-audio Tag [\#2911](https://github.com/pypeclub/OpenPype/pull/2911)
|
||||
- Flame: support for comment with xml attribute overrides [\#2892](https://github.com/pypeclub/OpenPype/pull/2892)
|
||||
- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -92,7 +105,6 @@
|
|||
- General: Move Attribute Definitions from pipeline [\#2931](https://github.com/pypeclub/OpenPype/pull/2931)
|
||||
- General: Removed silo references and terminal splash [\#2927](https://github.com/pypeclub/OpenPype/pull/2927)
|
||||
- General: Move pipeline constants to OpenPype [\#2918](https://github.com/pypeclub/OpenPype/pull/2918)
|
||||
- General: Move formatting and workfile functions [\#2914](https://github.com/pypeclub/OpenPype/pull/2914)
|
||||
- General: Move remaining plugins from avalon [\#2912](https://github.com/pypeclub/OpenPype/pull/2912)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
|
@ -109,10 +121,8 @@
|
|||
**🚀 Enhancements**
|
||||
|
||||
- General: Change how OPENPYPE\_DEBUG value is handled [\#2907](https://github.com/pypeclub/OpenPype/pull/2907)
|
||||
- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903)
|
||||
- nuke: imageio adding ocio config version 1.2 [\#2897](https://github.com/pypeclub/OpenPype/pull/2897)
|
||||
- Nuke: ExtractReviewSlate can handle more codes and profiles [\#2879](https://github.com/pypeclub/OpenPype/pull/2879)
|
||||
- Flame: sequence used for reference video [\#2869](https://github.com/pypeclub/OpenPype/pull/2869)
|
||||
- Flame: support for comment with xml attribute overrides [\#2892](https://github.com/pypeclub/OpenPype/pull/2892)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -121,39 +131,15 @@
|
|||
- Pyblish Pype - ensure current state is correct when entering new group order [\#2899](https://github.com/pypeclub/OpenPype/pull/2899)
|
||||
- SceneInventory: Fix import of load function [\#2894](https://github.com/pypeclub/OpenPype/pull/2894)
|
||||
- Harmony - fixed creator issue [\#2891](https://github.com/pypeclub/OpenPype/pull/2891)
|
||||
- General: Remove forgotten use of avalon Creator [\#2885](https://github.com/pypeclub/OpenPype/pull/2885)
|
||||
- General: Avoid circular import [\#2884](https://github.com/pypeclub/OpenPype/pull/2884)
|
||||
- Fixes for attaching loaded containers \(\#2837\) [\#2874](https://github.com/pypeclub/OpenPype/pull/2874)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Reduce style usage to OpenPype repository [\#2889](https://github.com/pypeclub/OpenPype/pull/2889)
|
||||
- General: Move loader logic from avalon to openpype [\#2886](https://github.com/pypeclub/OpenPype/pull/2886)
|
||||
|
||||
## [3.9.0](https://github.com/pypeclub/OpenPype/tree/3.9.0) (2022-03-14)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.0-nightly.9...3.9.0)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: Change Photoshop & AfterEffects plugin path [\#2878](https://github.com/pypeclub/OpenPype/pull/2878)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Subset name filtering in ExtractReview outpus [\#2872](https://github.com/pypeclub/OpenPype/pull/2872)
|
||||
- NewPublisher: Descriptions and Icons in creator dialog [\#2867](https://github.com/pypeclub/OpenPype/pull/2867)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Missing time function [\#2877](https://github.com/pypeclub/OpenPype/pull/2877)
|
||||
- Deadline: Fix plugin name for tile assemble [\#2868](https://github.com/pypeclub/OpenPype/pull/2868)
|
||||
- Nuke: gizmo precollect fix [\#2866](https://github.com/pypeclub/OpenPype/pull/2866)
|
||||
- General: Fix hardlink for windows [\#2864](https://github.com/pypeclub/OpenPype/pull/2864)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Refactor: move webserver tool to openpype [\#2876](https://github.com/pypeclub/OpenPype/pull/2876)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.8.2-nightly.3...3.8.2)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import os
|
||||
from avalon import api
|
||||
import pyblish.api
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -38,7 +39,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
|
||||
# workfile instance
|
||||
family = "workfile"
|
||||
subset = family + task.capitalize()
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
)
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
|
||||
|
|
|
|||
|
|
@ -3,6 +3,8 @@
|
|||
import pyblish.api
|
||||
import os
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
"""Collect current script for publish."""
|
||||
|
|
@ -14,10 +16,15 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
"""Plugin entry point."""
|
||||
family = "workfile"
|
||||
task = os.getenv("AVALON_TASK", None)
|
||||
sanitized_task_name = task[0].upper() + task[1:]
|
||||
basename = os.path.basename(context.data["currentFile"])
|
||||
subset = "{}{}".format(family, sanitized_task_name)
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
)
|
||||
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
|
|
|
|||
|
|
@ -252,6 +252,7 @@ class CreateRender(plugin.Creator):
|
|||
"""Create instance settings."""
|
||||
# get pools
|
||||
pool_names = []
|
||||
default_priority = 50
|
||||
|
||||
self.server_aliases = list(self.deadline_servers.keys())
|
||||
self.data["deadlineServers"] = self.server_aliases
|
||||
|
|
@ -260,7 +261,8 @@ class CreateRender(plugin.Creator):
|
|||
self.data["extendFrames"] = False
|
||||
self.data["overrideExistingFrame"] = True
|
||||
# self.data["useLegacyRenderLayers"] = True
|
||||
self.data["priority"] = 50
|
||||
self.data["priority"] = default_priority
|
||||
self.data["tile_priority"] = default_priority
|
||||
self.data["framesPerTask"] = 1
|
||||
self.data["whitelist"] = False
|
||||
self.data["machineList"] = ""
|
||||
|
|
@ -294,6 +296,16 @@ class CreateRender(plugin.Creator):
|
|||
deadline_url = next(iter(self.deadline_servers.values()))
|
||||
|
||||
pool_names = self._get_deadline_pools(deadline_url)
|
||||
maya_submit_dl = self._project_settings.get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {})
|
||||
priority = maya_submit_dl.get("priority", default_priority)
|
||||
self.data["priority"] = priority
|
||||
|
||||
tile_priority = maya_submit_dl.get("tile_priority",
|
||||
default_priority)
|
||||
self.data["tile_priority"] = tile_priority
|
||||
|
||||
if muster_enabled:
|
||||
self.log.info(">>> Loading Muster credentials ...")
|
||||
|
|
|
|||
|
|
@ -4,7 +4,6 @@ from bson.objectid import ObjectId
|
|||
from openpype.pipeline import (
|
||||
InventoryAction,
|
||||
get_representation_context,
|
||||
get_representation_path_from_context,
|
||||
)
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection,
|
||||
|
|
@ -80,10 +79,10 @@ class ImportModelRender(InventoryAction):
|
|||
})
|
||||
|
||||
context = get_representation_context(look_repr["_id"])
|
||||
maya_file = get_representation_path_from_context(context)
|
||||
maya_file = self.filepath_from_context(context)
|
||||
|
||||
context = get_representation_context(json_repr["_id"])
|
||||
json_file = get_representation_path_from_context(context)
|
||||
json_file = self.filepath_from_context(context)
|
||||
|
||||
# Import the look file
|
||||
with maintained_selection():
|
||||
|
|
|
|||
|
|
@ -40,7 +40,14 @@ class ValidateCameraContents(pyblish.api.InstancePlugin):
|
|||
# list when there are no actual cameras results in
|
||||
# still an empty 'invalid' list
|
||||
if len(cameras) < 1:
|
||||
raise RuntimeError("No cameras in instance.")
|
||||
if members:
|
||||
# If there are members in the instance return all of
|
||||
# them as 'invalid' so the user can still select invalid
|
||||
cls.log.error("No cameras found in instance "
|
||||
"members: {}".format(members))
|
||||
return members
|
||||
|
||||
raise RuntimeError("No cameras found in empty instance.")
|
||||
|
||||
# non-camera shapes
|
||||
valid_shapes = cmds.ls(shapes, type=('camera', 'locator'), long=True)
|
||||
|
|
|
|||
|
|
@ -123,7 +123,7 @@ class ExtractReviewDataMov(openpype.api.Extractor):
|
|||
if generated_repres:
|
||||
# assign to representations
|
||||
instance.data["representations"] += generated_repres
|
||||
instance.data["hasReviewableRepresentations"] = True
|
||||
instance.data["useSequenceForReview"] = False
|
||||
else:
|
||||
instance.data["families"].remove("review")
|
||||
self.log.info((
|
||||
|
|
|
|||
|
|
@ -2,7 +2,6 @@ import os
|
|||
|
||||
import qargparse
|
||||
|
||||
from openpype.pipeline import get_representation_path_from_context
|
||||
from openpype.hosts.photoshop import api as photoshop
|
||||
from openpype.hosts.photoshop.api import get_unique_layer_name
|
||||
|
||||
|
|
@ -63,7 +62,7 @@ class ImageFromSequenceLoader(photoshop.PhotoshopLoader):
|
|||
"""
|
||||
files = []
|
||||
for context in repre_contexts:
|
||||
fname = get_representation_path_from_context(context)
|
||||
fname = cls.filepath_from_context(context)
|
||||
_, file_extension = os.path.splitext(fname)
|
||||
|
||||
for file_name in os.listdir(os.path.dirname(fname)):
|
||||
|
|
|
|||
|
|
@ -0,0 +1,13 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectSAAppName(pyblish.api.ContextPlugin):
|
||||
"""Collect app name and label."""
|
||||
|
||||
label = "Collect App Name/Label"
|
||||
order = pyblish.api.CollectorOrder - 0.5
|
||||
hosts = ["standalonepublisher"]
|
||||
|
||||
def process(self, context):
|
||||
context.data["appName"] = "standalone publisher"
|
||||
context.data["appLabel"] = "Standalone publisher"
|
||||
|
|
@ -0,0 +1,13 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectTrayPublisherAppName(pyblish.api.ContextPlugin):
|
||||
"""Collect app name and label."""
|
||||
|
||||
label = "Collect App Name/Label"
|
||||
order = pyblish.api.CollectorOrder - 0.5
|
||||
hosts = ["traypublisher"]
|
||||
|
||||
def process(self, context):
|
||||
context.data["appName"] = "tray publisher"
|
||||
context.data["appLabel"] = "Tray publisher"
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
import collections
|
||||
import qargparse
|
||||
from avalon.pipeline import get_representation_context
|
||||
from openpype.pipeline import get_representation_context
|
||||
from openpype.hosts.tvpaint.api import lib, pipeline, plugin
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -4,7 +4,6 @@ import logging
|
|||
from typing import List
|
||||
|
||||
import pyblish.api
|
||||
from avalon import api
|
||||
|
||||
from openpype.pipeline import (
|
||||
register_loader_plugin_path,
|
||||
|
|
@ -76,30 +75,6 @@ def _register_events():
|
|||
pass
|
||||
|
||||
|
||||
class Creator(LegacyCreator):
|
||||
hosts = ["unreal"]
|
||||
asset_types = []
|
||||
|
||||
def process(self):
|
||||
nodes = list()
|
||||
|
||||
with unreal.ScopedEditorTransaction("OpenPype Creating Instance"):
|
||||
if (self.options or {}).get("useSelection"):
|
||||
self.log.info("setting ...")
|
||||
print("settings ...")
|
||||
nodes = unreal.EditorUtilityLibrary.get_selected_assets()
|
||||
|
||||
asset_paths = [a.get_path_name() for a in nodes]
|
||||
self.name = move_assets_to_path(
|
||||
"/Game", self.name, asset_paths
|
||||
)
|
||||
|
||||
instance = create_publish_instance("/Game", self.name)
|
||||
imprint(instance, self.data)
|
||||
|
||||
return instance
|
||||
|
||||
|
||||
def ls():
|
||||
"""List all containers.
|
||||
|
||||
|
|
|
|||
|
|
@ -2,13 +2,11 @@ import unreal
|
|||
from unreal import EditorAssetLibrary as eal
|
||||
from unreal import EditorLevelLibrary as ell
|
||||
|
||||
from openpype.hosts.unreal.api.plugin import Creator
|
||||
from avalon.unreal import (
|
||||
instantiate,
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api.pipeline import instantiate
|
||||
|
||||
|
||||
class CreateCamera(Creator):
|
||||
class CreateCamera(plugin.Creator):
|
||||
"""Layout output for character rigs"""
|
||||
|
||||
name = "layoutMain"
|
||||
|
|
|
|||
|
|
@ -1,12 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from unreal import EditorLevelLibrary as ell
|
||||
from openpype.hosts.unreal.api.plugin import Creator
|
||||
from avalon.unreal import (
|
||||
instantiate,
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api.pipeline import instantiate
|
||||
|
||||
|
||||
class CreateLayout(Creator):
|
||||
class CreateLayout(plugin.Creator):
|
||||
"""Layout output for character rigs."""
|
||||
|
||||
name = "layoutMain"
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Create look in Unreal."""
|
||||
import unreal # noqa
|
||||
from openpype.hosts.unreal.api.plugin import Creator
|
||||
from openpype.hosts.unreal.api import pipeline
|
||||
from openpype.hosts.unreal.api import pipeline, plugin
|
||||
|
||||
|
||||
class CreateLook(Creator):
|
||||
class CreateLook(plugin.Creator):
|
||||
"""Shader connections defining shape look."""
|
||||
|
||||
name = "unrealLook"
|
||||
|
|
|
|||
|
|
@ -1,13 +1,13 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Create Static Meshes as FBX geometry."""
|
||||
import unreal # noqa
|
||||
from openpype.hosts.unreal.api.plugin import Creator
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api.pipeline import (
|
||||
instantiate,
|
||||
)
|
||||
|
||||
|
||||
class CreateStaticMeshFBX(Creator):
|
||||
class CreateStaticMeshFBX(plugin.Creator):
|
||||
"""Static FBX geometry."""
|
||||
|
||||
name = "unrealStaticMeshMain"
|
||||
|
|
|
|||
|
|
@ -37,6 +37,8 @@ IGNORED_DEFAULT_FILENAMES = (
|
|||
"__init__.py",
|
||||
"base.py",
|
||||
"interfaces.py",
|
||||
"example_addons",
|
||||
"default_modules",
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -303,7 +305,16 @@ def _load_modules():
|
|||
fullpath = os.path.join(current_dir, filename)
|
||||
basename, ext = os.path.splitext(filename)
|
||||
|
||||
if not os.path.isdir(fullpath) and ext not in (".py", ):
|
||||
if os.path.isdir(fullpath):
|
||||
# Check existence of init fil
|
||||
init_path = os.path.join(fullpath, "__init__.py")
|
||||
if not os.path.exists(init_path):
|
||||
log.debug((
|
||||
"Module directory does not contan __init__.py file {}"
|
||||
).format(fullpath))
|
||||
continue
|
||||
|
||||
elif ext not in (".py", ):
|
||||
continue
|
||||
|
||||
try:
|
||||
|
|
@ -341,7 +352,16 @@ def _load_modules():
|
|||
fullpath = os.path.join(dirpath, filename)
|
||||
basename, ext = os.path.splitext(filename)
|
||||
|
||||
if not os.path.isdir(fullpath) and ext not in (".py", ):
|
||||
if os.path.isdir(fullpath):
|
||||
# Check existence of init fil
|
||||
init_path = os.path.join(fullpath, "__init__.py")
|
||||
if not os.path.exists(init_path):
|
||||
log.debug((
|
||||
"Module directory does not contan __init__.py file {}"
|
||||
).format(fullpath))
|
||||
continue
|
||||
|
||||
elif ext not in (".py", ):
|
||||
continue
|
||||
|
||||
# TODO add more logic how to define if folder is module or not
|
||||
|
|
|
|||
|
|
@ -254,7 +254,11 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
use_published = True
|
||||
tile_assembler_plugin = "OpenPypeTileAssembler"
|
||||
asset_dependencies = False
|
||||
priority = 50
|
||||
tile_priority = 50
|
||||
limit_groups = []
|
||||
jobInfo = {}
|
||||
pluginInfo = {}
|
||||
group = "none"
|
||||
|
||||
def process(self, instance):
|
||||
|
|
@ -272,37 +276,12 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.deadline_url = instance.data.get("deadlineUrl")
|
||||
assert self.deadline_url, "Requires Deadline Webservice URL"
|
||||
|
||||
self._job_info = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"jobInfo", {})
|
||||
)
|
||||
# just using existing names from Setting
|
||||
self._job_info = self.jobInfo
|
||||
|
||||
self._plugin_info = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"pluginInfo", {})
|
||||
)
|
||||
self._plugin_info = self.pluginInfo
|
||||
|
||||
self.limit_groups = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"limit", [])
|
||||
)
|
||||
|
||||
self.group = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"group", "none")
|
||||
)
|
||||
self.limit_groups = self.limit
|
||||
|
||||
context = instance.context
|
||||
workspace = context.data["workspaceDir"]
|
||||
|
|
@ -465,7 +444,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.payload_skeleton["JobInfo"]["UserName"] = deadline_user
|
||||
# Set job priority
|
||||
self.payload_skeleton["JobInfo"]["Priority"] = \
|
||||
self._instance.data.get("priority", 50)
|
||||
self._instance.data.get("priority", self.priority)
|
||||
|
||||
if self.group != "none" and self.group:
|
||||
self.payload_skeleton["JobInfo"]["Group"] = self.group
|
||||
|
|
@ -635,7 +614,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
}
|
||||
assembly_payload["JobInfo"].update(output_filenames)
|
||||
assembly_payload["JobInfo"]["Priority"] = self._instance.data.get(
|
||||
"priority", 50)
|
||||
"tile_priority", self.tile_priority)
|
||||
assembly_payload["JobInfo"]["UserName"] = deadline_user
|
||||
|
||||
frame_payloads = []
|
||||
|
|
|
|||
|
|
@ -235,6 +235,8 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
if mongo_url:
|
||||
environment["OPENPYPE_MONGO"] = mongo_url
|
||||
|
||||
priority = self.deadline_priority or instance.data.get("priority", 50)
|
||||
|
||||
args = [
|
||||
"--headless",
|
||||
'publish',
|
||||
|
|
@ -254,7 +256,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
|
||||
"Department": self.deadline_department,
|
||||
"ChunkSize": self.deadline_chunk_size,
|
||||
"Priority": job["Props"]["Pri"],
|
||||
"Priority": priority,
|
||||
|
||||
"Group": self.deadline_group,
|
||||
"Pool": self.deadline_pool,
|
||||
|
|
@ -524,26 +526,28 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
for collection in collections:
|
||||
ext = collection.tail.lstrip(".")
|
||||
preview = False
|
||||
# if filtered aov name is found in filename, toggle it for
|
||||
# preview video rendering
|
||||
for app in self.aov_filter.keys():
|
||||
if os.environ.get("AVALON_APP", "") == app:
|
||||
# no need to add review if `hasReviewableRepresentations`
|
||||
if instance.get("hasReviewableRepresentations"):
|
||||
break
|
||||
# TODO 'useSequenceForReview' is temporary solution which does
|
||||
# not work for 100% of cases. We must be able to tell what
|
||||
# expected files contains more explicitly and from what
|
||||
# should be review made.
|
||||
# - "review" tag is never added when is set to 'False'
|
||||
if instance["useSequenceForReview"]:
|
||||
# if filtered aov name is found in filename, toggle it for
|
||||
# preview video rendering
|
||||
for app in self.aov_filter.keys():
|
||||
if os.environ.get("AVALON_APP", "") == app:
|
||||
# iteratre all aov filters
|
||||
for aov in self.aov_filter[app]:
|
||||
if re.match(
|
||||
aov,
|
||||
list(collection)[0]
|
||||
):
|
||||
preview = True
|
||||
break
|
||||
|
||||
# iteratre all aov filters
|
||||
for aov in self.aov_filter[app]:
|
||||
if re.match(
|
||||
aov,
|
||||
list(collection)[0]
|
||||
):
|
||||
preview = True
|
||||
break
|
||||
|
||||
# toggle preview on if multipart is on
|
||||
if instance.get("multipartExr", False):
|
||||
preview = True
|
||||
# toggle preview on if multipart is on
|
||||
if instance.get("multipartExr", False):
|
||||
preview = True
|
||||
|
||||
staging = os.path.dirname(list(collection)[0])
|
||||
success, rootless_staging_dir = (
|
||||
|
|
@ -730,8 +734,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"resolutionHeight": data.get("resolutionHeight", 1080),
|
||||
"multipartExr": data.get("multipartExr", False),
|
||||
"jobBatchName": data.get("jobBatchName", ""),
|
||||
"hasReviewableRepresentations": data.get(
|
||||
"hasReviewableRepresentations")
|
||||
"useSequenceForReview": data.get("useSequenceForReview", True)
|
||||
}
|
||||
|
||||
if "prerender" in instance.data["families"]:
|
||||
|
|
@ -923,12 +926,6 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
# User is deadline user
|
||||
render_job["Props"]["User"] = context.data.get(
|
||||
"deadlineUser", getpass.getuser())
|
||||
# Priority is now not handled at all
|
||||
|
||||
if self.deadline_priority:
|
||||
render_job["Props"]["Pri"] = self.deadline_priority
|
||||
else:
|
||||
render_job["Props"]["Pri"] = instance.data.get("priority")
|
||||
|
||||
render_job["Props"]["Env"] = {
|
||||
"FTRACK_API_USER": os.environ.get("FTRACK_API_USER"),
|
||||
|
|
|
|||
|
|
@ -1,11 +1,6 @@
|
|||
import os
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
from avalon import lib as avalonlib
|
||||
from openpype.api import (
|
||||
Anatomy,
|
||||
get_project_settings
|
||||
)
|
||||
from openpype.lib import ApplicationManager
|
||||
from openpype.api import Anatomy
|
||||
|
||||
|
||||
class CreateFolders(BaseAction):
|
||||
|
|
|
|||
|
|
@ -1,3 +1,15 @@
|
|||
"""Integrate components into ftrack
|
||||
|
||||
Requires:
|
||||
context -> ftrackSession - connected ftrack.Session
|
||||
instance -> ftrackComponentsList - list of components to integrate
|
||||
|
||||
Provides:
|
||||
instance -> ftrackIntegratedAssetVersionsData
|
||||
# legacy
|
||||
instance -> ftrackIntegratedAssetVersions
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import six
|
||||
|
|
@ -54,6 +66,114 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
|
|||
self.log.debug(query)
|
||||
return query
|
||||
|
||||
def process(self, instance):
|
||||
session = instance.context.data["ftrackSession"]
|
||||
context = instance.context
|
||||
component_list = instance.data.get("ftrackComponentsList")
|
||||
if not component_list:
|
||||
self.log.info(
|
||||
"Instance don't have components to integrate to Ftrack."
|
||||
" Skipping."
|
||||
)
|
||||
return
|
||||
|
||||
session = instance.context.data["ftrackSession"]
|
||||
context = instance.context
|
||||
|
||||
parent_entity = None
|
||||
default_asset_name = None
|
||||
# If instance has set "ftrackEntity" or "ftrackTask" then use them from
|
||||
# instance. Even if they are set to None. If they are set to None it
|
||||
# has a reason. (like has different context)
|
||||
if "ftrackEntity" in instance.data or "ftrackTask" in instance.data:
|
||||
task_entity = instance.data.get("ftrackTask")
|
||||
parent_entity = instance.data.get("ftrackEntity")
|
||||
|
||||
elif "ftrackEntity" in context.data or "ftrackTask" in context.data:
|
||||
task_entity = context.data.get("ftrackTask")
|
||||
parent_entity = context.data.get("ftrackEntity")
|
||||
|
||||
if task_entity:
|
||||
default_asset_name = task_entity["name"]
|
||||
parent_entity = task_entity["parent"]
|
||||
|
||||
if parent_entity is None:
|
||||
self.log.info((
|
||||
"Skipping ftrack integration. Instance \"{}\" does not"
|
||||
" have specified ftrack entities."
|
||||
).format(str(instance)))
|
||||
return
|
||||
|
||||
if not default_asset_name:
|
||||
default_asset_name = parent_entity["name"]
|
||||
|
||||
# Change status on task
|
||||
self._set_task_status(instance, task_entity, session)
|
||||
|
||||
# Prepare AssetTypes
|
||||
asset_types_by_short = self._ensure_asset_types_exists(
|
||||
session, component_list
|
||||
)
|
||||
|
||||
asset_versions_data_by_id = {}
|
||||
used_asset_versions = []
|
||||
# Iterate over components and publish
|
||||
for data in component_list:
|
||||
self.log.debug("data: {}".format(data))
|
||||
|
||||
# AssetType
|
||||
asset_type_short = data["assettype_data"]["short"]
|
||||
asset_type_entity = asset_types_by_short[asset_type_short]
|
||||
|
||||
# Asset
|
||||
asset_data = data.get("asset_data") or {}
|
||||
if "name" not in asset_data:
|
||||
asset_data["name"] = default_asset_name
|
||||
asset_entity = self._ensure_asset_exists(
|
||||
session,
|
||||
asset_data,
|
||||
asset_type_entity["id"],
|
||||
parent_entity["id"]
|
||||
)
|
||||
|
||||
# Asset Version
|
||||
asset_version_data = data.get("assetversion_data") or {}
|
||||
asset_version_entity = self._ensure_asset_version_exists(
|
||||
session, asset_version_data, asset_entity["id"], task_entity
|
||||
)
|
||||
|
||||
# Component
|
||||
self.create_component(session, asset_version_entity, data)
|
||||
|
||||
# Store asset version and components items that were
|
||||
version_id = asset_version_entity["id"]
|
||||
if version_id not in asset_versions_data_by_id:
|
||||
asset_versions_data_by_id[version_id] = {
|
||||
"asset_version": asset_version_entity,
|
||||
"component_items": []
|
||||
}
|
||||
|
||||
asset_versions_data_by_id[version_id]["component_items"].append(
|
||||
data
|
||||
)
|
||||
|
||||
# Backwards compatibility
|
||||
if asset_version_entity not in used_asset_versions:
|
||||
used_asset_versions.append(asset_version_entity)
|
||||
|
||||
instance.data["ftrackIntegratedAssetVersionsData"] = (
|
||||
asset_versions_data_by_id
|
||||
)
|
||||
|
||||
# Backwards compatibility
|
||||
asset_versions_key = "ftrackIntegratedAssetVersions"
|
||||
if asset_versions_key not in instance.data:
|
||||
instance.data[asset_versions_key] = []
|
||||
|
||||
for asset_version in used_asset_versions:
|
||||
if asset_version not in instance.data[asset_versions_key]:
|
||||
instance.data[asset_versions_key].append(asset_version)
|
||||
|
||||
def _set_task_status(self, instance, task_entity, session):
|
||||
project_entity = instance.context.data.get("ftrackProject")
|
||||
if not project_entity:
|
||||
|
|
@ -100,190 +220,222 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
|
|||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
def process(self, instance):
|
||||
session = instance.context.data["ftrackSession"]
|
||||
context = instance.context
|
||||
def _ensure_asset_types_exists(self, session, component_list):
|
||||
"""Make sure that all AssetType entities exists for integration.
|
||||
|
||||
name = None
|
||||
# If instance has set "ftrackEntity" or "ftrackTask" then use them from
|
||||
# instance. Even if they are set to None. If they are set to None it
|
||||
# has a reason. (like has different context)
|
||||
if "ftrackEntity" in instance.data or "ftrackTask" in instance.data:
|
||||
task = instance.data.get("ftrackTask")
|
||||
parent = instance.data.get("ftrackEntity")
|
||||
Returns:
|
||||
dict: All asset types by short name.
|
||||
"""
|
||||
# Query existing asset types
|
||||
asset_types = session.query("select id, short from AssetType").all()
|
||||
# Stpore all existing short names
|
||||
asset_type_shorts = {asset_type["short"] for asset_type in asset_types}
|
||||
# Check which asset types are missing and store them
|
||||
asset_type_names_by_missing_shorts = {}
|
||||
default_short_name = "upload"
|
||||
for data in component_list:
|
||||
asset_type_data = data.get("assettype_data") or {}
|
||||
asset_type_short = asset_type_data.get("short")
|
||||
if not asset_type_short:
|
||||
# Use default asset type name if not set and change the
|
||||
# input data
|
||||
asset_type_short = default_short_name
|
||||
asset_type_data["short"] = asset_type_short
|
||||
data["assettype_data"] = asset_type_data
|
||||
|
||||
elif "ftrackEntity" in context.data or "ftrackTask" in context.data:
|
||||
task = context.data.get("ftrackTask")
|
||||
parent = context.data.get("ftrackEntity")
|
||||
if (
|
||||
# Skip if short name exists
|
||||
asset_type_short in asset_type_shorts
|
||||
# Skip if short name was already added to missing types
|
||||
# and asset type name is filled
|
||||
# - if asset type name is missing then try use name from other
|
||||
# data
|
||||
or asset_type_names_by_missing_shorts.get(asset_type_short)
|
||||
):
|
||||
continue
|
||||
|
||||
if task:
|
||||
parent = task["parent"]
|
||||
name = task
|
||||
elif parent:
|
||||
name = parent["name"]
|
||||
asset_type_names_by_missing_shorts[asset_type_short] = (
|
||||
asset_type_data.get("name")
|
||||
)
|
||||
|
||||
if not name:
|
||||
self.log.info((
|
||||
"Skipping ftrack integration. Instance \"{}\" does not"
|
||||
" have specified ftrack entities."
|
||||
).format(str(instance)))
|
||||
return
|
||||
# Create missing asset types if there are any
|
||||
if asset_type_names_by_missing_shorts:
|
||||
self.log.info("Creating asset types with short names: {}".format(
|
||||
", ".join(asset_type_names_by_missing_shorts.keys())
|
||||
))
|
||||
for missing_short, type_name in asset_type_names_by_missing_shorts:
|
||||
# Use short for name if name is not defined
|
||||
if not type_name:
|
||||
type_name = missing_short
|
||||
# Use short name also for name
|
||||
# - there is not other source for 'name'
|
||||
session.create(
|
||||
"AssetType",
|
||||
{
|
||||
"short": missing_short,
|
||||
"name": type_name
|
||||
}
|
||||
)
|
||||
|
||||
info_msg = (
|
||||
"Created new {entity_type} with data: {data}"
|
||||
", metadata: {metadata}."
|
||||
# Commit creation
|
||||
session.commit()
|
||||
# Requery asset types
|
||||
asset_types = session.query(
|
||||
"select id, short from AssetType"
|
||||
).all()
|
||||
|
||||
return {asset_type["short"]: asset_type for asset_type in asset_types}
|
||||
|
||||
def _ensure_asset_exists(
|
||||
self, session, asset_data, asset_type_id, parent_id
|
||||
):
|
||||
asset_name = asset_data["name"]
|
||||
asset_entity = self._query_asset(
|
||||
session, asset_name, asset_type_id, parent_id
|
||||
)
|
||||
if asset_entity is not None:
|
||||
return asset_entity
|
||||
|
||||
asset_data = {
|
||||
"name": asset_name,
|
||||
"type_id": asset_type_id,
|
||||
"context_id": parent_id
|
||||
}
|
||||
self.log.info("Created new Asset with data: {}.".format(asset_data))
|
||||
session.create("Asset", asset_data)
|
||||
session.commit()
|
||||
return self._query_asset(session, asset_name, asset_type_id, parent_id)
|
||||
|
||||
def _query_asset(self, session, asset_name, asset_type_id, parent_id):
|
||||
return session.query(
|
||||
(
|
||||
"select id from Asset"
|
||||
" where name is \"{}\""
|
||||
" and type_id is \"{}\""
|
||||
" and context_id is \"{}\""
|
||||
).format(asset_name, asset_type_id, parent_id)
|
||||
).first()
|
||||
|
||||
def _ensure_asset_version_exists(
|
||||
self, session, asset_version_data, asset_id, task_entity
|
||||
):
|
||||
task_id = None
|
||||
if task_entity:
|
||||
task_id = task_entity["id"]
|
||||
|
||||
# Try query asset version by criteria (asset id and version)
|
||||
version = asset_version_data.get("version") or 0
|
||||
asset_version_entity = self._query_asset_version(
|
||||
session, version, asset_id
|
||||
)
|
||||
|
||||
used_asset_versions = []
|
||||
# Prepare comment value
|
||||
comment = asset_version_data.get("comment") or ""
|
||||
if asset_version_entity is not None:
|
||||
changed = False
|
||||
if comment != asset_version_entity["comment"]:
|
||||
asset_version_entity["comment"] = comment
|
||||
changed = True
|
||||
|
||||
self._set_task_status(instance, task, session)
|
||||
if task_id != asset_version_entity["task_id"]:
|
||||
asset_version_entity["task_id"] = task_id
|
||||
changed = True
|
||||
|
||||
# Iterate over components and publish
|
||||
for data in instance.data.get("ftrackComponentsList", []):
|
||||
# AssetType
|
||||
# Get existing entity.
|
||||
assettype_data = {"short": "upload"}
|
||||
assettype_data.update(data.get("assettype_data", {}))
|
||||
self.log.debug("data: {}".format(data))
|
||||
if changed:
|
||||
session.commit()
|
||||
|
||||
assettype_entity = session.query(
|
||||
self.query("AssetType", assettype_data)
|
||||
).first()
|
||||
|
||||
# Create a new entity if none exits.
|
||||
if not assettype_entity:
|
||||
assettype_entity = session.create("AssetType", assettype_data)
|
||||
self.log.debug("Created new AssetType with data: {}".format(
|
||||
assettype_data
|
||||
))
|
||||
|
||||
# Asset
|
||||
# Get existing entity.
|
||||
asset_data = {
|
||||
"name": name,
|
||||
"type": assettype_entity,
|
||||
"parent": parent,
|
||||
else:
|
||||
new_asset_version_data = {
|
||||
"version": version,
|
||||
"asset_id": asset_id
|
||||
}
|
||||
asset_data.update(data.get("asset_data", {}))
|
||||
if task_id:
|
||||
new_asset_version_data["task_id"] = task_id
|
||||
|
||||
asset_entity = session.query(
|
||||
self.query("Asset", asset_data)
|
||||
).first()
|
||||
if comment:
|
||||
new_asset_version_data["comment"] = comment
|
||||
|
||||
self.log.info("asset entity: {}".format(asset_entity))
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
asset_metadata = asset_data.pop("metadata", {})
|
||||
|
||||
# Create a new entity if none exits.
|
||||
if not asset_entity:
|
||||
asset_entity = session.create("Asset", asset_data)
|
||||
self.log.debug(
|
||||
info_msg.format(
|
||||
entity_type="Asset",
|
||||
data=asset_data,
|
||||
metadata=asset_metadata
|
||||
)
|
||||
)
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
# Adding metadata
|
||||
existing_asset_metadata = asset_entity["metadata"]
|
||||
existing_asset_metadata.update(asset_metadata)
|
||||
asset_entity["metadata"] = existing_asset_metadata
|
||||
|
||||
# AssetVersion
|
||||
# Get existing entity.
|
||||
assetversion_data = {
|
||||
"version": 0,
|
||||
"asset": asset_entity,
|
||||
}
|
||||
_assetversion_data = data.get("assetversion_data", {})
|
||||
assetversion_cust_attrs = _assetversion_data.pop(
|
||||
"custom_attributes", {}
|
||||
self.log.info("Created new AssetVersion with data {}".format(
|
||||
new_asset_version_data
|
||||
))
|
||||
session.create("AssetVersion", new_asset_version_data)
|
||||
session.commit()
|
||||
asset_version_entity = self._query_asset_version(
|
||||
session, version, asset_id
|
||||
)
|
||||
asset_version_comment = _assetversion_data.pop(
|
||||
"comment", None
|
||||
)
|
||||
assetversion_data.update(_assetversion_data)
|
||||
|
||||
assetversion_entity = session.query(
|
||||
self.query("AssetVersion", assetversion_data)
|
||||
).first()
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
assetversion_metadata = assetversion_data.pop("metadata", {})
|
||||
|
||||
if task:
|
||||
assetversion_data['task'] = task
|
||||
|
||||
# Create a new entity if none exits.
|
||||
if not assetversion_entity:
|
||||
assetversion_entity = session.create(
|
||||
"AssetVersion", assetversion_data
|
||||
)
|
||||
self.log.debug(
|
||||
info_msg.format(
|
||||
entity_type="AssetVersion",
|
||||
data=assetversion_data,
|
||||
metadata=assetversion_metadata
|
||||
# Set custom attributes if there were any set
|
||||
custom_attrs = asset_version_data.get("custom_attributes") or {}
|
||||
for attr_key, attr_value in custom_attrs.items():
|
||||
if attr_key in asset_version_entity["custom_attributes"]:
|
||||
try:
|
||||
asset_version_entity["custom_attributes"][attr_key] = (
|
||||
attr_value
|
||||
)
|
||||
session.commit()
|
||||
continue
|
||||
except Exception:
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
|
||||
self.log.warning(
|
||||
(
|
||||
"Custom Attrubute \"{0}\" is not available for"
|
||||
" AssetVersion <{1}>. Can't set it's value to: \"{2}\""
|
||||
).format(
|
||||
attr_key, asset_version_entity["id"], str(attr_value)
|
||||
)
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
)
|
||||
|
||||
# Adding metadata
|
||||
existing_assetversion_metadata = assetversion_entity["metadata"]
|
||||
existing_assetversion_metadata.update(assetversion_metadata)
|
||||
assetversion_entity["metadata"] = existing_assetversion_metadata
|
||||
return asset_version_entity
|
||||
|
||||
# Add comment
|
||||
if asset_version_comment:
|
||||
assetversion_entity["comment"] = asset_version_comment
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
self.log.warning((
|
||||
"Comment was not possible to set for AssetVersion"
|
||||
"\"{0}\". Can't set it's value to: \"{1}\""
|
||||
).format(
|
||||
assetversion_entity["id"], str(asset_version_comment)
|
||||
))
|
||||
def _query_asset_version(self, session, version, asset_id):
|
||||
return session.query(
|
||||
(
|
||||
"select id, task_id, comment from AssetVersion"
|
||||
" where version is \"{}\" and asset_id is \"{}\""
|
||||
).format(version, asset_id)
|
||||
).first()
|
||||
|
||||
# Adding Custom Attributes
|
||||
for attr, val in assetversion_cust_attrs.items():
|
||||
if attr in assetversion_entity["custom_attributes"]:
|
||||
try:
|
||||
assetversion_entity["custom_attributes"][attr] = val
|
||||
session.commit()
|
||||
continue
|
||||
except Exception:
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
def create_component(self, session, asset_version_entity, data):
|
||||
component_data = data.get("component_data") or {}
|
||||
|
||||
self.log.warning((
|
||||
"Custom Attrubute \"{0}\""
|
||||
" is not available for AssetVersion <{1}>."
|
||||
" Can't set it's value to: \"{2}\""
|
||||
).format(attr, assetversion_entity["id"], str(val)))
|
||||
if not component_data.get("name"):
|
||||
component_data["name"] = "main"
|
||||
|
||||
version_id = asset_version_entity["id"]
|
||||
component_data["version_id"] = version_id
|
||||
component_entity = session.query(
|
||||
(
|
||||
"select id, name from Component where name is \"{}\""
|
||||
" and version_id is \"{}\""
|
||||
).format(component_data["name"], version_id)
|
||||
).first()
|
||||
|
||||
component_overwrite = data.get("component_overwrite", False)
|
||||
location = data.get("component_location", session.pick_location())
|
||||
|
||||
# Overwrite existing component data if requested.
|
||||
if component_entity and component_overwrite:
|
||||
origin_location = session.query(
|
||||
"Location where name is \"ftrack.origin\""
|
||||
).one()
|
||||
|
||||
# Removing existing members from location
|
||||
components = list(component_entity.get("members", []))
|
||||
components += [component_entity]
|
||||
for component in components:
|
||||
for loc in component["component_locations"]:
|
||||
if location["id"] == loc["location_id"]:
|
||||
location.remove_component(
|
||||
component, recursive=False
|
||||
)
|
||||
|
||||
# Deleting existing members on component entity
|
||||
for member in component_entity.get("members", []):
|
||||
session.delete(member)
|
||||
del(member)
|
||||
|
||||
# Have to commit the version and asset, because location can't
|
||||
# determine the final location without.
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
|
|
@ -292,175 +444,124 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
|
|||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
# Component
|
||||
# Get existing entity.
|
||||
component_data = {
|
||||
"name": "main",
|
||||
"version": assetversion_entity
|
||||
}
|
||||
component_data.update(data.get("component_data", {}))
|
||||
# Reset members in memory
|
||||
if "members" in component_entity.keys():
|
||||
component_entity["members"] = []
|
||||
|
||||
component_entity = session.query(
|
||||
self.query("Component", component_data)
|
||||
).first()
|
||||
# Add components to origin location
|
||||
try:
|
||||
collection = clique.parse(data["component_path"])
|
||||
except ValueError:
|
||||
# Assume its a single file
|
||||
# Changing file type
|
||||
name, ext = os.path.splitext(data["component_path"])
|
||||
component_entity["file_type"] = ext
|
||||
|
||||
component_overwrite = data.get("component_overwrite", False)
|
||||
location = data.get("component_location", session.pick_location())
|
||||
|
||||
# Overwrite existing component data if requested.
|
||||
if component_entity and component_overwrite:
|
||||
|
||||
origin_location = session.query(
|
||||
"Location where name is \"ftrack.origin\""
|
||||
).one()
|
||||
|
||||
# Removing existing members from location
|
||||
components = list(component_entity.get("members", []))
|
||||
components += [component_entity]
|
||||
for component in components:
|
||||
for loc in component["component_locations"]:
|
||||
if location["id"] == loc["location_id"]:
|
||||
location.remove_component(
|
||||
component, recursive=False
|
||||
)
|
||||
|
||||
# Deleting existing members on component entity
|
||||
for member in component_entity.get("members", []):
|
||||
session.delete(member)
|
||||
del(member)
|
||||
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
# Reset members in memory
|
||||
if "members" in component_entity.keys():
|
||||
component_entity["members"] = []
|
||||
|
||||
# Add components to origin location
|
||||
try:
|
||||
collection = clique.parse(data["component_path"])
|
||||
except ValueError:
|
||||
# Assume its a single file
|
||||
# Changing file type
|
||||
name, ext = os.path.splitext(data["component_path"])
|
||||
component_entity["file_type"] = ext
|
||||
|
||||
origin_location.add_component(
|
||||
component_entity, data["component_path"]
|
||||
)
|
||||
else:
|
||||
# Changing file type
|
||||
component_entity["file_type"] = collection.format("{tail}")
|
||||
|
||||
# Create member components for sequence.
|
||||
for member_path in collection:
|
||||
|
||||
size = 0
|
||||
try:
|
||||
size = os.path.getsize(member_path)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
name = collection.match(member_path).group("index")
|
||||
|
||||
member_data = {
|
||||
"name": name,
|
||||
"container": component_entity,
|
||||
"size": size,
|
||||
"file_type": os.path.splitext(member_path)[-1]
|
||||
}
|
||||
|
||||
component = session.create(
|
||||
"FileComponent", member_data
|
||||
)
|
||||
origin_location.add_component(
|
||||
component, member_path, recursive=False
|
||||
)
|
||||
component_entity["members"].append(component)
|
||||
|
||||
# Add components to location.
|
||||
location.add_component(
|
||||
component_entity, origin_location, recursive=True
|
||||
)
|
||||
|
||||
data["component"] = component_entity
|
||||
msg = "Overwriting Component with path: {0}, data: {1}, "
|
||||
msg += "location: {2}"
|
||||
self.log.info(
|
||||
msg.format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
location
|
||||
)
|
||||
)
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
component_metadata = component_data.pop("metadata", {})
|
||||
|
||||
# Create new component if none exists.
|
||||
new_component = False
|
||||
if not component_entity:
|
||||
component_entity = assetversion_entity.create_component(
|
||||
data["component_path"],
|
||||
data=component_data,
|
||||
location=location
|
||||
)
|
||||
data["component"] = component_entity
|
||||
msg = "Created new Component with path: {0}, data: {1}"
|
||||
msg += ", metadata: {2}, location: {3}"
|
||||
self.log.info(
|
||||
msg.format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
component_metadata,
|
||||
location
|
||||
)
|
||||
)
|
||||
new_component = True
|
||||
|
||||
# Adding metadata
|
||||
existing_component_metadata = component_entity["metadata"]
|
||||
existing_component_metadata.update(component_metadata)
|
||||
component_entity["metadata"] = existing_component_metadata
|
||||
|
||||
# if component_data['name'] = 'ftrackreview-mp4-mp4':
|
||||
# assetversion_entity["thumbnail_id"]
|
||||
|
||||
# Setting assetversion thumbnail
|
||||
if data.get("thumbnail", False):
|
||||
assetversion_entity["thumbnail_id"] = component_entity["id"]
|
||||
|
||||
# Inform user about no changes to the database.
|
||||
if (component_entity and not component_overwrite and
|
||||
not new_component):
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
"Found existing component, and no request to overwrite. "
|
||||
"Nothing has been changed."
|
||||
origin_location.add_component(
|
||||
component_entity, data["component_path"]
|
||||
)
|
||||
else:
|
||||
# Commit changes.
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
# Changing file type
|
||||
component_entity["file_type"] = collection.format("{tail}")
|
||||
|
||||
if assetversion_entity not in used_asset_versions:
|
||||
used_asset_versions.append(assetversion_entity)
|
||||
# Create member components for sequence.
|
||||
for member_path in collection:
|
||||
|
||||
asset_versions_key = "ftrackIntegratedAssetVersions"
|
||||
if asset_versions_key not in instance.data:
|
||||
instance.data[asset_versions_key] = []
|
||||
size = 0
|
||||
try:
|
||||
size = os.path.getsize(member_path)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
for asset_version in used_asset_versions:
|
||||
if asset_version not in instance.data[asset_versions_key]:
|
||||
instance.data[asset_versions_key].append(asset_version)
|
||||
name = collection.match(member_path).group("index")
|
||||
|
||||
member_data = {
|
||||
"name": name,
|
||||
"container": component_entity,
|
||||
"size": size,
|
||||
"file_type": os.path.splitext(member_path)[-1]
|
||||
}
|
||||
|
||||
component = session.create(
|
||||
"FileComponent", member_data
|
||||
)
|
||||
origin_location.add_component(
|
||||
component, member_path, recursive=False
|
||||
)
|
||||
component_entity["members"].append(component)
|
||||
|
||||
# Add components to location.
|
||||
location.add_component(
|
||||
component_entity, origin_location, recursive=True
|
||||
)
|
||||
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
(
|
||||
"Overwriting Component with path: {0}, data: {1},"
|
||||
" location: {2}"
|
||||
).format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
location
|
||||
)
|
||||
)
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
component_metadata = component_data.pop("metadata", {})
|
||||
|
||||
# Create new component if none exists.
|
||||
new_component = False
|
||||
if not component_entity:
|
||||
component_entity = asset_version_entity.create_component(
|
||||
data["component_path"],
|
||||
data=component_data,
|
||||
location=location
|
||||
)
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
(
|
||||
"Created new Component with path: {0}, data: {1},"
|
||||
" metadata: {2}, location: {3}"
|
||||
).format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
component_metadata,
|
||||
location
|
||||
)
|
||||
)
|
||||
new_component = True
|
||||
|
||||
# Adding metadata
|
||||
existing_component_metadata = component_entity["metadata"]
|
||||
existing_component_metadata.update(component_metadata)
|
||||
component_entity["metadata"] = existing_component_metadata
|
||||
|
||||
# if component_data['name'] = 'ftrackreview-mp4-mp4':
|
||||
# assetversion_entity["thumbnail_id"]
|
||||
|
||||
# Setting assetversion thumbnail
|
||||
if data.get("thumbnail"):
|
||||
asset_version_entity["thumbnail_id"] = component_entity["id"]
|
||||
|
||||
# Inform user about no changes to the database.
|
||||
if (
|
||||
component_entity
|
||||
and not component_overwrite
|
||||
and not new_component
|
||||
):
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
"Found existing component, and no request to overwrite. "
|
||||
"Nothing has been changed."
|
||||
)
|
||||
else:
|
||||
# Commit changes.
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,84 @@
|
|||
"""
|
||||
Requires:
|
||||
context > comment
|
||||
context > ftrackSession
|
||||
instance > ftrackIntegratedAssetVersionsData
|
||||
"""
|
||||
|
||||
import sys
|
||||
|
||||
import six
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class IntegrateFtrackDescription(pyblish.api.InstancePlugin):
|
||||
"""Add description to AssetVersions in Ftrack."""
|
||||
|
||||
# Must be after integrate asset new
|
||||
order = pyblish.api.IntegratorOrder + 0.4999
|
||||
label = "Integrate Ftrack description"
|
||||
families = ["ftrack"]
|
||||
optional = True
|
||||
|
||||
# Can be set in settings:
|
||||
# - Allows `intent` and `comment` keys
|
||||
description_template = "{comment}"
|
||||
|
||||
def process(self, instance):
|
||||
# Check if there are any integrated AssetVersion entities
|
||||
asset_versions_key = "ftrackIntegratedAssetVersionsData"
|
||||
asset_versions_data_by_id = instance.data.get(asset_versions_key)
|
||||
if not asset_versions_data_by_id:
|
||||
self.log.info("There are any integrated AssetVersions")
|
||||
return
|
||||
|
||||
comment = (instance.context.data.get("comment") or "").strip()
|
||||
if not comment:
|
||||
self.log.info("Comment is not set.")
|
||||
else:
|
||||
self.log.debug("Comment is set to `{}`".format(comment))
|
||||
|
||||
session = instance.context.data["ftrackSession"]
|
||||
|
||||
intent = instance.context.data.get("intent")
|
||||
intent_label = None
|
||||
if intent and isinstance(intent, dict):
|
||||
intent_val = intent.get("value")
|
||||
intent_label = intent.get("label")
|
||||
else:
|
||||
intent_val = intent
|
||||
|
||||
if not intent_label:
|
||||
intent_label = intent_val or ""
|
||||
|
||||
# if intent label is set then format comment
|
||||
# - it is possible that intent_label is equal to "" (empty string)
|
||||
if intent_label:
|
||||
self.log.debug(
|
||||
"Intent label is set to `{}`.".format(intent_label)
|
||||
)
|
||||
|
||||
else:
|
||||
self.log.debug("Intent is not set.")
|
||||
|
||||
for asset_version_data in asset_versions_data_by_id.values():
|
||||
asset_version = asset_version_data["asset_version"]
|
||||
|
||||
# Backwards compatibility for older settings using
|
||||
# attribute 'note_with_intent_template'
|
||||
comment = self.description_template.format(**{
|
||||
"intent": intent_label,
|
||||
"comment": comment
|
||||
})
|
||||
asset_version["comment"] = comment
|
||||
|
||||
try:
|
||||
session.commit()
|
||||
self.log.debug("Comment added to AssetVersion \"{}\"".format(
|
||||
str(asset_version)
|
||||
))
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
|
@ -40,6 +40,13 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
def process(self, instance):
|
||||
self.log.debug("instance {}".format(instance))
|
||||
|
||||
instance_repres = instance.data.get("representations")
|
||||
if not instance_repres:
|
||||
self.log.info((
|
||||
"Skipping instance. Does not have any representations {}"
|
||||
).format(str(instance)))
|
||||
return
|
||||
|
||||
instance_version = instance.data.get("version")
|
||||
if instance_version is None:
|
||||
raise ValueError("Instance version not set")
|
||||
|
|
@ -53,8 +60,12 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
if not asset_type and family_low in self.family_mapping:
|
||||
asset_type = self.family_mapping[family_low]
|
||||
|
||||
self.log.debug(self.family_mapping)
|
||||
self.log.debug(family_low)
|
||||
if not asset_type:
|
||||
asset_type = "upload"
|
||||
|
||||
self.log.debug(
|
||||
"Family: {}\nMapping: {}".format(family_low, self.family_mapping)
|
||||
)
|
||||
|
||||
# Ignore this instance if neither "ftrackFamily" or a family mapping is
|
||||
# found.
|
||||
|
|
@ -64,13 +75,6 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
).format(family))
|
||||
return
|
||||
|
||||
instance_repres = instance.data.get("representations")
|
||||
if not instance_repres:
|
||||
self.log.info((
|
||||
"Skipping instance. Does not have any representations {}"
|
||||
).format(str(instance)))
|
||||
return
|
||||
|
||||
# Prepare FPS
|
||||
instance_fps = instance.data.get("fps")
|
||||
if instance_fps is None:
|
||||
|
|
|
|||
|
|
@ -1,7 +1,17 @@
|
|||
"""
|
||||
Requires:
|
||||
context > hostName
|
||||
context > appName
|
||||
context > appLabel
|
||||
context > comment
|
||||
context > ftrackSession
|
||||
instance > ftrackIntegratedAssetVersionsData
|
||||
"""
|
||||
|
||||
import sys
|
||||
import json
|
||||
import pyblish.api
|
||||
|
||||
import six
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
||||
|
|
@ -15,100 +25,52 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
|
||||
# Can be set in presets:
|
||||
# - Allows only `intent` and `comment` keys
|
||||
note_template = None
|
||||
# Backwards compatibility
|
||||
note_with_intent_template = "{intent}: {comment}"
|
||||
# - note label must exist in Ftrack
|
||||
note_labels = []
|
||||
|
||||
def get_intent_label(self, session, intent_value):
|
||||
if not intent_value:
|
||||
return
|
||||
|
||||
intent_configurations = session.query(
|
||||
"CustomAttributeConfiguration where key is intent"
|
||||
).all()
|
||||
if not intent_configurations:
|
||||
return
|
||||
|
||||
intent_configuration = intent_configurations[0]
|
||||
if len(intent_configuration) > 1:
|
||||
self.log.warning((
|
||||
"Found more than one `intent` custom attribute."
|
||||
" Using first found."
|
||||
))
|
||||
|
||||
config = intent_configuration.get("config")
|
||||
if not config:
|
||||
return
|
||||
|
||||
configuration = json.loads(config)
|
||||
items = configuration.get("data")
|
||||
if not items:
|
||||
return
|
||||
|
||||
if sys.version_info[0] < 3:
|
||||
string_type = basestring
|
||||
else:
|
||||
string_type = str
|
||||
|
||||
if isinstance(items, string_type):
|
||||
items = json.loads(items)
|
||||
|
||||
intent_label = None
|
||||
for item in items:
|
||||
if item["value"] == intent_value:
|
||||
intent_label = item["menu"]
|
||||
break
|
||||
|
||||
return intent_label
|
||||
|
||||
def process(self, instance):
|
||||
comment = (instance.context.data.get("comment") or "").strip()
|
||||
# Check if there are any integrated AssetVersion entities
|
||||
asset_versions_key = "ftrackIntegratedAssetVersionsData"
|
||||
asset_versions_data_by_id = instance.data.get(asset_versions_key)
|
||||
if not asset_versions_data_by_id:
|
||||
self.log.info("There are any integrated AssetVersions")
|
||||
return
|
||||
|
||||
context = instance.context
|
||||
host_name = context.data["hostName"]
|
||||
app_name = context.data["appName"]
|
||||
app_label = context.data["appLabel"]
|
||||
comment = (context.data.get("comment") or "").strip()
|
||||
if not comment:
|
||||
self.log.info("Comment is not set.")
|
||||
return
|
||||
else:
|
||||
self.log.debug("Comment is set to `{}`".format(comment))
|
||||
|
||||
self.log.debug("Comment is set to `{}`".format(comment))
|
||||
|
||||
session = instance.context.data["ftrackSession"]
|
||||
session = context.data["ftrackSession"]
|
||||
|
||||
intent = instance.context.data.get("intent")
|
||||
intent_label = None
|
||||
if intent and isinstance(intent, dict):
|
||||
intent_val = intent.get("value")
|
||||
intent_label = intent.get("label")
|
||||
else:
|
||||
intent_val = intent_label = intent
|
||||
intent_val = intent
|
||||
|
||||
final_label = None
|
||||
if intent_val:
|
||||
final_label = self.get_intent_label(session, intent_val)
|
||||
if final_label is None:
|
||||
final_label = intent_label
|
||||
if not intent_label:
|
||||
intent_label = intent_val or ""
|
||||
|
||||
# if intent label is set then format comment
|
||||
# - it is possible that intent_label is equal to "" (empty string)
|
||||
if final_label:
|
||||
msg = "Intent label is set to `{}`.".format(final_label)
|
||||
comment = self.note_with_intent_template.format(**{
|
||||
"intent": final_label,
|
||||
"comment": comment
|
||||
})
|
||||
|
||||
elif intent_val:
|
||||
msg = (
|
||||
"Intent is set to `{}` and was not added"
|
||||
" to comment because label is set to `{}`."
|
||||
).format(intent_val, final_label)
|
||||
if intent_label:
|
||||
self.log.debug(
|
||||
"Intent label is set to `{}`.".format(intent_label)
|
||||
)
|
||||
|
||||
else:
|
||||
msg = "Intent is not set."
|
||||
|
||||
self.log.debug(msg)
|
||||
|
||||
asset_versions_key = "ftrackIntegratedAssetVersions"
|
||||
asset_versions = instance.data.get(asset_versions_key)
|
||||
if not asset_versions:
|
||||
self.log.info("There are any integrated AssetVersions")
|
||||
return
|
||||
self.log.debug("Intent is not set.")
|
||||
|
||||
user = session.query(
|
||||
"User where username is \"{}\"".format(session.api_user)
|
||||
|
|
@ -122,7 +84,7 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
|
||||
labels = []
|
||||
if self.note_labels:
|
||||
all_labels = session.query("NoteLabel").all()
|
||||
all_labels = session.query("select id, name from NoteLabel").all()
|
||||
labels_by_low_name = {lab["name"].lower(): lab for lab in all_labels}
|
||||
for _label in self.note_labels:
|
||||
label = labels_by_low_name.get(_label.lower())
|
||||
|
|
@ -134,7 +96,34 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
|
||||
labels.append(label)
|
||||
|
||||
for asset_version in asset_versions:
|
||||
for asset_version_data in asset_versions_data_by_id.values():
|
||||
asset_version = asset_version_data["asset_version"]
|
||||
component_items = asset_version_data["component_items"]
|
||||
|
||||
published_paths = set()
|
||||
for component_item in component_items:
|
||||
published_paths.add(component_item["component_path"])
|
||||
|
||||
# Backwards compatibility for older settings using
|
||||
# attribute 'note_with_intent_template'
|
||||
template = self.note_template
|
||||
if template is None:
|
||||
template = self.note_with_intent_template
|
||||
format_data = {
|
||||
"intent": intent_label,
|
||||
"comment": comment,
|
||||
"host_name": host_name,
|
||||
"app_name": app_name,
|
||||
"app_label": app_label,
|
||||
"published_paths": "<br/>".join(sorted(published_paths)),
|
||||
}
|
||||
comment = template.format(**format_data)
|
||||
if not comment:
|
||||
self.log.info((
|
||||
"Note for AssetVersion {} would be empty. Skipping."
|
||||
"\nTemplate: {}\nData: {}"
|
||||
).format(asset_version["id"], template, format_data))
|
||||
continue
|
||||
asset_version.create_note(comment, author=user, labels=labels)
|
||||
|
||||
try:
|
||||
|
|
|
|||
|
|
@ -389,7 +389,8 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
|
||||
self._append_lines([openpype_art])
|
||||
|
||||
self.setStyleSheet(load_stylesheet())
|
||||
self._first_show = True
|
||||
self._splitter_size_ratio = None
|
||||
|
||||
self._init_from_registry()
|
||||
|
||||
|
|
@ -416,9 +417,9 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
self.resize(width, height)
|
||||
|
||||
try:
|
||||
sizes = setting_registry.get_item("splitter_sizes")
|
||||
if len(sizes) == len(self._widgets_splitter.sizes()):
|
||||
self._widgets_splitter.setSizes(sizes)
|
||||
self._splitter_size_ratio = (
|
||||
setting_registry.get_item("splitter_sizes")
|
||||
)
|
||||
|
||||
except ValueError:
|
||||
pass
|
||||
|
|
@ -627,8 +628,29 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
def showEvent(self, event):
|
||||
self._line_check_timer.start()
|
||||
super(PythonInterpreterWidget, self).showEvent(event)
|
||||
# First show setup
|
||||
if self._first_show:
|
||||
self._first_show = False
|
||||
self._on_first_show()
|
||||
|
||||
self._output_widget.scroll_to_bottom()
|
||||
|
||||
def _on_first_show(self):
|
||||
# Change stylesheet
|
||||
self.setStyleSheet(load_stylesheet())
|
||||
# Check if splitter size ratio is set
|
||||
# - first store value to local variable and then unset it
|
||||
splitter_size_ratio = self._splitter_size_ratio
|
||||
self._splitter_size_ratio = None
|
||||
# Skip if is not set
|
||||
if not splitter_size_ratio:
|
||||
return
|
||||
|
||||
# Skip if number of size items does not match to splitter
|
||||
splitters_count = len(self._widgets_splitter.sizes())
|
||||
if len(splitter_size_ratio) == splitters_count:
|
||||
self._widgets_splitter.setSizes(splitter_size_ratio)
|
||||
|
||||
def closeEvent(self, event):
|
||||
self.save_registry()
|
||||
super(PythonInterpreterWidget, self).closeEvent(event)
|
||||
|
|
|
|||
|
|
@ -73,8 +73,23 @@ class GDriveHandler(AbstractProvider):
|
|||
format(site_name))
|
||||
return
|
||||
|
||||
cred_path = self.presets.get("credentials_url", {}).\
|
||||
get(platform.system().lower()) or ''
|
||||
current_platform = platform.system().lower()
|
||||
cred_path = self.presets.get("credentials_url", {}). \
|
||||
get(current_platform) or ''
|
||||
|
||||
if not cred_path:
|
||||
msg = "Sync Server: Please, fill the credentials for gdrive "\
|
||||
"provider for platform '{}' !".format(current_platform)
|
||||
log.info(msg)
|
||||
return
|
||||
|
||||
try:
|
||||
cred_path = cred_path.format(**os.environ)
|
||||
except KeyError as e:
|
||||
log.info("Sync Server: The key(s) {} does not exist in the "
|
||||
"environment variables".format(" ".join(e.args)))
|
||||
return
|
||||
|
||||
if not os.path.exists(cred_path):
|
||||
msg = "Sync Server: No credentials for gdrive provider " + \
|
||||
"for '{}' on path '{}'!".format(site_name, cred_path)
|
||||
|
|
|
|||
|
|
@ -41,6 +41,7 @@ from .load import (
|
|||
|
||||
loaders_from_representation,
|
||||
get_representation_path,
|
||||
get_representation_context,
|
||||
get_repres_contexts,
|
||||
)
|
||||
|
||||
|
|
@ -113,6 +114,7 @@ __all__ = (
|
|||
|
||||
"loaders_from_representation",
|
||||
"get_representation_path",
|
||||
"get_representation_context",
|
||||
"get_repres_contexts",
|
||||
|
||||
# --- Publish ---
|
||||
|
|
|
|||
|
|
@ -41,7 +41,8 @@ class LoaderPlugin(list):
|
|||
def get_representations(cls):
|
||||
return cls.representations
|
||||
|
||||
def filepath_from_context(self, context):
|
||||
@classmethod
|
||||
def filepath_from_context(cls, context):
|
||||
return get_representation_path_from_context(context)
|
||||
|
||||
def load(self, context, name=None, namespace=None, options=None):
|
||||
|
|
|
|||
|
|
@ -18,20 +18,30 @@ class CollectHostName(pyblish.api.ContextPlugin):
|
|||
|
||||
def process(self, context):
|
||||
host_name = context.data.get("hostName")
|
||||
app_name = context.data.get("appName")
|
||||
app_label = context.data.get("appLabel")
|
||||
# Don't override value if is already set
|
||||
if host_name:
|
||||
if host_name and app_name and app_label:
|
||||
return
|
||||
|
||||
# Use AVALON_APP as first if available it is the same as host name
|
||||
# - only if is not defined use AVALON_APP_NAME (e.g. on Farm) and
|
||||
# set it back to AVALON_APP env variable
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
# Use AVALON_APP to get host name if available
|
||||
if not host_name:
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
|
||||
# Use AVALON_APP_NAME to get full app name
|
||||
if not app_name:
|
||||
app_name = os.environ.get("AVALON_APP_NAME")
|
||||
if app_name:
|
||||
app_manager = ApplicationManager()
|
||||
app = app_manager.applications.get(app_name)
|
||||
if app:
|
||||
|
||||
# Fill missing values based on app full name
|
||||
if (not host_name or not app_label) and app_name:
|
||||
app_manager = ApplicationManager()
|
||||
app = app_manager.applications.get(app_name)
|
||||
if app:
|
||||
if not host_name:
|
||||
host_name = app.host_name
|
||||
if not app_label:
|
||||
app_label = app.full_label
|
||||
|
||||
context.data["hostName"] = host_name
|
||||
context.data["appName"] = app_name
|
||||
context.data["appLabel"] = app_label
|
||||
|
|
|
|||
|
|
@ -2,8 +2,8 @@ import pyblish.api
|
|||
from openpype.pipeline import PublishValidationError
|
||||
|
||||
|
||||
class ValidateContainers(pyblish.api.InstancePlugin):
|
||||
"""Validate existence of asset asset documents on instances.
|
||||
class ValidateAssetDocs(pyblish.api.InstancePlugin):
|
||||
"""Validate existence of asset documents on instances.
|
||||
|
||||
Without asset document it is not possible to publish the instance.
|
||||
|
||||
|
|
@ -22,10 +22,10 @@ class ValidateContainers(pyblish.api.InstancePlugin):
|
|||
return
|
||||
|
||||
if instance.data.get("assetEntity"):
|
||||
self.log.info("Instance have set asset document in it's data.")
|
||||
self.log.info("Instance has set asset document in its data.")
|
||||
|
||||
else:
|
||||
raise PublishValidationError((
|
||||
"Instance \"{}\" don't have set asset"
|
||||
" document which is needed for publishing."
|
||||
"Instance \"{}\" doesn't have asset document "
|
||||
"set which is needed for publishing."
|
||||
).format(instance.data["name"]))
|
||||
|
|
@ -15,33 +15,6 @@
|
|||
"deadline"
|
||||
]
|
||||
},
|
||||
"ProcessSubmittedJobOnFarm": {
|
||||
"enabled": true,
|
||||
"deadline_department": "",
|
||||
"deadline_pool": "",
|
||||
"deadline_group": "",
|
||||
"deadline_chunk_size": 1,
|
||||
"deadline_priority": 50,
|
||||
"publishing_script": "",
|
||||
"skip_integration_repre_list": [],
|
||||
"aov_filter": {
|
||||
"maya": [
|
||||
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
|
||||
],
|
||||
"nuke": [
|
||||
".*"
|
||||
],
|
||||
"aftereffects": [
|
||||
".*"
|
||||
],
|
||||
"celaction": [
|
||||
".*"
|
||||
],
|
||||
"harmony": [
|
||||
".*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"MayaSubmitDeadline": {
|
||||
"enabled": true,
|
||||
"optional": false,
|
||||
|
|
@ -49,6 +22,8 @@
|
|||
"tile_assembler_plugin": "OpenPypeTileAssembler",
|
||||
"use_published": true,
|
||||
"asset_dependencies": true,
|
||||
"priority": 50,
|
||||
"tile_priority": 50,
|
||||
"group": "none",
|
||||
"limit": [],
|
||||
"jobInfo": {},
|
||||
|
|
@ -96,6 +71,33 @@
|
|||
"group": "",
|
||||
"department": "",
|
||||
"multiprocess": true
|
||||
},
|
||||
"ProcessSubmittedJobOnFarm": {
|
||||
"enabled": true,
|
||||
"deadline_department": "",
|
||||
"deadline_pool": "",
|
||||
"deadline_group": "",
|
||||
"deadline_chunk_size": 1,
|
||||
"deadline_priority": 50,
|
||||
"publishing_script": "",
|
||||
"skip_integration_repre_list": [],
|
||||
"aov_filter": {
|
||||
"maya": [
|
||||
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
|
||||
],
|
||||
"nuke": [
|
||||
".*"
|
||||
],
|
||||
"aftereffects": [
|
||||
".*"
|
||||
],
|
||||
"celaction": [
|
||||
".*"
|
||||
],
|
||||
"harmony": [
|
||||
".*"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -354,9 +354,15 @@
|
|||
},
|
||||
"IntegrateFtrackNote": {
|
||||
"enabled": true,
|
||||
"note_with_intent_template": "{intent}: {comment}",
|
||||
"note_template": "{intent}: {comment}",
|
||||
"note_labels": []
|
||||
},
|
||||
"IntegrateFtrackDescription": {
|
||||
"enabled": false,
|
||||
"optional": true,
|
||||
"active": true,
|
||||
"description_template": "{comment}"
|
||||
},
|
||||
"ValidateFtrackAttributes": {
|
||||
"enabled": false,
|
||||
"ftrack_custom_attributes": {}
|
||||
|
|
|
|||
|
|
@ -190,7 +190,7 @@
|
|||
"tasks": [],
|
||||
"template_name": "simpleUnrealTexture"
|
||||
},
|
||||
{
|
||||
{
|
||||
"families": [
|
||||
"staticMesh",
|
||||
"skeletalMesh"
|
||||
|
|
@ -279,6 +279,15 @@
|
|||
"tasks": [],
|
||||
"template": "{family}{variant}"
|
||||
},
|
||||
{
|
||||
"families": [
|
||||
"workfile"
|
||||
],
|
||||
"hosts": [],
|
||||
"task_types": [],
|
||||
"tasks": [],
|
||||
"template": "{family}{Task}"
|
||||
},
|
||||
{
|
||||
"families": [
|
||||
"render"
|
||||
|
|
|
|||
|
|
@ -117,6 +117,16 @@
|
|||
"key": "asset_dependencies",
|
||||
"label": "Use Asset dependencies"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "priority",
|
||||
"label": "Priority"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "tile_priority",
|
||||
"label": "Tile Assembler Priority"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "group",
|
||||
|
|
|
|||
|
|
@ -738,10 +738,15 @@
|
|||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Template may contain formatting keys <b>intent</b>, <b>comment</b>, <b>host_name</b>, <b>app_name</b>, <b>app_label</b> and <b>published_paths</b>."
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "note_with_intent_template",
|
||||
"label": "Note with intent template"
|
||||
"key": "note_template",
|
||||
"label": "Note template",
|
||||
"multiline": true
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
|
|
@ -751,6 +756,44 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"checkbox_key": "enabled",
|
||||
"key": "IntegrateFtrackDescription",
|
||||
"label": "Integrate Ftrack Description",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Add description to integrated AssetVersion."
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "optional",
|
||||
"label": "Optional"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "active",
|
||||
"label": "Active"
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Template may contain formatting keys <b>intent</b> and <b>comment</b>."
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "description_template",
|
||||
"label": "Description template"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
|
|||
|
|
@ -216,7 +216,7 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
|
|||
def create_ui(self):
|
||||
self.modify_defaults_checkbox = None
|
||||
|
||||
conf_wrapper_widget = QtWidgets.QWidget(self)
|
||||
conf_wrapper_widget = QtWidgets.QSplitter(self)
|
||||
configurations_widget = QtWidgets.QWidget(conf_wrapper_widget)
|
||||
|
||||
# Breadcrumbs/Path widget
|
||||
|
|
@ -294,10 +294,7 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
|
|||
|
||||
configurations_layout.addWidget(scroll_widget, 1)
|
||||
|
||||
conf_wrapper_layout = QtWidgets.QHBoxLayout(conf_wrapper_widget)
|
||||
conf_wrapper_layout.setContentsMargins(0, 0, 0, 0)
|
||||
conf_wrapper_layout.setSpacing(0)
|
||||
conf_wrapper_layout.addWidget(configurations_widget, 1)
|
||||
conf_wrapper_widget.addWidget(configurations_widget)
|
||||
|
||||
main_layout = QtWidgets.QVBoxLayout(self)
|
||||
main_layout.setContentsMargins(0, 0, 0, 0)
|
||||
|
|
@ -327,7 +324,7 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
|
|||
self.breadcrumbs_model = None
|
||||
self.refresh_btn = refresh_btn
|
||||
|
||||
self.conf_wrapper_layout = conf_wrapper_layout
|
||||
self.conf_wrapper_widget = conf_wrapper_widget
|
||||
self.main_layout = main_layout
|
||||
|
||||
self.ui_tweaks()
|
||||
|
|
@ -818,7 +815,9 @@ class ProjectWidget(SettingsCategoryWidget):
|
|||
|
||||
project_list_widget = ProjectListWidget(self)
|
||||
|
||||
self.conf_wrapper_layout.insertWidget(0, project_list_widget, 0)
|
||||
self.conf_wrapper_widget.insertWidget(0, project_list_widget)
|
||||
self.conf_wrapper_widget.setStretchFactor(0, 0)
|
||||
self.conf_wrapper_widget.setStretchFactor(1, 1)
|
||||
|
||||
project_list_widget.project_changed.connect(self._on_project_change)
|
||||
project_list_widget.version_change_requested.connect(
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.9.3-nightly.1"
|
||||
__version__ = "3.9.3"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.9.3-nightly.1" # OpenPype
|
||||
version = "3.9.3" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -24,16 +24,18 @@ class DBAssert:
|
|||
else:
|
||||
args[key] = val
|
||||
|
||||
msg = None
|
||||
no_of_docs = dbcon.count_documents(args)
|
||||
if expected != no_of_docs:
|
||||
msg = "Not expected no of versions. "\
|
||||
"Expected {}, found {}".format(expected, no_of_docs)
|
||||
|
||||
args.pop("type")
|
||||
detail_str = " "
|
||||
if args:
|
||||
detail_str = " with {}".format(args)
|
||||
detail_str = " with '{}'".format(args)
|
||||
|
||||
msg = None
|
||||
no_of_docs = dbcon.count_documents(args)
|
||||
if expected != no_of_docs:
|
||||
msg = "Not expected no of '{}'{}."\
|
||||
"Expected {}, found {}".format(queried_type,
|
||||
detail_str,
|
||||
expected, no_of_docs)
|
||||
|
||||
status = "successful"
|
||||
if msg:
|
||||
|
|
@ -42,7 +44,5 @@ class DBAssert:
|
|||
print("Comparing count of {}{} {}".format(queried_type,
|
||||
detail_str,
|
||||
status))
|
||||
if msg:
|
||||
print(msg)
|
||||
|
||||
return msg
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ The main things you will need to run and build pype are:
|
|||
- **Terminal** in your OS
|
||||
- PowerShell 5.0+ (Windows)
|
||||
- Bash (Linux)
|
||||
- [**Python 3.7.8**](#python) or higher
|
||||
- [**Python 3.7.9**](#python) or higher
|
||||
- [**MongoDB**](#database)
|
||||
|
||||
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue