mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-25 05:14:40 +01:00
Merge branch 'develop' of github.com:pypeclub/OpenPype into feature/OP-2951_Download-all-workfile-inputs
This commit is contained in:
commit
ffee62799d
141 changed files with 3816 additions and 1628 deletions
157
CHANGELOG.md
157
CHANGELOG.md
|
|
@ -1,34 +1,92 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.2-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.9.3](https://github.com/pypeclub/OpenPype/tree/3.9.3) (2022-04-07)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.1...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.2...3.9.3)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Docs: Added MongoDB requirements [\#2951](https://github.com/pypeclub/OpenPype/pull/2951)
|
||||
- Website Docs: Manager Ftrack fix broken links [\#2979](https://github.com/pypeclub/OpenPype/pull/2979)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Multiverse: First PR [\#2908](https://github.com/pypeclub/OpenPype/pull/2908)
|
||||
- Ftrack: Add description integrator [\#3027](https://github.com/pypeclub/OpenPype/pull/3027)
|
||||
- Publishing textures for Unreal [\#2988](https://github.com/pypeclub/OpenPype/pull/2988)
|
||||
- Maya to Unreal: Static and Skeletal Meshes [\#2978](https://github.com/pypeclub/OpenPype/pull/2978)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Slack: Added configurable maximum file size of review upload to Slack [\#2945](https://github.com/pypeclub/OpenPype/pull/2945)
|
||||
- NewPublisher: Prepared implementation of optional pyblish plugin [\#2943](https://github.com/pypeclub/OpenPype/pull/2943)
|
||||
- Workfiles: Open published workfiles [\#2925](https://github.com/pypeclub/OpenPype/pull/2925)
|
||||
- General: Default modules loaded dynamically [\#2923](https://github.com/pypeclub/OpenPype/pull/2923)
|
||||
- CI: change the version bump logic [\#2919](https://github.com/pypeclub/OpenPype/pull/2919)
|
||||
- Deadline: Add headless argument [\#2916](https://github.com/pypeclub/OpenPype/pull/2916)
|
||||
- Nuke: Add no-audio Tag [\#2911](https://github.com/pypeclub/OpenPype/pull/2911)
|
||||
- Ftrack: Fill workfile in custom attribute [\#2906](https://github.com/pypeclub/OpenPype/pull/2906)
|
||||
- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903)
|
||||
- Settings UI: Add simple tooltips for settings entities [\#2901](https://github.com/pypeclub/OpenPype/pull/2901)
|
||||
- Ftrack: Add more options for note text of integrate ftrack note [\#3025](https://github.com/pypeclub/OpenPype/pull/3025)
|
||||
- Console Interpreter: Changed how console splitter size are reused on show [\#3016](https://github.com/pypeclub/OpenPype/pull/3016)
|
||||
- Deadline: Use more suitable name for sequence review logic [\#3015](https://github.com/pypeclub/OpenPype/pull/3015)
|
||||
- Nuke: add concurrency attr to deadline job [\#3005](https://github.com/pypeclub/OpenPype/pull/3005)
|
||||
- Photoshop: create image without instance [\#3001](https://github.com/pypeclub/OpenPype/pull/3001)
|
||||
- Deadline: priority configurable in Maya jobs [\#2995](https://github.com/pypeclub/OpenPype/pull/2995)
|
||||
- Workfiles tool: Save as published workfiles [\#2937](https://github.com/pypeclub/OpenPype/pull/2937)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Deadline: Fixed default value of use sequence for review [\#3033](https://github.com/pypeclub/OpenPype/pull/3033)
|
||||
- Settings UI: Version column can be extended so version are visible [\#3032](https://github.com/pypeclub/OpenPype/pull/3032)
|
||||
- General: Fix import after movements [\#3028](https://github.com/pypeclub/OpenPype/pull/3028)
|
||||
- Harmony: Added creating subset name for workfile from template [\#3024](https://github.com/pypeclub/OpenPype/pull/3024)
|
||||
- AfterEffects: Added creating subset name for workfile from template [\#3023](https://github.com/pypeclub/OpenPype/pull/3023)
|
||||
- General: Add example addons to ignored [\#3022](https://github.com/pypeclub/OpenPype/pull/3022)
|
||||
- Maya: Remove missing import [\#3017](https://github.com/pypeclub/OpenPype/pull/3017)
|
||||
- Ftrack: multiple reviewable componets [\#3012](https://github.com/pypeclub/OpenPype/pull/3012)
|
||||
- Tray publisher: Fixes after code movement [\#3010](https://github.com/pypeclub/OpenPype/pull/3010)
|
||||
- Nuke: fixing unicode type detection in effect loaders [\#3002](https://github.com/pypeclub/OpenPype/pull/3002)
|
||||
- Nuke: removing redundant Ftrack asset when farm publishing [\#2996](https://github.com/pypeclub/OpenPype/pull/2996)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Maya: Allow to select invalid camera contents if no cameras found [\#3030](https://github.com/pypeclub/OpenPype/pull/3030)
|
||||
- General: adding limitations for pyright [\#2994](https://github.com/pypeclub/OpenPype/pull/2994)
|
||||
|
||||
## [3.9.2](https://github.com/pypeclub/OpenPype/tree/3.9.2) (2022-04-04)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.2-nightly.4...3.9.2)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: Added mention of adding My Drive as a root [\#2999](https://github.com/pypeclub/OpenPype/pull/2999)
|
||||
- Docs: Added MongoDB requirements [\#2951](https://github.com/pypeclub/OpenPype/pull/2951)
|
||||
- Documentation: New publisher develop docs [\#2896](https://github.com/pypeclub/OpenPype/pull/2896)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- nuke: bypass baking [\#2992](https://github.com/pypeclub/OpenPype/pull/2992)
|
||||
- Multiverse: Initial Support [\#2908](https://github.com/pypeclub/OpenPype/pull/2908)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- TVPaint: Render scene family [\#3000](https://github.com/pypeclub/OpenPype/pull/3000)
|
||||
- Nuke: ReviewDataMov Read RAW attribute [\#2985](https://github.com/pypeclub/OpenPype/pull/2985)
|
||||
- General: `METADATA\_KEYS` constant as `frozenset` for optimal immutable lookup [\#2980](https://github.com/pypeclub/OpenPype/pull/2980)
|
||||
- General: Tools with host filters [\#2975](https://github.com/pypeclub/OpenPype/pull/2975)
|
||||
- Hero versions: Use custom templates [\#2967](https://github.com/pypeclub/OpenPype/pull/2967)
|
||||
- Slack: Added configurable maximum file size of review upload to Slack [\#2945](https://github.com/pypeclub/OpenPype/pull/2945)
|
||||
- NewPublisher: Prepared implementation of optional pyblish plugin [\#2943](https://github.com/pypeclub/OpenPype/pull/2943)
|
||||
- TVPaint: Extractor to convert PNG into EXR [\#2942](https://github.com/pypeclub/OpenPype/pull/2942)
|
||||
- Workfiles: Open published workfiles [\#2925](https://github.com/pypeclub/OpenPype/pull/2925)
|
||||
- General: Default modules loaded dynamically [\#2923](https://github.com/pypeclub/OpenPype/pull/2923)
|
||||
- Nuke: Add no-audio Tag [\#2911](https://github.com/pypeclub/OpenPype/pull/2911)
|
||||
- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Hosts: Remove path existence checks in 'add\_implementation\_envs' [\#3004](https://github.com/pypeclub/OpenPype/pull/3004)
|
||||
- Fix - remove doubled dot in workfile created from template [\#2998](https://github.com/pypeclub/OpenPype/pull/2998)
|
||||
- PS: fix renaming subset incorrectly in PS [\#2991](https://github.com/pypeclub/OpenPype/pull/2991)
|
||||
- Fix: Disable setuptools auto discovery [\#2990](https://github.com/pypeclub/OpenPype/pull/2990)
|
||||
- AEL: fix opening existing workfile if no scene opened [\#2989](https://github.com/pypeclub/OpenPype/pull/2989)
|
||||
- Maya: Don't do hardlinks on windows for look publishing [\#2986](https://github.com/pypeclub/OpenPype/pull/2986)
|
||||
- Settings UI: Fix version completer on linux [\#2981](https://github.com/pypeclub/OpenPype/pull/2981)
|
||||
- Photoshop: Fix creation of subset names in PS review and workfile [\#2969](https://github.com/pypeclub/OpenPype/pull/2969)
|
||||
- Slack: Added default for review\_upload\_limit for Slack [\#2965](https://github.com/pypeclub/OpenPype/pull/2965)
|
||||
- General: OIIO conversion for ffmeg can handle sequences [\#2958](https://github.com/pypeclub/OpenPype/pull/2958)
|
||||
- Settings: Conditional dictionary avoid invalid logs [\#2956](https://github.com/pypeclub/OpenPype/pull/2956)
|
||||
- General: Smaller fixes and typos [\#2950](https://github.com/pypeclub/OpenPype/pull/2950)
|
||||
- LogViewer: Don't refresh on initialization [\#2949](https://github.com/pypeclub/OpenPype/pull/2949)
|
||||
- nuke: python3 compatibility issue with `iteritems` [\#2948](https://github.com/pypeclub/OpenPype/pull/2948)
|
||||
- General: anatomy data with correct task short key [\#2947](https://github.com/pypeclub/OpenPype/pull/2947)
|
||||
|
|
@ -39,20 +97,21 @@
|
|||
- Settings UI: Collapsed of collapsible wrapper works as expected [\#2934](https://github.com/pypeclub/OpenPype/pull/2934)
|
||||
- Maya: Do not pass `set` to maya commands \(fixes support for older maya versions\) [\#2932](https://github.com/pypeclub/OpenPype/pull/2932)
|
||||
- General: Don't print log record on OSError [\#2926](https://github.com/pypeclub/OpenPype/pull/2926)
|
||||
- Hiero: Fix import of 'register\_event\_callback' [\#2924](https://github.com/pypeclub/OpenPype/pull/2924)
|
||||
- Ftrack: Missing Ftrack id after editorial publish [\#2905](https://github.com/pypeclub/OpenPype/pull/2905)
|
||||
- AfterEffects: Fix rendering for single frame in DL [\#2875](https://github.com/pypeclub/OpenPype/pull/2875)
|
||||
- Flame: centos related debugging [\#2922](https://github.com/pypeclub/OpenPype/pull/2922)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Move plugins register and discover [\#2935](https://github.com/pypeclub/OpenPype/pull/2935)
|
||||
- General: Move Attribute Definitions from pipeline [\#2931](https://github.com/pypeclub/OpenPype/pull/2931)
|
||||
- General: Removed silo references and terminal splash [\#2927](https://github.com/pypeclub/OpenPype/pull/2927)
|
||||
- General: Move pipeline constants to OpenPype [\#2918](https://github.com/pypeclub/OpenPype/pull/2918)
|
||||
- General: Move formatting and workfile functions [\#2914](https://github.com/pypeclub/OpenPype/pull/2914)
|
||||
- General: Move remaining plugins from avalon [\#2912](https://github.com/pypeclub/OpenPype/pull/2912)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Bump paramiko from 2.9.2 to 2.10.1 [\#2973](https://github.com/pypeclub/OpenPype/pull/2973)
|
||||
- Bump minimist from 1.2.5 to 1.2.6 in /website [\#2954](https://github.com/pypeclub/OpenPype/pull/2954)
|
||||
- Bump node-forge from 1.2.1 to 1.3.0 in /website [\#2953](https://github.com/pypeclub/OpenPype/pull/2953)
|
||||
- Maya - added transparency into review creator [\#2952](https://github.com/pypeclub/OpenPype/pull/2952)
|
||||
|
||||
## [3.9.1](https://github.com/pypeclub/OpenPype/tree/3.9.1) (2022-03-18)
|
||||
|
|
@ -64,8 +123,6 @@
|
|||
- General: Change how OPENPYPE\_DEBUG value is handled [\#2907](https://github.com/pypeclub/OpenPype/pull/2907)
|
||||
- nuke: imageio adding ocio config version 1.2 [\#2897](https://github.com/pypeclub/OpenPype/pull/2897)
|
||||
- Flame: support for comment with xml attribute overrides [\#2892](https://github.com/pypeclub/OpenPype/pull/2892)
|
||||
- Nuke: ExtractReviewSlate can handle more codes and profiles [\#2879](https://github.com/pypeclub/OpenPype/pull/2879)
|
||||
- Flame: sequence used for reference video [\#2869](https://github.com/pypeclub/OpenPype/pull/2869)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -74,75 +131,15 @@
|
|||
- Pyblish Pype - ensure current state is correct when entering new group order [\#2899](https://github.com/pypeclub/OpenPype/pull/2899)
|
||||
- SceneInventory: Fix import of load function [\#2894](https://github.com/pypeclub/OpenPype/pull/2894)
|
||||
- Harmony - fixed creator issue [\#2891](https://github.com/pypeclub/OpenPype/pull/2891)
|
||||
- General: Remove forgotten use of avalon Creator [\#2885](https://github.com/pypeclub/OpenPype/pull/2885)
|
||||
- General: Avoid circular import [\#2884](https://github.com/pypeclub/OpenPype/pull/2884)
|
||||
- Fixes for attaching loaded containers \(\#2837\) [\#2874](https://github.com/pypeclub/OpenPype/pull/2874)
|
||||
- Maya: Deformer node ids validation plugin [\#2826](https://github.com/pypeclub/OpenPype/pull/2826)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Reduce style usage to OpenPype repository [\#2889](https://github.com/pypeclub/OpenPype/pull/2889)
|
||||
- General: Move loader logic from avalon to openpype [\#2886](https://github.com/pypeclub/OpenPype/pull/2886)
|
||||
|
||||
## [3.9.0](https://github.com/pypeclub/OpenPype/tree/3.9.0) (2022-03-14)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.0-nightly.9...3.9.0)
|
||||
|
||||
**Deprecated:**
|
||||
|
||||
- AssetCreator: Remove the tool [\#2845](https://github.com/pypeclub/OpenPype/pull/2845)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: Change Photoshop & AfterEffects plugin path [\#2878](https://github.com/pypeclub/OpenPype/pull/2878)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Subset name filtering in ExtractReview outpus [\#2872](https://github.com/pypeclub/OpenPype/pull/2872)
|
||||
- NewPublisher: Descriptions and Icons in creator dialog [\#2867](https://github.com/pypeclub/OpenPype/pull/2867)
|
||||
- NewPublisher: Changing task on publishing instance [\#2863](https://github.com/pypeclub/OpenPype/pull/2863)
|
||||
- TrayPublisher: Choose project widget is more clear [\#2859](https://github.com/pypeclub/OpenPype/pull/2859)
|
||||
- New: Validation exceptions [\#2841](https://github.com/pypeclub/OpenPype/pull/2841)
|
||||
- Maya: add loaded containers to published instance [\#2837](https://github.com/pypeclub/OpenPype/pull/2837)
|
||||
- Ftrack: Can sync fps as string [\#2836](https://github.com/pypeclub/OpenPype/pull/2836)
|
||||
- General: Custom function for find executable [\#2822](https://github.com/pypeclub/OpenPype/pull/2822)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Missing time function [\#2877](https://github.com/pypeclub/OpenPype/pull/2877)
|
||||
- Deadline: Fix plugin name for tile assemble [\#2868](https://github.com/pypeclub/OpenPype/pull/2868)
|
||||
- Nuke: gizmo precollect fix [\#2866](https://github.com/pypeclub/OpenPype/pull/2866)
|
||||
- General: Fix hardlink for windows [\#2864](https://github.com/pypeclub/OpenPype/pull/2864)
|
||||
- General: ffmpeg was crashing on slate merge [\#2860](https://github.com/pypeclub/OpenPype/pull/2860)
|
||||
- WebPublisher: Video file was published with one too many frame [\#2858](https://github.com/pypeclub/OpenPype/pull/2858)
|
||||
- New Publisher: Error dialog got right styles [\#2857](https://github.com/pypeclub/OpenPype/pull/2857)
|
||||
- General: Fix getattr clalback on dynamic modules [\#2855](https://github.com/pypeclub/OpenPype/pull/2855)
|
||||
- Nuke: slate resolution to input video resolution [\#2853](https://github.com/pypeclub/OpenPype/pull/2853)
|
||||
- WebPublisher: Fix username stored in DB [\#2852](https://github.com/pypeclub/OpenPype/pull/2852)
|
||||
- WebPublisher: Fix wrong number of frames for video file [\#2851](https://github.com/pypeclub/OpenPype/pull/2851)
|
||||
- Nuke: Fix family test in validate\_write\_legacy to work with stillImage [\#2847](https://github.com/pypeclub/OpenPype/pull/2847)
|
||||
- Nuke: fix multiple baking profile farm publishing [\#2842](https://github.com/pypeclub/OpenPype/pull/2842)
|
||||
- Blender: Fixed parameters for FBX export of the camera [\#2840](https://github.com/pypeclub/OpenPype/pull/2840)
|
||||
- Maya: Stop creation of reviews for Cryptomattes [\#2832](https://github.com/pypeclub/OpenPype/pull/2832)
|
||||
- Deadline: Remove recreated event [\#2828](https://github.com/pypeclub/OpenPype/pull/2828)
|
||||
- Deadline: Added missing events folder [\#2827](https://github.com/pypeclub/OpenPype/pull/2827)
|
||||
- Settings: Missing document with OP versions may break start of OpenPype [\#2825](https://github.com/pypeclub/OpenPype/pull/2825)
|
||||
- Deadline: more detailed temp file name for environment json [\#2824](https://github.com/pypeclub/OpenPype/pull/2824)
|
||||
- General: Host name was formed from obsolete code [\#2821](https://github.com/pypeclub/OpenPype/pull/2821)
|
||||
- Settings UI: Fix "Apply from" action [\#2820](https://github.com/pypeclub/OpenPype/pull/2820)
|
||||
- Ftrack: Job killer with missing user [\#2819](https://github.com/pypeclub/OpenPype/pull/2819)
|
||||
- Nuke: Use AVALON\_APP to get value for "app" key [\#2818](https://github.com/pypeclub/OpenPype/pull/2818)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Refactor: move webserver tool to openpype [\#2876](https://github.com/pypeclub/OpenPype/pull/2876)
|
||||
- General: Move create logic from avalon to OpenPype [\#2854](https://github.com/pypeclub/OpenPype/pull/2854)
|
||||
- General: Add vendors from avalon [\#2848](https://github.com/pypeclub/OpenPype/pull/2848)
|
||||
- General: Basic event system [\#2846](https://github.com/pypeclub/OpenPype/pull/2846)
|
||||
- General: Move change context functions [\#2839](https://github.com/pypeclub/OpenPype/pull/2839)
|
||||
- Tools: Don't use avalon tools code [\#2829](https://github.com/pypeclub/OpenPype/pull/2829)
|
||||
- Move Unreal Implementation to OpenPype [\#2823](https://github.com/pypeclub/OpenPype/pull/2823)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.8.2-nightly.3...3.8.2)
|
||||
|
|
|
|||
|
|
@ -2,20 +2,16 @@
|
|||
"""Pype module."""
|
||||
import os
|
||||
import platform
|
||||
import functools
|
||||
import logging
|
||||
|
||||
from .settings import get_project_settings
|
||||
from .lib import (
|
||||
Anatomy,
|
||||
filter_pyblish_plugins,
|
||||
set_plugin_attributes_from_settings,
|
||||
change_timer_to_current_context,
|
||||
register_event_callback,
|
||||
)
|
||||
|
||||
pyblish = avalon = _original_discover = None
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
|
@ -27,60 +23,17 @@ PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
|||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
|
||||
|
||||
def import_wrapper(func):
|
||||
"""Wrap module imports to specific functions."""
|
||||
@functools.wraps(func)
|
||||
def decorated(*args, **kwargs):
|
||||
global pyblish
|
||||
global avalon
|
||||
global _original_discover
|
||||
if pyblish is None:
|
||||
from pyblish import api as pyblish
|
||||
from avalon import api as avalon
|
||||
|
||||
# we are monkey patching `avalon.api.discover()` to allow us to
|
||||
# load plugin presets on plugins being discovered by avalon.
|
||||
# Little bit of hacking, but it allows us to add out own features
|
||||
# without need to modify upstream code.
|
||||
|
||||
_original_discover = avalon.discover
|
||||
|
||||
return func(*args, **kwargs)
|
||||
|
||||
return decorated
|
||||
|
||||
|
||||
@import_wrapper
|
||||
def patched_discover(superclass):
|
||||
"""Patch `avalon.api.discover()`.
|
||||
|
||||
Monkey patched version of :func:`avalon.api.discover()`. It allows
|
||||
us to load presets on plugins being discovered.
|
||||
"""
|
||||
# run original discover and get plugins
|
||||
plugins = _original_discover(superclass)
|
||||
filtered_plugins = [
|
||||
plugin
|
||||
for plugin in plugins
|
||||
if issubclass(plugin, superclass)
|
||||
]
|
||||
|
||||
set_plugin_attributes_from_settings(filtered_plugins, superclass)
|
||||
|
||||
return filtered_plugins
|
||||
|
||||
|
||||
@import_wrapper
|
||||
def install():
|
||||
"""Install Pype to Avalon."""
|
||||
"""Install OpenPype to Avalon."""
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
from pyblish.lib import MessageHandler
|
||||
from openpype.modules import load_modules
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action,
|
||||
register_creator_plugin_path,
|
||||
)
|
||||
from avalon import pipeline
|
||||
|
||||
# Make sure modules are loaded
|
||||
load_modules()
|
||||
|
|
@ -93,8 +46,8 @@ def install():
|
|||
MessageHandler.emit = modified_emit
|
||||
|
||||
log.info("Registering global plug-ins..")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.register_discovery_filter(filter_pyblish_plugins)
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.register_discovery_filter(filter_pyblish_plugins)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
|
|
@ -103,7 +56,7 @@ def install():
|
|||
if project_name:
|
||||
anatomy = Anatomy(project_name)
|
||||
anatomy.set_root_environments()
|
||||
avalon.register_root(anatomy.roots)
|
||||
avalon.api.register_root(anatomy.roots)
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
platform_name = platform.system().lower()
|
||||
|
|
@ -122,17 +75,14 @@ def install():
|
|||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
pyblish.register_plugin_path(path)
|
||||
pyblish.api.register_plugin_path(path)
|
||||
register_loader_plugin_path(path)
|
||||
avalon.register_plugin_path(LegacyCreator, path)
|
||||
register_creator_plugin_path(path)
|
||||
register_inventory_action(path)
|
||||
|
||||
# apply monkey patched discover to original one
|
||||
log.info("Patching discovery")
|
||||
|
||||
avalon.discover = patched_discover
|
||||
pipeline.discover = patched_discover
|
||||
|
||||
register_event_callback("taskChanged", _on_task_change)
|
||||
|
||||
|
||||
|
|
@ -140,16 +90,13 @@ def _on_task_change():
|
|||
change_timer_to_current_context()
|
||||
|
||||
|
||||
@import_wrapper
|
||||
def uninstall():
|
||||
"""Uninstall Pype from Avalon."""
|
||||
import pyblish.api
|
||||
from openpype.pipeline import deregister_loader_plugin_path
|
||||
|
||||
log.info("Deregistering global plug-ins..")
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.deregister_discovery_filter(filter_pyblish_plugins)
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.deregister_discovery_filter(filter_pyblish_plugins)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
log.info("Global plug-ins unregistred")
|
||||
|
||||
# restore original discover
|
||||
avalon.discover = _original_discover
|
||||
|
|
|
|||
|
|
@ -5,15 +5,15 @@ from Qt import QtWidgets
|
|||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon import io
|
||||
|
||||
from openpype import lib
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.aftereffects
|
||||
|
|
@ -73,7 +73,7 @@ def install():
|
|||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
pyblish.api.register_callback(
|
||||
|
|
@ -86,7 +86,7 @@ def install():
|
|||
def uninstall():
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, old_value, new_value):
|
||||
|
|
|
|||
|
|
@ -5,14 +5,6 @@ from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
|||
from .launch_logic import get_stub
|
||||
|
||||
|
||||
def _active_document():
|
||||
document_name = get_stub().get_active_document_name()
|
||||
if not document_name:
|
||||
return None
|
||||
|
||||
return document_name
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return HOST_WORKFILE_EXTENSIONS["aftereffects"]
|
||||
|
||||
|
|
@ -39,7 +31,8 @@ def current_file():
|
|||
full_name = get_stub().get_active_document_full_name()
|
||||
if full_name and full_name != "null":
|
||||
return os.path.normpath(full_name).replace("\\", "/")
|
||||
except Exception:
|
||||
except ValueError:
|
||||
print("Nothing opened")
|
||||
pass
|
||||
|
||||
return None
|
||||
|
|
@ -47,3 +40,15 @@ def current_file():
|
|||
|
||||
def work_root(session):
|
||||
return os.path.normpath(session["AVALON_WORKDIR"]).replace("\\", "/")
|
||||
|
||||
|
||||
def _active_document():
|
||||
# TODO merge with current_file - even in extension
|
||||
document_name = None
|
||||
try:
|
||||
document_name = get_stub().get_active_document_name()
|
||||
except ValueError:
|
||||
print("Nothing opened")
|
||||
pass
|
||||
|
||||
return document_name
|
||||
|
|
@ -1,12 +1,14 @@
|
|||
from openpype.pipeline import create
|
||||
from openpype.pipeline import CreatorError
|
||||
from openpype.pipeline import (
|
||||
CreatorError,
|
||||
LegacyCreator
|
||||
)
|
||||
from openpype.hosts.aftereffects.api import (
|
||||
get_stub,
|
||||
list_instances
|
||||
)
|
||||
|
||||
|
||||
class CreateRender(create.LegacyCreator):
|
||||
class CreateRender(LegacyCreator):
|
||||
"""Render folder for publish.
|
||||
|
||||
Creates subsets in format 'familyTaskSubsetname',
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import os
|
||||
from avalon import api
|
||||
import pyblish.api
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -38,7 +39,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
|
||||
# workfile instance
|
||||
family = "workfile"
|
||||
subset = family + task.capitalize()
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
)
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
|
||||
|
|
|
|||
|
|
@ -29,12 +29,12 @@ def add_implementation_envs(env, _app):
|
|||
env.get("OPENPYPE_BLENDER_USER_SCRIPTS") or ""
|
||||
)
|
||||
for path in openpype_blender_user_scripts.split(os.pathsep):
|
||||
if path and os.path.exists(path):
|
||||
if path:
|
||||
previous_user_scripts.add(os.path.normpath(path))
|
||||
|
||||
blender_user_scripts = env.get("BLENDER_USER_SCRIPTS") or ""
|
||||
for path in blender_user_scripts.split(os.pathsep):
|
||||
if path and os.path.exists(path):
|
||||
if path:
|
||||
previous_user_scripts.add(os.path.normpath(path))
|
||||
|
||||
# Remove implementation path from user script paths as is set to
|
||||
|
|
|
|||
|
|
@ -14,9 +14,10 @@ import avalon.api
|
|||
from avalon import io, schema
|
||||
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.api import Logger
|
||||
|
|
@ -54,7 +55,7 @@ def install():
|
|||
pyblish.api.register_plugin_path(str(PUBLISH_PATH))
|
||||
|
||||
register_loader_plugin_path(str(LOAD_PATH))
|
||||
avalon.api.register_plugin_path(LegacyCreator, str(CREATE_PATH))
|
||||
register_creator_plugin_path(str(CREATE_PATH))
|
||||
|
||||
lib.append_user_scripts()
|
||||
|
||||
|
|
@ -76,7 +77,7 @@ def uninstall():
|
|||
pyblish.api.deregister_plugin_path(str(PUBLISH_PATH))
|
||||
|
||||
deregister_loader_plugin_path(str(LOAD_PATH))
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, str(CREATE_PATH))
|
||||
deregister_creator_plugin_path(str(CREATE_PATH))
|
||||
|
||||
if not IS_HEADLESS:
|
||||
ops.unregister()
|
||||
|
|
|
|||
|
|
@ -3,14 +3,14 @@ Basic avalon integration
|
|||
"""
|
||||
import os
|
||||
import contextlib
|
||||
from avalon import api as avalon
|
||||
from pyblish import api as pyblish
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from .lib import (
|
||||
|
|
@ -37,7 +37,7 @@ def install():
|
|||
pyblish.register_host("flame")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
log.info("OpenPype Flame plug-ins registred ...")
|
||||
|
||||
# register callback for switching publishable
|
||||
|
|
@ -52,7 +52,7 @@ def uninstall():
|
|||
log.info("Deregistering Flame plug-ins..")
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
|
|||
|
|
@ -7,14 +7,14 @@ import logging
|
|||
import contextlib
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
register_inventory_action_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
deregister_inventory_action_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
|
@ -70,7 +70,7 @@ def install():
|
|||
log.info("Registering Fusion plug-ins..")
|
||||
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
pyblish.api.register_callback(
|
||||
|
|
@ -94,7 +94,7 @@ def uninstall():
|
|||
log.info("Deregistering Fusion plug-ins..")
|
||||
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
deregister_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
pyblish.api.deregister_callback(
|
||||
|
|
|
|||
|
|
@ -1,13 +1,13 @@
|
|||
import os
|
||||
|
||||
from openpype.pipeline import create
|
||||
from openpype.pipeline import LegacyCreator
|
||||
from openpype.hosts.fusion.api import (
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
|
||||
class CreateOpenEXRSaver(create.LegacyCreator):
|
||||
class CreateOpenEXRSaver(LegacyCreator):
|
||||
|
||||
name = "openexrDefault"
|
||||
label = "Create OpenEXR Saver"
|
||||
|
|
|
|||
|
|
@ -6,14 +6,14 @@ from bson.objectid import ObjectId
|
|||
import pyblish.api
|
||||
|
||||
from avalon import io
|
||||
import avalon.api
|
||||
|
||||
from openpype import lib
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.harmony
|
||||
|
|
@ -108,9 +108,8 @@ def check_inventory():
|
|||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
outdated_containers = []
|
||||
for container in host.ls():
|
||||
for container in ls():
|
||||
representation = container['representation']
|
||||
representation_doc = io.find_one(
|
||||
{
|
||||
|
|
@ -186,7 +185,7 @@ def install():
|
|||
pyblish.api.register_host("harmony")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
# Register callbacks.
|
||||
|
|
@ -200,7 +199,7 @@ def install():
|
|||
def uninstall():
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, old_value, new_value):
|
||||
|
|
|
|||
|
|
@ -3,6 +3,8 @@
|
|||
import pyblish.api
|
||||
import os
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
"""Collect current script for publish."""
|
||||
|
|
@ -14,10 +16,15 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
"""Plugin entry point."""
|
||||
family = "workfile"
|
||||
task = os.getenv("AVALON_TASK", None)
|
||||
sanitized_task_name = task[0].upper() + task[1:]
|
||||
basename = os.path.basename(context.data["currentFile"])
|
||||
subset = "{}{}".format(family, sanitized_task_name)
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
)
|
||||
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ def add_implementation_envs(env, _app):
|
|||
]
|
||||
old_hiero_path = env.get("HIERO_PLUGIN_PATH") or ""
|
||||
for path in old_hiero_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
|
|
|
|||
|
|
@ -5,13 +5,13 @@ import os
|
|||
import contextlib
|
||||
from collections import OrderedDict
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import schema
|
||||
from pyblish import api as pyblish
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_creator_plugin_path,
|
||||
register_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
|
@ -50,7 +50,7 @@ def install():
|
|||
pyblish.register_host("hiero")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
@ -71,7 +71,7 @@ def uninstall():
|
|||
pyblish.deregister_host("hiero")
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ def add_implementation_envs(env, _app):
|
|||
old_houdini_menu_path = env.get("HOUDINI_MENU_PATH") or ""
|
||||
|
||||
for path in old_houdini_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
|
|
@ -23,7 +23,7 @@ def add_implementation_envs(env, _app):
|
|||
new_houdini_path.append(norm_path)
|
||||
|
||||
for path in old_houdini_menu_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ import avalon.api
|
|||
from avalon.lib import find_submodule
|
||||
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_creator_plugin_path,
|
||||
register_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
|
@ -54,7 +54,7 @@ def install():
|
|||
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
log.info("Installing callbacks ... ")
|
||||
# register_event_callback("init", on_init)
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ def add_implementation_envs(env, _app):
|
|||
]
|
||||
old_python_path = env.get("PYTHONPATH") or ""
|
||||
for path in old_python_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
|
|
|
|||
202
openpype/hosts/maya/api/fbx.py
Normal file
202
openpype/hosts/maya/api/fbx.py
Normal file
|
|
@ -0,0 +1,202 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Tools to work with FBX."""
|
||||
import logging
|
||||
|
||||
from pyblish.api import Instance
|
||||
|
||||
from maya import cmds # noqa
|
||||
import maya.mel as mel # noqa
|
||||
|
||||
|
||||
class FBXExtractor:
|
||||
"""Extract FBX from Maya.
|
||||
|
||||
This extracts reproducible FBX exports ignoring any of the settings set
|
||||
on the local machine in the FBX export options window.
|
||||
|
||||
All export settings are applied with the `FBXExport*` commands prior
|
||||
to the `FBXExport` call itself. The options can be overridden with
|
||||
their
|
||||
nice names as seen in the "options" property on this class.
|
||||
|
||||
For more information on FBX exports see:
|
||||
- https://knowledge.autodesk.com/support/maya/learn-explore/caas
|
||||
/CloudHelp/cloudhelp/2016/ENU/Maya/files/GUID-6CCE943A-2ED4-4CEE-96D4
|
||||
-9CB19C28F4E0-htm.html
|
||||
- http://forums.cgsociety.org/archive/index.php?t-1032853.html
|
||||
- https://groups.google.com/forum/#!msg/python_inside_maya/cLkaSo361oE
|
||||
/LKs9hakE28kJ
|
||||
|
||||
"""
|
||||
@property
|
||||
def options(self):
|
||||
"""Overridable options for FBX Export
|
||||
|
||||
Given in the following format
|
||||
- {NAME: EXPECTED TYPE}
|
||||
|
||||
If the overridden option's type does not match,
|
||||
the option is not included and a warning is logged.
|
||||
|
||||
"""
|
||||
|
||||
return {
|
||||
"cameras": bool,
|
||||
"smoothingGroups": bool,
|
||||
"hardEdges": bool,
|
||||
"tangents": bool,
|
||||
"smoothMesh": bool,
|
||||
"instances": bool,
|
||||
# "referencedContainersContent": bool, # deprecated in Maya 2016+
|
||||
"bakeComplexAnimation": int,
|
||||
"bakeComplexStart": int,
|
||||
"bakeComplexEnd": int,
|
||||
"bakeComplexStep": int,
|
||||
"bakeResampleAnimation": bool,
|
||||
"animationOnly": bool,
|
||||
"useSceneName": bool,
|
||||
"quaternion": str, # "euler"
|
||||
"shapes": bool,
|
||||
"skins": bool,
|
||||
"constraints": bool,
|
||||
"lights": bool,
|
||||
"embeddedTextures": bool,
|
||||
"inputConnections": bool,
|
||||
"upAxis": str, # x, y or z,
|
||||
"triangulate": bool
|
||||
}
|
||||
|
||||
@property
|
||||
def default_options(self):
|
||||
"""The default options for FBX extraction.
|
||||
|
||||
This includes shapes, skins, constraints, lights and incoming
|
||||
connections and exports with the Y-axis as up-axis.
|
||||
|
||||
By default this uses the time sliders start and end time.
|
||||
|
||||
"""
|
||||
|
||||
start_frame = int(cmds.playbackOptions(query=True,
|
||||
animationStartTime=True))
|
||||
end_frame = int(cmds.playbackOptions(query=True,
|
||||
animationEndTime=True))
|
||||
|
||||
return {
|
||||
"cameras": False,
|
||||
"smoothingGroups": True,
|
||||
"hardEdges": False,
|
||||
"tangents": False,
|
||||
"smoothMesh": True,
|
||||
"instances": False,
|
||||
"bakeComplexAnimation": True,
|
||||
"bakeComplexStart": start_frame,
|
||||
"bakeComplexEnd": end_frame,
|
||||
"bakeComplexStep": 1,
|
||||
"bakeResampleAnimation": True,
|
||||
"animationOnly": False,
|
||||
"useSceneName": False,
|
||||
"quaternion": "euler",
|
||||
"shapes": True,
|
||||
"skins": True,
|
||||
"constraints": False,
|
||||
"lights": True,
|
||||
"embeddedTextures": False,
|
||||
"inputConnections": True,
|
||||
"upAxis": "y",
|
||||
"triangulate": False
|
||||
}
|
||||
|
||||
def __init__(self, log=None):
|
||||
# Ensure FBX plug-in is loaded
|
||||
self.log = log or logging.getLogger(self.__class__.__name__)
|
||||
cmds.loadPlugin("fbxmaya", quiet=True)
|
||||
|
||||
def parse_overrides(self, instance, options):
|
||||
"""Inspect data of instance to determine overridden options
|
||||
|
||||
An instance may supply any of the overridable options
|
||||
as data, the option is then added to the extraction.
|
||||
|
||||
"""
|
||||
|
||||
for key in instance.data:
|
||||
if key not in self.options:
|
||||
continue
|
||||
|
||||
# Ensure the data is of correct type
|
||||
value = instance.data[key]
|
||||
if not isinstance(value, self.options[key]):
|
||||
self.log.warning(
|
||||
"Overridden attribute {key} was of "
|
||||
"the wrong type: {invalid_type} "
|
||||
"- should have been {valid_type}".format(
|
||||
key=key,
|
||||
invalid_type=type(value).__name__,
|
||||
valid_type=self.options[key].__name__))
|
||||
continue
|
||||
|
||||
options[key] = value
|
||||
|
||||
return options
|
||||
|
||||
def set_options_from_instance(self, instance):
|
||||
# type: (Instance) -> None
|
||||
"""Sets FBX export options from data in the instance.
|
||||
|
||||
Args:
|
||||
instance (Instance): Instance data.
|
||||
|
||||
"""
|
||||
# Parse export options
|
||||
options = self.default_options
|
||||
options = self.parse_overrides(instance, options)
|
||||
self.log.info("Export options: {0}".format(options))
|
||||
|
||||
# Collect the start and end including handles
|
||||
start = instance.data.get("frameStartHandle") or \
|
||||
instance.context.data.get("frameStartHandle")
|
||||
end = instance.data.get("frameEndHandle") or \
|
||||
instance.context.data.get("frameEndHandle")
|
||||
|
||||
options['bakeComplexStart'] = start
|
||||
options['bakeComplexEnd'] = end
|
||||
|
||||
# First apply the default export settings to be fully consistent
|
||||
# each time for successive publishes
|
||||
mel.eval("FBXResetExport")
|
||||
|
||||
# Apply the FBX overrides through MEL since the commands
|
||||
# only work correctly in MEL according to online
|
||||
# available discussions on the topic
|
||||
_iteritems = getattr(options, "iteritems", options.items)
|
||||
for option, value in _iteritems():
|
||||
key = option[0].upper() + option[1:] # uppercase first letter
|
||||
|
||||
# Boolean must be passed as lower-case strings
|
||||
# as to MEL standards
|
||||
if isinstance(value, bool):
|
||||
value = str(value).lower()
|
||||
|
||||
template = "FBXExport{0} {1}" if key == "UpAxis" else \
|
||||
"FBXExport{0} -v {1}" # noqa
|
||||
cmd = template.format(key, value)
|
||||
self.log.info(cmd)
|
||||
mel.eval(cmd)
|
||||
|
||||
# Never show the UI or generate a log
|
||||
mel.eval("FBXExportShowUI -v false")
|
||||
mel.eval("FBXExportGenerateLog -v false")
|
||||
|
||||
@staticmethod
|
||||
def export(members, path):
|
||||
# type: (list, str) -> None
|
||||
"""Export members as FBX with given path.
|
||||
|
||||
Args:
|
||||
members (list): List of members to export.
|
||||
path (str): Path to use for export.
|
||||
|
||||
"""
|
||||
cmds.select(members, r=True, noExpand=True)
|
||||
mel.eval('FBXExport -f "{}" -s'.format(path))
|
||||
|
|
@ -3138,11 +3138,20 @@ def set_colorspace():
|
|||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def root_parent(nodes):
|
||||
# type: (list) -> list
|
||||
def parent_nodes(nodes, parent=None):
|
||||
# type: (list, str) -> list
|
||||
"""Context manager to un-parent provided nodes and return them back."""
|
||||
import pymel.core as pm # noqa
|
||||
|
||||
parent_node = None
|
||||
delete_parent = False
|
||||
|
||||
if parent:
|
||||
if not cmds.objExists(parent):
|
||||
parent_node = pm.createNode("transform", n=parent, ss=False)
|
||||
delete_parent = True
|
||||
else:
|
||||
parent_node = pm.PyNode(parent)
|
||||
node_parents = []
|
||||
for node in nodes:
|
||||
n = pm.PyNode(node)
|
||||
|
|
@ -3153,9 +3162,14 @@ def root_parent(nodes):
|
|||
node_parents.append((n, root))
|
||||
try:
|
||||
for node in node_parents:
|
||||
node[0].setParent(world=True)
|
||||
if not parent:
|
||||
node[0].setParent(world=True)
|
||||
else:
|
||||
node[0].setParent(parent_node)
|
||||
yield
|
||||
finally:
|
||||
for node in node_parents:
|
||||
if node[1]:
|
||||
node[0].setParent(node[1])
|
||||
if delete_parent:
|
||||
pm.delete(parent_node)
|
||||
|
|
|
|||
|
|
@ -9,8 +9,6 @@ import maya.api.OpenMaya as om
|
|||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
from avalon.lib import find_submodule
|
||||
|
||||
import openpype.hosts.maya
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib import (
|
||||
|
|
@ -20,11 +18,12 @@ from openpype.lib import (
|
|||
)
|
||||
from openpype.lib.path_tools import HostDirmap
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_inventory_action_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.hosts.maya.lib import copy_workspace_mel
|
||||
|
|
@ -60,7 +59,7 @@ def install():
|
|||
pyblish.api.register_host("maya")
|
||||
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
|
|
@ -189,7 +188,7 @@ def uninstall():
|
|||
pyblish.api.deregister_host("maya")
|
||||
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
deregister_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
menu.uninstall()
|
||||
|
|
@ -268,21 +267,8 @@ def ls():
|
|||
|
||||
"""
|
||||
container_names = _ls()
|
||||
|
||||
has_metadata_collector = False
|
||||
config_host = find_submodule(avalon.api.registered_config(), "maya")
|
||||
if hasattr(config_host, "collect_container_metadata"):
|
||||
has_metadata_collector = True
|
||||
|
||||
for container in sorted(container_names):
|
||||
data = parse_container(container)
|
||||
|
||||
# Collect custom data if attribute is present
|
||||
if has_metadata_collector:
|
||||
metadata = config_host.collect_container_metadata(container)
|
||||
data.update(metadata)
|
||||
|
||||
yield data
|
||||
yield parse_container(container)
|
||||
|
||||
|
||||
def containerise(name,
|
||||
|
|
|
|||
|
|
@ -252,6 +252,7 @@ class CreateRender(plugin.Creator):
|
|||
"""Create instance settings."""
|
||||
# get pools
|
||||
pool_names = []
|
||||
default_priority = 50
|
||||
|
||||
self.server_aliases = list(self.deadline_servers.keys())
|
||||
self.data["deadlineServers"] = self.server_aliases
|
||||
|
|
@ -260,7 +261,8 @@ class CreateRender(plugin.Creator):
|
|||
self.data["extendFrames"] = False
|
||||
self.data["overrideExistingFrame"] = True
|
||||
# self.data["useLegacyRenderLayers"] = True
|
||||
self.data["priority"] = 50
|
||||
self.data["priority"] = default_priority
|
||||
self.data["tile_priority"] = default_priority
|
||||
self.data["framesPerTask"] = 1
|
||||
self.data["whitelist"] = False
|
||||
self.data["machineList"] = ""
|
||||
|
|
@ -294,6 +296,16 @@ class CreateRender(plugin.Creator):
|
|||
deadline_url = next(iter(self.deadline_servers.values()))
|
||||
|
||||
pool_names = self._get_deadline_pools(deadline_url)
|
||||
maya_submit_dl = self._project_settings.get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {})
|
||||
priority = maya_submit_dl.get("priority", default_priority)
|
||||
self.data["priority"] = priority
|
||||
|
||||
tile_priority = maya_submit_dl.get("tile_priority",
|
||||
default_priority)
|
||||
self.data["tile_priority"] = tile_priority
|
||||
|
||||
if muster_enabled:
|
||||
self.log.info(">>> Loading Muster credentials ...")
|
||||
|
|
|
|||
|
|
@ -0,0 +1,50 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Creator for Unreal Skeletal Meshes."""
|
||||
from openpype.hosts.maya.api import plugin, lib
|
||||
from avalon.api import Session
|
||||
from maya import cmds # noqa
|
||||
|
||||
|
||||
class CreateUnrealSkeletalMesh(plugin.Creator):
|
||||
"""Unreal Static Meshes with collisions."""
|
||||
name = "staticMeshMain"
|
||||
label = "Unreal - Skeletal Mesh"
|
||||
family = "skeletalMesh"
|
||||
icon = "thumbs-up"
|
||||
dynamic_subset_keys = ["asset"]
|
||||
|
||||
joint_hints = []
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
"""Constructor."""
|
||||
super(CreateUnrealSkeletalMesh, self).__init__(*args, **kwargs)
|
||||
|
||||
@classmethod
|
||||
def get_dynamic_data(
|
||||
cls, variant, task_name, asset_id, project_name, host_name
|
||||
):
|
||||
dynamic_data = super(CreateUnrealSkeletalMesh, cls).get_dynamic_data(
|
||||
variant, task_name, asset_id, project_name, host_name
|
||||
)
|
||||
dynamic_data["asset"] = Session.get("AVALON_ASSET")
|
||||
return dynamic_data
|
||||
|
||||
def process(self):
|
||||
self.name = "{}_{}".format(self.family, self.name)
|
||||
with lib.undo_chunk():
|
||||
instance = super(CreateUnrealSkeletalMesh, self).process()
|
||||
content = cmds.sets(instance, query=True)
|
||||
|
||||
# empty set and process its former content
|
||||
cmds.sets(content, rm=instance)
|
||||
geometry_set = cmds.sets(name="geometry_SET", empty=True)
|
||||
joints_set = cmds.sets(name="joints_SET", empty=True)
|
||||
|
||||
cmds.sets([geometry_set, joints_set], forceElement=instance)
|
||||
members = cmds.ls(content) or []
|
||||
|
||||
for node in members:
|
||||
if node in self.joint_hints:
|
||||
cmds.sets(node, forceElement=joints_set)
|
||||
else:
|
||||
cmds.sets(node, forceElement=geometry_set)
|
||||
|
|
@ -10,7 +10,7 @@ class CreateUnrealStaticMesh(plugin.Creator):
|
|||
"""Unreal Static Meshes with collisions."""
|
||||
name = "staticMeshMain"
|
||||
label = "Unreal - Static Mesh"
|
||||
family = "unrealStaticMesh"
|
||||
family = "staticMesh"
|
||||
icon = "cube"
|
||||
dynamic_subset_keys = ["asset"]
|
||||
|
||||
|
|
@ -28,10 +28,10 @@ class CreateUnrealStaticMesh(plugin.Creator):
|
|||
variant, task_name, asset_id, project_name, host_name
|
||||
)
|
||||
dynamic_data["asset"] = Session.get("AVALON_ASSET")
|
||||
|
||||
return dynamic_data
|
||||
|
||||
def process(self):
|
||||
self.name = "{}_{}".format(self.family, self.name)
|
||||
with lib.undo_chunk():
|
||||
instance = super(CreateUnrealStaticMesh, self).process()
|
||||
content = cmds.sets(instance, query=True)
|
||||
|
|
|
|||
|
|
@ -4,7 +4,6 @@ from bson.objectid import ObjectId
|
|||
from openpype.pipeline import (
|
||||
InventoryAction,
|
||||
get_representation_context,
|
||||
get_representation_path_from_context,
|
||||
)
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection,
|
||||
|
|
@ -80,10 +79,10 @@ class ImportModelRender(InventoryAction):
|
|||
})
|
||||
|
||||
context = get_representation_context(look_repr["_id"])
|
||||
maya_file = get_representation_path_from_context(context)
|
||||
maya_file = self.filepath_from_context(context)
|
||||
|
||||
context = get_representation_context(json_repr["_id"])
|
||||
json_file = get_representation_path_from_context(context)
|
||||
json_file = self.filepath_from_context(context)
|
||||
|
||||
# Import the look file
|
||||
with maintained_selection():
|
||||
|
|
|
|||
|
|
@ -22,7 +22,8 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
"camera",
|
||||
"rig",
|
||||
"camerarig",
|
||||
"xgen"]
|
||||
"xgen",
|
||||
"staticMesh"]
|
||||
representations = ["ma", "abc", "fbx", "mb"]
|
||||
|
||||
label = "Reference"
|
||||
|
|
|
|||
|
|
@ -1,31 +0,0 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Cleanup leftover nodes."""
|
||||
from maya import cmds # noqa
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CleanNodesUp(pyblish.api.InstancePlugin):
|
||||
"""Cleans up the staging directory after a successful publish.
|
||||
|
||||
This will also clean published renders and delete their parent directories.
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.IntegratorOrder + 10
|
||||
label = "Clean Nodes"
|
||||
optional = True
|
||||
active = True
|
||||
|
||||
def process(self, instance):
|
||||
if not instance.data.get("cleanNodes"):
|
||||
self.log.info("Nothing to clean.")
|
||||
return
|
||||
|
||||
nodes_to_clean = instance.data.pop("cleanNodes", [])
|
||||
self.log.info("Removing {} nodes".format(len(nodes_to_clean)))
|
||||
for node in nodes_to_clean:
|
||||
try:
|
||||
cmds.delete(node)
|
||||
except ValueError:
|
||||
# object might be already deleted, don't complain about it
|
||||
pass
|
||||
|
|
@ -0,0 +1,39 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from maya import cmds # noqa
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectUnrealSkeletalMesh(pyblish.api.InstancePlugin):
|
||||
"""Collect Unreal Skeletal Mesh."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.2
|
||||
label = "Collect Unreal Skeletal Meshes"
|
||||
families = ["skeletalMesh"]
|
||||
|
||||
def process(self, instance):
|
||||
frame = cmds.currentTime(query=True)
|
||||
instance.data["frameStart"] = frame
|
||||
instance.data["frameEnd"] = frame
|
||||
|
||||
geo_sets = [
|
||||
i for i in instance[:]
|
||||
if i.lower().startswith("geometry_set")
|
||||
]
|
||||
|
||||
joint_sets = [
|
||||
i for i in instance[:]
|
||||
if i.lower().startswith("joints_set")
|
||||
]
|
||||
|
||||
instance.data["geometry"] = []
|
||||
instance.data["joints"] = []
|
||||
|
||||
for geo_set in geo_sets:
|
||||
geo_content = cmds.ls(cmds.sets(geo_set, query=True), long=True)
|
||||
if geo_content:
|
||||
instance.data["geometry"] += geo_content
|
||||
|
||||
for join_set in joint_sets:
|
||||
join_content = cmds.ls(cmds.sets(join_set, query=True), long=True)
|
||||
if join_content:
|
||||
instance.data["joints"] += join_content
|
||||
|
|
@ -1,38 +1,36 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from maya import cmds
|
||||
from maya import cmds # noqa
|
||||
import pyblish.api
|
||||
from pprint import pformat
|
||||
|
||||
|
||||
class CollectUnrealStaticMesh(pyblish.api.InstancePlugin):
|
||||
"""Collect Unreal Static Mesh
|
||||
|
||||
Ensures always only a single frame is extracted (current frame). This
|
||||
also sets correct FBX options for later extraction.
|
||||
|
||||
"""
|
||||
"""Collect Unreal Static Mesh."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.2
|
||||
label = "Collect Unreal Static Meshes"
|
||||
families = ["unrealStaticMesh"]
|
||||
families = ["staticMesh"]
|
||||
|
||||
def process(self, instance):
|
||||
# add fbx family to trigger fbx extractor
|
||||
instance.data["families"].append("fbx")
|
||||
# take the name from instance (without the `S_` prefix)
|
||||
instance.data["staticMeshCombinedName"] = instance.name[2:]
|
||||
|
||||
geometry_set = [i for i in instance if i == "geometry_SET"]
|
||||
instance.data["membersToCombine"] = cmds.sets(
|
||||
geometry_set = [
|
||||
i for i in instance
|
||||
if i.startswith("geometry_SET")
|
||||
]
|
||||
instance.data["geometryMembers"] = cmds.sets(
|
||||
geometry_set, query=True)
|
||||
|
||||
collision_set = [i for i in instance if i == "collisions_SET"]
|
||||
self.log.info("geometry: {}".format(
|
||||
pformat(instance.data.get("geometryMembers"))))
|
||||
|
||||
collision_set = [
|
||||
i for i in instance
|
||||
if i.startswith("collisions_SET")
|
||||
]
|
||||
instance.data["collisionMembers"] = cmds.sets(
|
||||
collision_set, query=True)
|
||||
|
||||
# set fbx overrides on instance
|
||||
instance.data["smoothingGroups"] = True
|
||||
instance.data["smoothMesh"] = True
|
||||
instance.data["triangulate"] = True
|
||||
self.log.info("collisions: {}".format(
|
||||
pformat(instance.data.get("collisionMembers"))))
|
||||
|
||||
frame = cmds.currentTime(query=True)
|
||||
instance.data["frameStart"] = frame
|
||||
|
|
|
|||
|
|
@ -5,152 +5,29 @@ from maya import cmds # noqa
|
|||
import maya.mel as mel # noqa
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
root_parent,
|
||||
maintained_selection
|
||||
)
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
from openpype.hosts.maya.api import fbx
|
||||
|
||||
|
||||
class ExtractFBX(openpype.api.Extractor):
|
||||
"""Extract FBX from Maya.
|
||||
|
||||
This extracts reproducible FBX exports ignoring any of the settings set
|
||||
on the local machine in the FBX export options window.
|
||||
|
||||
All export settings are applied with the `FBXExport*` commands prior
|
||||
to the `FBXExport` call itself. The options can be overridden with their
|
||||
nice names as seen in the "options" property on this class.
|
||||
|
||||
For more information on FBX exports see:
|
||||
- https://knowledge.autodesk.com/support/maya/learn-explore/caas
|
||||
/CloudHelp/cloudhelp/2016/ENU/Maya/files/GUID-6CCE943A-2ED4-4CEE-96D4
|
||||
-9CB19C28F4E0-htm.html
|
||||
- http://forums.cgsociety.org/archive/index.php?t-1032853.html
|
||||
- https://groups.google.com/forum/#!msg/python_inside_maya/cLkaSo361oE
|
||||
/LKs9hakE28kJ
|
||||
This extracts reproducible FBX exports ignoring any of the
|
||||
settings set on the local machine in the FBX export options window.
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.ExtractorOrder
|
||||
label = "Extract FBX"
|
||||
families = ["fbx"]
|
||||
|
||||
@property
|
||||
def options(self):
|
||||
"""Overridable options for FBX Export
|
||||
|
||||
Given in the following format
|
||||
- {NAME: EXPECTED TYPE}
|
||||
|
||||
If the overridden option's type does not match,
|
||||
the option is not included and a warning is logged.
|
||||
|
||||
"""
|
||||
|
||||
return {
|
||||
"cameras": bool,
|
||||
"smoothingGroups": bool,
|
||||
"hardEdges": bool,
|
||||
"tangents": bool,
|
||||
"smoothMesh": bool,
|
||||
"instances": bool,
|
||||
# "referencedContainersContent": bool, # deprecated in Maya 2016+
|
||||
"bakeComplexAnimation": int,
|
||||
"bakeComplexStart": int,
|
||||
"bakeComplexEnd": int,
|
||||
"bakeComplexStep": int,
|
||||
"bakeResampleAnimation": bool,
|
||||
"animationOnly": bool,
|
||||
"useSceneName": bool,
|
||||
"quaternion": str, # "euler"
|
||||
"shapes": bool,
|
||||
"skins": bool,
|
||||
"constraints": bool,
|
||||
"lights": bool,
|
||||
"embeddedTextures": bool,
|
||||
"inputConnections": bool,
|
||||
"upAxis": str, # x, y or z,
|
||||
"triangulate": bool
|
||||
}
|
||||
|
||||
@property
|
||||
def default_options(self):
|
||||
"""The default options for FBX extraction.
|
||||
|
||||
This includes shapes, skins, constraints, lights and incoming
|
||||
connections and exports with the Y-axis as up-axis.
|
||||
|
||||
By default this uses the time sliders start and end time.
|
||||
|
||||
"""
|
||||
|
||||
start_frame = int(cmds.playbackOptions(query=True,
|
||||
animationStartTime=True))
|
||||
end_frame = int(cmds.playbackOptions(query=True,
|
||||
animationEndTime=True))
|
||||
|
||||
return {
|
||||
"cameras": False,
|
||||
"smoothingGroups": False,
|
||||
"hardEdges": False,
|
||||
"tangents": False,
|
||||
"smoothMesh": False,
|
||||
"instances": False,
|
||||
"bakeComplexAnimation": True,
|
||||
"bakeComplexStart": start_frame,
|
||||
"bakeComplexEnd": end_frame,
|
||||
"bakeComplexStep": 1,
|
||||
"bakeResampleAnimation": True,
|
||||
"animationOnly": False,
|
||||
"useSceneName": False,
|
||||
"quaternion": "euler",
|
||||
"shapes": True,
|
||||
"skins": True,
|
||||
"constraints": False,
|
||||
"lights": True,
|
||||
"embeddedTextures": True,
|
||||
"inputConnections": True,
|
||||
"upAxis": "y",
|
||||
"triangulate": False
|
||||
}
|
||||
|
||||
def parse_overrides(self, instance, options):
|
||||
"""Inspect data of instance to determine overridden options
|
||||
|
||||
An instance may supply any of the overridable options
|
||||
as data, the option is then added to the extraction.
|
||||
|
||||
"""
|
||||
|
||||
for key in instance.data:
|
||||
if key not in self.options:
|
||||
continue
|
||||
|
||||
# Ensure the data is of correct type
|
||||
value = instance.data[key]
|
||||
if not isinstance(value, self.options[key]):
|
||||
self.log.warning(
|
||||
"Overridden attribute {key} was of "
|
||||
"the wrong type: {invalid_type} "
|
||||
"- should have been {valid_type}".format(
|
||||
key=key,
|
||||
invalid_type=type(value).__name__,
|
||||
valid_type=self.options[key].__name__))
|
||||
continue
|
||||
|
||||
options[key] = value
|
||||
|
||||
return options
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
# Ensure FBX plug-in is loaded
|
||||
cmds.loadPlugin("fbxmaya", quiet=True)
|
||||
fbx_exporter = fbx.FBXExtractor(log=self.log)
|
||||
|
||||
# Define output path
|
||||
stagingDir = self.staging_dir(instance)
|
||||
staging_dir = self.staging_dir(instance)
|
||||
filename = "{0}.fbx".format(instance.name)
|
||||
path = os.path.join(stagingDir, filename)
|
||||
path = os.path.join(staging_dir, filename)
|
||||
|
||||
# The export requires forward slashes because we need
|
||||
# to format it into a string in a mel expression
|
||||
|
|
@ -162,54 +39,13 @@ class ExtractFBX(openpype.api.Extractor):
|
|||
self.log.info("Members: {0}".format(members))
|
||||
self.log.info("Instance: {0}".format(instance[:]))
|
||||
|
||||
# Parse export options
|
||||
options = self.default_options
|
||||
options = self.parse_overrides(instance, options)
|
||||
self.log.info("Export options: {0}".format(options))
|
||||
|
||||
# Collect the start and end including handles
|
||||
start = instance.data["frameStartHandle"]
|
||||
end = instance.data["frameEndHandle"]
|
||||
|
||||
options['bakeComplexStart'] = start
|
||||
options['bakeComplexEnd'] = end
|
||||
|
||||
# First apply the default export settings to be fully consistent
|
||||
# each time for successive publishes
|
||||
mel.eval("FBXResetExport")
|
||||
|
||||
# Apply the FBX overrides through MEL since the commands
|
||||
# only work correctly in MEL according to online
|
||||
# available discussions on the topic
|
||||
_iteritems = getattr(options, "iteritems", options.items)
|
||||
for option, value in _iteritems():
|
||||
key = option[0].upper() + option[1:] # uppercase first letter
|
||||
|
||||
# Boolean must be passed as lower-case strings
|
||||
# as to MEL standards
|
||||
if isinstance(value, bool):
|
||||
value = str(value).lower()
|
||||
|
||||
template = "FBXExport{0} {1}" if key == "UpAxis" else "FBXExport{0} -v {1}" # noqa
|
||||
cmd = template.format(key, value)
|
||||
self.log.info(cmd)
|
||||
mel.eval(cmd)
|
||||
|
||||
# Never show the UI or generate a log
|
||||
mel.eval("FBXExportShowUI -v false")
|
||||
mel.eval("FBXExportGenerateLog -v false")
|
||||
fbx_exporter.set_options_from_instance(instance)
|
||||
|
||||
# Export
|
||||
if "unrealStaticMesh" in instance.data["families"]:
|
||||
with maintained_selection():
|
||||
with root_parent(members):
|
||||
self.log.info("Un-parenting: {}".format(members))
|
||||
cmds.select(members, r=1, noExpand=True)
|
||||
mel.eval('FBXExport -f "{}" -s'.format(path))
|
||||
else:
|
||||
with maintained_selection():
|
||||
cmds.select(members, r=1, noExpand=True)
|
||||
mel.eval('FBXExport -f "{}" -s'.format(path))
|
||||
with maintained_selection():
|
||||
fbx_exporter.export(members, path)
|
||||
cmds.select(members, r=1, noExpand=True)
|
||||
mel.eval('FBXExport -f "{}" -s'.format(path))
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
|
@ -218,7 +54,7 @@ class ExtractFBX(openpype.api.Extractor):
|
|||
'name': 'fbx',
|
||||
'ext': 'fbx',
|
||||
'files': filename,
|
||||
"stagingDir": stagingDir,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,85 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Create Unreal Skeletal Mesh data to be extracted as FBX."""
|
||||
import os
|
||||
from contextlib import contextmanager
|
||||
|
||||
from maya import cmds # noqa
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api import fbx
|
||||
|
||||
|
||||
@contextmanager
|
||||
def renamed(original_name, renamed_name):
|
||||
# type: (str, str) -> None
|
||||
try:
|
||||
cmds.rename(original_name, renamed_name)
|
||||
yield
|
||||
finally:
|
||||
cmds.rename(renamed_name, original_name)
|
||||
|
||||
|
||||
class ExtractUnrealSkeletalMesh(openpype.api.Extractor):
|
||||
"""Extract Unreal Skeletal Mesh as FBX from Maya. """
|
||||
|
||||
order = pyblish.api.ExtractorOrder - 0.1
|
||||
label = "Extract Unreal Skeletal Mesh"
|
||||
families = ["skeletalMesh"]
|
||||
|
||||
def process(self, instance):
|
||||
fbx_exporter = fbx.FBXExtractor(log=self.log)
|
||||
|
||||
# Define output path
|
||||
staging_dir = self.staging_dir(instance)
|
||||
filename = "{0}.fbx".format(instance.name)
|
||||
path = os.path.join(staging_dir, filename)
|
||||
|
||||
geo = instance.data.get("geometry")
|
||||
joints = instance.data.get("joints")
|
||||
|
||||
to_extract = geo + joints
|
||||
|
||||
# The export requires forward slashes because we need
|
||||
# to format it into a string in a mel expression
|
||||
path = path.replace('\\', '/')
|
||||
|
||||
self.log.info("Extracting FBX to: {0}".format(path))
|
||||
self.log.info("Members: {0}".format(to_extract))
|
||||
self.log.info("Instance: {0}".format(instance[:]))
|
||||
|
||||
fbx_exporter.set_options_from_instance(instance)
|
||||
|
||||
# This magic is done for variants. To let Unreal merge correctly
|
||||
# existing data, top node must have the same name. So for every
|
||||
# variant we extract we need to rename top node of the rig correctly.
|
||||
# It is finally done in context manager so it won't affect current
|
||||
# scene.
|
||||
|
||||
# we rely on hierarchy under one root.
|
||||
original_parent = to_extract[0].split("|")[1]
|
||||
|
||||
parent_node = instance.data.get("asset")
|
||||
|
||||
renamed_to_extract = []
|
||||
for node in to_extract:
|
||||
node_path = node.split("|")
|
||||
node_path[1] = parent_node
|
||||
renamed_to_extract.append("|".join(node_path))
|
||||
|
||||
with renamed(original_parent, parent_node):
|
||||
self.log.info("Extracting: {}".format(renamed_to_extract, path))
|
||||
fbx_exporter.export(renamed_to_extract, path)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'fbx',
|
||||
'ext': 'fbx',
|
||||
'files': filename,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
||||
self.log.info("Extract FBX successful to: {0}".format(path))
|
||||
|
|
@ -1,33 +1,61 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Create Unreal Static Mesh data to be extracted as FBX."""
|
||||
import openpype.api
|
||||
import pyblish.api
|
||||
import os
|
||||
|
||||
from maya import cmds # noqa
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
parent_nodes,
|
||||
maintained_selection
|
||||
)
|
||||
from openpype.hosts.maya.api import fbx
|
||||
|
||||
|
||||
class ExtractUnrealStaticMesh(openpype.api.Extractor):
|
||||
"""Extract FBX from Maya. """
|
||||
"""Extract Unreal Static Mesh as FBX from Maya. """
|
||||
|
||||
order = pyblish.api.ExtractorOrder - 0.1
|
||||
label = "Extract Unreal Static Mesh"
|
||||
families = ["unrealStaticMesh"]
|
||||
families = ["staticMesh"]
|
||||
|
||||
def process(self, instance):
|
||||
to_combine = instance.data.get("membersToCombine")
|
||||
static_mesh_name = instance.data.get("staticMeshCombinedName")
|
||||
self.log.info(
|
||||
"merging {} into {}".format(
|
||||
" + ".join(to_combine), static_mesh_name))
|
||||
duplicates = cmds.duplicate(to_combine, ic=True)
|
||||
cmds.polyUnite(
|
||||
*duplicates,
|
||||
n=static_mesh_name, ch=False)
|
||||
members = instance.data.get("geometryMembers", [])
|
||||
if instance.data.get("collisionMembers"):
|
||||
members = members + instance.data.get("collisionMembers")
|
||||
|
||||
if not instance.data.get("cleanNodes"):
|
||||
instance.data["cleanNodes"] = []
|
||||
fbx_exporter = fbx.FBXExtractor(log=self.log)
|
||||
|
||||
instance.data["cleanNodes"].append(static_mesh_name)
|
||||
instance.data["cleanNodes"] += duplicates
|
||||
# Define output path
|
||||
staging_dir = self.staging_dir(instance)
|
||||
filename = "{0}.fbx".format(instance.name)
|
||||
path = os.path.join(staging_dir, filename)
|
||||
|
||||
instance.data["setMembers"] = [static_mesh_name]
|
||||
instance.data["setMembers"] += instance.data["collisionMembers"]
|
||||
# The export requires forward slashes because we need
|
||||
# to format it into a string in a mel expression
|
||||
path = path.replace('\\', '/')
|
||||
|
||||
self.log.info("Extracting FBX to: {0}".format(path))
|
||||
self.log.info("Members: {0}".format(members))
|
||||
self.log.info("Instance: {0}".format(instance[:]))
|
||||
|
||||
fbx_exporter.set_options_from_instance(instance)
|
||||
|
||||
with maintained_selection():
|
||||
with parent_nodes(members):
|
||||
self.log.info("Un-parenting: {}".format(members))
|
||||
fbx_exporter.export(members, path)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'fbx',
|
||||
'ext': 'fbx',
|
||||
'files': filename,
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
||||
self.log.info("Extract FBX successful to: {0}".format(path))
|
||||
|
|
|
|||
|
|
@ -0,0 +1,14 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<root>
|
||||
<error id="main">
|
||||
<title>Skeletal Mesh Top Node</title>
|
||||
<description>## Skeletal meshes needs common root
|
||||
|
||||
Skeletal meshes and their joints must be under one common root.
|
||||
|
||||
### How to repair?
|
||||
|
||||
Make sure all geometry and joints resides under same root.
|
||||
</description>
|
||||
</error>
|
||||
</root>
|
||||
|
|
@ -40,7 +40,14 @@ class ValidateCameraContents(pyblish.api.InstancePlugin):
|
|||
# list when there are no actual cameras results in
|
||||
# still an empty 'invalid' list
|
||||
if len(cameras) < 1:
|
||||
raise RuntimeError("No cameras in instance.")
|
||||
if members:
|
||||
# If there are members in the instance return all of
|
||||
# them as 'invalid' so the user can still select invalid
|
||||
cls.log.error("No cameras found in instance "
|
||||
"members: {}".format(members))
|
||||
return members
|
||||
|
||||
raise RuntimeError("No cameras found in empty instance.")
|
||||
|
||||
# non-camera shapes
|
||||
valid_shapes = cmds.ls(shapes, type=('camera', 'locator'), long=True)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,32 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
|
||||
from maya import cmds
|
||||
|
||||
|
||||
class ValidateSkeletalMeshHierarchy(pyblish.api.InstancePlugin):
|
||||
"""Validates that nodes has common root."""
|
||||
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
hosts = ["maya"]
|
||||
families = ["skeletalMesh"]
|
||||
label = "Skeletal Mesh Top Node"
|
||||
|
||||
def process(self, instance):
|
||||
geo = instance.data.get("geometry")
|
||||
joints = instance.data.get("joints")
|
||||
|
||||
joints_parents = cmds.ls(joints, long=True)
|
||||
geo_parents = cmds.ls(geo, long=True)
|
||||
|
||||
parents_set = {
|
||||
parent.split("|")[1] for parent in (joints_parents + geo_parents)
|
||||
}
|
||||
|
||||
if len(set(parents_set)) != 1:
|
||||
raise PublishXmlValidationError(
|
||||
self,
|
||||
"Multiple roots on geometry or joints."
|
||||
)
|
||||
|
|
@ -10,10 +10,11 @@ class ValidateUnrealMeshTriangulated(pyblish.api.InstancePlugin):
|
|||
|
||||
order = openpype.api.ValidateMeshOrder
|
||||
hosts = ["maya"]
|
||||
families = ["unrealStaticMesh"]
|
||||
families = ["staticMesh"]
|
||||
category = "geometry"
|
||||
label = "Mesh is Triangulated"
|
||||
actions = [openpype.hosts.maya.api.action.SelectInvalidAction]
|
||||
active = False
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
"""Validator for correct naming of Static Meshes."""
|
||||
from maya import cmds # noqa
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
|
|
@ -52,8 +52,8 @@ class ValidateUnrealStaticMeshName(pyblish.api.InstancePlugin):
|
|||
optional = True
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
hosts = ["maya"]
|
||||
families = ["unrealStaticMesh"]
|
||||
label = "Unreal StaticMesh Name"
|
||||
families = ["staticMesh"]
|
||||
label = "Unreal Static Mesh Name"
|
||||
actions = [openpype.hosts.maya.api.action.SelectInvalidAction]
|
||||
regex_mesh = r"(?P<renderName>.*))"
|
||||
regex_collision = r"(?P<renderName>.*)"
|
||||
|
|
@ -72,15 +72,13 @@ class ValidateUnrealStaticMeshName(pyblish.api.InstancePlugin):
|
|||
["collision_prefixes"]
|
||||
)
|
||||
|
||||
combined_geometry_name = instance.data.get(
|
||||
"staticMeshCombinedName", None)
|
||||
if cls.validate_mesh:
|
||||
# compile regex for testing names
|
||||
regex_mesh = "{}{}".format(
|
||||
("_" + cls.static_mesh_prefix) or "", cls.regex_mesh
|
||||
)
|
||||
sm_r = re.compile(regex_mesh)
|
||||
if not sm_r.match(combined_geometry_name):
|
||||
if not sm_r.match(instance.data.get("subset")):
|
||||
cls.log.error("Mesh doesn't comply with name validation.")
|
||||
return True
|
||||
|
||||
|
|
@ -91,7 +89,7 @@ class ValidateUnrealStaticMeshName(pyblish.api.InstancePlugin):
|
|||
cls.log.warning("No collision objects to validate.")
|
||||
return False
|
||||
|
||||
regex_collision = "{}{}".format(
|
||||
regex_collision = "{}{}_(\\d+)".format(
|
||||
"(?P<prefix>({}))_".format(
|
||||
"|".join("{0}".format(p) for p in collision_prefixes)
|
||||
) or "", cls.regex_collision
|
||||
|
|
@ -99,6 +97,9 @@ class ValidateUnrealStaticMeshName(pyblish.api.InstancePlugin):
|
|||
|
||||
cl_r = re.compile(regex_collision)
|
||||
|
||||
mesh_name = "{}{}".format(instance.data["asset"],
|
||||
instance.data.get("variant", []))
|
||||
|
||||
for obj in collision_set:
|
||||
cl_m = cl_r.match(obj)
|
||||
if not cl_m:
|
||||
|
|
@ -107,7 +108,7 @@ class ValidateUnrealStaticMeshName(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
expected_collision = "{}_{}".format(
|
||||
cl_m.group("prefix"),
|
||||
combined_geometry_name
|
||||
mesh_name
|
||||
)
|
||||
|
||||
if not obj.startswith(expected_collision):
|
||||
|
|
@ -116,11 +117,11 @@ class ValidateUnrealStaticMeshName(pyblish.api.InstancePlugin):
|
|||
"Collision object name doesn't match "
|
||||
"static mesh name"
|
||||
)
|
||||
cls.log.error("{}_{} != {}_{}".format(
|
||||
cls.log.error("{}_{} != {}_{}*".format(
|
||||
cl_m.group("prefix"),
|
||||
cl_m.group("renderName"),
|
||||
cl_m.group("prefix"),
|
||||
combined_geometry_name,
|
||||
mesh_name,
|
||||
))
|
||||
invalid.append(obj)
|
||||
|
||||
|
|
|
|||
|
|
@ -9,9 +9,10 @@ class ValidateUnrealUpAxis(pyblish.api.ContextPlugin):
|
|||
"""Validate if Z is set as up axis in Maya"""
|
||||
|
||||
optional = True
|
||||
active = False
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
hosts = ["maya"]
|
||||
families = ["unrealStaticMesh"]
|
||||
families = ["staticMesh"]
|
||||
label = "Unreal Up-Axis check"
|
||||
actions = [openpype.api.RepairAction]
|
||||
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ def add_implementation_envs(env, _app):
|
|||
]
|
||||
old_nuke_path = env.get("NUKE_PATH") or ""
|
||||
for path in old_nuke_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
|
|
|
|||
|
|
@ -26,6 +26,7 @@ from openpype.tools.utils import host_tools
|
|||
from openpype.lib.path_tools import HostDirmap
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.pipeline import discover_legacy_creator_plugins
|
||||
|
||||
from .workio import (
|
||||
save_file,
|
||||
|
|
@ -1047,17 +1048,28 @@ def add_review_knob(node):
|
|||
def add_deadline_tab(node):
|
||||
node.addKnob(nuke.Tab_Knob("Deadline"))
|
||||
|
||||
knob = nuke.Int_Knob("deadlineChunkSize", "Chunk Size")
|
||||
knob.setValue(0)
|
||||
node.addKnob(knob)
|
||||
|
||||
knob = nuke.Int_Knob("deadlinePriority", "Priority")
|
||||
knob.setValue(50)
|
||||
node.addKnob(knob)
|
||||
|
||||
knob = nuke.Int_Knob("deadlineChunkSize", "Chunk Size")
|
||||
knob.setValue(0)
|
||||
node.addKnob(knob)
|
||||
|
||||
knob = nuke.Int_Knob("deadlineConcurrentTasks", "Concurrent tasks")
|
||||
# zero as default will get value from Settings during collection
|
||||
# instead of being an explicit user override, see precollect_write.py
|
||||
knob.setValue(0)
|
||||
node.addKnob(knob)
|
||||
|
||||
|
||||
def get_deadline_knob_names():
|
||||
return ["Deadline", "deadlineChunkSize", "deadlinePriority"]
|
||||
return [
|
||||
"Deadline",
|
||||
"deadlineChunkSize",
|
||||
"deadlinePriority",
|
||||
"deadlineConcurrentTasks"
|
||||
]
|
||||
|
||||
|
||||
def create_backdrop(label="", color=None, layer=0,
|
||||
|
|
@ -1902,7 +1914,7 @@ def recreate_instance(origin_node, avalon_data=None):
|
|||
# create new node
|
||||
# get appropriate plugin class
|
||||
creator_plugin = None
|
||||
for Creator in api.discover(api.Creator):
|
||||
for Creator in discover_legacy_creator_plugins():
|
||||
if Creator.__name__ == data["creator"]:
|
||||
creator_plugin = Creator
|
||||
break
|
||||
|
|
|
|||
|
|
@ -5,7 +5,6 @@ from collections import OrderedDict
|
|||
import nuke
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
import openpype
|
||||
from openpype.api import (
|
||||
|
|
@ -15,10 +14,11 @@ from openpype.api import (
|
|||
)
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
register_inventory_action_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
deregister_inventory_action_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
|
@ -106,7 +106,7 @@ def install():
|
|||
log.info("Registering Nuke plug-ins..")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
register_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
# Register Avalon event for workfiles loading.
|
||||
|
|
@ -132,7 +132,7 @@ def uninstall():
|
|||
pyblish.deregister_host("nuke")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
deregister_inventory_action_path(INVENTORY_PATH)
|
||||
|
||||
pyblish.api.deregister_callback(
|
||||
|
|
|
|||
|
|
@ -450,6 +450,7 @@ class ExporterReviewMov(ExporterReview):
|
|||
|
||||
def generate_mov(self, farm=False, **kwargs):
|
||||
self.publish_on_farm = farm
|
||||
read_raw = kwargs["read_raw"]
|
||||
reformat_node_add = kwargs["reformat_node_add"]
|
||||
reformat_node_config = kwargs["reformat_node_config"]
|
||||
bake_viewer_process = kwargs["bake_viewer_process"]
|
||||
|
|
@ -484,6 +485,9 @@ class ExporterReviewMov(ExporterReview):
|
|||
r_node["origlast"].setValue(self.last_frame)
|
||||
r_node["colorspace"].setValue(self.write_colorspace)
|
||||
|
||||
if read_raw:
|
||||
r_node["raw"].setValue(1)
|
||||
|
||||
# connect
|
||||
self._temp_nodes[subset].append(r_node)
|
||||
self.previous_node = r_node
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import json
|
||||
from collections import OrderedDict
|
||||
import nuke
|
||||
import six
|
||||
|
||||
from avalon import io
|
||||
|
||||
|
|
@ -333,7 +334,7 @@ class LoadEffects(load.LoaderPlugin):
|
|||
for key, value in input.items()}
|
||||
elif isinstance(input, list):
|
||||
return [self.byteify(element) for element in input]
|
||||
elif isinstance(input, str):
|
||||
elif isinstance(input, six.text_type):
|
||||
return str(input)
|
||||
else:
|
||||
return input
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import json
|
||||
from collections import OrderedDict
|
||||
|
||||
import six
|
||||
import nuke
|
||||
|
||||
from avalon import io
|
||||
|
|
@ -353,7 +353,7 @@ class LoadEffectsInputProcess(load.LoaderPlugin):
|
|||
for key, value in input.items()}
|
||||
elif isinstance(input, list):
|
||||
return [self.byteify(element) for element in input]
|
||||
elif isinstance(input, str):
|
||||
elif isinstance(input, six.text_type):
|
||||
return str(input)
|
||||
else:
|
||||
return input
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import nuke
|
||||
|
||||
import six
|
||||
from avalon import io
|
||||
|
||||
from openpype.pipeline import (
|
||||
|
|
@ -243,8 +243,8 @@ class LoadGizmoInputProcess(load.LoaderPlugin):
|
|||
for key, value in input.items()}
|
||||
elif isinstance(input, list):
|
||||
return [self.byteify(element) for element in input]
|
||||
elif isinstance(input, unicode):
|
||||
return input.encode('utf-8')
|
||||
elif isinstance(input, six.text_type):
|
||||
return str(input)
|
||||
else:
|
||||
return input
|
||||
|
||||
|
|
|
|||
47
openpype/hosts/nuke/plugins/publish/extract_review_data.py
Normal file
47
openpype/hosts/nuke/plugins/publish/extract_review_data.py
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
import openpype
|
||||
from pprint import pformat
|
||||
|
||||
|
||||
class ExtractReviewData(openpype.api.Extractor):
|
||||
"""Extracts review tag into available representation
|
||||
"""
|
||||
|
||||
order = pyblish.api.ExtractorOrder + 0.01
|
||||
# order = pyblish.api.CollectorOrder + 0.499
|
||||
label = "Extract Review Data"
|
||||
|
||||
families = ["review"]
|
||||
hosts = ["nuke"]
|
||||
|
||||
def process(self, instance):
|
||||
fpath = instance.data["path"]
|
||||
ext = os.path.splitext(fpath)[-1][1:]
|
||||
|
||||
representations = instance.data.get("representations", [])
|
||||
|
||||
# review can be removed since `ProcessSubmittedJobOnFarm` will create
|
||||
# reviable representation if needed
|
||||
if (
|
||||
"render.farm" in instance.data["families"]
|
||||
and "review" in instance.data["families"]
|
||||
):
|
||||
instance.data["families"].remove("review")
|
||||
|
||||
# iterate representations and add `review` tag
|
||||
for repre in representations:
|
||||
if ext != repre["ext"]:
|
||||
continue
|
||||
|
||||
if not repre.get("tags"):
|
||||
repre["tags"] = []
|
||||
|
||||
if "review" not in repre["tags"]:
|
||||
repre["tags"].append("review")
|
||||
|
||||
self.log.debug("Matching representation: {}".format(
|
||||
pformat(repre)
|
||||
))
|
||||
|
||||
instance.data["representations"] = representations
|
||||
|
|
@ -123,6 +123,7 @@ class ExtractReviewDataMov(openpype.api.Extractor):
|
|||
if generated_repres:
|
||||
# assign to representations
|
||||
instance.data["representations"] += generated_repres
|
||||
instance.data["useSequenceForReview"] = False
|
||||
else:
|
||||
instance.data["families"].remove("review")
|
||||
self.log.info((
|
||||
|
|
|
|||
|
|
@ -128,13 +128,17 @@ class CollectNukeWrites(pyblish.api.InstancePlugin):
|
|||
}
|
||||
|
||||
group_node = [x for x in instance if x.Class() == "Group"][0]
|
||||
deadlineChunkSize = 1
|
||||
dl_chunk_size = 1
|
||||
if "deadlineChunkSize" in group_node.knobs():
|
||||
deadlineChunkSize = group_node["deadlineChunkSize"].value()
|
||||
dl_chunk_size = group_node["deadlineChunkSize"].value()
|
||||
|
||||
deadlinePriority = 50
|
||||
dl_priority = 50
|
||||
if "deadlinePriority" in group_node.knobs():
|
||||
deadlinePriority = group_node["deadlinePriority"].value()
|
||||
dl_priority = group_node["deadlinePriority"].value()
|
||||
|
||||
dl_concurrent_tasks = 0
|
||||
if "deadlineConcurrentTasks" in group_node.knobs():
|
||||
dl_concurrent_tasks = group_node["deadlineConcurrentTasks"].value()
|
||||
|
||||
instance.data.update({
|
||||
"versionData": version_data,
|
||||
|
|
@ -144,8 +148,9 @@ class CollectNukeWrites(pyblish.api.InstancePlugin):
|
|||
"label": label,
|
||||
"outputType": output_type,
|
||||
"colorspace": colorspace,
|
||||
"deadlineChunkSize": deadlineChunkSize,
|
||||
"deadlinePriority": deadlinePriority
|
||||
"deadlineChunkSize": dl_chunk_size,
|
||||
"deadlinePriority": dl_priority,
|
||||
"deadlineConcurrentTasks": dl_concurrent_tasks
|
||||
})
|
||||
|
||||
if self.is_prerender(_families_test):
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
import os
|
||||
import toml
|
||||
|
||||
import nuke
|
||||
|
||||
from avalon import api
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.pipeline import discover_creator_plugins
|
||||
from openpype.hosts.nuke.api.lib import get_avalon_knob_data
|
||||
|
||||
|
||||
|
|
@ -79,7 +78,7 @@ class ValidateWriteLegacy(pyblish.api.InstancePlugin):
|
|||
|
||||
# get appropriate plugin class
|
||||
creator_plugin = None
|
||||
for Creator in api.discover(api.Creator):
|
||||
for Creator in discover_creator_plugins():
|
||||
if Creator.__name__ != Create_name:
|
||||
continue
|
||||
|
||||
|
|
|
|||
|
|
@ -9,9 +9,10 @@ from avalon import io
|
|||
from openpype.api import Logger
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
import openpype.hosts.photoshop
|
||||
|
|
@ -75,7 +76,7 @@ def install():
|
|||
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
pyblish.api.register_callback(
|
||||
|
|
@ -88,7 +89,7 @@ def install():
|
|||
def uninstall():
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
|
||||
def ls():
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
from Qt import QtWidgets
|
||||
from openpype.pipeline import create
|
||||
from openpype.pipeline import LegacyCreator
|
||||
from openpype.hosts.photoshop import api as photoshop
|
||||
|
||||
|
||||
class CreateImage(create.LegacyCreator):
|
||||
class CreateImage(LegacyCreator):
|
||||
"""Image folder for publish."""
|
||||
|
||||
name = "imageDefault"
|
||||
|
|
|
|||
|
|
@ -2,7 +2,6 @@ import os
|
|||
|
||||
import qargparse
|
||||
|
||||
from openpype.pipeline import get_representation_path_from_context
|
||||
from openpype.hosts.photoshop import api as photoshop
|
||||
from openpype.hosts.photoshop.api import get_unique_layer_name
|
||||
|
||||
|
|
@ -63,7 +62,7 @@ class ImageFromSequenceLoader(photoshop.PhotoshopLoader):
|
|||
"""
|
||||
files = []
|
||||
for context in repre_contexts:
|
||||
fname = get_representation_path_from_context(context)
|
||||
fname = cls.filepath_from_context(context)
|
||||
_, file_extension = os.path.splitext(fname)
|
||||
|
||||
for file_name in os.listdir(os.path.dirname(fname)):
|
||||
|
|
|
|||
|
|
@ -1,6 +1,9 @@
|
|||
from avalon import api
|
||||
import pyblish.api
|
||||
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.hosts.photoshop import api as photoshop
|
||||
from openpype.lib import prepare_template_data
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.ContextPlugin):
|
||||
|
|
@ -9,6 +12,10 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
This collector takes into account assets that are associated with
|
||||
an LayerSet and marked with a unique identifier;
|
||||
|
||||
If no image instances are explicitly created, it looks if there is value
|
||||
in `flatten_subset_template` (configurable in Settings), in that case it
|
||||
produces flatten image with all visible layers.
|
||||
|
||||
Identifier:
|
||||
id (str): "pyblish.avalon.instance"
|
||||
"""
|
||||
|
|
@ -19,13 +26,17 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
families_mapping = {
|
||||
"image": []
|
||||
}
|
||||
# configurable in Settings
|
||||
flatten_subset_template = ""
|
||||
|
||||
def process(self, context):
|
||||
stub = photoshop.stub()
|
||||
layers = stub.get_layers()
|
||||
layers_meta = stub.get_layers_metadata()
|
||||
instance_names = []
|
||||
all_layer_ids = []
|
||||
for layer in layers:
|
||||
all_layer_ids.append(layer.id)
|
||||
layer_data = stub.read(layer, layers_meta)
|
||||
|
||||
# Skip layers without metadata.
|
||||
|
|
@ -59,3 +70,33 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
if len(instance_names) != len(set(instance_names)):
|
||||
self.log.warning("Duplicate instances found. " +
|
||||
"Remove unwanted via SubsetManager")
|
||||
|
||||
if len(instance_names) == 0 and self.flatten_subset_template:
|
||||
project_name = context.data["projectEntity"]["name"]
|
||||
variants = get_project_settings(project_name).get(
|
||||
"photoshop", {}).get(
|
||||
"create", {}).get(
|
||||
"CreateImage", {}).get(
|
||||
"defaults", [''])
|
||||
family = "image"
|
||||
task_name = api.Session["AVALON_TASK"]
|
||||
asset_name = context.data["assetEntity"]["name"]
|
||||
|
||||
fill_pairs = {
|
||||
"variant": variants[0],
|
||||
"family": family,
|
||||
"task": task_name
|
||||
}
|
||||
|
||||
subset = self.flatten_subset_template.format(
|
||||
**prepare_template_data(fill_pairs))
|
||||
|
||||
instance = context.create_instance(subset)
|
||||
instance.data["family"] = family
|
||||
instance.data["asset"] = asset_name
|
||||
instance.data["subset"] = subset
|
||||
instance.data["ids"] = all_layer_ids
|
||||
instance.data["families"] = self.families_mapping[family]
|
||||
instance.data["publish"] = True
|
||||
|
||||
self.log.info("flatten instance: {} ".format(instance.data))
|
||||
|
|
|
|||
|
|
@ -2,18 +2,26 @@ import os
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
|
||||
|
||||
class CollectReview(pyblish.api.ContextPlugin):
|
||||
"""Gather the active document as review instance."""
|
||||
|
||||
label = "Review"
|
||||
order = pyblish.api.CollectorOrder
|
||||
order = pyblish.api.CollectorOrder + 0.1
|
||||
hosts = ["photoshop"]
|
||||
|
||||
def process(self, context):
|
||||
family = "review"
|
||||
task = os.getenv("AVALON_TASK", None)
|
||||
subset = family + task.capitalize()
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
)
|
||||
|
||||
file_path = context.data["currentFile"]
|
||||
base_name = os.path.basename(file_path)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
"""Collect current script for publish."""
|
||||
|
|
@ -11,8 +13,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
|
||||
def process(self, context):
|
||||
family = "workfile"
|
||||
task = os.getenv("AVALON_TASK", None)
|
||||
subset = family + task.capitalize()
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
)
|
||||
|
||||
file_path = context.data["currentFile"]
|
||||
staging_dir = os.path.dirname(file_path)
|
||||
|
|
|
|||
|
|
@ -26,8 +26,10 @@ class ExtractImage(openpype.api.Extractor):
|
|||
with photoshop.maintained_selection():
|
||||
self.log.info("Extracting %s" % str(list(instance)))
|
||||
with photoshop.maintained_visibility():
|
||||
ids = set()
|
||||
layer = instance.data.get("layer")
|
||||
ids = set([layer.id])
|
||||
if layer:
|
||||
ids.add(layer.id)
|
||||
add_ids = instance.data.pop("ids", None)
|
||||
if add_ids:
|
||||
ids.update(set(add_ids))
|
||||
|
|
|
|||
|
|
@ -155,6 +155,9 @@ class ExtractReview(openpype.api.Extractor):
|
|||
for image_instance in instance.context:
|
||||
if image_instance.data["family"] != "image":
|
||||
continue
|
||||
if not image_instance.data.get("layer"):
|
||||
# dummy instance for flatten image
|
||||
continue
|
||||
layers.append(image_instance.data.get("layer"))
|
||||
|
||||
return sorted(layers)
|
||||
|
|
|
|||
|
|
@ -29,7 +29,8 @@ class ValidateNamingRepair(pyblish.api.Action):
|
|||
stub = photoshop.stub()
|
||||
for instance in instances:
|
||||
self.log.info("validate_naming instance {}".format(instance))
|
||||
metadata = stub.read(instance[0])
|
||||
layer_item = instance.data["layer"]
|
||||
metadata = stub.read(layer_item)
|
||||
self.log.info("metadata instance {}".format(metadata))
|
||||
layer_name = None
|
||||
if metadata.get("uuid"):
|
||||
|
|
@ -43,11 +44,11 @@ class ValidateNamingRepair(pyblish.api.Action):
|
|||
stub.rename_layer(instance.data["uuid"], layer_name)
|
||||
|
||||
subset_name = re.sub(invalid_chars, replace_char,
|
||||
instance.data["name"])
|
||||
instance.data["subset"])
|
||||
|
||||
instance[0].Name = layer_name or subset_name
|
||||
layer_item.name = layer_name or subset_name
|
||||
metadata["subset"] = subset_name
|
||||
stub.imprint(instance[0], metadata)
|
||||
stub.imprint(layer_item, metadata)
|
||||
|
||||
return True
|
||||
|
||||
|
|
|
|||
|
|
@ -4,14 +4,17 @@ Basic avalon integration
|
|||
import os
|
||||
import contextlib
|
||||
from collections import OrderedDict
|
||||
from avalon import api as avalon
|
||||
from avalon import schema
|
||||
|
||||
from pyblish import api as pyblish
|
||||
|
||||
from avalon import schema
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from . import lib
|
||||
|
|
@ -46,7 +49,7 @@ def install():
|
|||
log.info("Registering DaVinci Resovle plug-ins..")
|
||||
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
@ -70,7 +73,7 @@ def uninstall():
|
|||
log.info("Deregistering DaVinci Resovle plug-ins..")
|
||||
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
# register callback for switching publishable
|
||||
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,13 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectSAAppName(pyblish.api.ContextPlugin):
|
||||
"""Collect app name and label."""
|
||||
|
||||
label = "Collect App Name/Label"
|
||||
order = pyblish.api.CollectorOrder - 0.5
|
||||
hosts = ["standalonepublisher"]
|
||||
|
||||
def process(self, context):
|
||||
context.data["appName"] = "standalone publisher"
|
||||
context.data["appLabel"] = "Standalone publisher"
|
||||
|
|
@ -0,0 +1,18 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect original base name for use in templates."""
|
||||
from pathlib import Path
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectOriginalBasename(pyblish.api.InstancePlugin):
|
||||
"""Collect original file base name."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.498
|
||||
label = "Collect Base Name"
|
||||
hosts = ["standalonepublisher"]
|
||||
families = ["simpleUnrealTexture"]
|
||||
|
||||
def process(self, instance):
|
||||
file_name = Path(instance.data["representations"][0]["files"])
|
||||
instance.data["originalBasename"] = file_name.stem
|
||||
|
|
@ -0,0 +1,17 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<root>
|
||||
<error id="main">
|
||||
<title>Invalid texture name</title>
|
||||
<description>
|
||||
## Invalid file name
|
||||
|
||||
Submitted file has invalid name:
|
||||
'{invalid_file}'
|
||||
|
||||
### How to repair?
|
||||
|
||||
Texture file must adhere to naming conventions for Unreal:
|
||||
T_{asset}_*.ext
|
||||
</description>
|
||||
</error>
|
||||
</root>
|
||||
|
|
@ -0,0 +1,26 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Validator for correct file naming."""
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
import re
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
|
||||
|
||||
class ValidateSimpleUnrealTextureNaming(pyblish.api.InstancePlugin):
|
||||
label = "Validate Unreal Texture Names"
|
||||
hosts = ["standalonepublisher"]
|
||||
families = ["simpleUnrealTexture"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
regex = "^T_{asset}.*"
|
||||
|
||||
def process(self, instance):
|
||||
file_name = instance.data.get("originalBasename")
|
||||
self.log.info(file_name)
|
||||
pattern = self.regex.format(asset=instance.data.get("asset"))
|
||||
if not re.match(pattern, file_name):
|
||||
msg = f"Invalid file name {file_name}"
|
||||
raise PublishXmlValidationError(
|
||||
self, msg, formatting_data={
|
||||
"invalid_file": file_name,
|
||||
"asset": instance.data.get("asset")
|
||||
})
|
||||
|
|
@ -30,7 +30,7 @@ class MyAutoCreator(AutoCreator):
|
|||
def update_instances(self, update_list):
|
||||
pipeline.update_instances(update_list)
|
||||
|
||||
def create(self, options=None):
|
||||
def create(self):
|
||||
existing_instance = None
|
||||
for instance in self.create_context.instances:
|
||||
if instance.family == self.family:
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ from avalon import io
|
|||
import avalon.api
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import BaseCreator
|
||||
from openpype.pipeline import register_creator_plugin_path
|
||||
|
||||
ROOT_DIR = os.path.dirname(os.path.dirname(
|
||||
os.path.abspath(__file__)
|
||||
|
|
@ -169,7 +169,7 @@ def install():
|
|||
|
||||
pyblish.api.register_host("traypublisher")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.api.register_plugin_path(BaseCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
|
||||
def set_project_name(project_name):
|
||||
|
|
|
|||
|
|
@ -0,0 +1,13 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectTrayPublisherAppName(pyblish.api.ContextPlugin):
|
||||
"""Collect app name and label."""
|
||||
|
||||
label = "Collect App Name/Label"
|
||||
order = pyblish.api.CollectorOrder - 0.5
|
||||
hosts = ["traypublisher"]
|
||||
|
||||
def process(self, context):
|
||||
context.data["appName"] = "tray publisher"
|
||||
context.data["appLabel"] = "Tray publisher"
|
||||
|
|
@ -14,11 +14,22 @@ class ValidateWorkfilePath(pyblish.api.InstancePlugin):
|
|||
def process(self, instance):
|
||||
filepath = instance.data["sourceFilepath"]
|
||||
if not filepath:
|
||||
raise PublishValidationError((
|
||||
"Filepath of 'workfile' instance \"{}\" is not set"
|
||||
).format(instance.data["name"]))
|
||||
raise PublishValidationError(
|
||||
(
|
||||
"Filepath of 'workfile' instance \"{}\" is not set"
|
||||
).format(instance.data["name"]),
|
||||
"File not filled",
|
||||
"## Missing file\nYou are supposed to fill the path."
|
||||
)
|
||||
|
||||
if not os.path.exists(filepath):
|
||||
raise PublishValidationError((
|
||||
"Filepath of 'workfile' instance \"{}\" does not exist: {}"
|
||||
).format(instance.data["name"], filepath))
|
||||
raise PublishValidationError(
|
||||
(
|
||||
"Filepath of 'workfile' instance \"{}\" does not exist: {}"
|
||||
).format(instance.data["name"], filepath),
|
||||
"File not found",
|
||||
(
|
||||
"## File was not found\nFile \"{}\" was not found."
|
||||
" Check if the path is still available."
|
||||
).format(filepath)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -15,9 +15,10 @@ from openpype.hosts import tvpaint
|
|||
from openpype.api import get_current_project_settings
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
|
||||
|
|
@ -82,7 +83,7 @@ def install():
|
|||
pyblish.api.register_host("tvpaint")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.register_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
registered_callbacks = (
|
||||
pyblish.api.registered_callbacks().get("instanceToggled") or []
|
||||
|
|
@ -104,7 +105,7 @@ def uninstall():
|
|||
pyblish.api.deregister_host("tvpaint")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(LegacyCreator, CREATE_PATH)
|
||||
deregister_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
|
||||
def containerise(
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import collections
|
||||
import qargparse
|
||||
from avalon.pipeline import get_representation_context
|
||||
from openpype.pipeline import get_representation_context
|
||||
from openpype.hosts.tvpaint.api import lib, pipeline, plugin
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -20,21 +20,30 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
json.dumps(workfile_instances, indent=4)
|
||||
))
|
||||
|
||||
filtered_instance_data = []
|
||||
# Backwards compatibility for workfiles that already have review
|
||||
# instance in metadata.
|
||||
review_instance_exist = False
|
||||
for instance_data in workfile_instances:
|
||||
if instance_data["family"] == "review":
|
||||
family = instance_data["family"]
|
||||
if family == "review":
|
||||
review_instance_exist = True
|
||||
break
|
||||
|
||||
elif family not in ("renderPass", "renderLayer"):
|
||||
self.log.info("Unknown family \"{}\". Skipping {}".format(
|
||||
family, json.dumps(instance_data, indent=4)
|
||||
))
|
||||
continue
|
||||
|
||||
filtered_instance_data.append(instance_data)
|
||||
|
||||
# Fake review instance if review was not found in metadata families
|
||||
if not review_instance_exist:
|
||||
workfile_instances.append(
|
||||
filtered_instance_data.append(
|
||||
self._create_review_instance_data(context)
|
||||
)
|
||||
|
||||
for instance_data in workfile_instances:
|
||||
for instance_data in filtered_instance_data:
|
||||
instance_data["fps"] = context.data["sceneFps"]
|
||||
|
||||
# Store workfile instance data to instance data
|
||||
|
|
@ -42,8 +51,11 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
# Global instance data modifications
|
||||
# Fill families
|
||||
family = instance_data["family"]
|
||||
families = [family]
|
||||
if family != "review":
|
||||
families.append("review")
|
||||
# Add `review` family for thumbnail integration
|
||||
instance_data["families"] = [family, "review"]
|
||||
instance_data["families"] = families
|
||||
|
||||
# Instance name
|
||||
subset_name = instance_data["subset"]
|
||||
|
|
@ -78,7 +90,7 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
# Project name from workfile context
|
||||
project_name = context.data["workfile_context"]["project"]
|
||||
# Host name from environment variable
|
||||
host_name = os.environ["AVALON_APP"]
|
||||
host_name = context.data["hostName"]
|
||||
# Use empty variant value
|
||||
variant = ""
|
||||
task_name = io.Session["AVALON_TASK"]
|
||||
|
|
@ -106,12 +118,6 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
instance = self.create_render_pass_instance(
|
||||
context, instance_data
|
||||
)
|
||||
else:
|
||||
raise AssertionError(
|
||||
"Instance with unknown family \"{}\": {}".format(
|
||||
family, instance_data
|
||||
)
|
||||
)
|
||||
|
||||
if instance is None:
|
||||
continue
|
||||
|
|
|
|||
110
openpype/hosts/tvpaint/plugins/publish/collect_scene_render.py
Normal file
110
openpype/hosts/tvpaint/plugins/publish/collect_scene_render.py
Normal file
|
|
@ -0,0 +1,110 @@
|
|||
import json
|
||||
import copy
|
||||
import pyblish.api
|
||||
from avalon import io
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
|
||||
|
||||
class CollectRenderScene(pyblish.api.ContextPlugin):
|
||||
"""Collect instance which renders whole scene in PNG.
|
||||
|
||||
Creates instance with family 'renderScene' which will have all layers
|
||||
to render which will be composite into one result. The instance is not
|
||||
collected from scene.
|
||||
|
||||
Scene will be rendered with all visible layers similar way like review is.
|
||||
|
||||
Instance is disabled if there are any created instances of 'renderLayer'
|
||||
or 'renderPass'. That is because it is expected that this instance is
|
||||
used as lazy publish of TVPaint file.
|
||||
|
||||
Subset name is created similar way like 'renderLayer' family. It can use
|
||||
`renderPass` and `renderLayer` keys which can be set using settings and
|
||||
`variant` is filled using `renderPass` value.
|
||||
"""
|
||||
label = "Collect Render Scene"
|
||||
order = pyblish.api.CollectorOrder - 0.39
|
||||
hosts = ["tvpaint"]
|
||||
|
||||
# Value of 'render_pass' in subset name template
|
||||
render_pass = "beauty"
|
||||
|
||||
# Settings attributes
|
||||
enabled = False
|
||||
# Value of 'render_layer' and 'variant' in subset name template
|
||||
render_layer = "Main"
|
||||
|
||||
def process(self, context):
|
||||
# Check if there are created instances of renderPass and renderLayer
|
||||
# - that will define if renderScene instance is enabled after
|
||||
# collection
|
||||
any_created_instance = False
|
||||
for instance in context:
|
||||
family = instance.data["family"]
|
||||
if family in ("renderPass", "renderLayer"):
|
||||
any_created_instance = True
|
||||
break
|
||||
|
||||
# Global instance data modifications
|
||||
# Fill families
|
||||
family = "renderScene"
|
||||
# Add `review` family for thumbnail integration
|
||||
families = [family, "review"]
|
||||
|
||||
# Collect asset doc to get asset id
|
||||
# - not sure if it's good idea to require asset id in
|
||||
# get_subset_name?
|
||||
workfile_context = context.data["workfile_context"]
|
||||
asset_name = workfile_context["asset"]
|
||||
asset_doc = io.find_one({
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
})
|
||||
|
||||
# Project name from workfile context
|
||||
project_name = context.data["workfile_context"]["project"]
|
||||
# Host name from environment variable
|
||||
host_name = context.data["hostName"]
|
||||
# Variant is using render pass name
|
||||
variant = self.render_layer
|
||||
dynamic_data = {
|
||||
"render_layer": self.render_layer,
|
||||
"render_pass": self.render_pass
|
||||
}
|
||||
|
||||
task_name = workfile_context["task"]
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
"render",
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name,
|
||||
dynamic_data=dynamic_data
|
||||
)
|
||||
|
||||
instance_data = {
|
||||
"family": family,
|
||||
"families": families,
|
||||
"fps": context.data["sceneFps"],
|
||||
"subset": subset_name,
|
||||
"name": subset_name,
|
||||
"label": "{} [{}-{}]".format(
|
||||
subset_name,
|
||||
context.data["sceneMarkIn"] + 1,
|
||||
context.data["sceneMarkOut"] + 1
|
||||
),
|
||||
"active": not any_created_instance,
|
||||
"publish": not any_created_instance,
|
||||
"representations": [],
|
||||
"layers": copy.deepcopy(context.data["layersData"]),
|
||||
"asset": asset_name,
|
||||
"task": task_name
|
||||
}
|
||||
|
||||
instance = context.create_instance(**instance_data)
|
||||
|
||||
self.log.debug("Created instance: {}\n{}".format(
|
||||
instance, json.dumps(instance.data, indent=4)
|
||||
))
|
||||
|
|
@ -0,0 +1,99 @@
|
|||
"""Plugin converting png files from ExtractSequence into exrs.
|
||||
|
||||
Requires:
|
||||
ExtractSequence - source of PNG
|
||||
ExtractReview - review was already created so we can convert to any exr
|
||||
"""
|
||||
import os
|
||||
import json
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib import (
|
||||
get_oiio_tools_path,
|
||||
run_subprocess,
|
||||
)
|
||||
from openpype.pipeline import KnownPublishError
|
||||
|
||||
|
||||
class ExtractConvertToEXR(pyblish.api.InstancePlugin):
|
||||
# Offset to get after ExtractSequence plugin.
|
||||
order = pyblish.api.ExtractorOrder + 0.1
|
||||
label = "Extract Sequence EXR"
|
||||
hosts = ["tvpaint"]
|
||||
families = ["render"]
|
||||
|
||||
enabled = False
|
||||
|
||||
# Replace source PNG files or just add
|
||||
replace_pngs = True
|
||||
# EXR compression
|
||||
exr_compression = "ZIP"
|
||||
|
||||
def process(self, instance):
|
||||
repres = instance.data.get("representations")
|
||||
if not repres:
|
||||
return
|
||||
|
||||
oiio_path = get_oiio_tools_path()
|
||||
# Raise an exception when oiiotool is not available
|
||||
# - this can currently happen on MacOS machines
|
||||
if not os.path.exists(oiio_path):
|
||||
KnownPublishError(
|
||||
"OpenImageIO tool is not available on this machine."
|
||||
)
|
||||
|
||||
new_repres = []
|
||||
for repre in repres:
|
||||
if repre["name"] != "png":
|
||||
continue
|
||||
|
||||
self.log.info(
|
||||
"Processing representation: {}".format(
|
||||
json.dumps(repre, sort_keys=True, indent=4)
|
||||
)
|
||||
)
|
||||
|
||||
src_filepaths = set()
|
||||
new_filenames = []
|
||||
for src_filename in repre["files"]:
|
||||
dst_filename = os.path.splitext(src_filename)[0] + ".exr"
|
||||
new_filenames.append(dst_filename)
|
||||
|
||||
src_filepath = os.path.join(repre["stagingDir"], src_filename)
|
||||
dst_filepath = os.path.join(repre["stagingDir"], dst_filename)
|
||||
|
||||
src_filepaths.add(src_filepath)
|
||||
|
||||
args = [
|
||||
oiio_path, src_filepath,
|
||||
"--compression", self.exr_compression,
|
||||
# TODO how to define color conversion?
|
||||
"--colorconvert", "sRGB", "linear",
|
||||
"-o", dst_filepath
|
||||
]
|
||||
run_subprocess(args)
|
||||
|
||||
new_repres.append(
|
||||
{
|
||||
"name": "exr",
|
||||
"ext": "exr",
|
||||
"files": new_filenames,
|
||||
"stagingDir": repre["stagingDir"],
|
||||
"tags": list(repre["tags"])
|
||||
}
|
||||
)
|
||||
|
||||
if self.replace_pngs:
|
||||
instance.data["representations"].remove(repre)
|
||||
|
||||
for filepath in src_filepaths:
|
||||
instance.context.data["cleanupFullPaths"].append(filepath)
|
||||
|
||||
instance.data["representations"].extend(new_repres)
|
||||
self.log.info(
|
||||
"Representations: {}".format(
|
||||
json.dumps(
|
||||
instance.data["representations"], sort_keys=True, indent=4
|
||||
)
|
||||
)
|
||||
)
|
||||
|
|
@ -12,14 +12,13 @@ from openpype.hosts.tvpaint.lib import (
|
|||
fill_reference_frames,
|
||||
composite_rendered_layers,
|
||||
rename_filepaths_by_frame_start,
|
||||
composite_images
|
||||
)
|
||||
|
||||
|
||||
class ExtractSequence(pyblish.api.Extractor):
|
||||
label = "Extract Sequence"
|
||||
hosts = ["tvpaint"]
|
||||
families = ["review", "renderPass", "renderLayer"]
|
||||
families = ["review", "renderPass", "renderLayer", "renderScene"]
|
||||
|
||||
# Modifiable with settings
|
||||
review_bg = [255, 255, 255, 255]
|
||||
|
|
@ -160,7 +159,7 @@ class ExtractSequence(pyblish.api.Extractor):
|
|||
|
||||
# Fill tags and new families
|
||||
tags = []
|
||||
if family_lowered in ("review", "renderlayer"):
|
||||
if family_lowered in ("review", "renderlayer", "renderscene"):
|
||||
tags.append("review")
|
||||
|
||||
# Sequence of one frame
|
||||
|
|
@ -186,7 +185,7 @@ class ExtractSequence(pyblish.api.Extractor):
|
|||
|
||||
instance.data["representations"].append(new_repre)
|
||||
|
||||
if family_lowered in ("renderpass", "renderlayer"):
|
||||
if family_lowered in ("renderpass", "renderlayer", "renderscene"):
|
||||
# Change family to render
|
||||
instance.data["family"] = "render"
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ class ValidateLayersVisiblity(pyblish.api.InstancePlugin):
|
|||
|
||||
label = "Validate Layers Visibility"
|
||||
order = pyblish.api.ValidatorOrder
|
||||
families = ["review", "renderPass", "renderLayer"]
|
||||
families = ["review", "renderPass", "renderLayer", "renderScene"]
|
||||
|
||||
def process(self, instance):
|
||||
layer_names = set()
|
||||
|
|
|
|||
|
|
@ -7,9 +7,10 @@ import pyblish.api
|
|||
from avalon import api
|
||||
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
|
|
@ -49,7 +50,7 @@ def install():
|
|||
logger.info("installing OpenPype for Unreal")
|
||||
pyblish.api.register_plugin_path(str(PUBLISH_PATH))
|
||||
register_loader_plugin_path(str(LOAD_PATH))
|
||||
api.register_plugin_path(LegacyCreator, str(CREATE_PATH))
|
||||
register_creator_plugin_path(str(CREATE_PATH))
|
||||
_register_callbacks()
|
||||
_register_events()
|
||||
|
||||
|
|
@ -58,7 +59,7 @@ def uninstall():
|
|||
"""Uninstall Unreal configuration for Avalon."""
|
||||
pyblish.api.deregister_plugin_path(str(PUBLISH_PATH))
|
||||
deregister_loader_plugin_path(str(LOAD_PATH))
|
||||
api.deregister_plugin_path(LegacyCreator, str(CREATE_PATH))
|
||||
deregister_creator_plugin_path(str(CREATE_PATH))
|
||||
|
||||
|
||||
def _register_callbacks():
|
||||
|
|
|
|||
|
|
@ -1604,13 +1604,13 @@ def get_creator_by_name(creator_name, case_sensitive=False):
|
|||
Returns:
|
||||
Creator: Return first matching plugin or `None`.
|
||||
"""
|
||||
from openpype.pipeline import LegacyCreator
|
||||
from openpype.pipeline import discover_legacy_creator_plugins
|
||||
|
||||
# Lower input creator name if is not case sensitive
|
||||
if not case_sensitive:
|
||||
creator_name = creator_name.lower()
|
||||
|
||||
for creator_plugin in avalon.api.discover(LegacyCreator):
|
||||
for creator_plugin in discover_legacy_creator_plugins():
|
||||
_creator_name = creator_plugin.__name__
|
||||
|
||||
# Lower creator plugin name if is not case sensitive
|
||||
|
|
@ -1965,6 +1965,7 @@ def get_last_workfile(
|
|||
data.pop("comment", None)
|
||||
if not data.get("ext"):
|
||||
data["ext"] = extensions[0]
|
||||
data["ext"] = data["ext"].replace('.', '')
|
||||
filename = StringTemplate.format_strict_template(file_template, data)
|
||||
|
||||
if full_path:
|
||||
|
|
|
|||
|
|
@ -5,8 +5,9 @@ import importlib
|
|||
import inspect
|
||||
import logging
|
||||
|
||||
import six
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
PY3 = sys.version_info[0] == 3
|
||||
|
||||
|
||||
def import_filepath(filepath, module_name=None):
|
||||
|
|
@ -28,7 +29,7 @@ def import_filepath(filepath, module_name=None):
|
|||
# Prepare module object where content of file will be parsed
|
||||
module = types.ModuleType(module_name)
|
||||
|
||||
if PY3:
|
||||
if six.PY3:
|
||||
# Use loader so module has full specs
|
||||
module_loader = importlib.machinery.SourceFileLoader(
|
||||
module_name, filepath
|
||||
|
|
@ -38,7 +39,7 @@ def import_filepath(filepath, module_name=None):
|
|||
# Execute module code and store content to module
|
||||
with open(filepath) as _stream:
|
||||
# Execute content and store it to module object
|
||||
exec(_stream.read(), module.__dict__)
|
||||
six.exec_(_stream.read(), module.__dict__)
|
||||
|
||||
module.__file__ = filepath
|
||||
return module
|
||||
|
|
@ -129,20 +130,12 @@ def classes_from_module(superclass, module):
|
|||
for name in dir(module):
|
||||
# It could be anything at this point
|
||||
obj = getattr(module, name)
|
||||
if not inspect.isclass(obj):
|
||||
if not inspect.isclass(obj) or obj is superclass:
|
||||
continue
|
||||
|
||||
# These are subclassed from nothing, not even `object`
|
||||
if not len(obj.__bases__) > 0:
|
||||
continue
|
||||
if issubclass(obj, superclass):
|
||||
classes.append(obj)
|
||||
|
||||
# Use string comparison rather than `issubclass`
|
||||
# in order to support reloading of this module.
|
||||
bases = recursive_bases_from_class(obj)
|
||||
if not any(base.__name__ == superclass.__name__ for base in bases):
|
||||
continue
|
||||
|
||||
classes.append(obj)
|
||||
return classes
|
||||
|
||||
|
||||
|
|
@ -228,7 +221,7 @@ def import_module_from_dirpath(dirpath, folder_name, dst_module_name=None):
|
|||
dst_module_name(str): Parent module name under which can be loaded
|
||||
module added.
|
||||
"""
|
||||
if PY3:
|
||||
if six.PY3:
|
||||
module = _import_module_from_dirpath_py3(
|
||||
dirpath, folder_name, dst_module_name
|
||||
)
|
||||
|
|
|
|||
|
|
@ -37,6 +37,8 @@ IGNORED_DEFAULT_FILENAMES = (
|
|||
"__init__.py",
|
||||
"base.py",
|
||||
"interfaces.py",
|
||||
"example_addons",
|
||||
"default_modules",
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -303,7 +305,16 @@ def _load_modules():
|
|||
fullpath = os.path.join(current_dir, filename)
|
||||
basename, ext = os.path.splitext(filename)
|
||||
|
||||
if not os.path.isdir(fullpath) and ext not in (".py", ):
|
||||
if os.path.isdir(fullpath):
|
||||
# Check existence of init fil
|
||||
init_path = os.path.join(fullpath, "__init__.py")
|
||||
if not os.path.exists(init_path):
|
||||
log.debug((
|
||||
"Module directory does not contan __init__.py file {}"
|
||||
).format(fullpath))
|
||||
continue
|
||||
|
||||
elif ext not in (".py", ):
|
||||
continue
|
||||
|
||||
try:
|
||||
|
|
@ -341,7 +352,16 @@ def _load_modules():
|
|||
fullpath = os.path.join(dirpath, filename)
|
||||
basename, ext = os.path.splitext(filename)
|
||||
|
||||
if not os.path.isdir(fullpath) and ext not in (".py", ):
|
||||
if os.path.isdir(fullpath):
|
||||
# Check existence of init fil
|
||||
init_path = os.path.join(fullpath, "__init__.py")
|
||||
if not os.path.exists(init_path):
|
||||
log.debug((
|
||||
"Module directory does not contan __init__.py file {}"
|
||||
).format(fullpath))
|
||||
continue
|
||||
|
||||
elif ext not in (".py", ):
|
||||
continue
|
||||
|
||||
# TODO add more logic how to define if folder is module or not
|
||||
|
|
|
|||
|
|
@ -254,7 +254,11 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
use_published = True
|
||||
tile_assembler_plugin = "OpenPypeTileAssembler"
|
||||
asset_dependencies = False
|
||||
priority = 50
|
||||
tile_priority = 50
|
||||
limit_groups = []
|
||||
jobInfo = {}
|
||||
pluginInfo = {}
|
||||
group = "none"
|
||||
|
||||
def process(self, instance):
|
||||
|
|
@ -272,37 +276,12 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.deadline_url = instance.data.get("deadlineUrl")
|
||||
assert self.deadline_url, "Requires Deadline Webservice URL"
|
||||
|
||||
self._job_info = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"jobInfo", {})
|
||||
)
|
||||
# just using existing names from Setting
|
||||
self._job_info = self.jobInfo
|
||||
|
||||
self._plugin_info = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"pluginInfo", {})
|
||||
)
|
||||
self._plugin_info = self.pluginInfo
|
||||
|
||||
self.limit_groups = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"limit", [])
|
||||
)
|
||||
|
||||
self.group = (
|
||||
context.data["project_settings"].get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
"MayaSubmitDeadline", {}).get(
|
||||
"group", "none")
|
||||
)
|
||||
self.limit_groups = self.limit
|
||||
|
||||
context = instance.context
|
||||
workspace = context.data["workspaceDir"]
|
||||
|
|
@ -465,7 +444,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.payload_skeleton["JobInfo"]["UserName"] = deadline_user
|
||||
# Set job priority
|
||||
self.payload_skeleton["JobInfo"]["Priority"] = \
|
||||
self._instance.data.get("priority", 50)
|
||||
self._instance.data.get("priority", self.priority)
|
||||
|
||||
if self.group != "none" and self.group:
|
||||
self.payload_skeleton["JobInfo"]["Group"] = self.group
|
||||
|
|
@ -635,7 +614,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
}
|
||||
assembly_payload["JobInfo"].update(output_filenames)
|
||||
assembly_payload["JobInfo"]["Priority"] = self._instance.data.get(
|
||||
"priority", 50)
|
||||
"tile_priority", self.tile_priority)
|
||||
assembly_payload["JobInfo"]["UserName"] = deadline_user
|
||||
|
||||
frame_payloads = []
|
||||
|
|
|
|||
|
|
@ -27,6 +27,7 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
# presets
|
||||
priority = 50
|
||||
chunk_size = 1
|
||||
concurrent_tasks = 1
|
||||
primary_pool = ""
|
||||
secondary_pool = ""
|
||||
group = ""
|
||||
|
|
@ -149,11 +150,16 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
pass
|
||||
|
||||
# define chunk and priority
|
||||
chunk_size = instance.data.get("deadlineChunkSize")
|
||||
chunk_size = instance.data["deadlineChunkSize"]
|
||||
if chunk_size == 0 and self.chunk_size:
|
||||
chunk_size = self.chunk_size
|
||||
|
||||
priority = instance.data.get("deadlinePriority")
|
||||
# define chunk and priority
|
||||
concurrent_tasks = instance.data["deadlineConcurrentTasks"]
|
||||
if concurrent_tasks == 0 and self.concurrent_tasks:
|
||||
concurrent_tasks = self.concurrent_tasks
|
||||
|
||||
priority = instance.data["deadlinePriority"]
|
||||
if not priority:
|
||||
priority = self.priority
|
||||
|
||||
|
|
@ -177,6 +183,8 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
"Priority": priority,
|
||||
"ChunkSize": chunk_size,
|
||||
"ConcurrentTasks": concurrent_tasks,
|
||||
|
||||
"Department": self.department,
|
||||
|
||||
"Pool": self.primary_pool,
|
||||
|
|
|
|||
|
|
@ -235,6 +235,8 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
if mongo_url:
|
||||
environment["OPENPYPE_MONGO"] = mongo_url
|
||||
|
||||
priority = self.deadline_priority or instance.data.get("priority", 50)
|
||||
|
||||
args = [
|
||||
"--headless",
|
||||
'publish',
|
||||
|
|
@ -254,7 +256,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
|
||||
"Department": self.deadline_department,
|
||||
"ChunkSize": self.deadline_chunk_size,
|
||||
"Priority": job["Props"]["Pri"],
|
||||
"Priority": priority,
|
||||
|
||||
"Group": self.deadline_group,
|
||||
"Pool": self.deadline_pool,
|
||||
|
|
@ -509,8 +511,8 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
most cases, but if not - we create representation from each of them.
|
||||
|
||||
Arguments:
|
||||
instance (pyblish.plugin.Instance): instance for which we are
|
||||
setting representations
|
||||
instance (dict): instance data for which we are
|
||||
setting representations
|
||||
exp_files (list): list of expected files
|
||||
|
||||
Returns:
|
||||
|
|
@ -524,21 +526,28 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
for collection in collections:
|
||||
ext = collection.tail.lstrip(".")
|
||||
preview = False
|
||||
# if filtered aov name is found in filename, toggle it for
|
||||
# preview video rendering
|
||||
for app in self.aov_filter.keys():
|
||||
if os.environ.get("AVALON_APP", "") == app:
|
||||
for aov in self.aov_filter[app]:
|
||||
if re.match(
|
||||
aov,
|
||||
list(collection)[0]
|
||||
):
|
||||
preview = True
|
||||
break
|
||||
# TODO 'useSequenceForReview' is temporary solution which does
|
||||
# not work for 100% of cases. We must be able to tell what
|
||||
# expected files contains more explicitly and from what
|
||||
# should be review made.
|
||||
# - "review" tag is never added when is set to 'False'
|
||||
if instance["useSequenceForReview"]:
|
||||
# if filtered aov name is found in filename, toggle it for
|
||||
# preview video rendering
|
||||
for app in self.aov_filter.keys():
|
||||
if os.environ.get("AVALON_APP", "") == app:
|
||||
# iteratre all aov filters
|
||||
for aov in self.aov_filter[app]:
|
||||
if re.match(
|
||||
aov,
|
||||
list(collection)[0]
|
||||
):
|
||||
preview = True
|
||||
break
|
||||
|
||||
# toggle preview on if multipart is on
|
||||
if instance.get("multipartExr", False):
|
||||
preview = True
|
||||
# toggle preview on if multipart is on
|
||||
if instance.get("multipartExr", False):
|
||||
preview = True
|
||||
|
||||
staging = os.path.dirname(list(collection)[0])
|
||||
success, rootless_staging_dir = (
|
||||
|
|
@ -724,7 +733,8 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"resolutionWidth": data.get("resolutionWidth", 1920),
|
||||
"resolutionHeight": data.get("resolutionHeight", 1080),
|
||||
"multipartExr": data.get("multipartExr", False),
|
||||
"jobBatchName": data.get("jobBatchName", "")
|
||||
"jobBatchName": data.get("jobBatchName", ""),
|
||||
"useSequenceForReview": data.get("useSequenceForReview", True)
|
||||
}
|
||||
|
||||
if "prerender" in instance.data["families"]:
|
||||
|
|
@ -916,12 +926,6 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
# User is deadline user
|
||||
render_job["Props"]["User"] = context.data.get(
|
||||
"deadlineUser", getpass.getuser())
|
||||
# Priority is now not handled at all
|
||||
|
||||
if self.deadline_priority:
|
||||
render_job["Props"]["Pri"] = self.deadline_priority
|
||||
else:
|
||||
render_job["Props"]["Pri"] = instance.data.get("priority")
|
||||
|
||||
render_job["Props"]["Env"] = {
|
||||
"FTRACK_API_USER": os.environ.get("FTRACK_API_USER"),
|
||||
|
|
|
|||
|
|
@ -1,11 +1,6 @@
|
|||
import os
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
from avalon import lib as avalonlib
|
||||
from openpype.api import (
|
||||
Anatomy,
|
||||
get_project_settings
|
||||
)
|
||||
from openpype.lib import ApplicationManager
|
||||
from openpype.api import Anatomy
|
||||
|
||||
|
||||
class CreateFolders(BaseAction):
|
||||
|
|
|
|||
|
|
@ -1,3 +1,15 @@
|
|||
"""Integrate components into ftrack
|
||||
|
||||
Requires:
|
||||
context -> ftrackSession - connected ftrack.Session
|
||||
instance -> ftrackComponentsList - list of components to integrate
|
||||
|
||||
Provides:
|
||||
instance -> ftrackIntegratedAssetVersionsData
|
||||
# legacy
|
||||
instance -> ftrackIntegratedAssetVersions
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import six
|
||||
|
|
@ -54,6 +66,114 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
|
|||
self.log.debug(query)
|
||||
return query
|
||||
|
||||
def process(self, instance):
|
||||
session = instance.context.data["ftrackSession"]
|
||||
context = instance.context
|
||||
component_list = instance.data.get("ftrackComponentsList")
|
||||
if not component_list:
|
||||
self.log.info(
|
||||
"Instance don't have components to integrate to Ftrack."
|
||||
" Skipping."
|
||||
)
|
||||
return
|
||||
|
||||
session = instance.context.data["ftrackSession"]
|
||||
context = instance.context
|
||||
|
||||
parent_entity = None
|
||||
default_asset_name = None
|
||||
# If instance has set "ftrackEntity" or "ftrackTask" then use them from
|
||||
# instance. Even if they are set to None. If they are set to None it
|
||||
# has a reason. (like has different context)
|
||||
if "ftrackEntity" in instance.data or "ftrackTask" in instance.data:
|
||||
task_entity = instance.data.get("ftrackTask")
|
||||
parent_entity = instance.data.get("ftrackEntity")
|
||||
|
||||
elif "ftrackEntity" in context.data or "ftrackTask" in context.data:
|
||||
task_entity = context.data.get("ftrackTask")
|
||||
parent_entity = context.data.get("ftrackEntity")
|
||||
|
||||
if task_entity:
|
||||
default_asset_name = task_entity["name"]
|
||||
parent_entity = task_entity["parent"]
|
||||
|
||||
if parent_entity is None:
|
||||
self.log.info((
|
||||
"Skipping ftrack integration. Instance \"{}\" does not"
|
||||
" have specified ftrack entities."
|
||||
).format(str(instance)))
|
||||
return
|
||||
|
||||
if not default_asset_name:
|
||||
default_asset_name = parent_entity["name"]
|
||||
|
||||
# Change status on task
|
||||
self._set_task_status(instance, task_entity, session)
|
||||
|
||||
# Prepare AssetTypes
|
||||
asset_types_by_short = self._ensure_asset_types_exists(
|
||||
session, component_list
|
||||
)
|
||||
|
||||
asset_versions_data_by_id = {}
|
||||
used_asset_versions = []
|
||||
# Iterate over components and publish
|
||||
for data in component_list:
|
||||
self.log.debug("data: {}".format(data))
|
||||
|
||||
# AssetType
|
||||
asset_type_short = data["assettype_data"]["short"]
|
||||
asset_type_entity = asset_types_by_short[asset_type_short]
|
||||
|
||||
# Asset
|
||||
asset_data = data.get("asset_data") or {}
|
||||
if "name" not in asset_data:
|
||||
asset_data["name"] = default_asset_name
|
||||
asset_entity = self._ensure_asset_exists(
|
||||
session,
|
||||
asset_data,
|
||||
asset_type_entity["id"],
|
||||
parent_entity["id"]
|
||||
)
|
||||
|
||||
# Asset Version
|
||||
asset_version_data = data.get("assetversion_data") or {}
|
||||
asset_version_entity = self._ensure_asset_version_exists(
|
||||
session, asset_version_data, asset_entity["id"], task_entity
|
||||
)
|
||||
|
||||
# Component
|
||||
self.create_component(session, asset_version_entity, data)
|
||||
|
||||
# Store asset version and components items that were
|
||||
version_id = asset_version_entity["id"]
|
||||
if version_id not in asset_versions_data_by_id:
|
||||
asset_versions_data_by_id[version_id] = {
|
||||
"asset_version": asset_version_entity,
|
||||
"component_items": []
|
||||
}
|
||||
|
||||
asset_versions_data_by_id[version_id]["component_items"].append(
|
||||
data
|
||||
)
|
||||
|
||||
# Backwards compatibility
|
||||
if asset_version_entity not in used_asset_versions:
|
||||
used_asset_versions.append(asset_version_entity)
|
||||
|
||||
instance.data["ftrackIntegratedAssetVersionsData"] = (
|
||||
asset_versions_data_by_id
|
||||
)
|
||||
|
||||
# Backwards compatibility
|
||||
asset_versions_key = "ftrackIntegratedAssetVersions"
|
||||
if asset_versions_key not in instance.data:
|
||||
instance.data[asset_versions_key] = []
|
||||
|
||||
for asset_version in used_asset_versions:
|
||||
if asset_version not in instance.data[asset_versions_key]:
|
||||
instance.data[asset_versions_key].append(asset_version)
|
||||
|
||||
def _set_task_status(self, instance, task_entity, session):
|
||||
project_entity = instance.context.data.get("ftrackProject")
|
||||
if not project_entity:
|
||||
|
|
@ -100,190 +220,222 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
|
|||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
def process(self, instance):
|
||||
session = instance.context.data["ftrackSession"]
|
||||
context = instance.context
|
||||
def _ensure_asset_types_exists(self, session, component_list):
|
||||
"""Make sure that all AssetType entities exists for integration.
|
||||
|
||||
name = None
|
||||
# If instance has set "ftrackEntity" or "ftrackTask" then use them from
|
||||
# instance. Even if they are set to None. If they are set to None it
|
||||
# has a reason. (like has different context)
|
||||
if "ftrackEntity" in instance.data or "ftrackTask" in instance.data:
|
||||
task = instance.data.get("ftrackTask")
|
||||
parent = instance.data.get("ftrackEntity")
|
||||
Returns:
|
||||
dict: All asset types by short name.
|
||||
"""
|
||||
# Query existing asset types
|
||||
asset_types = session.query("select id, short from AssetType").all()
|
||||
# Stpore all existing short names
|
||||
asset_type_shorts = {asset_type["short"] for asset_type in asset_types}
|
||||
# Check which asset types are missing and store them
|
||||
asset_type_names_by_missing_shorts = {}
|
||||
default_short_name = "upload"
|
||||
for data in component_list:
|
||||
asset_type_data = data.get("assettype_data") or {}
|
||||
asset_type_short = asset_type_data.get("short")
|
||||
if not asset_type_short:
|
||||
# Use default asset type name if not set and change the
|
||||
# input data
|
||||
asset_type_short = default_short_name
|
||||
asset_type_data["short"] = asset_type_short
|
||||
data["assettype_data"] = asset_type_data
|
||||
|
||||
elif "ftrackEntity" in context.data or "ftrackTask" in context.data:
|
||||
task = context.data.get("ftrackTask")
|
||||
parent = context.data.get("ftrackEntity")
|
||||
if (
|
||||
# Skip if short name exists
|
||||
asset_type_short in asset_type_shorts
|
||||
# Skip if short name was already added to missing types
|
||||
# and asset type name is filled
|
||||
# - if asset type name is missing then try use name from other
|
||||
# data
|
||||
or asset_type_names_by_missing_shorts.get(asset_type_short)
|
||||
):
|
||||
continue
|
||||
|
||||
if task:
|
||||
parent = task["parent"]
|
||||
name = task
|
||||
elif parent:
|
||||
name = parent["name"]
|
||||
asset_type_names_by_missing_shorts[asset_type_short] = (
|
||||
asset_type_data.get("name")
|
||||
)
|
||||
|
||||
if not name:
|
||||
self.log.info((
|
||||
"Skipping ftrack integration. Instance \"{}\" does not"
|
||||
" have specified ftrack entities."
|
||||
).format(str(instance)))
|
||||
return
|
||||
# Create missing asset types if there are any
|
||||
if asset_type_names_by_missing_shorts:
|
||||
self.log.info("Creating asset types with short names: {}".format(
|
||||
", ".join(asset_type_names_by_missing_shorts.keys())
|
||||
))
|
||||
for missing_short, type_name in asset_type_names_by_missing_shorts:
|
||||
# Use short for name if name is not defined
|
||||
if not type_name:
|
||||
type_name = missing_short
|
||||
# Use short name also for name
|
||||
# - there is not other source for 'name'
|
||||
session.create(
|
||||
"AssetType",
|
||||
{
|
||||
"short": missing_short,
|
||||
"name": type_name
|
||||
}
|
||||
)
|
||||
|
||||
info_msg = (
|
||||
"Created new {entity_type} with data: {data}"
|
||||
", metadata: {metadata}."
|
||||
# Commit creation
|
||||
session.commit()
|
||||
# Requery asset types
|
||||
asset_types = session.query(
|
||||
"select id, short from AssetType"
|
||||
).all()
|
||||
|
||||
return {asset_type["short"]: asset_type for asset_type in asset_types}
|
||||
|
||||
def _ensure_asset_exists(
|
||||
self, session, asset_data, asset_type_id, parent_id
|
||||
):
|
||||
asset_name = asset_data["name"]
|
||||
asset_entity = self._query_asset(
|
||||
session, asset_name, asset_type_id, parent_id
|
||||
)
|
||||
if asset_entity is not None:
|
||||
return asset_entity
|
||||
|
||||
asset_data = {
|
||||
"name": asset_name,
|
||||
"type_id": asset_type_id,
|
||||
"context_id": parent_id
|
||||
}
|
||||
self.log.info("Created new Asset with data: {}.".format(asset_data))
|
||||
session.create("Asset", asset_data)
|
||||
session.commit()
|
||||
return self._query_asset(session, asset_name, asset_type_id, parent_id)
|
||||
|
||||
def _query_asset(self, session, asset_name, asset_type_id, parent_id):
|
||||
return session.query(
|
||||
(
|
||||
"select id from Asset"
|
||||
" where name is \"{}\""
|
||||
" and type_id is \"{}\""
|
||||
" and context_id is \"{}\""
|
||||
).format(asset_name, asset_type_id, parent_id)
|
||||
).first()
|
||||
|
||||
def _ensure_asset_version_exists(
|
||||
self, session, asset_version_data, asset_id, task_entity
|
||||
):
|
||||
task_id = None
|
||||
if task_entity:
|
||||
task_id = task_entity["id"]
|
||||
|
||||
# Try query asset version by criteria (asset id and version)
|
||||
version = asset_version_data.get("version") or 0
|
||||
asset_version_entity = self._query_asset_version(
|
||||
session, version, asset_id
|
||||
)
|
||||
|
||||
used_asset_versions = []
|
||||
# Prepare comment value
|
||||
comment = asset_version_data.get("comment") or ""
|
||||
if asset_version_entity is not None:
|
||||
changed = False
|
||||
if comment != asset_version_entity["comment"]:
|
||||
asset_version_entity["comment"] = comment
|
||||
changed = True
|
||||
|
||||
self._set_task_status(instance, task, session)
|
||||
if task_id != asset_version_entity["task_id"]:
|
||||
asset_version_entity["task_id"] = task_id
|
||||
changed = True
|
||||
|
||||
# Iterate over components and publish
|
||||
for data in instance.data.get("ftrackComponentsList", []):
|
||||
# AssetType
|
||||
# Get existing entity.
|
||||
assettype_data = {"short": "upload"}
|
||||
assettype_data.update(data.get("assettype_data", {}))
|
||||
self.log.debug("data: {}".format(data))
|
||||
if changed:
|
||||
session.commit()
|
||||
|
||||
assettype_entity = session.query(
|
||||
self.query("AssetType", assettype_data)
|
||||
).first()
|
||||
|
||||
# Create a new entity if none exits.
|
||||
if not assettype_entity:
|
||||
assettype_entity = session.create("AssetType", assettype_data)
|
||||
self.log.debug("Created new AssetType with data: {}".format(
|
||||
assettype_data
|
||||
))
|
||||
|
||||
# Asset
|
||||
# Get existing entity.
|
||||
asset_data = {
|
||||
"name": name,
|
||||
"type": assettype_entity,
|
||||
"parent": parent,
|
||||
else:
|
||||
new_asset_version_data = {
|
||||
"version": version,
|
||||
"asset_id": asset_id
|
||||
}
|
||||
asset_data.update(data.get("asset_data", {}))
|
||||
if task_id:
|
||||
new_asset_version_data["task_id"] = task_id
|
||||
|
||||
asset_entity = session.query(
|
||||
self.query("Asset", asset_data)
|
||||
).first()
|
||||
if comment:
|
||||
new_asset_version_data["comment"] = comment
|
||||
|
||||
self.log.info("asset entity: {}".format(asset_entity))
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
asset_metadata = asset_data.pop("metadata", {})
|
||||
|
||||
# Create a new entity if none exits.
|
||||
if not asset_entity:
|
||||
asset_entity = session.create("Asset", asset_data)
|
||||
self.log.debug(
|
||||
info_msg.format(
|
||||
entity_type="Asset",
|
||||
data=asset_data,
|
||||
metadata=asset_metadata
|
||||
)
|
||||
)
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
# Adding metadata
|
||||
existing_asset_metadata = asset_entity["metadata"]
|
||||
existing_asset_metadata.update(asset_metadata)
|
||||
asset_entity["metadata"] = existing_asset_metadata
|
||||
|
||||
# AssetVersion
|
||||
# Get existing entity.
|
||||
assetversion_data = {
|
||||
"version": 0,
|
||||
"asset": asset_entity,
|
||||
}
|
||||
_assetversion_data = data.get("assetversion_data", {})
|
||||
assetversion_cust_attrs = _assetversion_data.pop(
|
||||
"custom_attributes", {}
|
||||
self.log.info("Created new AssetVersion with data {}".format(
|
||||
new_asset_version_data
|
||||
))
|
||||
session.create("AssetVersion", new_asset_version_data)
|
||||
session.commit()
|
||||
asset_version_entity = self._query_asset_version(
|
||||
session, version, asset_id
|
||||
)
|
||||
asset_version_comment = _assetversion_data.pop(
|
||||
"comment", None
|
||||
)
|
||||
assetversion_data.update(_assetversion_data)
|
||||
|
||||
assetversion_entity = session.query(
|
||||
self.query("AssetVersion", assetversion_data)
|
||||
).first()
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
assetversion_metadata = assetversion_data.pop("metadata", {})
|
||||
|
||||
if task:
|
||||
assetversion_data['task'] = task
|
||||
|
||||
# Create a new entity if none exits.
|
||||
if not assetversion_entity:
|
||||
assetversion_entity = session.create(
|
||||
"AssetVersion", assetversion_data
|
||||
)
|
||||
self.log.debug(
|
||||
info_msg.format(
|
||||
entity_type="AssetVersion",
|
||||
data=assetversion_data,
|
||||
metadata=assetversion_metadata
|
||||
# Set custom attributes if there were any set
|
||||
custom_attrs = asset_version_data.get("custom_attributes") or {}
|
||||
for attr_key, attr_value in custom_attrs.items():
|
||||
if attr_key in asset_version_entity["custom_attributes"]:
|
||||
try:
|
||||
asset_version_entity["custom_attributes"][attr_key] = (
|
||||
attr_value
|
||||
)
|
||||
session.commit()
|
||||
continue
|
||||
except Exception:
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
|
||||
self.log.warning(
|
||||
(
|
||||
"Custom Attrubute \"{0}\" is not available for"
|
||||
" AssetVersion <{1}>. Can't set it's value to: \"{2}\""
|
||||
).format(
|
||||
attr_key, asset_version_entity["id"], str(attr_value)
|
||||
)
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
)
|
||||
|
||||
# Adding metadata
|
||||
existing_assetversion_metadata = assetversion_entity["metadata"]
|
||||
existing_assetversion_metadata.update(assetversion_metadata)
|
||||
assetversion_entity["metadata"] = existing_assetversion_metadata
|
||||
return asset_version_entity
|
||||
|
||||
# Add comment
|
||||
if asset_version_comment:
|
||||
assetversion_entity["comment"] = asset_version_comment
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
self.log.warning((
|
||||
"Comment was not possible to set for AssetVersion"
|
||||
"\"{0}\". Can't set it's value to: \"{1}\""
|
||||
).format(
|
||||
assetversion_entity["id"], str(asset_version_comment)
|
||||
))
|
||||
def _query_asset_version(self, session, version, asset_id):
|
||||
return session.query(
|
||||
(
|
||||
"select id, task_id, comment from AssetVersion"
|
||||
" where version is \"{}\" and asset_id is \"{}\""
|
||||
).format(version, asset_id)
|
||||
).first()
|
||||
|
||||
# Adding Custom Attributes
|
||||
for attr, val in assetversion_cust_attrs.items():
|
||||
if attr in assetversion_entity["custom_attributes"]:
|
||||
try:
|
||||
assetversion_entity["custom_attributes"][attr] = val
|
||||
session.commit()
|
||||
continue
|
||||
except Exception:
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
def create_component(self, session, asset_version_entity, data):
|
||||
component_data = data.get("component_data") or {}
|
||||
|
||||
self.log.warning((
|
||||
"Custom Attrubute \"{0}\""
|
||||
" is not available for AssetVersion <{1}>."
|
||||
" Can't set it's value to: \"{2}\""
|
||||
).format(attr, assetversion_entity["id"], str(val)))
|
||||
if not component_data.get("name"):
|
||||
component_data["name"] = "main"
|
||||
|
||||
version_id = asset_version_entity["id"]
|
||||
component_data["version_id"] = version_id
|
||||
component_entity = session.query(
|
||||
(
|
||||
"select id, name from Component where name is \"{}\""
|
||||
" and version_id is \"{}\""
|
||||
).format(component_data["name"], version_id)
|
||||
).first()
|
||||
|
||||
component_overwrite = data.get("component_overwrite", False)
|
||||
location = data.get("component_location", session.pick_location())
|
||||
|
||||
# Overwrite existing component data if requested.
|
||||
if component_entity and component_overwrite:
|
||||
origin_location = session.query(
|
||||
"Location where name is \"ftrack.origin\""
|
||||
).one()
|
||||
|
||||
# Removing existing members from location
|
||||
components = list(component_entity.get("members", []))
|
||||
components += [component_entity]
|
||||
for component in components:
|
||||
for loc in component["component_locations"]:
|
||||
if location["id"] == loc["location_id"]:
|
||||
location.remove_component(
|
||||
component, recursive=False
|
||||
)
|
||||
|
||||
# Deleting existing members on component entity
|
||||
for member in component_entity.get("members", []):
|
||||
session.delete(member)
|
||||
del(member)
|
||||
|
||||
# Have to commit the version and asset, because location can't
|
||||
# determine the final location without.
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
|
|
@ -292,175 +444,124 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
|
|||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
# Component
|
||||
# Get existing entity.
|
||||
component_data = {
|
||||
"name": "main",
|
||||
"version": assetversion_entity
|
||||
}
|
||||
component_data.update(data.get("component_data", {}))
|
||||
# Reset members in memory
|
||||
if "members" in component_entity.keys():
|
||||
component_entity["members"] = []
|
||||
|
||||
component_entity = session.query(
|
||||
self.query("Component", component_data)
|
||||
).first()
|
||||
# Add components to origin location
|
||||
try:
|
||||
collection = clique.parse(data["component_path"])
|
||||
except ValueError:
|
||||
# Assume its a single file
|
||||
# Changing file type
|
||||
name, ext = os.path.splitext(data["component_path"])
|
||||
component_entity["file_type"] = ext
|
||||
|
||||
component_overwrite = data.get("component_overwrite", False)
|
||||
location = data.get("component_location", session.pick_location())
|
||||
|
||||
# Overwrite existing component data if requested.
|
||||
if component_entity and component_overwrite:
|
||||
|
||||
origin_location = session.query(
|
||||
"Location where name is \"ftrack.origin\""
|
||||
).one()
|
||||
|
||||
# Removing existing members from location
|
||||
components = list(component_entity.get("members", []))
|
||||
components += [component_entity]
|
||||
for component in components:
|
||||
for loc in component["component_locations"]:
|
||||
if location["id"] == loc["location_id"]:
|
||||
location.remove_component(
|
||||
component, recursive=False
|
||||
)
|
||||
|
||||
# Deleting existing members on component entity
|
||||
for member in component_entity.get("members", []):
|
||||
session.delete(member)
|
||||
del(member)
|
||||
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
# Reset members in memory
|
||||
if "members" in component_entity.keys():
|
||||
component_entity["members"] = []
|
||||
|
||||
# Add components to origin location
|
||||
try:
|
||||
collection = clique.parse(data["component_path"])
|
||||
except ValueError:
|
||||
# Assume its a single file
|
||||
# Changing file type
|
||||
name, ext = os.path.splitext(data["component_path"])
|
||||
component_entity["file_type"] = ext
|
||||
|
||||
origin_location.add_component(
|
||||
component_entity, data["component_path"]
|
||||
)
|
||||
else:
|
||||
# Changing file type
|
||||
component_entity["file_type"] = collection.format("{tail}")
|
||||
|
||||
# Create member components for sequence.
|
||||
for member_path in collection:
|
||||
|
||||
size = 0
|
||||
try:
|
||||
size = os.path.getsize(member_path)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
name = collection.match(member_path).group("index")
|
||||
|
||||
member_data = {
|
||||
"name": name,
|
||||
"container": component_entity,
|
||||
"size": size,
|
||||
"file_type": os.path.splitext(member_path)[-1]
|
||||
}
|
||||
|
||||
component = session.create(
|
||||
"FileComponent", member_data
|
||||
)
|
||||
origin_location.add_component(
|
||||
component, member_path, recursive=False
|
||||
)
|
||||
component_entity["members"].append(component)
|
||||
|
||||
# Add components to location.
|
||||
location.add_component(
|
||||
component_entity, origin_location, recursive=True
|
||||
)
|
||||
|
||||
data["component"] = component_entity
|
||||
msg = "Overwriting Component with path: {0}, data: {1}, "
|
||||
msg += "location: {2}"
|
||||
self.log.info(
|
||||
msg.format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
location
|
||||
)
|
||||
)
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
component_metadata = component_data.pop("metadata", {})
|
||||
|
||||
# Create new component if none exists.
|
||||
new_component = False
|
||||
if not component_entity:
|
||||
component_entity = assetversion_entity.create_component(
|
||||
data["component_path"],
|
||||
data=component_data,
|
||||
location=location
|
||||
)
|
||||
data["component"] = component_entity
|
||||
msg = "Created new Component with path: {0}, data: {1}"
|
||||
msg += ", metadata: {2}, location: {3}"
|
||||
self.log.info(
|
||||
msg.format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
component_metadata,
|
||||
location
|
||||
)
|
||||
)
|
||||
new_component = True
|
||||
|
||||
# Adding metadata
|
||||
existing_component_metadata = component_entity["metadata"]
|
||||
existing_component_metadata.update(component_metadata)
|
||||
component_entity["metadata"] = existing_component_metadata
|
||||
|
||||
# if component_data['name'] = 'ftrackreview-mp4-mp4':
|
||||
# assetversion_entity["thumbnail_id"]
|
||||
|
||||
# Setting assetversion thumbnail
|
||||
if data.get("thumbnail", False):
|
||||
assetversion_entity["thumbnail_id"] = component_entity["id"]
|
||||
|
||||
# Inform user about no changes to the database.
|
||||
if (component_entity and not component_overwrite and
|
||||
not new_component):
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
"Found existing component, and no request to overwrite. "
|
||||
"Nothing has been changed."
|
||||
origin_location.add_component(
|
||||
component_entity, data["component_path"]
|
||||
)
|
||||
else:
|
||||
# Commit changes.
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
# Changing file type
|
||||
component_entity["file_type"] = collection.format("{tail}")
|
||||
|
||||
if assetversion_entity not in used_asset_versions:
|
||||
used_asset_versions.append(assetversion_entity)
|
||||
# Create member components for sequence.
|
||||
for member_path in collection:
|
||||
|
||||
asset_versions_key = "ftrackIntegratedAssetVersions"
|
||||
if asset_versions_key not in instance.data:
|
||||
instance.data[asset_versions_key] = []
|
||||
size = 0
|
||||
try:
|
||||
size = os.path.getsize(member_path)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
for asset_version in used_asset_versions:
|
||||
if asset_version not in instance.data[asset_versions_key]:
|
||||
instance.data[asset_versions_key].append(asset_version)
|
||||
name = collection.match(member_path).group("index")
|
||||
|
||||
member_data = {
|
||||
"name": name,
|
||||
"container": component_entity,
|
||||
"size": size,
|
||||
"file_type": os.path.splitext(member_path)[-1]
|
||||
}
|
||||
|
||||
component = session.create(
|
||||
"FileComponent", member_data
|
||||
)
|
||||
origin_location.add_component(
|
||||
component, member_path, recursive=False
|
||||
)
|
||||
component_entity["members"].append(component)
|
||||
|
||||
# Add components to location.
|
||||
location.add_component(
|
||||
component_entity, origin_location, recursive=True
|
||||
)
|
||||
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
(
|
||||
"Overwriting Component with path: {0}, data: {1},"
|
||||
" location: {2}"
|
||||
).format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
location
|
||||
)
|
||||
)
|
||||
|
||||
# Extracting metadata, and adding after entity creation. This is
|
||||
# due to a ftrack_api bug where you can't add metadata on creation.
|
||||
component_metadata = component_data.pop("metadata", {})
|
||||
|
||||
# Create new component if none exists.
|
||||
new_component = False
|
||||
if not component_entity:
|
||||
component_entity = asset_version_entity.create_component(
|
||||
data["component_path"],
|
||||
data=component_data,
|
||||
location=location
|
||||
)
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
(
|
||||
"Created new Component with path: {0}, data: {1},"
|
||||
" metadata: {2}, location: {3}"
|
||||
).format(
|
||||
data["component_path"],
|
||||
component_data,
|
||||
component_metadata,
|
||||
location
|
||||
)
|
||||
)
|
||||
new_component = True
|
||||
|
||||
# Adding metadata
|
||||
existing_component_metadata = component_entity["metadata"]
|
||||
existing_component_metadata.update(component_metadata)
|
||||
component_entity["metadata"] = existing_component_metadata
|
||||
|
||||
# if component_data['name'] = 'ftrackreview-mp4-mp4':
|
||||
# assetversion_entity["thumbnail_id"]
|
||||
|
||||
# Setting assetversion thumbnail
|
||||
if data.get("thumbnail"):
|
||||
asset_version_entity["thumbnail_id"] = component_entity["id"]
|
||||
|
||||
# Inform user about no changes to the database.
|
||||
if (
|
||||
component_entity
|
||||
and not component_overwrite
|
||||
and not new_component
|
||||
):
|
||||
data["component"] = component_entity
|
||||
self.log.info(
|
||||
"Found existing component, and no request to overwrite. "
|
||||
"Nothing has been changed."
|
||||
)
|
||||
else:
|
||||
# Commit changes.
|
||||
try:
|
||||
session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,84 @@
|
|||
"""
|
||||
Requires:
|
||||
context > comment
|
||||
context > ftrackSession
|
||||
instance > ftrackIntegratedAssetVersionsData
|
||||
"""
|
||||
|
||||
import sys
|
||||
|
||||
import six
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class IntegrateFtrackDescription(pyblish.api.InstancePlugin):
|
||||
"""Add description to AssetVersions in Ftrack."""
|
||||
|
||||
# Must be after integrate asset new
|
||||
order = pyblish.api.IntegratorOrder + 0.4999
|
||||
label = "Integrate Ftrack description"
|
||||
families = ["ftrack"]
|
||||
optional = True
|
||||
|
||||
# Can be set in settings:
|
||||
# - Allows `intent` and `comment` keys
|
||||
description_template = "{comment}"
|
||||
|
||||
def process(self, instance):
|
||||
# Check if there are any integrated AssetVersion entities
|
||||
asset_versions_key = "ftrackIntegratedAssetVersionsData"
|
||||
asset_versions_data_by_id = instance.data.get(asset_versions_key)
|
||||
if not asset_versions_data_by_id:
|
||||
self.log.info("There are any integrated AssetVersions")
|
||||
return
|
||||
|
||||
comment = (instance.context.data.get("comment") or "").strip()
|
||||
if not comment:
|
||||
self.log.info("Comment is not set.")
|
||||
else:
|
||||
self.log.debug("Comment is set to `{}`".format(comment))
|
||||
|
||||
session = instance.context.data["ftrackSession"]
|
||||
|
||||
intent = instance.context.data.get("intent")
|
||||
intent_label = None
|
||||
if intent and isinstance(intent, dict):
|
||||
intent_val = intent.get("value")
|
||||
intent_label = intent.get("label")
|
||||
else:
|
||||
intent_val = intent
|
||||
|
||||
if not intent_label:
|
||||
intent_label = intent_val or ""
|
||||
|
||||
# if intent label is set then format comment
|
||||
# - it is possible that intent_label is equal to "" (empty string)
|
||||
if intent_label:
|
||||
self.log.debug(
|
||||
"Intent label is set to `{}`.".format(intent_label)
|
||||
)
|
||||
|
||||
else:
|
||||
self.log.debug("Intent is not set.")
|
||||
|
||||
for asset_version_data in asset_versions_data_by_id.values():
|
||||
asset_version = asset_version_data["asset_version"]
|
||||
|
||||
# Backwards compatibility for older settings using
|
||||
# attribute 'note_with_intent_template'
|
||||
comment = self.description_template.format(**{
|
||||
"intent": intent_label,
|
||||
"comment": comment
|
||||
})
|
||||
asset_version["comment"] = comment
|
||||
|
||||
try:
|
||||
session.commit()
|
||||
self.log.debug("Comment added to AssetVersion \"{}\"".format(
|
||||
str(asset_version)
|
||||
))
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
session.rollback()
|
||||
session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
|
@ -35,10 +35,18 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
"image": "img",
|
||||
"reference": "reference"
|
||||
}
|
||||
keep_first_subset_name_for_review = True
|
||||
|
||||
def process(self, instance):
|
||||
self.log.debug("instance {}".format(instance))
|
||||
|
||||
instance_repres = instance.data.get("representations")
|
||||
if not instance_repres:
|
||||
self.log.info((
|
||||
"Skipping instance. Does not have any representations {}"
|
||||
).format(str(instance)))
|
||||
return
|
||||
|
||||
instance_version = instance.data.get("version")
|
||||
if instance_version is None:
|
||||
raise ValueError("Instance version not set")
|
||||
|
|
@ -52,8 +60,12 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
if not asset_type and family_low in self.family_mapping:
|
||||
asset_type = self.family_mapping[family_low]
|
||||
|
||||
self.log.debug(self.family_mapping)
|
||||
self.log.debug(family_low)
|
||||
if not asset_type:
|
||||
asset_type = "upload"
|
||||
|
||||
self.log.debug(
|
||||
"Family: {}\nMapping: {}".format(family_low, self.family_mapping)
|
||||
)
|
||||
|
||||
# Ignore this instance if neither "ftrackFamily" or a family mapping is
|
||||
# found.
|
||||
|
|
@ -63,13 +75,6 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
).format(family))
|
||||
return
|
||||
|
||||
instance_repres = instance.data.get("representations")
|
||||
if not instance_repres:
|
||||
self.log.info((
|
||||
"Skipping instance. Does not have any representations {}"
|
||||
).format(str(instance)))
|
||||
return
|
||||
|
||||
# Prepare FPS
|
||||
instance_fps = instance.data.get("fps")
|
||||
if instance_fps is None:
|
||||
|
|
@ -168,7 +173,47 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
# Change asset name of each new component for review
|
||||
is_first_review_repre = True
|
||||
not_first_components = []
|
||||
extended_asset_name = ""
|
||||
multiple_reviewable = len(review_representations) > 1
|
||||
for repre in review_representations:
|
||||
# Create copy of base comp item and append it
|
||||
review_item = copy.deepcopy(base_component_item)
|
||||
|
||||
# get asset name and define extended name variant
|
||||
asset_name = review_item["asset_data"]["name"]
|
||||
extended_asset_name = "_".join(
|
||||
(asset_name, repre["name"])
|
||||
)
|
||||
|
||||
# reset extended if no need for extended asset name
|
||||
if (
|
||||
self.keep_first_subset_name_for_review
|
||||
and is_first_review_repre
|
||||
):
|
||||
extended_asset_name = ""
|
||||
else:
|
||||
# only rename if multiple reviewable
|
||||
if multiple_reviewable:
|
||||
review_item["asset_data"]["name"] = extended_asset_name
|
||||
else:
|
||||
extended_asset_name = ""
|
||||
|
||||
# rename all already created components
|
||||
# only if first repre and extended name available
|
||||
if is_first_review_repre and extended_asset_name:
|
||||
# and rename all already created components
|
||||
for _ci in component_list:
|
||||
_ci["asset_data"]["name"] = extended_asset_name
|
||||
|
||||
# and rename all already created src components
|
||||
for _sci in src_components_to_add:
|
||||
_sci["asset_data"]["name"] = extended_asset_name
|
||||
|
||||
# rename also first thumbnail component if any
|
||||
if first_thumbnail_component is not None:
|
||||
first_thumbnail_component[
|
||||
"asset_data"]["name"] = extended_asset_name
|
||||
|
||||
frame_start = repre.get("frameStartFtrack")
|
||||
frame_end = repre.get("frameEndFtrack")
|
||||
if frame_start is None or frame_end is None:
|
||||
|
|
@ -184,8 +229,6 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
if fps is None:
|
||||
fps = instance_fps
|
||||
|
||||
# Create copy of base comp item and append it
|
||||
review_item = copy.deepcopy(base_component_item)
|
||||
# Change location
|
||||
review_item["component_path"] = repre["published_path"]
|
||||
# Change component data
|
||||
|
|
@ -200,18 +243,16 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
})
|
||||
}
|
||||
}
|
||||
# Create copy of item before setting location or changing asset
|
||||
src_components_to_add.append(copy.deepcopy(review_item))
|
||||
|
||||
if is_first_review_repre:
|
||||
is_first_review_repre = False
|
||||
else:
|
||||
# Add representation name to asset name of "not first" review
|
||||
asset_name = review_item["asset_data"]["name"]
|
||||
review_item["asset_data"]["name"] = "_".join(
|
||||
(asset_name, repre["name"])
|
||||
)
|
||||
# later detection for thumbnail duplication
|
||||
not_first_components.append(review_item)
|
||||
|
||||
# Create copy of item before setting location
|
||||
src_components_to_add.append(copy.deepcopy(review_item))
|
||||
|
||||
# Set location
|
||||
review_item["component_location"] = ftrack_server_location
|
||||
# Add item to component list
|
||||
|
|
@ -249,6 +290,14 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
continue
|
||||
# Create copy of base comp item and append it
|
||||
other_item = copy.deepcopy(base_component_item)
|
||||
|
||||
# add extended name if any
|
||||
if (
|
||||
not self.keep_first_subset_name_for_review
|
||||
and extended_asset_name
|
||||
):
|
||||
other_item["asset_data"]["name"] = extended_asset_name
|
||||
|
||||
other_item["component_data"] = {
|
||||
"name": repre["name"]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,7 +1,17 @@
|
|||
"""
|
||||
Requires:
|
||||
context > hostName
|
||||
context > appName
|
||||
context > appLabel
|
||||
context > comment
|
||||
context > ftrackSession
|
||||
instance > ftrackIntegratedAssetVersionsData
|
||||
"""
|
||||
|
||||
import sys
|
||||
import json
|
||||
import pyblish.api
|
||||
|
||||
import six
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
||||
|
|
@ -15,100 +25,52 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
|
||||
# Can be set in presets:
|
||||
# - Allows only `intent` and `comment` keys
|
||||
note_template = None
|
||||
# Backwards compatibility
|
||||
note_with_intent_template = "{intent}: {comment}"
|
||||
# - note label must exist in Ftrack
|
||||
note_labels = []
|
||||
|
||||
def get_intent_label(self, session, intent_value):
|
||||
if not intent_value:
|
||||
return
|
||||
|
||||
intent_configurations = session.query(
|
||||
"CustomAttributeConfiguration where key is intent"
|
||||
).all()
|
||||
if not intent_configurations:
|
||||
return
|
||||
|
||||
intent_configuration = intent_configurations[0]
|
||||
if len(intent_configuration) > 1:
|
||||
self.log.warning((
|
||||
"Found more than one `intent` custom attribute."
|
||||
" Using first found."
|
||||
))
|
||||
|
||||
config = intent_configuration.get("config")
|
||||
if not config:
|
||||
return
|
||||
|
||||
configuration = json.loads(config)
|
||||
items = configuration.get("data")
|
||||
if not items:
|
||||
return
|
||||
|
||||
if sys.version_info[0] < 3:
|
||||
string_type = basestring
|
||||
else:
|
||||
string_type = str
|
||||
|
||||
if isinstance(items, string_type):
|
||||
items = json.loads(items)
|
||||
|
||||
intent_label = None
|
||||
for item in items:
|
||||
if item["value"] == intent_value:
|
||||
intent_label = item["menu"]
|
||||
break
|
||||
|
||||
return intent_label
|
||||
|
||||
def process(self, instance):
|
||||
comment = (instance.context.data.get("comment") or "").strip()
|
||||
# Check if there are any integrated AssetVersion entities
|
||||
asset_versions_key = "ftrackIntegratedAssetVersionsData"
|
||||
asset_versions_data_by_id = instance.data.get(asset_versions_key)
|
||||
if not asset_versions_data_by_id:
|
||||
self.log.info("There are any integrated AssetVersions")
|
||||
return
|
||||
|
||||
context = instance.context
|
||||
host_name = context.data["hostName"]
|
||||
app_name = context.data["appName"]
|
||||
app_label = context.data["appLabel"]
|
||||
comment = (context.data.get("comment") or "").strip()
|
||||
if not comment:
|
||||
self.log.info("Comment is not set.")
|
||||
return
|
||||
else:
|
||||
self.log.debug("Comment is set to `{}`".format(comment))
|
||||
|
||||
self.log.debug("Comment is set to `{}`".format(comment))
|
||||
|
||||
session = instance.context.data["ftrackSession"]
|
||||
session = context.data["ftrackSession"]
|
||||
|
||||
intent = instance.context.data.get("intent")
|
||||
intent_label = None
|
||||
if intent and isinstance(intent, dict):
|
||||
intent_val = intent.get("value")
|
||||
intent_label = intent.get("label")
|
||||
else:
|
||||
intent_val = intent_label = intent
|
||||
intent_val = intent
|
||||
|
||||
final_label = None
|
||||
if intent_val:
|
||||
final_label = self.get_intent_label(session, intent_val)
|
||||
if final_label is None:
|
||||
final_label = intent_label
|
||||
if not intent_label:
|
||||
intent_label = intent_val or ""
|
||||
|
||||
# if intent label is set then format comment
|
||||
# - it is possible that intent_label is equal to "" (empty string)
|
||||
if final_label:
|
||||
msg = "Intent label is set to `{}`.".format(final_label)
|
||||
comment = self.note_with_intent_template.format(**{
|
||||
"intent": final_label,
|
||||
"comment": comment
|
||||
})
|
||||
|
||||
elif intent_val:
|
||||
msg = (
|
||||
"Intent is set to `{}` and was not added"
|
||||
" to comment because label is set to `{}`."
|
||||
).format(intent_val, final_label)
|
||||
if intent_label:
|
||||
self.log.debug(
|
||||
"Intent label is set to `{}`.".format(intent_label)
|
||||
)
|
||||
|
||||
else:
|
||||
msg = "Intent is not set."
|
||||
|
||||
self.log.debug(msg)
|
||||
|
||||
asset_versions_key = "ftrackIntegratedAssetVersions"
|
||||
asset_versions = instance.data.get(asset_versions_key)
|
||||
if not asset_versions:
|
||||
self.log.info("There are any integrated AssetVersions")
|
||||
return
|
||||
self.log.debug("Intent is not set.")
|
||||
|
||||
user = session.query(
|
||||
"User where username is \"{}\"".format(session.api_user)
|
||||
|
|
@ -122,7 +84,7 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
|
||||
labels = []
|
||||
if self.note_labels:
|
||||
all_labels = session.query("NoteLabel").all()
|
||||
all_labels = session.query("select id, name from NoteLabel").all()
|
||||
labels_by_low_name = {lab["name"].lower(): lab for lab in all_labels}
|
||||
for _label in self.note_labels:
|
||||
label = labels_by_low_name.get(_label.lower())
|
||||
|
|
@ -134,7 +96,34 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
|
||||
labels.append(label)
|
||||
|
||||
for asset_version in asset_versions:
|
||||
for asset_version_data in asset_versions_data_by_id.values():
|
||||
asset_version = asset_version_data["asset_version"]
|
||||
component_items = asset_version_data["component_items"]
|
||||
|
||||
published_paths = set()
|
||||
for component_item in component_items:
|
||||
published_paths.add(component_item["component_path"])
|
||||
|
||||
# Backwards compatibility for older settings using
|
||||
# attribute 'note_with_intent_template'
|
||||
template = self.note_template
|
||||
if template is None:
|
||||
template = self.note_with_intent_template
|
||||
format_data = {
|
||||
"intent": intent_label,
|
||||
"comment": comment,
|
||||
"host_name": host_name,
|
||||
"app_name": app_name,
|
||||
"app_label": app_label,
|
||||
"published_paths": "<br/>".join(sorted(published_paths)),
|
||||
}
|
||||
comment = template.format(**format_data)
|
||||
if not comment:
|
||||
self.log.info((
|
||||
"Note for AssetVersion {} would be empty. Skipping."
|
||||
"\nTemplate: {}\nData: {}"
|
||||
).format(asset_version["id"], template, format_data))
|
||||
continue
|
||||
asset_version.create_note(comment, author=user, labels=labels)
|
||||
|
||||
try:
|
||||
|
|
|
|||
|
|
@ -389,7 +389,8 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
|
||||
self._append_lines([openpype_art])
|
||||
|
||||
self.setStyleSheet(load_stylesheet())
|
||||
self._first_show = True
|
||||
self._splitter_size_ratio = None
|
||||
|
||||
self._init_from_registry()
|
||||
|
||||
|
|
@ -416,9 +417,9 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
self.resize(width, height)
|
||||
|
||||
try:
|
||||
sizes = setting_registry.get_item("splitter_sizes")
|
||||
if len(sizes) == len(self._widgets_splitter.sizes()):
|
||||
self._widgets_splitter.setSizes(sizes)
|
||||
self._splitter_size_ratio = (
|
||||
setting_registry.get_item("splitter_sizes")
|
||||
)
|
||||
|
||||
except ValueError:
|
||||
pass
|
||||
|
|
@ -627,8 +628,29 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
def showEvent(self, event):
|
||||
self._line_check_timer.start()
|
||||
super(PythonInterpreterWidget, self).showEvent(event)
|
||||
# First show setup
|
||||
if self._first_show:
|
||||
self._first_show = False
|
||||
self._on_first_show()
|
||||
|
||||
self._output_widget.scroll_to_bottom()
|
||||
|
||||
def _on_first_show(self):
|
||||
# Change stylesheet
|
||||
self.setStyleSheet(load_stylesheet())
|
||||
# Check if splitter size ratio is set
|
||||
# - first store value to local variable and then unset it
|
||||
splitter_size_ratio = self._splitter_size_ratio
|
||||
self._splitter_size_ratio = None
|
||||
# Skip if is not set
|
||||
if not splitter_size_ratio:
|
||||
return
|
||||
|
||||
# Skip if number of size items does not match to splitter
|
||||
splitters_count = len(self._widgets_splitter.sizes())
|
||||
if len(splitter_size_ratio) == splitters_count:
|
||||
self._widgets_splitter.setSizes(splitter_size_ratio)
|
||||
|
||||
def closeEvent(self, event):
|
||||
self.save_registry()
|
||||
super(PythonInterpreterWidget, self).closeEvent(event)
|
||||
|
|
|
|||
|
|
@ -73,8 +73,23 @@ class GDriveHandler(AbstractProvider):
|
|||
format(site_name))
|
||||
return
|
||||
|
||||
cred_path = self.presets.get("credentials_url", {}).\
|
||||
get(platform.system().lower()) or ''
|
||||
current_platform = platform.system().lower()
|
||||
cred_path = self.presets.get("credentials_url", {}). \
|
||||
get(current_platform) or ''
|
||||
|
||||
if not cred_path:
|
||||
msg = "Sync Server: Please, fill the credentials for gdrive "\
|
||||
"provider for platform '{}' !".format(current_platform)
|
||||
log.info(msg)
|
||||
return
|
||||
|
||||
try:
|
||||
cred_path = cred_path.format(**os.environ)
|
||||
except KeyError as e:
|
||||
log.info("Sync Server: The key(s) {} does not exist in the "
|
||||
"environment variables".format(" ".join(e.args)))
|
||||
return
|
||||
|
||||
if not os.path.exists(cred_path):
|
||||
msg = "Sync Server: No credentials for gdrive provider " + \
|
||||
"for '{}' on path '{}'!".format(site_name, cred_path)
|
||||
|
|
|
|||
|
|
@ -13,6 +13,13 @@ from .create import (
|
|||
|
||||
LegacyCreator,
|
||||
legacy_create,
|
||||
|
||||
discover_creator_plugins,
|
||||
discover_legacy_creator_plugins,
|
||||
register_creator_plugin,
|
||||
deregister_creator_plugin,
|
||||
register_creator_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
)
|
||||
|
||||
from .load import (
|
||||
|
|
@ -34,6 +41,7 @@ from .load import (
|
|||
|
||||
loaders_from_representation,
|
||||
get_representation_path,
|
||||
get_representation_context,
|
||||
get_repres_contexts,
|
||||
)
|
||||
|
||||
|
|
@ -80,6 +88,13 @@ __all__ = (
|
|||
"LegacyCreator",
|
||||
"legacy_create",
|
||||
|
||||
"discover_creator_plugins",
|
||||
"discover_legacy_creator_plugins",
|
||||
"register_creator_plugin",
|
||||
"deregister_creator_plugin",
|
||||
"register_creator_plugin_path",
|
||||
"deregister_creator_plugin_path",
|
||||
|
||||
# --- Load ---
|
||||
"HeroVersionType",
|
||||
"IncompatibleLoaderError",
|
||||
|
|
@ -99,6 +114,7 @@ __all__ = (
|
|||
|
||||
"loaders_from_representation",
|
||||
"get_representation_path",
|
||||
"get_representation_context",
|
||||
"get_repres_contexts",
|
||||
|
||||
# --- Publish ---
|
||||
|
|
|
|||
|
|
@ -1,4 +1,11 @@
|
|||
import logging
|
||||
from openpype.pipeline.plugin_discover import (
|
||||
discover,
|
||||
register_plugin,
|
||||
register_plugin_path,
|
||||
deregister_plugin,
|
||||
deregister_plugin_path
|
||||
)
|
||||
|
||||
|
||||
class LauncherAction(object):
|
||||
|
|
@ -90,28 +97,20 @@ class InventoryAction(object):
|
|||
|
||||
# Launcher action
|
||||
def discover_launcher_actions():
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.discover(LauncherAction)
|
||||
return discover(LauncherAction)
|
||||
|
||||
|
||||
def register_launcher_action(plugin):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin(LauncherAction, plugin)
|
||||
return register_plugin(LauncherAction, plugin)
|
||||
|
||||
|
||||
def register_launcher_action_path(path):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin_path(LauncherAction, path)
|
||||
return register_plugin_path(LauncherAction, path)
|
||||
|
||||
|
||||
# Inventory action
|
||||
def discover_inventory_actions():
|
||||
import avalon.api
|
||||
|
||||
actions = avalon.api.discover(InventoryAction)
|
||||
actions = discover(InventoryAction)
|
||||
filtered_actions = []
|
||||
for action in actions:
|
||||
if action is not InventoryAction:
|
||||
|
|
@ -121,24 +120,16 @@ def discover_inventory_actions():
|
|||
|
||||
|
||||
def register_inventory_action(plugin):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin(InventoryAction, plugin)
|
||||
return register_plugin(InventoryAction, plugin)
|
||||
|
||||
|
||||
def deregister_inventory_action(plugin):
|
||||
import avalon.api
|
||||
|
||||
avalon.api.deregister_plugin(InventoryAction, plugin)
|
||||
deregister_plugin(InventoryAction, plugin)
|
||||
|
||||
|
||||
def register_inventory_action_path(path):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin_path(InventoryAction, path)
|
||||
return register_plugin_path(InventoryAction, path)
|
||||
|
||||
|
||||
def deregister_inventory_action_path(path):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.deregister_plugin_path(InventoryAction, path)
|
||||
return deregister_plugin_path(InventoryAction, path)
|
||||
|
|
|
|||
|
|
@ -6,7 +6,14 @@ from .creator_plugins import (
|
|||
|
||||
BaseCreator,
|
||||
Creator,
|
||||
AutoCreator
|
||||
AutoCreator,
|
||||
|
||||
discover_creator_plugins,
|
||||
discover_legacy_creator_plugins,
|
||||
register_creator_plugin,
|
||||
deregister_creator_plugin,
|
||||
register_creator_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
)
|
||||
|
||||
from .context import (
|
||||
|
|
@ -29,6 +36,13 @@ __all__ = (
|
|||
"Creator",
|
||||
"AutoCreator",
|
||||
|
||||
"discover_creator_plugins",
|
||||
"discover_legacy_creator_plugins",
|
||||
"register_creator_plugin",
|
||||
"deregister_creator_plugin",
|
||||
"register_creator_plugin_path",
|
||||
"deregister_creator_plugin_path",
|
||||
|
||||
"CreatedInstance",
|
||||
"CreateContext",
|
||||
|
||||
|
|
|
|||
|
|
@ -9,7 +9,8 @@ from contextlib import contextmanager
|
|||
from .creator_plugins import (
|
||||
BaseCreator,
|
||||
Creator,
|
||||
AutoCreator
|
||||
AutoCreator,
|
||||
discover_creator_plugins,
|
||||
)
|
||||
|
||||
from openpype.api import (
|
||||
|
|
@ -17,6 +18,8 @@ from openpype.api import (
|
|||
get_project_settings
|
||||
)
|
||||
|
||||
UpdateData = collections.namedtuple("UpdateData", ["instance", "changes"])
|
||||
|
||||
|
||||
class ImmutableKeyError(TypeError):
|
||||
"""Accessed key is immutable so does not allow changes or removements."""
|
||||
|
|
@ -843,7 +846,7 @@ class CreateContext:
|
|||
creators = {}
|
||||
autocreators = {}
|
||||
manual_creators = {}
|
||||
for creator_class in avalon.api.discover(BaseCreator):
|
||||
for creator_class in discover_creator_plugins():
|
||||
if inspect.isabstract(creator_class):
|
||||
self.log.info(
|
||||
"Skipping abstract Creator {}".format(str(creator_class))
|
||||
|
|
@ -1081,7 +1084,7 @@ class CreateContext:
|
|||
for instance in cretor_instances:
|
||||
instance_changes = instance.changes()
|
||||
if instance_changes:
|
||||
update_list.append((instance, instance_changes))
|
||||
update_list.append(UpdateData(instance, instance_changes))
|
||||
|
||||
creator = self.creators[identifier]
|
||||
if update_list:
|
||||
|
|
|
|||
|
|
@ -8,7 +8,19 @@ from abc import (
|
|||
)
|
||||
import six
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.lib import (
|
||||
get_subset_name_with_asset_doc,
|
||||
set_plugin_attributes_from_settings,
|
||||
)
|
||||
from openpype.pipeline.plugin_discover import (
|
||||
discover,
|
||||
register_plugin,
|
||||
register_plugin_path,
|
||||
deregister_plugin,
|
||||
deregister_plugin_path
|
||||
)
|
||||
|
||||
from .legacy_create import LegacyCreator
|
||||
|
||||
|
||||
class CreatorError(Exception):
|
||||
|
|
@ -46,6 +58,11 @@ class BaseCreator:
|
|||
# - may not be used if `get_icon` is reimplemented
|
||||
icon = None
|
||||
|
||||
# Instance attribute definitions that can be changed per instance
|
||||
# - returns list of attribute definitions from
|
||||
# `openpype.pipeline.attribute_definitions`
|
||||
instance_attr_defs = []
|
||||
|
||||
def __init__(
|
||||
self, create_context, system_settings, project_settings, headless=False
|
||||
):
|
||||
|
|
@ -56,10 +73,13 @@ class BaseCreator:
|
|||
# - we may use UI inside processing this attribute should be checked
|
||||
self.headless = headless
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
def identifier(self):
|
||||
"""Identifier of creator (must be unique)."""
|
||||
pass
|
||||
"""Identifier of creator (must be unique).
|
||||
|
||||
Default implementation returns plugin's family.
|
||||
"""
|
||||
return self.family
|
||||
|
||||
@abstractproperty
|
||||
def family(self):
|
||||
|
|
@ -90,11 +110,39 @@ class BaseCreator:
|
|||
pass
|
||||
|
||||
@abstractmethod
|
||||
def collect_instances(self, attr_plugins=None):
|
||||
def collect_instances(self):
|
||||
"""Collect existing instances related to this creator plugin.
|
||||
|
||||
The implementation differs on host abilities. The creator has to
|
||||
collect metadata about instance and create 'CreatedInstance' object
|
||||
which should be added to 'CreateContext'.
|
||||
|
||||
Example:
|
||||
```python
|
||||
def collect_instances(self):
|
||||
# Getting existing instances is different per host implementation
|
||||
for instance_data in pipeline.list_instances():
|
||||
# Process only instances that were created by this creator
|
||||
creator_id = instance_data.get("creator_identifier")
|
||||
if creator_id == self.identifier:
|
||||
# Create instance object from existing data
|
||||
instance = CreatedInstance.from_existing(
|
||||
instance_data, self
|
||||
)
|
||||
# Add instance to create context
|
||||
self._add_instance_to_context(instance)
|
||||
```
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def update_instances(self, update_list):
|
||||
"""Store changes of existing instances so they can be recollected.
|
||||
|
||||
Args:
|
||||
update_list(list<UpdateData>): Gets list of tuples. Each item
|
||||
contain changed instance and it's changes.
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
|
|
@ -178,7 +226,7 @@ class BaseCreator:
|
|||
list<AbtractAttrDef>: Attribute definitions that can be tweaked for
|
||||
created instance.
|
||||
"""
|
||||
return []
|
||||
return self.instance_attr_defs
|
||||
|
||||
|
||||
class Creator(BaseCreator):
|
||||
|
|
@ -191,6 +239,9 @@ class Creator(BaseCreator):
|
|||
# - default_variants may not be used if `get_default_variants` is overriden
|
||||
default_variants = []
|
||||
|
||||
# Default variant used in 'get_default_variant'
|
||||
default_variant = None
|
||||
|
||||
# Short description of family
|
||||
# - may not be used if `get_description` is overriden
|
||||
description = None
|
||||
|
|
@ -204,6 +255,10 @@ class Creator(BaseCreator):
|
|||
# e.g. for buld creators
|
||||
create_allow_context_change = True
|
||||
|
||||
# Precreate attribute definitions showed before creation
|
||||
# - similar to instance attribute definitions
|
||||
pre_create_attr_defs = []
|
||||
|
||||
@abstractmethod
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
"""Create new instance and store it.
|
||||
|
|
@ -263,7 +318,7 @@ class Creator(BaseCreator):
|
|||
`get_default_variants` should be used.
|
||||
"""
|
||||
|
||||
return None
|
||||
return self.default_variant
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
"""Plugin attribute definitions needed for creation.
|
||||
|
|
@ -276,7 +331,7 @@ class Creator(BaseCreator):
|
|||
list<AbtractAttrDef>: Attribute definitions that can be tweaked for
|
||||
created instance.
|
||||
"""
|
||||
return []
|
||||
return self.pre_create_attr_defs
|
||||
|
||||
|
||||
class AutoCreator(BaseCreator):
|
||||
|
|
@ -284,6 +339,43 @@ class AutoCreator(BaseCreator):
|
|||
|
||||
Can be used e.g. for `workfile`.
|
||||
"""
|
||||
|
||||
def remove_instances(self, instances):
|
||||
"""Skip removement."""
|
||||
pass
|
||||
|
||||
|
||||
def discover_creator_plugins():
|
||||
return discover(BaseCreator)
|
||||
|
||||
|
||||
def discover_legacy_creator_plugins():
|
||||
plugins = discover(LegacyCreator)
|
||||
set_plugin_attributes_from_settings(plugins, LegacyCreator)
|
||||
return plugins
|
||||
|
||||
|
||||
def register_creator_plugin(plugin):
|
||||
if issubclass(plugin, BaseCreator):
|
||||
register_plugin(BaseCreator, plugin)
|
||||
|
||||
elif issubclass(plugin, LegacyCreator):
|
||||
register_plugin(LegacyCreator, plugin)
|
||||
|
||||
|
||||
def deregister_creator_plugin(plugin):
|
||||
if issubclass(plugin, BaseCreator):
|
||||
deregister_plugin(BaseCreator, plugin)
|
||||
|
||||
elif issubclass(plugin, LegacyCreator):
|
||||
deregister_plugin(LegacyCreator, plugin)
|
||||
|
||||
|
||||
def register_creator_plugin_path(path):
|
||||
register_plugin_path(BaseCreator, path)
|
||||
register_plugin_path(LegacyCreator, path)
|
||||
|
||||
|
||||
def deregister_creator_plugin_path(path):
|
||||
deregister_plugin_path(BaseCreator, path)
|
||||
deregister_plugin_path(LegacyCreator, path)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,13 @@
|
|||
import logging
|
||||
|
||||
from openpype.lib import set_plugin_attributes_from_settings
|
||||
from openpype.pipeline.plugin_discover import (
|
||||
discover,
|
||||
register_plugin,
|
||||
register_plugin_path,
|
||||
deregister_plugin,
|
||||
deregister_plugin_path
|
||||
)
|
||||
from .utils import get_representation_path_from_context
|
||||
|
||||
|
||||
|
|
@ -33,7 +41,8 @@ class LoaderPlugin(list):
|
|||
def get_representations(cls):
|
||||
return cls.representations
|
||||
|
||||
def filepath_from_context(self, context):
|
||||
@classmethod
|
||||
def filepath_from_context(cls, context):
|
||||
return get_representation_path_from_context(context)
|
||||
|
||||
def load(self, context, name=None, namespace=None, options=None):
|
||||
|
|
@ -102,30 +111,22 @@ class SubsetLoaderPlugin(LoaderPlugin):
|
|||
|
||||
|
||||
def discover_loader_plugins():
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.discover(LoaderPlugin)
|
||||
plugins = discover(LoaderPlugin)
|
||||
set_plugin_attributes_from_settings(plugins, LoaderPlugin)
|
||||
return plugins
|
||||
|
||||
|
||||
def register_loader_plugin(plugin):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin(LoaderPlugin, plugin)
|
||||
|
||||
|
||||
def deregister_loader_plugin_path(path):
|
||||
import avalon.api
|
||||
|
||||
avalon.api.deregister_plugin_path(LoaderPlugin, path)
|
||||
|
||||
|
||||
def register_loader_plugin_path(path):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin_path(LoaderPlugin, path)
|
||||
return register_plugin(LoaderPlugin, plugin)
|
||||
|
||||
|
||||
def deregister_loader_plugin(plugin):
|
||||
import avalon.api
|
||||
deregister_plugin(LoaderPlugin, plugin)
|
||||
|
||||
avalon.api.deregister_plugin(LoaderPlugin, plugin)
|
||||
|
||||
def deregister_loader_plugin_path(path):
|
||||
deregister_plugin_path(LoaderPlugin, path)
|
||||
|
||||
|
||||
def register_loader_plugin_path(path):
|
||||
return register_plugin_path(LoaderPlugin, path)
|
||||
|
|
|
|||
298
openpype/pipeline/plugin_discover.py
Normal file
298
openpype/pipeline/plugin_discover.py
Normal file
|
|
@ -0,0 +1,298 @@
|
|||
import os
|
||||
import inspect
|
||||
import traceback
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.lib.python_module_tools import (
|
||||
modules_from_path,
|
||||
classes_from_module,
|
||||
)
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
class DiscoverResult:
|
||||
"""Result of Plug-ins discovery of a single superclass type.
|
||||
|
||||
Stores discovered, duplicated, ignored and abstract plugins and file paths
|
||||
which crashed on execution of file.
|
||||
"""
|
||||
|
||||
def __init__(self, superclass):
|
||||
self.superclass = superclass
|
||||
self.plugins = []
|
||||
self.crashed_file_paths = {}
|
||||
self.duplicated_plugins = []
|
||||
self.abstract_plugins = []
|
||||
self.ignored_plugins = set()
|
||||
# Store loaded modules to keep them in memory
|
||||
self._modules = set()
|
||||
|
||||
def __iter__(self):
|
||||
for plugin in self.plugins:
|
||||
yield plugin
|
||||
|
||||
def __getitem__(self, item):
|
||||
return self.plugins[item]
|
||||
|
||||
def __setitem__(self, item, value):
|
||||
self.plugins[item] = value
|
||||
|
||||
def add_module(self, module):
|
||||
"""Add dynamically loaded python module to keep it in memory."""
|
||||
self._modules.add(module)
|
||||
|
||||
def get_report(self, only_errors=True, exc_info=True, full_report=False):
|
||||
lines = []
|
||||
if not only_errors:
|
||||
# Successfully discovered plugins
|
||||
if self.plugins or full_report:
|
||||
lines.append(
|
||||
"*** Discovered {} plugins".format(len(self.plugins))
|
||||
)
|
||||
for cls in self.plugins:
|
||||
lines.append("- {}".format(cls.__class__.__name__))
|
||||
|
||||
# Plugin that were defined to be ignored
|
||||
if self.ignored_plugins or full_report:
|
||||
lines.append("*** Ignored plugins {}".format(len(
|
||||
self.ignored_plugins
|
||||
)))
|
||||
for cls in self.ignored_plugins:
|
||||
lines.append("- {}".format(cls.__class__.__name__))
|
||||
|
||||
# Abstract classes
|
||||
if self.abstract_plugins or full_report:
|
||||
lines.append("*** Discovered {} abstract plugins".format(len(
|
||||
self.abstract_plugins
|
||||
)))
|
||||
for cls in self.abstract_plugins:
|
||||
lines.append("- {}".format(cls.__class__.__name__))
|
||||
|
||||
# Abstract classes
|
||||
if self.duplicated_plugins or full_report:
|
||||
lines.append("*** There were {} duplicated plugins".format(len(
|
||||
self.duplicated_plugins
|
||||
)))
|
||||
for cls in self.duplicated_plugins:
|
||||
lines.append("- {}".format(cls.__class__.__name__))
|
||||
|
||||
if self.crashed_file_paths or full_report:
|
||||
lines.append("*** Failed to load {} files".format(len(
|
||||
self.crashed_file_paths
|
||||
)))
|
||||
for path, exc_info_args in self.crashed_file_paths.items():
|
||||
lines.append("- {}".format(path))
|
||||
if exc_info:
|
||||
lines.append(10 * "*")
|
||||
lines.extend(traceback.format_exception(*exc_info_args))
|
||||
lines.append(10 * "*")
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
def log_report(self, only_errors=True, exc_info=True):
|
||||
report = self.get_report(only_errors, exc_info)
|
||||
if report:
|
||||
log.info(report)
|
||||
|
||||
|
||||
class PluginDiscoverContext(object):
|
||||
"""Store and discover registered types nad registered paths to types.
|
||||
|
||||
Keeps in memory all registered types and their paths. Paths are dynamically
|
||||
loaded on discover so different discover calls won't return the same
|
||||
class objects even if were loaded from same file.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self._registered_plugins = {}
|
||||
self._registered_plugin_paths = {}
|
||||
self._last_discovered_plugins = {}
|
||||
# Store the last result to memory
|
||||
self._last_discovered_results = {}
|
||||
|
||||
def get_last_discovered_plugins(self, superclass):
|
||||
"""Access last discovered plugin by a subperclass.
|
||||
|
||||
Returns:
|
||||
None: When superclass was not discovered yet.
|
||||
list: Lastly discovered plugins of the superclass.
|
||||
"""
|
||||
|
||||
return self._last_discovered_plugins.get(superclass)
|
||||
|
||||
def discover(
|
||||
self,
|
||||
superclass,
|
||||
allow_duplicates=True,
|
||||
ignore_classes=None,
|
||||
return_report=False
|
||||
):
|
||||
"""Find and return subclasses of `superclass`
|
||||
|
||||
Args:
|
||||
superclass (type): Class which determines discovered subclasses.
|
||||
allow_duplicates (bool): Validate class name duplications.
|
||||
ignore_classes (list): List of classes that will be ignored
|
||||
and not added to result.
|
||||
|
||||
Returns:
|
||||
DiscoverResult: Object holding succesfully discovered plugins,
|
||||
ignored plugins, plugins with missing abstract implementation
|
||||
and duplicated plugin.
|
||||
"""
|
||||
|
||||
if not ignore_classes:
|
||||
ignore_classes = []
|
||||
|
||||
result = DiscoverResult(superclass)
|
||||
plugin_names = set()
|
||||
registered_classes = self._registered_plugins.get(superclass) or []
|
||||
registered_paths = self._registered_plugin_paths.get(superclass) or []
|
||||
for cls in registered_classes:
|
||||
if cls is superclass or cls in ignore_classes:
|
||||
result.ignored_plugins.add(cls)
|
||||
continue
|
||||
|
||||
if inspect.isabstract(cls):
|
||||
result.abstract_plugins.append(cls)
|
||||
continue
|
||||
|
||||
class_name = cls.__name__
|
||||
if class_name in plugin_names:
|
||||
result.duplicated_plugins.append(cls)
|
||||
continue
|
||||
plugin_names.add(class_name)
|
||||
result.plugins.append(cls)
|
||||
|
||||
# Include plug-ins from registered paths
|
||||
for path in registered_paths:
|
||||
modules, crashed = modules_from_path(path)
|
||||
for item in crashed:
|
||||
filepath, exc_info = item
|
||||
result.crashed_file_paths[filepath] = exc_info
|
||||
|
||||
for item in modules:
|
||||
filepath, module = item
|
||||
result.add_module(module)
|
||||
for cls in classes_from_module(superclass, module):
|
||||
if cls is superclass or cls in ignore_classes:
|
||||
result.ignored_plugins.add(cls)
|
||||
continue
|
||||
|
||||
if inspect.isabstract(cls):
|
||||
result.abstract_plugins.append(cls)
|
||||
continue
|
||||
|
||||
if not allow_duplicates:
|
||||
class_name = cls.__name__
|
||||
if class_name in plugin_names:
|
||||
result.duplicated_plugins.append(cls)
|
||||
continue
|
||||
plugin_names.add(class_name)
|
||||
|
||||
result.plugins.append(cls)
|
||||
|
||||
# Store in memory last result to keep in memory loaded modules
|
||||
self._last_discovered_results[superclass] = result
|
||||
self._last_discovered_plugins[superclass] = list(
|
||||
result.plugins
|
||||
)
|
||||
result.log_report()
|
||||
if return_report:
|
||||
return result
|
||||
return result.plugins
|
||||
|
||||
def register_plugin(self, superclass, cls):
|
||||
"""Register a directory containing plug-ins of type `superclass`
|
||||
|
||||
Arguments:
|
||||
superclass (type): Superclass of plug-in
|
||||
cls (object): Subclass of `superclass`
|
||||
"""
|
||||
|
||||
if superclass not in self._registered_plugins:
|
||||
self._registered_plugins[superclass] = list()
|
||||
|
||||
if cls not in self._registered_plugins[superclass]:
|
||||
self._registered_plugins[superclass].append(cls)
|
||||
|
||||
def register_plugin_path(self, superclass, path):
|
||||
"""Register a directory of one or more plug-ins
|
||||
|
||||
Arguments:
|
||||
superclass (type): Superclass of plug-ins to look for during
|
||||
discovery
|
||||
path (str): Absolute path to directory in which to discover
|
||||
plug-ins
|
||||
"""
|
||||
|
||||
if superclass not in self._registered_plugin_paths:
|
||||
self._registered_plugin_paths[superclass] = list()
|
||||
|
||||
path = os.path.normpath(path)
|
||||
if path not in self._registered_plugin_paths[superclass]:
|
||||
self._registered_plugin_paths[superclass].append(path)
|
||||
|
||||
def registered_plugin_paths(self):
|
||||
"""Return all currently registered plug-in paths"""
|
||||
# Return shallow copy so we the original data can't be changed
|
||||
return {
|
||||
superclass: paths[:]
|
||||
for superclass, paths in self._registered_plugin_paths.items()
|
||||
}
|
||||
|
||||
def deregister_plugin(self, superclass, plugin):
|
||||
"""Opposite of `register_plugin()`"""
|
||||
if superclass in self._registered_plugins:
|
||||
self._registered_plugins[superclass].remove(plugin)
|
||||
|
||||
def deregister_plugin_path(self, superclass, path):
|
||||
"""Opposite of `register_plugin_path()`"""
|
||||
self._registered_plugin_paths[superclass].remove(path)
|
||||
|
||||
|
||||
class _GlobalDiscover:
|
||||
"""Access to global object of PluginDiscoverContext.
|
||||
|
||||
Using singleton object to register/deregister plugins and plugin paths
|
||||
and then discover them by superclass.
|
||||
"""
|
||||
|
||||
_context = None
|
||||
|
||||
@classmethod
|
||||
def get_context(cls):
|
||||
if cls._context is None:
|
||||
cls._context = PluginDiscoverContext()
|
||||
return cls._context
|
||||
|
||||
|
||||
def discover(superclass, allow_duplicates=True):
|
||||
context = _GlobalDiscover.get_context()
|
||||
return context.discover(superclass, allow_duplicates)
|
||||
|
||||
|
||||
def get_last_discovered_plugins(superclass):
|
||||
context = _GlobalDiscover.get_context()
|
||||
return context.get_last_discovered_plugins(superclass)
|
||||
|
||||
|
||||
def register_plugin(superclass, cls):
|
||||
context = _GlobalDiscover.get_context()
|
||||
context.register_plugin(superclass, cls)
|
||||
|
||||
|
||||
def register_plugin_path(superclass, path):
|
||||
context = _GlobalDiscover.get_context()
|
||||
context.register_plugin_path(superclass, path)
|
||||
|
||||
|
||||
def deregister_plugin(superclass, cls):
|
||||
context = _GlobalDiscover.get_context()
|
||||
context.deregister_plugin(superclass, cls)
|
||||
|
||||
|
||||
def deregister_plugin_path(superclass, path):
|
||||
context = _GlobalDiscover.get_context()
|
||||
context.deregister_plugin_path(superclass, path)
|
||||
|
|
@ -2,6 +2,11 @@ import os
|
|||
import copy
|
||||
import logging
|
||||
|
||||
from .plugin_discover import (
|
||||
discover,
|
||||
register_plugin,
|
||||
register_plugin_path,
|
||||
)
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
|
@ -126,21 +131,15 @@ class BinaryThumbnail(ThumbnailResolver):
|
|||
|
||||
# Thumbnail resolvers
|
||||
def discover_thumbnail_resolvers():
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.discover(ThumbnailResolver)
|
||||
return discover(ThumbnailResolver)
|
||||
|
||||
|
||||
def register_thumbnail_resolver(plugin):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin(ThumbnailResolver, plugin)
|
||||
register_plugin(ThumbnailResolver, plugin)
|
||||
|
||||
|
||||
def register_thumbnail_resolver_path(path):
|
||||
import avalon.api
|
||||
|
||||
return avalon.api.register_plugin_path(ThumbnailResolver, path)
|
||||
register_plugin_path(ThumbnailResolver, path)
|
||||
|
||||
|
||||
register_thumbnail_resolver(TemplateResolver)
|
||||
|
|
|
|||
21
openpype/plugins/publish/collect_cleanup_keys.py
Normal file
21
openpype/plugins/publish/collect_cleanup_keys.py
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
"""
|
||||
Requires:
|
||||
None
|
||||
Provides:
|
||||
context
|
||||
- cleanupFullPaths (list)
|
||||
- cleanupEmptyDirs (list)
|
||||
"""
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectCleanupKeys(pyblish.api.ContextPlugin):
|
||||
"""Prepare keys for 'ExplicitCleanUp' plugin."""
|
||||
|
||||
label = "Collect Cleanup Keys"
|
||||
order = pyblish.api.CollectorOrder
|
||||
|
||||
def process(self, context):
|
||||
context.data["cleanupFullPaths"] = []
|
||||
context.data["cleanupEmptyDirs"] = []
|
||||
|
|
@ -18,20 +18,30 @@ class CollectHostName(pyblish.api.ContextPlugin):
|
|||
|
||||
def process(self, context):
|
||||
host_name = context.data.get("hostName")
|
||||
app_name = context.data.get("appName")
|
||||
app_label = context.data.get("appLabel")
|
||||
# Don't override value if is already set
|
||||
if host_name:
|
||||
if host_name and app_name and app_label:
|
||||
return
|
||||
|
||||
# Use AVALON_APP as first if available it is the same as host name
|
||||
# - only if is not defined use AVALON_APP_NAME (e.g. on Farm) and
|
||||
# set it back to AVALON_APP env variable
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
# Use AVALON_APP to get host name if available
|
||||
if not host_name:
|
||||
host_name = os.environ.get("AVALON_APP")
|
||||
|
||||
# Use AVALON_APP_NAME to get full app name
|
||||
if not app_name:
|
||||
app_name = os.environ.get("AVALON_APP_NAME")
|
||||
if app_name:
|
||||
app_manager = ApplicationManager()
|
||||
app = app_manager.applications.get(app_name)
|
||||
if app:
|
||||
|
||||
# Fill missing values based on app full name
|
||||
if (not host_name or not app_label) and app_name:
|
||||
app_manager = ApplicationManager()
|
||||
app = app_manager.applications.get(app_name)
|
||||
if app:
|
||||
if not host_name:
|
||||
host_name = app.host_name
|
||||
if not app_label:
|
||||
app_label = app.full_label
|
||||
|
||||
context.data["hostName"] = host_name
|
||||
context.data["appName"] = app_name
|
||||
context.data["appLabel"] = app_label
|
||||
|
|
|
|||
|
|
@ -53,7 +53,10 @@ class CollectResourcesPath(pyblish.api.InstancePlugin):
|
|||
"textures",
|
||||
"action",
|
||||
"background",
|
||||
"effect"
|
||||
"effect",
|
||||
"staticMesh",
|
||||
"skeletalMesh"
|
||||
|
||||
]
|
||||
|
||||
def process(self, instance):
|
||||
|
|
|
|||
|
|
@ -485,6 +485,11 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
|
|||
anatomy = instance.context.data["anatomy"]
|
||||
template_data = copy.deepcopy(instance.data["anatomyData"])
|
||||
|
||||
if "originalBasename" in instance.data:
|
||||
template_data.update({
|
||||
"originalBasename": instance.data.get("originalBasename")
|
||||
})
|
||||
|
||||
if "folder" in anatomy.templates[template_key]:
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
publish_folder = anatomy_filled[template_key]["folder"]
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue