mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' of github.com:pypeclub/OpenPype into chore/site_sync_expose_sites_skeleton
This commit is contained in:
commit
395ff6b100
152 changed files with 3087 additions and 1459 deletions
2
.github/workflows/prerelease.yml
vendored
2
.github/workflows/prerelease.yml
vendored
|
|
@ -80,7 +80,7 @@ jobs:
|
|||
git tag -a $tag_name -m "nightly build"
|
||||
|
||||
- name: Push to protected main branch
|
||||
uses: CasperWA/push-protected@v2
|
||||
uses: CasperWA/push-protected@v2.10.0
|
||||
with:
|
||||
token: ${{ secrets.ADMIN_TOKEN }}
|
||||
branch: main
|
||||
|
|
|
|||
2
.github/workflows/release.yml
vendored
2
.github/workflows/release.yml
vendored
|
|
@ -68,7 +68,7 @@ jobs:
|
|||
|
||||
- name: 🔏 Push to protected main branch
|
||||
if: steps.version.outputs.release_tag != 'skip'
|
||||
uses: CasperWA/push-protected@v2
|
||||
uses: CasperWA/push-protected@v2.10.0
|
||||
with:
|
||||
token: ${{ secrets.ADMIN_TOKEN }}
|
||||
branch: main
|
||||
|
|
|
|||
83
CHANGELOG.md
83
CHANGELOG.md
|
|
@ -1,8 +1,57 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.5-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.4...HEAD)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: Add aov matching even for remainder and prerender [\#3060](https://github.com/pypeclub/OpenPype/pull/3060)
|
||||
|
||||
## [3.9.4](https://github.com/pypeclub/OpenPype/tree/3.9.4) (2022-04-15)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.4-nightly.2...3.9.4)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: more info about Tasks [\#3062](https://github.com/pypeclub/OpenPype/pull/3062)
|
||||
- Documentation: Python requirements to 3.7.9 [\#3035](https://github.com/pypeclub/OpenPype/pull/3035)
|
||||
- Website Docs: Remove unused pages [\#2974](https://github.com/pypeclub/OpenPype/pull/2974)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- General: Local overrides for environment variables [\#3045](https://github.com/pypeclub/OpenPype/pull/3045)
|
||||
- Flame: Flare integration preparation [\#2928](https://github.com/pypeclub/OpenPype/pull/2928)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- TVPaint: Added init file for worker to triggers missing sound file dialog [\#3053](https://github.com/pypeclub/OpenPype/pull/3053)
|
||||
- Ftrack: Custom attributes can be filled in slate values [\#3036](https://github.com/pypeclub/OpenPype/pull/3036)
|
||||
- Resolve environment variable in google drive credential path [\#3008](https://github.com/pypeclub/OpenPype/pull/3008)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- GitHub: Updated push-protected action in github workflow [\#3064](https://github.com/pypeclub/OpenPype/pull/3064)
|
||||
- Nuke: Typos in imports from Nuke implementation [\#3061](https://github.com/pypeclub/OpenPype/pull/3061)
|
||||
- Hotfix: fixing deadline job publishing [\#3059](https://github.com/pypeclub/OpenPype/pull/3059)
|
||||
- General: Extract Review handle invalid characters for ffmpeg [\#3050](https://github.com/pypeclub/OpenPype/pull/3050)
|
||||
- Slate Review: Support to keep format on slate concatenation [\#3049](https://github.com/pypeclub/OpenPype/pull/3049)
|
||||
- Webpublisher: fix processing of workfile [\#3048](https://github.com/pypeclub/OpenPype/pull/3048)
|
||||
- Ftrack: Integrate ftrack api fix [\#3044](https://github.com/pypeclub/OpenPype/pull/3044)
|
||||
- Webpublisher - removed wrong hardcoded family [\#3043](https://github.com/pypeclub/OpenPype/pull/3043)
|
||||
- LibraryLoader: Use current project for asset query in families filter [\#3042](https://github.com/pypeclub/OpenPype/pull/3042)
|
||||
- SiteSync: Providers ignore that site is disabled [\#3041](https://github.com/pypeclub/OpenPype/pull/3041)
|
||||
- Unreal: Creator import fixes [\#3040](https://github.com/pypeclub/OpenPype/pull/3040)
|
||||
- SiteSync: fix transitive alternate sites, fix dropdown in Local Settings [\#3018](https://github.com/pypeclub/OpenPype/pull/3018)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Deadline: reworked pools assignment [\#3051](https://github.com/pypeclub/OpenPype/pull/3051)
|
||||
- Houdini: Avoid ImportError on `hdefereval` when Houdini runs without UI [\#2987](https://github.com/pypeclub/OpenPype/pull/2987)
|
||||
|
||||
## [3.9.3](https://github.com/pypeclub/OpenPype/tree/3.9.3) (2022-04-07)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.2...3.9.3)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.3-nightly.2...3.9.3)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
|
|
@ -19,8 +68,8 @@
|
|||
- Ftrack: Add more options for note text of integrate ftrack note [\#3025](https://github.com/pypeclub/OpenPype/pull/3025)
|
||||
- Console Interpreter: Changed how console splitter size are reused on show [\#3016](https://github.com/pypeclub/OpenPype/pull/3016)
|
||||
- Deadline: Use more suitable name for sequence review logic [\#3015](https://github.com/pypeclub/OpenPype/pull/3015)
|
||||
- General: default workfile subset name for workfile [\#3011](https://github.com/pypeclub/OpenPype/pull/3011)
|
||||
- Nuke: add concurrency attr to deadline job [\#3005](https://github.com/pypeclub/OpenPype/pull/3005)
|
||||
- Photoshop: create image without instance [\#3001](https://github.com/pypeclub/OpenPype/pull/3001)
|
||||
- Deadline: priority configurable in Maya jobs [\#2995](https://github.com/pypeclub/OpenPype/pull/2995)
|
||||
- Workfiles tool: Save as published workfiles [\#2937](https://github.com/pypeclub/OpenPype/pull/2937)
|
||||
|
||||
|
|
@ -28,6 +77,7 @@
|
|||
|
||||
- Deadline: Fixed default value of use sequence for review [\#3033](https://github.com/pypeclub/OpenPype/pull/3033)
|
||||
- Settings UI: Version column can be extended so version are visible [\#3032](https://github.com/pypeclub/OpenPype/pull/3032)
|
||||
- General: Fix validate asset docs plug-in filename and class name [\#3029](https://github.com/pypeclub/OpenPype/pull/3029)
|
||||
- General: Fix import after movements [\#3028](https://github.com/pypeclub/OpenPype/pull/3028)
|
||||
- Harmony: Added creating subset name for workfile from template [\#3024](https://github.com/pypeclub/OpenPype/pull/3024)
|
||||
- AfterEffects: Added creating subset name for workfile from template [\#3023](https://github.com/pypeclub/OpenPype/pull/3023)
|
||||
|
|
@ -35,7 +85,6 @@
|
|||
- Maya: Remove missing import [\#3017](https://github.com/pypeclub/OpenPype/pull/3017)
|
||||
- Ftrack: multiple reviewable componets [\#3012](https://github.com/pypeclub/OpenPype/pull/3012)
|
||||
- Tray publisher: Fixes after code movement [\#3010](https://github.com/pypeclub/OpenPype/pull/3010)
|
||||
- Nuke: fixing unicode type detection in effect loaders [\#3002](https://github.com/pypeclub/OpenPype/pull/3002)
|
||||
- Nuke: removing redundant Ftrack asset when farm publishing [\#2996](https://github.com/pypeclub/OpenPype/pull/2996)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
|
@ -51,15 +100,14 @@
|
|||
|
||||
- Documentation: Added mention of adding My Drive as a root [\#2999](https://github.com/pypeclub/OpenPype/pull/2999)
|
||||
- Docs: Added MongoDB requirements [\#2951](https://github.com/pypeclub/OpenPype/pull/2951)
|
||||
- Documentation: New publisher develop docs [\#2896](https://github.com/pypeclub/OpenPype/pull/2896)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- nuke: bypass baking [\#2992](https://github.com/pypeclub/OpenPype/pull/2992)
|
||||
- Multiverse: Initial Support [\#2908](https://github.com/pypeclub/OpenPype/pull/2908)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Photoshop: create image without instance [\#3001](https://github.com/pypeclub/OpenPype/pull/3001)
|
||||
- TVPaint: Render scene family [\#3000](https://github.com/pypeclub/OpenPype/pull/3000)
|
||||
- Nuke: ReviewDataMov Read RAW attribute [\#2985](https://github.com/pypeclub/OpenPype/pull/2985)
|
||||
- General: `METADATA\_KEYS` constant as `frozenset` for optimal immutable lookup [\#2980](https://github.com/pypeclub/OpenPype/pull/2980)
|
||||
|
|
@ -69,13 +117,11 @@
|
|||
- NewPublisher: Prepared implementation of optional pyblish plugin [\#2943](https://github.com/pypeclub/OpenPype/pull/2943)
|
||||
- TVPaint: Extractor to convert PNG into EXR [\#2942](https://github.com/pypeclub/OpenPype/pull/2942)
|
||||
- Workfiles: Open published workfiles [\#2925](https://github.com/pypeclub/OpenPype/pull/2925)
|
||||
- General: Default modules loaded dynamically [\#2923](https://github.com/pypeclub/OpenPype/pull/2923)
|
||||
- Nuke: Add no-audio Tag [\#2911](https://github.com/pypeclub/OpenPype/pull/2911)
|
||||
- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Hosts: Remove path existence checks in 'add\_implementation\_envs' [\#3004](https://github.com/pypeclub/OpenPype/pull/3004)
|
||||
- Nuke: fixing unicode type detection in effect loaders [\#3002](https://github.com/pypeclub/OpenPype/pull/3002)
|
||||
- Fix - remove doubled dot in workfile created from template [\#2998](https://github.com/pypeclub/OpenPype/pull/2998)
|
||||
- PS: fix renaming subset incorrectly in PS [\#2991](https://github.com/pypeclub/OpenPype/pull/2991)
|
||||
- Fix: Disable setuptools auto discovery [\#2990](https://github.com/pypeclub/OpenPype/pull/2990)
|
||||
|
|
@ -97,15 +143,12 @@
|
|||
- Settings UI: Collapsed of collapsible wrapper works as expected [\#2934](https://github.com/pypeclub/OpenPype/pull/2934)
|
||||
- Maya: Do not pass `set` to maya commands \(fixes support for older maya versions\) [\#2932](https://github.com/pypeclub/OpenPype/pull/2932)
|
||||
- General: Don't print log record on OSError [\#2926](https://github.com/pypeclub/OpenPype/pull/2926)
|
||||
- Flame: centos related debugging [\#2922](https://github.com/pypeclub/OpenPype/pull/2922)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Move plugins register and discover [\#2935](https://github.com/pypeclub/OpenPype/pull/2935)
|
||||
- General: Move Attribute Definitions from pipeline [\#2931](https://github.com/pypeclub/OpenPype/pull/2931)
|
||||
- General: Removed silo references and terminal splash [\#2927](https://github.com/pypeclub/OpenPype/pull/2927)
|
||||
- General: Move pipeline constants to OpenPype [\#2918](https://github.com/pypeclub/OpenPype/pull/2918)
|
||||
- General: Move remaining plugins from avalon [\#2912](https://github.com/pypeclub/OpenPype/pull/2912)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
@ -118,24 +161,6 @@
|
|||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.1-nightly.3...3.9.1)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Change how OPENPYPE\_DEBUG value is handled [\#2907](https://github.com/pypeclub/OpenPype/pull/2907)
|
||||
- nuke: imageio adding ocio config version 1.2 [\#2897](https://github.com/pypeclub/OpenPype/pull/2897)
|
||||
- Flame: support for comment with xml attribute overrides [\#2892](https://github.com/pypeclub/OpenPype/pull/2892)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Fix use of Anatomy roots [\#2904](https://github.com/pypeclub/OpenPype/pull/2904)
|
||||
- Fixing gap detection in extract review [\#2902](https://github.com/pypeclub/OpenPype/pull/2902)
|
||||
- Pyblish Pype - ensure current state is correct when entering new group order [\#2899](https://github.com/pypeclub/OpenPype/pull/2899)
|
||||
- SceneInventory: Fix import of load function [\#2894](https://github.com/pypeclub/OpenPype/pull/2894)
|
||||
- Harmony - fixed creator issue [\#2891](https://github.com/pypeclub/OpenPype/pull/2891)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Reduce style usage to OpenPype repository [\#2889](https://github.com/pypeclub/OpenPype/pull/2889)
|
||||
|
||||
## [3.9.0](https://github.com/pypeclub/OpenPype/tree/3.9.0) (2022-03-14)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.0-nightly.9...3.9.0)
|
||||
|
|
|
|||
|
|
@ -1,102 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Pype module."""
|
||||
import os
|
||||
import platform
|
||||
import logging
|
||||
|
||||
from .settings import get_project_settings
|
||||
from .lib import (
|
||||
Anatomy,
|
||||
filter_pyblish_plugins,
|
||||
change_timer_to_current_context,
|
||||
register_event_callback,
|
||||
)
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
PACKAGE_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
|
||||
|
||||
# Global plugin paths
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
|
||||
|
||||
def install():
|
||||
"""Install OpenPype to Avalon."""
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
from pyblish.lib import MessageHandler
|
||||
from openpype.modules import load_modules
|
||||
from openpype.pipeline import (
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action,
|
||||
register_creator_plugin_path,
|
||||
)
|
||||
|
||||
# Make sure modules are loaded
|
||||
load_modules()
|
||||
|
||||
def modified_emit(obj, record):
|
||||
"""Method replacing `emit` in Pyblish's MessageHandler."""
|
||||
record.msg = record.getMessage()
|
||||
obj.records.append(record)
|
||||
|
||||
MessageHandler.emit = modified_emit
|
||||
|
||||
log.info("Registering global plug-ins..")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.register_discovery_filter(filter_pyblish_plugins)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
|
||||
# Register studio specific plugins
|
||||
if project_name:
|
||||
anatomy = Anatomy(project_name)
|
||||
anatomy.set_root_environments()
|
||||
avalon.api.register_root(anatomy.roots)
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
platform_name = platform.system().lower()
|
||||
project_plugins = (
|
||||
project_settings
|
||||
.get("global", {})
|
||||
.get("project_plugins", {})
|
||||
.get(platform_name)
|
||||
) or []
|
||||
for path in project_plugins:
|
||||
try:
|
||||
path = str(path.format(**os.environ))
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
pyblish.api.register_plugin_path(path)
|
||||
register_loader_plugin_path(path)
|
||||
register_creator_plugin_path(path)
|
||||
register_inventory_action(path)
|
||||
|
||||
# apply monkey patched discover to original one
|
||||
log.info("Patching discovery")
|
||||
|
||||
register_event_callback("taskChanged", _on_task_change)
|
||||
|
||||
|
||||
def _on_task_change():
|
||||
change_timer_to_current_context()
|
||||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall Pype from Avalon."""
|
||||
import pyblish.api
|
||||
from openpype.pipeline import deregister_loader_plugin_path
|
||||
|
||||
log.info("Deregistering global plug-ins..")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.deregister_discovery_filter(filter_pyblish_plugins)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
log.info("Global plug-ins unregistred")
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ import logging
|
|||
|
||||
from Qt import QtWidgets
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
|
|
@ -22,10 +23,9 @@ def safe_excepthook(*args):
|
|||
def main(*subprocess_args):
|
||||
sys.excepthook = safe_excepthook
|
||||
|
||||
import avalon.api
|
||||
from openpype.hosts.aftereffects import api
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
os.environ["OPENPYPE_LOG_NO_COLORS"] = "False"
|
||||
app = QtWidgets.QApplication([])
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
registered_host,
|
||||
)
|
||||
import openpype.hosts.aftereffects
|
||||
from openpype.lib import register_event_callback
|
||||
|
|
@ -37,24 +38,9 @@ def check_inventory():
|
|||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
host = pyblish.api.registered_host()
|
||||
outdated_containers = []
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
representation_doc = io.find_one(
|
||||
{
|
||||
"_id": ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
)
|
||||
if representation_doc and not lib.is_latest(representation_doc):
|
||||
outdated_containers.append(container)
|
||||
|
||||
# Warn about outdated containers.
|
||||
print("Starting new QApplication..")
|
||||
app = QtWidgets.QApplication(sys.argv)
|
||||
|
||||
message_box = QtWidgets.QMessageBox()
|
||||
message_box.setIcon(QtWidgets.QMessageBox.Warning)
|
||||
msg = "There are outdated containers in the scene."
|
||||
|
|
|
|||
|
|
@ -25,7 +25,7 @@ class AERenderInstance(RenderInstance):
|
|||
|
||||
class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.498
|
||||
order = pyblish.api.CollectorOrder + 0.400
|
||||
label = "Collect After Effects Render Layers"
|
||||
hosts = ["aftereffects"]
|
||||
|
||||
|
|
|
|||
|
|
@ -19,6 +19,7 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
uninstall_host,
|
||||
)
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import (
|
||||
|
|
@ -209,11 +210,10 @@ def reload_pipeline(*args):
|
|||
|
||||
"""
|
||||
|
||||
avalon.api.uninstall()
|
||||
uninstall_host()
|
||||
|
||||
for module in (
|
||||
"avalon.io",
|
||||
"avalon.lib",
|
||||
"avalon.pipeline",
|
||||
"avalon.api",
|
||||
):
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
from avalon import pipeline
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.blender import api
|
||||
|
||||
pipeline.install(api)
|
||||
install_host(api)
|
||||
|
|
|
|||
|
|
@ -3,8 +3,6 @@ import sys
|
|||
import copy
|
||||
import argparse
|
||||
|
||||
from avalon import io
|
||||
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
|
||||
|
|
@ -13,6 +11,8 @@ import openpype
|
|||
import openpype.hosts.celaction
|
||||
from openpype.hosts.celaction import api as celaction
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
|
||||
|
||||
log = Logger().get_logger("Celaction_cli_publisher")
|
||||
|
||||
|
|
@ -21,9 +21,6 @@ publish_host = "celaction"
|
|||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.celaction.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
def cli():
|
||||
|
|
@ -74,7 +71,7 @@ def main():
|
|||
_prepare_publish_environments()
|
||||
|
||||
# Registers pype's Global pyblish plugins
|
||||
openpype.install()
|
||||
install_openpype_plugins()
|
||||
|
||||
if os.path.exists(PUBLISH_PATH):
|
||||
log.info(f"Registering path: {PUBLISH_PATH}")
|
||||
|
|
|
|||
|
|
@ -11,10 +11,8 @@ from .constants import (
|
|||
from .lib import (
|
||||
CTX,
|
||||
FlameAppFramework,
|
||||
get_project_manager,
|
||||
get_current_project,
|
||||
get_current_sequence,
|
||||
create_bin,
|
||||
create_segment_data_marker,
|
||||
get_segment_data_marker,
|
||||
set_segment_data_marker,
|
||||
|
|
@ -29,7 +27,10 @@ from .lib import (
|
|||
get_frame_from_filename,
|
||||
get_padding_from_filename,
|
||||
maintained_object_duplication,
|
||||
get_clip_segment
|
||||
maintained_temp_file_path,
|
||||
get_clip_segment,
|
||||
get_batch_group_from_desktop,
|
||||
MediaInfoFile
|
||||
)
|
||||
from .utils import (
|
||||
setup,
|
||||
|
|
@ -56,7 +57,6 @@ from .plugin import (
|
|||
PublishableClip,
|
||||
ClipLoader,
|
||||
OpenClipSolver
|
||||
|
||||
)
|
||||
from .workio import (
|
||||
open_file,
|
||||
|
|
@ -71,6 +71,10 @@ from .render_utils import (
|
|||
get_preset_path_by_xml_name,
|
||||
modify_preset_file
|
||||
)
|
||||
from .batch_utils import (
|
||||
create_batch_group,
|
||||
create_batch_group_conent
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
# constants
|
||||
|
|
@ -83,10 +87,8 @@ __all__ = [
|
|||
# lib
|
||||
"CTX",
|
||||
"FlameAppFramework",
|
||||
"get_project_manager",
|
||||
"get_current_project",
|
||||
"get_current_sequence",
|
||||
"create_bin",
|
||||
"create_segment_data_marker",
|
||||
"get_segment_data_marker",
|
||||
"set_segment_data_marker",
|
||||
|
|
@ -101,7 +103,10 @@ __all__ = [
|
|||
"get_frame_from_filename",
|
||||
"get_padding_from_filename",
|
||||
"maintained_object_duplication",
|
||||
"maintained_temp_file_path",
|
||||
"get_clip_segment",
|
||||
"get_batch_group_from_desktop",
|
||||
"MediaInfoFile",
|
||||
|
||||
# pipeline
|
||||
"install",
|
||||
|
|
@ -142,5 +147,9 @@ __all__ = [
|
|||
# render utils
|
||||
"export_clip",
|
||||
"get_preset_path_by_xml_name",
|
||||
"modify_preset_file"
|
||||
"modify_preset_file",
|
||||
|
||||
# batch utils
|
||||
"create_batch_group",
|
||||
"create_batch_group_conent"
|
||||
]
|
||||
|
|
|
|||
151
openpype/hosts/flame/api/batch_utils.py
Normal file
151
openpype/hosts/flame/api/batch_utils.py
Normal file
|
|
@ -0,0 +1,151 @@
|
|||
import flame
|
||||
|
||||
|
||||
def create_batch_group(
|
||||
name,
|
||||
frame_start,
|
||||
frame_duration,
|
||||
update_batch_group=None,
|
||||
**kwargs
|
||||
):
|
||||
"""Create Batch Group in active project's Desktop
|
||||
|
||||
Args:
|
||||
name (str): name of batch group to be created
|
||||
frame_start (int): start frame of batch
|
||||
frame_end (int): end frame of batch
|
||||
update_batch_group (PyBatch)[optional]: batch group to update
|
||||
|
||||
Return:
|
||||
PyBatch: active flame batch group
|
||||
"""
|
||||
# make sure some batch obj is present
|
||||
batch_group = update_batch_group or flame.batch
|
||||
|
||||
schematic_reels = kwargs.get("shematic_reels") or ['LoadedReel1']
|
||||
shelf_reels = kwargs.get("shelf_reels") or ['ShelfReel1']
|
||||
|
||||
handle_start = kwargs.get("handleStart") or 0
|
||||
handle_end = kwargs.get("handleEnd") or 0
|
||||
|
||||
frame_start -= handle_start
|
||||
frame_duration += handle_start + handle_end
|
||||
|
||||
if not update_batch_group:
|
||||
# Create batch group with name, start_frame value, duration value,
|
||||
# set of schematic reel names, set of shelf reel names
|
||||
batch_group = batch_group.create_batch_group(
|
||||
name,
|
||||
start_frame=frame_start,
|
||||
duration=frame_duration,
|
||||
reels=schematic_reels,
|
||||
shelf_reels=shelf_reels
|
||||
)
|
||||
else:
|
||||
batch_group.name = name
|
||||
batch_group.start_frame = frame_start
|
||||
batch_group.duration = frame_duration
|
||||
|
||||
# add reels to batch group
|
||||
_add_reels_to_batch_group(
|
||||
batch_group, schematic_reels, shelf_reels)
|
||||
|
||||
# TODO: also update write node if there is any
|
||||
# TODO: also update loaders to start from correct frameStart
|
||||
|
||||
if kwargs.get("switch_batch_tab"):
|
||||
# use this command to switch to the batch tab
|
||||
batch_group.go_to()
|
||||
|
||||
return batch_group
|
||||
|
||||
|
||||
def _add_reels_to_batch_group(batch_group, reels, shelf_reels):
|
||||
# update or create defined reels
|
||||
# helper variables
|
||||
reel_names = [
|
||||
r.name.get_value()
|
||||
for r in batch_group.reels
|
||||
]
|
||||
shelf_reel_names = [
|
||||
r.name.get_value()
|
||||
for r in batch_group.shelf_reels
|
||||
]
|
||||
# add schematic reels
|
||||
for _r in reels:
|
||||
if _r in reel_names:
|
||||
continue
|
||||
batch_group.create_reel(_r)
|
||||
|
||||
# add shelf reels
|
||||
for _sr in shelf_reels:
|
||||
if _sr in shelf_reel_names:
|
||||
continue
|
||||
batch_group.create_shelf_reel(_sr)
|
||||
|
||||
|
||||
def create_batch_group_conent(batch_nodes, batch_links, batch_group=None):
|
||||
"""Creating batch group with links
|
||||
|
||||
Args:
|
||||
batch_nodes (list of dict): each dict is node definition
|
||||
batch_links (list of dict): each dict is link definition
|
||||
batch_group (PyBatch, optional): batch group. Defaults to None.
|
||||
|
||||
Return:
|
||||
dict: all batch nodes {name or id: PyNode}
|
||||
"""
|
||||
# make sure some batch obj is present
|
||||
batch_group = batch_group or flame.batch
|
||||
all_batch_nodes = {
|
||||
b.name.get_value(): b
|
||||
for b in batch_group.nodes
|
||||
}
|
||||
for node in batch_nodes:
|
||||
# NOTE: node_props needs to be ideally OrederDict type
|
||||
node_id, node_type, node_props = (
|
||||
node["id"], node["type"], node["properties"])
|
||||
|
||||
# get node name for checking if exists
|
||||
node_name = node_props.pop("name", None) or node_id
|
||||
|
||||
if all_batch_nodes.get(node_name):
|
||||
# update existing batch node
|
||||
batch_node = all_batch_nodes[node_name]
|
||||
else:
|
||||
# create new batch node
|
||||
batch_node = batch_group.create_node(node_type)
|
||||
|
||||
# set name
|
||||
batch_node.name.set_value(node_name)
|
||||
|
||||
# set attributes found in node props
|
||||
for key, value in node_props.items():
|
||||
if not hasattr(batch_node, key):
|
||||
continue
|
||||
setattr(batch_node, key, value)
|
||||
|
||||
# add created node for possible linking
|
||||
all_batch_nodes[node_id] = batch_node
|
||||
|
||||
# link nodes to each other
|
||||
for link in batch_links:
|
||||
_from_n, _to_n = link["from_node"], link["to_node"]
|
||||
|
||||
# check if all linking nodes are available
|
||||
if not all([
|
||||
all_batch_nodes.get(_from_n["id"]),
|
||||
all_batch_nodes.get(_to_n["id"])
|
||||
]):
|
||||
continue
|
||||
|
||||
# link nodes in defined link
|
||||
batch_group.connect_nodes(
|
||||
all_batch_nodes[_from_n["id"]], _from_n["connector"],
|
||||
all_batch_nodes[_to_n["id"]], _to_n["connector"]
|
||||
)
|
||||
|
||||
# sort batch nodes
|
||||
batch_group.organize()
|
||||
|
||||
return all_batch_nodes
|
||||
|
|
@ -3,7 +3,12 @@ import os
|
|||
import re
|
||||
import json
|
||||
import pickle
|
||||
import tempfile
|
||||
import itertools
|
||||
import contextlib
|
||||
import xml.etree.cElementTree as cET
|
||||
from copy import deepcopy
|
||||
from xml.etree import ElementTree as ET
|
||||
from pprint import pformat
|
||||
from .constants import (
|
||||
MARKER_COLOR,
|
||||
|
|
@ -12,9 +17,10 @@ from .constants import (
|
|||
COLOR_MAP,
|
||||
MARKER_PUBLISH_DEFAULT
|
||||
)
|
||||
from openpype.api import Logger
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
import openpype.api as openpype
|
||||
|
||||
log = openpype.Logger.get_logger(__name__)
|
||||
|
||||
FRAME_PATTERN = re.compile(r"[\._](\d+)[\.]")
|
||||
|
||||
|
|
@ -227,16 +233,6 @@ class FlameAppFramework(object):
|
|||
return True
|
||||
|
||||
|
||||
def get_project_manager():
|
||||
# TODO: get_project_manager
|
||||
return
|
||||
|
||||
|
||||
def get_media_storage():
|
||||
# TODO: get_media_storage
|
||||
return
|
||||
|
||||
|
||||
def get_current_project():
|
||||
import flame
|
||||
return flame.project.current_project
|
||||
|
|
@ -266,11 +262,6 @@ def get_current_sequence(selection):
|
|||
return process_timeline
|
||||
|
||||
|
||||
def create_bin(name, root=None):
|
||||
# TODO: create_bin
|
||||
return
|
||||
|
||||
|
||||
def rescan_hooks():
|
||||
import flame
|
||||
try:
|
||||
|
|
@ -280,6 +271,7 @@ def rescan_hooks():
|
|||
|
||||
|
||||
def get_metadata(project_name, _log=None):
|
||||
# TODO: can be replaced by MediaInfoFile class method
|
||||
from adsk.libwiretapPythonClientAPI import (
|
||||
WireTapClient,
|
||||
WireTapServerHandle,
|
||||
|
|
@ -704,6 +696,25 @@ def maintained_object_duplication(item):
|
|||
flame.delete(duplicate)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_temp_file_path(suffix=None):
|
||||
_suffix = suffix or ""
|
||||
|
||||
try:
|
||||
# Store dumped json to temporary file
|
||||
temporary_file = tempfile.mktemp(
|
||||
suffix=_suffix, prefix="flame_maintained_")
|
||||
yield temporary_file.replace("\\", "/")
|
||||
|
||||
except IOError as _error:
|
||||
raise IOError(
|
||||
"Not able to create temp json file: {}".format(_error))
|
||||
|
||||
finally:
|
||||
# Remove the temporary json
|
||||
os.remove(temporary_file)
|
||||
|
||||
|
||||
def get_clip_segment(flame_clip):
|
||||
name = flame_clip.name.get_value()
|
||||
version = flame_clip.versions[0]
|
||||
|
|
@ -717,3 +728,213 @@ def get_clip_segment(flame_clip):
|
|||
raise ValueError("Clip `{}` has too many segments!".format(name))
|
||||
|
||||
return segments[0]
|
||||
|
||||
|
||||
def get_batch_group_from_desktop(name):
|
||||
project = get_current_project()
|
||||
project_desktop = project.current_workspace.desktop
|
||||
|
||||
for bgroup in project_desktop.batch_groups:
|
||||
if bgroup.name.get_value() in name:
|
||||
return bgroup
|
||||
|
||||
|
||||
class MediaInfoFile(object):
|
||||
"""Class to get media info file clip data
|
||||
|
||||
Raises:
|
||||
IOError: MEDIA_SCRIPT_PATH path doesn't exists
|
||||
TypeError: Not able to generate clip xml data file
|
||||
ET.ParseError: Missing clip in xml clip data
|
||||
IOError: Not able to save xml clip data to file
|
||||
|
||||
Attributes:
|
||||
str: `MEDIA_SCRIPT_PATH` path to flame binary
|
||||
logging.Logger: `log` logger
|
||||
|
||||
TODO: add method for getting metadata to dict
|
||||
"""
|
||||
MEDIA_SCRIPT_PATH = "/opt/Autodesk/mio/current/dl_get_media_info"
|
||||
|
||||
log = log
|
||||
|
||||
_clip_data = None
|
||||
_start_frame = None
|
||||
_fps = None
|
||||
_drop_mode = None
|
||||
|
||||
def __init__(self, path, **kwargs):
|
||||
|
||||
# replace log if any
|
||||
if kwargs.get("logger"):
|
||||
self.log = kwargs["logger"]
|
||||
|
||||
# test if `dl_get_media_info` paht exists
|
||||
self._validate_media_script_path()
|
||||
|
||||
# derivate other feed variables
|
||||
self.feed_basename = os.path.basename(path)
|
||||
self.feed_dir = os.path.dirname(path)
|
||||
self.feed_ext = os.path.splitext(self.feed_basename)[1][1:].lower()
|
||||
|
||||
with maintained_temp_file_path(".clip") as tmp_path:
|
||||
self.log.info("Temp File: {}".format(tmp_path))
|
||||
self._generate_media_info_file(tmp_path)
|
||||
|
||||
# get clip data and make them single if there is multiple
|
||||
# clips data
|
||||
xml_data = self._make_single_clip_media_info(tmp_path)
|
||||
self.log.debug("xml_data: {}".format(xml_data))
|
||||
self.log.debug("type: {}".format(type(xml_data)))
|
||||
|
||||
# get all time related data and assign them
|
||||
self._get_time_info_from_origin(xml_data)
|
||||
self.log.debug("start_frame: {}".format(self.start_frame))
|
||||
self.log.debug("fps: {}".format(self.fps))
|
||||
self.log.debug("drop frame: {}".format(self.drop_mode))
|
||||
self.clip_data = xml_data
|
||||
|
||||
@property
|
||||
def clip_data(self):
|
||||
"""Clip's xml clip data
|
||||
|
||||
Returns:
|
||||
xml.etree.ElementTree: xml data
|
||||
"""
|
||||
return self._clip_data
|
||||
|
||||
@clip_data.setter
|
||||
def clip_data(self, data):
|
||||
self._clip_data = data
|
||||
|
||||
@property
|
||||
def start_frame(self):
|
||||
""" Clip's starting frame found in timecode
|
||||
|
||||
Returns:
|
||||
int: number of frames
|
||||
"""
|
||||
return self._start_frame
|
||||
|
||||
@start_frame.setter
|
||||
def start_frame(self, number):
|
||||
self._start_frame = int(number)
|
||||
|
||||
@property
|
||||
def fps(self):
|
||||
""" Clip's frame rate
|
||||
|
||||
Returns:
|
||||
float: frame rate
|
||||
"""
|
||||
return self._fps
|
||||
|
||||
@fps.setter
|
||||
def fps(self, fl_number):
|
||||
self._fps = float(fl_number)
|
||||
|
||||
@property
|
||||
def drop_mode(self):
|
||||
""" Clip's drop frame mode
|
||||
|
||||
Returns:
|
||||
str: drop frame flag
|
||||
"""
|
||||
return self._drop_mode
|
||||
|
||||
@drop_mode.setter
|
||||
def drop_mode(self, text):
|
||||
self._drop_mode = str(text)
|
||||
|
||||
def _validate_media_script_path(self):
|
||||
if not os.path.isfile(self.MEDIA_SCRIPT_PATH):
|
||||
raise IOError("Media Scirpt does not exist: `{}`".format(
|
||||
self.MEDIA_SCRIPT_PATH))
|
||||
|
||||
def _generate_media_info_file(self, fpath):
|
||||
# Create cmd arguments for gettig xml file info file
|
||||
cmd_args = [
|
||||
self.MEDIA_SCRIPT_PATH,
|
||||
"-e", self.feed_ext,
|
||||
"-o", fpath,
|
||||
self.feed_dir
|
||||
]
|
||||
|
||||
try:
|
||||
# execute creation of clip xml template data
|
||||
openpype.run_subprocess(cmd_args)
|
||||
except TypeError as error:
|
||||
raise TypeError(
|
||||
"Error creating `{}` due: {}".format(fpath, error))
|
||||
|
||||
def _make_single_clip_media_info(self, fpath):
|
||||
with open(fpath) as f:
|
||||
lines = f.readlines()
|
||||
_added_root = itertools.chain(
|
||||
"<root>", deepcopy(lines)[1:], "</root>")
|
||||
new_root = ET.fromstringlist(_added_root)
|
||||
|
||||
# find the clip which is matching to my input name
|
||||
xml_clips = new_root.findall("clip")
|
||||
matching_clip = None
|
||||
for xml_clip in xml_clips:
|
||||
if xml_clip.find("name").text in self.feed_basename:
|
||||
matching_clip = xml_clip
|
||||
|
||||
if matching_clip is None:
|
||||
# return warning there is missing clip
|
||||
raise ET.ParseError(
|
||||
"Missing clip in `{}`. Available clips {}".format(
|
||||
self.feed_basename, [
|
||||
xml_clip.find("name").text
|
||||
for xml_clip in xml_clips
|
||||
]
|
||||
))
|
||||
|
||||
return matching_clip
|
||||
|
||||
def _get_time_info_from_origin(self, xml_data):
|
||||
try:
|
||||
for out_track in xml_data.iter('track'):
|
||||
for out_feed in out_track.iter('feed'):
|
||||
# start frame
|
||||
out_feed_nb_ticks_obj = out_feed.find(
|
||||
'startTimecode/nbTicks')
|
||||
self.start_frame = out_feed_nb_ticks_obj.text
|
||||
|
||||
# fps
|
||||
out_feed_fps_obj = out_feed.find(
|
||||
'startTimecode/rate')
|
||||
self.fps = out_feed_fps_obj.text
|
||||
|
||||
# drop frame mode
|
||||
out_feed_drop_mode_obj = out_feed.find(
|
||||
'startTimecode/dropMode')
|
||||
self.drop_mode = out_feed_drop_mode_obj.text
|
||||
break
|
||||
else:
|
||||
continue
|
||||
except Exception as msg:
|
||||
self.log.warning(msg)
|
||||
|
||||
@staticmethod
|
||||
def write_clip_data_to_file(fpath, xml_element_data):
|
||||
""" Write xml element of clip data to file
|
||||
|
||||
Args:
|
||||
fpath (string): file path
|
||||
xml_element_data (xml.etree.ElementTree.Element): xml data
|
||||
|
||||
Raises:
|
||||
IOError: If data could not be written to file
|
||||
"""
|
||||
try:
|
||||
# save it as new file
|
||||
tree = cET.ElementTree(xml_element_data)
|
||||
tree.write(
|
||||
fpath, xml_declaration=True,
|
||||
method='xml', encoding='UTF-8'
|
||||
)
|
||||
except IOError as error:
|
||||
raise IOError(
|
||||
"Not able to write data to file: {}".format(error))
|
||||
|
|
|
|||
|
|
@ -1,24 +1,19 @@
|
|||
import os
|
||||
import re
|
||||
import shutil
|
||||
import sys
|
||||
from xml.etree import ElementTree as ET
|
||||
import six
|
||||
import qargparse
|
||||
from Qt import QtWidgets, QtCore
|
||||
import openpype.api as openpype
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
LoaderPlugin,
|
||||
)
|
||||
from openpype import style
|
||||
from . import (
|
||||
lib as flib,
|
||||
pipeline as fpipeline,
|
||||
constants
|
||||
)
|
||||
|
||||
from copy import deepcopy
|
||||
from xml.etree import ElementTree as ET
|
||||
|
||||
from Qt import QtCore, QtWidgets
|
||||
|
||||
import openpype.api as openpype
|
||||
import qargparse
|
||||
from openpype import style
|
||||
from openpype.pipeline import LegacyCreator, LoaderPlugin
|
||||
|
||||
from . import constants
|
||||
from . import lib as flib
|
||||
from . import pipeline as fpipeline
|
||||
|
||||
log = openpype.Logger.get_logger(__name__)
|
||||
|
||||
|
|
@ -660,8 +655,8 @@ class PublishableClip:
|
|||
|
||||
|
||||
# Publishing plugin functions
|
||||
# Loader plugin functions
|
||||
|
||||
# Loader plugin functions
|
||||
class ClipLoader(LoaderPlugin):
|
||||
"""A basic clip loader for Flame
|
||||
|
||||
|
|
@ -681,50 +676,52 @@ class ClipLoader(LoaderPlugin):
|
|||
]
|
||||
|
||||
|
||||
class OpenClipSolver:
|
||||
media_script_path = "/opt/Autodesk/mio/current/dl_get_media_info"
|
||||
tmp_name = "_tmp.clip"
|
||||
tmp_file = None
|
||||
class OpenClipSolver(flib.MediaInfoFile):
|
||||
create_new_clip = False
|
||||
|
||||
out_feed_nb_ticks = None
|
||||
out_feed_fps = None
|
||||
out_feed_drop_mode = None
|
||||
|
||||
log = log
|
||||
|
||||
def __init__(self, openclip_file_path, feed_data):
|
||||
# test if media script paht exists
|
||||
self._validate_media_script_path()
|
||||
self.out_file = openclip_file_path
|
||||
|
||||
# new feed variables:
|
||||
feed_path = feed_data["path"]
|
||||
feed_path = feed_data.pop("path")
|
||||
|
||||
# initialize parent class
|
||||
super(OpenClipSolver, self).__init__(
|
||||
feed_path,
|
||||
**feed_data
|
||||
)
|
||||
|
||||
# get other metadata
|
||||
self.feed_version_name = feed_data["version"]
|
||||
self.feed_colorspace = feed_data.get("colorspace")
|
||||
|
||||
if feed_data.get("logger"):
|
||||
self.log = feed_data["logger"]
|
||||
self.log.debug("feed_version_name: {}".format(self.feed_version_name))
|
||||
|
||||
# derivate other feed variables
|
||||
self.feed_basename = os.path.basename(feed_path)
|
||||
self.feed_dir = os.path.dirname(feed_path)
|
||||
self.feed_ext = os.path.splitext(self.feed_basename)[1][1:].lower()
|
||||
|
||||
if not os.path.isfile(openclip_file_path):
|
||||
# openclip does not exist yet and will be created
|
||||
self.tmp_file = self.out_file = openclip_file_path
|
||||
self.log.debug("feed_ext: {}".format(self.feed_ext))
|
||||
self.log.debug("out_file: {}".format(self.out_file))
|
||||
if not self._is_valid_tmp_file(self.out_file):
|
||||
self.create_new_clip = True
|
||||
|
||||
else:
|
||||
# output a temp file
|
||||
self.out_file = openclip_file_path
|
||||
self.tmp_file = os.path.join(self.feed_dir, self.tmp_name)
|
||||
self._clear_tmp_file()
|
||||
def _is_valid_tmp_file(self, file):
|
||||
# check if file exists
|
||||
if os.path.isfile(file):
|
||||
# test also if file is not empty
|
||||
with open(file) as f:
|
||||
lines = f.readlines()
|
||||
|
||||
self.log.info("Temp File: {}".format(self.tmp_file))
|
||||
if len(lines) > 2:
|
||||
return True
|
||||
|
||||
# file is probably corrupted
|
||||
os.remove(file)
|
||||
return False
|
||||
|
||||
def make(self):
|
||||
self._generate_media_info_file()
|
||||
|
||||
if self.create_new_clip:
|
||||
# New openClip
|
||||
|
|
@ -732,42 +729,17 @@ class OpenClipSolver:
|
|||
else:
|
||||
self._update_open_clip()
|
||||
|
||||
def _validate_media_script_path(self):
|
||||
if not os.path.isfile(self.media_script_path):
|
||||
raise IOError("Media Scirpt does not exist: `{}`".format(
|
||||
self.media_script_path))
|
||||
|
||||
def _generate_media_info_file(self):
|
||||
# Create cmd arguments for gettig xml file info file
|
||||
cmd_args = [
|
||||
self.media_script_path,
|
||||
"-e", self.feed_ext,
|
||||
"-o", self.tmp_file,
|
||||
self.feed_dir
|
||||
]
|
||||
|
||||
# execute creation of clip xml template data
|
||||
try:
|
||||
openpype.run_subprocess(cmd_args)
|
||||
except TypeError:
|
||||
self.log.error("Error creating self.tmp_file")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
def _clear_tmp_file(self):
|
||||
if os.path.isfile(self.tmp_file):
|
||||
os.remove(self.tmp_file)
|
||||
|
||||
def _clear_handler(self, xml_object):
|
||||
for handler in xml_object.findall("./handler"):
|
||||
self.log.debug("Handler found")
|
||||
self.log.info("Handler found")
|
||||
xml_object.remove(handler)
|
||||
|
||||
def _create_new_open_clip(self):
|
||||
self.log.info("Building new openClip")
|
||||
self.log.debug(">> self.clip_data: {}".format(self.clip_data))
|
||||
|
||||
tmp_xml = ET.parse(self.tmp_file)
|
||||
|
||||
tmp_xml_feeds = tmp_xml.find('tracks/track/feeds')
|
||||
# clip data comming from MediaInfoFile
|
||||
tmp_xml_feeds = self.clip_data.find('tracks/track/feeds')
|
||||
tmp_xml_feeds.set('currentVersion', self.feed_version_name)
|
||||
for tmp_feed in tmp_xml_feeds:
|
||||
tmp_feed.set('vuid', self.feed_version_name)
|
||||
|
|
@ -778,46 +750,48 @@ class OpenClipSolver:
|
|||
|
||||
self._clear_handler(tmp_feed)
|
||||
|
||||
tmp_xml_versions_obj = tmp_xml.find('versions')
|
||||
tmp_xml_versions_obj = self.clip_data.find('versions')
|
||||
tmp_xml_versions_obj.set('currentVersion', self.feed_version_name)
|
||||
for xml_new_version in tmp_xml_versions_obj:
|
||||
xml_new_version.set('uid', self.feed_version_name)
|
||||
xml_new_version.set('type', 'version')
|
||||
|
||||
xml_data = self._fix_xml_data(tmp_xml)
|
||||
self._clear_handler(self.clip_data)
|
||||
self.log.info("Adding feed version: {}".format(self.feed_basename))
|
||||
|
||||
self._write_result_xml_to_file(xml_data)
|
||||
|
||||
self.log.info("openClip Updated: {}".format(self.tmp_file))
|
||||
self.write_clip_data_to_file(self.out_file, self.clip_data)
|
||||
|
||||
def _update_open_clip(self):
|
||||
self.log.info("Updating openClip ..")
|
||||
|
||||
out_xml = ET.parse(self.out_file)
|
||||
tmp_xml = ET.parse(self.tmp_file)
|
||||
out_xml = out_xml.getroot()
|
||||
|
||||
self.log.debug(">> out_xml: {}".format(out_xml))
|
||||
self.log.debug(">> tmp_xml: {}".format(tmp_xml))
|
||||
self.log.debug(">> self.clip_data: {}".format(self.clip_data))
|
||||
|
||||
# Get new feed from tmp file
|
||||
tmp_xml_feed = tmp_xml.find('tracks/track/feeds/feed')
|
||||
tmp_xml_feed = self.clip_data.find('tracks/track/feeds/feed')
|
||||
|
||||
self._clear_handler(tmp_xml_feed)
|
||||
self._get_time_info_from_origin(out_xml)
|
||||
|
||||
if self.out_feed_fps:
|
||||
# update fps from MediaInfoFile class
|
||||
if self.fps:
|
||||
tmp_feed_fps_obj = tmp_xml_feed.find(
|
||||
"startTimecode/rate")
|
||||
tmp_feed_fps_obj.text = self.out_feed_fps
|
||||
if self.out_feed_nb_ticks:
|
||||
tmp_feed_fps_obj.text = str(self.fps)
|
||||
|
||||
# update start_frame from MediaInfoFile class
|
||||
if self.start_frame:
|
||||
tmp_feed_nb_ticks_obj = tmp_xml_feed.find(
|
||||
"startTimecode/nbTicks")
|
||||
tmp_feed_nb_ticks_obj.text = self.out_feed_nb_ticks
|
||||
if self.out_feed_drop_mode:
|
||||
tmp_feed_nb_ticks_obj.text = str(self.start_frame)
|
||||
|
||||
# update drop_mode from MediaInfoFile class
|
||||
if self.drop_mode:
|
||||
tmp_feed_drop_mode_obj = tmp_xml_feed.find(
|
||||
"startTimecode/dropMode")
|
||||
tmp_feed_drop_mode_obj.text = self.out_feed_drop_mode
|
||||
tmp_feed_drop_mode_obj.text = str(self.drop_mode)
|
||||
|
||||
new_path_obj = tmp_xml_feed.find(
|
||||
"spans/span/path")
|
||||
|
|
@ -850,7 +824,7 @@ class OpenClipSolver:
|
|||
"version", {"type": "version", "uid": self.feed_version_name})
|
||||
out_xml_versions_obj.insert(0, new_version_obj)
|
||||
|
||||
xml_data = self._fix_xml_data(out_xml)
|
||||
self._clear_handler(out_xml)
|
||||
|
||||
# fist create backup
|
||||
self._create_openclip_backup_file(self.out_file)
|
||||
|
|
@ -858,30 +832,9 @@ class OpenClipSolver:
|
|||
self.log.info("Adding feed version: {}".format(
|
||||
self.feed_version_name))
|
||||
|
||||
self._write_result_xml_to_file(xml_data)
|
||||
self.write_clip_data_to_file(self.out_file, out_xml)
|
||||
|
||||
self.log.info("openClip Updated: {}".format(self.out_file))
|
||||
|
||||
self._clear_tmp_file()
|
||||
|
||||
def _get_time_info_from_origin(self, xml_data):
|
||||
try:
|
||||
for out_track in xml_data.iter('track'):
|
||||
for out_feed in out_track.iter('feed'):
|
||||
out_feed_nb_ticks_obj = out_feed.find(
|
||||
'startTimecode/nbTicks')
|
||||
self.out_feed_nb_ticks = out_feed_nb_ticks_obj.text
|
||||
out_feed_fps_obj = out_feed.find(
|
||||
'startTimecode/rate')
|
||||
self.out_feed_fps = out_feed_fps_obj.text
|
||||
out_feed_drop_mode_obj = out_feed.find(
|
||||
'startTimecode/dropMode')
|
||||
self.out_feed_drop_mode = out_feed_drop_mode_obj.text
|
||||
break
|
||||
else:
|
||||
continue
|
||||
except Exception as msg:
|
||||
self.log.warning(msg)
|
||||
self.log.debug("OpenClip Updated: {}".format(self.out_file))
|
||||
|
||||
def _feed_exists(self, xml_data, path):
|
||||
# loop all available feed paths and check if
|
||||
|
|
@ -892,15 +845,6 @@ class OpenClipSolver:
|
|||
"Not appending file as it already is in .clip file")
|
||||
return True
|
||||
|
||||
def _fix_xml_data(self, xml_data):
|
||||
xml_root = xml_data.getroot()
|
||||
self._clear_handler(xml_root)
|
||||
return ET.tostring(xml_root).decode('utf-8')
|
||||
|
||||
def _write_result_xml_to_file(self, xml_data):
|
||||
with open(self.out_file, "w") as f:
|
||||
f.write(xml_data)
|
||||
|
||||
def _create_openclip_backup_file(self, file):
|
||||
bck_file = "{}.bak".format(file)
|
||||
# if backup does not exist
|
||||
|
|
|
|||
|
|
@ -185,7 +185,9 @@ class WireTapCom(object):
|
|||
|
||||
exit_code = subprocess.call(
|
||||
project_create_cmd,
|
||||
cwd=os.path.expanduser('~'))
|
||||
cwd=os.path.expanduser('~'),
|
||||
preexec_fn=_subprocess_preexec_fn
|
||||
)
|
||||
|
||||
if exit_code != 0:
|
||||
RuntimeError("Cannot create project in flame db")
|
||||
|
|
@ -254,7 +256,7 @@ class WireTapCom(object):
|
|||
filtered_users = [user for user in used_names if user_name in user]
|
||||
|
||||
if filtered_users:
|
||||
# todo: need to find lastly created following regex pattern for
|
||||
# TODO: need to find lastly created following regex pattern for
|
||||
# date used in name
|
||||
return filtered_users.pop()
|
||||
|
||||
|
|
@ -448,7 +450,9 @@ class WireTapCom(object):
|
|||
|
||||
exit_code = subprocess.call(
|
||||
project_colorspace_cmd,
|
||||
cwd=os.path.expanduser('~'))
|
||||
cwd=os.path.expanduser('~'),
|
||||
preexec_fn=_subprocess_preexec_fn
|
||||
)
|
||||
|
||||
if exit_code != 0:
|
||||
RuntimeError("Cannot set colorspace {} on project {}".format(
|
||||
|
|
@ -456,6 +460,15 @@ class WireTapCom(object):
|
|||
))
|
||||
|
||||
|
||||
def _subprocess_preexec_fn():
|
||||
""" Helper function
|
||||
|
||||
Setting permission mask to 0777
|
||||
"""
|
||||
os.setpgrp()
|
||||
os.umask(0o000)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# get json exchange data
|
||||
json_path = sys.argv[-1]
|
||||
|
|
|
|||
|
|
@ -11,8 +11,6 @@ from . import utils
|
|||
import flame
|
||||
from pprint import pformat
|
||||
|
||||
reload(utils) # noqa
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
|
@ -260,24 +258,15 @@ def create_otio_markers(otio_item, item):
|
|||
otio_item.markers.append(otio_marker)
|
||||
|
||||
|
||||
def create_otio_reference(clip_data):
|
||||
def create_otio_reference(clip_data, fps=None):
|
||||
metadata = _get_metadata(clip_data)
|
||||
|
||||
# get file info for path and start frame
|
||||
frame_start = 0
|
||||
fps = CTX.get_fps()
|
||||
fps = fps or CTX.get_fps()
|
||||
|
||||
path = clip_data["fpath"]
|
||||
|
||||
reel_clip = None
|
||||
match_reel_clip = [
|
||||
clip for clip in CTX.clips
|
||||
if clip["fpath"] == path
|
||||
]
|
||||
if match_reel_clip:
|
||||
reel_clip = match_reel_clip.pop()
|
||||
fps = reel_clip["fps"]
|
||||
|
||||
file_name = os.path.basename(path)
|
||||
file_head, extension = os.path.splitext(file_name)
|
||||
|
||||
|
|
@ -339,13 +328,22 @@ def create_otio_reference(clip_data):
|
|||
|
||||
|
||||
def create_otio_clip(clip_data):
|
||||
from openpype.hosts.flame.api import MediaInfoFile
|
||||
|
||||
segment = clip_data["PySegment"]
|
||||
|
||||
# create media reference
|
||||
media_reference = create_otio_reference(clip_data)
|
||||
|
||||
# calculate source in
|
||||
first_frame = utils.get_frame_from_filename(clip_data["fpath"]) or 0
|
||||
media_info = MediaInfoFile(clip_data["fpath"])
|
||||
media_timecode_start = media_info.start_frame
|
||||
media_fps = media_info.fps
|
||||
|
||||
# create media reference
|
||||
media_reference = create_otio_reference(clip_data, media_fps)
|
||||
|
||||
# define first frame
|
||||
first_frame = media_timecode_start or utils.get_frame_from_filename(
|
||||
clip_data["fpath"]) or 0
|
||||
|
||||
source_in = int(clip_data["source_in"]) - int(first_frame)
|
||||
|
||||
# creatae source range
|
||||
|
|
@ -378,38 +376,6 @@ def create_otio_gap(gap_start, clip_start, tl_start_frame, fps):
|
|||
)
|
||||
|
||||
|
||||
def get_clips_in_reels(project):
|
||||
output_clips = []
|
||||
project_desktop = project.current_workspace.desktop
|
||||
|
||||
for reel_group in project_desktop.reel_groups:
|
||||
for reel in reel_group.reels:
|
||||
for clip in reel.clips:
|
||||
clip_data = {
|
||||
"PyClip": clip,
|
||||
"fps": float(str(clip.frame_rate)[:-4])
|
||||
}
|
||||
|
||||
attrs = [
|
||||
"name", "width", "height",
|
||||
"ratio", "sample_rate", "bit_depth"
|
||||
]
|
||||
|
||||
for attr in attrs:
|
||||
val = getattr(clip, attr)
|
||||
clip_data[attr] = val
|
||||
|
||||
version = clip.versions[-1]
|
||||
track = version.tracks[-1]
|
||||
for segment in track.segments:
|
||||
segment_data = _get_segment_attributes(segment)
|
||||
clip_data.update(segment_data)
|
||||
|
||||
output_clips.append(clip_data)
|
||||
|
||||
return output_clips
|
||||
|
||||
|
||||
def _get_colourspace_policy():
|
||||
|
||||
output = {}
|
||||
|
|
@ -493,9 +459,6 @@ def _get_shot_tokens_values(clip, tokens):
|
|||
old_value = None
|
||||
output = {}
|
||||
|
||||
if not clip.shot_name:
|
||||
return output
|
||||
|
||||
old_value = clip.shot_name.get_value()
|
||||
|
||||
for token in tokens:
|
||||
|
|
@ -513,15 +476,21 @@ def _get_shot_tokens_values(clip, tokens):
|
|||
|
||||
|
||||
def _get_segment_attributes(segment):
|
||||
# log.debug(dir(segment))
|
||||
|
||||
if str(segment.name)[1:-1] == "":
|
||||
log.debug("Segment name|hidden: {}|{}".format(
|
||||
segment.name.get_value(), segment.hidden
|
||||
))
|
||||
if (
|
||||
segment.name.get_value() == ""
|
||||
or segment.hidden.get_value()
|
||||
):
|
||||
return None
|
||||
|
||||
# Add timeline segment to tree
|
||||
clip_data = {
|
||||
"segment_name": segment.name.get_value(),
|
||||
"segment_comment": segment.comment.get_value(),
|
||||
"shot_name": segment.shot_name.get_value(),
|
||||
"tape_name": segment.tape_name,
|
||||
"source_name": segment.source_name,
|
||||
"fpath": segment.file_path,
|
||||
|
|
@ -529,9 +498,10 @@ def _get_segment_attributes(segment):
|
|||
}
|
||||
|
||||
# add all available shot tokens
|
||||
shot_tokens = _get_shot_tokens_values(segment, [
|
||||
"<colour space>", "<width>", "<height>", "<depth>",
|
||||
])
|
||||
shot_tokens = _get_shot_tokens_values(
|
||||
segment,
|
||||
["<colour space>", "<width>", "<height>", "<depth>"]
|
||||
)
|
||||
clip_data.update(shot_tokens)
|
||||
|
||||
# populate shot source metadata
|
||||
|
|
@ -561,11 +531,6 @@ def create_otio_timeline(sequence):
|
|||
log.info(sequence.attributes)
|
||||
|
||||
CTX.project = get_current_flame_project()
|
||||
CTX.clips = get_clips_in_reels(CTX.project)
|
||||
|
||||
log.debug(pformat(
|
||||
CTX.clips
|
||||
))
|
||||
|
||||
# get current timeline
|
||||
CTX.set_fps(
|
||||
|
|
@ -583,8 +548,13 @@ def create_otio_timeline(sequence):
|
|||
# create otio tracks and clips
|
||||
for ver in sequence.versions:
|
||||
for track in ver.tracks:
|
||||
if len(track.segments) == 0 and track.hidden:
|
||||
return None
|
||||
# avoid all empty tracks
|
||||
# or hidden tracks
|
||||
if (
|
||||
len(track.segments) == 0
|
||||
or track.hidden.get_value()
|
||||
):
|
||||
continue
|
||||
|
||||
# convert track to otio
|
||||
otio_track = create_otio_track(
|
||||
|
|
@ -597,11 +567,7 @@ def create_otio_timeline(sequence):
|
|||
continue
|
||||
all_segments.append(clip_data)
|
||||
|
||||
segments_ordered = {
|
||||
itemindex: clip_data
|
||||
for itemindex, clip_data in enumerate(
|
||||
all_segments)
|
||||
}
|
||||
segments_ordered = dict(enumerate(all_segments))
|
||||
log.debug("_ segments_ordered: {}".format(
|
||||
pformat(segments_ordered)
|
||||
))
|
||||
|
|
@ -612,15 +578,11 @@ def create_otio_timeline(sequence):
|
|||
log.debug("_ itemindex: {}".format(itemindex))
|
||||
|
||||
# Add Gap if needed
|
||||
if itemindex == 0:
|
||||
# if it is first track item at track then add
|
||||
# it to previous item
|
||||
prev_item = segment_data
|
||||
|
||||
else:
|
||||
# get previous item
|
||||
prev_item = segments_ordered[itemindex - 1]
|
||||
|
||||
prev_item = (
|
||||
segment_data
|
||||
if itemindex == 0
|
||||
else segments_ordered[itemindex - 1]
|
||||
)
|
||||
log.debug("_ segment_data: {}".format(segment_data))
|
||||
|
||||
# calculate clip frame range difference from each other
|
||||
|
|
|
|||
|
|
@ -22,7 +22,7 @@ class LoadClip(opfapi.ClipLoader):
|
|||
# settings
|
||||
reel_group_name = "OpenPype_Reels"
|
||||
reel_name = "Loaded"
|
||||
clip_name_template = "{asset}_{subset}_{representation}"
|
||||
clip_name_template = "{asset}_{subset}_{output}"
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
|
||||
|
|
@ -39,7 +39,7 @@ class LoadClip(opfapi.ClipLoader):
|
|||
clip_name = self.clip_name_template.format(
|
||||
**context["representation"]["context"])
|
||||
|
||||
# todo: settings in imageio
|
||||
# TODO: settings in imageio
|
||||
# convert colorspace with ocio to flame mapping
|
||||
# in imageio flame section
|
||||
colorspace = colorspace
|
||||
|
|
|
|||
139
openpype/hosts/flame/plugins/load/load_clip_batch.py
Normal file
139
openpype/hosts/flame/plugins/load/load_clip_batch.py
Normal file
|
|
@ -0,0 +1,139 @@
|
|||
import os
|
||||
import flame
|
||||
from pprint import pformat
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
|
||||
|
||||
class LoadClipBatch(opfapi.ClipLoader):
|
||||
"""Load a subset to timeline as clip
|
||||
|
||||
Place clip to timeline on its asset origin timings collected
|
||||
during conforming to project
|
||||
"""
|
||||
|
||||
families = ["render2d", "source", "plate", "render", "review"]
|
||||
representations = ["exr", "dpx", "jpg", "jpeg", "png", "h264"]
|
||||
|
||||
label = "Load as clip to current batch"
|
||||
order = -10
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
|
||||
# settings
|
||||
reel_name = "OP_LoadedReel"
|
||||
clip_name_template = "{asset}_{subset}_{output}"
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
|
||||
# get flame objects
|
||||
self.batch = options.get("batch") or flame.batch
|
||||
|
||||
# load clip to timeline and get main variables
|
||||
namespace = namespace
|
||||
version = context['version']
|
||||
version_data = version.get("data", {})
|
||||
version_name = version.get("name", None)
|
||||
colorspace = version_data.get("colorspace", None)
|
||||
|
||||
# in case output is not in context replace key to representation
|
||||
if not context["representation"]["context"].get("output"):
|
||||
self.clip_name_template.replace("output", "representation")
|
||||
|
||||
clip_name = self.clip_name_template.format(
|
||||
**context["representation"]["context"])
|
||||
|
||||
# TODO: settings in imageio
|
||||
# convert colorspace with ocio to flame mapping
|
||||
# in imageio flame section
|
||||
colorspace = colorspace
|
||||
|
||||
# create workfile path
|
||||
workfile_dir = options.get("workdir") or os.environ["AVALON_WORKDIR"]
|
||||
openclip_dir = os.path.join(
|
||||
workfile_dir, clip_name
|
||||
)
|
||||
openclip_path = os.path.join(
|
||||
openclip_dir, clip_name + ".clip"
|
||||
)
|
||||
if not os.path.exists(openclip_dir):
|
||||
os.makedirs(openclip_dir)
|
||||
|
||||
# prepare clip data from context ad send it to openClipLoader
|
||||
loading_context = {
|
||||
"path": self.fname.replace("\\", "/"),
|
||||
"colorspace": colorspace,
|
||||
"version": "v{:0>3}".format(version_name),
|
||||
"logger": self.log
|
||||
|
||||
}
|
||||
self.log.debug(pformat(
|
||||
loading_context
|
||||
))
|
||||
self.log.debug(openclip_path)
|
||||
|
||||
# make openpype clip file
|
||||
opfapi.OpenClipSolver(openclip_path, loading_context).make()
|
||||
|
||||
# prepare Reel group in actual desktop
|
||||
opc = self._get_clip(
|
||||
clip_name,
|
||||
openclip_path
|
||||
)
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
add_keys = [
|
||||
"frameStart", "frameEnd", "source", "author",
|
||||
"fps", "handleStart", "handleEnd"
|
||||
]
|
||||
|
||||
# move all version data keys to tag data
|
||||
data_imprint = {
|
||||
key: version_data.get(key, str(None))
|
||||
for key in add_keys
|
||||
}
|
||||
# add variables related to version context
|
||||
data_imprint.update({
|
||||
"version": version_name,
|
||||
"colorspace": colorspace,
|
||||
"objectName": clip_name
|
||||
})
|
||||
|
||||
# TODO: finish the containerisation
|
||||
# opc_segment = opfapi.get_clip_segment(opc)
|
||||
|
||||
# return opfapi.containerise(
|
||||
# opc_segment,
|
||||
# name, namespace, context,
|
||||
# self.__class__.__name__,
|
||||
# data_imprint)
|
||||
|
||||
return opc
|
||||
|
||||
def _get_clip(self, name, clip_path):
|
||||
reel = self._get_reel()
|
||||
|
||||
# with maintained openclip as opc
|
||||
matching_clip = None
|
||||
for cl in reel.clips:
|
||||
if cl.name.get_value() != name:
|
||||
continue
|
||||
matching_clip = cl
|
||||
|
||||
if not matching_clip:
|
||||
created_clips = flame.import_clips(str(clip_path), reel)
|
||||
return created_clips.pop()
|
||||
|
||||
return matching_clip
|
||||
|
||||
def _get_reel(self):
|
||||
|
||||
matching_reel = [
|
||||
rg for rg in self.batch.reels
|
||||
if rg.name.get_value() == self.reel_name
|
||||
]
|
||||
|
||||
return (
|
||||
matching_reel.pop()
|
||||
if matching_reel
|
||||
else self.batch.create_reel(str(self.reel_name))
|
||||
)
|
||||
|
|
@ -21,19 +21,12 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
audio_track_items = []
|
||||
|
||||
# TODO: add to settings
|
||||
# settings
|
||||
xml_preset_attrs_from_comments = {
|
||||
"width": "number",
|
||||
"height": "number",
|
||||
"pixelRatio": "float",
|
||||
"resizeType": "string",
|
||||
"resizeFilter": "string"
|
||||
}
|
||||
xml_preset_attrs_from_comments = []
|
||||
add_tasks = []
|
||||
|
||||
def process(self, context):
|
||||
project = context.data["flameProject"]
|
||||
sequence = context.data["flameSequence"]
|
||||
selected_segments = context.data["flameSelectedSegments"]
|
||||
self.log.debug("__ selected_segments: {}".format(selected_segments))
|
||||
|
||||
|
|
@ -79,9 +72,9 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
# solve handles length
|
||||
marker_data["handleStart"] = min(
|
||||
marker_data["handleStart"], head)
|
||||
marker_data["handleStart"], abs(head))
|
||||
marker_data["handleEnd"] = min(
|
||||
marker_data["handleEnd"], tail)
|
||||
marker_data["handleEnd"], abs(tail))
|
||||
|
||||
with_audio = bool(marker_data.pop("audio"))
|
||||
|
||||
|
|
@ -112,7 +105,11 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
"fps": self.fps,
|
||||
"flameSourceClip": source_clip,
|
||||
"sourceFirstFrame": int(first_frame),
|
||||
"path": file_path
|
||||
"path": file_path,
|
||||
"flameAddTasks": self.add_tasks,
|
||||
"tasks": {
|
||||
task["name"]: {"type": task["type"]}
|
||||
for task in self.add_tasks}
|
||||
})
|
||||
|
||||
# get otio clip data
|
||||
|
|
@ -187,7 +184,10 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
# split to key and value
|
||||
key, value = split.split(":")
|
||||
|
||||
for a_name, a_type in self.xml_preset_attrs_from_comments.items():
|
||||
for attr_data in self.xml_preset_attrs_from_comments:
|
||||
a_name = attr_data["name"]
|
||||
a_type = attr_data["type"]
|
||||
|
||||
# exclude all not related attributes
|
||||
if a_name.lower() not in key.lower():
|
||||
continue
|
||||
|
|
@ -247,6 +247,7 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
|
|||
head = clip_data.get("segment_head")
|
||||
tail = clip_data.get("segment_tail")
|
||||
|
||||
# HACK: it is here to serve for versions bellow 2021.1
|
||||
if not head:
|
||||
head = int(clip_data["source_in"]) - int(first_frame)
|
||||
if not tail:
|
||||
|
|
|
|||
|
|
@ -61,9 +61,13 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
|
||||
# flame objects
|
||||
segment = instance.data["item"]
|
||||
segment_name = segment.name.get_value()
|
||||
sequence_clip = instance.context.data["flameSequence"]
|
||||
clip_data = instance.data["flameSourceClip"]
|
||||
clip = clip_data["PyClip"]
|
||||
|
||||
reel_clip = None
|
||||
if clip_data:
|
||||
reel_clip = clip_data["PyClip"]
|
||||
|
||||
# segment's parent track name
|
||||
s_track_name = segment.parent.name.get_value()
|
||||
|
|
@ -108,6 +112,16 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
ignore_comment_attrs = preset_config["ignore_comment_attrs"]
|
||||
color_out = preset_config["colorspace_out"]
|
||||
|
||||
# get attribures related loading in integrate_batch_group
|
||||
load_to_batch_group = preset_config.get(
|
||||
"load_to_batch_group")
|
||||
batch_group_loader_name = preset_config.get(
|
||||
"batch_group_loader_name")
|
||||
|
||||
# convert to None if empty string
|
||||
if batch_group_loader_name == "":
|
||||
batch_group_loader_name = None
|
||||
|
||||
# get frame range with handles for representation range
|
||||
frame_start_handle = frame_start - handle_start
|
||||
source_duration_handles = (
|
||||
|
|
@ -117,8 +131,20 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
in_mark = (source_start_handles - source_first_frame) + 1
|
||||
out_mark = in_mark + source_duration_handles
|
||||
|
||||
# make test for type of preset and available reel_clip
|
||||
if (
|
||||
not reel_clip
|
||||
and export_type != "Sequence Publish"
|
||||
):
|
||||
self.log.warning((
|
||||
"Skipping preset {}. Not available "
|
||||
"reel clip for {}").format(
|
||||
preset_file, segment_name
|
||||
))
|
||||
continue
|
||||
|
||||
# by default export source clips
|
||||
exporting_clip = clip
|
||||
exporting_clip = reel_clip
|
||||
|
||||
if export_type == "Sequence Publish":
|
||||
# change export clip to sequence
|
||||
|
|
@ -150,7 +176,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
|
||||
if export_type == "Sequence Publish":
|
||||
# only keep visible layer where instance segment is child
|
||||
self.hide_other_tracks(duplclip, s_track_name)
|
||||
self.hide_others(duplclip, segment_name, s_track_name)
|
||||
|
||||
# validate xml preset file is filled
|
||||
if preset_file == "":
|
||||
|
|
@ -211,7 +237,9 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
"tags": repre_tags,
|
||||
"data": {
|
||||
"colorspace": color_out
|
||||
}
|
||||
},
|
||||
"load_to_batch_group": load_to_batch_group,
|
||||
"batch_group_loader_name": batch_group_loader_name
|
||||
}
|
||||
|
||||
# collect all available content of export dir
|
||||
|
|
@ -322,18 +350,26 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
|
||||
return new_stage_dir, new_files_list
|
||||
|
||||
def hide_other_tracks(self, sequence_clip, track_name):
|
||||
def hide_others(self, sequence_clip, segment_name, track_name):
|
||||
"""Helper method used only if sequence clip is used
|
||||
|
||||
Args:
|
||||
sequence_clip (flame.Clip): sequence clip
|
||||
segment_name (str): segment name
|
||||
track_name (str): track name
|
||||
"""
|
||||
# create otio tracks and clips
|
||||
for ver in sequence_clip.versions:
|
||||
for track in ver.tracks:
|
||||
if len(track.segments) == 0 and track.hidden:
|
||||
if len(track.segments) == 0 and track.hidden.get_value():
|
||||
continue
|
||||
|
||||
# hide tracks which are not parent track
|
||||
if track.name.get_value() != track_name:
|
||||
track.hidden = True
|
||||
continue
|
||||
|
||||
# hidde all other segments
|
||||
for segment in track.segments:
|
||||
if segment.name.get_value() != segment_name:
|
||||
segment.hidden = True
|
||||
|
|
|
|||
328
openpype/hosts/flame/plugins/publish/integrate_batch_group.py
Normal file
328
openpype/hosts/flame/plugins/publish/integrate_batch_group.py
Normal file
|
|
@ -0,0 +1,328 @@
|
|||
import os
|
||||
import copy
|
||||
from collections import OrderedDict
|
||||
from pprint import pformat
|
||||
import pyblish
|
||||
from openpype.lib import get_workdir
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
import openpype.pipeline as op_pipeline
|
||||
|
||||
|
||||
class IntegrateBatchGroup(pyblish.api.InstancePlugin):
|
||||
"""Integrate published shot to batch group"""
|
||||
|
||||
order = pyblish.api.IntegratorOrder + 0.45
|
||||
label = "Integrate Batch Groups"
|
||||
hosts = ["flame"]
|
||||
families = ["clip"]
|
||||
|
||||
# settings
|
||||
default_loader = "LoadClip"
|
||||
|
||||
def process(self, instance):
|
||||
add_tasks = instance.data["flameAddTasks"]
|
||||
|
||||
# iterate all tasks from settings
|
||||
for task_data in add_tasks:
|
||||
# exclude batch group
|
||||
if not task_data["create_batch_group"]:
|
||||
continue
|
||||
|
||||
# create or get already created batch group
|
||||
bgroup = self._get_batch_group(instance, task_data)
|
||||
|
||||
# add batch group content
|
||||
all_batch_nodes = self._add_nodes_to_batch_with_links(
|
||||
instance, task_data, bgroup)
|
||||
|
||||
for name, node in all_batch_nodes.items():
|
||||
self.log.debug("name: {}, dir: {}".format(
|
||||
name, dir(node)
|
||||
))
|
||||
self.log.debug("__ node.attributes: {}".format(
|
||||
node.attributes
|
||||
))
|
||||
|
||||
# load plate to batch group
|
||||
self.log.info("Loading subset `{}` into batch `{}`".format(
|
||||
instance.data["subset"], bgroup.name.get_value()
|
||||
))
|
||||
self._load_clip_to_context(instance, bgroup)
|
||||
|
||||
def _add_nodes_to_batch_with_links(self, instance, task_data, batch_group):
|
||||
# get write file node properties > OrederDict because order does mater
|
||||
write_pref_data = self._get_write_prefs(instance, task_data)
|
||||
|
||||
batch_nodes = [
|
||||
{
|
||||
"type": "comp",
|
||||
"properties": {},
|
||||
"id": "comp_node01"
|
||||
},
|
||||
{
|
||||
"type": "Write File",
|
||||
"properties": write_pref_data,
|
||||
"id": "write_file_node01"
|
||||
}
|
||||
]
|
||||
batch_links = [
|
||||
{
|
||||
"from_node": {
|
||||
"id": "comp_node01",
|
||||
"connector": "Result"
|
||||
},
|
||||
"to_node": {
|
||||
"id": "write_file_node01",
|
||||
"connector": "Front"
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
# add nodes into batch group
|
||||
return opfapi.create_batch_group_conent(
|
||||
batch_nodes, batch_links, batch_group)
|
||||
|
||||
def _load_clip_to_context(self, instance, bgroup):
|
||||
# get all loaders for host
|
||||
loaders_by_name = {
|
||||
loader.__name__: loader
|
||||
for loader in op_pipeline.discover_loader_plugins()
|
||||
}
|
||||
|
||||
# get all published representations
|
||||
published_representations = instance.data["published_representations"]
|
||||
repres_db_id_by_name = {
|
||||
repre_info["representation"]["name"]: repre_id
|
||||
for repre_id, repre_info in published_representations.items()
|
||||
}
|
||||
|
||||
# get all loadable representations
|
||||
repres_by_name = {
|
||||
repre["name"]: repre for repre in instance.data["representations"]
|
||||
}
|
||||
|
||||
# get repre_id for the loadable representations
|
||||
loader_name_by_repre_id = {
|
||||
repres_db_id_by_name[repr_name]: {
|
||||
"loader": repr_data["batch_group_loader_name"],
|
||||
# add repre data for exception logging
|
||||
"_repre_data": repr_data
|
||||
}
|
||||
for repr_name, repr_data in repres_by_name.items()
|
||||
if repr_data.get("load_to_batch_group")
|
||||
}
|
||||
|
||||
self.log.debug("__ loader_name_by_repre_id: {}".format(pformat(
|
||||
loader_name_by_repre_id)))
|
||||
|
||||
# get representation context from the repre_id
|
||||
repre_contexts = op_pipeline.load.get_repres_contexts(
|
||||
loader_name_by_repre_id.keys())
|
||||
|
||||
self.log.debug("__ repre_contexts: {}".format(pformat(
|
||||
repre_contexts)))
|
||||
|
||||
# loop all returned repres from repre_context dict
|
||||
for repre_id, repre_context in repre_contexts.items():
|
||||
self.log.debug("__ repre_id: {}".format(repre_id))
|
||||
# get loader name by representation id
|
||||
loader_name = (
|
||||
loader_name_by_repre_id[repre_id]["loader"]
|
||||
# if nothing was added to settings fallback to default
|
||||
or self.default_loader
|
||||
)
|
||||
|
||||
# get loader plugin
|
||||
loader_plugin = loaders_by_name.get(loader_name)
|
||||
if loader_plugin:
|
||||
# load to flame by representation context
|
||||
try:
|
||||
op_pipeline.load.load_with_repre_context(
|
||||
loader_plugin, repre_context, **{
|
||||
"data": {
|
||||
"workdir": self.task_workdir,
|
||||
"batch": bgroup
|
||||
}
|
||||
})
|
||||
except op_pipeline.load.IncompatibleLoaderError as msg:
|
||||
self.log.error(
|
||||
"Check allowed representations for Loader `{}` "
|
||||
"in settings > error: {}".format(
|
||||
loader_plugin.__name__, msg))
|
||||
self.log.error(
|
||||
"Representaton context >>{}<< is not compatible "
|
||||
"with loader `{}`".format(
|
||||
pformat(repre_context), loader_plugin.__name__
|
||||
)
|
||||
)
|
||||
else:
|
||||
self.log.warning(
|
||||
"Something got wrong and there is not Loader found for "
|
||||
"following data: {}".format(
|
||||
pformat(loader_name_by_repre_id))
|
||||
)
|
||||
|
||||
def _get_batch_group(self, instance, task_data):
|
||||
frame_start = instance.data["frameStart"]
|
||||
frame_end = instance.data["frameEnd"]
|
||||
handle_start = instance.data["handleStart"]
|
||||
handle_end = instance.data["handleEnd"]
|
||||
frame_duration = (frame_end - frame_start) + 1
|
||||
asset_name = instance.data["asset"]
|
||||
|
||||
task_name = task_data["name"]
|
||||
batchgroup_name = "{}_{}".format(asset_name, task_name)
|
||||
|
||||
batch_data = {
|
||||
"shematic_reels": [
|
||||
"OP_LoadedReel"
|
||||
],
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end
|
||||
}
|
||||
self.log.debug(
|
||||
"__ batch_data: {}".format(pformat(batch_data)))
|
||||
|
||||
# check if the batch group already exists
|
||||
bgroup = opfapi.get_batch_group_from_desktop(batchgroup_name)
|
||||
|
||||
if not bgroup:
|
||||
self.log.info(
|
||||
"Creating new batch group: {}".format(batchgroup_name))
|
||||
# create batch with utils
|
||||
bgroup = opfapi.create_batch_group(
|
||||
batchgroup_name,
|
||||
frame_start,
|
||||
frame_duration,
|
||||
**batch_data
|
||||
)
|
||||
|
||||
else:
|
||||
self.log.info(
|
||||
"Updating batch group: {}".format(batchgroup_name))
|
||||
# update already created batch group
|
||||
bgroup = opfapi.create_batch_group(
|
||||
batchgroup_name,
|
||||
frame_start,
|
||||
frame_duration,
|
||||
update_batch_group=bgroup,
|
||||
**batch_data
|
||||
)
|
||||
|
||||
return bgroup
|
||||
|
||||
def _get_anamoty_data_with_current_task(self, instance, task_data):
|
||||
anatomy_data = copy.deepcopy(instance.data["anatomyData"])
|
||||
task_name = task_data["name"]
|
||||
task_type = task_data["type"]
|
||||
anatomy_obj = instance.context.data["anatomy"]
|
||||
|
||||
# update task data in anatomy data
|
||||
project_task_types = anatomy_obj["tasks"]
|
||||
task_code = project_task_types.get(task_type, {}).get("short_name")
|
||||
anatomy_data.update({
|
||||
"task": {
|
||||
"name": task_name,
|
||||
"type": task_type,
|
||||
"short": task_code
|
||||
}
|
||||
})
|
||||
return anatomy_data
|
||||
|
||||
def _get_write_prefs(self, instance, task_data):
|
||||
# update task in anatomy data
|
||||
anatomy_data = self._get_anamoty_data_with_current_task(
|
||||
instance, task_data)
|
||||
|
||||
self.task_workdir = self._get_shot_task_dir_path(
|
||||
instance, task_data)
|
||||
self.log.debug("__ task_workdir: {}".format(
|
||||
self.task_workdir))
|
||||
|
||||
# TODO: this might be done with template in settings
|
||||
render_dir_path = os.path.join(
|
||||
self.task_workdir, "render", "flame")
|
||||
|
||||
if not os.path.exists(render_dir_path):
|
||||
os.makedirs(render_dir_path, mode=0o777)
|
||||
|
||||
# TODO: add most of these to `imageio/flame/batch/write_node`
|
||||
name = "{project[code]}_{asset}_{task[name]}".format(
|
||||
**anatomy_data
|
||||
)
|
||||
|
||||
# The path attribute where the rendered clip is exported
|
||||
# /path/to/file.[0001-0010].exr
|
||||
media_path = render_dir_path
|
||||
# name of file represented by tokens
|
||||
media_path_pattern = (
|
||||
"<name>_v<iteration###>/<name>_v<iteration###>.<frame><ext>")
|
||||
# The Create Open Clip attribute of the Write File node. \
|
||||
# Determines if an Open Clip is created by the Write File node.
|
||||
create_clip = True
|
||||
# The Include Setup attribute of the Write File node.
|
||||
# Determines if a Batch Setup file is created by the Write File node.
|
||||
include_setup = True
|
||||
# The path attribute where the Open Clip file is exported by
|
||||
# the Write File node.
|
||||
create_clip_path = "<name>"
|
||||
# The path attribute where the Batch setup file
|
||||
# is exported by the Write File node.
|
||||
include_setup_path = "./<name>_v<iteration###>"
|
||||
# The file type for the files written by the Write File node.
|
||||
# Setting this attribute also overwrites format_extension,
|
||||
# bit_depth and compress_mode to match the defaults for
|
||||
# this file type.
|
||||
file_type = "OpenEXR"
|
||||
# The file extension for the files written by the Write File node.
|
||||
# This attribute resets to match file_type whenever file_type
|
||||
# is set. If you require a specific extension, you must
|
||||
# set format_extension after setting file_type.
|
||||
format_extension = "exr"
|
||||
# The bit depth for the files written by the Write File node.
|
||||
# This attribute resets to match file_type whenever file_type is set.
|
||||
bit_depth = "16"
|
||||
# The compressing attribute for the files exported by the Write
|
||||
# File node. Only relevant when file_type in 'OpenEXR', 'Sgi', 'Tiff'
|
||||
compress = True
|
||||
# The compression format attribute for the specific File Types
|
||||
# export by the Write File node. You must set compress_mode
|
||||
# after setting file_type.
|
||||
compress_mode = "DWAB"
|
||||
# The frame index mode attribute of the Write File node.
|
||||
# Value range: `Use Timecode` or `Use Start Frame`
|
||||
frame_index_mode = "Use Start Frame"
|
||||
frame_padding = 6
|
||||
# The versioning mode of the Open Clip exported by the Write File node.
|
||||
# Only available if create_clip = True.
|
||||
version_mode = "Follow Iteration"
|
||||
version_name = "v<version>"
|
||||
version_padding = 3
|
||||
|
||||
# need to make sure the order of keys is correct
|
||||
return OrderedDict((
|
||||
("name", name),
|
||||
("media_path", media_path),
|
||||
("media_path_pattern", media_path_pattern),
|
||||
("create_clip", create_clip),
|
||||
("include_setup", include_setup),
|
||||
("create_clip_path", create_clip_path),
|
||||
("include_setup_path", include_setup_path),
|
||||
("file_type", file_type),
|
||||
("format_extension", format_extension),
|
||||
("bit_depth", bit_depth),
|
||||
("compress", compress),
|
||||
("compress_mode", compress_mode),
|
||||
("frame_index_mode", frame_index_mode),
|
||||
("frame_padding", frame_padding),
|
||||
("version_mode", version_mode),
|
||||
("version_name", version_name),
|
||||
("version_padding", version_padding)
|
||||
))
|
||||
|
||||
def _get_shot_task_dir_path(self, instance, task_data):
|
||||
project_doc = instance.data["projectEntity"]
|
||||
asset_entity = instance.data["assetEntity"]
|
||||
|
||||
return get_workdir(
|
||||
project_doc, asset_entity, task_data["name"], "flame")
|
||||
|
|
@ -9,6 +9,8 @@ class ValidateSourceClip(pyblish.api.InstancePlugin):
|
|||
label = "Validate Source Clip"
|
||||
hosts = ["flame"]
|
||||
families = ["clip"]
|
||||
optional = True
|
||||
active = False
|
||||
|
||||
def process(self, instance):
|
||||
flame_source_clip = instance.data["flameSourceClip"]
|
||||
|
|
|
|||
|
|
@ -3,18 +3,19 @@ import sys
|
|||
from Qt import QtWidgets
|
||||
from pprint import pformat
|
||||
import atexit
|
||||
import openpype
|
||||
import avalon
|
||||
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
|
||||
|
||||
def openpype_install():
|
||||
"""Registering OpenPype in context
|
||||
"""
|
||||
openpype.install()
|
||||
avalon.api.install(opfapi)
|
||||
print("Avalon registered hosts: {}".format(
|
||||
avalon.api.registered_host()))
|
||||
install_host(opfapi)
|
||||
print("Registered host: {}".format(registered_host()))
|
||||
|
||||
|
||||
# Exception handler
|
||||
|
|
|
|||
|
|
@ -7,6 +7,10 @@ import logging
|
|||
import avalon.api
|
||||
from avalon import io
|
||||
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.lib import version_up
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.hosts.fusion.api import lib
|
||||
|
|
@ -218,7 +222,7 @@ def switch(asset_name, filepath=None, new=True):
|
|||
assert current_comp is not None, (
|
||||
"Fusion could not load '{}'").format(filepath)
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
containers = list(host.ls())
|
||||
assert containers, "Nothing to update"
|
||||
|
||||
|
|
@ -279,7 +283,7 @@ if __name__ == '__main__':
|
|||
|
||||
args, unknown = parser.parse_args()
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
switch(args.asset_name, args.file_path)
|
||||
|
||||
sys.exit(0)
|
||||
|
|
|
|||
|
|
@ -1,24 +1,23 @@
|
|||
import os
|
||||
import sys
|
||||
import openpype
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
def main(env):
|
||||
import avalon.api
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.hosts.fusion.api import menu
|
||||
|
||||
# Registers pype's Global pyblish plugins
|
||||
openpype.install()
|
||||
|
||||
# activate resolve from pype
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
log.info(f"Avalon registered hosts: {avalon.api.registered_host()}")
|
||||
log.info(f"Registered host: {registered_host()}")
|
||||
|
||||
menu.launch_openpype_menu()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,14 +1,15 @@
|
|||
import os
|
||||
import sys
|
||||
import glob
|
||||
import logging
|
||||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
import avalon.api
|
||||
from avalon import io
|
||||
import qtawesome as qta
|
||||
|
||||
from openpype import style
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.lib.avalon_context import get_workdir_from_session
|
||||
|
||||
|
|
@ -181,8 +182,7 @@ class App(QtWidgets.QWidget):
|
|||
|
||||
|
||||
if __name__ == '__main__':
|
||||
import sys
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
app = QtWidgets.QApplication(sys.argv)
|
||||
window = App()
|
||||
|
|
|
|||
|
|
@ -183,10 +183,10 @@ def launch(application_path, *args):
|
|||
application_path (str): Path to Harmony.
|
||||
|
||||
"""
|
||||
from avalon import api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.harmony import api as harmony
|
||||
|
||||
api.install(harmony)
|
||||
install_host(harmony)
|
||||
|
||||
ProcessContext.port = random.randrange(49152, 65535)
|
||||
os.environ["AVALON_HARMONY_PORT"] = str(ProcessContext.port)
|
||||
|
|
|
|||
|
|
@ -34,14 +34,7 @@ AVALON_CONTAINERS = ":AVALON_CONTAINERS"
|
|||
|
||||
|
||||
def install():
|
||||
"""
|
||||
Installing Hiero integration for avalon
|
||||
|
||||
Args:
|
||||
config (obj): avalon config module `pype` in our case, it is not
|
||||
used but required by avalon.api.install()
|
||||
|
||||
"""
|
||||
"""Installing Hiero integration."""
|
||||
|
||||
# adding all events
|
||||
events.register_events()
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
import traceback
|
||||
|
||||
# activate hiero from pype
|
||||
import avalon.api
|
||||
from openpype.pipeline import install_host
|
||||
import openpype.hosts.hiero.api as phiero
|
||||
avalon.api.install(phiero)
|
||||
install_host(phiero)
|
||||
|
||||
try:
|
||||
__import__("openpype.hosts.hiero.api")
|
||||
|
|
|
|||
|
|
@ -4,11 +4,8 @@ import logging
|
|||
import contextlib
|
||||
|
||||
import hou
|
||||
import hdefereval
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon.lib import find_submodule
|
||||
|
||||
from openpype.pipeline import (
|
||||
register_creator_plugin_path,
|
||||
|
|
@ -215,24 +212,12 @@ def ls():
|
|||
"pyblish.mindbender.container"):
|
||||
containers += lib.lsattr("id", identifier)
|
||||
|
||||
has_metadata_collector = False
|
||||
config_host = find_submodule(avalon.api.registered_config(), "houdini")
|
||||
if hasattr(config_host, "collect_container_metadata"):
|
||||
has_metadata_collector = True
|
||||
|
||||
for container in sorted(containers,
|
||||
# Hou 19+ Python 3 hou.ObjNode are not
|
||||
# sortable due to not supporting greater
|
||||
# than comparisons
|
||||
key=lambda node: node.path()):
|
||||
data = parse_container(container)
|
||||
|
||||
# Collect custom data if attribute is present
|
||||
if has_metadata_collector:
|
||||
metadata = config_host.collect_container_metadata(container)
|
||||
data.update(metadata)
|
||||
|
||||
yield data
|
||||
yield parse_container(container)
|
||||
|
||||
|
||||
def before_save():
|
||||
|
|
@ -305,7 +290,13 @@ def on_new():
|
|||
start = hou.playbar.playbackRange()[0]
|
||||
hou.setFrame(start)
|
||||
|
||||
hdefereval.executeDeferred(_enforce_start_frame)
|
||||
if hou.isUIAvailable():
|
||||
import hdefereval
|
||||
hdefereval.executeDeferred(_enforce_start_frame)
|
||||
else:
|
||||
# Run without execute deferred when no UI is available because
|
||||
# without UI `hdefereval` is not available to import
|
||||
_enforce_start_frame()
|
||||
|
||||
|
||||
def _set_context_settings():
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import avalon.api as api
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
|
||||
def collect_input_containers(nodes):
|
||||
"""Collect containers that contain any of the node in `nodes`.
|
||||
|
|
@ -18,7 +19,7 @@ def collect_input_containers(nodes):
|
|||
lookup = frozenset(nodes)
|
||||
|
||||
containers = []
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
|
||||
node = container["node"]
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import version_up
|
||||
from openpype.action import get_errored_plugins_from_data
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
|
||||
class IncrementCurrentFile(pyblish.api.InstancePlugin):
|
||||
|
|
@ -41,7 +41,7 @@ class IncrementCurrentFile(pyblish.api.InstancePlugin):
|
|||
)
|
||||
|
||||
# Filename must not have changed since collecting
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
current_file = host.current_file()
|
||||
assert (
|
||||
context.data["currentFile"] == current_file
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
|
||||
class SaveCurrentScene(pyblish.api.ContextPlugin):
|
||||
|
|
@ -12,7 +13,7 @@ class SaveCurrentScene(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
|
||||
# Filename must not have changed since collecting
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
current_file = host.current_file()
|
||||
assert context.data['currentFile'] == current_file, (
|
||||
"Collected filename from current scene name."
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import avalon.api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.houdini import api
|
||||
|
||||
|
||||
def main():
|
||||
print("Installing OpenPype ...")
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
|
||||
main()
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import avalon.api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.houdini import api
|
||||
|
||||
|
||||
def main():
|
||||
print("Installing OpenPype ...")
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
|
||||
main()
|
||||
|
|
|
|||
|
|
@ -134,6 +134,7 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
|
|||
"""
|
||||
|
||||
from avalon import api, io
|
||||
from openpype.pipeline import registered_root
|
||||
|
||||
PROJECT = api.Session["AVALON_PROJECT"]
|
||||
asset_doc = io.find_one({"name": asset,
|
||||
|
|
@ -141,7 +142,7 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
|
|||
if not asset_doc:
|
||||
raise RuntimeError("Invalid asset name: '%s'" % asset)
|
||||
|
||||
root = api.registered_root()
|
||||
root = registered_root()
|
||||
path = self._template.format(**{
|
||||
"root": root,
|
||||
"project": PROJECT,
|
||||
|
|
|
|||
|
|
@ -26,6 +26,7 @@ from openpype.pipeline import (
|
|||
loaders_from_representation,
|
||||
get_representation_path,
|
||||
load_container,
|
||||
registered_host,
|
||||
)
|
||||
from .commands import reset_frame_range
|
||||
|
||||
|
|
@ -1574,7 +1575,7 @@ def assign_look_by_version(nodes, version_id):
|
|||
"name": "json"})
|
||||
|
||||
# See if representation is already loaded, if so reuse it.
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
representation_id = str(look_representation['_id'])
|
||||
for container in host.ls():
|
||||
if (container['loader'] == "LookLoader" and
|
||||
|
|
@ -2612,7 +2613,7 @@ def get_attr_in_layer(attr, layer):
|
|||
def fix_incompatible_containers():
|
||||
"""Backwards compatibility: old containers to use new ReferenceLoader"""
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
loader = container['loader']
|
||||
|
||||
|
|
|
|||
|
|
@ -4,8 +4,6 @@ import os
|
|||
import json
|
||||
import appdirs
|
||||
import requests
|
||||
import six
|
||||
import sys
|
||||
|
||||
from maya import cmds
|
||||
import maya.app.renderSetup.model.renderSetup as renderSetup
|
||||
|
|
@ -14,6 +12,7 @@ from openpype.hosts.maya.api import (
|
|||
lib,
|
||||
plugin
|
||||
)
|
||||
from openpype.lib import requests_get
|
||||
from openpype.api import (
|
||||
get_system_settings,
|
||||
get_project_settings,
|
||||
|
|
@ -117,6 +116,8 @@ class CreateRender(plugin.Creator):
|
|||
except KeyError:
|
||||
self.aov_separator = "_"
|
||||
|
||||
manager = ModulesManager()
|
||||
self.deadline_module = manager.modules_by_name["deadline"]
|
||||
try:
|
||||
default_servers = deadline_settings["deadline_urls"]
|
||||
project_servers = (
|
||||
|
|
@ -133,10 +134,8 @@ class CreateRender(plugin.Creator):
|
|||
|
||||
except AttributeError:
|
||||
# Handle situation were we had only one url for deadline.
|
||||
manager = ModulesManager()
|
||||
deadline_module = manager.modules_by_name["deadline"]
|
||||
# get default deadline webservice url from deadline module
|
||||
self.deadline_servers = deadline_module.deadline_urls
|
||||
self.deadline_servers = self.deadline_module.deadline_urls
|
||||
|
||||
def process(self):
|
||||
"""Entry point."""
|
||||
|
|
@ -205,48 +204,31 @@ class CreateRender(plugin.Creator):
|
|||
def _deadline_webservice_changed(self):
|
||||
"""Refresh Deadline server dependent options."""
|
||||
# get selected server
|
||||
from maya import cmds
|
||||
webservice = self.deadline_servers[
|
||||
self.server_aliases[
|
||||
cmds.getAttr("{}.deadlineServers".format(self.instance))
|
||||
]
|
||||
]
|
||||
pools = self._get_deadline_pools(webservice)
|
||||
pools = self.deadline_module.get_deadline_pools(webservice, self.log)
|
||||
cmds.deleteAttr("{}.primaryPool".format(self.instance))
|
||||
cmds.deleteAttr("{}.secondaryPool".format(self.instance))
|
||||
|
||||
pool_setting = (self._project_settings["deadline"]
|
||||
["publish"]
|
||||
["CollectDeadlinePools"])
|
||||
|
||||
primary_pool = pool_setting["primary_pool"]
|
||||
sorted_pools = self._set_default_pool(list(pools), primary_pool)
|
||||
cmds.addAttr(self.instance, longName="primaryPool",
|
||||
attributeType="enum",
|
||||
enumName=":".join(pools))
|
||||
cmds.addAttr(self.instance, longName="secondaryPool",
|
||||
enumName=":".join(sorted_pools))
|
||||
|
||||
pools = ["-"] + pools
|
||||
secondary_pool = pool_setting["secondary_pool"]
|
||||
sorted_pools = self._set_default_pool(list(pools), secondary_pool)
|
||||
cmds.addAttr("{}.secondaryPool".format(self.instance),
|
||||
attributeType="enum",
|
||||
enumName=":".join(["-"] + pools))
|
||||
|
||||
def _get_deadline_pools(self, webservice):
|
||||
# type: (str) -> list
|
||||
"""Get pools from Deadline.
|
||||
Args:
|
||||
webservice (str): Server url.
|
||||
Returns:
|
||||
list: Pools.
|
||||
Throws:
|
||||
RuntimeError: If deadline webservice is unreachable.
|
||||
|
||||
"""
|
||||
argument = "{}/api/pools?NamesOnly=true".format(webservice)
|
||||
try:
|
||||
response = self._requests_get(argument)
|
||||
except requests.exceptions.ConnectionError as exc:
|
||||
msg = 'Cannot connect to deadline web service'
|
||||
self.log.error(msg)
|
||||
six.reraise(
|
||||
RuntimeError,
|
||||
RuntimeError('{} - {}'.format(msg, exc)),
|
||||
sys.exc_info()[2])
|
||||
if not response.ok:
|
||||
self.log.warning("No pools retrieved")
|
||||
return []
|
||||
|
||||
return response.json()
|
||||
enumName=":".join(sorted_pools))
|
||||
|
||||
def _create_render_settings(self):
|
||||
"""Create instance settings."""
|
||||
|
|
@ -295,7 +277,8 @@ class CreateRender(plugin.Creator):
|
|||
# use first one for initial list of pools.
|
||||
deadline_url = next(iter(self.deadline_servers.values()))
|
||||
|
||||
pool_names = self._get_deadline_pools(deadline_url)
|
||||
pool_names = self.deadline_module.get_deadline_pools(deadline_url,
|
||||
self.log)
|
||||
maya_submit_dl = self._project_settings.get(
|
||||
"deadline", {}).get(
|
||||
"publish", {}).get(
|
||||
|
|
@ -326,12 +309,27 @@ class CreateRender(plugin.Creator):
|
|||
self.log.info(" - pool: {}".format(pool["name"]))
|
||||
pool_names.append(pool["name"])
|
||||
|
||||
self.data["primaryPool"] = pool_names
|
||||
pool_setting = (self._project_settings["deadline"]
|
||||
["publish"]
|
||||
["CollectDeadlinePools"])
|
||||
primary_pool = pool_setting["primary_pool"]
|
||||
self.data["primaryPool"] = self._set_default_pool(pool_names,
|
||||
primary_pool)
|
||||
# We add a string "-" to allow the user to not
|
||||
# set any secondary pools
|
||||
self.data["secondaryPool"] = ["-"] + pool_names
|
||||
pool_names = ["-"] + pool_names
|
||||
secondary_pool = pool_setting["secondary_pool"]
|
||||
self.data["secondaryPool"] = self._set_default_pool(pool_names,
|
||||
secondary_pool)
|
||||
self.options = {"useSelection": False} # Force no content
|
||||
|
||||
def _set_default_pool(self, pool_names, pool_value):
|
||||
"""Reorder pool names, default should come first"""
|
||||
if pool_value and pool_value in pool_names:
|
||||
pool_names.remove(pool_value)
|
||||
pool_names = [pool_value] + pool_names
|
||||
return pool_names
|
||||
|
||||
def _load_credentials(self):
|
||||
"""Load Muster credentials.
|
||||
|
||||
|
|
@ -366,7 +364,7 @@ class CreateRender(plugin.Creator):
|
|||
"""
|
||||
params = {"authToken": self._token}
|
||||
api_entry = "/api/pools/list"
|
||||
response = self._requests_get(self.MUSTER_REST_URL + api_entry,
|
||||
response = requests_get(self.MUSTER_REST_URL + api_entry,
|
||||
params=params)
|
||||
if response.status_code != 200:
|
||||
if response.status_code == 401:
|
||||
|
|
@ -392,45 +390,11 @@ class CreateRender(plugin.Creator):
|
|||
api_url = "{}/muster/show_login".format(
|
||||
os.environ["OPENPYPE_WEBSERVER_URL"])
|
||||
self.log.debug(api_url)
|
||||
login_response = self._requests_get(api_url, timeout=1)
|
||||
login_response = requests_get(api_url, timeout=1)
|
||||
if login_response.status_code != 200:
|
||||
self.log.error("Cannot show login form to Muster")
|
||||
raise Exception("Cannot show login form to Muster")
|
||||
|
||||
def _requests_post(self, *args, **kwargs):
|
||||
"""Wrap request post method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if "verify" not in kwargs:
|
||||
kwargs["verify"] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
return requests.post(*args, **kwargs)
|
||||
|
||||
def _requests_get(self, *args, **kwargs):
|
||||
"""Wrap request get method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if "verify" not in kwargs:
|
||||
kwargs["verify"] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
return requests.get(*args, **kwargs)
|
||||
|
||||
def _set_default_renderer_settings(self, renderer):
|
||||
"""Set basic settings based on renderer.
|
||||
|
||||
|
|
|
|||
|
|
@ -4,8 +4,6 @@ import os
|
|||
import json
|
||||
import appdirs
|
||||
import requests
|
||||
import six
|
||||
import sys
|
||||
|
||||
from maya import cmds
|
||||
import maya.app.renderSetup.model.renderSetup as renderSetup
|
||||
|
|
@ -19,6 +17,7 @@ from openpype.api import (
|
|||
get_project_settings
|
||||
)
|
||||
|
||||
from openpype.lib import requests_get
|
||||
from openpype.pipeline import CreatorError
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
|
|
@ -40,6 +39,10 @@ class CreateVRayScene(plugin.Creator):
|
|||
self._rs = renderSetup.instance()
|
||||
self.data["exportOnFarm"] = False
|
||||
deadline_settings = get_system_settings()["modules"]["deadline"]
|
||||
|
||||
manager = ModulesManager()
|
||||
self.deadline_module = manager.modules_by_name["deadline"]
|
||||
|
||||
if not deadline_settings["enabled"]:
|
||||
self.deadline_servers = {}
|
||||
return
|
||||
|
|
@ -62,10 +65,8 @@ class CreateVRayScene(plugin.Creator):
|
|||
|
||||
except AttributeError:
|
||||
# Handle situation were we had only one url for deadline.
|
||||
manager = ModulesManager()
|
||||
deadline_module = manager.modules_by_name["deadline"]
|
||||
# get default deadline webservice url from deadline module
|
||||
self.deadline_servers = deadline_module.deadline_urls
|
||||
self.deadline_servers = self.deadline_module.deadline_urls
|
||||
|
||||
def process(self):
|
||||
"""Entry point."""
|
||||
|
|
@ -128,7 +129,7 @@ class CreateVRayScene(plugin.Creator):
|
|||
cmds.getAttr("{}.deadlineServers".format(self.instance))
|
||||
]
|
||||
]
|
||||
pools = self._get_deadline_pools(webservice)
|
||||
pools = self.deadline_module.get_deadline_pools(webservice)
|
||||
cmds.deleteAttr("{}.primaryPool".format(self.instance))
|
||||
cmds.deleteAttr("{}.secondaryPool".format(self.instance))
|
||||
cmds.addAttr(self.instance, longName="primaryPool",
|
||||
|
|
@ -138,33 +139,6 @@ class CreateVRayScene(plugin.Creator):
|
|||
attributeType="enum",
|
||||
enumName=":".join(["-"] + pools))
|
||||
|
||||
def _get_deadline_pools(self, webservice):
|
||||
# type: (str) -> list
|
||||
"""Get pools from Deadline.
|
||||
Args:
|
||||
webservice (str): Server url.
|
||||
Returns:
|
||||
list: Pools.
|
||||
Throws:
|
||||
RuntimeError: If deadline webservice is unreachable.
|
||||
|
||||
"""
|
||||
argument = "{}/api/pools?NamesOnly=true".format(webservice)
|
||||
try:
|
||||
response = self._requests_get(argument)
|
||||
except requests.exceptions.ConnectionError as exc:
|
||||
msg = 'Cannot connect to deadline web service'
|
||||
self.log.error(msg)
|
||||
six.reraise(
|
||||
CreatorError,
|
||||
CreatorError('{} - {}'.format(msg, exc)),
|
||||
sys.exc_info()[2])
|
||||
if not response.ok:
|
||||
self.log.warning("No pools retrieved")
|
||||
return []
|
||||
|
||||
return response.json()
|
||||
|
||||
def _create_vray_instance_settings(self):
|
||||
# get pools
|
||||
pools = []
|
||||
|
|
@ -195,7 +169,7 @@ class CreateVRayScene(plugin.Creator):
|
|||
for k in self.deadline_servers.keys()
|
||||
][0]
|
||||
|
||||
pool_names = self._get_deadline_pools(deadline_url)
|
||||
pool_names = self.deadline_module.get_deadline_pools(deadline_url)
|
||||
|
||||
if muster_enabled:
|
||||
self.log.info(">>> Loading Muster credentials ...")
|
||||
|
|
@ -259,8 +233,8 @@ class CreateVRayScene(plugin.Creator):
|
|||
"""
|
||||
params = {"authToken": self._token}
|
||||
api_entry = "/api/pools/list"
|
||||
response = self._requests_get(self.MUSTER_REST_URL + api_entry,
|
||||
params=params)
|
||||
response = requests_get(self.MUSTER_REST_URL + api_entry,
|
||||
params=params)
|
||||
if response.status_code != 200:
|
||||
if response.status_code == 401:
|
||||
self.log.warning("Authentication token expired.")
|
||||
|
|
@ -285,45 +259,7 @@ class CreateVRayScene(plugin.Creator):
|
|||
api_url = "{}/muster/show_login".format(
|
||||
os.environ["OPENPYPE_WEBSERVER_URL"])
|
||||
self.log.debug(api_url)
|
||||
login_response = self._requests_get(api_url, timeout=1)
|
||||
login_response = requests_get(api_url, timeout=1)
|
||||
if login_response.status_code != 200:
|
||||
self.log.error("Cannot show login form to Muster")
|
||||
raise CreatorError("Cannot show login form to Muster")
|
||||
|
||||
def _requests_post(self, *args, **kwargs):
|
||||
"""Wrap request post method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if "verify" not in kwargs:
|
||||
kwargs["verify"] = (
|
||||
False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True
|
||||
) # noqa
|
||||
return requests.post(*args, **kwargs)
|
||||
|
||||
def _requests_get(self, *args, **kwargs):
|
||||
"""Wrap request get method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if "verify" not in kwargs:
|
||||
kwargs["verify"] = (
|
||||
False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True
|
||||
) # noqa
|
||||
return requests.get(*args, **kwargs)
|
||||
|
|
|
|||
|
|
@ -386,6 +386,12 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
|
|||
overrides = self.parse_options(str(render_globals))
|
||||
data.update(**overrides)
|
||||
|
||||
# get string values for pools
|
||||
primary_pool = overrides["renderGlobals"]["Pool"]
|
||||
secondary_pool = overrides["renderGlobals"].get("SecondaryPool")
|
||||
data["primaryPool"] = primary_pool
|
||||
data["secondaryPool"] = secondary_pool
|
||||
|
||||
# Define nice label
|
||||
label = "{0} ({1})".format(expected_layer_name, data["asset"])
|
||||
label += " [{0}-{1}]".format(
|
||||
|
|
|
|||
|
|
@ -4,13 +4,13 @@ import getpass
|
|||
import platform
|
||||
|
||||
import appdirs
|
||||
import requests
|
||||
|
||||
from maya import cmds
|
||||
|
||||
from avalon import api
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib import requests_post
|
||||
from openpype.hosts.maya.api import lib
|
||||
from openpype.api import get_system_settings
|
||||
|
||||
|
|
@ -184,7 +184,7 @@ class MayaSubmitMuster(pyblish.api.InstancePlugin):
|
|||
"select": "name"
|
||||
}
|
||||
api_entry = '/api/templates/list'
|
||||
response = self._requests_post(
|
||||
response = requests_post(
|
||||
self.MUSTER_REST_URL + api_entry, params=params)
|
||||
if response.status_code != 200:
|
||||
self.log.error(
|
||||
|
|
@ -235,7 +235,7 @@ class MayaSubmitMuster(pyblish.api.InstancePlugin):
|
|||
"name": "submit"
|
||||
}
|
||||
api_entry = '/api/queue/actions'
|
||||
response = self._requests_post(
|
||||
response = requests_post(
|
||||
self.MUSTER_REST_URL + api_entry, params=params, json=payload)
|
||||
|
||||
if response.status_code != 200:
|
||||
|
|
@ -549,16 +549,3 @@ class MayaSubmitMuster(pyblish.api.InstancePlugin):
|
|||
% (value, int(value))
|
||||
)
|
||||
|
||||
def _requests_post(self, *args, **kwargs):
|
||||
""" Wrapper for requests, disabling SSL certificate validation if
|
||||
DONT_VERIFY_SSL environment variable is found. This is useful when
|
||||
Deadline or Muster server are running with self-signed certificates
|
||||
and their certificate is not added to trusted certificates on
|
||||
client machines.
|
||||
|
||||
WARNING: disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
"""
|
||||
if 'verify' not in kwargs:
|
||||
kwargs['verify'] = False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True # noqa
|
||||
return requests.post(*args, **kwargs)
|
||||
|
|
|
|||
|
|
@ -2,9 +2,9 @@ import os
|
|||
import json
|
||||
|
||||
import appdirs
|
||||
import requests
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib import requests_get
|
||||
from openpype.plugin import contextplugin_should_run
|
||||
import openpype.hosts.maya.api.action
|
||||
|
||||
|
|
@ -51,7 +51,7 @@ class ValidateMusterConnection(pyblish.api.ContextPlugin):
|
|||
'authToken': self._token
|
||||
}
|
||||
api_entry = '/api/pools/list'
|
||||
response = self._requests_get(
|
||||
response = requests_get(
|
||||
MUSTER_REST_URL + api_entry, params=params)
|
||||
assert response.status_code == 200, "invalid response from server"
|
||||
assert response.json()['ResponseData'], "invalid data in response"
|
||||
|
|
@ -88,35 +88,7 @@ class ValidateMusterConnection(pyblish.api.ContextPlugin):
|
|||
api_url = "{}/muster/show_login".format(
|
||||
os.environ["OPENPYPE_WEBSERVER_URL"])
|
||||
cls.log.debug(api_url)
|
||||
response = cls._requests_get(api_url, timeout=1)
|
||||
response = requests_get(api_url, timeout=1)
|
||||
if response.status_code != 200:
|
||||
cls.log.error('Cannot show login form to Muster')
|
||||
raise Exception('Cannot show login form to Muster')
|
||||
|
||||
def _requests_post(self, *args, **kwargs):
|
||||
""" Wrapper for requests, disabling SSL certificate validation if
|
||||
DONT_VERIFY_SSL environment variable is found. This is useful when
|
||||
Deadline or Muster server are running with self-signed certificates
|
||||
and their certificate is not added to trusted certificates on
|
||||
client machines.
|
||||
|
||||
WARNING: disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
"""
|
||||
if 'verify' not in kwargs:
|
||||
kwargs['verify'] = False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True # noqa
|
||||
return requests.post(*args, **kwargs)
|
||||
|
||||
def _requests_get(self, *args, **kwargs):
|
||||
""" Wrapper for requests, disabling SSL certificate validation if
|
||||
DONT_VERIFY_SSL environment variable is found. This is useful when
|
||||
Deadline or Muster server are running with self-signed certificates
|
||||
and their certificate is not added to trusted certificates on
|
||||
client machines.
|
||||
|
||||
WARNING: disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
"""
|
||||
if 'verify' not in kwargs:
|
||||
kwargs['verify'] = False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True # noqa
|
||||
return requests.get(*args, **kwargs)
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
import os
|
||||
import avalon.api
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.maya import api
|
||||
import openpype.hosts.maya.api.lib as mlib
|
||||
from maya import cmds
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
|
||||
print("starting OpenPype usersetup")
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
from openpype.pipeline import InventoryAction
|
||||
from openpype.hosts.nuke.api.commands import viewer_update_and_undo_stop
|
||||
from openpype.hosts.nuke.api.command import viewer_update_and_undo_stop
|
||||
|
||||
|
||||
class SelectContainers(InventoryAction):
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ from openpype.hosts.nuke.api.lib import (
|
|||
get_avalon_knob_data,
|
||||
set_avalon_knob_data
|
||||
)
|
||||
from openpype.hosts.nuke.api.commands import viewer_update_and_undo_stop
|
||||
from openpype.hosts.nuke.api.command import viewer_update_and_undo_stop
|
||||
from openpype.hosts.nuke.api import containerise, update_container
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,9 @@
|
|||
import os
|
||||
import nuke
|
||||
import copy
|
||||
|
||||
import pyblish.api
|
||||
|
||||
import openpype
|
||||
from openpype.hosts.nuke.api.lib import maintained_selection
|
||||
|
||||
|
|
@ -18,6 +21,13 @@ class ExtractSlateFrame(openpype.api.Extractor):
|
|||
families = ["slate"]
|
||||
hosts = ["nuke"]
|
||||
|
||||
# Settings values
|
||||
# - can be extended by other attributes from node in the future
|
||||
key_value_mapping = {
|
||||
"f_submission_note": [True, "{comment}"],
|
||||
"f_submitting_for": [True, "{intent[value]}"],
|
||||
"f_vfx_scope_of_work": [False, ""]
|
||||
}
|
||||
|
||||
def process(self, instance):
|
||||
if hasattr(self, "viewer_lut_raw"):
|
||||
|
|
@ -129,9 +139,7 @@ class ExtractSlateFrame(openpype.api.Extractor):
|
|||
for node in temporary_nodes:
|
||||
nuke.delete(node)
|
||||
|
||||
|
||||
def get_view_process_node(self):
|
||||
|
||||
# Select only the target node
|
||||
if nuke.selectedNodes():
|
||||
[n.setSelected(False) for n in nuke.selectedNodes()]
|
||||
|
|
@ -162,13 +170,56 @@ class ExtractSlateFrame(openpype.api.Extractor):
|
|||
return
|
||||
|
||||
comment = instance.context.data.get("comment")
|
||||
intent_value = instance.context.data.get("intent")
|
||||
if intent_value and isinstance(intent_value, dict):
|
||||
intent_value = intent_value.get("value")
|
||||
intent = instance.context.data.get("intent")
|
||||
if not isinstance(intent, dict):
|
||||
intent = {
|
||||
"label": intent,
|
||||
"value": intent
|
||||
}
|
||||
|
||||
try:
|
||||
node["f_submission_note"].setValue(comment)
|
||||
node["f_submitting_for"].setValue(intent_value or "")
|
||||
except NameError:
|
||||
return
|
||||
instance.data.pop("slateNode")
|
||||
fill_data = copy.deepcopy(instance.data["anatomyData"])
|
||||
fill_data.update({
|
||||
"custom": copy.deepcopy(
|
||||
instance.data.get("customData") or {}
|
||||
),
|
||||
"comment": comment,
|
||||
"intent": intent
|
||||
})
|
||||
|
||||
for key, value in self.key_value_mapping.items():
|
||||
enabled, template = value
|
||||
if not enabled:
|
||||
self.log.debug("Key \"{}\" is disabled".format(key))
|
||||
continue
|
||||
|
||||
try:
|
||||
value = template.format(**fill_data)
|
||||
|
||||
except ValueError:
|
||||
self.log.warning(
|
||||
"Couldn't fill template \"{}\" with data: {}".format(
|
||||
template, fill_data
|
||||
),
|
||||
exc_info=True
|
||||
)
|
||||
continue
|
||||
|
||||
except KeyError:
|
||||
self.log.warning(
|
||||
(
|
||||
"Template contains unknown key."
|
||||
" Template \"{}\" Data: {}"
|
||||
).format(template, fill_data),
|
||||
exc_info=True
|
||||
)
|
||||
continue
|
||||
|
||||
try:
|
||||
node[key].setValue(value)
|
||||
self.log.info("Change key \"{}\" to value \"{}\"".format(
|
||||
key, value
|
||||
))
|
||||
except NameError:
|
||||
self.log.warning((
|
||||
"Failed to set value \"{}\" on node attribute \"{}\""
|
||||
).format(value))
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import nuke
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.nuke import api
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
on_script_load,
|
||||
|
|
@ -13,7 +13,7 @@ from openpype.hosts.nuke.api.lib import (
|
|||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
# fix ffmpeg settings on script
|
||||
nuke.addOnScriptLoad(on_script_load)
|
||||
|
|
|
|||
|
|
@ -5,9 +5,8 @@ import traceback
|
|||
|
||||
from Qt import QtWidgets
|
||||
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
from openpype.lib import env_value_to_bool
|
||||
|
|
@ -24,7 +23,7 @@ def safe_excepthook(*args):
|
|||
def main(*subprocess_args):
|
||||
from openpype.hosts.photoshop import api
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
sys.excepthook = safe_excepthook
|
||||
|
||||
# coloring in StdOutBroker
|
||||
|
|
|
|||
|
|
@ -3,7 +3,6 @@ from Qt import QtWidgets
|
|||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon import io
|
||||
|
||||
from openpype.api import Logger
|
||||
|
|
@ -14,6 +13,7 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
registered_host,
|
||||
)
|
||||
import openpype.hosts.photoshop
|
||||
|
||||
|
|
@ -33,7 +33,7 @@ def check_inventory():
|
|||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
outdated_containers = []
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
|
|
|
|||
|
|
@ -1,13 +1,14 @@
|
|||
#!/usr/bin/env python
|
||||
import os
|
||||
import sys
|
||||
import openpype
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
|
||||
|
||||
def main(env):
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
install_host(bmdvr)
|
||||
bmdvr.setup(env)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,8 +1,7 @@
|
|||
import os
|
||||
import sys
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.api import Logger
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
|
@ -10,13 +9,9 @@ log = Logger().get_logger(__name__)
|
|||
|
||||
def main(env):
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
|
||||
log.info(f"Avalon registered hosts: {avalon.registered_host()}")
|
||||
install_host(bmdvr)
|
||||
|
||||
bmdvr.launch_pype_menu()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,9 +1,11 @@
|
|||
#! python3
|
||||
import os
|
||||
import sys
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
|
||||
import opentimelineio as otio
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
|
||||
from openpype.hosts.resolve import TestGUI
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
from openpype.hosts.resolve.otio import davinci_export as otio_export
|
||||
|
|
@ -14,10 +16,8 @@ class ThisTestGUI(TestGUI):
|
|||
|
||||
def __init__(self):
|
||||
super(ThisTestGUI, self).__init__()
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
install_host(bmdvr)
|
||||
|
||||
def _open_dir_button_pressed(self, event):
|
||||
# selected_path = self.fu.RequestFile(os.path.expanduser("~"))
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
#! python3
|
||||
import os
|
||||
import sys
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.resolve import TestGUI
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
import clique
|
||||
|
|
@ -13,10 +13,8 @@ class ThisTestGUI(TestGUI):
|
|||
|
||||
def __init__(self):
|
||||
super(ThisTestGUI, self).__init__()
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
install_host(bmdvr)
|
||||
|
||||
def _open_dir_button_pressed(self, event):
|
||||
# selected_path = self.fu.RequestFile(os.path.expanduser("~"))
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
#! python3
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
from openpype.pipeline import install_host
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
|
||||
|
||||
|
|
@ -15,8 +14,7 @@ def file_processing(fpath):
|
|||
if __name__ == "__main__":
|
||||
path = "C:/CODE/__openpype_projects/jtest03dev/shots/sq01/mainsq01sh030/publish/plate/plateMain/v006/jt3d_mainsq01sh030_plateMain_v006.0996.exr"
|
||||
|
||||
openpype.install()
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
install_host(bmdvr)
|
||||
|
||||
file_processing(path)
|
||||
file_processing(path)
|
||||
|
|
|
|||
|
|
@ -48,8 +48,8 @@ from openpype.tools.publisher.window import PublisherWindow
|
|||
|
||||
def main():
|
||||
"""Main function for testing purposes."""
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.hosts.testhost import api as testhost
|
||||
|
||||
|
|
@ -57,7 +57,7 @@ def main():
|
|||
for plugin_path in manager.collect_plugin_paths()["publish"]:
|
||||
pyblish.api.register_plugin_path(plugin_path)
|
||||
|
||||
avalon.api.install(testhost)
|
||||
install_host(testhost)
|
||||
|
||||
QtWidgets.QApplication.setAttribute(QtCore.Qt.AA_EnableHighDpiScaling)
|
||||
app = QtWidgets.QApplication([])
|
||||
|
|
|
|||
|
|
@ -8,8 +8,8 @@ import logging
|
|||
|
||||
from Qt import QtWidgets, QtCore, QtGui
|
||||
|
||||
from avalon import api
|
||||
from openpype import style
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.tvpaint.api.communication_server import (
|
||||
CommunicationWrapper
|
||||
)
|
||||
|
|
@ -31,7 +31,7 @@ def main(launch_args):
|
|||
qt_app = QtWidgets.QApplication([])
|
||||
|
||||
# Execute pipeline installation
|
||||
api.install(tvpaint_host)
|
||||
install_host(tvpaint_host)
|
||||
|
||||
# Create Communicator object and trigger launch
|
||||
# - this must be done before anything is processed
|
||||
|
|
|
|||
|
|
@ -67,11 +67,8 @@ instances=2
|
|||
|
||||
|
||||
def install():
|
||||
"""Install Maya-specific functionality of avalon-core.
|
||||
"""Install TVPaint-specific functionality."""
|
||||
|
||||
This function is called automatically on calling `api.install(maya)`.
|
||||
|
||||
"""
|
||||
log.info("OpenPype - Installing TVPaint integration")
|
||||
io.install()
|
||||
|
||||
|
|
@ -96,11 +93,11 @@ def install():
|
|||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall TVPaint-specific functionality of avalon-core.
|
||||
|
||||
This function is called automatically on calling `api.uninstall()`.
|
||||
"""Uninstall TVPaint-specific functionality.
|
||||
|
||||
This function is called automatically on calling `uninstall_host()`.
|
||||
"""
|
||||
|
||||
log.info("OpenPype - Uninstalling TVPaint integration")
|
||||
pyblish.api.deregister_host("tvpaint")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,13 @@
|
|||
import os
|
||||
|
||||
from avalon import api, io
|
||||
from avalon import io
|
||||
from openpype.lib import (
|
||||
StringTemplate,
|
||||
get_workfile_template_key_from_context,
|
||||
get_workdir_data,
|
||||
get_last_workfile_with_version,
|
||||
)
|
||||
from openpype.pipeline import registered_host
|
||||
from openpype.api import Anatomy
|
||||
from openpype.hosts.tvpaint.api import lib, pipeline, plugin
|
||||
|
||||
|
|
@ -22,7 +23,7 @@ class LoadWorkfile(plugin.Loader):
|
|||
def load(self, context, name, namespace, options):
|
||||
# Load context of current workfile as first thing
|
||||
# - which context and extension has
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
current_file = host.current_file()
|
||||
|
||||
context = pipeline.get_current_workfile_context()
|
||||
|
|
|
|||
BIN
openpype/hosts/tvpaint/worker/init_file.tvpp
Normal file
BIN
openpype/hosts/tvpaint/worker/init_file.tvpp
Normal file
Binary file not shown.
|
|
@ -1,5 +1,8 @@
|
|||
import os
|
||||
import signal
|
||||
import time
|
||||
import tempfile
|
||||
import shutil
|
||||
import asyncio
|
||||
|
||||
from openpype.hosts.tvpaint.api.communication_server import (
|
||||
|
|
@ -36,8 +39,28 @@ class TVPaintWorkerCommunicator(BaseCommunicator):
|
|||
|
||||
super()._start_webserver()
|
||||
|
||||
def _open_init_file(self):
|
||||
"""Open init TVPaint file.
|
||||
|
||||
File triggers dialog missing path to audio file which must be closed
|
||||
once and is ignored for rest of running process.
|
||||
"""
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
init_filepath = os.path.join(current_dir, "init_file.tvpp")
|
||||
with tempfile.NamedTemporaryFile(
|
||||
mode="w", prefix="a_tvp_", suffix=".tvpp"
|
||||
) as tmp_file:
|
||||
tmp_filepath = tmp_file.name.replace("\\", "/")
|
||||
|
||||
shutil.copy(init_filepath, tmp_filepath)
|
||||
george_script = "tv_LoadProject '\"'\"{}\"'\"'".format(tmp_filepath)
|
||||
self.execute_george_through_file(george_script)
|
||||
self.execute_george("tv_projectclose")
|
||||
os.remove(tmp_filepath)
|
||||
|
||||
def _on_client_connect(self, *args, **kwargs):
|
||||
super()._on_client_connect(*args, **kwargs)
|
||||
self._open_init_file()
|
||||
# Register as "ready to work" worker
|
||||
self._worker_connection.register_as_worker()
|
||||
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ from openpype.pipeline import (
|
|||
class Creator(LegacyCreator):
|
||||
"""This serves as skeleton for future OpenPype specific functionality"""
|
||||
defaults = ['Main']
|
||||
maintain_selection = False
|
||||
|
||||
|
||||
class Loader(LoaderPlugin, ABC):
|
||||
|
|
|
|||
|
|
@ -2,13 +2,7 @@ import unreal
|
|||
|
||||
openpype_detected = True
|
||||
try:
|
||||
from avalon import api
|
||||
except ImportError as exc:
|
||||
api = None
|
||||
openpype_detected = False
|
||||
unreal.log_error("Avalon: cannot load Avalon [ {} ]".format(exc))
|
||||
|
||||
try:
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.unreal import api as openpype_host
|
||||
except ImportError as exc:
|
||||
openpype_host = None
|
||||
|
|
@ -16,7 +10,7 @@ except ImportError as exc:
|
|||
unreal.log_error("OpenPype: cannot load OpenPype [ {} ]".format(exc))
|
||||
|
||||
if openpype_detected:
|
||||
api.install(openpype_host)
|
||||
install_host(openpype_host)
|
||||
|
||||
|
||||
@unreal.uclass()
|
||||
|
|
|
|||
|
|
@ -1,7 +1,6 @@
|
|||
import os
|
||||
import logging
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import io
|
||||
from pyblish import api as pyblish
|
||||
import openpype.hosts.webpublisher
|
||||
|
|
|
|||
|
|
@ -108,15 +108,18 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
instance.data["representations"] = self._get_single_repre(
|
||||
task_dir, task_data["files"], tags
|
||||
)
|
||||
file_url = os.path.join(task_dir, task_data["files"][0])
|
||||
no_of_frames = self._get_number_of_frames(file_url)
|
||||
if no_of_frames:
|
||||
if family != 'workfile':
|
||||
file_url = os.path.join(task_dir, task_data["files"][0])
|
||||
try:
|
||||
frame_end = int(frame_start) + math.ceil(no_of_frames)
|
||||
instance.data["frameEnd"] = math.ceil(frame_end) - 1
|
||||
self.log.debug("frameEnd:: {}".format(
|
||||
instance.data["frameEnd"]))
|
||||
except ValueError:
|
||||
no_of_frames = self._get_number_of_frames(file_url)
|
||||
if no_of_frames:
|
||||
frame_end = int(frame_start) + \
|
||||
math.ceil(no_of_frames)
|
||||
frame_end = math.ceil(frame_end) - 1
|
||||
instance.data["frameEnd"] = frame_end
|
||||
self.log.debug("frameEnd:: {}".format(
|
||||
instance.data["frameEnd"]))
|
||||
except Exception:
|
||||
self.log.warning("Unable to count frames "
|
||||
"duration {}".format(no_of_frames))
|
||||
|
||||
|
|
@ -209,7 +212,6 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
msg = "No family found for combination of " +\
|
||||
"task_type: {}, is_sequence:{}, extension: {}".format(
|
||||
task_type, is_sequence, extension)
|
||||
found_family = "render"
|
||||
assert found_family, msg
|
||||
|
||||
return (found_family,
|
||||
|
|
|
|||
|
|
@ -221,6 +221,12 @@ from .openpype_version import (
|
|||
is_current_version_higher_than_expected
|
||||
)
|
||||
|
||||
|
||||
from .connections import (
|
||||
requests_get,
|
||||
requests_post
|
||||
)
|
||||
|
||||
terminal = Terminal
|
||||
|
||||
__all__ = [
|
||||
|
|
@ -390,4 +396,7 @@ __all__ = [
|
|||
"is_running_from_build",
|
||||
"is_running_staging",
|
||||
"is_current_version_studio_latest",
|
||||
|
||||
"requests_get",
|
||||
"requests_post"
|
||||
]
|
||||
|
|
|
|||
|
|
@ -13,7 +13,8 @@ import six
|
|||
|
||||
from openpype.settings import (
|
||||
get_system_settings,
|
||||
get_project_settings
|
||||
get_project_settings,
|
||||
get_local_settings
|
||||
)
|
||||
from openpype.settings.constants import (
|
||||
METADATA_KEYS,
|
||||
|
|
@ -1272,6 +1273,9 @@ class EnvironmentPrepData(dict):
|
|||
if data.get("env") is None:
|
||||
data["env"] = os.environ.copy()
|
||||
|
||||
if "system_settings" not in data:
|
||||
data["system_settings"] = get_system_settings()
|
||||
|
||||
super(EnvironmentPrepData, self).__init__(data)
|
||||
|
||||
|
||||
|
|
@ -1395,8 +1399,27 @@ def prepare_app_environments(data, env_group=None, implementation_envs=True):
|
|||
|
||||
app = data["app"]
|
||||
log = data["log"]
|
||||
source_env = data["env"].copy()
|
||||
|
||||
_add_python_version_paths(app, data["env"], log)
|
||||
_add_python_version_paths(app, source_env, log)
|
||||
|
||||
# Use environments from local settings
|
||||
filtered_local_envs = {}
|
||||
system_settings = data["system_settings"]
|
||||
whitelist_envs = system_settings["general"].get("local_env_white_list")
|
||||
if whitelist_envs:
|
||||
local_settings = get_local_settings()
|
||||
local_envs = local_settings.get("environments") or {}
|
||||
filtered_local_envs = {
|
||||
key: value
|
||||
for key, value in local_envs.items()
|
||||
if key in whitelist_envs
|
||||
}
|
||||
|
||||
# Apply local environment variables for already existing values
|
||||
for key, value in filtered_local_envs.items():
|
||||
if key in source_env:
|
||||
source_env[key] = value
|
||||
|
||||
# `added_env_keys` has debug purpose
|
||||
added_env_keys = {app.group.name, app.name}
|
||||
|
|
@ -1441,10 +1464,19 @@ def prepare_app_environments(data, env_group=None, implementation_envs=True):
|
|||
|
||||
# Choose right platform
|
||||
tool_env = parse_environments(_env_values, env_group)
|
||||
|
||||
# Apply local environment variables
|
||||
# - must happen between all values because they may be used during
|
||||
# merge
|
||||
for key, value in filtered_local_envs.items():
|
||||
if key in tool_env:
|
||||
tool_env[key] = value
|
||||
|
||||
# Merge dictionaries
|
||||
env_values = _merge_env(tool_env, env_values)
|
||||
|
||||
merged_env = _merge_env(env_values, data["env"])
|
||||
merged_env = _merge_env(env_values, source_env)
|
||||
|
||||
loaded_env = acre.compute(merged_env, cleanup=False)
|
||||
|
||||
final_env = None
|
||||
|
|
@ -1464,7 +1496,7 @@ def prepare_app_environments(data, env_group=None, implementation_envs=True):
|
|||
if final_env is None:
|
||||
final_env = loaded_env
|
||||
|
||||
keys_to_remove = set(data["env"].keys()) - set(final_env.keys())
|
||||
keys_to_remove = set(source_env.keys()) - set(final_env.keys())
|
||||
|
||||
# Update env
|
||||
data["env"].update(final_env)
|
||||
|
|
@ -1611,7 +1643,6 @@ def _prepare_last_workfile(data, workdir):
|
|||
result will be stored.
|
||||
workdir (str): Path to folder where workfiles should be stored.
|
||||
"""
|
||||
import avalon.api
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
log = data["log"]
|
||||
|
|
|
|||
|
|
@ -161,9 +161,10 @@ def is_latest(representation):
|
|||
@with_avalon
|
||||
def any_outdated():
|
||||
"""Return whether the current scene has any outdated content"""
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
checked = set()
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
if representation in checked:
|
||||
|
|
|
|||
38
openpype/lib/connections.py
Normal file
38
openpype/lib/connections.py
Normal file
|
|
@ -0,0 +1,38 @@
|
|||
import requests
|
||||
import os
|
||||
|
||||
|
||||
def requests_post(*args, **kwargs):
|
||||
"""Wrap request post method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if "verify" not in kwargs:
|
||||
kwargs["verify"] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
return requests.post(*args, **kwargs)
|
||||
|
||||
|
||||
def requests_get(*args, **kwargs):
|
||||
"""Wrap request get method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if "verify" not in kwargs:
|
||||
kwargs["verify"] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
return requests.get(*args, **kwargs)
|
||||
|
|
@ -44,12 +44,6 @@ def _profile_exclusion(matching_profiles, logger):
|
|||
Returns:
|
||||
dict: Most matching profile.
|
||||
"""
|
||||
|
||||
logger.info(
|
||||
"Search for first most matching profile in match order:"
|
||||
" Host name -> Task name -> Family."
|
||||
)
|
||||
|
||||
if not matching_profiles:
|
||||
return None
|
||||
|
||||
|
|
@ -168,6 +162,15 @@ def filter_profiles(profiles_data, key_values, keys_order=None, logger=None):
|
|||
_keys_order.append(key)
|
||||
keys_order = tuple(_keys_order)
|
||||
|
||||
log_parts = " | ".join([
|
||||
"{}: \"{}\"".format(*item)
|
||||
for item in key_values.items()
|
||||
])
|
||||
|
||||
logger.info(
|
||||
"Looking for matching profile for: {}".format(log_parts)
|
||||
)
|
||||
|
||||
matching_profiles = None
|
||||
highest_profile_points = -1
|
||||
# Each profile get 1 point for each matching filter. Profile with most
|
||||
|
|
@ -205,11 +208,6 @@ def filter_profiles(profiles_data, key_values, keys_order=None, logger=None):
|
|||
if profile_points == highest_profile_points:
|
||||
matching_profiles.append((profile, profile_scores))
|
||||
|
||||
log_parts = " | ".join([
|
||||
"{}: \"{}\"".format(*item)
|
||||
for item in key_values.items()
|
||||
])
|
||||
|
||||
if not matching_profiles:
|
||||
logger.info(
|
||||
"None of profiles match your setup. {}".format(log_parts)
|
||||
|
|
@ -221,4 +219,9 @@ def filter_profiles(profiles_data, key_values, keys_order=None, logger=None):
|
|||
"More than one profile match your setup. {}".format(log_parts)
|
||||
)
|
||||
|
||||
return _profile_exclusion(matching_profiles, logger)
|
||||
profile = _profile_exclusion(matching_profiles, logger)
|
||||
if profile:
|
||||
logger.info(
|
||||
"Profile selected: {}".format(profile)
|
||||
)
|
||||
return profile
|
||||
|
|
|
|||
|
|
@ -1,13 +1,12 @@
|
|||
import os
|
||||
from datetime import datetime
|
||||
import sys
|
||||
from bson.objectid import ObjectId
|
||||
import collections
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.util
|
||||
import pyblish.api
|
||||
|
||||
from openpype import uninstall
|
||||
from openpype.lib.mongo import OpenPypeMongoConnection
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
|
||||
|
|
@ -81,7 +80,6 @@ def publish(log, close_plugin_name=None):
|
|||
|
||||
if result["error"]:
|
||||
log.error(error_format.format(**result))
|
||||
uninstall()
|
||||
if close_plugin: # close host app explicitly after error
|
||||
context = pyblish.api.Context()
|
||||
close_plugin().process(context)
|
||||
|
|
@ -118,7 +116,6 @@ def publish_and_log(dbcon, _id, log, close_plugin_name=None, batch_id=None):
|
|||
|
||||
if result["error"]:
|
||||
log.error(error_format.format(**result))
|
||||
uninstall()
|
||||
log_lines = [error_format.format(**result)] + log_lines
|
||||
dbcon.update_one(
|
||||
{"_id": _id},
|
||||
|
|
|
|||
|
|
@ -17,6 +17,9 @@ from .vendor_bin_utils import (
|
|||
|
||||
# Max length of string that is supported by ffmpeg
|
||||
MAX_FFMPEG_STRING_LEN = 8196
|
||||
# Not allowed symbols in attributes for ffmpeg
|
||||
NOT_ALLOWED_FFMPEG_CHARS = ("\"", )
|
||||
|
||||
# OIIO known xml tags
|
||||
STRING_TAGS = {
|
||||
"format"
|
||||
|
|
@ -367,11 +370,15 @@ def should_convert_for_ffmpeg(src_filepath):
|
|||
return None
|
||||
|
||||
for attr_value in input_info["attribs"].values():
|
||||
if (
|
||||
isinstance(attr_value, str)
|
||||
and len(attr_value) > MAX_FFMPEG_STRING_LEN
|
||||
):
|
||||
if not isinstance(attr_value, str):
|
||||
continue
|
||||
|
||||
if len(attr_value) > MAX_FFMPEG_STRING_LEN:
|
||||
return True
|
||||
|
||||
for char in NOT_ALLOWED_FFMPEG_CHARS:
|
||||
if char in attr_value:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
|
|
@ -422,7 +429,12 @@ def convert_for_ffmpeg(
|
|||
compression = "none"
|
||||
|
||||
# Prepare subprocess arguments
|
||||
oiio_cmd = [get_oiio_tools_path()]
|
||||
oiio_cmd = [
|
||||
get_oiio_tools_path(),
|
||||
|
||||
# Don't add any additional attributes
|
||||
"--nosoftwareattrib",
|
||||
]
|
||||
# Add input compression if available
|
||||
if compression:
|
||||
oiio_cmd.extend(["--compression", compression])
|
||||
|
|
@ -458,23 +470,33 @@ def convert_for_ffmpeg(
|
|||
"--frames", "{}-{}".format(input_frame_start, input_frame_end)
|
||||
])
|
||||
|
||||
ignore_attr_changes_added = False
|
||||
for attr_name, attr_value in input_info["attribs"].items():
|
||||
if not isinstance(attr_value, str):
|
||||
continue
|
||||
|
||||
# Remove attributes that have string value longer than allowed length
|
||||
# for ffmpeg
|
||||
# for ffmpeg or when containt unallowed symbols
|
||||
erase_reason = "Missing reason"
|
||||
erase_attribute = False
|
||||
if len(attr_value) > MAX_FFMPEG_STRING_LEN:
|
||||
if not ignore_attr_changes_added:
|
||||
# Attrite changes won't be added to attributes itself
|
||||
ignore_attr_changes_added = True
|
||||
oiio_cmd.append("--sansattrib")
|
||||
erase_reason = "has too long value ({} chars).".format(
|
||||
len(attr_value)
|
||||
)
|
||||
|
||||
if erase_attribute:
|
||||
for char in NOT_ALLOWED_FFMPEG_CHARS:
|
||||
if char in attr_value:
|
||||
erase_attribute = True
|
||||
erase_reason = (
|
||||
"contains unsupported character \"{}\"."
|
||||
).format(char)
|
||||
break
|
||||
|
||||
if erase_attribute:
|
||||
# Set attribute to empty string
|
||||
logger.info((
|
||||
"Removed attribute \"{}\" from metadata"
|
||||
" because has too long value ({} chars)."
|
||||
).format(attr_name, len(attr_value)))
|
||||
"Removed attribute \"{}\" from metadata because {}."
|
||||
).format(attr_name, erase_reason))
|
||||
oiio_cmd.extend(["--eraseattrib", attr_name])
|
||||
|
||||
# Add last argument - path to output
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ except ImportError:
|
|||
from mvpxr import Usd, UsdGeom, Sdf, Kind
|
||||
|
||||
from avalon import io, api
|
||||
from openpype.pipeline import registered_root
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -323,7 +324,7 @@ def get_usd_master_path(asset, subset, representation):
|
|||
|
||||
path = template.format(
|
||||
**{
|
||||
"root": api.registered_root(),
|
||||
"root": registered_root(),
|
||||
"project": api.Session["AVALON_PROJECT"],
|
||||
"asset": asset_doc["name"],
|
||||
"subset": subset,
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import os
|
||||
import openpype
|
||||
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import ITrayModule
|
||||
|
||||
|
|
@ -26,7 +26,8 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
self.avalon_mongo_timeout = avalon_mongo_timeout
|
||||
|
||||
# Tray attributes
|
||||
self.libraryloader = None
|
||||
self._library_loader_imported = None
|
||||
self._library_loader_window = None
|
||||
self.rest_api_obj = None
|
||||
|
||||
def get_global_environments(self):
|
||||
|
|
@ -41,21 +42,11 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
def tray_init(self):
|
||||
# Add library tool
|
||||
self._library_loader_imported = False
|
||||
try:
|
||||
from Qt import QtCore
|
||||
from openpype.tools.libraryloader import LibraryLoaderWindow
|
||||
|
||||
libraryloader = LibraryLoaderWindow(
|
||||
show_projects=True,
|
||||
show_libraries=True
|
||||
)
|
||||
# Remove always on top flag for tray
|
||||
window_flags = libraryloader.windowFlags()
|
||||
if window_flags | QtCore.Qt.WindowStaysOnTopHint:
|
||||
window_flags ^= QtCore.Qt.WindowStaysOnTopHint
|
||||
libraryloader.setWindowFlags(window_flags)
|
||||
self.libraryloader = libraryloader
|
||||
|
||||
self._library_loader_imported = True
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"Couldn't load Library loader tool for tray.",
|
||||
|
|
@ -64,7 +55,7 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
# Definition of Tray menu
|
||||
def tray_menu(self, tray_menu):
|
||||
if self.libraryloader is None:
|
||||
if not self._library_loader_imported:
|
||||
return
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
|
@ -84,17 +75,31 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
return
|
||||
|
||||
def show_library_loader(self):
|
||||
if self.libraryloader is None:
|
||||
return
|
||||
if self._library_loader_window is None:
|
||||
from Qt import QtCore
|
||||
from openpype.tools.libraryloader import LibraryLoaderWindow
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
|
||||
self.libraryloader.show()
|
||||
libraryloader = LibraryLoaderWindow(
|
||||
show_projects=True,
|
||||
show_libraries=True
|
||||
)
|
||||
# Remove always on top flag for tray
|
||||
window_flags = libraryloader.windowFlags()
|
||||
if window_flags | QtCore.Qt.WindowStaysOnTopHint:
|
||||
window_flags ^= QtCore.Qt.WindowStaysOnTopHint
|
||||
libraryloader.setWindowFlags(window_flags)
|
||||
self._library_loader_window = libraryloader
|
||||
|
||||
install_openpype_plugins()
|
||||
|
||||
self._library_loader_window.show()
|
||||
|
||||
# Raise and activate the window
|
||||
# for MacOS
|
||||
self.libraryloader.raise_()
|
||||
self._library_loader_window.raise_()
|
||||
# for Windows
|
||||
self.libraryloader.activateWindow()
|
||||
self.libraryloader.refresh()
|
||||
self._library_loader_window.activateWindow()
|
||||
|
||||
# Webserver module implementation
|
||||
def webserver_initialization(self, server_manager):
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
from avalon import api, io
|
||||
from avalon import io
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import LauncherAction
|
||||
from openpype_modules.clockify.clockify_api import ClockifyAPI
|
||||
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
class ClockifyStart(api.Action):
|
||||
class ClockifyStart(LauncherAction):
|
||||
|
||||
name = "clockify_start_timer"
|
||||
label = "Clockify - Start Timer"
|
||||
|
|
|
|||
|
|
@ -1,10 +1,13 @@
|
|||
from avalon import api, io
|
||||
from avalon import io
|
||||
|
||||
from openpype_modules.clockify.clockify_api import ClockifyAPI
|
||||
from openpype.api import Logger
|
||||
log = Logger().get_logger(__name__)
|
||||
from openpype.pipeline import LauncherAction
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
class ClockifySync(api.Action):
|
||||
class ClockifySync(LauncherAction):
|
||||
|
||||
name = "sync_to_clockify"
|
||||
label = "Sync to Clockify"
|
||||
|
|
|
|||
|
|
@ -1,8 +1,19 @@
|
|||
import os
|
||||
import requests
|
||||
import six
|
||||
import sys
|
||||
|
||||
from openpype.lib import requests_get, PypeLogger
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import IPluginPaths
|
||||
|
||||
|
||||
class DeadlineWebserviceError(Exception):
|
||||
"""
|
||||
Exception to throw when connection to Deadline server fails.
|
||||
"""
|
||||
|
||||
|
||||
class DeadlineModule(OpenPypeModule, IPluginPaths):
|
||||
name = "deadline"
|
||||
|
||||
|
|
@ -32,3 +43,35 @@ class DeadlineModule(OpenPypeModule, IPluginPaths):
|
|||
return {
|
||||
"publish": [os.path.join(current_dir, "plugins", "publish")]
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def get_deadline_pools(webservice, log=None):
|
||||
# type: (str) -> list
|
||||
"""Get pools from Deadline.
|
||||
Args:
|
||||
webservice (str): Server url.
|
||||
log (Logger)
|
||||
Returns:
|
||||
list: Pools.
|
||||
Throws:
|
||||
RuntimeError: If deadline webservice is unreachable.
|
||||
|
||||
"""
|
||||
if not log:
|
||||
log = PypeLogger.get_logger(__name__)
|
||||
|
||||
argument = "{}/api/pools?NamesOnly=true".format(webservice)
|
||||
try:
|
||||
response = requests_get(argument)
|
||||
except requests.exceptions.ConnectionError as exc:
|
||||
msg = 'Cannot connect to DL web service {}'.format(webservice)
|
||||
log.error(msg)
|
||||
six.reraise(
|
||||
DeadlineWebserviceError,
|
||||
DeadlineWebserviceError('{} - {}'.format(msg, exc)),
|
||||
sys.exc_info()[2])
|
||||
if not response.ok:
|
||||
log.warning("No pools retrieved")
|
||||
return []
|
||||
|
||||
return response.json()
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ import pyblish.api
|
|||
class CollectDeadlineServerFromInstance(pyblish.api.InstancePlugin):
|
||||
"""Collect Deadline Webservice URL from instance."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.02
|
||||
order = pyblish.api.CollectorOrder + 0.415
|
||||
label = "Deadline Webservice from the Instance"
|
||||
families = ["rendering"]
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import pyblish.api
|
|||
class CollectDefaultDeadlineServer(pyblish.api.ContextPlugin):
|
||||
"""Collect default Deadline Webservice URL."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.01
|
||||
order = pyblish.api.CollectorOrder + 0.410
|
||||
label = "Default Deadline Webservice"
|
||||
|
||||
pass_mongo_url = False
|
||||
|
|
|
|||
23
openpype/modules/deadline/plugins/publish/collect_pools.py
Normal file
23
openpype/modules/deadline/plugins/publish/collect_pools.py
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect Deadline pools. Choose default one from Settings
|
||||
|
||||
"""
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectDeadlinePools(pyblish.api.InstancePlugin):
|
||||
"""Collect pools from instance if present, from Setting otherwise."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.420
|
||||
label = "Collect Deadline Pools"
|
||||
families = ["rendering", "render.farm", "renderFarm", "renderlayer"]
|
||||
|
||||
primary_pool = None
|
||||
secondary_pool = None
|
||||
|
||||
def process(self, instance):
|
||||
if not instance.data.get("primaryPool"):
|
||||
instance.data["primaryPool"] = self.primary_pool or "none"
|
||||
|
||||
if not instance.data.get("secondaryPool"):
|
||||
instance.data["secondaryPool"] = self.secondary_pool or "none"
|
||||
|
|
@ -0,0 +1,31 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<root>
|
||||
<error id="main">
|
||||
<title>Scene setting</title>
|
||||
<description>
|
||||
## Invalid Deadline pools found
|
||||
|
||||
Configured pools don't match what is set in Deadline.
|
||||
|
||||
{invalid_value_str}
|
||||
|
||||
### How to repair?
|
||||
|
||||
If your instance had deadline pools set on creation, remove or
|
||||
change them.
|
||||
|
||||
In other cases inform admin to change them in Settings.
|
||||
|
||||
Available deadline pools {pools_str}.
|
||||
</description>
|
||||
<detail>
|
||||
### __Detailed Info__
|
||||
|
||||
This error is shown when deadline pool is not on Deadline anymore. It
|
||||
could happen in case of republish old workfile which was created with
|
||||
previous deadline pools,
|
||||
or someone changed pools on Deadline side, but didn't modify Openpype
|
||||
Settings.
|
||||
</detail>
|
||||
</error>
|
||||
</root>
|
||||
|
|
@ -37,8 +37,6 @@ class AfterEffectsSubmitDeadline(
|
|||
|
||||
priority = 50
|
||||
chunk_size = 1000000
|
||||
primary_pool = None
|
||||
secondary_pool = None
|
||||
group = None
|
||||
department = None
|
||||
multiprocess = True
|
||||
|
|
@ -62,8 +60,8 @@ class AfterEffectsSubmitDeadline(
|
|||
dln_job_info.Frames = frame_range
|
||||
|
||||
dln_job_info.Priority = self.priority
|
||||
dln_job_info.Pool = self.primary_pool
|
||||
dln_job_info.SecondaryPool = self.secondary_pool
|
||||
dln_job_info.Pool = self._instance.data.get("primaryPool")
|
||||
dln_job_info.SecondaryPool = self._instance.data.get("secondaryPool")
|
||||
dln_job_info.Group = self.group
|
||||
dln_job_info.Department = self.department
|
||||
dln_job_info.ChunkSize = self.chunk_size
|
||||
|
|
|
|||
|
|
@ -241,8 +241,6 @@ class HarmonySubmitDeadline(
|
|||
|
||||
optional = True
|
||||
use_published = False
|
||||
primary_pool = ""
|
||||
secondary_pool = ""
|
||||
priority = 50
|
||||
chunk_size = 1000000
|
||||
group = "none"
|
||||
|
|
@ -259,8 +257,8 @@ class HarmonySubmitDeadline(
|
|||
# for now, get those from presets. Later on it should be
|
||||
# configurable in Harmony UI directly.
|
||||
job_info.Priority = self.priority
|
||||
job_info.Pool = self.primary_pool
|
||||
job_info.SecondaryPool = self.secondary_pool
|
||||
job_info.Pool = self._instance.data.get("primaryPool")
|
||||
job_info.SecondaryPool = self._instance.data.get("secondaryPool")
|
||||
job_info.ChunkSize = self.chunk_size
|
||||
job_info.BatchName = os.path.basename(self._instance.data["source"])
|
||||
job_info.Department = self.department
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ from avalon import api
|
|||
|
||||
import pyblish.api
|
||||
|
||||
import hou
|
||||
# import hou ???
|
||||
|
||||
|
||||
class HoudiniSubmitRenderDeadline(pyblish.api.InstancePlugin):
|
||||
|
|
@ -71,7 +71,8 @@ class HoudiniSubmitRenderDeadline(pyblish.api.InstancePlugin):
|
|||
"UserName": deadline_user,
|
||||
|
||||
"Plugin": "Houdini",
|
||||
"Pool": "houdini_redshift", # todo: remove hardcoded pool
|
||||
"Pool": instance.data.get("primaryPool"),
|
||||
"secondaryPool": instance.data.get("secondaryPool"),
|
||||
"Frames": frames,
|
||||
|
||||
"ChunkSize": instance.data.get("chunkSize", 10),
|
||||
|
|
|
|||
|
|
@ -35,6 +35,7 @@ from maya import cmds
|
|||
from avalon import api
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import requests_post
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
# Documentation for keys available at:
|
||||
|
|
@ -700,7 +701,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
tiles_count = instance.data.get("tilesX") * instance.data.get("tilesY") # noqa: E501
|
||||
|
||||
for tile_job in frame_payloads:
|
||||
response = self._requests_post(url, json=tile_job)
|
||||
response = requests_post(url, json=tile_job)
|
||||
if not response.ok:
|
||||
raise Exception(response.text)
|
||||
|
||||
|
|
@ -763,7 +764,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
job_idx, len(assembly_payloads)
|
||||
))
|
||||
self.log.debug(json.dumps(ass_job, indent=4, sort_keys=True))
|
||||
response = self._requests_post(url, json=ass_job)
|
||||
response = requests_post(url, json=ass_job)
|
||||
if not response.ok:
|
||||
raise Exception(response.text)
|
||||
|
||||
|
|
@ -781,7 +782,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
# E.g. http://192.168.0.1:8082/api/jobs
|
||||
url = "{}/api/jobs".format(self.deadline_url)
|
||||
response = self._requests_post(url, json=payload)
|
||||
response = requests_post(url, json=payload)
|
||||
if not response.ok:
|
||||
raise Exception(response.text)
|
||||
instance.data["deadlineSubmissionJob"] = response.json()
|
||||
|
|
@ -989,7 +990,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.log.info("Submitting ass export job.")
|
||||
|
||||
url = "{}/api/jobs".format(self.deadline_url)
|
||||
response = self._requests_post(url, json=payload)
|
||||
response = requests_post(url, json=payload)
|
||||
if not response.ok:
|
||||
self.log.error("Submition failed!")
|
||||
self.log.error(response.status_code)
|
||||
|
|
@ -1013,44 +1014,6 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
% (value, int(value))
|
||||
)
|
||||
|
||||
def _requests_post(self, *args, **kwargs):
|
||||
"""Wrap request post method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if 'verify' not in kwargs:
|
||||
kwargs['verify'] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
# add 10sec timeout before bailing out
|
||||
kwargs['timeout'] = 10
|
||||
return requests.post(*args, **kwargs)
|
||||
|
||||
def _requests_get(self, *args, **kwargs):
|
||||
"""Wrap request get method.
|
||||
|
||||
Disabling SSL certificate validation if ``DONT_VERIFY_SSL`` environment
|
||||
variable is found. This is useful when Deadline or Muster server are
|
||||
running with self-signed certificates and their certificate is not
|
||||
added to trusted certificates on client machines.
|
||||
|
||||
Warning:
|
||||
Disabling SSL certificate validation is defeating one line
|
||||
of defense SSL is providing and it is not recommended.
|
||||
|
||||
"""
|
||||
if 'verify' not in kwargs:
|
||||
kwargs['verify'] = not os.getenv("OPENPYPE_DONT_VERIFY_SSL", True)
|
||||
# add 10sec timeout before bailing out
|
||||
kwargs['timeout'] = 10
|
||||
return requests.get(*args, **kwargs)
|
||||
|
||||
def format_vray_output_filename(self, filename, template, dir=False):
|
||||
"""Format the expected output file of the Export job.
|
||||
|
||||
|
|
|
|||
|
|
@ -28,8 +28,6 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
priority = 50
|
||||
chunk_size = 1
|
||||
concurrent_tasks = 1
|
||||
primary_pool = ""
|
||||
secondary_pool = ""
|
||||
group = ""
|
||||
department = ""
|
||||
limit_groups = {}
|
||||
|
|
@ -187,8 +185,8 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
"Department": self.department,
|
||||
|
||||
"Pool": self.primary_pool,
|
||||
"SecondaryPool": self.secondary_pool,
|
||||
"Pool": instance.data.get("primaryPool"),
|
||||
"SecondaryPool": instance.data.get("secondaryPool"),
|
||||
"Group": self.group,
|
||||
|
||||
"Plugin": "Nuke",
|
||||
|
|
|
|||
|
|
@ -8,6 +8,7 @@ from copy import copy, deepcopy
|
|||
import requests
|
||||
import clique
|
||||
import openpype.api
|
||||
from openpype.pipeline.farm.patterning import match_aov_pattern
|
||||
|
||||
from avalon import api, io
|
||||
|
||||
|
|
@ -107,7 +108,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
families = ["render.farm", "prerender.farm",
|
||||
"renderlayer", "imagesequence", "vrayscene"]
|
||||
|
||||
aov_filter = {"maya": [r".*(?:[\._-])*([Bb]eauty)(?:[\.|_])*.*"],
|
||||
aov_filter = {"maya": [r".*([Bb]eauty).*"],
|
||||
"aftereffects": [r".*"], # for everything from AE
|
||||
"harmony": [r".*"], # for everything from AE
|
||||
"celaction": [r".*"]}
|
||||
|
|
@ -129,7 +130,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"OPENPYPE_PUBLISH_JOB"
|
||||
]
|
||||
|
||||
# custom deadline atributes
|
||||
# custom deadline attributes
|
||||
deadline_department = ""
|
||||
deadline_pool = ""
|
||||
deadline_pool_secondary = ""
|
||||
|
|
@ -259,8 +260,8 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"Priority": priority,
|
||||
|
||||
"Group": self.deadline_group,
|
||||
"Pool": self.deadline_pool,
|
||||
"SecondaryPool": self.deadline_pool_secondary,
|
||||
"Pool": instance.data.get("primaryPool"),
|
||||
"SecondaryPool": instance.data.get("secondaryPool"),
|
||||
|
||||
"OutputDirectory0": output_dir
|
||||
},
|
||||
|
|
@ -449,12 +450,15 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
app = os.environ.get("AVALON_APP", "")
|
||||
|
||||
preview = False
|
||||
if app in self.aov_filter.keys():
|
||||
for aov_pattern in self.aov_filter[app]:
|
||||
if re.match(aov_pattern, aov):
|
||||
preview = True
|
||||
break
|
||||
|
||||
if isinstance(col, list):
|
||||
render_file_name = os.path.basename(col[0])
|
||||
else:
|
||||
render_file_name = os.path.basename(col)
|
||||
aov_patterns = self.aov_filter
|
||||
preview = match_aov_pattern(app, aov_patterns, render_file_name)
|
||||
|
||||
# toggle preview on if multipart is on
|
||||
if instance_data.get("multipartExr"):
|
||||
preview = True
|
||||
|
||||
|
|
@ -520,6 +524,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
|
||||
"""
|
||||
representations = []
|
||||
host_name = os.environ.get("AVALON_APP", "")
|
||||
collections, remainders = clique.assemble(exp_files)
|
||||
|
||||
# create representation for every collected sequento ce
|
||||
|
|
@ -532,22 +537,16 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
# should be review made.
|
||||
# - "review" tag is never added when is set to 'False'
|
||||
if instance["useSequenceForReview"]:
|
||||
# if filtered aov name is found in filename, toggle it for
|
||||
# preview video rendering
|
||||
for app in self.aov_filter.keys():
|
||||
if os.environ.get("AVALON_APP", "") == app:
|
||||
# iteratre all aov filters
|
||||
for aov in self.aov_filter[app]:
|
||||
if re.match(
|
||||
aov,
|
||||
list(collection)[0]
|
||||
):
|
||||
preview = True
|
||||
break
|
||||
|
||||
# toggle preview on if multipart is on
|
||||
if instance.get("multipartExr", False):
|
||||
preview = True
|
||||
else:
|
||||
render_file_name = list(collection)[0]
|
||||
# if filtered aov name is found in filename, toggle it for
|
||||
# preview video rendering
|
||||
preview = match_aov_pattern(
|
||||
host_name, self.aov_filter, render_file_name
|
||||
)
|
||||
|
||||
staging = os.path.dirname(list(collection)[0])
|
||||
success, rootless_staging_dir = (
|
||||
|
|
@ -611,12 +610,16 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"files": os.path.basename(remainder),
|
||||
"stagingDir": os.path.dirname(remainder),
|
||||
}
|
||||
if "render" in instance.get("families"):
|
||||
|
||||
preview = match_aov_pattern(
|
||||
host_name, self.aov_filter, remainder
|
||||
)
|
||||
if preview:
|
||||
rep.update({
|
||||
"fps": instance.get("fps"),
|
||||
"tags": ["review"]
|
||||
})
|
||||
self._solve_families(instance, True)
|
||||
self._solve_families(instance, preview)
|
||||
|
||||
already_there = False
|
||||
for repre in instance.get("representations", []):
|
||||
|
|
|
|||
|
|
@ -0,0 +1,48 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import (
|
||||
PublishXmlValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.modules.deadline.deadline_module import DeadlineModule
|
||||
|
||||
|
||||
class ValidateDeadlinePools(OptionalPyblishPluginMixin,
|
||||
pyblish.api.InstancePlugin):
|
||||
"""Validate primaryPool and secondaryPool on instance.
|
||||
|
||||
Values are on instance based on value insertion when Creating instance or
|
||||
by Settings in CollectDeadlinePools.
|
||||
"""
|
||||
|
||||
label = "Validate Deadline Pools"
|
||||
order = pyblish.api.ValidatorOrder
|
||||
families = ["rendering", "render.farm", "renderFarm", "renderlayer"]
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
# get default deadline webservice url from deadline module
|
||||
deadline_url = instance.context.data["defaultDeadline"]
|
||||
self.log.info("deadline_url::{}".format(deadline_url))
|
||||
pools = DeadlineModule.get_deadline_pools(deadline_url, log=self.log)
|
||||
self.log.info("pools::{}".format(pools))
|
||||
|
||||
formatting_data = {
|
||||
"pools_str": ",".join(pools)
|
||||
}
|
||||
|
||||
primary_pool = instance.data.get("primaryPool")
|
||||
if primary_pool and primary_pool not in pools:
|
||||
msg = "Configured primary '{}' not present on Deadline".format(
|
||||
instance.data["primaryPool"])
|
||||
formatting_data["invalid_value_str"] = msg
|
||||
raise PublishXmlValidationError(self, msg,
|
||||
formatting_data=formatting_data)
|
||||
|
||||
secondary_pool = instance.data.get("secondaryPool")
|
||||
if secondary_pool and secondary_pool not in pools:
|
||||
msg = "Configured secondary '{}' not present on Deadline".format(
|
||||
instance.data["secondaryPool"])
|
||||
formatting_data["invalid_value_str"] = msg
|
||||
raise PublishXmlValidationError(self, msg,
|
||||
formatting_data=formatting_data)
|
||||
|
|
@ -135,7 +135,7 @@ def query_custom_attributes(
|
|||
output.extend(
|
||||
session.query(
|
||||
(
|
||||
"select value, entity_id from {}"
|
||||
"select value, entity_id, configuration_id from {}"
|
||||
" where entity_id in ({}) and configuration_id in ({})"
|
||||
).format(
|
||||
table_name,
|
||||
|
|
|
|||
|
|
@ -0,0 +1,148 @@
|
|||
"""
|
||||
Requires:
|
||||
context > ftrackSession
|
||||
context > ftrackEntity
|
||||
instance > ftrackEntity
|
||||
|
||||
Provides:
|
||||
instance > customData > ftrack
|
||||
"""
|
||||
import copy
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectFtrackCustomAttributeData(pyblish.api.ContextPlugin):
|
||||
"""Collect custom attribute values and store them to customData.
|
||||
|
||||
Data are stored into each instance in context under
|
||||
instance.data["customData"]["ftrack"].
|
||||
|
||||
Hierarchical attributes are not looked up properly for that functionality
|
||||
custom attribute values lookup must be extended.
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.4992
|
||||
label = "Collect Ftrack Custom Attribute Data"
|
||||
|
||||
# Name of custom attributes for which will be look for
|
||||
custom_attribute_keys = []
|
||||
|
||||
def process(self, context):
|
||||
if not self.custom_attribute_keys:
|
||||
self.log.info("Custom attribute keys are not set. Skipping")
|
||||
return
|
||||
|
||||
ftrack_entities_by_id = {}
|
||||
default_entity_id = None
|
||||
|
||||
context_entity = context.data.get("ftrackEntity")
|
||||
if context_entity:
|
||||
entity_id = context_entity["id"]
|
||||
default_entity_id = entity_id
|
||||
ftrack_entities_by_id[entity_id] = context_entity
|
||||
|
||||
instances_by_entity_id = {
|
||||
default_entity_id: []
|
||||
}
|
||||
for instance in context:
|
||||
entity = instance.data.get("ftrackEntity")
|
||||
if not entity:
|
||||
instances_by_entity_id[default_entity_id].append(instance)
|
||||
continue
|
||||
|
||||
entity_id = entity["id"]
|
||||
ftrack_entities_by_id[entity_id] = entity
|
||||
if entity_id not in instances_by_entity_id:
|
||||
instances_by_entity_id[entity_id] = []
|
||||
instances_by_entity_id[entity_id].append(instance)
|
||||
|
||||
if not ftrack_entities_by_id:
|
||||
self.log.info("Ftrack entities are not set. Skipping")
|
||||
return
|
||||
|
||||
session = context.data["ftrackSession"]
|
||||
custom_attr_key_by_id = self.query_attr_confs(session)
|
||||
if not custom_attr_key_by_id:
|
||||
self.log.info((
|
||||
"Didn't find any of defined custom attributes {}"
|
||||
).format(", ".join(self.custom_attribute_keys)))
|
||||
return
|
||||
|
||||
entity_ids = list(instances_by_entity_id.keys())
|
||||
values_by_entity_id = self.query_attr_values(
|
||||
session, entity_ids, custom_attr_key_by_id
|
||||
)
|
||||
|
||||
for entity_id, instances in instances_by_entity_id.items():
|
||||
if entity_id not in values_by_entity_id:
|
||||
# Use defaut empty values
|
||||
entity_id = None
|
||||
|
||||
for instance in instances:
|
||||
value = copy.deepcopy(values_by_entity_id[entity_id])
|
||||
if "customData" not in instance.data:
|
||||
instance.data["customData"] = {}
|
||||
instance.data["customData"]["ftrack"] = value
|
||||
instance_label = (
|
||||
instance.data.get("label") or instance.data["name"]
|
||||
)
|
||||
self.log.debug((
|
||||
"Added ftrack custom data to instance \"{}\": {}"
|
||||
).format(instance_label, value))
|
||||
|
||||
def query_attr_values(self, session, entity_ids, custom_attr_key_by_id):
|
||||
# Prepare values for query
|
||||
entity_ids_joined = ",".join([
|
||||
'"{}"'.format(entity_id)
|
||||
for entity_id in entity_ids
|
||||
])
|
||||
conf_ids_joined = ",".join([
|
||||
'"{}"'.format(conf_id)
|
||||
for conf_id in custom_attr_key_by_id.keys()
|
||||
])
|
||||
# Query custom attribute values
|
||||
value_items = session.query(
|
||||
(
|
||||
"select value, entity_id, configuration_id"
|
||||
" from CustomAttributeValue"
|
||||
" where entity_id in ({}) and configuration_id in ({})"
|
||||
).format(
|
||||
entity_ids_joined,
|
||||
conf_ids_joined
|
||||
)
|
||||
).all()
|
||||
|
||||
# Prepare default value output per entity id
|
||||
values_by_key = {
|
||||
key: None for key in self.custom_attribute_keys
|
||||
}
|
||||
# Prepare all entity ids that were queried
|
||||
values_by_entity_id = {
|
||||
entity_id: copy.deepcopy(values_by_key)
|
||||
for entity_id in entity_ids
|
||||
}
|
||||
# Add none entity id which is used as default value
|
||||
values_by_entity_id[None] = copy.deepcopy(values_by_key)
|
||||
# Go through queried data and store them
|
||||
for item in value_items:
|
||||
conf_id = item["configuration_id"]
|
||||
conf_key = custom_attr_key_by_id[conf_id]
|
||||
entity_id = item["entity_id"]
|
||||
values_by_entity_id[entity_id][conf_key] = item["value"]
|
||||
return values_by_entity_id
|
||||
|
||||
def query_attr_confs(self, session):
|
||||
custom_attributes = set(self.custom_attribute_keys)
|
||||
cust_attrs_query = (
|
||||
"select id, key from CustomAttributeConfiguration"
|
||||
" where key in ({})"
|
||||
).format(", ".join(
|
||||
["\"{}\"".format(attr_name) for attr_name in custom_attributes]
|
||||
))
|
||||
|
||||
custom_attr_confs = session.query(cust_attrs_query).all()
|
||||
return {
|
||||
conf["id"]: conf["key"]
|
||||
for conf in custom_attr_confs
|
||||
}
|
||||
|
|
@ -6,7 +6,7 @@ import avalon.api
|
|||
class CollectFtrackApi(pyblish.api.ContextPlugin):
|
||||
""" Collects an ftrack session and the current task id. """
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.4999
|
||||
order = pyblish.api.CollectorOrder + 0.4991
|
||||
label = "Collect Ftrack Api"
|
||||
|
||||
def process(self, context):
|
||||
|
|
|
|||
|
|
@ -25,7 +25,7 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
based on 'families' (editorial drives it by presence of 'review')
|
||||
"""
|
||||
label = "Collect Ftrack Family"
|
||||
order = pyblish.api.CollectorOrder + 0.4998
|
||||
order = pyblish.api.CollectorOrder + 0.4990
|
||||
|
||||
profiles = None
|
||||
|
||||
|
|
@ -34,6 +34,7 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
self.log.warning("No profiles present for adding Ftrack family")
|
||||
return
|
||||
|
||||
add_ftrack_family = False
|
||||
task_name = instance.data.get("task",
|
||||
avalon.api.Session["AVALON_TASK"])
|
||||
host_name = avalon.api.Session["AVALON_APP"]
|
||||
|
|
@ -53,6 +54,8 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
|
||||
additional_filters = profile.get("advanced_filtering")
|
||||
if additional_filters:
|
||||
self.log.info("'{}' families used for additional filtering".
|
||||
format(families))
|
||||
add_ftrack_family = self._get_add_ftrack_f_from_addit_filters(
|
||||
additional_filters,
|
||||
families,
|
||||
|
|
@ -69,6 +72,13 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
instance.data["families"] = ["ftrack"]
|
||||
|
||||
result_str = "Adding"
|
||||
if not add_ftrack_family:
|
||||
result_str = "Not adding"
|
||||
self.log.info("{} 'ftrack' family for instance with '{}'".format(
|
||||
result_str, family
|
||||
))
|
||||
|
||||
def _get_add_ftrack_f_from_addit_filters(self,
|
||||
additional_filters,
|
||||
families,
|
||||
|
|
|
|||
|
|
@ -263,7 +263,9 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
|
|||
self.log.info("Creating asset types with short names: {}".format(
|
||||
", ".join(asset_type_names_by_missing_shorts.keys())
|
||||
))
|
||||
for missing_short, type_name in asset_type_names_by_missing_shorts:
|
||||
for missing_short, type_name in (
|
||||
asset_type_names_by_missing_shorts.items()
|
||||
):
|
||||
# Use short for name if name is not defined
|
||||
if not type_name:
|
||||
type_name = missing_short
|
||||
|
|
|
|||
|
|
@ -17,6 +17,7 @@ class DropboxHandler(AbstractProvider):
|
|||
self.active = False
|
||||
self.site_name = site_name
|
||||
self.presets = presets
|
||||
self.dbx = None
|
||||
|
||||
if not self.presets:
|
||||
log.info(
|
||||
|
|
@ -24,6 +25,11 @@ class DropboxHandler(AbstractProvider):
|
|||
)
|
||||
return
|
||||
|
||||
if not self.presets["enabled"]:
|
||||
log.debug("Sync Server: Site {} not enabled for {}.".
|
||||
format(site_name, project_name))
|
||||
return
|
||||
|
||||
token = self.presets.get("token", "")
|
||||
if not token:
|
||||
msg = "Sync Server: No access token for dropbox provider"
|
||||
|
|
@ -44,16 +50,13 @@ class DropboxHandler(AbstractProvider):
|
|||
log.info(msg)
|
||||
return
|
||||
|
||||
self.dbx = None
|
||||
|
||||
if self.presets["enabled"]:
|
||||
try:
|
||||
self.dbx = self._get_service(
|
||||
token, acting_as_member, team_folder_name
|
||||
)
|
||||
except Exception as e:
|
||||
log.info("Could not establish dropbox object: {}".format(e))
|
||||
return
|
||||
try:
|
||||
self.dbx = self._get_service(
|
||||
token, acting_as_member, team_folder_name
|
||||
)
|
||||
except Exception as e:
|
||||
log.info("Could not establish dropbox object: {}".format(e))
|
||||
return
|
||||
|
||||
super(AbstractProvider, self).__init__()
|
||||
|
||||
|
|
|
|||
|
|
@ -73,6 +73,11 @@ class GDriveHandler(AbstractProvider):
|
|||
format(site_name))
|
||||
return
|
||||
|
||||
if not self.presets["enabled"]:
|
||||
log.debug("Sync Server: Site {} not enabled for {}.".
|
||||
format(site_name, project_name))
|
||||
return
|
||||
|
||||
current_platform = platform.system().lower()
|
||||
cred_path = self.presets.get("credentials_url", {}). \
|
||||
get(current_platform) or ''
|
||||
|
|
@ -97,11 +102,10 @@ class GDriveHandler(AbstractProvider):
|
|||
return
|
||||
|
||||
self.service = None
|
||||
if self.presets["enabled"]:
|
||||
self.service = self._get_gd_service(cred_path)
|
||||
self.service = self._get_gd_service(cred_path)
|
||||
|
||||
self._tree = tree
|
||||
self.active = True
|
||||
self._tree = tree
|
||||
self.active = True
|
||||
|
||||
def is_active(self):
|
||||
"""
|
||||
|
|
|
|||
|
|
@ -69,6 +69,22 @@ from .actions import (
|
|||
deregister_inventory_action_path,
|
||||
)
|
||||
|
||||
from .context_tools import (
|
||||
install_openpype_plugins,
|
||||
install_host,
|
||||
uninstall_host,
|
||||
is_installed,
|
||||
|
||||
register_root,
|
||||
registered_root,
|
||||
|
||||
register_host,
|
||||
registered_host,
|
||||
deregister_host,
|
||||
)
|
||||
install = install_host
|
||||
uninstall = uninstall_host
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AVALON_CONTAINER_ID",
|
||||
|
|
@ -137,4 +153,21 @@ __all__ = (
|
|||
"register_inventory_action_path",
|
||||
"deregister_inventory_action",
|
||||
"deregister_inventory_action_path",
|
||||
|
||||
# --- Process context ---
|
||||
"install_openpype_plugins",
|
||||
"install_host",
|
||||
"uninstall_host",
|
||||
"is_installed",
|
||||
|
||||
"register_root",
|
||||
"registered_root",
|
||||
|
||||
"register_host",
|
||||
"registered_host",
|
||||
"deregister_host",
|
||||
|
||||
# Backwards compatible function names
|
||||
"install",
|
||||
"uninstall",
|
||||
)
|
||||
|
|
|
|||
335
openpype/pipeline/context_tools.py
Normal file
335
openpype/pipeline/context_tools.py
Normal file
|
|
@ -0,0 +1,335 @@
|
|||
"""Core pipeline functionality"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import types
|
||||
import logging
|
||||
import inspect
|
||||
import platform
|
||||
|
||||
import pyblish.api
|
||||
from pyblish.lib import MessageHandler
|
||||
|
||||
from avalon import io, Session
|
||||
|
||||
import openpype
|
||||
from openpype.modules import load_modules
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import (
|
||||
Anatomy,
|
||||
register_event_callback,
|
||||
filter_pyblish_plugins,
|
||||
change_timer_to_current_context,
|
||||
)
|
||||
|
||||
from . import (
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
)
|
||||
|
||||
|
||||
_is_installed = False
|
||||
_registered_root = {"_": ""}
|
||||
_registered_host = {"_": None}
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
PACKAGE_DIR = os.path.dirname(os.path.abspath(openpype.__file__))
|
||||
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
|
||||
|
||||
# Global plugin paths
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
|
||||
|
||||
def register_root(path):
|
||||
"""Register currently active root"""
|
||||
log.info("Registering root: %s" % path)
|
||||
_registered_root["_"] = path
|
||||
|
||||
|
||||
def registered_root():
|
||||
"""Return currently registered root"""
|
||||
root = _registered_root["_"]
|
||||
if root:
|
||||
return root
|
||||
|
||||
root = Session.get("AVALON_PROJECTS")
|
||||
if root:
|
||||
return os.path.normpath(root)
|
||||
return ""
|
||||
|
||||
|
||||
def install_host(host):
|
||||
"""Install `host` into the running Python session.
|
||||
|
||||
Args:
|
||||
host (module): A Python module containing the Avalon
|
||||
avalon host-interface.
|
||||
"""
|
||||
global _is_installed
|
||||
|
||||
_is_installed = True
|
||||
|
||||
io.install()
|
||||
|
||||
missing = list()
|
||||
for key in ("AVALON_PROJECT", "AVALON_ASSET"):
|
||||
if key not in Session:
|
||||
missing.append(key)
|
||||
|
||||
assert not missing, (
|
||||
"%s missing from environment, %s" % (
|
||||
", ".join(missing),
|
||||
json.dumps(Session, indent=4, sort_keys=True)
|
||||
))
|
||||
|
||||
project_name = Session["AVALON_PROJECT"]
|
||||
log.info("Activating %s.." % project_name)
|
||||
|
||||
# Optional host install function
|
||||
if hasattr(host, "install"):
|
||||
host.install()
|
||||
|
||||
register_host(host)
|
||||
|
||||
register_event_callback("taskChanged", _on_task_change)
|
||||
|
||||
def modified_emit(obj, record):
|
||||
"""Method replacing `emit` in Pyblish's MessageHandler."""
|
||||
record.msg = record.getMessage()
|
||||
obj.records.append(record)
|
||||
|
||||
MessageHandler.emit = modified_emit
|
||||
|
||||
install_openpype_plugins()
|
||||
|
||||
|
||||
def install_openpype_plugins(project_name=None):
|
||||
# Make sure modules are loaded
|
||||
load_modules()
|
||||
|
||||
log.info("Registering global plug-ins..")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.register_discovery_filter(filter_pyblish_plugins)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
|
||||
if project_name is None:
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
|
||||
# Register studio specific plugins
|
||||
if project_name:
|
||||
anatomy = Anatomy(project_name)
|
||||
anatomy.set_root_environments()
|
||||
register_root(anatomy.roots)
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
platform_name = platform.system().lower()
|
||||
project_plugins = (
|
||||
project_settings
|
||||
.get("global", {})
|
||||
.get("project_plugins", {})
|
||||
.get(platform_name)
|
||||
) or []
|
||||
for path in project_plugins:
|
||||
try:
|
||||
path = str(path.format(**os.environ))
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
pyblish.api.register_plugin_path(path)
|
||||
register_loader_plugin_path(path)
|
||||
register_creator_plugin_path(path)
|
||||
register_inventory_action(path)
|
||||
|
||||
|
||||
def _on_task_change():
|
||||
change_timer_to_current_context()
|
||||
|
||||
|
||||
def uninstall_host():
|
||||
"""Undo all of what `install()` did"""
|
||||
host = registered_host()
|
||||
|
||||
try:
|
||||
host.uninstall()
|
||||
except AttributeError:
|
||||
pass
|
||||
|
||||
log.info("Deregistering global plug-ins..")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.deregister_discovery_filter(filter_pyblish_plugins)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
log.info("Global plug-ins unregistred")
|
||||
|
||||
deregister_host()
|
||||
|
||||
io.uninstall()
|
||||
|
||||
log.info("Successfully uninstalled Avalon!")
|
||||
|
||||
|
||||
def is_installed():
|
||||
"""Return state of installation
|
||||
|
||||
Returns:
|
||||
True if installed, False otherwise
|
||||
|
||||
"""
|
||||
|
||||
return _is_installed
|
||||
|
||||
|
||||
def register_host(host):
|
||||
"""Register a new host for the current process
|
||||
|
||||
Arguments:
|
||||
host (ModuleType): A module implementing the
|
||||
Host API interface. See the Host API
|
||||
documentation for details on what is
|
||||
required, or browse the source code.
|
||||
|
||||
"""
|
||||
signatures = {
|
||||
"ls": []
|
||||
}
|
||||
|
||||
_validate_signature(host, signatures)
|
||||
_registered_host["_"] = host
|
||||
|
||||
|
||||
def _validate_signature(module, signatures):
|
||||
# Required signatures for each member
|
||||
|
||||
missing = list()
|
||||
invalid = list()
|
||||
success = True
|
||||
|
||||
for member in signatures:
|
||||
if not hasattr(module, member):
|
||||
missing.append(member)
|
||||
success = False
|
||||
|
||||
else:
|
||||
attr = getattr(module, member)
|
||||
if sys.version_info.major >= 3:
|
||||
signature = inspect.getfullargspec(attr)[0]
|
||||
else:
|
||||
signature = inspect.getargspec(attr)[0]
|
||||
required_signature = signatures[member]
|
||||
|
||||
assert isinstance(signature, list)
|
||||
assert isinstance(required_signature, list)
|
||||
|
||||
if not all(member in signature
|
||||
for member in required_signature):
|
||||
invalid.append({
|
||||
"member": member,
|
||||
"signature": ", ".join(signature),
|
||||
"required": ", ".join(required_signature)
|
||||
})
|
||||
success = False
|
||||
|
||||
if not success:
|
||||
report = list()
|
||||
|
||||
if missing:
|
||||
report.append(
|
||||
"Incomplete interface for module: '%s'\n"
|
||||
"Missing: %s" % (module, ", ".join(
|
||||
"'%s'" % member for member in missing))
|
||||
)
|
||||
|
||||
if invalid:
|
||||
report.append(
|
||||
"'%s': One or more members were found, but didn't "
|
||||
"have the right argument signature." % module.__name__
|
||||
)
|
||||
|
||||
for member in invalid:
|
||||
report.append(
|
||||
" Found: {member}({signature})".format(**member)
|
||||
)
|
||||
report.append(
|
||||
" Expected: {member}({required})".format(**member)
|
||||
)
|
||||
|
||||
raise ValueError("\n".join(report))
|
||||
|
||||
|
||||
def registered_host():
|
||||
"""Return currently registered host"""
|
||||
return _registered_host["_"]
|
||||
|
||||
|
||||
def deregister_host():
|
||||
_registered_host["_"] = default_host()
|
||||
|
||||
|
||||
def default_host():
|
||||
"""A default host, in place of anything better
|
||||
|
||||
This may be considered as reference for the
|
||||
interface a host must implement. It also ensures
|
||||
that the system runs, even when nothing is there
|
||||
to support it.
|
||||
|
||||
"""
|
||||
|
||||
host = types.ModuleType("defaultHost")
|
||||
|
||||
def ls():
|
||||
return list()
|
||||
|
||||
host.__dict__.update({
|
||||
"ls": ls
|
||||
})
|
||||
|
||||
return host
|
||||
|
||||
|
||||
def debug_host():
|
||||
"""A debug host, useful to debugging features that depend on a host"""
|
||||
|
||||
host = types.ModuleType("debugHost")
|
||||
|
||||
def ls():
|
||||
containers = [
|
||||
{
|
||||
"representation": "ee-ft-a-uuid1",
|
||||
"schema": "openpype:container-1.0",
|
||||
"name": "Bruce01",
|
||||
"objectName": "Bruce01_node",
|
||||
"namespace": "_bruce01_",
|
||||
"version": 3,
|
||||
},
|
||||
{
|
||||
"representation": "aa-bc-s-uuid2",
|
||||
"schema": "openpype:container-1.0",
|
||||
"name": "Bruce02",
|
||||
"objectName": "Bruce01_node",
|
||||
"namespace": "_bruce02_",
|
||||
"version": 2,
|
||||
}
|
||||
]
|
||||
|
||||
for container in containers:
|
||||
yield container
|
||||
|
||||
host.__dict__.update({
|
||||
"ls": ls,
|
||||
"open_file": lambda fname: None,
|
||||
"save_file": lambda fname: None,
|
||||
"current_file": lambda: os.path.expanduser("~/temp.txt"),
|
||||
"has_unsaved_changes": lambda: False,
|
||||
"work_root": lambda: os.path.expanduser("~/temp"),
|
||||
"file_extensions": lambda: ["txt"],
|
||||
})
|
||||
|
||||
return host
|
||||
|
|
@ -356,7 +356,7 @@ class CreatedInstance:
|
|||
already existing instance.
|
||||
creator(BaseCreator): Creator responsible for instance.
|
||||
host(ModuleType): Host implementation loaded with
|
||||
`avalon.api.registered_host`.
|
||||
`openpype.pipeline.registered_host`.
|
||||
new(bool): Is instance new.
|
||||
"""
|
||||
# Keys that can't be changed or removed from data after loading using
|
||||
|
|
|
|||
|
|
@ -142,7 +142,8 @@ def legacy_create(Creator, name, asset, options=None, data=None):
|
|||
Name of instance
|
||||
|
||||
"""
|
||||
from avalon.api import registered_host
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
host = registered_host()
|
||||
plugin = Creator(name, asset, options, data)
|
||||
|
||||
|
|
|
|||
0
openpype/pipeline/farm/__init__.py
Normal file
0
openpype/pipeline/farm/__init__.py
Normal file
24
openpype/pipeline/farm/patterning.py
Normal file
24
openpype/pipeline/farm/patterning.py
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import re
|
||||
|
||||
|
||||
def match_aov_pattern(host_name, aov_patterns, render_file_name):
|
||||
"""Matching against a `AOV` pattern in the render files.
|
||||
|
||||
In order to match the AOV name we must compare
|
||||
against the render filename string that we are
|
||||
grabbing the render filename string from the collection
|
||||
that we have grabbed from `exp_files`.
|
||||
|
||||
Args:
|
||||
app (str): Host name.
|
||||
aov_patterns (dict): AOV patterns from AOV filters.
|
||||
render_file_name (str): Incoming file name to match against.
|
||||
|
||||
Returns:
|
||||
bool: Review state for rendered file (render_file_name).
|
||||
"""
|
||||
aov_pattern = aov_patterns.get(host_name, [])
|
||||
if not aov_pattern:
|
||||
return False
|
||||
return any(re.match(p, render_file_name) for p in aov_pattern)
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue