mirror of
https://github.com/ynput/ayon-core.git
synced 2026-01-02 08:54:53 +01:00
Merge remote-tracking branch 'upstream/develop' into refactor_integrator
This commit is contained in:
commit
cf302dcb30
102 changed files with 1201 additions and 489 deletions
2
.github/workflows/prerelease.yml
vendored
2
.github/workflows/prerelease.yml
vendored
|
|
@ -80,7 +80,7 @@ jobs:
|
|||
git tag -a $tag_name -m "nightly build"
|
||||
|
||||
- name: Push to protected main branch
|
||||
uses: CasperWA/push-protected@v2
|
||||
uses: CasperWA/push-protected@v2.10.0
|
||||
with:
|
||||
token: ${{ secrets.ADMIN_TOKEN }}
|
||||
branch: main
|
||||
|
|
|
|||
2
.github/workflows/release.yml
vendored
2
.github/workflows/release.yml
vendored
|
|
@ -68,7 +68,7 @@ jobs:
|
|||
|
||||
- name: 🔏 Push to protected main branch
|
||||
if: steps.version.outputs.release_tag != 'skip'
|
||||
uses: CasperWA/push-protected@v2
|
||||
uses: CasperWA/push-protected@v2.10.0
|
||||
with:
|
||||
token: ${{ secrets.ADMIN_TOKEN }}
|
||||
branch: main
|
||||
|
|
|
|||
84
CHANGELOG.md
84
CHANGELOG.md
|
|
@ -1,23 +1,64 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.4-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.10.0-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.3...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.4...HEAD)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: Python requirements to 3.7.9 [\#3035](https://github.com/pypeclub/OpenPype/pull/3035)
|
||||
- Website Docs: Remove unused pages [\#2974](https://github.com/pypeclub/OpenPype/pull/2974)
|
||||
- Nuke docs with videos [\#3052](https://github.com/pypeclub/OpenPype/pull/3052)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Update collect\_render.py [\#3055](https://github.com/pypeclub/OpenPype/pull/3055)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: Add aov matching even for remainder and prerender [\#3060](https://github.com/pypeclub/OpenPype/pull/3060)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Move host install [\#3009](https://github.com/pypeclub/OpenPype/pull/3009)
|
||||
|
||||
## [3.9.4](https://github.com/pypeclub/OpenPype/tree/3.9.4) (2022-04-15)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.4-nightly.2...3.9.4)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: more info about Tasks [\#3062](https://github.com/pypeclub/OpenPype/pull/3062)
|
||||
- Documentation: Python requirements to 3.7.9 [\#3035](https://github.com/pypeclub/OpenPype/pull/3035)
|
||||
- Website Docs: Remove unused pages [\#2974](https://github.com/pypeclub/OpenPype/pull/2974)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- General: Local overrides for environment variables [\#3045](https://github.com/pypeclub/OpenPype/pull/3045)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- TVPaint: Added init file for worker to triggers missing sound file dialog [\#3053](https://github.com/pypeclub/OpenPype/pull/3053)
|
||||
- Ftrack: Custom attributes can be filled in slate values [\#3036](https://github.com/pypeclub/OpenPype/pull/3036)
|
||||
- Resolve environment variable in google drive credential path [\#3008](https://github.com/pypeclub/OpenPype/pull/3008)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- GitHub: Updated push-protected action in github workflow [\#3064](https://github.com/pypeclub/OpenPype/pull/3064)
|
||||
- Nuke: Typos in imports from Nuke implementation [\#3061](https://github.com/pypeclub/OpenPype/pull/3061)
|
||||
- Hotfix: fixing deadline job publishing [\#3059](https://github.com/pypeclub/OpenPype/pull/3059)
|
||||
- General: Extract Review handle invalid characters for ffmpeg [\#3050](https://github.com/pypeclub/OpenPype/pull/3050)
|
||||
- Slate Review: Support to keep format on slate concatenation [\#3049](https://github.com/pypeclub/OpenPype/pull/3049)
|
||||
- Webpublisher: fix processing of workfile [\#3048](https://github.com/pypeclub/OpenPype/pull/3048)
|
||||
- Ftrack: Integrate ftrack api fix [\#3044](https://github.com/pypeclub/OpenPype/pull/3044)
|
||||
- Webpublisher - removed wrong hardcoded family [\#3043](https://github.com/pypeclub/OpenPype/pull/3043)
|
||||
- LibraryLoader: Use current project for asset query in families filter [\#3042](https://github.com/pypeclub/OpenPype/pull/3042)
|
||||
- SiteSync: Providers ignore that site is disabled [\#3041](https://github.com/pypeclub/OpenPype/pull/3041)
|
||||
- Unreal: Creator import fixes [\#3040](https://github.com/pypeclub/OpenPype/pull/3040)
|
||||
- SiteSync: fix transitive alternate sites, fix dropdown in Local Settings [\#3018](https://github.com/pypeclub/OpenPype/pull/3018)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Deadline: reworked pools assignment [\#3051](https://github.com/pypeclub/OpenPype/pull/3051)
|
||||
- Houdini: Avoid ImportError on `hdefereval` when Houdini runs without UI [\#2987](https://github.com/pypeclub/OpenPype/pull/2987)
|
||||
|
||||
## [3.9.3](https://github.com/pypeclub/OpenPype/tree/3.9.3) (2022-04-07)
|
||||
|
||||
|
|
@ -38,6 +79,7 @@
|
|||
- Ftrack: Add more options for note text of integrate ftrack note [\#3025](https://github.com/pypeclub/OpenPype/pull/3025)
|
||||
- Console Interpreter: Changed how console splitter size are reused on show [\#3016](https://github.com/pypeclub/OpenPype/pull/3016)
|
||||
- Deadline: Use more suitable name for sequence review logic [\#3015](https://github.com/pypeclub/OpenPype/pull/3015)
|
||||
- General: default workfile subset name for workfile [\#3011](https://github.com/pypeclub/OpenPype/pull/3011)
|
||||
- Nuke: add concurrency attr to deadline job [\#3005](https://github.com/pypeclub/OpenPype/pull/3005)
|
||||
- Deadline: priority configurable in Maya jobs [\#2995](https://github.com/pypeclub/OpenPype/pull/2995)
|
||||
- Workfiles tool: Save as published workfiles [\#2937](https://github.com/pypeclub/OpenPype/pull/2937)
|
||||
|
|
@ -46,18 +88,21 @@
|
|||
|
||||
- Deadline: Fixed default value of use sequence for review [\#3033](https://github.com/pypeclub/OpenPype/pull/3033)
|
||||
- Settings UI: Version column can be extended so version are visible [\#3032](https://github.com/pypeclub/OpenPype/pull/3032)
|
||||
- General: Fix validate asset docs plug-in filename and class name [\#3029](https://github.com/pypeclub/OpenPype/pull/3029)
|
||||
- General: Fix import after movements [\#3028](https://github.com/pypeclub/OpenPype/pull/3028)
|
||||
- Harmony: Added creating subset name for workfile from template [\#3024](https://github.com/pypeclub/OpenPype/pull/3024)
|
||||
- AfterEffects: Added creating subset name for workfile from template [\#3023](https://github.com/pypeclub/OpenPype/pull/3023)
|
||||
- General: Add example addons to ignored [\#3022](https://github.com/pypeclub/OpenPype/pull/3022)
|
||||
- SiteSync: fix transitive alternate sites, fix dropdown in Local Settings [\#3018](https://github.com/pypeclub/OpenPype/pull/3018)
|
||||
- Maya: Remove missing import [\#3017](https://github.com/pypeclub/OpenPype/pull/3017)
|
||||
- Ftrack: multiple reviewable componets [\#3012](https://github.com/pypeclub/OpenPype/pull/3012)
|
||||
- Tray publisher: Fixes after code movement [\#3010](https://github.com/pypeclub/OpenPype/pull/3010)
|
||||
- Nuke: fixing unicode type detection in effect loaders [\#3002](https://github.com/pypeclub/OpenPype/pull/3002)
|
||||
- Fix - remove doubled dot in workfile created from template [\#2998](https://github.com/pypeclub/OpenPype/pull/2998)
|
||||
- Nuke: removing redundant Ftrack asset when farm publishing [\#2996](https://github.com/pypeclub/OpenPype/pull/2996)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Move plugins register and discover [\#2935](https://github.com/pypeclub/OpenPype/pull/2935)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Maya: Allow to select invalid camera contents if no cameras found [\#3030](https://github.com/pypeclub/OpenPype/pull/3030)
|
||||
|
|
@ -75,26 +120,24 @@
|
|||
**🆕 New features**
|
||||
|
||||
- nuke: bypass baking [\#2992](https://github.com/pypeclub/OpenPype/pull/2992)
|
||||
- Multiverse: Initial Support [\#2908](https://github.com/pypeclub/OpenPype/pull/2908)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Photoshop: create image without instance [\#3001](https://github.com/pypeclub/OpenPype/pull/3001)
|
||||
- TVPaint: Render scene family [\#3000](https://github.com/pypeclub/OpenPype/pull/3000)
|
||||
- Nuke: ReviewDataMov Read RAW attribute [\#2985](https://github.com/pypeclub/OpenPype/pull/2985)
|
||||
- SiteSync: Added compute\_resource\_sync\_sites to sync\_server\_module [\#2983](https://github.com/pypeclub/OpenPype/pull/2983)
|
||||
- General: `METADATA\_KEYS` constant as `frozenset` for optimal immutable lookup [\#2980](https://github.com/pypeclub/OpenPype/pull/2980)
|
||||
- General: Tools with host filters [\#2975](https://github.com/pypeclub/OpenPype/pull/2975)
|
||||
- Hero versions: Use custom templates [\#2967](https://github.com/pypeclub/OpenPype/pull/2967)
|
||||
- Slack: Added configurable maximum file size of review upload to Slack [\#2945](https://github.com/pypeclub/OpenPype/pull/2945)
|
||||
- NewPublisher: Prepared implementation of optional pyblish plugin [\#2943](https://github.com/pypeclub/OpenPype/pull/2943)
|
||||
- TVPaint: Extractor to convert PNG into EXR [\#2942](https://github.com/pypeclub/OpenPype/pull/2942)
|
||||
- Workfiles: Open published workfiles [\#2925](https://github.com/pypeclub/OpenPype/pull/2925)
|
||||
- General: Default modules loaded dynamically [\#2923](https://github.com/pypeclub/OpenPype/pull/2923)
|
||||
- Nuke: improving readability [\#2903](https://github.com/pypeclub/OpenPype/pull/2903)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Hosts: Remove path existence checks in 'add\_implementation\_envs' [\#3004](https://github.com/pypeclub/OpenPype/pull/3004)
|
||||
- Fix - remove doubled dot in workfile created from template [\#2998](https://github.com/pypeclub/OpenPype/pull/2998)
|
||||
- PS: fix renaming subset incorrectly in PS [\#2991](https://github.com/pypeclub/OpenPype/pull/2991)
|
||||
- Fix: Disable setuptools auto discovery [\#2990](https://github.com/pypeclub/OpenPype/pull/2990)
|
||||
- AEL: fix opening existing workfile if no scene opened [\#2989](https://github.com/pypeclub/OpenPype/pull/2989)
|
||||
|
|
@ -113,17 +156,6 @@
|
|||
- General: Python specific vendor paths on env injection [\#2939](https://github.com/pypeclub/OpenPype/pull/2939)
|
||||
- General: More fail safe delete old versions [\#2936](https://github.com/pypeclub/OpenPype/pull/2936)
|
||||
- Settings UI: Collapsed of collapsible wrapper works as expected [\#2934](https://github.com/pypeclub/OpenPype/pull/2934)
|
||||
- Maya: Do not pass `set` to maya commands \(fixes support for older maya versions\) [\#2932](https://github.com/pypeclub/OpenPype/pull/2932)
|
||||
- General: Don't print log record on OSError [\#2926](https://github.com/pypeclub/OpenPype/pull/2926)
|
||||
- Flame: centos related debugging [\#2922](https://github.com/pypeclub/OpenPype/pull/2922)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Move plugins register and discover [\#2935](https://github.com/pypeclub/OpenPype/pull/2935)
|
||||
- General: Move Attribute Definitions from pipeline [\#2931](https://github.com/pypeclub/OpenPype/pull/2931)
|
||||
- General: Removed silo references and terminal splash [\#2927](https://github.com/pypeclub/OpenPype/pull/2927)
|
||||
- General: Move pipeline constants to OpenPype [\#2918](https://github.com/pypeclub/OpenPype/pull/2918)
|
||||
- General: Move remaining plugins from avalon [\#2912](https://github.com/pypeclub/OpenPype/pull/2912)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
@ -136,16 +168,6 @@
|
|||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.1-nightly.3...3.9.1)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Nuke: Add no-audio Tag [\#2911](https://github.com/pypeclub/OpenPype/pull/2911)
|
||||
- General: Change how OPENPYPE\_DEBUG value is handled [\#2907](https://github.com/pypeclub/OpenPype/pull/2907)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Fix use of Anatomy roots [\#2904](https://github.com/pypeclub/OpenPype/pull/2904)
|
||||
- Fixing gap detection in extract review [\#2902](https://github.com/pypeclub/OpenPype/pull/2902)
|
||||
|
||||
## [3.9.0](https://github.com/pypeclub/OpenPype/tree/3.9.0) (2022-03-14)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.9.0-nightly.9...3.9.0)
|
||||
|
|
|
|||
|
|
@ -1,102 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Pype module."""
|
||||
import os
|
||||
import platform
|
||||
import logging
|
||||
|
||||
from .settings import get_project_settings
|
||||
from .lib import (
|
||||
Anatomy,
|
||||
filter_pyblish_plugins,
|
||||
change_timer_to_current_context,
|
||||
register_event_callback,
|
||||
)
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
PACKAGE_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
|
||||
|
||||
# Global plugin paths
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
|
||||
|
||||
def install():
|
||||
"""Install OpenPype to Avalon."""
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
from pyblish.lib import MessageHandler
|
||||
from openpype.modules import load_modules
|
||||
from openpype.pipeline import (
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action,
|
||||
register_creator_plugin_path,
|
||||
)
|
||||
|
||||
# Make sure modules are loaded
|
||||
load_modules()
|
||||
|
||||
def modified_emit(obj, record):
|
||||
"""Method replacing `emit` in Pyblish's MessageHandler."""
|
||||
record.msg = record.getMessage()
|
||||
obj.records.append(record)
|
||||
|
||||
MessageHandler.emit = modified_emit
|
||||
|
||||
log.info("Registering global plug-ins..")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.register_discovery_filter(filter_pyblish_plugins)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
|
||||
# Register studio specific plugins
|
||||
if project_name:
|
||||
anatomy = Anatomy(project_name)
|
||||
anatomy.set_root_environments()
|
||||
avalon.api.register_root(anatomy.roots)
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
platform_name = platform.system().lower()
|
||||
project_plugins = (
|
||||
project_settings
|
||||
.get("global", {})
|
||||
.get("project_plugins", {})
|
||||
.get(platform_name)
|
||||
) or []
|
||||
for path in project_plugins:
|
||||
try:
|
||||
path = str(path.format(**os.environ))
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
pyblish.api.register_plugin_path(path)
|
||||
register_loader_plugin_path(path)
|
||||
register_creator_plugin_path(path)
|
||||
register_inventory_action(path)
|
||||
|
||||
# apply monkey patched discover to original one
|
||||
log.info("Patching discovery")
|
||||
|
||||
register_event_callback("taskChanged", _on_task_change)
|
||||
|
||||
|
||||
def _on_task_change():
|
||||
change_timer_to_current_context()
|
||||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall Pype from Avalon."""
|
||||
import pyblish.api
|
||||
from openpype.pipeline import deregister_loader_plugin_path
|
||||
|
||||
log.info("Deregistering global plug-ins..")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.deregister_discovery_filter(filter_pyblish_plugins)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
log.info("Global plug-ins unregistred")
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ import logging
|
|||
|
||||
from Qt import QtWidgets
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
|
|
@ -22,10 +23,9 @@ def safe_excepthook(*args):
|
|||
def main(*subprocess_args):
|
||||
sys.excepthook = safe_excepthook
|
||||
|
||||
import avalon.api
|
||||
from openpype.hosts.aftereffects import api
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
os.environ["OPENPYPE_LOG_NO_COLORS"] = "False"
|
||||
app = QtWidgets.QApplication([])
|
||||
|
|
|
|||
|
|
@ -15,6 +15,7 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
registered_host,
|
||||
)
|
||||
import openpype.hosts.aftereffects
|
||||
from openpype.lib import register_event_callback
|
||||
|
|
@ -37,24 +38,9 @@ def check_inventory():
|
|||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
host = pyblish.api.registered_host()
|
||||
outdated_containers = []
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
representation_doc = io.find_one(
|
||||
{
|
||||
"_id": ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
)
|
||||
if representation_doc and not lib.is_latest(representation_doc):
|
||||
outdated_containers.append(container)
|
||||
|
||||
# Warn about outdated containers.
|
||||
print("Starting new QApplication..")
|
||||
app = QtWidgets.QApplication(sys.argv)
|
||||
|
||||
message_box = QtWidgets.QMessageBox()
|
||||
message_box.setIcon(QtWidgets.QMessageBox.Warning)
|
||||
msg = "There are outdated containers in the scene."
|
||||
|
|
|
|||
|
|
@ -19,6 +19,7 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
uninstall_host,
|
||||
)
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import (
|
||||
|
|
@ -209,11 +210,10 @@ def reload_pipeline(*args):
|
|||
|
||||
"""
|
||||
|
||||
avalon.api.uninstall()
|
||||
uninstall_host()
|
||||
|
||||
for module in (
|
||||
"avalon.io",
|
||||
"avalon.lib",
|
||||
"avalon.pipeline",
|
||||
"avalon.api",
|
||||
):
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
from avalon import pipeline
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.blender import api
|
||||
|
||||
pipeline.install(api)
|
||||
install_host(api)
|
||||
|
|
|
|||
|
|
@ -3,8 +3,6 @@ import sys
|
|||
import copy
|
||||
import argparse
|
||||
|
||||
from avalon import io
|
||||
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
|
||||
|
|
@ -13,6 +11,8 @@ import openpype
|
|||
import openpype.hosts.celaction
|
||||
from openpype.hosts.celaction import api as celaction
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
|
||||
|
||||
log = Logger().get_logger("Celaction_cli_publisher")
|
||||
|
||||
|
|
@ -21,9 +21,6 @@ publish_host = "celaction"
|
|||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.celaction.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
def cli():
|
||||
|
|
@ -74,7 +71,7 @@ def main():
|
|||
_prepare_publish_environments()
|
||||
|
||||
# Registers pype's Global pyblish plugins
|
||||
openpype.install()
|
||||
install_openpype_plugins()
|
||||
|
||||
if os.path.exists(PUBLISH_PATH):
|
||||
log.info(f"Registering path: {PUBLISH_PATH}")
|
||||
|
|
|
|||
|
|
@ -3,18 +3,19 @@ import sys
|
|||
from Qt import QtWidgets
|
||||
from pprint import pformat
|
||||
import atexit
|
||||
import openpype
|
||||
import avalon
|
||||
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
|
||||
|
||||
def openpype_install():
|
||||
"""Registering OpenPype in context
|
||||
"""
|
||||
openpype.install()
|
||||
avalon.api.install(opfapi)
|
||||
print("Avalon registered hosts: {}".format(
|
||||
avalon.api.registered_host()))
|
||||
install_host(opfapi)
|
||||
print("Registered host: {}".format(registered_host()))
|
||||
|
||||
|
||||
# Exception handler
|
||||
|
|
|
|||
|
|
@ -7,6 +7,10 @@ import logging
|
|||
import avalon.api
|
||||
from avalon import io
|
||||
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.lib import version_up
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.hosts.fusion.api import lib
|
||||
|
|
@ -218,7 +222,7 @@ def switch(asset_name, filepath=None, new=True):
|
|||
assert current_comp is not None, (
|
||||
"Fusion could not load '{}'").format(filepath)
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
containers = list(host.ls())
|
||||
assert containers, "Nothing to update"
|
||||
|
||||
|
|
@ -279,7 +283,7 @@ if __name__ == '__main__':
|
|||
|
||||
args, unknown = parser.parse_args()
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
switch(args.asset_name, args.file_path)
|
||||
|
||||
sys.exit(0)
|
||||
|
|
|
|||
|
|
@ -1,24 +1,23 @@
|
|||
import os
|
||||
import sys
|
||||
import openpype
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
def main(env):
|
||||
import avalon.api
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.hosts.fusion.api import menu
|
||||
|
||||
# Registers pype's Global pyblish plugins
|
||||
openpype.install()
|
||||
|
||||
# activate resolve from pype
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
log.info(f"Avalon registered hosts: {avalon.api.registered_host()}")
|
||||
log.info(f"Registered host: {registered_host()}")
|
||||
|
||||
menu.launch_openpype_menu()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,14 +1,15 @@
|
|||
import os
|
||||
import sys
|
||||
import glob
|
||||
import logging
|
||||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
import avalon.api
|
||||
from avalon import io
|
||||
import qtawesome as qta
|
||||
|
||||
from openpype import style
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.lib.avalon_context import get_workdir_from_session
|
||||
|
||||
|
|
@ -181,8 +182,7 @@ class App(QtWidgets.QWidget):
|
|||
|
||||
|
||||
if __name__ == '__main__':
|
||||
import sys
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
app = QtWidgets.QApplication(sys.argv)
|
||||
window = App()
|
||||
|
|
|
|||
|
|
@ -183,10 +183,10 @@ def launch(application_path, *args):
|
|||
application_path (str): Path to Harmony.
|
||||
|
||||
"""
|
||||
from avalon import api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.harmony import api as harmony
|
||||
|
||||
api.install(harmony)
|
||||
install_host(harmony)
|
||||
|
||||
ProcessContext.port = random.randrange(49152, 65535)
|
||||
os.environ["AVALON_HARMONY_PORT"] = str(ProcessContext.port)
|
||||
|
|
|
|||
|
|
@ -34,14 +34,7 @@ AVALON_CONTAINERS = ":AVALON_CONTAINERS"
|
|||
|
||||
|
||||
def install():
|
||||
"""
|
||||
Installing Hiero integration for avalon
|
||||
|
||||
Args:
|
||||
config (obj): avalon config module `pype` in our case, it is not
|
||||
used but required by avalon.api.install()
|
||||
|
||||
"""
|
||||
"""Installing Hiero integration."""
|
||||
|
||||
# adding all events
|
||||
events.register_events()
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
import traceback
|
||||
|
||||
# activate hiero from pype
|
||||
import avalon.api
|
||||
from openpype.pipeline import install_host
|
||||
import openpype.hosts.hiero.api as phiero
|
||||
avalon.api.install(phiero)
|
||||
install_host(phiero)
|
||||
|
||||
try:
|
||||
__import__("openpype.hosts.hiero.api")
|
||||
|
|
|
|||
|
|
@ -6,8 +6,6 @@ import contextlib
|
|||
import hou
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon.lib import find_submodule
|
||||
|
||||
from openpype.pipeline import (
|
||||
register_creator_plugin_path,
|
||||
|
|
@ -214,24 +212,12 @@ def ls():
|
|||
"pyblish.mindbender.container"):
|
||||
containers += lib.lsattr("id", identifier)
|
||||
|
||||
has_metadata_collector = False
|
||||
config_host = find_submodule(avalon.api.registered_config(), "houdini")
|
||||
if hasattr(config_host, "collect_container_metadata"):
|
||||
has_metadata_collector = True
|
||||
|
||||
for container in sorted(containers,
|
||||
# Hou 19+ Python 3 hou.ObjNode are not
|
||||
# sortable due to not supporting greater
|
||||
# than comparisons
|
||||
key=lambda node: node.path()):
|
||||
data = parse_container(container)
|
||||
|
||||
# Collect custom data if attribute is present
|
||||
if has_metadata_collector:
|
||||
metadata = config_host.collect_container_metadata(container)
|
||||
data.update(metadata)
|
||||
|
||||
yield data
|
||||
yield parse_container(container)
|
||||
|
||||
|
||||
def before_save():
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import avalon.api as api
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
|
||||
def collect_input_containers(nodes):
|
||||
"""Collect containers that contain any of the node in `nodes`.
|
||||
|
|
@ -18,7 +19,7 @@ def collect_input_containers(nodes):
|
|||
lookup = frozenset(nodes)
|
||||
|
||||
containers = []
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
|
||||
node = container["node"]
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import version_up
|
||||
from openpype.action import get_errored_plugins_from_data
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
|
||||
class IncrementCurrentFile(pyblish.api.InstancePlugin):
|
||||
|
|
@ -41,7 +41,7 @@ class IncrementCurrentFile(pyblish.api.InstancePlugin):
|
|||
)
|
||||
|
||||
# Filename must not have changed since collecting
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
current_file = host.current_file()
|
||||
assert (
|
||||
context.data["currentFile"] == current_file
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
|
||||
class SaveCurrentScene(pyblish.api.ContextPlugin):
|
||||
|
|
@ -12,7 +13,7 @@ class SaveCurrentScene(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
|
||||
# Filename must not have changed since collecting
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
current_file = host.current_file()
|
||||
assert context.data['currentFile'] == current_file, (
|
||||
"Collected filename from current scene name."
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import avalon.api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.houdini import api
|
||||
|
||||
|
||||
def main():
|
||||
print("Installing OpenPype ...")
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
|
||||
main()
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import avalon.api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.houdini import api
|
||||
|
||||
|
||||
def main():
|
||||
print("Installing OpenPype ...")
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
|
||||
main()
|
||||
|
|
|
|||
|
|
@ -134,6 +134,7 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
|
|||
"""
|
||||
|
||||
from avalon import api, io
|
||||
from openpype.pipeline import registered_root
|
||||
|
||||
PROJECT = api.Session["AVALON_PROJECT"]
|
||||
asset_doc = io.find_one({"name": asset,
|
||||
|
|
@ -141,7 +142,7 @@ class AvalonURIOutputProcessor(base.OutputProcessorBase):
|
|||
if not asset_doc:
|
||||
raise RuntimeError("Invalid asset name: '%s'" % asset)
|
||||
|
||||
root = api.registered_root()
|
||||
root = registered_root()
|
||||
path = self._template.format(**{
|
||||
"root": root,
|
||||
"project": PROJECT,
|
||||
|
|
|
|||
|
|
@ -26,6 +26,7 @@ from openpype.pipeline import (
|
|||
loaders_from_representation,
|
||||
get_representation_path,
|
||||
load_container,
|
||||
registered_host,
|
||||
)
|
||||
from .commands import reset_frame_range
|
||||
|
||||
|
|
@ -1574,7 +1575,7 @@ def assign_look_by_version(nodes, version_id):
|
|||
"name": "json"})
|
||||
|
||||
# See if representation is already loaded, if so reuse it.
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
representation_id = str(look_representation['_id'])
|
||||
for container in host.ls():
|
||||
if (container['loader'] == "LookLoader" and
|
||||
|
|
@ -2612,7 +2613,7 @@ def get_attr_in_layer(attr, layer):
|
|||
def fix_incompatible_containers():
|
||||
"""Backwards compatibility: old containers to use new ReferenceLoader"""
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
loader = container['loader']
|
||||
|
||||
|
|
|
|||
|
|
@ -194,11 +194,13 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
|
|||
assert render_products, "no render products generated"
|
||||
exp_files = []
|
||||
multipart = False
|
||||
render_cameras = []
|
||||
for product in render_products:
|
||||
if product.multipart:
|
||||
multipart = True
|
||||
product_name = product.productName
|
||||
if product.camera and layer_render_products.has_camera_token():
|
||||
render_cameras.append(product.camera)
|
||||
product_name = "{}{}".format(
|
||||
product.camera,
|
||||
"_" + product_name if product_name else "")
|
||||
|
|
@ -208,6 +210,8 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
|
|||
product)
|
||||
})
|
||||
|
||||
assert render_cameras, "No render cameras found."
|
||||
|
||||
self.log.info("multipart: {}".format(
|
||||
multipart))
|
||||
assert exp_files, "no file names were generated, this is bug"
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
import os
|
||||
import avalon.api
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.maya import api
|
||||
import openpype.hosts.maya.api.lib as mlib
|
||||
from maya import cmds
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
|
||||
print("starting OpenPype usersetup")
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import nuke
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.nuke import api
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
on_script_load,
|
||||
|
|
@ -13,7 +13,7 @@ from openpype.hosts.nuke.api.lib import (
|
|||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
|
||||
# fix ffmpeg settings on script
|
||||
nuke.addOnScriptLoad(on_script_load)
|
||||
|
|
|
|||
|
|
@ -5,9 +5,8 @@ import traceback
|
|||
|
||||
from Qt import QtWidgets
|
||||
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
from openpype.lib import env_value_to_bool
|
||||
|
|
@ -24,7 +23,7 @@ def safe_excepthook(*args):
|
|||
def main(*subprocess_args):
|
||||
from openpype.hosts.photoshop import api
|
||||
|
||||
avalon.api.install(api)
|
||||
install_host(api)
|
||||
sys.excepthook = safe_excepthook
|
||||
|
||||
# coloring in StdOutBroker
|
||||
|
|
|
|||
|
|
@ -3,7 +3,6 @@ from Qt import QtWidgets
|
|||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon import io
|
||||
|
||||
from openpype.api import Logger
|
||||
|
|
@ -14,6 +13,7 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
registered_host,
|
||||
)
|
||||
import openpype.hosts.photoshop
|
||||
|
||||
|
|
@ -33,7 +33,7 @@ def check_inventory():
|
|||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
outdated_containers = []
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
|
|
|
|||
|
|
@ -0,0 +1,73 @@
|
|||
"""Parses batch context from json and continues in publish process.
|
||||
|
||||
Provides:
|
||||
context -> Loaded batch file.
|
||||
- asset
|
||||
- task (task name)
|
||||
- taskType
|
||||
- project_name
|
||||
- variant
|
||||
|
||||
Code is practically copy of `openype/hosts/webpublish/collect_batch_data` as
|
||||
webpublisher should be eventually ejected as an addon, eg. mentioned plugin
|
||||
shouldn't be pushed into general publish plugins.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
import pyblish.api
|
||||
from avalon import io
|
||||
from openpype.lib.plugin_tools import (
|
||||
parse_json,
|
||||
get_batch_asset_task_info
|
||||
)
|
||||
|
||||
|
||||
class CollectBatchData(pyblish.api.ContextPlugin):
|
||||
"""Collect batch data from json stored in 'OPENPYPE_PUBLISH_DATA' env dir.
|
||||
|
||||
The directory must contain 'manifest.json' file where batch data should be
|
||||
stored.
|
||||
"""
|
||||
# must be really early, context values are only in json file
|
||||
order = pyblish.api.CollectorOrder - 0.495
|
||||
label = "Collect batch data"
|
||||
hosts = ["photoshop"]
|
||||
targets = ["remotepublish"]
|
||||
|
||||
def process(self, context):
|
||||
self.log.info("CollectBatchData")
|
||||
batch_dir = os.environ.get("OPENPYPE_PUBLISH_DATA")
|
||||
|
||||
assert batch_dir, (
|
||||
"Missing `OPENPYPE_PUBLISH_DATA`")
|
||||
|
||||
assert os.path.exists(batch_dir), \
|
||||
"Folder {} doesn't exist".format(batch_dir)
|
||||
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
if project_name is None:
|
||||
raise AssertionError(
|
||||
"Environment `AVALON_PROJECT` was not found."
|
||||
"Could not set project `root` which may cause issues."
|
||||
)
|
||||
|
||||
batch_data = parse_json(os.path.join(batch_dir, "manifest.json"))
|
||||
|
||||
context.data["batchDir"] = batch_dir
|
||||
context.data["batchData"] = batch_data
|
||||
|
||||
asset_name, task_name, task_type = get_batch_asset_task_info(
|
||||
batch_data["context"]
|
||||
)
|
||||
|
||||
os.environ["AVALON_ASSET"] = asset_name
|
||||
io.Session["AVALON_ASSET"] = asset_name
|
||||
os.environ["AVALON_TASK"] = task_name
|
||||
io.Session["AVALON_TASK"] = task_name
|
||||
|
||||
context.data["asset"] = asset_name
|
||||
context.data["task"] = task_name
|
||||
context.data["taskType"] = task_type
|
||||
context.data["project_name"] = project_name
|
||||
context.data["variant"] = batch_data["variant"]
|
||||
|
|
@ -4,7 +4,6 @@ import re
|
|||
import pyblish.api
|
||||
|
||||
from openpype.lib import prepare_template_data
|
||||
from openpype.lib.plugin_tools import parse_json, get_batch_asset_task_info
|
||||
from openpype.hosts.photoshop import api as photoshop
|
||||
|
||||
|
||||
|
|
@ -46,7 +45,10 @@ class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
existing_subset_names = self._get_existing_subset_names(context)
|
||||
|
||||
asset_name, task_name, variant = self._parse_batch(batch_dir)
|
||||
# from CollectBatchData
|
||||
asset_name = context.data["asset"]
|
||||
task_name = context.data["task"]
|
||||
variant = context.data["variant"]
|
||||
|
||||
stub = photoshop.stub()
|
||||
layers = stub.get_layers()
|
||||
|
|
@ -130,25 +132,6 @@ class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
return existing_subset_names
|
||||
|
||||
def _parse_batch(self, batch_dir):
|
||||
"""Parses asset_name, task_name, variant from batch manifest."""
|
||||
task_data = None
|
||||
if batch_dir and os.path.exists(batch_dir):
|
||||
task_data = parse_json(os.path.join(batch_dir,
|
||||
"manifest.json"))
|
||||
if not task_data:
|
||||
raise ValueError(
|
||||
"Cannot parse batch meta in {} folder".format(batch_dir))
|
||||
variant = task_data["variant"]
|
||||
|
||||
asset, task_name, task_type = get_batch_asset_task_info(
|
||||
task_data["context"])
|
||||
|
||||
if not task_name:
|
||||
task_name = task_type
|
||||
|
||||
return asset, task_name, variant
|
||||
|
||||
def _create_instance(self, context, layer, family,
|
||||
asset, subset, task_name):
|
||||
instance = context.create_instance(layer.name)
|
||||
|
|
|
|||
|
|
@ -82,8 +82,9 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
task_name = api.Session["AVALON_TASK"]
|
||||
asset_name = context.data["assetEntity"]["name"]
|
||||
|
||||
variant = context.data.get("variant") or variants[0]
|
||||
fill_pairs = {
|
||||
"variant": variants[0],
|
||||
"variant": variant,
|
||||
"family": family,
|
||||
"task": task_name
|
||||
}
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ class CollectReview(pyblish.api.ContextPlugin):
|
|||
family = "review"
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
family,
|
||||
"",
|
||||
context.data.get("variant", ''),
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
|
|
|
|||
|
|
@ -1,13 +1,14 @@
|
|||
#!/usr/bin/env python
|
||||
import os
|
||||
import sys
|
||||
import openpype
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
|
||||
|
||||
def main(env):
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
install_host(bmdvr)
|
||||
bmdvr.setup(env)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,8 +1,7 @@
|
|||
import os
|
||||
import sys
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.api import Logger
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
|
@ -10,13 +9,9 @@ log = Logger().get_logger(__name__)
|
|||
|
||||
def main(env):
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
|
||||
log.info(f"Avalon registered hosts: {avalon.registered_host()}")
|
||||
install_host(bmdvr)
|
||||
|
||||
bmdvr.launch_pype_menu()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,9 +1,11 @@
|
|||
#! python3
|
||||
import os
|
||||
import sys
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
|
||||
import opentimelineio as otio
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
|
||||
from openpype.hosts.resolve import TestGUI
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
from openpype.hosts.resolve.otio import davinci_export as otio_export
|
||||
|
|
@ -14,10 +16,8 @@ class ThisTestGUI(TestGUI):
|
|||
|
||||
def __init__(self):
|
||||
super(ThisTestGUI, self).__init__()
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
install_host(bmdvr)
|
||||
|
||||
def _open_dir_button_pressed(self, event):
|
||||
# selected_path = self.fu.RequestFile(os.path.expanduser("~"))
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
#! python3
|
||||
import os
|
||||
import sys
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.resolve import TestGUI
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
import clique
|
||||
|
|
@ -13,10 +13,8 @@ class ThisTestGUI(TestGUI):
|
|||
|
||||
def __init__(self):
|
||||
super(ThisTestGUI, self).__init__()
|
||||
# Registers openpype's Global pyblish plugins
|
||||
openpype.install()
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
install_host(bmdvr)
|
||||
|
||||
def _open_dir_button_pressed(self, event):
|
||||
# selected_path = self.fu.RequestFile(os.path.expanduser("~"))
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
#! python3
|
||||
import avalon.api as avalon
|
||||
import openpype
|
||||
from openpype.pipeline import install_host
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
|
||||
|
||||
|
|
@ -15,8 +14,7 @@ def file_processing(fpath):
|
|||
if __name__ == "__main__":
|
||||
path = "C:/CODE/__openpype_projects/jtest03dev/shots/sq01/mainsq01sh030/publish/plate/plateMain/v006/jt3d_mainsq01sh030_plateMain_v006.0996.exr"
|
||||
|
||||
openpype.install()
|
||||
# activate resolve from openpype
|
||||
avalon.install(bmdvr)
|
||||
install_host(bmdvr)
|
||||
|
||||
file_processing(path)
|
||||
file_processing(path)
|
||||
|
|
|
|||
|
|
@ -48,8 +48,8 @@ from openpype.tools.publisher.window import PublisherWindow
|
|||
|
||||
def main():
|
||||
"""Main function for testing purposes."""
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.hosts.testhost import api as testhost
|
||||
|
||||
|
|
@ -57,7 +57,7 @@ def main():
|
|||
for plugin_path in manager.collect_plugin_paths()["publish"]:
|
||||
pyblish.api.register_plugin_path(plugin_path)
|
||||
|
||||
avalon.api.install(testhost)
|
||||
install_host(testhost)
|
||||
|
||||
QtWidgets.QApplication.setAttribute(QtCore.Qt.AA_EnableHighDpiScaling)
|
||||
app = QtWidgets.QApplication([])
|
||||
|
|
|
|||
|
|
@ -8,8 +8,8 @@ import logging
|
|||
|
||||
from Qt import QtWidgets, QtCore, QtGui
|
||||
|
||||
from avalon import api
|
||||
from openpype import style
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.tvpaint.api.communication_server import (
|
||||
CommunicationWrapper
|
||||
)
|
||||
|
|
@ -31,7 +31,7 @@ def main(launch_args):
|
|||
qt_app = QtWidgets.QApplication([])
|
||||
|
||||
# Execute pipeline installation
|
||||
api.install(tvpaint_host)
|
||||
install_host(tvpaint_host)
|
||||
|
||||
# Create Communicator object and trigger launch
|
||||
# - this must be done before anything is processed
|
||||
|
|
|
|||
|
|
@ -67,11 +67,8 @@ instances=2
|
|||
|
||||
|
||||
def install():
|
||||
"""Install Maya-specific functionality of avalon-core.
|
||||
"""Install TVPaint-specific functionality."""
|
||||
|
||||
This function is called automatically on calling `api.install(maya)`.
|
||||
|
||||
"""
|
||||
log.info("OpenPype - Installing TVPaint integration")
|
||||
io.install()
|
||||
|
||||
|
|
@ -96,11 +93,11 @@ def install():
|
|||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall TVPaint-specific functionality of avalon-core.
|
||||
|
||||
This function is called automatically on calling `api.uninstall()`.
|
||||
"""Uninstall TVPaint-specific functionality.
|
||||
|
||||
This function is called automatically on calling `uninstall_host()`.
|
||||
"""
|
||||
|
||||
log.info("OpenPype - Uninstalling TVPaint integration")
|
||||
pyblish.api.deregister_host("tvpaint")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,13 @@
|
|||
import os
|
||||
|
||||
from avalon import api, io
|
||||
from avalon import io
|
||||
from openpype.lib import (
|
||||
StringTemplate,
|
||||
get_workfile_template_key_from_context,
|
||||
get_workdir_data,
|
||||
get_last_workfile_with_version,
|
||||
)
|
||||
from openpype.pipeline import registered_host
|
||||
from openpype.api import Anatomy
|
||||
from openpype.hosts.tvpaint.api import lib, pipeline, plugin
|
||||
|
||||
|
|
@ -22,7 +23,7 @@ class LoadWorkfile(plugin.Loader):
|
|||
def load(self, context, name, namespace, options):
|
||||
# Load context of current workfile as first thing
|
||||
# - which context and extension has
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
current_file = host.current_file()
|
||||
|
||||
context = pipeline.get_current_workfile_context()
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ from openpype.pipeline import (
|
|||
class Creator(LegacyCreator):
|
||||
"""This serves as skeleton for future OpenPype specific functionality"""
|
||||
defaults = ['Main']
|
||||
maintain_selection = False
|
||||
|
||||
|
||||
class Loader(LoaderPlugin, ABC):
|
||||
|
|
|
|||
|
|
@ -2,13 +2,7 @@ import unreal
|
|||
|
||||
openpype_detected = True
|
||||
try:
|
||||
from avalon import api
|
||||
except ImportError as exc:
|
||||
api = None
|
||||
openpype_detected = False
|
||||
unreal.log_error("Avalon: cannot load Avalon [ {} ]".format(exc))
|
||||
|
||||
try:
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.unreal import api as openpype_host
|
||||
except ImportError as exc:
|
||||
openpype_host = None
|
||||
|
|
@ -16,7 +10,7 @@ except ImportError as exc:
|
|||
unreal.log_error("OpenPype: cannot load OpenPype [ {} ]".format(exc))
|
||||
|
||||
if openpype_detected:
|
||||
api.install(openpype_host)
|
||||
install_host(openpype_host)
|
||||
|
||||
|
||||
@unreal.uclass()
|
||||
|
|
|
|||
|
|
@ -1,7 +1,6 @@
|
|||
import os
|
||||
import logging
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import io
|
||||
from pyblish import api as pyblish
|
||||
import openpype.hosts.webpublisher
|
||||
|
|
|
|||
|
|
@ -1,7 +1,12 @@
|
|||
"""Loads batch context from json and continues in publish process.
|
||||
"""Parses batch context from json and continues in publish process.
|
||||
|
||||
Provides:
|
||||
context -> Loaded batch file.
|
||||
- asset
|
||||
- task (task name)
|
||||
- taskType
|
||||
- project_name
|
||||
- variant
|
||||
"""
|
||||
|
||||
import os
|
||||
|
|
@ -24,7 +29,7 @@ class CollectBatchData(pyblish.api.ContextPlugin):
|
|||
# must be really early, context values are only in json file
|
||||
order = pyblish.api.CollectorOrder - 0.495
|
||||
label = "Collect batch data"
|
||||
host = ["webpublisher"]
|
||||
hosts = ["webpublisher"]
|
||||
|
||||
def process(self, context):
|
||||
batch_dir = os.environ.get("OPENPYPE_PUBLISH_DATA")
|
||||
|
|
@ -60,6 +65,7 @@ class CollectBatchData(pyblish.api.ContextPlugin):
|
|||
context.data["task"] = task_name
|
||||
context.data["taskType"] = task_type
|
||||
context.data["project_name"] = project_name
|
||||
context.data["variant"] = batch_data["variant"]
|
||||
|
||||
self._set_ctx_path(batch_data)
|
||||
|
||||
|
|
|
|||
|
|
@ -40,7 +40,7 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
# must be really early, context values are only in json file
|
||||
order = pyblish.api.CollectorOrder - 0.490
|
||||
label = "Collect rendered frames"
|
||||
host = ["webpublisher"]
|
||||
hosts = ["webpublisher"]
|
||||
targets = ["filespublish"]
|
||||
|
||||
# from Settings
|
||||
|
|
@ -61,6 +61,7 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
task_name = context.data["task"]
|
||||
task_type = context.data["taskType"]
|
||||
project_name = context.data["project_name"]
|
||||
variant = context.data["variant"]
|
||||
for task_dir in task_subfolders:
|
||||
task_data = parse_json(os.path.join(task_dir,
|
||||
"manifest.json"))
|
||||
|
|
@ -76,7 +77,7 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
extension.replace(".", ''))
|
||||
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
family, task_data["variant"], task_name, asset_doc,
|
||||
family, variant, task_name, asset_doc,
|
||||
project_name=project_name, host_name="webpublisher"
|
||||
)
|
||||
version = self._get_last_version(asset_name, subset_name) + 1
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ from openpype.lib import (
|
|||
run_subprocess,
|
||||
|
||||
get_transcode_temp_directory,
|
||||
convert_for_ffmpeg,
|
||||
convert_input_paths_for_ffmpeg,
|
||||
should_convert_for_ffmpeg
|
||||
)
|
||||
|
||||
|
|
@ -59,11 +59,9 @@ class ExtractThumbnail(pyblish.api.InstancePlugin):
|
|||
if do_convert:
|
||||
convert_dir = get_transcode_temp_directory()
|
||||
filename = os.path.basename(full_input_path)
|
||||
convert_for_ffmpeg(
|
||||
full_input_path,
|
||||
convert_input_paths_for_ffmpeg(
|
||||
[full_input_path],
|
||||
convert_dir,
|
||||
None,
|
||||
None,
|
||||
self.log
|
||||
)
|
||||
full_input_path = os.path.join(convert_dir, filename)
|
||||
|
|
|
|||
|
|
@ -105,6 +105,7 @@ from .transcoding import (
|
|||
get_transcode_temp_directory,
|
||||
should_convert_for_ffmpeg,
|
||||
convert_for_ffmpeg,
|
||||
convert_input_paths_for_ffmpeg,
|
||||
get_ffprobe_data,
|
||||
get_ffprobe_streams,
|
||||
get_ffmpeg_codec_args,
|
||||
|
|
@ -276,6 +277,7 @@ __all__ = [
|
|||
"get_transcode_temp_directory",
|
||||
"should_convert_for_ffmpeg",
|
||||
"convert_for_ffmpeg",
|
||||
"convert_input_paths_for_ffmpeg",
|
||||
"get_ffprobe_data",
|
||||
"get_ffprobe_streams",
|
||||
"get_ffmpeg_codec_args",
|
||||
|
|
|
|||
|
|
@ -161,9 +161,10 @@ def is_latest(representation):
|
|||
@with_avalon
|
||||
def any_outdated():
|
||||
"""Return whether the current scene has any outdated content"""
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
checked = set()
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
if representation in checked:
|
||||
|
|
|
|||
|
|
@ -44,12 +44,6 @@ def _profile_exclusion(matching_profiles, logger):
|
|||
Returns:
|
||||
dict: Most matching profile.
|
||||
"""
|
||||
|
||||
logger.info(
|
||||
"Search for first most matching profile in match order:"
|
||||
" Host name -> Task name -> Family."
|
||||
)
|
||||
|
||||
if not matching_profiles:
|
||||
return None
|
||||
|
||||
|
|
@ -168,6 +162,15 @@ def filter_profiles(profiles_data, key_values, keys_order=None, logger=None):
|
|||
_keys_order.append(key)
|
||||
keys_order = tuple(_keys_order)
|
||||
|
||||
log_parts = " | ".join([
|
||||
"{}: \"{}\"".format(*item)
|
||||
for item in key_values.items()
|
||||
])
|
||||
|
||||
logger.info(
|
||||
"Looking for matching profile for: {}".format(log_parts)
|
||||
)
|
||||
|
||||
matching_profiles = None
|
||||
highest_profile_points = -1
|
||||
# Each profile get 1 point for each matching filter. Profile with most
|
||||
|
|
@ -205,11 +208,6 @@ def filter_profiles(profiles_data, key_values, keys_order=None, logger=None):
|
|||
if profile_points == highest_profile_points:
|
||||
matching_profiles.append((profile, profile_scores))
|
||||
|
||||
log_parts = " | ".join([
|
||||
"{}: \"{}\"".format(*item)
|
||||
for item in key_values.items()
|
||||
])
|
||||
|
||||
if not matching_profiles:
|
||||
logger.info(
|
||||
"None of profiles match your setup. {}".format(log_parts)
|
||||
|
|
@ -221,4 +219,9 @@ def filter_profiles(profiles_data, key_values, keys_order=None, logger=None):
|
|||
"More than one profile match your setup. {}".format(log_parts)
|
||||
)
|
||||
|
||||
return _profile_exclusion(matching_profiles, logger)
|
||||
profile = _profile_exclusion(matching_profiles, logger)
|
||||
if profile:
|
||||
logger.info(
|
||||
"Profile selected: {}".format(profile)
|
||||
)
|
||||
return profile
|
||||
|
|
|
|||
|
|
@ -1,13 +1,12 @@
|
|||
import os
|
||||
from datetime import datetime
|
||||
import sys
|
||||
from bson.objectid import ObjectId
|
||||
import collections
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.util
|
||||
import pyblish.api
|
||||
|
||||
from openpype import uninstall
|
||||
from openpype.lib.mongo import OpenPypeMongoConnection
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
|
||||
|
|
@ -81,7 +80,6 @@ def publish(log, close_plugin_name=None):
|
|||
|
||||
if result["error"]:
|
||||
log.error(error_format.format(**result))
|
||||
uninstall()
|
||||
if close_plugin: # close host app explicitly after error
|
||||
context = pyblish.api.Context()
|
||||
close_plugin().process(context)
|
||||
|
|
@ -118,7 +116,6 @@ def publish_and_log(dbcon, _id, log, close_plugin_name=None, batch_id=None):
|
|||
|
||||
if result["error"]:
|
||||
log.error(error_format.format(**result))
|
||||
uninstall()
|
||||
log_lines = [error_format.format(**result)] + log_lines
|
||||
dbcon.update_one(
|
||||
{"_id": _id},
|
||||
|
|
|
|||
|
|
@ -382,6 +382,11 @@ def should_convert_for_ffmpeg(src_filepath):
|
|||
return False
|
||||
|
||||
|
||||
# Deprecated since 2022 4 20
|
||||
# - Reason - Doesn't convert sequences right way: Can't handle gaps, reuse
|
||||
# first frame for all frames and changes filenames when input
|
||||
# is sequence.
|
||||
# - use 'convert_input_paths_for_ffmpeg' instead
|
||||
def convert_for_ffmpeg(
|
||||
first_input_path,
|
||||
output_dir,
|
||||
|
|
@ -409,6 +414,12 @@ def convert_for_ffmpeg(
|
|||
if logger is None:
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
logger.warning((
|
||||
"DEPRECATED: 'openpype.lib.transcoding.convert_for_ffmpeg' is"
|
||||
" deprecated function of conversion for FFMpeg. Please replace usage"
|
||||
" with 'openpype.lib.transcoding.convert_input_paths_for_ffmpeg'"
|
||||
))
|
||||
|
||||
ext = os.path.splitext(first_input_path)[1].lower()
|
||||
if ext != ".exr":
|
||||
raise ValueError((
|
||||
|
|
@ -516,6 +527,130 @@ def convert_for_ffmpeg(
|
|||
run_subprocess(oiio_cmd, logger=logger)
|
||||
|
||||
|
||||
def convert_input_paths_for_ffmpeg(
|
||||
input_paths,
|
||||
output_dir,
|
||||
logger=None
|
||||
):
|
||||
"""Contert source file to format supported in ffmpeg.
|
||||
|
||||
Currently can convert only exrs. The input filepaths should be files
|
||||
with same type. Information about input is loaded only from first found
|
||||
file.
|
||||
|
||||
Filenames of input files are kept so make sure that output directory
|
||||
is not the same directory as input files have.
|
||||
- This way it can handle gaps and can keep input filenames without handling
|
||||
frame template
|
||||
|
||||
Args:
|
||||
input_paths (str): Paths that should be converted. It is expected that
|
||||
contains single file or image sequence of samy type.
|
||||
output_dir (str): Path to directory where output will be rendered.
|
||||
Must not be same as input's directory.
|
||||
logger (logging.Logger): Logger used for logging.
|
||||
|
||||
Raises:
|
||||
ValueError: If input filepath has extension not supported by function.
|
||||
Currently is supported only ".exr" extension.
|
||||
"""
|
||||
if logger is None:
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
first_input_path = input_paths[0]
|
||||
ext = os.path.splitext(first_input_path)[1].lower()
|
||||
if ext != ".exr":
|
||||
raise ValueError((
|
||||
"Function 'convert_for_ffmpeg' currently support only"
|
||||
" \".exr\" extension. Got \"{}\"."
|
||||
).format(ext))
|
||||
|
||||
input_info = get_oiio_info_for_input(first_input_path)
|
||||
|
||||
# Change compression only if source compression is "dwaa" or "dwab"
|
||||
# - they're not supported in ffmpeg
|
||||
compression = input_info["attribs"].get("compression")
|
||||
if compression in ("dwaa", "dwab"):
|
||||
compression = "none"
|
||||
|
||||
# Collect channels to export
|
||||
channel_names = input_info["channelnames"]
|
||||
review_channels = get_convert_rgb_channels(channel_names)
|
||||
if review_channels is None:
|
||||
raise ValueError(
|
||||
"Couldn't find channels that can be used for conversion."
|
||||
)
|
||||
|
||||
red, green, blue, alpha = review_channels
|
||||
input_channels = [red, green, blue]
|
||||
channels_arg = "R={},G={},B={}".format(red, green, blue)
|
||||
if alpha is not None:
|
||||
channels_arg += ",A={}".format(alpha)
|
||||
input_channels.append(alpha)
|
||||
input_channels_str = ",".join(input_channels)
|
||||
|
||||
for input_path in input_paths:
|
||||
# Prepare subprocess arguments
|
||||
oiio_cmd = [
|
||||
get_oiio_tools_path(),
|
||||
|
||||
# Don't add any additional attributes
|
||||
"--nosoftwareattrib",
|
||||
]
|
||||
# Add input compression if available
|
||||
if compression:
|
||||
oiio_cmd.extend(["--compression", compression])
|
||||
|
||||
oiio_cmd.extend([
|
||||
# Tell oiiotool which channels should be loaded
|
||||
# - other channels are not loaded to memory so helps to
|
||||
# avoid memory leak issues
|
||||
"-i:ch={}".format(input_channels_str), input_path,
|
||||
# Tell oiiotool which channels should be put to top stack
|
||||
# (and output)
|
||||
"--ch", channels_arg
|
||||
])
|
||||
|
||||
for attr_name, attr_value in input_info["attribs"].items():
|
||||
if not isinstance(attr_value, str):
|
||||
continue
|
||||
|
||||
# Remove attributes that have string value longer than allowed
|
||||
# length for ffmpeg or when containt unallowed symbols
|
||||
erase_reason = "Missing reason"
|
||||
erase_attribute = False
|
||||
if len(attr_value) > MAX_FFMPEG_STRING_LEN:
|
||||
erase_reason = "has too long value ({} chars).".format(
|
||||
len(attr_value)
|
||||
)
|
||||
|
||||
if erase_attribute:
|
||||
for char in NOT_ALLOWED_FFMPEG_CHARS:
|
||||
if char in attr_value:
|
||||
erase_attribute = True
|
||||
erase_reason = (
|
||||
"contains unsupported character \"{}\"."
|
||||
).format(char)
|
||||
break
|
||||
|
||||
if erase_attribute:
|
||||
# Set attribute to empty string
|
||||
logger.info((
|
||||
"Removed attribute \"{}\" from metadata because {}."
|
||||
).format(attr_name, erase_reason))
|
||||
oiio_cmd.extend(["--eraseattrib", attr_name])
|
||||
|
||||
# Add last argument - path to output
|
||||
base_filename = os.path.basename(input_path)
|
||||
output_path = os.path.join(output_dir, base_filename)
|
||||
oiio_cmd.extend([
|
||||
"-o", output_path
|
||||
])
|
||||
|
||||
logger.debug("Conversion command: {}".format(" ".join(oiio_cmd)))
|
||||
run_subprocess(oiio_cmd, logger=logger)
|
||||
|
||||
|
||||
# FFMPEG functions
|
||||
def get_ffprobe_data(path_to_file, logger=None):
|
||||
"""Load data about entered filepath via ffprobe.
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ except ImportError:
|
|||
from mvpxr import Usd, UsdGeom, Sdf, Kind
|
||||
|
||||
from avalon import io, api
|
||||
from openpype.pipeline import registered_root
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -323,7 +324,7 @@ def get_usd_master_path(asset, subset, representation):
|
|||
|
||||
path = template.format(
|
||||
**{
|
||||
"root": api.registered_root(),
|
||||
"root": registered_root(),
|
||||
"project": api.Session["AVALON_PROJECT"],
|
||||
"asset": asset_doc["name"],
|
||||
"subset": subset,
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import os
|
||||
import openpype
|
||||
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import ITrayModule
|
||||
|
||||
|
|
@ -26,7 +26,8 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
self.avalon_mongo_timeout = avalon_mongo_timeout
|
||||
|
||||
# Tray attributes
|
||||
self.libraryloader = None
|
||||
self._library_loader_imported = None
|
||||
self._library_loader_window = None
|
||||
self.rest_api_obj = None
|
||||
|
||||
def get_global_environments(self):
|
||||
|
|
@ -41,21 +42,11 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
def tray_init(self):
|
||||
# Add library tool
|
||||
self._library_loader_imported = False
|
||||
try:
|
||||
from Qt import QtCore
|
||||
from openpype.tools.libraryloader import LibraryLoaderWindow
|
||||
|
||||
libraryloader = LibraryLoaderWindow(
|
||||
show_projects=True,
|
||||
show_libraries=True
|
||||
)
|
||||
# Remove always on top flag for tray
|
||||
window_flags = libraryloader.windowFlags()
|
||||
if window_flags | QtCore.Qt.WindowStaysOnTopHint:
|
||||
window_flags ^= QtCore.Qt.WindowStaysOnTopHint
|
||||
libraryloader.setWindowFlags(window_flags)
|
||||
self.libraryloader = libraryloader
|
||||
|
||||
self._library_loader_imported = True
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"Couldn't load Library loader tool for tray.",
|
||||
|
|
@ -64,7 +55,7 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
# Definition of Tray menu
|
||||
def tray_menu(self, tray_menu):
|
||||
if self.libraryloader is None:
|
||||
if not self._library_loader_imported:
|
||||
return
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
|
@ -84,17 +75,31 @@ class AvalonModule(OpenPypeModule, ITrayModule):
|
|||
return
|
||||
|
||||
def show_library_loader(self):
|
||||
if self.libraryloader is None:
|
||||
return
|
||||
if self._library_loader_window is None:
|
||||
from Qt import QtCore
|
||||
from openpype.tools.libraryloader import LibraryLoaderWindow
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
|
||||
self.libraryloader.show()
|
||||
libraryloader = LibraryLoaderWindow(
|
||||
show_projects=True,
|
||||
show_libraries=True
|
||||
)
|
||||
# Remove always on top flag for tray
|
||||
window_flags = libraryloader.windowFlags()
|
||||
if window_flags | QtCore.Qt.WindowStaysOnTopHint:
|
||||
window_flags ^= QtCore.Qt.WindowStaysOnTopHint
|
||||
libraryloader.setWindowFlags(window_flags)
|
||||
self._library_loader_window = libraryloader
|
||||
|
||||
install_openpype_plugins()
|
||||
|
||||
self._library_loader_window.show()
|
||||
|
||||
# Raise and activate the window
|
||||
# for MacOS
|
||||
self.libraryloader.raise_()
|
||||
self._library_loader_window.raise_()
|
||||
# for Windows
|
||||
self.libraryloader.activateWindow()
|
||||
self.libraryloader.refresh()
|
||||
self._library_loader_window.activateWindow()
|
||||
|
||||
# Webserver module implementation
|
||||
def webserver_initialization(self, server_manager):
|
||||
|
|
|
|||
|
|
@ -1,12 +1,14 @@
|
|||
from avalon import api, io
|
||||
from avalon import io
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import LauncherAction
|
||||
from openpype_modules.clockify.clockify_api import ClockifyAPI
|
||||
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
class ClockifyStart(api.Action):
|
||||
class ClockifyStart(LauncherAction):
|
||||
|
||||
name = "clockify_start_timer"
|
||||
label = "Clockify - Start Timer"
|
||||
|
|
|
|||
|
|
@ -1,10 +1,13 @@
|
|||
from avalon import api, io
|
||||
from avalon import io
|
||||
|
||||
from openpype_modules.clockify.clockify_api import ClockifyAPI
|
||||
from openpype.api import Logger
|
||||
log = Logger().get_logger(__name__)
|
||||
from openpype.pipeline import LauncherAction
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
class ClockifySync(api.Action):
|
||||
class ClockifySync(LauncherAction):
|
||||
|
||||
name = "sync_to_clockify"
|
||||
label = "Sync to Clockify"
|
||||
|
|
|
|||
|
|
@ -524,6 +524,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
|
||||
"""
|
||||
representations = []
|
||||
host_name = os.environ.get("AVALON_APP", "")
|
||||
collections, remainders = clique.assemble(exp_files)
|
||||
|
||||
# create representation for every collected sequento ce
|
||||
|
|
@ -541,7 +542,6 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
preview = True
|
||||
else:
|
||||
render_file_name = list(collection)[0]
|
||||
host_name = os.environ.get("AVALON_APP", "")
|
||||
# if filtered aov name is found in filename, toggle it for
|
||||
# preview video rendering
|
||||
preview = match_aov_pattern(
|
||||
|
|
@ -610,12 +610,16 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
"files": os.path.basename(remainder),
|
||||
"stagingDir": os.path.dirname(remainder),
|
||||
}
|
||||
if "render" in instance.get("families"):
|
||||
|
||||
preview = match_aov_pattern(
|
||||
host_name, self.aov_filter, remainder
|
||||
)
|
||||
if preview:
|
||||
rep.update({
|
||||
"fps": instance.get("fps"),
|
||||
"tags": ["review"]
|
||||
})
|
||||
self._solve_families(instance, True)
|
||||
self._solve_families(instance, preview)
|
||||
|
||||
already_there = False
|
||||
for repre in instance.get("representations", []):
|
||||
|
|
|
|||
|
|
@ -34,6 +34,7 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
self.log.warning("No profiles present for adding Ftrack family")
|
||||
return
|
||||
|
||||
add_ftrack_family = False
|
||||
task_name = instance.data.get("task",
|
||||
avalon.api.Session["AVALON_TASK"])
|
||||
host_name = avalon.api.Session["AVALON_APP"]
|
||||
|
|
@ -53,6 +54,8 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
|
||||
additional_filters = profile.get("advanced_filtering")
|
||||
if additional_filters:
|
||||
self.log.info("'{}' families used for additional filtering".
|
||||
format(families))
|
||||
add_ftrack_family = self._get_add_ftrack_f_from_addit_filters(
|
||||
additional_filters,
|
||||
families,
|
||||
|
|
@ -69,6 +72,13 @@ class CollectFtrackFamily(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
instance.data["families"] = ["ftrack"]
|
||||
|
||||
result_str = "Adding"
|
||||
if not add_ftrack_family:
|
||||
result_str = "Not adding"
|
||||
self.log.info("{} 'ftrack' family for instance with '{}'".format(
|
||||
result_str, family
|
||||
))
|
||||
|
||||
def _get_add_ftrack_f_from_addit_filters(self,
|
||||
additional_filters,
|
||||
families,
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ from datetime import datetime
|
|||
import threading
|
||||
import platform
|
||||
import copy
|
||||
from collections import deque
|
||||
from collections import deque, defaultdict
|
||||
|
||||
from avalon.api import AvalonMongoDB
|
||||
|
||||
|
|
@ -157,7 +157,6 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
representation_id,
|
||||
site_name=site_name, force=force)
|
||||
|
||||
# public facing API
|
||||
def remove_site(self, collection, representation_id, site_name,
|
||||
remove_local_files=False):
|
||||
"""
|
||||
|
|
@ -184,6 +183,151 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
if remove_local_files:
|
||||
self._remove_local_file(collection, representation_id, site_name)
|
||||
|
||||
def compute_resource_sync_sites(self, project_name):
|
||||
"""Get available resource sync sites state for publish process.
|
||||
|
||||
Returns dict with prepared state of sync sites for 'project_name'.
|
||||
It checks if Site Sync is enabled, handles alternative sites.
|
||||
Publish process stores this dictionary as a part of representation
|
||||
document in DB.
|
||||
|
||||
Example:
|
||||
[
|
||||
{
|
||||
'name': '42abbc09-d62a-44a4-815c-a12cd679d2d7',
|
||||
'created_dt': datetime.datetime(2022, 3, 30, 12, 16, 9, 778637)
|
||||
},
|
||||
{'name': 'studio'},
|
||||
{'name': 'SFTP'}
|
||||
] -- representation is published locally, artist or Settings have set
|
||||
remote site as 'studio'. 'SFTP' is alternate site to 'studio'. Eg.
|
||||
whenever file is on 'studio', it is also on 'SFTP'.
|
||||
"""
|
||||
|
||||
def create_metadata(name, created=True):
|
||||
"""Create sync site metadata for site with `name`"""
|
||||
metadata = {"name": name}
|
||||
if created:
|
||||
metadata["created_dt"] = datetime.now()
|
||||
return metadata
|
||||
|
||||
if (
|
||||
not self.sync_system_settings["enabled"] or
|
||||
not self.sync_project_settings[project_name]["enabled"]):
|
||||
return [create_metadata(self.DEFAULT_SITE)]
|
||||
|
||||
local_site = self.get_active_site(project_name)
|
||||
remote_site = self.get_remote_site(project_name)
|
||||
|
||||
# Attached sites metadata by site name
|
||||
# That is the local site, remote site, the always accesible sites
|
||||
# and their alternate sites (alias of sites with different protocol)
|
||||
attached_sites = dict()
|
||||
attached_sites[local_site] = create_metadata(local_site)
|
||||
|
||||
if remote_site and remote_site not in attached_sites:
|
||||
attached_sites[remote_site] = create_metadata(remote_site,
|
||||
created=False)
|
||||
|
||||
attached_sites = self._add_alternative_sites(attached_sites)
|
||||
# add skeleton for sites where it should be always synced to
|
||||
# usually it would be a backup site which is handled by separate
|
||||
# background process
|
||||
for site in self._get_always_accessible_sites(project_name):
|
||||
if site not in attached_sites:
|
||||
attached_sites[site] = create_metadata(site, created=False)
|
||||
|
||||
return list(attached_sites.values())
|
||||
|
||||
def _get_always_accessible_sites(self, project_name):
|
||||
"""Sites that synced to as a part of background process.
|
||||
|
||||
Artist machine doesn't handle those, explicit Tray with that site name
|
||||
as a local id must be running.
|
||||
Example is dropbox site serving as a backup solution
|
||||
"""
|
||||
always_accessible_sites = (
|
||||
self.get_sync_project_setting(project_name)["config"].
|
||||
get("always_accessible_on", [])
|
||||
)
|
||||
return [site.strip() for site in always_accessible_sites]
|
||||
|
||||
def _add_alternative_sites(self, attached_sites):
|
||||
"""Add skeleton document for alternative sites
|
||||
|
||||
Each new configured site in System Setting could serve as a alternative
|
||||
site, it's a kind of alias. It means that files on 'a site' are
|
||||
physically accessible also on 'a alternative' site.
|
||||
Example is sftp site serving studio files via sftp protocol, physically
|
||||
file is only in studio, sftp server has this location mounted.
|
||||
"""
|
||||
additional_sites = self.sync_system_settings.get("sites", {})
|
||||
|
||||
alt_site_pairs = self._get_alt_site_pairs(additional_sites)
|
||||
|
||||
for site_name in additional_sites.keys():
|
||||
# Get alternate sites (stripped names) for this site name
|
||||
alt_sites = alt_site_pairs.get(site_name)
|
||||
alt_sites = [site.strip() for site in alt_sites]
|
||||
alt_sites = set(alt_sites)
|
||||
|
||||
# If no alternative sites we don't need to add
|
||||
if not alt_sites:
|
||||
continue
|
||||
|
||||
# Take a copy of data of the first alternate site that is already
|
||||
# defined as an attached site to match the same state.
|
||||
match_meta = next((attached_sites[site] for site in alt_sites
|
||||
if site in attached_sites), None)
|
||||
if not match_meta:
|
||||
continue
|
||||
|
||||
alt_site_meta = copy.deepcopy(match_meta)
|
||||
alt_site_meta["name"] = site_name
|
||||
|
||||
# Note: We change mutable `attached_site` dict in-place
|
||||
attached_sites[site_name] = alt_site_meta
|
||||
|
||||
return attached_sites
|
||||
|
||||
def _get_alt_site_pairs(self, conf_sites):
|
||||
"""Returns dict of site and its alternative sites.
|
||||
|
||||
If `site` has alternative site, it means that alt_site has 'site' as
|
||||
alternative site
|
||||
Args:
|
||||
conf_sites (dict)
|
||||
Returns:
|
||||
(dict): {'site': [alternative sites]...}
|
||||
"""
|
||||
alt_site_pairs = defaultdict(set)
|
||||
for site_name, site_info in conf_sites.items():
|
||||
alt_sites = set(site_info.get("alternative_sites", []))
|
||||
alt_site_pairs[site_name].update(alt_sites)
|
||||
|
||||
for alt_site in alt_sites:
|
||||
alt_site_pairs[alt_site].add(site_name)
|
||||
|
||||
for site_name, alt_sites in alt_site_pairs.items():
|
||||
sites_queue = deque(alt_sites)
|
||||
while sites_queue:
|
||||
alt_site = sites_queue.popleft()
|
||||
|
||||
# safety against wrong config
|
||||
# {"SFTP": {"alternative_site": "SFTP"}
|
||||
if alt_site == site_name or alt_site not in alt_site_pairs:
|
||||
continue
|
||||
|
||||
for alt_alt_site in alt_site_pairs[alt_site]:
|
||||
if (
|
||||
alt_alt_site != site_name
|
||||
and alt_alt_site not in alt_sites
|
||||
):
|
||||
alt_sites.add(alt_alt_site)
|
||||
sites_queue.append(alt_alt_site)
|
||||
|
||||
return alt_site_pairs
|
||||
|
||||
def clear_project(self, collection, site_name):
|
||||
"""
|
||||
Clear 'collection' of 'site_name' and its local files
|
||||
|
|
|
|||
|
|
@ -69,6 +69,22 @@ from .actions import (
|
|||
deregister_inventory_action_path,
|
||||
)
|
||||
|
||||
from .context_tools import (
|
||||
install_openpype_plugins,
|
||||
install_host,
|
||||
uninstall_host,
|
||||
is_installed,
|
||||
|
||||
register_root,
|
||||
registered_root,
|
||||
|
||||
register_host,
|
||||
registered_host,
|
||||
deregister_host,
|
||||
)
|
||||
install = install_host
|
||||
uninstall = uninstall_host
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AVALON_CONTAINER_ID",
|
||||
|
|
@ -137,4 +153,21 @@ __all__ = (
|
|||
"register_inventory_action_path",
|
||||
"deregister_inventory_action",
|
||||
"deregister_inventory_action_path",
|
||||
|
||||
# --- Process context ---
|
||||
"install_openpype_plugins",
|
||||
"install_host",
|
||||
"uninstall_host",
|
||||
"is_installed",
|
||||
|
||||
"register_root",
|
||||
"registered_root",
|
||||
|
||||
"register_host",
|
||||
"registered_host",
|
||||
"deregister_host",
|
||||
|
||||
# Backwards compatible function names
|
||||
"install",
|
||||
"uninstall",
|
||||
)
|
||||
|
|
|
|||
335
openpype/pipeline/context_tools.py
Normal file
335
openpype/pipeline/context_tools.py
Normal file
|
|
@ -0,0 +1,335 @@
|
|||
"""Core pipeline functionality"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import types
|
||||
import logging
|
||||
import inspect
|
||||
import platform
|
||||
|
||||
import pyblish.api
|
||||
from pyblish.lib import MessageHandler
|
||||
|
||||
from avalon import io, Session
|
||||
|
||||
import openpype
|
||||
from openpype.modules import load_modules
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import (
|
||||
Anatomy,
|
||||
register_event_callback,
|
||||
filter_pyblish_plugins,
|
||||
change_timer_to_current_context,
|
||||
)
|
||||
|
||||
from . import (
|
||||
register_loader_plugin_path,
|
||||
register_inventory_action,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
)
|
||||
|
||||
|
||||
_is_installed = False
|
||||
_registered_root = {"_": ""}
|
||||
_registered_host = {"_": None}
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
PACKAGE_DIR = os.path.dirname(os.path.abspath(openpype.__file__))
|
||||
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
|
||||
|
||||
# Global plugin paths
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
|
||||
|
||||
def register_root(path):
|
||||
"""Register currently active root"""
|
||||
log.info("Registering root: %s" % path)
|
||||
_registered_root["_"] = path
|
||||
|
||||
|
||||
def registered_root():
|
||||
"""Return currently registered root"""
|
||||
root = _registered_root["_"]
|
||||
if root:
|
||||
return root
|
||||
|
||||
root = Session.get("AVALON_PROJECTS")
|
||||
if root:
|
||||
return os.path.normpath(root)
|
||||
return ""
|
||||
|
||||
|
||||
def install_host(host):
|
||||
"""Install `host` into the running Python session.
|
||||
|
||||
Args:
|
||||
host (module): A Python module containing the Avalon
|
||||
avalon host-interface.
|
||||
"""
|
||||
global _is_installed
|
||||
|
||||
_is_installed = True
|
||||
|
||||
io.install()
|
||||
|
||||
missing = list()
|
||||
for key in ("AVALON_PROJECT", "AVALON_ASSET"):
|
||||
if key not in Session:
|
||||
missing.append(key)
|
||||
|
||||
assert not missing, (
|
||||
"%s missing from environment, %s" % (
|
||||
", ".join(missing),
|
||||
json.dumps(Session, indent=4, sort_keys=True)
|
||||
))
|
||||
|
||||
project_name = Session["AVALON_PROJECT"]
|
||||
log.info("Activating %s.." % project_name)
|
||||
|
||||
# Optional host install function
|
||||
if hasattr(host, "install"):
|
||||
host.install()
|
||||
|
||||
register_host(host)
|
||||
|
||||
register_event_callback("taskChanged", _on_task_change)
|
||||
|
||||
def modified_emit(obj, record):
|
||||
"""Method replacing `emit` in Pyblish's MessageHandler."""
|
||||
record.msg = record.getMessage()
|
||||
obj.records.append(record)
|
||||
|
||||
MessageHandler.emit = modified_emit
|
||||
|
||||
install_openpype_plugins()
|
||||
|
||||
|
||||
def install_openpype_plugins(project_name=None):
|
||||
# Make sure modules are loaded
|
||||
load_modules()
|
||||
|
||||
log.info("Registering global plug-ins..")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.register_discovery_filter(filter_pyblish_plugins)
|
||||
register_loader_plugin_path(LOAD_PATH)
|
||||
|
||||
if project_name is None:
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
|
||||
# Register studio specific plugins
|
||||
if project_name:
|
||||
anatomy = Anatomy(project_name)
|
||||
anatomy.set_root_environments()
|
||||
register_root(anatomy.roots)
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
platform_name = platform.system().lower()
|
||||
project_plugins = (
|
||||
project_settings
|
||||
.get("global", {})
|
||||
.get("project_plugins", {})
|
||||
.get(platform_name)
|
||||
) or []
|
||||
for path in project_plugins:
|
||||
try:
|
||||
path = str(path.format(**os.environ))
|
||||
except KeyError:
|
||||
pass
|
||||
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
pyblish.api.register_plugin_path(path)
|
||||
register_loader_plugin_path(path)
|
||||
register_creator_plugin_path(path)
|
||||
register_inventory_action(path)
|
||||
|
||||
|
||||
def _on_task_change():
|
||||
change_timer_to_current_context()
|
||||
|
||||
|
||||
def uninstall_host():
|
||||
"""Undo all of what `install()` did"""
|
||||
host = registered_host()
|
||||
|
||||
try:
|
||||
host.uninstall()
|
||||
except AttributeError:
|
||||
pass
|
||||
|
||||
log.info("Deregistering global plug-ins..")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.deregister_discovery_filter(filter_pyblish_plugins)
|
||||
deregister_loader_plugin_path(LOAD_PATH)
|
||||
log.info("Global plug-ins unregistred")
|
||||
|
||||
deregister_host()
|
||||
|
||||
io.uninstall()
|
||||
|
||||
log.info("Successfully uninstalled Avalon!")
|
||||
|
||||
|
||||
def is_installed():
|
||||
"""Return state of installation
|
||||
|
||||
Returns:
|
||||
True if installed, False otherwise
|
||||
|
||||
"""
|
||||
|
||||
return _is_installed
|
||||
|
||||
|
||||
def register_host(host):
|
||||
"""Register a new host for the current process
|
||||
|
||||
Arguments:
|
||||
host (ModuleType): A module implementing the
|
||||
Host API interface. See the Host API
|
||||
documentation for details on what is
|
||||
required, or browse the source code.
|
||||
|
||||
"""
|
||||
signatures = {
|
||||
"ls": []
|
||||
}
|
||||
|
||||
_validate_signature(host, signatures)
|
||||
_registered_host["_"] = host
|
||||
|
||||
|
||||
def _validate_signature(module, signatures):
|
||||
# Required signatures for each member
|
||||
|
||||
missing = list()
|
||||
invalid = list()
|
||||
success = True
|
||||
|
||||
for member in signatures:
|
||||
if not hasattr(module, member):
|
||||
missing.append(member)
|
||||
success = False
|
||||
|
||||
else:
|
||||
attr = getattr(module, member)
|
||||
if sys.version_info.major >= 3:
|
||||
signature = inspect.getfullargspec(attr)[0]
|
||||
else:
|
||||
signature = inspect.getargspec(attr)[0]
|
||||
required_signature = signatures[member]
|
||||
|
||||
assert isinstance(signature, list)
|
||||
assert isinstance(required_signature, list)
|
||||
|
||||
if not all(member in signature
|
||||
for member in required_signature):
|
||||
invalid.append({
|
||||
"member": member,
|
||||
"signature": ", ".join(signature),
|
||||
"required": ", ".join(required_signature)
|
||||
})
|
||||
success = False
|
||||
|
||||
if not success:
|
||||
report = list()
|
||||
|
||||
if missing:
|
||||
report.append(
|
||||
"Incomplete interface for module: '%s'\n"
|
||||
"Missing: %s" % (module, ", ".join(
|
||||
"'%s'" % member for member in missing))
|
||||
)
|
||||
|
||||
if invalid:
|
||||
report.append(
|
||||
"'%s': One or more members were found, but didn't "
|
||||
"have the right argument signature." % module.__name__
|
||||
)
|
||||
|
||||
for member in invalid:
|
||||
report.append(
|
||||
" Found: {member}({signature})".format(**member)
|
||||
)
|
||||
report.append(
|
||||
" Expected: {member}({required})".format(**member)
|
||||
)
|
||||
|
||||
raise ValueError("\n".join(report))
|
||||
|
||||
|
||||
def registered_host():
|
||||
"""Return currently registered host"""
|
||||
return _registered_host["_"]
|
||||
|
||||
|
||||
def deregister_host():
|
||||
_registered_host["_"] = default_host()
|
||||
|
||||
|
||||
def default_host():
|
||||
"""A default host, in place of anything better
|
||||
|
||||
This may be considered as reference for the
|
||||
interface a host must implement. It also ensures
|
||||
that the system runs, even when nothing is there
|
||||
to support it.
|
||||
|
||||
"""
|
||||
|
||||
host = types.ModuleType("defaultHost")
|
||||
|
||||
def ls():
|
||||
return list()
|
||||
|
||||
host.__dict__.update({
|
||||
"ls": ls
|
||||
})
|
||||
|
||||
return host
|
||||
|
||||
|
||||
def debug_host():
|
||||
"""A debug host, useful to debugging features that depend on a host"""
|
||||
|
||||
host = types.ModuleType("debugHost")
|
||||
|
||||
def ls():
|
||||
containers = [
|
||||
{
|
||||
"representation": "ee-ft-a-uuid1",
|
||||
"schema": "openpype:container-1.0",
|
||||
"name": "Bruce01",
|
||||
"objectName": "Bruce01_node",
|
||||
"namespace": "_bruce01_",
|
||||
"version": 3,
|
||||
},
|
||||
{
|
||||
"representation": "aa-bc-s-uuid2",
|
||||
"schema": "openpype:container-1.0",
|
||||
"name": "Bruce02",
|
||||
"objectName": "Bruce01_node",
|
||||
"namespace": "_bruce02_",
|
||||
"version": 2,
|
||||
}
|
||||
]
|
||||
|
||||
for container in containers:
|
||||
yield container
|
||||
|
||||
host.__dict__.update({
|
||||
"ls": ls,
|
||||
"open_file": lambda fname: None,
|
||||
"save_file": lambda fname: None,
|
||||
"current_file": lambda: os.path.expanduser("~/temp.txt"),
|
||||
"has_unsaved_changes": lambda: False,
|
||||
"work_root": lambda: os.path.expanduser("~/temp"),
|
||||
"file_extensions": lambda: ["txt"],
|
||||
})
|
||||
|
||||
return host
|
||||
|
|
@ -356,7 +356,7 @@ class CreatedInstance:
|
|||
already existing instance.
|
||||
creator(BaseCreator): Creator responsible for instance.
|
||||
host(ModuleType): Host implementation loaded with
|
||||
`avalon.api.registered_host`.
|
||||
`openpype.pipeline.registered_host`.
|
||||
new(bool): Is instance new.
|
||||
"""
|
||||
# Keys that can't be changed or removed from data after loading using
|
||||
|
|
|
|||
|
|
@ -142,7 +142,8 @@ def legacy_create(Creator, name, asset, options=None, data=None):
|
|||
Name of instance
|
||||
|
||||
"""
|
||||
from avalon.api import registered_host
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
host = registered_host()
|
||||
plugin = Creator(name, asset, options, data)
|
||||
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ import six
|
|||
from bson.objectid import ObjectId
|
||||
|
||||
from avalon import io, schema
|
||||
from avalon.api import Session, registered_root
|
||||
from avalon.api import Session
|
||||
|
||||
from openpype.lib import Anatomy
|
||||
|
||||
|
|
@ -532,6 +532,8 @@ def get_representation_path(representation, root=None, dbcon=None):
|
|||
dbcon = io
|
||||
|
||||
if root is None:
|
||||
from openpype.pipeline import registered_root
|
||||
|
||||
root = registered_root()
|
||||
|
||||
def path_from_represenation():
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.api
|
||||
from avalon import api, io
|
||||
from avalon import io
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
|
||||
class CollectSceneLoadedVersions(pyblish.api.ContextPlugin):
|
||||
|
|
@ -24,7 +25,7 @@ class CollectSceneLoadedVersions(pyblish.api.ContextPlugin):
|
|||
]
|
||||
|
||||
def process(self, context):
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
if host is None:
|
||||
self.log.warn("No registered host.")
|
||||
return
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ from openpype.lib import (
|
|||
run_openpype_process,
|
||||
|
||||
get_transcode_temp_directory,
|
||||
convert_for_ffmpeg,
|
||||
convert_input_paths_for_ffmpeg,
|
||||
should_convert_for_ffmpeg,
|
||||
|
||||
CREATE_NO_WINDOW
|
||||
|
|
@ -187,8 +187,13 @@ class ExtractBurnin(openpype.api.Extractor):
|
|||
repre_files = repre["files"]
|
||||
if isinstance(repre_files, (tuple, list)):
|
||||
filename = repre_files[0]
|
||||
src_filepaths = [
|
||||
os.path.join(src_repre_staging_dir, filename)
|
||||
for filename in repre_files
|
||||
]
|
||||
else:
|
||||
filename = repre_files
|
||||
src_filepaths = [os.path.join(src_repre_staging_dir, filename)]
|
||||
|
||||
first_input_path = os.path.join(src_repre_staging_dir, filename)
|
||||
# Determine if representation requires pre conversion for ffmpeg
|
||||
|
|
@ -209,11 +214,9 @@ class ExtractBurnin(openpype.api.Extractor):
|
|||
new_staging_dir = get_transcode_temp_directory()
|
||||
repre["stagingDir"] = new_staging_dir
|
||||
|
||||
convert_for_ffmpeg(
|
||||
first_input_path,
|
||||
convert_input_paths_for_ffmpeg(
|
||||
src_filepaths,
|
||||
new_staging_dir,
|
||||
_temp_data["frameStart"],
|
||||
_temp_data["frameEnd"],
|
||||
self.log
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ from openpype.lib import (
|
|||
path_to_subprocess_arg,
|
||||
|
||||
get_transcode_temp_directory,
|
||||
convert_for_ffmpeg,
|
||||
convert_input_paths_for_ffmpeg,
|
||||
should_convert_for_ffmpeg
|
||||
)
|
||||
|
||||
|
|
@ -79,11 +79,9 @@ class ExtractJpegEXR(pyblish.api.InstancePlugin):
|
|||
if do_convert:
|
||||
convert_dir = get_transcode_temp_directory()
|
||||
filename = os.path.basename(full_input_path)
|
||||
convert_for_ffmpeg(
|
||||
full_input_path,
|
||||
convert_input_paths_for_ffmpeg(
|
||||
[full_input_path],
|
||||
convert_dir,
|
||||
None,
|
||||
None,
|
||||
self.log
|
||||
)
|
||||
full_input_path = os.path.join(convert_dir, filename)
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ from openpype.lib import (
|
|||
path_to_subprocess_arg,
|
||||
|
||||
should_convert_for_ffmpeg,
|
||||
convert_for_ffmpeg,
|
||||
convert_input_paths_for_ffmpeg,
|
||||
get_transcode_temp_directory
|
||||
)
|
||||
import speedcopy
|
||||
|
|
@ -194,16 +194,20 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
src_repre_staging_dir = repre["stagingDir"]
|
||||
# Receive filepath to first file in representation
|
||||
first_input_path = None
|
||||
input_filepaths = []
|
||||
if not self.input_is_sequence(repre):
|
||||
first_input_path = os.path.join(
|
||||
src_repre_staging_dir, repre["files"]
|
||||
)
|
||||
input_filepaths.append(first_input_path)
|
||||
else:
|
||||
for filename in repre["files"]:
|
||||
first_input_path = os.path.join(
|
||||
filepath = os.path.join(
|
||||
src_repre_staging_dir, filename
|
||||
)
|
||||
break
|
||||
input_filepaths.append(filepath)
|
||||
if first_input_path is None:
|
||||
first_input_path = filepath
|
||||
|
||||
# Skip if file is not set
|
||||
if first_input_path is None:
|
||||
|
|
@ -230,13 +234,9 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
new_staging_dir = get_transcode_temp_directory()
|
||||
repre["stagingDir"] = new_staging_dir
|
||||
|
||||
frame_start = instance.data["frameStart"]
|
||||
frame_end = instance.data["frameEnd"]
|
||||
convert_for_ffmpeg(
|
||||
first_input_path,
|
||||
convert_input_paths_for_ffmpeg(
|
||||
input_filepaths,
|
||||
new_staging_dir,
|
||||
frame_start,
|
||||
frame_end,
|
||||
self.log
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -101,7 +101,8 @@ class PypeCommands:
|
|||
RuntimeError: When there is no path to process.
|
||||
"""
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype import install, uninstall
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.tools.utils.host_tools import show_publish
|
||||
from openpype.tools.utils.lib import qt_app_context
|
||||
|
|
@ -112,7 +113,7 @@ class PypeCommands:
|
|||
|
||||
log = Logger.get_logger()
|
||||
|
||||
install()
|
||||
install_openpype_plugins()
|
||||
|
||||
manager = ModulesManager()
|
||||
|
||||
|
|
@ -294,7 +295,8 @@ class PypeCommands:
|
|||
# Register target and host
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
import avalon.api
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.webpublisher import api as webpublisher
|
||||
|
||||
log = PypeLogger.get_logger()
|
||||
|
|
@ -315,7 +317,7 @@ class PypeCommands:
|
|||
for target in targets:
|
||||
pyblish.api.register_target(target)
|
||||
|
||||
avalon.api.install(webpublisher)
|
||||
install_host(webpublisher)
|
||||
|
||||
log.info("Running publish ...")
|
||||
|
||||
|
|
|
|||
|
|
@ -4,12 +4,16 @@ import sys
|
|||
import logging
|
||||
|
||||
# Pipeline imports
|
||||
from avalon import api, io
|
||||
import avalon.fusion
|
||||
from avalon import io
|
||||
from openpype.hosts.fusion import api
|
||||
import openpype.hosts.fusion.api.lib as fusion_lib
|
||||
|
||||
# Config imports
|
||||
import openpype.lib as pype
|
||||
import openpype.hosts.fusion.lib as fusion_lib
|
||||
from openpype.lib import version_up
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
|
||||
from openpype.lib.avalon_context import get_workdir_from_session
|
||||
|
||||
|
|
@ -79,7 +83,7 @@ def _format_filepath(session):
|
|||
|
||||
# Create new unqiue filepath
|
||||
if os.path.exists(new_filepath):
|
||||
new_filepath = pype.version_up(new_filepath)
|
||||
new_filepath = version_up(new_filepath)
|
||||
|
||||
return new_filepath
|
||||
|
||||
|
|
@ -102,7 +106,7 @@ def _update_savers(comp, session):
|
|||
|
||||
comp.Print("New renders to: %s\n" % renders)
|
||||
|
||||
with avalon.fusion.comp_lock_and_undo_chunk(comp):
|
||||
with api.comp_lock_and_undo_chunk(comp):
|
||||
savers = comp.GetToolList(False, "Saver").values()
|
||||
for saver in savers:
|
||||
filepath = saver.GetAttrs("TOOLST_Clip_Name")[1.0]
|
||||
|
|
@ -164,19 +168,19 @@ def switch(asset_name, filepath=None, new=True):
|
|||
# Get current project
|
||||
self._project = io.find_one({
|
||||
"type": "project",
|
||||
"name": api.Session["AVALON_PROJECT"]
|
||||
"name": io.Session["AVALON_PROJECT"]
|
||||
})
|
||||
|
||||
# Go to comp
|
||||
if not filepath:
|
||||
current_comp = avalon.fusion.get_current_comp()
|
||||
current_comp = api.get_current_comp()
|
||||
assert current_comp is not None, "Could not find current comp"
|
||||
else:
|
||||
fusion = _get_fusion_instance()
|
||||
current_comp = fusion.LoadComp(filepath, quiet=True)
|
||||
assert current_comp is not None, "Fusion could not load '%s'" % filepath
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
containers = list(host.ls())
|
||||
assert containers, "Nothing to update"
|
||||
|
||||
|
|
@ -194,7 +198,7 @@ def switch(asset_name, filepath=None, new=True):
|
|||
current_comp.Print(message)
|
||||
|
||||
# Build the session to switch to
|
||||
switch_to_session = api.Session.copy()
|
||||
switch_to_session = io.Session.copy()
|
||||
switch_to_session["AVALON_ASSET"] = asset['name']
|
||||
|
||||
if new:
|
||||
|
|
@ -203,7 +207,7 @@ def switch(asset_name, filepath=None, new=True):
|
|||
# Update savers output based on new session
|
||||
_update_savers(current_comp, switch_to_session)
|
||||
else:
|
||||
comp_path = pype.version_up(filepath)
|
||||
comp_path = version_up(filepath)
|
||||
|
||||
current_comp.Print(comp_path)
|
||||
|
||||
|
|
@ -234,7 +238,7 @@ if __name__ == '__main__':
|
|||
|
||||
args, unknown = parser.parse_args()
|
||||
|
||||
api.install(avalon.fusion)
|
||||
install_host(api)
|
||||
switch(args.asset_name, args.file_path)
|
||||
|
||||
sys.exit(0)
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ CURRENT_FILE = os.path.abspath(__file__)
|
|||
def show_error_messagebox(title, message, detail_message=None):
|
||||
"""Function will show message and process ends after closing it."""
|
||||
from Qt import QtWidgets, QtCore
|
||||
from avalon import style
|
||||
from openpype import style
|
||||
|
||||
app = QtWidgets.QApplication([])
|
||||
app.setStyleSheet(style.load_stylesheet())
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
import avalon.api as api
|
||||
import openpype
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
LegacyCreator,
|
||||
register_creator_plugin,
|
||||
discover_creator_plugins,
|
||||
|
|
@ -23,15 +22,14 @@ class Test:
|
|||
__name__ = "test"
|
||||
ls = len
|
||||
|
||||
def __call__(self):
|
||||
pass
|
||||
@staticmethod
|
||||
def install():
|
||||
register_creator_plugin(MyTestCreator)
|
||||
|
||||
|
||||
def test_avalon_plugin_presets(monkeypatch, printer):
|
||||
install_host(Test)
|
||||
|
||||
openpype.install()
|
||||
api.register_host(Test())
|
||||
register_creator_plugin(MyTestCreator)
|
||||
plugins = discover_creator_plugins()
|
||||
printer("Test if we got our test plugin")
|
||||
assert MyTestCreator in plugins
|
||||
|
|
|
|||
|
|
@ -16,8 +16,6 @@ from openpype.tools.utils.assets_widget import MultiSelectAssetsWidget
|
|||
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
from . import lib
|
||||
|
||||
module = sys.modules[__name__]
|
||||
module.window = None
|
||||
|
||||
|
|
@ -260,14 +258,6 @@ class LibraryLoaderWindow(QtWidgets.QDialog):
|
|||
|
||||
self.dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
_config = lib.find_config()
|
||||
if hasattr(_config, "install"):
|
||||
_config.install()
|
||||
else:
|
||||
print(
|
||||
"Config `%s` has no function `install`" % _config.__name__
|
||||
)
|
||||
|
||||
self._subsets_widget.on_project_change(project_name)
|
||||
if self._repres_widget:
|
||||
self._repres_widget.on_project_change(project_name)
|
||||
|
|
|
|||
|
|
@ -1,21 +0,0 @@
|
|||
import os
|
||||
import importlib
|
||||
import logging
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# `find_config` from `pipeline`
|
||||
def find_config():
|
||||
log.info("Finding configuration for project..")
|
||||
|
||||
config = os.environ["AVALON_CONFIG"]
|
||||
|
||||
if not config:
|
||||
raise EnvironmentError(
|
||||
"No configuration found in "
|
||||
"the project nor environment"
|
||||
)
|
||||
|
||||
log.info("Found %s, loading.." % config)
|
||||
return importlib.import_module(config)
|
||||
|
|
@ -5,6 +5,7 @@ from avalon import api, io
|
|||
|
||||
from openpype import style
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
from openpype.tools.utils import (
|
||||
lib,
|
||||
PlaceholderLineEdit
|
||||
|
|
@ -608,14 +609,6 @@ def cli(args):
|
|||
# Store settings
|
||||
api.Session["AVALON_PROJECT"] = project
|
||||
|
||||
from avalon import pipeline
|
||||
|
||||
# Find the set config
|
||||
_config = pipeline.find_config()
|
||||
if hasattr(_config, "install"):
|
||||
_config.install()
|
||||
else:
|
||||
print("Config `%s` has no function `install`" %
|
||||
_config.__name__)
|
||||
install_openpype_plugins(project)
|
||||
|
||||
show()
|
||||
|
|
|
|||
|
|
@ -5,9 +5,12 @@ import os
|
|||
from bson.objectid import ObjectId
|
||||
import maya.cmds as cmds
|
||||
|
||||
from avalon import io, api
|
||||
from avalon import io
|
||||
|
||||
from openpype.pipeline import remove_container
|
||||
from openpype.pipeline import (
|
||||
remove_container,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
from .vray_proxies import get_alembic_ids_cache
|
||||
|
|
@ -79,7 +82,7 @@ def get_all_asset_nodes():
|
|||
list: list of dictionaries
|
||||
"""
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
|
||||
nodes = []
|
||||
for container in host.ls():
|
||||
|
|
@ -192,7 +195,7 @@ def remove_unused_looks():
|
|||
|
||||
"""
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
|
||||
unused = []
|
||||
for container in host.ls():
|
||||
|
|
|
|||
|
|
@ -11,13 +11,14 @@ from bson.objectid import ObjectId
|
|||
import alembic.Abc
|
||||
from maya import cmds
|
||||
|
||||
from avalon import io, api
|
||||
from avalon import io
|
||||
|
||||
from openpype.pipeline import (
|
||||
load_container,
|
||||
loaders_from_representation,
|
||||
discover_loader_plugins,
|
||||
get_representation_path,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
|
@ -188,7 +189,7 @@ def load_look(version_id):
|
|||
"name": "ma"})
|
||||
|
||||
# See if representation is already loaded, if so reuse it.
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
representation_id = str(look_representation['_id'])
|
||||
for container in host.ls():
|
||||
if (container['loader'] == "LookLoader" and
|
||||
|
|
|
|||
|
|
@ -11,10 +11,12 @@ try:
|
|||
except Exception:
|
||||
from openpype.lib.python_2_comp import WeakMethod
|
||||
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import PublishValidationError
|
||||
from openpype.pipeline import (
|
||||
PublishValidationError,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.pipeline.create import CreateContext
|
||||
|
||||
from Qt import QtCore
|
||||
|
|
@ -353,7 +355,7 @@ class PublisherController:
|
|||
"""
|
||||
def __init__(self, dbcon=None, headless=False):
|
||||
self.log = logging.getLogger("PublisherController")
|
||||
self.host = avalon.api.registered_host()
|
||||
self.host = registered_host()
|
||||
self.headless = headless
|
||||
|
||||
self.create_context = CreateContext(
|
||||
|
|
|
|||
|
|
@ -7,8 +7,11 @@ from Qt import QtCore, QtGui
|
|||
import qtawesome
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from avalon import api, io, schema
|
||||
from openpype.pipeline import HeroVersionType
|
||||
from avalon import io, schema
|
||||
from openpype.pipeline import (
|
||||
HeroVersionType,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.style import get_default_entity_icon_color
|
||||
from openpype.tools.utils.models import TreeModel, Item
|
||||
from openpype.modules import ModulesManager
|
||||
|
|
@ -181,7 +184,7 @@ class InventoryModel(TreeModel):
|
|||
def refresh(self, selected=None, items=None):
|
||||
"""Refresh the model"""
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
if not items: # for debugging or testing, injecting items from outside
|
||||
items = host.ls()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
import openpype
|
||||
import pyblish.api
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
from openpype.tools.utils.host_tools import show_publish
|
||||
|
||||
|
||||
def main(env):
|
||||
# Registers pype's Global pyblish plugins
|
||||
openpype.install()
|
||||
install_openpype_plugins()
|
||||
|
||||
# Register additional paths
|
||||
addition_paths_str = env.get("PUBLISH_PATHS") or ""
|
||||
|
|
|
|||
|
|
@ -1,7 +1,9 @@
|
|||
from avalon import style
|
||||
from Qt import QtWidgets, QtCore
|
||||
import collections
|
||||
import re
|
||||
import collections
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
||||
from openpype import style
|
||||
|
||||
|
||||
class ConsoleDialog(QtWidgets.QDialog):
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import uuid
|
|||
|
||||
from Qt import QtCore, QtGui
|
||||
|
||||
from avalon import api
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
ITEM_ID_ROLE = QtCore.Qt.UserRole + 1
|
||||
|
||||
|
|
@ -21,7 +21,7 @@ class InstanceModel(QtGui.QStandardItemModel):
|
|||
self._instances_by_item_id = {}
|
||||
|
||||
instances = None
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
list_instances = getattr(host, "list_instances", None)
|
||||
if list_instances:
|
||||
instances = list_instances()
|
||||
|
|
|
|||
|
|
@ -4,9 +4,8 @@ import sys
|
|||
from Qt import QtWidgets, QtCore
|
||||
import qtawesome
|
||||
|
||||
from avalon import api
|
||||
|
||||
from openpype import style
|
||||
from openpype.pipeline import registered_host
|
||||
from openpype.tools.utils import PlaceholderLineEdit
|
||||
from openpype.tools.utils.lib import (
|
||||
iter_model_rows,
|
||||
|
|
@ -106,7 +105,7 @@ class SubsetManagerWindow(QtWidgets.QDialog):
|
|||
self._details_widget.set_details(container, item_id)
|
||||
|
||||
def _on_save(self):
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
if not hasattr(host, "save_instances"):
|
||||
print("BUG: Host does not have \"save_instances\" method")
|
||||
return
|
||||
|
|
@ -141,7 +140,7 @@ class SubsetManagerWindow(QtWidgets.QDialog):
|
|||
# Prepare menu
|
||||
menu = QtWidgets.QMenu(self)
|
||||
actions = []
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
if hasattr(host, "remove_instance"):
|
||||
action = QtWidgets.QAction("Remove instance", menu)
|
||||
action.setData(host.remove_instance)
|
||||
|
|
@ -176,7 +175,7 @@ class SubsetManagerWindow(QtWidgets.QDialog):
|
|||
self._details_widget.set_details(None, None)
|
||||
self._model.refresh()
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
dev_mode = os.environ.get("AVALON_DEVELOP_MODE") or ""
|
||||
editable = False
|
||||
if dev_mode.lower() in ("1", "yes", "true", "on"):
|
||||
|
|
|
|||
|
|
@ -8,8 +8,8 @@ publishing plugins.
|
|||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
import avalon.api
|
||||
from avalon.api import AvalonMongoDB
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.traypublisher import (
|
||||
api as traypublisher
|
||||
)
|
||||
|
|
@ -163,7 +163,7 @@ class TrayPublishWindow(PublisherWindow):
|
|||
|
||||
|
||||
def main():
|
||||
avalon.api.install(traypublisher)
|
||||
install_host(traypublisher)
|
||||
app = QtWidgets.QApplication([])
|
||||
window = TrayPublishWindow()
|
||||
window.show()
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ use singleton approach with global functions (using helper anyway).
|
|||
import os
|
||||
import avalon.api
|
||||
import pyblish.api
|
||||
from openpype.pipeline import registered_host
|
||||
from .lib import qt_app_context
|
||||
|
||||
|
||||
|
|
@ -47,7 +48,7 @@ class HostToolsHelper:
|
|||
Window, validate_host_requirements
|
||||
)
|
||||
# Host validation
|
||||
host = avalon.api.registered_host()
|
||||
host = registered_host()
|
||||
validate_host_requirements(host)
|
||||
|
||||
workfiles_window = Window(parent=parent)
|
||||
|
|
|
|||
|
|
@ -6,16 +6,17 @@ import collections
|
|||
from Qt import QtWidgets, QtCore, QtGui
|
||||
import qtawesome
|
||||
|
||||
import avalon.api
|
||||
|
||||
from openpype.style import get_default_entity_icon_color
|
||||
from openpype.style import (
|
||||
get_default_entity_icon_color,
|
||||
get_objected_colors,
|
||||
)
|
||||
from openpype.resources import get_image_path
|
||||
from openpype.lib import filter_profiles
|
||||
from openpype.api import (
|
||||
get_project_settings,
|
||||
Logger
|
||||
)
|
||||
from openpype.lib import filter_profiles
|
||||
from openpype.style import get_objected_colors
|
||||
from openpype.resources import get_image_path
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
|
@ -402,7 +403,7 @@ class FamilyConfigCache:
|
|||
|
||||
self.family_configs.clear()
|
||||
# Skip if we're not in host context
|
||||
if not avalon.api.registered_host():
|
||||
if not registered_host():
|
||||
return
|
||||
|
||||
# Update the icons from the project configuration
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import logging
|
|||
|
||||
from avalon import api
|
||||
|
||||
from openpype.pipeline import registered_host
|
||||
from openpype.tools.utils import qt_app_context
|
||||
from .window import Window
|
||||
|
||||
|
|
@ -47,7 +48,7 @@ def show(root=None, debug=False, parent=None, use_context=True, save=True):
|
|||
except (AttributeError, RuntimeError):
|
||||
pass
|
||||
|
||||
host = api.registered_host()
|
||||
host = registered_host()
|
||||
validate_host_requirements(host)
|
||||
|
||||
if debug:
|
||||
|
|
|
|||
|
|
@ -18,6 +18,7 @@ from openpype.lib.avalon_context import (
|
|||
update_current_task,
|
||||
compute_session_changes
|
||||
)
|
||||
from openpype.pipeline import registered_host
|
||||
from .model import (
|
||||
WorkAreaFilesModel,
|
||||
PublishFilesModel,
|
||||
|
|
@ -93,7 +94,7 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
# This is not root but workfile directory
|
||||
self._workfiles_root = None
|
||||
self._workdir_path = None
|
||||
self.host = api.registered_host()
|
||||
self.host = registered_host()
|
||||
|
||||
# Whether to automatically select the latest modified
|
||||
# file on a refresh of the files model.
|
||||
|
|
|
|||
|
|
@ -11,6 +11,7 @@ from openpype.lib import (
|
|||
get_last_workfile_with_version,
|
||||
get_workdir_data,
|
||||
)
|
||||
from openpype.pipeline import registered_host
|
||||
from openpype.tools.utils import PlaceholderLineEdit
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
|
@ -65,7 +66,7 @@ class CommentMatcher(object):
|
|||
return
|
||||
|
||||
# Create a regex group for extensions
|
||||
extensions = api.registered_host().file_extensions()
|
||||
extensions = registered_host().file_extensions()
|
||||
any_extension = "(?:{})".format(
|
||||
"|".join(re.escape(ext[1:]) for ext in extensions)
|
||||
)
|
||||
|
|
@ -200,7 +201,7 @@ class SaveAsDialog(QtWidgets.QDialog):
|
|||
self.setWindowFlags(self.windowFlags() | QtCore.Qt.FramelessWindowHint)
|
||||
|
||||
self.result = None
|
||||
self.host = api.registered_host()
|
||||
self.host = registered_host()
|
||||
self.root = root
|
||||
self.work_file = None
|
||||
self._extensions = extensions
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.9.4-nightly.1"
|
||||
__version__ = "3.10.0-nightly.1"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.9.4-nightly.1" # OpenPype
|
||||
version = "3.10.0-nightly.1" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -24,13 +24,14 @@ class DBAssert:
|
|||
else:
|
||||
args[key] = val
|
||||
|
||||
no_of_docs = dbcon.count_documents(args)
|
||||
|
||||
msg = None
|
||||
args.pop("type")
|
||||
detail_str = " "
|
||||
if args:
|
||||
detail_str = " with '{}'".format(args)
|
||||
|
||||
msg = None
|
||||
no_of_docs = dbcon.count_documents(args)
|
||||
if expected != no_of_docs:
|
||||
msg = "Not expected no of '{}'{}."\
|
||||
"Expected {}, found {}".format(queried_type,
|
||||
|
|
|
|||
|
|
@ -273,8 +273,6 @@ class PublishTest(ModuleUnitTest):
|
|||
)
|
||||
os.environ["AVALON_SCHEMA"] = schema_path
|
||||
|
||||
import openpype
|
||||
openpype.install()
|
||||
os.environ["OPENPYPE_EXECUTABLE"] = sys.executable
|
||||
from openpype.lib import ApplicationManager
|
||||
|
||||
|
|
|
|||
64
tests/unit/openpype/modules/sync_server/test_module_api.py
Normal file
64
tests/unit/openpype/modules/sync_server/test_module_api.py
Normal file
|
|
@ -0,0 +1,64 @@
|
|||
"""Test file for Sync Server, tests API methods, currently for integrate_new
|
||||
|
||||
File:
|
||||
creates temporary directory and downloads .zip file from GDrive
|
||||
unzips .zip file
|
||||
uses content of .zip file (MongoDB's dumps) to import to new databases
|
||||
with use of 'monkeypatch_session' modifies required env vars
|
||||
temporarily
|
||||
runs battery of tests checking that site operation for Sync Server
|
||||
module are working
|
||||
removes temporary folder
|
||||
removes temporary databases (?)
|
||||
"""
|
||||
import pytest
|
||||
|
||||
from tests.lib.testing_classes import ModuleUnitTest
|
||||
|
||||
|
||||
class TestModuleApi(ModuleUnitTest):
|
||||
|
||||
REPRESENTATION_ID = "60e578d0c987036c6a7b741d"
|
||||
|
||||
TEST_FILES = [("1eCwPljuJeOI8A3aisfOIBKKjcmIycTEt",
|
||||
"test_site_operations.zip", '')]
|
||||
|
||||
@pytest.fixture(scope="module")
|
||||
def setup_sync_server_module(self, dbcon):
|
||||
"""Get sync_server_module from ModulesManager"""
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
manager = ModulesManager()
|
||||
sync_server = manager.modules_by_name["sync_server"]
|
||||
yield sync_server
|
||||
|
||||
def test_get_alt_site_pairs(self, setup_sync_server_module):
|
||||
conf_sites = {"SFTP": {"alternative_sites": ["studio"]},
|
||||
"studio2": {"alternative_sites": ["studio"]}}
|
||||
|
||||
ret = setup_sync_server_module._get_alt_site_pairs(conf_sites)
|
||||
expected = {"SFTP": {"studio", "studio2"},
|
||||
"studio": {"SFTP", "studio2"},
|
||||
"studio2": {"studio", "SFTP"}}
|
||||
assert ret == expected, "Not matching result"
|
||||
|
||||
def test_get_alt_site_pairs_deep(self, setup_sync_server_module):
|
||||
conf_sites = {"A": {"alternative_sites": ["C"]},
|
||||
"B": {"alternative_sites": ["C"]},
|
||||
"C": {"alternative_sites": ["D"]},
|
||||
"D": {"alternative_sites": ["A"]},
|
||||
"F": {"alternative_sites": ["G"]},
|
||||
"G": {"alternative_sites": ["F"]},
|
||||
}
|
||||
|
||||
ret = setup_sync_server_module._get_alt_site_pairs(conf_sites)
|
||||
expected = {"A": {"B", "C", "D"},
|
||||
"B": {"A", "C", "D"},
|
||||
"C": {"A", "B", "D"},
|
||||
"D": {"A", "B", "C"},
|
||||
"F": {"G"},
|
||||
"G": {"F"}}
|
||||
assert ret == expected, "Not matching result"
|
||||
|
||||
|
||||
test_case = TestModuleApi()
|
||||
|
|
@ -59,7 +59,7 @@ We have a few required anatomy templates for OpenPype to work properly, however
|
|||
| `asset` | Name of asset or shot |
|
||||
| `task[name]` | Name of task |
|
||||
| `task[type]` | Type of task |
|
||||
| `task[short]` | Shortname of task |
|
||||
| `task[short]` | Short name of task type (eg. 'Modeling' > 'mdl') |
|
||||
| `parent` | Name of hierarchical parent |
|
||||
| `version` | Version number |
|
||||
| `subset` | Subset name |
|
||||
|
|
@ -105,5 +105,8 @@ We have a few required anatomy templates for OpenPype to work properly, however
|
|||
|
||||
## Task Types
|
||||
|
||||
Current state of default Task descriptors.
|
||||
|
||||

|
||||
|
||||
## Colour Management and Formats
|
||||
|
|
@ -94,6 +94,8 @@ This tool will set any defined colorspace definition from OpenPype `Settings / P
|
|||
With OpenPype, you can use Hiero/NKS as a starting point for creating a project's **shots** as *assets* from timeline clips with its *hierarchycal parents* like **episodes**, **sequences**, **folders**, and its child **tasks**. Most importantly it will create **versions** of plate *subsets*, with or without **reference video**. Publishig is naturally creating clip's **thumbnails** and assigns it to shot *asset*. Hiero is also publishing **audio** *subset* and various **soft-effects** either as retiming component as part of published plates or **color-tranformations**, that will be evailable later on for compositor artists to use either as *viewport input-process* or *loaded nodes* in graph editor.
|
||||
<br></br><br></br>
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/mdIfbTY5fCc" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
### Preparing timeline for conversion to instances
|
||||
Because we don't support on-fly data conversion so in case of working with raw camera sources or some other formats which need to be converted for 2D/3D work. We suggest to convert those before and reconform the timeline. Before any clips in timeline could be converted to publishable instances we recommend following.
|
||||
1. Merge all tracks which supposed to be one and they are multiply only because of editor's style
|
||||
|
|
@ -191,3 +193,12 @@ If you wish to change any individual properties of the shot then you are able to
|
|||
|
||||
</div>
|
||||
</div>
|
||||
|
||||
### Publishing Effects from Hiero to Nuke
|
||||
This video shows a way to publish shot look as effect from Hiero to Nuke.
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/HzZDdtII5io" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
### Assembling edit from published shot versions
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/5Wd6X-71vbg" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
|
|
|||
|
|
@ -89,6 +89,8 @@ This menu item will set correct Colorspace definitions for you. All has to be co
|
|||
- set preview LUT to your viewers
|
||||
- set correct colorspace to all discovered Read nodes (following expression set in settings)
|
||||
|
||||
See [Nuke Color Management](artist_hosts_nuke_tut.md#nuke-color-management)
|
||||
|
||||
</div>
|
||||
<div class="col col--6 markdown">
|
||||
|
||||
|
|
@ -144,6 +146,8 @@ This tool will append all available subsets into an actual node graph. It will l
|
|||
|
||||
This QuickStart is short introduction to what OpenPype can do for you. It attempts to make an overview for compositing artists, and simplifies processes that are better described in specific parts of the documentation.
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/jgwmLOPJg0g" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
### Launch Nuke - Shot and Task Context
|
||||
OpenPype has to know what shot and task you are working on. You need to run Nuke in context of the task, using Ftrack Action or OpenPype Launcher to select the task and run Nuke.
|
||||
|
||||
|
|
@ -226,6 +230,11 @@ This will create a Group with a Write node inside.
|
|||
You can configure write node parameters in **Studio Settings → Project → Anatomy → Color Management and Output Formats → Nuke → Nodes**
|
||||
:::
|
||||
|
||||
### Create Prerender Node
|
||||
Creating Prerender is very similar to creating OpenPype managed Write node.
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/er4SztHFN-w" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
#### What Nuke Publish Does
|
||||
From Artist perspective, Nuke publish gathers all the stuff found in the Nuke script with Publish checkbox set to on, exports stuff and raises the Nuke script (workfile) version.
|
||||
|
||||
|
|
@ -315,6 +324,8 @@ Main disadvantage of this approach is that you can render only one version of yo
|
|||
|
||||
When making quick farm publishes, like making two versions with different color correction, care must be taken to let the first job (first version) completely finish before the second version starts rendering.
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/j95OITIWJk8" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
### Managing Versions
|
||||
|
||||

|
||||
|
|
@ -323,15 +334,30 @@ OpenPype checks all the assets loaded to Nuke on script open. All out of date as
|
|||
|
||||
Use Manage to switch versions for loaded assets.
|
||||
|
||||
### Loading Effects
|
||||
This video show how to publish effect from Hiero / Nuke Studio, and use the effect in Nuke.
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/zFoH7bq-w0E" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/HzZDdtII5io" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
### Nuke Color Management
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/NKjQHkuwkSM" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Fixing Validate Containers
|
||||
|
||||
If your Pyblish dialog fails on Validate Containers, you might have an old asset loaded. Use OpenPype - Manage... to switch the asset(s) to the latest version.
|
||||
|
||||

|
||||
|
||||
If your Pyblish dialog fails on Validate Containers, you might have an old asset loaded. Use OpenPype - Manage... to switch the asset(s) to the latest version.
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/hridMybn5nA" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
||||
### Fixing Validate Version
|
||||
If your Pyblish dialog fails on Validate Version, you might be trying to publish already published version. Rise your version in the OpenPype WorkFiles SaveAs.
|
||||
|
||||
Or maybe you accidentally copied write node from different shot to your current one. Check the write publishes on the left side of the Pyblish dialog. Typically you publish only one write. Locate and delete the stray write from other shot.
|
||||
Or maybe you accidentally copied write node from different shot to your current one. Check the write publishes on the left side of the Pyblish dialog. Typically you publish only one write. Locate and delete the stray write from other shot.
|
||||
|
||||
<iframe width="512px" height="288px" src="https://www.youtube.com/embed/Ic9z4gKnHAA" frameborder="0" modestbranding="1" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="1"></iframe>
|
||||
|
|
|
|||
BIN
website/docs/assets/settings/anatomy_tasks.png
Normal file
BIN
website/docs/assets/settings/anatomy_tasks.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 26 KiB |
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue