mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into feature/OP-3776_Houdini-as-addon
This commit is contained in:
commit
c2b090c73d
96 changed files with 1259 additions and 848 deletions
55
CHANGELOG.md
55
CHANGELOG.md
|
|
@ -1,6 +1,6 @@
|
|||
# Changelog
|
||||
|
||||
## [3.14.1-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.14.1-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.0...HEAD)
|
||||
|
||||
|
|
@ -9,22 +9,49 @@
|
|||
- Documentation: Few updates [\#3698](https://github.com/pypeclub/OpenPype/pull/3698)
|
||||
- Documentation: Settings development [\#3660](https://github.com/pypeclub/OpenPype/pull/3660)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Webpublisher:change create flatten image into tri state [\#3678](https://github.com/pypeclub/OpenPype/pull/3678)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Settings: Remove settings lock on tray exit [\#3720](https://github.com/pypeclub/OpenPype/pull/3720)
|
||||
- General: Added helper getters to modules manager [\#3712](https://github.com/pypeclub/OpenPype/pull/3712)
|
||||
- Unreal: Define unreal as module and use host class [\#3701](https://github.com/pypeclub/OpenPype/pull/3701)
|
||||
- Settings: Lock settings UI session [\#3700](https://github.com/pypeclub/OpenPype/pull/3700)
|
||||
- General: Benevolent context label collector [\#3686](https://github.com/pypeclub/OpenPype/pull/3686)
|
||||
- Ftrack: Store ftrack entities on hierarchy integration to instances [\#3677](https://github.com/pypeclub/OpenPype/pull/3677)
|
||||
- Ftrack: More logs related to auto sync value change [\#3671](https://github.com/pypeclub/OpenPype/pull/3671)
|
||||
- Blender: ops refresh manager after process events [\#3663](https://github.com/pypeclub/OpenPype/pull/3663)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Logger tweaks [\#3741](https://github.com/pypeclub/OpenPype/pull/3741)
|
||||
- Nuke: color-space settings from anatomy is working [\#3721](https://github.com/pypeclub/OpenPype/pull/3721)
|
||||
- Settings: Fix studio default anatomy save [\#3716](https://github.com/pypeclub/OpenPype/pull/3716)
|
||||
- Maya: Use project name instead of project code [\#3709](https://github.com/pypeclub/OpenPype/pull/3709)
|
||||
- Settings: Fix project overrides save [\#3708](https://github.com/pypeclub/OpenPype/pull/3708)
|
||||
- Workfiles tool: Fix published workfile filtering [\#3704](https://github.com/pypeclub/OpenPype/pull/3704)
|
||||
- PS, AE: Provide default variant value for workfile subset [\#3703](https://github.com/pypeclub/OpenPype/pull/3703)
|
||||
- RoyalRender: handle host name that is not set [\#3695](https://github.com/pypeclub/OpenPype/pull/3695)
|
||||
- Flame: retime is working on clip publishing [\#3684](https://github.com/pypeclub/OpenPype/pull/3684)
|
||||
- Webpublisher: added check for empty context [\#3682](https://github.com/pypeclub/OpenPype/pull/3682)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Host addons cleanup [\#3744](https://github.com/pypeclub/OpenPype/pull/3744)
|
||||
- Webpublisher: Webpublisher is used as addon [\#3740](https://github.com/pypeclub/OpenPype/pull/3740)
|
||||
- Photoshop: Defined photoshop as addon [\#3736](https://github.com/pypeclub/OpenPype/pull/3736)
|
||||
- Harmony: Defined harmony as addon [\#3734](https://github.com/pypeclub/OpenPype/pull/3734)
|
||||
- General: Module interfaces cleanup [\#3731](https://github.com/pypeclub/OpenPype/pull/3731)
|
||||
- AfterEffects: Move AE functions from general lib [\#3730](https://github.com/pypeclub/OpenPype/pull/3730)
|
||||
- Blender: Define blender as module [\#3729](https://github.com/pypeclub/OpenPype/pull/3729)
|
||||
- AfterEffects: Define AfterEffects as module [\#3728](https://github.com/pypeclub/OpenPype/pull/3728)
|
||||
- General: Replace PypeLogger with Logger [\#3725](https://github.com/pypeclub/OpenPype/pull/3725)
|
||||
- Nuke: Define nuke as module [\#3724](https://github.com/pypeclub/OpenPype/pull/3724)
|
||||
- General: Move subset name functionality [\#3723](https://github.com/pypeclub/OpenPype/pull/3723)
|
||||
- General: Move creators plugin getter [\#3714](https://github.com/pypeclub/OpenPype/pull/3714)
|
||||
- General: Move constants from lib to client [\#3713](https://github.com/pypeclub/OpenPype/pull/3713)
|
||||
- Loader: Subset groups using client operations [\#3710](https://github.com/pypeclub/OpenPype/pull/3710)
|
||||
- TVPaint: Defined as module [\#3707](https://github.com/pypeclub/OpenPype/pull/3707)
|
||||
- StandalonePublisher: Define StandalonePublisher as module [\#3706](https://github.com/pypeclub/OpenPype/pull/3706)
|
||||
|
|
@ -33,6 +60,7 @@
|
|||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Hiero: Define hiero as module [\#3717](https://github.com/pypeclub/OpenPype/pull/3717)
|
||||
- Deadline: better logging for DL webservice failures [\#3694](https://github.com/pypeclub/OpenPype/pull/3694)
|
||||
- Photoshop: resize saved images in ExtractReview for ffmpeg [\#3676](https://github.com/pypeclub/OpenPype/pull/3676)
|
||||
|
||||
|
|
@ -40,10 +68,6 @@
|
|||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.14.0-nightly.1...3.14.0)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Maya: Build workfile by template [\#3578](https://github.com/pypeclub/OpenPype/pull/3578)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: Addiotional component metadata [\#3685](https://github.com/pypeclub/OpenPype/pull/3685)
|
||||
|
|
@ -69,7 +93,6 @@
|
|||
- Maya: Hosts as modules [\#3647](https://github.com/pypeclub/OpenPype/pull/3647)
|
||||
- TimersManager: Plugins are in timers manager module [\#3639](https://github.com/pypeclub/OpenPype/pull/3639)
|
||||
- General: Move workfiles functions into pipeline [\#3637](https://github.com/pypeclub/OpenPype/pull/3637)
|
||||
- General: Workfiles builder using query functions [\#3598](https://github.com/pypeclub/OpenPype/pull/3598)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
@ -91,12 +114,6 @@
|
|||
- Editorial: Mix audio use side file for ffmpeg filters [\#3630](https://github.com/pypeclub/OpenPype/pull/3630)
|
||||
- Ftrack: Comment template can contain optional keys [\#3615](https://github.com/pypeclub/OpenPype/pull/3615)
|
||||
- Ftrack: Add more metadata to ftrack components [\#3612](https://github.com/pypeclub/OpenPype/pull/3612)
|
||||
- General: Add context to pyblish context [\#3594](https://github.com/pypeclub/OpenPype/pull/3594)
|
||||
- Kitsu: Shot&Sequence name with prefix over appends [\#3593](https://github.com/pypeclub/OpenPype/pull/3593)
|
||||
- Photoshop: implemented {layer} placeholder in subset template [\#3591](https://github.com/pypeclub/OpenPype/pull/3591)
|
||||
- General: Python module appdirs from git [\#3589](https://github.com/pypeclub/OpenPype/pull/3589)
|
||||
- Ftrack: Update ftrack api to 2.3.3 [\#3588](https://github.com/pypeclub/OpenPype/pull/3588)
|
||||
- General: New Integrator small fixes [\#3583](https://github.com/pypeclub/OpenPype/pull/3583)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -109,35 +126,21 @@
|
|||
- AfterEffects: refactored integrate doesnt work formulti frame publishes [\#3610](https://github.com/pypeclub/OpenPype/pull/3610)
|
||||
- Maya look data contents fails with custom attribute on group [\#3607](https://github.com/pypeclub/OpenPype/pull/3607)
|
||||
- TrayPublisher: Fix wrong conflict merge [\#3600](https://github.com/pypeclub/OpenPype/pull/3600)
|
||||
- Bugfix: Add OCIO as submodule to prepare for handling `maketx` color space conversion. [\#3590](https://github.com/pypeclub/OpenPype/pull/3590)
|
||||
- Fix general settings environment variables resolution [\#3587](https://github.com/pypeclub/OpenPype/pull/3587)
|
||||
- Editorial publishing workflow improvements [\#3580](https://github.com/pypeclub/OpenPype/pull/3580)
|
||||
- General: Update imports in start script [\#3579](https://github.com/pypeclub/OpenPype/pull/3579)
|
||||
- Nuke: render family integration consistency [\#3576](https://github.com/pypeclub/OpenPype/pull/3576)
|
||||
- Ftrack: Handle missing published path in integrator [\#3570](https://github.com/pypeclub/OpenPype/pull/3570)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Plugin settings handled by plugins [\#3623](https://github.com/pypeclub/OpenPype/pull/3623)
|
||||
- General: Naive implementation of document create, update, delete [\#3601](https://github.com/pypeclub/OpenPype/pull/3601)
|
||||
- General: Use query functions in general code [\#3596](https://github.com/pypeclub/OpenPype/pull/3596)
|
||||
- General: Separate extraction of template data into more functions [\#3574](https://github.com/pypeclub/OpenPype/pull/3574)
|
||||
- General: Lib cleanup [\#3571](https://github.com/pypeclub/OpenPype/pull/3571)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Webpublisher: timeout for PS studio processing [\#3619](https://github.com/pypeclub/OpenPype/pull/3619)
|
||||
- Core: translated validate\_containers.py into New publisher style [\#3614](https://github.com/pypeclub/OpenPype/pull/3614)
|
||||
- Enable write color sets on animation publish automatically [\#3582](https://github.com/pypeclub/OpenPype/pull/3582)
|
||||
|
||||
## [3.12.2](https://github.com/pypeclub/OpenPype/tree/3.12.2) (2022-07-27)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.2-nightly.4...3.12.2)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: fix Review image plane attribute [\#3569](https://github.com/pypeclub/OpenPype/pull/3569)
|
||||
|
||||
## [3.12.1](https://github.com/pypeclub/OpenPype/tree/3.12.1) (2022-07-13)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.1-nightly.6...3.12.1)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from .module import AfterEffectsModule
|
||||
from .addon import AfterEffectsAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AfterEffectsModule",
|
||||
"AfterEffectsAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
|
||||
class AfterEffectsModule(OpenPypeModule, IHostModule):
|
||||
class AfterEffectsAddon(OpenPypeModule, IHostAddon):
|
||||
name = "aftereffects"
|
||||
host_name = "aftereffects"
|
||||
|
||||
|
|
@ -1,13 +1,16 @@
|
|||
import os
|
||||
import sys
|
||||
import re
|
||||
import json
|
||||
import contextlib
|
||||
import traceback
|
||||
import logging
|
||||
from functools import partial
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
from .launch_logic import ProcessLauncher, get_stub
|
||||
|
|
@ -35,10 +38,18 @@ def main(*subprocess_args):
|
|||
launcher.start()
|
||||
|
||||
if os.environ.get("HEADLESS_PUBLISH"):
|
||||
launcher.execute_in_main_thread(lambda: headless_publish(
|
||||
log,
|
||||
"CloseAE",
|
||||
os.environ.get("IS_TEST")))
|
||||
manager = ModulesManager()
|
||||
webpublisher_addon = manager["webpublisher"]
|
||||
|
||||
launcher.execute_in_main_thread(
|
||||
partial(
|
||||
webpublisher_addon.headless_publish,
|
||||
log,
|
||||
"CloseAE",
|
||||
os.environ.get("IS_TEST")
|
||||
)
|
||||
)
|
||||
|
||||
elif os.environ.get("AVALON_PHOTOSHOP_WORKFILES_ON_LAUNCH", True):
|
||||
save = False
|
||||
if os.getenv("WORKFILES_SAVE_AS"):
|
||||
|
|
@ -68,3 +79,57 @@ def get_extension_manifest_path():
|
|||
"CSXS",
|
||||
"manifest.xml"
|
||||
)
|
||||
|
||||
|
||||
def get_unique_layer_name(layers, name):
|
||||
"""
|
||||
Gets all layer names and if 'name' is present in them, increases
|
||||
suffix by 1 (eg. creates unique layer name - for Loader)
|
||||
Args:
|
||||
layers (list): of strings, names only
|
||||
name (string): checked value
|
||||
|
||||
Returns:
|
||||
(string): name_00X (without version)
|
||||
"""
|
||||
names = {}
|
||||
for layer in layers:
|
||||
layer_name = re.sub(r'_\d{3}$', '', layer)
|
||||
if layer_name in names.keys():
|
||||
names[layer_name] = names[layer_name] + 1
|
||||
else:
|
||||
names[layer_name] = 1
|
||||
occurrences = names.get(name, 0)
|
||||
|
||||
return "{}_{:0>3d}".format(name, occurrences + 1)
|
||||
|
||||
|
||||
def get_background_layers(file_url):
|
||||
"""
|
||||
Pulls file name from background json file, enrich with folder url for
|
||||
AE to be able import files.
|
||||
|
||||
Order is important, follows order in json.
|
||||
|
||||
Args:
|
||||
file_url (str): abs url of background json
|
||||
|
||||
Returns:
|
||||
(list): of abs paths to images
|
||||
"""
|
||||
with open(file_url) as json_file:
|
||||
data = json.load(json_file)
|
||||
|
||||
layers = list()
|
||||
bg_folder = os.path.dirname(file_url)
|
||||
for child in data['children']:
|
||||
if child.get("filename"):
|
||||
layers.append(os.path.join(bg_folder, child.get("filename")).
|
||||
replace("\\", "/"))
|
||||
else:
|
||||
for layer in child['children']:
|
||||
if layer.get("filename"):
|
||||
layers.append(os.path.join(bg_folder,
|
||||
layer.get("filename")).
|
||||
replace("\\", "/"))
|
||||
return layers
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
import re
|
||||
|
||||
from openpype.lib import (
|
||||
get_background_layers,
|
||||
get_unique_layer_name
|
||||
)
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.hosts.aftereffects.api import (
|
||||
AfterEffectsLoader,
|
||||
containerise
|
||||
)
|
||||
from openpype.hosts.aftereffects.api.lib import (
|
||||
get_background_layers,
|
||||
get_unique_layer_name,
|
||||
)
|
||||
|
||||
|
||||
class BackgroundLoader(AfterEffectsLoader):
|
||||
|
|
|
|||
|
|
@ -1,12 +1,11 @@
|
|||
import re
|
||||
|
||||
from openpype import lib
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.hosts.aftereffects.api import (
|
||||
AfterEffectsLoader,
|
||||
containerise
|
||||
)
|
||||
from openpype.hosts.aftereffects.api.lib import get_unique_layer_name
|
||||
|
||||
|
||||
class FileLoader(AfterEffectsLoader):
|
||||
|
|
@ -28,7 +27,7 @@ class FileLoader(AfterEffectsLoader):
|
|||
stub = self.get_stub()
|
||||
layers = stub.get_items(comps=True, folders=True, footages=True)
|
||||
existing_layers = [layer.name for layer in layers]
|
||||
comp_name = lib.get_unique_layer_name(
|
||||
comp_name = get_unique_layer_name(
|
||||
existing_layers, "{}_{}".format(context["asset"]["name"], name))
|
||||
|
||||
import_options = {}
|
||||
|
|
@ -87,7 +86,7 @@ class FileLoader(AfterEffectsLoader):
|
|||
if namespace_from_container != layer_name:
|
||||
layers = stub.get_items(comps=True)
|
||||
existing_layers = [layer.name for layer in layers]
|
||||
layer_name = lib.get_unique_layer_name(
|
||||
layer_name = get_unique_layer_name(
|
||||
existing_layers,
|
||||
"{}_{}".format(context["asset"], context["subset"]))
|
||||
else: # switching version - keep same name
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -71,13 +71,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
|
||||
# workfile instance
|
||||
family = "workfile"
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
subset = get_subset_name(
|
||||
family,
|
||||
self.default_variant,
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
host_name=context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from .module import BlenderModule
|
||||
from .addon import BlenderAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"BlenderModule",
|
||||
"BlenderAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,11 +1,11 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
BLENDER_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class BlenderModule(OpenPypeModule, IHostModule):
|
||||
class BlenderAddon(OpenPypeModule, IHostAddon):
|
||||
name = "blender"
|
||||
host_name = "blender"
|
||||
|
||||
|
|
@ -234,7 +234,7 @@ def lsattrs(attrs: Dict) -> List:
|
|||
def read(node: bpy.types.bpy_struct_meta_idprop):
|
||||
"""Return user-defined attributes from `node`"""
|
||||
|
||||
data = dict(node.get(pipeline.AVALON_PROPERTY))
|
||||
data = dict(node.get(pipeline.AVALON_PROPERTY, {}))
|
||||
|
||||
# Ignore hidden/internal data
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ PREVIEW_COLLECTIONS: Dict = dict()
|
|||
# This seems like a good value to keep the Qt app responsive and doesn't slow
|
||||
# down Blender. At least on macOS I the interace of Blender gets very laggy if
|
||||
# you make it smaller.
|
||||
TIMER_INTERVAL: float = 0.01
|
||||
TIMER_INTERVAL: float = 0.01 if platform.system() == "Windows" else 0.1
|
||||
|
||||
|
||||
class BlenderApplication(QtWidgets.QApplication):
|
||||
|
|
@ -164,6 +164,12 @@ def _process_app_events() -> Optional[float]:
|
|||
dialog.setDetailedText(detail)
|
||||
dialog.exec_()
|
||||
|
||||
# Refresh Manager
|
||||
if GlobalClass.app:
|
||||
manager = GlobalClass.app.get_window("WM_OT_avalon_manager")
|
||||
if manager:
|
||||
manager.refresh()
|
||||
|
||||
if not GlobalClass.is_windows:
|
||||
if OpenFileCacher.opening_file:
|
||||
return TIMER_INTERVAL
|
||||
|
|
@ -192,10 +198,11 @@ class LaunchQtApp(bpy.types.Operator):
|
|||
self._app = BlenderApplication.get_app()
|
||||
GlobalClass.app = self._app
|
||||
|
||||
bpy.app.timers.register(
|
||||
_process_app_events,
|
||||
persistent=True
|
||||
)
|
||||
if not bpy.app.timers.is_registered(_process_app_events):
|
||||
bpy.app.timers.register(
|
||||
_process_app_events,
|
||||
persistent=True
|
||||
)
|
||||
|
||||
def execute(self, context):
|
||||
"""Execute the operator.
|
||||
|
|
|
|||
|
|
@ -1,4 +1,10 @@
|
|||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.blender import api
|
||||
|
||||
install_host(api)
|
||||
|
||||
def register():
|
||||
install_host(api)
|
||||
|
||||
|
||||
def unregister():
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
import pyblish.api
|
||||
|
||||
import openpype.lib as oplib
|
||||
from openpype.pipeline import legacy_io
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
from openpype.hosts.flame.otio import flame_export
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollecTimelineOTIO(pyblish.api.ContextPlugin):
|
||||
|
|
@ -24,11 +24,14 @@ class CollecTimelineOTIO(pyblish.api.ContextPlugin):
|
|||
sequence = opfapi.get_current_sequence(opfapi.CTX.selection)
|
||||
|
||||
# create subset name
|
||||
subset_name = oplib.get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
context.data["projectName"],
|
||||
context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
# adding otio timeline to context
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
import os
|
||||
from .addon import (
|
||||
HARMONY_HOST_DIR,
|
||||
HarmonyAddon,
|
||||
)
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
openharmony_path = os.path.join(
|
||||
os.environ["OPENPYPE_REPOS_ROOT"], "openpype", "hosts",
|
||||
"harmony", "vendor", "OpenHarmony"
|
||||
)
|
||||
# TODO check if is already set? What to do if is already set?
|
||||
env["LIB_OPENHARMONY_PATH"] = openharmony_path
|
||||
__all__ = (
|
||||
"HARMONY_HOST_DIR",
|
||||
"HarmonyAddon",
|
||||
)
|
||||
|
|
|
|||
24
openpype/hosts/harmony/addon.py
Normal file
24
openpype/hosts/harmony/addon.py
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
HARMONY_HOST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class HarmonyAddon(OpenPypeModule, IHostAddon):
|
||||
name = "harmony"
|
||||
host_name = "harmony"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
openharmony_path = os.path.join(
|
||||
HARMONY_HOST_DIR, "vendor", "OpenHarmony"
|
||||
)
|
||||
# TODO check if is already set? What to do if is already set?
|
||||
env["LIB_OPENHARMONY_PATH"] = openharmony_path
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".zip"]
|
||||
|
|
@ -14,14 +14,14 @@ from openpype.pipeline import (
|
|||
)
|
||||
from openpype.pipeline.load import get_outdated_containers
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
import openpype.hosts.harmony
|
||||
|
||||
from openpype.hosts.harmony import HARMONY_HOST_DIR
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
log = logging.getLogger("openpype.hosts.harmony")
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.harmony.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PLUGINS_DIR = os.path.join(HARMONY_HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
|
|
|
|||
|
|
@ -2,8 +2,6 @@
|
|||
import os
|
||||
import shutil
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
from .lib import (
|
||||
ProcessContext,
|
||||
get_local_harmony_path,
|
||||
|
|
@ -16,7 +14,7 @@ save_disabled = False
|
|||
|
||||
|
||||
def file_extensions():
|
||||
return HOST_WORKFILE_EXTENSIONS["harmony"]
|
||||
return [".zip"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect current workfile from Harmony."""
|
||||
import pyblish.api
|
||||
import os
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -17,13 +17,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
"""Plugin entry point."""
|
||||
family = "workfile"
|
||||
basename = os.path.basename(context.data["currentFile"])
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
subset = get_subset_name(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
host_name=context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
# Create instance
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
from .module import (
|
||||
from .addon import (
|
||||
HIERO_ROOT_DIR,
|
||||
HieroModule,
|
||||
HieroAddon,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"HIERO_ROOT_DIR",
|
||||
"HieroModule",
|
||||
"HieroAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
import os
|
||||
import platform
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
HIERO_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class HieroModule(OpenPypeModule, IHostModule):
|
||||
class HieroAddon(OpenPypeModule, IHostAddon):
|
||||
name = "hiero"
|
||||
host_name = "hiero"
|
||||
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
from .module import OpenPypeMaya
|
||||
from .addon import MayaAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"OpenPypeMaya",
|
||||
"MayaAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
MAYA_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class OpenPypeMaya(OpenPypeModule, IHostModule):
|
||||
name = "openpype_maya"
|
||||
class MayaAddon(OpenPypeModule, IHostAddon):
|
||||
name = "maya"
|
||||
host_name = "maya"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
|
|
@ -1,10 +1,10 @@
|
|||
from .module import (
|
||||
from .addon import (
|
||||
NUKE_ROOT_DIR,
|
||||
NukeModule,
|
||||
NukeAddon,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"NUKE_ROOT_DIR",
|
||||
"NukeModule",
|
||||
"NukeAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
import os
|
||||
import platform
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
NUKE_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class NukeModule(OpenPypeModule, IHostModule):
|
||||
class NukeAddon(OpenPypeModule, IHostAddon):
|
||||
name = "nuke"
|
||||
host_name = "nuke"
|
||||
|
||||
|
|
@ -1952,15 +1952,25 @@ class WorkfileSettings(object):
|
|||
if not write_node:
|
||||
return
|
||||
|
||||
# write all knobs to node
|
||||
for knob in nuke_imageio_writes["knobs"]:
|
||||
value = knob["value"]
|
||||
if isinstance(value, six.text_type):
|
||||
value = str(value)
|
||||
if str(value).startswith("0x"):
|
||||
value = int(value, 16)
|
||||
try:
|
||||
# write all knobs to node
|
||||
for knob in nuke_imageio_writes["knobs"]:
|
||||
value = knob["value"]
|
||||
if isinstance(value, six.text_type):
|
||||
value = str(value)
|
||||
if str(value).startswith("0x"):
|
||||
value = int(value, 16)
|
||||
|
||||
write_node[knob["name"]].setValue(value)
|
||||
log.debug("knob: {}| value: {}".format(
|
||||
knob["name"], value
|
||||
))
|
||||
write_node[knob["name"]].setValue(value)
|
||||
except TypeError:
|
||||
log.warning(
|
||||
"Legacy workflow didnt work, switching to current")
|
||||
|
||||
set_node_knobs_from_settings(
|
||||
write_node, nuke_imageio_writes["knobs"])
|
||||
|
||||
def set_reads_colorspace(self, read_clrs_inputs):
|
||||
""" Setting colorspace to Read nodes
|
||||
|
|
@ -2017,12 +2027,14 @@ class WorkfileSettings(object):
|
|||
# get imageio
|
||||
nuke_colorspace = get_nuke_imageio_settings()
|
||||
|
||||
log.info("Setting colorspace to workfile...")
|
||||
try:
|
||||
self.set_root_colorspace(nuke_colorspace["workfile"])
|
||||
except AttributeError:
|
||||
msg = "set_colorspace(): missing `workfile` settings in template"
|
||||
nuke.message(msg)
|
||||
|
||||
log.info("Setting colorspace to viewers...")
|
||||
try:
|
||||
self.set_viewers_colorspace(nuke_colorspace["viewer"])
|
||||
except AttributeError:
|
||||
|
|
@ -2030,24 +2042,18 @@ class WorkfileSettings(object):
|
|||
nuke.message(msg)
|
||||
log.error(msg)
|
||||
|
||||
log.info("Setting colorspace to write nodes...")
|
||||
try:
|
||||
self.set_writes_colorspace()
|
||||
except AttributeError as _error:
|
||||
nuke.message(_error)
|
||||
log.error(_error)
|
||||
|
||||
log.info("Setting colorspace to read nodes...")
|
||||
read_clrs_inputs = nuke_colorspace["regexInputs"].get("inputs", [])
|
||||
if read_clrs_inputs:
|
||||
self.set_reads_colorspace(read_clrs_inputs)
|
||||
|
||||
try:
|
||||
for key in nuke_colorspace:
|
||||
log.debug("Preset's colorspace key: {}".format(key))
|
||||
except TypeError:
|
||||
msg = "Nuke is not in templates! Contact your supervisor!"
|
||||
nuke.message(msg)
|
||||
log.error(msg)
|
||||
|
||||
def reset_frame_range_handles(self):
|
||||
"""Set frame range to current asset"""
|
||||
|
||||
|
|
|
|||
|
|
@ -1,9 +1,10 @@
|
|||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True",
|
||||
"WEBSOCKET_URL": "ws://localhost:8099/ws/"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
from .addon import (
|
||||
PhotoshopAddon,
|
||||
PHOTOSHOP_HOST_DIR,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"PhotoshopAddon",
|
||||
"PHOTOSHOP_HOST_DIR",
|
||||
)
|
||||
|
|
|
|||
26
openpype/hosts/photoshop/addon.py
Normal file
26
openpype/hosts/photoshop/addon.py
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
PHOTOSHOP_HOST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class PhotoshopAddon(OpenPypeModule, IHostAddon):
|
||||
name = "photoshop"
|
||||
host_name = "photoshop"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True",
|
||||
"WEBSOCKET_URL": "ws://localhost:8099/ws/"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".psd", ".psb"]
|
||||
|
|
@ -5,11 +5,10 @@ import traceback
|
|||
|
||||
from Qt import QtWidgets
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import env_value_to_bool, Logger
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
from openpype.lib import env_value_to_bool
|
||||
|
||||
from .launch_logic import ProcessLauncher, stub
|
||||
|
||||
|
|
@ -35,8 +34,10 @@ def main(*subprocess_args):
|
|||
launcher.start()
|
||||
|
||||
if env_value_to_bool("HEADLESS_PUBLISH"):
|
||||
manager = ModulesManager()
|
||||
webpublisher_addon = manager["webpublisher"]
|
||||
launcher.execute_in_main_thread(
|
||||
headless_publish,
|
||||
webpublisher_addon.headless_publish,
|
||||
log,
|
||||
"ClosePS",
|
||||
os.environ.get("IS_TEST")
|
||||
|
|
|
|||
|
|
@ -14,14 +14,13 @@ from openpype.pipeline import (
|
|||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.photoshop
|
||||
from openpype.hosts.photoshop import PHOTOSHOP_HOST_DIR
|
||||
|
||||
from . import lib
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.photoshop.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PLUGINS_DIR = os.path.join(PHOTOSHOP_HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
|
|
|
|||
|
|
@ -1,7 +1,6 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
from . import lib
|
||||
|
||||
|
||||
|
|
@ -14,7 +13,7 @@ def _active_document():
|
|||
|
||||
|
||||
def file_extensions():
|
||||
return HOST_WORKFILE_EXTENSIONS["photoshop"]
|
||||
return [".psd", ".psb"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -17,11 +17,11 @@ import os
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib.plugin_tools import (
|
||||
parse_json,
|
||||
get_batch_asset_task_info
|
||||
)
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype_modules.webpublisher.lib import (
|
||||
get_batch_asset_task_info,
|
||||
parse_json
|
||||
)
|
||||
|
||||
|
||||
class CollectBatchData(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -9,14 +9,22 @@ from openpype.settings import get_project_settings
|
|||
|
||||
|
||||
class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
||||
"""Creates instances for configured color code of a layer.
|
||||
"""Creates instances for layers marked by configurable color.
|
||||
|
||||
Used in remote publishing when artists marks publishable layers by color-
|
||||
coding.
|
||||
coding. Top level layers (group) must be marked by specific color to be
|
||||
published as an instance of 'image' family.
|
||||
|
||||
Can add group for all publishable layers to allow creation of flattened
|
||||
image. (Cannot contain special background layer as it cannot be grouped!)
|
||||
|
||||
Based on value `create_flatten_image` from Settings:
|
||||
- "yes": create flattened 'image' subset of all publishable layers + create
|
||||
'image' subset per publishable layer
|
||||
- "only": create ONLY flattened 'image' subset of all publishable layers
|
||||
- "no": do not create flattened 'image' subset at all,
|
||||
only separate subsets per marked layer.
|
||||
|
||||
Identifier:
|
||||
id (str): "pyblish.avalon.instance"
|
||||
"""
|
||||
|
|
@ -32,8 +40,7 @@ class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
|||
# TODO check if could be set globally, probably doesn't make sense when
|
||||
# flattened template cannot
|
||||
subset_template_name = ""
|
||||
create_flatten_image = False
|
||||
# probably not possible to configure this globally
|
||||
create_flatten_image = "no"
|
||||
flatten_subset_template = ""
|
||||
|
||||
def process(self, context):
|
||||
|
|
@ -62,6 +69,7 @@ class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
publishable_layers = []
|
||||
created_instances = []
|
||||
family_from_settings = None
|
||||
for layer in layers:
|
||||
self.log.debug("Layer:: {}".format(layer))
|
||||
if layer.parents:
|
||||
|
|
@ -80,6 +88,9 @@ class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
|||
self.log.debug("!!! Not found family or template, skip")
|
||||
continue
|
||||
|
||||
if not family_from_settings:
|
||||
family_from_settings = resolved_family
|
||||
|
||||
fill_pairs = {
|
||||
"variant": variant,
|
||||
"family": resolved_family,
|
||||
|
|
@ -98,13 +109,16 @@ class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
|||
"Subset {} already created, skipping.".format(subset))
|
||||
continue
|
||||
|
||||
instance = self._create_instance(context, layer, resolved_family,
|
||||
asset_name, subset, task_name)
|
||||
if self.create_flatten_image != "flatten_only":
|
||||
instance = self._create_instance(context, layer,
|
||||
resolved_family,
|
||||
asset_name, subset, task_name)
|
||||
created_instances.append(instance)
|
||||
|
||||
existing_subset_names.append(subset)
|
||||
publishable_layers.append(layer)
|
||||
created_instances.append(instance)
|
||||
|
||||
if self.create_flatten_image and publishable_layers:
|
||||
if self.create_flatten_image != "no" and publishable_layers:
|
||||
self.log.debug("create_flatten_image")
|
||||
if not self.flatten_subset_template:
|
||||
self.log.warning("No template for flatten image")
|
||||
|
|
@ -116,7 +130,7 @@ class CollectColorCodedInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
first_layer = publishable_layers[0] # dummy layer
|
||||
first_layer.name = subset
|
||||
family = created_instances[0].data["family"] # inherit family
|
||||
family = family_from_settings # inherit family
|
||||
instance = self._create_instance(context, first_layer,
|
||||
family,
|
||||
asset_name, subset, task_name)
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ import os
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectReview(pyblish.api.ContextPlugin):
|
||||
|
|
@ -27,13 +27,14 @@ class CollectReview(pyblish.api.ContextPlugin):
|
|||
|
||||
def process(self, context):
|
||||
family = "review"
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
subset = get_subset_name(
|
||||
family,
|
||||
context.data.get("variant", ''),
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
host_name=context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
instance = context.create_instance(subset)
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -24,13 +24,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
family = "workfile"
|
||||
# context.data["variant"] might come only from collect_batch_data
|
||||
variant = context.data.get("variant") or self.default_variant
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
subset = get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
host_name=context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
file_path = context.data["currentFile"]
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from .standalonepublish_module import StandAlonePublishModule
|
||||
from .addon import StandAlonePublishAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"StandAlonePublishModule",
|
||||
"StandAlonePublishAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -5,18 +5,18 @@ import click
|
|||
from openpype.lib import get_openpype_execute_args
|
||||
from openpype.lib.execute import run_detached_process
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import ITrayAction, IHostModule
|
||||
from openpype.modules.interfaces import ITrayAction, IHostAddon
|
||||
|
||||
STANDALONEPUBLISH_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class StandAlonePublishModule(OpenPypeModule, ITrayAction, IHostModule):
|
||||
class StandAlonePublishAddon(OpenPypeModule, ITrayAction, IHostAddon):
|
||||
label = "Publish"
|
||||
name = "standalonepublish_tool"
|
||||
name = "standalonepublisher"
|
||||
host_name = "standalonepublisher"
|
||||
|
||||
def initialize(self, modules_settings):
|
||||
self.enabled = modules_settings[self.name]["enabled"]
|
||||
self.enabled = modules_settings["standalonepublish_tool"]["enabled"]
|
||||
self.publish_paths = [
|
||||
os.path.join(STANDALONEPUBLISH_ROOT_DIR, "plugins", "publish")
|
||||
]
|
||||
|
|
@ -42,7 +42,7 @@ class StandAlonePublishModule(OpenPypeModule, ITrayAction, IHostModule):
|
|||
|
||||
|
||||
@click.group(
|
||||
StandAlonePublishModule.name,
|
||||
StandAlonePublishAddon.name,
|
||||
help="StandalonePublisher related commands.")
|
||||
def cli_main():
|
||||
pass
|
||||
|
|
@ -2,8 +2,8 @@ import copy
|
|||
import json
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectBulkMovInstances(pyblish.api.InstancePlugin):
|
||||
|
|
@ -44,12 +44,14 @@ class CollectBulkMovInstances(pyblish.api.InstancePlugin):
|
|||
task_name = available_task_names[_task_name_low]
|
||||
break
|
||||
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
self.new_instance_family,
|
||||
self.subset_name_variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name
|
||||
project_name,
|
||||
host_name=context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
instance_name = f"{asset_name}_{subset_name}"
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from .module import TrayPublishModule
|
||||
from .addon import TrayPublishAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"TrayPublishModule",
|
||||
"TrayPublishAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -5,15 +5,15 @@ import click
|
|||
from openpype.lib import get_openpype_execute_args
|
||||
from openpype.lib.execute import run_detached_process
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import ITrayAction, IHostModule
|
||||
from openpype.modules.interfaces import ITrayAction, IHostAddon
|
||||
|
||||
TRAYPUBLISH_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class TrayPublishModule(OpenPypeModule, IHostModule, ITrayAction):
|
||||
class TrayPublishAddon(OpenPypeModule, IHostAddon, ITrayAction):
|
||||
label = "New Publish (beta)"
|
||||
name = "traypublish_tool"
|
||||
host_name = "traypublish"
|
||||
name = "traypublisher"
|
||||
host_name = "traypublisher"
|
||||
|
||||
def initialize(self, modules_settings):
|
||||
self.enabled = True
|
||||
|
|
@ -28,7 +28,7 @@ class TrayPublishModule(OpenPypeModule, IHostModule, ITrayAction):
|
|||
self._experimental_tools = ExperimentalTools()
|
||||
|
||||
def tray_menu(self, *args, **kwargs):
|
||||
super(TrayPublishModule, self).tray_menu(*args, **kwargs)
|
||||
super(TrayPublishAddon, self).tray_menu(*args, **kwargs)
|
||||
traypublisher = self._experimental_tools.get("traypublisher")
|
||||
visible = False
|
||||
if traypublisher and traypublisher.enabled:
|
||||
|
|
@ -53,7 +53,7 @@ class TrayPublishModule(OpenPypeModule, IHostModule, ITrayAction):
|
|||
click_group.add_command(cli_main)
|
||||
|
||||
|
||||
@click.group(TrayPublishModule.name, help="TrayPublisher related commands.")
|
||||
@click.group(TrayPublishAddon.name, help="TrayPublisher related commands.")
|
||||
def cli_main():
|
||||
pass
|
||||
|
||||
|
|
@ -6,13 +6,15 @@ from openpype.client import get_assets, get_asset_by_name
|
|||
from openpype.lib import (
|
||||
FileDef,
|
||||
BoolDef,
|
||||
get_subset_name_with_asset_doc,
|
||||
TaskNotSetError,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
CreatedInstance,
|
||||
CreatorError
|
||||
)
|
||||
from openpype.pipeline.create import (
|
||||
get_subset_name,
|
||||
TaskNotSetError,
|
||||
)
|
||||
|
||||
from openpype.hosts.traypublisher.api.plugin import TrayPublishCreator
|
||||
|
||||
|
|
@ -130,7 +132,7 @@ class BatchMovieCreator(TrayPublishCreator):
|
|||
task_name = self._get_task_name(asset_doc)
|
||||
|
||||
try:
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
self.family,
|
||||
variant,
|
||||
task_name,
|
||||
|
|
@ -143,7 +145,7 @@ class BatchMovieCreator(TrayPublishCreator):
|
|||
# but user have ability to change it
|
||||
# NOTE: This expect that there is not task 'Undefined' on asset
|
||||
task_name = "Undefined"
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
self.family,
|
||||
variant,
|
||||
task_name,
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
from .tvpaint_module import (
|
||||
from .addon import (
|
||||
get_launch_script_path,
|
||||
TVPaintModule,
|
||||
TVPaintAddon,
|
||||
TVPAINT_ROOT_DIR,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"get_launch_script_path",
|
||||
"TVPaintModule",
|
||||
"TVPaintAddon",
|
||||
"TVPAINT_ROOT_DIR",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
TVPAINT_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
|
@ -13,7 +13,7 @@ def get_launch_script_path():
|
|||
)
|
||||
|
||||
|
||||
class TVPaintModule(OpenPypeModule, IHostModule):
|
||||
class TVPaintAddon(OpenPypeModule, IHostAddon):
|
||||
name = "tvpaint"
|
||||
host_name = "tvpaint"
|
||||
|
||||
|
|
@ -3,8 +3,8 @@ import copy
|
|||
import pyblish.api
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.ContextPlugin):
|
||||
|
|
@ -107,13 +107,14 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
# Use empty variant value
|
||||
variant = ""
|
||||
task_name = legacy_io.Session["AVALON_TASK"]
|
||||
new_subset_name = get_subset_name_with_asset_doc(
|
||||
new_subset_name = get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name
|
||||
host_name,
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
instance_data["subset"] = new_subset_name
|
||||
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import copy
|
|||
import pyblish.api
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectRenderScene(pyblish.api.ContextPlugin):
|
||||
|
|
@ -75,14 +75,15 @@ class CollectRenderScene(pyblish.api.ContextPlugin):
|
|||
dynamic_data["render_pass"] = dynamic_data["renderpass"]
|
||||
|
||||
task_name = workfile_context["task"]
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
"render",
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name,
|
||||
dynamic_data=dynamic_data
|
||||
dynamic_data=dynamic_data,
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
instance_data = {
|
||||
|
|
|
|||
|
|
@ -3,8 +3,8 @@ import json
|
|||
import pyblish.api
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -39,13 +39,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
# Use empty variant value
|
||||
variant = ""
|
||||
task_name = legacy_io.Session["AVALON_TASK"]
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name
|
||||
host_name,
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
# Create Workfile instance
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from .module import UnrealModule
|
||||
from .addon import UnrealAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"UnrealModule",
|
||||
"UnrealAddon",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,18 +1,18 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
UNREAL_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class UnrealModule(OpenPypeModule, IHostModule):
|
||||
class UnrealAddon(OpenPypeModule, IHostAddon):
|
||||
name = "unreal"
|
||||
host_name = "unreal"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, app) -> None:
|
||||
def add_implementation_envs(self, env, app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
# Set OPENPYPE_UNREAL_PLUGIN required for Unreal implementation
|
||||
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Unreal launching and project tools."""
|
||||
import sys
|
||||
|
||||
import os
|
||||
import platform
|
||||
import json
|
||||
|
|
@ -9,7 +9,7 @@ import subprocess
|
|||
import re
|
||||
from pathlib import Path
|
||||
from collections import OrderedDict
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
|
||||
def get_engine_versions(env=None):
|
||||
|
|
|
|||
|
|
@ -0,0 +1,10 @@
|
|||
from .addon import (
|
||||
WebpublisherAddon,
|
||||
WEBPUBLISHER_ROOT_DIR,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"WebpublisherAddon",
|
||||
"WEBPUBLISHER_ROOT_DIR",
|
||||
)
|
||||
106
openpype/hosts/webpublisher/addon.py
Normal file
106
openpype/hosts/webpublisher/addon.py
Normal file
|
|
@ -0,0 +1,106 @@
|
|||
import os
|
||||
|
||||
import click
|
||||
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
WEBPUBLISHER_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class WebpublisherAddon(OpenPypeModule, IHostAddon):
|
||||
name = "webpublisher"
|
||||
host_name = "webpublisher"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def headless_publish(self, log, close_plugin_name=None, is_test=False):
|
||||
"""Runs publish in a opened host with a context.
|
||||
|
||||
Close Python process at the end.
|
||||
"""
|
||||
|
||||
from openpype.pipeline.publish.lib import remote_publish
|
||||
from .lib import get_webpublish_conn, publish_and_log
|
||||
|
||||
if is_test:
|
||||
remote_publish(log, close_plugin_name)
|
||||
return
|
||||
|
||||
dbcon = get_webpublish_conn()
|
||||
_id = os.environ.get("BATCH_LOG_ID")
|
||||
if not _id:
|
||||
log.warning("Unable to store log records, "
|
||||
"batch will be unfinished!")
|
||||
return
|
||||
|
||||
publish_and_log(
|
||||
dbcon, _id, log, close_plugin_name=close_plugin_name
|
||||
)
|
||||
|
||||
def cli(self, click_group):
|
||||
click_group.add_command(cli_main)
|
||||
|
||||
|
||||
@click.group(
|
||||
WebpublisherAddon.name,
|
||||
help="Webpublisher related commands.")
|
||||
def cli_main():
|
||||
pass
|
||||
|
||||
|
||||
@cli_main.command()
|
||||
@click.argument("path")
|
||||
@click.option("-u", "--user", help="User email address")
|
||||
@click.option("-p", "--project", help="Project")
|
||||
@click.option("-t", "--targets", help="Targets", default=None,
|
||||
multiple=True)
|
||||
def publish(project, path, user=None, targets=None):
|
||||
"""Start publishing (Inner command).
|
||||
|
||||
Publish collects json from paths provided as an argument.
|
||||
More than one path is allowed.
|
||||
"""
|
||||
|
||||
from .publish_functions import cli_publish
|
||||
|
||||
cli_publish(project, path, user, targets)
|
||||
|
||||
|
||||
@cli_main.command()
|
||||
@click.argument("path")
|
||||
@click.option("-p", "--project", help="Project")
|
||||
@click.option("-h", "--host", help="Host")
|
||||
@click.option("-u", "--user", help="User email address")
|
||||
@click.option("-t", "--targets", help="Targets", default=None,
|
||||
multiple=True)
|
||||
def publishfromapp(project, path, host, user=None, targets=None):
|
||||
"""Start publishing through application (Inner command).
|
||||
|
||||
Publish collects json from paths provided as an argument.
|
||||
More than one path is allowed.
|
||||
"""
|
||||
|
||||
from .publish_functions import cli_publish_from_app
|
||||
|
||||
cli_publish_from_app(project, path, host, user, targets)
|
||||
|
||||
|
||||
@cli_main.command()
|
||||
@click.option("-e", "--executable", help="Executable")
|
||||
@click.option("-u", "--upload_dir", help="Upload dir")
|
||||
@click.option("-h", "--host", help="Host", default=None)
|
||||
@click.option("-p", "--port", help="Port", default=None)
|
||||
def webserver(executable, upload_dir, host=None, port=None):
|
||||
"""Start service for communication with Webpublish Front end.
|
||||
|
||||
OP must be congigured on a machine, eg. OPENPYPE_MONGO filled AND
|
||||
FTRACK_BOT_API_KEY provided with api key from Ftrack.
|
||||
|
||||
Expect "pype.club" user created on Ftrack.
|
||||
"""
|
||||
|
||||
from .webserver_service import run_webserver
|
||||
|
||||
run_webserver(executable, upload_dir, host, port)
|
||||
|
|
@ -1,31 +1,23 @@
|
|||
import os
|
||||
import logging
|
||||
|
||||
from pyblish import api as pyblish
|
||||
import openpype.hosts.webpublisher
|
||||
from openpype.pipeline import legacy_io
|
||||
import pyblish.api
|
||||
|
||||
from openpype.host import HostBase
|
||||
from openpype.hosts.webpublisher import WEBPUBLISHER_ROOT_DIR
|
||||
|
||||
log = logging.getLogger("openpype.hosts.webpublisher")
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(
|
||||
openpype.hosts.webpublisher.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
|
||||
class WebpublisherHost(HostBase):
|
||||
name = "webpublisher"
|
||||
|
||||
def install():
|
||||
print("Installing Pype config...")
|
||||
def install(self):
|
||||
print("Installing Pype config...")
|
||||
pyblish.api.register_host(self.name)
|
||||
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
legacy_io.install()
|
||||
|
||||
|
||||
def uninstall():
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
|
||||
|
||||
# to have required methods for interface
|
||||
def ls():
|
||||
pass
|
||||
publish_plugin_dir = os.path.join(
|
||||
WEBPUBLISHER_ROOT_DIR, "plugins", "publish"
|
||||
)
|
||||
pyblish.api.register_plugin_path(publish_plugin_dir)
|
||||
self.log.info(publish_plugin_dir)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import os
|
||||
from datetime import datetime
|
||||
import collections
|
||||
import json
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
|
|
@ -8,9 +9,10 @@ import pyblish.util
|
|||
import pyblish.api
|
||||
|
||||
from openpype.client.mongo import OpenPypeMongoConnection
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import Logger
|
||||
from openpype.lib.profiles_filtering import filter_profiles
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.pipeline.publish.lib import find_close_plugin
|
||||
|
||||
ERROR_STATUS = "error"
|
||||
IN_PROGRESS_STATUS = "in_progress"
|
||||
|
|
@ -19,21 +21,51 @@ SENT_REPROCESSING_STATUS = "sent_for_reprocessing"
|
|||
FINISHED_REPROCESS_STATUS = "republishing_finished"
|
||||
FINISHED_OK_STATUS = "finished_ok"
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
def headless_publish(log, close_plugin_name=None, is_test=False):
|
||||
"""Runs publish in a opened host with a context and closes Python process.
|
||||
|
||||
def parse_json(path):
|
||||
"""Parses json file at 'path' location
|
||||
|
||||
Returns:
|
||||
(dict) or None if unparsable
|
||||
Raises:
|
||||
AsssertionError if 'path' doesn't exist
|
||||
"""
|
||||
if not is_test:
|
||||
dbcon = get_webpublish_conn()
|
||||
_id = os.environ.get("BATCH_LOG_ID")
|
||||
if not _id:
|
||||
log.warning("Unable to store log records, "
|
||||
"batch will be unfinished!")
|
||||
return
|
||||
path = path.strip('\"')
|
||||
assert os.path.isfile(path), (
|
||||
"Path to json file doesn't exist. \"{}\"".format(path)
|
||||
)
|
||||
data = None
|
||||
with open(path, "r") as json_file:
|
||||
try:
|
||||
data = json.load(json_file)
|
||||
except Exception as exc:
|
||||
log.error(
|
||||
"Error loading json: {} - Exception: {}".format(path, exc)
|
||||
)
|
||||
return data
|
||||
|
||||
publish_and_log(dbcon, _id, log, close_plugin_name=close_plugin_name)
|
||||
|
||||
def get_batch_asset_task_info(ctx):
|
||||
"""Parses context data from webpublisher's batch metadata
|
||||
|
||||
Returns:
|
||||
(tuple): asset, task_name (Optional), task_type
|
||||
"""
|
||||
task_type = "default_task_type"
|
||||
task_name = None
|
||||
asset = None
|
||||
|
||||
if ctx["type"] == "task":
|
||||
items = ctx["path"].split('/')
|
||||
asset = items[-2]
|
||||
task_name = ctx["name"]
|
||||
task_type = ctx["attributes"]["type"]
|
||||
else:
|
||||
publish(log, close_plugin_name)
|
||||
asset = ctx["name"]
|
||||
|
||||
return asset, task_name, task_type
|
||||
|
||||
|
||||
def get_webpublish_conn():
|
||||
|
|
@ -62,36 +94,6 @@ def start_webpublish_log(dbcon, batch_id, user):
|
|||
}).inserted_id
|
||||
|
||||
|
||||
def publish(log, close_plugin_name=None, raise_error=False):
|
||||
"""Loops through all plugins, logs to console. Used for tests.
|
||||
|
||||
Args:
|
||||
log (openpype.lib.Logger)
|
||||
close_plugin_name (str): name of plugin with responsibility to
|
||||
close host app
|
||||
"""
|
||||
# Error exit as soon as any error occurs.
|
||||
error_format = "Failed {plugin.__name__}: {error} -- {error.traceback}"
|
||||
|
||||
close_plugin = _get_close_plugin(close_plugin_name, log)
|
||||
|
||||
for result in pyblish.util.publish_iter():
|
||||
for record in result["records"]:
|
||||
log.info("{}: {}".format(
|
||||
result["plugin"].label, record.msg))
|
||||
|
||||
if result["error"]:
|
||||
error_message = error_format.format(**result)
|
||||
log.error(error_message)
|
||||
if close_plugin: # close host app explicitly after error
|
||||
context = pyblish.api.Context()
|
||||
close_plugin().process(context)
|
||||
if raise_error:
|
||||
# Fatal Error is because of Deadline
|
||||
error_message = "Fatal Error: " + error_format.format(**result)
|
||||
raise RuntimeError(error_message)
|
||||
|
||||
|
||||
def publish_and_log(dbcon, _id, log, close_plugin_name=None, batch_id=None):
|
||||
"""Loops through all plugins, logs ok and fails into OP DB.
|
||||
|
||||
|
|
@ -107,7 +109,7 @@ def publish_and_log(dbcon, _id, log, close_plugin_name=None, batch_id=None):
|
|||
error_format = "Failed {plugin.__name__}: {error} -- {error.traceback}\n"
|
||||
error_format += "-" * 80 + "\n"
|
||||
|
||||
close_plugin = _get_close_plugin(close_plugin_name, log)
|
||||
close_plugin = find_close_plugin(close_plugin_name, log)
|
||||
|
||||
if isinstance(_id, str):
|
||||
_id = ObjectId(_id)
|
||||
|
|
@ -226,16 +228,6 @@ def find_variant_key(application_manager, host):
|
|||
return found_variant_key
|
||||
|
||||
|
||||
def _get_close_plugin(close_plugin_name, log):
|
||||
if close_plugin_name:
|
||||
plugins = pyblish.api.discover()
|
||||
for plugin in plugins:
|
||||
if plugin.__name__ == close_plugin_name:
|
||||
return plugin
|
||||
|
||||
log.debug("Close plugin not found, app might not close.")
|
||||
|
||||
|
||||
def get_task_data(batch_dir):
|
||||
"""Return parsed data from first task manifest.json
|
||||
|
||||
|
|
@ -13,12 +13,13 @@ import os
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib.plugin_tools import (
|
||||
parse_json,
|
||||
get_batch_asset_task_info
|
||||
)
|
||||
from openpype.lib.remote_publish import get_webpublish_conn, IN_PROGRESS_STATUS
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype_modules.webpublisher.lib import (
|
||||
parse_json,
|
||||
get_batch_asset_task_info,
|
||||
get_webpublish_conn,
|
||||
IN_PROGRESS_STATUS
|
||||
)
|
||||
|
||||
|
||||
class CollectBatchData(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -23,10 +23,8 @@ from openpype.lib import (
|
|||
get_ffprobe_streams,
|
||||
convert_ffprobe_fps_value,
|
||||
)
|
||||
from openpype.lib.plugin_tools import (
|
||||
parse_json,
|
||||
get_subset_name_with_asset_doc
|
||||
)
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
from openpype_modules.webpublisher.lib import parse_json
|
||||
|
||||
|
||||
class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
||||
|
|
@ -80,9 +78,14 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
is_sequence,
|
||||
extension.replace(".", ''))
|
||||
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
family, variant, task_name, asset_doc,
|
||||
project_name=project_name, host_name="webpublisher"
|
||||
subset_name = get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name=project_name,
|
||||
host_name="webpublisher",
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
version = self._get_next_version(
|
||||
project_name, asset_doc, subset_name
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ import re
|
|||
import copy
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectTVPaintInstances(pyblish.api.ContextPlugin):
|
||||
|
|
@ -47,13 +47,14 @@ class CollectTVPaintInstances(pyblish.api.ContextPlugin):
|
|||
new_instances = []
|
||||
|
||||
# Workfile instance
|
||||
workfile_subset_name = get_subset_name_with_asset_doc(
|
||||
workfile_subset_name = get_subset_name(
|
||||
self.workfile_family,
|
||||
self.workfile_variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name
|
||||
host_name,
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
workfile_instance = self._create_workfile_instance(
|
||||
context, workfile_subset_name
|
||||
|
|
@ -61,13 +62,14 @@ class CollectTVPaintInstances(pyblish.api.ContextPlugin):
|
|||
new_instances.append(workfile_instance)
|
||||
|
||||
# Review instance
|
||||
review_subset_name = get_subset_name_with_asset_doc(
|
||||
review_subset_name = get_subset_name(
|
||||
self.review_family,
|
||||
self.review_variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name
|
||||
host_name,
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
review_instance = self._create_review_instance(
|
||||
context, review_subset_name
|
||||
|
|
@ -114,14 +116,15 @@ class CollectTVPaintInstances(pyblish.api.ContextPlugin):
|
|||
"family": "render"
|
||||
}
|
||||
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
self.render_pass_family,
|
||||
render_pass,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name,
|
||||
dynamic_data=dynamic_data
|
||||
dynamic_data=dynamic_data,
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
instance = self._create_render_pass_instance(
|
||||
|
|
@ -137,14 +140,15 @@ class CollectTVPaintInstances(pyblish.api.ContextPlugin):
|
|||
# Override family for subset name
|
||||
"family": "render"
|
||||
}
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
self.render_layer_family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name,
|
||||
dynamic_data=dynamic_data
|
||||
dynamic_data=dynamic_data,
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
instance = self._create_render_layer_instance(
|
||||
context, layers, subset_name
|
||||
|
|
|
|||
|
|
@ -16,11 +16,11 @@ import uuid
|
|||
import json
|
||||
import shutil
|
||||
import pyblish.api
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
from openpype.hosts.tvpaint.worker import (
|
||||
SenderTVPaintCommands,
|
||||
CollectSceneData
|
||||
)
|
||||
from openpype_modules.webpublisher.lib import parse_json
|
||||
|
||||
|
||||
class CollectTVPaintWorkfileData(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
205
openpype/hosts/webpublisher/publish_functions.py
Normal file
205
openpype/hosts/webpublisher/publish_functions.py
Normal file
|
|
@ -0,0 +1,205 @@
|
|||
import os
|
||||
import time
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
|
||||
from openpype.lib import Logger
|
||||
from openpype.lib.applications import (
|
||||
ApplicationManager,
|
||||
get_app_environments_for_context,
|
||||
)
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.webpublisher.api import WebpublisherHost
|
||||
|
||||
from .lib import (
|
||||
get_batch_asset_task_info,
|
||||
get_webpublish_conn,
|
||||
start_webpublish_log,
|
||||
publish_and_log,
|
||||
fail_batch,
|
||||
find_variant_key,
|
||||
get_task_data,
|
||||
get_timeout,
|
||||
IN_PROGRESS_STATUS
|
||||
)
|
||||
|
||||
|
||||
def cli_publish(project_name, batch_path, user_email, targets):
|
||||
"""Start headless publishing.
|
||||
|
||||
Used to publish rendered assets, workfiles etc via Webpublisher.
|
||||
Eventually should be yanked out to Webpublisher cli.
|
||||
|
||||
Publish use json from passed paths argument.
|
||||
|
||||
Args:
|
||||
project_name (str): project to publish (only single context is
|
||||
expected per call of remotepublish
|
||||
batch_path (str): Path batch folder. Contains subfolders with
|
||||
resources (workfile, another subfolder 'renders' etc.)
|
||||
user_email (string): email address for webpublisher - used to
|
||||
find Ftrack user with same email
|
||||
targets (list): Pyblish targets
|
||||
(to choose validator for example)
|
||||
|
||||
Raises:
|
||||
RuntimeError: When there is no path to process.
|
||||
"""
|
||||
|
||||
if not batch_path:
|
||||
raise RuntimeError("No publish paths specified")
|
||||
|
||||
log = Logger.get_logger("remotepublish")
|
||||
log.info("remotepublish command")
|
||||
|
||||
# Register target and host
|
||||
webpublisher_host = WebpublisherHost()
|
||||
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = batch_path
|
||||
os.environ["AVALON_PROJECT"] = project_name
|
||||
os.environ["AVALON_APP"] = webpublisher_host.name
|
||||
os.environ["USER_EMAIL"] = user_email
|
||||
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
|
||||
|
||||
if targets:
|
||||
if isinstance(targets, str):
|
||||
targets = [targets]
|
||||
for target in targets:
|
||||
pyblish.api.register_target(target)
|
||||
|
||||
install_host(webpublisher_host)
|
||||
|
||||
log.info("Running publish ...")
|
||||
|
||||
_, batch_id = os.path.split(batch_path)
|
||||
dbcon = get_webpublish_conn()
|
||||
_id = start_webpublish_log(dbcon, batch_id, user_email)
|
||||
|
||||
task_data = get_task_data(batch_path)
|
||||
if not task_data["context"]:
|
||||
msg = "Batch manifest must contain context data"
|
||||
msg += "Create new batch and set context properly."
|
||||
fail_batch(_id, dbcon, msg)
|
||||
|
||||
publish_and_log(dbcon, _id, log, batch_id=batch_id)
|
||||
|
||||
log.info("Publish finished.")
|
||||
|
||||
|
||||
def cli_publish_from_app(
|
||||
project_name, batch_path, host_name, user_email, targets
|
||||
):
|
||||
"""Opens installed variant of 'host' and run remote publish there.
|
||||
|
||||
Eventually should be yanked out to Webpublisher cli.
|
||||
|
||||
Currently implemented and tested for Photoshop where customer
|
||||
wants to process uploaded .psd file and publish collected layers
|
||||
from there. Triggered by Webpublisher.
|
||||
|
||||
Checks if no other batches are running (status =='in_progress). If
|
||||
so, it sleeps for SLEEP (this is separate process),
|
||||
waits for WAIT_FOR seconds altogether.
|
||||
|
||||
Requires installed host application on the machine.
|
||||
|
||||
Runs publish process as user would, in automatic fashion.
|
||||
|
||||
Args:
|
||||
project_name (str): project to publish (only single context is
|
||||
expected per call of remotepublish
|
||||
batch_path (str): Path batch folder. Contains subfolders with
|
||||
resources (workfile, another subfolder 'renders' etc.)
|
||||
host_name (str): 'photoshop'
|
||||
user_email (string): email address for webpublisher - used to
|
||||
find Ftrack user with same email
|
||||
targets (list): Pyblish targets
|
||||
(to choose validator for example)
|
||||
"""
|
||||
|
||||
log = Logger.get_logger("RemotePublishFromApp")
|
||||
|
||||
log.info("remotepublishphotoshop command")
|
||||
|
||||
task_data = get_task_data(batch_path)
|
||||
|
||||
workfile_path = os.path.join(batch_path,
|
||||
task_data["task"],
|
||||
task_data["files"][0])
|
||||
|
||||
print("workfile_path {}".format(workfile_path))
|
||||
|
||||
batch_id = task_data["batch"]
|
||||
dbcon = get_webpublish_conn()
|
||||
# safer to start logging here, launch might be broken altogether
|
||||
_id = start_webpublish_log(dbcon, batch_id, user_email)
|
||||
|
||||
batches_in_progress = list(dbcon.find({"status": IN_PROGRESS_STATUS}))
|
||||
if len(batches_in_progress) > 1:
|
||||
running_batches = [str(batch["_id"])
|
||||
for batch in batches_in_progress
|
||||
if batch["_id"] != _id]
|
||||
msg = "There are still running batches {}\n". \
|
||||
format("\n".join(running_batches))
|
||||
msg += "Ask admin to check them and reprocess current batch"
|
||||
fail_batch(_id, dbcon, msg)
|
||||
|
||||
if not task_data["context"]:
|
||||
msg = "Batch manifest must contain context data"
|
||||
msg += "Create new batch and set context properly."
|
||||
fail_batch(_id, dbcon, msg)
|
||||
|
||||
asset_name, task_name, task_type = get_batch_asset_task_info(
|
||||
task_data["context"])
|
||||
|
||||
application_manager = ApplicationManager()
|
||||
found_variant_key = find_variant_key(application_manager, host_name)
|
||||
app_name = "{}/{}".format(host_name, found_variant_key)
|
||||
|
||||
# must have for proper launch of app
|
||||
env = get_app_environments_for_context(
|
||||
project_name,
|
||||
asset_name,
|
||||
task_name,
|
||||
app_name
|
||||
)
|
||||
print("env:: {}".format(env))
|
||||
os.environ.update(env)
|
||||
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = batch_path
|
||||
# must pass identifier to update log lines for a batch
|
||||
os.environ["BATCH_LOG_ID"] = str(_id)
|
||||
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
|
||||
os.environ["USER_EMAIL"] = user_email
|
||||
|
||||
pyblish.api.register_host(host_name)
|
||||
if targets:
|
||||
if isinstance(targets, str):
|
||||
targets = [targets]
|
||||
current_targets = os.environ.get("PYBLISH_TARGETS", "").split(
|
||||
os.pathsep)
|
||||
for target in targets:
|
||||
current_targets.append(target)
|
||||
|
||||
os.environ["PYBLISH_TARGETS"] = os.pathsep.join(
|
||||
set(current_targets))
|
||||
|
||||
data = {
|
||||
"last_workfile_path": workfile_path,
|
||||
"start_last_workfile": True,
|
||||
"project_name": project_name,
|
||||
"asset_name": asset_name,
|
||||
"task_name": task_name
|
||||
}
|
||||
|
||||
launched_app = application_manager.launch(app_name, **data)
|
||||
|
||||
timeout = get_timeout(project_name, host_name, task_type)
|
||||
|
||||
time_start = time.time()
|
||||
while launched_app.poll() is None:
|
||||
time.sleep(0.5)
|
||||
if time.time() - time_start > timeout:
|
||||
launched_app.terminate()
|
||||
msg = "Timeout reached"
|
||||
fail_batch(_id, dbcon, msg)
|
||||
|
|
@ -0,0 +1,6 @@
|
|||
from .webserver import run_webserver
|
||||
|
||||
|
||||
__all__ = (
|
||||
"run_webserver",
|
||||
)
|
||||
|
|
@ -10,16 +10,17 @@ from aiohttp.web_response import Response
|
|||
from openpype.client import (
|
||||
get_projects,
|
||||
get_assets,
|
||||
OpenPypeMongoConnection,
|
||||
)
|
||||
from openpype.lib import Logger
|
||||
from openpype.lib.remote_publish import (
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype_modules.webserver.base_routes import RestApiEndpoint
|
||||
from openpype_modules.webpublisher import WebpublisherAddon
|
||||
from openpype_modules.webpublisher.lib import (
|
||||
get_webpublish_conn,
|
||||
get_task_data,
|
||||
ERROR_STATUS,
|
||||
REPROCESS_STATUS
|
||||
)
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype_modules.webserver.base_routes import RestApiEndpoint
|
||||
|
||||
log = Logger.get_logger("WebpublishRoutes")
|
||||
|
||||
|
|
@ -77,9 +78,7 @@ class WebpublishRestApiResource(JsonApiResource):
|
|||
"""Resource carrying OP DB connection for storing batch info into DB."""
|
||||
|
||||
def __init__(self):
|
||||
mongo_client = OpenPypeMongoConnection.get_mongo_client()
|
||||
database_name = os.environ["OPENPYPE_DATABASE_NAME"]
|
||||
self.dbcon = mongo_client[database_name]["webpublishes"]
|
||||
self.dbcon = get_webpublish_conn()
|
||||
|
||||
|
||||
class ProjectsEndpoint(ResourceRestApiEndpoint):
|
||||
|
|
@ -215,7 +214,7 @@ class BatchPublishEndpoint(WebpublishApiEndpoint):
|
|||
# TVPaint filter
|
||||
{
|
||||
"extensions": [".tvpp"],
|
||||
"command": "remotepublish",
|
||||
"command": "publish",
|
||||
"arguments": {
|
||||
"targets": ["tvpaint_worker"]
|
||||
},
|
||||
|
|
@ -224,13 +223,13 @@ class BatchPublishEndpoint(WebpublishApiEndpoint):
|
|||
# Photoshop filter
|
||||
{
|
||||
"extensions": [".psd", ".psb"],
|
||||
"command": "remotepublishfromapp",
|
||||
"command": "publishfromapp",
|
||||
"arguments": {
|
||||
# Command 'remotepublishfromapp' requires --host argument
|
||||
# Command 'publishfromapp' requires --host argument
|
||||
"host": "photoshop",
|
||||
# Make sure targets are set to None for cases that default
|
||||
# would change
|
||||
# - targets argument is not used in 'remotepublishfromapp'
|
||||
# - targets argument is not used in 'publishfromapp'
|
||||
"targets": ["remotepublish"]
|
||||
},
|
||||
# does publish need to be handled by a queue, eg. only
|
||||
|
|
@ -242,7 +241,7 @@ class BatchPublishEndpoint(WebpublishApiEndpoint):
|
|||
batch_dir = os.path.join(self.resource.upload_dir, content["batch"])
|
||||
|
||||
# Default command and arguments
|
||||
command = "remotepublish"
|
||||
command = "publish"
|
||||
add_args = {
|
||||
# All commands need 'project' and 'user'
|
||||
"project": content["project_name"],
|
||||
|
|
@ -273,6 +272,8 @@ class BatchPublishEndpoint(WebpublishApiEndpoint):
|
|||
|
||||
args = [
|
||||
openpype_app,
|
||||
"module",
|
||||
WebpublisherAddon.name,
|
||||
command,
|
||||
batch_dir
|
||||
]
|
||||
|
|
|
|||
|
|
@ -7,8 +7,15 @@ import json
|
|||
import subprocess
|
||||
|
||||
from openpype.client import OpenPypeMongoConnection
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.lib import Logger
|
||||
|
||||
from openpype_modules.webpublisher.lib import (
|
||||
ERROR_STATUS,
|
||||
REPROCESS_STATUS,
|
||||
SENT_REPROCESSING_STATUS
|
||||
)
|
||||
|
||||
from .webpublish_routes import (
|
||||
RestApiResource,
|
||||
WebpublishRestApiResource,
|
||||
|
|
@ -21,32 +28,29 @@ from .webpublish_routes import (
|
|||
TaskPublishEndpoint,
|
||||
UserReportEndpoint
|
||||
)
|
||||
from openpype.lib.remote_publish import (
|
||||
ERROR_STATUS,
|
||||
REPROCESS_STATUS,
|
||||
SENT_REPROCESSING_STATUS
|
||||
)
|
||||
|
||||
|
||||
log = Logger.get_logger("webserver_gui")
|
||||
|
||||
|
||||
def run_webserver(*args, **kwargs):
|
||||
def run_webserver(executable, upload_dir, host=None, port=None):
|
||||
"""Runs webserver in command line, adds routes."""
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
if not host:
|
||||
host = "localhost"
|
||||
if not port:
|
||||
port = 8079
|
||||
|
||||
manager = ModulesManager()
|
||||
webserver_module = manager.modules_by_name["webserver"]
|
||||
host = kwargs.get("host") or "localhost"
|
||||
port = kwargs.get("port") or 8079
|
||||
|
||||
server_manager = webserver_module.create_new_server_manager(port, host)
|
||||
webserver_url = server_manager.url
|
||||
# queue for remotepublishfromapp tasks
|
||||
studio_task_queue = collections.deque()
|
||||
|
||||
resource = RestApiResource(server_manager,
|
||||
upload_dir=kwargs["upload_dir"],
|
||||
executable=kwargs["executable"],
|
||||
upload_dir=upload_dir,
|
||||
executable=executable,
|
||||
studio_task_queue=studio_task_queue)
|
||||
projects_endpoint = ProjectsEndpoint(resource)
|
||||
server_manager.add_route(
|
||||
|
|
@ -111,7 +115,7 @@ def run_webserver(*args, **kwargs):
|
|||
last_reprocessed = time.time()
|
||||
while True:
|
||||
if time.time() - last_reprocessed > 20:
|
||||
reprocess_failed(kwargs["upload_dir"], webserver_url)
|
||||
reprocess_failed(upload_dir, webserver_url)
|
||||
last_reprocessed = time.time()
|
||||
if studio_task_queue:
|
||||
args = studio_task_queue.popleft()
|
||||
|
|
@ -189,8 +189,6 @@ from .plugin_tools import (
|
|||
filter_pyblish_plugins,
|
||||
set_plugin_attributes_from_settings,
|
||||
source_hash,
|
||||
get_unique_layer_name,
|
||||
get_background_layers,
|
||||
)
|
||||
|
||||
from .path_tools import (
|
||||
|
|
@ -354,8 +352,6 @@ __all__ = [
|
|||
"filter_pyblish_plugins",
|
||||
"set_plugin_attributes_from_settings",
|
||||
"source_hash",
|
||||
"get_unique_layer_name",
|
||||
"get_background_layers",
|
||||
|
||||
"create_hard_link",
|
||||
"version_up",
|
||||
|
|
|
|||
|
|
@ -469,6 +469,19 @@ class ApplicationManager:
|
|||
for tool in group:
|
||||
self.tools[tool.full_name] = tool
|
||||
|
||||
def find_latest_available_variant_for_group(self, group_name):
|
||||
group = self.app_groups.get(group_name)
|
||||
if group is None or not group.enabled:
|
||||
return None
|
||||
|
||||
output = None
|
||||
for _, variant in reversed(sorted(group.variants.items())):
|
||||
executable = variant.find_executable()
|
||||
if executable:
|
||||
output = variant
|
||||
break
|
||||
return output
|
||||
|
||||
def launch(self, app_name, **data):
|
||||
"""Launch procedure.
|
||||
|
||||
|
|
@ -950,6 +963,63 @@ class ApplicationLaunchContext:
|
|||
)
|
||||
self.kwargs["env"] = value
|
||||
|
||||
def _collect_addons_launch_hook_paths(self):
|
||||
"""Helper to collect application launch hooks from addons.
|
||||
|
||||
Module have to have implemented 'get_launch_hook_paths' method which
|
||||
can expect appliction as argument or nothing.
|
||||
|
||||
Returns:
|
||||
List[str]: Paths to launch hook directories.
|
||||
"""
|
||||
|
||||
expected_types = (list, tuple, set)
|
||||
|
||||
output = []
|
||||
for module in self.modules_manager.get_enabled_modules():
|
||||
# Skip module if does not have implemented 'get_launch_hook_paths'
|
||||
func = getattr(module, "get_launch_hook_paths", None)
|
||||
if func is None:
|
||||
continue
|
||||
|
||||
func = module.get_launch_hook_paths
|
||||
if hasattr(inspect, "signature"):
|
||||
sig = inspect.signature(func)
|
||||
expect_args = len(sig.parameters) > 0
|
||||
else:
|
||||
expect_args = len(inspect.getargspec(func)[0]) > 0
|
||||
|
||||
# Pass application argument if method expect it.
|
||||
try:
|
||||
if expect_args:
|
||||
hook_paths = func(self.application)
|
||||
else:
|
||||
hook_paths = func()
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"Failed to call 'get_launch_hook_paths'",
|
||||
exc_info=True
|
||||
)
|
||||
continue
|
||||
|
||||
if not hook_paths:
|
||||
continue
|
||||
|
||||
# Convert string to list
|
||||
if isinstance(hook_paths, six.string_types):
|
||||
hook_paths = [hook_paths]
|
||||
|
||||
# Skip invalid types
|
||||
if not isinstance(hook_paths, expected_types):
|
||||
self.log.warning((
|
||||
"Result of `get_launch_hook_paths`"
|
||||
" has invalid type {}. Expected {}"
|
||||
).format(type(hook_paths), expected_types))
|
||||
continue
|
||||
|
||||
output.extend(hook_paths)
|
||||
return output
|
||||
|
||||
def paths_to_launch_hooks(self):
|
||||
"""Directory paths where to look for launch hooks."""
|
||||
# This method has potential to be part of application manager (maybe).
|
||||
|
|
@ -983,9 +1053,7 @@ class ApplicationLaunchContext:
|
|||
paths.append(path)
|
||||
|
||||
# Load modules paths
|
||||
paths.extend(
|
||||
self.modules_manager.collect_launch_hook_paths(self.application)
|
||||
)
|
||||
paths.extend(self._collect_addons_launch_hook_paths())
|
||||
|
||||
return paths
|
||||
|
||||
|
|
|
|||
|
|
@ -11,13 +11,8 @@ import functools
|
|||
from openpype.client import get_asset_by_id
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
from .profiles_filtering import filter_profiles
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
# Subset name template used when plugin does not have defined any
|
||||
DEFAULT_SUBSET_TEMPLATE = "{family}{Variant}"
|
||||
|
||||
|
||||
class PluginToolsDeprecatedWarning(DeprecationWarning):
|
||||
pass
|
||||
|
|
@ -64,13 +59,14 @@ def deprecated(new_destination):
|
|||
return _decorator(func)
|
||||
|
||||
|
||||
class TaskNotSetError(KeyError):
|
||||
def __init__(self, msg=None):
|
||||
if not msg:
|
||||
msg = "Creator's subset name template requires task name."
|
||||
super(TaskNotSetError, self).__init__(msg)
|
||||
@deprecated("openpype.pipeline.create.TaskNotSetError")
|
||||
def TaskNotSetError(*args, **kwargs):
|
||||
from openpype.pipeline.create import TaskNotSetError
|
||||
|
||||
return TaskNotSetError(*args, **kwargs)
|
||||
|
||||
|
||||
@deprecated("openpype.pipeline.create.get_subset_name")
|
||||
def get_subset_name_with_asset_doc(
|
||||
family,
|
||||
variant,
|
||||
|
|
@ -109,61 +105,22 @@ def get_subset_name_with_asset_doc(
|
|||
dbcon (AvalonMongoDB): Mongo connection to be able query asset document
|
||||
if 'asset_doc' is not passed.
|
||||
"""
|
||||
if not family:
|
||||
return ""
|
||||
|
||||
if not host_name:
|
||||
host_name = os.environ["AVALON_APP"]
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
# Use only last part of class family value split by dot (`.`)
|
||||
family = family.rsplit(".", 1)[-1]
|
||||
|
||||
if project_name is None:
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
project_name = legacy_io.Session["AVALON_PROJECT"]
|
||||
|
||||
asset_tasks = asset_doc.get("data", {}).get("tasks") or {}
|
||||
task_info = asset_tasks.get(task_name) or {}
|
||||
task_type = task_info.get("type")
|
||||
|
||||
# Get settings
|
||||
tools_settings = get_project_settings(project_name)["global"]["tools"]
|
||||
profiles = tools_settings["creator"]["subset_name_profiles"]
|
||||
filtering_criteria = {
|
||||
"families": family,
|
||||
"hosts": host_name,
|
||||
"tasks": task_name,
|
||||
"task_types": task_type
|
||||
}
|
||||
|
||||
matching_profile = filter_profiles(profiles, filtering_criteria)
|
||||
template = None
|
||||
if matching_profile:
|
||||
template = matching_profile["template"]
|
||||
|
||||
# Make sure template is set (matching may have empty string)
|
||||
if not template:
|
||||
template = default_template or DEFAULT_SUBSET_TEMPLATE
|
||||
|
||||
# Simple check of task name existence for template with {task} in
|
||||
# - missing task should be possible only in Standalone publisher
|
||||
if not task_name and "{task" in template.lower():
|
||||
raise TaskNotSetError()
|
||||
|
||||
fill_pairs = {
|
||||
"variant": variant,
|
||||
"family": family,
|
||||
"task": task_name
|
||||
}
|
||||
if dynamic_data:
|
||||
# Dynamic data may override default values
|
||||
for key, value in dynamic_data.items():
|
||||
fill_pairs[key] = value
|
||||
|
||||
return template.format(**prepare_template_data(fill_pairs))
|
||||
return get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name,
|
||||
default_template,
|
||||
dynamic_data
|
||||
)
|
||||
|
||||
|
||||
@deprecated
|
||||
def get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
|
|
@ -183,16 +140,18 @@ def get_subset_name(
|
|||
`get_subset_name_with_asset_doc` where asset document is expected.
|
||||
"""
|
||||
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
if project_name is None:
|
||||
project_name = dbcon.project_name
|
||||
|
||||
asset_doc = get_asset_by_id(project_name, asset_id, fields=["data.tasks"])
|
||||
|
||||
return get_subset_name_with_asset_doc(
|
||||
return get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc or {},
|
||||
asset_doc,
|
||||
project_name,
|
||||
host_name,
|
||||
default_template,
|
||||
|
|
@ -254,6 +213,9 @@ def filter_pyblish_plugins(plugins):
|
|||
Args:
|
||||
plugins (dict): Dictionary of plugins produced by :mod:`pyblish-base`
|
||||
`discover()` method.
|
||||
|
||||
Deprecated:
|
||||
Function will be removed after release version 3.15.*
|
||||
"""
|
||||
|
||||
from openpype.pipeline.publish.lib import filter_pyblish_plugins
|
||||
|
|
@ -277,6 +239,9 @@ def set_plugin_attributes_from_settings(
|
|||
Value from environment `AVALON_APP` is used if not entered.
|
||||
project_name (str): Name of project for which settings will be loaded.
|
||||
Value from environment `AVALON_PROJECT` is used if not entered.
|
||||
|
||||
Deprecated:
|
||||
Function will be removed after release version 3.15.*
|
||||
"""
|
||||
|
||||
# Function is not used anymore
|
||||
|
|
@ -373,102 +338,3 @@ def source_hash(filepath, *args):
|
|||
time = str(os.path.getmtime(filepath))
|
||||
size = str(os.path.getsize(filepath))
|
||||
return "|".join([file_name, time, size] + list(args)).replace(".", ",")
|
||||
|
||||
|
||||
def get_unique_layer_name(layers, name):
|
||||
"""
|
||||
Gets all layer names and if 'name' is present in them, increases
|
||||
suffix by 1 (eg. creates unique layer name - for Loader)
|
||||
Args:
|
||||
layers (list): of strings, names only
|
||||
name (string): checked value
|
||||
|
||||
Returns:
|
||||
(string): name_00X (without version)
|
||||
"""
|
||||
names = {}
|
||||
for layer in layers:
|
||||
layer_name = re.sub(r'_\d{3}$', '', layer)
|
||||
if layer_name in names.keys():
|
||||
names[layer_name] = names[layer_name] + 1
|
||||
else:
|
||||
names[layer_name] = 1
|
||||
occurrences = names.get(name, 0)
|
||||
|
||||
return "{}_{:0>3d}".format(name, occurrences + 1)
|
||||
|
||||
|
||||
def get_background_layers(file_url):
|
||||
"""
|
||||
Pulls file name from background json file, enrich with folder url for
|
||||
AE to be able import files.
|
||||
|
||||
Order is important, follows order in json.
|
||||
|
||||
Args:
|
||||
file_url (str): abs url of background json
|
||||
|
||||
Returns:
|
||||
(list): of abs paths to images
|
||||
"""
|
||||
with open(file_url) as json_file:
|
||||
data = json.load(json_file)
|
||||
|
||||
layers = list()
|
||||
bg_folder = os.path.dirname(file_url)
|
||||
for child in data['children']:
|
||||
if child.get("filename"):
|
||||
layers.append(os.path.join(bg_folder, child.get("filename")).
|
||||
replace("\\", "/"))
|
||||
else:
|
||||
for layer in child['children']:
|
||||
if layer.get("filename"):
|
||||
layers.append(os.path.join(bg_folder,
|
||||
layer.get("filename")).
|
||||
replace("\\", "/"))
|
||||
return layers
|
||||
|
||||
|
||||
def parse_json(path):
|
||||
"""Parses json file at 'path' location
|
||||
|
||||
Returns:
|
||||
(dict) or None if unparsable
|
||||
Raises:
|
||||
AsssertionError if 'path' doesn't exist
|
||||
"""
|
||||
path = path.strip('\"')
|
||||
assert os.path.isfile(path), (
|
||||
"Path to json file doesn't exist. \"{}\"".format(path)
|
||||
)
|
||||
data = None
|
||||
with open(path, "r") as json_file:
|
||||
try:
|
||||
data = json.load(json_file)
|
||||
except Exception as exc:
|
||||
log.error(
|
||||
"Error loading json: "
|
||||
"{} - Exception: {}".format(path, exc)
|
||||
)
|
||||
return data
|
||||
|
||||
|
||||
def get_batch_asset_task_info(ctx):
|
||||
"""Parses context data from webpublisher's batch metadata
|
||||
|
||||
Returns:
|
||||
(tuple): asset, task_name (Optional), task_type
|
||||
"""
|
||||
task_type = "default_task_type"
|
||||
task_name = None
|
||||
asset = None
|
||||
|
||||
if ctx["type"] == "task":
|
||||
items = ctx["path"].split('/')
|
||||
asset = items[-2]
|
||||
task_name = ctx["name"]
|
||||
task_type = ctx["attributes"]["type"]
|
||||
else:
|
||||
asset = ctx["name"]
|
||||
|
||||
return asset, task_name, task_type
|
||||
|
|
|
|||
|
|
@ -2,7 +2,6 @@
|
|||
from .base import (
|
||||
OpenPypeModule,
|
||||
OpenPypeAddOn,
|
||||
OpenPypeInterface,
|
||||
|
||||
load_modules,
|
||||
|
||||
|
|
@ -20,7 +19,6 @@ from .base import (
|
|||
__all__ = (
|
||||
"OpenPypeModule",
|
||||
"OpenPypeAddOn",
|
||||
"OpenPypeInterface",
|
||||
|
||||
"load_modules",
|
||||
|
||||
|
|
|
|||
|
|
@ -32,6 +32,14 @@ from openpype.lib import (
|
|||
import_module_from_dirpath
|
||||
)
|
||||
|
||||
from .interfaces import (
|
||||
OpenPypeInterface,
|
||||
IPluginPaths,
|
||||
IHostAddon,
|
||||
ITrayModule,
|
||||
ITrayService
|
||||
)
|
||||
|
||||
# Files that will be always ignored on modules import
|
||||
IGNORED_FILENAMES = (
|
||||
"__pycache__",
|
||||
|
|
@ -389,31 +397,6 @@ def _load_modules():
|
|||
log.error(msg, exc_info=True)
|
||||
|
||||
|
||||
class _OpenPypeInterfaceMeta(ABCMeta):
|
||||
"""OpenPypeInterface meta class to print proper string."""
|
||||
|
||||
def __str__(self):
|
||||
return "<'OpenPypeInterface.{}'>".format(self.__name__)
|
||||
|
||||
def __repr__(self):
|
||||
return str(self)
|
||||
|
||||
|
||||
@six.add_metaclass(_OpenPypeInterfaceMeta)
|
||||
class OpenPypeInterface:
|
||||
"""Base class of Interface that can be used as Mixin with abstract parts.
|
||||
|
||||
This is way how OpenPype module or addon can tell that has implementation
|
||||
for specific part or for other module/addon.
|
||||
|
||||
Child classes of OpenPypeInterface may be used as mixin in different
|
||||
OpenPype modules which means they have to have implemented methods defined
|
||||
in the interface. By default interface does not have any abstract parts.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class OpenPypeModule:
|
||||
"""Base class of pype module.
|
||||
|
|
@ -747,8 +730,6 @@ class ModulesManager:
|
|||
and "actions" each containing list of paths.
|
||||
"""
|
||||
# Output structure
|
||||
from openpype_interfaces import IPluginPaths
|
||||
|
||||
output = {
|
||||
"publish": [],
|
||||
"create": [],
|
||||
|
|
@ -805,8 +786,6 @@ class ModulesManager:
|
|||
list: List of creator plugin paths.
|
||||
"""
|
||||
# Output structure
|
||||
from openpype_interfaces import IPluginPaths
|
||||
|
||||
output = []
|
||||
for module in self.get_enabled_modules():
|
||||
# Skip module that do not inherit from `IPluginPaths`
|
||||
|
|
@ -821,68 +800,6 @@ class ModulesManager:
|
|||
output.extend(paths)
|
||||
return output
|
||||
|
||||
def collect_launch_hook_paths(self, app):
|
||||
"""Helper to collect application launch hooks.
|
||||
|
||||
It used to be based on 'ILaunchHookPaths' which is not true anymore.
|
||||
Module just have to have implemented 'get_launch_hook_paths' method.
|
||||
|
||||
Args:
|
||||
app (Application): Application object which can be used for
|
||||
filtering of which launch hook paths are returned.
|
||||
|
||||
Returns:
|
||||
list: Paths to launch hook directories.
|
||||
"""
|
||||
|
||||
str_type = type("")
|
||||
expected_types = (list, tuple, set)
|
||||
|
||||
output = []
|
||||
for module in self.get_enabled_modules():
|
||||
# Skip module if does not have implemented 'get_launch_hook_paths'
|
||||
func = getattr(module, "get_launch_hook_paths", None)
|
||||
if func is None:
|
||||
continue
|
||||
|
||||
func = module.get_launch_hook_paths
|
||||
if hasattr(inspect, "signature"):
|
||||
sig = inspect.signature(func)
|
||||
expect_args = len(sig.parameters) > 0
|
||||
else:
|
||||
expect_args = len(inspect.getargspec(func)[0]) > 0
|
||||
|
||||
# Pass application argument if method expect it.
|
||||
try:
|
||||
if expect_args:
|
||||
hook_paths = func(app)
|
||||
else:
|
||||
hook_paths = func()
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"Failed to call 'get_launch_hook_paths'",
|
||||
exc_info=True
|
||||
)
|
||||
continue
|
||||
|
||||
if not hook_paths:
|
||||
continue
|
||||
|
||||
# Convert string to list
|
||||
if isinstance(hook_paths, str_type):
|
||||
hook_paths = [hook_paths]
|
||||
|
||||
# Skip invalid types
|
||||
if not isinstance(hook_paths, expected_types):
|
||||
self.log.warning((
|
||||
"Result of `get_launch_hook_paths`"
|
||||
" has invalid type {}. Expected {}"
|
||||
).format(type(hook_paths), expected_types))
|
||||
continue
|
||||
|
||||
output.extend(hook_paths)
|
||||
return output
|
||||
|
||||
def get_host_module(self, host_name):
|
||||
"""Find host module by host name.
|
||||
|
||||
|
|
@ -891,15 +808,13 @@ class ModulesManager:
|
|||
|
||||
Returns:
|
||||
OpenPypeModule: Found host module by name.
|
||||
None: There was not found module inheriting IHostModule which has
|
||||
None: There was not found module inheriting IHostAddon which has
|
||||
host name set to passed 'host_name'.
|
||||
"""
|
||||
|
||||
from openpype_interfaces import IHostModule
|
||||
|
||||
for module in self.get_enabled_modules():
|
||||
if (
|
||||
isinstance(module, IHostModule)
|
||||
isinstance(module, IHostAddon)
|
||||
and module.host_name == host_name
|
||||
):
|
||||
return module
|
||||
|
|
@ -910,15 +825,13 @@ class ModulesManager:
|
|||
|
||||
Returns:
|
||||
Iterable[str]: All available host names based on enabled modules
|
||||
inheriting 'IHostModule'.
|
||||
inheriting 'IHostAddon'.
|
||||
"""
|
||||
|
||||
from openpype_interfaces import IHostModule
|
||||
|
||||
host_names = {
|
||||
module.host_name
|
||||
for module in self.get_enabled_modules()
|
||||
if isinstance(module, IHostModule)
|
||||
if isinstance(module, IHostAddon)
|
||||
}
|
||||
return host_names
|
||||
|
||||
|
|
@ -1096,8 +1009,6 @@ class TrayModulesManager(ModulesManager):
|
|||
self.tray_menu(tray_menu)
|
||||
|
||||
def get_enabled_tray_modules(self):
|
||||
from openpype_interfaces import ITrayModule
|
||||
|
||||
output = []
|
||||
for module in self.modules:
|
||||
if module.enabled and isinstance(module, ITrayModule):
|
||||
|
|
@ -1173,8 +1084,6 @@ class TrayModulesManager(ModulesManager):
|
|||
self._report["Tray menu"] = report
|
||||
|
||||
def start_modules(self):
|
||||
from openpype_interfaces import ITrayService
|
||||
|
||||
report = {}
|
||||
time_start = time.time()
|
||||
prev_start_time = time_start
|
||||
|
|
|
|||
|
|
@ -9,7 +9,6 @@ from openpype.modules import OpenPypeModule
|
|||
from openpype_interfaces import (
|
||||
ITrayModule,
|
||||
IPluginPaths,
|
||||
ILaunchHookPaths,
|
||||
ISettingsChangeListener
|
||||
)
|
||||
from openpype.settings import SaveWarningExc
|
||||
|
|
@ -21,7 +20,6 @@ class FtrackModule(
|
|||
OpenPypeModule,
|
||||
ITrayModule,
|
||||
IPluginPaths,
|
||||
ILaunchHookPaths,
|
||||
ISettingsChangeListener
|
||||
):
|
||||
name = "ftrack"
|
||||
|
|
@ -85,7 +83,8 @@ class FtrackModule(
|
|||
}
|
||||
|
||||
def get_launch_hook_paths(self):
|
||||
"""Implementation of `ILaunchHookPaths`."""
|
||||
"""Implementation for applications launch hooks."""
|
||||
|
||||
return os.path.join(FTRACK_MODULE_DIR, "launch_hooks")
|
||||
|
||||
def modify_application_launch_arguments(self, application, env):
|
||||
|
|
|
|||
|
|
@ -1,8 +1,33 @@
|
|||
from abc import abstractmethod, abstractproperty
|
||||
from abc import ABCMeta, abstractmethod, abstractproperty
|
||||
|
||||
import six
|
||||
|
||||
from openpype import resources
|
||||
|
||||
from openpype.modules import OpenPypeInterface
|
||||
|
||||
class _OpenPypeInterfaceMeta(ABCMeta):
|
||||
"""OpenPypeInterface meta class to print proper string."""
|
||||
|
||||
def __str__(self):
|
||||
return "<'OpenPypeInterface.{}'>".format(self.__name__)
|
||||
|
||||
def __repr__(self):
|
||||
return str(self)
|
||||
|
||||
|
||||
@six.add_metaclass(_OpenPypeInterfaceMeta)
|
||||
class OpenPypeInterface:
|
||||
"""Base class of Interface that can be used as Mixin with abstract parts.
|
||||
|
||||
This is way how OpenPype module or addon can tell OpenPype that contain
|
||||
implementation for specific functionality.
|
||||
|
||||
Child classes of OpenPypeInterface may be used as mixin in different
|
||||
OpenPype modules which means they have to have implemented methods defined
|
||||
in the interface. By default interface does not have any abstract parts.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class IPluginPaths(OpenPypeInterface):
|
||||
|
|
@ -56,6 +81,13 @@ class ILaunchHookPaths(OpenPypeInterface):
|
|||
|
||||
Expected result is list of paths.
|
||||
["path/to/launch_hooks_dir"]
|
||||
|
||||
Deprecated:
|
||||
This interface is not needed since OpenPype 3.14.*. Addon just have to
|
||||
implement 'get_launch_hook_paths' which can expect Application object
|
||||
or nothing as argument.
|
||||
|
||||
Interface class will be removed after 3.16.*.
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
|
|
@ -353,8 +385,8 @@ class ISettingsChangeListener(OpenPypeInterface):
|
|||
pass
|
||||
|
||||
|
||||
class IHostModule(OpenPypeInterface):
|
||||
"""Module which also contain a host implementation."""
|
||||
class IHostAddon(OpenPypeInterface):
|
||||
"""Addon which also contain a host implementation."""
|
||||
|
||||
@abstractproperty
|
||||
def host_name(self):
|
||||
|
|
|
|||
|
|
@ -3,7 +3,6 @@ import os
|
|||
from openpype_interfaces import (
|
||||
ITrayModule,
|
||||
IPluginPaths,
|
||||
ILaunchHookPaths,
|
||||
)
|
||||
|
||||
from openpype.modules import OpenPypeModule
|
||||
|
|
@ -11,9 +10,7 @@ from openpype.modules import OpenPypeModule
|
|||
SHOTGRID_MODULE_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class ShotgridModule(
|
||||
OpenPypeModule, ITrayModule, IPluginPaths, ILaunchHookPaths
|
||||
):
|
||||
class ShotgridModule(OpenPypeModule, ITrayModule, IPluginPaths):
|
||||
leecher_manager_url = None
|
||||
name = "shotgrid"
|
||||
enabled = False
|
||||
|
|
|
|||
|
|
@ -1,14 +1,11 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import (
|
||||
IPluginPaths,
|
||||
ILaunchHookPaths
|
||||
)
|
||||
from openpype.modules.interfaces import IPluginPaths
|
||||
|
||||
SLACK_MODULE_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class SlackIntegrationModule(OpenPypeModule, IPluginPaths, ILaunchHookPaths):
|
||||
class SlackIntegrationModule(OpenPypeModule, IPluginPaths):
|
||||
"""Allows sending notification to Slack channels during publishing."""
|
||||
|
||||
name = "slack"
|
||||
|
|
@ -18,7 +15,8 @@ class SlackIntegrationModule(OpenPypeModule, IPluginPaths, ILaunchHookPaths):
|
|||
self.enabled = slack_settings["enabled"]
|
||||
|
||||
def get_launch_hook_paths(self):
|
||||
"""Implementation of `ILaunchHookPaths`."""
|
||||
"""Implementation for applications launch hooks."""
|
||||
|
||||
return os.path.join(SLACK_MODULE_DIR, "launch_hooks")
|
||||
|
||||
def get_plugin_paths(self):
|
||||
|
|
|
|||
|
|
@ -10,6 +10,8 @@ class AbstractProvider:
|
|||
CODE = ''
|
||||
LABEL = ''
|
||||
|
||||
_log = None
|
||||
|
||||
def __init__(self, project_name, site_name, tree=None, presets=None):
|
||||
self.presets = None
|
||||
self.active = False
|
||||
|
|
@ -19,6 +21,12 @@ class AbstractProvider:
|
|||
|
||||
super(AbstractProvider, self).__init__()
|
||||
|
||||
@property
|
||||
def log(self):
|
||||
if self._log is None:
|
||||
self._log = Logger.get_logger(self.__class__.__name__)
|
||||
return self._log
|
||||
|
||||
@abc.abstractmethod
|
||||
def is_active(self):
|
||||
"""
|
||||
|
|
@ -199,11 +207,11 @@ class AbstractProvider:
|
|||
path = anatomy.fill_root(path)
|
||||
except KeyError:
|
||||
msg = "Error in resolving local root from anatomy"
|
||||
log.error(msg)
|
||||
self.log.error(msg)
|
||||
raise ValueError(msg)
|
||||
except IndexError:
|
||||
msg = "Path {} contains unfillable placeholder"
|
||||
log.error(msg)
|
||||
self.log.error(msg)
|
||||
raise ValueError(msg)
|
||||
|
||||
return path
|
||||
|
|
|
|||
|
|
@ -2,12 +2,9 @@ import os
|
|||
|
||||
import dropbox
|
||||
|
||||
from openpype.api import Logger
|
||||
from .abstract_provider import AbstractProvider
|
||||
from ..utils import EditableScopes
|
||||
|
||||
log = Logger().get_logger("SyncServer")
|
||||
|
||||
|
||||
class DropboxHandler(AbstractProvider):
|
||||
CODE = 'dropbox'
|
||||
|
|
@ -20,26 +17,26 @@ class DropboxHandler(AbstractProvider):
|
|||
self.dbx = None
|
||||
|
||||
if not self.presets:
|
||||
log.info(
|
||||
self.log.info(
|
||||
"Sync Server: There are no presets for {}.".format(site_name)
|
||||
)
|
||||
return
|
||||
|
||||
if not self.presets["enabled"]:
|
||||
log.debug("Sync Server: Site {} not enabled for {}.".
|
||||
self.log.debug("Sync Server: Site {} not enabled for {}.".
|
||||
format(site_name, project_name))
|
||||
return
|
||||
|
||||
token = self.presets.get("token", "")
|
||||
if not token:
|
||||
msg = "Sync Server: No access token for dropbox provider"
|
||||
log.info(msg)
|
||||
self.log.info(msg)
|
||||
return
|
||||
|
||||
team_folder_name = self.presets.get("team_folder_name", "")
|
||||
if not team_folder_name:
|
||||
msg = "Sync Server: No team folder name for dropbox provider"
|
||||
log.info(msg)
|
||||
self.log.info(msg)
|
||||
return
|
||||
|
||||
acting_as_member = self.presets.get("acting_as_member", "")
|
||||
|
|
@ -47,7 +44,7 @@ class DropboxHandler(AbstractProvider):
|
|||
msg = (
|
||||
"Sync Server: No acting member for dropbox provider"
|
||||
)
|
||||
log.info(msg)
|
||||
self.log.info(msg)
|
||||
return
|
||||
|
||||
try:
|
||||
|
|
@ -55,7 +52,7 @@ class DropboxHandler(AbstractProvider):
|
|||
token, acting_as_member, team_folder_name
|
||||
)
|
||||
except Exception as e:
|
||||
log.info("Could not establish dropbox object: {}".format(e))
|
||||
self.log.info("Could not establish dropbox object: {}".format(e))
|
||||
return
|
||||
|
||||
super(AbstractProvider, self).__init__()
|
||||
|
|
@ -448,7 +445,7 @@ class DropboxHandler(AbstractProvider):
|
|||
path = anatomy.fill_root(path)
|
||||
except KeyError:
|
||||
msg = "Error in resolving local root from anatomy"
|
||||
log.error(msg)
|
||||
self.log.error(msg)
|
||||
raise ValueError(msg)
|
||||
|
||||
return path
|
||||
|
|
|
|||
|
|
@ -5,12 +5,12 @@ import sys
|
|||
import six
|
||||
import platform
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.api import get_system_settings
|
||||
from openpype.lib import Logger
|
||||
from openpype.settings import get_system_settings
|
||||
from .abstract_provider import AbstractProvider
|
||||
from ..utils import time_function, ResumableError
|
||||
|
||||
log = Logger().get_logger("SyncServer")
|
||||
log = Logger.get_logger("GDriveHandler")
|
||||
|
||||
try:
|
||||
from googleapiclient.discovery import build
|
||||
|
|
@ -69,13 +69,17 @@ class GDriveHandler(AbstractProvider):
|
|||
|
||||
self.presets = presets
|
||||
if not self.presets:
|
||||
log.info("Sync Server: There are no presets for {}.".
|
||||
format(site_name))
|
||||
self.log.info(
|
||||
"Sync Server: There are no presets for {}.".format(site_name)
|
||||
)
|
||||
return
|
||||
|
||||
if not self.presets["enabled"]:
|
||||
log.debug("Sync Server: Site {} not enabled for {}.".
|
||||
format(site_name, project_name))
|
||||
self.log.debug(
|
||||
"Sync Server: Site {} not enabled for {}.".format(
|
||||
site_name, project_name
|
||||
)
|
||||
)
|
||||
return
|
||||
|
||||
current_platform = platform.system().lower()
|
||||
|
|
@ -85,20 +89,22 @@ class GDriveHandler(AbstractProvider):
|
|||
if not cred_path:
|
||||
msg = "Sync Server: Please, fill the credentials for gdrive "\
|
||||
"provider for platform '{}' !".format(current_platform)
|
||||
log.info(msg)
|
||||
self.log.info(msg)
|
||||
return
|
||||
|
||||
try:
|
||||
cred_path = cred_path.format(**os.environ)
|
||||
except KeyError as e:
|
||||
log.info("Sync Server: The key(s) {} does not exist in the "
|
||||
"environment variables".format(" ".join(e.args)))
|
||||
self.log.info((
|
||||
"Sync Server: The key(s) {} does not exist in the "
|
||||
"environment variables"
|
||||
).format(" ".join(e.args)))
|
||||
return
|
||||
|
||||
if not os.path.exists(cred_path):
|
||||
msg = "Sync Server: No credentials for gdrive provider " + \
|
||||
"for '{}' on path '{}'!".format(site_name, cred_path)
|
||||
log.info(msg)
|
||||
self.log.info(msg)
|
||||
return
|
||||
|
||||
self.service = None
|
||||
|
|
@ -318,7 +324,7 @@ class GDriveHandler(AbstractProvider):
|
|||
fields='id')
|
||||
|
||||
media.stream()
|
||||
log.debug("Start Upload! {}".format(source_path))
|
||||
self.log.debug("Start Upload! {}".format(source_path))
|
||||
last_tick = status = response = None
|
||||
status_val = 0
|
||||
while response is None:
|
||||
|
|
@ -331,7 +337,7 @@ class GDriveHandler(AbstractProvider):
|
|||
if not last_tick or \
|
||||
time.time() - last_tick >= server.LOG_PROGRESS_SEC:
|
||||
last_tick = time.time()
|
||||
log.debug("Uploaded %d%%." %
|
||||
self.log.debug("Uploaded %d%%." %
|
||||
int(status_val * 100))
|
||||
server.update_db(project_name=project_name,
|
||||
new_file_id=None,
|
||||
|
|
@ -350,8 +356,9 @@ class GDriveHandler(AbstractProvider):
|
|||
if 'has not granted' in ex._get_reason().strip():
|
||||
raise PermissionError(ex._get_reason().strip())
|
||||
|
||||
log.warning("Forbidden received, hit quota. "
|
||||
"Injecting 60s delay.")
|
||||
self.log.warning(
|
||||
"Forbidden received, hit quota. Injecting 60s delay."
|
||||
)
|
||||
time.sleep(60)
|
||||
return False
|
||||
raise
|
||||
|
|
@ -417,7 +424,7 @@ class GDriveHandler(AbstractProvider):
|
|||
if not last_tick or \
|
||||
time.time() - last_tick >= server.LOG_PROGRESS_SEC:
|
||||
last_tick = time.time()
|
||||
log.debug("Downloaded %d%%." %
|
||||
self.log.debug("Downloaded %d%%." %
|
||||
int(status_val * 100))
|
||||
server.update_db(project_name=project_name,
|
||||
new_file_id=None,
|
||||
|
|
@ -629,9 +636,9 @@ class GDriveHandler(AbstractProvider):
|
|||
["gdrive"]
|
||||
)
|
||||
except KeyError:
|
||||
log.info(("Sync Server: There are no presets for Gdrive " +
|
||||
"provider.").
|
||||
format(str(provider_presets)))
|
||||
log.info((
|
||||
"Sync Server: There are no presets for Gdrive provider."
|
||||
).format(str(provider_presets)))
|
||||
return
|
||||
return provider_presets
|
||||
|
||||
|
|
@ -704,7 +711,7 @@ class GDriveHandler(AbstractProvider):
|
|||
roots[self.MY_DRIVE_STR] = self.service.files() \
|
||||
.get(fileId='root').execute()
|
||||
except errors.HttpError:
|
||||
log.warning("HttpError in sync loop, "
|
||||
self.log.warning("HttpError in sync loop, "
|
||||
"trying next loop",
|
||||
exc_info=True)
|
||||
raise ResumableError
|
||||
|
|
@ -727,7 +734,7 @@ class GDriveHandler(AbstractProvider):
|
|||
Returns:
|
||||
(dictionary) path as a key, folder id as a value
|
||||
"""
|
||||
log.debug("build_tree len {}".format(len(folders)))
|
||||
self.log.debug("build_tree len {}".format(len(folders)))
|
||||
if not self.root: # build only when necessary, could be expensive
|
||||
self.root = self._prepare_root_info()
|
||||
|
||||
|
|
@ -779,9 +786,9 @@ class GDriveHandler(AbstractProvider):
|
|||
loop_cnt += 1
|
||||
|
||||
if len(no_parents_yet) > 0:
|
||||
log.debug("Some folders path are not resolved {}".
|
||||
self.log.debug("Some folders path are not resolved {}".
|
||||
format(no_parents_yet))
|
||||
log.debug("Remove deleted folders from trash.")
|
||||
self.log.debug("Remove deleted folders from trash.")
|
||||
|
||||
return tree
|
||||
|
||||
|
|
|
|||
|
|
@ -4,10 +4,10 @@ import time
|
|||
import threading
|
||||
import platform
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.api import get_system_settings
|
||||
from openpype.lib import Logger
|
||||
from openpype.settings import get_system_settings
|
||||
from .abstract_provider import AbstractProvider
|
||||
log = Logger().get_logger("SyncServer")
|
||||
log = Logger.get_logger("SyncServer-SFTPHandler")
|
||||
|
||||
pysftp = None
|
||||
try:
|
||||
|
|
@ -43,8 +43,9 @@ class SFTPHandler(AbstractProvider):
|
|||
|
||||
self.presets = presets
|
||||
if not self.presets:
|
||||
log.warning("Sync Server: There are no presets for {}.".
|
||||
format(site_name))
|
||||
self.log.warning(
|
||||
"Sync Server: There are no presets for {}.".format(site_name)
|
||||
)
|
||||
return
|
||||
|
||||
# store to instance for reconnect
|
||||
|
|
@ -423,7 +424,7 @@ class SFTPHandler(AbstractProvider):
|
|||
return pysftp.Connection(**conn_params)
|
||||
except (paramiko.ssh_exception.SSHException,
|
||||
pysftp.exceptions.ConnectionException):
|
||||
log.warning("Couldn't connect", exc_info=True)
|
||||
self.log.warning("Couldn't connect", exc_info=True)
|
||||
|
||||
def _mark_progress(self, project_name, file, representation, server, site,
|
||||
source_path, target_path, direction):
|
||||
|
|
@ -445,7 +446,7 @@ class SFTPHandler(AbstractProvider):
|
|||
time.time() - last_tick >= server.LOG_PROGRESS_SEC:
|
||||
status_val = target_file_size / source_file_size
|
||||
last_tick = time.time()
|
||||
log.debug(direction + "ed %d%%." % int(status_val * 100))
|
||||
self.log.debug(direction + "ed %d%%." % int(status_val * 100))
|
||||
server.update_db(project_name=project_name,
|
||||
new_file_id=None,
|
||||
file=file,
|
||||
|
|
|
|||
|
|
@ -10,7 +10,7 @@ class TimersManagerModuleRestApi:
|
|||
happens in Workfile app.
|
||||
"""
|
||||
def __init__(self, user_module, server_manager):
|
||||
self.log = None
|
||||
self._log = None
|
||||
self.module = user_module
|
||||
self.server_manager = server_manager
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,6 @@ from openpype.client import get_asset_by_name
|
|||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import (
|
||||
ITrayService,
|
||||
ILaunchHookPaths,
|
||||
IPluginPaths
|
||||
)
|
||||
from openpype.lib.events import register_event_callback
|
||||
|
|
@ -79,7 +78,6 @@ class ExampleTimersManagerConnector:
|
|||
class TimersManager(
|
||||
OpenPypeModule,
|
||||
ITrayService,
|
||||
ILaunchHookPaths,
|
||||
IPluginPaths
|
||||
):
|
||||
""" Handles about Timers.
|
||||
|
|
@ -185,12 +183,11 @@ class TimersManager(
|
|||
)
|
||||
|
||||
def get_launch_hook_paths(self):
|
||||
"""Implementation of `ILaunchHookPaths`."""
|
||||
"""Implementation for applications launch hooks."""
|
||||
|
||||
return os.path.join(
|
||||
TIMER_MODULE_DIR,
|
||||
"launch_hooks"
|
||||
)
|
||||
return [
|
||||
os.path.join(TIMER_MODULE_DIR, "launch_hooks")
|
||||
]
|
||||
|
||||
def get_plugin_paths(self):
|
||||
"""Implementation of `IPluginPaths`."""
|
||||
|
|
|
|||
|
|
@ -53,9 +53,12 @@ class WebServerModule(OpenPypeModule, ITrayService):
|
|||
try:
|
||||
module.webserver_initialization(self.server_manager)
|
||||
except Exception:
|
||||
self.log.warning((
|
||||
"Failed to connect module \"{}\" to webserver."
|
||||
).format(module.name))
|
||||
self.log.warning(
|
||||
(
|
||||
"Failed to connect module \"{}\" to webserver."
|
||||
).format(module.name),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
def tray_init(self):
|
||||
self.create_server_manager()
|
||||
|
|
|
|||
|
|
@ -1,6 +1,13 @@
|
|||
from .constants import (
|
||||
SUBSET_NAME_ALLOWED_SYMBOLS
|
||||
SUBSET_NAME_ALLOWED_SYMBOLS,
|
||||
DEFAULT_SUBSET_TEMPLATE,
|
||||
)
|
||||
|
||||
from .subset_name import (
|
||||
TaskNotSetError,
|
||||
get_subset_name,
|
||||
)
|
||||
|
||||
from .creator_plugins import (
|
||||
CreatorError,
|
||||
|
||||
|
|
@ -32,6 +39,10 @@ from .legacy_create import (
|
|||
|
||||
__all__ = (
|
||||
"SUBSET_NAME_ALLOWED_SYMBOLS",
|
||||
"DEFAULT_SUBSET_TEMPLATE",
|
||||
|
||||
"TaskNotSetError",
|
||||
"get_subset_name",
|
||||
|
||||
"CreatorError",
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
SUBSET_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_."
|
||||
DEFAULT_SUBSET_TEMPLATE = "{family}{Variant}"
|
||||
|
||||
|
||||
__all__ = (
|
||||
"SUBSET_NAME_ALLOWED_SYMBOLS",
|
||||
"DEFAULT_SUBSET_TEMPLATE",
|
||||
)
|
||||
|
|
|
|||
109
openpype/pipeline/create/subset_name.py
Normal file
109
openpype/pipeline/create/subset_name.py
Normal file
|
|
@ -0,0 +1,109 @@
|
|||
import os
|
||||
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import filter_profiles, prepare_template_data
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
from .constants import DEFAULT_SUBSET_TEMPLATE
|
||||
|
||||
|
||||
class TaskNotSetError(KeyError):
|
||||
def __init__(self, msg=None):
|
||||
if not msg:
|
||||
msg = "Creator's subset name template requires task name."
|
||||
super(TaskNotSetError, self).__init__(msg)
|
||||
|
||||
|
||||
def get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name=None,
|
||||
host_name=None,
|
||||
default_template=None,
|
||||
dynamic_data=None,
|
||||
project_settings=None
|
||||
):
|
||||
"""Calculate subset name based on passed context and OpenPype settings.
|
||||
|
||||
Subst name templates are defined in `project_settings/global/tools/creator
|
||||
/subset_name_profiles` where are profiles with host name, family, task name
|
||||
and task type filters. If context does not match any profile then
|
||||
`DEFAULT_SUBSET_TEMPLATE` is used as default template.
|
||||
|
||||
That's main reason why so many arguments are required to calculate subset
|
||||
name.
|
||||
|
||||
Args:
|
||||
family (str): Instance family.
|
||||
variant (str): In most of cases it is user input during creation.
|
||||
task_name (str): Task name on which context is instance created.
|
||||
asset_doc (dict): Queried asset document with it's tasks in data.
|
||||
Used to get task type.
|
||||
project_name (str): Name of project on which is instance created.
|
||||
Important for project settings that are loaded.
|
||||
host_name (str): One of filtering criteria for template profile
|
||||
filters.
|
||||
default_template (str): Default template if any profile does not match
|
||||
passed context. Constant 'DEFAULT_SUBSET_TEMPLATE' is used if
|
||||
is not passed.
|
||||
dynamic_data (dict): Dynamic data specific for a creator which creates
|
||||
instance.
|
||||
dbcon (AvalonMongoDB): Mongo connection to be able query asset document
|
||||
if 'asset_doc' is not passed.
|
||||
"""
|
||||
|
||||
if not family:
|
||||
return ""
|
||||
|
||||
if not host_name:
|
||||
host_name = os.environ["AVALON_APP"]
|
||||
|
||||
# Use only last part of class family value split by dot (`.`)
|
||||
family = family.rsplit(".", 1)[-1]
|
||||
|
||||
if project_name is None:
|
||||
project_name = legacy_io.Session["AVALON_PROJECT"]
|
||||
|
||||
asset_tasks = asset_doc.get("data", {}).get("tasks") or {}
|
||||
task_info = asset_tasks.get(task_name) or {}
|
||||
task_type = task_info.get("type")
|
||||
|
||||
# Get settings
|
||||
if not project_settings:
|
||||
project_settings = get_project_settings(project_name)
|
||||
tools_settings = project_settings["global"]["tools"]
|
||||
profiles = tools_settings["creator"]["subset_name_profiles"]
|
||||
filtering_criteria = {
|
||||
"families": family,
|
||||
"hosts": host_name,
|
||||
"tasks": task_name,
|
||||
"task_types": task_type
|
||||
}
|
||||
|
||||
matching_profile = filter_profiles(profiles, filtering_criteria)
|
||||
template = None
|
||||
if matching_profile:
|
||||
template = matching_profile["template"]
|
||||
|
||||
# Make sure template is set (matching may have empty string)
|
||||
if not template:
|
||||
template = default_template or DEFAULT_SUBSET_TEMPLATE
|
||||
|
||||
# Simple check of task name existence for template with {task} in
|
||||
# - missing task should be possible only in Standalone publisher
|
||||
if not task_name and "{task" in template.lower():
|
||||
raise TaskNotSetError()
|
||||
|
||||
fill_pairs = {
|
||||
"variant": variant,
|
||||
"family": family,
|
||||
"task": task_name
|
||||
}
|
||||
if dynamic_data:
|
||||
# Dynamic data may override default values
|
||||
for key, value in dynamic_data.items():
|
||||
fill_pairs[key] = value
|
||||
|
||||
return template.format(**prepare_template_data(fill_pairs))
|
||||
|
|
@ -273,3 +273,43 @@ def filter_pyblish_plugins(plugins):
|
|||
option, value, plugin.__name__))
|
||||
|
||||
setattr(plugin, option, value)
|
||||
|
||||
|
||||
def find_close_plugin(close_plugin_name, log):
|
||||
if close_plugin_name:
|
||||
plugins = pyblish.api.discover()
|
||||
for plugin in plugins:
|
||||
if plugin.__name__ == close_plugin_name:
|
||||
return plugin
|
||||
|
||||
log.debug("Close plugin not found, app might not close.")
|
||||
|
||||
|
||||
def remote_publish(log, close_plugin_name=None, raise_error=False):
|
||||
"""Loops through all plugins, logs to console. Used for tests.
|
||||
|
||||
Args:
|
||||
log (openpype.lib.Logger)
|
||||
close_plugin_name (str): name of plugin with responsibility to
|
||||
close host app
|
||||
"""
|
||||
# Error exit as soon as any error occurs.
|
||||
error_format = "Failed {plugin.__name__}: {error} -- {error.traceback}"
|
||||
|
||||
close_plugin = find_close_plugin(close_plugin_name, log)
|
||||
|
||||
for result in pyblish.util.publish_iter():
|
||||
for record in result["records"]:
|
||||
log.info("{}: {}".format(
|
||||
result["plugin"].label, record.msg))
|
||||
|
||||
if result["error"]:
|
||||
error_message = error_format.format(**result)
|
||||
log.error(error_message)
|
||||
if close_plugin: # close host app explicitly after error
|
||||
context = pyblish.api.Context()
|
||||
close_plugin().process(context)
|
||||
if raise_error:
|
||||
# Fatal Error is because of Deadline
|
||||
error_message = "Fatal Error: " + error_format.format(**result)
|
||||
raise RuntimeError(error_message)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
"""
|
||||
Requires:
|
||||
Optional:
|
||||
context -> hostName (str)
|
||||
context -> currentFile (str)
|
||||
Provides:
|
||||
context -> label (str)
|
||||
|
|
@ -16,16 +17,27 @@ class CollectContextLabel(pyblish.api.ContextPlugin):
|
|||
label = "Context Label"
|
||||
|
||||
def process(self, context):
|
||||
# Add ability to use custom context label
|
||||
label = context.data.get("label")
|
||||
if label:
|
||||
self.log.debug("Context label is already set to \"{}\"".format(
|
||||
label
|
||||
))
|
||||
return
|
||||
|
||||
# Get last registered host
|
||||
host = pyblish.api.registered_hosts()[-1]
|
||||
host_name = context.data.get("hostName")
|
||||
if not host_name:
|
||||
host_name = pyblish.api.registered_hosts()[-1]
|
||||
# Use host name as base for label
|
||||
label = host_name.title()
|
||||
|
||||
# Get scene name from "currentFile"
|
||||
path = context.data.get("currentFile") or "<Unsaved>"
|
||||
base = os.path.basename(path)
|
||||
# Get scene name from "currentFile" and use basename as ending of label
|
||||
path = context.data.get("currentFile")
|
||||
if path:
|
||||
label += " - {}".format(os.path.basename(path))
|
||||
|
||||
# Set label
|
||||
label = "{host} - {scene}".format(host=host.title(), scene=base)
|
||||
if host == "standalonepublisher":
|
||||
label = host.title()
|
||||
context.data["label"] = label
|
||||
self.log.debug("Context label is changed to \"{}\"".format(
|
||||
label
|
||||
))
|
||||
|
|
|
|||
|
|
@ -5,19 +5,6 @@ import sys
|
|||
import json
|
||||
import time
|
||||
|
||||
from openpype.api import get_app_environments_for_context
|
||||
from openpype.lib.plugin_tools import get_batch_asset_task_info
|
||||
from openpype.lib.remote_publish import (
|
||||
get_webpublish_conn,
|
||||
start_webpublish_log,
|
||||
publish_and_log,
|
||||
fail_batch,
|
||||
find_variant_key,
|
||||
get_task_data,
|
||||
get_timeout,
|
||||
IN_PROGRESS_STATUS
|
||||
)
|
||||
|
||||
|
||||
class PypeCommands:
|
||||
"""Class implementing commands used by Pype.
|
||||
|
|
@ -74,8 +61,8 @@ class PypeCommands:
|
|||
|
||||
@staticmethod
|
||||
def launch_webpublisher_webservercli(*args, **kwargs):
|
||||
from openpype.hosts.webpublisher.webserver_service.webserver_cli \
|
||||
import (run_webserver)
|
||||
from openpype.hosts.webpublisher.webserver_service import run_webserver
|
||||
|
||||
return run_webserver(*args, **kwargs)
|
||||
|
||||
@staticmethod
|
||||
|
|
@ -100,6 +87,7 @@ class PypeCommands:
|
|||
"""
|
||||
|
||||
from openpype.lib import Logger
|
||||
from openpype.lib.applications import get_app_environments_for_context
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.pipeline import install_openpype_plugins
|
||||
from openpype.tools.utils.host_tools import show_publish
|
||||
|
|
@ -198,96 +186,13 @@ class PypeCommands:
|
|||
(to choose validator for example)
|
||||
"""
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib import ApplicationManager
|
||||
|
||||
from openpype.lib import Logger
|
||||
log = Logger.get_logger("CLI-remotepublishfromapp")
|
||||
|
||||
log.info("remotepublishphotoshop command")
|
||||
|
||||
task_data = get_task_data(batch_path)
|
||||
|
||||
workfile_path = os.path.join(batch_path,
|
||||
task_data["task"],
|
||||
task_data["files"][0])
|
||||
|
||||
print("workfile_path {}".format(workfile_path))
|
||||
|
||||
batch_id = task_data["batch"]
|
||||
dbcon = get_webpublish_conn()
|
||||
# safer to start logging here, launch might be broken altogether
|
||||
_id = start_webpublish_log(dbcon, batch_id, user_email)
|
||||
|
||||
batches_in_progress = list(dbcon.find({"status": IN_PROGRESS_STATUS}))
|
||||
if len(batches_in_progress) > 1:
|
||||
running_batches = [str(batch["_id"])
|
||||
for batch in batches_in_progress
|
||||
if batch["_id"] != _id]
|
||||
msg = "There are still running batches {}\n". \
|
||||
format("\n".join(running_batches))
|
||||
msg += "Ask admin to check them and reprocess current batch"
|
||||
fail_batch(_id, dbcon, msg)
|
||||
|
||||
if not task_data["context"]:
|
||||
msg = "Batch manifest must contain context data"
|
||||
msg += "Create new batch and set context properly."
|
||||
fail_batch(_id, dbcon, msg)
|
||||
|
||||
asset_name, task_name, task_type = get_batch_asset_task_info(
|
||||
task_data["context"])
|
||||
|
||||
application_manager = ApplicationManager()
|
||||
found_variant_key = find_variant_key(application_manager, host_name)
|
||||
app_name = "{}/{}".format(host_name, found_variant_key)
|
||||
|
||||
# must have for proper launch of app
|
||||
env = get_app_environments_for_context(
|
||||
project_name,
|
||||
asset_name,
|
||||
task_name,
|
||||
app_name
|
||||
from openpype.hosts.webpublisher.cli_functions import (
|
||||
cli_publish_from_app
|
||||
)
|
||||
print("env:: {}".format(env))
|
||||
os.environ.update(env)
|
||||
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = batch_path
|
||||
# must pass identifier to update log lines for a batch
|
||||
os.environ["BATCH_LOG_ID"] = str(_id)
|
||||
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
|
||||
os.environ["USER_EMAIL"] = user_email
|
||||
|
||||
pyblish.api.register_host(host_name)
|
||||
if targets:
|
||||
if isinstance(targets, str):
|
||||
targets = [targets]
|
||||
current_targets = os.environ.get("PYBLISH_TARGETS", "").split(
|
||||
os.pathsep)
|
||||
for target in targets:
|
||||
current_targets.append(target)
|
||||
|
||||
os.environ["PYBLISH_TARGETS"] = os.pathsep.join(
|
||||
set(current_targets))
|
||||
|
||||
data = {
|
||||
"last_workfile_path": workfile_path,
|
||||
"start_last_workfile": True,
|
||||
"project_name": project_name,
|
||||
"asset_name": asset_name,
|
||||
"task_name": task_name
|
||||
}
|
||||
|
||||
launched_app = application_manager.launch(app_name, **data)
|
||||
|
||||
timeout = get_timeout(project_name, host_name, task_type)
|
||||
|
||||
time_start = time.time()
|
||||
while launched_app.poll() is None:
|
||||
time.sleep(0.5)
|
||||
if time.time() - time_start > timeout:
|
||||
launched_app.terminate()
|
||||
msg = "Timeout reached"
|
||||
fail_batch(_id, dbcon, msg)
|
||||
cli_publish_from_app(
|
||||
project_name, batch_path, host_name, user_email, targets
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def remotepublish(project, batch_path, user_email, targets=None):
|
||||
|
|
@ -311,53 +216,12 @@ class PypeCommands:
|
|||
Raises:
|
||||
RuntimeError: When there is no path to process.
|
||||
"""
|
||||
if not batch_path:
|
||||
raise RuntimeError("No publish paths specified")
|
||||
|
||||
# Register target and host
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
from openpype.hosts.webpublisher.cli_functions import (
|
||||
cli_publish
|
||||
)
|
||||
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.webpublisher import api as webpublisher
|
||||
|
||||
log = Logger.get_logger("remotepublish")
|
||||
|
||||
log.info("remotepublish command")
|
||||
|
||||
host_name = "webpublisher"
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = batch_path
|
||||
os.environ["AVALON_PROJECT"] = project
|
||||
os.environ["AVALON_APP"] = host_name
|
||||
os.environ["USER_EMAIL"] = user_email
|
||||
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
|
||||
|
||||
pyblish.api.register_host(host_name)
|
||||
|
||||
if targets:
|
||||
if isinstance(targets, str):
|
||||
targets = [targets]
|
||||
for target in targets:
|
||||
pyblish.api.register_target(target)
|
||||
|
||||
install_host(webpublisher)
|
||||
|
||||
log.info("Running publish ...")
|
||||
|
||||
_, batch_id = os.path.split(batch_path)
|
||||
dbcon = get_webpublish_conn()
|
||||
_id = start_webpublish_log(dbcon, batch_id, user_email)
|
||||
|
||||
task_data = get_task_data(batch_path)
|
||||
if not task_data["context"]:
|
||||
msg = "Batch manifest must contain context data"
|
||||
msg += "Create new batch and set context properly."
|
||||
fail_batch(_id, dbcon, msg)
|
||||
|
||||
publish_and_log(dbcon, _id, log, batch_id=batch_id)
|
||||
|
||||
log.info("Publish finished.")
|
||||
cli_publish(project, batch_path, user_email, targets)
|
||||
|
||||
@staticmethod
|
||||
def extractenvironments(output_json_path, project, asset, task, app,
|
||||
|
|
@ -366,8 +230,10 @@ class PypeCommands:
|
|||
|
||||
Called by Deadline plugin to propagate environment into render jobs.
|
||||
"""
|
||||
|
||||
from openpype.lib.applications import get_app_environments_for_context
|
||||
|
||||
if all((project, asset, task, app)):
|
||||
from openpype.api import get_app_environments_for_context
|
||||
env = get_app_environments_for_context(
|
||||
project, asset, task, app, env_group
|
||||
)
|
||||
|
|
@ -469,7 +335,6 @@ class PypeCommands:
|
|||
sync_server_module.server_init()
|
||||
sync_server_module.server_start()
|
||||
|
||||
import time
|
||||
while True:
|
||||
time.sleep(1.0)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,11 +1,12 @@
|
|||
try:
|
||||
from openpype.api import Logger
|
||||
import openpype.lib.remote_publish
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline.publish.lib import remote_publish
|
||||
except ImportError as exc:
|
||||
# Ensure Deadline fails by output an error that contains "Fatal Error:"
|
||||
raise ImportError("Fatal Error: %s" % exc)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Perform remote publish with thorough error checking
|
||||
log = Logger.get_logger(__name__)
|
||||
openpype.lib.remote_publish.publish(log, raise_error=True)
|
||||
remote_publish(log, raise_error=True)
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@
|
|||
},
|
||||
"publish": {
|
||||
"CollectColorCodedInstances": {
|
||||
"create_flatten_image": false,
|
||||
"create_flatten_image": "no",
|
||||
"flatten_subset_template": "",
|
||||
"color_code_mapping": []
|
||||
},
|
||||
|
|
|
|||
|
|
@ -45,9 +45,15 @@
|
|||
"label": "Set color for publishable layers, set its resulting family and template for subset name. \nCan create flatten image from published instances.(Applicable only for remote publishing!)"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "create_flatten_image",
|
||||
"label": "Create flatten image"
|
||||
"label": "Create flatten image",
|
||||
"type": "enum",
|
||||
"multiselection": false,
|
||||
"enum_items": [
|
||||
{ "flatten_with_images": "Flatten with images" },
|
||||
{ "flatten_only": "Flatten only" },
|
||||
{ "no": "No" }
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
|
|
|
|||
|
|
@ -79,7 +79,7 @@ class PublishReport:
|
|||
|
||||
context_data = data["context"]
|
||||
context_data["name"] = "context"
|
||||
context_data["label"] = context_data["label"] or "Context"
|
||||
context_data["label"] = context_data.get("label") or "Context"
|
||||
|
||||
logs = []
|
||||
plugins_items_by_id = {}
|
||||
|
|
|
|||
|
|
@ -11,10 +11,10 @@ except Exception:
|
|||
from Qt import QtWidgets, QtCore, QtGui
|
||||
|
||||
from openpype.client import get_asset_by_name, get_subsets
|
||||
from openpype.lib import TaskNotSetError
|
||||
from openpype.pipeline.create import (
|
||||
CreatorError,
|
||||
SUBSET_NAME_ALLOWED_SYMBOLS
|
||||
SUBSET_NAME_ALLOWED_SYMBOLS,
|
||||
TaskNotSetError,
|
||||
)
|
||||
from openpype.tools.utils import (
|
||||
ErrorMessageBox,
|
||||
|
|
|
|||
|
|
@ -335,14 +335,12 @@ class PublishFrame(QtWidgets.QFrame):
|
|||
if instance is None:
|
||||
new_name = (
|
||||
context.data.get("label")
|
||||
or getattr(context, "label", None)
|
||||
or context.data.get("name")
|
||||
or "Context"
|
||||
)
|
||||
else:
|
||||
new_name = (
|
||||
instance.data.get("label")
|
||||
or getattr(instance, "label", None)
|
||||
or instance.data["name"]
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,6 @@ import collections
|
|||
from Qt import QtWidgets, QtCore, QtGui
|
||||
import qtawesome
|
||||
|
||||
from openpype.lib import TaskNotSetError
|
||||
from openpype.widgets.attribute_defs import create_widget_for_attr_def
|
||||
from openpype.tools import resources
|
||||
from openpype.tools.flickcharm import FlickCharm
|
||||
|
|
@ -17,7 +16,10 @@ from openpype.tools.utils import (
|
|||
BaseClickableFrame,
|
||||
set_style_property,
|
||||
)
|
||||
from openpype.pipeline.create import SUBSET_NAME_ALLOWED_SYMBOLS
|
||||
from openpype.pipeline.create import (
|
||||
SUBSET_NAME_ALLOWED_SYMBOLS,
|
||||
TaskNotSetError,
|
||||
)
|
||||
from .assets_widget import AssetsDialog
|
||||
from .tasks_widget import TasksModel
|
||||
from .icons import (
|
||||
|
|
|
|||
|
|
@ -244,7 +244,6 @@ class Controller(QtCore.QObject):
|
|||
self.context.optional = False
|
||||
|
||||
self.context.data["publish"] = True
|
||||
self.context.data["label"] = "Context"
|
||||
self.context.data["name"] = "context"
|
||||
|
||||
self.context.data["host"] = reversed(pyblish.api.registered_hosts())
|
||||
|
|
|
|||
|
|
@ -596,11 +596,6 @@ class InstanceItem(QtGui.QStandardItem):
|
|||
instance._logs = []
|
||||
instance.optional = getattr(instance, "optional", True)
|
||||
instance.data["publish"] = instance.data.get("publish", True)
|
||||
instance.data["label"] = (
|
||||
instance.data.get("label")
|
||||
or getattr(instance, "label", None)
|
||||
or instance.data["name"]
|
||||
)
|
||||
|
||||
family = self.data(Roles.FamiliesRole)[0]
|
||||
self.setData(
|
||||
|
|
@ -616,9 +611,16 @@ class InstanceItem(QtGui.QStandardItem):
|
|||
|
||||
def data(self, role=QtCore.Qt.DisplayRole):
|
||||
if role == QtCore.Qt.DisplayRole:
|
||||
label = None
|
||||
if settings.UseLabel:
|
||||
return self.instance.data["label"]
|
||||
return self.instance.data["name"]
|
||||
label = self.instance.data.get("label")
|
||||
|
||||
if not label:
|
||||
if self.is_context:
|
||||
label = "Context"
|
||||
else:
|
||||
label = self.instance.data["name"]
|
||||
return label
|
||||
|
||||
if role == QtCore.Qt.DecorationRole:
|
||||
icon_name = self.instance.data.get("icon") or "file"
|
||||
|
|
|
|||
|
|
@ -236,7 +236,7 @@ def main():
|
|||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
modules_manager = ModulesManager()
|
||||
module = modules_manager.modules_by_name["standalonepublish_tool"]
|
||||
module = modules_manager.modules_by_name["standalonepublisher"]
|
||||
|
||||
window = Window(module.publish_paths)
|
||||
window.show()
|
||||
|
|
|
|||
|
|
@ -8,10 +8,12 @@ from openpype.client import (
|
|||
get_subsets,
|
||||
get_last_version_by_subset_id,
|
||||
)
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.pipeline import LegacyCreator
|
||||
from openpype.lib import TaskNotSetError
|
||||
from openpype.pipeline.create import SUBSET_NAME_ALLOWED_SYMBOLS
|
||||
from openpype.pipeline.create import (
|
||||
SUBSET_NAME_ALLOWED_SYMBOLS,
|
||||
TaskNotSetError,
|
||||
)
|
||||
|
||||
from . import HelpRole, FamilyRole, ExistsRole, PluginRole, PluginKeyRole
|
||||
from . import FamilyDescriptionWidget
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.14.1-nightly.2"
|
||||
__version__ = "3.14.1-nightly.3"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.14.1-nightly.2" # OpenPype
|
||||
version = "3.14.1-nightly.3" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -12,8 +12,6 @@ import platform
|
|||
from tests.lib.db_handler import DBHandler
|
||||
from tests.lib.file_handler import RemoteFileHandler
|
||||
|
||||
from openpype.lib.remote_publish import find_variant_key
|
||||
|
||||
|
||||
class BaseTest:
|
||||
"""Empty base test class"""
|
||||
|
|
@ -210,7 +208,10 @@ class PublishTest(ModuleUnitTest):
|
|||
|
||||
application_manager = ApplicationManager()
|
||||
if not app_variant:
|
||||
app_variant = find_variant_key(application_manager, self.APP)
|
||||
variant = (
|
||||
application_manager.find_latest_available_variant_for_group(
|
||||
self.APP))
|
||||
app_variant = variant.name
|
||||
|
||||
yield "{}/{}".format(self.APP, app_variant)
|
||||
|
||||
|
|
@ -342,4 +343,4 @@ class HostFixtures(PublishTest):
|
|||
@pytest.fixture(scope="module")
|
||||
def startup_scripts(self, monkeypatch_session, download_test_data):
|
||||
""""Adds init scripts (like userSetup) to expected location"""
|
||||
raise NotImplementedError
|
||||
raise NotImplementedError
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue