mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into enhancement/OP-5244_3dsmax-setting-resolution
This commit is contained in:
commit
4dfb57bb21
58 changed files with 1885 additions and 1674 deletions
22
.github/workflows/project_actions.yml
vendored
Normal file
22
.github/workflows/project_actions.yml
vendored
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
name: project-actions
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [review_requested]
|
||||
pull_request_review:
|
||||
types: [submitted]
|
||||
|
||||
jobs:
|
||||
pr_review_requested:
|
||||
name: pr_review_requested
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event_name == 'pull_request' && github.event.action == 'review_requested'
|
||||
steps:
|
||||
- name: Move PR to 'Change Requested'
|
||||
uses: leonsteinhaeuser/project-beta-automations@v2.1.0
|
||||
with:
|
||||
gh_token: ${{ secrets.YNPUT_BOT_TOKEN }}
|
||||
organization: ynput
|
||||
project_id: 11
|
||||
resource_node_id: ${{ github.event.pull_request.node_id }}
|
||||
status_value: Change Requested
|
||||
77
ARCHITECTURE.md
Normal file
77
ARCHITECTURE.md
Normal file
|
|
@ -0,0 +1,77 @@
|
|||
# Architecture
|
||||
|
||||
OpenPype is a monolithic Python project that bundles several parts, this document will try to give a birds eye overview of the project and, to a certain degree, each of the sub-projects.
|
||||
The current file structure looks like this:
|
||||
|
||||
```
|
||||
.
|
||||
├── common - Code in this folder is backend portion of Addon distribution logic for v4 server.
|
||||
├── docs - Documentation of the source code.
|
||||
├── igniter - The OpenPype bootstrapper, deals with running version resolution and setting up the connection to the mongodb.
|
||||
├── openpype - The actual OpenPype core package.
|
||||
├── schema - Collection of JSON files describing schematics of objects. This follows Avalon's convention.
|
||||
├── tests - Integration and unit tests.
|
||||
├── tools - Conveninece scripts to perform common actions (in both bash and ps1).
|
||||
├── vendor - When using the igniter, it deploys third party tools in here, such as ffmpeg.
|
||||
└── website - Source files for https://openpype.io/ which is Docusaursus (https://docusaurus.io/).
|
||||
```
|
||||
|
||||
The core functionality of the pipeline can be found in `igniter` and `openpype`, which in turn rely on the `schema` files, whenever you build (or download a pre-built) version of OpenPype, these two are bundled in there, and `Igniter` is the entry point.
|
||||
|
||||
|
||||
## Igniter
|
||||
|
||||
It's the setup and update tool for OpenPype, unless you want to package `openpype` separately and deal with all the config manually, this will most likely be your entry point.
|
||||
|
||||
```
|
||||
igniter/
|
||||
├── bootstrap_repos.py - Module that will find or install OpenPype versions in the system.
|
||||
├── __init__.py - Igniter entry point.
|
||||
├── install_dialog.py- Show dialog for choosing central pype repository.
|
||||
├── install_thread.py - Threading helpers for the install process.
|
||||
├── __main__.py - Like `__init__.py` ?
|
||||
├── message_dialog.py - Qt Dialog with a message and "Ok" button.
|
||||
├── nice_progress_bar.py - Fancy Qt progress bar.
|
||||
├── splash.txt - ASCII art for the terminal installer.
|
||||
├── stylesheet.css - Installer Qt styles.
|
||||
├── terminal_splash.py - Terminal installer animation, relies in `splash.txt`.
|
||||
├── tools.py - Collection of methods that don't fit in other modules.
|
||||
├── update_thread.py - Threading helper to update existing OpenPype installs.
|
||||
├── update_window.py - Qt UI to update OpenPype installs.
|
||||
├── user_settings.py - Interface for the OpenPype user settings.
|
||||
└── version.py - Igniter's version number.
|
||||
```
|
||||
|
||||
## OpenPype
|
||||
|
||||
This is the main package of the OpenPype logic, it could be loosely described as a combination of [Avalon](https://getavalon.github.io), [Pyblish](https://pyblish.com/) and glue around those with custom OpenPype only elements, things are in progress of being moved around to better prepare for V4, which will be released under a new name AYON.
|
||||
|
||||
```
|
||||
openpype/
|
||||
├── client - Interface for the MongoDB.
|
||||
├── hooks - Hooks to be executed on certain OpenPype Applications defined in `openpype.lib.applications`.
|
||||
├── host - Base class for the different hosts.
|
||||
├── hosts - Integration with the different DCCs (hosts) using the `host` base class.
|
||||
├── lib - Libraries that stitch together the package, some have been moved into other parts.
|
||||
├── modules - OpenPype modules should contain separated logic of specific kind of implementation, such as Ftrack connection and its python API.
|
||||
├── pipeline - Core of the OpenPype pipeline, handles creation of data, publishing, etc.
|
||||
├── plugins - Global/core plugins for loader and publisher tool.
|
||||
├── resources - Icons, fonts, etc.
|
||||
├── scripts - Loose scipts that get run by tools/publishers.
|
||||
├── settings - OpenPype settings interface.
|
||||
├── style - Qt styling.
|
||||
├── tests - Unit tests.
|
||||
├── tools - Core tools, check out https://openpype.io/docs/artist_tools.
|
||||
├── vendor - Vendoring of needed required Python packes.
|
||||
├── widgets - Common re-usable Qt Widgets.
|
||||
├── action.py - LEGACY: Lives now in `openpype.pipeline.publish.action` Pyblish actions.
|
||||
├── cli.py - Command line interface, leverages `click`.
|
||||
├── __init__.py - Sets two constants.
|
||||
├── __main__.py - Entry point, calls the `cli.py`
|
||||
├── plugin.py - Pyblish plugins.
|
||||
├── pype_commands.py - Implementation of OpenPype commands.
|
||||
└── version.py - Current version number.
|
||||
```
|
||||
|
||||
|
||||
|
||||
|
|
@ -3,10 +3,13 @@ from openpype.lib import PreLaunchHook
|
|||
from openpype.pipeline.workfile import create_workdir_extra_folders
|
||||
|
||||
|
||||
class AddLastWorkfileToLaunchArgs(PreLaunchHook):
|
||||
"""Add last workfile path to launch arguments.
|
||||
class CreateWorkdirExtraFolders(PreLaunchHook):
|
||||
"""Create extra folders for the work directory.
|
||||
|
||||
Based on setting `project_settings/global/tools/Workfiles/extra_folders`
|
||||
profile filtering will decide whether extra folders need to be created in
|
||||
the work directory.
|
||||
|
||||
This is not possible to do for all applications the same way.
|
||||
"""
|
||||
|
||||
# Execute after workfile template copy
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ class LaunchFoundryAppsWindows(PreLaunchHook):
|
|||
|
||||
Nuke is executed "like" python process so it is required to pass
|
||||
`CREATE_NEW_CONSOLE` flag on windows to trigger creation of new console.
|
||||
At the same time the newly created console won't create it's own stdout
|
||||
At the same time the newly created console won't create its own stdout
|
||||
and stderr handlers so they should not be redirected to DEVNULL.
|
||||
"""
|
||||
|
||||
|
|
@ -18,7 +18,7 @@ class LaunchFoundryAppsWindows(PreLaunchHook):
|
|||
|
||||
def execute(self):
|
||||
# Change `creationflags` to CREATE_NEW_CONSOLE
|
||||
# - on Windows will nuke create new window using it's console
|
||||
# - on Windows nuke will create new window using its console
|
||||
# Set `stdout` and `stderr` to None so new created console does not
|
||||
# have redirected output to DEVNULL in build
|
||||
self.launch_context.kwargs.update({
|
||||
|
|
|
|||
|
|
@ -84,11 +84,11 @@ class MainThreadItem:
|
|||
self.kwargs = kwargs
|
||||
|
||||
def execute(self):
|
||||
"""Execute callback and store it's result.
|
||||
"""Execute callback and store its result.
|
||||
|
||||
Method must be called from main thread. Item is marked as `done`
|
||||
when callback execution finished. Store output of callback of exception
|
||||
information when callback raise one.
|
||||
information when callback raises one.
|
||||
"""
|
||||
print("Executing process in main thread")
|
||||
if self.done:
|
||||
|
|
|
|||
|
|
@ -38,8 +38,9 @@ class CelactionPrelaunchHook(PreLaunchHook):
|
|||
)
|
||||
|
||||
path_to_cli = os.path.join(CELACTION_SCRIPTS_DIR, "publish_cli.py")
|
||||
subproces_args = get_openpype_execute_args("run", path_to_cli)
|
||||
openpype_executable = subproces_args.pop(0)
|
||||
subprocess_args = get_openpype_execute_args("run", path_to_cli)
|
||||
openpype_executable = subprocess_args.pop(0)
|
||||
workfile_settings = self.get_workfile_settings()
|
||||
|
||||
winreg.SetValueEx(
|
||||
hKey,
|
||||
|
|
@ -49,20 +50,34 @@ class CelactionPrelaunchHook(PreLaunchHook):
|
|||
openpype_executable
|
||||
)
|
||||
|
||||
parameters = subproces_args + [
|
||||
"--currentFile", "*SCENE*",
|
||||
"--chunk", "*CHUNK*",
|
||||
"--frameStart", "*START*",
|
||||
"--frameEnd", "*END*",
|
||||
"--resolutionWidth", "*X*",
|
||||
"--resolutionHeight", "*Y*"
|
||||
# add required arguments for workfile path
|
||||
parameters = subprocess_args + [
|
||||
"--currentFile", "*SCENE*"
|
||||
]
|
||||
|
||||
# Add custom parameters from workfile settings
|
||||
if "render_chunk" in workfile_settings["submission_overrides"]:
|
||||
parameters += [
|
||||
"--chunk", "*CHUNK*"
|
||||
]
|
||||
if "resolution" in workfile_settings["submission_overrides"]:
|
||||
parameters += [
|
||||
"--resolutionWidth", "*X*",
|
||||
"--resolutionHeight", "*Y*"
|
||||
]
|
||||
if "frame_range" in workfile_settings["submission_overrides"]:
|
||||
parameters += [
|
||||
"--frameStart", "*START*",
|
||||
"--frameEnd", "*END*"
|
||||
]
|
||||
|
||||
winreg.SetValueEx(
|
||||
hKey, "SubmitParametersTitle", 0, winreg.REG_SZ,
|
||||
subprocess.list2cmdline(parameters)
|
||||
)
|
||||
|
||||
self.log.debug(f"__ parameters: \"{parameters}\"")
|
||||
|
||||
# setting resolution parameters
|
||||
path_submit = "\\".join([
|
||||
path_user_settings, "Dialogs", "SubmitOutput"
|
||||
|
|
@ -135,3 +150,6 @@ class CelactionPrelaunchHook(PreLaunchHook):
|
|||
self.log.info(f"Workfile to open: \"{workfile_path}\"")
|
||||
|
||||
return workfile_path
|
||||
|
||||
def get_workfile_settings(self):
|
||||
return self.data["project_settings"]["celaction"]["workfile"]
|
||||
|
|
|
|||
|
|
@ -39,7 +39,7 @@ class CollectCelactionCliKwargs(pyblish.api.Collector):
|
|||
passing_kwargs[key] = value
|
||||
|
||||
if missing_kwargs:
|
||||
raise RuntimeError("Missing arguments {}".format(
|
||||
self.log.debug("Missing arguments {}".format(
|
||||
", ".join(
|
||||
[f'"{key}"' for key in missing_kwargs]
|
||||
)
|
||||
|
|
|
|||
|
|
@ -6,12 +6,13 @@ from openpype.pipeline.publish import get_errored_instances_from_context
|
|||
|
||||
|
||||
class SelectInvalidAction(pyblish.api.Action):
|
||||
"""Select invalid nodes in Maya when plug-in failed.
|
||||
"""Select invalid nodes in Fusion when plug-in failed.
|
||||
|
||||
To retrieve the invalid nodes this assumes a static `get_invalid()`
|
||||
method is available on the plugin.
|
||||
|
||||
"""
|
||||
|
||||
label = "Select invalid"
|
||||
on = "failed" # This action is only available on a failed plug-in
|
||||
icon = "search" # Icon from Awesome Icon
|
||||
|
|
@ -31,8 +32,10 @@ class SelectInvalidAction(pyblish.api.Action):
|
|||
if isinstance(invalid_nodes, (list, tuple)):
|
||||
invalid.extend(invalid_nodes)
|
||||
else:
|
||||
self.log.warning("Plug-in returned to be invalid, "
|
||||
"but has no selectable nodes.")
|
||||
self.log.warning(
|
||||
"Plug-in returned to be invalid, "
|
||||
"but has no selectable nodes."
|
||||
)
|
||||
|
||||
if not invalid:
|
||||
# Assume relevant comp is current comp and clear selection
|
||||
|
|
@ -51,4 +54,6 @@ class SelectInvalidAction(pyblish.api.Action):
|
|||
for tool in invalid:
|
||||
flow.Select(tool, True)
|
||||
names.add(tool.Name)
|
||||
self.log.info("Selecting invalid tools: %s" % ", ".join(sorted(names)))
|
||||
self.log.info(
|
||||
"Selecting invalid tools: %s" % ", ".join(sorted(names))
|
||||
)
|
||||
|
|
|
|||
|
|
@ -6,7 +6,6 @@ from openpype.tools.utils import host_tools
|
|||
from openpype.style import load_stylesheet
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.hosts.fusion.scripts import (
|
||||
set_rendermode,
|
||||
duplicate_with_inputs,
|
||||
)
|
||||
from openpype.hosts.fusion.api.lib import (
|
||||
|
|
@ -60,7 +59,6 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
publish_btn = QtWidgets.QPushButton("Publish...", self)
|
||||
manager_btn = QtWidgets.QPushButton("Manage...", self)
|
||||
libload_btn = QtWidgets.QPushButton("Library...", self)
|
||||
rendermode_btn = QtWidgets.QPushButton("Set render mode...", self)
|
||||
set_framerange_btn = QtWidgets.QPushButton("Set Frame Range", self)
|
||||
set_resolution_btn = QtWidgets.QPushButton("Set Resolution", self)
|
||||
duplicate_with_inputs_btn = QtWidgets.QPushButton(
|
||||
|
|
@ -91,7 +89,6 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
|
||||
layout.addWidget(set_framerange_btn)
|
||||
layout.addWidget(set_resolution_btn)
|
||||
layout.addWidget(rendermode_btn)
|
||||
|
||||
layout.addSpacing(20)
|
||||
|
||||
|
|
@ -108,7 +105,6 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
load_btn.clicked.connect(self.on_load_clicked)
|
||||
manager_btn.clicked.connect(self.on_manager_clicked)
|
||||
libload_btn.clicked.connect(self.on_libload_clicked)
|
||||
rendermode_btn.clicked.connect(self.on_rendermode_clicked)
|
||||
duplicate_with_inputs_btn.clicked.connect(
|
||||
self.on_duplicate_with_inputs_clicked
|
||||
)
|
||||
|
|
@ -162,15 +158,6 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
def on_libload_clicked(self):
|
||||
host_tools.show_library_loader()
|
||||
|
||||
def on_rendermode_clicked(self):
|
||||
if self.render_mode_widget is None:
|
||||
window = set_rendermode.SetRenderMode()
|
||||
window.setStyleSheet(load_stylesheet())
|
||||
window.show()
|
||||
self.render_mode_widget = window
|
||||
else:
|
||||
self.render_mode_widget.show()
|
||||
|
||||
def on_duplicate_with_inputs_clicked(self):
|
||||
duplicate_with_inputs.duplicate_with_input_connections()
|
||||
|
||||
|
|
|
|||
|
|
@ -4,29 +4,34 @@ import qtawesome
|
|||
|
||||
from openpype.hosts.fusion.api import (
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
comp_lock_and_undo_chunk,
|
||||
)
|
||||
|
||||
from openpype.lib import BoolDef
|
||||
from openpype.lib import (
|
||||
BoolDef,
|
||||
EnumDef,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
Creator,
|
||||
CreatedInstance
|
||||
CreatedInstance,
|
||||
)
|
||||
from openpype.client import (
|
||||
get_asset_by_name,
|
||||
)
|
||||
from openpype.client import get_asset_by_name
|
||||
|
||||
|
||||
class CreateSaver(Creator):
|
||||
identifier = "io.openpype.creators.fusion.saver"
|
||||
name = "saver"
|
||||
label = "Saver"
|
||||
label = "Render (saver)"
|
||||
name = "render"
|
||||
family = "render"
|
||||
default_variants = ["Main"]
|
||||
|
||||
default_variants = ["Main", "Mask"]
|
||||
description = "Fusion Saver to generate image sequence"
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
instance_attributes = ["reviewable"]
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
# TODO: Add pre_create attributes to choose file format?
|
||||
file_format = "OpenEXRFormat"
|
||||
|
||||
|
|
@ -58,7 +63,8 @@ class CreateSaver(Creator):
|
|||
family=self.family,
|
||||
subset_name=subset_name,
|
||||
data=instance_data,
|
||||
creator=self)
|
||||
creator=self,
|
||||
)
|
||||
|
||||
# Insert the transient data
|
||||
instance.transient_data["tool"] = saver
|
||||
|
|
@ -68,11 +74,9 @@ class CreateSaver(Creator):
|
|||
return instance
|
||||
|
||||
def collect_instances(self):
|
||||
|
||||
comp = get_current_comp()
|
||||
tools = comp.GetToolList(False, "Saver").values()
|
||||
for tool in tools:
|
||||
|
||||
data = self.get_managed_tool_data(tool)
|
||||
if not data:
|
||||
data = self._collect_unmanaged_saver(tool)
|
||||
|
|
@ -90,7 +94,6 @@ class CreateSaver(Creator):
|
|||
|
||||
def update_instances(self, update_list):
|
||||
for created_inst, _changes in update_list:
|
||||
|
||||
new_data = created_inst.data_to_store()
|
||||
tool = created_inst.transient_data["tool"]
|
||||
self._update_tool_with_data(tool, new_data)
|
||||
|
|
@ -139,7 +142,6 @@ class CreateSaver(Creator):
|
|||
tool.SetAttrs({"TOOLS_Name": subset})
|
||||
|
||||
def _collect_unmanaged_saver(self, tool):
|
||||
|
||||
# TODO: this should not be done this way - this should actually
|
||||
# get the data as stored on the tool explicitly (however)
|
||||
# that would disallow any 'regular saver' to be collected
|
||||
|
|
@ -153,8 +155,7 @@ class CreateSaver(Creator):
|
|||
asset = legacy_io.Session["AVALON_ASSET"]
|
||||
task = legacy_io.Session["AVALON_TASK"]
|
||||
|
||||
asset_doc = get_asset_by_name(project_name=project,
|
||||
asset_name=asset)
|
||||
asset_doc = get_asset_by_name(project_name=project, asset_name=asset)
|
||||
|
||||
path = tool["Clip"][comp.TIME_UNDEFINED]
|
||||
fname = os.path.basename(path)
|
||||
|
|
@ -178,21 +179,20 @@ class CreateSaver(Creator):
|
|||
"variant": variant,
|
||||
"active": not passthrough,
|
||||
"family": self.family,
|
||||
|
||||
# Unique identifier for instance and this creator
|
||||
"id": "pyblish.avalon.instance",
|
||||
"creator_identifier": self.identifier
|
||||
"creator_identifier": self.identifier,
|
||||
}
|
||||
|
||||
def get_managed_tool_data(self, tool):
|
||||
"""Return data of the tool if it matches creator identifier"""
|
||||
data = tool.GetData('openpype')
|
||||
data = tool.GetData("openpype")
|
||||
if not isinstance(data, dict):
|
||||
return
|
||||
|
||||
required = {
|
||||
"id": "pyblish.avalon.instance",
|
||||
"creator_identifier": self.identifier
|
||||
"creator_identifier": self.identifier,
|
||||
}
|
||||
for key, value in required.items():
|
||||
if key not in data or data[key] != value:
|
||||
|
|
@ -205,11 +205,40 @@ class CreateSaver(Creator):
|
|||
|
||||
return data
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
return [
|
||||
BoolDef(
|
||||
"review",
|
||||
default=True,
|
||||
label="Review"
|
||||
)
|
||||
def get_pre_create_attr_defs(self):
|
||||
"""Settings for create page"""
|
||||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool(),
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
"""Settings for publish page"""
|
||||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool(),
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
# These functions below should be moved to another file
|
||||
# so it can be used by other plugins. plugin.py ?
|
||||
|
||||
def _get_render_target_enum(self):
|
||||
rendering_targets = {
|
||||
"local": "Local machine rendering",
|
||||
"frames": "Use existing frames",
|
||||
}
|
||||
if "farm_rendering" in self.instance_attributes:
|
||||
rendering_targets["farm"] = "Farm rendering"
|
||||
|
||||
return EnumDef(
|
||||
"render_target", items=rendering_targets, label="Render target"
|
||||
)
|
||||
|
||||
def _get_reviewable_bool(self):
|
||||
return BoolDef(
|
||||
"review",
|
||||
default=("reviewable" in self.instance_attributes),
|
||||
label="Review",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,50 @@
|
|||
import pyblish.api
|
||||
from openpype.pipeline import publish
|
||||
import os
|
||||
|
||||
|
||||
class CollectFusionExpectedFrames(
|
||||
pyblish.api.InstancePlugin, publish.ColormanagedPyblishPluginMixin
|
||||
):
|
||||
"""Collect all frames needed to publish expected frames"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.5
|
||||
label = "Collect Expected Frames"
|
||||
hosts = ["fusion"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
context = instance.context
|
||||
|
||||
frame_start = context.data["frameStartHandle"]
|
||||
frame_end = context.data["frameEndHandle"]
|
||||
path = instance.data["path"]
|
||||
output_dir = instance.data["outputDir"]
|
||||
|
||||
basename = os.path.basename(path)
|
||||
head, ext = os.path.splitext(basename)
|
||||
files = [
|
||||
f"{head}{str(frame).zfill(4)}{ext}"
|
||||
for frame in range(frame_start, frame_end + 1)
|
||||
]
|
||||
repre = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"frameStart": f"%0{len(str(frame_end))}d" % frame_start,
|
||||
"files": files,
|
||||
"stagingDir": output_dir,
|
||||
}
|
||||
|
||||
self.set_representation_colorspace(
|
||||
representation=repre,
|
||||
context=context,
|
||||
)
|
||||
|
||||
# review representation
|
||||
if instance.data.get("review", False):
|
||||
repre["tags"] = ["review"]
|
||||
|
||||
# add the repre to the instance
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
instance.data["representations"].append(repre)
|
||||
|
|
@ -1,44 +0,0 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectFusionRenderMode(pyblish.api.InstancePlugin):
|
||||
"""Collect current comp's render Mode
|
||||
|
||||
Options:
|
||||
local
|
||||
farm
|
||||
|
||||
Note that this value is set for each comp separately. When you save the
|
||||
comp this information will be stored in that file. If for some reason the
|
||||
available tool does not visualize which render mode is set for the
|
||||
current comp, please run the following line in the console (Py2)
|
||||
|
||||
comp.GetData("openpype.rendermode")
|
||||
|
||||
This will return the name of the current render mode as seen above under
|
||||
Options.
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.4
|
||||
label = "Collect Render Mode"
|
||||
hosts = ["fusion"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
"""Collect all image sequence tools"""
|
||||
options = ["local", "farm"]
|
||||
|
||||
comp = instance.context.data.get("currentComp")
|
||||
if not comp:
|
||||
raise RuntimeError("No comp previously collected, unable to "
|
||||
"retrieve Fusion version.")
|
||||
|
||||
rendermode = comp.GetData("openpype.rendermode") or "local"
|
||||
assert rendermode in options, "Must be supported render mode"
|
||||
|
||||
self.log.info("Render mode: {0}".format(rendermode))
|
||||
|
||||
# Append family
|
||||
family = "render.{0}".format(rendermode)
|
||||
instance.data["families"].append(family)
|
||||
25
openpype/hosts/fusion/plugins/publish/collect_renders.py
Normal file
25
openpype/hosts/fusion/plugins/publish/collect_renders.py
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectFusionRenders(pyblish.api.InstancePlugin):
|
||||
"""Collect current saver node's render Mode
|
||||
|
||||
Options:
|
||||
local (Render locally)
|
||||
frames (Use existing frames)
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.4
|
||||
label = "Collect Renders"
|
||||
hosts = ["fusion"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
render_target = instance.data["render_target"]
|
||||
family = instance.data["family"]
|
||||
|
||||
# add targeted family to families
|
||||
instance.data["families"].append(
|
||||
"{}.{}".format(family, render_target)
|
||||
)
|
||||
109
openpype/hosts/fusion/plugins/publish/extract_render_local.py
Normal file
109
openpype/hosts/fusion/plugins/publish/extract_render_local.py
Normal file
|
|
@ -0,0 +1,109 @@
|
|||
import logging
|
||||
import contextlib
|
||||
import pyblish.api
|
||||
from openpype.hosts.fusion.api import comp_lock_and_undo_chunk
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def enabled_savers(comp, savers):
|
||||
"""Enable only the `savers` in Comp during the context.
|
||||
|
||||
Any Saver tool in the passed composition that is not in the savers list
|
||||
will be set to passthrough during the context.
|
||||
|
||||
Args:
|
||||
comp (object): Fusion composition object.
|
||||
savers (list): List of Saver tool objects.
|
||||
|
||||
"""
|
||||
passthrough_key = "TOOLB_PassThrough"
|
||||
original_states = {}
|
||||
enabled_save_names = {saver.Name for saver in savers}
|
||||
try:
|
||||
all_savers = comp.GetToolList(False, "Saver").values()
|
||||
for saver in all_savers:
|
||||
original_state = saver.GetAttrs()[passthrough_key]
|
||||
original_states[saver] = original_state
|
||||
|
||||
# The passthrough state we want to set (passthrough != enabled)
|
||||
state = saver.Name not in enabled_save_names
|
||||
if state != original_state:
|
||||
saver.SetAttrs({passthrough_key: state})
|
||||
yield
|
||||
finally:
|
||||
for saver, original_state in original_states.items():
|
||||
saver.SetAttrs({"TOOLB_PassThrough": original_state})
|
||||
|
||||
|
||||
class FusionRenderLocal(pyblish.api.InstancePlugin):
|
||||
"""Render the current Fusion composition locally."""
|
||||
|
||||
order = pyblish.api.ExtractorOrder - 0.2
|
||||
label = "Render Local"
|
||||
hosts = ["fusion"]
|
||||
families = ["render.local"]
|
||||
|
||||
def process(self, instance):
|
||||
context = instance.context
|
||||
|
||||
# Start render
|
||||
self.render_once(context)
|
||||
|
||||
# Log render status
|
||||
self.log.info(
|
||||
"Rendered '{nm}' for asset '{ast}' under the task '{tsk}'".format(
|
||||
nm=instance.data["name"],
|
||||
ast=instance.data["asset"],
|
||||
tsk=instance.data["task"],
|
||||
)
|
||||
)
|
||||
|
||||
def render_once(self, context):
|
||||
"""Render context comp only once, even with more render instances"""
|
||||
|
||||
# This plug-in assumes all render nodes get rendered at the same time
|
||||
# to speed up the rendering. The check below makes sure that we only
|
||||
# execute the rendering once and not for each instance.
|
||||
key = f"__hasRun{self.__class__.__name__}"
|
||||
|
||||
savers_to_render = [
|
||||
# Get the saver tool from the instance
|
||||
instance[0] for instance in context if
|
||||
# Only active instances
|
||||
instance.data.get("publish", True) and
|
||||
# Only render.local instances
|
||||
"render.local" in instance.data["families"]
|
||||
]
|
||||
|
||||
if key not in context.data:
|
||||
# We initialize as false to indicate it wasn't successful yet
|
||||
# so we can keep track of whether Fusion succeeded
|
||||
context.data[key] = False
|
||||
|
||||
current_comp = context.data["currentComp"]
|
||||
frame_start = context.data["frameStartHandle"]
|
||||
frame_end = context.data["frameEndHandle"]
|
||||
|
||||
self.log.info("Starting Fusion render")
|
||||
self.log.info(f"Start frame: {frame_start}")
|
||||
self.log.info(f"End frame: {frame_end}")
|
||||
saver_names = ", ".join(saver.Name for saver in savers_to_render)
|
||||
self.log.info(f"Rendering tools: {saver_names}")
|
||||
|
||||
with comp_lock_and_undo_chunk(current_comp):
|
||||
with enabled_savers(current_comp, savers_to_render):
|
||||
result = current_comp.Render(
|
||||
{
|
||||
"Start": frame_start,
|
||||
"End": frame_end,
|
||||
"Wait": True,
|
||||
}
|
||||
)
|
||||
|
||||
context.data[key] = bool(result)
|
||||
|
||||
if context.data[key] is False:
|
||||
raise RuntimeError("Comp render failed")
|
||||
|
|
@ -1,100 +0,0 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.fusion.api import comp_lock_and_undo_chunk
|
||||
|
||||
|
||||
class Fusionlocal(pyblish.api.InstancePlugin,
|
||||
publish.ColormanagedPyblishPluginMixin):
|
||||
"""Render the current Fusion composition locally.
|
||||
|
||||
Extract the result of savers by starting a comp render
|
||||
This will run the local render of Fusion.
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.ExtractorOrder - 0.1
|
||||
label = "Render Local"
|
||||
hosts = ["fusion"]
|
||||
families = ["render.local"]
|
||||
|
||||
def process(self, instance):
|
||||
context = instance.context
|
||||
|
||||
# Start render
|
||||
self.render_once(context)
|
||||
|
||||
# Log render status
|
||||
self.log.info(
|
||||
"Rendered '{nm}' for asset '{ast}' under the task '{tsk}'".format(
|
||||
nm=instance.data["name"],
|
||||
ast=instance.data["asset"],
|
||||
tsk=instance.data["task"],
|
||||
)
|
||||
)
|
||||
|
||||
frame_start = context.data["frameStartHandle"]
|
||||
frame_end = context.data["frameEndHandle"]
|
||||
path = instance.data["path"]
|
||||
output_dir = instance.data["outputDir"]
|
||||
|
||||
basename = os.path.basename(path)
|
||||
head, ext = os.path.splitext(basename)
|
||||
files = [
|
||||
f"{head}{str(frame).zfill(4)}{ext}"
|
||||
for frame in range(frame_start, frame_end + 1)
|
||||
]
|
||||
repre = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"frameStart": f"%0{len(str(frame_end))}d" % frame_start,
|
||||
"files": files,
|
||||
"stagingDir": output_dir,
|
||||
}
|
||||
|
||||
self.set_representation_colorspace(
|
||||
representation=repre,
|
||||
context=context,
|
||||
)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
instance.data["representations"].append(repre)
|
||||
|
||||
# review representation
|
||||
if instance.data.get("review", False):
|
||||
repre["tags"] = ["review", "ftrackreview"]
|
||||
|
||||
def render_once(self, context):
|
||||
"""Render context comp only once, even with more render instances"""
|
||||
|
||||
# This plug-in assumes all render nodes get rendered at the same time
|
||||
# to speed up the rendering. The check below makes sure that we only
|
||||
# execute the rendering once and not for each instance.
|
||||
key = f"__hasRun{self.__class__.__name__}"
|
||||
if key not in context.data:
|
||||
# We initialize as false to indicate it wasn't successful yet
|
||||
# so we can keep track of whether Fusion succeeded
|
||||
context.data[key] = False
|
||||
|
||||
current_comp = context.data["currentComp"]
|
||||
frame_start = context.data["frameStartHandle"]
|
||||
frame_end = context.data["frameEndHandle"]
|
||||
|
||||
self.log.info("Starting Fusion render")
|
||||
self.log.info(f"Start frame: {frame_start}")
|
||||
self.log.info(f"End frame: {frame_end}")
|
||||
|
||||
with comp_lock_and_undo_chunk(current_comp):
|
||||
result = current_comp.Render(
|
||||
{
|
||||
"Start": frame_start,
|
||||
"End": frame_end,
|
||||
"Wait": True,
|
||||
}
|
||||
)
|
||||
|
||||
context.data[key] = bool(result)
|
||||
|
||||
if context.data[key] is False:
|
||||
raise RuntimeError("Comp render failed")
|
||||
|
|
@ -14,22 +14,19 @@ class ValidateCreateFolderChecked(pyblish.api.InstancePlugin):
|
|||
"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
actions = [RepairAction]
|
||||
label = "Validate Create Folder Checked"
|
||||
families = ["render"]
|
||||
hosts = ["fusion"]
|
||||
actions = [SelectInvalidAction]
|
||||
actions = [RepairAction, SelectInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
active = instance.data.get("active", instance.data.get("publish"))
|
||||
if not active:
|
||||
return []
|
||||
|
||||
tool = instance[0]
|
||||
create_dir = tool.GetInput("CreateDir")
|
||||
if create_dir == 0.0:
|
||||
cls.log.error("%s has Create Folder turned off" % instance[0].Name)
|
||||
cls.log.error(
|
||||
"%s has Create Folder turned off" % instance[0].Name
|
||||
)
|
||||
return [tool]
|
||||
|
||||
def process(self, instance):
|
||||
|
|
@ -37,7 +34,8 @@ class ValidateCreateFolderChecked(pyblish.api.InstancePlugin):
|
|||
if invalid:
|
||||
raise PublishValidationError(
|
||||
"Found Saver with Create Folder During Render checked off",
|
||||
title=self.label)
|
||||
title=self.label,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
|
|
|
|||
|
|
@ -0,0 +1,78 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline.publish import RepairAction
|
||||
from openpype.pipeline import PublishValidationError
|
||||
|
||||
from openpype.hosts.fusion.api.action import SelectInvalidAction
|
||||
|
||||
|
||||
class ValidateLocalFramesExistence(pyblish.api.InstancePlugin):
|
||||
"""Checks if files for savers that's set
|
||||
to publish expected frames exists
|
||||
"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Expected Frames Exists"
|
||||
families = ["render"]
|
||||
hosts = ["fusion"]
|
||||
actions = [RepairAction, SelectInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance, non_existing_frames=None):
|
||||
if non_existing_frames is None:
|
||||
non_existing_frames = []
|
||||
|
||||
if instance.data.get("render_target") == "frames":
|
||||
tool = instance[0]
|
||||
|
||||
frame_start = instance.data["frameStart"]
|
||||
frame_end = instance.data["frameEnd"]
|
||||
path = instance.data["path"]
|
||||
output_dir = instance.data["outputDir"]
|
||||
|
||||
basename = os.path.basename(path)
|
||||
head, ext = os.path.splitext(basename)
|
||||
files = [
|
||||
f"{head}{str(frame).zfill(4)}{ext}"
|
||||
for frame in range(frame_start, frame_end + 1)
|
||||
]
|
||||
|
||||
for file in files:
|
||||
if not os.path.exists(os.path.join(output_dir, file)):
|
||||
cls.log.error(
|
||||
f"Missing file: {os.path.join(output_dir, file)}"
|
||||
)
|
||||
non_existing_frames.append(file)
|
||||
|
||||
if len(non_existing_frames) > 0:
|
||||
cls.log.error(f"Some of {tool.Name}'s files does not exist")
|
||||
return [tool]
|
||||
|
||||
def process(self, instance):
|
||||
non_existing_frames = []
|
||||
invalid = self.get_invalid(instance, non_existing_frames)
|
||||
if invalid:
|
||||
raise PublishValidationError(
|
||||
"{} is set to publish existing frames but "
|
||||
"some frames are missing. "
|
||||
"The missing file(s) are:\n\n{}".format(
|
||||
invalid[0].Name,
|
||||
"\n\n".join(non_existing_frames),
|
||||
),
|
||||
title=self.label,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
invalid = cls.get_invalid(instance)
|
||||
if invalid:
|
||||
tool = invalid[0]
|
||||
|
||||
# Change render target to local to render locally
|
||||
tool.SetData("openpype.creator_attributes.render_target", "local")
|
||||
|
||||
cls.log.info(
|
||||
f"Reload the publisher and {tool.Name} "
|
||||
"will be set to render locally"
|
||||
)
|
||||
|
|
@ -1,112 +0,0 @@
|
|||
from qtpy import QtWidgets
|
||||
import qtawesome
|
||||
from openpype.hosts.fusion.api import get_current_comp
|
||||
|
||||
|
||||
_help = {"local": "Render the comp on your own machine and publish "
|
||||
"it from that the destination folder",
|
||||
"farm": "Submit a Fusion render job to a Render farm to use all other"
|
||||
" computers and add a publish job"}
|
||||
|
||||
|
||||
class SetRenderMode(QtWidgets.QWidget):
|
||||
|
||||
def __init__(self, parent=None):
|
||||
QtWidgets.QWidget.__init__(self, parent)
|
||||
|
||||
self._comp = get_current_comp()
|
||||
self._comp_name = self._get_comp_name()
|
||||
|
||||
self.setWindowTitle("Set Render Mode")
|
||||
self.setFixedSize(300, 175)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout()
|
||||
|
||||
# region comp info
|
||||
comp_info_layout = QtWidgets.QHBoxLayout()
|
||||
|
||||
update_btn = QtWidgets.QPushButton(qtawesome.icon("fa.refresh",
|
||||
color="white"), "")
|
||||
update_btn.setFixedWidth(25)
|
||||
update_btn.setFixedHeight(25)
|
||||
|
||||
comp_information = QtWidgets.QLineEdit()
|
||||
comp_information.setEnabled(False)
|
||||
|
||||
comp_info_layout.addWidget(comp_information)
|
||||
comp_info_layout.addWidget(update_btn)
|
||||
# endregion comp info
|
||||
|
||||
# region modes
|
||||
mode_options = QtWidgets.QComboBox()
|
||||
mode_options.addItems(_help.keys())
|
||||
|
||||
mode_information = QtWidgets.QTextEdit()
|
||||
mode_information.setReadOnly(True)
|
||||
# endregion modes
|
||||
|
||||
accept_btn = QtWidgets.QPushButton("Accept")
|
||||
|
||||
layout.addLayout(comp_info_layout)
|
||||
layout.addWidget(mode_options)
|
||||
layout.addWidget(mode_information)
|
||||
layout.addWidget(accept_btn)
|
||||
|
||||
self.setLayout(layout)
|
||||
|
||||
self.comp_information = comp_information
|
||||
self.update_btn = update_btn
|
||||
|
||||
self.mode_options = mode_options
|
||||
self.mode_information = mode_information
|
||||
|
||||
self.accept_btn = accept_btn
|
||||
|
||||
self.connections()
|
||||
self.update()
|
||||
|
||||
# Force updated render mode help text
|
||||
self._update_rendermode_info()
|
||||
|
||||
def connections(self):
|
||||
"""Build connections between code and buttons"""
|
||||
|
||||
self.update_btn.clicked.connect(self.update)
|
||||
self.accept_btn.clicked.connect(self._set_comp_rendermode)
|
||||
self.mode_options.currentIndexChanged.connect(
|
||||
self._update_rendermode_info)
|
||||
|
||||
def update(self):
|
||||
"""Update all information in the UI"""
|
||||
|
||||
self._comp = get_current_comp()
|
||||
self._comp_name = self._get_comp_name()
|
||||
self.comp_information.setText(self._comp_name)
|
||||
|
||||
# Update current comp settings
|
||||
mode = self._get_comp_rendermode()
|
||||
index = self.mode_options.findText(mode)
|
||||
self.mode_options.setCurrentIndex(index)
|
||||
|
||||
def _update_rendermode_info(self):
|
||||
rendermode = self.mode_options.currentText()
|
||||
self.mode_information.setText(_help[rendermode])
|
||||
|
||||
def _get_comp_name(self):
|
||||
return self._comp.GetAttrs("COMPS_Name")
|
||||
|
||||
def _get_comp_rendermode(self):
|
||||
return self._comp.GetData("openpype.rendermode") or "local"
|
||||
|
||||
def _set_comp_rendermode(self):
|
||||
rendermode = self.mode_options.currentText()
|
||||
self._comp.SetData("openpype.rendermode", rendermode)
|
||||
|
||||
self._comp.Print("Updated render mode to '%s'\n" % rendermode)
|
||||
self.hide()
|
||||
|
||||
def _validation(self):
|
||||
ui_mode = self.mode_options.currentText()
|
||||
comp_mode = self._get_comp_rendermode()
|
||||
|
||||
return comp_mode == ui_mode
|
||||
|
|
@ -0,0 +1,19 @@
|
|||
import pyblish.api
|
||||
from openpype.lib import version_up
|
||||
from pymxs import runtime as rt
|
||||
|
||||
|
||||
class IncrementWorkfileVersion(pyblish.api.ContextPlugin):
|
||||
"""Increment current workfile version."""
|
||||
|
||||
order = pyblish.api.IntegratorOrder + 0.9
|
||||
label = "Increment Workfile Version"
|
||||
hosts = ["max"]
|
||||
families = ["workfile"]
|
||||
|
||||
def process(self, context):
|
||||
path = context.data["currentFile"]
|
||||
filepath = version_up(path)
|
||||
|
||||
rt.saveMaxFile(filepath)
|
||||
self.log.info("Incrementing file version")
|
||||
|
|
@ -80,21 +80,7 @@ def get_all_asset_nodes():
|
|||
Returns:
|
||||
list: list of dictionaries
|
||||
"""
|
||||
|
||||
host = registered_host()
|
||||
|
||||
nodes = []
|
||||
for container in host.ls():
|
||||
# We are not interested in looks but assets!
|
||||
if container["loader"] == "LookLoader":
|
||||
continue
|
||||
|
||||
# Gather all information
|
||||
container_name = container["objectName"]
|
||||
nodes += lib.get_container_members(container_name)
|
||||
|
||||
nodes = list(set(nodes))
|
||||
return nodes
|
||||
return cmds.ls(dag=True, noIntermediate=True, long=True)
|
||||
|
||||
|
||||
def create_asset_id_hash(nodes):
|
||||
|
|
|
|||
|
|
@ -66,11 +66,11 @@ class MainThreadItem:
|
|||
return self._result
|
||||
|
||||
def execute(self):
|
||||
"""Execute callback and store it's result.
|
||||
"""Execute callback and store its result.
|
||||
|
||||
Method must be called from main thread. Item is marked as `done`
|
||||
when callback execution finished. Store output of callback of exception
|
||||
information when callback raise one.
|
||||
information when callback raises one.
|
||||
"""
|
||||
log.debug("Executing process in main thread")
|
||||
if self.done:
|
||||
|
|
|
|||
|
|
@ -389,11 +389,11 @@ class MainThreadItem:
|
|||
self.kwargs = kwargs
|
||||
|
||||
def execute(self):
|
||||
"""Execute callback and store it's result.
|
||||
"""Execute callback and store its result.
|
||||
|
||||
Method must be called from main thread. Item is marked as `done`
|
||||
when callback execution finished. Store output of callback of exception
|
||||
information when callback raise one.
|
||||
information when callback raises one.
|
||||
"""
|
||||
log.debug("Executing process in main thread")
|
||||
if self.done:
|
||||
|
|
|
|||
|
|
@ -889,7 +889,8 @@ class ApplicationLaunchContext:
|
|||
self.modules_manager = ModulesManager()
|
||||
|
||||
# Logger
|
||||
logger_name = "{}-{}".format(self.__class__.__name__, self.app_name)
|
||||
logger_name = "{}-{}".format(self.__class__.__name__,
|
||||
self.application.full_name)
|
||||
self.log = Logger.get_logger(logger_name)
|
||||
|
||||
self.executable = executable
|
||||
|
|
@ -1246,7 +1247,7 @@ class ApplicationLaunchContext:
|
|||
args_len_str = " ({})".format(len(args))
|
||||
self.log.info(
|
||||
"Launching \"{}\" with args{}: {}".format(
|
||||
self.app_name, args_len_str, args
|
||||
self.application.full_name, args_len_str, args
|
||||
)
|
||||
)
|
||||
self.launch_args = args
|
||||
|
|
@ -1271,7 +1272,9 @@ class ApplicationLaunchContext:
|
|||
exc_info=True
|
||||
)
|
||||
|
||||
self.log.debug("Launch of {} finished.".format(self.app_name))
|
||||
self.log.debug("Launch of {} finished.".format(
|
||||
self.application.full_name
|
||||
))
|
||||
|
||||
return self.process
|
||||
|
||||
|
|
@ -1508,8 +1511,8 @@ def prepare_app_environments(
|
|||
if key in source_env:
|
||||
source_env[key] = value
|
||||
|
||||
# `added_env_keys` has debug purpose
|
||||
added_env_keys = {app.group.name, app.name}
|
||||
# `app_and_tool_labels` has debug purpose
|
||||
app_and_tool_labels = [app.full_name]
|
||||
# Environments for application
|
||||
environments = [
|
||||
app.group.environment,
|
||||
|
|
@ -1532,15 +1535,14 @@ def prepare_app_environments(
|
|||
for group_name in sorted(groups_by_name.keys()):
|
||||
group = groups_by_name[group_name]
|
||||
environments.append(group.environment)
|
||||
added_env_keys.add(group_name)
|
||||
for tool_name in sorted(tool_by_group_name[group_name].keys()):
|
||||
tool = tool_by_group_name[group_name][tool_name]
|
||||
environments.append(tool.environment)
|
||||
added_env_keys.add(tool.name)
|
||||
app_and_tool_labels.append(tool.full_name)
|
||||
|
||||
log.debug(
|
||||
"Will add environments for apps and tools: {}".format(
|
||||
", ".join(added_env_keys)
|
||||
", ".join(app_and_tool_labels)
|
||||
)
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -106,7 +106,7 @@ class CelactionSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
|
||||
# define chunk and priority
|
||||
chunk_size = instance.context.data.get("chunk")
|
||||
if chunk_size == 0:
|
||||
if not chunk_size:
|
||||
chunk_size = self.deadline_chunk_size
|
||||
|
||||
# search for %02d pattern in name, and padding number
|
||||
|
|
|
|||
|
|
@ -44,7 +44,7 @@ class AddonSettingsDef(JsonFilesSettingsDef):
|
|||
|
||||
|
||||
class ExampleAddon(OpenPypeAddOn, IPluginPaths, ITrayAction):
|
||||
"""This Addon has defined it's settings and interface.
|
||||
"""This Addon has defined its settings and interface.
|
||||
|
||||
This example has system settings with an enabled option. And use
|
||||
few other interfaces:
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ from openpype_modules.ftrack.lib import (
|
|||
|
||||
|
||||
class PushHierValuesToNonHier(ServerAction):
|
||||
"""Action push hierarchical custom attribute values to non hierarchical.
|
||||
"""Action push hierarchical custom attribute values to non-hierarchical.
|
||||
|
||||
Hierarchical value is also pushed to their task entities.
|
||||
|
||||
|
|
@ -119,17 +119,109 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
self.join_query_keys(object_ids)
|
||||
)).all()
|
||||
|
||||
output = {}
|
||||
attrs_by_obj_id = collections.defaultdict(list)
|
||||
hiearchical = []
|
||||
for attr in attrs:
|
||||
if attr["is_hierarchical"]:
|
||||
hiearchical.append(attr)
|
||||
continue
|
||||
obj_id = attr["object_type_id"]
|
||||
if obj_id not in output:
|
||||
output[obj_id] = []
|
||||
output[obj_id].append(attr)
|
||||
return output, hiearchical
|
||||
attrs_by_obj_id[obj_id].append(attr)
|
||||
return attrs_by_obj_id, hiearchical
|
||||
|
||||
def query_attr_value(
|
||||
self,
|
||||
session,
|
||||
hier_attrs,
|
||||
attrs_by_obj_id,
|
||||
dst_object_type_ids,
|
||||
task_entity_ids,
|
||||
non_task_entity_ids,
|
||||
parent_id_by_entity_id
|
||||
):
|
||||
all_non_task_ids_with_parents = set()
|
||||
for entity_id in non_task_entity_ids:
|
||||
all_non_task_ids_with_parents.add(entity_id)
|
||||
_entity_id = entity_id
|
||||
while True:
|
||||
parent_id = parent_id_by_entity_id.get(_entity_id)
|
||||
if (
|
||||
parent_id is None
|
||||
or parent_id in all_non_task_ids_with_parents
|
||||
):
|
||||
break
|
||||
all_non_task_ids_with_parents.add(parent_id)
|
||||
_entity_id = parent_id
|
||||
|
||||
all_entity_ids = (
|
||||
set(all_non_task_ids_with_parents)
|
||||
| set(task_entity_ids)
|
||||
)
|
||||
attr_ids = {attr["id"] for attr in hier_attrs}
|
||||
for obj_id in dst_object_type_ids:
|
||||
attrs = attrs_by_obj_id.get(obj_id)
|
||||
if attrs is not None:
|
||||
for attr in attrs:
|
||||
attr_ids.add(attr["id"])
|
||||
|
||||
real_values_by_entity_id = {
|
||||
entity_id: {}
|
||||
for entity_id in all_entity_ids
|
||||
}
|
||||
|
||||
attr_values = query_custom_attributes(
|
||||
session, attr_ids, all_entity_ids, True
|
||||
)
|
||||
for item in attr_values:
|
||||
entity_id = item["entity_id"]
|
||||
attr_id = item["configuration_id"]
|
||||
real_values_by_entity_id[entity_id][attr_id] = item["value"]
|
||||
|
||||
# Fill hierarchical values
|
||||
hier_attrs_key_by_id = {
|
||||
hier_attr["id"]: hier_attr
|
||||
for hier_attr in hier_attrs
|
||||
}
|
||||
hier_values_per_entity_id = {}
|
||||
for entity_id in all_non_task_ids_with_parents:
|
||||
real_values = real_values_by_entity_id[entity_id]
|
||||
hier_values_per_entity_id[entity_id] = {}
|
||||
for attr_id, attr in hier_attrs_key_by_id.items():
|
||||
key = attr["key"]
|
||||
hier_values_per_entity_id[entity_id][key] = (
|
||||
real_values.get(attr_id)
|
||||
)
|
||||
|
||||
output = {}
|
||||
for entity_id in non_task_entity_ids:
|
||||
output[entity_id] = {}
|
||||
for attr in hier_attrs_key_by_id.values():
|
||||
key = attr["key"]
|
||||
value = hier_values_per_entity_id[entity_id][key]
|
||||
tried_ids = set()
|
||||
if value is None:
|
||||
tried_ids.add(entity_id)
|
||||
_entity_id = entity_id
|
||||
while value is None:
|
||||
parent_id = parent_id_by_entity_id.get(_entity_id)
|
||||
if not parent_id:
|
||||
break
|
||||
value = hier_values_per_entity_id[parent_id][key]
|
||||
if value is not None:
|
||||
break
|
||||
_entity_id = parent_id
|
||||
tried_ids.add(parent_id)
|
||||
|
||||
if value is None:
|
||||
value = attr["default"]
|
||||
|
||||
if value is not None:
|
||||
for ent_id in tried_ids:
|
||||
hier_values_per_entity_id[ent_id][key] = value
|
||||
|
||||
output[entity_id][key] = value
|
||||
|
||||
return real_values_by_entity_id, output
|
||||
|
||||
def propagate_values(self, session, event, selected_entities):
|
||||
ftrack_settings = self.get_ftrack_settings(
|
||||
|
|
@ -156,29 +248,24 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
}
|
||||
|
||||
task_object_type = object_types_by_low_name["task"]
|
||||
destination_object_types = [task_object_type]
|
||||
dst_object_type_ids = {task_object_type["id"]}
|
||||
for ent_type in interest_entity_types:
|
||||
obj_type = object_types_by_low_name.get(ent_type)
|
||||
if obj_type and obj_type not in destination_object_types:
|
||||
destination_object_types.append(obj_type)
|
||||
|
||||
destination_object_type_ids = set(
|
||||
obj_type["id"]
|
||||
for obj_type in destination_object_types
|
||||
)
|
||||
if obj_type:
|
||||
dst_object_type_ids.add(obj_type["id"])
|
||||
|
||||
interest_attributes = action_settings["interest_attributes"]
|
||||
# Find custom attributes definitions
|
||||
attrs_by_obj_id, hier_attrs = self.attrs_configurations(
|
||||
session, destination_object_type_ids, interest_attributes
|
||||
session, dst_object_type_ids, interest_attributes
|
||||
)
|
||||
# Filter destination object types if they have any object specific
|
||||
# custom attribute
|
||||
for obj_id in tuple(destination_object_type_ids):
|
||||
for obj_id in tuple(dst_object_type_ids):
|
||||
if obj_id not in attrs_by_obj_id:
|
||||
destination_object_type_ids.remove(obj_id)
|
||||
dst_object_type_ids.remove(obj_id)
|
||||
|
||||
if not destination_object_type_ids:
|
||||
if not dst_object_type_ids:
|
||||
# TODO report that there are not matching custom attributes
|
||||
return {
|
||||
"success": True,
|
||||
|
|
@ -192,14 +279,14 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
session,
|
||||
selected_ids,
|
||||
project_entity,
|
||||
destination_object_type_ids
|
||||
dst_object_type_ids
|
||||
)
|
||||
|
||||
self.log.debug("Preparing whole project hierarchy by ids.")
|
||||
|
||||
entities_by_obj_id = {
|
||||
obj_id: []
|
||||
for obj_id in destination_object_type_ids
|
||||
for obj_id in dst_object_type_ids
|
||||
}
|
||||
|
||||
self.log.debug("Filtering Task entities.")
|
||||
|
|
@ -223,10 +310,16 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
"message": "Nothing to do in your selection."
|
||||
}
|
||||
|
||||
self.log.debug("Getting Hierarchical custom attribute values parents.")
|
||||
hier_values_by_entity_id = self.get_hier_values(
|
||||
self.log.debug("Getting Custom attribute values.")
|
||||
(
|
||||
real_values_by_entity_id,
|
||||
hier_values_by_entity_id
|
||||
) = self.query_attr_value(
|
||||
session,
|
||||
hier_attrs,
|
||||
attrs_by_obj_id,
|
||||
dst_object_type_ids,
|
||||
task_entity_ids,
|
||||
non_task_entity_ids,
|
||||
parent_id_by_entity_id
|
||||
)
|
||||
|
|
@ -237,7 +330,8 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
hier_attrs,
|
||||
task_entity_ids,
|
||||
hier_values_by_entity_id,
|
||||
parent_id_by_entity_id
|
||||
parent_id_by_entity_id,
|
||||
real_values_by_entity_id
|
||||
)
|
||||
|
||||
self.log.debug("Setting values to entities themselves.")
|
||||
|
|
@ -245,7 +339,8 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
session,
|
||||
entities_by_obj_id,
|
||||
attrs_by_obj_id,
|
||||
hier_values_by_entity_id
|
||||
hier_values_by_entity_id,
|
||||
real_values_by_entity_id
|
||||
)
|
||||
|
||||
return True
|
||||
|
|
@ -322,112 +417,64 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
|
||||
return parent_id_by_entity_id, filtered_entities
|
||||
|
||||
def get_hier_values(
|
||||
self,
|
||||
session,
|
||||
hier_attrs,
|
||||
focus_entity_ids,
|
||||
parent_id_by_entity_id
|
||||
):
|
||||
all_ids_with_parents = set()
|
||||
for entity_id in focus_entity_ids:
|
||||
all_ids_with_parents.add(entity_id)
|
||||
_entity_id = entity_id
|
||||
while True:
|
||||
parent_id = parent_id_by_entity_id.get(_entity_id)
|
||||
if (
|
||||
not parent_id
|
||||
or parent_id in all_ids_with_parents
|
||||
):
|
||||
break
|
||||
all_ids_with_parents.add(parent_id)
|
||||
_entity_id = parent_id
|
||||
|
||||
hier_attr_ids = tuple(hier_attr["id"] for hier_attr in hier_attrs)
|
||||
hier_attrs_key_by_id = {
|
||||
hier_attr["id"]: hier_attr["key"]
|
||||
for hier_attr in hier_attrs
|
||||
}
|
||||
|
||||
values_per_entity_id = {}
|
||||
for entity_id in all_ids_with_parents:
|
||||
values_per_entity_id[entity_id] = {}
|
||||
for key in hier_attrs_key_by_id.values():
|
||||
values_per_entity_id[entity_id][key] = None
|
||||
|
||||
values = query_custom_attributes(
|
||||
session, hier_attr_ids, all_ids_with_parents, True
|
||||
)
|
||||
for item in values:
|
||||
entity_id = item["entity_id"]
|
||||
key = hier_attrs_key_by_id[item["configuration_id"]]
|
||||
|
||||
values_per_entity_id[entity_id][key] = item["value"]
|
||||
|
||||
output = {}
|
||||
for entity_id in focus_entity_ids:
|
||||
output[entity_id] = {}
|
||||
for key in hier_attrs_key_by_id.values():
|
||||
value = values_per_entity_id[entity_id][key]
|
||||
tried_ids = set()
|
||||
if value is None:
|
||||
tried_ids.add(entity_id)
|
||||
_entity_id = entity_id
|
||||
while value is None:
|
||||
parent_id = parent_id_by_entity_id.get(_entity_id)
|
||||
if not parent_id:
|
||||
break
|
||||
value = values_per_entity_id[parent_id][key]
|
||||
if value is not None:
|
||||
break
|
||||
_entity_id = parent_id
|
||||
tried_ids.add(parent_id)
|
||||
|
||||
if value is not None:
|
||||
for ent_id in tried_ids:
|
||||
values_per_entity_id[ent_id][key] = value
|
||||
|
||||
output[entity_id][key] = value
|
||||
return output
|
||||
|
||||
def set_task_attr_values(
|
||||
self,
|
||||
session,
|
||||
hier_attrs,
|
||||
task_entity_ids,
|
||||
hier_values_by_entity_id,
|
||||
parent_id_by_entity_id
|
||||
parent_id_by_entity_id,
|
||||
real_values_by_entity_id
|
||||
):
|
||||
hier_attr_id_by_key = {
|
||||
attr["key"]: attr["id"]
|
||||
for attr in hier_attrs
|
||||
}
|
||||
filtered_task_ids = set()
|
||||
for task_id in task_entity_ids:
|
||||
parent_id = parent_id_by_entity_id.get(task_id) or {}
|
||||
parent_id = parent_id_by_entity_id.get(task_id)
|
||||
parent_values = hier_values_by_entity_id.get(parent_id)
|
||||
if not parent_values:
|
||||
continue
|
||||
if parent_values:
|
||||
filtered_task_ids.add(task_id)
|
||||
|
||||
if not filtered_task_ids:
|
||||
return
|
||||
|
||||
for task_id in filtered_task_ids:
|
||||
parent_id = parent_id_by_entity_id[task_id]
|
||||
parent_values = hier_values_by_entity_id[parent_id]
|
||||
hier_values_by_entity_id[task_id] = {}
|
||||
real_task_attr_values = real_values_by_entity_id[task_id]
|
||||
for key, value in parent_values.items():
|
||||
hier_values_by_entity_id[task_id][key] = value
|
||||
if value is None:
|
||||
continue
|
||||
|
||||
configuration_id = hier_attr_id_by_key[key]
|
||||
_entity_key = collections.OrderedDict([
|
||||
("configuration_id", configuration_id),
|
||||
("entity_id", task_id)
|
||||
])
|
||||
|
||||
session.recorded_operations.push(
|
||||
ftrack_api.operation.UpdateEntityOperation(
|
||||
"ContextCustomAttributeValue",
|
||||
op = None
|
||||
if configuration_id not in real_task_attr_values:
|
||||
op = ftrack_api.operation.CreateEntityOperation(
|
||||
"CustomAttributeValue",
|
||||
_entity_key,
|
||||
{"value": value}
|
||||
)
|
||||
elif real_task_attr_values[configuration_id] != value:
|
||||
op = ftrack_api.operation.UpdateEntityOperation(
|
||||
"CustomAttributeValue",
|
||||
_entity_key,
|
||||
"value",
|
||||
ftrack_api.symbol.NOT_SET,
|
||||
real_task_attr_values[configuration_id],
|
||||
value
|
||||
)
|
||||
)
|
||||
if len(session.recorded_operations) > 100:
|
||||
session.commit()
|
||||
|
||||
if op is not None:
|
||||
session.recorded_operations.push(op)
|
||||
if len(session.recorded_operations) > 100:
|
||||
session.commit()
|
||||
|
||||
session.commit()
|
||||
|
||||
|
|
@ -436,39 +483,68 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
session,
|
||||
entities_by_obj_id,
|
||||
attrs_by_obj_id,
|
||||
hier_values_by_entity_id
|
||||
hier_values_by_entity_id,
|
||||
real_values_by_entity_id
|
||||
):
|
||||
"""Push values from hierarchical custom attributes to non-hierarchical.
|
||||
|
||||
Args:
|
||||
session (ftrack_api.Sessison): Session which queried entities,
|
||||
values and which is used for change propagation.
|
||||
entities_by_obj_id (dict[str, list[str]]): TypedContext
|
||||
ftrack entity ids where the attributes are propagated by their
|
||||
object ids.
|
||||
attrs_by_obj_id (dict[str, ftrack_api.Entity]): Objects of
|
||||
'CustomAttributeConfiguration' by their ids.
|
||||
hier_values_by_entity_id (doc[str, dict[str, Any]]): Attribute
|
||||
values by entity id and by their keys.
|
||||
real_values_by_entity_id (doc[str, dict[str, Any]]): Real attribute
|
||||
values of entities.
|
||||
"""
|
||||
|
||||
for object_id, entity_ids in entities_by_obj_id.items():
|
||||
attrs = attrs_by_obj_id.get(object_id)
|
||||
if not attrs or not entity_ids:
|
||||
continue
|
||||
|
||||
for attr in attrs:
|
||||
for entity_id in entity_ids:
|
||||
value = (
|
||||
hier_values_by_entity_id
|
||||
.get(entity_id, {})
|
||||
.get(attr["key"])
|
||||
)
|
||||
for entity_id in entity_ids:
|
||||
real_values = real_values_by_entity_id.get(entity_id)
|
||||
hier_values = hier_values_by_entity_id.get(entity_id)
|
||||
if hier_values is None:
|
||||
continue
|
||||
|
||||
for attr in attrs:
|
||||
attr_id = attr["id"]
|
||||
attr_key = attr["key"]
|
||||
value = hier_values.get(attr_key)
|
||||
if value is None:
|
||||
continue
|
||||
|
||||
_entity_key = collections.OrderedDict([
|
||||
("configuration_id", attr["id"]),
|
||||
("configuration_id", attr_id),
|
||||
("entity_id", entity_id)
|
||||
])
|
||||
|
||||
session.recorded_operations.push(
|
||||
ftrack_api.operation.UpdateEntityOperation(
|
||||
"ContextCustomAttributeValue",
|
||||
op = None
|
||||
if attr_id not in real_values:
|
||||
op = ftrack_api.operation.CreateEntityOperation(
|
||||
"CustomAttributeValue",
|
||||
_entity_key,
|
||||
{"value": value}
|
||||
)
|
||||
elif real_values[attr_id] != value:
|
||||
op = ftrack_api.operation.UpdateEntityOperation(
|
||||
"CustomAttributeValue",
|
||||
_entity_key,
|
||||
"value",
|
||||
ftrack_api.symbol.NOT_SET,
|
||||
real_values[attr_id],
|
||||
value
|
||||
)
|
||||
)
|
||||
if len(session.recorded_operations) > 100:
|
||||
session.commit()
|
||||
|
||||
if op is not None:
|
||||
session.recorded_operations.push(op)
|
||||
if len(session.recorded_operations) > 100:
|
||||
session.commit()
|
||||
|
||||
session.commit()
|
||||
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
|
|
@ -29,7 +29,7 @@ class CollectKitsuEntities(pyblish.api.ContextPlugin):
|
|||
if not zou_asset_data:
|
||||
raise ValueError("Zou asset data not found in OpenPype!")
|
||||
|
||||
task_name = instance.data.get("task")
|
||||
task_name = instance.data.get("task", context.data.get("task"))
|
||||
if not task_name:
|
||||
continue
|
||||
|
||||
|
|
|
|||
|
|
@ -48,7 +48,10 @@ class IntegrateKitsuNote(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
for instance in context:
|
||||
# Check if instance is a review by checking its family
|
||||
if "review" not in instance.data["families"]:
|
||||
# Allow a match to primary family or any of families
|
||||
families = set([instance.data["family"]] +
|
||||
instance.data.get("families", []))
|
||||
if "review" not in families:
|
||||
continue
|
||||
|
||||
kitsu_task = instance.data.get("kitsu_task")
|
||||
|
|
|
|||
|
|
@ -12,17 +12,17 @@ class IntegrateKitsuReview(pyblish.api.InstancePlugin):
|
|||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
task = instance.data["kitsu_task"]["id"]
|
||||
comment = instance.data["kitsu_comment"]["id"]
|
||||
|
||||
# Check comment has been created
|
||||
if not comment:
|
||||
comment_id = instance.data.get("kitsu_comment", {}).get("id")
|
||||
if not comment_id:
|
||||
self.log.debug(
|
||||
"Comment not created, review not pushed to preview."
|
||||
)
|
||||
return
|
||||
|
||||
# Add review representations as preview of comment
|
||||
task_id = instance.data["kitsu_task"]["id"]
|
||||
for representation in instance.data.get("representations", []):
|
||||
# Skip if not tagged as review
|
||||
if "kitsureview" not in representation.get("tags", []):
|
||||
|
|
@ -31,6 +31,6 @@ class IntegrateKitsuReview(pyblish.api.InstancePlugin):
|
|||
self.log.debug("Found review at: {}".format(review_path))
|
||||
|
||||
gazu.task.add_preview(
|
||||
task, comment, review_path, normalize_movie=True
|
||||
task_id, comment_id, review_path, normalize_movie=True
|
||||
)
|
||||
self.log.info("Review upload on comment")
|
||||
|
|
|
|||
|
|
@ -22,7 +22,7 @@ from openpype.lib.attribute_definitions import (
|
|||
deserialize_attr_defs,
|
||||
get_default_values,
|
||||
)
|
||||
from openpype.host import IPublishHost
|
||||
from openpype.host import IPublishHost, IWorkfileHost
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.plugin_discover import DiscoverResult
|
||||
|
||||
|
|
@ -1374,6 +1374,7 @@ class CreateContext:
|
|||
self._current_project_name = None
|
||||
self._current_asset_name = None
|
||||
self._current_task_name = None
|
||||
self._current_workfile_path = None
|
||||
|
||||
self._host_is_valid = host_is_valid
|
||||
# Currently unused variable
|
||||
|
|
@ -1503,14 +1504,62 @@ class CreateContext:
|
|||
return os.environ["AVALON_APP"]
|
||||
|
||||
def get_current_project_name(self):
|
||||
"""Project name which was used as current context on context reset.
|
||||
|
||||
Returns:
|
||||
Union[str, None]: Project name.
|
||||
"""
|
||||
|
||||
return self._current_project_name
|
||||
|
||||
def get_current_asset_name(self):
|
||||
"""Asset name which was used as current context on context reset.
|
||||
|
||||
Returns:
|
||||
Union[str, None]: Asset name.
|
||||
"""
|
||||
|
||||
return self._current_asset_name
|
||||
|
||||
def get_current_task_name(self):
|
||||
"""Task name which was used as current context on context reset.
|
||||
|
||||
Returns:
|
||||
Union[str, None]: Task name.
|
||||
"""
|
||||
|
||||
return self._current_task_name
|
||||
|
||||
def get_current_workfile_path(self):
|
||||
"""Workfile path which was opened on context reset.
|
||||
|
||||
Returns:
|
||||
Union[str, None]: Workfile path.
|
||||
"""
|
||||
|
||||
return self._current_workfile_path
|
||||
|
||||
@property
|
||||
def context_has_changed(self):
|
||||
"""Host context has changed.
|
||||
|
||||
As context is used project, asset, task name and workfile path if
|
||||
host does support workfiles.
|
||||
|
||||
Returns:
|
||||
bool: Context changed.
|
||||
"""
|
||||
|
||||
project_name, asset_name, task_name, workfile_path = (
|
||||
self._get_current_host_context()
|
||||
)
|
||||
return (
|
||||
self._current_project_name != project_name
|
||||
or self._current_asset_name != asset_name
|
||||
or self._current_task_name != task_name
|
||||
or self._current_workfile_path != workfile_path
|
||||
)
|
||||
|
||||
project_name = property(get_current_project_name)
|
||||
|
||||
@property
|
||||
|
|
@ -1575,6 +1624,28 @@ class CreateContext:
|
|||
self._collection_shared_data = None
|
||||
self.refresh_thumbnails()
|
||||
|
||||
def _get_current_host_context(self):
|
||||
project_name = asset_name = task_name = workfile_path = None
|
||||
if hasattr(self.host, "get_current_context"):
|
||||
host_context = self.host.get_current_context()
|
||||
if host_context:
|
||||
project_name = host_context.get("project_name")
|
||||
asset_name = host_context.get("asset_name")
|
||||
task_name = host_context.get("task_name")
|
||||
|
||||
if isinstance(self.host, IWorkfileHost):
|
||||
workfile_path = self.host.get_current_workfile()
|
||||
|
||||
# --- TODO remove these conditions ---
|
||||
if not project_name:
|
||||
project_name = legacy_io.Session.get("AVALON_PROJECT")
|
||||
if not asset_name:
|
||||
asset_name = legacy_io.Session.get("AVALON_ASSET")
|
||||
if not task_name:
|
||||
task_name = legacy_io.Session.get("AVALON_TASK")
|
||||
# ---
|
||||
return project_name, asset_name, task_name, workfile_path
|
||||
|
||||
def reset_current_context(self):
|
||||
"""Refresh current context.
|
||||
|
||||
|
|
@ -1593,24 +1664,14 @@ class CreateContext:
|
|||
are stored. We should store the workfile (if is available) too.
|
||||
"""
|
||||
|
||||
project_name = asset_name = task_name = None
|
||||
if hasattr(self.host, "get_current_context"):
|
||||
host_context = self.host.get_current_context()
|
||||
if host_context:
|
||||
project_name = host_context.get("project_name")
|
||||
asset_name = host_context.get("asset_name")
|
||||
task_name = host_context.get("task_name")
|
||||
|
||||
if not project_name:
|
||||
project_name = legacy_io.Session.get("AVALON_PROJECT")
|
||||
if not asset_name:
|
||||
asset_name = legacy_io.Session.get("AVALON_ASSET")
|
||||
if not task_name:
|
||||
task_name = legacy_io.Session.get("AVALON_TASK")
|
||||
project_name, asset_name, task_name, workfile_path = (
|
||||
self._get_current_host_context()
|
||||
)
|
||||
|
||||
self._current_project_name = project_name
|
||||
self._current_asset_name = asset_name
|
||||
self._current_task_name = task_name
|
||||
self._current_workfile_path = workfile_path
|
||||
|
||||
def reset_plugins(self, discover_publish_plugins=True):
|
||||
"""Reload plugins.
|
||||
|
|
|
|||
|
|
@ -1,2 +1,3 @@
|
|||
DEFAULT_PUBLISH_TEMPLATE = "publish"
|
||||
DEFAULT_HERO_PUBLISH_TEMPLATE = "hero"
|
||||
TRANSIENT_DIR_TEMPLATE = "transient"
|
||||
|
|
|
|||
|
|
@ -20,13 +20,15 @@ from openpype.settings import (
|
|||
get_system_settings,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
tempdir
|
||||
tempdir,
|
||||
Anatomy
|
||||
)
|
||||
from openpype.pipeline.plugin_discover import DiscoverResult
|
||||
|
||||
from .contants import (
|
||||
DEFAULT_PUBLISH_TEMPLATE,
|
||||
DEFAULT_HERO_PUBLISH_TEMPLATE,
|
||||
TRANSIENT_DIR_TEMPLATE
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -690,3 +692,79 @@ def get_publish_repre_path(instance, repre, only_published=False):
|
|||
if os.path.exists(src_path):
|
||||
return src_path
|
||||
return None
|
||||
|
||||
|
||||
def get_custom_staging_dir_info(project_name, host_name, family, task_name,
|
||||
task_type, subset_name,
|
||||
project_settings=None,
|
||||
anatomy=None, log=None):
|
||||
"""Checks profiles if context should use special custom dir as staging.
|
||||
|
||||
Args:
|
||||
project_name (str)
|
||||
host_name (str)
|
||||
family (str)
|
||||
task_name (str)
|
||||
task_type (str)
|
||||
subset_name (str)
|
||||
project_settings(Dict[str, Any]): Prepared project settings.
|
||||
anatomy (Dict[str, Any])
|
||||
log (Logger) (optional)
|
||||
|
||||
Returns:
|
||||
(tuple)
|
||||
Raises:
|
||||
ValueError - if misconfigured template should be used
|
||||
"""
|
||||
settings = project_settings or get_project_settings(project_name)
|
||||
custom_staging_dir_profiles = (settings["global"]
|
||||
["tools"]
|
||||
["publish"]
|
||||
["custom_staging_dir_profiles"])
|
||||
if not custom_staging_dir_profiles:
|
||||
return None, None
|
||||
|
||||
if not log:
|
||||
log = Logger.get_logger("get_custom_staging_dir_info")
|
||||
|
||||
filtering_criteria = {
|
||||
"hosts": host_name,
|
||||
"families": family,
|
||||
"task_names": task_name,
|
||||
"task_types": task_type,
|
||||
"subsets": subset_name
|
||||
}
|
||||
profile = filter_profiles(custom_staging_dir_profiles,
|
||||
filtering_criteria,
|
||||
logger=log)
|
||||
|
||||
if not profile or not profile["active"]:
|
||||
return None, None
|
||||
|
||||
if not anatomy:
|
||||
anatomy = Anatomy(project_name)
|
||||
|
||||
template_name = profile["template_name"] or TRANSIENT_DIR_TEMPLATE
|
||||
_validate_transient_template(project_name, template_name, anatomy)
|
||||
|
||||
custom_staging_dir = anatomy.templates[template_name]["folder"]
|
||||
is_persistent = profile["custom_staging_dir_persistent"]
|
||||
|
||||
return custom_staging_dir, is_persistent
|
||||
|
||||
|
||||
def _validate_transient_template(project_name, template_name, anatomy):
|
||||
"""Check that transient template is correctly configured.
|
||||
|
||||
Raises:
|
||||
ValueError - if misconfigured template
|
||||
"""
|
||||
if template_name not in anatomy.templates:
|
||||
raise ValueError(("Anatomy of project \"{}\" does not have set"
|
||||
" \"{}\" template key!"
|
||||
).format(project_name, template_name))
|
||||
|
||||
if "folder" not in anatomy.templates[template_name]:
|
||||
raise ValueError(("There is not set \"folder\" template in \"{}\" anatomy" # noqa
|
||||
" for project \"{}\"."
|
||||
).format(template_name, project_name))
|
||||
|
|
|
|||
|
|
@ -93,6 +93,10 @@ class CleanUp(pyblish.api.InstancePlugin):
|
|||
self.log.info("No staging directory found: %s" % staging_dir)
|
||||
return
|
||||
|
||||
if instance.data.get("stagingDir_persistent"):
|
||||
self.log.info("Staging dir: %s should be persistent" % staging_dir)
|
||||
return
|
||||
|
||||
self.log.info("Removing staging directory {}".format(staging_dir))
|
||||
shutil.rmtree(staging_dir)
|
||||
|
||||
|
|
|
|||
|
|
@ -37,7 +37,7 @@ class CleanUpFarm(pyblish.api.ContextPlugin):
|
|||
dirpaths_to_remove = set()
|
||||
for instance in context:
|
||||
staging_dir = instance.data.get("stagingDir")
|
||||
if staging_dir:
|
||||
if staging_dir and not instance.data.get("stagingDir_persistent"):
|
||||
dirpaths_to_remove.add(os.path.normpath(staging_dir))
|
||||
|
||||
if "representations" in instance.data:
|
||||
|
|
|
|||
67
openpype/plugins/publish/collect_custom_staging_dir.py
Normal file
67
openpype/plugins/publish/collect_custom_staging_dir.py
Normal file
|
|
@ -0,0 +1,67 @@
|
|||
"""
|
||||
Requires:
|
||||
anatomy
|
||||
|
||||
|
||||
Provides:
|
||||
instance.data -> stagingDir (folder path)
|
||||
-> stagingDir_persistent (bool)
|
||||
"""
|
||||
import copy
|
||||
import os.path
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline.publish.lib import get_custom_staging_dir_info
|
||||
|
||||
|
||||
class CollectCustomStagingDir(pyblish.api.InstancePlugin):
|
||||
"""Looks through profiles if stagingDir should be persistent and in special
|
||||
location.
|
||||
|
||||
Transient staging dir could be useful in specific use cases where is
|
||||
desirable to have temporary renders in specific, persistent folders, could
|
||||
be on disks optimized for speed for example.
|
||||
|
||||
It is studio responsibility to clean up obsolete folders with data.
|
||||
|
||||
Location of the folder is configured in `project_anatomy/templates/others`.
|
||||
('transient' key is expected, with 'folder' key)
|
||||
|
||||
Which family/task type/subset is applicable is configured in:
|
||||
`project_settings/global/tools/publish/custom_staging_dir_profiles`
|
||||
|
||||
"""
|
||||
label = "Collect Custom Staging Directory"
|
||||
order = pyblish.api.CollectorOrder + 0.4990
|
||||
|
||||
template_key = "transient"
|
||||
|
||||
def process(self, instance):
|
||||
family = instance.data["family"]
|
||||
subset_name = instance.data["subset"]
|
||||
host_name = instance.context.data["hostName"]
|
||||
project_name = instance.context.data["projectName"]
|
||||
|
||||
anatomy = instance.context.data["anatomy"]
|
||||
anatomy_data = copy.deepcopy(instance.data["anatomyData"])
|
||||
task = anatomy_data.get("task", {})
|
||||
|
||||
transient_tml, is_persistent = get_custom_staging_dir_info(
|
||||
project_name, host_name, family, task.get("name"),
|
||||
task.get("type"), subset_name, anatomy=anatomy, log=self.log)
|
||||
result_str = "Not adding"
|
||||
if transient_tml:
|
||||
anatomy_data["root"] = anatomy.roots
|
||||
scene_name = instance.context.data.get("currentFile")
|
||||
if scene_name:
|
||||
anatomy_data["scene_name"] = os.path.basename(scene_name)
|
||||
transient_dir = transient_tml.format(**anatomy_data)
|
||||
instance.data["stagingDir"] = transient_dir
|
||||
|
||||
instance.data["stagingDir_persistent"] = is_persistent
|
||||
result_str = "Adding '{}' as".format(transient_dir)
|
||||
|
||||
self.log.info("{} custom staging dir for instance with '{}'".format(
|
||||
result_str, family
|
||||
))
|
||||
|
|
@ -725,7 +725,6 @@ class ExtractBurnin(publish.Extractor):
|
|||
return filtered_burnin_defs
|
||||
|
||||
families = self.families_from_instance(instance)
|
||||
low_families = [family.lower() for family in families]
|
||||
|
||||
for filename_suffix, orig_burnin_def in burnin_defs.items():
|
||||
burnin_def = copy.deepcopy(orig_burnin_def)
|
||||
|
|
@ -736,7 +735,7 @@ class ExtractBurnin(publish.Extractor):
|
|||
|
||||
families_filters = def_filter["families"]
|
||||
if not self.families_filter_validation(
|
||||
low_families, families_filters
|
||||
families, families_filters
|
||||
):
|
||||
self.log.debug((
|
||||
"Skipped burnin definition \"{}\". Family"
|
||||
|
|
@ -773,31 +772,19 @@ class ExtractBurnin(publish.Extractor):
|
|||
return filtered_burnin_defs
|
||||
|
||||
def families_filter_validation(self, families, output_families_filter):
|
||||
"""Determine if entered families intersect with families filters.
|
||||
"""Determines if entered families intersect with families filters.
|
||||
|
||||
All family values are lowered to avoid unexpected results.
|
||||
"""
|
||||
if not output_families_filter:
|
||||
|
||||
families_filter_lower = set(family.lower() for family in
|
||||
output_families_filter
|
||||
# Exclude empty filter values
|
||||
if family)
|
||||
if not families_filter_lower:
|
||||
return True
|
||||
|
||||
for family_filter in output_families_filter:
|
||||
if not family_filter:
|
||||
continue
|
||||
|
||||
if not isinstance(family_filter, (list, tuple)):
|
||||
if family_filter.lower() not in families:
|
||||
continue
|
||||
return True
|
||||
|
||||
valid = True
|
||||
for family in family_filter:
|
||||
if family.lower() not in families:
|
||||
valid = False
|
||||
break
|
||||
|
||||
if valid:
|
||||
return True
|
||||
return False
|
||||
return any(family.lower() in families_filter_lower
|
||||
for family in families)
|
||||
|
||||
def families_from_instance(self, instance):
|
||||
"""Return all families of entered instance."""
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ import pyblish.api
|
|||
|
||||
from openpype.lib import (
|
||||
get_ffmpeg_tool_path,
|
||||
|
||||
filter_profiles,
|
||||
path_to_subprocess_arg,
|
||||
run_subprocess,
|
||||
)
|
||||
|
|
@ -23,6 +23,7 @@ from openpype.lib.transcoding import (
|
|||
convert_input_paths_for_ffmpeg,
|
||||
get_transcode_temp_directory,
|
||||
)
|
||||
from openpype.pipeline.publish import KnownPublishError
|
||||
|
||||
|
||||
class ExtractReview(pyblish.api.InstancePlugin):
|
||||
|
|
@ -88,21 +89,23 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
|
||||
def _get_outputs_for_instance(self, instance):
|
||||
host_name = instance.context.data["hostName"]
|
||||
task_name = os.environ["AVALON_TASK"]
|
||||
family = self.main_family_from_instance(instance)
|
||||
|
||||
self.log.info("Host: \"{}\"".format(host_name))
|
||||
self.log.info("Task: \"{}\"".format(task_name))
|
||||
self.log.info("Family: \"{}\"".format(family))
|
||||
|
||||
profile = self.find_matching_profile(
|
||||
host_name, task_name, family
|
||||
)
|
||||
profile = filter_profiles(
|
||||
self.profiles,
|
||||
{
|
||||
"hosts": host_name,
|
||||
"families": family,
|
||||
},
|
||||
logger=self.log)
|
||||
if not profile:
|
||||
self.log.info((
|
||||
"Skipped instance. None of profiles in presets are for"
|
||||
" Host: \"{}\" | Family: \"{}\" | Task \"{}\""
|
||||
).format(host_name, family, task_name))
|
||||
" Host: \"{}\" | Family: \"{}\""
|
||||
).format(host_name, family))
|
||||
return
|
||||
|
||||
self.log.debug("Matching profile: \"{}\"".format(json.dumps(profile)))
|
||||
|
|
@ -112,17 +115,19 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
filtered_outputs = self.filter_output_defs(
|
||||
profile, subset_name, instance_families
|
||||
)
|
||||
if not filtered_outputs:
|
||||
self.log.info((
|
||||
"Skipped instance. All output definitions from selected"
|
||||
" profile do not match instance families \"{}\" or"
|
||||
" subset name \"{}\"."
|
||||
).format(str(instance_families), subset_name))
|
||||
|
||||
# Store `filename_suffix` to save arguments
|
||||
profile_outputs = []
|
||||
for filename_suffix, definition in filtered_outputs.items():
|
||||
definition["filename_suffix"] = filename_suffix
|
||||
profile_outputs.append(definition)
|
||||
|
||||
if not filtered_outputs:
|
||||
self.log.info((
|
||||
"Skipped instance. All output definitions from selected"
|
||||
" profile does not match to instance families. \"{}\""
|
||||
).format(str(instance_families)))
|
||||
return profile_outputs
|
||||
|
||||
def _get_outputs_per_representations(self, instance, profile_outputs):
|
||||
|
|
@ -216,6 +221,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
outputs_per_repres = self._get_outputs_per_representations(
|
||||
instance, profile_outputs
|
||||
)
|
||||
|
||||
for repre, output_defs in outputs_per_repres:
|
||||
# Check if input should be preconverted before processing
|
||||
# Store original staging dir (it's value may change)
|
||||
|
|
@ -297,10 +303,10 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
shutil.rmtree(new_staging_dir)
|
||||
|
||||
def _render_output_definitions(
|
||||
self, instance, repre, src_repre_staging_dir, output_defs
|
||||
self, instance, repre, src_repre_staging_dir, output_definitions
|
||||
):
|
||||
fill_data = copy.deepcopy(instance.data["anatomyData"])
|
||||
for _output_def in output_defs:
|
||||
for _output_def in output_definitions:
|
||||
output_def = copy.deepcopy(_output_def)
|
||||
# Make sure output definition has "tags" key
|
||||
if "tags" not in output_def:
|
||||
|
|
@ -346,10 +352,11 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
if temp_data["input_is_sequence"]:
|
||||
self.log.info("Filling gaps in sequence.")
|
||||
files_to_clean = self.fill_sequence_gaps(
|
||||
temp_data["origin_repre"]["files"],
|
||||
new_repre["stagingDir"],
|
||||
temp_data["frame_start"],
|
||||
temp_data["frame_end"])
|
||||
files=temp_data["origin_repre"]["files"],
|
||||
staging_dir=new_repre["stagingDir"],
|
||||
start_frame=temp_data["frame_start"],
|
||||
end_frame=temp_data["frame_end"]
|
||||
)
|
||||
|
||||
# create or update outputName
|
||||
output_name = new_repre.get("outputName", "")
|
||||
|
|
@ -421,10 +428,10 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
def input_is_sequence(self, repre):
|
||||
"""Deduce from representation data if input is sequence."""
|
||||
# TODO GLOBAL ISSUE - Find better way how to find out if input
|
||||
# is sequence. Issues( in theory):
|
||||
# - there may be multiple files ant not be sequence
|
||||
# - remainders are not checked at all
|
||||
# - there can be more than one collection
|
||||
# is sequence. Issues (in theory):
|
||||
# - there may be multiple files ant not be sequence
|
||||
# - remainders are not checked at all
|
||||
# - there can be more than one collection
|
||||
return isinstance(repre["files"], (list, tuple))
|
||||
|
||||
def prepare_temp_data(self, instance, repre, output_def):
|
||||
|
|
@ -816,76 +823,41 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
is done.
|
||||
|
||||
Raises:
|
||||
AssertionError: if more then one collection is obtained.
|
||||
|
||||
KnownPublishError: if more than one collection is obtained.
|
||||
"""
|
||||
start_frame = int(start_frame)
|
||||
end_frame = int(end_frame)
|
||||
|
||||
collections = clique.assemble(files)[0]
|
||||
msg = "Multiple collections {} found.".format(collections)
|
||||
assert len(collections) == 1, msg
|
||||
if len(collections) != 1:
|
||||
raise KnownPublishError(
|
||||
"Multiple collections {} found.".format(collections))
|
||||
|
||||
col = collections[0]
|
||||
|
||||
# do nothing if no gap is found in input range
|
||||
not_gap = True
|
||||
for fr in range(start_frame, end_frame + 1):
|
||||
if fr not in col.indexes:
|
||||
not_gap = False
|
||||
|
||||
if not_gap:
|
||||
return []
|
||||
|
||||
holes = col.holes()
|
||||
|
||||
# generate ideal sequence
|
||||
complete_col = clique.assemble(
|
||||
[("{}{:0" + str(col.padding) + "d}{}").format(
|
||||
col.head, f, col.tail
|
||||
) for f in range(start_frame, end_frame)]
|
||||
)[0][0] # type: clique.Collection
|
||||
|
||||
new_files = {}
|
||||
last_existing_file = None
|
||||
|
||||
for idx in holes.indexes:
|
||||
# get previous existing file
|
||||
test_file = os.path.normpath(os.path.join(
|
||||
staging_dir,
|
||||
("{}{:0" + str(complete_col.padding) + "d}{}").format(
|
||||
complete_col.head, idx - 1, complete_col.tail)))
|
||||
if os.path.isfile(test_file):
|
||||
new_files[idx] = test_file
|
||||
last_existing_file = test_file
|
||||
# Prepare which hole is filled with what frame
|
||||
# - the frame is filled only with already existing frames
|
||||
prev_frame = next(iter(col.indexes))
|
||||
hole_frame_to_nearest = {}
|
||||
for frame in range(int(start_frame), int(end_frame) + 1):
|
||||
if frame in col.indexes:
|
||||
prev_frame = frame
|
||||
else:
|
||||
if not last_existing_file:
|
||||
# previous file is not found (sequence has a hole
|
||||
# at the beginning. Use first available frame
|
||||
# there is.
|
||||
try:
|
||||
last_existing_file = list(col)[0]
|
||||
except IndexError:
|
||||
# empty collection?
|
||||
raise AssertionError(
|
||||
"Invalid sequence collected")
|
||||
new_files[idx] = os.path.normpath(
|
||||
os.path.join(staging_dir, last_existing_file))
|
||||
# Use previous frame as source for hole
|
||||
hole_frame_to_nearest[frame] = prev_frame
|
||||
|
||||
files_to_clean = []
|
||||
if new_files:
|
||||
# so now new files are dict with missing frame as a key and
|
||||
# existing file as a value.
|
||||
for frame, file in new_files.items():
|
||||
self.log.info(
|
||||
"Filling gap {} with {}".format(frame, file))
|
||||
# Calculate paths
|
||||
added_files = []
|
||||
col_format = col.format("{head}{padding}{tail}")
|
||||
for hole_frame, src_frame in hole_frame_to_nearest.items():
|
||||
hole_fpath = os.path.join(staging_dir, col_format % hole_frame)
|
||||
src_fpath = os.path.join(staging_dir, col_format % src_frame)
|
||||
if not os.path.isfile(src_fpath):
|
||||
raise KnownPublishError(
|
||||
"Missing previously detected file: {}".format(src_fpath))
|
||||
|
||||
hole = os.path.join(
|
||||
staging_dir,
|
||||
("{}{:0" + str(col.padding) + "d}{}").format(
|
||||
col.head, frame, col.tail))
|
||||
speedcopy.copyfile(file, hole)
|
||||
files_to_clean.append(hole)
|
||||
speedcopy.copyfile(src_fpath, hole_fpath)
|
||||
added_files.append(hole_fpath)
|
||||
|
||||
return files_to_clean
|
||||
return added_files
|
||||
|
||||
def input_output_paths(self, new_repre, output_def, temp_data):
|
||||
"""Deduce input nad output file paths based on entered data.
|
||||
|
|
@ -1281,7 +1253,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
# 'use_input_res' is set to 'True'.
|
||||
use_input_res = False
|
||||
|
||||
# Overscal color
|
||||
# Overscan color
|
||||
overscan_color_value = "black"
|
||||
overscan_color = output_def.get("overscan_color")
|
||||
if overscan_color:
|
||||
|
|
@ -1468,240 +1440,20 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
families.append(family)
|
||||
return families
|
||||
|
||||
def compile_list_of_regexes(self, in_list):
|
||||
"""Convert strings in entered list to compiled regex objects."""
|
||||
regexes = []
|
||||
if not in_list:
|
||||
return regexes
|
||||
|
||||
for item in in_list:
|
||||
if not item:
|
||||
continue
|
||||
|
||||
try:
|
||||
regexes.append(re.compile(item))
|
||||
except TypeError:
|
||||
self.log.warning((
|
||||
"Invalid type \"{}\" value \"{}\"."
|
||||
" Expected string based object. Skipping."
|
||||
).format(str(type(item)), str(item)))
|
||||
|
||||
return regexes
|
||||
|
||||
def validate_value_by_regexes(self, value, in_list):
|
||||
"""Validates in any regex from list match entered value.
|
||||
|
||||
Args:
|
||||
in_list (list): List with regexes.
|
||||
value (str): String where regexes is checked.
|
||||
|
||||
Returns:
|
||||
int: Returns `0` when list is not set or is empty. Returns `1` when
|
||||
any regex match value and returns `-1` when none of regexes
|
||||
match value entered.
|
||||
"""
|
||||
if not in_list:
|
||||
return 0
|
||||
|
||||
output = -1
|
||||
regexes = self.compile_list_of_regexes(in_list)
|
||||
for regex in regexes:
|
||||
if not value:
|
||||
continue
|
||||
if re.match(regex, value):
|
||||
output = 1
|
||||
break
|
||||
return output
|
||||
|
||||
def profile_exclusion(self, matching_profiles):
|
||||
"""Find out most matching profile byt host, task and family match.
|
||||
|
||||
Profiles are selectively filtered. Each profile should have
|
||||
"__value__" key with list of booleans. Each boolean represents
|
||||
existence of filter for specific key (host, tasks, family).
|
||||
Profiles are looped in sequence. In each sequence are split into
|
||||
true_list and false_list. For next sequence loop are used profiles in
|
||||
true_list if there are any profiles else false_list is used.
|
||||
|
||||
Filtering ends when only one profile left in true_list. Or when all
|
||||
existence booleans loops passed, in that case first profile from left
|
||||
profiles is returned.
|
||||
|
||||
Args:
|
||||
matching_profiles (list): Profiles with same values.
|
||||
|
||||
Returns:
|
||||
dict: Most matching profile.
|
||||
"""
|
||||
self.log.info(
|
||||
"Search for first most matching profile in match order:"
|
||||
" Host name -> Task name -> Family."
|
||||
)
|
||||
# Filter all profiles with highest points value. First filter profiles
|
||||
# with matching host if there are any then filter profiles by task
|
||||
# name if there are any and lastly filter by family. Else use first in
|
||||
# list.
|
||||
idx = 0
|
||||
final_profile = None
|
||||
while True:
|
||||
profiles_true = []
|
||||
profiles_false = []
|
||||
for profile in matching_profiles:
|
||||
value = profile["__value__"]
|
||||
# Just use first profile when idx is greater than values.
|
||||
if not idx < len(value):
|
||||
final_profile = profile
|
||||
break
|
||||
|
||||
if value[idx]:
|
||||
profiles_true.append(profile)
|
||||
else:
|
||||
profiles_false.append(profile)
|
||||
|
||||
if final_profile is not None:
|
||||
break
|
||||
|
||||
if profiles_true:
|
||||
matching_profiles = profiles_true
|
||||
else:
|
||||
matching_profiles = profiles_false
|
||||
|
||||
if len(matching_profiles) == 1:
|
||||
final_profile = matching_profiles[0]
|
||||
break
|
||||
idx += 1
|
||||
|
||||
final_profile.pop("__value__")
|
||||
return final_profile
|
||||
|
||||
def find_matching_profile(self, host_name, task_name, family):
|
||||
""" Filter profiles by Host name, Task name and main Family.
|
||||
|
||||
Filtering keys are "hosts" (list), "tasks" (list), "families" (list).
|
||||
If key is not find or is empty than it's expected to match.
|
||||
|
||||
Args:
|
||||
profiles (list): Profiles definition from presets.
|
||||
host_name (str): Current running host name.
|
||||
task_name (str): Current context task name.
|
||||
family (str): Main family of current Instance.
|
||||
|
||||
Returns:
|
||||
dict/None: Return most matching profile or None if none of profiles
|
||||
match at least one criteria.
|
||||
"""
|
||||
|
||||
matching_profiles = None
|
||||
if not self.profiles:
|
||||
return matching_profiles
|
||||
|
||||
highest_profile_points = -1
|
||||
# Each profile get 1 point for each matching filter. Profile with most
|
||||
# points is returned. For cases when more than one profile will match
|
||||
# are also stored ordered lists of matching values.
|
||||
for profile in self.profiles:
|
||||
profile_points = 0
|
||||
profile_value = []
|
||||
|
||||
# Host filtering
|
||||
host_names = profile.get("hosts")
|
||||
match = self.validate_value_by_regexes(host_name, host_names)
|
||||
if match == -1:
|
||||
self.log.debug(
|
||||
"\"{}\" not found in {}".format(host_name, host_names)
|
||||
)
|
||||
continue
|
||||
profile_points += match
|
||||
profile_value.append(bool(match))
|
||||
|
||||
# Task filtering
|
||||
task_names = profile.get("tasks")
|
||||
match = self.validate_value_by_regexes(task_name, task_names)
|
||||
if match == -1:
|
||||
self.log.debug(
|
||||
"\"{}\" not found in {}".format(task_name, task_names)
|
||||
)
|
||||
continue
|
||||
profile_points += match
|
||||
profile_value.append(bool(match))
|
||||
|
||||
# Family filtering
|
||||
families = profile.get("families")
|
||||
match = self.validate_value_by_regexes(family, families)
|
||||
if match == -1:
|
||||
self.log.debug(
|
||||
"\"{}\" not found in {}".format(family, families)
|
||||
)
|
||||
continue
|
||||
profile_points += match
|
||||
profile_value.append(bool(match))
|
||||
|
||||
if profile_points < highest_profile_points:
|
||||
continue
|
||||
|
||||
if profile_points > highest_profile_points:
|
||||
matching_profiles = []
|
||||
highest_profile_points = profile_points
|
||||
|
||||
if profile_points == highest_profile_points:
|
||||
profile["__value__"] = profile_value
|
||||
matching_profiles.append(profile)
|
||||
|
||||
if not matching_profiles:
|
||||
self.log.warning((
|
||||
"None of profiles match your setup."
|
||||
" Host \"{}\" | Task: \"{}\" | Family: \"{}\""
|
||||
).format(host_name, task_name, family))
|
||||
return
|
||||
|
||||
if len(matching_profiles) == 1:
|
||||
# Pop temporary key `__value__`
|
||||
matching_profiles[0].pop("__value__")
|
||||
return matching_profiles[0]
|
||||
|
||||
self.log.warning((
|
||||
"More than one profile match your setup."
|
||||
" Host \"{}\" | Task: \"{}\" | Family: \"{}\""
|
||||
).format(host_name, task_name, family))
|
||||
|
||||
return self.profile_exclusion(matching_profiles)
|
||||
|
||||
def families_filter_validation(self, families, output_families_filter):
|
||||
"""Determines if entered families intersect with families filters.
|
||||
|
||||
All family values are lowered to avoid unexpected results.
|
||||
"""
|
||||
if not output_families_filter:
|
||||
|
||||
families_filter_lower = set(family.lower() for family in
|
||||
output_families_filter
|
||||
# Exclude empty filter values
|
||||
if family)
|
||||
if not families_filter_lower:
|
||||
return True
|
||||
|
||||
single_families = []
|
||||
combination_families = []
|
||||
for family_filter in output_families_filter:
|
||||
if not family_filter:
|
||||
continue
|
||||
if isinstance(family_filter, (list, tuple)):
|
||||
_family_filter = []
|
||||
for family in family_filter:
|
||||
if family:
|
||||
_family_filter.append(family.lower())
|
||||
combination_families.append(_family_filter)
|
||||
else:
|
||||
single_families.append(family_filter.lower())
|
||||
|
||||
for family in single_families:
|
||||
if family in families:
|
||||
return True
|
||||
|
||||
for family_combination in combination_families:
|
||||
valid = True
|
||||
for family in family_combination:
|
||||
if family not in families:
|
||||
valid = False
|
||||
break
|
||||
|
||||
if valid:
|
||||
return True
|
||||
return False
|
||||
return any(family.lower() in families_filter_lower
|
||||
for family in families)
|
||||
|
||||
def filter_output_defs(self, profile, subset_name, families):
|
||||
"""Return outputs matching input instance families.
|
||||
|
|
@ -1716,14 +1468,10 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
Returns:
|
||||
list: Containg all output definitions matching entered families.
|
||||
"""
|
||||
outputs = profile.get("outputs") or []
|
||||
outputs = profile.get("outputs") or {}
|
||||
if not outputs:
|
||||
return outputs
|
||||
|
||||
# lower values
|
||||
# QUESTION is this valid operation?
|
||||
families = [family.lower() for family in families]
|
||||
|
||||
filtered_outputs = {}
|
||||
for filename_suffix, output_def in outputs.items():
|
||||
output_filters = output_def.get("filter")
|
||||
|
|
@ -1995,14 +1743,14 @@ class OverscanCrop:
|
|||
relative_source_regex = re.compile(r"%([\+\-])")
|
||||
|
||||
def __init__(
|
||||
self, input_width, input_height, string_value, overscal_color=None
|
||||
self, input_width, input_height, string_value, overscan_color=None
|
||||
):
|
||||
# Make sure that is not None
|
||||
string_value = string_value or ""
|
||||
|
||||
self.input_width = input_width
|
||||
self.input_height = input_height
|
||||
self.overscal_color = overscal_color
|
||||
self.overscan_color = overscan_color
|
||||
|
||||
width, height = self._convert_string_to_values(string_value)
|
||||
self._width_value = width
|
||||
|
|
@ -2058,20 +1806,20 @@ class OverscanCrop:
|
|||
elif width >= self.input_width and height >= self.input_height:
|
||||
output.append(
|
||||
"pad={}:{}:(iw-ow)/2:(ih-oh)/2:{}".format(
|
||||
width, height, self.overscal_color
|
||||
width, height, self.overscan_color
|
||||
)
|
||||
)
|
||||
|
||||
elif width > self.input_width and height < self.input_height:
|
||||
output.append("crop=iw:{}".format(height))
|
||||
output.append("pad={}:ih:(iw-ow)/2:(ih-oh)/2:{}".format(
|
||||
width, self.overscal_color
|
||||
width, self.overscan_color
|
||||
))
|
||||
|
||||
elif width < self.input_width and height > self.input_height:
|
||||
output.append("crop={}:ih".format(width))
|
||||
output.append("pad=iw:{}:(iw-ow)/2:(ih-oh)/2:{}".format(
|
||||
height, self.overscal_color
|
||||
height, self.overscan_color
|
||||
))
|
||||
|
||||
return output
|
||||
|
|
|
|||
|
|
@ -58,12 +58,16 @@
|
|||
"file": "{originalBasename}.{ext}",
|
||||
"path": "{@folder}/{@file}"
|
||||
},
|
||||
"transient": {
|
||||
"folder": "{root[work]}/{project[name]}/{hierarchy}/{asset}/work/{family}/{subset}"
|
||||
},
|
||||
"__dynamic_keys_labels__": {
|
||||
"maya2unreal": "Maya to Unreal",
|
||||
"simpleUnrealTextureHero": "Simple Unreal Texture - Hero",
|
||||
"simpleUnrealTexture": "Simple Unreal Texture",
|
||||
"online": "online",
|
||||
"source": "source"
|
||||
"source": "source",
|
||||
"transient": "transient"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -9,6 +9,13 @@
|
|||
"rules": {}
|
||||
}
|
||||
},
|
||||
"workfile": {
|
||||
"submission_overrides": [
|
||||
"render_chunk",
|
||||
"frame_range",
|
||||
"resolution"
|
||||
]
|
||||
},
|
||||
"publish": {
|
||||
"CollectRenderPath": {
|
||||
"output_extension": "png",
|
||||
|
|
|
|||
|
|
@ -591,7 +591,8 @@
|
|||
"task_names": [],
|
||||
"template_name": "simpleUnrealTextureHero"
|
||||
}
|
||||
]
|
||||
],
|
||||
"custom_staging_dir_profiles": []
|
||||
}
|
||||
},
|
||||
"project_folder_structure": "{\"__project_root__\": {\"prod\": {}, \"resources\": {\"footage\": {\"plates\": {}, \"offline\": {}}, \"audio\": {}, \"art_dept\": {}}, \"editorial\": {}, \"assets\": {\"characters\": {}, \"locations\": {}}, \"shots\": {}}}",
|
||||
|
|
|
|||
|
|
@ -22,6 +22,31 @@
|
|||
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "workfile",
|
||||
"label": "Workfile",
|
||||
"children": [
|
||||
{
|
||||
"key": "submission_overrides",
|
||||
"label": "Submission workfile overrides",
|
||||
"type": "enum",
|
||||
"multiselection": true,
|
||||
"enum_items": [
|
||||
{
|
||||
"render_chunk": "Pass chunk size"
|
||||
},
|
||||
{
|
||||
"frame_range": "Pass frame range"
|
||||
},
|
||||
{
|
||||
"resolution": "Pass resolution"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
|
|||
|
|
@ -408,6 +408,71 @@
|
|||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "custom_staging_dir_profiles",
|
||||
"label": "Custom Staging Dir Profiles",
|
||||
"use_label_wrap": true,
|
||||
"docstring": "Profiles to specify special location and persistence for staging dir. Could be used in Creators and Publish phase!",
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "active",
|
||||
"label": "Is active",
|
||||
"default": true
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"key": "hosts",
|
||||
"label": "Host names",
|
||||
"type": "hosts-enum",
|
||||
"multiselection": true
|
||||
},
|
||||
{
|
||||
"key": "task_types",
|
||||
"label": "Task types",
|
||||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "task_names",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"key": "families",
|
||||
"label": "Families",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"key": "subsets",
|
||||
"label": "Subset names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"key": "custom_staging_dir_persistent",
|
||||
"label": "Custom Staging Folder Persistent",
|
||||
"type": "boolean",
|
||||
"default": false
|
||||
},
|
||||
{
|
||||
"key": "template_name",
|
||||
"label": "Template Name",
|
||||
"type": "text",
|
||||
"placeholder": "transient"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
from qtpy import QtCore
|
||||
from qtpy import QtCore, QtGui
|
||||
|
||||
# ID of context item in instance view
|
||||
CONTEXT_ID = "context"
|
||||
|
|
@ -26,6 +26,9 @@ GROUP_ROLE = QtCore.Qt.UserRole + 7
|
|||
CONVERTER_IDENTIFIER_ROLE = QtCore.Qt.UserRole + 8
|
||||
CREATOR_SORT_ROLE = QtCore.Qt.UserRole + 9
|
||||
|
||||
ResetKeySequence = QtGui.QKeySequence(
|
||||
QtCore.Qt.ControlModifier | QtCore.Qt.Key_R
|
||||
)
|
||||
|
||||
__all__ = (
|
||||
"CONTEXT_ID",
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import collections
|
|||
import uuid
|
||||
import tempfile
|
||||
import shutil
|
||||
from abc import ABCMeta, abstractmethod, abstractproperty
|
||||
from abc import ABCMeta, abstractmethod
|
||||
|
||||
import six
|
||||
import pyblish.api
|
||||
|
|
@ -964,7 +964,8 @@ class AbstractPublisherController(object):
|
|||
access objects directly but by using wrappers that can be serialized.
|
||||
"""
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def log(self):
|
||||
"""Controller's logger object.
|
||||
|
||||
|
|
@ -974,13 +975,15 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def event_system(self):
|
||||
"""Inner event system for publisher controller."""
|
||||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def project_name(self):
|
||||
"""Current context project name.
|
||||
|
||||
|
|
@ -990,7 +993,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def current_asset_name(self):
|
||||
"""Current context asset name.
|
||||
|
||||
|
|
@ -1000,7 +1004,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def current_task_name(self):
|
||||
"""Current context task name.
|
||||
|
||||
|
|
@ -1010,7 +1015,21 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def host_context_has_changed(self):
|
||||
"""Host context changed after last reset.
|
||||
|
||||
'CreateContext' has this option available using 'context_has_changed'.
|
||||
|
||||
Returns:
|
||||
bool: Context has changed.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def host_is_valid(self):
|
||||
"""Host is valid for creation part.
|
||||
|
||||
|
|
@ -1023,7 +1042,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def instances(self):
|
||||
"""Collected/created instances.
|
||||
|
||||
|
|
@ -1134,7 +1154,13 @@ class AbstractPublisherController(object):
|
|||
|
||||
@abstractmethod
|
||||
def save_changes(self):
|
||||
"""Save changes in create context."""
|
||||
"""Save changes in create context.
|
||||
|
||||
Save can crash because of unexpected errors.
|
||||
|
||||
Returns:
|
||||
bool: Save was successful.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
|
@ -1145,7 +1171,19 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_has_started(self):
|
||||
"""Has publishing finished.
|
||||
|
||||
Returns:
|
||||
bool: If publishing finished and all plugins were iterated.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_has_finished(self):
|
||||
"""Has publishing finished.
|
||||
|
||||
|
|
@ -1155,7 +1193,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_is_running(self):
|
||||
"""Publishing is running right now.
|
||||
|
||||
|
|
@ -1165,7 +1204,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_has_validated(self):
|
||||
"""Publish validation passed.
|
||||
|
||||
|
|
@ -1175,7 +1215,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_has_crashed(self):
|
||||
"""Publishing crashed for any reason.
|
||||
|
||||
|
|
@ -1185,7 +1226,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_has_validation_errors(self):
|
||||
"""During validation happened at least one validation error.
|
||||
|
||||
|
|
@ -1195,7 +1237,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_max_progress(self):
|
||||
"""Get maximum possible progress number.
|
||||
|
||||
|
|
@ -1205,7 +1248,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_progress(self):
|
||||
"""Current progress number.
|
||||
|
||||
|
|
@ -1215,7 +1259,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def publish_error_msg(self):
|
||||
"""Current error message which cause fail of publishing.
|
||||
|
||||
|
|
@ -1267,7 +1312,8 @@ class AbstractPublisherController(object):
|
|||
|
||||
pass
|
||||
|
||||
@abstractproperty
|
||||
@property
|
||||
@abstractmethod
|
||||
def convertor_items(self):
|
||||
pass
|
||||
|
||||
|
|
@ -1356,6 +1402,7 @@ class BasePublisherController(AbstractPublisherController):
|
|||
self._publish_has_validation_errors = False
|
||||
self._publish_has_crashed = False
|
||||
# All publish plugins are processed
|
||||
self._publish_has_started = False
|
||||
self._publish_has_finished = False
|
||||
self._publish_max_progress = 0
|
||||
self._publish_progress = 0
|
||||
|
|
@ -1386,7 +1433,8 @@ class BasePublisherController(AbstractPublisherController):
|
|||
"show.card.message" - Show card message request (UI related).
|
||||
"instances.refresh.finished" - Instances are refreshed.
|
||||
"plugins.refresh.finished" - Plugins refreshed.
|
||||
"publish.reset.finished" - Publish context reset finished.
|
||||
"publish.reset.finished" - Reset finished.
|
||||
"controller.reset.started" - Controller reset started.
|
||||
"controller.reset.finished" - Controller reset finished.
|
||||
"publish.process.started" - Publishing started. Can be started from
|
||||
paused state.
|
||||
|
|
@ -1425,7 +1473,16 @@ class BasePublisherController(AbstractPublisherController):
|
|||
def _set_host_is_valid(self, value):
|
||||
if self._host_is_valid != value:
|
||||
self._host_is_valid = value
|
||||
self._emit_event("publish.host_is_valid.changed", {"value": value})
|
||||
self._emit_event(
|
||||
"publish.host_is_valid.changed", {"value": value}
|
||||
)
|
||||
|
||||
def _get_publish_has_started(self):
|
||||
return self._publish_has_started
|
||||
|
||||
def _set_publish_has_started(self, value):
|
||||
if value != self._publish_has_started:
|
||||
self._publish_has_started = value
|
||||
|
||||
def _get_publish_has_finished(self):
|
||||
return self._publish_has_finished
|
||||
|
|
@ -1449,7 +1506,9 @@ class BasePublisherController(AbstractPublisherController):
|
|||
def _set_publish_has_validated(self, value):
|
||||
if self._publish_has_validated != value:
|
||||
self._publish_has_validated = value
|
||||
self._emit_event("publish.has_validated.changed", {"value": value})
|
||||
self._emit_event(
|
||||
"publish.has_validated.changed", {"value": value}
|
||||
)
|
||||
|
||||
def _get_publish_has_crashed(self):
|
||||
return self._publish_has_crashed
|
||||
|
|
@ -1497,6 +1556,9 @@ class BasePublisherController(AbstractPublisherController):
|
|||
host_is_valid = property(
|
||||
_get_host_is_valid, _set_host_is_valid
|
||||
)
|
||||
publish_has_started = property(
|
||||
_get_publish_has_started, _set_publish_has_started
|
||||
)
|
||||
publish_has_finished = property(
|
||||
_get_publish_has_finished, _set_publish_has_finished
|
||||
)
|
||||
|
|
@ -1526,6 +1588,7 @@ class BasePublisherController(AbstractPublisherController):
|
|||
"""Reset most of attributes that can be reset."""
|
||||
|
||||
self.publish_is_running = False
|
||||
self.publish_has_started = False
|
||||
self.publish_has_validated = False
|
||||
self.publish_has_crashed = False
|
||||
self.publish_has_validation_errors = False
|
||||
|
|
@ -1645,10 +1708,7 @@ class PublisherController(BasePublisherController):
|
|||
str: Project name.
|
||||
"""
|
||||
|
||||
if not hasattr(self._host, "get_current_context"):
|
||||
return legacy_io.active_project()
|
||||
|
||||
return self._host.get_current_context()["project_name"]
|
||||
return self._create_context.get_current_project_name()
|
||||
|
||||
@property
|
||||
def current_asset_name(self):
|
||||
|
|
@ -1658,10 +1718,7 @@ class PublisherController(BasePublisherController):
|
|||
Union[str, None]: Asset name or None if asset is not set.
|
||||
"""
|
||||
|
||||
if not hasattr(self._host, "get_current_context"):
|
||||
return legacy_io.Session["AVALON_ASSET"]
|
||||
|
||||
return self._host.get_current_context()["asset_name"]
|
||||
return self._create_context.get_current_asset_name()
|
||||
|
||||
@property
|
||||
def current_task_name(self):
|
||||
|
|
@ -1671,10 +1728,11 @@ class PublisherController(BasePublisherController):
|
|||
Union[str, None]: Task name or None if task is not set.
|
||||
"""
|
||||
|
||||
if not hasattr(self._host, "get_current_context"):
|
||||
return legacy_io.Session["AVALON_TASK"]
|
||||
return self._create_context.get_current_task_name()
|
||||
|
||||
return self._host.get_current_context()["task_name"]
|
||||
@property
|
||||
def host_context_has_changed(self):
|
||||
return self._create_context.context_has_changed
|
||||
|
||||
@property
|
||||
def instances(self):
|
||||
|
|
@ -1751,6 +1809,8 @@ class PublisherController(BasePublisherController):
|
|||
"""Reset everything related to creation and publishing."""
|
||||
self.stop_publish()
|
||||
|
||||
self._emit_event("controller.reset.started")
|
||||
|
||||
self.host_is_valid = self._create_context.host_is_valid
|
||||
|
||||
self._create_context.reset_preparation()
|
||||
|
|
@ -1992,7 +2052,15 @@ class PublisherController(BasePublisherController):
|
|||
)
|
||||
|
||||
def trigger_convertor_items(self, convertor_identifiers):
|
||||
self.save_changes()
|
||||
"""Trigger legacy item convertors.
|
||||
|
||||
This functionality requires to save and reset CreateContext. The reset
|
||||
is needed so Creators can collect converted items.
|
||||
|
||||
Args:
|
||||
convertor_identifiers (list[str]): Identifiers of convertor
|
||||
plugins.
|
||||
"""
|
||||
|
||||
success = True
|
||||
try:
|
||||
|
|
@ -2039,13 +2107,33 @@ class PublisherController(BasePublisherController):
|
|||
self._on_create_instance_change()
|
||||
return success
|
||||
|
||||
def save_changes(self):
|
||||
"""Save changes happened during creation."""
|
||||
def save_changes(self, show_message=True):
|
||||
"""Save changes happened during creation.
|
||||
|
||||
Trigger save of changes using host api. This functionality does not
|
||||
validate anything. It is required to do checks before this method is
|
||||
called to be able to give user actionable response e.g. check of
|
||||
context using 'host_context_has_changed'.
|
||||
|
||||
Args:
|
||||
show_message (bool): Show message that changes were
|
||||
saved successfully.
|
||||
|
||||
Returns:
|
||||
bool: Save of changes was successful.
|
||||
"""
|
||||
|
||||
if not self._create_context.host_is_valid:
|
||||
return
|
||||
# TODO remove
|
||||
# Fake success save when host is not valid for CreateContext
|
||||
# this is for testing as experimental feature
|
||||
return True
|
||||
|
||||
try:
|
||||
self._create_context.save_changes()
|
||||
if show_message:
|
||||
self.emit_card_message("Saved changes..")
|
||||
return True
|
||||
|
||||
except CreatorsOperationFailed as exc:
|
||||
self._emit_event(
|
||||
|
|
@ -2056,16 +2144,17 @@ class PublisherController(BasePublisherController):
|
|||
}
|
||||
)
|
||||
|
||||
return False
|
||||
|
||||
def remove_instances(self, instance_ids):
|
||||
"""Remove instances based on instance ids.
|
||||
|
||||
Args:
|
||||
instance_ids (List[str]): List of instance ids to remove.
|
||||
"""
|
||||
# QUESTION Expect that instances are really removed? In that case save
|
||||
# reset is not required and save changes too.
|
||||
self.save_changes()
|
||||
|
||||
# QUESTION Expect that instances are really removed? In that case reset
|
||||
# is not required.
|
||||
self._remove_instances_from_context(instance_ids)
|
||||
|
||||
self._on_create_instance_change()
|
||||
|
|
@ -2136,12 +2225,22 @@ class PublisherController(BasePublisherController):
|
|||
self._publish_comment_is_set = True
|
||||
|
||||
def publish(self):
|
||||
"""Run publishing."""
|
||||
"""Run publishing.
|
||||
|
||||
Make sure all changes are saved before method is called (Call
|
||||
'save_changes' and check output).
|
||||
"""
|
||||
|
||||
self._publish_up_validation = False
|
||||
self._start_publish()
|
||||
|
||||
def validate(self):
|
||||
"""Run publishing and stop after Validation."""
|
||||
"""Run publishing and stop after Validation.
|
||||
|
||||
Make sure all changes are saved before method is called (Call
|
||||
'save_changes' and check output).
|
||||
"""
|
||||
|
||||
if self.publish_has_validated:
|
||||
return
|
||||
self._publish_up_validation = True
|
||||
|
|
@ -2152,10 +2251,8 @@ class PublisherController(BasePublisherController):
|
|||
if self.publish_is_running:
|
||||
return
|
||||
|
||||
# Make sure changes are saved
|
||||
self.save_changes()
|
||||
|
||||
self.publish_is_running = True
|
||||
self.publish_has_started = True
|
||||
|
||||
self._emit_event("publish.process.started")
|
||||
|
||||
|
|
|
|||
|
|
@ -4,8 +4,9 @@ from .icons import (
|
|||
get_icon
|
||||
)
|
||||
from .widgets import (
|
||||
StopBtn,
|
||||
SaveBtn,
|
||||
ResetBtn,
|
||||
StopBtn,
|
||||
ValidateBtn,
|
||||
PublishBtn,
|
||||
CreateNextPageOverlay,
|
||||
|
|
@ -25,8 +26,9 @@ __all__ = (
|
|||
"get_pixmap",
|
||||
"get_icon",
|
||||
|
||||
"StopBtn",
|
||||
"SaveBtn",
|
||||
"ResetBtn",
|
||||
"StopBtn",
|
||||
"ValidateBtn",
|
||||
"PublishBtn",
|
||||
"CreateNextPageOverlay",
|
||||
|
|
|
|||
|
|
@ -164,6 +164,11 @@ class BaseGroupWidget(QtWidgets.QWidget):
|
|||
def _on_widget_selection(self, instance_id, group_id, selection_type):
|
||||
self.selected.emit(instance_id, group_id, selection_type)
|
||||
|
||||
def set_active_toggle_enabled(self, enabled):
|
||||
for widget in self._widgets_by_id.values():
|
||||
if isinstance(widget, InstanceCardWidget):
|
||||
widget.set_active_toggle_enabled(enabled)
|
||||
|
||||
|
||||
class ConvertorItemsGroupWidget(BaseGroupWidget):
|
||||
def update_items(self, items_by_id):
|
||||
|
|
@ -437,6 +442,9 @@ class InstanceCardWidget(CardWidget):
|
|||
|
||||
self.update_instance_values()
|
||||
|
||||
def set_active_toggle_enabled(self, enabled):
|
||||
self._active_checkbox.setEnabled(enabled)
|
||||
|
||||
def set_active(self, new_value):
|
||||
"""Set instance as active."""
|
||||
checkbox_value = self._active_checkbox.isChecked()
|
||||
|
|
@ -551,6 +559,7 @@ class InstanceCardView(AbstractInstanceView):
|
|||
|
||||
self._context_widget = None
|
||||
self._convertor_items_group = None
|
||||
self._active_toggle_enabled = True
|
||||
self._widgets_by_group = {}
|
||||
self._ordered_groups = []
|
||||
|
||||
|
|
@ -667,6 +676,9 @@ class InstanceCardView(AbstractInstanceView):
|
|||
group_widget.update_instances(
|
||||
instances_by_group[group_name]
|
||||
)
|
||||
group_widget.set_active_toggle_enabled(
|
||||
self._active_toggle_enabled
|
||||
)
|
||||
|
||||
self._update_ordered_group_names()
|
||||
|
||||
|
|
@ -1091,3 +1103,10 @@ class InstanceCardView(AbstractInstanceView):
|
|||
|
||||
self._explicitly_selected_groups = selected_groups
|
||||
self._explicitly_selected_instance_ids = selected_instances
|
||||
|
||||
def set_active_toggle_enabled(self, enabled):
|
||||
if self._active_toggle_enabled is enabled:
|
||||
return
|
||||
self._active_toggle_enabled = enabled
|
||||
for group_widget in self._widgets_by_group.values():
|
||||
group_widget.set_active_toggle_enabled(enabled)
|
||||
|
|
|
|||
BIN
openpype/tools/publisher/widgets/images/save.png
Normal file
BIN
openpype/tools/publisher/widgets/images/save.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 3.9 KiB |
|
|
@ -198,6 +198,9 @@ class InstanceListItemWidget(QtWidgets.QWidget):
|
|||
self.instance["active"] = new_value
|
||||
self.active_changed.emit(self.instance.id, new_value)
|
||||
|
||||
def set_active_toggle_enabled(self, enabled):
|
||||
self._active_checkbox.setEnabled(enabled)
|
||||
|
||||
|
||||
class ListContextWidget(QtWidgets.QFrame):
|
||||
"""Context (or global attributes) widget."""
|
||||
|
|
@ -302,6 +305,9 @@ class InstanceListGroupWidget(QtWidgets.QFrame):
|
|||
else:
|
||||
self.expand_btn.setArrowType(QtCore.Qt.RightArrow)
|
||||
|
||||
def set_active_toggle_enabled(self, enabled):
|
||||
self.toggle_checkbox.setEnabled(enabled)
|
||||
|
||||
|
||||
class InstanceTreeView(QtWidgets.QTreeView):
|
||||
"""View showing instances and their groups."""
|
||||
|
|
@ -461,6 +467,8 @@ class InstanceListView(AbstractInstanceView):
|
|||
self._instance_model = instance_model
|
||||
self._proxy_model = proxy_model
|
||||
|
||||
self._active_toggle_enabled = True
|
||||
|
||||
def _on_expand(self, index):
|
||||
self._update_widget_expand_state(index, True)
|
||||
|
||||
|
|
@ -667,6 +675,9 @@ class InstanceListView(AbstractInstanceView):
|
|||
widget = InstanceListItemWidget(
|
||||
instance, self._instance_view
|
||||
)
|
||||
widget.set_active_toggle_enabled(
|
||||
self._active_toggle_enabled
|
||||
)
|
||||
widget.active_changed.connect(self._on_active_changed)
|
||||
self._instance_view.setIndexWidget(proxy_index, widget)
|
||||
self._widgets_by_id[instance.id] = widget
|
||||
|
|
@ -802,6 +813,9 @@ class InstanceListView(AbstractInstanceView):
|
|||
proxy_index = self._proxy_model.mapFromSource(index)
|
||||
group_name = group_item.data(GROUP_ROLE)
|
||||
widget = InstanceListGroupWidget(group_name, self._instance_view)
|
||||
widget.set_active_toggle_enabled(
|
||||
self._active_toggle_enabled
|
||||
)
|
||||
widget.expand_changed.connect(self._on_group_expand_request)
|
||||
widget.toggle_requested.connect(self._on_group_toggle_request)
|
||||
self._group_widgets[group_name] = widget
|
||||
|
|
@ -1051,3 +1065,16 @@ class InstanceListView(AbstractInstanceView):
|
|||
QtCore.QItemSelectionModel.Select
|
||||
| QtCore.QItemSelectionModel.Rows
|
||||
)
|
||||
|
||||
def set_active_toggle_enabled(self, enabled):
|
||||
if self._active_toggle_enabled is enabled:
|
||||
return
|
||||
|
||||
self._active_toggle_enabled = enabled
|
||||
for widget in self._widgets_by_id.values():
|
||||
if isinstance(widget, InstanceListItemWidget):
|
||||
widget.set_active_toggle_enabled(enabled)
|
||||
|
||||
for widget in self._group_widgets.values():
|
||||
if isinstance(widget, InstanceListGroupWidget):
|
||||
widget.set_active_toggle_enabled(enabled)
|
||||
|
|
|
|||
|
|
@ -17,6 +17,7 @@ class OverviewWidget(QtWidgets.QFrame):
|
|||
active_changed = QtCore.Signal()
|
||||
instance_context_changed = QtCore.Signal()
|
||||
create_requested = QtCore.Signal()
|
||||
convert_requested = QtCore.Signal()
|
||||
|
||||
anim_end_value = 200
|
||||
anim_duration = 200
|
||||
|
|
@ -132,6 +133,9 @@ class OverviewWidget(QtWidgets.QFrame):
|
|||
controller.event_system.add_callback(
|
||||
"publish.process.started", self._on_publish_start
|
||||
)
|
||||
controller.event_system.add_callback(
|
||||
"controller.reset.started", self._on_controller_reset_start
|
||||
)
|
||||
controller.event_system.add_callback(
|
||||
"publish.reset.finished", self._on_publish_reset
|
||||
)
|
||||
|
|
@ -336,13 +340,31 @@ class OverviewWidget(QtWidgets.QFrame):
|
|||
self.instance_context_changed.emit()
|
||||
|
||||
def _on_convert_requested(self):
|
||||
_, _, convertor_identifiers = self.get_selected_items()
|
||||
self._controller.trigger_convertor_items(convertor_identifiers)
|
||||
self.convert_requested.emit()
|
||||
|
||||
def get_selected_items(self):
|
||||
"""Selected items in current view widget.
|
||||
|
||||
Returns:
|
||||
tuple[list[str], bool, list[str]]: Selected items. List of
|
||||
instance ids, context is selected, list of selected legacy
|
||||
convertor plugins.
|
||||
"""
|
||||
|
||||
view = self._subset_views_layout.currentWidget()
|
||||
return view.get_selected_items()
|
||||
|
||||
def get_selected_legacy_convertors(self):
|
||||
"""Selected legacy convertor identifiers.
|
||||
|
||||
Returns:
|
||||
list[str]: Selected legacy convertor identifiers.
|
||||
Example: ['io.openpype.creators.houdini.legacy']
|
||||
"""
|
||||
|
||||
_, _, convertor_identifiers = self.get_selected_items()
|
||||
return convertor_identifiers
|
||||
|
||||
def _change_view_type(self):
|
||||
idx = self._subset_views_layout.currentIndex()
|
||||
new_idx = (idx + 1) % self._subset_views_layout.count()
|
||||
|
|
@ -391,9 +413,19 @@ class OverviewWidget(QtWidgets.QFrame):
|
|||
|
||||
self._create_btn.setEnabled(False)
|
||||
self._subset_attributes_wrap.setEnabled(False)
|
||||
for idx in range(self._subset_views_layout.count()):
|
||||
widget = self._subset_views_layout.widget(idx)
|
||||
widget.set_active_toggle_enabled(False)
|
||||
|
||||
def _on_controller_reset_start(self):
|
||||
"""Controller reset started."""
|
||||
|
||||
for idx in range(self._subset_views_layout.count()):
|
||||
widget = self._subset_views_layout.widget(idx)
|
||||
widget.set_active_toggle_enabled(True)
|
||||
|
||||
def _on_publish_reset(self):
|
||||
"""Context in controller has been refreshed."""
|
||||
"""Context in controller has been reseted."""
|
||||
|
||||
self._create_btn.setEnabled(True)
|
||||
self._subset_attributes_wrap.setEnabled(True)
|
||||
|
|
|
|||
|
|
@ -34,7 +34,8 @@ from .icons import (
|
|||
)
|
||||
|
||||
from ..constants import (
|
||||
VARIANT_TOOLTIP
|
||||
VARIANT_TOOLTIP,
|
||||
ResetKeySequence,
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -198,12 +199,26 @@ class CreateBtn(PublishIconBtn):
|
|||
self.setLayoutDirection(QtCore.Qt.RightToLeft)
|
||||
|
||||
|
||||
class SaveBtn(PublishIconBtn):
|
||||
"""Save context and instances information."""
|
||||
def __init__(self, parent=None):
|
||||
icon_path = get_icon_path("save")
|
||||
super(SaveBtn, self).__init__(icon_path, parent)
|
||||
self.setToolTip(
|
||||
"Save changes ({})".format(
|
||||
QtGui.QKeySequence(QtGui.QKeySequence.Save).toString()
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
class ResetBtn(PublishIconBtn):
|
||||
"""Publish reset button."""
|
||||
def __init__(self, parent=None):
|
||||
icon_path = get_icon_path("refresh")
|
||||
super(ResetBtn, self).__init__(icon_path, parent)
|
||||
self.setToolTip("Refresh publishing")
|
||||
self.setToolTip(
|
||||
"Reset & discard changes ({})".format(ResetKeySequence.toString())
|
||||
)
|
||||
|
||||
|
||||
class StopBtn(PublishIconBtn):
|
||||
|
|
@ -348,6 +363,19 @@ class AbstractInstanceView(QtWidgets.QWidget):
|
|||
"{} Method 'set_selected_items' is not implemented."
|
||||
).format(self.__class__.__name__))
|
||||
|
||||
def set_active_toggle_enabled(self, enabled):
|
||||
"""Instances are disabled for changing enabled state.
|
||||
|
||||
Active state should stay the same until is "unset".
|
||||
|
||||
Args:
|
||||
enabled (bool): Instance state can be changed.
|
||||
"""
|
||||
|
||||
raise NotImplementedError((
|
||||
"{} Method 'set_active_toggle_enabled' is not implemented."
|
||||
).format(self.__class__.__name__))
|
||||
|
||||
|
||||
class ClickableLineEdit(QtWidgets.QLineEdit):
|
||||
"""QLineEdit capturing left mouse click.
|
||||
|
|
@ -1533,7 +1561,7 @@ class SubsetAttributesWidget(QtWidgets.QWidget):
|
|||
│ attributes │ Thumbnail │ TOP
|
||||
│ │ │
|
||||
├─────────────┬───┴─────────────┤
|
||||
│ Family │ Publish │
|
||||
│ Creator │ Publish │
|
||||
│ attributes │ plugin │ BOTTOM
|
||||
│ │ attributes │
|
||||
└───────────────────────────────┘
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ from openpype.tools.utils import (
|
|||
PixmapLabel,
|
||||
)
|
||||
|
||||
from .constants import ResetKeySequence
|
||||
from .publish_report_viewer import PublishReportViewerWidget
|
||||
from .control_qt import QtPublisherController
|
||||
from .widgets import (
|
||||
|
|
@ -22,8 +23,9 @@ from .widgets import (
|
|||
|
||||
PublisherTabsWidget,
|
||||
|
||||
StopBtn,
|
||||
SaveBtn,
|
||||
ResetBtn,
|
||||
StopBtn,
|
||||
ValidateBtn,
|
||||
PublishBtn,
|
||||
|
||||
|
|
@ -121,6 +123,7 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
"Attach a comment to your publish"
|
||||
)
|
||||
|
||||
save_btn = SaveBtn(footer_widget)
|
||||
reset_btn = ResetBtn(footer_widget)
|
||||
stop_btn = StopBtn(footer_widget)
|
||||
validate_btn = ValidateBtn(footer_widget)
|
||||
|
|
@ -129,6 +132,7 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
footer_bottom_layout = QtWidgets.QHBoxLayout(footer_bottom_widget)
|
||||
footer_bottom_layout.setContentsMargins(0, 0, 0, 0)
|
||||
footer_bottom_layout.addStretch(1)
|
||||
footer_bottom_layout.addWidget(save_btn, 0)
|
||||
footer_bottom_layout.addWidget(reset_btn, 0)
|
||||
footer_bottom_layout.addWidget(stop_btn, 0)
|
||||
footer_bottom_layout.addWidget(validate_btn, 0)
|
||||
|
|
@ -250,7 +254,11 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
overview_widget.create_requested.connect(
|
||||
self._on_create_request
|
||||
)
|
||||
overview_widget.convert_requested.connect(
|
||||
self._on_convert_requested
|
||||
)
|
||||
|
||||
save_btn.clicked.connect(self._on_save_clicked)
|
||||
reset_btn.clicked.connect(self._on_reset_clicked)
|
||||
stop_btn.clicked.connect(self._on_stop_clicked)
|
||||
validate_btn.clicked.connect(self._on_validate_clicked)
|
||||
|
|
@ -330,8 +338,9 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
self._comment_input = comment_input
|
||||
self._footer_spacer = footer_spacer
|
||||
|
||||
self._stop_btn = stop_btn
|
||||
self._save_btn = save_btn
|
||||
self._reset_btn = reset_btn
|
||||
self._stop_btn = stop_btn
|
||||
self._validate_btn = validate_btn
|
||||
self._publish_btn = publish_btn
|
||||
|
||||
|
|
@ -388,7 +397,9 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
def closeEvent(self, event):
|
||||
self._window_is_visible = False
|
||||
self._uninstall_app_event_listener()
|
||||
self.save_changes()
|
||||
# TODO capture changes and ask user if wants to save changes on close
|
||||
if not self._controller.host_context_has_changed:
|
||||
self._save_changes(False)
|
||||
self._reset_on_show = True
|
||||
self._controller.clear_thumbnail_temp_dir_path()
|
||||
super(PublisherWindow, self).closeEvent(event)
|
||||
|
|
@ -421,6 +432,21 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
if event.key() == QtCore.Qt.Key_Escape:
|
||||
event.accept()
|
||||
return
|
||||
|
||||
if event.matches(QtGui.QKeySequence.Save):
|
||||
if not self._controller.publish_has_started:
|
||||
self._save_changes(True)
|
||||
event.accept()
|
||||
return
|
||||
|
||||
if ResetKeySequence.matches(
|
||||
QtGui.QKeySequence(event.key() | event.modifiers())
|
||||
):
|
||||
if not self.controller.publish_is_running:
|
||||
self.reset()
|
||||
event.accept()
|
||||
return
|
||||
|
||||
super(PublisherWindow, self).keyPressEvent(event)
|
||||
|
||||
def _on_overlay_message(self, event):
|
||||
|
|
@ -455,8 +481,65 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
self._reset_on_show = False
|
||||
self.reset()
|
||||
|
||||
def save_changes(self):
|
||||
self._controller.save_changes()
|
||||
def _checks_before_save(self, explicit_save):
|
||||
"""Save of changes may trigger some issues.
|
||||
|
||||
Check if context did change and ask user if he is really sure the
|
||||
save should happen. A dialog can be shown during this method.
|
||||
|
||||
Args:
|
||||
explicit_save (bool): Method was called when user explicitly asked
|
||||
for save. Value affects shown message.
|
||||
|
||||
Returns:
|
||||
bool: Save can happen.
|
||||
"""
|
||||
|
||||
if not self._controller.host_context_has_changed:
|
||||
return True
|
||||
|
||||
title = "Host context changed"
|
||||
if explicit_save:
|
||||
message = (
|
||||
"Context has changed since Publisher window was refreshed last"
|
||||
" time.\n\nAre you sure you want to save changes?"
|
||||
)
|
||||
else:
|
||||
message = (
|
||||
"Your action requires save of changes but context has changed"
|
||||
" since Publisher window was refreshed last time.\n\nAre you"
|
||||
" sure you want to continue and save changes?"
|
||||
)
|
||||
|
||||
result = QtWidgets.QMessageBox.question(
|
||||
self,
|
||||
title,
|
||||
message,
|
||||
QtWidgets.QMessageBox.Save | QtWidgets.QMessageBox.Cancel
|
||||
)
|
||||
return result == QtWidgets.QMessageBox.Save
|
||||
|
||||
def _save_changes(self, explicit_save):
|
||||
"""Save changes of Creation part.
|
||||
|
||||
All possible triggers of save changes were moved to main window (here),
|
||||
so it can handle possible issues with save at one place. Do checks,
|
||||
so user don't accidentally save changes to different file or using
|
||||
different context.
|
||||
Moving responsibility to this place gives option to show the dialog and
|
||||
wait for user's response without breaking action he wanted to do.
|
||||
|
||||
Args:
|
||||
explicit_save (bool): Method was called when user explicitly asked
|
||||
for save. Value affects shown message.
|
||||
|
||||
Returns:
|
||||
bool: Save happened successfully.
|
||||
"""
|
||||
|
||||
if not self._checks_before_save(explicit_save):
|
||||
return False
|
||||
return self._controller.save_changes()
|
||||
|
||||
def reset(self):
|
||||
self._controller.reset()
|
||||
|
|
@ -491,15 +574,18 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
self._help_dialog.show()
|
||||
|
||||
window = self.window()
|
||||
desktop = QtWidgets.QApplication.desktop()
|
||||
screen_idx = desktop.screenNumber(window)
|
||||
screen = desktop.screen(screen_idx)
|
||||
screen_rect = screen.geometry()
|
||||
if hasattr(QtWidgets.QApplication, "desktop"):
|
||||
desktop = QtWidgets.QApplication.desktop()
|
||||
screen_idx = desktop.screenNumber(window)
|
||||
screen_geo = desktop.screenGeometry(screen_idx)
|
||||
else:
|
||||
screen = window.screen()
|
||||
screen_geo = screen.geometry()
|
||||
|
||||
window_geo = window.geometry()
|
||||
dialog_x = window_geo.x() + window_geo.width()
|
||||
dialog_right = (dialog_x + self._help_dialog.width()) - 1
|
||||
diff = dialog_right - screen_rect.right()
|
||||
diff = dialog_right - screen_geo.right()
|
||||
if diff > 0:
|
||||
dialog_x -= diff
|
||||
|
||||
|
|
@ -549,6 +635,14 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
def _on_create_request(self):
|
||||
self._go_to_create_tab()
|
||||
|
||||
def _on_convert_requested(self):
|
||||
if not self._save_changes(False):
|
||||
return
|
||||
convertor_identifiers = (
|
||||
self._overview_widget.get_selected_legacy_convertors()
|
||||
)
|
||||
self._controller.trigger_convertor_items(convertor_identifiers)
|
||||
|
||||
def _set_current_tab(self, identifier):
|
||||
self._tabs_widget.set_current_tab(identifier)
|
||||
|
||||
|
|
@ -599,8 +693,10 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
self._publish_frame.setVisible(visible)
|
||||
self._update_publish_frame_rect()
|
||||
|
||||
def _on_save_clicked(self):
|
||||
self._save_changes(True)
|
||||
|
||||
def _on_reset_clicked(self):
|
||||
self.save_changes()
|
||||
self.reset()
|
||||
|
||||
def _on_stop_clicked(self):
|
||||
|
|
@ -610,14 +706,17 @@ class PublisherWindow(QtWidgets.QDialog):
|
|||
self._controller.set_comment(self._comment_input.text())
|
||||
|
||||
def _on_validate_clicked(self):
|
||||
self._set_publish_comment()
|
||||
self._controller.validate()
|
||||
if self._save_changes(False):
|
||||
self._set_publish_comment()
|
||||
self._controller.validate()
|
||||
|
||||
def _on_publish_clicked(self):
|
||||
self._set_publish_comment()
|
||||
self._controller.publish()
|
||||
if self._save_changes(False):
|
||||
self._set_publish_comment()
|
||||
self._controller.publish()
|
||||
|
||||
def _set_footer_enabled(self, enabled):
|
||||
self._save_btn.setEnabled(True)
|
||||
self._reset_btn.setEnabled(True)
|
||||
if enabled:
|
||||
self._stop_btn.setEnabled(False)
|
||||
|
|
|
|||
|
|
@ -247,7 +247,7 @@ class TrayPublishWindow(PublisherWindow):
|
|||
|
||||
def _on_project_select(self, project_name):
|
||||
# TODO register project specific plugin paths
|
||||
self._controller.save_changes()
|
||||
self._controller.save_changes(False)
|
||||
self._controller.reset_project_data_cache()
|
||||
|
||||
self.reset()
|
||||
|
|
|
|||
|
|
@ -862,11 +862,11 @@ class WrappedCallbackItem:
|
|||
return self._result
|
||||
|
||||
def execute(self):
|
||||
"""Execute callback and store it's result.
|
||||
"""Execute callback and store its result.
|
||||
|
||||
Method must be called from main thread. Item is marked as `done`
|
||||
when callback execution finished. Store output of callback of exception
|
||||
information when callback raise one.
|
||||
information when callback raises one.
|
||||
"""
|
||||
if self.done:
|
||||
self.log.warning("- item is already processed")
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.15.3-nightly.3"
|
||||
__version__ = "3.15.3-nightly.4"
|
||||
|
|
|
|||
Binary file not shown.
|
After Width: | Height: | Size: 9.7 KiB |
|
|
@ -82,8 +82,8 @@ All context filters are lists which may contain strings or Regular expressions (
|
|||
- **`tasks`** - Currently processed task. `["modeling", "animation"]`
|
||||
|
||||
:::important Filtering
|
||||
Filters are optional. In case when multiple profiles match current context, profile with higher number of matched filters has higher priority that profile without filters.
|
||||
(Eg. order of when filter is added doesn't matter, only the precision of matching does.)
|
||||
Filters are optional. In case when multiple profiles match current context, profile with higher number of matched filters has higher priority than profile without filters.
|
||||
(The order the profiles in settings doesn't matter, only the precision of matching does.)
|
||||
:::
|
||||
|
||||
## Publish plugins
|
||||
|
|
@ -94,7 +94,7 @@ Publish plugins used across all integrations.
|
|||
### Extract Review
|
||||
Plugin responsible for automatic FFmpeg conversion to variety of formats.
|
||||
|
||||
Extract review is using [profile filtering](#profile-filters) to be able render different outputs for different situations.
|
||||
Extract review uses [profile filtering](#profile-filters) to render different outputs for different situations.
|
||||
|
||||
Applicable context filters:
|
||||
**`hosts`** - Host from which publishing was triggered. `["maya", "nuke"]`
|
||||
|
|
@ -104,7 +104,7 @@ Applicable context filters:
|
|||
|
||||
**Output Definitions**
|
||||
|
||||
Profile may generate multiple outputs from a single input. Each output must define unique name and output extension (use the extension without a dot e.g. **mp4**). All other settings of output definition are optional.
|
||||
A profile may generate multiple outputs from a single input. Each output must define unique name and output extension (use the extension without a dot e.g. **mp4**). All other settings of output definition are optional.
|
||||
|
||||

|
||||
- **`Tags`**
|
||||
|
|
@ -118,7 +118,7 @@ Profile may generate multiple outputs from a single input. Each output must defi
|
|||
- **Output arguments** other FFmpeg output arguments like codec definition.
|
||||
|
||||
- **`Output width`** and **`Output height`**
|
||||
- it is possible to rescale output to specified resolution and keep aspect ratio.
|
||||
- It is possible to rescale output to specified resolution and keep aspect ratio.
|
||||
- If value is set to 0, source resolution will be used.
|
||||
|
||||
- **`Overscan crop`**
|
||||
|
|
@ -230,10 +230,10 @@ Applicable context filters:
|
|||
## Tools
|
||||
Settings for OpenPype tools.
|
||||
|
||||
## Creator
|
||||
### Creator
|
||||
Settings related to [Creator tool](artist_tools_creator).
|
||||
|
||||
### Subset name profiles
|
||||
#### Subset name profiles
|
||||

|
||||
|
||||
Subset name helps to identify published content. More specific name helps with organization and avoid mixing of published content. Subset name is defined using one of templates defined in **Subset name profiles settings**. The template is filled with context information at the time of creation.
|
||||
|
|
@ -263,10 +263,31 @@ Template may look like `"{family}{Task}{Variant}"`.
|
|||
Some creators may have other keys as their context may require more information or more specific values. Make sure you've read documentation of host you're using.
|
||||
|
||||
|
||||
## Workfiles
|
||||
### Publish
|
||||
|
||||
#### Custom Staging Directory Profiles
|
||||
With this feature, users can specify a custom data folder path based on presets, which can be used during the creation and publishing stages.
|
||||
|
||||

|
||||
|
||||
Staging directories are used as a destination for intermediate files (as renders) before they are renamed and copied to proper location during the integration phase. They could be created completely dynamically in the temp folder or for some DCCs in the `work` area.
|
||||
Example could be Nuke where artist might want to temporarily render pictures into `work` area to check them before they get published with the choice of "Use existing frames" on the write node.
|
||||
|
||||
One of the key advantages of this feature is that it allows users to choose the folder for writing such intermediate files to take advantage of faster storage for rendering, which can help improve workflow efficiency. Additionally, this feature allows users to keep their intermediate extracted data persistent, and use their own infrastructure for regular cleaning.
|
||||
|
||||
In some cases, these DCCs (Nuke, Houdini, Maya) automatically add a rendering path during the creation stage, which is then used in publishing. Creators and extractors of such DCCs need to use these profiles to fill paths in DCC's nodes to use this functionality.
|
||||
|
||||
The custom staging folder uses a path template configured in `project_anatomy/templates/others` with `transient` being a default example path that could be used. The template requires a 'folder' key for it to be usable as custom staging folder.
|
||||
|
||||
##### Known issues
|
||||
- Any DCC that uses prefilled paths and store them inside of workfile nodes needs to implement resolving these paths with a configured profiles.
|
||||
- If studio uses Site Sync remote artists need to have access to configured custom staging folder!
|
||||
- Each node on the rendering farm must have access to configured custom staging folder!
|
||||
|
||||
### Workfiles
|
||||
All settings related to Workfile tool.
|
||||
|
||||
### Open last workfile at launch
|
||||
#### Open last workfile at launch
|
||||
This feature allows you to define a rule for each task/host or toggle the feature globally to all tasks as they are visible in the picture.
|
||||
|
||||

|
||||
|
|
|
|||
|
|
@ -36,7 +36,7 @@ All context filters are lists which may contain strings or Regular expressions (
|
|||
- **families** - Main family of processed instance. `["plate", "model"]`
|
||||
|
||||
:::important Filtering
|
||||
Filters are optional and may not be set. In case when multiple profiles match current context, profile with filters has higher priority that profile without filters.
|
||||
Filters are optional and may not be set. In case when multiple profiles match current context, profile with filters has higher priority than profile without filters.
|
||||
:::
|
||||
|
||||
#### Profile outputs
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue