Merge remote-tracking branch 'upstream/develop' into substance_integration

This commit is contained in:
Roy Nieterau 2023-03-23 13:41:07 +01:00
commit 13aa1c82fa
53 changed files with 1092 additions and 599 deletions

77
ARCHITECTURE.md Normal file
View file

@ -0,0 +1,77 @@
# Architecture
OpenPype is a monolithic Python project that bundles several parts, this document will try to give a birds eye overview of the project and, to a certain degree, each of the sub-projects.
The current file structure looks like this:
```
.
├── common - Code in this folder is backend portion of Addon distribution logic for v4 server.
├── docs - Documentation of the source code.
├── igniter - The OpenPype bootstrapper, deals with running version resolution and setting up the connection to the mongodb.
├── openpype - The actual OpenPype core package.
├── schema - Collection of JSON files describing schematics of objects. This follows Avalon's convention.
├── tests - Integration and unit tests.
├── tools - Conveninece scripts to perform common actions (in both bash and ps1).
├── vendor - When using the igniter, it deploys third party tools in here, such as ffmpeg.
└── website - Source files for https://openpype.io/ which is Docusaursus (https://docusaurus.io/).
```
The core functionality of the pipeline can be found in `igniter` and `openpype`, which in turn rely on the `schema` files, whenever you build (or download a pre-built) version of OpenPype, these two are bundled in there, and `Igniter` is the entry point.
## Igniter
It's the setup and update tool for OpenPype, unless you want to package `openpype` separately and deal with all the config manually, this will most likely be your entry point.
```
igniter/
├── bootstrap_repos.py - Module that will find or install OpenPype versions in the system.
├── __init__.py - Igniter entry point.
├── install_dialog.py- Show dialog for choosing central pype repository.
├── install_thread.py - Threading helpers for the install process.
├── __main__.py - Like `__init__.py` ?
├── message_dialog.py - Qt Dialog with a message and "Ok" button.
├── nice_progress_bar.py - Fancy Qt progress bar.
├── splash.txt - ASCII art for the terminal installer.
├── stylesheet.css - Installer Qt styles.
├── terminal_splash.py - Terminal installer animation, relies in `splash.txt`.
├── tools.py - Collection of methods that don't fit in other modules.
├── update_thread.py - Threading helper to update existing OpenPype installs.
├── update_window.py - Qt UI to update OpenPype installs.
├── user_settings.py - Interface for the OpenPype user settings.
└── version.py - Igniter's version number.
```
## OpenPype
This is the main package of the OpenPype logic, it could be loosely described as a combination of [Avalon](https://getavalon.github.io), [Pyblish](https://pyblish.com/) and glue around those with custom OpenPype only elements, things are in progress of being moved around to better prepare for V4, which will be released under a new name AYON.
```
openpype/
├── client - Interface for the MongoDB.
├── hooks - Hooks to be executed on certain OpenPype Applications defined in `openpype.lib.applications`.
├── host - Base class for the different hosts.
├── hosts - Integration with the different DCCs (hosts) using the `host` base class.
├── lib - Libraries that stitch together the package, some have been moved into other parts.
├── modules - OpenPype modules should contain separated logic of specific kind of implementation, such as Ftrack connection and its python API.
├── pipeline - Core of the OpenPype pipeline, handles creation of data, publishing, etc.
├── plugins - Global/core plugins for loader and publisher tool.
├── resources - Icons, fonts, etc.
├── scripts - Loose scipts that get run by tools/publishers.
├── settings - OpenPype settings interface.
├── style - Qt styling.
├── tests - Unit tests.
├── tools - Core tools, check out https://openpype.io/docs/artist_tools.
├── vendor - Vendoring of needed required Python packes.
├── widgets - Common re-usable Qt Widgets.
├── action.py - LEGACY: Lives now in `openpype.pipeline.publish.action` Pyblish actions.
├── cli.py - Command line interface, leverages `click`.
├── __init__.py - Sets two constants.
├── __main__.py - Entry point, calls the `cli.py`
├── plugin.py - Pyblish plugins.
├── pype_commands.py - Implementation of OpenPype commands.
└── version.py - Current version number.
```

View file

@ -3,10 +3,13 @@ from openpype.lib import PreLaunchHook
from openpype.pipeline.workfile import create_workdir_extra_folders
class AddLastWorkfileToLaunchArgs(PreLaunchHook):
"""Add last workfile path to launch arguments.
class CreateWorkdirExtraFolders(PreLaunchHook):
"""Create extra folders for the work directory.
Based on setting `project_settings/global/tools/Workfiles/extra_folders`
profile filtering will decide whether extra folders need to be created in
the work directory.
This is not possible to do for all applications the same way.
"""
# Execute after workfile template copy

View file

@ -146,6 +146,8 @@ class CreatorWidget(QtWidgets.QDialog):
return " ".join([str(m.group(0)).capitalize() for m in matches])
def create_row(self, layout, type, text, **kwargs):
value_keys = ["setText", "setCheckState", "setValue", "setChecked"]
# get type attribute from qwidgets
attr = getattr(QtWidgets, type)
@ -167,14 +169,27 @@ class CreatorWidget(QtWidgets.QDialog):
# assign the created attribute to variable
item = getattr(self, attr_name)
# set attributes to item which are not values
for func, val in kwargs.items():
if func in value_keys:
continue
if getattr(item, func):
log.debug("Setting {} to {}".format(func, val))
func_attr = getattr(item, func)
if isinstance(val, tuple):
func_attr(*val)
else:
func_attr(val)
# set values to item
for value_item in value_keys:
if value_item not in kwargs:
continue
if getattr(item, value_item):
getattr(item, value_item)(kwargs[value_item])
# add to layout
layout.addRow(label, item)
@ -276,8 +291,11 @@ class CreatorWidget(QtWidgets.QDialog):
elif v["type"] == "QSpinBox":
data[k]["value"] = self.create_row(
content_layout, "QSpinBox", v["label"],
setValue=v["value"], setMinimum=0,
setValue=v["value"],
setDisplayIntegerBase=10000,
setRange=(0, 99999), setMinimum=0,
setMaximum=100000, setToolTip=tool_tip)
return data

View file

@ -0,0 +1,19 @@
import pyblish.api
from openpype.lib import version_up
from pymxs import runtime as rt
class IncrementWorkfileVersion(pyblish.api.ContextPlugin):
"""Increment current workfile version."""
order = pyblish.api.IntegratorOrder + 0.9
label = "Increment Workfile Version"
hosts = ["max"]
families = ["workfile"]
def process(self, context):
path = context.data["currentFile"]
filepath = version_up(path)
rt.saveMaxFile(filepath)
self.log.info("Incrementing file version")

View file

@ -30,36 +30,6 @@ def _has_arnold():
return False
def escape_space(path):
"""Ensure path is enclosed by quotes to allow paths with spaces"""
return '"{}"'.format(path) if " " in path else path
def get_ocio_config_path(profile_folder):
"""Path to OpenPype vendorized OCIO.
Vendorized OCIO config file path is grabbed from the specific path
hierarchy specified below.
"{OPENPYPE_ROOT}/vendor/OpenColorIO-Configs/{profile_folder}/config.ocio"
Args:
profile_folder (str): Name of folder to grab config file from.
Returns:
str: Path to vendorized config file.
"""
return os.path.join(
os.environ["OPENPYPE_ROOT"],
"vendor",
"bin",
"ocioconfig",
"OpenColorIOConfigs",
profile_folder,
"config.ocio"
)
def find_paths_by_hash(texture_hash):
"""Find the texture hash key in the dictionary.

View file

@ -25,7 +25,7 @@ class ExtractReviewData(publish.Extractor):
# review can be removed since `ProcessSubmittedJobOnFarm` will create
# reviewable representation if needed
if (
"render.farm" in instance.data["families"]
instance.data.get("farm")
and "review" in instance.data["families"]
):
instance.data["families"].remove("review")

View file

@ -49,7 +49,12 @@ class ExtractReviewDataLut(publish.Extractor):
exporter.stagingDir, exporter.file).replace("\\", "/")
instance.data["representations"] += data["representations"]
if "render.farm" in families:
# review can be removed since `ProcessSubmittedJobOnFarm` will create
# reviewable representation if needed
if (
instance.data.get("farm")
and "review" in instance.data["families"]
):
instance.data["families"].remove("review")
self.log.debug(

View file

@ -105,10 +105,7 @@ class ExtractReviewDataMov(publish.Extractor):
self, instance, o_name, o_data["extension"],
multiple_presets)
if (
"render.farm" in families or
"prerender.farm" in families
):
if instance.data.get("farm"):
if "review" in instance.data["families"]:
instance.data["families"].remove("review")

View file

@ -31,7 +31,7 @@ class ExtractThumbnail(publish.Extractor):
def process(self, instance):
if "render.farm" in instance.data["families"]:
if instance.data.get("farm"):
return
with napi.maintained_selection():

View file

@ -5,29 +5,38 @@ import re
import subprocess
from distutils import dir_util
from pathlib import Path
from typing import List
from typing import List, Union
import openpype.hosts.unreal.lib as ue_lib
from qtpy import QtCore
def parse_comp_progress(line: str, progress_signal: QtCore.Signal(int)) -> int:
match = re.search('\[[1-9]+/[0-9]+\]', line)
def parse_comp_progress(line: str, progress_signal: QtCore.Signal(int)):
match = re.search(r"\[[1-9]+/[0-9]+]", line)
if match is not None:
split: list[str] = match.group().split('/')
split: list[str] = match.group().split("/")
curr: float = float(split[0][1:])
total: float = float(split[1][:-1])
progress_signal.emit(int((curr / total) * 100.0))
def parse_prj_progress(line: str, progress_signal: QtCore.Signal(int)) -> int:
match = re.search('@progress', line)
def parse_prj_progress(line: str, progress_signal: QtCore.Signal(int)):
match = re.search("@progress", line)
if match is not None:
percent_match = re.search('\d{1,3}', line)
percent_match = re.search(r"\d{1,3}", line)
progress_signal.emit(int(percent_match.group()))
def retrieve_exit_code(line: str):
match = re.search(r"ExitCode=\d+", line)
if match is not None:
split: list[str] = match.group().split("=")
return int(split[1])
return None
class UEProjectGenerationWorker(QtCore.QObject):
finished = QtCore.Signal(str)
failed = QtCore.Signal(str)
@ -77,16 +86,19 @@ class UEProjectGenerationWorker(QtCore.QObject):
if self.dev_mode:
stage_count = 4
self.stage_begin.emit(f'Generating a new UE project ... 1 out of '
f'{stage_count}')
self.stage_begin.emit(
("Generating a new UE project ... 1 out of "
f"{stage_count}"))
commandlet_cmd = [f'{ue_editor_exe.as_posix()}',
f'{cmdlet_project.as_posix()}',
f'-run=OPGenerateProject',
f'{project_file.resolve().as_posix()}']
commandlet_cmd = [
f"{ue_editor_exe.as_posix()}",
f"{cmdlet_project.as_posix()}",
"-run=OPGenerateProject",
f"{project_file.resolve().as_posix()}",
]
if self.dev_mode:
commandlet_cmd.append('-GenerateCode')
commandlet_cmd.append("-GenerateCode")
gen_process = subprocess.Popen(commandlet_cmd,
stdout=subprocess.PIPE,
@ -94,24 +106,27 @@ class UEProjectGenerationWorker(QtCore.QObject):
for line in gen_process.stdout:
decoded_line = line.decode(errors="replace")
print(decoded_line, end='')
print(decoded_line, end="")
self.log.emit(decoded_line)
gen_process.stdout.close()
return_code = gen_process.wait()
if return_code and return_code != 0:
msg = 'Failed to generate ' + self.project_name \
+ f' project! Exited with return code {return_code}'
msg = (
f"Failed to generate {self.project_name} "
f"project! Exited with return code {return_code}"
)
self.failed.emit(msg, return_code)
raise RuntimeError(msg)
print("--- Project has been generated successfully.")
self.stage_begin.emit(f'Writing the Engine ID of the build UE ... 1'
f' out of {stage_count}')
self.stage_begin.emit(
(f"Writing the Engine ID of the build UE ... 1"
f" out of {stage_count}"))
if not project_file.is_file():
msg = "Failed to write the Engine ID into .uproject file! Can " \
"not read!"
msg = ("Failed to write the Engine ID into .uproject file! Can "
"not read!")
self.failed.emit(msg)
raise RuntimeError(msg)
@ -125,13 +140,14 @@ class UEProjectGenerationWorker(QtCore.QObject):
pf.seek(0)
json.dump(pf_json, pf, indent=4)
pf.truncate()
print(f'--- Engine ID has been written into the project file')
print("--- Engine ID has been written into the project file")
self.progress.emit(90)
if self.dev_mode:
# 2nd stage
self.stage_begin.emit(f'Generating project files ... 2 out of '
f'{stage_count}')
self.stage_begin.emit(
(f"Generating project files ... 2 out of "
f"{stage_count}"))
self.progress.emit(0)
ubt_path = ue_lib.get_path_to_ubt(self.engine_path,
@ -154,8 +170,8 @@ class UEProjectGenerationWorker(QtCore.QObject):
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
for line in gen_proc.stdout:
decoded_line: str = line.decode(errors='replace')
print(decoded_line, end='')
decoded_line: str = line.decode(errors="replace")
print(decoded_line, end="")
self.log.emit(decoded_line)
parse_prj_progress(decoded_line, self.progress)
@ -163,13 +179,13 @@ class UEProjectGenerationWorker(QtCore.QObject):
return_code = gen_proc.wait()
if return_code and return_code != 0:
msg = 'Failed to generate project files! ' \
f'Exited with return code {return_code}'
msg = ("Failed to generate project files! "
f"Exited with return code {return_code}")
self.failed.emit(msg, return_code)
raise RuntimeError(msg)
self.stage_begin.emit(f'Building the project ... 3 out of '
f'{stage_count}')
self.stage_begin.emit(
f"Building the project ... 3 out of {stage_count}")
self.progress.emit(0)
# 3rd stage
build_prj_cmd = [ubt_path.as_posix(),
@ -177,16 +193,16 @@ class UEProjectGenerationWorker(QtCore.QObject):
arch,
"Development",
"-TargetType=Editor",
f'-Project={project_file}',
f'{project_file}',
f"-Project={project_file}",
f"{project_file}",
"-IgnoreJunk"]
build_prj_proc = subprocess.Popen(build_prj_cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
for line in build_prj_proc.stdout:
decoded_line: str = line.decode(errors='replace')
print(decoded_line, end='')
decoded_line: str = line.decode(errors="replace")
print(decoded_line, end="")
self.log.emit(decoded_line)
parse_comp_progress(decoded_line, self.progress)
@ -194,16 +210,17 @@ class UEProjectGenerationWorker(QtCore.QObject):
return_code = build_prj_proc.wait()
if return_code and return_code != 0:
msg = 'Failed to build project! ' \
f'Exited with return code {return_code}'
msg = ("Failed to build project! "
f"Exited with return code {return_code}")
self.failed.emit(msg, return_code)
raise RuntimeError(msg)
# ensure we have PySide2 installed in engine
self.progress.emit(0)
self.stage_begin.emit(f'Checking PySide2 installation... {stage_count}'
f' out of {stage_count}')
self.stage_begin.emit(
(f"Checking PySide2 installation... {stage_count} "
f" out of {stage_count}"))
python_path = None
if platform.system().lower() == "windows":
python_path = self.engine_path / ("Engine/Binaries/ThirdParty/"
@ -225,9 +242,30 @@ class UEProjectGenerationWorker(QtCore.QObject):
msg = f"Unreal Python not found at {python_path}"
self.failed.emit(msg, 1)
raise RuntimeError(msg)
subprocess.check_call(
[python_path.as_posix(), "-m", "pip", "install", "pyside2"]
)
pyside_cmd = [python_path.as_posix(),
"-m",
"pip",
"install",
"pyside2"]
pyside_install = subprocess.Popen(pyside_cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
for line in pyside_install.stdout:
decoded_line: str = line.decode(errors="replace")
print(decoded_line, end="")
self.log.emit(decoded_line)
pyside_install.stdout.close()
return_code = pyside_install.wait()
if return_code and return_code != 0:
msg = ("Failed to create the project! "
"The installation of PySide2 has failed!")
self.failed.emit(msg, return_code)
raise RuntimeError(msg)
self.progress.emit(100)
self.finished.emit("Project successfully built!")
@ -266,26 +304,30 @@ class UEPluginInstallWorker(QtCore.QObject):
# in order to successfully build the plugin,
# It must be built outside the Engine directory and then moved
build_plugin_cmd: List[str] = [f'{uat_path.as_posix()}',
'BuildPlugin',
f'-Plugin={uplugin_path.as_posix()}',
f'-Package={temp_dir.as_posix()}']
build_plugin_cmd: List[str] = [f"{uat_path.as_posix()}",
"BuildPlugin",
f"-Plugin={uplugin_path.as_posix()}",
f"-Package={temp_dir.as_posix()}"]
build_proc = subprocess.Popen(build_plugin_cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
return_code: Union[None, int] = None
for line in build_proc.stdout:
decoded_line: str = line.decode(errors='replace')
print(decoded_line, end='')
decoded_line: str = line.decode(errors="replace")
print(decoded_line, end="")
self.log.emit(decoded_line)
if return_code is None:
return_code = retrieve_exit_code(decoded_line)
parse_comp_progress(decoded_line, self.progress)
build_proc.stdout.close()
return_code = build_proc.wait()
build_proc.wait()
if return_code and return_code != 0:
msg = 'Failed to build plugin' \
f' project! Exited with return code {return_code}'
msg = ("Failed to build plugin"
f" project! Exited with return code {return_code}")
dir_util.remove_tree(temp_dir.as_posix())
self.failed.emit(msg, return_code)
raise RuntimeError(msg)

View file

@ -889,7 +889,8 @@ class ApplicationLaunchContext:
self.modules_manager = ModulesManager()
# Logger
logger_name = "{}-{}".format(self.__class__.__name__, self.app_name)
logger_name = "{}-{}".format(self.__class__.__name__,
self.application.full_name)
self.log = Logger.get_logger(logger_name)
self.executable = executable
@ -1246,7 +1247,7 @@ class ApplicationLaunchContext:
args_len_str = " ({})".format(len(args))
self.log.info(
"Launching \"{}\" with args{}: {}".format(
self.app_name, args_len_str, args
self.application.full_name, args_len_str, args
)
)
self.launch_args = args
@ -1271,7 +1272,9 @@ class ApplicationLaunchContext:
exc_info=True
)
self.log.debug("Launch of {} finished.".format(self.app_name))
self.log.debug("Launch of {} finished.".format(
self.application.full_name
))
return self.process
@ -1508,8 +1511,8 @@ def prepare_app_environments(
if key in source_env:
source_env[key] = value
# `added_env_keys` has debug purpose
added_env_keys = {app.group.name, app.name}
# `app_and_tool_labels` has debug purpose
app_and_tool_labels = [app.full_name]
# Environments for application
environments = [
app.group.environment,
@ -1532,15 +1535,14 @@ def prepare_app_environments(
for group_name in sorted(groups_by_name.keys()):
group = groups_by_name[group_name]
environments.append(group.environment)
added_env_keys.add(group_name)
for tool_name in sorted(tool_by_group_name[group_name].keys()):
tool = tool_by_group_name[group_name][tool_name]
environments.append(tool.environment)
added_env_keys.add(tool.name)
app_and_tool_labels.append(tool.full_name)
log.debug(
"Will add environments for apps and tools: {}".format(
", ".join(added_env_keys)
", ".join(app_and_tool_labels)
)
)

View file

@ -102,6 +102,10 @@ def run_subprocess(*args, **kwargs):
if (
platform.system().lower() == "windows"
and "creationflags" not in kwargs
# shell=True already tries to hide the console window
# and passing these creationflags then shows the window again
# so we avoid it for shell=True cases
and kwargs.get("shell") is not True
):
kwargs["creationflags"] = (
subprocess.CREATE_NEW_PROCESS_GROUP

View file

@ -13,6 +13,16 @@ else:
from shutil import copyfile
class DuplicateDestinationError(ValueError):
"""Error raised when transfer destination already exists in queue.
The error is only raised if `allow_queue_replacements` is False on the
FileTransaction instance and the added file to transfer is of a different
src file than the one already detected in the queue.
"""
class FileTransaction(object):
"""File transaction with rollback options.
@ -44,7 +54,7 @@ class FileTransaction(object):
MODE_COPY = 0
MODE_HARDLINK = 1
def __init__(self, log=None):
def __init__(self, log=None, allow_queue_replacements=False):
if log is None:
log = logging.getLogger("FileTransaction")
@ -60,6 +70,8 @@ class FileTransaction(object):
# Backup file location mapping to original locations
self._backup_to_original = {}
self._allow_queue_replacements = allow_queue_replacements
def add(self, src, dst, mode=MODE_COPY):
"""Add a new file to transfer queue.
@ -82,6 +94,14 @@ class FileTransaction(object):
src, dst))
return
else:
if not self._allow_queue_replacements:
raise DuplicateDestinationError(
"Transfer to destination is already in queue: "
"{} -> {}. It's not allowed to be replaced by "
"a new transfer from {}".format(
queued_src, dst, src
))
self.log.warning("File transfer in queue replaced..")
self.log.debug(
"Removed from queue: {} -> {} replaced by {} -> {}".format(

View file

@ -224,18 +224,26 @@ def find_tool_in_custom_paths(paths, tool, validation_func=None):
def _check_args_returncode(args):
try:
# Python 2 compatibility where DEVNULL is not available
kwargs = {}
if platform.system().lower() == "windows":
kwargs["creationflags"] = (
subprocess.CREATE_NEW_PROCESS_GROUP
| getattr(subprocess, "DETACHED_PROCESS", 0)
| getattr(subprocess, "CREATE_NO_WINDOW", 0)
)
if hasattr(subprocess, "DEVNULL"):
proc = subprocess.Popen(
args,
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
**kwargs
)
proc.wait()
else:
with open(os.devnull, "w") as devnull:
proc = subprocess.Popen(
args, stdout=devnull, stderr=devnull,
args, stdout=devnull, stderr=devnull, **kwargs
)
proc.wait()

View file

@ -32,7 +32,7 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin,
label = "Submit Nuke to Deadline"
order = pyblish.api.IntegratorOrder + 0.1
hosts = ["nuke"]
families = ["render", "prerender.farm"]
families = ["render", "prerender"]
optional = True
targets = ["local"]
@ -80,6 +80,10 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin,
]
def process(self, instance):
if not instance.data.get("farm"):
self.log.info("Skipping local instance.")
return
instance.data["attributeValues"] = self.get_attr_values_from_data(
instance.data)
@ -168,10 +172,10 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin,
resp.json()["_id"])
# redefinition of families
if "render.farm" in families:
if "render" in instance.data["family"]:
instance.data['family'] = 'write'
families.insert(0, "render2d")
elif "prerender.farm" in families:
elif "prerender" in instance.data["family"]:
instance.data['family'] = 'write'
families.insert(0, "prerender")
instance.data["families"] = families

View file

@ -756,6 +756,10 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
instance (pyblish.api.Instance): Instance data.
"""
if not instance.data.get("farm"):
self.log.info("Skipping local instance.")
return
data = instance.data.copy()
context = instance.context
self.context = context

View file

@ -21,6 +21,10 @@ class ValidateDeadlinePools(OptionalPyblishPluginMixin,
optional = True
def process(self, instance):
if not instance.data.get("farm"):
self.log.info("Skipping local instance.")
return
# get default deadline webservice url from deadline module
deadline_url = instance.context.data["defaultDeadline"]
self.log.info("deadline_url::{}".format(deadline_url))

View file

@ -124,6 +124,11 @@ class AppplicationsAction(BaseAction):
if not avalon_project_apps:
return False
settings = self.get_project_settings_from_event(
event, avalon_project_doc["name"])
only_available = settings["applications"]["only_available"]
items = []
for app_name in avalon_project_apps:
app = self.application_manager.applications.get(app_name)
@ -133,6 +138,10 @@ class AppplicationsAction(BaseAction):
if app.group.name in CUSTOM_LAUNCH_APP_GROUPS:
continue
# Skip applications without valid executables
if only_available and not app.find_executable():
continue
app_icon = app.icon
if app_icon and self.icon_url:
try:

View file

@ -29,7 +29,7 @@ class CollectKitsuEntities(pyblish.api.ContextPlugin):
if not zou_asset_data:
raise ValueError("Zou asset data not found in OpenPype!")
task_name = instance.data.get("task")
task_name = instance.data.get("task", context.data.get("task"))
if not task_name:
continue

View file

@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
import gazu
import pyblish.api
import re
class IntegrateKitsuNote(pyblish.api.ContextPlugin):
@ -9,27 +10,69 @@ class IntegrateKitsuNote(pyblish.api.ContextPlugin):
order = pyblish.api.IntegratorOrder
label = "Kitsu Note and Status"
families = ["render", "kitsu"]
# status settings
set_status_note = False
note_status_shortname = "wfa"
status_conditions = list()
# comment settings
custom_comment_template = {
"enabled": False,
"comment_template": "{comment}",
}
def format_publish_comment(self, instance):
"""Format the instance's publish comment
Formats `instance.data` against the custom template.
"""
def replace_missing_key(match):
"""If key is not found in kwargs, set None instead"""
key = match.group(1)
if key not in instance.data:
self.log.warning(
"Key '{}' was not found in instance.data "
"and will be rendered as an empty string "
"in the comment".format(key)
)
return ""
else:
return str(instance.data[key])
template = self.custom_comment_template["comment_template"]
pattern = r"\{([^}]*)\}"
return re.sub(pattern, replace_missing_key, template)
def process(self, context):
# Get comment text body
publish_comment = context.data.get("comment")
if not publish_comment:
self.log.info("Comment is not set.")
self.log.debug("Comment is `{}`".format(publish_comment))
for instance in context:
# Check if instance is a review by checking its family
# Allow a match to primary family or any of families
families = set([instance.data["family"]] +
instance.data.get("families", []))
if "review" not in families:
continue
kitsu_task = instance.data.get("kitsu_task")
if kitsu_task is None:
continue
# Get note status, by default uses the task status for the note
# if it is not specified in the configuration
note_status = kitsu_task["task_status"]["id"]
shortname = kitsu_task["task_status"]["short_name"].upper()
note_status = kitsu_task["task_status_id"]
if self.set_status_note:
# Check if any status condition is not met
allow_status_change = True
for status_cond in self.status_conditions:
condition = status_cond["condition"] == "equal"
match = status_cond["short_name"].upper() == shortname
if match and not condition or condition and not match:
allow_status_change = False
break
if self.set_status_note and allow_status_change:
kitsu_status = gazu.task.get_task_status_by_short_name(
self.note_status_shortname
)
@ -42,11 +85,22 @@ class IntegrateKitsuNote(pyblish.api.ContextPlugin):
"changed!".format(self.note_status_shortname)
)
# Get comment text body
publish_comment = instance.data.get("comment")
if self.custom_comment_template["enabled"]:
publish_comment = self.format_publish_comment(instance)
if not publish_comment:
self.log.info("Comment is not set.")
else:
self.log.debug("Comment is `{}`".format(publish_comment))
# Add comment to kitsu task
task_id = kitsu_task["id"]
self.log.debug("Add new note in taks id {}".format(task_id))
self.log.debug(
"Add new note in tasks id {}".format(kitsu_task["id"])
)
kitsu_comment = gazu.task.add_comment(
task_id, note_status, comment=publish_comment
kitsu_task, note_status, comment=publish_comment
)
instance.data["kitsu_comment"] = kitsu_comment

View file

@ -12,17 +12,17 @@ class IntegrateKitsuReview(pyblish.api.InstancePlugin):
optional = True
def process(self, instance):
task = instance.data["kitsu_task"]["id"]
comment = instance.data["kitsu_comment"]["id"]
# Check comment has been created
if not comment:
comment_id = instance.data.get("kitsu_comment", {}).get("id")
if not comment_id:
self.log.debug(
"Comment not created, review not pushed to preview."
)
return
# Add review representations as preview of comment
task_id = instance.data["kitsu_task"]["id"]
for representation in instance.data.get("representations", []):
# Skip if not tagged as review
if "kitsureview" not in representation.get("tags", []):
@ -31,6 +31,6 @@ class IntegrateKitsuReview(pyblish.api.InstancePlugin):
self.log.debug("Found review at: {}".format(review_path))
gazu.task.add_preview(
task, comment, review_path, normalize_movie=True
task_id, comment_id, review_path, normalize_movie=True
)
self.log.info("Review upload on comment")

View file

@ -129,7 +129,7 @@ def update_op_assets(
frame_out = frame_in + frames_duration - 1
else:
frame_out = project_doc["data"].get("frameEnd", frame_in)
item_data["frameEnd"] = frame_out
item_data["frameEnd"] = int(frame_out)
# Fps, fallback to project's value or default value (25.0)
try:
fps = float(item_data.get("fps"))
@ -147,33 +147,37 @@ def update_op_assets(
item_data["resolutionWidth"] = int(match_res.group(1))
item_data["resolutionHeight"] = int(match_res.group(2))
else:
item_data["resolutionWidth"] = project_doc["data"].get(
"resolutionWidth"
item_data["resolutionWidth"] = int(
project_doc["data"].get("resolutionWidth")
)
item_data["resolutionHeight"] = project_doc["data"].get(
"resolutionHeight"
item_data["resolutionHeight"] = int(
project_doc["data"].get("resolutionHeight")
)
# Properties that doesn't fully exist in Kitsu.
# Guessing those property names below:
# Pixel Aspect Ratio
item_data["pixelAspect"] = item_data.get(
"pixel_aspect", project_doc["data"].get("pixelAspect")
item_data["pixelAspect"] = float(
item_data.get(
"pixel_aspect", project_doc["data"].get("pixelAspect")
)
)
# Handle Start
item_data["handleStart"] = item_data.get(
"handle_start", project_doc["data"].get("handleStart")
item_data["handleStart"] = int(
item_data.get(
"handle_start", project_doc["data"].get("handleStart")
)
)
# Handle End
item_data["handleEnd"] = item_data.get(
"handle_end", project_doc["data"].get("handleEnd")
item_data["handleEnd"] = int(
item_data.get("handle_end", project_doc["data"].get("handleEnd"))
)
# Clip In
item_data["clipIn"] = item_data.get(
"clip_in", project_doc["data"].get("clipIn")
item_data["clipIn"] = int(
item_data.get("clip_in", project_doc["data"].get("clipIn"))
)
# Clip Out
item_data["clipOut"] = item_data.get(
"clip_out", project_doc["data"].get("clipOut")
item_data["clipOut"] = int(
item_data.get("clip_out", project_doc["data"].get("clipOut"))
)
# Tasks

View file

@ -22,7 +22,7 @@ from openpype.lib.attribute_definitions import (
deserialize_attr_defs,
get_default_values,
)
from openpype.host import IPublishHost
from openpype.host import IPublishHost, IWorkfileHost
from openpype.pipeline import legacy_io
from openpype.pipeline.plugin_discover import DiscoverResult
@ -1374,6 +1374,7 @@ class CreateContext:
self._current_project_name = None
self._current_asset_name = None
self._current_task_name = None
self._current_workfile_path = None
self._host_is_valid = host_is_valid
# Currently unused variable
@ -1503,14 +1504,62 @@ class CreateContext:
return os.environ["AVALON_APP"]
def get_current_project_name(self):
"""Project name which was used as current context on context reset.
Returns:
Union[str, None]: Project name.
"""
return self._current_project_name
def get_current_asset_name(self):
"""Asset name which was used as current context on context reset.
Returns:
Union[str, None]: Asset name.
"""
return self._current_asset_name
def get_current_task_name(self):
"""Task name which was used as current context on context reset.
Returns:
Union[str, None]: Task name.
"""
return self._current_task_name
def get_current_workfile_path(self):
"""Workfile path which was opened on context reset.
Returns:
Union[str, None]: Workfile path.
"""
return self._current_workfile_path
@property
def context_has_changed(self):
"""Host context has changed.
As context is used project, asset, task name and workfile path if
host does support workfiles.
Returns:
bool: Context changed.
"""
project_name, asset_name, task_name, workfile_path = (
self._get_current_host_context()
)
return (
self._current_project_name != project_name
or self._current_asset_name != asset_name
or self._current_task_name != task_name
or self._current_workfile_path != workfile_path
)
project_name = property(get_current_project_name)
@property
@ -1575,6 +1624,28 @@ class CreateContext:
self._collection_shared_data = None
self.refresh_thumbnails()
def _get_current_host_context(self):
project_name = asset_name = task_name = workfile_path = None
if hasattr(self.host, "get_current_context"):
host_context = self.host.get_current_context()
if host_context:
project_name = host_context.get("project_name")
asset_name = host_context.get("asset_name")
task_name = host_context.get("task_name")
if isinstance(self.host, IWorkfileHost):
workfile_path = self.host.get_current_workfile()
# --- TODO remove these conditions ---
if not project_name:
project_name = legacy_io.Session.get("AVALON_PROJECT")
if not asset_name:
asset_name = legacy_io.Session.get("AVALON_ASSET")
if not task_name:
task_name = legacy_io.Session.get("AVALON_TASK")
# ---
return project_name, asset_name, task_name, workfile_path
def reset_current_context(self):
"""Refresh current context.
@ -1593,24 +1664,14 @@ class CreateContext:
are stored. We should store the workfile (if is available) too.
"""
project_name = asset_name = task_name = None
if hasattr(self.host, "get_current_context"):
host_context = self.host.get_current_context()
if host_context:
project_name = host_context.get("project_name")
asset_name = host_context.get("asset_name")
task_name = host_context.get("task_name")
if not project_name:
project_name = legacy_io.Session.get("AVALON_PROJECT")
if not asset_name:
asset_name = legacy_io.Session.get("AVALON_ASSET")
if not task_name:
task_name = legacy_io.Session.get("AVALON_TASK")
project_name, asset_name, task_name, workfile_path = (
self._get_current_host_context()
)
self._current_project_name = project_name
self._current_asset_name = asset_name
self._current_task_name = task_name
self._current_workfile_path = workfile_path
def reset_plugins(self, discover_publish_plugins=True):
"""Reload plugins.

View file

@ -16,9 +16,7 @@ from openpype.lib import (
get_transcode_temp_directory,
convert_input_paths_for_ffmpeg,
should_convert_for_ffmpeg,
CREATE_NO_WINDOW
should_convert_for_ffmpeg
)
from openpype.lib.profiles_filtering import filter_profiles
@ -338,8 +336,6 @@ class ExtractBurnin(publish.Extractor):
"logger": self.log,
"env": {}
}
if platform.system().lower() == "windows":
process_kwargs["creationflags"] = CREATE_NO_WINDOW
run_openpype_process(*args, **process_kwargs)
# Remove the temporary json
@ -729,7 +725,6 @@ class ExtractBurnin(publish.Extractor):
return filtered_burnin_defs
families = self.families_from_instance(instance)
low_families = [family.lower() for family in families]
for filename_suffix, orig_burnin_def in burnin_defs.items():
burnin_def = copy.deepcopy(orig_burnin_def)
@ -740,7 +735,7 @@ class ExtractBurnin(publish.Extractor):
families_filters = def_filter["families"]
if not self.families_filter_validation(
low_families, families_filters
families, families_filters
):
self.log.debug((
"Skipped burnin definition \"{}\". Family"
@ -777,31 +772,19 @@ class ExtractBurnin(publish.Extractor):
return filtered_burnin_defs
def families_filter_validation(self, families, output_families_filter):
"""Determine if entered families intersect with families filters.
"""Determines if entered families intersect with families filters.
All family values are lowered to avoid unexpected results.
"""
if not output_families_filter:
families_filter_lower = set(family.lower() for family in
output_families_filter
# Exclude empty filter values
if family)
if not families_filter_lower:
return True
for family_filter in output_families_filter:
if not family_filter:
continue
if not isinstance(family_filter, (list, tuple)):
if family_filter.lower() not in families:
continue
return True
valid = True
for family in family_filter:
if family.lower() not in families:
valid = False
break
if valid:
return True
return False
return any(family.lower() in families_filter_lower
for family in families)
def families_from_instance(self, instance):
"""Return all families of entered instance."""

View file

@ -12,7 +12,7 @@ import pyblish.api
from openpype.lib import (
get_ffmpeg_tool_path,
filter_profiles,
path_to_subprocess_arg,
run_subprocess,
)
@ -23,6 +23,7 @@ from openpype.lib.transcoding import (
convert_input_paths_for_ffmpeg,
get_transcode_temp_directory,
)
from openpype.pipeline.publish import KnownPublishError
class ExtractReview(pyblish.api.InstancePlugin):
@ -88,21 +89,23 @@ class ExtractReview(pyblish.api.InstancePlugin):
def _get_outputs_for_instance(self, instance):
host_name = instance.context.data["hostName"]
task_name = os.environ["AVALON_TASK"]
family = self.main_family_from_instance(instance)
self.log.info("Host: \"{}\"".format(host_name))
self.log.info("Task: \"{}\"".format(task_name))
self.log.info("Family: \"{}\"".format(family))
profile = self.find_matching_profile(
host_name, task_name, family
)
profile = filter_profiles(
self.profiles,
{
"hosts": host_name,
"families": family,
},
logger=self.log)
if not profile:
self.log.info((
"Skipped instance. None of profiles in presets are for"
" Host: \"{}\" | Family: \"{}\" | Task \"{}\""
).format(host_name, family, task_name))
" Host: \"{}\" | Family: \"{}\""
).format(host_name, family))
return
self.log.debug("Matching profile: \"{}\"".format(json.dumps(profile)))
@ -112,17 +115,19 @@ class ExtractReview(pyblish.api.InstancePlugin):
filtered_outputs = self.filter_output_defs(
profile, subset_name, instance_families
)
if not filtered_outputs:
self.log.info((
"Skipped instance. All output definitions from selected"
" profile do not match instance families \"{}\" or"
" subset name \"{}\"."
).format(str(instance_families), subset_name))
# Store `filename_suffix` to save arguments
profile_outputs = []
for filename_suffix, definition in filtered_outputs.items():
definition["filename_suffix"] = filename_suffix
profile_outputs.append(definition)
if not filtered_outputs:
self.log.info((
"Skipped instance. All output definitions from selected"
" profile does not match to instance families. \"{}\""
).format(str(instance_families)))
return profile_outputs
def _get_outputs_per_representations(self, instance, profile_outputs):
@ -216,6 +221,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
outputs_per_repres = self._get_outputs_per_representations(
instance, profile_outputs
)
for repre, output_defs in outputs_per_repres:
# Check if input should be preconverted before processing
# Store original staging dir (it's value may change)
@ -297,10 +303,10 @@ class ExtractReview(pyblish.api.InstancePlugin):
shutil.rmtree(new_staging_dir)
def _render_output_definitions(
self, instance, repre, src_repre_staging_dir, output_defs
self, instance, repre, src_repre_staging_dir, output_definitions
):
fill_data = copy.deepcopy(instance.data["anatomyData"])
for _output_def in output_defs:
for _output_def in output_definitions:
output_def = copy.deepcopy(_output_def)
# Make sure output definition has "tags" key
if "tags" not in output_def:
@ -346,10 +352,11 @@ class ExtractReview(pyblish.api.InstancePlugin):
if temp_data["input_is_sequence"]:
self.log.info("Filling gaps in sequence.")
files_to_clean = self.fill_sequence_gaps(
temp_data["origin_repre"]["files"],
new_repre["stagingDir"],
temp_data["frame_start"],
temp_data["frame_end"])
files=temp_data["origin_repre"]["files"],
staging_dir=new_repre["stagingDir"],
start_frame=temp_data["frame_start"],
end_frame=temp_data["frame_end"]
)
# create or update outputName
output_name = new_repre.get("outputName", "")
@ -421,10 +428,10 @@ class ExtractReview(pyblish.api.InstancePlugin):
def input_is_sequence(self, repre):
"""Deduce from representation data if input is sequence."""
# TODO GLOBAL ISSUE - Find better way how to find out if input
# is sequence. Issues( in theory):
# - there may be multiple files ant not be sequence
# - remainders are not checked at all
# - there can be more than one collection
# is sequence. Issues (in theory):
# - there may be multiple files ant not be sequence
# - remainders are not checked at all
# - there can be more than one collection
return isinstance(repre["files"], (list, tuple))
def prepare_temp_data(self, instance, repre, output_def):
@ -816,76 +823,41 @@ class ExtractReview(pyblish.api.InstancePlugin):
is done.
Raises:
AssertionError: if more then one collection is obtained.
KnownPublishError: if more than one collection is obtained.
"""
start_frame = int(start_frame)
end_frame = int(end_frame)
collections = clique.assemble(files)[0]
msg = "Multiple collections {} found.".format(collections)
assert len(collections) == 1, msg
if len(collections) != 1:
raise KnownPublishError(
"Multiple collections {} found.".format(collections))
col = collections[0]
# do nothing if no gap is found in input range
not_gap = True
for fr in range(start_frame, end_frame + 1):
if fr not in col.indexes:
not_gap = False
if not_gap:
return []
holes = col.holes()
# generate ideal sequence
complete_col = clique.assemble(
[("{}{:0" + str(col.padding) + "d}{}").format(
col.head, f, col.tail
) for f in range(start_frame, end_frame)]
)[0][0] # type: clique.Collection
new_files = {}
last_existing_file = None
for idx in holes.indexes:
# get previous existing file
test_file = os.path.normpath(os.path.join(
staging_dir,
("{}{:0" + str(complete_col.padding) + "d}{}").format(
complete_col.head, idx - 1, complete_col.tail)))
if os.path.isfile(test_file):
new_files[idx] = test_file
last_existing_file = test_file
# Prepare which hole is filled with what frame
# - the frame is filled only with already existing frames
prev_frame = next(iter(col.indexes))
hole_frame_to_nearest = {}
for frame in range(int(start_frame), int(end_frame) + 1):
if frame in col.indexes:
prev_frame = frame
else:
if not last_existing_file:
# previous file is not found (sequence has a hole
# at the beginning. Use first available frame
# there is.
try:
last_existing_file = list(col)[0]
except IndexError:
# empty collection?
raise AssertionError(
"Invalid sequence collected")
new_files[idx] = os.path.normpath(
os.path.join(staging_dir, last_existing_file))
# Use previous frame as source for hole
hole_frame_to_nearest[frame] = prev_frame
files_to_clean = []
if new_files:
# so now new files are dict with missing frame as a key and
# existing file as a value.
for frame, file in new_files.items():
self.log.info(
"Filling gap {} with {}".format(frame, file))
# Calculate paths
added_files = []
col_format = col.format("{head}{padding}{tail}")
for hole_frame, src_frame in hole_frame_to_nearest.items():
hole_fpath = os.path.join(staging_dir, col_format % hole_frame)
src_fpath = os.path.join(staging_dir, col_format % src_frame)
if not os.path.isfile(src_fpath):
raise KnownPublishError(
"Missing previously detected file: {}".format(src_fpath))
hole = os.path.join(
staging_dir,
("{}{:0" + str(col.padding) + "d}{}").format(
col.head, frame, col.tail))
speedcopy.copyfile(file, hole)
files_to_clean.append(hole)
speedcopy.copyfile(src_fpath, hole_fpath)
added_files.append(hole_fpath)
return files_to_clean
return added_files
def input_output_paths(self, new_repre, output_def, temp_data):
"""Deduce input nad output file paths based on entered data.
@ -1281,7 +1253,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
# 'use_input_res' is set to 'True'.
use_input_res = False
# Overscal color
# Overscan color
overscan_color_value = "black"
overscan_color = output_def.get("overscan_color")
if overscan_color:
@ -1468,240 +1440,20 @@ class ExtractReview(pyblish.api.InstancePlugin):
families.append(family)
return families
def compile_list_of_regexes(self, in_list):
"""Convert strings in entered list to compiled regex objects."""
regexes = []
if not in_list:
return regexes
for item in in_list:
if not item:
continue
try:
regexes.append(re.compile(item))
except TypeError:
self.log.warning((
"Invalid type \"{}\" value \"{}\"."
" Expected string based object. Skipping."
).format(str(type(item)), str(item)))
return regexes
def validate_value_by_regexes(self, value, in_list):
"""Validates in any regex from list match entered value.
Args:
in_list (list): List with regexes.
value (str): String where regexes is checked.
Returns:
int: Returns `0` when list is not set or is empty. Returns `1` when
any regex match value and returns `-1` when none of regexes
match value entered.
"""
if not in_list:
return 0
output = -1
regexes = self.compile_list_of_regexes(in_list)
for regex in regexes:
if not value:
continue
if re.match(regex, value):
output = 1
break
return output
def profile_exclusion(self, matching_profiles):
"""Find out most matching profile byt host, task and family match.
Profiles are selectively filtered. Each profile should have
"__value__" key with list of booleans. Each boolean represents
existence of filter for specific key (host, tasks, family).
Profiles are looped in sequence. In each sequence are split into
true_list and false_list. For next sequence loop are used profiles in
true_list if there are any profiles else false_list is used.
Filtering ends when only one profile left in true_list. Or when all
existence booleans loops passed, in that case first profile from left
profiles is returned.
Args:
matching_profiles (list): Profiles with same values.
Returns:
dict: Most matching profile.
"""
self.log.info(
"Search for first most matching profile in match order:"
" Host name -> Task name -> Family."
)
# Filter all profiles with highest points value. First filter profiles
# with matching host if there are any then filter profiles by task
# name if there are any and lastly filter by family. Else use first in
# list.
idx = 0
final_profile = None
while True:
profiles_true = []
profiles_false = []
for profile in matching_profiles:
value = profile["__value__"]
# Just use first profile when idx is greater than values.
if not idx < len(value):
final_profile = profile
break
if value[idx]:
profiles_true.append(profile)
else:
profiles_false.append(profile)
if final_profile is not None:
break
if profiles_true:
matching_profiles = profiles_true
else:
matching_profiles = profiles_false
if len(matching_profiles) == 1:
final_profile = matching_profiles[0]
break
idx += 1
final_profile.pop("__value__")
return final_profile
def find_matching_profile(self, host_name, task_name, family):
""" Filter profiles by Host name, Task name and main Family.
Filtering keys are "hosts" (list), "tasks" (list), "families" (list).
If key is not find or is empty than it's expected to match.
Args:
profiles (list): Profiles definition from presets.
host_name (str): Current running host name.
task_name (str): Current context task name.
family (str): Main family of current Instance.
Returns:
dict/None: Return most matching profile or None if none of profiles
match at least one criteria.
"""
matching_profiles = None
if not self.profiles:
return matching_profiles
highest_profile_points = -1
# Each profile get 1 point for each matching filter. Profile with most
# points is returned. For cases when more than one profile will match
# are also stored ordered lists of matching values.
for profile in self.profiles:
profile_points = 0
profile_value = []
# Host filtering
host_names = profile.get("hosts")
match = self.validate_value_by_regexes(host_name, host_names)
if match == -1:
self.log.debug(
"\"{}\" not found in {}".format(host_name, host_names)
)
continue
profile_points += match
profile_value.append(bool(match))
# Task filtering
task_names = profile.get("tasks")
match = self.validate_value_by_regexes(task_name, task_names)
if match == -1:
self.log.debug(
"\"{}\" not found in {}".format(task_name, task_names)
)
continue
profile_points += match
profile_value.append(bool(match))
# Family filtering
families = profile.get("families")
match = self.validate_value_by_regexes(family, families)
if match == -1:
self.log.debug(
"\"{}\" not found in {}".format(family, families)
)
continue
profile_points += match
profile_value.append(bool(match))
if profile_points < highest_profile_points:
continue
if profile_points > highest_profile_points:
matching_profiles = []
highest_profile_points = profile_points
if profile_points == highest_profile_points:
profile["__value__"] = profile_value
matching_profiles.append(profile)
if not matching_profiles:
self.log.warning((
"None of profiles match your setup."
" Host \"{}\" | Task: \"{}\" | Family: \"{}\""
).format(host_name, task_name, family))
return
if len(matching_profiles) == 1:
# Pop temporary key `__value__`
matching_profiles[0].pop("__value__")
return matching_profiles[0]
self.log.warning((
"More than one profile match your setup."
" Host \"{}\" | Task: \"{}\" | Family: \"{}\""
).format(host_name, task_name, family))
return self.profile_exclusion(matching_profiles)
def families_filter_validation(self, families, output_families_filter):
"""Determines if entered families intersect with families filters.
All family values are lowered to avoid unexpected results.
"""
if not output_families_filter:
families_filter_lower = set(family.lower() for family in
output_families_filter
# Exclude empty filter values
if family)
if not families_filter_lower:
return True
single_families = []
combination_families = []
for family_filter in output_families_filter:
if not family_filter:
continue
if isinstance(family_filter, (list, tuple)):
_family_filter = []
for family in family_filter:
if family:
_family_filter.append(family.lower())
combination_families.append(_family_filter)
else:
single_families.append(family_filter.lower())
for family in single_families:
if family in families:
return True
for family_combination in combination_families:
valid = True
for family in family_combination:
if family not in families:
valid = False
break
if valid:
return True
return False
return any(family.lower() in families_filter_lower
for family in families)
def filter_output_defs(self, profile, subset_name, families):
"""Return outputs matching input instance families.
@ -1716,14 +1468,10 @@ class ExtractReview(pyblish.api.InstancePlugin):
Returns:
list: Containg all output definitions matching entered families.
"""
outputs = profile.get("outputs") or []
outputs = profile.get("outputs") or {}
if not outputs:
return outputs
# lower values
# QUESTION is this valid operation?
families = [family.lower() for family in families]
filtered_outputs = {}
for filename_suffix, output_def in outputs.items():
output_filters = output_def.get("filter")
@ -1995,14 +1743,14 @@ class OverscanCrop:
relative_source_regex = re.compile(r"%([\+\-])")
def __init__(
self, input_width, input_height, string_value, overscal_color=None
self, input_width, input_height, string_value, overscan_color=None
):
# Make sure that is not None
string_value = string_value or ""
self.input_width = input_width
self.input_height = input_height
self.overscal_color = overscal_color
self.overscan_color = overscan_color
width, height = self._convert_string_to_values(string_value)
self._width_value = width
@ -2058,20 +1806,20 @@ class OverscanCrop:
elif width >= self.input_width and height >= self.input_height:
output.append(
"pad={}:{}:(iw-ow)/2:(ih-oh)/2:{}".format(
width, height, self.overscal_color
width, height, self.overscan_color
)
)
elif width > self.input_width and height < self.input_height:
output.append("crop=iw:{}".format(height))
output.append("pad={}:ih:(iw-ow)/2:(ih-oh)/2:{}".format(
width, self.overscal_color
width, self.overscan_color
))
elif width < self.input_width and height > self.input_height:
output.append("crop={}:ih".format(width))
output.append("pad=iw:{}:(iw-ow)/2:(ih-oh)/2:{}".format(
height, self.overscal_color
height, self.overscan_color
))
return output

View file

@ -24,7 +24,10 @@ from openpype.client import (
get_version_by_name,
)
from openpype.lib import source_hash
from openpype.lib.file_transaction import FileTransaction
from openpype.lib.file_transaction import (
FileTransaction,
DuplicateDestinationError
)
from openpype.pipeline.publish import (
KnownPublishError,
get_publish_template_name,
@ -175,9 +178,18 @@ class IntegrateAsset(pyblish.api.InstancePlugin):
).format(instance.data["family"]))
return
file_transactions = FileTransaction(log=self.log)
file_transactions = FileTransaction(log=self.log,
# Enforce unique transfers
allow_queue_replacements=False)
try:
self.register(instance, file_transactions, filtered_repres)
except DuplicateDestinationError as exc:
# Raise DuplicateDestinationError as KnownPublishError
# and rollback the transactions
file_transactions.rollback()
six.reraise(KnownPublishError,
KnownPublishError(exc),
sys.exc_info()[2])
except Exception:
# clean destination
# todo: preferably we'd also rollback *any* changes to the database

View file

@ -60,6 +60,8 @@ class PreIntegrateThumbnails(pyblish.api.InstancePlugin):
if not found_profile:
return
thumbnail_repre.setdefault("tags", [])
if not found_profile["integrate_thumbnail"]:
if "delete" not in thumbnail_repre["tags"]:
thumbnail_repre["tags"].append("delete")

View file

@ -345,12 +345,6 @@ class ModifiedBurnins(ffmpeg_burnins.Burnins):
"stderr": subprocess.PIPE,
"shell": True,
}
if platform.system().lower() == "windows":
kwargs["creationflags"] = (
subprocess.CREATE_NEW_PROCESS_GROUP
| getattr(subprocess, "DETACHED_PROCESS", 0)
| getattr(subprocess, "CREATE_NO_WINDOW", 0)
)
proc = subprocess.Popen(command, **kwargs)
_stdout, _stderr = proc.communicate()

View file

@ -0,0 +1,3 @@
{
"only_available": false
}

View file

@ -7,7 +7,12 @@
"publish": {
"IntegrateKitsuNote": {
"set_status_note": false,
"note_status_shortname": "wfa"
"note_status_shortname": "wfa",
"status_conditions": [],
"custom_comment_template": {
"enabled": false,
"comment_template": "{comment}\n\n| | |\n|--|--|\n| version| `{version}` |\n| family | `{family}` |\n| name | `{name}` |"
}
}
}
}

View file

@ -82,6 +82,10 @@
"type": "schema",
"name": "schema_project_slack"
},
{
"type": "schema",
"name": "schema_project_applications"
},
{
"type": "schema",
"name": "schema_project_max"

View file

@ -0,0 +1,14 @@
{
"type": "dict",
"key": "applications",
"label": "Applications",
"collapsible": true,
"is_file": true,
"children": [
{
"type": "boolean",
"key": "only_available",
"label": "Show only available applications"
}
]
}

View file

@ -46,12 +46,62 @@
{
"type": "boolean",
"key": "set_status_note",
"label": "Set status on note"
"label": "Set status with note"
},
{
"type": "text",
"key": "note_status_shortname",
"label": "Note shortname"
},
{
"type": "list",
"key": "status_conditions",
"label": "Status conditions",
"object_type": {
"type": "dict",
"key": "conditions_dict",
"children": [
{
"type": "enum",
"key": "condition",
"label": "Condition",
"enum_items": [
{"equal": "Equal"},
{"not_equal": "Not equal"}
]
},
{
"type": "text",
"key": "short_name",
"label": "Short name"
}
]
},
"label": "Status shortname"
},
{
"type": "dict",
"collapsible": true,
"checkbox_key": "enabled",
"key": "custom_comment_template",
"label": "Custom Comment Template",
"children": [
{
"type": "boolean",
"key": "enabled",
"label": "Enabled"
},
{
"type": "label",
"label": "Kitsu supports markdown and here you can create a custom comment template.<br/>You can use data from your publishing instance's data."
},
{
"key": "comment_template",
"type": "text",
"multiline": true,
"label": "Custom comment"
}
]
}
]
}

View file

@ -19,6 +19,7 @@ from openpype.lib.applications import (
CUSTOM_LAUNCH_APP_GROUPS,
ApplicationManager
)
from openpype.settings import get_project_settings
from openpype.pipeline import discover_launcher_actions
from openpype.tools.utils.lib import (
DynamicQThread,
@ -94,6 +95,8 @@ class ActionModel(QtGui.QStandardItemModel):
if not project_doc:
return actions
project_settings = get_project_settings(project_name)
only_available = project_settings["applications"]["only_available"]
self.application_manager.refresh()
for app_def in project_doc["config"]["apps"]:
app_name = app_def["name"]
@ -104,6 +107,9 @@ class ActionModel(QtGui.QStandardItemModel):
if app.group.name in CUSTOM_LAUNCH_APP_GROUPS:
continue
if only_available and not app.find_executable():
continue
# Get from app definition, if not there from app in project
action = type(
"app_{}".format(app_name),

View file

@ -1,4 +1,4 @@
from qtpy import QtCore
from qtpy import QtCore, QtGui
# ID of context item in instance view
CONTEXT_ID = "context"
@ -26,6 +26,9 @@ GROUP_ROLE = QtCore.Qt.UserRole + 7
CONVERTER_IDENTIFIER_ROLE = QtCore.Qt.UserRole + 8
CREATOR_SORT_ROLE = QtCore.Qt.UserRole + 9
ResetKeySequence = QtGui.QKeySequence(
QtCore.Qt.ControlModifier | QtCore.Qt.Key_R
)
__all__ = (
"CONTEXT_ID",

View file

@ -6,7 +6,7 @@ import collections
import uuid
import tempfile
import shutil
from abc import ABCMeta, abstractmethod, abstractproperty
from abc import ABCMeta, abstractmethod
import six
import pyblish.api
@ -964,7 +964,8 @@ class AbstractPublisherController(object):
access objects directly but by using wrappers that can be serialized.
"""
@abstractproperty
@property
@abstractmethod
def log(self):
"""Controller's logger object.
@ -974,13 +975,15 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def event_system(self):
"""Inner event system for publisher controller."""
pass
@abstractproperty
@property
@abstractmethod
def project_name(self):
"""Current context project name.
@ -990,7 +993,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def current_asset_name(self):
"""Current context asset name.
@ -1000,7 +1004,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def current_task_name(self):
"""Current context task name.
@ -1010,7 +1015,21 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def host_context_has_changed(self):
"""Host context changed after last reset.
'CreateContext' has this option available using 'context_has_changed'.
Returns:
bool: Context has changed.
"""
pass
@property
@abstractmethod
def host_is_valid(self):
"""Host is valid for creation part.
@ -1023,7 +1042,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def instances(self):
"""Collected/created instances.
@ -1134,7 +1154,13 @@ class AbstractPublisherController(object):
@abstractmethod
def save_changes(self):
"""Save changes in create context."""
"""Save changes in create context.
Save can crash because of unexpected errors.
Returns:
bool: Save was successful.
"""
pass
@ -1145,7 +1171,19 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_has_started(self):
"""Has publishing finished.
Returns:
bool: If publishing finished and all plugins were iterated.
"""
pass
@property
@abstractmethod
def publish_has_finished(self):
"""Has publishing finished.
@ -1155,7 +1193,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_is_running(self):
"""Publishing is running right now.
@ -1165,7 +1204,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_has_validated(self):
"""Publish validation passed.
@ -1175,7 +1215,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_has_crashed(self):
"""Publishing crashed for any reason.
@ -1185,7 +1226,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_has_validation_errors(self):
"""During validation happened at least one validation error.
@ -1195,7 +1237,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_max_progress(self):
"""Get maximum possible progress number.
@ -1205,7 +1248,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_progress(self):
"""Current progress number.
@ -1215,7 +1259,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def publish_error_msg(self):
"""Current error message which cause fail of publishing.
@ -1267,7 +1312,8 @@ class AbstractPublisherController(object):
pass
@abstractproperty
@property
@abstractmethod
def convertor_items(self):
pass
@ -1356,6 +1402,7 @@ class BasePublisherController(AbstractPublisherController):
self._publish_has_validation_errors = False
self._publish_has_crashed = False
# All publish plugins are processed
self._publish_has_started = False
self._publish_has_finished = False
self._publish_max_progress = 0
self._publish_progress = 0
@ -1386,7 +1433,8 @@ class BasePublisherController(AbstractPublisherController):
"show.card.message" - Show card message request (UI related).
"instances.refresh.finished" - Instances are refreshed.
"plugins.refresh.finished" - Plugins refreshed.
"publish.reset.finished" - Publish context reset finished.
"publish.reset.finished" - Reset finished.
"controller.reset.started" - Controller reset started.
"controller.reset.finished" - Controller reset finished.
"publish.process.started" - Publishing started. Can be started from
paused state.
@ -1425,7 +1473,16 @@ class BasePublisherController(AbstractPublisherController):
def _set_host_is_valid(self, value):
if self._host_is_valid != value:
self._host_is_valid = value
self._emit_event("publish.host_is_valid.changed", {"value": value})
self._emit_event(
"publish.host_is_valid.changed", {"value": value}
)
def _get_publish_has_started(self):
return self._publish_has_started
def _set_publish_has_started(self, value):
if value != self._publish_has_started:
self._publish_has_started = value
def _get_publish_has_finished(self):
return self._publish_has_finished
@ -1449,7 +1506,9 @@ class BasePublisherController(AbstractPublisherController):
def _set_publish_has_validated(self, value):
if self._publish_has_validated != value:
self._publish_has_validated = value
self._emit_event("publish.has_validated.changed", {"value": value})
self._emit_event(
"publish.has_validated.changed", {"value": value}
)
def _get_publish_has_crashed(self):
return self._publish_has_crashed
@ -1497,6 +1556,9 @@ class BasePublisherController(AbstractPublisherController):
host_is_valid = property(
_get_host_is_valid, _set_host_is_valid
)
publish_has_started = property(
_get_publish_has_started, _set_publish_has_started
)
publish_has_finished = property(
_get_publish_has_finished, _set_publish_has_finished
)
@ -1526,6 +1588,7 @@ class BasePublisherController(AbstractPublisherController):
"""Reset most of attributes that can be reset."""
self.publish_is_running = False
self.publish_has_started = False
self.publish_has_validated = False
self.publish_has_crashed = False
self.publish_has_validation_errors = False
@ -1645,10 +1708,7 @@ class PublisherController(BasePublisherController):
str: Project name.
"""
if not hasattr(self._host, "get_current_context"):
return legacy_io.active_project()
return self._host.get_current_context()["project_name"]
return self._create_context.get_current_project_name()
@property
def current_asset_name(self):
@ -1658,10 +1718,7 @@ class PublisherController(BasePublisherController):
Union[str, None]: Asset name or None if asset is not set.
"""
if not hasattr(self._host, "get_current_context"):
return legacy_io.Session["AVALON_ASSET"]
return self._host.get_current_context()["asset_name"]
return self._create_context.get_current_asset_name()
@property
def current_task_name(self):
@ -1671,10 +1728,11 @@ class PublisherController(BasePublisherController):
Union[str, None]: Task name or None if task is not set.
"""
if not hasattr(self._host, "get_current_context"):
return legacy_io.Session["AVALON_TASK"]
return self._create_context.get_current_task_name()
return self._host.get_current_context()["task_name"]
@property
def host_context_has_changed(self):
return self._create_context.context_has_changed
@property
def instances(self):
@ -1751,6 +1809,8 @@ class PublisherController(BasePublisherController):
"""Reset everything related to creation and publishing."""
self.stop_publish()
self._emit_event("controller.reset.started")
self.host_is_valid = self._create_context.host_is_valid
self._create_context.reset_preparation()
@ -1992,7 +2052,15 @@ class PublisherController(BasePublisherController):
)
def trigger_convertor_items(self, convertor_identifiers):
self.save_changes()
"""Trigger legacy item convertors.
This functionality requires to save and reset CreateContext. The reset
is needed so Creators can collect converted items.
Args:
convertor_identifiers (list[str]): Identifiers of convertor
plugins.
"""
success = True
try:
@ -2039,13 +2107,33 @@ class PublisherController(BasePublisherController):
self._on_create_instance_change()
return success
def save_changes(self):
"""Save changes happened during creation."""
def save_changes(self, show_message=True):
"""Save changes happened during creation.
Trigger save of changes using host api. This functionality does not
validate anything. It is required to do checks before this method is
called to be able to give user actionable response e.g. check of
context using 'host_context_has_changed'.
Args:
show_message (bool): Show message that changes were
saved successfully.
Returns:
bool: Save of changes was successful.
"""
if not self._create_context.host_is_valid:
return
# TODO remove
# Fake success save when host is not valid for CreateContext
# this is for testing as experimental feature
return True
try:
self._create_context.save_changes()
if show_message:
self.emit_card_message("Saved changes..")
return True
except CreatorsOperationFailed as exc:
self._emit_event(
@ -2056,16 +2144,17 @@ class PublisherController(BasePublisherController):
}
)
return False
def remove_instances(self, instance_ids):
"""Remove instances based on instance ids.
Args:
instance_ids (List[str]): List of instance ids to remove.
"""
# QUESTION Expect that instances are really removed? In that case save
# reset is not required and save changes too.
self.save_changes()
# QUESTION Expect that instances are really removed? In that case reset
# is not required.
self._remove_instances_from_context(instance_ids)
self._on_create_instance_change()
@ -2136,12 +2225,22 @@ class PublisherController(BasePublisherController):
self._publish_comment_is_set = True
def publish(self):
"""Run publishing."""
"""Run publishing.
Make sure all changes are saved before method is called (Call
'save_changes' and check output).
"""
self._publish_up_validation = False
self._start_publish()
def validate(self):
"""Run publishing and stop after Validation."""
"""Run publishing and stop after Validation.
Make sure all changes are saved before method is called (Call
'save_changes' and check output).
"""
if self.publish_has_validated:
return
self._publish_up_validation = True
@ -2152,10 +2251,8 @@ class PublisherController(BasePublisherController):
if self.publish_is_running:
return
# Make sure changes are saved
self.save_changes()
self.publish_is_running = True
self.publish_has_started = True
self._emit_event("publish.process.started")

View file

@ -4,8 +4,9 @@ from .icons import (
get_icon
)
from .widgets import (
StopBtn,
SaveBtn,
ResetBtn,
StopBtn,
ValidateBtn,
PublishBtn,
CreateNextPageOverlay,
@ -25,8 +26,9 @@ __all__ = (
"get_pixmap",
"get_icon",
"StopBtn",
"SaveBtn",
"ResetBtn",
"StopBtn",
"ValidateBtn",
"PublishBtn",
"CreateNextPageOverlay",

View file

@ -164,6 +164,11 @@ class BaseGroupWidget(QtWidgets.QWidget):
def _on_widget_selection(self, instance_id, group_id, selection_type):
self.selected.emit(instance_id, group_id, selection_type)
def set_active_toggle_enabled(self, enabled):
for widget in self._widgets_by_id.values():
if isinstance(widget, InstanceCardWidget):
widget.set_active_toggle_enabled(enabled)
class ConvertorItemsGroupWidget(BaseGroupWidget):
def update_items(self, items_by_id):
@ -437,6 +442,9 @@ class InstanceCardWidget(CardWidget):
self.update_instance_values()
def set_active_toggle_enabled(self, enabled):
self._active_checkbox.setEnabled(enabled)
def set_active(self, new_value):
"""Set instance as active."""
checkbox_value = self._active_checkbox.isChecked()
@ -551,6 +559,7 @@ class InstanceCardView(AbstractInstanceView):
self._context_widget = None
self._convertor_items_group = None
self._active_toggle_enabled = True
self._widgets_by_group = {}
self._ordered_groups = []
@ -667,6 +676,9 @@ class InstanceCardView(AbstractInstanceView):
group_widget.update_instances(
instances_by_group[group_name]
)
group_widget.set_active_toggle_enabled(
self._active_toggle_enabled
)
self._update_ordered_group_names()
@ -1091,3 +1103,10 @@ class InstanceCardView(AbstractInstanceView):
self._explicitly_selected_groups = selected_groups
self._explicitly_selected_instance_ids = selected_instances
def set_active_toggle_enabled(self, enabled):
if self._active_toggle_enabled is enabled:
return
self._active_toggle_enabled = enabled
for group_widget in self._widgets_by_group.values():
group_widget.set_active_toggle_enabled(enabled)

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.9 KiB

View file

@ -198,6 +198,9 @@ class InstanceListItemWidget(QtWidgets.QWidget):
self.instance["active"] = new_value
self.active_changed.emit(self.instance.id, new_value)
def set_active_toggle_enabled(self, enabled):
self._active_checkbox.setEnabled(enabled)
class ListContextWidget(QtWidgets.QFrame):
"""Context (or global attributes) widget."""
@ -302,6 +305,9 @@ class InstanceListGroupWidget(QtWidgets.QFrame):
else:
self.expand_btn.setArrowType(QtCore.Qt.RightArrow)
def set_active_toggle_enabled(self, enabled):
self.toggle_checkbox.setEnabled(enabled)
class InstanceTreeView(QtWidgets.QTreeView):
"""View showing instances and their groups."""
@ -461,6 +467,8 @@ class InstanceListView(AbstractInstanceView):
self._instance_model = instance_model
self._proxy_model = proxy_model
self._active_toggle_enabled = True
def _on_expand(self, index):
self._update_widget_expand_state(index, True)
@ -667,6 +675,9 @@ class InstanceListView(AbstractInstanceView):
widget = InstanceListItemWidget(
instance, self._instance_view
)
widget.set_active_toggle_enabled(
self._active_toggle_enabled
)
widget.active_changed.connect(self._on_active_changed)
self._instance_view.setIndexWidget(proxy_index, widget)
self._widgets_by_id[instance.id] = widget
@ -802,6 +813,9 @@ class InstanceListView(AbstractInstanceView):
proxy_index = self._proxy_model.mapFromSource(index)
group_name = group_item.data(GROUP_ROLE)
widget = InstanceListGroupWidget(group_name, self._instance_view)
widget.set_active_toggle_enabled(
self._active_toggle_enabled
)
widget.expand_changed.connect(self._on_group_expand_request)
widget.toggle_requested.connect(self._on_group_toggle_request)
self._group_widgets[group_name] = widget
@ -1051,3 +1065,16 @@ class InstanceListView(AbstractInstanceView):
QtCore.QItemSelectionModel.Select
| QtCore.QItemSelectionModel.Rows
)
def set_active_toggle_enabled(self, enabled):
if self._active_toggle_enabled is enabled:
return
self._active_toggle_enabled = enabled
for widget in self._widgets_by_id.values():
if isinstance(widget, InstanceListItemWidget):
widget.set_active_toggle_enabled(enabled)
for widget in self._group_widgets.values():
if isinstance(widget, InstanceListGroupWidget):
widget.set_active_toggle_enabled(enabled)

View file

@ -17,6 +17,7 @@ class OverviewWidget(QtWidgets.QFrame):
active_changed = QtCore.Signal()
instance_context_changed = QtCore.Signal()
create_requested = QtCore.Signal()
convert_requested = QtCore.Signal()
anim_end_value = 200
anim_duration = 200
@ -132,6 +133,9 @@ class OverviewWidget(QtWidgets.QFrame):
controller.event_system.add_callback(
"publish.process.started", self._on_publish_start
)
controller.event_system.add_callback(
"controller.reset.started", self._on_controller_reset_start
)
controller.event_system.add_callback(
"publish.reset.finished", self._on_publish_reset
)
@ -336,13 +340,31 @@ class OverviewWidget(QtWidgets.QFrame):
self.instance_context_changed.emit()
def _on_convert_requested(self):
_, _, convertor_identifiers = self.get_selected_items()
self._controller.trigger_convertor_items(convertor_identifiers)
self.convert_requested.emit()
def get_selected_items(self):
"""Selected items in current view widget.
Returns:
tuple[list[str], bool, list[str]]: Selected items. List of
instance ids, context is selected, list of selected legacy
convertor plugins.
"""
view = self._subset_views_layout.currentWidget()
return view.get_selected_items()
def get_selected_legacy_convertors(self):
"""Selected legacy convertor identifiers.
Returns:
list[str]: Selected legacy convertor identifiers.
Example: ['io.openpype.creators.houdini.legacy']
"""
_, _, convertor_identifiers = self.get_selected_items()
return convertor_identifiers
def _change_view_type(self):
idx = self._subset_views_layout.currentIndex()
new_idx = (idx + 1) % self._subset_views_layout.count()
@ -391,9 +413,19 @@ class OverviewWidget(QtWidgets.QFrame):
self._create_btn.setEnabled(False)
self._subset_attributes_wrap.setEnabled(False)
for idx in range(self._subset_views_layout.count()):
widget = self._subset_views_layout.widget(idx)
widget.set_active_toggle_enabled(False)
def _on_controller_reset_start(self):
"""Controller reset started."""
for idx in range(self._subset_views_layout.count()):
widget = self._subset_views_layout.widget(idx)
widget.set_active_toggle_enabled(True)
def _on_publish_reset(self):
"""Context in controller has been refreshed."""
"""Context in controller has been reseted."""
self._create_btn.setEnabled(True)
self._subset_attributes_wrap.setEnabled(True)

View file

@ -34,7 +34,8 @@ from .icons import (
)
from ..constants import (
VARIANT_TOOLTIP
VARIANT_TOOLTIP,
ResetKeySequence,
)
@ -198,12 +199,26 @@ class CreateBtn(PublishIconBtn):
self.setLayoutDirection(QtCore.Qt.RightToLeft)
class SaveBtn(PublishIconBtn):
"""Save context and instances information."""
def __init__(self, parent=None):
icon_path = get_icon_path("save")
super(SaveBtn, self).__init__(icon_path, parent)
self.setToolTip(
"Save changes ({})".format(
QtGui.QKeySequence(QtGui.QKeySequence.Save).toString()
)
)
class ResetBtn(PublishIconBtn):
"""Publish reset button."""
def __init__(self, parent=None):
icon_path = get_icon_path("refresh")
super(ResetBtn, self).__init__(icon_path, parent)
self.setToolTip("Refresh publishing")
self.setToolTip(
"Reset & discard changes ({})".format(ResetKeySequence.toString())
)
class StopBtn(PublishIconBtn):
@ -348,6 +363,19 @@ class AbstractInstanceView(QtWidgets.QWidget):
"{} Method 'set_selected_items' is not implemented."
).format(self.__class__.__name__))
def set_active_toggle_enabled(self, enabled):
"""Instances are disabled for changing enabled state.
Active state should stay the same until is "unset".
Args:
enabled (bool): Instance state can be changed.
"""
raise NotImplementedError((
"{} Method 'set_active_toggle_enabled' is not implemented."
).format(self.__class__.__name__))
class ClickableLineEdit(QtWidgets.QLineEdit):
"""QLineEdit capturing left mouse click.
@ -1533,7 +1561,7 @@ class SubsetAttributesWidget(QtWidgets.QWidget):
attributes Thumbnail TOP
Family Publish
Creator Publish
attributes plugin BOTTOM
attributes

View file

@ -13,6 +13,7 @@ from openpype.tools.utils import (
PixmapLabel,
)
from .constants import ResetKeySequence
from .publish_report_viewer import PublishReportViewerWidget
from .control_qt import QtPublisherController
from .widgets import (
@ -22,8 +23,9 @@ from .widgets import (
PublisherTabsWidget,
StopBtn,
SaveBtn,
ResetBtn,
StopBtn,
ValidateBtn,
PublishBtn,
@ -121,6 +123,7 @@ class PublisherWindow(QtWidgets.QDialog):
"Attach a comment to your publish"
)
save_btn = SaveBtn(footer_widget)
reset_btn = ResetBtn(footer_widget)
stop_btn = StopBtn(footer_widget)
validate_btn = ValidateBtn(footer_widget)
@ -129,6 +132,7 @@ class PublisherWindow(QtWidgets.QDialog):
footer_bottom_layout = QtWidgets.QHBoxLayout(footer_bottom_widget)
footer_bottom_layout.setContentsMargins(0, 0, 0, 0)
footer_bottom_layout.addStretch(1)
footer_bottom_layout.addWidget(save_btn, 0)
footer_bottom_layout.addWidget(reset_btn, 0)
footer_bottom_layout.addWidget(stop_btn, 0)
footer_bottom_layout.addWidget(validate_btn, 0)
@ -250,7 +254,11 @@ class PublisherWindow(QtWidgets.QDialog):
overview_widget.create_requested.connect(
self._on_create_request
)
overview_widget.convert_requested.connect(
self._on_convert_requested
)
save_btn.clicked.connect(self._on_save_clicked)
reset_btn.clicked.connect(self._on_reset_clicked)
stop_btn.clicked.connect(self._on_stop_clicked)
validate_btn.clicked.connect(self._on_validate_clicked)
@ -330,8 +338,9 @@ class PublisherWindow(QtWidgets.QDialog):
self._comment_input = comment_input
self._footer_spacer = footer_spacer
self._stop_btn = stop_btn
self._save_btn = save_btn
self._reset_btn = reset_btn
self._stop_btn = stop_btn
self._validate_btn = validate_btn
self._publish_btn = publish_btn
@ -388,7 +397,9 @@ class PublisherWindow(QtWidgets.QDialog):
def closeEvent(self, event):
self._window_is_visible = False
self._uninstall_app_event_listener()
self.save_changes()
# TODO capture changes and ask user if wants to save changes on close
if not self._controller.host_context_has_changed:
self._save_changes(False)
self._reset_on_show = True
self._controller.clear_thumbnail_temp_dir_path()
super(PublisherWindow, self).closeEvent(event)
@ -421,6 +432,19 @@ class PublisherWindow(QtWidgets.QDialog):
if event.key() == QtCore.Qt.Key_Escape:
event.accept()
return
if event.matches(QtGui.QKeySequence.Save):
if not self._controller.publish_has_started:
self._save_changes(True)
event.accept()
return
if event.matches(ResetKeySequence):
if not self.controller.publish_is_running:
self.reset()
event.accept()
return
super(PublisherWindow, self).keyPressEvent(event)
def _on_overlay_message(self, event):
@ -455,8 +479,65 @@ class PublisherWindow(QtWidgets.QDialog):
self._reset_on_show = False
self.reset()
def save_changes(self):
self._controller.save_changes()
def _checks_before_save(self, explicit_save):
"""Save of changes may trigger some issues.
Check if context did change and ask user if he is really sure the
save should happen. A dialog can be shown during this method.
Args:
explicit_save (bool): Method was called when user explicitly asked
for save. Value affects shown message.
Returns:
bool: Save can happen.
"""
if not self._controller.host_context_has_changed:
return True
title = "Host context changed"
if explicit_save:
message = (
"Context has changed since Publisher window was refreshed last"
" time.\n\nAre you sure you want to save changes?"
)
else:
message = (
"Your action requires save of changes but context has changed"
" since Publisher window was refreshed last time.\n\nAre you"
" sure you want to continue and save changes?"
)
result = QtWidgets.QMessageBox.question(
self,
title,
message,
QtWidgets.QMessageBox.Save | QtWidgets.QMessageBox.Cancel
)
return result == QtWidgets.QMessageBox.Save
def _save_changes(self, explicit_save):
"""Save changes of Creation part.
All possible triggers of save changes were moved to main window (here),
so it can handle possible issues with save at one place. Do checks,
so user don't accidentally save changes to different file or using
different context.
Moving responsibility to this place gives option to show the dialog and
wait for user's response without breaking action he wanted to do.
Args:
explicit_save (bool): Method was called when user explicitly asked
for save. Value affects shown message.
Returns:
bool: Save happened successfully.
"""
if not self._checks_before_save(explicit_save):
return False
return self._controller.save_changes()
def reset(self):
self._controller.reset()
@ -491,15 +572,18 @@ class PublisherWindow(QtWidgets.QDialog):
self._help_dialog.show()
window = self.window()
desktop = QtWidgets.QApplication.desktop()
screen_idx = desktop.screenNumber(window)
screen = desktop.screen(screen_idx)
screen_rect = screen.geometry()
if hasattr(QtWidgets.QApplication, "desktop"):
desktop = QtWidgets.QApplication.desktop()
screen_idx = desktop.screenNumber(window)
screen_geo = desktop.screenGeometry(screen_idx)
else:
screen = window.screen()
screen_geo = screen.geometry()
window_geo = window.geometry()
dialog_x = window_geo.x() + window_geo.width()
dialog_right = (dialog_x + self._help_dialog.width()) - 1
diff = dialog_right - screen_rect.right()
diff = dialog_right - screen_geo.right()
if diff > 0:
dialog_x -= diff
@ -549,6 +633,14 @@ class PublisherWindow(QtWidgets.QDialog):
def _on_create_request(self):
self._go_to_create_tab()
def _on_convert_requested(self):
if not self._save_changes(False):
return
convertor_identifiers = (
self._overview_widget.get_selected_legacy_convertors()
)
self._controller.trigger_convertor_items(convertor_identifiers)
def _set_current_tab(self, identifier):
self._tabs_widget.set_current_tab(identifier)
@ -599,8 +691,10 @@ class PublisherWindow(QtWidgets.QDialog):
self._publish_frame.setVisible(visible)
self._update_publish_frame_rect()
def _on_save_clicked(self):
self._save_changes(True)
def _on_reset_clicked(self):
self.save_changes()
self.reset()
def _on_stop_clicked(self):
@ -610,14 +704,17 @@ class PublisherWindow(QtWidgets.QDialog):
self._controller.set_comment(self._comment_input.text())
def _on_validate_clicked(self):
self._set_publish_comment()
self._controller.validate()
if self._save_changes(False):
self._set_publish_comment()
self._controller.validate()
def _on_publish_clicked(self):
self._set_publish_comment()
self._controller.publish()
if self._save_changes(False):
self._set_publish_comment()
self._controller.publish()
def _set_footer_enabled(self, enabled):
self._save_btn.setEnabled(True)
self._reset_btn.setEnabled(True)
if enabled:
self._stop_btn.setEnabled(False)

View file

@ -327,7 +327,7 @@ class InventoryModel(TreeModel):
project_name, repre_id
)
if not representation:
not_found["representation"].append(group_items)
not_found["representation"].extend(group_items)
not_found_ids.append(repre_id)
continue
@ -335,7 +335,7 @@ class InventoryModel(TreeModel):
project_name, representation["parent"]
)
if not version:
not_found["version"].append(group_items)
not_found["version"].extend(group_items)
not_found_ids.append(repre_id)
continue
@ -348,13 +348,13 @@ class InventoryModel(TreeModel):
subset = get_subset_by_id(project_name, version["parent"])
if not subset:
not_found["subset"].append(group_items)
not_found["subset"].extend(group_items)
not_found_ids.append(repre_id)
continue
asset = get_asset_by_id(project_name, subset["parent"])
if not asset:
not_found["asset"].append(group_items)
not_found["asset"].extend(group_items)
not_found_ids.append(repre_id)
continue
@ -380,11 +380,11 @@ class InventoryModel(TreeModel):
self.add_child(group_node, parent=parent)
for _group_items in group_items:
for item in group_items:
item_node = Item()
item_node["Name"] = ", ".join(
[item["objectName"] for item in _group_items]
)
item_node.update(item)
item_node["Name"] = item.get("objectName", "NO NAME")
item_node["isNotFound"] = True
self.add_child(item_node, parent=group_node)
for repre_id, group_dict in sorted(grouped.items()):

View file

@ -80,9 +80,16 @@ class SceneInventoryView(QtWidgets.QTreeView):
self.setStyleSheet("QTreeView {}")
def _build_item_menu_for_selection(self, items, menu):
# Exclude items that are "NOT FOUND" since setting versions, updating
# and removal won't work for those items.
items = [item for item in items if not item.get("isNotFound")]
if not items:
return
# An item might not have a representation, for example when an item
# is listed as "NOT FOUND"
repre_ids = {
item["representation"]
for item in items

View file

@ -247,7 +247,7 @@ class TrayPublishWindow(PublisherWindow):
def _on_project_select(self, project_name):
# TODO register project specific plugin paths
self._controller.save_changes()
self._controller.save_changes(False)
self._controller.reset_project_data_cache()
self.reset()

View file

@ -30,9 +30,11 @@ class VersionDelegate(QtWidgets.QStyledItemDelegate):
def displayText(self, value, locale):
if isinstance(value, HeroVersionType):
return lib.format_version(value, True)
assert isinstance(value, numbers.Integral), (
"Version is not integer. \"{}\" {}".format(value, str(type(value)))
)
if not isinstance(value, numbers.Integral):
# For cases where no version is resolved like NOT FOUND cases
# where a representation might not exist in current database
return
return lib.format_version(value)
def paint(self, painter, option, index):

View file

@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
__version__ = "3.15.3-nightly.2"
__version__ = "3.15.3-nightly.3"

View file

@ -49,9 +49,7 @@ class SplashScreen(QtWidgets.QDialog):
self.init_ui()
def was_proc_successful(self) -> bool:
if self.thread_return_code == 0:
return True
return False
return self.thread_return_code == 0
def start_thread(self, q_thread: QtCore.QThread):
"""Saves the reference to this thread and starts it.
@ -80,8 +78,14 @@ class SplashScreen(QtWidgets.QDialog):
"""
self.thread_return_code = 0
self.q_thread.quit()
if not self.q_thread.wait(5000):
raise RuntimeError("Failed to quit the QThread! "
"The deadline has been reached! The thread "
"has not finished it's execution!.")
self.close()
@QtCore.Slot()
def toggle_log(self):
if self.is_log_visible:
@ -256,3 +260,4 @@ class SplashScreen(QtWidgets.QDialog):
self.close_btn.show()
self.thread_return_code = return_code
self.q_thread.exit(return_code)
self.q_thread.wait()

View file

@ -71,12 +71,30 @@ class TestDeadlinePublishInNuke(NukeDeadlinePublishTestClass):
failures.append(
DBAssert.count_of_types(dbcon, "representation", 4))
additional_args = {"context.subset": "workfileTest_task",
"context.ext": "nk"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
additional_args = {"context.subset": "renderTest_taskMain",
"context.ext": "exr"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
additional_args = {"context.subset": "renderTest_taskMain",
"name": "thumbnail"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
additional_args = {"context.subset": "renderTest_taskMain",
"name": "h264_mov"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
assert not any(failures)

View file

@ -15,7 +15,7 @@ class TestPublishInNuke(NukeLocalPublishTestClass):
!!!
It expects modified path in WriteNode,
use '[python {nuke.script_directory()}]' instead of regular root
dir (eg. instead of `c:/projects/test_project/test_asset/test_task`).
dir (eg. instead of `c:/projects`).
Access file path by selecting WriteNode group, CTRL+Enter, update file
input
!!!
@ -70,12 +70,30 @@ class TestPublishInNuke(NukeLocalPublishTestClass):
failures.append(
DBAssert.count_of_types(dbcon, "representation", 4))
additional_args = {"context.subset": "workfileTest_task",
"context.ext": "nk"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
additional_args = {"context.subset": "renderTest_taskMain",
"context.ext": "exr"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
additional_args = {"context.subset": "renderTest_taskMain",
"name": "thumbnail"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
additional_args = {"context.subset": "renderTest_taskMain",
"name": "h264_mov"}
failures.append(
DBAssert.count_of_types(dbcon, "representation", 1,
additional_args=additional_args))
assert not any(failures)

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

View file

@ -38,6 +38,18 @@ This functionality cannot deal with all cases and is not error proof, some inter
openpype_console module kitsu push-to-zou -l me@domain.ext -p my_password
```
## Integrate Kitsu Note
Task status can be automatically set during publish thanks to `Integrate Kitsu Note`. This feature can be configured in:
`Admin -> Studio Settings -> Project Settings -> Kitsu -> Integrate Kitsu Note`.
There are three settings available:
- `Set status on note` -> turns on and off this integrator.
- `Note shortname` -> Which status shortname should be set automatically (Case sensitive).
- `Status conditions` -> Conditions that need to be met for kitsu status to be changed. You can add as many conditions as you like. There are two fields to each conditions: `Condition` (Whether current status should be equal or not equal to the condition status) and `Short name` (Kitsu Shortname of the condition status).
![Integrate Kitsu Note project settings](assets/integrate_kitsu_note_settings.png)
## Q&A
### Is it safe to rename an entity from Kitsu?
Absolutely! Entities are linked by their unique IDs between the two databases.