Merge remote-tracking branch 'origin/enhancement/allow_nested_settings_templates' into feature/OP-5207_Global-host-color-management-activation

This commit is contained in:
Jakub Jezek 2023-05-24 14:23:23 +02:00
commit e0f7e106b8
No known key found for this signature in database
GPG key ID: 730D7C02726179A7
49 changed files with 679 additions and 188 deletions

View file

@ -35,6 +35,7 @@ body:
label: Version
description: What version are you running? Look to OpenPype Tray
options:
- 3.15.8
- 3.15.8-nightly.3
- 3.15.8-nightly.2
- 3.15.8-nightly.1
@ -134,7 +135,6 @@ body:
- 3.14.2-nightly.4
- 3.14.2-nightly.3
- 3.14.2-nightly.2
- 3.14.2-nightly.1
validations:
required: true
- type: dropdown

View file

@ -1,6 +1,304 @@
# Changelog
## [3.15.8](https://github.com/ynput/OpenPype/tree/3.15.8)
[Full Changelog](https://github.com/ynput/OpenPype/compare/3.15.7...3.15.8)
### **🆕 New features**
<details>
<summary>Publisher: Show instances in report page <a href="https://github.com/ynput/OpenPype/pull/4915">#4915</a></summary>
Show publish instances in report page. Also added basic log view with logs grouped by instance. Validation error detail now have 2 colums, one with erro details second with logs. Crashed state shows fast access to report action buttons. Success will show only logs. Publish frame is shrunked automatically on publish stop.
___
</details>
<details>
<summary>Fusion - Loader plugins updates <a href="https://github.com/ynput/OpenPype/pull/4920">#4920</a></summary>
Update to some Fusion loader plugins:The sequence loader can now load footage from the image and online family.The FBX loader can now import all formats Fusions FBX node can read.You can now import the content of another workfile into your current comp with the workfile loader.
___
</details>
<details>
<summary>Fusion: deadline farm rendering <a href="https://github.com/ynput/OpenPype/pull/4955">#4955</a></summary>
Enabling Fusion for deadline farm rendering.
___
</details>
<details>
<summary>AfterEffects: set frame range and resolution <a href="https://github.com/ynput/OpenPype/pull/4983">#4983</a></summary>
Frame information (frame start, duration, fps) and resolution (width and height) is applied to selected composition from Asset Management System (Ftrack or DB) automatically when published instance is created.It is also possible explicitly propagate both values from DB to selected composition by newly added menu buttons.
___
</details>
<details>
<summary>Publish: Enhance automated publish plugin settings <a href="https://github.com/ynput/OpenPype/pull/4986">#4986</a></summary>
Added plugins option to define settings category where to look for settings of a plugin and added public helper functions to apply settings `get_plugin_settings` and `apply_plugin_settings_automatically`.
___
</details>
### **🚀 Enhancements**
<details>
<summary>Load Rig References - Change Rig to Animation in Animation instance <a href="https://github.com/ynput/OpenPype/pull/4877">#4877</a></summary>
We are using the template builder to build an animation scene. All the rig placeholders are imported correctly, but the automatically created animation instances retain the rig family in their names and subsets. In our example, we need animationMain instead of rigMain, because this name will be used in the following steps like lighting.Here is the result we need. I checked, and it's not a template builder problem, because even if I load a rig as a reference, the result is the same. For me, since we are in the animation instance, it makes more sense to have animation instead of rig in the name. The naming is just fine if we use create from the Openpype menu.
___
</details>
<details>
<summary>Enhancement: Resolve prelaunch code refactoring and update defaults <a href="https://github.com/ynput/OpenPype/pull/4916">#4916</a></summary>
The main reason of this PR is wrong default settings in `openpype/settings/defaults/system_settings/applications.json` for Resolve host. The `bin` folder should not be a part of the macos and Linux `RESOLVE_PYTHON3_PATH` variable.The rest of this PR is some code cleanups for Resolve prelaunch hook to simplify further development.Also added a .gitignore for vscode workspace files.
___
</details>
<details>
<summary>Unreal: 🚚 move Unreal plugin to separate repository <a href="https://github.com/ynput/OpenPype/pull/4980">#4980</a></summary>
To support Epic Marketplace have to move AYON Unreal integration plugins to separate repository. This is replacing current files with git submodule, so the change should be functionally without impact.New repository lives here: https://github.com/ynput/ayon-unreal-plugin
___
</details>
<details>
<summary>General: Lib code cleanup <a href="https://github.com/ynput/OpenPype/pull/5003">#5003</a></summary>
Small cleanup in lib files in openpype.
___
</details>
<details>
<summary>Allow to open with djv by extension instead of representation name <a href="https://github.com/ynput/OpenPype/pull/5004">#5004</a></summary>
Filter open in djv action by extension instead of representation.
___
</details>
<details>
<summary>DJV open action `extensions` as `set` <a href="https://github.com/ynput/OpenPype/pull/5005">#5005</a></summary>
Change `extensions` attribute to `set`.
___
</details>
<details>
<summary>Nuke: extract thumbnail with multiple reposition nodes <a href="https://github.com/ynput/OpenPype/pull/5011">#5011</a></summary>
Added support for multiple reposition nodes.
___
</details>
<details>
<summary>Enhancement: Improve logging levels and messages for artist facing publish reports <a href="https://github.com/ynput/OpenPype/pull/5018">#5018</a></summary>
Tweak the logging levels and messages to try and only show those logs that an artist should see and could understand. Move anything that's slightly more involved into a "debug" message instead.
___
</details>
### **🐛 Bug fixes**
<details>
<summary>Bugfix/frame variable fix <a href="https://github.com/ynput/OpenPype/pull/4978">#4978</a></summary>
Renamed variables to match OpenPype terminology to reduce confusion and add consistency.
___
</details>
<details>
<summary>Global: plugins cleanup plugin will leave beauty rendered files <a href="https://github.com/ynput/OpenPype/pull/4790">#4790</a></summary>
Attempt to mark more files to be cleaned up explicitly in intermediate `renders` folder in work area for farm jobs.
___
</details>
<details>
<summary>Fix: Download last workfile doesn't work if not already downloaded <a href="https://github.com/ynput/OpenPype/pull/4942">#4942</a></summary>
Some optimization condition is messing with the feature: if the published workfile is not already downloaded, it won't download it...
___
</details>
<details>
<summary>Unreal: Fix transform when loading layout to match existing assets <a href="https://github.com/ynput/OpenPype/pull/4972">#4972</a></summary>
Fixed transform when loading layout to match existing assets.
___
</details>
<details>
<summary>fix the bug of fbx loaders in Max <a href="https://github.com/ynput/OpenPype/pull/4977">#4977</a></summary>
bug fix of fbx loaders for not being able to parent to the CON instances while importing cameras(and models) which is published from other DCCs such as Maya.
___
</details>
<details>
<summary>AfterEffects: allow returning stub with not saved workfile <a href="https://github.com/ynput/OpenPype/pull/4984">#4984</a></summary>
Allows to use Workfile app to Save first empty workfile.
___
</details>
<details>
<summary>Blender: Fix Alembic loading <a href="https://github.com/ynput/OpenPype/pull/4985">#4985</a></summary>
Fixed problem occurring when trying to load an Alembic model in Blender.
___
</details>
<details>
<summary>Unreal: Addon Py2 compatibility <a href="https://github.com/ynput/OpenPype/pull/4994">#4994</a></summary>
Fixed Python 2 compatibility of unreal addon.
___
</details>
<details>
<summary>Nuke: fixed missing files key in representation <a href="https://github.com/ynput/OpenPype/pull/4999">#4999</a></summary>
Issue with missing keys once rendering target set to existing frames is fixed. Instance has to be evaluated in validation for missing files.
___
</details>
<details>
<summary>Unreal: Fix the frame range when loading camera <a href="https://github.com/ynput/OpenPype/pull/5002">#5002</a></summary>
The keyframes of the camera, when loaded, were not using the correct frame range.
___
</details>
<details>
<summary>Fusion: fixing frame range targeting <a href="https://github.com/ynput/OpenPype/pull/5013">#5013</a></summary>
Frame range targeting at Rendering instances is now following configured options.
___
</details>
<details>
<summary>Deadline: fix selection from multiple webservices <a href="https://github.com/ynput/OpenPype/pull/5015">#5015</a></summary>
Multiple different DL webservice could be configured. First they must by configured in System Settings., then they could be configured per project in `project_settings/deadline/deadline_servers`.Only single webservice could be a target of publish though.
___
</details>
### **Merged pull requests**
<details>
<summary>3dsmax: Refactored publish plugins to use proper implementation of pymxs <a href="https://github.com/ynput/OpenPype/pull/4988">#4988</a></summary>
___
</details>
## [3.15.7](https://github.com/ynput/OpenPype/tree/3.15.7)

View file

@ -113,4 +113,4 @@ class CollectUpstreamInputs(pyblish.api.InstancePlugin):
inputs = [c["representation"] for c in containers]
instance.data["inputRepresentations"] = inputs
self.log.info("Collected inputs: %s" % inputs)
self.log.debug("Collected inputs: %s" % inputs)

View file

@ -17,5 +17,5 @@ class FusionSaveComp(pyblish.api.ContextPlugin):
current = comp.GetAttrs().get("COMPS_FileName", "")
assert context.data['currentFile'] == current
self.log.info("Saving current file..")
self.log.info("Saving current file: {}".format(current))
comp.Save()

View file

@ -8,7 +8,6 @@ import pyblish.api
from openpype.hosts.houdini.api import lib
class CollectFrames(pyblish.api.InstancePlugin):
"""Collect all frames which would be saved from the ROP nodes"""
@ -34,8 +33,10 @@ class CollectFrames(pyblish.api.InstancePlugin):
self.log.warning("Using current frame: {}".format(hou.frame()))
output = output_parm.eval()
_, ext = lib.splitext(output,
allowed_multidot_extensions=[".ass.gz"])
_, ext = lib.splitext(
output,
allowed_multidot_extensions=[".ass.gz"]
)
file_name = os.path.basename(output)
result = file_name

View file

@ -117,4 +117,4 @@ class CollectUpstreamInputs(pyblish.api.InstancePlugin):
inputs = [c["representation"] for c in containers]
instance.data["inputRepresentations"] = inputs
self.log.info("Collected inputs: %s" % inputs)
self.log.debug("Collected inputs: %s" % inputs)

View file

@ -55,7 +55,9 @@ class CollectInstances(pyblish.api.ContextPlugin):
has_family = node.evalParm("family")
assert has_family, "'%s' is missing 'family'" % node.name()
self.log.info("processing {}".format(node))
self.log.info(
"Processing legacy instance node {}".format(node.path())
)
data = lib.read(node)
# Check bypass state and reverse

View file

@ -32,5 +32,4 @@ class CollectWorkfile(pyblish.api.InstancePlugin):
"stagingDir": folder,
}]
self.log.info('Collected instance: {}'.format(file))
self.log.info('staging Dir: {}'.format(folder))
self.log.debug('Collected workfile instance: {}'.format(file))

View file

@ -20,7 +20,7 @@ class SaveCurrentScene(pyblish.api.ContextPlugin):
)
if host.has_unsaved_changes():
self.log.info("Saving current file {}...".format(current_file))
self.log.info("Saving current file: {}".format(current_file))
host.save_workfile(current_file)
else:
self.log.debug("No unsaved changes, skipping file save..")

View file

@ -28,18 +28,37 @@ class ValidateWorkfilePaths(
if not self.is_active(instance.data):
return
invalid = self.get_invalid()
self.log.info(
"node types to check: {}".format(", ".join(self.node_types)))
self.log.info(
"prohibited vars: {}".format(", ".join(self.prohibited_vars))
self.log.debug(
"Checking node types: {}".format(", ".join(self.node_types)))
self.log.debug(
"Searching prohibited vars: {}".format(
", ".join(self.prohibited_vars)
)
)
if invalid:
for param in invalid:
self.log.error(
"{}: {}".format(param.path(), param.unexpandedString()))
raise PublishValidationError(
"Invalid paths found", title=self.label)
if invalid:
all_container_vars = set()
for param in invalid:
value = param.unexpandedString()
contained_vars = [
var for var in self.prohibited_vars
if var in value
]
all_container_vars.update(contained_vars)
self.log.error(
"Parm {} contains prohibited vars {}: {}".format(
param.path(),
", ".join(contained_vars),
value)
)
message = (
"Prohibited vars {} found in parameter values".format(
", ".join(all_container_vars)
)
)
raise PublishValidationError(message, title=self.label)
@classmethod
def get_invalid(cls):
@ -63,7 +82,7 @@ class ValidateWorkfilePaths(
def repair(cls, instance):
invalid = cls.get_invalid()
for param in invalid:
cls.log.info("processing: {}".format(param.path()))
cls.log.info("Processing: {}".format(param.path()))
cls.log.info("Replacing {} for {}".format(
param.unexpandedString(),
hou.text.expandString(param.unexpandedString())))

View file

@ -166,7 +166,7 @@ class CollectUpstreamInputs(pyblish.api.InstancePlugin):
inputs = [c["representation"] for c in containers]
instance.data["inputRepresentations"] = inputs
self.log.info("Collected inputs: %s" % inputs)
self.log.debug("Collected inputs: %s" % inputs)
def _collect_renderlayer_inputs(self, scene_containers, instance):
"""Collects inputs from nodes in renderlayer, incl. shaders + camera"""

View file

@ -31,5 +31,5 @@ class SaveCurrentScene(pyblish.api.ContextPlugin):
# remove lockfile before saving
if is_workfile_lock_enabled("maya", project_name, project_settings):
remove_workfile_lock(current)
self.log.info("Saving current file..")
self.log.info("Saving current file: {}".format(current))
cmds.file(save=True, force=True)

View file

@ -5,6 +5,8 @@ import pyblish.api
from openpype.pipeline import publish
from openpype.hosts.nuke import api as napi
from openpype.hosts.nuke.api.lib import set_node_knobs_from_settings
if sys.version_info[0] >= 3:
unicode = str
@ -28,7 +30,7 @@ class ExtractThumbnail(publish.Extractor):
bake_viewer_process = True
bake_viewer_input_process = True
nodes = {}
reposition_nodes = None
def process(self, instance):
if instance.data.get("farm"):
@ -123,18 +125,32 @@ class ExtractThumbnail(publish.Extractor):
temporary_nodes.append(rnode)
previous_node = rnode
reformat_node = nuke.createNode("Reformat")
ref_node = self.nodes.get("Reformat", None)
if ref_node:
for k, v in ref_node:
self.log.debug("k, v: {0}:{1}".format(k, v))
if isinstance(v, unicode):
v = str(v)
reformat_node[k].setValue(v)
if self.reposition_nodes is None:
# [deprecated] create reformat node old way
reformat_node = nuke.createNode("Reformat")
ref_node = self.nodes.get("Reformat", None)
if ref_node:
for k, v in ref_node:
self.log.debug("k, v: {0}:{1}".format(k, v))
if isinstance(v, unicode):
v = str(v)
reformat_node[k].setValue(v)
reformat_node.setInput(0, previous_node)
previous_node = reformat_node
temporary_nodes.append(reformat_node)
reformat_node.setInput(0, previous_node)
previous_node = reformat_node
temporary_nodes.append(reformat_node)
else:
# create reformat node new way
for repo_node in self.reposition_nodes:
node_class = repo_node["node_class"]
knobs = repo_node["knobs"]
node = nuke.createNode(node_class)
set_node_knobs_from_settings(node, knobs)
# connect in order
node.setInput(0, previous_node)
previous_node = node
temporary_nodes.append(node)
# only create colorspace baking if toggled on
if bake_viewer_process:

View file

@ -16,11 +16,12 @@ class SaveCurrentWorkfile(pyblish.api.ContextPlugin):
def process(self, context):
host = registered_host()
if context.data["currentFile"] != host.get_current_workfile():
current = host.get_current_workfile()
if context.data["currentFile"] != current:
raise KnownPublishError("Workfile has changed during publishing!")
if host.has_unsaved_changes():
self.log.info("Saving current file..")
self.log.info("Saving current file: {}".format(current))
host.save_workfile()
else:
self.log.debug("Skipping workfile save because there are no "

View file

@ -156,7 +156,7 @@ class AnimationFBXLoader(plugin.Loader):
package_paths=[f"{root}/{hierarchy[0]}"],
recursive_paths=False)
levels = ar.get_assets(_filter)
master_level = levels[0].get_full_name()
master_level = levels[0].get_asset().get_path_name()
hierarchy_dir = root
for h in hierarchy:
@ -168,7 +168,7 @@ class AnimationFBXLoader(plugin.Loader):
package_paths=[f"{hierarchy_dir}/"],
recursive_paths=True)
levels = ar.get_assets(_filter)
level = levels[0].get_full_name()
level = levels[0].get_asset().get_path_name()
unreal.EditorLevelLibrary.save_all_dirty_levels()
unreal.EditorLevelLibrary.load_level(level)

View file

@ -365,7 +365,7 @@ class CameraLoader(plugin.Loader):
maps = ar.get_assets(filter)
# There should be only one map in the list
EditorLevelLibrary.load_level(maps[0].get_full_name())
EditorLevelLibrary.load_level(maps[0].get_asset().get_path_name())
level_sequence = sequences[0].get_asset()
@ -513,7 +513,7 @@ class CameraLoader(plugin.Loader):
map = maps[0]
EditorLevelLibrary.save_all_dirty_levels()
EditorLevelLibrary.load_level(map.get_full_name())
EditorLevelLibrary.load_level(map.get_asset().get_path_name())
# Remove the camera from the level.
actors = EditorLevelLibrary.get_all_level_actors()
@ -523,7 +523,7 @@ class CameraLoader(plugin.Loader):
EditorLevelLibrary.destroy_actor(a)
EditorLevelLibrary.save_all_dirty_levels()
EditorLevelLibrary.load_level(world.get_full_name())
EditorLevelLibrary.load_level(world.get_asset().get_path_name())
# There should be only one sequence in the path.
sequence_name = sequences[0].asset_name

View file

@ -740,7 +740,7 @@ class LayoutLoader(plugin.Loader):
loaded_assets = self._process(self.fname, asset_dir, shot)
for s in sequences:
EditorAssetLibrary.save_asset(s.get_full_name())
EditorAssetLibrary.save_asset(s.get_path_name())
EditorLevelLibrary.save_current_level()
@ -819,7 +819,7 @@ class LayoutLoader(plugin.Loader):
recursive_paths=False)
levels = ar.get_assets(filter)
layout_level = levels[0].get_full_name()
layout_level = levels[0].get_asset().get_path_name()
EditorLevelLibrary.save_all_dirty_levels()
EditorLevelLibrary.load_level(layout_level)
@ -919,7 +919,7 @@ class LayoutLoader(plugin.Loader):
package_paths=[f"{root}/{ms_asset}"],
recursive_paths=False)
levels = ar.get_assets(_filter)
master_level = levels[0].get_full_name()
master_level = levels[0].get_asset().get_path_name()
sequences = [master_sequence]

View file

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
# flake8: noqa E402
"""Pype module API."""
"""OpenPype lib functions."""
# add vendor to sys path based on Python version
import sys
import os
@ -94,7 +94,8 @@ from .python_module_tools import (
modules_from_path,
recursive_bases_from_class,
classes_from_module,
import_module_from_dirpath
import_module_from_dirpath,
is_func_signature_supported,
)
from .profiles_filtering import (
@ -243,6 +244,7 @@ __all__ = [
"recursive_bases_from_class",
"classes_from_module",
"import_module_from_dirpath",
"is_func_signature_supported",
"get_transcode_temp_directory",
"should_convert_for_ffmpeg",

View file

@ -6,10 +6,9 @@ import inspect
import logging
import weakref
from uuid import uuid4
try:
from weakref import WeakMethod
except Exception:
from openpype.lib.python_2_comp import WeakMethod
from .python_2_comp import WeakMethod
from .python_module_tools import is_func_signature_supported
class MissingEventSystem(Exception):
@ -80,40 +79,8 @@ class EventCallback(object):
# Get expected arguments from function spec
# - positional arguments are always preferred
expect_args = False
expect_kwargs = False
fake_event = "fake"
if hasattr(inspect, "signature"):
# Python 3 using 'Signature' object where we try to bind arg
# or kwarg. Using signature is recommended approach based on
# documentation.
sig = inspect.signature(func)
try:
sig.bind(fake_event)
expect_args = True
except TypeError:
pass
try:
sig.bind(event=fake_event)
expect_kwargs = True
except TypeError:
pass
else:
# In Python 2 'signature' is not available so 'getcallargs' is used
# - 'getcallargs' is marked as deprecated since Python 3.0
try:
inspect.getcallargs(func, fake_event)
expect_args = True
except TypeError:
pass
try:
inspect.getcallargs(func, event=fake_event)
expect_kwargs = True
except TypeError:
pass
expect_args = is_func_signature_supported(func, "fake")
expect_kwargs = is_func_signature_supported(func, event="fake")
self._func_ref = func_ref
self._func_name = func_name

View file

@ -190,7 +190,7 @@ def run_openpype_process(*args, **kwargs):
Example:
```
run_openpype_process("run", "<path to .py script>")
run_detached_process("run", "<path to .py script>")
```
Args:

View file

@ -1,41 +1,44 @@
import weakref
class _weak_callable:
def __init__(self, obj, func):
self.im_self = obj
self.im_func = func
WeakMethod = getattr(weakref, "WeakMethod", None)
def __call__(self, *args, **kws):
if self.im_self is None:
return self.im_func(*args, **kws)
else:
return self.im_func(self.im_self, *args, **kws)
if WeakMethod is None:
class _WeakCallable:
def __init__(self, obj, func):
self.im_self = obj
self.im_func = func
def __call__(self, *args, **kws):
if self.im_self is None:
return self.im_func(*args, **kws)
else:
return self.im_func(self.im_self, *args, **kws)
class WeakMethod:
""" Wraps a function or, more importantly, a bound method in
a way that allows a bound method's object to be GCed, while
providing the same interface as a normal weak reference. """
class WeakMethod:
""" Wraps a function or, more importantly, a bound method in
a way that allows a bound method's object to be GCed, while
providing the same interface as a normal weak reference. """
def __init__(self, fn):
try:
self._obj = weakref.ref(fn.im_self)
self._meth = fn.im_func
except AttributeError:
# It's not a bound method
self._obj = None
self._meth = fn
def __init__(self, fn):
try:
self._obj = weakref.ref(fn.im_self)
self._meth = fn.im_func
except AttributeError:
# It's not a bound method
self._obj = None
self._meth = fn
def __call__(self):
if self._dead():
return None
return _weak_callable(self._getobj(), self._meth)
def __call__(self):
if self._dead():
return None
return _WeakCallable(self._getobj(), self._meth)
def _dead(self):
return self._obj is not None and self._obj() is None
def _dead(self):
return self._obj is not None and self._obj() is None
def _getobj(self):
if self._obj is None:
return None
return self._obj()
def _getobj(self):
if self._obj is None:
return None
return self._obj()

View file

@ -230,3 +230,70 @@ def import_module_from_dirpath(dirpath, folder_name, dst_module_name=None):
dirpath, folder_name, dst_module_name
)
return module
def is_func_signature_supported(func, *args, **kwargs):
"""Check if a function signature supports passed args and kwargs.
This check does not actually call the function, just look if function can
be called with the arguments.
Notes:
This does NOT check if the function would work with passed arguments
only if they can be passed in. If function have *args, **kwargs
in paramaters, this will always return 'True'.
Example:
>>> def my_function(my_number):
... return my_number + 1
...
>>> is_func_signature_supported(my_function, 1)
True
>>> is_func_signature_supported(my_function, 1, 2)
False
>>> is_func_signature_supported(my_function, my_number=1)
True
>>> is_func_signature_supported(my_function, number=1)
False
>>> is_func_signature_supported(my_function, "string")
True
>>> def my_other_function(*args, **kwargs):
... my_function(*args, **kwargs)
...
>>> is_func_signature_supported(
... my_other_function,
... "string",
... 1,
... other=None
... )
True
Args:
func (function): A function where the signature should be tested.
*args (tuple[Any]): Positional arguments for function signature.
**kwargs (dict[str, Any]): Keyword arguments for function signature.
Returns:
bool: Function can pass in arguments.
"""
if hasattr(inspect, "signature"):
# Python 3 using 'Signature' object where we try to bind arg
# or kwarg. Using signature is recommended approach based on
# documentation.
sig = inspect.signature(func)
try:
sig.bind(*args, **kwargs)
return True
except TypeError:
pass
else:
# In Python 2 'signature' is not available so 'getcallargs' is used
# - 'getcallargs' is marked as deprecated since Python 3.0
try:
inspect.getcallargs(func, *args, **kwargs)
return True
except TypeError:
pass
return False

View file

@ -73,7 +73,7 @@ class FusionSubmitDeadline(
def process(self, instance):
if not instance.data.get("farm"):
self.log.info("Skipping local instance.")
self.log.debug("Skipping local instance.")
return
attribute_values = self.get_attr_values_from_data(

View file

@ -86,7 +86,7 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin,
def process(self, instance):
if not instance.data.get("farm"):
self.log.info("Skipping local instance.")
self.log.debug("Skipping local instance.")
return
instance.data["attributeValues"] = self.get_attr_values_from_data(

View file

@ -762,7 +762,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
"""
if not instance.data.get("farm"):
self.log.info("Skipping local instance.")
self.log.debug("Skipping local instance.")
return
data = instance.data.copy()

View file

@ -26,7 +26,7 @@ class ValidateDeadlinePools(OptionalPyblishPluginMixin,
def process(self, instance):
if not instance.data.get("farm"):
self.log.info("Skipping local instance.")
self.log.debug("Skipping local instance.")
return
# get default deadline webservice url from deadline module

View file

@ -1,12 +1,10 @@
import os
import sys
import types
import inspect
import copy
import tempfile
import xml.etree.ElementTree
import six
import pyblish.util
import pyblish.plugin
import pyblish.api
@ -42,7 +40,9 @@ def get_template_name_profiles(
Args:
project_name (str): Name of project where to look for templates.
project_settings(Dic[str, Any]): Prepared project settings.
project_settings (Dict[str, Any]): Prepared project settings.
logger (Optional[logging.Logger]): Logger object to be used instead
of default logger.
Returns:
List[Dict[str, Any]]: Publish template profiles.
@ -103,7 +103,9 @@ def get_hero_template_name_profiles(
Args:
project_name (str): Name of project where to look for templates.
project_settings(Dic[str, Any]): Prepared project settings.
project_settings (Dict[str, Any]): Prepared project settings.
logger (Optional[logging.Logger]): Logger object to be used instead
of default logger.
Returns:
List[Dict[str, Any]]: Publish template profiles.
@ -172,9 +174,10 @@ def get_publish_template_name(
project_name (str): Name of project where to look for settings.
host_name (str): Name of host integration.
family (str): Family for which should be found template.
task_name (str): Task name on which is intance working.
task_type (str): Task type on which is intance working.
project_setting (Dict[str, Any]): Prepared project settings.
task_name (str): Task name on which is instance working.
task_type (str): Task type on which is instance working.
project_settings (Dict[str, Any]): Prepared project settings.
hero (bool): Template is for hero version publishing.
logger (logging.Logger): Custom logger used for 'filter_profiles'
function.
@ -264,19 +267,18 @@ def load_help_content_from_plugin(plugin):
def publish_plugins_discover(paths=None):
"""Find and return available pyblish plug-ins
Overridden function from `pyblish` module to be able collect crashed files
and reason of their crash.
Overridden function from `pyblish` module to be able to collect
crashed files and reason of their crash.
Arguments:
paths (list, optional): Paths to discover plug-ins from.
If no paths are provided, all paths are searched.
"""
# The only difference with `pyblish.api.discover`
result = DiscoverResult(pyblish.api.Plugin)
plugins = dict()
plugins = {}
plugin_names = []
allow_duplicates = pyblish.plugin.ALLOW_DUPLICATES
@ -302,7 +304,7 @@ def publish_plugins_discover(paths=None):
mod_name, mod_ext = os.path.splitext(fname)
if not mod_ext == ".py":
if mod_ext != ".py":
continue
try:
@ -533,10 +535,10 @@ def find_close_plugin(close_plugin_name, log):
def remote_publish(log, close_plugin_name=None, raise_error=False):
"""Loops through all plugins, logs to console. Used for tests.
Args:
log (openpype.lib.Logger)
close_plugin_name (str): name of plugin with responsibility to
close host app
Args:
log (Logger)
close_plugin_name (str): name of plugin with responsibility to
close host app
"""
# Error exit as soon as any error occurs.
error_format = "Failed {plugin.__name__}: {error} -- {error.traceback}"
@ -845,3 +847,22 @@ def _validate_transient_template(project_name, template_name, anatomy):
raise ValueError(("There is not set \"folder\" template in \"{}\" anatomy" # noqa
" for project \"{}\"."
).format(template_name, project_name))
def add_repre_files_for_cleanup(instance, repre):
""" Explicitly mark repre files to be deleted.
Should be used on intermediate files (eg. review, thumbnails) to be
explicitly deleted.
"""
files = repre["files"]
staging_dir = repre.get("stagingDir")
if not staging_dir:
return
if isinstance(files, str):
files = [files]
for file_name in files:
expected_file = os.path.join(staging_dir, file_name)
instance.context.data["cleanupFullPaths"].append(expected_file)

View file

@ -384,7 +384,9 @@ class ColormanagedPyblishPluginMixin(object):
# check if ext in lower case is in self.allowed_ext
if ext.lstrip(".").lower() not in self.allowed_ext:
self.log.debug("Extension is not in allowed extensions.")
self.log.debug(
"Extension '{}' is not in allowed extensions.".format(ext)
)
return
if colorspace_settings is None:
@ -398,8 +400,12 @@ class ColormanagedPyblishPluginMixin(object):
# unpack colorspace settings
config_data, file_rules = colorspace_settings
self.log.info("Config data is : `{}`".format(
config_data))
if not config_data:
# warn in case no colorspace path was defined
self.log.warning("No colorspace management was defined")
return
self.log.debug("Config data is: `{}`".format(config_data))
project_name = context.data["projectName"]
host_name = context.data["hostName"]
@ -410,8 +416,7 @@ class ColormanagedPyblishPluginMixin(object):
if isinstance(filename, list):
filename = filename[0]
self.log.debug("__ filename: `{}`".format(
filename))
self.log.debug("__ filename: `{}`".format(filename))
# get matching colorspace from rules
colorspace = colorspace or get_imageio_colorspace_from_filepath(
@ -420,8 +425,7 @@ class ColormanagedPyblishPluginMixin(object):
file_rules=file_rules,
project_settings=project_settings
)
self.log.debug("__ colorspace: `{}`".format(
colorspace))
self.log.debug("__ colorspace: `{}`".format(colorspace))
# infuse data to representation
if colorspace:

View file

@ -81,7 +81,8 @@ class CleanUp(pyblish.api.InstancePlugin):
staging_dir = instance.data.get("stagingDir", None)
if not staging_dir:
self.log.info("Staging dir not set.")
self.log.debug("Skipping cleanup. Staging dir not set "
"on instance: {}.".format(instance))
return
if not os.path.normpath(staging_dir).startswith(temp_root):
@ -90,7 +91,7 @@ class CleanUp(pyblish.api.InstancePlugin):
return
if not os.path.exists(staging_dir):
self.log.info("No staging directory found: %s" % staging_dir)
self.log.debug("No staging directory found at: %s" % staging_dir)
return
if instance.data.get("stagingDir_persistent"):
@ -131,7 +132,9 @@ class CleanUp(pyblish.api.InstancePlugin):
try:
os.remove(src)
except PermissionError:
self.log.warning("Insufficient permission to delete {}".format(src))
self.log.warning(
"Insufficient permission to delete {}".format(src)
)
continue
# add dir for cleanup

View file

@ -67,5 +67,6 @@ class CollectAnatomyContextData(pyblish.api.ContextPlugin):
# Store
context.data["anatomyData"] = anatomy_data
self.log.info("Global anatomy Data collected")
self.log.debug(json.dumps(anatomy_data, indent=4))
self.log.debug("Global Anatomy Context Data collected:\n{}".format(
json.dumps(anatomy_data, indent=4)
))

View file

@ -46,17 +46,17 @@ class CollectAnatomyInstanceData(pyblish.api.ContextPlugin):
follow_workfile_version = False
def process(self, context):
self.log.info("Collecting anatomy data for all instances.")
self.log.debug("Collecting anatomy data for all instances.")
project_name = context.data["projectName"]
self.fill_missing_asset_docs(context, project_name)
self.fill_latest_versions(context, project_name)
self.fill_anatomy_data(context)
self.log.info("Anatomy Data collection finished.")
self.log.debug("Anatomy Data collection finished.")
def fill_missing_asset_docs(self, context, project_name):
self.log.debug("Qeurying asset documents for instances.")
self.log.debug("Querying asset documents for instances.")
context_asset_doc = context.data.get("assetEntity")
@ -271,7 +271,7 @@ class CollectAnatomyInstanceData(pyblish.api.ContextPlugin):
instance_name = instance.data["name"]
instance_label = instance.data.get("label")
if instance_label:
instance_name += "({})".format(instance_label)
instance_name += " ({})".format(instance_label)
self.log.debug("Anatomy data for instance {}: {}".format(
instance_name,
json.dumps(anatomy_data, indent=4)

View file

@ -30,6 +30,6 @@ class CollectAnatomyObject(pyblish.api.ContextPlugin):
context.data["anatomy"] = Anatomy(project_name)
self.log.info(
self.log.debug(
"Anatomy object collected for project \"{}\".".format(project_name)
)

View file

@ -65,6 +65,6 @@ class CollectCustomStagingDir(pyblish.api.InstancePlugin):
else:
result_str = "Not adding"
self.log.info("{} custom staging dir for instance with '{}'".format(
self.log.debug("{} custom staging dir for instance with '{}'".format(
result_str, family
))

View file

@ -92,5 +92,5 @@ class CollectFromCreateContext(pyblish.api.ContextPlugin):
instance.data["transientData"] = transient_data
self.log.info("collected instance: {}".format(instance.data))
self.log.info("parsing data: {}".format(in_data))
self.log.debug("collected instance: {}".format(instance.data))
self.log.debug("parsing data: {}".format(in_data))

View file

@ -13,6 +13,7 @@ import json
import pyblish.api
from openpype.pipeline import legacy_io, KnownPublishError
from openpype.pipeline.publish.lib import add_repre_files_for_cleanup
class CollectRenderedFiles(pyblish.api.ContextPlugin):
@ -89,6 +90,7 @@ class CollectRenderedFiles(pyblish.api.ContextPlugin):
# now we can just add instances from json file and we are done
for instance_data in data.get("instances"):
self.log.info(" - processing instance for {}".format(
instance_data.get("subset")))
instance = self._context.create_instance(
@ -107,6 +109,8 @@ class CollectRenderedFiles(pyblish.api.ContextPlugin):
self._fill_staging_dir(repre_data, anatomy)
representations.append(repre_data)
add_repre_files_for_cleanup(instance, repre_data)
instance.data["representations"] = representations
# add audio if in metadata data
@ -157,6 +161,8 @@ class CollectRenderedFiles(pyblish.api.ContextPlugin):
os.environ.update(session_data)
session_is_set = True
self._process_path(data, anatomy)
context.data["cleanupFullPaths"].append(path)
context.data["cleanupEmptyDirs"].append(os.path.dirname(path))
except Exception as e:
self.log.error(e, exc_info=True)
raise Exception("Error") from e

View file

@ -48,10 +48,13 @@ class CollectSceneVersion(pyblish.api.ContextPlugin):
if '<shell>' in filename:
return
self.log.debug(
"Collecting scene version from filename: {}".format(filename)
)
version = get_version_from_path(filename)
assert version, "Cannot determine version"
rootVersion = int(version)
context.data['version'] = rootVersion
self.log.info("{}".format(type(rootVersion)))
self.log.info('Scene Version: %s' % context.data.get('version'))

View file

@ -19,6 +19,7 @@ from openpype.lib import (
should_convert_for_ffmpeg
)
from openpype.lib.profiles_filtering import filter_profiles
from openpype.pipeline.publish.lib import add_repre_files_for_cleanup
class ExtractBurnin(publish.Extractor):
@ -353,6 +354,8 @@ class ExtractBurnin(publish.Extractor):
# Add new representation to instance
instance.data["representations"].append(new_repre)
add_repre_files_for_cleanup(instance, new_repre)
# Cleanup temp staging dir after procesisng of output definitions
if do_convert:
temp_dir = repre["stagingDir"]
@ -517,8 +520,8 @@ class ExtractBurnin(publish.Extractor):
"""
if "burnin" not in (repre.get("tags") or []):
self.log.info((
"Representation \"{}\" don't have \"burnin\" tag. Skipped."
self.log.debug((
"Representation \"{}\" does not have \"burnin\" tag. Skipped."
).format(repre["name"]))
return False

View file

@ -336,13 +336,13 @@ class ExtractOIIOTranscode(publish.Extractor):
if repre.get("ext") not in self.supported_exts:
self.log.debug((
"Representation '{}' of unsupported extension. Skipped."
).format(repre["name"]))
"Representation '{}' has unsupported extension: '{}'. Skipped."
).format(repre["name"], repre.get("ext")))
return False
if not repre.get("files"):
self.log.debug((
"Representation '{}' have empty files. Skipped."
"Representation '{}' has empty files. Skipped."
).format(repre["name"]))
return False

View file

@ -24,6 +24,7 @@ from openpype.lib.transcoding import (
get_transcode_temp_directory,
)
from openpype.pipeline.publish import KnownPublishError
from openpype.pipeline.publish.lib import add_repre_files_for_cleanup
class ExtractReview(pyblish.api.InstancePlugin):
@ -92,8 +93,8 @@ class ExtractReview(pyblish.api.InstancePlugin):
host_name = instance.context.data["hostName"]
family = self.main_family_from_instance(instance)
self.log.info("Host: \"{}\"".format(host_name))
self.log.info("Family: \"{}\"".format(family))
self.log.debug("Host: \"{}\"".format(host_name))
self.log.debug("Family: \"{}\"".format(family))
profile = filter_profiles(
self.profiles,
@ -351,7 +352,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
temp_data = self.prepare_temp_data(instance, repre, output_def)
files_to_clean = []
if temp_data["input_is_sequence"]:
self.log.info("Filling gaps in sequence.")
self.log.debug("Checking sequence to fill gaps in sequence..")
files_to_clean = self.fill_sequence_gaps(
files=temp_data["origin_repre"]["files"],
staging_dir=new_repre["stagingDir"],
@ -425,6 +426,8 @@ class ExtractReview(pyblish.api.InstancePlugin):
)
instance.data["representations"].append(new_repre)
add_repre_files_for_cleanup(instance, new_repre)
def input_is_sequence(self, repre):
"""Deduce from representation data if input is sequence."""
# TODO GLOBAL ISSUE - Find better way how to find out if input

View file

@ -36,7 +36,7 @@ class ExtractThumbnail(pyblish.api.InstancePlugin):
).format(subset_name))
return
self.log.info(
self.log.debug(
"Processing instance with subset name {}".format(subset_name)
)
@ -89,13 +89,13 @@ class ExtractThumbnail(pyblish.api.InstancePlugin):
src_staging = os.path.normpath(repre["stagingDir"])
full_input_path = os.path.join(src_staging, input_file)
self.log.info("input {}".format(full_input_path))
self.log.debug("input {}".format(full_input_path))
filename = os.path.splitext(input_file)[0]
jpeg_file = filename + "_thumb.jpg"
full_output_path = os.path.join(dst_staging, jpeg_file)
if oiio_supported:
self.log.info("Trying to convert with OIIO")
self.log.debug("Trying to convert with OIIO")
# If the input can read by OIIO then use OIIO method for
# conversion otherwise use ffmpeg
thumbnail_created = self.create_thumbnail_oiio(
@ -148,7 +148,7 @@ class ExtractThumbnail(pyblish.api.InstancePlugin):
def _already_has_thumbnail(self, repres):
for repre in repres:
self.log.info("repre {}".format(repre))
self.log.debug("repre {}".format(repre))
if repre["name"] == "thumbnail":
return True
return False
@ -173,20 +173,20 @@ class ExtractThumbnail(pyblish.api.InstancePlugin):
return filtered_repres
def create_thumbnail_oiio(self, src_path, dst_path):
self.log.info("outputting {}".format(dst_path))
self.log.info("Extracting thumbnail {}".format(dst_path))
oiio_tool_path = get_oiio_tools_path()
oiio_cmd = [
oiio_tool_path,
"-a", src_path,
"-o", dst_path
]
self.log.info("running: {}".format(" ".join(oiio_cmd)))
self.log.debug("running: {}".format(" ".join(oiio_cmd)))
try:
run_subprocess(oiio_cmd, logger=self.log)
return True
except Exception:
self.log.warning(
"Failed to create thubmnail using oiiotool",
"Failed to create thumbnail using oiiotool",
exc_info=True
)
return False

View file

@ -39,7 +39,7 @@ class ExtractThumbnailFromSource(pyblish.api.InstancePlugin):
self._create_context_thumbnail(instance.context)
subset_name = instance.data["subset"]
self.log.info(
self.log.debug(
"Processing instance with subset name {}".format(subset_name)
)
thumbnail_source = instance.data.get("thumbnailSource")
@ -104,7 +104,7 @@ class ExtractThumbnailFromSource(pyblish.api.InstancePlugin):
full_output_path = os.path.join(dst_staging, dst_filename)
if oiio_supported:
self.log.info("Trying to convert with OIIO")
self.log.debug("Trying to convert with OIIO")
# If the input can read by OIIO then use OIIO method for
# conversion otherwise use ffmpeg
thumbnail_created = self.create_thumbnail_oiio(

View file

@ -267,7 +267,7 @@ class IntegrateAsset(pyblish.api.InstancePlugin):
instance_stagingdir = instance.data.get("stagingDir")
if not instance_stagingdir:
self.log.info((
self.log.debug((
"{0} is missing reference to staging directory."
" Will try to get it from representation."
).format(instance))
@ -480,7 +480,7 @@ class IntegrateAsset(pyblish.api.InstancePlugin):
update_data
)
self.log.info("Prepared subset: {}".format(subset_name))
self.log.debug("Prepared subset: {}".format(subset_name))
return subset_doc
def prepare_version(self, instance, op_session, subset_doc, project_name):
@ -521,7 +521,9 @@ class IntegrateAsset(pyblish.api.InstancePlugin):
project_name, version_doc["type"], version_doc
)
self.log.info("Prepared version: v{0:03d}".format(version_doc["name"]))
self.log.debug(
"Prepared version: v{0:03d}".format(version_doc["name"])
)
return version_doc

View file

@ -147,7 +147,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
def process(self, instance):
if instance.data.get("processedWithNewIntegrator"):
self.log.info("Instance was already processed with new integrator")
self.log.debug(
"Instance was already processed with new integrator"
)
return
for ef in self.exclude_families:
@ -274,7 +276,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
stagingdir = instance.data.get("stagingDir")
if not stagingdir:
self.log.info((
self.log.debug((
"{0} is missing reference to staging directory."
" Will try to get it from representation."
).format(instance))

View file

@ -41,7 +41,7 @@ class IntegrateThumbnails(pyblish.api.ContextPlugin):
# Filter instances which can be used for integration
filtered_instance_items = self._prepare_instances(context)
if not filtered_instance_items:
self.log.info(
self.log.debug(
"All instances were filtered. Thumbnail integration skipped."
)
return
@ -162,7 +162,7 @@ class IntegrateThumbnails(pyblish.api.ContextPlugin):
# Skip instance if thumbnail path is not available for it
if not thumbnail_path:
self.log.info((
self.log.debug((
"Skipping thumbnail integration for instance \"{}\"."
" Instance and context"
" thumbnail paths are not available."

View file

@ -397,7 +397,39 @@
false
]
]
}
},
"reposition_nodes": [
{
"node_class": "Reformat",
"knobs": [
{
"type": "text",
"name": "type",
"value": "to format"
},
{
"type": "text",
"name": "format",
"value": "HD_1080"
},
{
"type": "text",
"name": "filter",
"value": "Lanczos6"
},
{
"type": "bool",
"name": "black_outside",
"value": true
},
{
"type": "bool",
"name": "pbb",
"value": false
}
]
}
]
},
"ExtractReviewData": {
"enabled": false

View file

@ -323,7 +323,10 @@ class SchemasHub:
filled_template = self._fill_template(
schema_data, template_def
)
return filled_template
new_template_def = []
for item in filled_template:
new_template_def.extend(self.resolve_schema_data(item))
return new_template_def
def create_schema_object(self, schema_data, *args, **kwargs):
"""Create entity for passed schema data.

View file

@ -158,10 +158,43 @@
"label": "Nodes",
"collapsible": true,
"children": [
{
"type": "label",
"label": "Nodes attribute will be deprecated in future releases. Use reposition_nodes instead."
},
{
"type": "raw-json",
"key": "nodes",
"label": "Nodes"
"label": "Nodes [depricated]"
},
{
"type": "label",
"label": "Reposition knobs supported only. You can add multiple reformat nodes <br/>and set their knobs. Order of reformat nodes is important. First reformat node <br/>will be applied first and last reformat node will be applied last."
},
{
"key": "reposition_nodes",
"type": "list",
"label": "Reposition nodes",
"object_type": {
"type": "dict",
"children": [
{
"key": "node_class",
"label": "Node class",
"type": "text"
},
{
"type": "schema_template",
"name": "template_nuke_knob_inputs",
"template_data": [
{
"label": "Node knobs",
"key": "knobs"
}
]
}
]
}
}
]
}

View file

@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
__version__ = "3.15.8-nightly.3"
__version__ = "3.15.8"

View file

@ -1,6 +1,6 @@
[tool.poetry]
name = "OpenPype"
version = "3.15.7" # OpenPype
version = "3.15.8" # OpenPype
description = "Open VFX and Animation pipeline with support."
authors = ["OpenPype Team <info@openpype.io>"]
license = "MIT License"