Merge branch 'develop' into feature/deadline-nuke-settings-attributes

This commit is contained in:
Jakub Jezek 2021-07-01 16:26:09 +02:00
commit 1f9bbd5820
No known key found for this signature in database
GPG key ID: D8548FBF690B100A
29 changed files with 1249 additions and 547 deletions

View file

@ -1,11 +1,15 @@
# Changelog
## [3.2.0-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.2.0-nightly.5](https://github.com/pypeclub/OpenPype/tree/HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.4...HEAD)
**🚀 Enhancements**
- Settings UI copy/paste [\#1769](https://github.com/pypeclub/OpenPype/pull/1769)
- Workfile tool widths [\#1766](https://github.com/pypeclub/OpenPype/pull/1766)
- Push hierarchical attributes care about task parent changes [\#1763](https://github.com/pypeclub/OpenPype/pull/1763)
- Application executables with environment variables [\#1757](https://github.com/pypeclub/OpenPype/pull/1757)
- Settings Hosts enum [\#1739](https://github.com/pypeclub/OpenPype/pull/1739)
- Validate containers settings [\#1736](https://github.com/pypeclub/OpenPype/pull/1736)
- PS - added loader from sequence [\#1726](https://github.com/pypeclub/OpenPype/pull/1726)
@ -18,6 +22,11 @@
**🐛 Bug fixes**
- FFprobe streams order [\#1775](https://github.com/pypeclub/OpenPype/pull/1775)
- Project specific environments [\#1767](https://github.com/pypeclub/OpenPype/pull/1767)
- Settings UI with refresh button [\#1764](https://github.com/pypeclub/OpenPype/pull/1764)
- Standalone publisher thumbnail extractor fix [\#1761](https://github.com/pypeclub/OpenPype/pull/1761)
- Anatomy others templates don't cause crash [\#1758](https://github.com/pypeclub/OpenPype/pull/1758)
- Backend acre module commit update [\#1745](https://github.com/pypeclub/OpenPype/pull/1745)
- hiero: precollect instances failing when audio selected [\#1743](https://github.com/pypeclub/OpenPype/pull/1743)
- Hiero: creator instance error [\#1742](https://github.com/pypeclub/OpenPype/pull/1742)
@ -31,9 +40,14 @@
- Default subset template for TVPaint review and workfile families [\#1716](https://github.com/pypeclub/OpenPype/pull/1716)
- Maya: Extract review hotfix [\#1714](https://github.com/pypeclub/OpenPype/pull/1714)
- Settings: Imageio improving granularity [\#1711](https://github.com/pypeclub/OpenPype/pull/1711)
- Hiero: published whole edit mov [\#1687](https://github.com/pypeclub/OpenPype/pull/1687)
- Application without executables [\#1679](https://github.com/pypeclub/OpenPype/pull/1679)
**Merged pull requests:**
- Bump prismjs from 1.23.0 to 1.24.0 in /website [\#1773](https://github.com/pypeclub/OpenPype/pull/1773)
- TVPaint ftrack family [\#1755](https://github.com/pypeclub/OpenPype/pull/1755)
- global: removing obsolete ftrack validator plugin [\#1710](https://github.com/pypeclub/OpenPype/pull/1710)
## [2.18.4](https://github.com/pypeclub/OpenPype/tree/2.18.4) (2021-06-24)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.3...2.18.4)
@ -51,10 +65,6 @@
- Tools names forwards compatibility [\#1727](https://github.com/pypeclub/OpenPype/pull/1727)
**Merged pull requests:**
- global: removing obsolete ftrack validator plugin [\#1710](https://github.com/pypeclub/OpenPype/pull/1710)
## [2.18.2](https://github.com/pypeclub/OpenPype/tree/2.18.2) (2021-06-16)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.1.0...2.18.2)
@ -98,6 +108,7 @@
- Nuke: broken publishing rendered frames [\#1707](https://github.com/pypeclub/OpenPype/pull/1707)
- Standalone publisher Thumbnail export args [\#1705](https://github.com/pypeclub/OpenPype/pull/1705)
- Bad zip can break OpenPype start [\#1691](https://github.com/pypeclub/OpenPype/pull/1691)
- Hiero: published whole edit mov [\#1687](https://github.com/pypeclub/OpenPype/pull/1687)
- Ftrack subprocess handle of stdout/stderr [\#1675](https://github.com/pypeclub/OpenPype/pull/1675)
- Settings list race condifiton and mutable dict list conversion [\#1671](https://github.com/pypeclub/OpenPype/pull/1671)
- Mac launch arguments fix [\#1660](https://github.com/pypeclub/OpenPype/pull/1660)

View file

@ -66,7 +66,6 @@ class ExtractThumbnailSP(pyblish.api.InstancePlugin):
else:
# Convert to jpeg if not yet
full_input_path = os.path.join(thumbnail_repre["stagingDir"], file)
full_input_path = '"{}"'.format(full_input_path)
self.log.info("input {}".format(full_input_path))
full_thumbnail_path = tempfile.mkstemp(suffix=".jpg")[1]

View file

@ -43,7 +43,10 @@ class ValidateFrameRange(pyblish.api.InstancePlugin):
self.log.warning("Cannot check for extension {}".format(ext))
return
frames = len(instance.data.get("representations", [None])[0]["files"])
files = instance.data.get("representations", [None])[0]["files"]
if isinstance(files, str):
files = [files]
frames = len(files)
err_msg = "Frame duration from DB:'{}' ". format(int(duration)) +\
" doesn't match number of files:'{}'".format(frames) +\

View file

@ -103,8 +103,6 @@ class CollectInstances(pyblish.api.ContextPlugin):
instance.data["layers"] = copy.deepcopy(
context.data["layersData"]
)
# Add ftrack family
instance.data["families"].append("ftrack")
elif family == "renderLayer":
instance = self.create_render_layer_instance(
@ -186,9 +184,6 @@ class CollectInstances(pyblish.api.ContextPlugin):
instance_data["layers"] = group_layers
# Add ftrack family
instance_data["families"].append("ftrack")
return context.create_instance(**instance_data)
def create_render_pass_instance(self, context, instance_data):

View file

@ -733,6 +733,9 @@ class Templates:
continue
default_key_values[key] = templates.pop(key)
# Pop "others" key before before expected keys are processed
other_templates = templates.pop("others") or {}
keys_by_subkey = {}
for sub_key, sub_value in templates.items():
key_values = {}
@ -740,7 +743,6 @@ class Templates:
key_values.update(sub_value)
keys_by_subkey[sub_key] = cls.prepare_inner_keys(key_values)
other_templates = templates.get("others") or {}
for sub_key, sub_value in other_templates.items():
if sub_key in keys_by_subkey:
log.warning((

View file

@ -179,7 +179,7 @@ class Application:
if group.enabled:
enabled = data.get("enabled", True)
self.enabled = enabled
self.use_python_2 = data["use_python_2"]
self.use_python_2 = data.get("use_python_2", False)
self.label = data.get("variant_label") or name
self.full_name = "/".join((group.name, name))
@ -460,6 +460,12 @@ class ApplicationExecutable:
if os.path.exists(_executable):
executable = _executable
# Try to format executable with environments
try:
executable = executable.format(**os.environ)
except Exception:
pass
self.executable_path = executable
def __str__(self):
@ -1153,6 +1159,9 @@ def prepare_host_environments(data, implementation_envs=True):
def apply_project_environments_value(project_name, env, project_settings=None):
"""Apply project specific environments on passed environments.
The enviornments are applied on passed `env` argument value so it is not
required to apply changes back.
Args:
project_name (str): Name of project for which environemnts should be
received.
@ -1161,6 +1170,9 @@ def apply_project_environments_value(project_name, env, project_settings=None):
project_settings (dict): Project settings for passed project name.
Optional if project settings are already prepared.
Returns:
dict: Passed env values with applied project environments.
Raises:
KeyError: If project settings do not contain keys for project specific
environments.
@ -1171,10 +1183,9 @@ def apply_project_environments_value(project_name, env, project_settings=None):
project_settings = get_project_settings(project_name)
env_value = project_settings["global"]["project_environments"]
if not env_value:
return env
parsed = acre.parse(env_value)
return _merge_env(parsed, env)
if env_value:
env.update(_merge_env(acre.parse(env_value), env))
return env
def prepare_context_environments(data):
@ -1203,9 +1214,8 @@ def prepare_context_environments(data):
# Load project specific environments
project_name = project_doc["name"]
data["env"] = apply_project_environments_value(
project_name, data["env"]
)
# Apply project specific environments on current env value
apply_project_environments_value(project_name, data["env"])
app = data["app"]
workdir_data = get_workdir_data(

View file

@ -89,8 +89,13 @@ def ffprobe_streams(path_to_file, logger=None):
popen_stdout, popen_stderr = popen.communicate()
if popen_stdout:
logger.debug("ffprobe stdout: {}".format(popen_stdout))
logger.debug("FFprobe stdout:\n{}".format(
popen_stdout.decode("utf-8")
))
if popen_stderr:
logger.debug("ffprobe stderr: {}".format(popen_stderr))
logger.warning("FFprobe stderr:\n{}".format(
popen_stderr.decode("utf-8")
))
return json.loads(popen_stdout)["streams"]

View file

@ -2,7 +2,10 @@ import collections
import datetime
import ftrack_api
from openpype.modules.ftrack.lib import BaseEvent
from openpype.modules.ftrack.lib import (
BaseEvent,
query_custom_attributes
)
class PushFrameValuesToTaskEvent(BaseEvent):
@ -55,10 +58,6 @@ class PushFrameValuesToTaskEvent(BaseEvent):
if entity_info.get("entityType") != "task":
continue
# Skip `Task` entity type
if entity_info["entity_type"].lower() == "task":
continue
# Care only about changes of status
changes = entity_info.get("changes")
if not changes:
@ -74,6 +73,14 @@ class PushFrameValuesToTaskEvent(BaseEvent):
if project_id is None:
continue
# Skip `Task` entity type if parent didn't change
if entity_info["entity_type"].lower() == "task":
if (
"parent_id" not in changes
or changes["parent_id"]["new"] is None
):
continue
if project_id not in entities_info_by_project_id:
entities_info_by_project_id[project_id] = []
entities_info_by_project_id[project_id].append(entity_info)
@ -117,11 +124,24 @@ class PushFrameValuesToTaskEvent(BaseEvent):
))
return
interest_attributes = set(interest_attributes)
interest_entity_types = set(interest_entity_types)
# Separate value changes and task parent changes
_entities_info = []
task_parent_changes = []
for entity_info in entities_info:
if entity_info["entity_type"].lower() == "task":
task_parent_changes.append(entity_info)
else:
_entities_info.append(entity_info)
entities_info = _entities_info
# Filter entities info with changes
interesting_data, changed_keys_by_object_id = self.filter_changes(
session, event, entities_info, interest_attributes
)
if not interesting_data:
if not interesting_data and not task_parent_changes:
return
# Prepare object types
@ -131,6 +151,289 @@ class PushFrameValuesToTaskEvent(BaseEvent):
name_low = object_type["name"].lower()
object_types_by_name[name_low] = object_type
# NOTE it would be nice to check if `interesting_data` do not contain
# value changs of tasks that were created or moved
# - it is a complex way how to find out
if interesting_data:
self.process_attribute_changes(
session, object_types_by_name,
interesting_data, changed_keys_by_object_id,
interest_entity_types, interest_attributes
)
if task_parent_changes:
self.process_task_parent_change(
session, object_types_by_name, task_parent_changes,
interest_entity_types, interest_attributes
)
def process_task_parent_change(
self, session, object_types_by_name, task_parent_changes,
interest_entity_types, interest_attributes
):
"""Push custom attribute values if task parent has changed.
Parent is changed if task is created or if is moved under different
entity. We don't care about all task changes only about those that
have it's parent in interest types (from settings).
Tasks hierarchical value should be unset or set based on parents
real hierarchical value and non hierarchical custom attribute value
should be set to hierarchical value.
"""
# Store task ids which were created or moved under parent with entity
# type defined in settings (interest_entity_types).
task_ids = set()
# Store parent ids of matching task ids
matching_parent_ids = set()
# Store all entity ids of all entities to be able query hierarchical
# values.
whole_hierarchy_ids = set()
# Store parent id of each entity id
parent_id_by_entity_id = {}
for entity_info in task_parent_changes:
# Ignore entities with less parents than 2
# NOTE entity itself is also part of "parents" value
parents = entity_info.get("parents") or []
if len(parents) < 2:
continue
parent_info = parents[1]
# Check if parent has entity type we care about.
if parent_info["entity_type"] not in interest_entity_types:
continue
task_ids.add(entity_info["entityId"])
matching_parent_ids.add(parent_info["entityId"])
# Store whole hierarchi of task entity
prev_id = None
for item in parents:
item_id = item["entityId"]
whole_hierarchy_ids.add(item_id)
if prev_id is None:
prev_id = item_id
continue
parent_id_by_entity_id[prev_id] = item_id
if item["entityType"] == "show":
break
prev_id = item_id
# Just skip if nothing is interesting for our settings
if not matching_parent_ids:
return
# Query object type ids of parent ids for custom attribute
# definitions query
entities = session.query(
"select object_type_id from TypedContext where id in ({})".format(
self.join_query_keys(matching_parent_ids)
)
)
# Prepare task object id
task_object_id = object_types_by_name["task"]["id"]
# All object ids for which we're querying custom attribute definitions
object_type_ids = set()
object_type_ids.add(task_object_id)
for entity in entities:
object_type_ids.add(entity["object_type_id"])
attrs_by_obj_id, hier_attrs = self.attrs_configurations(
session, object_type_ids, interest_attributes
)
# Skip if all task attributes are not available
task_attrs = attrs_by_obj_id.get(task_object_id)
if not task_attrs:
return
# Skip attributes that is not in both hierarchical and nonhierarchical
# TODO be able to push values if hierarchical is available
for key in interest_attributes:
if key not in hier_attrs:
task_attrs.pop(key, None)
elif key not in task_attrs:
hier_attrs.pop(key)
# Skip if nothing remained
if not task_attrs:
return
# Do some preparations for custom attribute values query
attr_key_by_id = {}
nonhier_id_by_key = {}
hier_attr_ids = []
for key, attr_id in hier_attrs.items():
attr_key_by_id[attr_id] = key
hier_attr_ids.append(attr_id)
conf_ids = list(hier_attr_ids)
for key, attr_id in task_attrs.items():
attr_key_by_id[attr_id] = key
nonhier_id_by_key[key] = attr_id
conf_ids.append(attr_id)
# Query custom attribute values
# - result does not contain values for all entities only result of
# query callback to ftrack server
result = query_custom_attributes(
session, conf_ids, whole_hierarchy_ids
)
# Prepare variables where result will be stored
# - hierachical values should not contain attribute with value by
# default
hier_values_by_entity_id = {
entity_id: {}
for entity_id in whole_hierarchy_ids
}
# - real values of custom attributes
values_by_entity_id = {
entity_id: {
attr_id: None
for attr_id in conf_ids
}
for entity_id in whole_hierarchy_ids
}
for item in result:
attr_id = item["configuration_id"]
entity_id = item["entity_id"]
value = item["value"]
values_by_entity_id[entity_id][attr_id] = value
if attr_id in hier_attr_ids and value is not None:
hier_values_by_entity_id[entity_id][attr_id] = value
# Prepare values for all task entities
# - going through all parents and storing first value value
# - store None to those that are already known that do not have set
# value at all
for task_id in tuple(task_ids):
for attr_id in hier_attr_ids:
entity_ids = []
value = None
entity_id = task_id
while value is None:
entity_value = hier_values_by_entity_id[entity_id]
if attr_id in entity_value:
value = entity_value[attr_id]
if value is None:
break
if value is None:
entity_ids.append(entity_id)
entity_id = parent_id_by_entity_id.get(entity_id)
if entity_id is None:
break
for entity_id in entity_ids:
hier_values_by_entity_id[entity_id][attr_id] = value
# Prepare changes to commit
changes = []
for task_id in tuple(task_ids):
parent_id = parent_id_by_entity_id[task_id]
for attr_id in hier_attr_ids:
attr_key = attr_key_by_id[attr_id]
nonhier_id = nonhier_id_by_key[attr_key]
# Real value of hierarchical attribute on parent
# - If is none then should be unset
real_parent_value = values_by_entity_id[parent_id][attr_id]
# Current hierarchical value of a task
# - Will be compared to real parent value
hier_value = hier_values_by_entity_id[task_id][attr_id]
# Parent value that can be inherited from it's parent entity
parent_value = hier_values_by_entity_id[parent_id][attr_id]
# Task value of nonhierarchical custom attribute
nonhier_value = values_by_entity_id[task_id][nonhier_id]
if real_parent_value != hier_value:
changes.append({
"new_value": real_parent_value,
"attr_id": attr_id,
"entity_id": task_id,
"attr_key": attr_key
})
if parent_value != nonhier_value:
changes.append({
"new_value": parent_value,
"attr_id": nonhier_id,
"entity_id": task_id,
"attr_key": attr_key
})
self._commit_changes(session, changes)
def _commit_changes(self, session, changes):
uncommited_changes = False
for idx, item in enumerate(changes):
new_value = item["new_value"]
attr_id = item["attr_id"]
entity_id = item["entity_id"]
attr_key = item["attr_key"]
entity_key = collections.OrderedDict()
entity_key["configuration_id"] = attr_id
entity_key["entity_id"] = entity_id
self._cached_changes.append({
"attr_key": attr_key,
"entity_id": entity_id,
"value": new_value,
"time": datetime.datetime.now()
})
if new_value is None:
op = ftrack_api.operation.DeleteEntityOperation(
"CustomAttributeValue",
entity_key
)
else:
op = ftrack_api.operation.UpdateEntityOperation(
"ContextCustomAttributeValue",
entity_key,
"value",
ftrack_api.symbol.NOT_SET,
new_value
)
session.recorded_operations.push(op)
self.log.info((
"Changing Custom Attribute \"{}\" to value"
" \"{}\" on entity: {}"
).format(attr_key, new_value, entity_id))
if (idx + 1) % 20 == 0:
uncommited_changes = False
try:
session.commit()
except Exception:
session.rollback()
self.log.warning(
"Changing of values failed.", exc_info=True
)
else:
uncommited_changes = True
if uncommited_changes:
try:
session.commit()
except Exception:
session.rollback()
self.log.warning("Changing of values failed.", exc_info=True)
def process_attribute_changes(
self, session, object_types_by_name,
interesting_data, changed_keys_by_object_id,
interest_entity_types, interest_attributes
):
# Prepare task object id
task_object_id = object_types_by_name["task"]["id"]
@ -216,13 +519,13 @@ class PushFrameValuesToTaskEvent(BaseEvent):
task_entity_ids.add(task_id)
parent_id_by_task_id[task_id] = task_entity["parent_id"]
self.finalize(
self.finalize_attribute_changes(
session, interesting_data,
changed_keys, attrs_by_obj_id, hier_attrs,
task_entity_ids, parent_id_by_task_id
)
def finalize(
def finalize_attribute_changes(
self, session, interesting_data,
changed_keys, attrs_by_obj_id, hier_attrs,
task_entity_ids, parent_id_by_task_id
@ -248,6 +551,7 @@ class PushFrameValuesToTaskEvent(BaseEvent):
session, attr_ids, entity_ids, task_entity_ids, hier_attrs
)
changes = []
for entity_id, current_values in current_values_by_id.items():
parent_id = parent_id_by_task_id.get(entity_id)
if not parent_id:
@ -272,39 +576,13 @@ class PushFrameValuesToTaskEvent(BaseEvent):
if new_value == old_value:
continue
entity_key = collections.OrderedDict()
entity_key["configuration_id"] = attr_id
entity_key["entity_id"] = entity_id
self._cached_changes.append({
"attr_key": attr_key,
changes.append({
"new_value": new_value,
"attr_id": attr_id,
"entity_id": entity_id,
"value": new_value,
"time": datetime.datetime.now()
"attr_key": attr_key
})
if new_value is None:
op = ftrack_api.operation.DeleteEntityOperation(
"CustomAttributeValue",
entity_key
)
else:
op = ftrack_api.operation.UpdateEntityOperation(
"ContextCustomAttributeValue",
entity_key,
"value",
ftrack_api.symbol.NOT_SET,
new_value
)
session.recorded_operations.push(op)
self.log.info((
"Changing Custom Attribute \"{}\" to value"
" \"{}\" on entity: {}"
).format(attr_key, new_value, entity_id))
try:
session.commit()
except Exception:
session.rollback()
self.log.warning("Changing of values failed.", exc_info=True)
self._commit_changes(session, changes)
def filter_changes(
self, session, event, entities_info, interest_attributes

View file

@ -13,7 +13,8 @@ from .custom_attributes import (
default_custom_attributes_definition,
app_definitions_from_app_manager,
tool_definitions_from_app_manager,
get_openpype_attr
get_openpype_attr,
query_custom_attributes
)
from . import avalon_sync
@ -37,6 +38,7 @@ __all__ = (
"app_definitions_from_app_manager",
"tool_definitions_from_app_manager",
"get_openpype_attr",
"query_custom_attributes",
"avalon_sync",

View file

@ -81,3 +81,60 @@ def get_openpype_attr(session, split_hierarchical=True, query_keys=None):
return custom_attributes, hier_custom_attributes
return custom_attributes
def join_query_keys(keys):
"""Helper to join keys to query."""
return ",".join(["\"{}\"".format(key) for key in keys])
def query_custom_attributes(session, conf_ids, entity_ids, table_name=None):
"""Query custom attribute values from ftrack database.
Using ftrack call method result may differ based on used table name and
version of ftrack server.
Args:
session(ftrack_api.Session): Connected ftrack session.
conf_id(list, set, tuple): Configuration(attribute) ids which are
queried.
entity_ids(list, set, tuple): Entity ids for which are values queried.
table_name(str): Table nam from which values are queried. Not
recommended to change until you know what it means.
"""
output = []
# Just skip
if not conf_ids or not entity_ids:
return output
if table_name is None:
table_name = "ContextCustomAttributeValue"
# Prepare values to query
attributes_joined = join_query_keys(conf_ids)
attributes_len = len(conf_ids)
# Query values in chunks
chunk_size = int(5000 / attributes_len)
# Make sure entity_ids is `list` for chunk selection
entity_ids = list(entity_ids)
for idx in range(0, len(entity_ids), chunk_size):
entity_ids_joined = join_query_keys(
entity_ids[idx:idx + chunk_size]
)
call_expr = [{
"action": "query",
"expression": (
"select value, entity_id from {}"
" where entity_id in ({}) and configuration_id in ({})"
).format(table_name, entity_ids_joined, attributes_joined)
}]
if hasattr(session, "call"):
[result] = session.call(call_expr)
else:
[result] = session._call(call_expr)
for item in result["data"]:
output.append(item)
return output

View file

@ -975,11 +975,31 @@ class ExtractReview(pyblish.api.InstancePlugin):
# NOTE Skipped using instance's resolution
full_input_path_single_file = temp_data["full_input_path_single_file"]
input_data = ffprobe_streams(
full_input_path_single_file, self.log
)[0]
input_width = int(input_data["width"])
input_height = int(input_data["height"])
try:
streams = ffprobe_streams(
full_input_path_single_file, self.log
)
except Exception:
raise AssertionError((
"FFprobe couldn't read information about input file: \"{}\""
).format(full_input_path_single_file))
# Try to find first stream with defined 'width' and 'height'
# - this is to avoid order of streams where audio can be as first
# - there may be a better way (checking `codec_type`?)
input_width = None
input_height = None
for stream in streams:
if "width" in stream and "height" in stream:
input_width = int(stream["width"])
input_height = int(stream["height"])
break
# Raise exception of any stream didn't define input resolution
if input_width is None:
raise AssertionError((
"FFprobe couldn't read resolution from input file: \"{}\""
).format(full_input_path_single_file))
# NOTE Setting only one of `width` or `heigth` is not allowed
# - settings value can't have None but has value of 0

View file

@ -26,9 +26,23 @@ class ExtractReviewSlate(openpype.api.Extractor):
slate_path = inst_data.get("slateFrame")
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
slate_stream = openpype.lib.ffprobe_streams(slate_path, self.log)[0]
slate_width = slate_stream["width"]
slate_height = slate_stream["height"]
slate_streams = openpype.lib.ffprobe_streams(slate_path, self.log)
# Try to find first stream with defined 'width' and 'height'
# - this is to avoid order of streams where audio can be as first
# - there may be a better way (checking `codec_type`?)+
slate_width = None
slate_height = None
for slate_stream in slate_streams:
if "width" in slate_stream and "height" in slate_stream:
slate_width = int(slate_stream["width"])
slate_height = int(slate_stream["height"])
break
# Raise exception of any stream didn't define input resolution
if slate_width is None:
raise AssertionError((
"FFprobe couldn't read resolution from input file: \"{}\""
).format(slate_path))
if "reviewToWidth" in inst_data:
use_legacy_code = True
@ -309,16 +323,29 @@ class ExtractReviewSlate(openpype.api.Extractor):
)
return codec_args
codec_name = streams[0].get("codec_name")
# Try to find first stream that is not an audio
no_audio_stream = None
for stream in streams:
if stream.get("codec_type") != "audio":
no_audio_stream = stream
break
if no_audio_stream is None:
self.log.warning((
"Couldn't find stream that is not an audio from file \"{}\""
).format(full_input_path))
return codec_args
codec_name = no_audio_stream.get("codec_name")
if codec_name:
codec_args.append("-codec:v {}".format(codec_name))
profile_name = streams[0].get("profile")
profile_name = no_audio_stream.get("profile")
if profile_name:
profile_name = profile_name.replace(" ", "_").lower()
codec_args.append("-profile:v {}".format(profile_name))
pix_fmt = streams[0].get("pix_fmt")
pix_fmt = no_audio_stream.get("pix_fmt")
if pix_fmt:
codec_args.append("-pix_fmt {}".format(pix_fmt))
return codec_args

View file

@ -259,6 +259,26 @@
"tasks": [],
"add_ftrack_family": true,
"advanced_filtering": []
},
{
"hosts": [
"tvpaint"
],
"families": [
"renderPass"
],
"tasks": [],
"add_ftrack_family": false,
"advanced_filtering": []
},
{
"hosts": [
"tvpaint"
],
"families": [],
"tasks": [],
"add_ftrack_family": true,
"advanced_filtering": []
}
]
},

View file

@ -807,7 +807,6 @@
"environment": {},
"variants": {
"2-83": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Blender Foundation\\Blender 2.83\\blender.exe"
@ -829,7 +828,6 @@
"environment": {}
},
"2-90": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Blender Foundation\\Blender 2.90\\blender.exe"
@ -851,7 +849,6 @@
"environment": {}
},
"2-91": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Blender Foundation\\Blender 2.91\\blender.exe"
@ -891,7 +888,6 @@
"20": {
"enabled": true,
"variant_label": "20",
"use_python_2": false,
"executables": {
"windows": [],
"darwin": [],
@ -907,7 +903,6 @@
"17": {
"enabled": true,
"variant_label": "17",
"use_python_2": false,
"executables": {
"windows": [],
"darwin": [
@ -932,7 +927,6 @@
"environment": {},
"variants": {
"animation_11-64bits": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\TVPaint Developpement\\TVPaint Animation 11 (64bits)\\TVPaint Animation 11 (64bits).exe"
@ -948,7 +942,6 @@
"environment": {}
},
"animation_11-32bits": {
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files (x86)\\TVPaint Developpement\\TVPaint Animation 11 (32bits)\\TVPaint Animation 11 (32bits).exe"
@ -982,7 +975,6 @@
"2020": {
"enabled": true,
"variant_label": "2020",
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Adobe\\Adobe Photoshop 2020\\Photoshop.exe"
@ -1000,7 +992,6 @@
"2021": {
"enabled": true,
"variant_label": "2021",
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Adobe\\Adobe Photoshop 2021\\Photoshop.exe"
@ -1030,7 +1021,6 @@
"2020": {
"enabled": true,
"variant_label": "2020",
"use_python_2": false,
"executables": {
"windows": [
""
@ -1048,7 +1038,6 @@
"2021": {
"enabled": true,
"variant_label": "2021",
"use_python_2": false,
"executables": {
"windows": [
"C:\\Program Files\\Adobe\\Adobe After Effects 2021\\Support Files\\AfterFX.exe"

View file

@ -279,6 +279,11 @@ class BaseItemEntity(BaseEntity):
self, "Dynamic entity can't require restart."
)
@abstractproperty
def root_key(self):
"""Root is represented as this dictionary key."""
pass
@abstractmethod
def set_override_state(self, state):
"""Set override state and trigger it on children.
@ -866,6 +871,10 @@ class ItemEntity(BaseItemEntity):
"""Call save on root item."""
self.root_item.save()
@property
def root_key(self):
return self.root_item.root_key
def schema_validations(self):
if not self.label and self.use_label_wrap:
reason = (
@ -885,7 +894,11 @@ class ItemEntity(BaseItemEntity):
def create_schema_object(self, *args, **kwargs):
"""Reference method for creation of entities defined in RootEntity."""
return self.root_item.create_schema_object(*args, **kwargs)
return self.schema_hub.create_schema_object(*args, **kwargs)
@property
def schema_hub(self):
return self.root_item.schema_hub
def get_entity_from_path(self, path):
return self.root_item.get_entity_from_path(path)

View file

@ -1,4 +1,5 @@
import copy
import collections
from .lib import (
WRAPPER_TYPES,
@ -138,7 +139,16 @@ class DictImmutableKeysEntity(ItemEntity):
method when handling gui wrappers.
"""
added_children = []
for children_schema in schema_data["children"]:
children_deque = collections.deque()
for _children_schema in schema_data["children"]:
children_schemas = self.schema_hub.resolve_schema_data(
_children_schema
)
for children_schema in children_schemas:
children_deque.append(children_schema)
while children_deque:
children_schema = children_deque.popleft()
if children_schema["type"] in WRAPPER_TYPES:
_children_schema = copy.deepcopy(children_schema)
wrapper_children = self._add_children(

View file

@ -2,6 +2,7 @@ import os
import re
import json
import copy
import inspect
from .exceptions import (
SchemaTemplateMissingKeys,
@ -25,335 +26,6 @@ TEMPLATE_METADATA_KEYS = (
template_key_pattern = re.compile(r"(\{.*?[^{0]*\})")
def _pop_metadata_item(template):
found_idx = None
for idx, item in enumerate(template):
if not isinstance(item, dict):
continue
for key in TEMPLATE_METADATA_KEYS:
if key in item:
found_idx = idx
break
if found_idx is not None:
break
metadata_item = {}
if found_idx is not None:
metadata_item = template.pop(found_idx)
return metadata_item
def _fill_schema_template_data(
template, template_data, skip_paths, required_keys=None, missing_keys=None
):
first = False
if required_keys is None:
first = True
if "skip_paths" in template_data:
skip_paths = template_data["skip_paths"]
if not isinstance(skip_paths, list):
skip_paths = [skip_paths]
# Cleanup skip paths (skip empty values)
skip_paths = [path for path in skip_paths if path]
required_keys = set()
missing_keys = set()
# Copy template data as content may change
template = copy.deepcopy(template)
# Get metadata item from template
metadata_item = _pop_metadata_item(template)
# Check for default values for template data
default_values = metadata_item.get(DEFAULT_VALUES_KEY) or {}
for key, value in default_values.items():
if key not in template_data:
template_data[key] = value
if not template:
output = template
elif isinstance(template, list):
# Store paths by first part if path
# - None value says that whole key should be skipped
skip_paths_by_first_key = {}
for path in skip_paths:
parts = path.split("/")
key = parts.pop(0)
if key not in skip_paths_by_first_key:
skip_paths_by_first_key[key] = []
value = "/".join(parts)
skip_paths_by_first_key[key].append(value or None)
output = []
for item in template:
# Get skip paths for children item
_skip_paths = []
if not isinstance(item, dict):
pass
elif item.get("type") in WRAPPER_TYPES:
_skip_paths = copy.deepcopy(skip_paths)
elif skip_paths_by_first_key:
# Check if this item should be skipped
key = item.get("key")
if key and key in skip_paths_by_first_key:
_skip_paths = skip_paths_by_first_key[key]
# Skip whole item if None is in skip paths value
if None in _skip_paths:
continue
output_item = _fill_schema_template_data(
item, template_data, _skip_paths, required_keys, missing_keys
)
if output_item:
output.append(output_item)
elif isinstance(template, dict):
output = {}
for key, value in template.items():
output[key] = _fill_schema_template_data(
value, template_data, skip_paths, required_keys, missing_keys
)
if output.get("type") in WRAPPER_TYPES and not output.get("children"):
return {}
elif isinstance(template, STRING_TYPE):
# TODO find much better way how to handle filling template data
template = template.replace("{{", "__dbcb__").replace("}}", "__decb__")
for replacement_string in template_key_pattern.findall(template):
key = str(replacement_string[1:-1])
required_keys.add(key)
if key not in template_data:
missing_keys.add(key)
continue
value = template_data[key]
if replacement_string == template:
# Replace the value with value from templates data
# - with this is possible to set value with different type
template = value
else:
# Only replace the key in string
template = template.replace(replacement_string, value)
output = template.replace("__dbcb__", "{").replace("__decb__", "}")
else:
output = template
if first and missing_keys:
raise SchemaTemplateMissingKeys(missing_keys, required_keys)
return output
def _fill_schema_template(child_data, schema_collection, schema_templates):
template_name = child_data["name"]
template = schema_templates.get(template_name)
if template is None:
if template_name in schema_collection:
raise KeyError((
"Schema \"{}\" is used as `schema_template`"
).format(template_name))
raise KeyError("Schema template \"{}\" was not found".format(
template_name
))
# Default value must be dictionary (NOT list)
# - empty list would not add any item if `template_data` are not filled
template_data = child_data.get("template_data") or {}
if isinstance(template_data, dict):
template_data = [template_data]
skip_paths = child_data.get("skip_paths") or []
if isinstance(skip_paths, STRING_TYPE):
skip_paths = [skip_paths]
output = []
for single_template_data in template_data:
try:
filled_child = _fill_schema_template_data(
template, single_template_data, skip_paths
)
except SchemaTemplateMissingKeys as exc:
raise SchemaTemplateMissingKeys(
exc.missing_keys, exc.required_keys, template_name
)
for item in filled_child:
filled_item = _fill_inner_schemas(
item, schema_collection, schema_templates
)
if filled_item["type"] == "schema_template":
output.extend(_fill_schema_template(
filled_item, schema_collection, schema_templates
))
else:
output.append(filled_item)
return output
def _fill_inner_schemas(schema_data, schema_collection, schema_templates):
if schema_data["type"] == "schema":
raise ValueError("First item in schema data can't be schema.")
children_key = "children"
object_type_key = "object_type"
for item_key in (children_key, object_type_key):
children = schema_data.get(item_key)
if not children:
continue
if object_type_key == item_key:
if not isinstance(children, dict):
continue
children = [children]
new_children = []
for child in children:
child_type = child["type"]
if child_type == "schema":
schema_name = child["name"]
if schema_name not in schema_collection:
if schema_name in schema_templates:
raise KeyError((
"Schema template \"{}\" is used as `schema`"
).format(schema_name))
raise KeyError(
"Schema \"{}\" was not found".format(schema_name)
)
filled_child = _fill_inner_schemas(
schema_collection[schema_name],
schema_collection,
schema_templates
)
elif child_type in ("template", "schema_template"):
for filled_child in _fill_schema_template(
child, schema_collection, schema_templates
):
new_children.append(filled_child)
continue
else:
filled_child = _fill_inner_schemas(
child, schema_collection, schema_templates
)
new_children.append(filled_child)
if item_key == object_type_key:
if len(new_children) != 1:
raise KeyError((
"Failed to fill object type with type: {} | name {}"
).format(
child_type, str(child.get("name"))
))
new_children = new_children[0]
schema_data[item_key] = new_children
return schema_data
# TODO reimplement logic inside entities
def validate_environment_groups_uniquenes(
schema_data, env_groups=None, keys=None
):
is_first = False
if env_groups is None:
is_first = True
env_groups = {}
keys = []
my_keys = copy.deepcopy(keys)
key = schema_data.get("key")
if key:
my_keys.append(key)
env_group_key = schema_data.get("env_group_key")
if env_group_key:
if env_group_key not in env_groups:
env_groups[env_group_key] = []
env_groups[env_group_key].append("/".join(my_keys))
children = schema_data.get("children")
if not children:
return
for child in children:
validate_environment_groups_uniquenes(
child, env_groups, copy.deepcopy(my_keys)
)
if is_first:
invalid = {}
for env_group_key, key_paths in env_groups.items():
if len(key_paths) > 1:
invalid[env_group_key] = key_paths
if invalid:
raise SchemaDuplicatedEnvGroupKeys(invalid)
def validate_schema(schema_data):
validate_environment_groups_uniquenes(schema_data)
def get_gui_schema(subfolder, main_schema_name):
dirpath = os.path.join(
os.path.dirname(__file__),
"schemas",
subfolder
)
loaded_schemas = {}
loaded_schema_templates = {}
for root, _, filenames in os.walk(dirpath):
for filename in filenames:
basename, ext = os.path.splitext(filename)
if ext != ".json":
continue
filepath = os.path.join(root, filename)
with open(filepath, "r") as json_stream:
try:
schema_data = json.load(json_stream)
except Exception as exc:
raise ValueError((
"Unable to parse JSON file {}\n{}"
).format(filepath, str(exc)))
if isinstance(schema_data, list):
loaded_schema_templates[basename] = schema_data
else:
loaded_schemas[basename] = schema_data
main_schema = _fill_inner_schemas(
loaded_schemas[main_schema_name],
loaded_schemas,
loaded_schema_templates
)
validate_schema(main_schema)
return main_schema
def get_studio_settings_schema():
return get_gui_schema("system_schema", "schema_main")
def get_project_settings_schema():
return get_gui_schema("projects_schema", "schema_main")
class OverrideStateItem:
"""Object used as item for `OverrideState` enum.
@ -426,3 +98,506 @@ class OverrideState:
DEFAULTS = OverrideStateItem(0, "Defaults")
STUDIO = OverrideStateItem(1, "Studio overrides")
PROJECT = OverrideStateItem(2, "Project Overrides")
class SchemasHub:
def __init__(self, schema_subfolder, reset=True):
self._schema_subfolder = schema_subfolder
self._loaded_types = {}
self._gui_types = tuple()
self._crashed_on_load = {}
self._loaded_templates = {}
self._loaded_schemas = {}
# It doesn't make sence to reload types on each reset as they can't be
# changed
self._load_types()
# Trigger reset
if reset:
self.reset()
def reset(self):
self._load_schemas()
@property
def gui_types(self):
return self._gui_types
def get_schema(self, schema_name):
"""Get schema definition data by it's name.
Returns:
dict: Copy of schema loaded from json files.
Raises:
KeyError: When schema name is stored in loaded templates or json
file was not possible to parse or when schema name was not
found.
"""
if schema_name not in self._loaded_schemas:
if schema_name in self._loaded_templates:
raise KeyError((
"Template \"{}\" is used as `schema`"
).format(schema_name))
elif schema_name in self._crashed_on_load:
crashed_item = self._crashed_on_load[schema_name]
raise KeyError(
"Unable to parse schema file \"{}\". {}".format(
crashed_item["filpath"], crashed_item["message"]
)
)
raise KeyError(
"Schema \"{}\" was not found".format(schema_name)
)
return copy.deepcopy(self._loaded_schemas[schema_name])
def get_template(self, template_name):
"""Get template definition data by it's name.
Returns:
list: Copy of template items loaded from json files.
Raises:
KeyError: When template name is stored in loaded schemas or json
file was not possible to parse or when template name was not
found.
"""
if template_name not in self._loaded_templates:
if template_name in self._loaded_schemas:
raise KeyError((
"Schema \"{}\" is used as `template`"
).format(template_name))
elif template_name in self._crashed_on_load:
crashed_item = self._crashed_on_load[template_name]
raise KeyError(
"Unable to parse templace file \"{}\". {}".format(
crashed_item["filpath"], crashed_item["message"]
)
)
raise KeyError(
"Template \"{}\" was not found".format(template_name)
)
return copy.deepcopy(self._loaded_templates[template_name])
def resolve_schema_data(self, schema_data):
"""Resolve single item schema data as few types can be expanded.
This is mainly for 'schema' and 'template' types. Type 'schema' does
not have entity representation and 'template' may contain more than one
output schemas.
In other cases is retuned passed schema item in list.
Goal is to have schema and template resolving at one place.
Returns:
list: Resolved schema data.
"""
schema_type = schema_data["type"]
if schema_type not in ("schema", "template", "schema_template"):
return [schema_data]
if schema_type == "schema":
return self.resolve_schema_data(
self.get_schema(schema_data["name"])
)
template_name = schema_data["name"]
template_def = self.get_template(template_name)
filled_template = self._fill_template(
schema_data, template_def
)
return filled_template
def create_schema_object(self, schema_data, *args, **kwargs):
"""Create entity for passed schema data.
Args:
schema_data(dict): Schema definition of settings entity.
Returns:
ItemEntity: Created entity for passed schema data item.
Raises:
ValueError: When 'schema', 'template' or any of wrapper types are
passed.
KeyError: When type of passed schema is not known.
"""
schema_type = schema_data["type"]
if schema_type in ("schema", "template", "schema_template"):
raise ValueError(
"Got unresolved schema data of type \"{}\"".format(schema_type)
)
if schema_type in WRAPPER_TYPES:
raise ValueError((
"Function `create_schema_object` can't create entities"
" of any wrapper type. Got type: \"{}\""
).format(schema_type))
klass = self._loaded_types.get(schema_type)
if not klass:
raise KeyError("Unknown type \"{}\"".format(schema_type))
return klass(schema_data, *args, **kwargs)
def _load_types(self):
"""Prepare entity types for cretion of their objects.
Currently all classes in `openpype.settings.entities` that inherited
from `BaseEntity` are stored as loaded types. GUI types are stored to
separated attribute to not mess up api access of entities.
TODOs:
Add more dynamic way how to add custom types from anywhere and
better handling of abstract classes. Skipping them is dangerous.
"""
from openpype.settings import entities
# Define known abstract classes
known_abstract_classes = (
entities.BaseEntity,
entities.BaseItemEntity,
entities.ItemEntity,
entities.EndpointEntity,
entities.InputEntity,
entities.BaseEnumEntity
)
self._loaded_types = {}
_gui_types = []
for attr in dir(entities):
item = getattr(entities, attr)
# Filter classes
if not inspect.isclass(item):
continue
# Skip classes that do not inherit from BaseEntity
if not issubclass(item, entities.BaseEntity):
continue
# Skip class that is abstract by design
if item in known_abstract_classes:
continue
if inspect.isabstract(item):
# Create an object to get crash and get traceback
item()
# Backwards compatibility
# Single entity may have multiple schema types
for schema_type in item.schema_types:
self._loaded_types[schema_type] = item
if item.gui_type:
_gui_types.append(item)
self._gui_types = tuple(_gui_types)
def _load_schemas(self):
"""Load schema definitions from json files."""
# Refresh all affecting variables
self._crashed_on_load = {}
self._loaded_templates = {}
self._loaded_schemas = {}
dirpath = os.path.join(
os.path.dirname(os.path.abspath(__file__)),
"schemas",
self._schema_subfolder
)
loaded_schemas = {}
loaded_templates = {}
for root, _, filenames in os.walk(dirpath):
for filename in filenames:
basename, ext = os.path.splitext(filename)
if ext != ".json":
continue
filepath = os.path.join(root, filename)
with open(filepath, "r") as json_stream:
try:
schema_data = json.load(json_stream)
except Exception as exc:
msg = str(exc)
print("Unable to parse JSON file {}\n{}".format(
filepath, msg
))
self._crashed_on_load[basename] = {
"filepath": filepath,
"message": msg
}
continue
if basename in self._crashed_on_load:
crashed_item = self._crashed_on_load[basename]
raise KeyError((
"Duplicated filename \"{}\"."
" One of them crashed on load \"{}\" {}"
).format(
filename,
crashed_item["filpath"],
crashed_item["message"]
))
if isinstance(schema_data, list):
if basename in loaded_templates:
raise KeyError(
"Duplicated template filename \"{}\"".format(
filename
)
)
loaded_templates[basename] = schema_data
else:
if basename in loaded_schemas:
raise KeyError(
"Duplicated schema filename \"{}\"".format(
filename
)
)
loaded_schemas[basename] = schema_data
self._loaded_templates = loaded_templates
self._loaded_schemas = loaded_schemas
def _fill_template(self, child_data, template_def):
"""Fill template based on schema definition and template definition.
Based on `child_data` is `template_def` modified and result is
returned.
Template definition may have defined data to fill which
should be filled with data from child data.
Child data may contain more than one output definition of an template.
Child data can define paths to skip. Path is full path of an item
which won't be returned.
TODO:
Be able to handle wrapper items here.
Args:
child_data(dict): Schema data of template item.
template_def(dict): Template definition that will be filled with
child_data.
Returns:
list: Resolved template always returns list of schemas.
"""
template_name = child_data["name"]
# Default value must be dictionary (NOT list)
# - empty list would not add any item if `template_data` are not filled
template_data = child_data.get("template_data") or {}
if isinstance(template_data, dict):
template_data = [template_data]
skip_paths = child_data.get("skip_paths") or []
if isinstance(skip_paths, STRING_TYPE):
skip_paths = [skip_paths]
output = []
for single_template_data in template_data:
try:
output.extend(self._fill_template_data(
template_def, single_template_data, skip_paths
))
except SchemaTemplateMissingKeys as exc:
raise SchemaTemplateMissingKeys(
exc.missing_keys, exc.required_keys, template_name
)
return output
def _fill_template_data(
self,
template,
template_data,
skip_paths,
required_keys=None,
missing_keys=None
):
"""Fill template values with data from schema data.
Template has more abilities than schemas. It is expected that template
will be used at multiple places (but may not). Schema represents
exactly one entity and it's children but template may represent more
entities.
Template can have "keys to fill" from their definition. Some key may be
required and some may be optional because template has their default
values defined.
Template also have ability to "skip paths" which means to skip entities
from it's content. A template can be used across multiple places with
different requirements.
Raises:
SchemaTemplateMissingKeys: When fill data do not contain all
required keys for template.
"""
first = False
if required_keys is None:
first = True
if "skip_paths" in template_data:
skip_paths = template_data["skip_paths"]
if not isinstance(skip_paths, list):
skip_paths = [skip_paths]
# Cleanup skip paths (skip empty values)
skip_paths = [path for path in skip_paths if path]
required_keys = set()
missing_keys = set()
# Copy template data as content may change
template = copy.deepcopy(template)
# Get metadata item from template
metadata_item = self._pop_metadata_item(template)
# Check for default values for template data
default_values = metadata_item.get(DEFAULT_VALUES_KEY) or {}
for key, value in default_values.items():
if key not in template_data:
template_data[key] = value
if not template:
output = template
elif isinstance(template, list):
# Store paths by first part if path
# - None value says that whole key should be skipped
skip_paths_by_first_key = {}
for path in skip_paths:
parts = path.split("/")
key = parts.pop(0)
if key not in skip_paths_by_first_key:
skip_paths_by_first_key[key] = []
value = "/".join(parts)
skip_paths_by_first_key[key].append(value or None)
output = []
for item in template:
# Get skip paths for children item
_skip_paths = []
if not isinstance(item, dict):
pass
elif item.get("type") in WRAPPER_TYPES:
_skip_paths = copy.deepcopy(skip_paths)
elif skip_paths_by_first_key:
# Check if this item should be skipped
key = item.get("key")
if key and key in skip_paths_by_first_key:
_skip_paths = skip_paths_by_first_key[key]
# Skip whole item if None is in skip paths value
if None in _skip_paths:
continue
output_item = self._fill_template_data(
item,
template_data,
_skip_paths,
required_keys,
missing_keys
)
if output_item:
output.append(output_item)
elif isinstance(template, dict):
output = {}
for key, value in template.items():
output[key] = self._fill_template_data(
value,
template_data,
skip_paths,
required_keys,
missing_keys
)
if (
output.get("type") in WRAPPER_TYPES
and not output.get("children")
):
return {}
elif isinstance(template, STRING_TYPE):
# TODO find much better way how to handle filling template data
template = (
template
.replace("{{", "__dbcb__")
.replace("}}", "__decb__")
)
full_replacement = False
for replacement_string in template_key_pattern.findall(template):
key = str(replacement_string[1:-1])
required_keys.add(key)
if key not in template_data:
missing_keys.add(key)
continue
value = template_data[key]
if replacement_string == template:
# Replace the value with value from templates data
# - with this is possible to set value with different type
template = value
full_replacement = True
else:
# Only replace the key in string
template = template.replace(replacement_string, value)
if not full_replacement:
output = (
template
.replace("__dbcb__", "{")
.replace("__decb__", "}")
)
else:
output = template
else:
output = template
if first and missing_keys:
raise SchemaTemplateMissingKeys(missing_keys, required_keys)
return output
def _pop_metadata_item(self, template_def):
"""Pop template metadata from template definition.
Template metadata may define default values if are not passed from
schema data.
"""
found_idx = None
for idx, item in enumerate(template_def):
if not isinstance(item, dict):
continue
for key in TEMPLATE_METADATA_KEYS:
if key in item:
found_idx = idx
break
if found_idx is not None:
break
metadata_item = {}
if found_idx is not None:
metadata_item = template_def.pop(found_idx)
return metadata_item

View file

@ -1,7 +1,7 @@
import os
import json
import copy
import inspect
import collections
from abc import abstractmethod
@ -10,8 +10,7 @@ from .lib import (
NOT_SET,
WRAPPER_TYPES,
OverrideState,
get_studio_settings_schema,
get_project_settings_schema
SchemasHub
)
from .exceptions import (
SchemaError,
@ -53,7 +52,12 @@ class RootEntity(BaseItemEntity):
"""
schema_types = ["root"]
def __init__(self, schema_data, reset):
def __init__(self, schema_hub, reset, main_schema_name=None):
self.schema_hub = schema_hub
if not main_schema_name:
main_schema_name = "schema_main"
schema_data = schema_hub.get_schema(main_schema_name)
super(RootEntity, self).__init__(schema_data)
self._require_restart_callbacks = []
self._item_ids_require_restart = set()
@ -130,7 +134,17 @@ class RootEntity(BaseItemEntity):
def _add_children(self, schema_data, first=True):
added_children = []
for children_schema in schema_data["children"]:
children_deque = collections.deque()
for _children_schema in schema_data["children"]:
children_schemas = self.schema_hub.resolve_schema_data(
_children_schema
)
for children_schema in children_schemas:
children_deque.append(children_schema)
while children_deque:
children_schema = children_deque.popleft()
if children_schema["type"] in WRAPPER_TYPES:
_children_schema = copy.deepcopy(children_schema)
wrapper_children = self._add_children(
@ -143,11 +157,13 @@ class RootEntity(BaseItemEntity):
child_obj = self.create_schema_object(children_schema, self)
self.children.append(child_obj)
added_children.append(child_obj)
if isinstance(child_obj, self._gui_types):
if isinstance(child_obj, self.schema_hub.gui_types):
continue
if child_obj.key in self.non_gui_children:
raise KeyError("Duplicated key \"{}\"".format(child_obj.key))
raise KeyError(
"Duplicated key \"{}\"".format(child_obj.key)
)
self.non_gui_children[child_obj.key] = child_obj
if not first:
@ -160,9 +176,6 @@ class RootEntity(BaseItemEntity):
# Store `self` to `root_item` for children entities
self.root_item = self
self._loaded_types = None
self._gui_types = None
# Children are stored by key as keys are immutable and are defined by
# schema
self.valid_value_types = (dict, )
@ -189,11 +202,10 @@ class RootEntity(BaseItemEntity):
if not KEY_REGEX.match(key):
raise InvalidKeySymbols(self.path, key)
@abstractmethod
def get_entity_from_path(self, path):
"""Return system settings entity."""
raise NotImplementedError((
"Method `get_entity_from_path` not available for \"{}\""
).format(self.__class__.__name__))
"""Return entity matching passed path."""
pass
def create_schema_object(self, schema_data, *args, **kwargs):
"""Create entity by entered schema data.
@ -201,54 +213,9 @@ class RootEntity(BaseItemEntity):
Available entities are loaded on first run. Children entities can call
this method.
"""
if self._loaded_types is None:
# Load available entities
from openpype.settings import entities
# Define known abstract classes
known_abstract_classes = (
entities.BaseEntity,
entities.BaseItemEntity,
entities.ItemEntity,
entities.EndpointEntity,
entities.InputEntity,
entities.BaseEnumEntity
)
self._loaded_types = {}
_gui_types = []
for attr in dir(entities):
item = getattr(entities, attr)
# Filter classes
if not inspect.isclass(item):
continue
# Skip classes that do not inherit from BaseEntity
if not issubclass(item, entities.BaseEntity):
continue
# Skip class that is abstract by design
if item in known_abstract_classes:
continue
if inspect.isabstract(item):
# Create an object to get crash and get traceback
item()
# Backwards compatibility
# Single entity may have multiple schema types
for schema_type in item.schema_types:
self._loaded_types[schema_type] = item
if item.gui_type:
_gui_types.append(item)
self._gui_types = tuple(_gui_types)
klass = self._loaded_types.get(schema_data["type"])
if not klass:
raise KeyError("Unknown type \"{}\"".format(schema_data["type"]))
return klass(schema_data, *args, **kwargs)
return self.schema_hub.create_schema_object(
schema_data, *args, **kwargs
)
def set_override_state(self, state):
"""Set override state and trigger it on children.
@ -491,18 +458,32 @@ class SystemSettings(RootEntity):
schema_data (dict): Pass schema data to entity. This is for development
and debugging purposes.
"""
def __init__(
self, set_studio_state=True, reset=True, schema_data=None
):
if schema_data is None:
# Load system schemas
schema_data = get_studio_settings_schema()
root_key = SYSTEM_SETTINGS_KEY
super(SystemSettings, self).__init__(schema_data, reset)
def __init__(
self, set_studio_state=True, reset=True, schema_hub=None
):
if schema_hub is None:
# Load system schemas
schema_hub = SchemasHub("system_schema")
super(SystemSettings, self).__init__(schema_hub, reset)
if set_studio_state:
self.set_studio_state()
def get_entity_from_path(self, path):
"""Return system settings entity."""
path_parts = path.split("/")
first_part = path_parts[0]
output = self
if first_part == self.root_key:
path_parts.pop(0)
for path_part in path_parts:
output = output[path_part]
return output
def _reset_values(self):
default_value = get_default_settings()[SYSTEM_SETTINGS_KEY]
for key, child_obj in self.non_gui_children.items():
@ -600,22 +581,24 @@ class ProjectSettings(RootEntity):
schema_data (dict): Pass schema data to entity. This is for development
and debugging purposes.
"""
root_key = PROJECT_SETTINGS_KEY
def __init__(
self,
project_name=None,
change_state=True,
reset=True,
schema_data=None
schema_hub=None
):
self._project_name = project_name
self._system_settings_entity = None
if schema_data is None:
if schema_hub is None:
# Load system schemas
schema_data = get_project_settings_schema()
schema_hub = SchemasHub("projects_schema")
super(ProjectSettings, self).__init__(schema_data, reset)
super(ProjectSettings, self).__init__(schema_hub, reset)
if change_state:
if self.project_name is None:

View file

@ -29,11 +29,13 @@
"template_data": [
{
"app_variant_label": "2020",
"app_variant": "2020"
"app_variant": "2020",
"variant_skip_paths": ["use_python_2"]
},
{
"app_variant_label": "2021",
"app_variant": "2021"
"app_variant": "2021",
"variant_skip_paths": ["use_python_2"]
}
]
}

View file

@ -30,7 +30,8 @@
"children": [
{
"type": "schema_template",
"name": "template_host_variant_items"
"name": "template_host_variant_items",
"skip_paths": ["use_python_2"]
}
]
}

View file

@ -29,11 +29,13 @@
"template_data": [
{
"app_variant_label": "20",
"app_variant": "20"
"app_variant": "20",
"variant_skip_paths": ["use_python_2"]
},
{
"app_variant_label": "17",
"app_variant": "17"
"app_variant": "17",
"variant_skip_paths": ["use_python_2"]
}
]
}

View file

@ -29,11 +29,13 @@
"template_data": [
{
"app_variant_label": "2020",
"app_variant": "2020"
"app_variant": "2020",
"variant_skip_paths": ["use_python_2"]
},
{
"app_variant_label": "2021",
"app_variant": "2021"
"app_variant": "2021",
"variant_skip_paths": ["use_python_2"]
}
]
}

View file

@ -30,7 +30,8 @@
"children": [
{
"type": "schema_template",
"name": "template_host_variant_items"
"name": "template_host_variant_items",
"skip_paths": ["use_python_2"]
}
]
}

View file

@ -1,4 +1,9 @@
[
{
"__default_values__": {
"variant_skip_paths": null
}
},
{
"type": "dict",
"key": "{app_variant}",
@ -19,7 +24,8 @@
},
{
"type": "schema_template",
"name": "template_host_variant_items"
"name": "template_host_variant_items",
"skip_paths": "{variant_skip_paths}"
}
]
}

View file

@ -1,3 +1,5 @@
import json
from Qt import QtWidgets, QtGui, QtCore
from openpype.tools.settings import CHILD_OFFSET
from .widgets import ExpandingWidget
@ -125,6 +127,117 @@ class BaseWidget(QtWidgets.QWidget):
actions_mapping[action] = remove_from_project_override
menu.addAction(action)
def _copy_value_actions(self, menu):
def copy_value():
mime_data = QtCore.QMimeData()
if self.entity.is_dynamic_item or self.entity.is_in_dynamic_item:
entity_path = None
else:
entity_path = "/".join(
[self.entity.root_key, self.entity.path]
)
value = self.entity.value
# Copy for settings tool
settings_data = {
"root_key": self.entity.root_key,
"value": value,
"path": entity_path
}
settings_encoded_data = QtCore.QByteArray()
settings_stream = QtCore.QDataStream(
settings_encoded_data, QtCore.QIODevice.WriteOnly
)
settings_stream.writeQString(json.dumps(settings_data))
mime_data.setData(
"application/copy_settings_value", settings_encoded_data
)
# Copy as json
json_encoded_data = None
if isinstance(value, (dict, list)):
json_encoded_data = QtCore.QByteArray()
json_stream = QtCore.QDataStream(
json_encoded_data, QtCore.QIODevice.WriteOnly
)
json_stream.writeQString(json.dumps(value))
mime_data.setData("application/json", json_encoded_data)
# Copy as text
if json_encoded_data is None:
# Store value as string
mime_data.setText(str(value))
else:
# Store data as json string
mime_data.setText(json.dumps(value, indent=4))
QtWidgets.QApplication.clipboard().setMimeData(mime_data)
action = QtWidgets.QAction("Copy", menu)
return [(action, copy_value)]
def _paste_value_actions(self, menu):
output = []
# Allow paste of value only if were copied from this UI
mime_data = QtWidgets.QApplication.clipboard().mimeData()
mime_value = mime_data.data("application/copy_settings_value")
# Skip if there is nothing to do
if not mime_value:
return output
settings_stream = QtCore.QDataStream(
mime_value, QtCore.QIODevice.ReadOnly
)
mime_data_value_str = settings_stream.readQString()
mime_data_value = json.loads(mime_data_value_str)
value = mime_data_value["value"]
path = mime_data_value["path"]
root_key = mime_data_value["root_key"]
# Try to find matching entity to be able paste values to same spot
# - entity can't by dynamic or in dynamic item
# - must be in same root entity as source copy
# Can't copy system settings <-> project settings
matching_entity = None
if path and root_key == self.entity.root_key:
try:
matching_entity = self.entity.get_entity_from_path(path)
except Exception:
pass
def _set_entity_value(_entity, _value):
try:
_entity.set(_value)
except Exception:
dialog = QtWidgets.QMessageBox(self)
dialog.setWindowTitle("Value does not match settings schema")
dialog.setIcon(QtWidgets.QMessageBox.Warning)
dialog.setText((
"Pasted value does not seem to match schema of destination"
" settings entity."
))
dialog.exec_()
# Simple paste value method
def paste_value():
_set_entity_value(self.entity, value)
action = QtWidgets.QAction("Paste", menu)
output.append((action, paste_value))
# Paste value to matchin entity
def paste_value_to_path():
_set_entity_value(matching_entity, value)
if matching_entity is not None:
action = QtWidgets.QAction("Paste to same place", menu)
output.append((action, paste_value_to_path))
return output
def show_actions_menu(self, event=None):
if event and event.button() != QtCore.Qt.RightButton:
return
@ -144,6 +257,15 @@ class BaseWidget(QtWidgets.QWidget):
self._add_to_project_override_action(menu, actions_mapping)
self._remove_from_project_override_action(menu, actions_mapping)
ui_actions = []
ui_actions.extend(self._copy_value_actions(menu))
ui_actions.extend(self._paste_value_actions(menu))
if ui_actions:
menu.addSeparator()
for action, callback in ui_actions:
menu.addAction(action)
actions_mapping[action] = callback
if not actions_mapping:
action = QtWidgets.QAction("< No action >")
actions_mapping[action] = None

View file

@ -183,6 +183,12 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
footer_widget = QtWidgets.QWidget(configurations_widget)
footer_layout = QtWidgets.QHBoxLayout(footer_widget)
refresh_icon = qtawesome.icon("fa.refresh", color="white")
refresh_btn = QtWidgets.QPushButton(footer_widget)
refresh_btn.setIcon(refresh_icon)
footer_layout.addWidget(refresh_btn, 0)
if self.user_role == "developer":
self._add_developer_ui(footer_layout)
@ -205,8 +211,10 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
main_layout.addWidget(configurations_widget, 1)
save_btn.clicked.connect(self._save)
refresh_btn.clicked.connect(self._on_refresh)
self.save_btn = save_btn
self.refresh_btn = refresh_btn
self.require_restart_label = require_restart_label
self.scroll_widget = scroll_widget
self.content_layout = content_layout
@ -220,10 +228,6 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
return
def _add_developer_ui(self, footer_layout):
refresh_icon = qtawesome.icon("fa.refresh", color="white")
refresh_button = QtWidgets.QPushButton()
refresh_button.setIcon(refresh_icon)
modify_defaults_widget = QtWidgets.QWidget()
modify_defaults_checkbox = QtWidgets.QCheckBox(modify_defaults_widget)
modify_defaults_checkbox.setChecked(self._hide_studio_overrides)
@ -235,10 +239,8 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
modify_defaults_layout.addWidget(label_widget)
modify_defaults_layout.addWidget(modify_defaults_checkbox)
footer_layout.addWidget(refresh_button, 0)
footer_layout.addWidget(modify_defaults_widget, 0)
refresh_button.clicked.connect(self._on_refresh)
modify_defaults_checkbox.stateChanged.connect(
self._on_modify_defaults
)

View file

@ -944,10 +944,8 @@ class Window(QtWidgets.QMainWindow):
split_widget.addWidget(tasks_widget)
split_widget.addWidget(files_widget)
split_widget.addWidget(side_panel)
split_widget.setStretchFactor(0, 1)
split_widget.setStretchFactor(1, 1)
split_widget.setStretchFactor(2, 3)
split_widget.setStretchFactor(3, 1)
split_widget.setSizes([255, 160, 455, 175])
body_layout.addWidget(split_widget)
# Add top margin for tasks to align it visually with files as
@ -976,7 +974,7 @@ class Window(QtWidgets.QMainWindow):
# Force focus on the open button by default, required for Houdini.
files_widget.btn_open.setFocus()
self.resize(1000, 600)
self.resize(1200, 600)
def keyPressEvent(self, event):
"""Custom keyPressEvent.

View file

@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
__version__ = "3.2.0-nightly.3"
__version__ = "3.2.0-nightly.5"

View file

@ -2667,15 +2667,6 @@ cli-boxes@^2.2.1:
resolved "https://registry.yarnpkg.com/cli-boxes/-/cli-boxes-2.2.1.tgz#ddd5035d25094fce220e9cab40a45840a440318f"
integrity sha512-y4coMcylgSCdVinjiDBuR8PCC2bLjyGTwEmPb9NHR/QaNU6EUOXcTY/s6VjGMD6ENSEaeQYHCY0GNGS5jfMwPw==
clipboard@^2.0.0:
version "2.0.8"
resolved "https://registry.yarnpkg.com/clipboard/-/clipboard-2.0.8.tgz#ffc6c103dd2967a83005f3f61976aa4655a4cdba"
integrity sha512-Y6WO0unAIQp5bLmk1zdThRhgJt/x3ks6f30s3oE3H1mgIEU33XyQjEf8gsf6DxC7NPX8Y1SsNWjUjL/ywLnnbQ==
dependencies:
good-listener "^1.2.2"
select "^1.1.2"
tiny-emitter "^2.0.0"
cliui@^5.0.0:
version "5.0.0"
resolved "https://registry.yarnpkg.com/cliui/-/cliui-5.0.0.tgz#deefcfdb2e800784aa34f46fa08e06851c7bbbc5"
@ -3310,11 +3301,6 @@ del@^6.0.0:
rimraf "^3.0.2"
slash "^3.0.0"
delegate@^3.1.2:
version "3.2.0"
resolved "https://registry.yarnpkg.com/delegate/-/delegate-3.2.0.tgz#b66b71c3158522e8ab5744f720d8ca0c2af59166"
integrity sha512-IofjkYBZaZivn0V8nnsMJGBr4jVLxHDheKSW88PyxS5QC4Vo9ZbZVvhzlSxY87fVq3STR6r+4cGepyHkcWOQSw==
depd@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/depd/-/depd-1.1.2.tgz#9bcd52e14c097763e749b274c4346ed2e560b5a9"
@ -4224,13 +4210,6 @@ globby@^6.1.0:
pify "^2.0.0"
pinkie-promise "^2.0.0"
good-listener@^1.2.2:
version "1.2.2"
resolved "https://registry.yarnpkg.com/good-listener/-/good-listener-1.2.2.tgz#d53b30cdf9313dffb7dc9a0d477096aa6d145c50"
integrity sha1-1TswzfkxPf+33JoNR3CWqm0UXFA=
dependencies:
delegate "^3.1.2"
got@^9.6.0:
version "9.6.0"
resolved "https://registry.yarnpkg.com/got/-/got-9.6.0.tgz#edf45e7d67f99545705de1f7bbeeeb121765ed85"
@ -6615,11 +6594,9 @@ prism-react-renderer@^1.1.1:
integrity sha512-GHqzxLYImx1iKN1jJURcuRoA/0ygCcNhfGw1IT8nPIMzarmKQ3Nc+JcG0gi8JXQzuh0C5ShE4npMIoqNin40hg==
prismjs@^1.23.0:
version "1.23.0"
resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.23.0.tgz#d3b3967f7d72440690497652a9d40ff046067f33"
integrity sha512-c29LVsqOaLbBHuIbsTxaKENh1N2EQBOHaWv7gkHN4dgRbxSREqDnDbtFJYdpPauS4YCplMSNCABQ6Eeor69bAA==
optionalDependencies:
clipboard "^2.0.0"
version "1.24.0"
resolved "https://registry.yarnpkg.com/prismjs/-/prismjs-1.24.0.tgz#0409c30068a6c52c89ef7f1089b3ca4de56be2ac"
integrity sha512-SqV5GRsNqnzCL8k5dfAjCNhUrF3pR0A9lTDSCUZeh/LIshheXJEaP0hwLz2t4XHivd2J/v2HR+gRnigzeKe3cQ==
process-nextick-args@~2.0.0:
version "2.0.1"
@ -7390,11 +7367,6 @@ select-hose@^2.0.0:
resolved "https://registry.yarnpkg.com/select-hose/-/select-hose-2.0.0.tgz#625d8658f865af43ec962bfc376a37359a4994ca"
integrity sha1-Yl2GWPhlr0Psliv8N2o3NZpJlMo=
select@^1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/select/-/select-1.1.2.tgz#0e7350acdec80b1108528786ec1d4418d11b396d"
integrity sha1-DnNQrN7ICxEIUoeG7B1EGNEbOW0=
selfsigned@^1.10.8:
version "1.10.8"
resolved "https://registry.yarnpkg.com/selfsigned/-/selfsigned-1.10.8.tgz#0d17208b7d12c33f8eac85c41835f27fc3d81a30"
@ -8016,11 +7988,6 @@ timsort@^0.3.0:
resolved "https://registry.yarnpkg.com/timsort/-/timsort-0.3.0.tgz#405411a8e7e6339fe64db9a234de11dc31e02bd4"
integrity sha1-QFQRqOfmM5/mTbmiNN4R3DHgK9Q=
tiny-emitter@^2.0.0:
version "2.1.0"
resolved "https://registry.yarnpkg.com/tiny-emitter/-/tiny-emitter-2.1.0.tgz#1d1a56edfc51c43e863cbb5382a72330e3555423"
integrity sha512-NB6Dk1A9xgQPMoGqC5CVXn123gWyte215ONT5Pp5a0yt4nlEoO1ZWeCwpncaekPHXO60i47ihFnZPiRPjRMq4Q==
tiny-invariant@^1.0.2:
version "1.1.0"
resolved "https://registry.yarnpkg.com/tiny-invariant/-/tiny-invariant-1.1.0.tgz#634c5f8efdc27714b7f386c35e6760991d230875"