mirror of
https://github.com/ynput/ayon-core.git
synced 2026-01-02 08:54:53 +01:00
Merge branch 'develop' of https://github.com/pypeclub/OpenPype into feature/multiverse
This commit is contained in:
commit
f6400ff638
20 changed files with 592 additions and 127 deletions
44
CHANGELOG.md
44
CHANGELOG.md
|
|
@ -1,6 +1,6 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.0-nightly.5](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.9.0-nightly.6](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.8.2...HEAD)
|
||||
|
||||
|
|
@ -12,56 +12,56 @@
|
|||
|
||||
- Documentation: fixed broken links [\#2799](https://github.com/pypeclub/OpenPype/pull/2799)
|
||||
- Documentation: broken link fix [\#2785](https://github.com/pypeclub/OpenPype/pull/2785)
|
||||
- Documentation: link fixes [\#2772](https://github.com/pypeclub/OpenPype/pull/2772)
|
||||
- Update docusaurus to latest version [\#2760](https://github.com/pypeclub/OpenPype/pull/2760)
|
||||
- Various testing updates [\#2726](https://github.com/pypeclub/OpenPype/pull/2726)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: Can sync fps as string [\#2836](https://github.com/pypeclub/OpenPype/pull/2836)
|
||||
- General: Color dialog UI fixes [\#2817](https://github.com/pypeclub/OpenPype/pull/2817)
|
||||
- Nuke: adding Reformat to baking mov plugin [\#2811](https://github.com/pypeclub/OpenPype/pull/2811)
|
||||
- Manager: Update all to latest button [\#2805](https://github.com/pypeclub/OpenPype/pull/2805)
|
||||
- General: Set context environments for non host applications [\#2803](https://github.com/pypeclub/OpenPype/pull/2803)
|
||||
- Houdini: Remove duplicate ValidateOutputNode plug-in [\#2780](https://github.com/pypeclub/OpenPype/pull/2780)
|
||||
- Tray publisher: New Tray Publisher host \(beta\) [\#2778](https://github.com/pypeclub/OpenPype/pull/2778)
|
||||
- Slack: Added regex for filtering on subset names [\#2775](https://github.com/pypeclub/OpenPype/pull/2775)
|
||||
- Houdini: Implement Reset Frame Range [\#2770](https://github.com/pypeclub/OpenPype/pull/2770)
|
||||
- Pyblish Pype: Remove redundant new line in installed fonts printing [\#2758](https://github.com/pypeclub/OpenPype/pull/2758)
|
||||
- Flame: use Shot Name on segment for asset name [\#2751](https://github.com/pypeclub/OpenPype/pull/2751)
|
||||
- Flame: adding validator source clip [\#2746](https://github.com/pypeclub/OpenPype/pull/2746)
|
||||
- Ftrack: Disable ftrack module by default [\#2732](https://github.com/pypeclub/OpenPype/pull/2732)
|
||||
- Houdini: Move Houdini Save Current File to beginning of ExtractorOrder [\#2747](https://github.com/pypeclub/OpenPype/pull/2747)
|
||||
- RoyalRender: Minor enhancements [\#2700](https://github.com/pypeclub/OpenPype/pull/2700)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: Stop creation of reviews for Cryptomattes [\#2832](https://github.com/pypeclub/OpenPype/pull/2832)
|
||||
- Deadline: Remove recreated event [\#2828](https://github.com/pypeclub/OpenPype/pull/2828)
|
||||
- Deadline: Added missing events folder [\#2827](https://github.com/pypeclub/OpenPype/pull/2827)
|
||||
- Settings: Missing document with OP versions may break start of OpenPype [\#2825](https://github.com/pypeclub/OpenPype/pull/2825)
|
||||
- Deadline: more detailed temp file name for environment json [\#2824](https://github.com/pypeclub/OpenPype/pull/2824)
|
||||
- General: Host name was formed from obsolete code [\#2821](https://github.com/pypeclub/OpenPype/pull/2821)
|
||||
- Settings UI: Fix "Apply from" action [\#2820](https://github.com/pypeclub/OpenPype/pull/2820)
|
||||
- Ftrack: Job killer with missing user [\#2819](https://github.com/pypeclub/OpenPype/pull/2819)
|
||||
- StandalonePublisher: use dynamic groups in subset names [\#2816](https://github.com/pypeclub/OpenPype/pull/2816)
|
||||
- Settings UI: Search case sensitivity [\#2810](https://github.com/pypeclub/OpenPype/pull/2810)
|
||||
- Flame Babypublisher optimalization [\#2806](https://github.com/pypeclub/OpenPype/pull/2806)
|
||||
- resolve: fixing fusion module loading [\#2802](https://github.com/pypeclub/OpenPype/pull/2802)
|
||||
- Ftrack: Unset task ids from asset versions before tasks are removed [\#2800](https://github.com/pypeclub/OpenPype/pull/2800)
|
||||
- Slack: fail gracefully if slack exception [\#2798](https://github.com/pypeclub/OpenPype/pull/2798)
|
||||
- Flame: Fix version string in default settings [\#2783](https://github.com/pypeclub/OpenPype/pull/2783)
|
||||
- After Effects: Fix typo in name `afftereffects` -\> `aftereffects` [\#2768](https://github.com/pypeclub/OpenPype/pull/2768)
|
||||
- Avoid renaming udim indexes [\#2765](https://github.com/pypeclub/OpenPype/pull/2765)
|
||||
- Houdini: Fix open last workfile [\#2767](https://github.com/pypeclub/OpenPype/pull/2767)
|
||||
- Maya: Fix `unique\_namespace` when in an namespace that is empty [\#2759](https://github.com/pypeclub/OpenPype/pull/2759)
|
||||
- Loader UI: Fix right click in representation widget [\#2757](https://github.com/pypeclub/OpenPype/pull/2757)
|
||||
- Aftereffects 2022 and Deadline [\#2748](https://github.com/pypeclub/OpenPype/pull/2748)
|
||||
- Flame: bunch of bugs [\#2745](https://github.com/pypeclub/OpenPype/pull/2745)
|
||||
- Maya: Save current scene on workfile publish [\#2744](https://github.com/pypeclub/OpenPype/pull/2744)
|
||||
- Version Up: Preserve parts of filename after version number \(like subversion\) on version\_up [\#2741](https://github.com/pypeclub/OpenPype/pull/2741)
|
||||
- Maya: Remove some unused code [\#2709](https://github.com/pypeclub/OpenPype/pull/2709)
|
||||
- Multiple hosts: unify menu style across hosts [\#2693](https://github.com/pypeclub/OpenPype/pull/2693)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- General: Move change context functions [\#2839](https://github.com/pypeclub/OpenPype/pull/2839)
|
||||
- Tools: Don't use avalon tools code [\#2829](https://github.com/pypeclub/OpenPype/pull/2829)
|
||||
- Move Unreal Implementation to OpenPype [\#2823](https://github.com/pypeclub/OpenPype/pull/2823)
|
||||
- Ftrack: Job killer with missing user [\#2819](https://github.com/pypeclub/OpenPype/pull/2819)
|
||||
- Ftrack: Unset task ids from asset versions before tasks are removed [\#2800](https://github.com/pypeclub/OpenPype/pull/2800)
|
||||
- Slack: fail gracefully if slack exception [\#2798](https://github.com/pypeclub/OpenPype/pull/2798)
|
||||
- Nuke: Use AVALON\_APP to get value for "app" key [\#2818](https://github.com/pypeclub/OpenPype/pull/2818)
|
||||
- Ftrack: Moved module one hierarchy level higher [\#2792](https://github.com/pypeclub/OpenPype/pull/2792)
|
||||
- SyncServer: Moved module one hierarchy level higher [\#2791](https://github.com/pypeclub/OpenPype/pull/2791)
|
||||
- Royal render: Move module one hierarchy level higher [\#2790](https://github.com/pypeclub/OpenPype/pull/2790)
|
||||
- Deadline: Move module one hierarchy level higher [\#2789](https://github.com/pypeclub/OpenPype/pull/2789)
|
||||
- Houdini: Remove duplicate ValidateOutputNode plug-in [\#2780](https://github.com/pypeclub/OpenPype/pull/2780)
|
||||
- Slack: Added regex for filtering on subset names [\#2775](https://github.com/pypeclub/OpenPype/pull/2775)
|
||||
- Houdini: Fix open last workfile [\#2767](https://github.com/pypeclub/OpenPype/pull/2767)
|
||||
- General: Extract template formatting from anatomy [\#2766](https://github.com/pypeclub/OpenPype/pull/2766)
|
||||
- Harmony: Rendering in Deadline didn't work in other machines than submitter [\#2754](https://github.com/pypeclub/OpenPype/pull/2754)
|
||||
- Houdini: Move Houdini Save Current File to beginning of ExtractorOrder [\#2747](https://github.com/pypeclub/OpenPype/pull/2747)
|
||||
- Maya: set Deadline job/batch name to original source workfile name instead of published workfile [\#2733](https://github.com/pypeclub/OpenPype/pull/2733)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
|
|
|
|||
|
|
@ -5,11 +5,12 @@ import logging
|
|||
|
||||
# Pipeline imports
|
||||
import avalon.api
|
||||
from avalon import io, pipeline
|
||||
from avalon import io
|
||||
|
||||
from openpype.lib import version_up
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.hosts.fusion.api import lib
|
||||
from openpype.lib.avalon_context import get_workdir_from_session
|
||||
|
||||
log = logging.getLogger("Update Slap Comp")
|
||||
|
||||
|
|
@ -44,16 +45,6 @@ def _format_version_folder(folder):
|
|||
return version_folder
|
||||
|
||||
|
||||
def _get_work_folder(session):
|
||||
"""Convenience function to get the work folder path of the current asset"""
|
||||
|
||||
# Get new filename, create path based on asset and work template
|
||||
template_work = self._project["config"]["template"]["work"]
|
||||
work_path = pipeline._format_work_template(template_work, session)
|
||||
|
||||
return os.path.normpath(work_path)
|
||||
|
||||
|
||||
def _get_fusion_instance():
|
||||
fusion = getattr(sys.modules["__main__"], "fusion", None)
|
||||
if fusion is None:
|
||||
|
|
@ -72,7 +63,7 @@ def _format_filepath(session):
|
|||
asset = session["AVALON_ASSET"]
|
||||
|
||||
# Save updated slap comp
|
||||
work_path = _get_work_folder(session)
|
||||
work_path = get_workdir_from_session(session)
|
||||
walk_to_dir = os.path.join(work_path, "scenes", "slapcomp")
|
||||
slapcomp_dir = os.path.abspath(walk_to_dir)
|
||||
|
||||
|
|
@ -112,7 +103,7 @@ def _update_savers(comp, session):
|
|||
None
|
||||
"""
|
||||
|
||||
new_work = _get_work_folder(session)
|
||||
new_work = get_workdir_from_session(session)
|
||||
renders = os.path.join(new_work, "renders")
|
||||
version_folder = _format_version_folder(renders)
|
||||
renders_version = os.path.join(renders, version_folder)
|
||||
|
|
|
|||
|
|
@ -5,11 +5,12 @@ import logging
|
|||
from Qt import QtWidgets, QtCore
|
||||
|
||||
import avalon.api
|
||||
from avalon import io, pipeline
|
||||
from avalon import io
|
||||
from avalon.vendor import qtawesome as qta
|
||||
|
||||
from openpype import style
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.lib.avalon_context import get_workdir_from_session
|
||||
|
||||
log = logging.getLogger("Fusion Switch Shot")
|
||||
|
||||
|
|
@ -123,7 +124,7 @@ class App(QtWidgets.QWidget):
|
|||
|
||||
def _on_open_from_dir(self):
|
||||
|
||||
start_dir = self._get_context_directory()
|
||||
start_dir = get_workdir_from_session()
|
||||
comp_file, _ = QtWidgets.QFileDialog.getOpenFileName(
|
||||
self, "Choose comp", start_dir)
|
||||
|
||||
|
|
@ -157,17 +158,6 @@ class App(QtWidgets.QWidget):
|
|||
import colorbleed.scripts.fusion_switch_shot as switch_shot
|
||||
switch_shot.switch(asset_name=asset, filepath=file_name, new=True)
|
||||
|
||||
def _get_context_directory(self):
|
||||
|
||||
project = io.find_one({"type": "project",
|
||||
"name": avalon.api.Session["AVALON_PROJECT"]},
|
||||
projection={"config": True})
|
||||
|
||||
template = project["config"]["template"]["work"]
|
||||
dir = pipeline._format_work_template(template, avalon.api.Session)
|
||||
|
||||
return dir
|
||||
|
||||
def collect_slap_comps(self, directory):
|
||||
items = glob.glob("{}/*.comp".format(directory))
|
||||
return items
|
||||
|
|
|
|||
|
|
@ -446,6 +446,8 @@ class ExporterReviewMov(ExporterReview):
|
|||
return path
|
||||
|
||||
def generate_mov(self, farm=False, **kwargs):
|
||||
reformat_node_add = kwargs["reformat_node_add"]
|
||||
reformat_node_config = kwargs["reformat_node_config"]
|
||||
bake_viewer_process = kwargs["bake_viewer_process"]
|
||||
bake_viewer_input_process_node = kwargs[
|
||||
"bake_viewer_input_process"]
|
||||
|
|
@ -483,6 +485,30 @@ class ExporterReviewMov(ExporterReview):
|
|||
self.previous_node = r_node
|
||||
self.log.debug("Read... `{}`".format(self._temp_nodes[subset]))
|
||||
|
||||
# add reformat node
|
||||
if reformat_node_add:
|
||||
# append reformated tag
|
||||
add_tags.append("reformated")
|
||||
|
||||
rf_node = nuke.createNode("Reformat")
|
||||
for kn_conf in reformat_node_config:
|
||||
_type = kn_conf["type"]
|
||||
k_name = str(kn_conf["name"])
|
||||
k_value = kn_conf["value"]
|
||||
|
||||
# to remove unicode as nuke doesn't like it
|
||||
if _type == "string":
|
||||
k_value = str(kn_conf["value"])
|
||||
|
||||
rf_node[k_name].setValue(k_value)
|
||||
|
||||
# connect
|
||||
rf_node.setInput(0, self.previous_node)
|
||||
self._temp_nodes[subset].append(rf_node)
|
||||
self.previous_node = rf_node
|
||||
self.log.debug(
|
||||
"Reformat... `{}`".format(self._temp_nodes[subset]))
|
||||
|
||||
# only create colorspace baking if toggled on
|
||||
if bake_viewer_process:
|
||||
if bake_viewer_input_process_node:
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
import re
|
||||
import pyblish.api
|
||||
import openpype
|
||||
from openpype.hosts.nuke.api import plugin
|
||||
|
|
@ -25,6 +26,7 @@ class ExtractReviewDataMov(openpype.api.Extractor):
|
|||
def process(self, instance):
|
||||
families = instance.data["families"]
|
||||
task_type = instance.context.data["taskType"]
|
||||
subset = instance.data["subset"]
|
||||
self.log.info("Creating staging dir...")
|
||||
|
||||
if "representations" not in instance.data:
|
||||
|
|
@ -46,6 +48,7 @@ class ExtractReviewDataMov(openpype.api.Extractor):
|
|||
for o_name, o_data in self.outputs.items():
|
||||
f_families = o_data["filter"]["families"]
|
||||
f_task_types = o_data["filter"]["task_types"]
|
||||
f_subsets = o_data["filter"]["sebsets"]
|
||||
|
||||
# test if family found in context
|
||||
test_families = any([
|
||||
|
|
@ -69,11 +72,25 @@ class ExtractReviewDataMov(openpype.api.Extractor):
|
|||
bool(not f_task_types)
|
||||
])
|
||||
|
||||
# test subsets from filter
|
||||
test_subsets = any([
|
||||
# check if any of subset filter inputs
|
||||
# converted to regex patern is not found in subset
|
||||
# we keep strict case sensitivity
|
||||
bool(next((
|
||||
s for s in f_subsets
|
||||
if re.search(re.compile(s), subset)
|
||||
), None)),
|
||||
# but if no subsets were set then make this acuntable too
|
||||
bool(not f_subsets)
|
||||
])
|
||||
|
||||
# we need all filters to be positive for this
|
||||
# preset to be activated
|
||||
test_all = all([
|
||||
test_families,
|
||||
test_task_types
|
||||
test_task_types,
|
||||
test_subsets
|
||||
])
|
||||
|
||||
# if it is not positive then skip this preset
|
||||
|
|
@ -120,6 +137,13 @@ class ExtractReviewDataMov(openpype.api.Extractor):
|
|||
if generated_repres:
|
||||
# assign to representations
|
||||
instance.data["representations"] += generated_repres
|
||||
else:
|
||||
instance.data["families"].remove("review")
|
||||
self.log.info((
|
||||
"Removing `review` from families. "
|
||||
"Not available baking profile."
|
||||
))
|
||||
self.log.debug(instance.data["families"])
|
||||
|
||||
self.log.debug(
|
||||
"_ representations: {}".format(
|
||||
|
|
|
|||
|
|
@ -81,14 +81,10 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
parsed_subset = instance.data["subset"].replace(
|
||||
instance.data["family"], '')
|
||||
|
||||
fill_pairs = {
|
||||
explicit_data = {
|
||||
"subset": parsed_subset
|
||||
}
|
||||
|
||||
fill_pairs = prepare_template_data(fill_pairs)
|
||||
workfile_subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.workfile_subset_template)
|
||||
|
||||
processed_instance = False
|
||||
for repre in instance.data["representations"]:
|
||||
ext = repre["ext"].replace('.', '')
|
||||
|
|
@ -102,6 +98,21 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
if ext in self.main_workfile_extensions or \
|
||||
ext in self.other_workfile_extensions:
|
||||
|
||||
formatting_data = self._get_parsed_groups(
|
||||
repre_file,
|
||||
self.input_naming_patterns["workfile"],
|
||||
self.input_naming_groups["workfile"],
|
||||
self.color_space
|
||||
)
|
||||
self.log.info("Parsed groups from workfile "
|
||||
"name '{}': {}".format(repre_file,
|
||||
formatting_data))
|
||||
|
||||
formatting_data.update(explicit_data)
|
||||
fill_pairs = prepare_template_data(formatting_data)
|
||||
workfile_subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.workfile_subset_template)
|
||||
|
||||
asset_build = self._get_asset_build(
|
||||
repre_file,
|
||||
self.input_naming_patterns["workfile"],
|
||||
|
|
@ -148,11 +159,23 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
resource_files[workfile_subset].append(item)
|
||||
|
||||
if ext in self.texture_extensions:
|
||||
formatting_data = self._get_parsed_groups(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
|
||||
self.log.info("Parsed groups from texture "
|
||||
"name '{}': {}".format(repre_file,
|
||||
formatting_data))
|
||||
|
||||
c_space = self._get_color_space(
|
||||
repre_file,
|
||||
self.color_space
|
||||
)
|
||||
|
||||
# optional value
|
||||
channel = self._get_channel_name(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
|
|
@ -160,6 +183,7 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
self.color_space
|
||||
)
|
||||
|
||||
# optional value
|
||||
shader = self._get_shader_name(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
|
|
@ -167,13 +191,15 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
self.color_space
|
||||
)
|
||||
|
||||
formatting_data = {
|
||||
explicit_data = {
|
||||
"color_space": c_space or '', # None throws exception
|
||||
"channel": channel or '',
|
||||
"shader": shader or '',
|
||||
"subset": parsed_subset or ''
|
||||
}
|
||||
|
||||
formatting_data.update(explicit_data)
|
||||
|
||||
fill_pairs = prepare_template_data(formatting_data)
|
||||
subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.texture_subset_template)
|
||||
|
|
@ -243,6 +269,13 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
for asset_build, version, subset, family in asset_builds:
|
||||
if not main_version:
|
||||
main_version = version
|
||||
|
||||
try:
|
||||
version_int = int(version or main_version or 1)
|
||||
except ValueError:
|
||||
self.log.error("Parsed version {} is not "
|
||||
"an number".format(version))
|
||||
|
||||
new_instance = context.create_instance(subset)
|
||||
new_instance.data.update(
|
||||
{
|
||||
|
|
@ -251,7 +284,7 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
"label": subset,
|
||||
"name": subset,
|
||||
"family": family,
|
||||
"version": int(version or main_version or 1),
|
||||
"version": version_int,
|
||||
"asset_build": asset_build # remove in validator
|
||||
}
|
||||
)
|
||||
|
|
@ -320,13 +353,14 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
"""
|
||||
asset_name = "NOT_AVAIL"
|
||||
|
||||
return self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'asset') or asset_name
|
||||
return (self._parse_key(name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces, 'asset') or
|
||||
asset_name)
|
||||
|
||||
def _get_version(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'version')
|
||||
found = self._parse_key(name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces, 'version')
|
||||
|
||||
if found:
|
||||
return found.replace('v', '')
|
||||
|
|
@ -336,8 +370,8 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
def _get_udim(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
"""Parses from 'name' udim value."""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'udim')
|
||||
found = self._parse_key(name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces, 'udim')
|
||||
if found:
|
||||
return found
|
||||
|
||||
|
|
@ -375,12 +409,15 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
Unknown format of channel name and color spaces >> cs are known
|
||||
list - 'color_space' used as a placeholder
|
||||
"""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'shader')
|
||||
if found:
|
||||
return found
|
||||
found = None
|
||||
try:
|
||||
found = self._parse_key(name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces,
|
||||
'shader')
|
||||
except ValueError:
|
||||
self.log.warning("Didn't find shader in {}".format(name))
|
||||
|
||||
self.log.warning("Didn't find shader in {}".format(name))
|
||||
return found
|
||||
|
||||
def _get_channel_name(self, name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces):
|
||||
|
|
@ -389,15 +426,18 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
Unknown format of channel name and color spaces >> cs are known
|
||||
list - 'color_space' used as a placeholder
|
||||
"""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'channel')
|
||||
if found:
|
||||
return found
|
||||
found = None
|
||||
try:
|
||||
found = self._parse_key(name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces,
|
||||
'channel')
|
||||
except ValueError:
|
||||
self.log.warning("Didn't find channel in {}".format(name))
|
||||
|
||||
self.log.warning("Didn't find channel in {}".format(name))
|
||||
return found
|
||||
|
||||
def _parse(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, key):
|
||||
def _parse_key(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, key):
|
||||
"""Universal way to parse 'name' with configurable regex groups.
|
||||
|
||||
Args:
|
||||
|
|
@ -411,23 +451,47 @@ class CollectTextures(pyblish.api.ContextPlugin):
|
|||
Raises:
|
||||
ValueError - if broken 'input_naming_groups'
|
||||
"""
|
||||
parsed_groups = self._get_parsed_groups(name,
|
||||
input_naming_patterns,
|
||||
input_naming_groups,
|
||||
color_spaces)
|
||||
|
||||
try:
|
||||
parsed_value = parsed_groups[key]
|
||||
return parsed_value
|
||||
except (IndexError, KeyError):
|
||||
msg = ("'Textures group positions' must " +
|
||||
"have '{}' key".format(key))
|
||||
raise ValueError(msg)
|
||||
|
||||
def _get_parsed_groups(self, name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces):
|
||||
"""Universal way to parse 'name' with configurable regex groups.
|
||||
|
||||
Args:
|
||||
name (str): workfile name or texture name
|
||||
input_naming_patterns (list):
|
||||
[workfile_pattern] or [texture_pattern]
|
||||
input_naming_groups (list)
|
||||
ordinal position of regex groups matching to input_naming..
|
||||
color_spaces (list) - predefined color spaces
|
||||
|
||||
Returns:
|
||||
(dict) {group_name:parsed_value}
|
||||
"""
|
||||
for input_pattern in input_naming_patterns:
|
||||
for cs in color_spaces:
|
||||
pattern = input_pattern.replace('{color_space}', cs)
|
||||
regex_result = re.findall(pattern, name)
|
||||
if regex_result:
|
||||
idx = list(input_naming_groups).index(key)
|
||||
if idx < 0:
|
||||
msg = "input_naming_groups must " +\
|
||||
"have '{}' key".format(key)
|
||||
raise ValueError(msg)
|
||||
if len(regex_result[0]) == len(input_naming_groups):
|
||||
return dict(zip(input_naming_groups, regex_result[0]))
|
||||
else:
|
||||
self.log.warning("No of parsed groups doesn't match "
|
||||
"no of group labels")
|
||||
|
||||
try:
|
||||
parsed_value = regex_result[0][idx]
|
||||
return parsed_value
|
||||
except IndexError:
|
||||
self.log.warning("Wrong index, probably "
|
||||
"wrong name {}".format(name))
|
||||
raise ValueError("Name '{}' cannot be parsed by any "
|
||||
"'{}' patterns".format(name, input_naming_patterns))
|
||||
|
||||
def _update_representations(self, upd_representations):
|
||||
"""Frames dont have sense for textures, add collected udims instead."""
|
||||
|
|
|
|||
|
|
@ -644,6 +644,166 @@ def get_workdir(
|
|||
)
|
||||
|
||||
|
||||
def template_data_from_session(session=None):
|
||||
""" Return dictionary with template from session keys.
|
||||
|
||||
Args:
|
||||
session (dict, Optional): The Session to use. If not provided use the
|
||||
currently active global Session.
|
||||
Returns:
|
||||
dict: All available data from session.
|
||||
"""
|
||||
from avalon import io
|
||||
import avalon.api
|
||||
|
||||
if session is None:
|
||||
session = avalon.api.Session
|
||||
|
||||
project_name = session["AVALON_PROJECT"]
|
||||
project_doc = io._database[project_name].find_one({"type": "project"})
|
||||
asset_doc = io._database[project_name].find_one({
|
||||
"type": "asset",
|
||||
"name": session["AVALON_ASSET"]
|
||||
})
|
||||
task_name = session["AVALON_TASK"]
|
||||
host_name = session["AVALON_APP"]
|
||||
return get_workdir_data(project_doc, asset_doc, task_name, host_name)
|
||||
|
||||
|
||||
def compute_session_changes(
|
||||
session, task=None, asset=None, app=None, template_key=None
|
||||
):
|
||||
"""Compute the changes for a Session object on asset, task or app switch
|
||||
|
||||
This does *NOT* update the Session object, but returns the changes
|
||||
required for a valid update of the Session.
|
||||
|
||||
Args:
|
||||
session (dict): The initial session to compute changes to.
|
||||
This is required for computing the full Work Directory, as that
|
||||
also depends on the values that haven't changed.
|
||||
task (str, Optional): Name of task to switch to.
|
||||
asset (str or dict, Optional): Name of asset to switch to.
|
||||
You can also directly provide the Asset dictionary as returned
|
||||
from the database to avoid an additional query. (optimization)
|
||||
app (str, Optional): Name of app to switch to.
|
||||
|
||||
Returns:
|
||||
dict: The required changes in the Session dictionary.
|
||||
|
||||
"""
|
||||
changes = dict()
|
||||
|
||||
# If no changes, return directly
|
||||
if not any([task, asset, app]):
|
||||
return changes
|
||||
|
||||
# Get asset document and asset
|
||||
asset_document = None
|
||||
asset_tasks = None
|
||||
if isinstance(asset, dict):
|
||||
# Assume asset database document
|
||||
asset_document = asset
|
||||
asset_tasks = asset_document.get("data", {}).get("tasks")
|
||||
asset = asset["name"]
|
||||
|
||||
if not asset_document or not asset_tasks:
|
||||
from avalon import io
|
||||
|
||||
# Assume asset name
|
||||
asset_document = io.find_one(
|
||||
{
|
||||
"name": asset,
|
||||
"type": "asset"
|
||||
},
|
||||
{"data.tasks": True}
|
||||
)
|
||||
assert asset_document, "Asset must exist"
|
||||
|
||||
# Detect any changes compared session
|
||||
mapping = {
|
||||
"AVALON_ASSET": asset,
|
||||
"AVALON_TASK": task,
|
||||
"AVALON_APP": app,
|
||||
}
|
||||
changes = {
|
||||
key: value
|
||||
for key, value in mapping.items()
|
||||
if value and value != session.get(key)
|
||||
}
|
||||
if not changes:
|
||||
return changes
|
||||
|
||||
# Compute work directory (with the temporary changed session so far)
|
||||
_session = session.copy()
|
||||
_session.update(changes)
|
||||
|
||||
changes["AVALON_WORKDIR"] = get_workdir_from_session(_session)
|
||||
|
||||
return changes
|
||||
|
||||
|
||||
def get_workdir_from_session(session=None, template_key=None):
|
||||
import avalon.api
|
||||
|
||||
if session is None:
|
||||
session = avalon.api.Session
|
||||
project_name = session["AVALON_PROJECT"]
|
||||
host_name = session["AVALON_APP"]
|
||||
anatomy = Anatomy(project_name)
|
||||
template_data = template_data_from_session(session)
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
|
||||
if not template_key:
|
||||
task_type = template_data["task"]["type"]
|
||||
template_key = get_workfile_template_key(
|
||||
task_type,
|
||||
host_name,
|
||||
project_name=project_name
|
||||
)
|
||||
return anatomy_filled[template_key]["folder"]
|
||||
|
||||
|
||||
def update_current_task(task=None, asset=None, app=None, template_key=None):
|
||||
"""Update active Session to a new task work area.
|
||||
|
||||
This updates the live Session to a different `asset`, `task` or `app`.
|
||||
|
||||
Args:
|
||||
task (str): The task to set.
|
||||
asset (str): The asset to set.
|
||||
app (str): The app to set.
|
||||
|
||||
Returns:
|
||||
dict: The changed key, values in the current Session.
|
||||
|
||||
"""
|
||||
import avalon.api
|
||||
from avalon.pipeline import emit
|
||||
|
||||
changes = compute_session_changes(
|
||||
avalon.api.Session,
|
||||
task=task,
|
||||
asset=asset,
|
||||
app=app,
|
||||
template_key=template_key
|
||||
)
|
||||
|
||||
# Update the Session and environments. Pop from environments all keys with
|
||||
# value set to None.
|
||||
for key, value in changes.items():
|
||||
avalon.api.Session[key] = value
|
||||
if value is None:
|
||||
os.environ.pop(key, None)
|
||||
else:
|
||||
os.environ[key] = value
|
||||
|
||||
# Emit session change
|
||||
emit("taskChanged", changes.copy())
|
||||
|
||||
return changes
|
||||
|
||||
|
||||
@with_avalon
|
||||
def get_workfile_doc(asset_id, task_name, filename, dbcon=None):
|
||||
"""Return workfile document for entered context.
|
||||
|
|
|
|||
|
|
@ -399,15 +399,6 @@ class CreatedInstance:
|
|||
self._data["active"] = data.get("active", True)
|
||||
self._data["creator_identifier"] = creator.identifier
|
||||
|
||||
# QUESTION handle version of instance here or in creator?
|
||||
version = None
|
||||
if not new:
|
||||
version = data.get("version")
|
||||
|
||||
if version is None:
|
||||
version = 1
|
||||
self._data["version"] = version
|
||||
|
||||
# Pop from source data all keys that are defined in `_data` before
|
||||
# this moment and through their values away
|
||||
# - they should be the same and if are not then should not change
|
||||
|
|
|
|||
|
|
@ -34,7 +34,12 @@ class ExtractJpegEXR(pyblish.api.InstancePlugin):
|
|||
self.log.info("subset {}".format(instance.data['subset']))
|
||||
|
||||
# skip crypto passes.
|
||||
if 'crypto' in instance.data['subset']:
|
||||
# TODO: This is just a quick fix and has its own side-effects - it is
|
||||
# affecting every subset name with `crypto` in its name.
|
||||
# This must be solved properly, maybe using tags on
|
||||
# representation that can be determined much earlier and
|
||||
# with better precision.
|
||||
if 'crypto' in instance.data['subset'].lower():
|
||||
self.log.info("Skipping crypto passes.")
|
||||
return
|
||||
|
||||
|
|
|
|||
|
|
@ -1171,6 +1171,9 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
self.log.debug("input_width: `{}`".format(input_width))
|
||||
self.log.debug("input_height: `{}`".format(input_height))
|
||||
|
||||
reformat_in_baking = bool("reformated" in new_repre["tags"])
|
||||
self.log.debug("reformat_in_baking: `{}`".format(reformat_in_baking))
|
||||
|
||||
# Use instance resolution if output definition has not set it.
|
||||
if output_width is None or output_height is None:
|
||||
output_width = temp_data["resolution_width"]
|
||||
|
|
@ -1182,6 +1185,17 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
output_width = input_width
|
||||
output_height = input_height
|
||||
|
||||
if reformat_in_baking:
|
||||
self.log.debug((
|
||||
"Using resolution from input. It is already "
|
||||
"reformated from baking process"
|
||||
))
|
||||
output_width = input_width
|
||||
output_height = input_height
|
||||
pixel_aspect = 1
|
||||
new_repre["resolutionWidth"] = input_width
|
||||
new_repre["resolutionHeight"] = input_height
|
||||
|
||||
output_width = int(output_width)
|
||||
output_height = int(output_height)
|
||||
|
||||
|
|
|
|||
|
|
@ -4,13 +4,15 @@ import sys
|
|||
import logging
|
||||
|
||||
# Pipeline imports
|
||||
from avalon import api, io, pipeline
|
||||
from avalon import api, io
|
||||
import avalon.fusion
|
||||
|
||||
# Config imports
|
||||
import openpype.lib as pype
|
||||
import openpype.hosts.fusion.lib as fusion_lib
|
||||
|
||||
from openpype.lib.avalon_context import get_workdir_from_session
|
||||
|
||||
log = logging.getLogger("Update Slap Comp")
|
||||
|
||||
self = sys.modules[__name__]
|
||||
|
|
@ -44,16 +46,6 @@ def _format_version_folder(folder):
|
|||
return version_folder
|
||||
|
||||
|
||||
def _get_work_folder(session):
|
||||
"""Convenience function to get the work folder path of the current asset"""
|
||||
|
||||
# Get new filename, create path based on asset and work template
|
||||
template_work = self._project["config"]["template"]["work"]
|
||||
work_path = pipeline._format_work_template(template_work, session)
|
||||
|
||||
return os.path.normpath(work_path)
|
||||
|
||||
|
||||
def _get_fusion_instance():
|
||||
fusion = getattr(sys.modules["__main__"], "fusion", None)
|
||||
if fusion is None:
|
||||
|
|
@ -72,7 +64,7 @@ def _format_filepath(session):
|
|||
asset = session["AVALON_ASSET"]
|
||||
|
||||
# Save updated slap comp
|
||||
work_path = _get_work_folder(session)
|
||||
work_path = get_workdir_from_session(session)
|
||||
walk_to_dir = os.path.join(work_path, "scenes", "slapcomp")
|
||||
slapcomp_dir = os.path.abspath(walk_to_dir)
|
||||
|
||||
|
|
@ -103,7 +95,7 @@ def _update_savers(comp, session):
|
|||
None
|
||||
"""
|
||||
|
||||
new_work = _get_work_folder(session)
|
||||
new_work = get_workdir_from_session(session)
|
||||
renders = os.path.join(new_work, "renders")
|
||||
version_folder = _format_version_folder(renders)
|
||||
renders_version = os.path.join(renders, version_folder)
|
||||
|
|
|
|||
|
|
@ -116,13 +116,42 @@
|
|||
"baking": {
|
||||
"filter": {
|
||||
"task_types": [],
|
||||
"families": []
|
||||
"families": [],
|
||||
"sebsets": []
|
||||
},
|
||||
"extension": "mov",
|
||||
"viewer_process_override": "",
|
||||
"bake_viewer_process": true,
|
||||
"bake_viewer_input_process": true,
|
||||
"add_tags": []
|
||||
"add_tags": [],
|
||||
"reformat_node_add": false,
|
||||
"reformat_node_config": [
|
||||
{
|
||||
"type": "string",
|
||||
"name": "type",
|
||||
"value": "to format"
|
||||
},
|
||||
{
|
||||
"type": "string",
|
||||
"name": "format",
|
||||
"value": "HD_1080"
|
||||
},
|
||||
{
|
||||
"type": "string",
|
||||
"name": "filter",
|
||||
"value": "Lanczos6"
|
||||
},
|
||||
{
|
||||
"type": "bool",
|
||||
"name": "black_outside",
|
||||
"value": true
|
||||
},
|
||||
{
|
||||
"type": "bool",
|
||||
"name": "pbb",
|
||||
"value": false
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
|
|
|
|||
|
|
@ -584,8 +584,9 @@ class DictConditionalEntity(ItemEntity):
|
|||
|
||||
self.enum_entity.update_default_value(enum_value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
value_copy = copy.deepcopy(value)
|
||||
for key, child_obj in children_by_key.items():
|
||||
child_value = value.get(key, NOT_SET)
|
||||
child_value = value_copy.get(key, NOT_SET)
|
||||
child_obj.update_default_value(child_value)
|
||||
|
||||
def update_studio_value(self, value):
|
||||
|
|
@ -620,8 +621,9 @@ class DictConditionalEntity(ItemEntity):
|
|||
|
||||
self.enum_entity.update_studio_value(enum_value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
value_copy = copy.deepcopy(value)
|
||||
for key, child_obj in children_by_key.items():
|
||||
child_value = value.get(key, NOT_SET)
|
||||
child_value = value_copy.get(key, NOT_SET)
|
||||
child_obj.update_studio_value(child_value)
|
||||
|
||||
def update_project_value(self, value):
|
||||
|
|
@ -656,8 +658,9 @@ class DictConditionalEntity(ItemEntity):
|
|||
|
||||
self.enum_entity.update_project_value(enum_value)
|
||||
for children_by_key in self.non_gui_children.values():
|
||||
value_copy = copy.deepcopy(value)
|
||||
for key, child_obj in children_by_key.items():
|
||||
child_value = value.get(key, NOT_SET)
|
||||
child_value = value_copy.get(key, NOT_SET)
|
||||
child_obj.update_project_value(child_value)
|
||||
|
||||
def _discard_changes(self, on_change_trigger):
|
||||
|
|
|
|||
|
|
@ -195,6 +195,12 @@
|
|||
"label": "Families",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"key": "sebsets",
|
||||
"label": "Subsets",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
@ -226,6 +232,121 @@
|
|||
"label": "Add additional tags to representations",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "reformat_node_add",
|
||||
"label": "Add Reformat Node",
|
||||
"default": false
|
||||
},
|
||||
{
|
||||
"type": "collapsible-wrap",
|
||||
"label": "Reformat Node Knobs",
|
||||
"collapsible": true,
|
||||
"collapsed": false,
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "reformat_node_config",
|
||||
"object_type": {
|
||||
"type": "dict-conditional",
|
||||
"enum_key": "type",
|
||||
"enum_label": "Type",
|
||||
"enum_children": [
|
||||
{
|
||||
"key": "string",
|
||||
"label": "String",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "name",
|
||||
"label": "Name"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "value",
|
||||
"label": "Value"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"key": "bool",
|
||||
"label": "Boolean",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "name",
|
||||
"label": "Name"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "value",
|
||||
"label": "Value"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"key": "number",
|
||||
"label": "Number",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "name",
|
||||
"label": "Name"
|
||||
},
|
||||
{
|
||||
"type": "list-strict",
|
||||
"key": "value",
|
||||
"label": "Value",
|
||||
"object_types": [
|
||||
{
|
||||
"type": "number",
|
||||
"key": "number",
|
||||
"default": 1,
|
||||
"decimal": 4
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
]
|
||||
},
|
||||
{
|
||||
"key": "list_numbers",
|
||||
"label": "2 Numbers",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "name",
|
||||
"label": "Name"
|
||||
},
|
||||
{
|
||||
"type": "list-strict",
|
||||
"key": "value",
|
||||
"label": "Value",
|
||||
"object_types": [
|
||||
{
|
||||
"type": "number",
|
||||
"key": "x",
|
||||
"default": 1,
|
||||
"decimal": 4
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "y",
|
||||
"default": 1,
|
||||
"decimal": 4
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -4,9 +4,11 @@ from subprocess import Popen
|
|||
|
||||
import ftrack_api
|
||||
from Qt import QtWidgets, QtCore
|
||||
from openpype import style
|
||||
from openpype.api import get_current_project_settings
|
||||
from openpype.lib.avalon_context import update_current_task
|
||||
from openpype.tools.utils.lib import qt_app_context
|
||||
from avalon import io, api, style, schema
|
||||
from avalon import io, api, schema
|
||||
from . import widget, model
|
||||
|
||||
module = sys.modules[__name__]
|
||||
|
|
@ -463,12 +465,12 @@ class Window(QtWidgets.QDialog):
|
|||
return
|
||||
task_name = task_model.itemData(index)[0]
|
||||
try:
|
||||
api.update_current_task(task=task_name, asset=asset_name)
|
||||
update_current_task(task=task_name, asset=asset_name)
|
||||
self.open_app()
|
||||
|
||||
finally:
|
||||
if origin_task is not None and origin_asset is not None:
|
||||
api.update_current_task(
|
||||
update_current_task(
|
||||
task=origin_task, asset=origin_asset
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -24,12 +24,12 @@ DEFAULT_COLOR = "#fb9c15"
|
|||
log = logging.getLogger("SceneInventory")
|
||||
|
||||
|
||||
class SceneInvetoryView(QtWidgets.QTreeView):
|
||||
class SceneInventoryView(QtWidgets.QTreeView):
|
||||
data_changed = QtCore.Signal()
|
||||
hierarchy_view_changed = QtCore.Signal(bool)
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super(SceneInvetoryView, self).__init__(parent=parent)
|
||||
super(SceneInventoryView, self).__init__(parent=parent)
|
||||
|
||||
# view settings
|
||||
self.setIndentation(12)
|
||||
|
|
@ -796,3 +796,40 @@ class SceneInvetoryView(QtWidgets.QTreeView):
|
|||
).format(version_str)
|
||||
dialog.setText(msg)
|
||||
dialog.exec_()
|
||||
|
||||
def update_all(self):
|
||||
"""Update all items that are currently 'outdated' in the view"""
|
||||
# Get the source model through the proxy model
|
||||
model = self.model().sourceModel()
|
||||
|
||||
# Get all items from outdated groups
|
||||
outdated_items = []
|
||||
for index in iter_model_rows(model,
|
||||
column=0,
|
||||
include_root=False):
|
||||
item = index.data(model.ItemRole)
|
||||
|
||||
if not item.get("isGroupNode"):
|
||||
continue
|
||||
|
||||
# Only the group nodes contain the "highest_version" data and as
|
||||
# such we find only the groups and take its children.
|
||||
if not model.outdated(item):
|
||||
continue
|
||||
|
||||
# Collect all children which we want to update
|
||||
children = item.children()
|
||||
outdated_items.extend(children)
|
||||
|
||||
if not outdated_items:
|
||||
log.info("Nothing to update.")
|
||||
return
|
||||
|
||||
# Trigger update to latest
|
||||
for item in outdated_items:
|
||||
try:
|
||||
api.update(item, -1)
|
||||
except AssertionError:
|
||||
self._show_version_error_dialog(None, [item])
|
||||
log.warning("Update failed", exc_info=True)
|
||||
self.data_changed.emit()
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ from .model import (
|
|||
InventoryModel,
|
||||
FilterProxyModel
|
||||
)
|
||||
from .view import SceneInvetoryView
|
||||
from .view import SceneInventoryView
|
||||
|
||||
|
||||
module = sys.modules[__name__]
|
||||
|
|
@ -54,14 +54,21 @@ class SceneInventoryWindow(QtWidgets.QDialog):
|
|||
outdated_only_checkbox.setToolTip("Show outdated files only")
|
||||
outdated_only_checkbox.setChecked(False)
|
||||
|
||||
icon = qtawesome.icon("fa.arrow-up", color="white")
|
||||
update_all_button = QtWidgets.QPushButton(self)
|
||||
update_all_button.setToolTip("Update all outdated to latest version")
|
||||
update_all_button.setIcon(icon)
|
||||
|
||||
icon = qtawesome.icon("fa.refresh", color="white")
|
||||
refresh_button = QtWidgets.QPushButton(self)
|
||||
update_all_button.setToolTip("Refresh")
|
||||
refresh_button.setIcon(icon)
|
||||
|
||||
control_layout = QtWidgets.QHBoxLayout()
|
||||
control_layout.addWidget(filter_label)
|
||||
control_layout.addWidget(text_filter)
|
||||
control_layout.addWidget(outdated_only_checkbox)
|
||||
control_layout.addWidget(update_all_button)
|
||||
control_layout.addWidget(refresh_button)
|
||||
|
||||
# endregion control
|
||||
|
|
@ -73,7 +80,7 @@ class SceneInventoryWindow(QtWidgets.QDialog):
|
|||
proxy.setDynamicSortFilter(True)
|
||||
proxy.setFilterCaseSensitivity(QtCore.Qt.CaseInsensitive)
|
||||
|
||||
view = SceneInvetoryView(self)
|
||||
view = SceneInventoryView(self)
|
||||
view.setModel(proxy)
|
||||
|
||||
# set some nice default widths for the view
|
||||
|
|
@ -98,11 +105,13 @@ class SceneInventoryWindow(QtWidgets.QDialog):
|
|||
self._on_outdated_state_change
|
||||
)
|
||||
view.hierarchy_view_changed.connect(
|
||||
self._on_hiearchy_view_change
|
||||
self._on_hierarchy_view_change
|
||||
)
|
||||
view.data_changed.connect(self.refresh)
|
||||
refresh_button.clicked.connect(self.refresh)
|
||||
update_all_button.clicked.connect(self._on_update_all)
|
||||
|
||||
self._update_all_button = update_all_button
|
||||
self._outdated_only_checkbox = outdated_only_checkbox
|
||||
self._view = view
|
||||
self._model = model
|
||||
|
|
@ -146,7 +155,7 @@ class SceneInventoryWindow(QtWidgets.QDialog):
|
|||
kwargs["selected"] = self._view._selected
|
||||
self._model.refresh(**kwargs)
|
||||
|
||||
def _on_hiearchy_view_change(self, enabled):
|
||||
def _on_hierarchy_view_change(self, enabled):
|
||||
self._proxy.set_hierarchy_view(enabled)
|
||||
self._model.set_hierarchy_view(enabled)
|
||||
|
||||
|
|
@ -158,6 +167,9 @@ class SceneInventoryWindow(QtWidgets.QDialog):
|
|||
self._outdated_only_checkbox.isChecked()
|
||||
)
|
||||
|
||||
def _on_update_all(self):
|
||||
self._view.update_all()
|
||||
|
||||
|
||||
def show(root=None, debug=False, parent=None, items=None):
|
||||
"""Display Scene Inventory GUI
|
||||
|
|
|
|||
|
|
@ -29,6 +29,10 @@ from openpype.lib import (
|
|||
create_workdir_extra_folders,
|
||||
get_system_general_anatomy_data
|
||||
)
|
||||
from openpype.lib.avalon_context import (
|
||||
update_current_task,
|
||||
compute_session_changes
|
||||
)
|
||||
from .model import FilesModel
|
||||
from .view import FilesView
|
||||
|
||||
|
|
@ -667,7 +671,7 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
session["AVALON_APP"],
|
||||
project_name=session["AVALON_PROJECT"]
|
||||
)
|
||||
changes = pipeline.compute_session_changes(
|
||||
changes = compute_session_changes(
|
||||
session,
|
||||
asset=self._get_asset_doc(),
|
||||
task=self._task_name,
|
||||
|
|
@ -681,7 +685,7 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
"""Enter the asset and task session currently selected"""
|
||||
|
||||
session = api.Session.copy()
|
||||
changes = pipeline.compute_session_changes(
|
||||
changes = compute_session_changes(
|
||||
session,
|
||||
asset=self._get_asset_doc(),
|
||||
task=self._task_name,
|
||||
|
|
@ -692,7 +696,7 @@ class FilesWidget(QtWidgets.QWidget):
|
|||
# to avoid any unwanted Task Changed callbacks to be triggered.
|
||||
return
|
||||
|
||||
api.update_current_task(
|
||||
update_current_task(
|
||||
asset=self._get_asset_doc(),
|
||||
task=self._task_name,
|
||||
template_key=self.template_key
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.9.0-nightly.5"
|
||||
__version__ = "3.9.0-nightly.6"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.9.0-nightly.5" # OpenPype
|
||||
version = "3.9.0-nightly.6" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue