mirror of
https://github.com/ynput/ayon-core.git
synced 2026-01-01 16:34:53 +01:00
Merge remote-tracking branch 'origin/develop' into feature/maya-dirmapping-support
This commit is contained in:
commit
70df6cfebb
83 changed files with 5128 additions and 287 deletions
48
CHANGELOG.md
48
CHANGELOG.md
|
|
@ -1,13 +1,24 @@
|
||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
## [3.3.0-nightly.4](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
## [3.3.0-nightly.6](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||||
|
|
||||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.2.0...HEAD)
|
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.2.0...HEAD)
|
||||||
|
|
||||||
**🚀 Enhancements**
|
**🚀 Enhancements**
|
||||||
|
|
||||||
|
- Expose stop timer through rest api. [\#1886](https://github.com/pypeclub/OpenPype/pull/1886)
|
||||||
|
- Allow Multiple Notes to run on tasks. [\#1882](https://github.com/pypeclub/OpenPype/pull/1882)
|
||||||
|
- Prepare for pyside2 [\#1869](https://github.com/pypeclub/OpenPype/pull/1869)
|
||||||
|
- Filter hosts in settings host-enum [\#1868](https://github.com/pypeclub/OpenPype/pull/1868)
|
||||||
|
- Local actions with process identifier [\#1867](https://github.com/pypeclub/OpenPype/pull/1867)
|
||||||
|
- Workfile tool start at host launch support [\#1865](https://github.com/pypeclub/OpenPype/pull/1865)
|
||||||
|
- Anatomy schema validation [\#1864](https://github.com/pypeclub/OpenPype/pull/1864)
|
||||||
|
- Ftrack prepare project structure [\#1861](https://github.com/pypeclub/OpenPype/pull/1861)
|
||||||
|
- Independent general environments [\#1853](https://github.com/pypeclub/OpenPype/pull/1853)
|
||||||
|
- TVPaint Start Frame [\#1844](https://github.com/pypeclub/OpenPype/pull/1844)
|
||||||
- Ftrack push attributes action adds traceback to job [\#1843](https://github.com/pypeclub/OpenPype/pull/1843)
|
- Ftrack push attributes action adds traceback to job [\#1843](https://github.com/pypeclub/OpenPype/pull/1843)
|
||||||
- Prepare project action enhance [\#1838](https://github.com/pypeclub/OpenPype/pull/1838)
|
- Prepare project action enhance [\#1838](https://github.com/pypeclub/OpenPype/pull/1838)
|
||||||
|
- Standalone Publish of textures family [\#1834](https://github.com/pypeclub/OpenPype/pull/1834)
|
||||||
- nuke: settings create missing default subsets [\#1829](https://github.com/pypeclub/OpenPype/pull/1829)
|
- nuke: settings create missing default subsets [\#1829](https://github.com/pypeclub/OpenPype/pull/1829)
|
||||||
- Update poetry lock [\#1823](https://github.com/pypeclub/OpenPype/pull/1823)
|
- Update poetry lock [\#1823](https://github.com/pypeclub/OpenPype/pull/1823)
|
||||||
- Settings: settings for plugins [\#1819](https://github.com/pypeclub/OpenPype/pull/1819)
|
- Settings: settings for plugins [\#1819](https://github.com/pypeclub/OpenPype/pull/1819)
|
||||||
|
|
@ -15,12 +26,18 @@
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
**🐛 Bug fixes**
|
||||||
|
|
||||||
|
- Normalize path returned from Workfiles. [\#1880](https://github.com/pypeclub/OpenPype/pull/1880)
|
||||||
|
- Workfiles tool event arguments fix [\#1862](https://github.com/pypeclub/OpenPype/pull/1862)
|
||||||
|
- imageio: fix grouping [\#1856](https://github.com/pypeclub/OpenPype/pull/1856)
|
||||||
|
- publisher: missing version in subset prop [\#1849](https://github.com/pypeclub/OpenPype/pull/1849)
|
||||||
|
- Ftrack type error fix in sync to avalon event handler [\#1845](https://github.com/pypeclub/OpenPype/pull/1845)
|
||||||
- Nuke: updating effects subset fail [\#1841](https://github.com/pypeclub/OpenPype/pull/1841)
|
- Nuke: updating effects subset fail [\#1841](https://github.com/pypeclub/OpenPype/pull/1841)
|
||||||
|
- Fix - Standalone Publish better handling of loading multiple versions… [\#1837](https://github.com/pypeclub/OpenPype/pull/1837)
|
||||||
- nuke: write render node skipped with crop [\#1836](https://github.com/pypeclub/OpenPype/pull/1836)
|
- nuke: write render node skipped with crop [\#1836](https://github.com/pypeclub/OpenPype/pull/1836)
|
||||||
- Project folder structure overrides [\#1813](https://github.com/pypeclub/OpenPype/pull/1813)
|
- Project folder structure overrides [\#1813](https://github.com/pypeclub/OpenPype/pull/1813)
|
||||||
- Maya: fix yeti settings path in extractor [\#1809](https://github.com/pypeclub/OpenPype/pull/1809)
|
- Maya: fix yeti settings path in extractor [\#1809](https://github.com/pypeclub/OpenPype/pull/1809)
|
||||||
- Failsafe for cross project containers. [\#1806](https://github.com/pypeclub/OpenPype/pull/1806)
|
- Failsafe for cross project containers. [\#1806](https://github.com/pypeclub/OpenPype/pull/1806)
|
||||||
- Houdini colector formatting keys fix [\#1802](https://github.com/pypeclub/OpenPype/pull/1802)
|
- Settings error dialog on show [\#1798](https://github.com/pypeclub/OpenPype/pull/1798)
|
||||||
|
|
||||||
**Merged pull requests:**
|
**Merged pull requests:**
|
||||||
|
|
||||||
|
|
@ -47,7 +64,6 @@
|
||||||
- Settings Hosts enum [\#1739](https://github.com/pypeclub/OpenPype/pull/1739)
|
- Settings Hosts enum [\#1739](https://github.com/pypeclub/OpenPype/pull/1739)
|
||||||
- Validate containers settings [\#1736](https://github.com/pypeclub/OpenPype/pull/1736)
|
- Validate containers settings [\#1736](https://github.com/pypeclub/OpenPype/pull/1736)
|
||||||
- PS - added loader from sequence [\#1726](https://github.com/pypeclub/OpenPype/pull/1726)
|
- PS - added loader from sequence [\#1726](https://github.com/pypeclub/OpenPype/pull/1726)
|
||||||
- Toggle Ftrack upload in StandalonePublisher [\#1708](https://github.com/pypeclub/OpenPype/pull/1708)
|
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
**🐛 Bug fixes**
|
||||||
|
|
||||||
|
|
@ -91,40 +107,14 @@
|
||||||
|
|
||||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.2...2.18.3)
|
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.2...2.18.3)
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
|
||||||
|
|
||||||
- Tools names forwards compatibility [\#1727](https://github.com/pypeclub/OpenPype/pull/1727)
|
|
||||||
|
|
||||||
**⚠️ Deprecations**
|
|
||||||
|
|
||||||
- global: removing obsolete ftrack validator plugin [\#1710](https://github.com/pypeclub/OpenPype/pull/1710)
|
|
||||||
|
|
||||||
## [2.18.2](https://github.com/pypeclub/OpenPype/tree/2.18.2) (2021-06-16)
|
## [2.18.2](https://github.com/pypeclub/OpenPype/tree/2.18.2) (2021-06-16)
|
||||||
|
|
||||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.1.0...2.18.2)
|
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.1.0...2.18.2)
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
|
||||||
|
|
||||||
- Maya: Extract review hotfix - 2.x backport [\#1713](https://github.com/pypeclub/OpenPype/pull/1713)
|
|
||||||
|
|
||||||
**Merged pull requests:**
|
|
||||||
|
|
||||||
- 1698 Nuke: Prerender Frame Range by default [\#1709](https://github.com/pypeclub/OpenPype/pull/1709)
|
|
||||||
|
|
||||||
## [3.1.0](https://github.com/pypeclub/OpenPype/tree/3.1.0) (2021-06-15)
|
## [3.1.0](https://github.com/pypeclub/OpenPype/tree/3.1.0) (2021-06-15)
|
||||||
|
|
||||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.1.0-nightly.4...3.1.0)
|
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.1.0-nightly.4...3.1.0)
|
||||||
|
|
||||||
**🚀 Enhancements**
|
|
||||||
|
|
||||||
- Log Viewer with OpenPype style [\#1703](https://github.com/pypeclub/OpenPype/pull/1703)
|
|
||||||
- Scrolling in OpenPype info widget [\#1702](https://github.com/pypeclub/OpenPype/pull/1702)
|
|
||||||
|
|
||||||
**🐛 Bug fixes**
|
|
||||||
|
|
||||||
- Nuke: broken publishing rendered frames [\#1707](https://github.com/pypeclub/OpenPype/pull/1707)
|
|
||||||
- Standalone publisher Thumbnail export args [\#1705](https://github.com/pypeclub/OpenPype/pull/1705)
|
|
||||||
|
|
||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -49,7 +49,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
||||||
))
|
))
|
||||||
return
|
return
|
||||||
|
|
||||||
self.log.info("Last workfile does not exits.")
|
self.log.info("Last workfile does not exist.")
|
||||||
|
|
||||||
project_name = self.data["project_name"]
|
project_name = self.data["project_name"]
|
||||||
asset_name = self.data["asset_name"]
|
asset_name = self.data["asset_name"]
|
||||||
|
|
|
||||||
53
openpype/hosts/maya/api/commands.py
Normal file
53
openpype/hosts/maya/api/commands.py
Normal file
|
|
@ -0,0 +1,53 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
"""OpenPype script commands to be used directly in Maya."""
|
||||||
|
|
||||||
|
|
||||||
|
class ToolWindows:
|
||||||
|
|
||||||
|
_windows = {}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def get_window(cls, tool):
|
||||||
|
"""Get widget for specific tool.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tool (str): Name of the tool.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Stored widget.
|
||||||
|
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return cls._windows[tool]
|
||||||
|
except KeyError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def set_window(cls, tool, window):
|
||||||
|
"""Set widget for the tool.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tool (str): Name of the tool.
|
||||||
|
window (QtWidgets.QWidget): Widget
|
||||||
|
|
||||||
|
"""
|
||||||
|
cls._windows[tool] = window
|
||||||
|
|
||||||
|
|
||||||
|
def edit_shader_definitions():
|
||||||
|
from avalon.tools import lib
|
||||||
|
from Qt import QtWidgets
|
||||||
|
from openpype.hosts.maya.api.shader_definition_editor import (
|
||||||
|
ShaderDefinitionsEditor
|
||||||
|
)
|
||||||
|
|
||||||
|
top_level_widgets = QtWidgets.QApplication.topLevelWidgets()
|
||||||
|
main_window = next(widget for widget in top_level_widgets
|
||||||
|
if widget.objectName() == "MayaWindow")
|
||||||
|
|
||||||
|
with lib.application():
|
||||||
|
window = ToolWindows.get_window("shader_definition_editor")
|
||||||
|
if not window:
|
||||||
|
window = ShaderDefinitionsEditor(parent=main_window)
|
||||||
|
ToolWindows.set_window("shader_definition_editor", window)
|
||||||
|
window.show()
|
||||||
|
|
@ -6,9 +6,9 @@ from avalon.vendor.Qt import QtWidgets, QtGui
|
||||||
from avalon.maya import pipeline
|
from avalon.maya import pipeline
|
||||||
from openpype.api import BuildWorkfile
|
from openpype.api import BuildWorkfile
|
||||||
import maya.cmds as cmds
|
import maya.cmds as cmds
|
||||||
|
from openpype.settings import get_project_settings
|
||||||
|
|
||||||
self = sys.modules[__name__]
|
self = sys.modules[__name__]
|
||||||
self._menu = os.environ.get("AVALON_LABEL")
|
|
||||||
|
|
||||||
|
|
||||||
log = logging.getLogger(__name__)
|
log = logging.getLogger(__name__)
|
||||||
|
|
@ -17,8 +17,11 @@ log = logging.getLogger(__name__)
|
||||||
def _get_menu(menu_name=None):
|
def _get_menu(menu_name=None):
|
||||||
"""Return the menu instance if it currently exists in Maya"""
|
"""Return the menu instance if it currently exists in Maya"""
|
||||||
|
|
||||||
|
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||||
|
_menu = project_settings["maya"]["scriptsmenu"]["name"]
|
||||||
|
|
||||||
if menu_name is None:
|
if menu_name is None:
|
||||||
menu_name = self._menu
|
menu_name = _menu
|
||||||
widgets = dict((
|
widgets = dict((
|
||||||
w.objectName(), w) for w in QtWidgets.QApplication.allWidgets())
|
w.objectName(), w) for w in QtWidgets.QApplication.allWidgets())
|
||||||
menu = widgets.get(menu_name)
|
menu = widgets.get(menu_name)
|
||||||
|
|
@ -55,35 +58,7 @@ def deferred():
|
||||||
parent=pipeline._parent
|
parent=pipeline._parent
|
||||||
)
|
)
|
||||||
|
|
||||||
# Find the pipeline menu
|
log.info("Attempting to install scripts menu ...")
|
||||||
top_menu = _get_menu(pipeline._menu)
|
|
||||||
|
|
||||||
# Try to find workfile tool action in the menu
|
|
||||||
workfile_action = None
|
|
||||||
for action in top_menu.actions():
|
|
||||||
if action.text() == "Work Files":
|
|
||||||
workfile_action = action
|
|
||||||
break
|
|
||||||
|
|
||||||
# Add at the top of menu if "Work Files" action was not found
|
|
||||||
after_action = ""
|
|
||||||
if workfile_action:
|
|
||||||
# Use action's object name for `insertAfter` argument
|
|
||||||
after_action = workfile_action.objectName()
|
|
||||||
|
|
||||||
# Insert action to menu
|
|
||||||
cmds.menuItem(
|
|
||||||
"Work Files",
|
|
||||||
parent=pipeline._menu,
|
|
||||||
command=launch_workfiles_app,
|
|
||||||
insertAfter=after_action
|
|
||||||
)
|
|
||||||
|
|
||||||
# Remove replaced action
|
|
||||||
if workfile_action:
|
|
||||||
top_menu.removeAction(workfile_action)
|
|
||||||
|
|
||||||
log.info("Attempting to install scripts menu..")
|
|
||||||
|
|
||||||
add_build_workfiles_item()
|
add_build_workfiles_item()
|
||||||
add_look_assigner_item()
|
add_look_assigner_item()
|
||||||
|
|
@ -100,13 +75,18 @@ def deferred():
|
||||||
return
|
return
|
||||||
|
|
||||||
# load configuration of custom menu
|
# load configuration of custom menu
|
||||||
config_path = os.path.join(os.path.dirname(__file__), "menu.json")
|
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||||
config = scriptsmenu.load_configuration(config_path)
|
config = project_settings["maya"]["scriptsmenu"]["definition"]
|
||||||
|
_menu = project_settings["maya"]["scriptsmenu"]["name"]
|
||||||
|
|
||||||
|
if not config:
|
||||||
|
log.warning("Skipping studio menu, no definition found.")
|
||||||
|
return
|
||||||
|
|
||||||
# run the launcher for Maya menu
|
# run the launcher for Maya menu
|
||||||
studio_menu = launchformaya.main(
|
studio_menu = launchformaya.main(
|
||||||
title=self._menu.title(),
|
title=_menu.title(),
|
||||||
objectName=self._menu
|
objectName=_menu.title().lower().replace(" ", "_")
|
||||||
)
|
)
|
||||||
|
|
||||||
# apply configuration
|
# apply configuration
|
||||||
|
|
@ -116,7 +96,7 @@ def deferred():
|
||||||
def uninstall():
|
def uninstall():
|
||||||
menu = _get_menu()
|
menu = _get_menu()
|
||||||
if menu:
|
if menu:
|
||||||
log.info("Attempting to uninstall..")
|
log.info("Attempting to uninstall ...")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
menu.deleteLater()
|
menu.deleteLater()
|
||||||
|
|
@ -136,9 +116,8 @@ def install():
|
||||||
|
|
||||||
|
|
||||||
def popup():
|
def popup():
|
||||||
"""Pop-up the existing menu near the mouse cursor"""
|
"""Pop-up the existing menu near the mouse cursor."""
|
||||||
menu = _get_menu()
|
menu = _get_menu()
|
||||||
|
|
||||||
cursor = QtGui.QCursor()
|
cursor = QtGui.QCursor()
|
||||||
point = cursor.pos()
|
point = cursor.pos()
|
||||||
menu.exec_(point)
|
menu.exec_(point)
|
||||||
|
|
|
||||||
176
openpype/hosts/maya/api/shader_definition_editor.py
Normal file
176
openpype/hosts/maya/api/shader_definition_editor.py
Normal file
|
|
@ -0,0 +1,176 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
"""Editor for shader definitions.
|
||||||
|
|
||||||
|
Shader names are stored as simple text file over GridFS in mongodb.
|
||||||
|
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
from Qt import QtWidgets, QtCore, QtGui
|
||||||
|
from openpype.lib.mongo import OpenPypeMongoConnection
|
||||||
|
from openpype import resources
|
||||||
|
import gridfs
|
||||||
|
|
||||||
|
|
||||||
|
DEFINITION_FILENAME = "{}/maya/shader_definition.txt".format(
|
||||||
|
os.getenv("AVALON_PROJECT"))
|
||||||
|
|
||||||
|
|
||||||
|
class ShaderDefinitionsEditor(QtWidgets.QWidget):
|
||||||
|
"""Widget serving as simple editor for shader name definitions."""
|
||||||
|
|
||||||
|
# name of the file used to store definitions
|
||||||
|
|
||||||
|
def __init__(self, parent=None):
|
||||||
|
super(ShaderDefinitionsEditor, self).__init__(parent)
|
||||||
|
self._mongo = OpenPypeMongoConnection.get_mongo_client()
|
||||||
|
self._gridfs = gridfs.GridFS(
|
||||||
|
self._mongo[os.getenv("OPENPYPE_DATABASE_NAME")])
|
||||||
|
self._editor = None
|
||||||
|
|
||||||
|
self._original_content = self._read_definition_file()
|
||||||
|
|
||||||
|
self.setObjectName("shaderDefinitionEditor")
|
||||||
|
self.setWindowTitle("OpenPype shader name definition editor")
|
||||||
|
icon = QtGui.QIcon(resources.pype_icon_filepath())
|
||||||
|
self.setWindowIcon(icon)
|
||||||
|
self.setWindowFlags(QtCore.Qt.Window)
|
||||||
|
self.setParent(parent)
|
||||||
|
self.setAttribute(QtCore.Qt.WA_DeleteOnClose)
|
||||||
|
self.resize(750, 500)
|
||||||
|
|
||||||
|
self._setup_ui()
|
||||||
|
self._reload()
|
||||||
|
|
||||||
|
def _setup_ui(self):
|
||||||
|
"""Setup UI of Widget."""
|
||||||
|
layout = QtWidgets.QVBoxLayout(self)
|
||||||
|
label = QtWidgets.QLabel()
|
||||||
|
label.setText("Put shader names here - one name per line:")
|
||||||
|
layout.addWidget(label)
|
||||||
|
self._editor = QtWidgets.QPlainTextEdit()
|
||||||
|
self._editor.setStyleSheet("border: none;")
|
||||||
|
layout.addWidget(self._editor)
|
||||||
|
|
||||||
|
btn_layout = QtWidgets.QHBoxLayout()
|
||||||
|
save_btn = QtWidgets.QPushButton("Save")
|
||||||
|
save_btn.clicked.connect(self._save)
|
||||||
|
|
||||||
|
reload_btn = QtWidgets.QPushButton("Reload")
|
||||||
|
reload_btn.clicked.connect(self._reload)
|
||||||
|
|
||||||
|
exit_btn = QtWidgets.QPushButton("Exit")
|
||||||
|
exit_btn.clicked.connect(self._close)
|
||||||
|
|
||||||
|
btn_layout.addWidget(reload_btn)
|
||||||
|
btn_layout.addWidget(save_btn)
|
||||||
|
btn_layout.addWidget(exit_btn)
|
||||||
|
|
||||||
|
layout.addLayout(btn_layout)
|
||||||
|
|
||||||
|
def _read_definition_file(self, file=None):
|
||||||
|
"""Read definition file from database.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file (gridfs.grid_file.GridOut, Optional): File to read. If not
|
||||||
|
set, new query will be issued to find it.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: Content of the file or empty string if file doesn't exist.
|
||||||
|
|
||||||
|
"""
|
||||||
|
content = ""
|
||||||
|
if not file:
|
||||||
|
file = self._gridfs.find_one(
|
||||||
|
{"filename": DEFINITION_FILENAME})
|
||||||
|
if not file:
|
||||||
|
print(">>> [SNDE]: nothing in database yet")
|
||||||
|
return content
|
||||||
|
content = file.read()
|
||||||
|
file.close()
|
||||||
|
return content
|
||||||
|
|
||||||
|
def _write_definition_file(self, content, force=False):
|
||||||
|
"""Write content as definition to file in database.
|
||||||
|
|
||||||
|
Before file is writen, check is made if its content has not
|
||||||
|
changed. If is changed, warning is issued to user if he wants
|
||||||
|
it to overwrite. Note: GridFs doesn't allow changing file content.
|
||||||
|
You need to delete existing file and create new one.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
content (str): Content to write.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ContentException: If file is changed in database while
|
||||||
|
editor is running.
|
||||||
|
"""
|
||||||
|
file = self._gridfs.find_one(
|
||||||
|
{"filename": DEFINITION_FILENAME})
|
||||||
|
if file:
|
||||||
|
content_check = self._read_definition_file(file)
|
||||||
|
if content == content_check:
|
||||||
|
print(">>> [SNDE]: content not changed")
|
||||||
|
return
|
||||||
|
if self._original_content != content_check:
|
||||||
|
if not force:
|
||||||
|
raise ContentException("Content changed")
|
||||||
|
print(">>> [SNDE]: overwriting data")
|
||||||
|
file.close()
|
||||||
|
self._gridfs.delete(file._id)
|
||||||
|
|
||||||
|
file = self._gridfs.new_file(
|
||||||
|
filename=DEFINITION_FILENAME,
|
||||||
|
content_type='text/plain',
|
||||||
|
encoding='utf-8')
|
||||||
|
file.write(content)
|
||||||
|
file.close()
|
||||||
|
QtCore.QTimer.singleShot(200, self._reset_style)
|
||||||
|
self._editor.setStyleSheet("border: 1px solid #33AF65;")
|
||||||
|
self._original_content = content
|
||||||
|
|
||||||
|
def _reset_style(self):
|
||||||
|
"""Reset editor style back.
|
||||||
|
|
||||||
|
Used to visually indicate save.
|
||||||
|
|
||||||
|
"""
|
||||||
|
self._editor.setStyleSheet("border: none;")
|
||||||
|
|
||||||
|
def _close(self):
|
||||||
|
self.hide()
|
||||||
|
|
||||||
|
def closeEvent(self, event):
|
||||||
|
event.ignore()
|
||||||
|
self.hide()
|
||||||
|
|
||||||
|
def _reload(self):
|
||||||
|
print(">>> [SNDE]: reloading")
|
||||||
|
self._set_content(self._read_definition_file())
|
||||||
|
|
||||||
|
def _save(self):
|
||||||
|
try:
|
||||||
|
self._write_definition_file(content=self._editor.toPlainText())
|
||||||
|
except ContentException:
|
||||||
|
# content has changed meanwhile
|
||||||
|
print(">>> [SNDE]: content has changed")
|
||||||
|
self._show_overwrite_warning()
|
||||||
|
|
||||||
|
def _set_content(self, content):
|
||||||
|
self._editor.setPlainText(content)
|
||||||
|
|
||||||
|
def _show_overwrite_warning(self):
|
||||||
|
reply = QtWidgets.QMessageBox.question(
|
||||||
|
self,
|
||||||
|
"Warning",
|
||||||
|
("Content you are editing was changed meanwhile in database.\n"
|
||||||
|
"Please, reload and solve the conflict."),
|
||||||
|
QtWidgets.QMessageBox.OK)
|
||||||
|
|
||||||
|
if reply == QtWidgets.QMessageBox.OK:
|
||||||
|
# do nothing
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
class ContentException(Exception):
|
||||||
|
"""This is risen during save if file is changed in database."""
|
||||||
|
pass
|
||||||
|
|
@ -1,8 +1,16 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
"""Validate model nodes names."""
|
||||||
from maya import cmds
|
from maya import cmds
|
||||||
import pyblish.api
|
import pyblish.api
|
||||||
import openpype.api
|
import openpype.api
|
||||||
|
import avalon.api
|
||||||
import openpype.hosts.maya.api.action
|
import openpype.hosts.maya.api.action
|
||||||
|
from openpype.hosts.maya.api.shader_definition_editor import (
|
||||||
|
DEFINITION_FILENAME)
|
||||||
|
from openpype.lib.mongo import OpenPypeMongoConnection
|
||||||
|
import gridfs
|
||||||
import re
|
import re
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
class ValidateModelName(pyblish.api.InstancePlugin):
|
class ValidateModelName(pyblish.api.InstancePlugin):
|
||||||
|
|
@ -19,18 +27,18 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
||||||
families = ["model"]
|
families = ["model"]
|
||||||
label = "Model Name"
|
label = "Model Name"
|
||||||
actions = [openpype.hosts.maya.api.action.SelectInvalidAction]
|
actions = [openpype.hosts.maya.api.action.SelectInvalidAction]
|
||||||
# path to shader names definitions
|
|
||||||
# TODO: move it to preset file
|
|
||||||
material_file = None
|
material_file = None
|
||||||
regex = '(.*)_(\\d)*_(.*)_(GEO)'
|
database_file = DEFINITION_FILENAME
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def get_invalid(cls, instance):
|
def get_invalid(cls, instance):
|
||||||
|
"""Get invalid nodes."""
|
||||||
|
use_db = cls.database
|
||||||
|
|
||||||
# find out if supplied transform is group or not
|
def is_group(group_name):
|
||||||
def is_group(groupName):
|
"""Find out if supplied transform is group or not."""
|
||||||
try:
|
try:
|
||||||
children = cmds.listRelatives(groupName, children=True)
|
children = cmds.listRelatives(group_name, children=True)
|
||||||
for child in children:
|
for child in children:
|
||||||
if not cmds.ls(child, transforms=True):
|
if not cmds.ls(child, transforms=True):
|
||||||
return False
|
return False
|
||||||
|
|
@ -44,29 +52,74 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
||||||
cls.log.error("Instance has no nodes!")
|
cls.log.error("Instance has no nodes!")
|
||||||
return True
|
return True
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
# validate top level group name
|
||||||
|
assemblies = cmds.ls(content_instance, assemblies=True, long=True)
|
||||||
|
if len(assemblies) != 1:
|
||||||
|
cls.log.error("Must have exactly one top group")
|
||||||
|
return assemblies or True
|
||||||
|
top_group = assemblies[0]
|
||||||
|
regex = cls.top_level_regex
|
||||||
|
r = re.compile(regex)
|
||||||
|
m = r.match(top_group)
|
||||||
|
if m is None:
|
||||||
|
cls.log.error("invalid name on: {}".format(top_group))
|
||||||
|
cls.log.error("name doesn't match regex {}".format(regex))
|
||||||
|
invalid.append(top_group)
|
||||||
|
else:
|
||||||
|
if "asset" in r.groupindex:
|
||||||
|
if m.group("asset") != avalon.api.Session["AVALON_ASSET"]:
|
||||||
|
cls.log.error("Invalid asset name in top level group.")
|
||||||
|
return top_group
|
||||||
|
if "subset" in r.groupindex:
|
||||||
|
if m.group("subset") != instance.data.get("subset"):
|
||||||
|
cls.log.error("Invalid subset name in top level group.")
|
||||||
|
return top_group
|
||||||
|
if "project" in r.groupindex:
|
||||||
|
if m.group("project") != avalon.api.Session["AVALON_PROJECT"]:
|
||||||
|
cls.log.error("Invalid project name in top level group.")
|
||||||
|
return top_group
|
||||||
|
|
||||||
descendants = cmds.listRelatives(content_instance,
|
descendants = cmds.listRelatives(content_instance,
|
||||||
allDescendents=True,
|
allDescendents=True,
|
||||||
fullPath=True) or []
|
fullPath=True) or []
|
||||||
|
|
||||||
descendants = cmds.ls(descendants, noIntermediate=True, long=True)
|
descendants = cmds.ls(descendants, noIntermediate=True, long=True)
|
||||||
trns = cmds.ls(descendants, long=False, type=('transform'))
|
trns = cmds.ls(descendants, long=False, type='transform')
|
||||||
|
|
||||||
# filter out groups
|
# filter out groups
|
||||||
filter = [node for node in trns if not is_group(node)]
|
filtered = [node for node in trns if not is_group(node)]
|
||||||
|
|
||||||
# load shader list file as utf-8
|
# load shader list file as utf-8
|
||||||
if cls.material_file:
|
shaders = []
|
||||||
shader_file = open(cls.material_file, "r")
|
if not use_db:
|
||||||
shaders = shader_file.readlines()
|
if cls.material_file:
|
||||||
|
if os.path.isfile(cls.material_file):
|
||||||
|
shader_file = open(cls.material_file, "r")
|
||||||
|
shaders = shader_file.readlines()
|
||||||
|
shader_file.close()
|
||||||
|
else:
|
||||||
|
cls.log.error("Missing shader name definition file.")
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
client = OpenPypeMongoConnection.get_mongo_client()
|
||||||
|
fs = gridfs.GridFS(client[os.getenv("OPENPYPE_DATABASE_NAME")])
|
||||||
|
shader_file = fs.find_one({"filename": cls.database_file})
|
||||||
|
if not shader_file:
|
||||||
|
cls.log.error("Missing shader name definition in database.")
|
||||||
|
return True
|
||||||
|
shaders = shader_file.read().splitlines()
|
||||||
shader_file.close()
|
shader_file.close()
|
||||||
|
|
||||||
# strip line endings from list
|
# strip line endings from list
|
||||||
shaders = map(lambda s: s.rstrip(), shaders)
|
shaders = map(lambda s: s.rstrip(), shaders)
|
||||||
|
|
||||||
# compile regex for testing names
|
# compile regex for testing names
|
||||||
r = re.compile(cls.regex)
|
regex = cls.regex
|
||||||
|
r = re.compile(regex)
|
||||||
|
|
||||||
for obj in filter:
|
for obj in filtered:
|
||||||
|
cls.log.info("testing: {}".format(obj))
|
||||||
m = r.match(obj)
|
m = r.match(obj)
|
||||||
if m is None:
|
if m is None:
|
||||||
cls.log.error("invalid name on: {}".format(obj))
|
cls.log.error("invalid name on: {}".format(obj))
|
||||||
|
|
@ -74,7 +127,7 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
||||||
else:
|
else:
|
||||||
# if we have shader files and shader named group is in
|
# if we have shader files and shader named group is in
|
||||||
# regex, test this group against names in shader file
|
# regex, test this group against names in shader file
|
||||||
if 'shader' in r.groupindex and shaders:
|
if "shader" in r.groupindex and shaders:
|
||||||
try:
|
try:
|
||||||
if not m.group('shader') in shaders:
|
if not m.group('shader') in shaders:
|
||||||
cls.log.error(
|
cls.log.error(
|
||||||
|
|
@ -90,8 +143,8 @@ class ValidateModelName(pyblish.api.InstancePlugin):
|
||||||
return invalid
|
return invalid
|
||||||
|
|
||||||
def process(self, instance):
|
def process(self, instance):
|
||||||
|
"""Plugin entry point."""
|
||||||
invalid = self.get_invalid(instance)
|
invalid = self.get_invalid(instance)
|
||||||
|
|
||||||
if invalid:
|
if invalid:
|
||||||
raise RuntimeError("Model naming is invalid. See log.")
|
raise RuntimeError("Model naming is invalid. See the log.")
|
||||||
|
|
|
||||||
|
|
@ -1660,9 +1660,13 @@ def find_free_space_to_paste_nodes(
|
||||||
def launch_workfiles_app():
|
def launch_workfiles_app():
|
||||||
'''Function letting start workfiles after start of host
|
'''Function letting start workfiles after start of host
|
||||||
'''
|
'''
|
||||||
# get state from settings
|
from openpype.lib import (
|
||||||
open_at_start = get_current_project_settings()["nuke"].get(
|
env_value_to_bool
|
||||||
"general", {}).get("open_workfile_at_start")
|
)
|
||||||
|
# get all imortant settings
|
||||||
|
open_at_start = env_value_to_bool(
|
||||||
|
env_key="OPENPYPE_WORKFILE_TOOL_ON_START",
|
||||||
|
default=None)
|
||||||
|
|
||||||
# return if none is defined
|
# return if none is defined
|
||||||
if not open_at_start:
|
if not open_at_start:
|
||||||
|
|
@ -1739,3 +1743,68 @@ def process_workfile_builder():
|
||||||
log.info("Opening last workfile...")
|
log.info("Opening last workfile...")
|
||||||
# open workfile
|
# open workfile
|
||||||
open_file(last_workfile_path)
|
open_file(last_workfile_path)
|
||||||
|
|
||||||
|
|
||||||
|
def recreate_instance(origin_node, avalon_data=None):
|
||||||
|
"""Recreate input instance to different data
|
||||||
|
|
||||||
|
Args:
|
||||||
|
origin_node (nuke.Node): Nuke node to be recreating from
|
||||||
|
avalon_data (dict, optional): data to be used in new node avalon_data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
nuke.Node: newly created node
|
||||||
|
"""
|
||||||
|
knobs_wl = ["render", "publish", "review", "ypos",
|
||||||
|
"use_limit", "first", "last"]
|
||||||
|
# get data from avalon knobs
|
||||||
|
data = anlib.get_avalon_knob_data(
|
||||||
|
origin_node)
|
||||||
|
|
||||||
|
# add input data to avalon data
|
||||||
|
if avalon_data:
|
||||||
|
data.update(avalon_data)
|
||||||
|
|
||||||
|
# capture all node knobs allowed in op_knobs
|
||||||
|
knobs_data = {k: origin_node[k].value()
|
||||||
|
for k in origin_node.knobs()
|
||||||
|
for key in knobs_wl
|
||||||
|
if key in k}
|
||||||
|
|
||||||
|
# get node dependencies
|
||||||
|
inputs = origin_node.dependencies()
|
||||||
|
outputs = origin_node.dependent()
|
||||||
|
|
||||||
|
# remove the node
|
||||||
|
nuke.delete(origin_node)
|
||||||
|
|
||||||
|
# create new node
|
||||||
|
# get appropriate plugin class
|
||||||
|
creator_plugin = None
|
||||||
|
for Creator in api.discover(api.Creator):
|
||||||
|
if Creator.__name__ == data["creator"]:
|
||||||
|
creator_plugin = Creator
|
||||||
|
break
|
||||||
|
|
||||||
|
# create write node with creator
|
||||||
|
new_node_name = data["subset"]
|
||||||
|
new_node = creator_plugin(new_node_name, data["asset"]).process()
|
||||||
|
|
||||||
|
# white listed knobs to the new node
|
||||||
|
for _k, _v in knobs_data.items():
|
||||||
|
try:
|
||||||
|
print(_k, _v)
|
||||||
|
new_node[_k].setValue(_v)
|
||||||
|
except Exception as e:
|
||||||
|
print(e)
|
||||||
|
|
||||||
|
# connect to original inputs
|
||||||
|
for i, n in enumerate(inputs):
|
||||||
|
new_node.setInput(i, n)
|
||||||
|
|
||||||
|
# connect to outputs
|
||||||
|
if len(outputs) > 0:
|
||||||
|
for dn in outputs:
|
||||||
|
dn.setInput(0, new_node)
|
||||||
|
|
||||||
|
return new_node
|
||||||
|
|
|
||||||
|
|
@ -182,7 +182,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
||||||
})
|
})
|
||||||
for subset, properities in self.subsets.items():
|
for subset, properities in self.subsets.items():
|
||||||
version = properities.get("version")
|
version = properities.get("version")
|
||||||
if version and version == 0:
|
if version == 0:
|
||||||
properities.pop("version")
|
properities.pop("version")
|
||||||
|
|
||||||
# adding Review-able instance
|
# adding Review-able instance
|
||||||
|
|
|
||||||
|
|
@ -37,7 +37,7 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
||||||
|
|
||||||
# return if any
|
# return if any
|
||||||
if entity_type:
|
if entity_type:
|
||||||
return {"entityType": entity_type, "entityName": value}
|
return {"entity_type": entity_type, "entity_name": value}
|
||||||
|
|
||||||
def rename_with_hierarchy(self, instance):
|
def rename_with_hierarchy(self, instance):
|
||||||
search_text = ""
|
search_text = ""
|
||||||
|
|
@ -76,8 +76,8 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
||||||
# add current selection context hierarchy from standalonepublisher
|
# add current selection context hierarchy from standalonepublisher
|
||||||
for entity in reversed(visual_hierarchy):
|
for entity in reversed(visual_hierarchy):
|
||||||
parents.append({
|
parents.append({
|
||||||
"entityType": entity["data"]["entityType"],
|
"entity_type": entity["data"]["entityType"],
|
||||||
"entityName": entity["name"]
|
"entity_name": entity["name"]
|
||||||
})
|
})
|
||||||
|
|
||||||
if self.shot_add_hierarchy:
|
if self.shot_add_hierarchy:
|
||||||
|
|
@ -98,7 +98,7 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
||||||
|
|
||||||
# in case SP context is set to the same folder
|
# in case SP context is set to the same folder
|
||||||
if (_index == 0) and ("folder" in parent_key) \
|
if (_index == 0) and ("folder" in parent_key) \
|
||||||
and (parents[-1]["entityName"] == parent_filled):
|
and (parents[-1]["entity_name"] == parent_filled):
|
||||||
self.log.debug(f" skiping : {parent_filled}")
|
self.log.debug(f" skiping : {parent_filled}")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
|
@ -280,9 +280,9 @@ class CollectHierarchyContext(pyblish.api.ContextPlugin):
|
||||||
|
|
||||||
for parent in reversed(parents):
|
for parent in reversed(parents):
|
||||||
next_dict = {}
|
next_dict = {}
|
||||||
parent_name = parent["entityName"]
|
parent_name = parent["entity_name"]
|
||||||
next_dict[parent_name] = {}
|
next_dict[parent_name] = {}
|
||||||
next_dict[parent_name]["entity_type"] = parent["entityType"]
|
next_dict[parent_name]["entity_type"] = parent["entity_type"]
|
||||||
next_dict[parent_name]["childs"] = actual
|
next_dict[parent_name]["childs"] = actual
|
||||||
actual = next_dict
|
actual = next_dict
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,456 @@
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import pyblish.api
|
||||||
|
import json
|
||||||
|
|
||||||
|
from avalon.api import format_template_with_optional_keys
|
||||||
|
|
||||||
|
from openpype.lib import prepare_template_data
|
||||||
|
|
||||||
|
|
||||||
|
class CollectTextures(pyblish.api.ContextPlugin):
|
||||||
|
"""Collect workfile (and its resource_files) and textures.
|
||||||
|
|
||||||
|
Currently implements use case with Mari and Substance Painter, where
|
||||||
|
one workfile is main (.mra - Mari) with possible additional workfiles
|
||||||
|
(.spp - Substance)
|
||||||
|
|
||||||
|
|
||||||
|
Provides:
|
||||||
|
1 instance per workfile (with 'resources' filled if needed)
|
||||||
|
(workfile family)
|
||||||
|
1 instance per group of textures
|
||||||
|
(textures family)
|
||||||
|
"""
|
||||||
|
|
||||||
|
order = pyblish.api.CollectorOrder
|
||||||
|
label = "Collect Textures"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
families = ["texture_batch"]
|
||||||
|
actions = []
|
||||||
|
|
||||||
|
# from presets
|
||||||
|
main_workfile_extensions = ['mra']
|
||||||
|
other_workfile_extensions = ['spp', 'psd']
|
||||||
|
texture_extensions = ["exr", "dpx", "jpg", "jpeg", "png", "tiff", "tga",
|
||||||
|
"gif", "svg"]
|
||||||
|
|
||||||
|
# additional families (ftrack etc.)
|
||||||
|
workfile_families = []
|
||||||
|
textures_families = []
|
||||||
|
|
||||||
|
color_space = ["linsRGB", "raw", "acesg"]
|
||||||
|
|
||||||
|
# currently implemented placeholders ["color_space"]
|
||||||
|
# describing patterns in file names splitted by regex groups
|
||||||
|
input_naming_patterns = {
|
||||||
|
# workfile: corridorMain_v001.mra >
|
||||||
|
# texture: corridorMain_aluminiumID_v001_baseColor_linsRGB_1001.exr
|
||||||
|
"workfile": r'^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+',
|
||||||
|
"textures": r'^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+', # noqa
|
||||||
|
}
|
||||||
|
# matching regex group position to 'input_naming_patterns'
|
||||||
|
input_naming_groups = {
|
||||||
|
"workfile": ('asset', 'filler', 'version'),
|
||||||
|
"textures": ('asset', 'shader', 'version', 'channel', 'color_space',
|
||||||
|
'udim')
|
||||||
|
}
|
||||||
|
|
||||||
|
workfile_subset_template = "textures{Subset}Workfile"
|
||||||
|
# implemented keys: ["color_space", "channel", "subset", "shader"]
|
||||||
|
texture_subset_template = "textures{Subset}_{Shader}_{Channel}"
|
||||||
|
|
||||||
|
def process(self, context):
|
||||||
|
self.context = context
|
||||||
|
|
||||||
|
resource_files = {}
|
||||||
|
workfile_files = {}
|
||||||
|
representations = {}
|
||||||
|
version_data = {}
|
||||||
|
asset_builds = set()
|
||||||
|
asset = None
|
||||||
|
for instance in context:
|
||||||
|
if not self.input_naming_patterns:
|
||||||
|
raise ValueError("Naming patterns are not configured. \n"
|
||||||
|
"Ask admin to provide naming conventions "
|
||||||
|
"for workfiles and textures.")
|
||||||
|
|
||||||
|
if not asset:
|
||||||
|
asset = instance.data["asset"] # selected from SP
|
||||||
|
|
||||||
|
parsed_subset = instance.data["subset"].replace(
|
||||||
|
instance.data["family"], '')
|
||||||
|
|
||||||
|
fill_pairs = {
|
||||||
|
"subset": parsed_subset
|
||||||
|
}
|
||||||
|
|
||||||
|
fill_pairs = prepare_template_data(fill_pairs)
|
||||||
|
workfile_subset = format_template_with_optional_keys(
|
||||||
|
fill_pairs, self.workfile_subset_template)
|
||||||
|
|
||||||
|
processed_instance = False
|
||||||
|
for repre in instance.data["representations"]:
|
||||||
|
ext = repre["ext"].replace('.', '')
|
||||||
|
asset_build = version = None
|
||||||
|
|
||||||
|
if isinstance(repre["files"], list):
|
||||||
|
repre_file = repre["files"][0]
|
||||||
|
else:
|
||||||
|
repre_file = repre["files"]
|
||||||
|
|
||||||
|
if ext in self.main_workfile_extensions or \
|
||||||
|
ext in self.other_workfile_extensions:
|
||||||
|
|
||||||
|
asset_build = self._get_asset_build(
|
||||||
|
repre_file,
|
||||||
|
self.input_naming_patterns["workfile"],
|
||||||
|
self.input_naming_groups["workfile"],
|
||||||
|
self.color_space
|
||||||
|
)
|
||||||
|
version = self._get_version(
|
||||||
|
repre_file,
|
||||||
|
self.input_naming_patterns["workfile"],
|
||||||
|
self.input_naming_groups["workfile"],
|
||||||
|
self.color_space
|
||||||
|
)
|
||||||
|
asset_builds.add((asset_build, version,
|
||||||
|
workfile_subset, 'workfile'))
|
||||||
|
processed_instance = True
|
||||||
|
|
||||||
|
if not representations.get(workfile_subset):
|
||||||
|
representations[workfile_subset] = []
|
||||||
|
|
||||||
|
if ext in self.main_workfile_extensions:
|
||||||
|
# workfiles can have only single representation
|
||||||
|
# currently OP is not supporting different extensions in
|
||||||
|
# representation files
|
||||||
|
representations[workfile_subset] = [repre]
|
||||||
|
|
||||||
|
workfile_files[asset_build] = repre_file
|
||||||
|
|
||||||
|
if ext in self.other_workfile_extensions:
|
||||||
|
# add only if not added already from main
|
||||||
|
if not representations.get(workfile_subset):
|
||||||
|
representations[workfile_subset] = [repre]
|
||||||
|
|
||||||
|
# only overwrite if not present
|
||||||
|
if not workfile_files.get(asset_build):
|
||||||
|
workfile_files[asset_build] = repre_file
|
||||||
|
|
||||||
|
if not resource_files.get(workfile_subset):
|
||||||
|
resource_files[workfile_subset] = []
|
||||||
|
item = {
|
||||||
|
"files": [os.path.join(repre["stagingDir"],
|
||||||
|
repre["files"])],
|
||||||
|
"source": "standalone publisher"
|
||||||
|
}
|
||||||
|
resource_files[workfile_subset].append(item)
|
||||||
|
|
||||||
|
if ext in self.texture_extensions:
|
||||||
|
c_space = self._get_color_space(
|
||||||
|
repre_file,
|
||||||
|
self.color_space
|
||||||
|
)
|
||||||
|
|
||||||
|
channel = self._get_channel_name(
|
||||||
|
repre_file,
|
||||||
|
self.input_naming_patterns["textures"],
|
||||||
|
self.input_naming_groups["textures"],
|
||||||
|
self.color_space
|
||||||
|
)
|
||||||
|
|
||||||
|
shader = self._get_shader_name(
|
||||||
|
repre_file,
|
||||||
|
self.input_naming_patterns["textures"],
|
||||||
|
self.input_naming_groups["textures"],
|
||||||
|
self.color_space
|
||||||
|
)
|
||||||
|
|
||||||
|
formatting_data = {
|
||||||
|
"color_space": c_space or '', # None throws exception
|
||||||
|
"channel": channel or '',
|
||||||
|
"shader": shader or '',
|
||||||
|
"subset": parsed_subset or ''
|
||||||
|
}
|
||||||
|
|
||||||
|
fill_pairs = prepare_template_data(formatting_data)
|
||||||
|
subset = format_template_with_optional_keys(
|
||||||
|
fill_pairs, self.texture_subset_template)
|
||||||
|
|
||||||
|
asset_build = self._get_asset_build(
|
||||||
|
repre_file,
|
||||||
|
self.input_naming_patterns["textures"],
|
||||||
|
self.input_naming_groups["textures"],
|
||||||
|
self.color_space
|
||||||
|
)
|
||||||
|
version = self._get_version(
|
||||||
|
repre_file,
|
||||||
|
self.input_naming_patterns["textures"],
|
||||||
|
self.input_naming_groups["textures"],
|
||||||
|
self.color_space
|
||||||
|
)
|
||||||
|
if not representations.get(subset):
|
||||||
|
representations[subset] = []
|
||||||
|
representations[subset].append(repre)
|
||||||
|
|
||||||
|
ver_data = {
|
||||||
|
"color_space": c_space or '',
|
||||||
|
"channel_name": channel or '',
|
||||||
|
"shader_name": shader or ''
|
||||||
|
}
|
||||||
|
version_data[subset] = ver_data
|
||||||
|
|
||||||
|
asset_builds.add(
|
||||||
|
(asset_build, version, subset, "textures"))
|
||||||
|
processed_instance = True
|
||||||
|
|
||||||
|
if processed_instance:
|
||||||
|
self.context.remove(instance)
|
||||||
|
|
||||||
|
self._create_new_instances(context,
|
||||||
|
asset,
|
||||||
|
asset_builds,
|
||||||
|
resource_files,
|
||||||
|
representations,
|
||||||
|
version_data,
|
||||||
|
workfile_files)
|
||||||
|
|
||||||
|
def _create_new_instances(self, context, asset, asset_builds,
|
||||||
|
resource_files, representations,
|
||||||
|
version_data, workfile_files):
|
||||||
|
"""Prepare new instances from collected data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
context (ContextPlugin)
|
||||||
|
asset (string): selected asset from SP
|
||||||
|
asset_builds (set) of tuples
|
||||||
|
(asset_build, version, subset, family)
|
||||||
|
resource_files (list) of resource dicts - to store additional
|
||||||
|
files to main workfile
|
||||||
|
representations (list) of dicts - to store workfile info OR
|
||||||
|
all collected texture files, key is asset_build
|
||||||
|
version_data (dict) - prepared to store into version doc in DB
|
||||||
|
workfile_files (dict) - to store workfile to add to textures
|
||||||
|
key is asset_build
|
||||||
|
"""
|
||||||
|
# sort workfile first
|
||||||
|
asset_builds = sorted(asset_builds,
|
||||||
|
key=lambda tup: tup[3], reverse=True)
|
||||||
|
|
||||||
|
# workfile must have version, textures might
|
||||||
|
main_version = None
|
||||||
|
for asset_build, version, subset, family in asset_builds:
|
||||||
|
if not main_version:
|
||||||
|
main_version = version
|
||||||
|
new_instance = context.create_instance(subset)
|
||||||
|
new_instance.data.update(
|
||||||
|
{
|
||||||
|
"subset": subset,
|
||||||
|
"asset": asset,
|
||||||
|
"label": subset,
|
||||||
|
"name": subset,
|
||||||
|
"family": family,
|
||||||
|
"version": int(version or main_version or 1),
|
||||||
|
"asset_build": asset_build # remove in validator
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
workfile = workfile_files.get(asset_build)
|
||||||
|
|
||||||
|
if resource_files.get(subset):
|
||||||
|
# add resources only when workfile is main style
|
||||||
|
for ext in self.main_workfile_extensions:
|
||||||
|
if ext in workfile:
|
||||||
|
new_instance.data.update({
|
||||||
|
"resources": resource_files.get(subset)
|
||||||
|
})
|
||||||
|
break
|
||||||
|
|
||||||
|
# store origin
|
||||||
|
if family == 'workfile':
|
||||||
|
families = self.workfile_families
|
||||||
|
|
||||||
|
new_instance.data["source"] = "standalone publisher"
|
||||||
|
else:
|
||||||
|
families = self.textures_families
|
||||||
|
|
||||||
|
repre = representations.get(subset)[0]
|
||||||
|
new_instance.context.data["currentFile"] = os.path.join(
|
||||||
|
repre["stagingDir"], workfile or 'dummy.txt')
|
||||||
|
|
||||||
|
new_instance.data["families"] = families
|
||||||
|
|
||||||
|
# add data for version document
|
||||||
|
ver_data = version_data.get(subset)
|
||||||
|
if ver_data:
|
||||||
|
if workfile:
|
||||||
|
ver_data['workfile'] = workfile
|
||||||
|
|
||||||
|
new_instance.data.update(
|
||||||
|
{"versionData": ver_data}
|
||||||
|
)
|
||||||
|
|
||||||
|
upd_representations = representations.get(subset)
|
||||||
|
if upd_representations and family != 'workfile':
|
||||||
|
upd_representations = self._update_representations(
|
||||||
|
upd_representations)
|
||||||
|
|
||||||
|
new_instance.data["representations"] = upd_representations
|
||||||
|
|
||||||
|
self.log.debug("new instance - {}:: {}".format(
|
||||||
|
family,
|
||||||
|
json.dumps(new_instance.data, indent=4)))
|
||||||
|
|
||||||
|
def _get_asset_build(self, name,
|
||||||
|
input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces):
|
||||||
|
"""Loops through configured workfile patterns to find asset name.
|
||||||
|
|
||||||
|
Asset name used to bind workfile and its textures.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
name (str): workfile name
|
||||||
|
input_naming_patterns (list):
|
||||||
|
[workfile_pattern] or [texture_pattern]
|
||||||
|
input_naming_groups (list)
|
||||||
|
ordinal position of regex groups matching to input_naming..
|
||||||
|
color_spaces (list) - predefined color spaces
|
||||||
|
"""
|
||||||
|
asset_name = "NOT_AVAIL"
|
||||||
|
|
||||||
|
return self._parse(name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces, 'asset') or asset_name
|
||||||
|
|
||||||
|
def _get_version(self, name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces):
|
||||||
|
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces, 'version')
|
||||||
|
|
||||||
|
if found:
|
||||||
|
return found.replace('v', '')
|
||||||
|
|
||||||
|
self.log.info("No version found in the name {}".format(name))
|
||||||
|
|
||||||
|
def _get_udim(self, name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces):
|
||||||
|
"""Parses from 'name' udim value."""
|
||||||
|
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces, 'udim')
|
||||||
|
if found:
|
||||||
|
return found
|
||||||
|
|
||||||
|
self.log.warning("Didn't find UDIM in {}".format(name))
|
||||||
|
|
||||||
|
def _get_color_space(self, name, color_spaces):
|
||||||
|
"""Looks for color_space from a list in a file name.
|
||||||
|
|
||||||
|
Color space seems not to be recognizable by regex pattern, set of
|
||||||
|
known space spaces must be provided.
|
||||||
|
"""
|
||||||
|
color_space = None
|
||||||
|
found = [cs for cs in color_spaces if
|
||||||
|
re.search("_{}_".format(cs), name)]
|
||||||
|
|
||||||
|
if not found:
|
||||||
|
self.log.warning("No color space found in {}".format(name))
|
||||||
|
else:
|
||||||
|
if len(found) > 1:
|
||||||
|
msg = "Multiple color spaces found in {}->{}".format(name,
|
||||||
|
found)
|
||||||
|
self.log.warning(msg)
|
||||||
|
|
||||||
|
color_space = found[0]
|
||||||
|
|
||||||
|
return color_space
|
||||||
|
|
||||||
|
def _get_shader_name(self, name, input_naming_patterns,
|
||||||
|
input_naming_groups, color_spaces):
|
||||||
|
"""Return parsed shader name.
|
||||||
|
|
||||||
|
Shader name is needed for overlapping udims (eg. udims might be
|
||||||
|
used for different materials, shader needed to not overwrite).
|
||||||
|
|
||||||
|
Unknown format of channel name and color spaces >> cs are known
|
||||||
|
list - 'color_space' used as a placeholder
|
||||||
|
"""
|
||||||
|
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces, 'shader')
|
||||||
|
if found:
|
||||||
|
return found
|
||||||
|
|
||||||
|
self.log.warning("Didn't find shader in {}".format(name))
|
||||||
|
|
||||||
|
def _get_channel_name(self, name, input_naming_patterns,
|
||||||
|
input_naming_groups, color_spaces):
|
||||||
|
"""Return parsed channel name.
|
||||||
|
|
||||||
|
Unknown format of channel name and color spaces >> cs are known
|
||||||
|
list - 'color_space' used as a placeholder
|
||||||
|
"""
|
||||||
|
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces, 'channel')
|
||||||
|
if found:
|
||||||
|
return found
|
||||||
|
|
||||||
|
self.log.warning("Didn't find channel in {}".format(name))
|
||||||
|
|
||||||
|
def _parse(self, name, input_naming_patterns, input_naming_groups,
|
||||||
|
color_spaces, key):
|
||||||
|
"""Universal way to parse 'name' with configurable regex groups.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
name (str): workfile name
|
||||||
|
input_naming_patterns (list):
|
||||||
|
[workfile_pattern] or [texture_pattern]
|
||||||
|
input_naming_groups (list)
|
||||||
|
ordinal position of regex groups matching to input_naming..
|
||||||
|
color_spaces (list) - predefined color spaces
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ValueError - if broken 'input_naming_groups'
|
||||||
|
"""
|
||||||
|
for input_pattern in input_naming_patterns:
|
||||||
|
for cs in color_spaces:
|
||||||
|
pattern = input_pattern.replace('{color_space}', cs)
|
||||||
|
regex_result = re.findall(pattern, name)
|
||||||
|
if regex_result:
|
||||||
|
idx = list(input_naming_groups).index(key)
|
||||||
|
if idx < 0:
|
||||||
|
msg = "input_naming_groups must " +\
|
||||||
|
"have '{}' key".format(key)
|
||||||
|
raise ValueError(msg)
|
||||||
|
|
||||||
|
try:
|
||||||
|
parsed_value = regex_result[0][idx]
|
||||||
|
return parsed_value
|
||||||
|
except IndexError:
|
||||||
|
self.log.warning("Wrong index, probably "
|
||||||
|
"wrong name {}".format(name))
|
||||||
|
|
||||||
|
def _update_representations(self, upd_representations):
|
||||||
|
"""Frames dont have sense for textures, add collected udims instead."""
|
||||||
|
udims = []
|
||||||
|
for repre in upd_representations:
|
||||||
|
repre.pop("frameStart", None)
|
||||||
|
repre.pop("frameEnd", None)
|
||||||
|
repre.pop("fps", None)
|
||||||
|
|
||||||
|
# ignore unique name from SP, use extension instead
|
||||||
|
# SP enforces unique name, here different subsets >> unique repres
|
||||||
|
repre["name"] = repre["ext"].replace('.', '')
|
||||||
|
|
||||||
|
files = repre.get("files", [])
|
||||||
|
if not isinstance(files, list):
|
||||||
|
files = [files]
|
||||||
|
|
||||||
|
for file_name in files:
|
||||||
|
udim = self._get_udim(file_name,
|
||||||
|
self.input_naming_patterns["textures"],
|
||||||
|
self.input_naming_groups["textures"],
|
||||||
|
self.color_space)
|
||||||
|
udims.append(udim)
|
||||||
|
|
||||||
|
repre["udim"] = udims # must be this way, used for filling path
|
||||||
|
|
||||||
|
return upd_representations
|
||||||
|
|
@ -0,0 +1,42 @@
|
||||||
|
import os
|
||||||
|
import pyblish.api
|
||||||
|
|
||||||
|
|
||||||
|
class ExtractResources(pyblish.api.InstancePlugin):
|
||||||
|
"""
|
||||||
|
Extracts files from instance.data["resources"].
|
||||||
|
|
||||||
|
These files are additional (textures etc.), currently not stored in
|
||||||
|
representations!
|
||||||
|
|
||||||
|
Expects collected 'resourcesDir'. (list of dicts with 'files' key and
|
||||||
|
list of source urls)
|
||||||
|
|
||||||
|
Provides filled 'transfers' (list of tuples (source_url, target_url))
|
||||||
|
"""
|
||||||
|
|
||||||
|
label = "Extract Resources SP"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
order = pyblish.api.ExtractorOrder
|
||||||
|
|
||||||
|
families = ["workfile"]
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
if not instance.data.get("resources"):
|
||||||
|
self.log.info("No resources")
|
||||||
|
return
|
||||||
|
|
||||||
|
if not instance.data.get("transfers"):
|
||||||
|
instance.data["transfers"] = []
|
||||||
|
|
||||||
|
publish_dir = instance.data["resourcesDir"]
|
||||||
|
|
||||||
|
transfers = []
|
||||||
|
for resource in instance.data["resources"]:
|
||||||
|
for file_url in resource.get("files", []):
|
||||||
|
file_name = os.path.basename(file_url)
|
||||||
|
dest_url = os.path.join(publish_dir, file_name)
|
||||||
|
transfers.append((file_url, dest_url))
|
||||||
|
|
||||||
|
self.log.info("transfers:: {}".format(transfers))
|
||||||
|
instance.data["transfers"].extend(transfers)
|
||||||
|
|
@ -60,7 +60,7 @@ class ExtractTrimVideoAudio(openpype.api.Extractor):
|
||||||
]
|
]
|
||||||
|
|
||||||
args = [
|
args = [
|
||||||
ffmpeg_path,
|
f"\"{ffmpeg_path}\"",
|
||||||
"-ss", str(start / fps),
|
"-ss", str(start / fps),
|
||||||
"-i", f"\"{video_file_path}\"",
|
"-i", f"\"{video_file_path}\"",
|
||||||
"-t", str(dur / fps)
|
"-t", str(dur / fps)
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,43 @@
|
||||||
|
import os
|
||||||
|
import pyblish.api
|
||||||
|
|
||||||
|
|
||||||
|
class ExtractWorkfileUrl(pyblish.api.ContextPlugin):
|
||||||
|
"""
|
||||||
|
Modifies 'workfile' field to contain link to published workfile.
|
||||||
|
|
||||||
|
Expects that batch contains only single workfile and matching
|
||||||
|
(multiple) textures.
|
||||||
|
"""
|
||||||
|
|
||||||
|
label = "Extract Workfile Url SP"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
order = pyblish.api.ExtractorOrder
|
||||||
|
|
||||||
|
families = ["textures"]
|
||||||
|
|
||||||
|
def process(self, context):
|
||||||
|
filepath = None
|
||||||
|
|
||||||
|
# first loop for workfile
|
||||||
|
for instance in context:
|
||||||
|
if instance.data["family"] == 'workfile':
|
||||||
|
anatomy = context.data['anatomy']
|
||||||
|
template_data = instance.data.get("anatomyData")
|
||||||
|
rep_name = instance.data.get("representations")[0].get("name")
|
||||||
|
template_data["representation"] = rep_name
|
||||||
|
template_data["ext"] = rep_name
|
||||||
|
anatomy_filled = anatomy.format(template_data)
|
||||||
|
template_filled = anatomy_filled["publish"]["path"]
|
||||||
|
filepath = os.path.normpath(template_filled)
|
||||||
|
self.log.info("Using published scene for render {}".format(
|
||||||
|
filepath))
|
||||||
|
|
||||||
|
if not filepath:
|
||||||
|
self.log.info("Texture batch doesn't contain workfile.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# then apply to all textures
|
||||||
|
for instance in context:
|
||||||
|
if instance.data["family"] == 'textures':
|
||||||
|
instance.data["versionData"]["workfile"] = filepath
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
import pyblish.api
|
||||||
|
import openpype.api
|
||||||
|
|
||||||
|
|
||||||
|
class ValidateTextureBatch(pyblish.api.InstancePlugin):
|
||||||
|
"""Validates that some texture files are present."""
|
||||||
|
|
||||||
|
label = "Validate Texture Presence"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
order = openpype.api.ValidateContentsOrder
|
||||||
|
families = ["workfile"]
|
||||||
|
optional = False
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
present = False
|
||||||
|
for instance in instance.context:
|
||||||
|
if instance.data["family"] == "textures":
|
||||||
|
self.log.info("Some textures present.")
|
||||||
|
|
||||||
|
return
|
||||||
|
|
||||||
|
assert present, "No textures found in published batch!"
|
||||||
|
|
@ -0,0 +1,20 @@
|
||||||
|
import pyblish.api
|
||||||
|
import openpype.api
|
||||||
|
|
||||||
|
|
||||||
|
class ValidateTextureHasWorkfile(pyblish.api.InstancePlugin):
|
||||||
|
"""Validates that textures have appropriate workfile attached.
|
||||||
|
|
||||||
|
Workfile is optional, disable this Validator after Refresh if you are
|
||||||
|
sure it is not needed.
|
||||||
|
"""
|
||||||
|
label = "Validate Texture Has Workfile"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
order = openpype.api.ValidateContentsOrder
|
||||||
|
families = ["textures"]
|
||||||
|
optional = True
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
wfile = instance.data["versionData"].get("workfile")
|
||||||
|
|
||||||
|
assert wfile, "Textures are missing attached workfile"
|
||||||
|
|
@ -0,0 +1,50 @@
|
||||||
|
import pyblish.api
|
||||||
|
import openpype.api
|
||||||
|
|
||||||
|
|
||||||
|
class ValidateTextureBatchNaming(pyblish.api.InstancePlugin):
|
||||||
|
"""Validates that all instances had properly formatted name."""
|
||||||
|
|
||||||
|
label = "Validate Texture Batch Naming"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
order = openpype.api.ValidateContentsOrder
|
||||||
|
families = ["workfile", "textures"]
|
||||||
|
optional = False
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
file_name = instance.data["representations"][0]["files"]
|
||||||
|
if isinstance(file_name, list):
|
||||||
|
file_name = file_name[0]
|
||||||
|
|
||||||
|
msg = "Couldnt find asset name in '{}'\n".format(file_name) + \
|
||||||
|
"File name doesn't follow configured pattern.\n" + \
|
||||||
|
"Please rename the file."
|
||||||
|
assert "NOT_AVAIL" not in instance.data["asset_build"], msg
|
||||||
|
|
||||||
|
instance.data.pop("asset_build")
|
||||||
|
|
||||||
|
if instance.data["family"] == "textures":
|
||||||
|
file_name = instance.data["representations"][0]["files"][0]
|
||||||
|
self._check_proper_collected(instance.data["versionData"],
|
||||||
|
file_name)
|
||||||
|
|
||||||
|
def _check_proper_collected(self, versionData, file_name):
|
||||||
|
"""
|
||||||
|
Loop through collected versionData to check if name parsing was OK.
|
||||||
|
Args:
|
||||||
|
versionData: (dict)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
raises AssertionException
|
||||||
|
"""
|
||||||
|
missing_key_values = []
|
||||||
|
for key, value in versionData.items():
|
||||||
|
if not value:
|
||||||
|
missing_key_values.append(key)
|
||||||
|
|
||||||
|
msg = "Collected data {} doesn't contain values for {}".format(
|
||||||
|
versionData, missing_key_values) + "\n" + \
|
||||||
|
"Name of the texture file doesn't match expected pattern.\n" + \
|
||||||
|
"Please rename file(s) {}".format(file_name)
|
||||||
|
|
||||||
|
assert not missing_key_values, msg
|
||||||
|
|
@ -0,0 +1,38 @@
|
||||||
|
import pyblish.api
|
||||||
|
import openpype.api
|
||||||
|
|
||||||
|
|
||||||
|
class ValidateTextureBatchVersions(pyblish.api.InstancePlugin):
|
||||||
|
"""Validates that versions match in workfile and textures.
|
||||||
|
|
||||||
|
Workfile is optional, so if you are sure, you can disable this
|
||||||
|
validator after Refresh.
|
||||||
|
|
||||||
|
Validates that only single version is published at a time.
|
||||||
|
"""
|
||||||
|
label = "Validate Texture Batch Versions"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
order = openpype.api.ValidateContentsOrder
|
||||||
|
families = ["textures"]
|
||||||
|
optional = False
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
wfile = instance.data["versionData"].get("workfile")
|
||||||
|
|
||||||
|
version_str = "v{:03d}".format(instance.data["version"])
|
||||||
|
|
||||||
|
if not wfile: # no matching workfile, do not check versions
|
||||||
|
self.log.info("No workfile present for textures")
|
||||||
|
return
|
||||||
|
|
||||||
|
msg = "Not matching version: texture v{:03d} - workfile {}"
|
||||||
|
assert version_str in wfile, \
|
||||||
|
msg.format(
|
||||||
|
instance.data["version"], wfile
|
||||||
|
)
|
||||||
|
|
||||||
|
present_versions = set()
|
||||||
|
for instance in instance.context:
|
||||||
|
present_versions.add(instance.data["version"])
|
||||||
|
|
||||||
|
assert len(present_versions) == 1, "Too many versions in a batch!"
|
||||||
|
|
@ -0,0 +1,29 @@
|
||||||
|
import pyblish.api
|
||||||
|
import openpype.api
|
||||||
|
|
||||||
|
|
||||||
|
class ValidateTextureBatchWorkfiles(pyblish.api.InstancePlugin):
|
||||||
|
"""Validates that textures workfile has collected resources (optional).
|
||||||
|
|
||||||
|
Collected recourses means secondary workfiles (in most cases).
|
||||||
|
"""
|
||||||
|
|
||||||
|
label = "Validate Texture Workfile Has Resources"
|
||||||
|
hosts = ["standalonepublisher"]
|
||||||
|
order = openpype.api.ValidateContentsOrder
|
||||||
|
families = ["workfile"]
|
||||||
|
optional = True
|
||||||
|
|
||||||
|
# from presets
|
||||||
|
main_workfile_extensions = ['mra']
|
||||||
|
|
||||||
|
def process(self, instance):
|
||||||
|
if instance.data["family"] == "workfile":
|
||||||
|
ext = instance.data["representations"][0]["ext"]
|
||||||
|
if ext not in self.main_workfile_extensions:
|
||||||
|
self.log.warning("Only secondary workfile present!")
|
||||||
|
return
|
||||||
|
|
||||||
|
msg = "No secondary workfiles present for workfile {}".\
|
||||||
|
format(instance.data["name"])
|
||||||
|
assert instance.data.get("resources"), msg
|
||||||
|
|
@ -155,6 +155,7 @@ class CollectWorkfileData(pyblish.api.ContextPlugin):
|
||||||
"sceneMarkInState": mark_in_state == "set",
|
"sceneMarkInState": mark_in_state == "set",
|
||||||
"sceneMarkOut": int(mark_out_frame),
|
"sceneMarkOut": int(mark_out_frame),
|
||||||
"sceneMarkOutState": mark_out_state == "set",
|
"sceneMarkOutState": mark_out_state == "set",
|
||||||
|
"sceneStartFrame": int(lib.execute_george("tv_startframe")),
|
||||||
"sceneBgColor": self._get_bg_color()
|
"sceneBgColor": self._get_bg_color()
|
||||||
}
|
}
|
||||||
self.log.debug(
|
self.log.debug(
|
||||||
|
|
|
||||||
|
|
@ -49,6 +49,14 @@ class ExtractSequence(pyblish.api.Extractor):
|
||||||
family_lowered = instance.data["family"].lower()
|
family_lowered = instance.data["family"].lower()
|
||||||
mark_in = instance.context.data["sceneMarkIn"]
|
mark_in = instance.context.data["sceneMarkIn"]
|
||||||
mark_out = instance.context.data["sceneMarkOut"]
|
mark_out = instance.context.data["sceneMarkOut"]
|
||||||
|
|
||||||
|
# Scene start frame offsets the output files, so we need to offset the
|
||||||
|
# marks.
|
||||||
|
scene_start_frame = instance.context.data["sceneStartFrame"]
|
||||||
|
difference = scene_start_frame - mark_in
|
||||||
|
mark_in += difference
|
||||||
|
mark_out += difference
|
||||||
|
|
||||||
# Frame start/end may be stored as float
|
# Frame start/end may be stored as float
|
||||||
frame_start = int(instance.data["frameStart"])
|
frame_start = int(instance.data["frameStart"])
|
||||||
frame_end = int(instance.data["frameEnd"])
|
frame_end = int(instance.data["frameEnd"])
|
||||||
|
|
@ -98,7 +106,7 @@ class ExtractSequence(pyblish.api.Extractor):
|
||||||
self.log.warning((
|
self.log.warning((
|
||||||
"Lowering representation range to {} frames."
|
"Lowering representation range to {} frames."
|
||||||
" Changed frame end {} -> {}"
|
" Changed frame end {} -> {}"
|
||||||
).format(output_range + 1, mark_out, new_mark_out))
|
).format(output_range + 1, mark_out, new_output_frame_end))
|
||||||
output_frame_end = new_output_frame_end
|
output_frame_end = new_output_frame_end
|
||||||
|
|
||||||
# -------------------------------------------------------------------
|
# -------------------------------------------------------------------
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,27 @@
|
||||||
|
import pyblish.api
|
||||||
|
from avalon.tvpaint import lib
|
||||||
|
|
||||||
|
|
||||||
|
class RepairStartFrame(pyblish.api.Action):
|
||||||
|
"""Repair start frame."""
|
||||||
|
|
||||||
|
label = "Repair"
|
||||||
|
icon = "wrench"
|
||||||
|
on = "failed"
|
||||||
|
|
||||||
|
def process(self, context, plugin):
|
||||||
|
lib.execute_george("tv_startframe 0")
|
||||||
|
|
||||||
|
|
||||||
|
class ValidateStartFrame(pyblish.api.ContextPlugin):
|
||||||
|
"""Validate start frame being at frame 0."""
|
||||||
|
|
||||||
|
label = "Validate Start Frame"
|
||||||
|
order = pyblish.api.ValidatorOrder
|
||||||
|
hosts = ["tvpaint"]
|
||||||
|
actions = [RepairStartFrame]
|
||||||
|
optional = True
|
||||||
|
|
||||||
|
def process(self, context):
|
||||||
|
start_frame = lib.execute_george("tv_startframe")
|
||||||
|
assert int(start_frame) == 0, "Start frame has to be frame 0."
|
||||||
|
|
@ -1302,10 +1302,18 @@ def _prepare_last_workfile(data, workdir):
|
||||||
)
|
)
|
||||||
data["start_last_workfile"] = start_last_workfile
|
data["start_last_workfile"] = start_last_workfile
|
||||||
|
|
||||||
|
workfile_startup = should_workfile_tool_start(
|
||||||
|
project_name, app.host_name, task_name
|
||||||
|
)
|
||||||
|
data["workfile_startup"] = workfile_startup
|
||||||
|
|
||||||
# Store boolean as "0"(False) or "1"(True)
|
# Store boolean as "0"(False) or "1"(True)
|
||||||
data["env"]["AVALON_OPEN_LAST_WORKFILE"] = (
|
data["env"]["AVALON_OPEN_LAST_WORKFILE"] = (
|
||||||
str(int(bool(start_last_workfile)))
|
str(int(bool(start_last_workfile)))
|
||||||
)
|
)
|
||||||
|
data["env"]["OPENPYPE_WORKFILE_TOOL_ON_START"] = (
|
||||||
|
str(int(bool(workfile_startup)))
|
||||||
|
)
|
||||||
|
|
||||||
_sub_msg = "" if start_last_workfile else " not"
|
_sub_msg = "" if start_last_workfile else " not"
|
||||||
log.debug(
|
log.debug(
|
||||||
|
|
@ -1344,40 +1352,9 @@ def _prepare_last_workfile(data, workdir):
|
||||||
data["last_workfile_path"] = last_workfile_path
|
data["last_workfile_path"] = last_workfile_path
|
||||||
|
|
||||||
|
|
||||||
def should_start_last_workfile(
|
def get_option_from_settings(
|
||||||
project_name, host_name, task_name, default_output=False
|
startup_presets, host_name, task_name, default_output
|
||||||
):
|
):
|
||||||
"""Define if host should start last version workfile if possible.
|
|
||||||
|
|
||||||
Default output is `False`. Can be overriden with environment variable
|
|
||||||
`AVALON_OPEN_LAST_WORKFILE`, valid values without case sensitivity are
|
|
||||||
`"0", "1", "true", "false", "yes", "no"`.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
project_name (str): Name of project.
|
|
||||||
host_name (str): Name of host which is launched. In avalon's
|
|
||||||
application context it's value stored in app definition under
|
|
||||||
key `"application_dir"`. Is not case sensitive.
|
|
||||||
task_name (str): Name of task which is used for launching the host.
|
|
||||||
Task name is not case sensitive.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
bool: True if host should start workfile.
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
project_settings = get_project_settings(project_name)
|
|
||||||
startup_presets = (
|
|
||||||
project_settings
|
|
||||||
["global"]
|
|
||||||
["tools"]
|
|
||||||
["Workfiles"]
|
|
||||||
["last_workfile_on_startup"]
|
|
||||||
)
|
|
||||||
|
|
||||||
if not startup_presets:
|
|
||||||
return default_output
|
|
||||||
|
|
||||||
host_name_lowered = host_name.lower()
|
host_name_lowered = host_name.lower()
|
||||||
task_name_lowered = task_name.lower()
|
task_name_lowered = task_name.lower()
|
||||||
|
|
||||||
|
|
@ -1421,6 +1398,82 @@ def should_start_last_workfile(
|
||||||
return default_output
|
return default_output
|
||||||
|
|
||||||
|
|
||||||
|
def should_start_last_workfile(
|
||||||
|
project_name, host_name, task_name, default_output=False
|
||||||
|
):
|
||||||
|
"""Define if host should start last version workfile if possible.
|
||||||
|
|
||||||
|
Default output is `False`. Can be overriden with environment variable
|
||||||
|
`AVALON_OPEN_LAST_WORKFILE`, valid values without case sensitivity are
|
||||||
|
`"0", "1", "true", "false", "yes", "no"`.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_name (str): Name of project.
|
||||||
|
host_name (str): Name of host which is launched. In avalon's
|
||||||
|
application context it's value stored in app definition under
|
||||||
|
key `"application_dir"`. Is not case sensitive.
|
||||||
|
task_name (str): Name of task which is used for launching the host.
|
||||||
|
Task name is not case sensitive.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
bool: True if host should start workfile.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
project_settings = get_project_settings(project_name)
|
||||||
|
startup_presets = (
|
||||||
|
project_settings
|
||||||
|
["global"]
|
||||||
|
["tools"]
|
||||||
|
["Workfiles"]
|
||||||
|
["last_workfile_on_startup"]
|
||||||
|
)
|
||||||
|
|
||||||
|
if not startup_presets:
|
||||||
|
return default_output
|
||||||
|
|
||||||
|
return get_option_from_settings(
|
||||||
|
startup_presets, host_name, task_name, default_output)
|
||||||
|
|
||||||
|
|
||||||
|
def should_workfile_tool_start(
|
||||||
|
project_name, host_name, task_name, default_output=False
|
||||||
|
):
|
||||||
|
"""Define if host should start workfile tool at host launch.
|
||||||
|
|
||||||
|
Default output is `False`. Can be overriden with environment variable
|
||||||
|
`OPENPYPE_WORKFILE_TOOL_ON_START`, valid values without case sensitivity are
|
||||||
|
`"0", "1", "true", "false", "yes", "no"`.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_name (str): Name of project.
|
||||||
|
host_name (str): Name of host which is launched. In avalon's
|
||||||
|
application context it's value stored in app definition under
|
||||||
|
key `"application_dir"`. Is not case sensitive.
|
||||||
|
task_name (str): Name of task which is used for launching the host.
|
||||||
|
Task name is not case sensitive.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
bool: True if host should start workfile.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
project_settings = get_project_settings(project_name)
|
||||||
|
startup_presets = (
|
||||||
|
project_settings
|
||||||
|
["global"]
|
||||||
|
["tools"]
|
||||||
|
["Workfiles"]
|
||||||
|
["open_workfile_tool_on_startup"]
|
||||||
|
)
|
||||||
|
|
||||||
|
if not startup_presets:
|
||||||
|
return default_output
|
||||||
|
|
||||||
|
return get_option_from_settings(
|
||||||
|
startup_presets, host_name, task_name, default_output)
|
||||||
|
|
||||||
|
|
||||||
def compile_list_of_regexes(in_list):
|
def compile_list_of_regexes(in_list):
|
||||||
"""Convert strings in entered list to compiled regex objects."""
|
"""Convert strings in entered list to compiled regex objects."""
|
||||||
regexes = list()
|
regexes = list()
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,8 @@
|
||||||
import json
|
import json
|
||||||
|
|
||||||
|
from avalon.api import AvalonMongoDB
|
||||||
from openpype.api import ProjectSettings
|
from openpype.api import ProjectSettings
|
||||||
|
from openpype.lib import create_project
|
||||||
|
|
||||||
from openpype.modules.ftrack.lib import (
|
from openpype.modules.ftrack.lib import (
|
||||||
ServerAction,
|
ServerAction,
|
||||||
|
|
@ -21,8 +23,24 @@ class PrepareProjectServer(ServerAction):
|
||||||
|
|
||||||
role_list = ["Pypeclub", "Administrator", "Project Manager"]
|
role_list = ["Pypeclub", "Administrator", "Project Manager"]
|
||||||
|
|
||||||
# Key to store info about trigerring create folder structure
|
settings_key = "prepare_project"
|
||||||
|
|
||||||
item_splitter = {"type": "label", "value": "---"}
|
item_splitter = {"type": "label", "value": "---"}
|
||||||
|
_keys_order = (
|
||||||
|
"fps",
|
||||||
|
"frameStart",
|
||||||
|
"frameEnd",
|
||||||
|
"handleStart",
|
||||||
|
"handleEnd",
|
||||||
|
"clipIn",
|
||||||
|
"clipOut",
|
||||||
|
"resolutionHeight",
|
||||||
|
"resolutionWidth",
|
||||||
|
"pixelAspect",
|
||||||
|
"applications",
|
||||||
|
"tools_env",
|
||||||
|
"library_project",
|
||||||
|
)
|
||||||
|
|
||||||
def discover(self, session, entities, event):
|
def discover(self, session, entities, event):
|
||||||
"""Show only on project."""
|
"""Show only on project."""
|
||||||
|
|
@ -47,13 +65,7 @@ class PrepareProjectServer(ServerAction):
|
||||||
project_entity = entities[0]
|
project_entity = entities[0]
|
||||||
project_name = project_entity["full_name"]
|
project_name = project_entity["full_name"]
|
||||||
|
|
||||||
try:
|
project_settings = ProjectSettings(project_name)
|
||||||
project_settings = ProjectSettings(project_name)
|
|
||||||
except ValueError:
|
|
||||||
return {
|
|
||||||
"message": "Project is not synchronized yet",
|
|
||||||
"success": False
|
|
||||||
}
|
|
||||||
|
|
||||||
project_anatom_settings = project_settings["project_anatomy"]
|
project_anatom_settings = project_settings["project_anatomy"]
|
||||||
root_items = self.prepare_root_items(project_anatom_settings)
|
root_items = self.prepare_root_items(project_anatom_settings)
|
||||||
|
|
@ -78,14 +90,13 @@ class PrepareProjectServer(ServerAction):
|
||||||
|
|
||||||
items.extend(ca_items)
|
items.extend(ca_items)
|
||||||
|
|
||||||
# This item will be last (before enumerators)
|
# This item will be last before enumerators
|
||||||
# - sets value of auto synchronization
|
# Set value of auto synchronization
|
||||||
auto_sync_name = "avalon_auto_sync"
|
|
||||||
auto_sync_value = project_entity["custom_attributes"].get(
|
auto_sync_value = project_entity["custom_attributes"].get(
|
||||||
CUST_ATTR_AUTO_SYNC, False
|
CUST_ATTR_AUTO_SYNC, False
|
||||||
)
|
)
|
||||||
auto_sync_item = {
|
auto_sync_item = {
|
||||||
"name": auto_sync_name,
|
"name": CUST_ATTR_AUTO_SYNC,
|
||||||
"type": "boolean",
|
"type": "boolean",
|
||||||
"value": auto_sync_value,
|
"value": auto_sync_value,
|
||||||
"label": "AutoSync to Avalon"
|
"label": "AutoSync to Avalon"
|
||||||
|
|
@ -199,7 +210,18 @@ class PrepareProjectServer(ServerAction):
|
||||||
str([key for key in attributes_to_set])
|
str([key for key in attributes_to_set])
|
||||||
))
|
))
|
||||||
|
|
||||||
for key, in_data in attributes_to_set.items():
|
attribute_keys = set(attributes_to_set.keys())
|
||||||
|
keys_order = []
|
||||||
|
for key in self._keys_order:
|
||||||
|
if key in attribute_keys:
|
||||||
|
keys_order.append(key)
|
||||||
|
|
||||||
|
attribute_keys = attribute_keys - set(keys_order)
|
||||||
|
for key in sorted(attribute_keys):
|
||||||
|
keys_order.append(key)
|
||||||
|
|
||||||
|
for key in keys_order:
|
||||||
|
in_data = attributes_to_set[key]
|
||||||
attr = in_data["object"]
|
attr = in_data["object"]
|
||||||
|
|
||||||
# initial item definition
|
# initial item definition
|
||||||
|
|
@ -225,7 +247,7 @@ class PrepareProjectServer(ServerAction):
|
||||||
multiselect_enumerators.append(self.item_splitter)
|
multiselect_enumerators.append(self.item_splitter)
|
||||||
multiselect_enumerators.append({
|
multiselect_enumerators.append({
|
||||||
"type": "label",
|
"type": "label",
|
||||||
"value": in_data["label"]
|
"value": "<h3>{}</h3>".format(in_data["label"])
|
||||||
})
|
})
|
||||||
|
|
||||||
default = in_data["default"]
|
default = in_data["default"]
|
||||||
|
|
@ -286,10 +308,10 @@ class PrepareProjectServer(ServerAction):
|
||||||
return items, multiselect_enumerators
|
return items, multiselect_enumerators
|
||||||
|
|
||||||
def launch(self, session, entities, event):
|
def launch(self, session, entities, event):
|
||||||
if not event['data'].get('values', {}):
|
in_data = event["data"].get("values")
|
||||||
|
if not in_data:
|
||||||
return
|
return
|
||||||
|
|
||||||
in_data = event['data']['values']
|
|
||||||
|
|
||||||
root_values = {}
|
root_values = {}
|
||||||
root_key = "__root__"
|
root_key = "__root__"
|
||||||
|
|
@ -337,7 +359,27 @@ class PrepareProjectServer(ServerAction):
|
||||||
|
|
||||||
self.log.debug("Setting Custom Attribute values")
|
self.log.debug("Setting Custom Attribute values")
|
||||||
|
|
||||||
project_name = entities[0]["full_name"]
|
project_entity = entities[0]
|
||||||
|
project_name = project_entity["full_name"]
|
||||||
|
|
||||||
|
# Try to find project document
|
||||||
|
dbcon = AvalonMongoDB()
|
||||||
|
dbcon.install()
|
||||||
|
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||||
|
project_doc = dbcon.find_one({
|
||||||
|
"type": "project"
|
||||||
|
})
|
||||||
|
# Create project if is not available
|
||||||
|
# - creation is required to be able set project anatomy and attributes
|
||||||
|
if not project_doc:
|
||||||
|
project_code = project_entity["name"]
|
||||||
|
self.log.info("Creating project \"{} [{}]\"".format(
|
||||||
|
project_name, project_code
|
||||||
|
))
|
||||||
|
create_project(project_name, project_code, dbcon=dbcon)
|
||||||
|
|
||||||
|
dbcon.uninstall()
|
||||||
|
|
||||||
project_settings = ProjectSettings(project_name)
|
project_settings = ProjectSettings(project_name)
|
||||||
project_anatomy_settings = project_settings["project_anatomy"]
|
project_anatomy_settings = project_settings["project_anatomy"]
|
||||||
project_anatomy_settings["roots"] = root_data
|
project_anatomy_settings["roots"] = root_data
|
||||||
|
|
@ -352,10 +394,12 @@ class PrepareProjectServer(ServerAction):
|
||||||
|
|
||||||
project_settings.save()
|
project_settings.save()
|
||||||
|
|
||||||
entity = entities[0]
|
# Change custom attributes on project
|
||||||
for key, value in custom_attribute_values.items():
|
if custom_attribute_values:
|
||||||
entity["custom_attributes"][key] = value
|
for key, value in custom_attribute_values.items():
|
||||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
project_entity["custom_attributes"][key] = value
|
||||||
|
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||||
|
session.commit()
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -11,29 +11,44 @@ from avalon.api import AvalonMongoDB
|
||||||
|
|
||||||
|
|
||||||
class AppplicationsAction(BaseAction):
|
class AppplicationsAction(BaseAction):
|
||||||
"""Application Action class.
|
"""Applications Action class."""
|
||||||
|
|
||||||
Args:
|
|
||||||
session (ftrack_api.Session): Session where action will be registered.
|
|
||||||
label (str): A descriptive string identifing your action.
|
|
||||||
varaint (str, optional): To group actions together, give them the same
|
|
||||||
label and specify a unique variant per action.
|
|
||||||
identifier (str): An unique identifier for app.
|
|
||||||
description (str): A verbose descriptive text for you action.
|
|
||||||
icon (str): Url path to icon which will be shown in Ftrack web.
|
|
||||||
"""
|
|
||||||
|
|
||||||
type = "Application"
|
type = "Application"
|
||||||
label = "Application action"
|
label = "Application action"
|
||||||
identifier = "pype_app.{}.".format(str(uuid4()))
|
|
||||||
|
identifier = "openpype_app"
|
||||||
|
_launch_identifier_with_id = None
|
||||||
|
|
||||||
icon_url = os.environ.get("OPENPYPE_STATICS_SERVER")
|
icon_url = os.environ.get("OPENPYPE_STATICS_SERVER")
|
||||||
|
|
||||||
def __init__(self, *args, **kwargs):
|
def __init__(self, *args, **kwargs):
|
||||||
super().__init__(*args, **kwargs)
|
super(AppplicationsAction, self).__init__(*args, **kwargs)
|
||||||
|
|
||||||
self.application_manager = ApplicationManager()
|
self.application_manager = ApplicationManager()
|
||||||
self.dbcon = AvalonMongoDB()
|
self.dbcon = AvalonMongoDB()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def discover_identifier(self):
|
||||||
|
if self._discover_identifier is None:
|
||||||
|
self._discover_identifier = "{}.{}".format(
|
||||||
|
self.identifier, self.process_identifier()
|
||||||
|
)
|
||||||
|
return self._discover_identifier
|
||||||
|
|
||||||
|
@property
|
||||||
|
def launch_identifier(self):
|
||||||
|
if self._launch_identifier is None:
|
||||||
|
self._launch_identifier = "{}.*".format(self.identifier)
|
||||||
|
return self._launch_identifier
|
||||||
|
|
||||||
|
@property
|
||||||
|
def launch_identifier_with_id(self):
|
||||||
|
if self._launch_identifier_with_id is None:
|
||||||
|
self._launch_identifier_with_id = "{}.{}".format(
|
||||||
|
self.identifier, self.process_identifier()
|
||||||
|
)
|
||||||
|
return self._launch_identifier_with_id
|
||||||
|
|
||||||
def construct_requirements_validations(self):
|
def construct_requirements_validations(self):
|
||||||
# Override validation as this action does not need them
|
# Override validation as this action does not need them
|
||||||
return
|
return
|
||||||
|
|
@ -56,7 +71,7 @@ class AppplicationsAction(BaseAction):
|
||||||
" and data.actionIdentifier={0}"
|
" and data.actionIdentifier={0}"
|
||||||
" and source.user.username={1}"
|
" and source.user.username={1}"
|
||||||
).format(
|
).format(
|
||||||
self.identifier + "*",
|
self.launch_identifier,
|
||||||
self.session.api_user
|
self.session.api_user
|
||||||
)
|
)
|
||||||
self.session.event_hub.subscribe(
|
self.session.event_hub.subscribe(
|
||||||
|
|
@ -136,12 +151,29 @@ class AppplicationsAction(BaseAction):
|
||||||
"label": app.group.label,
|
"label": app.group.label,
|
||||||
"variant": app.label,
|
"variant": app.label,
|
||||||
"description": None,
|
"description": None,
|
||||||
"actionIdentifier": self.identifier + app_name,
|
"actionIdentifier": "{}.{}".format(
|
||||||
|
self.launch_identifier_with_id, app_name
|
||||||
|
),
|
||||||
"icon": app_icon
|
"icon": app_icon
|
||||||
})
|
})
|
||||||
|
|
||||||
return items
|
return items
|
||||||
|
|
||||||
|
def _launch(self, event):
|
||||||
|
event_identifier = event["data"]["actionIdentifier"]
|
||||||
|
# Check if identifier is same
|
||||||
|
# - show message that acion may not be triggered on this machine
|
||||||
|
if event_identifier.startswith(self.launch_identifier_with_id):
|
||||||
|
return BaseAction._launch(self, event)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"success": False,
|
||||||
|
"message": (
|
||||||
|
"There are running more OpenPype processes"
|
||||||
|
" where Application can be launched."
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
def launch(self, session, entities, event):
|
def launch(self, session, entities, event):
|
||||||
"""Callback method for the custom action.
|
"""Callback method for the custom action.
|
||||||
|
|
||||||
|
|
@ -162,7 +194,8 @@ class AppplicationsAction(BaseAction):
|
||||||
*event* the unmodified original event
|
*event* the unmodified original event
|
||||||
"""
|
"""
|
||||||
identifier = event["data"]["actionIdentifier"]
|
identifier = event["data"]["actionIdentifier"]
|
||||||
app_name = identifier[len(self.identifier):]
|
id_identifier_len = len(self.launch_identifier_with_id) + 1
|
||||||
|
app_name = identifier[id_identifier_len:]
|
||||||
|
|
||||||
entity = entities[0]
|
entity = entities[0]
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -9,16 +9,24 @@ class MultipleNotes(BaseAction):
|
||||||
#: Action label.
|
#: Action label.
|
||||||
label = 'Multiple Notes'
|
label = 'Multiple Notes'
|
||||||
#: Action description.
|
#: Action description.
|
||||||
description = 'Add same note to multiple Asset Versions'
|
description = 'Add same note to multiple entities'
|
||||||
icon = statics_icon("ftrack", "action_icons", "MultipleNotes.svg")
|
icon = statics_icon("ftrack", "action_icons", "MultipleNotes.svg")
|
||||||
|
|
||||||
def discover(self, session, entities, event):
|
def discover(self, session, entities, event):
|
||||||
''' Validation '''
|
''' Validation '''
|
||||||
valid = True
|
valid = True
|
||||||
|
|
||||||
|
# Check for multiple selection.
|
||||||
|
if len(entities) < 2:
|
||||||
|
valid = False
|
||||||
|
|
||||||
|
# Check for valid entities.
|
||||||
|
valid_entity_types = ['assetversion', 'task']
|
||||||
for entity in entities:
|
for entity in entities:
|
||||||
if entity.entity_type.lower() != 'assetversion':
|
if entity.entity_type.lower() not in valid_entity_types:
|
||||||
valid = False
|
valid = False
|
||||||
break
|
break
|
||||||
|
|
||||||
return valid
|
return valid
|
||||||
|
|
||||||
def interface(self, session, entities, event):
|
def interface(self, session, entities, event):
|
||||||
|
|
@ -58,7 +66,7 @@ class MultipleNotes(BaseAction):
|
||||||
|
|
||||||
splitter = {
|
splitter = {
|
||||||
'type': 'label',
|
'type': 'label',
|
||||||
'value': '{}'.format(200*"-")
|
'value': '{}'.format(200 * "-")
|
||||||
}
|
}
|
||||||
|
|
||||||
items = []
|
items = []
|
||||||
|
|
|
||||||
|
|
@ -25,6 +25,8 @@ class PrepareProjectLocal(BaseAction):
|
||||||
settings_key = "prepare_project"
|
settings_key = "prepare_project"
|
||||||
|
|
||||||
# Key to store info about trigerring create folder structure
|
# Key to store info about trigerring create folder structure
|
||||||
|
create_project_structure_key = "create_folder_structure"
|
||||||
|
create_project_structure_identifier = "create.project.structure"
|
||||||
item_splitter = {"type": "label", "value": "---"}
|
item_splitter = {"type": "label", "value": "---"}
|
||||||
_keys_order = (
|
_keys_order = (
|
||||||
"fps",
|
"fps",
|
||||||
|
|
@ -90,14 +92,12 @@ class PrepareProjectLocal(BaseAction):
|
||||||
|
|
||||||
items.extend(ca_items)
|
items.extend(ca_items)
|
||||||
|
|
||||||
# This item will be last (before enumerators)
|
# Set value of auto synchronization
|
||||||
# - sets value of auto synchronization
|
|
||||||
auto_sync_name = "avalon_auto_sync"
|
|
||||||
auto_sync_value = project_entity["custom_attributes"].get(
|
auto_sync_value = project_entity["custom_attributes"].get(
|
||||||
CUST_ATTR_AUTO_SYNC, False
|
CUST_ATTR_AUTO_SYNC, False
|
||||||
)
|
)
|
||||||
auto_sync_item = {
|
auto_sync_item = {
|
||||||
"name": auto_sync_name,
|
"name": CUST_ATTR_AUTO_SYNC,
|
||||||
"type": "boolean",
|
"type": "boolean",
|
||||||
"value": auto_sync_value,
|
"value": auto_sync_value,
|
||||||
"label": "AutoSync to Avalon"
|
"label": "AutoSync to Avalon"
|
||||||
|
|
@ -105,6 +105,27 @@ class PrepareProjectLocal(BaseAction):
|
||||||
# Add autosync attribute
|
# Add autosync attribute
|
||||||
items.append(auto_sync_item)
|
items.append(auto_sync_item)
|
||||||
|
|
||||||
|
# This item will be last before enumerators
|
||||||
|
# Ask if want to trigger Action Create Folder Structure
|
||||||
|
create_project_structure_checked = (
|
||||||
|
project_settings
|
||||||
|
["project_settings"]
|
||||||
|
["ftrack"]
|
||||||
|
["user_handlers"]
|
||||||
|
["prepare_project"]
|
||||||
|
["create_project_structure_checked"]
|
||||||
|
).value
|
||||||
|
items.append({
|
||||||
|
"type": "label",
|
||||||
|
"value": "<h3>Want to create basic Folder Structure?</h3>"
|
||||||
|
})
|
||||||
|
items.append({
|
||||||
|
"name": self.create_project_structure_key,
|
||||||
|
"type": "boolean",
|
||||||
|
"value": create_project_structure_checked,
|
||||||
|
"label": "Check if Yes"
|
||||||
|
})
|
||||||
|
|
||||||
# Add enumerator items at the end
|
# Add enumerator items at the end
|
||||||
for item in multiselect_enumerators:
|
for item in multiselect_enumerators:
|
||||||
items.append(item)
|
items.append(item)
|
||||||
|
|
@ -248,7 +269,7 @@ class PrepareProjectLocal(BaseAction):
|
||||||
multiselect_enumerators.append(self.item_splitter)
|
multiselect_enumerators.append(self.item_splitter)
|
||||||
multiselect_enumerators.append({
|
multiselect_enumerators.append({
|
||||||
"type": "label",
|
"type": "label",
|
||||||
"value": in_data["label"]
|
"value": "<h3>{}</h3>".format(in_data["label"])
|
||||||
})
|
})
|
||||||
|
|
||||||
default = in_data["default"]
|
default = in_data["default"]
|
||||||
|
|
@ -309,10 +330,13 @@ class PrepareProjectLocal(BaseAction):
|
||||||
return items, multiselect_enumerators
|
return items, multiselect_enumerators
|
||||||
|
|
||||||
def launch(self, session, entities, event):
|
def launch(self, session, entities, event):
|
||||||
if not event['data'].get('values', {}):
|
in_data = event["data"].get("values")
|
||||||
|
if not in_data:
|
||||||
return
|
return
|
||||||
|
|
||||||
in_data = event['data']['values']
|
create_project_structure_checked = in_data.pop(
|
||||||
|
self.create_project_structure_key
|
||||||
|
)
|
||||||
|
|
||||||
root_values = {}
|
root_values = {}
|
||||||
root_key = "__root__"
|
root_key = "__root__"
|
||||||
|
|
@ -395,11 +419,20 @@ class PrepareProjectLocal(BaseAction):
|
||||||
|
|
||||||
project_settings.save()
|
project_settings.save()
|
||||||
|
|
||||||
entity = entities[0]
|
# Change custom attributes on project
|
||||||
for key, value in custom_attribute_values.items():
|
if custom_attribute_values:
|
||||||
entity["custom_attributes"][key] = value
|
for key, value in custom_attribute_values.items():
|
||||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
project_entity["custom_attributes"][key] = value
|
||||||
|
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
# Trigger create project structure action
|
||||||
|
if create_project_structure_checked:
|
||||||
|
trigger_identifier = "{}.{}".format(
|
||||||
|
self.create_project_structure_identifier,
|
||||||
|
self.process_identifier()
|
||||||
|
)
|
||||||
|
self.trigger_action(trigger_identifier, event)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -24,6 +24,10 @@ class ActionShowWhereIRun(BaseAction):
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
@property
|
||||||
|
def launch_identifier(self):
|
||||||
|
return self.identifier
|
||||||
|
|
||||||
def launch(self, session, entities, event):
|
def launch(self, session, entities, event):
|
||||||
# Don't show info when was launch from this session
|
# Don't show info when was launch from this session
|
||||||
if session.event_hub.id == event.get("data", {}).get("event_hub_id"):
|
if session.event_hub.id == event.get("data", {}).get("event_hub_id"):
|
||||||
|
|
|
||||||
|
|
@ -29,6 +29,9 @@ class BaseAction(BaseHandler):
|
||||||
icon = None
|
icon = None
|
||||||
type = 'Action'
|
type = 'Action'
|
||||||
|
|
||||||
|
_discover_identifier = None
|
||||||
|
_launch_identifier = None
|
||||||
|
|
||||||
settings_frack_subkey = "user_handlers"
|
settings_frack_subkey = "user_handlers"
|
||||||
settings_enabled_key = "enabled"
|
settings_enabled_key = "enabled"
|
||||||
|
|
||||||
|
|
@ -42,6 +45,22 @@ class BaseAction(BaseHandler):
|
||||||
|
|
||||||
super().__init__(session)
|
super().__init__(session)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def discover_identifier(self):
|
||||||
|
if self._discover_identifier is None:
|
||||||
|
self._discover_identifier = "{}.{}".format(
|
||||||
|
self.identifier, self.process_identifier()
|
||||||
|
)
|
||||||
|
return self._discover_identifier
|
||||||
|
|
||||||
|
@property
|
||||||
|
def launch_identifier(self):
|
||||||
|
if self._launch_identifier is None:
|
||||||
|
self._launch_identifier = "{}.{}".format(
|
||||||
|
self.identifier, self.process_identifier()
|
||||||
|
)
|
||||||
|
return self._launch_identifier
|
||||||
|
|
||||||
def register(self):
|
def register(self):
|
||||||
'''
|
'''
|
||||||
Registers the action, subscribing the the discover and launch topics.
|
Registers the action, subscribing the the discover and launch topics.
|
||||||
|
|
@ -60,7 +79,7 @@ class BaseAction(BaseHandler):
|
||||||
' and data.actionIdentifier={0}'
|
' and data.actionIdentifier={0}'
|
||||||
' and source.user.username={1}'
|
' and source.user.username={1}'
|
||||||
).format(
|
).format(
|
||||||
self.identifier,
|
self.launch_identifier,
|
||||||
self.session.api_user
|
self.session.api_user
|
||||||
)
|
)
|
||||||
self.session.event_hub.subscribe(
|
self.session.event_hub.subscribe(
|
||||||
|
|
@ -86,7 +105,7 @@ class BaseAction(BaseHandler):
|
||||||
'label': self.label,
|
'label': self.label,
|
||||||
'variant': self.variant,
|
'variant': self.variant,
|
||||||
'description': self.description,
|
'description': self.description,
|
||||||
'actionIdentifier': self.identifier,
|
'actionIdentifier': self.discover_identifier,
|
||||||
'icon': self.icon,
|
'icon': self.icon,
|
||||||
}]
|
}]
|
||||||
}
|
}
|
||||||
|
|
@ -309,6 +328,78 @@ class BaseAction(BaseHandler):
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
class LocalAction(BaseAction):
|
||||||
|
"""Action that warn user when more Processes with same action are running.
|
||||||
|
|
||||||
|
Action is launched all the time but if id does not match id of current
|
||||||
|
instanace then message is shown to user.
|
||||||
|
|
||||||
|
Handy for actions where matters if is executed on specific machine.
|
||||||
|
"""
|
||||||
|
_full_launch_identifier = None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def discover_identifier(self):
|
||||||
|
if self._discover_identifier is None:
|
||||||
|
self._discover_identifier = "{}.{}".format(
|
||||||
|
self.identifier, self.process_identifier()
|
||||||
|
)
|
||||||
|
return self._discover_identifier
|
||||||
|
|
||||||
|
@property
|
||||||
|
def launch_identifier(self):
|
||||||
|
"""Catch all topics with same identifier."""
|
||||||
|
if self._launch_identifier is None:
|
||||||
|
self._launch_identifier = "{}.*".format(self.identifier)
|
||||||
|
return self._launch_identifier
|
||||||
|
|
||||||
|
@property
|
||||||
|
def full_launch_identifier(self):
|
||||||
|
"""Catch all topics with same identifier."""
|
||||||
|
if self._full_launch_identifier is None:
|
||||||
|
self._full_launch_identifier = "{}.{}".format(
|
||||||
|
self.identifier, self.process_identifier()
|
||||||
|
)
|
||||||
|
return self._full_launch_identifier
|
||||||
|
|
||||||
|
def _discover(self, event):
|
||||||
|
entities = self._translate_event(event)
|
||||||
|
if not entities:
|
||||||
|
return
|
||||||
|
|
||||||
|
accepts = self.discover(self.session, entities, event)
|
||||||
|
if not accepts:
|
||||||
|
return
|
||||||
|
|
||||||
|
self.log.debug("Discovering action with selection: {0}".format(
|
||||||
|
event["data"].get("selection", [])
|
||||||
|
))
|
||||||
|
|
||||||
|
return {
|
||||||
|
"items": [{
|
||||||
|
"label": self.label,
|
||||||
|
"variant": self.variant,
|
||||||
|
"description": self.description,
|
||||||
|
"actionIdentifier": self.discover_identifier,
|
||||||
|
"icon": self.icon,
|
||||||
|
}]
|
||||||
|
}
|
||||||
|
|
||||||
|
def _launch(self, event):
|
||||||
|
event_identifier = event["data"]["actionIdentifier"]
|
||||||
|
# Check if identifier is same
|
||||||
|
# - show message that acion may not be triggered on this machine
|
||||||
|
if event_identifier != self.full_launch_identifier:
|
||||||
|
return {
|
||||||
|
"success": False,
|
||||||
|
"message": (
|
||||||
|
"There are running more OpenPype processes"
|
||||||
|
" where this action could be launched."
|
||||||
|
)
|
||||||
|
}
|
||||||
|
return super(LocalAction, self)._launch(event)
|
||||||
|
|
||||||
|
|
||||||
class ServerAction(BaseAction):
|
class ServerAction(BaseAction):
|
||||||
"""Action class meant to be used on event server.
|
"""Action class meant to be used on event server.
|
||||||
|
|
||||||
|
|
@ -318,6 +409,14 @@ class ServerAction(BaseAction):
|
||||||
|
|
||||||
settings_frack_subkey = "events"
|
settings_frack_subkey = "events"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def discover_identifier(self):
|
||||||
|
return self.identifier
|
||||||
|
|
||||||
|
@property
|
||||||
|
def launch_identifier(self):
|
||||||
|
return self.identifier
|
||||||
|
|
||||||
def register(self):
|
def register(self):
|
||||||
"""Register subcription to Ftrack event hub."""
|
"""Register subcription to Ftrack event hub."""
|
||||||
self.session.event_hub.subscribe(
|
self.session.event_hub.subscribe(
|
||||||
|
|
@ -328,5 +427,5 @@ class ServerAction(BaseAction):
|
||||||
|
|
||||||
launch_subscription = (
|
launch_subscription = (
|
||||||
"topic=ftrack.action.launch and data.actionIdentifier={0}"
|
"topic=ftrack.action.launch and data.actionIdentifier={0}"
|
||||||
).format(self.identifier)
|
).format(self.launch_identifier)
|
||||||
self.session.event_hub.subscribe(launch_subscription, self._launch)
|
self.session.event_hub.subscribe(launch_subscription, self._launch)
|
||||||
|
|
|
||||||
|
|
@ -2,6 +2,7 @@ import os
|
||||||
import tempfile
|
import tempfile
|
||||||
import json
|
import json
|
||||||
import functools
|
import functools
|
||||||
|
import uuid
|
||||||
import datetime
|
import datetime
|
||||||
import traceback
|
import traceback
|
||||||
import time
|
import time
|
||||||
|
|
@ -36,6 +37,7 @@ class BaseHandler(object):
|
||||||
<description> - a verbose descriptive text for you action
|
<description> - a verbose descriptive text for you action
|
||||||
<icon> - icon in ftrack
|
<icon> - icon in ftrack
|
||||||
'''
|
'''
|
||||||
|
_process_id = None
|
||||||
# Default priority is 100
|
# Default priority is 100
|
||||||
priority = 100
|
priority = 100
|
||||||
# Type is just for logging purpose (e.g.: Action, Event, Application,...)
|
# Type is just for logging purpose (e.g.: Action, Event, Application,...)
|
||||||
|
|
@ -70,6 +72,13 @@ class BaseHandler(object):
|
||||||
self.register = self.register_decorator(self.register)
|
self.register = self.register_decorator(self.register)
|
||||||
self.launch = self.launch_log(self.launch)
|
self.launch = self.launch_log(self.launch)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def process_identifier():
|
||||||
|
"""Helper property to have """
|
||||||
|
if not BaseHandler._process_id:
|
||||||
|
BaseHandler._process_id = str(uuid.uuid4())
|
||||||
|
return BaseHandler._process_id
|
||||||
|
|
||||||
# Decorator
|
# Decorator
|
||||||
def register_decorator(self, func):
|
def register_decorator(self, func):
|
||||||
@functools.wraps(func)
|
@functools.wraps(func)
|
||||||
|
|
|
||||||
|
|
@ -7,12 +7,13 @@ class LogsWindow(QtWidgets.QWidget):
|
||||||
def __init__(self, parent=None):
|
def __init__(self, parent=None):
|
||||||
super(LogsWindow, self).__init__(parent)
|
super(LogsWindow, self).__init__(parent)
|
||||||
|
|
||||||
self.setStyleSheet(style.load_stylesheet())
|
self.setWindowTitle("Logs viewer")
|
||||||
|
|
||||||
self.resize(1400, 800)
|
self.resize(1400, 800)
|
||||||
log_detail = OutputWidget(parent=self)
|
log_detail = OutputWidget(parent=self)
|
||||||
logs_widget = LogsWidget(log_detail, parent=self)
|
logs_widget = LogsWidget(log_detail, parent=self)
|
||||||
|
|
||||||
main_layout = QtWidgets.QHBoxLayout()
|
main_layout = QtWidgets.QHBoxLayout(self)
|
||||||
|
|
||||||
log_splitter = QtWidgets.QSplitter(self)
|
log_splitter = QtWidgets.QSplitter(self)
|
||||||
log_splitter.setOrientation(QtCore.Qt.Horizontal)
|
log_splitter.setOrientation(QtCore.Qt.Horizontal)
|
||||||
|
|
@ -24,5 +25,4 @@ class LogsWindow(QtWidgets.QWidget):
|
||||||
self.logs_widget = logs_widget
|
self.logs_widget = logs_widget
|
||||||
self.log_detail = log_detail
|
self.log_detail = log_detail
|
||||||
|
|
||||||
self.setLayout(main_layout)
|
self.setStyleSheet(style.load_stylesheet())
|
||||||
self.setWindowTitle("Logs")
|
|
||||||
|
|
|
||||||
|
|
@ -77,12 +77,10 @@ class CustomCombo(QtWidgets.QWidget):
|
||||||
toolbutton.setMenu(toolmenu)
|
toolbutton.setMenu(toolmenu)
|
||||||
toolbutton.setPopupMode(QtWidgets.QToolButton.MenuButtonPopup)
|
toolbutton.setPopupMode(QtWidgets.QToolButton.MenuButtonPopup)
|
||||||
|
|
||||||
layout = QtWidgets.QHBoxLayout()
|
layout = QtWidgets.QHBoxLayout(self)
|
||||||
layout.setContentsMargins(0, 0, 0, 0)
|
layout.setContentsMargins(0, 0, 0, 0)
|
||||||
layout.addWidget(toolbutton)
|
layout.addWidget(toolbutton)
|
||||||
|
|
||||||
self.setLayout(layout)
|
|
||||||
|
|
||||||
toolmenu.selection_changed.connect(self.selection_changed)
|
toolmenu.selection_changed.connect(self.selection_changed)
|
||||||
|
|
||||||
self.toolbutton = toolbutton
|
self.toolbutton = toolbutton
|
||||||
|
|
@ -141,7 +139,6 @@ class LogsWidget(QtWidgets.QWidget):
|
||||||
filter_layout.addWidget(refresh_btn)
|
filter_layout.addWidget(refresh_btn)
|
||||||
|
|
||||||
view = QtWidgets.QTreeView(self)
|
view = QtWidgets.QTreeView(self)
|
||||||
view.setAllColumnsShowFocus(True)
|
|
||||||
view.setEditTriggers(QtWidgets.QAbstractItemView.NoEditTriggers)
|
view.setEditTriggers(QtWidgets.QAbstractItemView.NoEditTriggers)
|
||||||
|
|
||||||
layout = QtWidgets.QVBoxLayout(self)
|
layout = QtWidgets.QVBoxLayout(self)
|
||||||
|
|
@ -229,9 +226,9 @@ class OutputWidget(QtWidgets.QWidget):
|
||||||
super(OutputWidget, self).__init__(parent=parent)
|
super(OutputWidget, self).__init__(parent=parent)
|
||||||
layout = QtWidgets.QVBoxLayout(self)
|
layout = QtWidgets.QVBoxLayout(self)
|
||||||
|
|
||||||
show_timecode_checkbox = QtWidgets.QCheckBox("Show timestamp")
|
show_timecode_checkbox = QtWidgets.QCheckBox("Show timestamp", self)
|
||||||
|
|
||||||
output_text = QtWidgets.QTextEdit()
|
output_text = QtWidgets.QTextEdit(self)
|
||||||
output_text.setReadOnly(True)
|
output_text.setReadOnly(True)
|
||||||
# output_text.setLineWrapMode(QtWidgets.QTextEdit.FixedPixelWidth)
|
# output_text.setLineWrapMode(QtWidgets.QTextEdit.FixedPixelWidth)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -3,6 +3,7 @@ from openpype.api import Logger
|
||||||
|
|
||||||
log = Logger().get_logger("Event processor")
|
log = Logger().get_logger("Event processor")
|
||||||
|
|
||||||
|
|
||||||
class TimersManagerModuleRestApi:
|
class TimersManagerModuleRestApi:
|
||||||
"""
|
"""
|
||||||
REST API endpoint used for calling from hosts when context change
|
REST API endpoint used for calling from hosts when context change
|
||||||
|
|
@ -22,6 +23,11 @@ class TimersManagerModuleRestApi:
|
||||||
self.prefix + "/start_timer",
|
self.prefix + "/start_timer",
|
||||||
self.start_timer
|
self.start_timer
|
||||||
)
|
)
|
||||||
|
self.server_manager.add_route(
|
||||||
|
"POST",
|
||||||
|
self.prefix + "/stop_timer",
|
||||||
|
self.stop_timer
|
||||||
|
)
|
||||||
|
|
||||||
async def start_timer(self, request):
|
async def start_timer(self, request):
|
||||||
data = await request.json()
|
data = await request.json()
|
||||||
|
|
@ -38,3 +44,7 @@ class TimersManagerModuleRestApi:
|
||||||
self.module.stop_timers()
|
self.module.stop_timers()
|
||||||
self.module.start_timer(project_name, asset_name, task_name, hierarchy)
|
self.module.start_timer(project_name, asset_name, task_name, hierarchy)
|
||||||
return Response(status=200)
|
return Response(status=200)
|
||||||
|
|
||||||
|
async def stop_timer(self, request):
|
||||||
|
self.module.stop_timers()
|
||||||
|
return Response(status=200)
|
||||||
|
|
|
||||||
|
|
@ -303,6 +303,8 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
key_values = {"families": family, "tasks": task_name}
|
key_values = {"families": family, "tasks": task_name}
|
||||||
profile = filter_profiles(self.template_name_profiles, key_values,
|
profile = filter_profiles(self.template_name_profiles, key_values,
|
||||||
logger=self.log)
|
logger=self.log)
|
||||||
|
|
||||||
|
template_name = "publish"
|
||||||
if profile:
|
if profile:
|
||||||
template_name = profile["template_name"]
|
template_name = profile["template_name"]
|
||||||
|
|
||||||
|
|
@ -380,7 +382,12 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
|
|
||||||
test_dest_files = list()
|
test_dest_files = list()
|
||||||
for i in [1, 2]:
|
for i in [1, 2]:
|
||||||
template_data["frame"] = src_padding_exp % i
|
template_data["representation"] = repre['ext']
|
||||||
|
if not repre.get("udim"):
|
||||||
|
template_data["frame"] = src_padding_exp % i
|
||||||
|
else:
|
||||||
|
template_data["udim"] = src_padding_exp % i
|
||||||
|
|
||||||
anatomy_filled = anatomy.format(template_data)
|
anatomy_filled = anatomy.format(template_data)
|
||||||
template_filled = anatomy_filled[template_name]["path"]
|
template_filled = anatomy_filled[template_name]["path"]
|
||||||
if repre_context is None:
|
if repre_context is None:
|
||||||
|
|
@ -388,7 +395,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
test_dest_files.append(
|
test_dest_files.append(
|
||||||
os.path.normpath(template_filled)
|
os.path.normpath(template_filled)
|
||||||
)
|
)
|
||||||
template_data["frame"] = repre_context["frame"]
|
if not repre.get("udim"):
|
||||||
|
template_data["frame"] = repre_context["frame"]
|
||||||
|
else:
|
||||||
|
template_data["udim"] = repre_context["udim"]
|
||||||
|
|
||||||
self.log.debug(
|
self.log.debug(
|
||||||
"test_dest_files: {}".format(str(test_dest_files)))
|
"test_dest_files: {}".format(str(test_dest_files)))
|
||||||
|
|
@ -453,7 +463,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
dst_start_frame = dst_padding
|
dst_start_frame = dst_padding
|
||||||
|
|
||||||
# Store used frame value to template data
|
# Store used frame value to template data
|
||||||
template_data["frame"] = dst_start_frame
|
if repre.get("frame"):
|
||||||
|
template_data["frame"] = dst_start_frame
|
||||||
|
|
||||||
dst = "{0}{1}{2}".format(
|
dst = "{0}{1}{2}".format(
|
||||||
dst_head,
|
dst_head,
|
||||||
dst_start_frame,
|
dst_start_frame,
|
||||||
|
|
@ -476,6 +488,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
"Given file name is a full path"
|
"Given file name is a full path"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
template_data["representation"] = repre['ext']
|
||||||
|
# Store used frame value to template data
|
||||||
|
if repre.get("udim"):
|
||||||
|
template_data["udim"] = repre["udim"][0]
|
||||||
src = os.path.join(stagingdir, fname)
|
src = os.path.join(stagingdir, fname)
|
||||||
anatomy_filled = anatomy.format(template_data)
|
anatomy_filled = anatomy.format(template_data)
|
||||||
template_filled = anatomy_filled[template_name]["path"]
|
template_filled = anatomy_filled[template_name]["path"]
|
||||||
|
|
@ -488,6 +504,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
repre['published_path'] = dst
|
repre['published_path'] = dst
|
||||||
self.log.debug("__ dst: {}".format(dst))
|
self.log.debug("__ dst: {}".format(dst))
|
||||||
|
|
||||||
|
if repre.get("udim"):
|
||||||
|
repre_context["udim"] = repre.get("udim") # store list
|
||||||
|
|
||||||
repre["publishedFiles"] = published_files
|
repre["publishedFiles"] = published_files
|
||||||
|
|
||||||
for key in self.db_representation_context_keys:
|
for key in self.db_representation_context_keys:
|
||||||
|
|
@ -1045,6 +1064,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
shutil.copy(file_url, new_name)
|
shutil.copy(file_url, new_name)
|
||||||
|
os.remove(file_url)
|
||||||
else:
|
else:
|
||||||
self.log.debug(
|
self.log.debug(
|
||||||
"Renaming file {} to {}".format(
|
"Renaming file {} to {}".format(
|
||||||
|
|
|
||||||
|
|
@ -11,7 +11,7 @@ class ValidateEditorialAssetName(pyblish.api.ContextPlugin):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
order = pyblish.api.ValidatorOrder
|
order = pyblish.api.ValidatorOrder
|
||||||
label = "Validate Asset Name"
|
label = "Validate Editorial Asset Name"
|
||||||
|
|
||||||
def process(self, context):
|
def process(self, context):
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -92,15 +92,16 @@ class RepairSelectInvalidInstances(pyblish.api.Action):
|
||||||
|
|
||||||
context_asset = context.data["assetEntity"]["name"]
|
context_asset = context.data["assetEntity"]["name"]
|
||||||
for instance in instances:
|
for instance in instances:
|
||||||
self.set_attribute(instance, context_asset)
|
if "nuke" in pyblish.api.registered_hosts():
|
||||||
|
import openpype.hosts.nuke.api as nuke_api
|
||||||
|
origin_node = instance[0]
|
||||||
|
nuke_api.lib.recreate_instance(
|
||||||
|
origin_node, avalon_data={"asset": context_asset}
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.set_attribute(instance, context_asset)
|
||||||
|
|
||||||
def set_attribute(self, instance, context_asset):
|
def set_attribute(self, instance, context_asset):
|
||||||
if "nuke" in pyblish.api.registered_hosts():
|
|
||||||
import nuke
|
|
||||||
nuke.toNode(
|
|
||||||
instance.data.get("name")
|
|
||||||
)["avalon:asset"].setValue(context_asset)
|
|
||||||
|
|
||||||
if "maya" in pyblish.api.registered_hosts():
|
if "maya" in pyblish.api.registered_hosts():
|
||||||
from maya import cmds
|
from maya import cmds
|
||||||
cmds.setAttr(
|
cmds.setAttr(
|
||||||
|
|
|
||||||
|
|
@ -17,7 +17,7 @@
|
||||||
},
|
},
|
||||||
"publish": {
|
"publish": {
|
||||||
"folder": "{root[work]}/{project[name]}/{hierarchy}/{asset}/publish/{family}/{subset}/{@version}",
|
"folder": "{root[work]}/{project[name]}/{hierarchy}/{asset}/publish/{family}/{subset}/{@version}",
|
||||||
"file": "{project[code]}_{asset}_{subset}_{@version}<_{output}><.{@frame}>.{ext}",
|
"file": "{project[code]}_{asset}_{subset}_{@version}<_{output}><.{@frame}><_{udim}>.{ext}",
|
||||||
"path": "{@folder}/{@file}",
|
"path": "{@folder}/{@file}",
|
||||||
"thumbnail": "{thumbnail_root}/{project[name]}/{_id}_{thumbnail_type}.{ext}"
|
"thumbnail": "{thumbnail_root}/{project[name]}/{_id}_{thumbnail_type}.{ext}"
|
||||||
},
|
},
|
||||||
|
|
|
||||||
|
|
@ -136,7 +136,8 @@
|
||||||
"Pypeclub",
|
"Pypeclub",
|
||||||
"Administrator",
|
"Administrator",
|
||||||
"Project manager"
|
"Project manager"
|
||||||
]
|
],
|
||||||
|
"create_project_structure_checked": false
|
||||||
},
|
},
|
||||||
"clean_hierarchical_attr": {
|
"clean_hierarchical_attr": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
|
|
|
||||||
|
|
@ -1,5 +1,13 @@
|
||||||
{
|
{
|
||||||
"publish": {
|
"publish": {
|
||||||
|
"ValidateEditorialAssetName": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false
|
||||||
|
},
|
||||||
|
"ValidateVersion": {
|
||||||
|
"enabled": true,
|
||||||
|
"optional": false
|
||||||
|
},
|
||||||
"IntegrateHeroVersion": {
|
"IntegrateHeroVersion": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
|
|
@ -260,6 +268,13 @@
|
||||||
"enabled": true
|
"enabled": true
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
"open_workfile_tool_on_startup": [
|
||||||
|
{
|
||||||
|
"hosts": [],
|
||||||
|
"tasks": [],
|
||||||
|
"enabled": false
|
||||||
|
}
|
||||||
|
],
|
||||||
"sw_folders": {
|
"sw_folders": {
|
||||||
"compositing": [
|
"compositing": [
|
||||||
"nuke",
|
"nuke",
|
||||||
|
|
|
||||||
|
|
@ -20,6 +20,22 @@
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"scriptsmenu": {
|
||||||
|
"name": "OpenPype Tools",
|
||||||
|
"definition": [
|
||||||
|
{
|
||||||
|
"type": "action",
|
||||||
|
"command": "import openpype.hosts.maya.api.commands as op_cmds; op_cmds.edit_shader_definitions()",
|
||||||
|
"sourcetype": "python",
|
||||||
|
"title": "Edit shader name definitions",
|
||||||
|
"tooltip": "Edit shader name definitions used in validation and renaming.",
|
||||||
|
"tags": [
|
||||||
|
"pipeline",
|
||||||
|
"shader"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
"create": {
|
"create": {
|
||||||
"CreateLook": {
|
"CreateLook": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
|
|
|
||||||
|
|
@ -123,6 +123,16 @@
|
||||||
],
|
],
|
||||||
"help": "Process multiple Mov files and publish them for layout and comp."
|
"help": "Process multiple Mov files and publish them for layout and comp."
|
||||||
},
|
},
|
||||||
|
"create_texture_batch": {
|
||||||
|
"name": "texture_batch",
|
||||||
|
"label": "Texture Batch",
|
||||||
|
"family": "texture_batch",
|
||||||
|
"icon": "image",
|
||||||
|
"defaults": [
|
||||||
|
"Main"
|
||||||
|
],
|
||||||
|
"help": "Texture files with UDIM together with worfile"
|
||||||
|
},
|
||||||
"__dynamic_keys_labels__": {
|
"__dynamic_keys_labels__": {
|
||||||
"create_workfile": "Workfile",
|
"create_workfile": "Workfile",
|
||||||
"create_model": "Model",
|
"create_model": "Model",
|
||||||
|
|
@ -134,10 +144,65 @@
|
||||||
"create_image": "Image",
|
"create_image": "Image",
|
||||||
"create_matchmove": "Matchmove",
|
"create_matchmove": "Matchmove",
|
||||||
"create_render": "Render",
|
"create_render": "Render",
|
||||||
"create_mov_batch": "Batch Mov"
|
"create_mov_batch": "Batch Mov",
|
||||||
|
"create_texture_batch": "Batch Texture"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"publish": {
|
"publish": {
|
||||||
|
"CollectTextures": {
|
||||||
|
"enabled": true,
|
||||||
|
"active": true,
|
||||||
|
"main_workfile_extensions": [
|
||||||
|
"mra"
|
||||||
|
],
|
||||||
|
"other_workfile_extensions": [
|
||||||
|
"spp",
|
||||||
|
"psd"
|
||||||
|
],
|
||||||
|
"texture_extensions": [
|
||||||
|
"exr",
|
||||||
|
"dpx",
|
||||||
|
"jpg",
|
||||||
|
"jpeg",
|
||||||
|
"png",
|
||||||
|
"tiff",
|
||||||
|
"tga",
|
||||||
|
"gif",
|
||||||
|
"svg"
|
||||||
|
],
|
||||||
|
"workfile_families": [],
|
||||||
|
"texture_families": [],
|
||||||
|
"color_space": [
|
||||||
|
"linsRGB",
|
||||||
|
"raw",
|
||||||
|
"acesg"
|
||||||
|
],
|
||||||
|
"input_naming_patterns": {
|
||||||
|
"workfile": [
|
||||||
|
"^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+"
|
||||||
|
],
|
||||||
|
"textures": [
|
||||||
|
"^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"input_naming_groups": {
|
||||||
|
"workfile": [
|
||||||
|
"asset",
|
||||||
|
"filler",
|
||||||
|
"version"
|
||||||
|
],
|
||||||
|
"textures": [
|
||||||
|
"asset",
|
||||||
|
"shader",
|
||||||
|
"version",
|
||||||
|
"channel",
|
||||||
|
"color_space",
|
||||||
|
"udim"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"workfile_subset_template": "textures{Subset}Workfile",
|
||||||
|
"texture_subset_template": "textures{Subset}_{Shader}_{Channel}"
|
||||||
|
},
|
||||||
"ValidateSceneSettings": {
|
"ValidateSceneSettings": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
|
|
@ -189,7 +254,7 @@
|
||||||
},
|
},
|
||||||
"shot_add_tasks": {}
|
"shot_add_tasks": {}
|
||||||
},
|
},
|
||||||
"shot_add_tasks": {
|
"CollectInstances": {
|
||||||
"custom_start_frame": 0,
|
"custom_start_frame": 0,
|
||||||
"timeline_frame_start": 900000,
|
"timeline_frame_start": 900000,
|
||||||
"timeline_frame_offset": 0,
|
"timeline_frame_offset": 0,
|
||||||
|
|
|
||||||
|
|
@ -18,6 +18,11 @@
|
||||||
"optional": true,
|
"optional": true,
|
||||||
"active": true
|
"active": true
|
||||||
},
|
},
|
||||||
|
"ValidateStartFrame": {
|
||||||
|
"enabled": false,
|
||||||
|
"optional": true,
|
||||||
|
"active": true
|
||||||
|
},
|
||||||
"ValidateAssetName": {
|
"ValidateAssetName": {
|
||||||
"enabled": true,
|
"enabled": true,
|
||||||
"optional": true,
|
"optional": true,
|
||||||
|
|
|
||||||
|
|
@ -1,5 +1,6 @@
|
||||||
from .dict_immutable_keys_entity import DictImmutableKeysEntity
|
from .dict_immutable_keys_entity import DictImmutableKeysEntity
|
||||||
from .lib import OverrideState
|
from .lib import OverrideState
|
||||||
|
from .exceptions import EntitySchemaError
|
||||||
|
|
||||||
|
|
||||||
class AnatomyEntity(DictImmutableKeysEntity):
|
class AnatomyEntity(DictImmutableKeysEntity):
|
||||||
|
|
@ -23,3 +24,25 @@ class AnatomyEntity(DictImmutableKeysEntity):
|
||||||
if not child_obj.has_project_override:
|
if not child_obj.has_project_override:
|
||||||
child_obj.add_to_project_override()
|
child_obj.add_to_project_override()
|
||||||
return super(AnatomyEntity, self).on_child_change(child_obj)
|
return super(AnatomyEntity, self).on_child_change(child_obj)
|
||||||
|
|
||||||
|
def schema_validations(self):
|
||||||
|
non_group_children = []
|
||||||
|
for key, child_obj in self.non_gui_children.items():
|
||||||
|
if not child_obj.is_group:
|
||||||
|
non_group_children.append(key)
|
||||||
|
|
||||||
|
if non_group_children:
|
||||||
|
_non_group_children = [
|
||||||
|
"project_anatomy/{}".format(key)
|
||||||
|
for key in non_group_children
|
||||||
|
]
|
||||||
|
reason = (
|
||||||
|
"Anatomy must have all children as groups."
|
||||||
|
" Set 'is_group' to `true` on > {}"
|
||||||
|
).format(", ".join([
|
||||||
|
'"{}"'.format(item)
|
||||||
|
for item in _non_group_children
|
||||||
|
]))
|
||||||
|
raise EntitySchemaError(self, reason)
|
||||||
|
|
||||||
|
return super(AnatomyEntity, self).schema_validations()
|
||||||
|
|
|
||||||
|
|
@ -144,6 +144,13 @@ class DictConditionalEntity(ItemEntity):
|
||||||
|
|
||||||
self.enum_entity = None
|
self.enum_entity = None
|
||||||
|
|
||||||
|
# GUI attributes
|
||||||
|
self.enum_is_horizontal = self.schema_data.get(
|
||||||
|
"enum_is_horizontal", False
|
||||||
|
)
|
||||||
|
# `enum_on_right` can be used only if
|
||||||
|
self.enum_on_right = self.schema_data.get("enum_on_right", False)
|
||||||
|
|
||||||
self.highlight_content = self.schema_data.get(
|
self.highlight_content = self.schema_data.get(
|
||||||
"highlight_content", False
|
"highlight_content", False
|
||||||
)
|
)
|
||||||
|
|
@ -185,13 +192,13 @@ class DictConditionalEntity(ItemEntity):
|
||||||
children_def_keys = []
|
children_def_keys = []
|
||||||
for children_def in self.enum_children:
|
for children_def in self.enum_children:
|
||||||
if not isinstance(children_def, dict):
|
if not isinstance(children_def, dict):
|
||||||
raise EntitySchemaError((
|
raise EntitySchemaError(self, (
|
||||||
"Children definition under key 'enum_children' must"
|
"Children definition under key 'enum_children' must"
|
||||||
" be a dictionary."
|
" be a dictionary."
|
||||||
))
|
))
|
||||||
|
|
||||||
if "key" not in children_def:
|
if "key" not in children_def:
|
||||||
raise EntitySchemaError((
|
raise EntitySchemaError(self, (
|
||||||
"Children definition under key 'enum_children' miss"
|
"Children definition under key 'enum_children' miss"
|
||||||
" 'key' definition."
|
" 'key' definition."
|
||||||
))
|
))
|
||||||
|
|
@ -286,7 +293,7 @@ class DictConditionalEntity(ItemEntity):
|
||||||
"multiselection": False,
|
"multiselection": False,
|
||||||
"enum_items": enum_items,
|
"enum_items": enum_items,
|
||||||
"key": enum_key,
|
"key": enum_key,
|
||||||
"label": self.enum_label or enum_key
|
"label": self.enum_label
|
||||||
}
|
}
|
||||||
|
|
||||||
enum_entity = self.create_schema_object(enum_schema, self)
|
enum_entity = self.create_schema_object(enum_schema, self)
|
||||||
|
|
|
||||||
|
|
@ -1,3 +1,4 @@
|
||||||
|
import copy
|
||||||
from .input_entities import InputEntity
|
from .input_entities import InputEntity
|
||||||
from .exceptions import EntitySchemaError
|
from .exceptions import EntitySchemaError
|
||||||
from .lib import (
|
from .lib import (
|
||||||
|
|
@ -118,30 +119,43 @@ class HostsEnumEntity(BaseEnumEntity):
|
||||||
implementation instead of application name.
|
implementation instead of application name.
|
||||||
"""
|
"""
|
||||||
schema_types = ["hosts-enum"]
|
schema_types = ["hosts-enum"]
|
||||||
|
all_host_names = [
|
||||||
|
"aftereffects",
|
||||||
|
"blender",
|
||||||
|
"celaction",
|
||||||
|
"fusion",
|
||||||
|
"harmony",
|
||||||
|
"hiero",
|
||||||
|
"houdini",
|
||||||
|
"maya",
|
||||||
|
"nuke",
|
||||||
|
"photoshop",
|
||||||
|
"resolve",
|
||||||
|
"tvpaint",
|
||||||
|
"unreal",
|
||||||
|
"standalonepublisher"
|
||||||
|
]
|
||||||
|
|
||||||
def _item_initalization(self):
|
def _item_initalization(self):
|
||||||
self.multiselection = self.schema_data.get("multiselection", True)
|
self.multiselection = self.schema_data.get("multiselection", True)
|
||||||
self.use_empty_value = self.schema_data.get(
|
use_empty_value = False
|
||||||
"use_empty_value", not self.multiselection
|
if not self.multiselection:
|
||||||
)
|
use_empty_value = self.schema_data.get(
|
||||||
|
"use_empty_value", use_empty_value
|
||||||
|
)
|
||||||
|
self.use_empty_value = use_empty_value
|
||||||
|
|
||||||
|
hosts_filter = self.schema_data.get("hosts_filter") or []
|
||||||
|
self.hosts_filter = hosts_filter
|
||||||
|
|
||||||
custom_labels = self.schema_data.get("custom_labels") or {}
|
custom_labels = self.schema_data.get("custom_labels") or {}
|
||||||
|
|
||||||
host_names = [
|
host_names = copy.deepcopy(self.all_host_names)
|
||||||
"aftereffects",
|
if hosts_filter:
|
||||||
"blender",
|
for host_name in tuple(host_names):
|
||||||
"celaction",
|
if host_name not in hosts_filter:
|
||||||
"fusion",
|
host_names.remove(host_name)
|
||||||
"harmony",
|
|
||||||
"hiero",
|
|
||||||
"houdini",
|
|
||||||
"maya",
|
|
||||||
"nuke",
|
|
||||||
"photoshop",
|
|
||||||
"resolve",
|
|
||||||
"tvpaint",
|
|
||||||
"unreal",
|
|
||||||
"standalonepublisher"
|
|
||||||
]
|
|
||||||
if self.use_empty_value:
|
if self.use_empty_value:
|
||||||
host_names.insert(0, "")
|
host_names.insert(0, "")
|
||||||
# Add default label for empty value if not available
|
# Add default label for empty value if not available
|
||||||
|
|
@ -173,6 +187,44 @@ class HostsEnumEntity(BaseEnumEntity):
|
||||||
# GUI attribute
|
# GUI attribute
|
||||||
self.placeholder = self.schema_data.get("placeholder")
|
self.placeholder = self.schema_data.get("placeholder")
|
||||||
|
|
||||||
|
def schema_validations(self):
|
||||||
|
if self.hosts_filter:
|
||||||
|
enum_len = len(self.enum_items)
|
||||||
|
if (
|
||||||
|
enum_len == 0
|
||||||
|
or (enum_len == 1 and self.use_empty_value)
|
||||||
|
):
|
||||||
|
joined_filters = ", ".join([
|
||||||
|
'"{}"'.format(item)
|
||||||
|
for item in self.hosts_filter
|
||||||
|
])
|
||||||
|
reason = (
|
||||||
|
"All host names were removed after applying"
|
||||||
|
" host filters. {}"
|
||||||
|
).format(joined_filters)
|
||||||
|
raise EntitySchemaError(self, reason)
|
||||||
|
|
||||||
|
invalid_filters = set()
|
||||||
|
for item in self.hosts_filter:
|
||||||
|
if item not in self.all_host_names:
|
||||||
|
invalid_filters.add(item)
|
||||||
|
|
||||||
|
if invalid_filters:
|
||||||
|
joined_filters = ", ".join([
|
||||||
|
'"{}"'.format(item)
|
||||||
|
for item in self.hosts_filter
|
||||||
|
])
|
||||||
|
expected_hosts = ", ".join([
|
||||||
|
'"{}"'.format(item)
|
||||||
|
for item in self.all_host_names
|
||||||
|
])
|
||||||
|
self.log.warning((
|
||||||
|
"Host filters containt invalid host names:"
|
||||||
|
" \"{}\" Expected values are {}"
|
||||||
|
).format(joined_filters, expected_hosts))
|
||||||
|
|
||||||
|
super(HostsEnumEntity, self).schema_validations()
|
||||||
|
|
||||||
|
|
||||||
class AppsEnumEntity(BaseEnumEntity):
|
class AppsEnumEntity(BaseEnumEntity):
|
||||||
schema_types = ["apps-enum"]
|
schema_types = ["apps-enum"]
|
||||||
|
|
|
||||||
|
|
@ -204,6 +204,8 @@
|
||||||
- it is possible to add darker background with `"highlight_content"` (Default: `False`)
|
- it is possible to add darker background with `"highlight_content"` (Default: `False`)
|
||||||
- darker background has limits of maximum applies after 3-4 nested highlighted items there is not difference in the color
|
- darker background has limits of maximum applies after 3-4 nested highlighted items there is not difference in the color
|
||||||
- output is dictionary `{the "key": children values}`
|
- output is dictionary `{the "key": children values}`
|
||||||
|
- for UI porposes was added `enum_is_horizontal` which will make combobox appear next to children inputs instead of on top of them (Default: `False`)
|
||||||
|
- this has extended ability of `enum_on_right` which will move combobox to right side next to children widgets (Default: `False`)
|
||||||
```
|
```
|
||||||
# Example
|
# Example
|
||||||
{
|
{
|
||||||
|
|
@ -379,6 +381,9 @@ How output of the schema could look like on save:
|
||||||
- multiselection can be allowed with setting key `"multiselection"` to `True` (Default: `False`)
|
- multiselection can be allowed with setting key `"multiselection"` to `True` (Default: `False`)
|
||||||
- it is possible to add empty value (represented with empty string) with setting `"use_empty_value"` to `True` (Default: `False`)
|
- it is possible to add empty value (represented with empty string) with setting `"use_empty_value"` to `True` (Default: `False`)
|
||||||
- it is possible to set `"custom_labels"` for host names where key `""` is empty value (Default: `{}`)
|
- it is possible to set `"custom_labels"` for host names where key `""` is empty value (Default: `{}`)
|
||||||
|
- to filter host names it is required to define `"hosts_filter"` which is list of host names that will be available
|
||||||
|
- do not pass empty string if `use_empty_value` is enabled
|
||||||
|
- ignoring host names would be more dangerous in some cases
|
||||||
```
|
```
|
||||||
{
|
{
|
||||||
"key": "host",
|
"key": "host",
|
||||||
|
|
@ -389,7 +394,10 @@ How output of the schema could look like on save:
|
||||||
"custom_labels": {
|
"custom_labels": {
|
||||||
"": "N/A",
|
"": "N/A",
|
||||||
"nuke": "Nuke"
|
"nuke": "Nuke"
|
||||||
}
|
},
|
||||||
|
"hosts_filter": [
|
||||||
|
"nuke"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
@ -577,6 +585,15 @@ How output of the schema could look like on save:
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Anatomy
|
||||||
|
Anatomy represents data stored on project document.
|
||||||
|
|
||||||
|
### anatomy
|
||||||
|
- entity works similarly to `dict`
|
||||||
|
- anatomy has always all keys overriden with overrides
|
||||||
|
- overrides are not applied as all anatomy data must be available from project document
|
||||||
|
- all children must be groups
|
||||||
|
|
||||||
## Proxy wrappers
|
## Proxy wrappers
|
||||||
- should wraps multiple inputs only visually
|
- should wraps multiple inputs only visually
|
||||||
- these does not have `"key"` key and do not allow to have `"is_file"` or `"is_group"` modifiers enabled
|
- these does not have `"key"` key and do not allow to have `"is_file"` or `"is_group"` modifiers enabled
|
||||||
|
|
|
||||||
|
|
@ -441,6 +441,18 @@
|
||||||
"key": "role_list",
|
"key": "role_list",
|
||||||
"label": "Roles",
|
"label": "Roles",
|
||||||
"object_type": "text"
|
"object_type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "separator"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "label",
|
||||||
|
"label": "Check \"Create project structure\" by default"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "create_project_structure_checked",
|
||||||
|
"label": "Checked"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|
|
||||||
|
|
@ -47,6 +47,10 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "schema",
|
||||||
|
"name": "schema_maya_scriptsmenu"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "schema",
|
"type": "schema",
|
||||||
"name": "schema_maya_create"
|
"name": "schema_maya_create"
|
||||||
|
|
|
||||||
|
|
@ -56,6 +56,119 @@
|
||||||
"key": "publish",
|
"key": "publish",
|
||||||
"label": "Publish plugins",
|
"label": "Publish plugins",
|
||||||
"children": [
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": true,
|
||||||
|
"key": "CollectTextures",
|
||||||
|
"label": "Collect Textures",
|
||||||
|
"checkbox_key": "enabled",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "enabled",
|
||||||
|
"label": "Enabled"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "active",
|
||||||
|
"label": "Active"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "main_workfile_extensions",
|
||||||
|
"object_type": "text",
|
||||||
|
"label": "Main workfile extensions"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "other_workfile_extensions",
|
||||||
|
"label": "Support workfile extensions",
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "texture_extensions",
|
||||||
|
"object_type": "text",
|
||||||
|
"label": "Texture extensions"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "workfile_families",
|
||||||
|
"object_type": "text",
|
||||||
|
"label": "Additional families for workfile"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "texture_families",
|
||||||
|
"object_type": "text",
|
||||||
|
"label": "Additional families for textures"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "color_space",
|
||||||
|
"object_type": "text",
|
||||||
|
"label": "Color spaces"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": false,
|
||||||
|
"key": "input_naming_patterns",
|
||||||
|
"label": "Regex patterns for naming conventions",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "label",
|
||||||
|
"label": "Add regex groups matching expected name"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text",
|
||||||
|
"key": "workfile",
|
||||||
|
"label": "Workfile naming pattern"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text",
|
||||||
|
"key": "textures",
|
||||||
|
"label": "Textures naming pattern"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": false,
|
||||||
|
"key": "input_naming_groups",
|
||||||
|
"label": "Group order for regex patterns",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "label",
|
||||||
|
"label": "Add names of matched groups in correct order. Available values: ('filler', 'asset', 'shader', 'version', 'channel', 'color_space', 'udim')"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text",
|
||||||
|
"key": "workfile",
|
||||||
|
"label": "Workfile group positions"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text",
|
||||||
|
"key": "textures",
|
||||||
|
"label": "Textures group positions"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "text",
|
||||||
|
"key": "workfile_subset_template",
|
||||||
|
"label": "Subset name template for workfile"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "text",
|
||||||
|
"key": "texture_subset_template",
|
||||||
|
"label": "Subset name template for textures"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"collapsible": true,
|
"collapsible": true,
|
||||||
|
|
@ -214,7 +327,7 @@
|
||||||
{
|
{
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"collapsible": true,
|
"collapsible": true,
|
||||||
"key": "shot_add_tasks",
|
"key": "CollectInstances",
|
||||||
"label": "Collect Clip Instances",
|
"label": "Collect Clip Instances",
|
||||||
"is_group": true,
|
"is_group": true,
|
||||||
"children": [
|
"children": [
|
||||||
|
|
|
||||||
|
|
@ -52,6 +52,17 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "schema_template",
|
||||||
|
"name": "template_publish_plugin",
|
||||||
|
"template_data": [
|
||||||
|
{
|
||||||
|
"key": "ValidateStartFrame",
|
||||||
|
"label": "Validate Scene Start Frame",
|
||||||
|
"docstring": "Validate first frame of scene is set to '0'."
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "schema_template",
|
"type": "schema_template",
|
||||||
"name": "template_publish_plugin",
|
"name": "template_publish_plugin",
|
||||||
|
|
|
||||||
|
|
@ -3,6 +3,7 @@
|
||||||
"key": "imageio",
|
"key": "imageio",
|
||||||
"label": "Color Management and Output Formats",
|
"label": "Color Management and Output Formats",
|
||||||
"is_file": true,
|
"is_file": true,
|
||||||
|
"is_group": true,
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"key": "hiero",
|
"key": "hiero",
|
||||||
|
|
@ -14,7 +15,6 @@
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"label": "Workfile",
|
"label": "Workfile",
|
||||||
"collapsible": false,
|
"collapsible": false,
|
||||||
"is_group": true,
|
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"type": "form",
|
"type": "form",
|
||||||
|
|
@ -89,7 +89,6 @@
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"label": "Colorspace on Inputs by regex detection",
|
"label": "Colorspace on Inputs by regex detection",
|
||||||
"collapsible": true,
|
"collapsible": true,
|
||||||
"is_group": true,
|
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"type": "list",
|
"type": "list",
|
||||||
|
|
@ -124,7 +123,6 @@
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"label": "Viewer",
|
"label": "Viewer",
|
||||||
"collapsible": false,
|
"collapsible": false,
|
||||||
"is_group": true,
|
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"type": "text",
|
"type": "text",
|
||||||
|
|
@ -138,7 +136,6 @@
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"label": "Workfile",
|
"label": "Workfile",
|
||||||
"collapsible": false,
|
"collapsible": false,
|
||||||
"is_group": true,
|
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"type": "form",
|
"type": "form",
|
||||||
|
|
@ -236,7 +233,6 @@
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"label": "Nodes",
|
"label": "Nodes",
|
||||||
"collapsible": true,
|
"collapsible": true,
|
||||||
"is_group": true,
|
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"key": "requiredNodes",
|
"key": "requiredNodes",
|
||||||
|
|
@ -339,7 +335,6 @@
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"label": "Colorspace on Inputs by regex detection",
|
"label": "Colorspace on Inputs by regex detection",
|
||||||
"collapsible": true,
|
"collapsible": true,
|
||||||
"is_group": true,
|
|
||||||
"children": [
|
"children": [
|
||||||
{
|
{
|
||||||
"type": "list",
|
"type": "list",
|
||||||
|
|
|
||||||
|
|
@ -4,6 +4,46 @@
|
||||||
"key": "publish",
|
"key": "publish",
|
||||||
"label": "Publish plugins",
|
"label": "Publish plugins",
|
||||||
"children": [
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": true,
|
||||||
|
"checkbox_key": "enabled",
|
||||||
|
"key": "ValidateEditorialAssetName",
|
||||||
|
"label": "Validate Editorial Asset Name",
|
||||||
|
"is_group": true,
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "enabled",
|
||||||
|
"label": "Enabled"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "optional",
|
||||||
|
"label": "Optional"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": true,
|
||||||
|
"checkbox_key": "enabled",
|
||||||
|
"key": "ValidateVersion",
|
||||||
|
"label": "Validate Version",
|
||||||
|
"is_group": true,
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "enabled",
|
||||||
|
"label": "Enabled"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "optional",
|
||||||
|
"label": "Optional"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "dict",
|
"type": "dict",
|
||||||
"collapsible": true,
|
"collapsible": true,
|
||||||
|
|
|
||||||
|
|
@ -78,7 +78,57 @@
|
||||||
"type": "hosts-enum",
|
"type": "hosts-enum",
|
||||||
"key": "hosts",
|
"key": "hosts",
|
||||||
"label": "Hosts",
|
"label": "Hosts",
|
||||||
"multiselection": true
|
"multiselection": true,
|
||||||
|
"hosts_filter": [
|
||||||
|
"aftereffects",
|
||||||
|
"blender",
|
||||||
|
"celaction",
|
||||||
|
"fusion",
|
||||||
|
"harmony",
|
||||||
|
"hiero",
|
||||||
|
"houdini",
|
||||||
|
"maya",
|
||||||
|
"nuke",
|
||||||
|
"photoshop",
|
||||||
|
"resolve",
|
||||||
|
"tvpaint",
|
||||||
|
"unreal"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "tasks",
|
||||||
|
"label": "Tasks",
|
||||||
|
"type": "list",
|
||||||
|
"object_type": "text"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "splitter"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "enabled",
|
||||||
|
"label": "Enabled"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "list",
|
||||||
|
"key": "open_workfile_tool_on_startup",
|
||||||
|
"label": "Open workfile tool on launch",
|
||||||
|
"is_group": true,
|
||||||
|
"use_label_wrap": true,
|
||||||
|
"object_type": {
|
||||||
|
"type": "dict",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "hosts-enum",
|
||||||
|
"key": "hosts",
|
||||||
|
"label": "Hosts",
|
||||||
|
"multiselection": true,
|
||||||
|
"hosts_filter": [
|
||||||
|
"nuke"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"key": "tasks",
|
"key": "tasks",
|
||||||
|
|
|
||||||
|
|
@ -147,9 +147,14 @@
|
||||||
"key": "enabled",
|
"key": "enabled",
|
||||||
"label": "Enabled"
|
"label": "Enabled"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "boolean",
|
||||||
|
"key": "database",
|
||||||
|
"label": "Use database shader name definitions"
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "label",
|
"type": "label",
|
||||||
"label": "Path to material file defining list of material names to check. This is material name per line simple text file.<br/>It will be checked against named group <b>shader</b> in your <em>Validation regex</em>.<p>For example: <br/> <code>^.*(?P=<shader>.+)_GEO</code></p>"
|
"label": "Path to material file defining list of material names to check. This is material name per line simple text file.<br/>It will be checked against named group <b>shader</b> in your <em>Validation regex</em>.<p>For example: <br/> <code>^.*(?P=<shader>.+)_GEO</code></p>This is used instead of database definitions if they are disabled."
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"type": "path",
|
"type": "path",
|
||||||
|
|
@ -162,6 +167,15 @@
|
||||||
"type": "text",
|
"type": "text",
|
||||||
"key": "regex",
|
"key": "regex",
|
||||||
"label": "Validation regex"
|
"label": "Validation regex"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "label",
|
||||||
|
"label": "Regex for validating name of top level group name.<br/>You can use named capturing groups:<br/><code>(?P<asset>.*)</code> for Asset name<br/><code>(?P<subset>.*)</code> for Subset<br/><code>(?P<project>.*)</code> for project<br/><p>For example to check for asset in name so <code>*_some_asset_name_GRP</code> is valid, use:<br/><code>.*?_(?P<asset>.*)_GEO</code>"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "text",
|
||||||
|
"key": "top_level_regex",
|
||||||
|
"label": "Top level group name regex"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
{
|
||||||
|
"type": "dict",
|
||||||
|
"collapsible": true,
|
||||||
|
"key": "scriptsmenu",
|
||||||
|
"label": "Scripts Menu Definition",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "text",
|
||||||
|
"key": "name",
|
||||||
|
"label": "Menu Name"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "splitter"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "raw-json",
|
||||||
|
"key": "definition",
|
||||||
|
"label": "Menu definition",
|
||||||
|
"is_list": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
@ -9,6 +9,31 @@
|
||||||
"label": "Color input",
|
"label": "Color input",
|
||||||
"type": "color"
|
"type": "color"
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"type": "dict-conditional",
|
||||||
|
"key": "overriden_value",
|
||||||
|
"label": "Overriden value",
|
||||||
|
"enum_key": "overriden",
|
||||||
|
"enum_is_horizontal": true,
|
||||||
|
"enum_children": [
|
||||||
|
{
|
||||||
|
"key": "overriden",
|
||||||
|
"label": "Override value",
|
||||||
|
"children": [
|
||||||
|
{
|
||||||
|
"type": "number",
|
||||||
|
"key": "value",
|
||||||
|
"label": "value"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"key": "inherit",
|
||||||
|
"label": "Inherit value",
|
||||||
|
"children": []
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"type": "dict-conditional",
|
"type": "dict-conditional",
|
||||||
"use_label_wrap": true,
|
"use_label_wrap": true,
|
||||||
|
|
|
||||||
|
|
@ -315,14 +315,28 @@ class DuplicatedEnvGroups(Exception):
|
||||||
super(DuplicatedEnvGroups, self).__init__(msg)
|
super(DuplicatedEnvGroups, self).__init__(msg)
|
||||||
|
|
||||||
|
|
||||||
|
def load_openpype_default_settings():
|
||||||
|
"""Load openpype default settings."""
|
||||||
|
return load_jsons_from_dir(DEFAULTS_DIR)
|
||||||
|
|
||||||
|
|
||||||
def reset_default_settings():
|
def reset_default_settings():
|
||||||
|
"""Reset cache of default settings. Can't be used now."""
|
||||||
global _DEFAULT_SETTINGS
|
global _DEFAULT_SETTINGS
|
||||||
_DEFAULT_SETTINGS = None
|
_DEFAULT_SETTINGS = None
|
||||||
|
|
||||||
|
|
||||||
def get_default_settings():
|
def get_default_settings():
|
||||||
|
"""Get default settings.
|
||||||
|
|
||||||
|
Todo:
|
||||||
|
Cache loaded defaults.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: Loaded default settings.
|
||||||
|
"""
|
||||||
# TODO add cacher
|
# TODO add cacher
|
||||||
return load_jsons_from_dir(DEFAULTS_DIR)
|
return load_openpype_default_settings()
|
||||||
# global _DEFAULT_SETTINGS
|
# global _DEFAULT_SETTINGS
|
||||||
# if _DEFAULT_SETTINGS is None:
|
# if _DEFAULT_SETTINGS is None:
|
||||||
# _DEFAULT_SETTINGS = load_jsons_from_dir(DEFAULTS_DIR)
|
# _DEFAULT_SETTINGS = load_jsons_from_dir(DEFAULTS_DIR)
|
||||||
|
|
@ -868,6 +882,25 @@ def get_environments():
|
||||||
return find_environments(get_system_settings(False))
|
return find_environments(get_system_settings(False))
|
||||||
|
|
||||||
|
|
||||||
|
def get_general_environments():
|
||||||
|
"""Get general environments.
|
||||||
|
|
||||||
|
Function is implemented to be able load general environments without using
|
||||||
|
`get_default_settings`.
|
||||||
|
"""
|
||||||
|
# Use only openpype defaults.
|
||||||
|
# - prevent to use `get_system_settings` where `get_default_settings`
|
||||||
|
# is used
|
||||||
|
default_values = load_openpype_default_settings()
|
||||||
|
studio_overrides = get_studio_system_settings_overrides()
|
||||||
|
result = apply_overrides(default_values, studio_overrides)
|
||||||
|
environments = result["general"]["environment"]
|
||||||
|
|
||||||
|
clear_metadata_from_settings(environments)
|
||||||
|
|
||||||
|
return environments
|
||||||
|
|
||||||
|
|
||||||
def clear_metadata_from_settings(values):
|
def clear_metadata_from_settings(values):
|
||||||
"""Remove all metadata keys from loaded settings."""
|
"""Remove all metadata keys from loaded settings."""
|
||||||
if isinstance(values, dict):
|
if isinstance(values, dict):
|
||||||
|
|
|
||||||
|
|
@ -35,6 +35,10 @@ QWidget:disabled {
|
||||||
color: {color:font-disabled};
|
color: {color:font-disabled};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
QLabel {
|
||||||
|
background: transparent;
|
||||||
|
}
|
||||||
|
|
||||||
/* Inputs */
|
/* Inputs */
|
||||||
QAbstractSpinBox, QLineEdit, QPlainTextEdit, QTextEdit {
|
QAbstractSpinBox, QLineEdit, QPlainTextEdit, QTextEdit {
|
||||||
border: 1px solid {color:border};
|
border: 1px solid {color:border};
|
||||||
|
|
@ -97,7 +101,7 @@ QToolButton:disabled {
|
||||||
background: {color:bg-buttons-disabled};
|
background: {color:bg-buttons-disabled};
|
||||||
}
|
}
|
||||||
|
|
||||||
QToolButton[popupMode="1"] {
|
QToolButton[popupMode="1"], QToolButton[popupMode="MenuButtonPopup"] {
|
||||||
/* make way for the popup button */
|
/* make way for the popup button */
|
||||||
padding-right: 20px;
|
padding-right: 20px;
|
||||||
border: 1px solid {color:bg-buttons};
|
border: 1px solid {color:bg-buttons};
|
||||||
|
|
@ -340,6 +344,11 @@ QAbstractItemView {
|
||||||
selection-background-color: transparent;
|
selection-background-color: transparent;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
QAbstractItemView::item {
|
||||||
|
/* `border: none` hide outline of selected item. */
|
||||||
|
border: none;
|
||||||
|
}
|
||||||
|
|
||||||
QAbstractItemView:disabled{
|
QAbstractItemView:disabled{
|
||||||
background: {color:bg-view-disabled};
|
background: {color:bg-view-disabled};
|
||||||
alternate-background-color: {color:bg-view-alternate-disabled};
|
alternate-background-color: {color:bg-view-alternate-disabled};
|
||||||
|
|
|
||||||
|
|
@ -294,6 +294,7 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
|
||||||
msg = "<br><br>".join(warnings)
|
msg = "<br><br>".join(warnings)
|
||||||
|
|
||||||
dialog = QtWidgets.QMessageBox(self)
|
dialog = QtWidgets.QMessageBox(self)
|
||||||
|
dialog.setWindowTitle("Save warnings")
|
||||||
dialog.setText(msg)
|
dialog.setText(msg)
|
||||||
dialog.setIcon(QtWidgets.QMessageBox.Warning)
|
dialog.setIcon(QtWidgets.QMessageBox.Warning)
|
||||||
dialog.exec_()
|
dialog.exec_()
|
||||||
|
|
@ -303,6 +304,7 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
formatted_traceback = traceback.format_exception(*sys.exc_info())
|
formatted_traceback = traceback.format_exception(*sys.exc_info())
|
||||||
dialog = QtWidgets.QMessageBox(self)
|
dialog = QtWidgets.QMessageBox(self)
|
||||||
|
dialog.setWindowTitle("Unexpected error")
|
||||||
msg = "Unexpected error happened!\n\nError: {}".format(str(exc))
|
msg = "Unexpected error happened!\n\nError: {}".format(str(exc))
|
||||||
dialog.setText(msg)
|
dialog.setText(msg)
|
||||||
dialog.setDetailedText("\n".join(formatted_traceback))
|
dialog.setDetailedText("\n".join(formatted_traceback))
|
||||||
|
|
@ -392,6 +394,7 @@ class SettingsCategoryWidget(QtWidgets.QWidget):
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
formatted_traceback = traceback.format_exception(*sys.exc_info())
|
formatted_traceback = traceback.format_exception(*sys.exc_info())
|
||||||
dialog = QtWidgets.QMessageBox(self)
|
dialog = QtWidgets.QMessageBox(self)
|
||||||
|
dialog.setWindowTitle("Unexpected error")
|
||||||
msg = "Unexpected error happened!\n\nError: {}".format(str(exc))
|
msg = "Unexpected error happened!\n\nError: {}".format(str(exc))
|
||||||
dialog.setText(msg)
|
dialog.setText(msg)
|
||||||
dialog.setDetailedText("\n".join(formatted_traceback))
|
dialog.setDetailedText("\n".join(formatted_traceback))
|
||||||
|
|
|
||||||
|
|
@ -24,6 +24,7 @@ class DictConditionalWidget(BaseWidget):
|
||||||
self.body_widget = None
|
self.body_widget = None
|
||||||
self.content_widget = None
|
self.content_widget = None
|
||||||
self.content_layout = None
|
self.content_layout = None
|
||||||
|
self.enum_layout = None
|
||||||
|
|
||||||
label = None
|
label = None
|
||||||
if self.entity.is_dynamic_item:
|
if self.entity.is_dynamic_item:
|
||||||
|
|
@ -40,8 +41,36 @@ class DictConditionalWidget(BaseWidget):
|
||||||
self._enum_key_by_wrapper_id = {}
|
self._enum_key_by_wrapper_id = {}
|
||||||
self._added_wrapper_ids = set()
|
self._added_wrapper_ids = set()
|
||||||
|
|
||||||
self.content_layout.setColumnStretch(0, 0)
|
enum_layout = QtWidgets.QGridLayout()
|
||||||
self.content_layout.setColumnStretch(1, 1)
|
enum_layout.setContentsMargins(0, 0, 0, 0)
|
||||||
|
enum_layout.setColumnStretch(0, 0)
|
||||||
|
enum_layout.setColumnStretch(1, 1)
|
||||||
|
|
||||||
|
all_children_layout = QtWidgets.QVBoxLayout()
|
||||||
|
all_children_layout.setContentsMargins(0, 0, 0, 0)
|
||||||
|
|
||||||
|
if self.entity.enum_is_horizontal:
|
||||||
|
if self.entity.enum_on_right:
|
||||||
|
self.content_layout.addLayout(all_children_layout, 0, 0)
|
||||||
|
self.content_layout.addLayout(enum_layout, 0, 1)
|
||||||
|
# Stretch combobox to minimum and expand value
|
||||||
|
self.content_layout.setColumnStretch(0, 1)
|
||||||
|
self.content_layout.setColumnStretch(1, 0)
|
||||||
|
else:
|
||||||
|
self.content_layout.addLayout(enum_layout, 0, 0)
|
||||||
|
self.content_layout.addLayout(all_children_layout, 0, 1)
|
||||||
|
# Stretch combobox to minimum and expand value
|
||||||
|
self.content_layout.setColumnStretch(0, 0)
|
||||||
|
self.content_layout.setColumnStretch(1, 1)
|
||||||
|
|
||||||
|
else:
|
||||||
|
# Expand content
|
||||||
|
self.content_layout.setColumnStretch(0, 1)
|
||||||
|
self.content_layout.addLayout(enum_layout, 0, 0)
|
||||||
|
self.content_layout.addLayout(all_children_layout, 1, 0)
|
||||||
|
|
||||||
|
self.enum_layout = enum_layout
|
||||||
|
self.all_children_layout = all_children_layout
|
||||||
|
|
||||||
# Add enum entity to layout mapping
|
# Add enum entity to layout mapping
|
||||||
enum_entity = self.entity.enum_entity
|
enum_entity = self.entity.enum_entity
|
||||||
|
|
@ -58,6 +87,8 @@ class DictConditionalWidget(BaseWidget):
|
||||||
content_layout.setContentsMargins(0, 0, 0, 0)
|
content_layout.setContentsMargins(0, 0, 0, 0)
|
||||||
content_layout.setSpacing(5)
|
content_layout.setSpacing(5)
|
||||||
|
|
||||||
|
all_children_layout.addWidget(content_widget)
|
||||||
|
|
||||||
self._content_by_enum_value[enum_key] = {
|
self._content_by_enum_value[enum_key] = {
|
||||||
"widget": content_widget,
|
"widget": content_widget,
|
||||||
"layout": content_layout
|
"layout": content_layout
|
||||||
|
|
@ -80,9 +111,6 @@ class DictConditionalWidget(BaseWidget):
|
||||||
|
|
||||||
for item_key, children in self.entity.children.items():
|
for item_key, children in self.entity.children.items():
|
||||||
content_widget = self._content_by_enum_value[item_key]["widget"]
|
content_widget = self._content_by_enum_value[item_key]["widget"]
|
||||||
row = self.content_layout.rowCount()
|
|
||||||
self.content_layout.addWidget(content_widget, row, 0, 1, 2)
|
|
||||||
|
|
||||||
for child_obj in children:
|
for child_obj in children:
|
||||||
self.input_fields.append(
|
self.input_fields.append(
|
||||||
self.create_ui_for_entity(
|
self.create_ui_for_entity(
|
||||||
|
|
@ -191,12 +219,25 @@ class DictConditionalWidget(BaseWidget):
|
||||||
else:
|
else:
|
||||||
map_id = widget.entity.id
|
map_id = widget.entity.id
|
||||||
|
|
||||||
content_widget = self.content_widget
|
is_enum_item = map_id == self.entity.enum_entity.id
|
||||||
content_layout = self.content_layout
|
if is_enum_item:
|
||||||
if map_id != self.entity.enum_entity.id:
|
content_widget = self.content_widget
|
||||||
enum_value = self._enum_key_by_wrapper_id[map_id]
|
content_layout = self.enum_layout
|
||||||
content_widget = self._content_by_enum_value[enum_value]["widget"]
|
|
||||||
content_layout = self._content_by_enum_value[enum_value]["layout"]
|
if not label:
|
||||||
|
content_layout.addWidget(widget, 0, 0, 1, 2)
|
||||||
|
return
|
||||||
|
|
||||||
|
label_widget = GridLabelWidget(label, widget)
|
||||||
|
label_widget.input_field = widget
|
||||||
|
widget.label_widget = label_widget
|
||||||
|
content_layout.addWidget(label_widget, 0, 0, 1, 1)
|
||||||
|
content_layout.addWidget(widget, 0, 1, 1, 1)
|
||||||
|
return
|
||||||
|
|
||||||
|
enum_value = self._enum_key_by_wrapper_id[map_id]
|
||||||
|
content_widget = self._content_by_enum_value[enum_value]["widget"]
|
||||||
|
content_layout = self._content_by_enum_value[enum_value]["layout"]
|
||||||
|
|
||||||
wrapper = self._parent_widget_by_entity_id[map_id]
|
wrapper = self._parent_widget_by_entity_id[map_id]
|
||||||
if wrapper is not content_widget:
|
if wrapper is not content_widget:
|
||||||
|
|
|
||||||
|
|
@ -94,7 +94,8 @@ class MainWidget(QtWidgets.QWidget):
|
||||||
super(MainWidget, self).showEvent(event)
|
super(MainWidget, self).showEvent(event)
|
||||||
if self._reset_on_show:
|
if self._reset_on_show:
|
||||||
self._reset_on_show = False
|
self._reset_on_show = False
|
||||||
self.reset()
|
# Trigger reset with 100ms delay
|
||||||
|
QtCore.QTimer.singleShot(100, self.reset)
|
||||||
|
|
||||||
def _show_password_dialog(self):
|
def _show_password_dialog(self):
|
||||||
if self._password_dialog:
|
if self._password_dialog:
|
||||||
|
|
@ -107,6 +108,8 @@ class MainWidget(QtWidgets.QWidget):
|
||||||
self._password_dialog = None
|
self._password_dialog = None
|
||||||
if password_passed:
|
if password_passed:
|
||||||
self.reset()
|
self.reset()
|
||||||
|
if not self.isVisible():
|
||||||
|
self.show()
|
||||||
else:
|
else:
|
||||||
self.close()
|
self.close()
|
||||||
|
|
||||||
|
|
@ -141,7 +144,10 @@ class MainWidget(QtWidgets.QWidget):
|
||||||
# Don't show dialog if there are not registered slots for
|
# Don't show dialog if there are not registered slots for
|
||||||
# `trigger_restart` signal.
|
# `trigger_restart` signal.
|
||||||
# - For example when settings are runnin as standalone tool
|
# - For example when settings are runnin as standalone tool
|
||||||
if self.receivers(self.trigger_restart) < 1:
|
# - PySide2 and PyQt5 compatible way how to find out
|
||||||
|
method_index = self.metaObject().indexOfMethod("trigger_restart()")
|
||||||
|
method = self.metaObject().method(method_index)
|
||||||
|
if not self.isSignalConnected(method):
|
||||||
return
|
return
|
||||||
|
|
||||||
dialog = RestartDialog(self)
|
dialog = RestartDialog(self)
|
||||||
|
|
|
||||||
|
|
@ -1,7 +1,6 @@
|
||||||
import os
|
import os
|
||||||
from Qt import QtCore, QtGui, QtWidgets
|
from Qt import QtCore, QtGui, QtWidgets
|
||||||
from .resources import get_resource
|
from .resources import get_resource
|
||||||
from avalon import style
|
|
||||||
|
|
||||||
|
|
||||||
class ComponentItem(QtWidgets.QFrame):
|
class ComponentItem(QtWidgets.QFrame):
|
||||||
|
|
@ -61,7 +60,7 @@ class ComponentItem(QtWidgets.QFrame):
|
||||||
name="menu", size=QtCore.QSize(22, 22)
|
name="menu", size=QtCore.QSize(22, 22)
|
||||||
)
|
)
|
||||||
|
|
||||||
self.action_menu = QtWidgets.QMenu()
|
self.action_menu = QtWidgets.QMenu(self.btn_action_menu)
|
||||||
|
|
||||||
expanding_sizePolicy = QtWidgets.QSizePolicy(
|
expanding_sizePolicy = QtWidgets.QSizePolicy(
|
||||||
QtWidgets.QSizePolicy.Expanding, QtWidgets.QSizePolicy.Expanding
|
QtWidgets.QSizePolicy.Expanding, QtWidgets.QSizePolicy.Expanding
|
||||||
|
|
@ -229,7 +228,6 @@ class ComponentItem(QtWidgets.QFrame):
|
||||||
if not self.btn_action_menu.isVisible():
|
if not self.btn_action_menu.isVisible():
|
||||||
self.btn_action_menu.setVisible(True)
|
self.btn_action_menu.setVisible(True)
|
||||||
self.btn_action_menu.clicked.connect(self.show_actions)
|
self.btn_action_menu.clicked.connect(self.show_actions)
|
||||||
self.action_menu.setStyleSheet(style.load_stylesheet())
|
|
||||||
|
|
||||||
def set_repre_name_valid(self, valid):
|
def set_repre_name_valid(self, valid):
|
||||||
self.has_valid_repre = valid
|
self.has_valid_repre = valid
|
||||||
|
|
|
||||||
|
|
@ -211,7 +211,8 @@ class DropDataFrame(QtWidgets.QFrame):
|
||||||
folder_path = os.path.dirname(collection.head)
|
folder_path = os.path.dirname(collection.head)
|
||||||
if file_base[-1] in ['.', '_']:
|
if file_base[-1] in ['.', '_']:
|
||||||
file_base = file_base[:-1]
|
file_base = file_base[:-1]
|
||||||
file_ext = collection.tail
|
file_ext = os.path.splitext(
|
||||||
|
collection.format('{head}{padding}{tail}'))[1]
|
||||||
repr_name = file_ext.replace('.', '')
|
repr_name = file_ext.replace('.', '')
|
||||||
range = collection.format('{ranges}')
|
range = collection.format('{ranges}')
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -693,16 +693,16 @@ class FilesWidget(QtWidgets.QWidget):
|
||||||
)
|
)
|
||||||
return
|
return
|
||||||
|
|
||||||
file_path = os.path.join(self.root, work_file)
|
file_path = os.path.join(os.path.normpath(self.root), work_file)
|
||||||
|
|
||||||
pipeline.emit("before.workfile.save", file_path)
|
pipeline.emit("before.workfile.save", [file_path])
|
||||||
|
|
||||||
self._enter_session() # Make sure we are in the right session
|
self._enter_session() # Make sure we are in the right session
|
||||||
self.host.save_file(file_path)
|
self.host.save_file(file_path)
|
||||||
|
|
||||||
self.set_asset_task(self._asset, self._task)
|
self.set_asset_task(self._asset, self._task)
|
||||||
|
|
||||||
pipeline.emit("after.workfile.save", file_path)
|
pipeline.emit("after.workfile.save", [file_path])
|
||||||
|
|
||||||
self.workfile_created.emit(file_path)
|
self.workfile_created.emit(file_path)
|
||||||
|
|
||||||
|
|
|
||||||
5
openpype/vendor/python/common/scriptsmenu/__init__.py
vendored
Normal file
5
openpype/vendor/python/common/scriptsmenu/__init__.py
vendored
Normal file
|
|
@ -0,0 +1,5 @@
|
||||||
|
from .scriptsmenu import ScriptsMenu
|
||||||
|
from . import version
|
||||||
|
|
||||||
|
__all__ = ["ScriptsMenu"]
|
||||||
|
__version__ = version.version
|
||||||
207
openpype/vendor/python/common/scriptsmenu/action.py
vendored
Normal file
207
openpype/vendor/python/common/scriptsmenu/action.py
vendored
Normal file
|
|
@ -0,0 +1,207 @@
|
||||||
|
import os
|
||||||
|
|
||||||
|
from .vendor.Qt import QtWidgets
|
||||||
|
|
||||||
|
|
||||||
|
class Action(QtWidgets.QAction):
|
||||||
|
"""Custom Action widget"""
|
||||||
|
|
||||||
|
def __init__(self, parent=None):
|
||||||
|
|
||||||
|
QtWidgets.QAction.__init__(self, parent)
|
||||||
|
|
||||||
|
self._root = None
|
||||||
|
self._tags = list()
|
||||||
|
self._command = None
|
||||||
|
self._sourcetype = None
|
||||||
|
self._iconfile = None
|
||||||
|
self._label = None
|
||||||
|
|
||||||
|
self._COMMAND = """import imp
|
||||||
|
f, filepath, descr = imp.find_module('{module_name}', ['{dirname}'])
|
||||||
|
module = imp.load_module('{module_name}', f, filepath, descr)
|
||||||
|
module.{module_name}()"""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def root(self):
|
||||||
|
return self._root
|
||||||
|
|
||||||
|
@root.setter
|
||||||
|
def root(self, value):
|
||||||
|
self._root = value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def tags(self):
|
||||||
|
return self._tags
|
||||||
|
|
||||||
|
@tags.setter
|
||||||
|
def tags(self, value):
|
||||||
|
self._tags = value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def command(self):
|
||||||
|
return self._command
|
||||||
|
|
||||||
|
@command.setter
|
||||||
|
def command(self, value):
|
||||||
|
"""
|
||||||
|
Store the command in the QAction
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value (str): the full command which will be executed when clicked
|
||||||
|
|
||||||
|
Return:
|
||||||
|
None
|
||||||
|
"""
|
||||||
|
self._command = value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def sourcetype(self):
|
||||||
|
return self._sourcetype
|
||||||
|
|
||||||
|
@sourcetype.setter
|
||||||
|
def sourcetype(self, value):
|
||||||
|
"""
|
||||||
|
Set the command type to get the correct execution of the command given
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value (str): the name of the command type
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None
|
||||||
|
|
||||||
|
"""
|
||||||
|
self._sourcetype = value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def iconfile(self):
|
||||||
|
return self._iconfile
|
||||||
|
|
||||||
|
@iconfile.setter
|
||||||
|
def iconfile(self, value):
|
||||||
|
"""Store the path to the image file which needs to be displayed
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value (str): the path to the image
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None
|
||||||
|
"""
|
||||||
|
self._iconfile = value
|
||||||
|
|
||||||
|
@property
|
||||||
|
def label(self):
|
||||||
|
return self._label
|
||||||
|
|
||||||
|
@label.setter
|
||||||
|
def label(self, value):
|
||||||
|
"""
|
||||||
|
Set the abbreviation which will be used as overlay text in the shelf
|
||||||
|
|
||||||
|
Args:
|
||||||
|
value (str): an abbreviation of the name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None
|
||||||
|
|
||||||
|
"""
|
||||||
|
self._label = value
|
||||||
|
|
||||||
|
def run_command(self):
|
||||||
|
"""
|
||||||
|
Run the command of the instance or copy the command to the active shelf
|
||||||
|
based on the current modifiers.
|
||||||
|
|
||||||
|
If callbacks have been registered with fouind modifier integer the
|
||||||
|
function will trigger all callbacks. When a callback function returns a
|
||||||
|
non zero integer it will not execute the action's command
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
# get the current application and its linked keyboard modifiers
|
||||||
|
modifiers = QtWidgets.QApplication.keyboardModifiers()
|
||||||
|
|
||||||
|
# If the menu has a callback registered for the current modifier
|
||||||
|
# we run the callback instead of the action itself.
|
||||||
|
registered = self._root.registered_callbacks
|
||||||
|
callbacks = registered.get(int(modifiers), [])
|
||||||
|
for callback in callbacks:
|
||||||
|
signal = callback(self)
|
||||||
|
if signal != 0:
|
||||||
|
# Exit function on non-zero return code
|
||||||
|
return
|
||||||
|
|
||||||
|
exec(self.process_command())
|
||||||
|
|
||||||
|
def process_command(self):
|
||||||
|
"""
|
||||||
|
Check if the command is a file which needs to be launched and if it
|
||||||
|
has a relative path, if so return the full path by expanding
|
||||||
|
environment variables. Wrap any mel command in a executable string
|
||||||
|
for Python and return the string if the source type is
|
||||||
|
|
||||||
|
Add your own source type and required process to ensure callback
|
||||||
|
is stored correctly.
|
||||||
|
|
||||||
|
An example of a process is the sourcetype is MEL
|
||||||
|
(Maya Embedded Language) as Python cannot run it on its own so it
|
||||||
|
needs to be wrapped in a string in which we explicitly import mel and
|
||||||
|
run it as a mel.eval. The string is then parsed to python as
|
||||||
|
exec("command").
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: a clean command which can be used
|
||||||
|
|
||||||
|
"""
|
||||||
|
if self._sourcetype == "python":
|
||||||
|
return self._command
|
||||||
|
|
||||||
|
if self._sourcetype == "mel":
|
||||||
|
# Escape single quotes
|
||||||
|
conversion = self._command.replace("'", "\\'")
|
||||||
|
return "import maya; maya.mel.eval('{}')".format(conversion)
|
||||||
|
|
||||||
|
if self._sourcetype == "file":
|
||||||
|
if os.path.isabs(self._command):
|
||||||
|
filepath = self._command
|
||||||
|
else:
|
||||||
|
filepath = os.path.normpath(os.path.expandvars(self._command))
|
||||||
|
|
||||||
|
return self._wrap_filepath(filepath)
|
||||||
|
|
||||||
|
def has_tag(self, tag):
|
||||||
|
"""Check whether the tag matches with the action's tags.
|
||||||
|
|
||||||
|
A partial match will also return True, for example tag `a` will match
|
||||||
|
correctly with the `ape` tag on the Action.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tag (str): The tag
|
||||||
|
|
||||||
|
Returns
|
||||||
|
bool: Whether the action is tagged with given tag
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
for tagitem in self.tags:
|
||||||
|
if tag not in tagitem:
|
||||||
|
continue
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
def _wrap_filepath(self, file_path):
|
||||||
|
"""Create a wrapped string for the python command
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path (str): the filepath of a script
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
str: the wrapped command
|
||||||
|
"""
|
||||||
|
|
||||||
|
dirname = os.path.dirname(r"{}".format(file_path))
|
||||||
|
dirpath = dirname.replace("\\", "/")
|
||||||
|
module_name = os.path.splitext(os.path.basename(file_path))[0]
|
||||||
|
|
||||||
|
return self._COMMAND.format(module_name=module_name, dirname=dirpath)
|
||||||
54
openpype/vendor/python/common/scriptsmenu/launchformari.py
vendored
Normal file
54
openpype/vendor/python/common/scriptsmenu/launchformari.py
vendored
Normal file
|
|
@ -0,0 +1,54 @@
|
||||||
|
|
||||||
|
# Import third-party modules
|
||||||
|
from vendor.Qt import QtWidgets
|
||||||
|
|
||||||
|
# Import local modules
|
||||||
|
import scriptsmenu
|
||||||
|
|
||||||
|
|
||||||
|
def _mari_main_window():
|
||||||
|
"""Get Mari main window.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MriMainWindow: Mari's main window.
|
||||||
|
|
||||||
|
"""
|
||||||
|
for obj in QtWidgets.QApplication.topLevelWidgets():
|
||||||
|
if obj.metaObject().className() == 'MriMainWindow':
|
||||||
|
return obj
|
||||||
|
raise RuntimeError('Could not find Mari MainWindow instance')
|
||||||
|
|
||||||
|
|
||||||
|
def _mari_main_menubar():
|
||||||
|
"""Get Mari main menu bar.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Retrieve the main menubar of the Mari window.
|
||||||
|
|
||||||
|
"""
|
||||||
|
mari_window = _mari_main_window()
|
||||||
|
menubar = [
|
||||||
|
i for i in mari_window.children() if isinstance(i, QtWidgets.QMenuBar)
|
||||||
|
]
|
||||||
|
assert len(menubar) == 1, "Error, could not find menu bar!"
|
||||||
|
return menubar[0]
|
||||||
|
|
||||||
|
|
||||||
|
def main(title="Scripts"):
|
||||||
|
"""Build the main scripts menu in Mari.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title (str): Name of the menu in the application.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
scriptsmenu.ScriptsMenu: Instance object.
|
||||||
|
|
||||||
|
"""
|
||||||
|
mari_main_bar = _mari_main_menubar()
|
||||||
|
for mari_bar in mari_main_bar.children():
|
||||||
|
if isinstance(mari_bar, scriptsmenu.ScriptsMenu):
|
||||||
|
if mari_bar.title() == title:
|
||||||
|
menu = mari_bar
|
||||||
|
return menu
|
||||||
|
menu = scriptsmenu.ScriptsMenu(title=title, parent=mari_main_bar)
|
||||||
|
return menu
|
||||||
137
openpype/vendor/python/common/scriptsmenu/launchformaya.py
vendored
Normal file
137
openpype/vendor/python/common/scriptsmenu/launchformaya.py
vendored
Normal file
|
|
@ -0,0 +1,137 @@
|
||||||
|
import logging
|
||||||
|
|
||||||
|
import maya.cmds as cmds
|
||||||
|
import maya.mel as mel
|
||||||
|
|
||||||
|
import scriptsmenu
|
||||||
|
from .vendor.Qt import QtCore, QtWidgets
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def register_repeat_last(action):
|
||||||
|
"""Register the action in repeatLast to ensure the RepeatLast hotkey works
|
||||||
|
|
||||||
|
Args:
|
||||||
|
action (action.Action): Action wigdet instance
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
int: 0
|
||||||
|
|
||||||
|
"""
|
||||||
|
command = action.process_command()
|
||||||
|
command = command.replace("\n", "; ")
|
||||||
|
# Register command to Maya (mel)
|
||||||
|
cmds.repeatLast(addCommand='python("{}")'.format(command),
|
||||||
|
addCommandLabel=action.label)
|
||||||
|
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def to_shelf(action):
|
||||||
|
"""Copy clicked menu item to the currently active Maya shelf
|
||||||
|
Args:
|
||||||
|
action (action.Action): the action instance which is clicked
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
int: 1
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
shelftoplevel = mel.eval("$gShelfTopLevel = $gShelfTopLevel;")
|
||||||
|
current_active_shelf = cmds.tabLayout(shelftoplevel,
|
||||||
|
query=True,
|
||||||
|
selectTab=True)
|
||||||
|
|
||||||
|
cmds.shelfButton(command=action.process_command(),
|
||||||
|
sourceType="python",
|
||||||
|
parent=current_active_shelf,
|
||||||
|
image=action.iconfile or "pythonFamily.png",
|
||||||
|
annotation=action.statusTip(),
|
||||||
|
imageOverlayLabel=action.label or "")
|
||||||
|
|
||||||
|
return 1
|
||||||
|
|
||||||
|
|
||||||
|
def _maya_main_window():
|
||||||
|
"""Return Maya's main window"""
|
||||||
|
for obj in QtWidgets.QApplication.topLevelWidgets():
|
||||||
|
if obj.objectName() == 'MayaWindow':
|
||||||
|
return obj
|
||||||
|
raise RuntimeError('Could not find MayaWindow instance')
|
||||||
|
|
||||||
|
|
||||||
|
def _maya_main_menubar():
|
||||||
|
"""Retrieve the main menubar of the Maya window"""
|
||||||
|
mayawindow = _maya_main_window()
|
||||||
|
menubar = [i for i in mayawindow.children()
|
||||||
|
if isinstance(i, QtWidgets.QMenuBar)]
|
||||||
|
|
||||||
|
assert len(menubar) == 1, "Error, could not find menu bar!"
|
||||||
|
|
||||||
|
return menubar[0]
|
||||||
|
|
||||||
|
|
||||||
|
def find_scripts_menu(title, parent):
|
||||||
|
"""
|
||||||
|
Check if the menu exists with the given title in the parent
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title (str): the title name of the scripts menu
|
||||||
|
|
||||||
|
parent (QtWidgets.QMenuBar): the menubar to check
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QtWidgets.QMenu or None
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
menu = None
|
||||||
|
search = [i for i in parent.children() if
|
||||||
|
isinstance(i, scriptsmenu.ScriptsMenu)
|
||||||
|
and i.title() == title]
|
||||||
|
|
||||||
|
if search:
|
||||||
|
assert len(search) < 2, ("Multiple instances of menu '{}' "
|
||||||
|
"in menu bar".format(title))
|
||||||
|
menu = search[0]
|
||||||
|
|
||||||
|
return menu
|
||||||
|
|
||||||
|
|
||||||
|
def main(title="Scripts", parent=None, objectName=None):
|
||||||
|
"""Build the main scripts menu in Maya
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title (str): name of the menu in the application
|
||||||
|
|
||||||
|
parent (QtWidgets.QtMenuBar): the parent object for the menu
|
||||||
|
|
||||||
|
objectName (str): custom objectName for scripts menu
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
scriptsmenu.ScriptsMenu instance
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
mayamainbar = parent or _maya_main_menubar()
|
||||||
|
try:
|
||||||
|
# check menu already exists
|
||||||
|
menu = find_scripts_menu(title, mayamainbar)
|
||||||
|
if not menu:
|
||||||
|
log.info("Attempting to build menu ...")
|
||||||
|
object_name = objectName or title.lower()
|
||||||
|
menu = scriptsmenu.ScriptsMenu(title=title,
|
||||||
|
parent=mayamainbar,
|
||||||
|
objectName=object_name)
|
||||||
|
except Exception as e:
|
||||||
|
log.error(e)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Register control + shift callback to add to shelf (maya behavior)
|
||||||
|
modifiers = QtCore.Qt.ControlModifier | QtCore.Qt.ShiftModifier
|
||||||
|
menu.register_callback(int(modifiers), to_shelf)
|
||||||
|
|
||||||
|
menu.register_callback(0, register_repeat_last)
|
||||||
|
|
||||||
|
return menu
|
||||||
36
openpype/vendor/python/common/scriptsmenu/launchfornuke.py
vendored
Normal file
36
openpype/vendor/python/common/scriptsmenu/launchfornuke.py
vendored
Normal file
|
|
@ -0,0 +1,36 @@
|
||||||
|
import scriptsmenu
|
||||||
|
from .vendor.Qt import QtWidgets
|
||||||
|
|
||||||
|
|
||||||
|
def _nuke_main_window():
|
||||||
|
"""Return Nuke's main window"""
|
||||||
|
for obj in QtWidgets.QApplication.topLevelWidgets():
|
||||||
|
if (obj.inherits('QMainWindow') and
|
||||||
|
obj.metaObject().className() == 'Foundry::UI::DockMainWindow'):
|
||||||
|
return obj
|
||||||
|
raise RuntimeError('Could not find Nuke MainWindow instance')
|
||||||
|
|
||||||
|
|
||||||
|
def _nuke_main_menubar():
|
||||||
|
"""Retrieve the main menubar of the Nuke window"""
|
||||||
|
nuke_window = _nuke_main_window()
|
||||||
|
menubar = [i for i in nuke_window.children()
|
||||||
|
if isinstance(i, QtWidgets.QMenuBar)]
|
||||||
|
|
||||||
|
assert len(menubar) == 1, "Error, could not find menu bar!"
|
||||||
|
return menubar[0]
|
||||||
|
|
||||||
|
|
||||||
|
def main(title="Scripts"):
|
||||||
|
# Register control + shift callback to add to shelf (Nuke behavior)
|
||||||
|
# modifiers = QtCore.Qt.ControlModifier | QtCore.Qt.ShiftModifier
|
||||||
|
# menu.register_callback(modifiers, to_shelf)
|
||||||
|
nuke_main_bar = _nuke_main_menubar()
|
||||||
|
for nuke_bar in nuke_main_bar.children():
|
||||||
|
if isinstance(nuke_bar, scriptsmenu.ScriptsMenu):
|
||||||
|
if nuke_bar.title() == title:
|
||||||
|
menu = nuke_bar
|
||||||
|
return menu
|
||||||
|
|
||||||
|
menu = scriptsmenu.ScriptsMenu(title=title, parent=nuke_main_bar)
|
||||||
|
return menu
|
||||||
316
openpype/vendor/python/common/scriptsmenu/scriptsmenu.py
vendored
Normal file
316
openpype/vendor/python/common/scriptsmenu/scriptsmenu.py
vendored
Normal file
|
|
@ -0,0 +1,316 @@
|
||||||
|
import os
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from collections import defaultdict
|
||||||
|
|
||||||
|
from .vendor.Qt import QtWidgets, QtCore
|
||||||
|
from . import action
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class ScriptsMenu(QtWidgets.QMenu):
|
||||||
|
"""A Qt menu that displays a list of searchable actions"""
|
||||||
|
|
||||||
|
updated = QtCore.Signal(QtWidgets.QMenu)
|
||||||
|
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
"""Initialize Scripts menu
|
||||||
|
|
||||||
|
Args:
|
||||||
|
title (str): the name of the root menu which will be created
|
||||||
|
|
||||||
|
parent (QtWidgets.QObject) : the QObject to parent the menu to
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None
|
||||||
|
|
||||||
|
"""
|
||||||
|
QtWidgets.QMenu.__init__(self, *args, **kwargs)
|
||||||
|
|
||||||
|
self.searchbar = None
|
||||||
|
self.update_action = None
|
||||||
|
|
||||||
|
self._script_actions = []
|
||||||
|
self._callbacks = defaultdict(list)
|
||||||
|
|
||||||
|
# Automatically add it to the parent menu
|
||||||
|
parent = kwargs.get("parent", None)
|
||||||
|
if parent:
|
||||||
|
parent.addMenu(self)
|
||||||
|
|
||||||
|
objectname = kwargs.get("objectName", "scripts")
|
||||||
|
title = kwargs.get("title", "Scripts")
|
||||||
|
self.setObjectName(objectname)
|
||||||
|
self.setTitle(title)
|
||||||
|
|
||||||
|
# add default items in the menu
|
||||||
|
self.create_default_items()
|
||||||
|
|
||||||
|
def on_update(self):
|
||||||
|
self.updated.emit(self)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def registered_callbacks(self):
|
||||||
|
return self._callbacks.copy()
|
||||||
|
|
||||||
|
def create_default_items(self):
|
||||||
|
"""Add a search bar to the top of the menu given"""
|
||||||
|
|
||||||
|
# create widget and link function
|
||||||
|
searchbar = QtWidgets.QLineEdit()
|
||||||
|
searchbar.setFixedWidth(120)
|
||||||
|
searchbar.setPlaceholderText("Search ...")
|
||||||
|
searchbar.textChanged.connect(self._update_search)
|
||||||
|
self.searchbar = searchbar
|
||||||
|
|
||||||
|
# create widget holder
|
||||||
|
searchbar_action = QtWidgets.QWidgetAction(self)
|
||||||
|
|
||||||
|
# add widget to widget holder
|
||||||
|
searchbar_action.setDefaultWidget(self.searchbar)
|
||||||
|
searchbar_action.setObjectName("Searchbar")
|
||||||
|
|
||||||
|
# add update button and link function
|
||||||
|
update_action = QtWidgets.QAction(self)
|
||||||
|
update_action.setObjectName("Update Scripts")
|
||||||
|
update_action.setText("Update Scripts")
|
||||||
|
update_action.setVisible(False)
|
||||||
|
update_action.triggered.connect(self.on_update)
|
||||||
|
self.update_action = update_action
|
||||||
|
|
||||||
|
# add action to menu
|
||||||
|
self.addAction(searchbar_action)
|
||||||
|
self.addAction(update_action)
|
||||||
|
|
||||||
|
# add separator object
|
||||||
|
separator = self.addSeparator()
|
||||||
|
separator.setObjectName("separator")
|
||||||
|
|
||||||
|
def add_menu(self, title, parent=None):
|
||||||
|
"""Create a sub menu for a parent widget
|
||||||
|
|
||||||
|
Args:
|
||||||
|
parent(QtWidgets.QWidget): the object to parent the menu to
|
||||||
|
|
||||||
|
title(str): the title of the menu
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QtWidget.QMenu instance
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not parent:
|
||||||
|
parent = self
|
||||||
|
|
||||||
|
menu = QtWidgets.QMenu(parent, title)
|
||||||
|
menu.setTitle(title)
|
||||||
|
menu.setObjectName(title)
|
||||||
|
menu.setTearOffEnabled(True)
|
||||||
|
parent.addMenu(menu)
|
||||||
|
|
||||||
|
return menu
|
||||||
|
|
||||||
|
def add_script(self, parent, title, command, sourcetype, icon=None,
|
||||||
|
tags=None, label=None, tooltip=None):
|
||||||
|
"""Create an action item which runs a script when clicked
|
||||||
|
|
||||||
|
Args:
|
||||||
|
parent (QtWidget.QWidget): The widget to parent the item to
|
||||||
|
|
||||||
|
title (str): The text which will be displayed in the menu
|
||||||
|
|
||||||
|
command (str): The command which needs to be run when the item is
|
||||||
|
clicked.
|
||||||
|
|
||||||
|
sourcetype (str): The type of command, the way the command is
|
||||||
|
processed is based on the source type.
|
||||||
|
|
||||||
|
icon (str): The file path of an icon to display with the menu item
|
||||||
|
|
||||||
|
tags (list, tuple): Keywords which describe the action
|
||||||
|
|
||||||
|
label (str): A short description of the script which will be displayed
|
||||||
|
when hovering over the menu item
|
||||||
|
|
||||||
|
tooltip (str): A tip for the user about the usage fo the tool
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QtWidget.QAction instance
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
assert tags is None or isinstance(tags, (list, tuple))
|
||||||
|
# Ensure tags is a list
|
||||||
|
tags = list() if tags is None else list(tags)
|
||||||
|
tags.append(title.lower())
|
||||||
|
|
||||||
|
assert icon is None or isinstance(icon, str), (
|
||||||
|
"Invalid data type for icon, supported : None, string")
|
||||||
|
|
||||||
|
# create new action
|
||||||
|
script_action = action.Action(parent)
|
||||||
|
script_action.setText(title)
|
||||||
|
script_action.setObjectName(title)
|
||||||
|
script_action.tags = tags
|
||||||
|
|
||||||
|
# link action to root for callback library
|
||||||
|
script_action.root = self
|
||||||
|
|
||||||
|
# Set up the command
|
||||||
|
script_action.sourcetype = sourcetype
|
||||||
|
script_action.command = command
|
||||||
|
|
||||||
|
try:
|
||||||
|
script_action.process_command()
|
||||||
|
except RuntimeError as e:
|
||||||
|
raise RuntimeError("Script action can't be "
|
||||||
|
"processed: {}".format(e))
|
||||||
|
|
||||||
|
if icon:
|
||||||
|
iconfile = os.path.expandvars(icon)
|
||||||
|
script_action.iconfile = iconfile
|
||||||
|
script_action_icon = QtWidgets.QIcon(iconfile)
|
||||||
|
script_action.setIcon(script_action_icon)
|
||||||
|
|
||||||
|
if label:
|
||||||
|
script_action.label = label
|
||||||
|
|
||||||
|
if tooltip:
|
||||||
|
script_action.setStatusTip(tooltip)
|
||||||
|
|
||||||
|
script_action.triggered.connect(script_action.run_command)
|
||||||
|
parent.addAction(script_action)
|
||||||
|
|
||||||
|
# Add to our searchable actions
|
||||||
|
self._script_actions.append(script_action)
|
||||||
|
|
||||||
|
return script_action
|
||||||
|
|
||||||
|
def build_from_configuration(self, parent, configuration):
|
||||||
|
"""Process the configurations and store the configuration
|
||||||
|
|
||||||
|
This creates all submenus from a configuration.json file.
|
||||||
|
|
||||||
|
When the configuration holds the key `main` all scripts under `main` will
|
||||||
|
be added to the main menu first before adding the rest
|
||||||
|
|
||||||
|
Args:
|
||||||
|
parent (ScriptsMenu): script menu instance
|
||||||
|
configuration (list): A ScriptsMenu configuration list
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
for item in configuration:
|
||||||
|
assert isinstance(item, dict), "Configuration is wrong!"
|
||||||
|
|
||||||
|
# skip items which have no `type` key
|
||||||
|
item_type = item.get('type', None)
|
||||||
|
if not item_type:
|
||||||
|
log.warning("Missing 'type' from configuration item")
|
||||||
|
continue
|
||||||
|
|
||||||
|
# add separator
|
||||||
|
# Special behavior for separators
|
||||||
|
if item_type == "separator":
|
||||||
|
parent.addSeparator()
|
||||||
|
|
||||||
|
# add submenu
|
||||||
|
# items should hold a collection of submenu items (dict)
|
||||||
|
elif item_type == "menu":
|
||||||
|
assert "items" in item, "Menu is missing 'items' key"
|
||||||
|
menu = self.add_menu(parent=parent, title=item["title"])
|
||||||
|
self.build_from_configuration(menu, item["items"])
|
||||||
|
|
||||||
|
# add script
|
||||||
|
elif item_type == "action":
|
||||||
|
# filter out `type` from the item dict
|
||||||
|
config = {key: value for key, value in
|
||||||
|
item.items() if key != "type"}
|
||||||
|
|
||||||
|
self.add_script(parent=parent, **config)
|
||||||
|
|
||||||
|
def set_update_visible(self, state):
|
||||||
|
self.update_action.setVisible(state)
|
||||||
|
|
||||||
|
def clear_menu(self):
|
||||||
|
"""Clear all menu items which are not default
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
# TODO: Set up a more robust implementation for this
|
||||||
|
# Delete all except the first three actions
|
||||||
|
for _action in self.actions()[3:]:
|
||||||
|
self.removeAction(_action)
|
||||||
|
|
||||||
|
def register_callback(self, modifiers, callback):
|
||||||
|
self._callbacks[modifiers].append(callback)
|
||||||
|
|
||||||
|
def _update_search(self, search):
|
||||||
|
"""Hide all the samples which do not match the user's import
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
None
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not search:
|
||||||
|
for action in self._script_actions:
|
||||||
|
action.setVisible(True)
|
||||||
|
else:
|
||||||
|
for action in self._script_actions:
|
||||||
|
if not action.has_tag(search.lower()):
|
||||||
|
action.setVisible(False)
|
||||||
|
|
||||||
|
# Set visibility for all submenus
|
||||||
|
for action in self.actions():
|
||||||
|
if not action.menu():
|
||||||
|
continue
|
||||||
|
|
||||||
|
menu = action.menu()
|
||||||
|
visible = any(action.isVisible() for action in menu.actions())
|
||||||
|
action.setVisible(visible)
|
||||||
|
|
||||||
|
|
||||||
|
def load_configuration(path):
|
||||||
|
"""Load the configuration from a file
|
||||||
|
|
||||||
|
Read out the JSON file which will dictate the structure of the scripts menu
|
||||||
|
|
||||||
|
Args:
|
||||||
|
path (str): file path of the .JSON file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not os.path.isfile(path):
|
||||||
|
raise AttributeError("Given configuration is not "
|
||||||
|
"a file!\n'{}'".format(path))
|
||||||
|
|
||||||
|
extension = os.path.splitext(path)[-1]
|
||||||
|
if extension != ".json":
|
||||||
|
raise AttributeError("Given configuration file has unsupported "
|
||||||
|
"file type, provide a .json file")
|
||||||
|
|
||||||
|
# retrieve and store config
|
||||||
|
with open(path, "r") as f:
|
||||||
|
configuration = json.load(f)
|
||||||
|
|
||||||
|
return configuration
|
||||||
|
|
||||||
|
|
||||||
|
def application(configuration, parent):
|
||||||
|
import sys
|
||||||
|
app = QtWidgets.QApplication(sys.argv)
|
||||||
|
|
||||||
|
scriptsmenu = ScriptsMenu(configuration, parent)
|
||||||
|
scriptsmenu.show()
|
||||||
|
|
||||||
|
sys.exit(app.exec_())
|
||||||
1989
openpype/vendor/python/common/scriptsmenu/vendor/Qt.py
vendored
Normal file
1989
openpype/vendor/python/common/scriptsmenu/vendor/Qt.py
vendored
Normal file
File diff suppressed because it is too large
Load diff
0
openpype/vendor/python/common/scriptsmenu/vendor/__init__.py
vendored
Normal file
0
openpype/vendor/python/common/scriptsmenu/vendor/__init__.py
vendored
Normal file
9
openpype/vendor/python/common/scriptsmenu/version.py
vendored
Normal file
9
openpype/vendor/python/common/scriptsmenu/version.py
vendored
Normal file
|
|
@ -0,0 +1,9 @@
|
||||||
|
VERSION_MAJOR = 1
|
||||||
|
VERSION_MINOR = 5
|
||||||
|
VERSION_PATCH = 1
|
||||||
|
|
||||||
|
|
||||||
|
version = '{}.{}.{}'.format(VERSION_MAJOR, VERSION_MINOR, VERSION_PATCH)
|
||||||
|
__version__ = version
|
||||||
|
|
||||||
|
__all__ = ['version', '__version__']
|
||||||
|
|
@ -1,3 +1,3 @@
|
||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
"""Package declaring Pype version."""
|
"""Package declaring Pype version."""
|
||||||
__version__ = "3.3.0-nightly.4"
|
__version__ = "3.3.0-nightly.6"
|
||||||
|
|
|
||||||
|
|
@ -1 +1 @@
|
||||||
Subproject commit d8be0bdb37961e32243f1de0eb9696e86acf7443
|
Subproject commit cfd4191e364b47de7364096f45d9d9d9a901692a
|
||||||
17
start.py
17
start.py
|
|
@ -208,14 +208,21 @@ def set_openpype_global_environments() -> None:
|
||||||
"""Set global OpenPype's environments."""
|
"""Set global OpenPype's environments."""
|
||||||
import acre
|
import acre
|
||||||
|
|
||||||
from openpype.settings import get_environments
|
try:
|
||||||
|
from openpype.settings import get_general_environments
|
||||||
|
|
||||||
all_env = get_environments()
|
general_env = get_general_environments()
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
# Backwards compatibility for OpenPype versions where
|
||||||
|
# `get_general_environments` does not exists yet
|
||||||
|
from openpype.settings import get_environments
|
||||||
|
|
||||||
|
all_env = get_environments()
|
||||||
|
general_env = all_env["global"]
|
||||||
|
|
||||||
# TODO Global environments will be stored in "general" settings so loading
|
|
||||||
# will be modified and can be done in igniter.
|
|
||||||
env = acre.merge(
|
env = acre.merge(
|
||||||
acre.parse(all_env["global"]),
|
acre.parse(general_env),
|
||||||
dict(os.environ)
|
dict(os.environ)
|
||||||
)
|
)
|
||||||
os.environ.clear()
|
os.environ.clear()
|
||||||
|
|
|
||||||
|
|
@ -80,17 +80,6 @@ function Show-PSWarning() {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function Install-Poetry() {
|
|
||||||
Write-Host ">>> " -NoNewline -ForegroundColor Green
|
|
||||||
Write-Host "Installing Poetry ... "
|
|
||||||
$python = "python"
|
|
||||||
if (Get-Command "pyenv" -ErrorAction SilentlyContinue) {
|
|
||||||
$python = & pyenv which python
|
|
||||||
}
|
|
||||||
$env:POETRY_HOME="$openpype_root\.poetry"
|
|
||||||
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py -UseBasicParsing).Content | & $($python) -
|
|
||||||
}
|
|
||||||
|
|
||||||
$art = @"
|
$art = @"
|
||||||
|
|
||||||
. . .. . ..
|
. . .. . ..
|
||||||
|
|
|
||||||
|
|
@ -62,9 +62,12 @@ function Test-Python() {
|
||||||
Write-Host "Detecting host Python ... " -NoNewline
|
Write-Host "Detecting host Python ... " -NoNewline
|
||||||
$python = "python"
|
$python = "python"
|
||||||
if (Get-Command "pyenv" -ErrorAction SilentlyContinue) {
|
if (Get-Command "pyenv" -ErrorAction SilentlyContinue) {
|
||||||
$python = & pyenv which python
|
$pyenv_python = & pyenv which python
|
||||||
|
if (Test-Path -PathType Leaf -Path "$($pyenv_python)") {
|
||||||
|
$python = $pyenv_python
|
||||||
|
}
|
||||||
}
|
}
|
||||||
if (-not (Get-Command "python3" -ErrorAction SilentlyContinue)) {
|
if (-not (Get-Command $python -ErrorAction SilentlyContinue)) {
|
||||||
Write-Host "!!! Python not detected" -ForegroundColor red
|
Write-Host "!!! Python not detected" -ForegroundColor red
|
||||||
Set-Location -Path $current_dir
|
Set-Location -Path $current_dir
|
||||||
Exit-WithCode 1
|
Exit-WithCode 1
|
||||||
|
|
|
||||||
|
|
@ -4,11 +4,11 @@ title: Maya
|
||||||
sidebar_label: Maya
|
sidebar_label: Maya
|
||||||
---
|
---
|
||||||
|
|
||||||
## Maya
|
## Publish Plugins
|
||||||
|
|
||||||
### Publish Plugins
|
### Render Settings Validator
|
||||||
|
|
||||||
#### Render Settings Validator (`ValidateRenderSettings`)
|
`ValidateRenderSettings`
|
||||||
|
|
||||||
Render Settings Validator is here to make sure artists will submit renders
|
Render Settings Validator is here to make sure artists will submit renders
|
||||||
we correct settings. Some of these settings are needed by OpenPype but some
|
we correct settings. Some of these settings are needed by OpenPype but some
|
||||||
|
|
@ -49,4 +49,49 @@ Arnolds Camera (AA) samples to 6.
|
||||||
Note that `aiOptions` is not the name of node but rather its type. For renderers there is usually
|
Note that `aiOptions` is not the name of node but rather its type. For renderers there is usually
|
||||||
just one instance of this node type but if that is not so, validator will go through all its
|
just one instance of this node type but if that is not so, validator will go through all its
|
||||||
instances and check the value there. Node type for **VRay** settings is `VRaySettingsNode`, for **Renderman**
|
instances and check the value there. Node type for **VRay** settings is `VRaySettingsNode`, for **Renderman**
|
||||||
it is `rmanGlobals`, for **Redshift** it is `RedshiftOptions`.
|
it is `rmanGlobals`, for **Redshift** it is `RedshiftOptions`.
|
||||||
|
|
||||||
|
### Model Name Validator
|
||||||
|
|
||||||
|
`ValidateRenderSettings`
|
||||||
|
|
||||||
|
This validator can enforce specific names for model members. It will check them against **Validation Regex**.
|
||||||
|
There is special group in that regex - **shader**. If present, it will take that part of the name as shader name
|
||||||
|
and it will compare it with list of shaders defined either in file name specified in **Material File** or from
|
||||||
|
database file that is per project and can be directly edited from Maya's *OpenPype Tools > Edit Shader name definitions* when
|
||||||
|
**Use database shader name definitions** is on. This list defines simply as one shader name per line.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
For example - you are using default regex `(.*)_(\d)*_(?P<shader>.*)_(GEO)` and you have two shaders defined
|
||||||
|
in either file or database `foo` and `bar`.
|
||||||
|
|
||||||
|
Object named `SomeCube_0001_foo_GEO` will pass but `SomeCube_GEO` will not and `SomeCube_001_xxx_GEO` will not too.
|
||||||
|
|
||||||
|
#### Top level group name
|
||||||
|
There is a validation for top level group name too. You can specify whatever regex you'd like to use. Default will
|
||||||
|
pass everything with `_GRP` suffix. You can use *named capturing groups* to validate against specific data. If you
|
||||||
|
put `(?P<asset>.*)` it will try to match everything captured in that group against current asset name. Likewise you can
|
||||||
|
use it for **subset** and **project** - `(?P<subset>.*)` and `(?P<project>.*)`.
|
||||||
|
|
||||||
|
**Example**
|
||||||
|
|
||||||
|
You are working on asset (shot) `0030_OGC_0190`. You have this regex in **Top level group name**:
|
||||||
|
```regexp
|
||||||
|
.*?_(?P<asset>.*)_GRP
|
||||||
|
```
|
||||||
|
|
||||||
|
When you publish your model with top group named like `foo_GRP` it will fail. But with `foo_0030_OGC_0190_GRP` it will pass.
|
||||||
|
|
||||||
|
:::info About regex
|
||||||
|
All regexes used here are in Python variant.
|
||||||
|
:::
|
||||||
|
|
||||||
|
## Custom Menu
|
||||||
|
You can add your custom tools menu into Maya by extending definitions in **Maya -> Scripts Menu Definition**.
|
||||||
|

|
||||||
|
|
||||||
|
:::note Work in progress
|
||||||
|
This is still work in progress. Menu definition will be handled more friendly with widgets and not
|
||||||
|
raw json.
|
||||||
|
:::
|
||||||
BIN
website/docs/assets/maya-admin_model_name_validator.png
Normal file
BIN
website/docs/assets/maya-admin_model_name_validator.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 34 KiB |
BIN
website/docs/assets/maya-admin_scriptsmenu.png
Normal file
BIN
website/docs/assets/maya-admin_scriptsmenu.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 16 KiB |
BIN
website/docs/project_settings/assets/standalone_creators.png
Normal file
BIN
website/docs/project_settings/assets/standalone_creators.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 14 KiB |
98
website/docs/project_settings/settings_project_standalone.md
Normal file
98
website/docs/project_settings/settings_project_standalone.md
Normal file
|
|
@ -0,0 +1,98 @@
|
||||||
|
---
|
||||||
|
id: settings_project_standalone
|
||||||
|
title: Project Standalone Publisher Setting
|
||||||
|
sidebar_label: Standalone Publisher
|
||||||
|
---
|
||||||
|
|
||||||
|
import Tabs from '@theme/Tabs';
|
||||||
|
import TabItem from '@theme/TabItem';
|
||||||
|
|
||||||
|
Project settings can have project specific values. Each new project is using studio values defined in **default** project but these values can be modified or overriden per project.
|
||||||
|
|
||||||
|
:::warning Default studio values
|
||||||
|
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orage colour). Any changes in default project may affect all existing projects.
|
||||||
|
:::
|
||||||
|
|
||||||
|
## Creator Plugins
|
||||||
|
|
||||||
|
Contains list of implemented families to show in middle menu in Standalone Publisher. Each plugin must contain:
|
||||||
|
- name
|
||||||
|
- label
|
||||||
|
- family
|
||||||
|
- icon
|
||||||
|
- default subset(s)
|
||||||
|
- help (additional short information about family)
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
## Publish plugins
|
||||||
|
|
||||||
|
### Collect Textures
|
||||||
|
|
||||||
|
Serves to collect all needed information about workfiles and textures created from those. Allows to publish
|
||||||
|
main workfile (for example from Mari), additional worfiles (from Substance Painter) and exported textures.
|
||||||
|
|
||||||
|
Available configuration:
|
||||||
|
- Main workfile extension - only single workfile can be "main" one
|
||||||
|
- Support workfile extensions - additional workfiles will be published to same folder as "main", just under `resourses` subfolder
|
||||||
|
- Texture extension - what kind of formats are expected for textures
|
||||||
|
- Additional families for workfile - should any family ('ftrack', 'review') be added to published workfile
|
||||||
|
- Additional families for textures - should any family ('ftrack', 'review') be added to published textures
|
||||||
|
|
||||||
|
#### Naming conventions
|
||||||
|
|
||||||
|
Implementation tries to be flexible and cover multiple naming conventions for workfiles and textures.
|
||||||
|
|
||||||
|
##### Workfile naming pattern
|
||||||
|
|
||||||
|
Provide regex matching pattern containing regex groups used to parse workfile name to learn needed information. (For example
|
||||||
|
build name.)
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
- pattern: ```^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+```
|
||||||
|
- with groups: ```["asset", "filler", "version"]```
|
||||||
|
|
||||||
|
parses `corridorMain_v001` into three groups:
|
||||||
|
- asset build (`corridorMain`)
|
||||||
|
- filler (in this case empty)
|
||||||
|
- version (`001`)
|
||||||
|
|
||||||
|
Advanced example (for texture files):
|
||||||
|
|
||||||
|
- pattern: ```^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+```
|
||||||
|
- with groups: ```["asset", "shader", "version", "channel", "color_space", "udim"]```
|
||||||
|
|
||||||
|
parses `corridorMain_aluminiumID_v001_baseColor_linsRGB_1001.exr`:
|
||||||
|
- asset build (`corridorMain`)
|
||||||
|
- shader (`aluminiumID`)
|
||||||
|
- version (`001`)
|
||||||
|
- channel (`baseColor`)
|
||||||
|
- color_space (`linsRGB`)
|
||||||
|
- udim (`1001`)
|
||||||
|
|
||||||
|
|
||||||
|
In case of different naming pattern, additional groups could be added or removed. Number of matching groups (`(...)`) must be same as number of items in `Group order for regex patterns`
|
||||||
|
|
||||||
|
##### Workfile group positions
|
||||||
|
|
||||||
|
For each matching regex group set in previous paragraph, its ordinal position is required (in case of need for addition of new groups etc.)
|
||||||
|
|
||||||
|
Number of groups added here must match number of parsing groups from `Workfile naming pattern`.
|
||||||
|
|
||||||
|
##### Output names
|
||||||
|
|
||||||
|
Output names of published workfiles and textures could be configured separately:
|
||||||
|
- Subset name template for workfile
|
||||||
|
- Subset name template for textures (implemented keys: ["color_space", "channel", "subset", "shader"])
|
||||||
|
|
||||||
|
|
||||||
|
### Validate Scene Settings
|
||||||
|
|
||||||
|
#### Check Frame Range for Extensions
|
||||||
|
|
||||||
|
Configure families, file extension and task to validate that DB setting (frame range) matches currently published values.
|
||||||
|
|
||||||
|
### ExtractThumbnailSP
|
||||||
|
|
||||||
|
Plugin responsible for generating thumbnails, configure appropriate values for your version o ffmpeg.
|
||||||
|
|
@ -65,7 +65,8 @@ module.exports = {
|
||||||
label: "Project Settings",
|
label: "Project Settings",
|
||||||
items: [
|
items: [
|
||||||
"project_settings/settings_project_global",
|
"project_settings/settings_project_global",
|
||||||
"project_settings/settings_project_nuke"
|
"project_settings/settings_project_nuke",
|
||||||
|
"project_settings/settings_project_standalone"
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue