Merge pull request #3315 from quadproduction/162-refactor-template-builder-for-maya

This commit is contained in:
Milan Kolar 2022-07-27 10:01:24 +02:00 committed by GitHub
commit ac3a8f1e22
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
112 changed files with 4562 additions and 302 deletions

View file

@ -309,7 +309,18 @@
"contributions": [
"code"
]
},
{
"login": "Tilix4",
"name": "Félix David",
"avatar_url": "https://avatars.githubusercontent.com/u/22875539?v=4",
"profile": "http://felixdavid.com/",
"contributions": [
"code",
"doc"
]
}
],
"contributorsPerLine": 7
}
"contributorsPerLine": 7,
"skipCi": true
}

View file

@ -1,17 +1,28 @@
# Changelog
## [3.10.1-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.11.0-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.10.0...HEAD)
### 📖 Documentation
- doc: adding royal render and multiverse to the web site [\#3285](https://github.com/pypeclub/OpenPype/pull/3285)
**🚀 Enhancements**
- General: Updated windows oiio tool [\#3268](https://github.com/pypeclub/OpenPype/pull/3268)
- Unreal: add support for skeletalMesh and staticMesh to loaders [\#3267](https://github.com/pypeclub/OpenPype/pull/3267)
- Maya: reference loaders could store placeholder in referenced url [\#3264](https://github.com/pypeclub/OpenPype/pull/3264)
- TVPaint: Init file for TVPaint worker also handle guideline images [\#3250](https://github.com/pypeclub/OpenPype/pull/3250)
- Nuke: Change default icon path in settings [\#3247](https://github.com/pypeclub/OpenPype/pull/3247)
**🐛 Bug fixes**
- Global: extract review slate issues [\#3286](https://github.com/pypeclub/OpenPype/pull/3286)
- Webpublisher: return only active projects in ProjectsEndpoint [\#3281](https://github.com/pypeclub/OpenPype/pull/3281)
- Hiero: add support for task tags 3.10.x [\#3279](https://github.com/pypeclub/OpenPype/pull/3279)
- General: Fix Oiio tool path resolving [\#3278](https://github.com/pypeclub/OpenPype/pull/3278)
- Maya: Fix udim support for e.g. uppercase \<UDIM\> tag [\#3266](https://github.com/pypeclub/OpenPype/pull/3266)
- Nuke: bake reformat was failing on string type [\#3261](https://github.com/pypeclub/OpenPype/pull/3261)
- Maya: hotfix Pxr multitexture in looks [\#3260](https://github.com/pypeclub/OpenPype/pull/3260)
- Unreal: Fix Camera Loading if Layout is missing [\#3255](https://github.com/pypeclub/OpenPype/pull/3255)
@ -19,10 +30,14 @@
- Unreal: Fixed Render creation in UE5 [\#3239](https://github.com/pypeclub/OpenPype/pull/3239)
- Unreal: Fixed Camera loading in UE5 [\#3238](https://github.com/pypeclub/OpenPype/pull/3238)
- Flame: debugging [\#3224](https://github.com/pypeclub/OpenPype/pull/3224)
- add silent audio to slate [\#3162](https://github.com/pypeclub/OpenPype/pull/3162)
**Merged pull requests:**
- Maya: better handling of legacy review subsets names [\#3269](https://github.com/pypeclub/OpenPype/pull/3269)
- Deadline: publishing of animation and pointcache on a farm [\#3225](https://github.com/pypeclub/OpenPype/pull/3225)
- Nuke: add pointcache and animation to loader [\#3186](https://github.com/pypeclub/OpenPype/pull/3186)
- Add a gizmo menu to nuke [\#3172](https://github.com/pypeclub/OpenPype/pull/3172)
## [3.10.0](https://github.com/pypeclub/OpenPype/tree/3.10.0) (2022-05-26)
@ -32,7 +47,6 @@
- General: OpenPype modules publish plugins are registered in host [\#3180](https://github.com/pypeclub/OpenPype/pull/3180)
- General: Creator plugins from addons can be registered [\#3179](https://github.com/pypeclub/OpenPype/pull/3179)
- Ftrack: Single image reviewable [\#3157](https://github.com/pypeclub/OpenPype/pull/3157)
**🚀 Enhancements**
@ -45,13 +59,6 @@
- Maya: added clean\_import option to Import loader [\#3181](https://github.com/pypeclub/OpenPype/pull/3181)
- Add the scripts menu definition to nuke [\#3168](https://github.com/pypeclub/OpenPype/pull/3168)
- Maya: add maya 2023 to default applications [\#3167](https://github.com/pypeclub/OpenPype/pull/3167)
- Compressed bgeo publishing in SAP and Houdini loader [\#3153](https://github.com/pypeclub/OpenPype/pull/3153)
- General: Add 'dataclasses' to required python modules [\#3149](https://github.com/pypeclub/OpenPype/pull/3149)
- Hooks: Tweak logging grammar [\#3147](https://github.com/pypeclub/OpenPype/pull/3147)
- Nuke: settings for reformat node in CreateWriteRender node [\#3143](https://github.com/pypeclub/OpenPype/pull/3143)
- Houdini: Add loader for alembic through Alembic Archive node [\#3140](https://github.com/pypeclub/OpenPype/pull/3140)
- Publisher: UI Modifications and fixes [\#3139](https://github.com/pypeclub/OpenPype/pull/3139)
- General: Simplified OP modules/addons import [\#3137](https://github.com/pypeclub/OpenPype/pull/3137)
**🐛 Bug fixes**
@ -75,14 +82,6 @@
- General: Oiio conversion for ffmpeg checks for invalid characters [\#3166](https://github.com/pypeclub/OpenPype/pull/3166)
- Fix for attaching render to subset [\#3164](https://github.com/pypeclub/OpenPype/pull/3164)
- Harmony: fixed missing task name in render instance [\#3163](https://github.com/pypeclub/OpenPype/pull/3163)
- add silent audio to slate [\#3162](https://github.com/pypeclub/OpenPype/pull/3162)
- Ftrack: Action delete old versions formatting works [\#3152](https://github.com/pypeclub/OpenPype/pull/3152)
- nuke: adding extract thumbnail settings [\#3148](https://github.com/pypeclub/OpenPype/pull/3148)
- Deadline: fix the output directory [\#3144](https://github.com/pypeclub/OpenPype/pull/3144)
- General: New Session schema [\#3141](https://github.com/pypeclub/OpenPype/pull/3141)
- General: Missing version on headless mode crash properly [\#3136](https://github.com/pypeclub/OpenPype/pull/3136)
- TVPaint: Composite layers in reversed order [\#3135](https://github.com/pypeclub/OpenPype/pull/3135)
- TVPaint: Composite layers in reversed order [\#3134](https://github.com/pypeclub/OpenPype/pull/3134)
**🔀 Refactored code**
@ -122,23 +121,6 @@
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.6...3.9.7)
**🆕 New features**
- Ftrack: Single image reviewable [\#3158](https://github.com/pypeclub/OpenPype/pull/3158)
**🚀 Enhancements**
- Deadline output dir issue to 3.9x [\#3155](https://github.com/pypeclub/OpenPype/pull/3155)
- nuke: removing redundant code from startup [\#3142](https://github.com/pypeclub/OpenPype/pull/3142)
**🐛 Bug fixes**
- Ftrack: Action delete old versions formatting works [\#3154](https://github.com/pypeclub/OpenPype/pull/3154)
**Merged pull requests:**
- Webpublisher: replace space by underscore in subset names [\#3159](https://github.com/pypeclub/OpenPype/pull/3159)
## [3.9.6](https://github.com/pypeclub/OpenPype/tree/3.9.6) (2022-05-03)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.9.5...3.9.6)

View file

@ -1,6 +1,6 @@
<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->
[![All Contributors](https://img.shields.io/badge/all_contributors-26-orange.svg?style=flat-square)](#contributors-)
[![All Contributors](https://img.shields.io/badge/all_contributors-27-orange.svg?style=flat-square)](#contributors-)
<!-- ALL-CONTRIBUTORS-BADGE:END -->
OpenPype
====
@ -328,6 +328,7 @@ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/d
<td align="center"><a href="https://github.com/Malthaldar"><img src="https://avatars.githubusercontent.com/u/33671694?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Malthaldar</b></sub></a><br /><a href="https://github.com/pypeclub/OpenPype/commits?author=Malthaldar" title="Code">💻</a></td>
<td align="center"><a href="http://www.svenneve.com/"><img src="https://avatars.githubusercontent.com/u/2472863?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Sven Neve</b></sub></a><br /><a href="https://github.com/pypeclub/OpenPype/commits?author=svenneve" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/zafrs"><img src="https://avatars.githubusercontent.com/u/26890002?v=4?s=100" width="100px;" alt=""/><br /><sub><b>zafrs</b></sub></a><br /><a href="https://github.com/pypeclub/OpenPype/commits?author=zafrs" title="Code">💻</a></td>
<td align="center"><a href="http://felixdavid.com/"><img src="https://avatars.githubusercontent.com/u/22875539?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Félix David</b></sub></a><br /><a href="https://github.com/pypeclub/OpenPype/commits?author=Tilix4" title="Code">💻</a> <a href="https://github.com/pypeclub/OpenPype/commits?author=Tilix4" title="Documentation">📖</a></td>
</tr>
</table>

View file

@ -44,6 +44,7 @@ from . import resources
from .plugin import (
Extractor,
Integrator,
ValidatePipelineOrder,
ValidateContentsOrder,
@ -86,6 +87,7 @@ __all__ = [
# plugin classes
"Extractor",
"Integrator",
# ordering
"ValidatePipelineOrder",
"ValidateContentsOrder",

View file

@ -29,6 +29,7 @@ from .lib import (
get_current_sequence,
get_timeline_selection,
get_current_track,
get_track_item_tags,
get_track_item_pype_tag,
set_track_item_pype_tag,
get_track_item_pype_data,
@ -83,6 +84,7 @@ __all__ = [
"get_current_sequence",
"get_timeline_selection",
"get_current_track",
"get_track_item_tags",
"get_track_item_pype_tag",
"set_track_item_pype_tag",
"get_track_item_pype_data",

View file

@ -274,6 +274,31 @@ def _validate_all_atrributes(
])
def get_track_item_tags(track_item):
"""
Get track item tags excluded openpype tag
Attributes:
trackItem (hiero.core.TrackItem): hiero object
Returns:
hiero.core.Tag: hierarchy, orig clip attributes
"""
returning_tag_data = []
# get all tags from track item
_tags = track_item.tags()
if not _tags:
return []
# collect all tags which are not openpype tag
returning_tag_data.extend(
tag for tag in _tags
if tag.name() != self.pype_tag_name
)
return returning_tag_data
def get_track_item_pype_tag(track_item):
"""
Get pype track item tag created by creator or loader plugin.

View file

@ -4,16 +4,16 @@ from pyblish import api
class CollectClipTagTasks(api.InstancePlugin):
"""Collect Tags from selected track items."""
order = api.CollectorOrder
order = api.CollectorOrder - 0.077
label = "Collect Tag Tasks"
hosts = ["hiero"]
families = ['clip']
families = ["shot"]
def process(self, instance):
# gets tags
tags = instance.data["tags"]
tasks = dict()
tasks = {}
for tag in tags:
t_metadata = dict(tag.metadata())
t_family = t_metadata.get("tag.family", "")

View file

@ -106,7 +106,10 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
# clip's effect
"clipEffectItems": subtracks,
"clipAnnotations": annotations
"clipAnnotations": annotations,
# add all additional tags
"tags": phiero.get_track_item_tags(track_item)
})
# otio clip data

View file

@ -1737,8 +1737,11 @@ def apply_shaders(relationships, shadernodes, nodes):
log.warning("No nodes found for shading engine "
"'{0}'".format(id_shading_engines[0]))
continue
try:
cmds.sets(filtered_nodes, forceElement=id_shading_engines[0])
except RuntimeError as rte:
log.error("Error during shader assignment: {}".format(rte))
cmds.sets(filtered_nodes, forceElement=id_shading_engines[0])
# endregion
apply_attributes(attributes, nodes_by_id)

View file

@ -0,0 +1,250 @@
import json
from collections import OrderedDict
import maya.cmds as cmds
import qargparse
from openpype.tools.utils.widgets import OptionDialog
from .lib import get_main_window, imprint
# To change as enum
build_types = ["context_asset", "linked_asset", "all_assets"]
def get_placeholder_attributes(node):
return {
attr: cmds.getAttr("{}.{}".format(node, attr))
for attr in cmds.listAttr(node, userDefined=True)}
def delete_placeholder_attributes(node):
'''
function to delete all extra placeholder attributes
'''
extra_attributes = get_placeholder_attributes(node)
for attribute in extra_attributes:
cmds.deleteAttr(node + '.' + attribute)
def create_placeholder():
args = placeholder_window()
if not args:
return # operation canceled, no locator created
# custom arg parse to force empty data query
# and still imprint them on placeholder
# and getting items when arg is of type Enumerator
options = create_options(args)
# create placeholder name dynamically from args and options
placeholder_name = create_placeholder_name(args, options)
selection = cmds.ls(selection=True)
placeholder = cmds.spaceLocator(name=placeholder_name)[0]
# get the long name of the placeholder (with the groups)
placeholder_full_name = cmds.ls(selection[0], long=True)[
0] + '|' + placeholder.replace('|', '')
if selection:
cmds.parent(placeholder, selection[0])
imprint(placeholder_full_name, options)
# Some tweaks because imprint force enums to to default value so we get
# back arg read and force them to attributes
imprint_enum(placeholder_full_name, args)
# Add helper attributes to keep placeholder info
cmds.addAttr(
placeholder_full_name,
longName="parent",
hidden=True,
dataType="string"
)
cmds.addAttr(
placeholder_full_name,
longName="index",
hidden=True,
attributeType="short",
defaultValue=-1
)
cmds.setAttr(placeholder_full_name + '.parent', "", type="string")
def create_placeholder_name(args, options):
placeholder_builder_type = [
arg.read() for arg in args if 'builder_type' in str(arg)
][0]
placeholder_family = options['family']
placeholder_name = placeholder_builder_type.split('_')
# add famlily in any
if placeholder_family:
placeholder_name.insert(1, placeholder_family)
# add loader arguments if any
if options['loader_args']:
pos = 2
loader_args = options['loader_args'].replace('\'', '\"')
loader_args = json.loads(loader_args)
values = [v for v in loader_args.values()]
for i in range(len(values)):
placeholder_name.insert(i + pos, values[i])
placeholder_name = '_'.join(placeholder_name)
return placeholder_name.capitalize()
def update_placeholder():
placeholder = cmds.ls(selection=True)
if len(placeholder) == 0:
raise ValueError("No node selected")
if len(placeholder) > 1:
raise ValueError("Too many selected nodes")
placeholder = placeholder[0]
args = placeholder_window(get_placeholder_attributes(placeholder))
if not args:
return # operation canceled
# delete placeholder attributes
delete_placeholder_attributes(placeholder)
options = create_options(args)
imprint(placeholder, options)
imprint_enum(placeholder, args)
cmds.addAttr(
placeholder,
longName="parent",
hidden=True,
dataType="string"
)
cmds.addAttr(
placeholder,
longName="index",
hidden=True,
attributeType="short",
defaultValue=-1
)
cmds.setAttr(placeholder + '.parent', '', type="string")
def create_options(args):
options = OrderedDict()
for arg in args:
if not type(arg) == qargparse.Separator:
options[str(arg)] = arg._data.get("items") or arg.read()
return options
def imprint_enum(placeholder, args):
"""
Imprint method doesn't act properly with enums.
Replacing the functionnality with this for now
"""
enum_values = {str(arg): arg.read()
for arg in args if arg._data.get("items")}
string_to_value_enum_table = {
build: i for i, build
in enumerate(build_types)}
for key, value in enum_values.items():
cmds.setAttr(
placeholder + "." + key,
string_to_value_enum_table[value])
def placeholder_window(options=None):
options = options or dict()
dialog = OptionDialog(parent=get_main_window())
dialog.setWindowTitle("Create Placeholder")
args = [
qargparse.Separator("Main attributes"),
qargparse.Enum(
"builder_type",
label="Asset Builder Type",
default=options.get("builder_type", 0),
items=build_types,
help="""Asset Builder Type
Builder type describe what template loader will look for.
context_asset : Template loader will look for subsets of
current context asset (Asset bob will find asset)
linked_asset : Template loader will look for assets linked
to current context asset.
Linked asset are looked in avalon database under field "inputLinks"
"""
),
qargparse.String(
"family",
default=options.get("family", ""),
label="OpenPype Family",
placeholder="ex: model, look ..."),
qargparse.String(
"representation",
default=options.get("representation", ""),
label="OpenPype Representation",
placeholder="ex: ma, abc ..."),
qargparse.String(
"loader",
default=options.get("loader", ""),
label="Loader",
placeholder="ex: ReferenceLoader, LightLoader ...",
help="""Loader
Defines what openpype loader will be used to load assets.
Useable loader depends on current host's loader list.
Field is case sensitive.
"""),
qargparse.String(
"loader_args",
default=options.get("loader_args", ""),
label="Loader Arguments",
placeholder='ex: {"camera":"persp", "lights":True}',
help="""Loader
Defines a dictionnary of arguments used to load assets.
Useable arguments depend on current placeholder Loader.
Field should be a valid python dict. Anything else will be ignored.
"""),
qargparse.Integer(
"order",
default=options.get("order", 0),
min=0,
max=999,
label="Order",
placeholder="ex: 0, 100 ... (smallest order loaded first)",
help="""Order
Order defines asset loading priority (0 to 999)
Priority rule is : "lowest is first to load"."""),
qargparse.Separator(
"Optional attributes"),
qargparse.String(
"asset",
default=options.get("asset", ""),
label="Asset filter",
placeholder="regex filtering by asset name",
help="Filtering assets by matching field regex to asset's name"),
qargparse.String(
"subset",
default=options.get("subset", ""),
label="Subset filter",
placeholder="regex filtering by subset name",
help="Filtering assets by matching field regex to subset's name"),
qargparse.String(
"hierarchy",
default=options.get("hierarchy", ""),
label="Hierarchy filter",
placeholder="regex filtering by asset's hierarchy",
help="Filtering assets by matching field asset's hierarchy")
]
dialog.create(args)
if not dialog.exec_():
return None
return args

View file

@ -7,12 +7,19 @@ import maya.utils
import maya.cmds as cmds
from openpype.api import BuildWorkfile
from openpype.lib.build_template import (
build_workfile_template,
update_workfile_template
)
from openpype.settings import get_project_settings
from openpype.pipeline import legacy_io
from openpype.tools.utils import host_tools
from openpype.hosts.maya.api import lib
from .lib import get_main_window, IS_HEADLESS
from .commands import reset_frame_range
from .lib_template_builder import create_placeholder, update_placeholder
log = logging.getLogger(__name__)
@ -139,6 +146,34 @@ def install():
parent_widget
)
)
builder_menu = cmds.menuItem(
"Template Builder",
subMenu=True,
tearOff=True,
parent=MENU_NAME
)
cmds.menuItem(
"Create Placeholder",
parent=builder_menu,
command=lambda *args: create_placeholder()
)
cmds.menuItem(
"Update Placeholder",
parent=builder_menu,
command=lambda *args: update_placeholder()
)
cmds.menuItem(
"Build Workfile from template",
parent=builder_menu,
command=build_workfile_template
)
cmds.menuItem(
"Update Workfile from template",
parent=builder_menu,
command=update_workfile_template
)
cmds.setParent(MENU_NAME, menu=True)
def add_scripts_menu():

View file

@ -66,13 +66,23 @@ def install():
log.info("Installing callbacks ... ")
register_event_callback("init", on_init)
# Callbacks below are not required for headless mode, the `init` however
# is important to load referenced Alembics correctly at rendertime.
if os.environ.get("HEADLESS_PUBLISH"):
# Maya launched on farm, lib.IS_HEADLESS might be triggered locally too
# target "farm" == rendering on farm, expects OPENPYPE_PUBLISH_DATA
# target "remote" == remote execution
print("Registering pyblish target: remote")
pyblish.api.register_target("remote")
return
if lib.IS_HEADLESS:
log.info(("Running in headless mode, skipping Maya "
"save/open/new callback installation.."))
return
print("Registering pyblish target: local")
pyblish.api.register_target("local")
_set_project()
_register_callbacks()

View file

@ -10,7 +10,8 @@ from openpype.pipeline import (
get_representation_path,
AVALON_CONTAINER_ID,
)
from openpype.api import Anatomy
from openpype.settings import get_project_settings
from .pipeline import containerise
from . import lib
@ -230,6 +231,10 @@ class ReferenceLoader(Loader):
self.log.debug("No alembic nodes found in {}".format(members))
try:
path = self.prepare_root_value(path,
representation["context"]
["project"]
["code"])
content = cmds.file(path,
loadReference=reference_node,
type=file_type,
@ -319,6 +324,29 @@ class ReferenceLoader(Loader):
except RuntimeError:
pass
def prepare_root_value(self, file_url, project_name):
"""Replace root value with env var placeholder.
Use ${OPENPYPE_ROOT_WORK} (or any other root) instead of proper root
value when storing referenced url into a workfile.
Useful for remote workflows with SiteSync.
Args:
file_url (str)
project_name (dict)
Returns:
(str)
"""
settings = get_project_settings(project_name)
use_env_var_as_root = (settings["maya"]
["maya-dirmap"]
["use_env_var_as_root"])
if use_env_var_as_root:
anatomy = Anatomy(project_name)
file_url = anatomy.replace_root_with_env_key(file_url, '${{{}}}')
return file_url
@staticmethod
def _organize_containers(nodes, container):
# type: (list, str) -> None

View file

@ -0,0 +1,246 @@
from maya import cmds
from openpype.pipeline import legacy_io
from openpype.lib.abstract_template_loader import (
AbstractPlaceholder,
AbstractTemplateLoader
)
from openpype.lib.build_template_exceptions import TemplateAlreadyImported
PLACEHOLDER_SET = 'PLACEHOLDERS_SET'
class MayaTemplateLoader(AbstractTemplateLoader):
"""Concrete implementation of AbstractTemplateLoader for maya
"""
def import_template(self, path):
"""Import template into current scene.
Block if a template is already loaded.
Args:
path (str): A path to current template (usually given by
get_template_path implementation)
Returns:
bool: Wether the template was succesfully imported or not
"""
if cmds.objExists(PLACEHOLDER_SET):
raise TemplateAlreadyImported(
"Build template already loaded\n"
"Clean scene if needed (File > New Scene)")
cmds.sets(name=PLACEHOLDER_SET, empty=True)
self.new_nodes = cmds.file(path, i=True, returnNewNodes=True)
cmds.setAttr(PLACEHOLDER_SET + '.hiddenInOutliner', True)
for set in cmds.listSets(allSets=True):
if (cmds.objExists(set) and
cmds.attributeQuery('id', node=set, exists=True) and
cmds.getAttr(set + '.id') == 'pyblish.avalon.instance'):
if cmds.attributeQuery('asset', node=set, exists=True):
cmds.setAttr(
set + '.asset',
legacy_io.Session['AVALON_ASSET'], type='string'
)
return True
def template_already_imported(self, err_msg):
clearButton = "Clear scene and build"
updateButton = "Update template"
abortButton = "Abort"
title = "Scene already builded"
message = (
"It's seems a template was already build for this scene.\n"
"Error message reveived :\n\n\"{}\"".format(err_msg))
buttons = [clearButton, updateButton, abortButton]
defaultButton = clearButton
cancelButton = abortButton
dismissString = abortButton
answer = cmds.confirmDialog(
t=title,
m=message,
b=buttons,
db=defaultButton,
cb=cancelButton,
ds=dismissString)
if answer == clearButton:
cmds.file(newFile=True, force=True)
self.import_template(self.template_path)
self.populate_template()
elif answer == updateButton:
self.update_missing_containers()
elif answer == abortButton:
return
@staticmethod
def get_template_nodes():
attributes = cmds.ls('*.builder_type', long=True)
return [attribute.rpartition('.')[0] for attribute in attributes]
def get_loaded_containers_by_id(self):
try:
containers = cmds.sets("AVALON_CONTAINERS", q=True)
except ValueError:
return None
return [
cmds.getAttr(container + '.representation')
for container in containers]
class MayaPlaceholder(AbstractPlaceholder):
"""Concrete implementation of AbstractPlaceholder for maya
"""
optional_attributes = {'asset', 'subset', 'hierarchy'}
def get_data(self, node):
user_data = dict()
for attr in self.attributes.union(self.optional_attributes):
attribute_name = '{}.{}'.format(node, attr)
if not cmds.attributeQuery(attr, node=node, exists=True):
print("{} not found".format(attribute_name))
continue
user_data[attr] = cmds.getAttr(
attribute_name,
asString=True)
user_data['parent'] = (
cmds.getAttr(node + '.parent', asString=True)
or node.rpartition('|')[0] or "")
user_data['node'] = node
if user_data['parent']:
siblings = cmds.listRelatives(user_data['parent'], children=True)
else:
siblings = cmds.ls(assemblies=True)
node_shortname = user_data['node'].rpartition('|')[2]
current_index = cmds.getAttr(node + '.index', asString=True)
user_data['index'] = (
current_index if current_index >= 0
else siblings.index(node_shortname))
self.data = user_data
def parent_in_hierarchy(self, containers):
"""Parent loaded container to placeholder's parent
ie : Set loaded content as placeholder's sibling
Args:
containers (String): Placeholder loaded containers
"""
if not containers:
return
roots = cmds.sets(containers, q=True)
nodes_to_parent = []
for root in roots:
if root.endswith("_RN"):
refRoot = cmds.referenceQuery(root, n=True)[0]
refRoot = cmds.listRelatives(refRoot, parent=True) or [refRoot]
nodes_to_parent.extend(refRoot)
elif root in cmds.listSets(allSets=True):
if not cmds.sets(root, q=True):
return
else:
continue
else:
nodes_to_parent.append(root)
if self.data['parent']:
cmds.parent(nodes_to_parent, self.data['parent'])
# Move loaded nodes to correct index in outliner hierarchy
placeholder_node = self.data['node']
placeholder_form = cmds.xform(
placeholder_node,
q=True,
matrix=True,
worldSpace=True
)
for node in set(nodes_to_parent):
cmds.reorder(node, front=True)
cmds.reorder(node, relative=self.data['index'])
cmds.xform(node, matrix=placeholder_form, ws=True)
holding_sets = cmds.listSets(object=placeholder_node)
if not holding_sets:
return
for holding_set in holding_sets:
cmds.sets(roots, forceElement=holding_set)
def clean(self):
"""Hide placeholder, parent them to root
add them to placeholder set and register placeholder's parent
to keep placeholder info available for future use
"""
node = self.data['node']
if self.data['parent']:
cmds.setAttr(node + '.parent', self.data['parent'], type='string')
if cmds.getAttr(node + '.index') < 0:
cmds.setAttr(node + '.index', self.data['index'])
holding_sets = cmds.listSets(object=node)
if holding_sets:
for set in holding_sets:
cmds.sets(node, remove=set)
if cmds.listRelatives(node, p=True):
node = cmds.parent(node, world=True)[0]
cmds.sets(node, addElement=PLACEHOLDER_SET)
cmds.hide(node)
cmds.setAttr(node + '.hiddenInOutliner', True)
def convert_to_db_filters(self, current_asset, linked_asset):
if self.data['builder_type'] == "context_asset":
return [
{
"type": "representation",
"context.asset": {
"$eq": current_asset,
"$regex": self.data['asset']
},
"context.subset": {"$regex": self.data['subset']},
"context.hierarchy": {"$regex": self.data['hierarchy']},
"context.representation": self.data['representation'],
"context.family": self.data['family'],
}
]
elif self.data['builder_type'] == "linked_asset":
return [
{
"type": "representation",
"context.asset": {
"$eq": asset_name,
"$regex": self.data['asset']
},
"context.subset": {"$regex": self.data['subset']},
"context.hierarchy": {"$regex": self.data['hierarchy']},
"context.representation": self.data['representation'],
"context.family": self.data['family'],
} for asset_name in linked_asset
]
else:
return [
{
"type": "representation",
"context.asset": {"$regex": self.data['asset']},
"context.subset": {"$regex": self.data['subset']},
"context.hierarchy": {"$regex": self.data['hierarchy']},
"context.representation": self.data['representation'],
"context.family": self.data['family'],
}
]
def err_message(self):
return (
"Error while trying to load a representation.\n"
"Either the subset wasn't published or the template is malformed."
"\n\n"
"Builder was looking for :\n{attributes}".format(
attributes="\n".join([
"{}: {}".format(key.title(), value)
for key, value in self.data.items()]
)
)
)

View file

@ -38,3 +38,7 @@ class CreateAnimation(plugin.Creator):
# Default to exporting world-space
self.data["worldSpace"] = True
# Default to not send to farm.
self.data["farm"] = False
self.data["priority"] = 50

View file

@ -28,3 +28,7 @@ class CreatePointCache(plugin.Creator):
# Add options for custom attributes
self.data["attr"] = ""
self.data["attrPrefix"] = ""
# Default to not send to farm.
self.data["farm"] = False
self.data["priority"] = 50

View file

@ -35,8 +35,9 @@ class AbcLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
# hero_001 (abc)
# asset_counter{optional}
nodes = cmds.file(self.fname,
file_url = self.prepare_root_value(self.fname,
context["project"]["code"])
nodes = cmds.file(file_url,
namespace=namespace,
sharedReferenceFile=False,
groupReference=True,

View file

@ -64,9 +64,11 @@ class AssProxyLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
path = os.path.join(publish_folder, filename)
proxyPath = proxyPath_base + ".ma"
self.log.info
nodes = cmds.file(proxyPath,
file_url = self.prepare_root_value(proxyPath,
context["project"]["code"])
nodes = cmds.file(file_url,
namespace=namespace,
reference=True,
returnNewNodes=True,
@ -123,7 +125,11 @@ class AssProxyLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
assert os.path.exists(proxyPath), "%s does not exist." % proxyPath
try:
content = cmds.file(proxyPath,
file_url = self.prepare_root_value(proxyPath,
representation["context"]
["project"]
["code"])
content = cmds.file(file_url,
loadReference=reference_node,
type="mayaAscii",
returnNewNodes=True)

View file

@ -31,7 +31,9 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
import maya.cmds as cmds
with lib.maintained_selection():
nodes = cmds.file(self.fname,
file_url = self.prepare_root_value(self.fname,
context["project"]["code"])
nodes = cmds.file(file_url,
namespace=namespace,
reference=True,
returnNewNodes=True)

View file

@ -51,7 +51,9 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
with maintained_selection():
cmds.loadPlugin("AbcImport.mll", quiet=True)
nodes = cmds.file(self.fname,
file_url = self.prepare_root_value(self.fname,
context["project"]["code"])
nodes = cmds.file(file_url,
namespace=namespace,
sharedReferenceFile=False,
reference=True,

View file

@ -53,7 +53,9 @@ class YetiRigLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
# load rig
with lib.maintained_selection():
nodes = cmds.file(self.fname,
file_url = self.prepare_root_value(self.fname,
context["project"]["code"])
nodes = cmds.file(file_url,
namespace=namespace,
reference=True,
returnNewNodes=True,

View file

@ -55,3 +55,6 @@ class CollectAnimationOutputGeometry(pyblish.api.InstancePlugin):
# Store data in the instance for the validator
instance.data["out_hierarchy"] = hierarchy
if instance.data.get("farm"):
instance.data["families"].append("publish.farm")

View file

@ -109,16 +109,18 @@ def node_uses_image_sequence(node, node_path):
"""
# useFrameExtension indicates an explicit image sequence
# The following tokens imply a sequence
patterns = ["<udim>", "<tile>", "<uvtile>",
"u<u>_v<v>", "<frame0", "<f4>"]
try:
use_frame_extension = cmds.getAttr('%s.useFrameExtension' % node)
except ValueError:
use_frame_extension = False
if use_frame_extension:
return True
return (use_frame_extension or
any(pattern in node_path for pattern in patterns))
# The following tokens imply a sequence
patterns = ["<udim>", "<tile>", "<uvtile>",
"u<u>_v<v>", "<frame0", "<f4>"]
node_path_lowered = node_path.lower()
return any(pattern in node_path_lowered for pattern in patterns)
def seq_to_glob(path):

View file

@ -0,0 +1,14 @@
import pyblish.api
class CollectPointcache(pyblish.api.InstancePlugin):
"""Collect pointcache data for instance."""
order = pyblish.api.CollectorOrder + 0.4
families = ["pointcache"]
label = "Collect Pointcache"
hosts = ["maya"]
def process(self, instance):
if instance.data.get("farm"):
instance.data["families"].append("publish.farm")

View file

@ -340,10 +340,10 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
"expectedFiles": full_exp_files,
"publishRenderMetadataFolder": common_publish_meta_path,
"resolutionWidth": lib.get_attr_in_layer(
"defaultResolution.height", layer=layer_name
"defaultResolution.width", layer=layer_name
),
"resolutionHeight": lib.get_attr_in_layer(
"defaultResolution.width", layer=layer_name
"defaultResolution.height", layer=layer_name
),
"pixelAspect": lib.get_attr_in_layer(
"defaultResolution.pixelAspect", layer=layer_name

View file

@ -77,15 +77,14 @@ class CollectReview(pyblish.api.InstancePlugin):
instance.data['remove'] = True
self.log.debug('isntance data {}'.format(instance.data))
else:
if self.legacy:
instance.data['subset'] = task + 'Review'
else:
subset = "{}{}{}".format(
task,
instance.data["subset"][0].upper(),
instance.data["subset"][1:]
)
instance.data['subset'] = subset
legacy_subset_name = task + 'Review'
asset_doc_id = instance.context.data['assetEntity']["_id"]
subsets = legacy_io.find({"type": "subset",
"name": legacy_subset_name,
"parent": asset_doc_id}).distinct("_id")
if len(list(subsets)) > 0:
self.log.debug("Existing subsets found, keep legacy name.")
instance.data['subset'] = legacy_subset_name
instance.data['review_camera'] = camera
instance.data['frameStartFtrack'] = \

View file

@ -16,13 +16,19 @@ class ExtractAnimation(openpype.api.Extractor):
Positions and normals, uvs, creases are preserved, but nothing more,
for plain and predictable point caches.
Plugin can run locally or remotely (on a farm - if instance is marked with
"farm" it will be skipped in local processing, but processed on farm)
"""
label = "Extract Animation"
hosts = ["maya"]
families = ["animation"]
targets = ["local", "remote"]
def process(self, instance):
if instance.data.get("farm"):
self.log.debug("Should be processed on farm, skipping.")
return
# Collect the out set nodes
out_sets = [node for node in instance if node.endswith("out_SET")]
@ -89,4 +95,6 @@ class ExtractAnimation(openpype.api.Extractor):
}
instance.data["representations"].append(representation)
instance.context.data["cleanupFullPaths"].append(path)
self.log.info("Extracted {} to {}".format(instance, dirname))

View file

@ -16,6 +16,8 @@ class ExtractAlembic(openpype.api.Extractor):
Positions and normals, uvs, creases are preserved, but nothing more,
for plain and predictable point caches.
Plugin can run locally or remotely (on a farm - if instance is marked with
"farm" it will be skipped in local processing, but processed on farm)
"""
label = "Extract Pointcache (Alembic)"
@ -23,8 +25,12 @@ class ExtractAlembic(openpype.api.Extractor):
families = ["pointcache",
"model",
"vrayproxy"]
targets = ["local", "remote"]
def process(self, instance):
if instance.data.get("farm"):
self.log.debug("Should be processed on farm, skipping.")
return
nodes = instance[:]
@ -92,4 +98,6 @@ class ExtractAlembic(openpype.api.Extractor):
}
instance.data["representations"].append(representation)
instance.context.data["cleanupFullPaths"].append(path)
self.log.info("Extracted {} to {}".format(instance, dirname))

View file

@ -0,0 +1,16 @@
<?xml version="1.0" encoding="UTF-8"?>
<root>
<error id="main">
<title>Errors found</title>
<description>
## Publish process has errors
At least one plugin failed before this plugin, job won't be sent to Deadline for processing before all issues are fixed.
### How to repair?
Check all failing plugins (should be highlighted in red) and fix issues if possible.
</description>
</error>
</root>

View file

@ -0,0 +1,28 @@
<?xml version="1.0" encoding="UTF-8"?>
<root>
<error id="main">
<title>Review subsets not unique</title>
<description>
## Non unique subset name found
Non unique subset names: '{non_unique}'
<detail>
### __Detailed Info__ (optional)
This might happen if you already published for this asset
review subset with legacy name {task}Review.
This legacy name limits possibility of publishing of multiple
reviews from a single workfile. Proper review subset name should
now
contain variant also (as 'Main', 'Default' etc.). That would
result in completely new subset though, so this situation must
be handled manually.
</detail>
### How to repair?
Legacy subsets must be removed from Openpype DB, please ask admin
to do that. Please provide them asset and subset names.
</description>
</error>
</root>

View file

@ -30,6 +30,10 @@ class ValidateAnimationContent(pyblish.api.InstancePlugin):
assert 'out_hierarchy' in instance.data, "Missing `out_hierarchy` data"
out_sets = [node for node in instance if node.endswith("out_SET")]
msg = "Couldn't find exactly one out_SET: {0}".format(out_sets)
assert len(out_sets) == 1, msg
# All nodes in the `out_hierarchy` must be among the nodes that are
# in the instance. The nodes in the instance are found from the top
# group, as such this tests whether all nodes are under that top group.

View file

@ -0,0 +1,36 @@
# -*- coding: utf-8 -*-
import collections
import pyblish.api
import openpype.api
from openpype.pipeline import PublishXmlValidationError
class ValidateReviewSubsetUniqueness(pyblish.api.ContextPlugin):
"""Validates that nodes has common root."""
order = openpype.api.ValidateContentsOrder
hosts = ["maya"]
families = ["review"]
label = "Validate Review Subset Unique"
def process(self, context):
subset_names = []
for instance in context:
self.log.info("instance:: {}".format(instance.data))
if instance.data.get('publish'):
subset_names.append(instance.data.get('subset'))
non_unique = \
[item
for item, count in collections.Counter(subset_names).items()
if count > 1]
msg = ("Instance subset names {} are not unique. ".format(non_unique) +
"Ask admin to remove subset from DB for multiple reviews.")
formatting_data = {
"non_unique": ",".join(non_unique)
}
if non_unique:
raise PublishXmlValidationError(self, msg,
formatting_data=formatting_data)

View file

@ -0,0 +1,86 @@
import os
import re
import nuke
from openpype.api import Logger
log = Logger.get_logger(__name__)
class GizmoMenu():
def __init__(self, title, icon=None):
self.toolbar = self._create_toolbar_menu(
title,
icon=icon
)
self._script_actions = []
def _create_toolbar_menu(self, name, icon=None):
nuke_node_menu = nuke.menu("Nodes")
return nuke_node_menu.addMenu(
name,
icon=icon
)
def _make_menu_path(self, path, icon=None):
parent = self.toolbar
for folder in re.split(r"/|\\", path):
if not folder:
continue
existing_menu = parent.findItem(folder)
if existing_menu:
parent = existing_menu
else:
parent = parent.addMenu(folder, icon=icon)
return parent
def build_from_configuration(self, configuration):
for menu in configuration:
# Construct parent path else parent is toolbar
parent = self.toolbar
gizmo_toolbar_path = menu.get("gizmo_toolbar_path")
if gizmo_toolbar_path:
parent = self._make_menu_path(gizmo_toolbar_path)
for item in menu["sub_gizmo_list"]:
assert isinstance(item, dict), "Configuration is wrong!"
if not item.get("title"):
continue
item_type = item.get("sourcetype")
if item_type == ("python" or "file"):
parent.addCommand(
item["title"],
command=str(item["command"]),
icon=item.get("icon"),
shortcut=item.get("hotkey")
)
# add separator
# Special behavior for separators
elif item_type == "separator":
parent.addSeparator()
# add submenu
# items should hold a collection of submenu items (dict)
elif item_type == "menu":
# assert "items" in item, "Menu is missing 'items' key"
parent.addMenu(
item['title'],
icon=item.get('icon')
)
def add_gizmo_path(self, gizmo_paths):
for gizmo_path in gizmo_paths:
if os.path.isdir(gizmo_path):
for folder in os.listdir(gizmo_path):
if os.path.isdir(os.path.join(gizmo_path, folder)):
nuke.pluginAddPath(os.path.join(gizmo_path, folder))
nuke.pluginAddPath(gizmo_path)
else:
log.warning("This path doesn't exist: {}".format(gizmo_path))

View file

@ -30,6 +30,8 @@ from openpype.pipeline import (
legacy_io,
)
from . import gizmo_menu
from .workio import (
save_file,
open_file
@ -2498,6 +2500,70 @@ def recreate_instance(origin_node, avalon_data=None):
return new_node
def add_scripts_gizmo():
# load configuration of custom menu
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
platform_name = platform.system().lower()
for gizmo_settings in project_settings["nuke"]["gizmo"]:
gizmo_list_definition = gizmo_settings["gizmo_definition"]
toolbar_name = gizmo_settings["toolbar_menu_name"]
# gizmo_toolbar_path = gizmo_settings["gizmo_toolbar_path"]
gizmo_source_dir = gizmo_settings.get(
"gizmo_source_dir", {}).get(platform_name)
toolbar_icon_path = gizmo_settings.get(
"toolbar_icon_path", {}).get(platform_name)
if not gizmo_source_dir:
log.debug("Skipping studio gizmo `{}`, "
"no gizmo path found.".format(toolbar_name)
)
return
if not gizmo_list_definition:
log.debug("Skipping studio gizmo `{}`, "
"no definition found.".format(toolbar_name)
)
return
if toolbar_icon_path:
try:
toolbar_icon_path = toolbar_icon_path.format(**os.environ)
except KeyError as e:
log.error(
"This environment variable doesn't exist: {}".format(e)
)
existing_gizmo_path = []
for source_dir in gizmo_source_dir:
try:
resolve_source_dir = source_dir.format(**os.environ)
except KeyError as e:
log.error(
"This environment variable doesn't exist: {}".format(e)
)
continue
if not os.path.exists(resolve_source_dir):
log.warning(
"The source of gizmo `{}` does not exists".format(
resolve_source_dir
)
)
continue
existing_gizmo_path.append(resolve_source_dir)
# run the launcher for Nuke toolbar
toolbar_menu = gizmo_menu.GizmoMenu(
title=toolbar_name,
icon=toolbar_icon_path
)
# apply configuration
toolbar_menu.add_gizmo_path(existing_gizmo_path)
toolbar_menu.build_from_configuration(gizmo_list_definition)
class NukeDirmap(HostDirmap):
def __init__(self, host_name, project_settings, sync_module, file_name):
"""

View file

@ -8,7 +8,8 @@ from openpype.hosts.nuke.api.lib import (
on_script_load,
check_inventory_versions,
WorkfileSettings,
dirmap_file_name_filter
dirmap_file_name_filter,
add_scripts_gizmo
)
from openpype.settings import get_project_settings
@ -59,3 +60,5 @@ def add_scripts_menu():
add_scripts_menu()
add_scripts_gizmo()

View file

@ -14,7 +14,7 @@ import unreal # noqa
class SkeletalMeshFBXLoader(plugin.Loader):
"""Load Unreal SkeletalMesh from FBX."""
families = ["rig"]
families = ["rig", "skeletalMesh"]
label = "Import FBX Skeletal Mesh"
representations = ["fbx"]
icon = "cube"

View file

@ -14,7 +14,7 @@ import unreal # noqa
class StaticMeshFBXLoader(plugin.Loader):
"""Load Unreal StaticMesh from FBX."""
families = ["model", "unrealStaticMesh"]
families = ["model", "staticMesh"]
label = "Import FBX Static Mesh"
representations = ["fbx"]
icon = "cube"

View file

@ -71,16 +71,12 @@ class ProjectsEndpoint(_RestApiEndpoint):
"""Returns list of dict with project info (id, name)."""
async def get(self) -> Response:
output = []
for project_name in self.dbcon.database.collection_names():
project_doc = self.dbcon.database[project_name].find_one({
"type": "project"
})
if project_doc:
ret_val = {
"id": project_doc["_id"],
"name": project_doc["name"]
}
output.append(ret_val)
for project_doc in self.dbcon.projects():
ret_val = {
"id": project_doc["_id"],
"name": project_doc["name"]
}
output.append(ret_val)
return Response(
status=200,
body=self.resource.encode(output),

View file

@ -136,6 +136,7 @@ from .avalon_context import (
create_workfile_doc,
save_workfile_data_to_doc,
get_workfile_doc,
get_loaders_by_name,
BuildWorkfile,
@ -308,6 +309,7 @@ __all__ = [
"create_workfile_doc",
"save_workfile_data_to_doc",
"get_workfile_doc",
"get_loaders_by_name",
"BuildWorkfile",

View file

@ -0,0 +1,464 @@
import os
from abc import ABCMeta, abstractmethod
import traceback
import six
from openpype.settings import get_project_settings
from openpype.lib import Anatomy, get_linked_assets, get_loaders_by_name
from openpype.api import PypeLogger as Logger
from openpype.pipeline import legacy_io, load
from functools import reduce
from openpype.lib.build_template_exceptions import (
TemplateAlreadyImported,
TemplateLoadingFailed,
TemplateProfileNotFound,
TemplateNotFound
)
import logging
log = logging.getLogger(__name__)
def update_representations(entities, entity):
if entity['context']['subset'] not in entities:
entities[entity['context']['subset']] = entity
else:
current = entities[entity['context']['subset']]
incomming = entity
entities[entity['context']['subset']] = max(
current, incomming,
key=lambda entity: entity["context"].get("version", -1))
return entities
def parse_loader_args(loader_args):
if not loader_args:
return dict()
try:
parsed_args = eval(loader_args)
if not isinstance(parsed_args, dict):
return dict()
else:
return parsed_args
except Exception as err:
print(
"Error while parsing loader arguments '{}'.\n{}: {}\n\n"
"Continuing with default arguments. . .".format(
loader_args,
err.__class__.__name__,
err))
return dict()
@six.add_metaclass(ABCMeta)
class AbstractTemplateLoader:
"""
Abstraction of Template Loader.
Properties:
template_path : property to get current template path
Methods:
import_template : Abstract Method. Used to load template,
depending on current host
get_template_nodes : Abstract Method. Used to query nodes acting
as placeholders. Depending on current host
"""
def __init__(self, placeholder_class):
self.loaders_by_name = get_loaders_by_name()
self.current_asset = legacy_io.Session["AVALON_ASSET"]
self.project_name = legacy_io.Session["AVALON_PROJECT"]
self.host_name = legacy_io.Session["AVALON_APP"]
self.task_name = legacy_io.Session["AVALON_TASK"]
self.placeholder_class = placeholder_class
self.current_asset_docs = legacy_io.find_one({
"type": "asset",
"name": self.current_asset
})
self.task_type = (
self.current_asset_docs
.get("data", {})
.get("tasks", {})
.get(self.task_name, {})
.get("type")
)
self.log = Logger().get_logger("BUILD TEMPLATE")
self.log.info(
"BUILDING ASSET FROM TEMPLATE :\n"
"Starting templated build for {asset} in {project}\n\n"
"Asset : {asset}\n"
"Task : {task_name} ({task_type})\n"
"Host : {host}\n"
"Project : {project}\n".format(
asset=self.current_asset,
host=self.host_name,
project=self.project_name,
task_name=self.task_name,
task_type=self.task_type
))
# Skip if there is no loader
if not self.loaders_by_name:
self.log.warning(
"There is no registered loaders. No assets will be loaded")
return
def template_already_imported(self, err_msg):
"""In case template was already loaded.
Raise the error as a default action.
Override this method in your template loader implementation
to manage this case."""
self.log.error("{}: {}".format(
err_msg.__class__.__name__,
err_msg))
raise TemplateAlreadyImported(err_msg)
def template_loading_failed(self, err_msg):
"""In case template loading failed
Raise the error as a default action.
Override this method in your template loader implementation
to manage this case.
"""
self.log.error("{}: {}".format(
err_msg.__class__.__name__,
err_msg))
raise TemplateLoadingFailed(err_msg)
@property
def template_path(self):
"""
Property returning template path. Avoiding setter.
Getting template path from open pype settings based on current avalon
session and solving the path variables if needed.
Returns:
str: Solved template path
Raises:
TemplateProfileNotFound: No profile found from settings for
current avalon session
KeyError: Could not solve path because a key does not exists
in avalon context
TemplateNotFound: Solved path does not exists on current filesystem
"""
project_name = self.project_name
host_name = self.host_name
task_name = self.task_name
task_type = self.task_type
anatomy = Anatomy(project_name)
project_settings = get_project_settings(project_name)
build_info = project_settings[host_name]['templated_workfile_build']
profiles = build_info['profiles']
for prf in profiles:
if prf['task_types'] and task_type not in prf['task_types']:
continue
if prf['tasks'] and task_name not in prf['tasks']:
continue
path = prf['path']
break
else: # IF no template were found (no break happened)
raise TemplateProfileNotFound(
"No matching profile found for task '{}' of type '{}' "
"with host '{}'".format(task_name, task_type, host_name)
)
if path is None:
raise TemplateLoadingFailed(
"Template path is not set.\n"
"Path need to be set in {}\\Template Workfile Build "
"Settings\\Profiles".format(host_name.title()))
try:
solved_path = None
while True:
solved_path = anatomy.path_remapper(path)
if solved_path is None:
solved_path = path
if solved_path == path:
break
path = solved_path
except KeyError as missing_key:
raise KeyError(
"Could not solve key '{}' in template path '{}'".format(
missing_key, path))
finally:
solved_path = os.path.normpath(solved_path)
if not os.path.exists(solved_path):
raise TemplateNotFound(
"Template found in openPype settings for task '{}' with host "
"'{}' does not exists. (Not found : {})".format(
task_name, host_name, solved_path))
self.log.info("Found template at : '{}'".format(solved_path))
return solved_path
def populate_template(self, ignored_ids=None):
"""
Use template placeholders to load assets and parent them in hierarchy
Arguments :
ignored_ids :
Returns:
None
"""
loaders_by_name = self.loaders_by_name
current_asset = self.current_asset
linked_assets = [asset['name'] for asset
in get_linked_assets(self.current_asset_docs)]
ignored_ids = ignored_ids or []
placeholders = self.get_placeholders()
self.log.debug("Placeholders found in template: {}".format(
[placeholder.data['node'] for placeholder in placeholders]
))
for placeholder in placeholders:
self.log.debug("Start to processing placeholder {}".format(
placeholder.data['node']
))
placeholder_representations = self.get_placeholder_representations(
placeholder,
current_asset,
linked_assets
)
if not placeholder_representations:
self.log.info(
"There's no representation for this placeholder: "
"{}".format(placeholder.data['node'])
)
continue
for representation in placeholder_representations:
self.preload(placeholder, loaders_by_name, representation)
if self.load_data_is_incorrect(
placeholder,
representation,
ignored_ids):
continue
self.log.info(
"Loading {}_{} with loader {}\n"
"Loader arguments used : {}".format(
representation['context']['asset'],
representation['context']['subset'],
placeholder.loader,
placeholder.data['loader_args']))
try:
container = self.load(
placeholder, loaders_by_name, representation)
except Exception:
self.load_failed(placeholder, representation)
else:
self.load_succeed(placeholder, container)
finally:
self.postload(placeholder)
def get_placeholder_representations(
self, placeholder, current_asset, linked_assets):
placeholder_db_filters = placeholder.convert_to_db_filters(
current_asset,
linked_assets)
# get representation by assets
for db_filter in placeholder_db_filters:
placeholder_representations = list(legacy_io.find(db_filter))
for representation in reduce(update_representations,
placeholder_representations,
dict()).values():
yield representation
def load_data_is_incorrect(
self, placeholder, last_representation, ignored_ids):
if not last_representation:
self.log.warning(placeholder.err_message())
return True
if (str(last_representation['_id']) in ignored_ids):
print("Ignoring : ", last_representation['_id'])
return True
return False
def preload(self, placeholder, loaders_by_name, last_representation):
pass
def load(self, placeholder, loaders_by_name, last_representation):
repre = load.get_representation_context(last_representation)
return load.load_with_repre_context(
loaders_by_name[placeholder.loader],
repre,
options=parse_loader_args(placeholder.data['loader_args']))
def load_succeed(self, placeholder, container):
placeholder.parent_in_hierarchy(container)
def load_failed(self, placeholder, last_representation):
self.log.warning("Got error trying to load {}:{} with {}\n\n"
"{}".format(last_representation['context']['asset'],
last_representation['context']['subset'],
placeholder.loader,
traceback.format_exc()))
def postload(self, placeholder):
placeholder.clean()
def update_missing_containers(self):
loaded_containers_ids = self.get_loaded_containers_by_id()
self.populate_template(ignored_ids=loaded_containers_ids)
def get_placeholders(self):
placeholder_class = self.placeholder_class
placeholders = map(placeholder_class, self.get_template_nodes())
valid_placeholders = filter(placeholder_class.is_valid, placeholders)
sorted_placeholders = sorted(valid_placeholders,
key=placeholder_class.order)
return sorted_placeholders
@abstractmethod
def get_loaded_containers_by_id(self):
"""
Collect already loaded containers for updating scene
Return:
dict (string, node): A dictionnary id as key
and containers as value
"""
pass
@abstractmethod
def import_template(self, template_path):
"""
Import template in current host
Args:
template_path (str): fullpath to current task and
host's template file
Return:
None
"""
pass
@abstractmethod
def get_template_nodes(self):
"""
Returning a list of nodes acting as host placeholders for
templating. The data representation is by user.
AbstractLoadTemplate (and LoadTemplate) won't directly manipulate nodes
Args :
None
Returns:
list(AnyNode): Solved template path
"""
pass
@six.add_metaclass(ABCMeta)
class AbstractPlaceholder:
"""Abstraction of placeholders logic
Properties:
attributes: A list of mandatory attribute to decribe placeholder
and assets to load.
optional_attributes: A list of optional attribute to decribe
placeholder and assets to load
loader: Name of linked loader to use while loading assets
is_context: Is placeholder linked
to context asset (or to linked assets)
Methods:
is_repres_valid:
loader:
order:
is_valid:
get_data:
parent_in_hierachy:
"""
attributes = {'builder_type', 'family', 'representation',
'order', 'loader', 'loader_args'}
optional_attributes = {}
def __init__(self, node):
self.get_data(node)
def order(self):
"""Get placeholder order.
Order is used to sort them by priority
Priority is lowset first, highest last
(ex:
1: First to load
100: Last to load)
Returns:
Int: Order priority
"""
return self.data.get('order')
@property
def loader(self):
"""Return placeholder loader type
Returns:
string: Loader name
"""
return self.data.get('loader')
@property
def is_context(self):
"""Return placeholder type
context_asset: For loading current asset
linked_asset: For loading linked assets
Returns:
bool: true if placeholder is a context placeholder
"""
return self.data.get('builder_type') == 'context_asset'
def is_valid(self):
"""Test validity of placeholder
i.e.: every attributes exists in placeholder data
Returns:
Bool: True if every attributes are a key of data
"""
if set(self.attributes).issubset(self.data.keys()):
print("Valid placeholder : {}".format(self.data["node"]))
return True
print("Placeholder is not valid : {}".format(self.data["node"]))
return False
@abstractmethod
def parent_in_hierarchy(self, containers):
"""Place container in correct hierarchy
given by placeholder
Args:
containers (String): Container name returned back by
placeholder's loader.
"""
pass
@abstractmethod
def clean(self):
"""Clean placeholder from hierarchy after loading assets.
"""
pass
@abstractmethod
def convert_to_db_filters(self, current_asset, linked_asset):
"""map current placeholder data as a db filter
args:
current_asset (String): Name of current asset in context
linked asset (list[String]) : Names of assets linked to
current asset in context
Returns:
dict: a dictionnary describing a filter to look for asset in
a database
"""
pass
@abstractmethod
def get_data(self, node):
"""
Collect placeholders information.
Args:
node (AnyNode): A unique node decided by Placeholder implementation
"""
pass

View file

@ -1282,7 +1282,13 @@ class EnvironmentPrepData(dict):
def get_app_environments_for_context(
project_name, asset_name, task_name, app_name, env_group=None, env=None
project_name,
asset_name,
task_name,
app_name,
env_group=None,
env=None,
modules_manager=None
):
"""Prepare environment variables by context.
Args:
@ -1293,10 +1299,12 @@ def get_app_environments_for_context(
by ApplicationManager.
env (dict): Initial environment variables. `os.environ` is used when
not passed.
modules_manager (ModulesManager): Initialized modules manager.
Returns:
dict: Environments for passed context and application.
"""
from openpype.pipeline import AvalonMongoDB
# Avalon database connection
@ -1311,6 +1319,11 @@ def get_app_environments_for_context(
"name": asset_name
})
if modules_manager is None:
from openpype.modules import ModulesManager
modules_manager = ModulesManager()
# Prepare app object which can be obtained only from ApplciationManager
app_manager = ApplicationManager()
app = app_manager.applications[app_name]
@ -1334,7 +1347,7 @@ def get_app_environments_for_context(
"env": env
})
prepare_app_environments(data, env_group)
prepare_app_environments(data, env_group, modules_manager)
prepare_context_environments(data, env_group)
# Discard avalon connection
@ -1355,9 +1368,12 @@ def _merge_env(env, current_env):
return result
def _add_python_version_paths(app, env, logger):
def _add_python_version_paths(app, env, logger, modules_manager):
"""Add vendor packages specific for a Python version."""
for module in modules_manager.get_enabled_modules():
module.modify_application_launch_arguments(app, env)
# Skip adding if host name is not set
if not app.host_name:
return
@ -1390,7 +1406,9 @@ def _add_python_version_paths(app, env, logger):
env["PYTHONPATH"] = os.pathsep.join(python_paths)
def prepare_app_environments(data, env_group=None, implementation_envs=True):
def prepare_app_environments(
data, env_group=None, implementation_envs=True, modules_manager=None
):
"""Modify launch environments based on launched app and context.
Args:
@ -1403,7 +1421,12 @@ def prepare_app_environments(data, env_group=None, implementation_envs=True):
log = data["log"]
source_env = data["env"].copy()
_add_python_version_paths(app, source_env, log)
if modules_manager is None:
from openpype.modules import ModulesManager
modules_manager = ModulesManager()
_add_python_version_paths(app, source_env, log, modules_manager)
# Use environments from local settings
filtered_local_envs = {}

View file

@ -15,6 +15,7 @@ from openpype.settings import (
get_project_settings,
get_system_settings
)
from .anatomy import Anatomy
from .profiles_filtering import filter_profiles
from .events import emit_event
@ -922,6 +923,118 @@ def save_workfile_data_to_doc(workfile_doc, data, dbcon=None):
)
@with_pipeline_io
def collect_last_version_repres(asset_entities):
"""Collect subsets, versions and representations for asset_entities.
Args:
asset_entities (list): Asset entities for which want to find data
Returns:
(dict): collected entities
Example output:
```
{
{Asset ID}: {
"asset_entity": <AssetEntity>,
"subsets": {
{Subset ID}: {
"subset_entity": <SubsetEntity>,
"version": {
"version_entity": <VersionEntity>,
"repres": [
<RepreEntity1>, <RepreEntity2>, ...
]
}
},
...
}
},
...
}
output[asset_id]["subsets"][subset_id]["version"]["repres"]
```
"""
if not asset_entities:
return {}
asset_entity_by_ids = {asset["_id"]: asset for asset in asset_entities}
subsets = list(legacy_io.find({
"type": "subset",
"parent": {"$in": list(asset_entity_by_ids.keys())}
}))
subset_entity_by_ids = {subset["_id"]: subset for subset in subsets}
sorted_versions = list(legacy_io.find({
"type": "version",
"parent": {"$in": list(subset_entity_by_ids.keys())}
}).sort("name", -1))
subset_id_with_latest_version = []
last_versions_by_id = {}
for version in sorted_versions:
subset_id = version["parent"]
if subset_id in subset_id_with_latest_version:
continue
subset_id_with_latest_version.append(subset_id)
last_versions_by_id[version["_id"]] = version
repres = legacy_io.find({
"type": "representation",
"parent": {"$in": list(last_versions_by_id.keys())}
})
output = {}
for repre in repres:
version_id = repre["parent"]
version = last_versions_by_id[version_id]
subset_id = version["parent"]
subset = subset_entity_by_ids[subset_id]
asset_id = subset["parent"]
asset = asset_entity_by_ids[asset_id]
if asset_id not in output:
output[asset_id] = {
"asset_entity": asset,
"subsets": {}
}
if subset_id not in output[asset_id]["subsets"]:
output[asset_id]["subsets"][subset_id] = {
"subset_entity": subset,
"version": {
"version_entity": version,
"repres": []
}
}
output[asset_id]["subsets"][subset_id]["version"]["repres"].append(
repre
)
return output
@with_pipeline_io
def get_loaders_by_name():
from openpype.pipeline import discover_loader_plugins
loaders_by_name = {}
for loader in discover_loader_plugins():
loader_name = loader.__name__
if loader_name in loaders_by_name:
raise KeyError(
"Duplicated loader name {} !".format(loader_name)
)
loaders_by_name[loader_name] = loader
return loaders_by_name
class BuildWorkfile:
"""Wrapper for build workfile process.
@ -979,8 +1092,6 @@ class BuildWorkfile:
...
}]
"""
from openpype.pipeline import discover_loader_plugins
# Get current asset name and entity
current_asset_name = legacy_io.Session["AVALON_ASSET"]
current_asset_entity = legacy_io.find_one({
@ -996,14 +1107,7 @@ class BuildWorkfile:
return
# Prepare available loaders
loaders_by_name = {}
for loader in discover_loader_plugins():
loader_name = loader.__name__
if loader_name in loaders_by_name:
raise KeyError(
"Duplicated loader name {0}!".format(loader_name)
)
loaders_by_name[loader_name] = loader
loaders_by_name = get_loaders_by_name()
# Skip if there are any loaders
if not loaders_by_name:
@ -1075,7 +1179,7 @@ class BuildWorkfile:
return
# Prepare entities from database for assets
prepared_entities = self._collect_last_version_repres(assets)
prepared_entities = collect_last_version_repres(assets)
# Load containers by prepared entities and presets
loaded_containers = []
@ -1491,102 +1595,6 @@ class BuildWorkfile:
return loaded_containers
@with_pipeline_io
def _collect_last_version_repres(self, asset_entities):
"""Collect subsets, versions and representations for asset_entities.
Args:
asset_entities (list): Asset entities for which want to find data
Returns:
(dict): collected entities
Example output:
```
{
{Asset ID}: {
"asset_entity": <AssetEntity>,
"subsets": {
{Subset ID}: {
"subset_entity": <SubsetEntity>,
"version": {
"version_entity": <VersionEntity>,
"repres": [
<RepreEntity1>, <RepreEntity2>, ...
]
}
},
...
}
},
...
}
output[asset_id]["subsets"][subset_id]["version"]["repres"]
```
"""
if not asset_entities:
return {}
asset_entity_by_ids = {asset["_id"]: asset for asset in asset_entities}
subsets = list(legacy_io.find({
"type": "subset",
"parent": {"$in": list(asset_entity_by_ids.keys())}
}))
subset_entity_by_ids = {subset["_id"]: subset for subset in subsets}
sorted_versions = list(legacy_io.find({
"type": "version",
"parent": {"$in": list(subset_entity_by_ids.keys())}
}).sort("name", -1))
subset_id_with_latest_version = []
last_versions_by_id = {}
for version in sorted_versions:
subset_id = version["parent"]
if subset_id in subset_id_with_latest_version:
continue
subset_id_with_latest_version.append(subset_id)
last_versions_by_id[version["_id"]] = version
repres = legacy_io.find({
"type": "representation",
"parent": {"$in": list(last_versions_by_id.keys())}
})
output = {}
for repre in repres:
version_id = repre["parent"]
version = last_versions_by_id[version_id]
subset_id = version["parent"]
subset = subset_entity_by_ids[subset_id]
asset_id = subset["parent"]
asset = asset_entity_by_ids[asset_id]
if asset_id not in output:
output[asset_id] = {
"asset_entity": asset,
"subsets": {}
}
if subset_id not in output[asset_id]["subsets"]:
output[asset_id]["subsets"][subset_id] = {
"subset_entity": subset,
"version": {
"version_entity": version,
"repres": []
}
}
output[asset_id]["subsets"][subset_id]["version"]["repres"].append(
repre
)
return output
@with_pipeline_io
def get_creator_by_name(creator_name, case_sensitive=False):

View file

@ -0,0 +1,61 @@
from openpype.pipeline import registered_host
from openpype.lib import classes_from_module
from importlib import import_module
from .abstract_template_loader import (
AbstractPlaceholder,
AbstractTemplateLoader)
from .build_template_exceptions import (
TemplateLoadingFailed,
TemplateAlreadyImported,
MissingHostTemplateModule,
MissingTemplatePlaceholderClass,
MissingTemplateLoaderClass
)
_module_path_format = 'openpype.{host}.template_loader'
def build_workfile_template(*args):
template_loader = build_template_loader()
try:
template_loader.import_template(template_loader.template_path)
except TemplateAlreadyImported as err:
template_loader.template_already_imported(err)
except TemplateLoadingFailed as err:
template_loader.template_loading_failed(err)
else:
template_loader.populate_template()
def update_workfile_template(args):
template_loader = build_template_loader()
template_loader.update_missing_containers()
def build_template_loader():
host_name = registered_host().__name__.partition('.')[2]
module_path = _module_path_format.format(host=host_name)
module = import_module(module_path)
if not module:
raise MissingHostTemplateModule(
"No template loader found for host {}".format(host_name))
template_loader_class = classes_from_module(
AbstractTemplateLoader,
module
)
template_placeholder_class = classes_from_module(
AbstractPlaceholder,
module
)
if not template_loader_class:
raise MissingTemplateLoaderClass()
template_loader_class = template_loader_class[0]
if not template_placeholder_class:
raise MissingTemplatePlaceholderClass()
template_placeholder_class = template_placeholder_class[0]
return template_loader_class(template_placeholder_class)

View file

@ -0,0 +1,35 @@
class MissingHostTemplateModule(Exception):
"""Error raised when expected module does not exists"""
pass
class MissingTemplatePlaceholderClass(Exception):
"""Error raised when module doesn't implement a placeholder class"""
pass
class MissingTemplateLoaderClass(Exception):
"""Error raised when module doesn't implement a template loader class"""
pass
class TemplateNotFound(Exception):
"""Exception raised when template does not exist."""
pass
class TemplateProfileNotFound(Exception):
"""Exception raised when current profile
doesn't match any template profile"""
pass
class TemplateAlreadyImported(Exception):
"""Error raised when Template was already imported by host for
this session"""
pass
class TemplateLoadingFailed(Exception):
"""Error raised whend Template loader was unable to load the template"""
pass

View file

@ -60,7 +60,7 @@ def start_webpublish_log(dbcon, batch_id, user):
}).inserted_id
def publish(log, close_plugin_name=None):
def publish(log, close_plugin_name=None, raise_error=False):
"""Loops through all plugins, logs to console. Used for tests.
Args:
@ -79,10 +79,15 @@ def publish(log, close_plugin_name=None):
result["plugin"].label, record.msg))
if result["error"]:
log.error(error_format.format(**result))
error_message = error_format.format(**result)
log.error(error_message)
if close_plugin: # close host app explicitly after error
context = pyblish.api.Context()
close_plugin().process(context)
if raise_error:
# Fatal Error is because of Deadline
error_message = "Fatal Error: " + error_format.format(**result)
raise RuntimeError(error_message)
def publish_and_log(dbcon, _id, log, close_plugin_name=None, batch_id=None):
@ -228,7 +233,7 @@ def _get_close_plugin(close_plugin_name, log):
if plugin.__name__ == close_plugin_name:
return plugin
log.warning("Close plugin not found, app might not close.")
log.debug("Close plugin not found, app might not close.")
def get_task_data(batch_dir):

View file

@ -116,7 +116,10 @@ def get_oiio_tools_path(tool="oiiotool"):
tool (string): Tool name (oiiotool, maketx, ...).
Default is "oiiotool".
"""
oiio_dir = get_vendor_bin_path("oiio")
if platform.system().lower() == "linux":
oiio_dir = os.path.join(oiio_dir, "bin")
return find_executable(os.path.join(oiio_dir, tool))

View file

@ -370,6 +370,7 @@ def _load_modules():
class _OpenPypeInterfaceMeta(ABCMeta):
"""OpenPypeInterface meta class to print proper string."""
def __str__(self):
return "<'OpenPypeInterface.{}'>".format(self.__name__)
@ -388,6 +389,7 @@ class OpenPypeInterface:
OpenPype modules which means they have to have implemented methods defined
in the interface. By default interface does not have any abstract parts.
"""
pass
@ -432,10 +434,12 @@ class OpenPypeModule:
It is not recommended to override __init__ that's why specific method
was implemented.
"""
pass
def connect_with_modules(self, enabled_modules):
"""Connect with other enabled modules."""
pass
def get_global_environments(self):
@ -443,8 +447,22 @@ class OpenPypeModule:
Environment variables that can be get only from system settings.
"""
return {}
def modify_application_launch_arguments(self, application, env):
"""Give option to modify launch environments before application launch.
Implementation is optional. To change environments modify passed
dictionary of environments.
Args:
application (Application): Application that is launched.
env (dict): Current environemnt variables.
"""
pass
def cli(self, module_click_group):
"""Add commands to click group.
@ -465,6 +483,7 @@ class OpenPypeModule:
def mycommand():
print("my_command")
"""
pass
@ -886,6 +905,7 @@ class TrayModulesManager(ModulesManager):
modules_menu_order = (
"user",
"ftrack",
"kitsu",
"muster",
"launcher_tool",
"avalon",

View file

@ -0,0 +1,39 @@
# -*- coding: utf-8 -*-
"""Collect instances that should be processed and published on DL.
"""
import os
import pyblish.api
from openpype.pipeline import PublishValidationError
class CollectDeadlinePublishableInstances(pyblish.api.InstancePlugin):
"""Collect instances that should be processed and published on DL.
Some long running publishes (not just renders) could be offloaded to DL,
this plugin compares theirs name against env variable, marks only
publishable by farm.
Triggered only when running only in headless mode, eg on a farm.
"""
order = pyblish.api.CollectorOrder + 0.499
label = "Collect Deadline Publishable Instance"
targets = ["remote"]
def process(self, instance):
self.log.debug("CollectDeadlinePublishableInstances")
publish_inst = os.environ.get("OPENPYPE_PUBLISH_SUBSET", '')
if not publish_inst:
raise PublishValidationError("OPENPYPE_PUBLISH_SUBSET env var "
"required for remote publishing")
subset_name = instance.data["subset"]
if subset_name == publish_inst:
self.log.debug("Publish {}".format(subset_name))
instance.data["publish"] = True
instance.data["farm"] = False
else:
self.log.debug("Skipping {}".format(subset_name))
instance.data["publish"] = False

View file

@ -0,0 +1,136 @@
import os
import requests
from maya import cmds
from openpype.pipeline import legacy_io, PublishXmlValidationError
from openpype.settings import get_project_settings
import openpype.api
import pyblish.api
class MayaSubmitRemotePublishDeadline(openpype.api.Integrator):
"""Submit Maya scene to perform a local publish in Deadline.
Publishing in Deadline can be helpful for scenes that publish very slow.
This way it can process in the background on another machine without the
Artist having to wait for the publish to finish on their local machine.
Submission is done through the Deadline Web Service. DL then triggers
`openpype/scripts/remote_publish.py`.
Each publishable instance creates its own full publish job.
Different from `ProcessSubmittedJobOnFarm` which creates publish job
depending on metadata json containing context and instance data of
rendered files.
"""
label = "Submit Scene to Deadline"
order = pyblish.api.IntegratorOrder
hosts = ["maya"]
families = ["publish.farm"]
def process(self, instance):
settings = get_project_settings(os.getenv("AVALON_PROJECT"))
# use setting for publish job on farm, no reason to have it separately
deadline_publish_job_sett = (settings["deadline"]
["publish"]
["ProcessSubmittedJobOnFarm"])
# Ensure no errors so far
if not (all(result["success"]
for result in instance.context.data["results"])):
raise PublishXmlValidationError("Publish process has errors")
if not instance.data["publish"]:
self.log.warning("No active instances found. "
"Skipping submission..")
return
scene = instance.context.data["currentFile"]
scenename = os.path.basename(scene)
# Get project code
project_name = legacy_io.Session["AVALON_PROJECT"]
job_name = "{scene} [PUBLISH]".format(scene=scenename)
batch_name = "{code} - {scene}".format(code=project_name,
scene=scenename)
# Generate the payload for Deadline submission
payload = {
"JobInfo": {
"Plugin": "MayaBatch",
"BatchName": batch_name,
"Name": job_name,
"UserName": instance.context.data["user"],
"Comment": instance.context.data.get("comment", ""),
# "InitialStatus": state
"Department": deadline_publish_job_sett["deadline_department"],
"ChunkSize": deadline_publish_job_sett["deadline_chunk_size"],
"Priority": deadline_publish_job_sett["deadline_priority"],
"Group": deadline_publish_job_sett["deadline_group"],
"Pool": deadline_publish_job_sett["deadline_pool"],
},
"PluginInfo": {
"Build": None, # Don't force build
"StrictErrorChecking": True,
"ScriptJob": True,
# Inputs
"SceneFile": scene,
"ScriptFilename": "{OPENPYPE_REPOS_ROOT}/openpype/scripts/remote_publish.py", # noqa
# Mandatory for Deadline
"Version": cmds.about(version=True),
# Resolve relative references
"ProjectPath": cmds.workspace(query=True,
rootDirectory=True),
},
# Mandatory for Deadline, may be empty
"AuxFiles": []
}
# Include critical environment variables with submission + api.Session
keys = [
"FTRACK_API_USER",
"FTRACK_API_KEY",
"FTRACK_SERVER"
]
environment = dict({key: os.environ[key] for key in keys
if key in os.environ}, **legacy_io.Session)
# TODO replace legacy_io with context.data ?
environment["AVALON_PROJECT"] = legacy_io.Session["AVALON_PROJECT"]
environment["AVALON_ASSET"] = legacy_io.Session["AVALON_ASSET"]
environment["AVALON_TASK"] = legacy_io.Session["AVALON_TASK"]
environment["AVALON_APP_NAME"] = os.environ.get("AVALON_APP_NAME")
environment["OPENPYPE_LOG_NO_COLORS"] = "1"
environment["OPENPYPE_REMOTE_JOB"] = "1"
environment["OPENPYPE_USERNAME"] = instance.context.data["user"]
environment["OPENPYPE_PUBLISH_SUBSET"] = instance.data["subset"]
environment["HEADLESS_PUBLISH"] = "1"
payload["JobInfo"].update({
"EnvironmentKeyValue%d" % index: "{key}={value}".format(
key=key,
value=environment[key]
) for index, key in enumerate(environment)
})
self.log.info("Submitting Deadline job ...")
deadline_url = instance.context.data["defaultDeadline"]
# if custom one is set in instance, use that
if instance.data.get("deadlineUrl"):
deadline_url = instance.data.get("deadlineUrl")
assert deadline_url, "Requires Deadline Webservice URL"
url = "{}/api/jobs".format(deadline_url)
response = requests.post(url, json=payload, timeout=10)
if not response.ok:
raise Exception(response.text)

View file

@ -87,6 +87,13 @@ def inject_openpype_environment(deadlinePlugin):
for key, value in contents.items():
deadlinePlugin.SetProcessEnvironmentVariable(key, value)
script_url = job.GetJobPluginInfoKeyValue("ScriptFilename")
if script_url:
script_url = script_url.format(**contents).replace("\\", "/")
print(">>> Setting script path {}".format(script_url))
job.SetJobPluginInfoKeyValue("ScriptFilename", script_url)
print(">>> Removing temporary file")
os.remove(export_url)
@ -196,16 +203,19 @@ def __main__(deadlinePlugin):
job.GetJobEnvironmentKeyValue('OPENPYPE_RENDER_JOB') or '0'
openpype_publish_job = \
job.GetJobEnvironmentKeyValue('OPENPYPE_PUBLISH_JOB') or '0'
openpype_remote_job = \
job.GetJobEnvironmentKeyValue('OPENPYPE_REMOTE_JOB') or '0'
print("--- Job type - render {}".format(openpype_render_job))
print("--- Job type - publish {}".format(openpype_publish_job))
print("--- Job type - remote {}".format(openpype_remote_job))
if openpype_publish_job == '1' and openpype_render_job == '1':
raise RuntimeError("Misconfiguration. Job couldn't be both " +
"render and publish.")
if openpype_publish_job == '1':
inject_render_job_id(deadlinePlugin)
elif openpype_render_job == '1':
elif openpype_render_job == '1' or openpype_remote_job == '1':
inject_openpype_environment(deadlinePlugin)
else:
pype(deadlinePlugin) # backward compatibility with Pype2

View file

@ -0,0 +1,346 @@
import copy
import json
import collections
import ftrack_api
from openpype_modules.ftrack.lib import (
ServerAction,
statics_icon,
)
from openpype_modules.ftrack.lib.avalon_sync import create_chunks
class TransferHierarchicalValues(ServerAction):
"""Transfer values across hierarhcical attributes.
Aalso gives ability to convert types meanwhile. That is limited to
conversions between numbers and strings
- int <-> float
- in, float -> string
"""
identifier = "transfer.hierarchical.values"
label = "OpenPype Admin"
variant = "- Transfer values between 2 custom attributes"
description = (
"Move values from a hierarchical attribute to"
" second hierarchical attribute."
)
icon = statics_icon("ftrack", "action_icons", "OpenPypeAdmin.svg")
all_project_entities_query = (
"select id, name, parent_id, link"
" from TypedContext where project_id is \"{}\""
)
cust_attr_query = (
"select value, entity_id from CustomAttributeValue"
" where entity_id in ({}) and configuration_id is \"{}\""
)
settings_key = "transfer_values_of_hierarchical_attributes"
def discover(self, session, entities, event):
"""Show anywhere."""
return self.valid_roles(session, entities, event)
def _selection_interface(self, session, event_values=None):
title = "Transfer hierarchical values"
attr_confs = session.query(
(
"select id, key from CustomAttributeConfiguration"
" where is_hierarchical is true"
)
).all()
attr_items = []
for attr_conf in attr_confs:
attr_items.append({
"value": attr_conf["id"],
"label": attr_conf["key"]
})
if len(attr_items) < 2:
return {
"title": title,
"items": [{
"type": "label",
"value": (
"Didn't found custom attributes"
" that can be transfered."
)
}]
}
attr_items = sorted(attr_items, key=lambda item: item["label"])
items = []
item_splitter = {"type": "label", "value": "---"}
items.append({
"type": "label",
"value": (
"<h2>Please select source and destination"
" Custom attribute</h2>"
)
})
items.append({
"type": "label",
"value": (
"<b>WARNING:</b> This will take affect for all projects!"
)
})
if event_values:
items.append({
"type": "label",
"value": (
"<b>Note:</b> Please select 2 different custom attributes."
)
})
items.append(item_splitter)
src_item = {
"type": "enumerator",
"label": "Source",
"name": "src_attr_id",
"data": copy.deepcopy(attr_items)
}
dst_item = {
"type": "enumerator",
"label": "Destination",
"name": "dst_attr_id",
"data": copy.deepcopy(attr_items)
}
delete_item = {
"type": "boolean",
"name": "delete_dst_attr_first",
"label": "Delete first",
"value": False
}
if event_values:
src_item["value"] = event_values["src_attr_id"]
dst_item["value"] = event_values["dst_attr_id"]
delete_item["value"] = event_values["delete_dst_attr_first"]
items.append(src_item)
items.append(dst_item)
items.append(item_splitter)
items.append({
"type": "label",
"value": (
"<b>WARNING:</b> All values from destination"
" Custom Attribute will be removed if this is enabled."
)
})
items.append(delete_item)
return {
"title": title,
"items": items
}
def interface(self, session, entities, event):
if event["data"].get("values", {}):
return None
return self._selection_interface(session)
def launch(self, session, entities, event):
values = event["data"].get("values", {})
if not values:
return None
src_attr_id = values["src_attr_id"]
dst_attr_id = values["dst_attr_id"]
delete_dst_values = values["delete_dst_attr_first"]
if not src_attr_id or not dst_attr_id:
self.log.info("Attributes were not filled. Nothing to do.")
return {
"success": True,
"message": "Nothing to do"
}
if src_attr_id == dst_attr_id:
self.log.info((
"Same attributes were selected {}, {}."
" Showing interface again."
).format(src_attr_id, dst_attr_id))
return self._selection_interface(session, values)
# Query custom attrbutes
src_conf = session.query((
"select id from CustomAttributeConfiguration where id is {}"
).format(src_attr_id)).one()
dst_conf = session.query((
"select id from CustomAttributeConfiguration where id is {}"
).format(dst_attr_id)).one()
src_type_name = src_conf["type"]["name"]
dst_type_name = dst_conf["type"]["name"]
# Limit conversion to
# - same type -> same type (there is no need to do conversion)
# - number <Any> -> number <Any> (int to float and back)
# - number <Any> -> str (any number can be converted to str)
src_type = None
dst_type = None
if src_type_name == "number" or src_type_name != dst_type_name:
src_type = self._get_attr_type(dst_conf)
dst_type = self._get_attr_type(dst_conf)
valid = False
# Can convert numbers
if src_type in (int, float) and dst_type in (int, float):
valid = True
# Can convert numbers to string
elif dst_type is str:
valid = True
if not valid:
self.log.info((
"Don't know how to properly convert"
" custom attribute types {} > {}"
).format(src_type_name, dst_type_name))
return {
"message": (
"Don't know how to properly convert"
" custom attribute types {} > {}"
).format(src_type_name, dst_type_name),
"success": False
}
# Query source values
src_attr_values = session.query(
(
"select value, entity_id"
" from CustomAttributeValue"
" where configuration_id is {}"
).format(src_attr_id)
).all()
self.log.debug("Queried source values.")
failed_entity_ids = []
if dst_type is not None:
self.log.debug("Converting source values to desctination type")
value_by_id = {}
for attr_value in src_attr_values:
entity_id = attr_value["entity_id"]
value = attr_value["value"]
if value is not None:
try:
if dst_type is not None:
value = dst_type(value)
value_by_id[entity_id] = value
except Exception:
failed_entity_ids.append(entity_id)
if failed_entity_ids:
self.log.info(
"Couldn't convert some values to destination attribute"
)
return {
"success": False,
"message": (
"Couldn't convert some values to destination attribute"
)
}
# Delete destination custom attributes first
if delete_dst_values:
self.log.info("Deleting destination custom attribute values first")
self._delete_custom_attribute_values(session, dst_attr_id)
self.log.info("Applying source values on destination custom attribute")
self._apply_values(session, value_by_id, dst_attr_id)
return True
def _delete_custom_attribute_values(self, session, dst_attr_id):
dst_attr_values = session.query(
(
"select configuration_id, entity_id"
" from CustomAttributeValue"
" where configuration_id is {}"
).format(dst_attr_id)
).all()
delete_operations = []
for attr_value in dst_attr_values:
entity_id = attr_value["entity_id"]
configuration_id = attr_value["configuration_id"]
entity_key = collections.OrderedDict((
("configuration_id", configuration_id),
("entity_id", entity_id)
))
delete_operations.append(
ftrack_api.operation.DeleteEntityOperation(
"CustomAttributeValue",
entity_key
)
)
if not delete_operations:
return
for chunk in create_chunks(delete_operations, 500):
for operation in chunk:
session.recorded_operations.push(operation)
session.commit()
def _apply_values(self, session, value_by_id, dst_attr_id):
dst_attr_values = session.query(
(
"select configuration_id, entity_id"
" from CustomAttributeValue"
" where configuration_id is {}"
).format(dst_attr_id)
).all()
dst_entity_ids_with_value = {
item["entity_id"]
for item in dst_attr_values
}
operations = []
for entity_id, value in value_by_id.items():
entity_key = collections.OrderedDict((
("configuration_id", dst_attr_id),
("entity_id", entity_id)
))
if entity_id in dst_entity_ids_with_value:
operations.append(
ftrack_api.operation.UpdateEntityOperation(
"CustomAttributeValue",
entity_key,
"value",
ftrack_api.symbol.NOT_SET,
value
)
)
else:
operations.append(
ftrack_api.operation.CreateEntityOperation(
"CustomAttributeValue",
entity_key,
{"value": value}
)
)
if not operations:
return
for chunk in create_chunks(operations, 500):
for operation in chunk:
session.recorded_operations.push(operation)
session.commit()
def _get_attr_type(self, conf_def):
type_name = conf_def["type"]["name"]
if type_name == "text":
return str
if type_name == "number":
config = json.loads(conf_def["config"])
if config["isdecimal"]:
return float
return int
return None
def register(session):
'''Register plugin. Called when used as an plugin.'''
TransferHierarchicalValues(session).register()

View file

@ -88,6 +88,40 @@ class FtrackModule(
"""Implementation of `ILaunchHookPaths`."""
return os.path.join(FTRACK_MODULE_DIR, "launch_hooks")
def modify_application_launch_arguments(self, application, env):
if not application.use_python_2:
return
self.log.info("Adding Ftrack Python 2 packages to PYTHONPATH.")
# Prepare vendor dir path
python_2_vendor = os.path.join(FTRACK_MODULE_DIR, "python2_vendor")
# Add Python 2 modules
python_paths = [
# `python-ftrack-api`
os.path.join(python_2_vendor, "ftrack-python-api", "source"),
# `arrow`
os.path.join(python_2_vendor, "arrow"),
# `builtins` from `python-future`
# - `python-future` is strict Python 2 module that cause crashes
# of Python 3 scripts executed through OpenPype
# (burnin script etc.)
os.path.join(python_2_vendor, "builtins"),
# `backports.functools_lru_cache`
os.path.join(
python_2_vendor, "backports.functools_lru_cache"
)
]
# Load PYTHONPATH from current launch context
python_path = env.get("PYTHONPATH")
if python_path:
python_paths.append(python_path)
# Set new PYTHONPATH to launch context environments
env["PYTHONPATH"] = os.pathsep.join(python_paths)
def connect_with_modules(self, enabled_modules):
for module in enabled_modules:
if not hasattr(module, "get_ftrack_event_handler_paths"):

View file

@ -1,43 +0,0 @@
import os
from openpype.lib import PreLaunchHook
from openpype_modules.ftrack import FTRACK_MODULE_DIR
class PrePython2Support(PreLaunchHook):
"""Add python ftrack api module for Python 2 to PYTHONPATH.
Path to vendor modules is added to the beggining of PYTHONPATH.
"""
def execute(self):
if not self.application.use_python_2:
return
self.log.info("Adding Ftrack Python 2 packages to PYTHONPATH.")
# Prepare vendor dir path
python_2_vendor = os.path.join(FTRACK_MODULE_DIR, "python2_vendor")
# Add Python 2 modules
python_paths = [
# `python-ftrack-api`
os.path.join(python_2_vendor, "ftrack-python-api", "source"),
# `arrow`
os.path.join(python_2_vendor, "arrow"),
# `builtins` from `python-future`
# - `python-future` is strict Python 2 module that cause crashes
# of Python 3 scripts executed through OpenPype (burnin script etc.)
os.path.join(python_2_vendor, "builtins"),
# `backports.functools_lru_cache`
os.path.join(
python_2_vendor, "backports.functools_lru_cache"
)
]
# Load PYTHONPATH from current launch context
python_path = self.launch_context.env.get("PYTHONPATH")
if python_path:
python_paths.append(python_path)
# Set new PYTHONPATH to launch context environments
self.launch_context.env["PYTHONPATH"] = os.pathsep.join(python_paths)

View file

@ -2,7 +2,7 @@ import sys
import collections
import six
import pyblish.api
from copy import deepcopy
from openpype.pipeline import legacy_io
# Copy of constant `openpype_modules.ftrack.lib.avalon_sync.CUST_ATTR_AUTO_SYNC`
@ -72,7 +72,8 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
if "hierarchyContext" not in self.context.data:
return
hierarchy_context = self.context.data["hierarchyContext"]
hierarchy_context = self._get_active_assets(context)
self.log.debug("__ hierarchy_context: {}".format(hierarchy_context))
self.session = self.context.data["ftrackSession"]
project_name = self.context.data["projectEntity"]["name"]
@ -86,15 +87,13 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
self.ft_project = None
input_data = hierarchy_context
# disable termporarily ftrack project's autosyncing
if auto_sync_state:
self.auto_sync_off(project)
try:
# import ftrack hierarchy
self.import_to_ftrack(input_data)
self.import_to_ftrack(hierarchy_context)
except Exception:
raise
finally:
@ -355,3 +354,41 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
self.session.rollback()
self.session._configure_locations()
six.reraise(tp, value, tb)
def _get_active_assets(self, context):
""" Returns only asset dictionary.
Usually the last part of deep dictionary which
is not having any children
"""
def get_pure_hierarchy_data(input_dict):
input_dict_copy = deepcopy(input_dict)
for key in input_dict.keys():
self.log.debug("__ key: {}".format(key))
# check if child key is available
if input_dict[key].get("childs"):
# loop deeper
input_dict_copy[
key]["childs"] = get_pure_hierarchy_data(
input_dict[key]["childs"])
elif key not in active_assets:
input_dict_copy.pop(key, None)
return input_dict_copy
hierarchy_context = context.data["hierarchyContext"]
active_assets = []
# filter only the active publishing insatnces
for instance in context:
if instance.data.get("publish") is False:
continue
if not instance.data.get("asset"):
continue
active_assets.append(instance.data["asset"])
# remove duplicity in list
active_assets = list(set(active_assets))
self.log.debug("__ active_assets: {}".format(active_assets))
return get_pure_hierarchy_data(hierarchy_context)

View file

@ -0,0 +1,9 @@
""" Addon class definition and Settings definition must be imported here.
If addon class or settings definition won't be here their definition won't
be found by OpenPype discovery.
"""
from .kitsu_module import KitsuModule
__all__ = ("KitsuModule",)

View file

@ -0,0 +1,136 @@
"""Kitsu module."""
import click
import os
from openpype.modules import OpenPypeModule
from openpype_interfaces import IPluginPaths, ITrayAction
class KitsuModule(OpenPypeModule, IPluginPaths, ITrayAction):
"""Kitsu module class."""
label = "Kitsu Connect"
name = "kitsu"
def initialize(self, settings):
"""Initialization of module."""
module_settings = settings[self.name]
# Enabled by settings
self.enabled = module_settings.get("enabled", False)
# Add API URL schema
kitsu_url = module_settings["server"].strip()
if kitsu_url:
# Ensure web url
if not kitsu_url.startswith("http"):
kitsu_url = "https://" + kitsu_url
# Check for "/api" url validity
if not kitsu_url.endswith("api"):
kitsu_url = "{}{}api".format(
kitsu_url, "" if kitsu_url.endswith("/") else "/"
)
self.server_url = kitsu_url
# UI which must not be created at this time
self._dialog = None
def tray_init(self):
"""Tray init."""
self._create_dialog()
def tray_start(self):
"""Tray start."""
from .utils.credentials import (
load_credentials,
validate_credentials,
set_credentials_envs,
)
login, password = load_credentials()
# Check credentials, ask them if needed
if validate_credentials(login, password):
set_credentials_envs(login, password)
else:
self.show_dialog()
def get_global_environments(self):
"""Kitsu's global environments."""
return {"KITSU_SERVER": self.server_url}
def _create_dialog(self):
# Don't recreate dialog if already exists
if self._dialog is not None:
return
from .kitsu_widgets import KitsuPasswordDialog
self._dialog = KitsuPasswordDialog()
def show_dialog(self):
"""Show dialog to log-in."""
# Make sure dialog is created
self._create_dialog()
# Show dialog
self._dialog.open()
def on_action_trigger(self):
"""Implementation of abstract method for `ITrayAction`."""
self.show_dialog()
def get_plugin_paths(self):
"""Implementation of abstract method for `IPluginPaths`."""
current_dir = os.path.dirname(os.path.abspath(__file__))
return {"publish": [os.path.join(current_dir, "plugins", "publish")]}
def cli(self, click_group):
click_group.add_command(cli_main)
@click.group(KitsuModule.name, help="Kitsu dynamic cli commands.")
def cli_main():
pass
@cli_main.command()
@click.option("--login", envvar="KITSU_LOGIN", help="Kitsu login")
@click.option(
"--password", envvar="KITSU_PWD", help="Password for kitsu username"
)
def push_to_zou(login, password):
"""Synchronize Zou database (Kitsu backend) with openpype database.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
"""
from .utils.update_zou_with_op import sync_zou
sync_zou(login, password)
@cli_main.command()
@click.option("-l", "--login", envvar="KITSU_LOGIN", help="Kitsu login")
@click.option(
"-p", "--password", envvar="KITSU_PWD", help="Password for kitsu username"
)
def sync_service(login, password):
"""Synchronize openpype database from Zou sever database.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
"""
from .utils.update_op_with_zou import sync_all_project
from .utils.sync_service import start_listeners
sync_all_project(login, password)
start_listeners(login, password)

View file

@ -0,0 +1,188 @@
from Qt import QtWidgets, QtCore, QtGui
from openpype import style
from openpype.modules.kitsu.utils.credentials import (
clear_credentials,
load_credentials,
save_credentials,
set_credentials_envs,
validate_credentials,
)
from openpype.resources import get_resource
from openpype.settings.lib import (
get_system_settings,
)
from openpype.widgets.password_dialog import PressHoverButton
class KitsuPasswordDialog(QtWidgets.QDialog):
"""Kitsu login dialog."""
finished = QtCore.Signal(bool)
def __init__(self, parent=None):
super(KitsuPasswordDialog, self).__init__(parent)
self.setWindowTitle("Kitsu Credentials")
self.resize(300, 120)
system_settings = get_system_settings()
user_login, user_pwd = load_credentials()
remembered = bool(user_login or user_pwd)
self._final_result = None
self._connectable = bool(
system_settings["modules"].get("kitsu", {}).get("server")
)
# Server label
server_message = (
system_settings["modules"]["kitsu"]["server"]
if self._connectable
else "no server url set in Studio Settings..."
)
server_label = QtWidgets.QLabel(
f"Server: {server_message}",
self,
)
# Login input
login_widget = QtWidgets.QWidget(self)
login_label = QtWidgets.QLabel("Login:", login_widget)
login_input = QtWidgets.QLineEdit(
login_widget,
text=user_login if remembered else None,
)
login_input.setPlaceholderText("Your Kitsu account login...")
login_layout = QtWidgets.QHBoxLayout(login_widget)
login_layout.setContentsMargins(0, 0, 0, 0)
login_layout.addWidget(login_label)
login_layout.addWidget(login_input)
# Password input
password_widget = QtWidgets.QWidget(self)
password_label = QtWidgets.QLabel("Password:", password_widget)
password_input = QtWidgets.QLineEdit(
password_widget,
text=user_pwd if remembered else None,
)
password_input.setPlaceholderText("Your password...")
password_input.setEchoMode(QtWidgets.QLineEdit.Password)
show_password_icon_path = get_resource("icons", "eye.png")
show_password_icon = QtGui.QIcon(show_password_icon_path)
show_password_btn = PressHoverButton(password_widget)
show_password_btn.setObjectName("PasswordBtn")
show_password_btn.setIcon(show_password_icon)
show_password_btn.setFocusPolicy(QtCore.Qt.ClickFocus)
password_layout = QtWidgets.QHBoxLayout(password_widget)
password_layout.setContentsMargins(0, 0, 0, 0)
password_layout.addWidget(password_label)
password_layout.addWidget(password_input)
password_layout.addWidget(show_password_btn)
# Message label
message_label = QtWidgets.QLabel("", self)
# Buttons
buttons_widget = QtWidgets.QWidget(self)
remember_checkbox = QtWidgets.QCheckBox("Remember", buttons_widget)
remember_checkbox.setObjectName("RememberCheckbox")
remember_checkbox.setChecked(remembered)
ok_btn = QtWidgets.QPushButton("Ok", buttons_widget)
cancel_btn = QtWidgets.QPushButton("Cancel", buttons_widget)
buttons_layout = QtWidgets.QHBoxLayout(buttons_widget)
buttons_layout.setContentsMargins(0, 0, 0, 0)
buttons_layout.addWidget(remember_checkbox)
buttons_layout.addStretch(1)
buttons_layout.addWidget(ok_btn)
buttons_layout.addWidget(cancel_btn)
# Main layout
layout = QtWidgets.QVBoxLayout(self)
layout.addSpacing(5)
layout.addWidget(server_label, 0)
layout.addSpacing(5)
layout.addWidget(login_widget, 0)
layout.addWidget(password_widget, 0)
layout.addWidget(message_label, 0)
layout.addStretch(1)
layout.addWidget(buttons_widget, 0)
ok_btn.clicked.connect(self._on_ok_click)
cancel_btn.clicked.connect(self._on_cancel_click)
show_password_btn.change_state.connect(self._on_show_password)
self.login_input = login_input
self.password_input = password_input
self.remember_checkbox = remember_checkbox
self.message_label = message_label
self.setStyleSheet(style.load_stylesheet())
def result(self):
return self._final_result
def keyPressEvent(self, event):
if event.key() in (QtCore.Qt.Key_Return, QtCore.Qt.Key_Enter):
self._on_ok_click()
return event.accept()
super(KitsuPasswordDialog, self).keyPressEvent(event)
def closeEvent(self, event):
super(KitsuPasswordDialog, self).closeEvent(event)
self.finished.emit(self.result())
def _on_ok_click(self):
# Check if is connectable
if not self._connectable:
self.message_label.setText(
"Please set server url in Studio Settings!"
)
return
# Collect values
login_value = self.login_input.text()
pwd_value = self.password_input.text()
remember = self.remember_checkbox.isChecked()
# Authenticate
if validate_credentials(login_value, pwd_value):
set_credentials_envs(login_value, pwd_value)
else:
self.message_label.setText("Authentication failed...")
return
# Remember password cases
if remember:
save_credentials(login_value, pwd_value)
else:
# Clear local settings
clear_credentials()
# Clear input fields
self.login_input.clear()
self.password_input.clear()
self._final_result = True
self.close()
def _on_show_password(self, show_password):
if show_password:
echo_mode = QtWidgets.QLineEdit.Normal
else:
echo_mode = QtWidgets.QLineEdit.Password
self.password_input.setEchoMode(echo_mode)
def _on_cancel_click(self):
self.close()

View file

@ -0,0 +1,18 @@
# -*- coding: utf-8 -*-
import os
import gazu
import pyblish.api
class CollectKitsuSession(pyblish.api.ContextPlugin): # rename log in
"""Collect Kitsu session using user credentials"""
order = pyblish.api.CollectorOrder
label = "Kitsu user session"
# families = ["kitsu"]
def process(self, context):
gazu.client.set_host(os.environ["KITSU_SERVER"])
gazu.log_in(os.environ["KITSU_LOGIN"], os.environ["KITSU_PWD"])

View file

@ -0,0 +1,65 @@
# -*- coding: utf-8 -*-
import os
import gazu
import pyblish.api
class CollectKitsuEntities(pyblish.api.ContextPlugin):
"""Collect Kitsu entities according to the current context"""
order = pyblish.api.CollectorOrder + 0.499
label = "Kitsu entities"
def process(self, context):
asset_data = context.data["assetEntity"]["data"]
zou_asset_data = asset_data.get("zou")
if not zou_asset_data:
raise AssertionError("Zou asset data not found in OpenPype!")
self.log.debug("Collected zou asset data: {}".format(zou_asset_data))
zou_task_data = asset_data["tasks"][os.environ["AVALON_TASK"]].get(
"zou"
)
if not zou_task_data:
self.log.warning("Zou task data not found in OpenPype!")
self.log.debug("Collected zou task data: {}".format(zou_task_data))
kitsu_project = gazu.project.get_project(zou_asset_data["project_id"])
if not kitsu_project:
raise AssertionError("Project not found in kitsu!")
context.data["kitsu_project"] = kitsu_project
self.log.debug("Collect kitsu project: {}".format(kitsu_project))
kitsu_asset = gazu.asset.get_asset(zou_asset_data["id"])
if not kitsu_asset:
raise AssertionError("Asset not found in kitsu!")
context.data["kitsu_asset"] = kitsu_asset
self.log.debug("Collect kitsu asset: {}".format(kitsu_asset))
if zou_task_data:
kitsu_task = gazu.task.get_task(zou_task_data["id"])
if not kitsu_task:
raise AssertionError("Task not found in kitsu!")
context.data["kitsu_task"] = kitsu_task
self.log.debug("Collect kitsu task: {}".format(kitsu_task))
else:
kitsu_task_type = gazu.task.get_task_type_by_name(
os.environ["AVALON_TASK"]
)
if not kitsu_task_type:
raise AssertionError(
"Task type {} not found in Kitsu!".format(
os.environ["AVALON_TASK"]
)
)
kitsu_task = gazu.task.get_task_by_name(
kitsu_asset, kitsu_task_type
)
if not kitsu_task:
raise AssertionError("Task not found in kitsu!")
context.data["kitsu_task"] = kitsu_task
self.log.debug("Collect kitsu task: {}".format(kitsu_task))

View file

@ -0,0 +1,50 @@
# -*- coding: utf-8 -*-
import gazu
import pyblish.api
class IntegrateKitsuNote(pyblish.api.ContextPlugin):
"""Integrate Kitsu Note"""
order = pyblish.api.IntegratorOrder
label = "Kitsu Note and Status"
# families = ["kitsu"]
set_status_note = False
note_status_shortname = "wfa"
def process(self, context):
# Get comment text body
publish_comment = context.data.get("comment")
if not publish_comment:
self.log.info("Comment is not set.")
self.log.debug("Comment is `{}`".format(publish_comment))
# Get note status, by default uses the task status for the note
# if it is not specified in the configuration
note_status = context.data["kitsu_task"]["task_status_id"]
if self.set_status_note:
kitsu_status = gazu.task.get_task_status_by_short_name(
self.note_status_shortname
)
if kitsu_status:
note_status = kitsu_status
self.log.info("Note Kitsu status: {}".format(note_status))
else:
self.log.info(
"Cannot find {} status. The status will not be "
"changed!".format(self.note_status_shortname)
)
# Add comment to kitsu task
self.log.debug(
"Add new note in taks id {}".format(
context.data["kitsu_task"]["id"]
)
)
kitsu_comment = gazu.task.add_comment(
context.data["kitsu_task"], note_status, comment=publish_comment
)
context.data["kitsu_comment"] = kitsu_comment

View file

@ -0,0 +1,40 @@
# -*- coding: utf-8 -*-
import gazu
import pyblish.api
class IntegrateKitsuReview(pyblish.api.InstancePlugin):
"""Integrate Kitsu Review"""
order = pyblish.api.IntegratorOrder + 0.01
label = "Kitsu Review"
# families = ["kitsu"]
optional = True
def process(self, instance):
context = instance.context
task = context.data["kitsu_task"]
comment = context.data.get("kitsu_comment")
# Check comment has been created
if not comment:
self.log.debug(
"Comment not created, review not pushed to preview."
)
return
# Add review representations as preview of comment
for representation in instance.data.get("representations", []):
# Skip if not tagged as review
if "review" not in representation.get("tags", []):
continue
review_path = representation.get("published_path")
self.log.debug("Found review at: {}".format(review_path))
gazu.task.add_preview(
task, comment, review_path, normalize_movie=True
)
self.log.info("Review upload on comment")

View file

@ -0,0 +1,15 @@
# -*- coding: utf-8 -*-
import gazu
import pyblish.api
class KitsuLogOut(pyblish.api.ContextPlugin):
"""
Log out from Kitsu API
"""
order = pyblish.api.IntegratorOrder + 10
label = "Kitsu Log Out"
def process(self, context):
gazu.log_out()

View file

View file

@ -0,0 +1,104 @@
"""Kitsu credentials functions."""
import os
from typing import Tuple
import gazu
from openpype.lib.local_settings import OpenPypeSecureRegistry
def validate_credentials(
login: str, password: str, kitsu_url: str = None
) -> bool:
"""Validate credentials by trying to connect to Kitsu host URL.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
kitsu_url (str, optional): Kitsu host URL. Defaults to None.
Returns:
bool: Are credentials valid?
"""
if kitsu_url is None:
kitsu_url = os.environ.get("KITSU_SERVER")
# Connect to server
validate_host(kitsu_url)
# Authenticate
try:
gazu.log_in(login, password)
except gazu.exception.AuthFailedException:
return False
return True
def validate_host(kitsu_url: str) -> bool:
"""Validate credentials by trying to connect to Kitsu host URL.
Args:
kitsu_url (str, optional): Kitsu host URL.
Returns:
bool: Is host valid?
"""
# Connect to server
gazu.set_host(kitsu_url)
# Test host
if gazu.client.host_is_valid():
return True
else:
raise gazu.exception.HostException(f"Host '{kitsu_url}' is invalid.")
def clear_credentials():
"""Clear credentials in Secure Registry."""
# Get user registry
user_registry = OpenPypeSecureRegistry("kitsu_user")
# Set local settings
user_registry.delete_item("login")
user_registry.delete_item("password")
def save_credentials(login: str, password: str):
"""Save credentials in Secure Registry.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
"""
# Get user registry
user_registry = OpenPypeSecureRegistry("kitsu_user")
# Set local settings
user_registry.set_item("login", login)
user_registry.set_item("password", password)
def load_credentials() -> Tuple[str, str]:
"""Load registered credentials.
Returns:
Tuple[str, str]: (Login, Password)
"""
# Get user registry
user_registry = OpenPypeSecureRegistry("kitsu_user")
return user_registry.get_item("login", None), user_registry.get_item(
"password", None
)
def set_credentials_envs(login: str, password: str):
"""Set environment variables with Kitsu login and password.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
"""
os.environ["KITSU_LOGIN"] = login
os.environ["KITSU_PWD"] = password

View file

@ -0,0 +1,384 @@
import os
import gazu
from openpype.pipeline import AvalonMongoDB
from .credentials import validate_credentials
from .update_op_with_zou import (
create_op_asset,
set_op_project,
write_project_to_op,
update_op_assets,
)
class Listener:
"""Host Kitsu listener."""
def __init__(self, login, password):
"""Create client and add listeners to events without starting it.
Run `listener.start()` to actually start the service.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
Raises:
AuthFailedException: Wrong user login and/or password
"""
self.dbcon = AvalonMongoDB()
self.dbcon.install()
gazu.client.set_host(os.environ["KITSU_SERVER"])
# Authenticate
if not validate_credentials(login, password):
raise gazu.exception.AuthFailedException(
f"Kitsu authentication failed for login: '{login}'..."
)
gazu.set_event_host(
os.environ["KITSU_SERVER"].replace("api", "socket.io")
)
self.event_client = gazu.events.init()
gazu.events.add_listener(
self.event_client, "project:new", self._new_project
)
gazu.events.add_listener(
self.event_client, "project:update", self._update_project
)
gazu.events.add_listener(
self.event_client, "project:delete", self._delete_project
)
gazu.events.add_listener(
self.event_client, "asset:new", self._new_asset
)
gazu.events.add_listener(
self.event_client, "asset:update", self._update_asset
)
gazu.events.add_listener(
self.event_client, "asset:delete", self._delete_asset
)
gazu.events.add_listener(
self.event_client, "episode:new", self._new_episode
)
gazu.events.add_listener(
self.event_client, "episode:update", self._update_episode
)
gazu.events.add_listener(
self.event_client, "episode:delete", self._delete_episode
)
gazu.events.add_listener(
self.event_client, "sequence:new", self._new_sequence
)
gazu.events.add_listener(
self.event_client, "sequence:update", self._update_sequence
)
gazu.events.add_listener(
self.event_client, "sequence:delete", self._delete_sequence
)
gazu.events.add_listener(self.event_client, "shot:new", self._new_shot)
gazu.events.add_listener(
self.event_client, "shot:update", self._update_shot
)
gazu.events.add_listener(
self.event_client, "shot:delete", self._delete_shot
)
gazu.events.add_listener(self.event_client, "task:new", self._new_task)
gazu.events.add_listener(
self.event_client, "task:update", self._update_task
)
gazu.events.add_listener(
self.event_client, "task:delete", self._delete_task
)
def start(self):
gazu.events.run_client(self.event_client)
# == Project ==
def _new_project(self, data):
"""Create new project into OP DB."""
# Use update process to avoid duplicating code
self._update_project(data)
def _update_project(self, data):
"""Update project into OP DB."""
# Get project entity
project = gazu.project.get_project(data["project_id"])
project_name = project["name"]
update_project = write_project_to_op(project, self.dbcon)
# Write into DB
if update_project:
self.dbcon = self.dbcon.database[project_name]
self.dbcon.bulk_write([update_project])
def _delete_project(self, data):
"""Delete project."""
project_doc = self.dbcon.find_one(
{"type": "project", "data.zou_id": data["project_id"]}
)
# Delete project collection
self.dbcon.database[project_doc["name"]].drop()
# == Asset ==
def _new_asset(self, data):
"""Create new asset into OP DB."""
# Get project entity
set_op_project(self.dbcon, data["project_id"])
# Get gazu entity
asset = gazu.asset.get_asset(data["asset_id"])
# Insert doc in DB
self.dbcon.insert_one(create_op_asset(asset))
# Update
self._update_asset(data)
def _update_asset(self, data):
"""Update asset into OP DB."""
set_op_project(self.dbcon, data["project_id"])
project_doc = self.dbcon.find_one({"type": "project"})
# Get gazu entity
asset = gazu.asset.get_asset(data["asset_id"])
# Find asset doc
# Query all assets of the local project
zou_ids_and_asset_docs = {
asset_doc["data"]["zou"]["id"]: asset_doc
for asset_doc in self.dbcon.find({"type": "asset"})
if asset_doc["data"].get("zou", {}).get("id")
}
zou_ids_and_asset_docs[asset["project_id"]] = project_doc
# Update
asset_doc_id, asset_update = update_op_assets(
self.dbcon, project_doc, [asset], zou_ids_and_asset_docs
)[0]
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
def _delete_asset(self, data):
"""Delete asset of OP DB."""
set_op_project(self.dbcon, data["project_id"])
# Delete
self.dbcon.delete_one(
{"type": "asset", "data.zou.id": data["asset_id"]}
)
# == Episode ==
def _new_episode(self, data):
"""Create new episode into OP DB."""
# Get project entity
set_op_project(self.dbcon, data["project_id"])
# Get gazu entity
episode = gazu.shot.get_episode(data["episode_id"])
# Insert doc in DB
self.dbcon.insert_one(create_op_asset(episode))
# Update
self._update_episode(data)
def _update_episode(self, data):
"""Update episode into OP DB."""
set_op_project(self.dbcon, data["project_id"])
project_doc = self.dbcon.find_one({"type": "project"})
# Get gazu entity
episode = gazu.shot.get_episode(data["episode_id"])
# Find asset doc
# Query all assets of the local project
zou_ids_and_asset_docs = {
asset_doc["data"]["zou"]["id"]: asset_doc
for asset_doc in self.dbcon.find({"type": "asset"})
if asset_doc["data"].get("zou", {}).get("id")
}
zou_ids_and_asset_docs[episode["project_id"]] = project_doc
# Update
asset_doc_id, asset_update = update_op_assets(
self.dbcon, project_doc, [episode], zou_ids_and_asset_docs
)[0]
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
def _delete_episode(self, data):
"""Delete shot of OP DB."""
set_op_project(self.dbcon, data["project_id"])
print("delete episode") # TODO check bugfix
# Delete
self.dbcon.delete_one(
{"type": "asset", "data.zou.id": data["episode_id"]}
)
# == Sequence ==
def _new_sequence(self, data):
"""Create new sequnce into OP DB."""
# Get project entity
set_op_project(self.dbcon, data["project_id"])
# Get gazu entity
sequence = gazu.shot.get_sequence(data["sequence_id"])
# Insert doc in DB
self.dbcon.insert_one(create_op_asset(sequence))
# Update
self._update_sequence(data)
def _update_sequence(self, data):
"""Update sequence into OP DB."""
set_op_project(self.dbcon, data["project_id"])
project_doc = self.dbcon.find_one({"type": "project"})
# Get gazu entity
sequence = gazu.shot.get_sequence(data["sequence_id"])
# Find asset doc
# Query all assets of the local project
zou_ids_and_asset_docs = {
asset_doc["data"]["zou"]["id"]: asset_doc
for asset_doc in self.dbcon.find({"type": "asset"})
if asset_doc["data"].get("zou", {}).get("id")
}
zou_ids_and_asset_docs[sequence["project_id"]] = project_doc
# Update
asset_doc_id, asset_update = update_op_assets(
self.dbcon, project_doc, [sequence], zou_ids_and_asset_docs
)[0]
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
def _delete_sequence(self, data):
"""Delete sequence of OP DB."""
set_op_project(self.dbcon, data["project_id"])
print("delete sequence") # TODO check bugfix
# Delete
self.dbcon.delete_one(
{"type": "asset", "data.zou.id": data["sequence_id"]}
)
# == Shot ==
def _new_shot(self, data):
"""Create new shot into OP DB."""
# Get project entity
set_op_project(self.dbcon, data["project_id"])
# Get gazu entity
shot = gazu.shot.get_shot(data["shot_id"])
# Insert doc in DB
self.dbcon.insert_one(create_op_asset(shot))
# Update
self._update_shot(data)
def _update_shot(self, data):
"""Update shot into OP DB."""
set_op_project(self.dbcon, data["project_id"])
project_doc = self.dbcon.find_one({"type": "project"})
# Get gazu entity
shot = gazu.shot.get_shot(data["shot_id"])
# Find asset doc
# Query all assets of the local project
zou_ids_and_asset_docs = {
asset_doc["data"]["zou"]["id"]: asset_doc
for asset_doc in self.dbcon.find({"type": "asset"})
if asset_doc["data"].get("zou", {}).get("id")
}
zou_ids_and_asset_docs[shot["project_id"]] = project_doc
# Update
asset_doc_id, asset_update = update_op_assets(
self.dbcon, project_doc, [shot], zou_ids_and_asset_docs
)[0]
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
def _delete_shot(self, data):
"""Delete shot of OP DB."""
set_op_project(self.dbcon, data["project_id"])
# Delete
self.dbcon.delete_one(
{"type": "asset", "data.zou.id": data["shot_id"]}
)
# == Task ==
def _new_task(self, data):
"""Create new task into OP DB."""
# Get project entity
set_op_project(self.dbcon, data["project_id"])
# Get gazu entity
task = gazu.task.get_task(data["task_id"])
# Find asset doc
asset_doc = self.dbcon.find_one(
{"type": "asset", "data.zou.id": task["entity"]["id"]}
)
# Update asset tasks with new one
asset_tasks = asset_doc["data"].get("tasks")
task_type_name = task["task_type"]["name"]
asset_tasks[task_type_name] = {"type": task_type_name, "zou": task}
self.dbcon.update_one(
{"_id": asset_doc["_id"]}, {"$set": {"data.tasks": asset_tasks}}
)
def _update_task(self, data):
"""Update task into OP DB."""
# TODO is it necessary?
pass
def _delete_task(self, data):
"""Delete task of OP DB."""
set_op_project(self.dbcon, data["project_id"])
# Find asset doc
asset_docs = [doc for doc in self.dbcon.find({"type": "asset"})]
for doc in asset_docs:
# Match task
for name, task in doc["data"]["tasks"].items():
if task.get("zou") and data["task_id"] == task["zou"]["id"]:
# Pop task
asset_tasks = doc["data"].get("tasks", {})
asset_tasks.pop(name)
# Delete task in DB
self.dbcon.update_one(
{"_id": doc["_id"]},
{"$set": {"data.tasks": asset_tasks}},
)
return
def start_listeners(login: str, password: str):
"""Start listeners to keep OpenPype up-to-date with Kitsu.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
"""
# Connect to server
listener = Listener(login, password)
listener.start()

View file

@ -0,0 +1,389 @@
"""Functions to update OpenPype data using Kitsu DB (a.k.a Zou)."""
from copy import deepcopy
import re
from typing import Dict, List
from pymongo import DeleteOne, UpdateOne
import gazu
from gazu.task import (
all_tasks_for_asset,
all_tasks_for_shot,
)
from openpype.pipeline import AvalonMongoDB
from openpype.api import get_project_settings
from openpype.lib import create_project
from openpype.modules.kitsu.utils.credentials import validate_credentials
# Accepted namin pattern for OP
naming_pattern = re.compile("^[a-zA-Z0-9_.]*$")
def create_op_asset(gazu_entity: dict) -> dict:
"""Create OP asset dict from gazu entity.
:param gazu_entity:
"""
return {
"name": gazu_entity["name"],
"type": "asset",
"schema": "openpype:asset-3.0",
"data": {"zou": gazu_entity, "tasks": {}},
}
def set_op_project(dbcon: AvalonMongoDB, project_id: str):
"""Set project context.
Args:
dbcon (AvalonMongoDB): Connection to DB
project_id (str): Project zou ID
"""
project = gazu.project.get_project(project_id)
project_name = project["name"]
dbcon.Session["AVALON_PROJECT"] = project_name
def update_op_assets(
dbcon: AvalonMongoDB,
project_doc: dict,
entities_list: List[dict],
asset_doc_ids: Dict[str, dict],
) -> List[Dict[str, dict]]:
"""Update OpenPype assets.
Set 'data' and 'parent' fields.
Args:
dbcon (AvalonMongoDB): Connection to DB
entities_list (List[dict]): List of zou entities to update
asset_doc_ids (Dict[str, dict]): Dicts of [{zou_id: asset_doc}, ...]
Returns:
List[Dict[str, dict]]: List of (doc_id, update_dict) tuples
"""
project_name = project_doc["name"]
project_module_settings = get_project_settings(project_name)["kitsu"]
assets_with_update = []
for item in entities_list:
# Check asset exists
item_doc = asset_doc_ids.get(item["id"])
if not item_doc: # Create asset
op_asset = create_op_asset(item)
insert_result = dbcon.insert_one(op_asset)
item_doc = dbcon.find_one(
{"type": "asset", "_id": insert_result.inserted_id}
)
# Update asset
item_data = deepcopy(item_doc["data"])
item_data.update(item.get("data") or {})
item_data["zou"] = item
# == Asset settings ==
# Frame in, fallback on 0
frame_in = int(item_data.get("frame_in") or 0)
item_data["frameStart"] = frame_in
item_data.pop("frame_in")
# Frame out, fallback on frame_in + duration
frames_duration = int(item.get("nb_frames") or 1)
frame_out = (
item_data["frame_out"]
if item_data.get("frame_out")
else frame_in + frames_duration
)
item_data["frameEnd"] = int(frame_out)
item_data.pop("frame_out")
# Fps, fallback to project's value when entity fps is deleted
if not item_data.get("fps") and item_doc["data"].get("fps"):
item_data["fps"] = project_doc["data"]["fps"]
# Tasks
tasks_list = []
item_type = item["type"]
if item_type == "Asset":
tasks_list = all_tasks_for_asset(item)
elif item_type == "Shot":
tasks_list = all_tasks_for_shot(item)
# TODO frame in and out
item_data["tasks"] = {
t["task_type_name"]: {"type": t["task_type_name"]}
for t in tasks_list
}
# Get zou parent id for correct hierarchy
# Use parent substitutes if existing
substitute_parent_item = (
item_data["parent_substitutes"][0]
if item_data.get("parent_substitutes")
else None
)
if substitute_parent_item:
parent_zou_id = substitute_parent_item["parent_id"]
else:
parent_zou_id = (
item.get("parent_id")
or item.get("episode_id")
or item.get("source_id")
) # TODO check consistency
# Substitute Episode and Sequence by Shot
substitute_item_type = (
"shots"
if item_type in ["Episode", "Sequence"]
else f"{item_type.lower()}s"
)
entity_parent_folders = [
f
for f in project_module_settings["entities_root"]
.get(substitute_item_type)
.split("/")
if f
]
# Root parent folder if exist
visual_parent_doc_id = (
asset_doc_ids[parent_zou_id]["_id"] if parent_zou_id else None
)
if visual_parent_doc_id is None:
# Find root folder doc
root_folder_doc = dbcon.find_one(
{
"type": "asset",
"name": entity_parent_folders[-1],
"data.root_of": substitute_item_type,
},
["_id"],
)
if root_folder_doc:
visual_parent_doc_id = root_folder_doc["_id"]
# Visual parent for hierarchy
item_data["visualParent"] = visual_parent_doc_id
# Add parents for hierarchy
item_data["parents"] = []
while parent_zou_id is not None:
parent_doc = asset_doc_ids[parent_zou_id]
item_data["parents"].insert(0, parent_doc["name"])
# Get parent entity
parent_entity = parent_doc["data"]["zou"]
parent_zou_id = parent_entity["parent_id"]
# Set root folders parents
item_data["parents"] = entity_parent_folders + item_data["parents"]
# Update 'data' different in zou DB
updated_data = {
k: v for k, v in item_data.items() if item_doc["data"].get(k) != v
}
if updated_data or not item_doc.get("parent"):
assets_with_update.append(
(
item_doc["_id"],
{
"$set": {
"name": item["name"],
"data": item_data,
"parent": asset_doc_ids[item["project_id"]]["_id"],
}
},
)
)
return assets_with_update
def write_project_to_op(project: dict, dbcon: AvalonMongoDB) -> UpdateOne:
"""Write gazu project to OP database.
Create project if doesn't exist.
Args:
project (dict): Gazu project
dbcon (AvalonMongoDB): DB to create project in
Returns:
UpdateOne: Update instance for the project
"""
project_name = project["name"]
project_doc = dbcon.database[project_name].find_one({"type": "project"})
if not project_doc:
print(f"Creating project '{project_name}'")
project_doc = create_project(project_name, project_name, dbcon=dbcon)
# Project data and tasks
project_data = project["data"] or {}
# Build project code and update Kitsu
project_code = project.get("code")
if not project_code:
project_code = project["name"].replace(" ", "_").lower()
project["code"] = project_code
# Update Zou
gazu.project.update_project(project)
# Update data
project_data.update(
{
"code": project_code,
"fps": project["fps"],
"resolutionWidth": project["resolution"].split("x")[0],
"resolutionHeight": project["resolution"].split("x")[1],
"zou_id": project["id"],
}
)
return UpdateOne(
{"_id": project_doc["_id"]},
{
"$set": {
"config.tasks": {
t["name"]: {"short_name": t.get("short_name", t["name"])}
for t in gazu.task.all_task_types_for_project(project)
},
"data": project_data,
}
},
)
def sync_all_project(login: str, password: str):
"""Update all OP projects in DB with Zou data.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
Raises:
gazu.exception.AuthFailedException: Wrong user login and/or password
"""
# Authenticate
if not validate_credentials(login, password):
raise gazu.exception.AuthFailedException(
f"Kitsu authentication failed for login: '{login}'..."
)
# Iterate projects
dbcon = AvalonMongoDB()
dbcon.install()
all_projects = gazu.project.all_open_projects()
for project in all_projects:
sync_project_from_kitsu(dbcon, project)
def sync_project_from_kitsu(dbcon: AvalonMongoDB, project: dict):
"""Update OP project in DB with Zou data.
Args:
dbcon (AvalonMongoDB): MongoDB connection
project (dict): Project dict got using gazu.
"""
bulk_writes = []
# Get project from zou
if not project:
project = gazu.project.get_project_by_name(project["name"])
print(f"Synchronizing {project['name']}...")
# Get all assets from zou
all_assets = gazu.asset.all_assets_for_project(project)
all_episodes = gazu.shot.all_episodes_for_project(project)
all_seqs = gazu.shot.all_sequences_for_project(project)
all_shots = gazu.shot.all_shots_for_project(project)
all_entities = [
item
for item in all_assets + all_episodes + all_seqs + all_shots
if naming_pattern.match(item["name"])
]
# Sync project. Create if doesn't exist
bulk_writes.append(write_project_to_op(project, dbcon))
# Try to find project document
dbcon.Session["AVALON_PROJECT"] = project["name"]
project_doc = dbcon.find_one({"type": "project"})
# Query all assets of the local project
zou_ids_and_asset_docs = {
asset_doc["data"]["zou"]["id"]: asset_doc
for asset_doc in dbcon.find({"type": "asset"})
if asset_doc["data"].get("zou", {}).get("id")
}
zou_ids_and_asset_docs[project["id"]] = project_doc
# Create entities root folders
project_module_settings = get_project_settings(project["name"])["kitsu"]
for entity_type, root in project_module_settings["entities_root"].items():
parent_folders = root.split("/")
direct_parent_doc = None
for i, folder in enumerate(parent_folders, 1):
parent_doc = dbcon.find_one(
{"type": "asset", "name": folder, "data.root_of": entity_type}
)
if not parent_doc:
direct_parent_doc = dbcon.insert_one(
{
"name": folder,
"type": "asset",
"schema": "openpype:asset-3.0",
"data": {
"root_of": entity_type,
"parents": parent_folders[:i],
"visualParent": direct_parent_doc,
"tasks": {},
},
}
)
# Create
to_insert = []
to_insert.extend(
[
create_op_asset(item)
for item in all_entities
if item["id"] not in zou_ids_and_asset_docs.keys()
]
)
if to_insert:
# Insert doc in DB
dbcon.insert_many(to_insert)
# Update existing docs
zou_ids_and_asset_docs.update(
{
asset_doc["data"]["zou"]["id"]: asset_doc
for asset_doc in dbcon.find({"type": "asset"})
if asset_doc["data"].get("zou")
}
)
# Update
bulk_writes.extend(
[
UpdateOne({"_id": id}, update)
for id, update in update_op_assets(
dbcon, project_doc, all_entities, zou_ids_and_asset_docs
)
]
)
# Delete
diff_assets = set(zou_ids_and_asset_docs.keys()) - {
e["id"] for e in all_entities + [project]
}
if diff_assets:
bulk_writes.extend(
[
DeleteOne(zou_ids_and_asset_docs[asset_id])
for asset_id in diff_assets
]
)
# Write into DB
if bulk_writes:
dbcon.bulk_write(bulk_writes)

View file

@ -0,0 +1,262 @@
"""Functions to update Kitsu DB (a.k.a Zou) using OpenPype Data."""
import re
from typing import List
import gazu
from pymongo import UpdateOne
from openpype.pipeline import AvalonMongoDB
from openpype.api import get_project_settings
from openpype.modules.kitsu.utils.credentials import validate_credentials
def sync_zou(login: str, password: str):
"""Synchronize Zou database (Kitsu backend) with openpype database.
This is an utility function to help updating zou data with OP's, it may not
handle correctly all cases, a human intervention might
be required after all.
Will work better if OP DB has been previously synchronized from zou/kitsu.
Args:
login (str): Kitsu user login
password (str): Kitsu user password
Raises:
gazu.exception.AuthFailedException: Wrong user login and/or password
"""
# Authenticate
if not validate_credentials(login, password):
raise gazu.exception.AuthFailedException(
f"Kitsu authentication failed for login: '{login}'..."
)
# Iterate projects
dbcon = AvalonMongoDB()
dbcon.install()
op_projects = [p for p in dbcon.projects()]
for project_doc in op_projects:
sync_zou_from_op_project(project_doc["name"], dbcon, project_doc)
def sync_zou_from_op_project(
project_name: str, dbcon: AvalonMongoDB, project_doc: dict = None
) -> List[UpdateOne]:
"""Update OP project in DB with Zou data.
Args:
project_name (str): Name of project to sync
dbcon (AvalonMongoDB): MongoDB connection
project_doc (str, optional): Project doc to sync
"""
# Get project doc if not provided
if not project_doc:
project_doc = dbcon.database[project_name].find_one(
{"type": "project"}
)
# Get all entities from zou
print(f"Synchronizing {project_name}...")
zou_project = gazu.project.get_project_by_name(project_name)
# Create project
if zou_project is None:
raise RuntimeError(
f"Project '{project_name}' doesn't exist in Zou database, "
"please create it in Kitsu and add OpenPype user to it before "
"running synchronization."
)
# Update project settings and data
if project_doc["data"]:
zou_project.update(
{
"code": project_doc["data"]["code"],
"fps": project_doc["data"]["fps"],
"resolution": f"{project_doc['data']['resolutionWidth']}"
f"x{project_doc['data']['resolutionHeight']}",
}
)
gazu.project.update_project_data(zou_project, data=project_doc["data"])
gazu.project.update_project(zou_project)
asset_types = gazu.asset.all_asset_types()
all_assets = gazu.asset.all_assets_for_project(zou_project)
all_episodes = gazu.shot.all_episodes_for_project(zou_project)
all_seqs = gazu.shot.all_sequences_for_project(zou_project)
all_shots = gazu.shot.all_shots_for_project(zou_project)
all_entities_ids = {
e["id"] for e in all_episodes + all_seqs + all_shots + all_assets
}
# Query all assets of the local project
project_module_settings = get_project_settings(project_name)["kitsu"]
dbcon.Session["AVALON_PROJECT"] = project_name
asset_docs = {
asset_doc["_id"]: asset_doc
for asset_doc in dbcon.find({"type": "asset"})
}
# Create new assets
new_assets_docs = [
doc
for doc in asset_docs.values()
if doc["data"].get("zou", {}).get("id") not in all_entities_ids
]
naming_pattern = project_module_settings["entities_naming_pattern"]
regex_ep = re.compile(
r"(.*{}.*)|(.*{}.*)|(.*{}.*)".format(
naming_pattern["shot"].replace("#", ""),
naming_pattern["sequence"].replace("#", ""),
naming_pattern["episode"].replace("#", ""),
),
re.IGNORECASE,
)
bulk_writes = []
for doc in new_assets_docs:
visual_parent_id = doc["data"]["visualParent"]
parent_substitutes = []
# Match asset type by it's name
match = regex_ep.match(doc["name"])
if not match: # Asset
new_entity = gazu.asset.new_asset(
zou_project, asset_types[0], doc["name"]
)
# Match case in shot<sequence<episode order to support
# composed names like 'ep01_sq01_sh01'
elif match.group(1): # Shot
# Match and check parent doc
parent_doc = asset_docs[visual_parent_id]
zou_parent_id = parent_doc["data"]["zou"]["id"]
if parent_doc["data"].get("zou", {}).get("type") != "Sequence":
# Substitute name
digits_padding = naming_pattern["sequence"].count("#")
episode_name = naming_pattern["episode"].replace(
"#" * digits_padding, "1".zfill(digits_padding)
)
sequence_name = naming_pattern["sequence"].replace(
"#" * digits_padding, "1".zfill(digits_padding)
)
substitute_sequence_name = f"{episode_name}_{sequence_name}"
# Warn
print(
f"Shot {doc['name']} must be parented to a Sequence "
"in Kitsu. "
f"Creating automatically one substitute sequence "
f"called {substitute_sequence_name} in Kitsu..."
)
# Create new sequence and set it as substitute
created_sequence = gazu.shot.new_sequence(
zou_project,
substitute_sequence_name,
episode=zou_parent_id,
)
gazu.shot.update_sequence_data(
created_sequence, {"is_substitute": True}
)
parent_substitutes.append(created_sequence)
# Update parent ID
zou_parent_id = created_sequence["id"]
# Create shot
new_entity = gazu.shot.new_shot(
zou_project,
zou_parent_id,
doc["name"],
frame_in=doc["data"]["frameStart"],
frame_out=doc["data"]["frameEnd"],
nb_frames=doc["data"]["frameEnd"] - doc["data"]["frameStart"],
)
elif match.group(2): # Sequence
parent_doc = asset_docs[visual_parent_id]
new_entity = gazu.shot.new_sequence(
zou_project,
doc["name"],
episode=parent_doc["data"]["zou"]["id"],
)
elif match.group(3): # Episode
new_entity = gazu.shot.new_episode(zou_project, doc["name"])
# Update doc with zou id
doc["data"].update(
{
"visualParent": visual_parent_id,
"zou": new_entity,
}
)
bulk_writes.append(
UpdateOne(
{"_id": doc["_id"]},
{
"$set": {
"data.visualParent": visual_parent_id,
"data.zou": new_entity,
"data.parent_substitutes": parent_substitutes,
}
},
)
)
# Update assets
all_tasks_types = {t["name"]: t for t in gazu.task.all_task_types()}
assets_docs_to_update = [
doc
for doc in asset_docs.values()
if doc["data"].get("zou", {}).get("id") in all_entities_ids
]
for doc in assets_docs_to_update:
zou_id = doc["data"]["zou"]["id"]
if zou_id:
# Data
entity_data = {}
frame_in = doc["data"].get("frameStart")
frame_out = doc["data"].get("frameEnd")
if frame_in or frame_out:
entity_data.update(
{
"data": {
"frame_in": frame_in,
"frame_out": frame_out,
},
"nb_frames": frame_out - frame_in,
}
)
entity = gazu.raw.update("entities", zou_id, entity_data)
# Tasks
all_tasks_func = getattr(
gazu.task, f"all_tasks_for_{entity['type'].lower()}"
)
entity_tasks = {t["name"] for t in all_tasks_func(entity)}
for task_name in doc["data"]["tasks"].keys():
# Create only if new
if task_name not in entity_tasks:
task_type = all_tasks_types.get(task_name)
# Create non existing task
if not task_type:
task_type = gazu.task.new_task_type(task_name)
all_tasks_types[task_name] = task_type
# New task for entity
gazu.task.new_task(entity, task_type)
# Delete
deleted_entities = all_entities_ids - {
asset_doc["data"].get("zou", {}).get("id")
for asset_doc in asset_docs.values()
}
for entity_id in deleted_entities:
gazu.raw.delete(f"data/entities/{entity_id}")
# Write into DB
if bulk_writes:
dbcon.bulk_write(bulk_writes)

View file

@ -18,6 +18,16 @@ class InstancePlugin(pyblish.api.InstancePlugin):
super(InstancePlugin, cls).process(cls, *args, **kwargs)
class Integrator(InstancePlugin):
"""Integrator base class.
Wraps pyblish instance plugin. Targets set to "local" which means all
integrators should run on "local" publishes, by default.
"farm" targets could be used for integrators that should run on a farm.
"""
targets = ["local"]
class Extractor(InstancePlugin):
"""Extractor base class.
@ -28,6 +38,8 @@ class Extractor(InstancePlugin):
"""
targets = ["local"]
order = 2.0
def staging_dir(self, instance):

View file

@ -33,10 +33,6 @@ class CollectHierarchy(pyblish.api.ContextPlugin):
family = instance.data["family"]
families = instance.data["families"]
# filter out all unepropriate instances
if not instance.data["publish"]:
continue
# exclude other families then self.families with intersection
if not set(self.families).intersection(set(families + [family])):
continue

View file

@ -1,7 +1,5 @@
from copy import deepcopy
import pyblish.api
from openpype.pipeline import legacy_io
@ -17,33 +15,16 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin):
if "hierarchyContext" not in context.data:
self.log.info("skipping IntegrateHierarchyToAvalon")
return
hierarchy_context = deepcopy(context.data["hierarchyContext"])
if not legacy_io.Session:
legacy_io.install()
active_assets = []
# filter only the active publishing insatnces
for instance in context:
if instance.data.get("publish") is False:
continue
if not instance.data.get("asset"):
continue
active_assets.append(instance.data["asset"])
# remove duplicity in list
self.active_assets = list(set(active_assets))
self.log.debug("__ self.active_assets: {}".format(self.active_assets))
hierarchy_context = self._get_assets(hierarchy_context)
hierarchy_context = self._get_active_assets(context)
self.log.debug("__ hierarchy_context: {}".format(hierarchy_context))
input_data = context.data["hierarchyContext"] = hierarchy_context
self.project = None
self.import_to_avalon(input_data)
self.import_to_avalon(hierarchy_context)
def import_to_avalon(self, input_data, parent=None):
for name in input_data:
@ -183,23 +164,40 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin):
return legacy_io.find_one({"_id": entity_id})
def _get_assets(self, input_dict):
def _get_active_assets(self, context):
""" Returns only asset dictionary.
Usually the last part of deep dictionary which
is not having any children
"""
input_dict_copy = deepcopy(input_dict)
for key in input_dict.keys():
self.log.debug("__ key: {}".format(key))
# check if child key is available
if input_dict[key].get("childs"):
# loop deeper
input_dict_copy[key]["childs"] = self._get_assets(
input_dict[key]["childs"])
else:
# filter out unwanted assets
if key not in self.active_assets:
def get_pure_hierarchy_data(input_dict):
input_dict_copy = deepcopy(input_dict)
for key in input_dict.keys():
self.log.debug("__ key: {}".format(key))
# check if child key is available
if input_dict[key].get("childs"):
# loop deeper
input_dict_copy[
key]["childs"] = get_pure_hierarchy_data(
input_dict[key]["childs"])
elif key not in active_assets:
input_dict_copy.pop(key, None)
return input_dict_copy
return input_dict_copy
hierarchy_context = context.data["hierarchyContext"]
active_assets = []
# filter only the active publishing insatnces
for instance in context:
if instance.data.get("publish") is False:
continue
if not instance.data.get("asset"):
continue
active_assets.append(instance.data["asset"])
# remove duplicity in list
active_assets = list(set(active_assets))
self.log.debug("__ active_assets: {}".format(active_assets))
return get_pure_hierarchy_data(hierarchy_context)

View file

@ -173,7 +173,6 @@ class ExtractReviewSlate(openpype.api.Extractor):
self.log.debug("Slate Timecode: `{}`".format(
offset_timecode
))
input_args.extend(["-timecode", str(offset_timecode)])
if use_legacy_code:
format_args = []
@ -189,7 +188,6 @@ class ExtractReviewSlate(openpype.api.Extractor):
# make sure colors are correct
output_args.extend([
"-vf", "scale=out_color_matrix=bt709",
"-color_primaries", "bt709",
"-color_trc", "bt709",
"-colorspace", "bt709",
@ -230,6 +228,7 @@ class ExtractReviewSlate(openpype.api.Extractor):
scaling_arg = (
"scale={0}x{1}:flags=lanczos"
":out_color_matrix=bt709"
",pad={2}:{3}:{4}:{5}:black"
",setsar=1"
",fps={6}"

View file

@ -139,6 +139,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
ef, instance.data["family"], instance.data["families"]))
return
# instance should be published on a farm
if instance.data.get("farm"):
return
self.integrated_file_sizes = {}
try:
self.register(instance)

View file

@ -144,6 +144,7 @@ class PypeCommands:
pyblish.api.register_target("farm")
os.environ["OPENPYPE_PUBLISH_DATA"] = os.pathsep.join(paths)
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
log.info("Running publish ...")
@ -173,9 +174,11 @@ class PypeCommands:
user_email, targets=None):
"""Opens installed variant of 'host' and run remote publish there.
Eventually should be yanked out to Webpublisher cli.
Currently implemented and tested for Photoshop where customer
wants to process uploaded .psd file and publish collected layers
from there.
from there. Triggered by Webpublisher.
Checks if no other batches are running (status =='in_progress). If
so, it sleeps for SLEEP (this is separate process),
@ -273,7 +276,8 @@ class PypeCommands:
def remotepublish(project, batch_path, user_email, targets=None):
"""Start headless publishing.
Used to publish rendered assets, workfiles etc.
Used to publish rendered assets, workfiles etc via Webpublisher.
Eventually should be yanked out to Webpublisher cli.
Publish use json from passed paths argument.
@ -309,6 +313,7 @@ class PypeCommands:
os.environ["AVALON_PROJECT"] = project
os.environ["AVALON_APP"] = host_name
os.environ["USER_EMAIL"] = user_email
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
pyblish.api.register_host(host_name)
@ -331,9 +336,12 @@ class PypeCommands:
log.info("Publish finished.")
@staticmethod
def extractenvironments(
output_json_path, project, asset, task, app, env_group
):
def extractenvironments(output_json_path, project, asset, task, app,
env_group):
"""Produces json file with environment based on project and app.
Called by Deadline plugin to propagate environment into render jobs.
"""
if all((project, asset, task, app)):
from openpype.api import get_app_environments_for_context
env = get_app_environments_for_context(

View file

@ -0,0 +1,11 @@
try:
from openpype.api import Logger
import openpype.lib.remote_publish
except ImportError as exc:
# Ensure Deadline fails by output an error that contains "Fatal Error:"
raise ImportError("Fatal Error: %s" % exc)
if __name__ == "__main__":
# Perform remote publish with thorough error checking
log = Logger.get_logger(__name__)
openpype.lib.remote_publish.publish(log, raise_error=True)

View file

@ -109,6 +109,13 @@
"Omitted"
],
"name_sorting": false
},
"transfer_values_of_hierarchical_attributes": {
"enabled": true,
"role_list": [
"Administrator",
"Project manager"
]
}
},
"user_handlers": {

View file

@ -0,0 +1,17 @@
{
"entities_root": {
"assets": "Assets",
"shots": "Shots"
},
"entities_naming_pattern": {
"episode": "E##",
"sequence": "SQ##",
"shot": "SH##"
},
"publish": {
"IntegrateKitsuNote": {
"set_status_note": false,
"note_status_shortname": "wfa"
}
}
}

View file

@ -8,6 +8,7 @@
"yetiRig": "ma"
},
"maya-dirmap": {
"use_env_var_as_root": false,
"enabled": false,
"paths": {
"source-path": [],
@ -717,6 +718,15 @@
}
]
},
"templated_workfile_build": {
"profiles": [
{
"task_types": [],
"tasks": [],
"path": "/path/to/your/template"
}
]
},
"filters": {
"preset 1": {
"ValidateNoAnimation": false,

View file

@ -27,6 +27,34 @@
}
]
},
"gizmo": [
{
"toolbar_menu_name": "OpenPype Gizmo",
"gizmo_source_dir": {
"windows": [],
"darwin": [],
"linux": []
},
"toolbar_icon_path": {
"windows": "",
"darwin": "",
"linux": ""
},
"gizmo_definition": [
{
"gizmo_toolbar_path": "/path/to/menu",
"sub_gizmo_list": [
{
"sourcetype": "python",
"title": "Gizmo Note",
"command": "nuke.nodes.StickyNote(label='You can create your own toolbar menu in the Nuke GizmoMenu of OpenPype')",
"shortcut": ""
}
]
}
]
}
],
"create": {
"CreateWriteRender": {
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}",

View file

@ -137,6 +137,10 @@
}
}
},
"kitsu": {
"enabled": false,
"server": ""
},
"timers_manager": {
"enabled": true,
"auto_stop": true,

View file

@ -62,6 +62,10 @@
"type": "schema",
"name": "schema_project_ftrack"
},
{
"type": "schema",
"name": "schema_project_kitsu"
},
{
"type": "schema",
"name": "schema_project_deadline"

View file

@ -369,6 +369,25 @@
"key": "name_sorting"
}
]
},
{
"type": "dict",
"key": "transfer_values_of_hierarchical_attributes",
"label": "Action to transfer hierarchical attribute values",
"checkbox_key": "enabled",
"children": [
{
"type": "boolean",
"key": "enabled",
"label": "Enabled"
},
{
"type": "list",
"key": "role_list",
"label": "Roles",
"object_type": "text"
}
]
}
]
},

View file

@ -0,0 +1,78 @@
{
"type": "dict",
"key": "kitsu",
"label": "Kitsu",
"collapsible": true,
"is_file": true,
"children": [
{
"type": "dict",
"key": "entities_root",
"label": "Entities root folder",
"children": [
{
"type": "text",
"key": "assets",
"label": "Assets:"
},
{
"type": "text",
"key": "shots",
"label": "Shots (includes Episodes & Sequences if any):"
}
]
},
{
"type": "dict",
"key": "entities_naming_pattern",
"label": "Entities naming pattern",
"children": [
{
"type": "text",
"key": "episode",
"label": "Episode:"
},
{
"type": "text",
"key": "sequence",
"label": "Sequence:"
},
{
"type": "text",
"key": "shot",
"label": "Shot:"
}
]
},
{
"type": "dict",
"collapsible": true,
"key": "publish",
"label": "Publish plugins",
"children": [
{
"type": "label",
"label": "Integrator"
},
{
"type": "dict",
"collapsible": true,
"key": "IntegrateKitsuNote",
"label": "Integrate Kitsu Note",
"children": [
{
"type": "boolean",
"key": "set_status_note",
"label": "Set status on note"
},
{
"type": "text",
"key": "note_status_shortname",
"label": "Note shortname"
}
]
}
]
}
]
}

View file

@ -22,6 +22,12 @@
"label": "Maya Directory Mapping",
"is_group": true,
"children": [
{
"type": "boolean",
"key": "use_env_var_as_root",
"label": "Use env var placeholder in referenced paths",
"docstring": "Use ${} placeholder instead of absolute value of a root in referenced filepaths."
},
{
"type": "boolean",
"key": "enabled",
@ -67,6 +73,10 @@
"type": "schema",
"name": "schema_workfile_build"
},
{
"type": "schema",
"name": "schema_templated_workfile_build"
},
{
"type": "schema",
"name": "schema_publish_gui_filter"

View file

@ -83,6 +83,10 @@
"type": "schema",
"name": "schema_scriptsmenu"
},
{
"type": "schema",
"name": "schema_nuke_scriptsgizmo"
},
{
"type": "dict",
"collapsible": true,

View file

@ -0,0 +1,124 @@
{
"type": "list",
"key": "gizmo",
"label": "Gizmo Menu",
"is_group": true,
"use_label_wrap": true,
"object_type": {
"type": "dict",
"children": [
{
"type": "text",
"key": "toolbar_menu_name",
"label": "Toolbar Menu Name"
},
{
"type": "path",
"key": "gizmo_source_dir",
"label": "Gizmo directory path",
"multipath": true,
"multiplatform": true
},
{
"type": "collapsible-wrap",
"label": "Options",
"collapsible": true,
"collapsed": true,
"children": [
{
"type": "path",
"key": "toolbar_icon_path",
"label": "Toolbar Icon Path",
"multipath": false,
"multiplatform": true
},
{
"type": "splitter"
},
{
"type": "list",
"key": "gizmo_definition",
"label": "Gizmo definitions",
"use_label_wrap": true,
"object_type": {
"type": "dict",
"children": [
{
"type": "text",
"key": "gizmo_toolbar_path",
"label": "Gizmo Menu Path"
},
{
"type": "list",
"key": "sub_gizmo_list",
"label": "Sub Gizmo List",
"use_label_wrap": true,
"object_type": {
"type": "dict-conditional",
"enum_key": "sourcetype",
"enum_label": "Type of usage",
"enum_children": [
{
"key": "python",
"label": "Python",
"children": [
{
"type": "text",
"key": "title",
"label": "Title"
},
{
"type": "text",
"key": "command",
"label": "Python command"
},
{
"type": "text",
"key": "shortcut",
"label": "Hotkey"
}
]
},
{
"key": "file",
"label": "File",
"children": [
{
"type": "text",
"key": "title",
"label": "Title"
},
{
"type": "text",
"key": "file_name",
"label": "Gizmo file name"
},
{
"type": "text",
"key": "shortcut",
"label": "Hotkey"
}
]
},
{
"key": "separator",
"label": "Separator",
"children": [
{
"type": "text",
"key": "gizmo_toolbar_path",
"label": "Toolbar path"
}
]
}
]
}
}
]
}
}
]
}
]
}
}

View file

@ -0,0 +1,35 @@
{
"type": "dict",
"collapsible": true,
"key": "templated_workfile_build",
"label": "Templated Workfile Build Settings",
"children": [
{
"type": "list",
"key": "profiles",
"label": "Profiles",
"object_type": {
"type": "dict",
"children": [
{
"key": "task_types",
"label": "Task types",
"type": "task-types-enum"
},
{
"key": "tasks",
"label": "Task names",
"type": "list",
"object_type": "text"
},
{
"key": "path",
"label": "Path to template",
"type": "text",
"object_type": "text"
}
]
}
}
]
}

View file

@ -0,0 +1,23 @@
{
"type": "dict",
"key": "kitsu",
"label": "Kitsu",
"collapsible": true,
"require_restart": true,
"checkbox_key": "enabled",
"children": [
{
"type": "boolean",
"key": "enabled",
"label": "Enabled"
},
{
"type": "text",
"key": "server",
"label": "Server"
},
{
"type": "splitter"
}
]
}

View file

@ -44,6 +44,10 @@
"type": "schema",
"name": "schema_ftrack"
},
{
"type": "schema",
"name": "schema_kitsu"
},
{
"type": "dict",
"key": "timers_manager",

View file

@ -105,6 +105,7 @@ class HostToolsHelper:
loader_tool.show()
loader_tool.raise_()
loader_tool.activateWindow()
loader_tool.showNormal()
if use_context is None:
use_context = False
@ -180,6 +181,7 @@ class HostToolsHelper:
# Pull window to the front.
scene_inventory_tool.raise_()
scene_inventory_tool.activateWindow()
scene_inventory_tool.showNormal()
def get_library_loader_tool(self, parent):
"""Create, cache and return library loader tool window."""
@ -200,8 +202,10 @@ class HostToolsHelper:
library_loader_tool.show()
library_loader_tool.raise_()
library_loader_tool.activateWindow()
library_loader_tool.showNormal()
library_loader_tool.refresh()
def show_publish(self, parent=None):
"""Try showing the most desirable publish GUI
@ -243,6 +247,11 @@ class HostToolsHelper:
look_assigner_tool = self.get_look_assigner_tool(parent)
look_assigner_tool.show()
# Pull window to the front.
look_assigner_tool.raise_()
look_assigner_tool.activateWindow()
look_assigner_tool.showNormal()
def get_experimental_tools_dialog(self, parent=None):
"""Dialog of experimental tools.
@ -270,6 +279,7 @@ class HostToolsHelper:
dialog.show()
dialog.raise_()
dialog.activateWindow()
dialog.showNormal()
def get_tool_by_name(self, tool_name, parent=None, *args, **kwargs):
"""Show tool by it's name.

View file

@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
__version__ = "3.10.1-nightly.2"
__version__ = "3.11.0-nightly.1"

View file

@ -1,6 +1,6 @@
[tool.poetry]
name = "OpenPype"
version = "3.10.1-nightly.2" # OpenPype
version = "3.11.0-nightly.1" # OpenPype
description = "Open VFX and Animation pipeline with support."
authors = ["OpenPype Team <info@openpype.io>"]
license = "MIT License"
@ -40,6 +40,7 @@ clique = "1.6.*"
Click = "^7"
dnspython = "^2.1.0"
ftrack-python-api = "2.0.*"
gazu = "^0.8"
google-api-python-client = "^1.12.8" # sync server google support (should be separate?)
jsonschema = "^2.6.0"
keyring = "^22.0.1"
@ -64,7 +65,7 @@ jinxed = [
python3-xlib = { version="*", markers = "sys_platform == 'linux'"}
enlighten = "^1.9.0"
slack-sdk = "^3.6.0"
requests = "2.25.1"
requests = "^2.25.1"
pysftp = "^0.2.9"
dropbox = "^11.20.0"

View file

@ -120,3 +120,54 @@ raw json.
You can configure path mapping using Maya `dirmap` command. This will add bi-directional mapping between
list of paths specified in **Settings**. You can find it in **Settings -> Project Settings -> Maya -> Maya Directory Mapping**
![Dirmap settings](assets/maya-admin_dirmap_settings.png)
## Templated Build Workfile
Building a workfile using a template designed by users. Helping to assert homogeneous subsets hierarchy and imports. Template stored as file easy to define, change and customize for production needs.
**1. Make a template**
Make your template. Add families and everything needed for your tasks. Here is an example template for the modeling task using a placeholder to import a gauge.
![maya outliner](assets/maya-workfile-outliner.png)
If needed, you can add placeholders when the template needs to load some assets. **OpenPype > Template Builder > Create Placeholder**
![create placeholder](assets/maya-create_placeholder.png)
- **Configure placeholders**
Fill in the necessary fields (the optional fields are regex filters)
![new place holder](assets/maya-placeholder_new.png)
- Builder type: Wether the the placeholder should load current asset representations or linked assets representations
- Representation: Representation that will be loaded (ex: ma, abc, png, etc...)
- Family: Family of the representation to load (main, look, image, etc ...)
- Loader: Placeholder loader name that will be used to load corresponding representations
- Order: Priority for current placeholder loader (priority is lowest first, highet last)
- **Save your template**
**2. Configure Template**
- **Go to Studio settings > Project > Your DCC > Templated Build Settings**
- Add a profile for your task and enter path to your template
![setting build template](assets/settings/template_build_workfile.png)
**3. Build your workfile**
- Open maya
- Build your workfile
![maya build template](assets/maya-build_workfile_from_template.png)

View file

@ -12,3 +12,7 @@ You can add your custom tools menu into Nuke by extending definitions in **Nuke
This is still work in progress. Menu definition will be handled more friendly with widgets and not
raw json.
:::
## Gizmo Menu
You can add your custom toolbar menu into Nuke by setting your gizmo path and extending definitions in **Nuke -> Gizmo Menu**.
![Custom menu definition](assets/nuke-admin_gizmomenu.png)

View file

@ -67,6 +67,7 @@ We have a few required anatomy templates for OpenPype to work properly, however
| `ext` | File extension |
| `representation` | Representation name |
| `frame` | Frame number for sequence files. |
| `app` | Application Name |
| `output` | |
| `comment` | |

View file

@ -312,6 +312,10 @@ Example setup:
![Maya - Point Cache Example](assets/maya-pointcache_setup.png)
:::note Publish on farm
If your studio has Deadline configured, artists could choose to offload potentially long running export of pointache and publish it to the farm.
Only thing that is necessary is to toggle `Farm` property in created pointcache instance to True.
### Loading Point Caches
Loading point cache means creating reference to **abc** file with Go **OpenPype → Load...**.

View file

@ -0,0 +1,17 @@
---
id: artist_kitsu
title: Kitsu
sidebar_label: Kitsu
---
# How to use Kitsu in OpenPype
## Login to Kitsu module in OpenPype
1. Launch OpenPype, the `Kitsu Credentials` window will open automatically, if not, or if you want to log-in with another account, go to systray OpenPype icon and click on `Kitsu Connect`.
2. Enter your credentials and press *Ok*:
![kitsu-login](assets/kitsu/kitsu_credentials.png)
:::tip
In Kitsu, All the publish actions executed by `pyblish` will be attributed to the currently logged-in user.
:::

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Some files were not shown because too many files have changed in this diff Show more