Merge remote-tracking branch 'origin/develop' into feature/OP-3553_update-to-python-39

This commit is contained in:
Ondřej Samohel 2022-09-12 11:32:17 +02:00
commit c1a9d5392a
No known key found for this signature in database
GPG key ID: 02376E18990A97C6
336 changed files with 6928 additions and 4624 deletions

3
.gitignore vendored
View file

@ -107,3 +107,6 @@ website/.docusaurus
mypy.ini
tools/run_eventserver.*
# Developer tools
tools/dev_*

View file

@ -1,44 +1,94 @@
# Changelog
## [3.14.1-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.14.2-nightly.4](https://github.com/pypeclub/OpenPype/tree/HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.0...HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.1...HEAD)
**🆕 New features**
- Nuke: Build workfile by template [\#3763](https://github.com/pypeclub/OpenPype/pull/3763)
- Houdini: Publishing workfiles [\#3697](https://github.com/pypeclub/OpenPype/pull/3697)
- Global: making collect audio plugin global [\#3679](https://github.com/pypeclub/OpenPype/pull/3679)
**🚀 Enhancements**
- Flame: Adding Creator's retimed shot and handles switch [\#3826](https://github.com/pypeclub/OpenPype/pull/3826)
- Flame: OpenPype submenu to batch and media manager [\#3825](https://github.com/pypeclub/OpenPype/pull/3825)
- General: Better pixmap scaling [\#3809](https://github.com/pypeclub/OpenPype/pull/3809)
- Photoshop: attempt to speed up ExtractImage [\#3793](https://github.com/pypeclub/OpenPype/pull/3793)
- SyncServer: Added cli commands for sync server [\#3765](https://github.com/pypeclub/OpenPype/pull/3765)
- Kitsu: Drop 'entities root' setting. [\#3739](https://github.com/pypeclub/OpenPype/pull/3739)
**🐛 Bug fixes**
- General: Fix Pattern access in client code [\#3828](https://github.com/pypeclub/OpenPype/pull/3828)
- Launcher: Skip opening last work file works for groups [\#3822](https://github.com/pypeclub/OpenPype/pull/3822)
- Maya: Publishing data key change [\#3811](https://github.com/pypeclub/OpenPype/pull/3811)
- Igniter: Fix status handling when version is already installed [\#3804](https://github.com/pypeclub/OpenPype/pull/3804)
- Resolve: Addon import is Python 2 compatible [\#3798](https://github.com/pypeclub/OpenPype/pull/3798)
- Hiero: retimed clip publishing is working [\#3792](https://github.com/pypeclub/OpenPype/pull/3792)
- nuke: validate write node is not failing due wrong type [\#3780](https://github.com/pypeclub/OpenPype/pull/3780)
- Fix - changed format of version string in pyproject.toml [\#3777](https://github.com/pypeclub/OpenPype/pull/3777)
- Ftrack status fix typo prgoress -\> progress [\#3761](https://github.com/pypeclub/OpenPype/pull/3761)
- Fix version resolution [\#3757](https://github.com/pypeclub/OpenPype/pull/3757)
**🔀 Refactored code**
- Photoshop: Use new Extractor location [\#3789](https://github.com/pypeclub/OpenPype/pull/3789)
- Blender: Use new Extractor location [\#3787](https://github.com/pypeclub/OpenPype/pull/3787)
- AfterEffects: Use new Extractor location [\#3784](https://github.com/pypeclub/OpenPype/pull/3784)
- General: Remove unused teshost [\#3773](https://github.com/pypeclub/OpenPype/pull/3773)
- General: Copied 'Extractor' plugin to publish pipeline [\#3771](https://github.com/pypeclub/OpenPype/pull/3771)
- General: Move queries of asset and representation links [\#3770](https://github.com/pypeclub/OpenPype/pull/3770)
- General: Move create project folders to pipeline [\#3768](https://github.com/pypeclub/OpenPype/pull/3768)
- General: Create project function moved to client code [\#3766](https://github.com/pypeclub/OpenPype/pull/3766)
- General: Move hostdirname functionality into host [\#3749](https://github.com/pypeclub/OpenPype/pull/3749)
- General: Move publish utils to pipeline [\#3745](https://github.com/pypeclub/OpenPype/pull/3745)
- Houdini: Define houdini as addon [\#3735](https://github.com/pypeclub/OpenPype/pull/3735)
- Fusion: Defined fusion as addon [\#3733](https://github.com/pypeclub/OpenPype/pull/3733)
- Flame: Defined flame as addon [\#3732](https://github.com/pypeclub/OpenPype/pull/3732)
- Resolve: Define resolve as addon [\#3727](https://github.com/pypeclub/OpenPype/pull/3727)
**Merged pull requests:**
- Standalone Publisher: Ignore empty labels, then still use name like other asset models [\#3779](https://github.com/pypeclub/OpenPype/pull/3779)
- Kitsu - sync\_all\_project - add list ignore\_projects [\#3776](https://github.com/pypeclub/OpenPype/pull/3776)
## [3.14.1](https://github.com/pypeclub/OpenPype/tree/3.14.1) (2022-08-30)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.14.1-nightly.4...3.14.1)
### 📖 Documentation
- Documentation: Few updates [\#3698](https://github.com/pypeclub/OpenPype/pull/3698)
- Documentation: Settings development [\#3660](https://github.com/pypeclub/OpenPype/pull/3660)
**🆕 New features**
- Webpublisher:change create flatten image into tri state [\#3678](https://github.com/pypeclub/OpenPype/pull/3678)
**🚀 Enhancements**
- General: Thumbnail can use project roots [\#3750](https://github.com/pypeclub/OpenPype/pull/3750)
- git: update gitignore [\#3722](https://github.com/pypeclub/OpenPype/pull/3722)
- Settings: Remove settings lock on tray exit [\#3720](https://github.com/pypeclub/OpenPype/pull/3720)
- General: Added helper getters to modules manager [\#3712](https://github.com/pypeclub/OpenPype/pull/3712)
- Unreal: Define unreal as module and use host class [\#3701](https://github.com/pypeclub/OpenPype/pull/3701)
- Settings: Lock settings UI session [\#3700](https://github.com/pypeclub/OpenPype/pull/3700)
- General: Benevolent context label collector [\#3686](https://github.com/pypeclub/OpenPype/pull/3686)
- Ftrack: Store ftrack entities on hierarchy integration to instances [\#3677](https://github.com/pypeclub/OpenPype/pull/3677)
- Ftrack: More logs related to auto sync value change [\#3671](https://github.com/pypeclub/OpenPype/pull/3671)
- Blender: ops refresh manager after process events [\#3663](https://github.com/pypeclub/OpenPype/pull/3663)
**🐛 Bug fixes**
- Maya: Fix typo in getPanel argument `with\_focus` -\> `withFocus` [\#3753](https://github.com/pypeclub/OpenPype/pull/3753)
- General: Smaller fixes of imports [\#3748](https://github.com/pypeclub/OpenPype/pull/3748)
- General: Logger tweaks [\#3741](https://github.com/pypeclub/OpenPype/pull/3741)
- Nuke: missing job dependency if multiple bake streams [\#3737](https://github.com/pypeclub/OpenPype/pull/3737)
- Nuke: color-space settings from anatomy is working [\#3721](https://github.com/pypeclub/OpenPype/pull/3721)
- Settings: Fix studio default anatomy save [\#3716](https://github.com/pypeclub/OpenPype/pull/3716)
- Maya: Use project name instead of project code [\#3709](https://github.com/pypeclub/OpenPype/pull/3709)
- Settings: Fix project overrides save [\#3708](https://github.com/pypeclub/OpenPype/pull/3708)
- Workfiles tool: Fix published workfile filtering [\#3704](https://github.com/pypeclub/OpenPype/pull/3704)
- PS, AE: Provide default variant value for workfile subset [\#3703](https://github.com/pypeclub/OpenPype/pull/3703)
- RoyalRender: handle host name that is not set [\#3695](https://github.com/pypeclub/OpenPype/pull/3695)
- Flame: retime is working on clip publishing [\#3684](https://github.com/pypeclub/OpenPype/pull/3684)
- Webpublisher: added check for empty context [\#3682](https://github.com/pypeclub/OpenPype/pull/3682)
**🔀 Refactored code**
- General: Move delivery logic to pipeline [\#3751](https://github.com/pypeclub/OpenPype/pull/3751)
- General: Host addons cleanup [\#3744](https://github.com/pypeclub/OpenPype/pull/3744)
- Webpublisher: Webpublisher is used as addon [\#3740](https://github.com/pypeclub/OpenPype/pull/3740)
- Photoshop: Defined photoshop as addon [\#3736](https://github.com/pypeclub/OpenPype/pull/3736)
@ -62,7 +112,6 @@
- Hiero: Define hiero as module [\#3717](https://github.com/pypeclub/OpenPype/pull/3717)
- Deadline: better logging for DL webservice failures [\#3694](https://github.com/pypeclub/OpenPype/pull/3694)
- Photoshop: resize saved images in ExtractReview for ffmpeg [\#3676](https://github.com/pypeclub/OpenPype/pull/3676)
## [3.14.0](https://github.com/pypeclub/OpenPype/tree/3.14.0) (2022-08-18)
@ -72,71 +121,16 @@
- Ftrack: Addiotional component metadata [\#3685](https://github.com/pypeclub/OpenPype/pull/3685)
- Ftrack: Set task status on farm publishing [\#3680](https://github.com/pypeclub/OpenPype/pull/3680)
- Ftrack: Set task status on task creation in integrate hierarchy [\#3675](https://github.com/pypeclub/OpenPype/pull/3675)
- Maya: Disable rendering of all lights for render instances submitted through Deadline. [\#3661](https://github.com/pypeclub/OpenPype/pull/3661)
- General: Optimized OCIO configs [\#3650](https://github.com/pypeclub/OpenPype/pull/3650)
**🐛 Bug fixes**
- General: Switch from hero version to versioned works [\#3691](https://github.com/pypeclub/OpenPype/pull/3691)
- General: Fix finding of last version [\#3656](https://github.com/pypeclub/OpenPype/pull/3656)
- General: Extract Review can scale with pixel aspect ratio [\#3644](https://github.com/pypeclub/OpenPype/pull/3644)
- Maya: Refactor moved usage of CreateRender settings [\#3643](https://github.com/pypeclub/OpenPype/pull/3643)
- General: Hero version representations have full context [\#3638](https://github.com/pypeclub/OpenPype/pull/3638)
- Nuke: color settings for render write node is working now [\#3632](https://github.com/pypeclub/OpenPype/pull/3632)
- Maya: FBX support for update in reference loader [\#3631](https://github.com/pypeclub/OpenPype/pull/3631)
**🔀 Refactored code**
- General: Use client projects getter [\#3673](https://github.com/pypeclub/OpenPype/pull/3673)
- Resolve: Match folder structure to other hosts [\#3653](https://github.com/pypeclub/OpenPype/pull/3653)
- Maya: Hosts as modules [\#3647](https://github.com/pypeclub/OpenPype/pull/3647)
- TimersManager: Plugins are in timers manager module [\#3639](https://github.com/pypeclub/OpenPype/pull/3639)
- General: Move workfiles functions into pipeline [\#3637](https://github.com/pypeclub/OpenPype/pull/3637)
**Merged pull requests:**
- Deadline: Global job pre load is not Pype 2 compatible [\#3666](https://github.com/pypeclub/OpenPype/pull/3666)
- Maya: Remove unused get current renderer logic [\#3645](https://github.com/pypeclub/OpenPype/pull/3645)
- Kitsu|Fix: Movie project type fails & first loop children names [\#3636](https://github.com/pypeclub/OpenPype/pull/3636)
- fix the bug of failing to extract look when UDIMs format used in AiImage [\#3628](https://github.com/pypeclub/OpenPype/pull/3628)
- Flame: retime is working on clip publishing [\#3684](https://github.com/pypeclub/OpenPype/pull/3684)
## [3.13.0](https://github.com/pypeclub/OpenPype/tree/3.13.0) (2022-08-09)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.13.0-nightly.1...3.13.0)
**🆕 New features**
- Support for mutliple installed versions - 3.13 [\#3605](https://github.com/pypeclub/OpenPype/pull/3605)
**🚀 Enhancements**
- Editorial: Mix audio use side file for ffmpeg filters [\#3630](https://github.com/pypeclub/OpenPype/pull/3630)
- Ftrack: Comment template can contain optional keys [\#3615](https://github.com/pypeclub/OpenPype/pull/3615)
- Ftrack: Add more metadata to ftrack components [\#3612](https://github.com/pypeclub/OpenPype/pull/3612)
**🐛 Bug fixes**
- Maya: fix aov separator in Redshift [\#3625](https://github.com/pypeclub/OpenPype/pull/3625)
- Fix for multi-version build on Mac [\#3622](https://github.com/pypeclub/OpenPype/pull/3622)
- Ftrack: Sync hierarchical attributes can handle new created entities [\#3621](https://github.com/pypeclub/OpenPype/pull/3621)
- General: Extract review aspect ratio scale is calculated by ffmpeg [\#3620](https://github.com/pypeclub/OpenPype/pull/3620)
- Maya: Fix types of default settings [\#3617](https://github.com/pypeclub/OpenPype/pull/3617)
- Integrator: Don't force to have dot before frame [\#3611](https://github.com/pypeclub/OpenPype/pull/3611)
- AfterEffects: refactored integrate doesnt work formulti frame publishes [\#3610](https://github.com/pypeclub/OpenPype/pull/3610)
- Maya look data contents fails with custom attribute on group [\#3607](https://github.com/pypeclub/OpenPype/pull/3607)
- TrayPublisher: Fix wrong conflict merge [\#3600](https://github.com/pypeclub/OpenPype/pull/3600)
**🔀 Refactored code**
- General: Plugin settings handled by plugins [\#3623](https://github.com/pypeclub/OpenPype/pull/3623)
- General: Naive implementation of document create, update, delete [\#3601](https://github.com/pypeclub/OpenPype/pull/3601)
**Merged pull requests:**
- Webpublisher: timeout for PS studio processing [\#3619](https://github.com/pypeclub/OpenPype/pull/3619)
- Core: translated validate\_containers.py into New publisher style [\#3614](https://github.com/pypeclub/OpenPype/pull/3614)
## [3.12.2](https://github.com/pypeclub/OpenPype/tree/3.12.2) (2022-07-27)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.2-nightly.4...3.12.2)

View file

@ -287,6 +287,14 @@ To run tests, execute `.\tools\run_tests(.ps1|.sh)`.
**Note that it needs existing virtual environment.**
Developer tools
-------------
In case you wish to add your own tools to `.\tools` folder without git tracking, it is possible by adding it with `dev_*` suffix (example: `dev_clear_pyc(.ps1|.sh)`).
## Contributors ✨
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):

View file

@ -63,7 +63,7 @@ class OpenPypeVersion(semver.VersionInfo):
"""
staging = False
path = None
_VERSION_REGEX = re.compile(r"(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$") # noqa: E501
_VERSION_REGEX = re.compile(r"(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?") # noqa: E501
_installed_version = None
def __init__(self, *args, **kwargs):

View file

@ -388,8 +388,11 @@ class InstallDialog(QtWidgets.QDialog):
install_thread.start()
def _installation_finished(self):
# TODO we should find out why status can be set to 'None'?
# - 'InstallThread.run' should handle all cases so not sure where
# that come from
status = self._install_thread.result()
if status >= 0:
if status is not None and status >= 0:
self._update_progress(100)
QtWidgets.QApplication.processEvents()
self.done(3)

View file

@ -1,44 +1,82 @@
# absolute_import is needed to counter the `module has no cmds error` in Maya
from __future__ import absolute_import
import warnings
import functools
import pyblish.api
def get_errored_instances_from_context(context):
instances = list()
for result in context.data["results"]:
if result["instance"] is None:
# When instance is None we are on the "context" result
continue
if result["error"]:
instances.append(result["instance"])
return instances
class ActionDeprecatedWarning(DeprecationWarning):
pass
def get_errored_plugins_from_data(context):
"""Get all failed validation plugins
Args:
context (object):
Returns:
list of plugins which failed during validation
def deprecated(new_destination):
"""Mark functions as deprecated.
It will result in a warning being emitted when the function is used.
"""
plugins = list()
results = context.data.get("results", [])
for result in results:
if result["success"] is True:
continue
plugins.append(result["plugin"])
func = None
if callable(new_destination):
func = new_destination
new_destination = None
return plugins
def _decorator(decorated_func):
if new_destination is None:
warning_message = (
" Please check content of deprecated function to figure out"
" possible replacement."
)
else:
warning_message = " Please replace your usage with '{}'.".format(
new_destination
)
@functools.wraps(decorated_func)
def wrapper(*args, **kwargs):
warnings.simplefilter("always", ActionDeprecatedWarning)
warnings.warn(
(
"Call to deprecated function '{}'"
"\nFunction was moved or removed.{}"
).format(decorated_func.__name__, warning_message),
category=ActionDeprecatedWarning,
stacklevel=4
)
return decorated_func(*args, **kwargs)
return wrapper
if func is None:
return _decorator
return _decorator(func)
@deprecated("openpype.pipeline.publish.get_errored_instances_from_context")
def get_errored_instances_from_context(context):
"""
Deprecated:
Since 3.14.* will be removed in 3.16.* or later.
"""
from openpype.pipeline.publish import get_errored_instances_from_context
return get_errored_instances_from_context(context)
@deprecated("openpype.pipeline.publish.get_errored_plugins_from_context")
def get_errored_plugins_from_data(context):
"""
Deprecated:
Since 3.14.* will be removed in 3.16.* or later.
"""
from openpype.pipeline.publish import get_errored_plugins_from_context
return get_errored_plugins_from_context(context)
# 'RepairAction' and 'RepairContextAction' were moved to
# 'openpype.pipeline.publish' please change you imports.
# There is no "reasonable" way hot mark these classes as deprecated to show
# warning of wrong import.
# Deprecated since 3.14.* will be removed in 3.16.*
class RepairAction(pyblish.api.Action):
"""Repairs the action
@ -65,6 +103,7 @@ class RepairAction(pyblish.api.Action):
plugin.repair(instance)
# Deprecated since 3.14.* will be removed in 3.16.*
class RepairContextAction(pyblish.api.Action):
"""Repairs the action

View file

@ -49,7 +49,6 @@ from .plugin import (
ValidateContentsOrder,
ValidateSceneOrder,
ValidateMeshOrder,
ValidationException
)
# temporary fix, might
@ -94,8 +93,6 @@ __all__ = [
"RepairAction",
"RepairContextAction",
"ValidationException",
# get contextual data
"version_up",
"get_asset",

View file

@ -45,6 +45,17 @@ from .entities import (
get_workfile_info,
)
from .entity_links import (
get_linked_asset_ids,
get_linked_assets,
get_linked_representation_id,
)
from .operations import (
create_project,
)
__all__ = (
"OpenPypeMongoConnection",
@ -88,4 +99,10 @@ __all__ = (
"get_thumbnail_id_from_source",
"get_workfile_info",
"get_linked_asset_ids",
"get_linked_assets",
"get_linked_representation_id",
"create_project",
)

View file

@ -14,6 +14,8 @@ from bson.objectid import ObjectId
from .mongo import get_project_database, get_project_connection
PatternType = type(re.compile(""))
def _prepare_fields(fields, required_fields=None):
if not fields:
@ -32,17 +34,37 @@ def _prepare_fields(fields, required_fields=None):
return output
def _convert_id(in_id):
def convert_id(in_id):
"""Helper function for conversion of id from string to ObjectId.
Args:
in_id (Union[str, ObjectId, Any]): Entity id that should be converted
to right type for queries.
Returns:
Union[ObjectId, Any]: Converted ids to ObjectId or in type.
"""
if isinstance(in_id, six.string_types):
return ObjectId(in_id)
return in_id
def _convert_ids(in_ids):
def convert_ids(in_ids):
"""Helper function for conversion of ids from string to ObjectId.
Args:
in_ids (Iterable[Union[str, ObjectId, Any]]): List of entity ids that
should be converted to right type for queries.
Returns:
List[ObjectId]: Converted ids to ObjectId.
"""
_output = set()
for in_id in in_ids:
if in_id is not None:
_output.add(_convert_id(in_id))
_output.add(convert_id(in_id))
return list(_output)
@ -58,7 +80,7 @@ def get_projects(active=True, inactive=False, fields=None):
yield project_doc
def get_project(project_name, active=True, inactive=False, fields=None):
def get_project(project_name, active=True, inactive=True, fields=None):
# Skip if both are disabled
if not active and not inactive:
return None
@ -115,7 +137,7 @@ def get_asset_by_id(project_name, asset_id, fields=None):
None: Asset was not found by id.
"""
asset_id = _convert_id(asset_id)
asset_id = convert_id(asset_id)
if not asset_id:
return None
@ -196,7 +218,7 @@ def _get_assets(
query_filter = {"type": {"$in": asset_types}}
if asset_ids is not None:
asset_ids = _convert_ids(asset_ids)
asset_ids = convert_ids(asset_ids)
if not asset_ids:
return []
query_filter["_id"] = {"$in": asset_ids}
@ -207,7 +229,7 @@ def _get_assets(
query_filter["name"] = {"$in": list(asset_names)}
if parent_ids is not None:
parent_ids = _convert_ids(parent_ids)
parent_ids = convert_ids(parent_ids)
if not parent_ids:
return []
query_filter["data.visualParent"] = {"$in": parent_ids}
@ -307,7 +329,7 @@ def get_asset_ids_with_subsets(project_name, asset_ids=None):
"type": "subset"
}
if asset_ids is not None:
asset_ids = _convert_ids(asset_ids)
asset_ids = convert_ids(asset_ids)
if not asset_ids:
return []
subset_query["parent"] = {"$in": asset_ids}
@ -347,7 +369,7 @@ def get_subset_by_id(project_name, subset_id, fields=None):
Dict: Subset document which can be reduced to specified 'fields'.
"""
subset_id = _convert_id(subset_id)
subset_id = convert_id(subset_id)
if not subset_id:
return None
@ -374,7 +396,7 @@ def get_subset_by_name(project_name, subset_name, asset_id, fields=None):
if not subset_name:
return None
asset_id = _convert_id(asset_id)
asset_id = convert_id(asset_id)
if not asset_id:
return None
@ -428,13 +450,13 @@ def get_subsets(
query_filter = {"type": {"$in": subset_types}}
if asset_ids is not None:
asset_ids = _convert_ids(asset_ids)
asset_ids = convert_ids(asset_ids)
if not asset_ids:
return []
query_filter["parent"] = {"$in": asset_ids}
if subset_ids is not None:
subset_ids = _convert_ids(subset_ids)
subset_ids = convert_ids(subset_ids)
if not subset_ids:
return []
query_filter["_id"] = {"$in": subset_ids}
@ -449,7 +471,7 @@ def get_subsets(
for asset_id, names in names_by_asset_ids.items():
if asset_id and names:
or_query.append({
"parent": _convert_id(asset_id),
"parent": convert_id(asset_id),
"name": {"$in": list(names)}
})
if not or_query:
@ -510,7 +532,7 @@ def get_version_by_id(project_name, version_id, fields=None):
Dict: Version document which can be reduced to specified 'fields'.
"""
version_id = _convert_id(version_id)
version_id = convert_id(version_id)
if not version_id:
return None
@ -537,7 +559,7 @@ def get_version_by_name(project_name, version, subset_id, fields=None):
Dict: Version document which can be reduced to specified 'fields'.
"""
subset_id = _convert_id(subset_id)
subset_id = convert_id(subset_id)
if not subset_id:
return None
@ -567,7 +589,7 @@ def version_is_latest(project_name, version_id):
bool: True if is latest version from subset else False.
"""
version_id = _convert_id(version_id)
version_id = convert_id(version_id)
if not version_id:
return False
version_doc = get_version_by_id(
@ -610,13 +632,13 @@ def _get_versions(
query_filter = {"type": {"$in": version_types}}
if subset_ids is not None:
subset_ids = _convert_ids(subset_ids)
subset_ids = convert_ids(subset_ids)
if not subset_ids:
return []
query_filter["parent"] = {"$in": subset_ids}
if version_ids is not None:
version_ids = _convert_ids(version_ids)
version_ids = convert_ids(version_ids)
if not version_ids:
return []
query_filter["_id"] = {"$in": version_ids}
@ -690,7 +712,7 @@ def get_hero_version_by_subset_id(project_name, subset_id, fields=None):
Dict: Hero version entity data.
"""
subset_id = _convert_id(subset_id)
subset_id = convert_id(subset_id)
if not subset_id:
return None
@ -720,7 +742,7 @@ def get_hero_version_by_id(project_name, version_id, fields=None):
Dict: Hero version entity data.
"""
version_id = _convert_id(version_id)
version_id = convert_id(version_id)
if not version_id:
return None
@ -786,7 +808,7 @@ def get_output_link_versions(project_name, version_id, fields=None):
links for passed version.
"""
version_id = _convert_id(version_id)
version_id = convert_id(version_id)
if not version_id:
return []
@ -812,7 +834,7 @@ def get_last_versions(project_name, subset_ids, fields=None):
dict[ObjectId, int]: Key is subset id and value is last version name.
"""
subset_ids = _convert_ids(subset_ids)
subset_ids = convert_ids(subset_ids)
if not subset_ids:
return {}
@ -898,7 +920,7 @@ def get_last_version_by_subset_id(project_name, subset_id, fields=None):
Dict: Version document which can be reduced to specified 'fields'.
"""
subset_id = _convert_id(subset_id)
subset_id = convert_id(subset_id)
if not subset_id:
return None
@ -971,7 +993,7 @@ def get_representation_by_id(project_name, representation_id, fields=None):
"type": {"$in": repre_types}
}
if representation_id is not None:
query_filter["_id"] = _convert_id(representation_id)
query_filter["_id"] = convert_id(representation_id)
conn = get_project_connection(project_name)
@ -996,7 +1018,7 @@ def get_representation_by_name(
to specified 'fields'.
"""
version_id = _convert_id(version_id)
version_id = convert_id(version_id)
if not version_id or not representation_name:
return None
repre_types = ["representation", "archived_representations"]
@ -1034,11 +1056,11 @@ def _regex_filters(filters):
for key, value in filters.items():
regexes = []
a_values = []
if isinstance(value, re.Pattern):
if isinstance(value, PatternType):
regexes.append(value)
elif isinstance(value, (list, tuple, set)):
for item in value:
if isinstance(item, re.Pattern):
if isinstance(item, PatternType):
regexes.append(item)
else:
a_values.append(item)
@ -1089,7 +1111,7 @@ def _get_representations(
query_filter = {"type": {"$in": repre_types}}
if representation_ids is not None:
representation_ids = _convert_ids(representation_ids)
representation_ids = convert_ids(representation_ids)
if not representation_ids:
return default_output
query_filter["_id"] = {"$in": representation_ids}
@ -1100,7 +1122,7 @@ def _get_representations(
query_filter["name"] = {"$in": list(representation_names)}
if version_ids is not None:
version_ids = _convert_ids(version_ids)
version_ids = convert_ids(version_ids)
if not version_ids:
return default_output
query_filter["parent"] = {"$in": version_ids}
@ -1111,7 +1133,7 @@ def _get_representations(
for version_id, names in names_by_version_ids.items():
if version_id and names:
or_query.append({
"parent": _convert_id(version_id),
"parent": convert_id(version_id),
"name": {"$in": list(names)}
})
if not or_query:
@ -1174,7 +1196,7 @@ def get_representations(
as filter. Filter ignored if 'None' is passed.
version_ids (Iterable[str]): Subset ids used as parent filter. Filter
ignored if 'None' is passed.
context_filters (Dict[str, List[str, re.Pattern]]): Filter by
context_filters (Dict[str, List[str, PatternType]]): Filter by
representation context fields.
names_by_version_ids (dict[ObjectId, list[str]]): Complex filtering
using version ids and list of names under the version.
@ -1220,7 +1242,7 @@ def get_archived_representations(
as filter. Filter ignored if 'None' is passed.
version_ids (Iterable[str]): Subset ids used as parent filter. Filter
ignored if 'None' is passed.
context_filters (Dict[str, List[str, re.Pattern]]): Filter by
context_filters (Dict[str, List[str, PatternType]]): Filter by
representation context fields.
names_by_version_ids (dict[ObjectId, List[str]]): Complex filtering
using version ids and list of names under the version.
@ -1361,7 +1383,7 @@ def get_thumbnail_id_from_source(project_name, src_type, src_id):
if not src_type or not src_id:
return None
query_filter = {"_id": _convert_id(src_id)}
query_filter = {"_id": convert_id(src_id)}
conn = get_project_connection(project_name)
src_doc = conn.find_one(query_filter, {"data.thumbnail_id"})
@ -1388,7 +1410,7 @@ def get_thumbnails(project_name, thumbnail_ids, fields=None):
"""
if thumbnail_ids:
thumbnail_ids = _convert_ids(thumbnail_ids)
thumbnail_ids = convert_ids(thumbnail_ids)
if not thumbnail_ids:
return []
@ -1416,7 +1438,7 @@ def get_thumbnail(project_name, thumbnail_id, fields=None):
if not thumbnail_id:
return None
query_filter = {"type": "thumbnail", "_id": _convert_id(thumbnail_id)}
query_filter = {"type": "thumbnail", "_id": convert_id(thumbnail_id)}
conn = get_project_connection(project_name)
return conn.find_one(query_filter, _prepare_fields(fields))
@ -1444,7 +1466,7 @@ def get_workfile_info(
query_filter = {
"type": "workfile",
"parent": _convert_id(asset_id),
"parent": convert_id(asset_id),
"task_name": task_name,
"filename": filename
}

View file

@ -0,0 +1,232 @@
from .mongo import get_project_connection
from .entities import (
get_assets,
get_asset_by_id,
get_representation_by_id,
convert_id,
)
def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
"""Extract linked asset ids from asset document.
One of asset document or asset id must be passed.
Note:
Asset links now works only from asset to assets.
Args:
asset_doc (dict): Asset document from DB.
Returns:
List[Union[ObjectId, str]]: Asset ids of input links.
"""
output = []
if not asset_doc and not asset_id:
return output
if not asset_doc:
asset_doc = get_asset_by_id(
project_name, asset_id, fields=["data.inputLinks"]
)
input_links = asset_doc["data"].get("inputLinks")
if not input_links:
return output
for item in input_links:
# Backwards compatibility for "_id" key which was replaced with
# "id"
if "_id" in item:
link_id = item["_id"]
else:
link_id = item["id"]
output.append(link_id)
return output
def get_linked_assets(
project_name, asset_doc=None, asset_id=None, fields=None
):
"""Return linked assets based on passed asset document.
One of asset document or asset id must be passed.
Args:
project_name (str): Name of project where to look for queried entities.
asset_doc (Dict[str, Any]): Asset document from database.
asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
asset document.
fields (Iterable[str]): Fields that should be returned. All fields are
returned if 'None' is passed.
Returns:
List[Dict[str, Any]]: Asset documents of input links for passed
asset doc.
"""
if not asset_doc:
if not asset_id:
return []
asset_doc = get_asset_by_id(
project_name,
asset_id,
fields=["data.inputLinks"]
)
if not asset_doc:
return []
link_ids = get_linked_asset_ids(project_name, asset_doc=asset_doc)
if not link_ids:
return []
return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
def get_linked_representation_id(
project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
):
"""Returns list of linked ids of particular type (if provided).
One of representation document or representation id must be passed.
Note:
Representation links now works only from representation through version
back to representations.
Args:
project_name (str): Name of project where look for links.
repre_doc (Dict[str, Any]): Representation document.
repre_id (Union[ObjectId, str]): Representation id.
link_type (str): Type of link (e.g. 'reference', ...).
max_depth (int): Limit recursion level. Default: 0
Returns:
List[ObjectId] Linked representation ids.
"""
if repre_doc:
repre_id = repre_doc["_id"]
if repre_id:
repre_id = convert_id(repre_id)
if not repre_id and not repre_doc:
return []
version_id = None
if repre_doc:
version_id = repre_doc.get("parent")
if not version_id:
repre_doc = get_representation_by_id(
project_name, repre_id, fields=["parent"]
)
version_id = repre_doc["parent"]
if not version_id:
return []
if max_depth is None:
max_depth = 0
match = {
"_id": version_id,
"type": {"$in": ["version", "hero_version"]}
}
graph_lookup = {
"from": project_name,
"startWith": "$data.inputLinks.id",
"connectFromField": "data.inputLinks.id",
"connectToField": "_id",
"as": "outputs_recursive",
"depthField": "depth"
}
if max_depth != 0:
# We offset by -1 since 0 basically means no recursion
# but the recursion only happens after the initial lookup
# for outputs.
graph_lookup["maxDepth"] = max_depth - 1
query_pipeline = [
# Match
{"$match": match},
# Recursive graph lookup for inputs
{"$graphLookup": graph_lookup}
]
conn = get_project_connection(project_name)
result = conn.aggregate(query_pipeline)
referenced_version_ids = _process_referenced_pipeline_result(
result, link_type
)
if not referenced_version_ids:
return []
ref_ids = conn.distinct(
"_id",
filter={
"parent": {"$in": list(referenced_version_ids)},
"type": "representation"
}
)
return list(ref_ids)
def _process_referenced_pipeline_result(result, link_type):
"""Filters result from pipeline for particular link_type.
Pipeline cannot use link_type directly in a query.
Returns:
(list)
"""
referenced_version_ids = set()
correctly_linked_ids = set()
for item in result:
input_links = item["data"].get("inputLinks")
if not input_links:
continue
_filter_input_links(
input_links,
link_type,
correctly_linked_ids
)
# outputs_recursive in random order, sort by depth
outputs_recursive = item.get("outputs_recursive")
if not outputs_recursive:
continue
for output in sorted(outputs_recursive, key=lambda o: o["depth"]):
output_links = output["data"].get("inputLinks")
if not output_links:
continue
# Leaf
if output["_id"] not in correctly_linked_ids:
continue
_filter_input_links(
output_links,
link_type,
correctly_linked_ids
)
referenced_version_ids.add(output["_id"])
return referenced_version_ids
def _filter_input_links(input_links, link_type, correctly_linked_ids):
for input_link in input_links:
if link_type and input_link["type"] != link_type:
continue
link_id = input_link.get("id") or input_link.get("_id")
if link_id is not None:
correctly_linked_ids.add(link_id)

View file

@ -9,6 +9,7 @@ from bson.objectid import ObjectId
from pymongo import DeleteOne, InsertOne, UpdateOne
from .mongo import get_project_connection
from .entities import get_project
REMOVED_VALUE = object()
@ -24,6 +25,7 @@ CURRENT_SUBSET_SCHEMA = "openpype:subset-3.0"
CURRENT_VERSION_SCHEMA = "openpype:version-3.0"
CURRENT_REPRESENTATION_SCHEMA = "openpype:representation-2.0"
CURRENT_WORKFILE_INFO_SCHEMA = "openpype:workfile-1.0"
CURRENT_THUMBNAIL_SCHEMA = "openpype:thumbnail-1.0"
def _create_or_convert_to_mongo_id(mongo_id):
@ -195,6 +197,29 @@ def new_representation_doc(
}
def new_thumbnail_doc(data=None, entity_id=None):
"""Create skeleton data of thumbnail document.
Args:
data (Dict[str, Any]): Thumbnail document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of thumbnail document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"type": "thumbnail",
"schema": CURRENT_THUMBNAIL_SCHEMA,
"data": data
}
def new_workfile_info_doc(
filename, asset_id, task_name, files, data=None, entity_id=None
):
@ -638,3 +663,89 @@ class OperationsSession(object):
operation = DeleteOperation(project_name, entity_type, entity_id)
self.add(operation)
return operation
def create_project(project_name, project_code, library_project=False):
"""Create project using OpenPype settings.
This project creation function is not validating project document on
creation. It is because project document is created blindly with only
minimum required information about project which is it's name, code, type
and schema.
Entered project name must be unique and project must not exist yet.
Note:
This function is here to be OP v4 ready but in v3 has more logic
to do. That's why inner imports are in the body.
Args:
project_name(str): New project name. Should be unique.
project_code(str): Project's code should be unique too.
library_project(bool): Project is library project.
Raises:
ValueError: When project name already exists in MongoDB.
Returns:
dict: Created project document.
"""
from openpype.settings import ProjectSettings, SaveWarningExc
from openpype.pipeline.schema import validate
if get_project(project_name, fields=["name"]):
raise ValueError("Project with name \"{}\" already exists".format(
project_name
))
if not PROJECT_NAME_REGEX.match(project_name):
raise ValueError((
"Project name \"{}\" contain invalid characters"
).format(project_name))
project_doc = {
"type": "project",
"name": project_name,
"data": {
"code": project_code,
"library_project": library_project
},
"schema": CURRENT_PROJECT_SCHEMA
}
op_session = OperationsSession()
# Insert document with basic data
create_op = op_session.create_entity(
project_name, project_doc["type"], project_doc
)
op_session.commit()
# Load ProjectSettings for the project and save it to store all attributes
# and Anatomy
try:
project_settings_entity = ProjectSettings(project_name)
project_settings_entity.save()
except SaveWarningExc as exc:
print(str(exc))
except Exception:
op_session.delete_entity(
project_name, project_doc["type"], create_op.entity_id
)
op_session.commit()
raise
project_doc = get_project(project_name)
try:
# Validate created project document
validate(project_doc)
except Exception:
# Remove project if is not valid
op_session.delete_entity(
project_name, project_doc["type"], create_op.entity_id
)
op_session.commit()
raise
return project_doc

View file

@ -1,8 +1,6 @@
import os
from openpype.lib import (
PreLaunchHook,
create_workdir_extra_folders
)
from openpype.lib import PreLaunchHook
from openpype.pipeline.workfile import create_workdir_extra_folders
class AddLastWorkfileToLaunchArgs(PreLaunchHook):

View file

@ -1,13 +1,22 @@
from .host import (
HostBase,
)
from .interfaces import (
IWorkfileHost,
ILoadHost,
INewPublisher,
)
from .dirmap import HostDirmap
__all__ = (
"HostBase",
"IWorkfileHost",
"ILoadHost",
"INewPublisher",
"HostDirmap",
)

205
openpype/host/dirmap.py Normal file
View file

@ -0,0 +1,205 @@
"""Dirmap functionality used in host integrations inside DCCs.
Idea for current dirmap implementation was used from Maya where is possible to
enter source and destination roots and maya will try each found source
in referenced file replace with each destionation paths. First path which
exists is used.
"""
import os
from abc import ABCMeta, abstractmethod
import six
from openpype.lib import Logger
from openpype.modules import ModulesManager
from openpype.settings import get_project_settings
from openpype.settings.lib import get_site_local_overrides
@six.add_metaclass(ABCMeta)
class HostDirmap(object):
"""Abstract class for running dirmap on a workfile in a host.
Dirmap is used to translate paths inside of host workfile from one
OS to another. (Eg. arstist created workfile on Win, different artists
opens same file on Linux.)
Expects methods to be implemented inside of host:
on_dirmap_enabled: run host code for enabling dirmap
do_dirmap: run host code to do actual remapping
"""
def __init__(
self, host_name, project_name, project_settings=None, sync_module=None
):
self.host_name = host_name
self.project_name = project_name
self._project_settings = project_settings
self._sync_module = sync_module # to limit reinit of Modules
self._log = None
self._mapping = None # cache mapping
@property
def sync_module(self):
if self._sync_module is None:
manager = ModulesManager()
self._sync_module = manager["sync_server"]
return self._sync_module
@property
def project_settings(self):
if self._project_settings is None:
self._project_settings = get_project_settings(self.project_name)
return self._project_settings
@property
def log(self):
if self._log is None:
self._log = Logger.get_logger(self.__class__.__name__)
return self._log
@abstractmethod
def on_enable_dirmap(self):
"""Run host dependent operation for enabling dirmap if necessary."""
pass
@abstractmethod
def dirmap_routine(self, source_path, destination_path):
"""Run host dependent remapping from source_path to destination_path"""
pass
def process_dirmap(self):
# type: (dict) -> None
"""Go through all paths in Settings and set them using `dirmap`.
If artists has Site Sync enabled, take dirmap mapping directly from
Local Settings when artist is syncing workfile locally.
Args:
project_settings (dict): Settings for current project.
"""
if not self._mapping:
self._mapping = self.get_mappings(self.project_settings)
if not self._mapping:
return
self.log.info("Processing directory mapping ...")
self.on_enable_dirmap()
self.log.info("mapping:: {}".format(self._mapping))
for k, sp in enumerate(self._mapping["source-path"]):
dst = self._mapping["destination-path"][k]
try:
print("{} -> {}".format(sp, dst))
self.dirmap_routine(sp, dst)
except IndexError:
# missing corresponding destination path
self.log.error((
"invalid dirmap mapping, missing corresponding"
" destination directory."
))
break
except RuntimeError:
self.log.error(
"invalid path {} -> {}, mapping not registered".format(
sp, dst
)
)
continue
def get_mappings(self, project_settings):
"""Get translation from source-path to destination-path.
It checks if Site Sync is enabled and user chose to use local
site, in that case configuration in Local Settings takes precedence
"""
local_mapping = self._get_local_sync_dirmap(project_settings)
dirmap_label = "{}-dirmap".format(self.host_name)
if (
not self.project_settings[self.host_name].get(dirmap_label)
and not local_mapping
):
return {}
mapping_settings = self.project_settings[self.host_name][dirmap_label]
mapping_enabled = mapping_settings["enabled"] or bool(local_mapping)
if not mapping_enabled:
return {}
mapping = (
local_mapping
or mapping_settings["paths"]
or {}
)
if (
not mapping
or not mapping.get("destination-path")
or not mapping.get("source-path")
):
return {}
return mapping
def _get_local_sync_dirmap(self, project_settings):
"""
Returns dirmap if synch to local project is enabled.
Only valid mapping is from roots of remote site to local site set
in Local Settings.
Args:
project_settings (dict)
Returns:
dict : { "source-path": [XXX], "destination-path": [YYYY]}
"""
mapping = {}
if not project_settings["global"]["sync_server"]["enabled"]:
return mapping
project_name = os.getenv("AVALON_PROJECT")
active_site = self.sync_module.get_local_normalized_site(
self.sync_module.get_active_site(project_name))
remote_site = self.sync_module.get_local_normalized_site(
self.sync_module.get_remote_site(project_name))
self.log.debug(
"active {} - remote {}".format(active_site, remote_site)
)
if (
active_site == "local"
and project_name in self.sync_module.get_enabled_projects()
and active_site != remote_site
):
sync_settings = self.sync_module.get_sync_project_setting(
project_name,
exclude_locals=False,
cached=False)
active_overrides = get_site_local_overrides(
project_name, active_site)
remote_overrides = get_site_local_overrides(
project_name, remote_site)
self.log.debug("local overrides {}".format(active_overrides))
self.log.debug("remote overrides {}".format(remote_overrides))
for root_name, active_site_dir in active_overrides.items():
remote_site_dir = (
remote_overrides.get(root_name)
or sync_settings["sites"][remote_site]["root"][root_name]
)
if os.path.isdir(active_site_dir):
if "destination-path" not in mapping:
mapping["destination-path"] = []
mapping["destination-path"].append(active_site_dir)
if "source-path" not in mapping:
mapping["source-path"] = []
mapping["source-path"].append(remote_site_dir)
self.log.debug("local sync mapping:: {}".format(mapping))
return mapping

View file

@ -1,37 +1,12 @@
import logging
import contextlib
from abc import ABCMeta, abstractproperty, abstractmethod
from abc import ABCMeta, abstractproperty
import six
# NOTE can't import 'typing' because of issues in Maya 2020
# - shiboken crashes on 'typing' module import
class MissingMethodsError(ValueError):
"""Exception when host miss some required methods for specific workflow.
Args:
host (HostBase): Host implementation where are missing methods.
missing_methods (list[str]): List of missing methods.
"""
def __init__(self, host, missing_methods):
joined_missing = ", ".join(
['"{}"'.format(item) for item in missing_methods]
)
if isinstance(host, HostBase):
host_name = host.name
else:
try:
host_name = host.__file__.replace("\\", "/").split("/")[-3]
except Exception:
host_name = str(host)
message = (
"Host \"{}\" miss methods {}".format(host_name, joined_missing)
)
super(MissingMethodsError, self).__init__(message)
@six.add_metaclass(ABCMeta)
class HostBase(object):
"""Base of host implementation class.
@ -185,347 +160,3 @@ class HostBase(object):
yield
finally:
pass
class ILoadHost:
"""Implementation requirements to be able use reference of representations.
The load plugins can do referencing even without implementation of methods
here, but switch and removement of containers would not be possible.
Questions:
- Is list container dependency of host or load plugins?
- Should this be directly in HostBase?
- how to find out if referencing is available?
- do we need to know that?
"""
@staticmethod
def get_missing_load_methods(host):
"""Look for missing methods on "old type" host implementation.
Method is used for validation of implemented functions related to
loading. Checks only existence of methods.
Args:
Union[ModuleType, HostBase]: Object of host where to look for
required methods.
Returns:
list[str]: Missing method implementations for loading workflow.
"""
if isinstance(host, ILoadHost):
return []
required = ["ls"]
missing = []
for name in required:
if not hasattr(host, name):
missing.append(name)
return missing
@staticmethod
def validate_load_methods(host):
"""Validate implemented methods of "old type" host for load workflow.
Args:
Union[ModuleType, HostBase]: Object of host to validate.
Raises:
MissingMethodsError: If there are missing methods on host
implementation.
"""
missing = ILoadHost.get_missing_load_methods(host)
if missing:
raise MissingMethodsError(host, missing)
@abstractmethod
def get_containers(self):
"""Retreive referenced containers from scene.
This can be implemented in hosts where referencing can be used.
Todo:
Rename function to something more self explanatory.
Suggestion: 'get_containers'
Returns:
list[dict]: Information about loaded containers.
"""
pass
# --- Deprecated method names ---
def ls(self):
"""Deprecated variant of 'get_containers'.
Todo:
Remove when all usages are replaced.
"""
return self.get_containers()
@six.add_metaclass(ABCMeta)
class IWorkfileHost:
"""Implementation requirements to be able use workfile utils and tool."""
@staticmethod
def get_missing_workfile_methods(host):
"""Look for missing methods on "old type" host implementation.
Method is used for validation of implemented functions related to
workfiles. Checks only existence of methods.
Args:
Union[ModuleType, HostBase]: Object of host where to look for
required methods.
Returns:
list[str]: Missing method implementations for workfiles workflow.
"""
if isinstance(host, IWorkfileHost):
return []
required = [
"open_file",
"save_file",
"current_file",
"has_unsaved_changes",
"file_extensions",
"work_root",
]
missing = []
for name in required:
if not hasattr(host, name):
missing.append(name)
return missing
@staticmethod
def validate_workfile_methods(host):
"""Validate methods of "old type" host for workfiles workflow.
Args:
Union[ModuleType, HostBase]: Object of host to validate.
Raises:
MissingMethodsError: If there are missing methods on host
implementation.
"""
missing = IWorkfileHost.get_missing_workfile_methods(host)
if missing:
raise MissingMethodsError(host, missing)
@abstractmethod
def get_workfile_extensions(self):
"""Extensions that can be used as save.
Questions:
This could potentially use 'HostDefinition'.
"""
return []
@abstractmethod
def save_workfile(self, dst_path=None):
"""Save currently opened scene.
Args:
dst_path (str): Where the current scene should be saved. Or use
current path if 'None' is passed.
"""
pass
@abstractmethod
def open_workfile(self, filepath):
"""Open passed filepath in the host.
Args:
filepath (str): Path to workfile.
"""
pass
@abstractmethod
def get_current_workfile(self):
"""Retreive path to current opened file.
Returns:
str: Path to file which is currently opened.
None: If nothing is opened.
"""
return None
def workfile_has_unsaved_changes(self):
"""Currently opened scene is saved.
Not all hosts can know if current scene is saved because the API of
DCC does not support it.
Returns:
bool: True if scene is saved and False if has unsaved
modifications.
None: Can't tell if workfiles has modifications.
"""
return None
def work_root(self, session):
"""Modify workdir per host.
Default implementation keeps workdir untouched.
Warnings:
We must handle this modification with more sofisticated way because
this can't be called out of DCC so opening of last workfile
(calculated before DCC is launched) is complicated. Also breaking
defined work template is not a good idea.
Only place where it's really used and can make sense is Maya. There
workspace.mel can modify subfolders where to look for maya files.
Args:
session (dict): Session context data.
Returns:
str: Path to new workdir.
"""
return session["AVALON_WORKDIR"]
# --- Deprecated method names ---
def file_extensions(self):
"""Deprecated variant of 'get_workfile_extensions'.
Todo:
Remove when all usages are replaced.
"""
return self.get_workfile_extensions()
def save_file(self, dst_path=None):
"""Deprecated variant of 'save_workfile'.
Todo:
Remove when all usages are replaced.
"""
self.save_workfile()
def open_file(self, filepath):
"""Deprecated variant of 'open_workfile'.
Todo:
Remove when all usages are replaced.
"""
return self.open_workfile(filepath)
def current_file(self):
"""Deprecated variant of 'get_current_workfile'.
Todo:
Remove when all usages are replaced.
"""
return self.get_current_workfile()
def has_unsaved_changes(self):
"""Deprecated variant of 'workfile_has_unsaved_changes'.
Todo:
Remove when all usages are replaced.
"""
return self.workfile_has_unsaved_changes()
class INewPublisher:
"""Functions related to new creation system in new publisher.
New publisher is not storing information only about each created instance
but also some global data. At this moment are data related only to context
publish plugins but that can extend in future.
"""
@staticmethod
def get_missing_publish_methods(host):
"""Look for missing methods on "old type" host implementation.
Method is used for validation of implemented functions related to
new publish creation. Checks only existence of methods.
Args:
Union[ModuleType, HostBase]: Host module where to look for
required methods.
Returns:
list[str]: Missing method implementations for new publsher
workflow.
"""
if isinstance(host, INewPublisher):
return []
required = [
"get_context_data",
"update_context_data",
]
missing = []
for name in required:
if not hasattr(host, name):
missing.append(name)
return missing
@staticmethod
def validate_publish_methods(host):
"""Validate implemented methods of "old type" host.
Args:
Union[ModuleType, HostBase]: Host module to validate.
Raises:
MissingMethodsError: If there are missing methods on host
implementation.
"""
missing = INewPublisher.get_missing_publish_methods(host)
if missing:
raise MissingMethodsError(host, missing)
@abstractmethod
def get_context_data(self):
"""Get global data related to creation-publishing from workfile.
These data are not related to any created instance but to whole
publishing context. Not saving/returning them will cause that each
reset of publishing resets all values to default ones.
Context data can contain information about enabled/disabled publish
plugins or other values that can be filled by artist.
Returns:
dict: Context data stored using 'update_context_data'.
"""
pass
@abstractmethod
def update_context_data(self, data, changes):
"""Store global context data to workfile.
Called when some values in context data has changed.
Without storing the values in a way that 'get_context_data' would
return them will each reset of publishing cause loose of filled values
by artist. Best practice is to store values into workfile, if possible.
Args:
data (dict): New data as are.
changes (dict): Only data that has been changed. Each value has
tuple with '(<old>, <new>)' value.
"""
pass

370
openpype/host/interfaces.py Normal file
View file

@ -0,0 +1,370 @@
from abc import ABCMeta, abstractmethod
import six
class MissingMethodsError(ValueError):
"""Exception when host miss some required methods for specific workflow.
Args:
host (HostBase): Host implementation where are missing methods.
missing_methods (list[str]): List of missing methods.
"""
def __init__(self, host, missing_methods):
joined_missing = ", ".join(
['"{}"'.format(item) for item in missing_methods]
)
host_name = getattr(host, "name", None)
if not host_name:
try:
host_name = host.__file__.replace("\\", "/").split("/")[-3]
except Exception:
host_name = str(host)
message = (
"Host \"{}\" miss methods {}".format(host_name, joined_missing)
)
super(MissingMethodsError, self).__init__(message)
class ILoadHost:
"""Implementation requirements to be able use reference of representations.
The load plugins can do referencing even without implementation of methods
here, but switch and removement of containers would not be possible.
Questions:
- Is list container dependency of host or load plugins?
- Should this be directly in HostBase?
- how to find out if referencing is available?
- do we need to know that?
"""
@staticmethod
def get_missing_load_methods(host):
"""Look for missing methods on "old type" host implementation.
Method is used for validation of implemented functions related to
loading. Checks only existence of methods.
Args:
Union[ModuleType, HostBase]: Object of host where to look for
required methods.
Returns:
list[str]: Missing method implementations for loading workflow.
"""
if isinstance(host, ILoadHost):
return []
required = ["ls"]
missing = []
for name in required:
if not hasattr(host, name):
missing.append(name)
return missing
@staticmethod
def validate_load_methods(host):
"""Validate implemented methods of "old type" host for load workflow.
Args:
Union[ModuleType, HostBase]: Object of host to validate.
Raises:
MissingMethodsError: If there are missing methods on host
implementation.
"""
missing = ILoadHost.get_missing_load_methods(host)
if missing:
raise MissingMethodsError(host, missing)
@abstractmethod
def get_containers(self):
"""Retreive referenced containers from scene.
This can be implemented in hosts where referencing can be used.
Todo:
Rename function to something more self explanatory.
Suggestion: 'get_containers'
Returns:
list[dict]: Information about loaded containers.
"""
pass
# --- Deprecated method names ---
def ls(self):
"""Deprecated variant of 'get_containers'.
Todo:
Remove when all usages are replaced.
"""
return self.get_containers()
@six.add_metaclass(ABCMeta)
class IWorkfileHost:
"""Implementation requirements to be able use workfile utils and tool."""
@staticmethod
def get_missing_workfile_methods(host):
"""Look for missing methods on "old type" host implementation.
Method is used for validation of implemented functions related to
workfiles. Checks only existence of methods.
Args:
Union[ModuleType, HostBase]: Object of host where to look for
required methods.
Returns:
list[str]: Missing method implementations for workfiles workflow.
"""
if isinstance(host, IWorkfileHost):
return []
required = [
"open_file",
"save_file",
"current_file",
"has_unsaved_changes",
"file_extensions",
"work_root",
]
missing = []
for name in required:
if not hasattr(host, name):
missing.append(name)
return missing
@staticmethod
def validate_workfile_methods(host):
"""Validate methods of "old type" host for workfiles workflow.
Args:
Union[ModuleType, HostBase]: Object of host to validate.
Raises:
MissingMethodsError: If there are missing methods on host
implementation.
"""
missing = IWorkfileHost.get_missing_workfile_methods(host)
if missing:
raise MissingMethodsError(host, missing)
@abstractmethod
def get_workfile_extensions(self):
"""Extensions that can be used as save.
Questions:
This could potentially use 'HostDefinition'.
"""
return []
@abstractmethod
def save_workfile(self, dst_path=None):
"""Save currently opened scene.
Args:
dst_path (str): Where the current scene should be saved. Or use
current path if 'None' is passed.
"""
pass
@abstractmethod
def open_workfile(self, filepath):
"""Open passed filepath in the host.
Args:
filepath (str): Path to workfile.
"""
pass
@abstractmethod
def get_current_workfile(self):
"""Retreive path to current opened file.
Returns:
str: Path to file which is currently opened.
None: If nothing is opened.
"""
return None
def workfile_has_unsaved_changes(self):
"""Currently opened scene is saved.
Not all hosts can know if current scene is saved because the API of
DCC does not support it.
Returns:
bool: True if scene is saved and False if has unsaved
modifications.
None: Can't tell if workfiles has modifications.
"""
return None
def work_root(self, session):
"""Modify workdir per host.
Default implementation keeps workdir untouched.
Warnings:
We must handle this modification with more sofisticated way because
this can't be called out of DCC so opening of last workfile
(calculated before DCC is launched) is complicated. Also breaking
defined work template is not a good idea.
Only place where it's really used and can make sense is Maya. There
workspace.mel can modify subfolders where to look for maya files.
Args:
session (dict): Session context data.
Returns:
str: Path to new workdir.
"""
return session["AVALON_WORKDIR"]
# --- Deprecated method names ---
def file_extensions(self):
"""Deprecated variant of 'get_workfile_extensions'.
Todo:
Remove when all usages are replaced.
"""
return self.get_workfile_extensions()
def save_file(self, dst_path=None):
"""Deprecated variant of 'save_workfile'.
Todo:
Remove when all usages are replaced.
"""
self.save_workfile()
def open_file(self, filepath):
"""Deprecated variant of 'open_workfile'.
Todo:
Remove when all usages are replaced.
"""
return self.open_workfile(filepath)
def current_file(self):
"""Deprecated variant of 'get_current_workfile'.
Todo:
Remove when all usages are replaced.
"""
return self.get_current_workfile()
def has_unsaved_changes(self):
"""Deprecated variant of 'workfile_has_unsaved_changes'.
Todo:
Remove when all usages are replaced.
"""
return self.workfile_has_unsaved_changes()
class INewPublisher:
"""Functions related to new creation system in new publisher.
New publisher is not storing information only about each created instance
but also some global data. At this moment are data related only to context
publish plugins but that can extend in future.
"""
@staticmethod
def get_missing_publish_methods(host):
"""Look for missing methods on "old type" host implementation.
Method is used for validation of implemented functions related to
new publish creation. Checks only existence of methods.
Args:
Union[ModuleType, HostBase]: Host module where to look for
required methods.
Returns:
list[str]: Missing method implementations for new publsher
workflow.
"""
if isinstance(host, INewPublisher):
return []
required = [
"get_context_data",
"update_context_data",
]
missing = []
for name in required:
if not hasattr(host, name):
missing.append(name)
return missing
@staticmethod
def validate_publish_methods(host):
"""Validate implemented methods of "old type" host.
Args:
Union[ModuleType, HostBase]: Host module to validate.
Raises:
MissingMethodsError: If there are missing methods on host
implementation.
"""
missing = INewPublisher.get_missing_publish_methods(host)
if missing:
raise MissingMethodsError(host, missing)
@abstractmethod
def get_context_data(self):
"""Get global data related to creation-publishing from workfile.
These data are not related to any created instance but to whole
publishing context. Not saving/returning them will cause that each
reset of publishing resets all values to default ones.
Context data can contain information about enabled/disabled publish
plugins or other values that can be filled by artist.
Returns:
dict: Context data stored using 'update_context_data'.
"""
pass
@abstractmethod
def update_context_data(self, data, changes):
"""Store global context data to workfile.
Called when some values in context data has changed.
Without storing the values in a way that 'get_context_data' would
return them will each reset of publishing cause loose of filled values
by artist. Best practice is to store values into workfile, if possible.
Args:
data (dict): New data as are.
changes (dict): Only data that has been changed. Each value has
tuple with '(<old>, <new>)' value.
"""
pass

View file

@ -2,14 +2,18 @@ import os
import sys
import six
import openpype.api
from openpype.lib import (
get_ffmpeg_tool_path,
run_subprocess,
)
from openpype.pipeline import publish
from openpype.hosts.aftereffects.api import get_stub
class ExtractLocalRender(openpype.api.Extractor):
class ExtractLocalRender(publish.Extractor):
"""Render RenderQueue locally."""
order = openpype.api.Extractor.order - 0.47
order = publish.Extractor.order - 0.47
label = "Extract Local Render"
hosts = ["aftereffects"]
families = ["renderLocal", "render.local"]
@ -53,7 +57,7 @@ class ExtractLocalRender(openpype.api.Extractor):
instance.data["representations"] = [repre_data]
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
ffmpeg_path = get_ffmpeg_tool_path("ffmpeg")
# Generate thumbnail.
thumbnail_path = os.path.join(staging_dir, "thumbnail.jpg")
@ -66,7 +70,7 @@ class ExtractLocalRender(openpype.api.Extractor):
]
self.log.debug("Thumbnail args:: {}".format(args))
try:
output = openpype.lib.run_subprocess(args)
output = run_subprocess(args)
except TypeError:
self.log.warning("Error in creating thumbnail")
six.reraise(*sys.exc_info())

View file

@ -1,13 +1,13 @@
import pyblish.api
import openpype.api
from openpype.pipeline import publish
from openpype.hosts.aftereffects.api import get_stub
class ExtractSaveScene(pyblish.api.ContextPlugin):
"""Save scene before extraction."""
order = openpype.api.Extractor.order - 0.48
order = publish.Extractor.order - 0.48
label = "Extract Save Scene"
hosts = ["aftereffects"]

View file

@ -1,6 +1,6 @@
import pyblish.api
from openpype.action import get_errored_plugins_from_data
from openpype.lib import version_up
from openpype.pipeline.publish import get_errored_plugins_from_context
from openpype.hosts.aftereffects.api import get_stub
@ -18,7 +18,7 @@ class IncrementWorkfile(pyblish.api.InstancePlugin):
optional = True
def process(self, instance):
errored_plugins = get_errored_plugins_from_data(instance.context)
errored_plugins = get_errored_plugins_from_context(instance.context)
if errored_plugins:
raise RuntimeError(
"Skipping incrementing current file because publishing failed."

View file

@ -1,8 +1,8 @@
import openpype.api
from openpype.pipeline import publish
from openpype.hosts.aftereffects.api import get_stub
class RemovePublishHighlight(openpype.api.Extractor):
class RemovePublishHighlight(publish.Extractor):
"""Clean utf characters which are not working in DL
Published compositions are marked with unicode icon which causes
@ -10,7 +10,7 @@ class RemovePublishHighlight(openpype.api.Extractor):
rendering, add it later back to avoid confusion.
"""
order = openpype.api.Extractor.order - 0.49 # just before save
order = publish.Extractor.order - 0.49 # just before save
label = "Clean render comp"
hosts = ["aftereffects"]
families = ["render.farm"]

View file

@ -1,9 +1,9 @@
import pyblish.api
import openpype.api
from openpype.pipeline import (
from openpype.pipeline import legacy_io
from openpype.pipeline.publish import (
ValidateContentsOrder,
PublishXmlValidationError,
legacy_io,
)
from openpype.hosts.aftereffects.api import get_stub
@ -50,7 +50,7 @@ class ValidateInstanceAsset(pyblish.api.InstancePlugin):
label = "Validate Instance Asset"
hosts = ["aftereffects"]
actions = [ValidateInstanceAssetRepair]
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
def process(self, instance):
instance_asset = instance.data["asset"]

View file

@ -2,7 +2,7 @@ import bpy
import pyblish.api
from openpype.api import get_errored_instances_from_context
from openpype.pipeline.publish import get_errored_instances_from_context
class SelectInvalidAction(pyblish.api.Action):

View file

@ -1,6 +1,19 @@
import os
import bpy
import pyblish.api
from openpype.pipeline import legacy_io
from openpype.hosts.blender.api import workio
class SaveWorkfiledAction(pyblish.api.Action):
"""Save Workfile."""
label = "Save Workfile"
on = "failed"
icon = "save"
def process(self, context, plugin):
bpy.ops.wm.avalon_workfiles()
class CollectBlenderCurrentFile(pyblish.api.ContextPlugin):
@ -8,12 +21,52 @@ class CollectBlenderCurrentFile(pyblish.api.ContextPlugin):
order = pyblish.api.CollectorOrder - 0.5
label = "Blender Current File"
hosts = ['blender']
hosts = ["blender"]
actions = [SaveWorkfiledAction]
def process(self, context):
"""Inject the current working file"""
current_file = bpy.data.filepath
context.data['currentFile'] = current_file
current_file = workio.current_file()
assert current_file != '', "Current file is empty. " \
"Save the file before continuing."
context.data["currentFile"] = current_file
assert current_file, (
"Current file is empty. Save the file before continuing."
)
folder, file = os.path.split(current_file)
filename, ext = os.path.splitext(file)
task = legacy_io.Session["AVALON_TASK"]
data = {}
# create instance
instance = context.create_instance(name=filename)
subset = "workfile" + task.capitalize()
data.update({
"subset": subset,
"asset": os.getenv("AVALON_ASSET", None),
"label": subset,
"publish": True,
"family": "workfile",
"families": ["workfile"],
"setMembers": [current_file],
"frameStart": bpy.context.scene.frame_start,
"frameEnd": bpy.context.scene.frame_end,
})
data["representations"] = [{
"name": ext.lstrip("."),
"ext": ext.lstrip("."),
"files": file,
"stagingDir": folder,
}]
instance.data.update(data)
self.log.info("Collected instance: {}".format(file))
self.log.info("Scene path: {}".format(current_file))
self.log.info("staging Dir: {}".format(folder))
self.log.info("subset: {}".format(subset))

View file

@ -2,12 +2,12 @@ import os
import bpy
from openpype import api
from openpype.pipeline import publish
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
class ExtractABC(api.Extractor):
class ExtractABC(publish.Extractor):
"""Extract as ABC."""
label = "Extract ABC"

View file

@ -2,10 +2,10 @@ import os
import bpy
import openpype.api
from openpype.pipeline import publish
class ExtractBlend(openpype.api.Extractor):
class ExtractBlend(publish.Extractor):
"""Extract a blend file."""
label = "Extract Blend"

View file

@ -2,10 +2,10 @@ import os
import bpy
import openpype.api
from openpype.pipeline import publish
class ExtractBlendAnimation(openpype.api.Extractor):
class ExtractBlendAnimation(publish.Extractor):
"""Extract a blend file."""
label = "Extract Blend"

View file

@ -2,11 +2,11 @@ import os
import bpy
from openpype import api
from openpype.pipeline import publish
from openpype.hosts.blender.api import plugin
class ExtractCamera(api.Extractor):
class ExtractCamera(publish.Extractor):
"""Extract as the camera as FBX."""
label = "Extract Camera"

View file

@ -2,12 +2,12 @@ import os
import bpy
from openpype import api
from openpype.pipeline import publish
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
class ExtractFBX(api.Extractor):
class ExtractFBX(publish.Extractor):
"""Extract as FBX."""
label = "Extract FBX"

View file

@ -5,12 +5,12 @@ import bpy
import bpy_extras
import bpy_extras.anim_utils
from openpype import api
from openpype.pipeline import publish
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
class ExtractAnimationFBX(api.Extractor):
class ExtractAnimationFBX(publish.Extractor):
"""Extract as animation."""
label = "Extract FBX"

View file

@ -6,12 +6,12 @@ import bpy_extras
import bpy_extras.anim_utils
from openpype.client import get_representation_by_name
from openpype.pipeline import publish
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
import openpype.api
class ExtractLayout(openpype.api.Extractor):
class ExtractLayout(publish.Extractor):
"""Extract a layout."""
label = "Extract Layout"

View file

@ -1,9 +1,11 @@
from typing import List
import mathutils
import bpy
import pyblish.api
import openpype.api
import openpype.hosts.blender.api.action
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateCameraZeroKeyframe(pyblish.api.InstancePlugin):
@ -14,21 +16,18 @@ class ValidateCameraZeroKeyframe(pyblish.api.InstancePlugin):
in Unreal and Blender.
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ["blender"]
families = ["camera"]
category = "geometry"
version = (0, 1, 0)
label = "Zero Keyframe"
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
_identity = mathutils.Matrix()
@classmethod
def get_invalid(cls, instance) -> List:
@staticmethod
def get_invalid(instance) -> List:
invalid = []
for obj in [obj for obj in instance]:
if obj.type == "CAMERA":
for obj in instance:
if isinstance(obj, bpy.types.Object) and obj.type == "CAMERA":
if obj.animation_data and obj.animation_data.action:
action = obj.animation_data.action
frames_set = set()
@ -45,4 +44,5 @@ class ValidateCameraZeroKeyframe(pyblish.api.InstancePlugin):
invalid = self.get_invalid(instance)
if invalid:
raise RuntimeError(
f"Object found in instance is not in Object Mode: {invalid}")
f"Camera must have a keyframe at frame 0: {invalid}"
)

View file

@ -3,13 +3,14 @@ from typing import List
import bpy
import pyblish.api
import openpype.api
import openpype.hosts.blender.api.action
class ValidateMeshHasUvs(pyblish.api.InstancePlugin):
"""Validate that the current mesh has UV's."""
order = pyblish.api.ValidatorOrder
order = openpype.api.ValidateContentsOrder
hosts = ["blender"]
families = ["model"]
category = "geometry"
@ -25,7 +26,10 @@ class ValidateMeshHasUvs(pyblish.api.InstancePlugin):
for uv_layer in obj.data.uv_layers:
for polygon in obj.data.polygons:
for loop_index in polygon.loop_indices:
if not uv_layer.data[loop_index].uv:
if (
loop_index >= len(uv_layer.data)
or not uv_layer.data[loop_index].uv
):
return False
return True
@ -33,20 +37,20 @@ class ValidateMeshHasUvs(pyblish.api.InstancePlugin):
@classmethod
def get_invalid(cls, instance) -> List:
invalid = []
# TODO (jasper): only check objects in the collection that will be published?
for obj in [
obj for obj in instance]:
try:
if obj.type == 'MESH':
# Make sure we are in object mode.
bpy.ops.object.mode_set(mode='OBJECT')
if not cls.has_uvs(obj):
invalid.append(obj)
except:
continue
for obj in instance:
if isinstance(obj, bpy.types.Object) and obj.type == 'MESH':
if obj.mode != "OBJECT":
cls.log.warning(
f"Mesh object {obj.name} should be in 'OBJECT' mode"
" to be properly checked."
)
if not cls.has_uvs(obj):
invalid.append(obj)
return invalid
def process(self, instance):
invalid = self.get_invalid(instance)
if invalid:
raise RuntimeError(f"Meshes found in instance without valid UV's: {invalid}")
raise RuntimeError(
f"Meshes found in instance without valid UV's: {invalid}"
)

View file

@ -3,28 +3,27 @@ from typing import List
import bpy
import pyblish.api
import openpype.api
import openpype.hosts.blender.api.action
class ValidateMeshNoNegativeScale(pyblish.api.Validator):
"""Ensure that meshes don't have a negative scale."""
order = pyblish.api.ValidatorOrder
order = openpype.api.ValidateContentsOrder
hosts = ["blender"]
families = ["model"]
category = "geometry"
label = "Mesh No Negative Scale"
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
@staticmethod
def get_invalid(instance) -> List:
invalid = []
# TODO (jasper): only check objects in the collection that will be published?
for obj in [
obj for obj in bpy.data.objects if obj.type == 'MESH'
]:
if any(v < 0 for v in obj.scale):
invalid.append(obj)
for obj in instance:
if isinstance(obj, bpy.types.Object) and obj.type == 'MESH':
if any(v < 0 for v in obj.scale):
invalid.append(obj)
return invalid
def process(self, instance):

View file

@ -1,7 +1,11 @@
from typing import List
import bpy
import pyblish.api
import openpype.api
import openpype.hosts.blender.api.action
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateNoColonsInName(pyblish.api.InstancePlugin):
@ -12,20 +16,20 @@ class ValidateNoColonsInName(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ["blender"]
families = ["model", "rig"]
version = (0, 1, 0)
label = "No Colons in names"
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
@classmethod
def get_invalid(cls, instance) -> List:
@staticmethod
def get_invalid(instance) -> List:
invalid = []
for obj in [obj for obj in instance]:
for obj in instance:
if ':' in obj.name:
invalid.append(obj)
if obj.type == 'ARMATURE':
if isinstance(obj, bpy.types.Object) and obj.type == 'ARMATURE':
for bone in obj.data.bones:
if ':' in bone.name:
invalid.append(obj)
@ -36,4 +40,5 @@ class ValidateNoColonsInName(pyblish.api.InstancePlugin):
invalid = self.get_invalid(instance)
if invalid:
raise RuntimeError(
f"Objects found with colon in name: {invalid}")
f"Objects found with colon in name: {invalid}"
)

View file

@ -1,5 +1,7 @@
from typing import List
import bpy
import pyblish.api
import openpype.hosts.blender.api.action
@ -10,26 +12,21 @@ class ValidateObjectIsInObjectMode(pyblish.api.InstancePlugin):
order = pyblish.api.ValidatorOrder - 0.01
hosts = ["blender"]
families = ["model", "rig", "layout"]
category = "geometry"
label = "Validate Object Mode"
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
optional = False
@classmethod
def get_invalid(cls, instance) -> List:
@staticmethod
def get_invalid(instance) -> List:
invalid = []
for obj in [obj for obj in instance]:
try:
if obj.type == 'MESH' or obj.type == 'ARMATURE':
# Check if the object is in object mode.
if not obj.mode == 'OBJECT':
invalid.append(obj)
except Exception:
continue
for obj in instance:
if isinstance(obj, bpy.types.Object) and obj.mode != "OBJECT":
invalid.append(obj)
return invalid
def process(self, instance):
invalid = self.get_invalid(instance)
if invalid:
raise RuntimeError(
f"Object found in instance is not in Object Mode: {invalid}")
f"Object found in instance is not in Object Mode: {invalid}"
)

View file

@ -1,9 +1,12 @@
from typing import List
import mathutils
import bpy
import pyblish.api
import openpype.api
import openpype.hosts.blender.api.action
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateTransformZero(pyblish.api.InstancePlugin):
@ -15,10 +18,9 @@ class ValidateTransformZero(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ["blender"]
families = ["model"]
category = "geometry"
version = (0, 1, 0)
label = "Transform Zero"
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
@ -28,8 +30,11 @@ class ValidateTransformZero(pyblish.api.InstancePlugin):
@classmethod
def get_invalid(cls, instance) -> List:
invalid = []
for obj in [obj for obj in instance]:
if obj.matrix_basis != cls._identity:
for obj in instance:
if (
isinstance(obj, bpy.types.Object)
and obj.matrix_basis != cls._identity
):
invalid.append(obj)
return invalid
@ -37,4 +42,6 @@ class ValidateTransformZero(pyblish.api.InstancePlugin):
invalid = self.get_invalid(instance)
if invalid:
raise RuntimeError(
f"Object found in instance is not in Object Mode: {invalid}")
"Object found in instance has not"
f" transform to zero: {invalid}"
)

View file

@ -1,113 +0,0 @@
import os
import collections
from pprint import pformat
import pyblish.api
from openpype.client import (
get_subsets,
get_last_versions,
get_representations
)
from openpype.pipeline import legacy_io
class AppendCelactionAudio(pyblish.api.ContextPlugin):
label = "Colect Audio for publishing"
order = pyblish.api.CollectorOrder + 0.1
def process(self, context):
self.log.info('Collecting Audio Data')
asset_doc = context.data["assetEntity"]
# get all available representations
subsets = self.get_subsets(
asset_doc,
representations=["audio", "wav"]
)
self.log.info(f"subsets is: {pformat(subsets)}")
if not subsets.get("audioMain"):
raise AttributeError("`audioMain` subset does not exist")
reprs = subsets.get("audioMain", {}).get("representations", [])
self.log.info(f"reprs is: {pformat(reprs)}")
repr = next((r for r in reprs), None)
if not repr:
raise "Missing `audioMain` representation"
self.log.info(f"representation is: {repr}")
audio_file = repr.get('data', {}).get('path', "")
if os.path.exists(audio_file):
context.data["audioFile"] = audio_file
self.log.info(
'audio_file: {}, has been added to context'.format(audio_file))
else:
self.log.warning("Couldn't find any audio file on Ftrack.")
def get_subsets(self, asset_doc, representations):
"""
Query subsets with filter on name.
The method will return all found subsets and its defined version
and subsets. Version could be specified with number. Representation
can be filtered.
Arguments:
asset_doct (dict): Asset (shot) mongo document
representations (list): list for all representations
Returns:
dict: subsets with version and representations in keys
"""
# Query all subsets for asset
project_name = legacy_io.active_project()
subset_docs = get_subsets(
project_name, asset_ids=[asset_doc["_id"]], fields=["_id"]
)
# Collect all subset ids
subset_ids = [
subset_doc["_id"]
for subset_doc in subset_docs
]
# Check if we found anything
assert subset_ids, (
"No subsets found. Check correct filter. "
"Try this for start `r'.*'`: asset: `{}`"
).format(asset_doc["name"])
last_versions_by_subset_id = get_last_versions(
project_name, subset_ids, fields=["_id", "parent"]
)
version_docs_by_id = {}
for version_doc in last_versions_by_subset_id.values():
version_docs_by_id[version_doc["_id"]] = version_doc
repre_docs = get_representations(
project_name,
version_ids=version_docs_by_id.keys(),
representation_names=representations
)
repre_docs_by_version_id = collections.defaultdict(list)
for repre_doc in repre_docs:
version_id = repre_doc["parent"]
repre_docs_by_version_id[version_id].append(repre_doc)
output_dict = {}
for version_id, repre_docs in repre_docs_by_version_id.items():
version_doc = version_docs_by_id[version_id]
subset_id = version_doc["parent"]
subset_doc = last_versions_by_subset_id[subset_id]
# Store queried docs by subset name
output_dict[subset_doc["name"]] = {
"representations": repre_docs,
"version": version_doc
}
return output_dict

View file

@ -1,22 +1,10 @@
import os
HOST_DIR = os.path.dirname(
os.path.abspath(__file__)
from .addon import (
HOST_DIR,
FlameAddon,
)
def add_implementation_envs(env, _app):
# Add requirements to DL_PYTHON_HOOK_PATH
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
env["DL_PYTHON_HOOK_PATH"] = os.path.join(
pype_root, "openpype", "hosts", "flame", "startup")
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
# Set default values if are not already set via settings
defaults = {
"LOGLEVEL": "DEBUG"
}
for key, value in defaults.items():
if not env.get(key):
env[key] = value
__all__ = (
"HOST_DIR",
"FlameAddon",
)

View file

@ -0,0 +1,36 @@
import os
from openpype.modules import OpenPypeModule
from openpype.modules.interfaces import IHostAddon
HOST_DIR = os.path.dirname(os.path.abspath(__file__))
class FlameAddon(OpenPypeModule, IHostAddon):
name = "flame"
host_name = "flame"
def initialize(self, module_settings):
self.enabled = True
def add_implementation_envs(self, env, _app):
# Add requirements to DL_PYTHON_HOOK_PATH
env["DL_PYTHON_HOOK_PATH"] = os.path.join(HOST_DIR, "startup")
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
# Set default values if are not already set via settings
defaults = {
"LOGLEVEL": "DEBUG"
}
for key, value in defaults.items():
if not env.get(key):
env[key] = value
def get_launch_hook_paths(self, app):
if app.host_name != self.host_name:
return []
return [
os.path.join(HOST_DIR, "hooks")
]
def get_workfile_extensions(self):
return [".otoc"]

View file

@ -51,7 +51,8 @@ from .pipeline import (
)
from .menu import (
FlameMenuProjectConnect,
FlameMenuTimeline
FlameMenuTimeline,
FlameMenuUniversal
)
from .plugin import (
Creator,
@ -131,6 +132,7 @@ __all__ = [
# menu
"FlameMenuProjectConnect",
"FlameMenuTimeline",
"FlameMenuUniversal",
# plugin
"Creator",

View file

@ -201,3 +201,53 @@ class FlameMenuTimeline(_FlameMenuApp):
if self.flame:
self.flame.execute_shortcut('Rescan Python Hooks')
self.log.info('Rescan Python Hooks')
class FlameMenuUniversal(_FlameMenuApp):
# flameMenuProjectconnect app takes care of the preferences dialog as well
def __init__(self, framework):
_FlameMenuApp.__init__(self, framework)
def __getattr__(self, name):
def method(*args, **kwargs):
project = self.dynamic_menu_data.get(name)
if project:
self.link_project(project)
return method
def build_menu(self):
if not self.flame:
return []
menu = deepcopy(self.menu)
menu['actions'].append({
"name": "Load...",
"execute": lambda x: self.tools_helper.show_loader()
})
menu['actions'].append({
"name": "Manage...",
"execute": lambda x: self.tools_helper.show_scene_inventory()
})
menu['actions'].append({
"name": "Library...",
"execute": lambda x: self.tools_helper.show_library_loader()
})
return menu
def refresh(self, *args, **kwargs):
self.rescan()
def rescan(self, *args, **kwargs):
if not self.flame:
try:
import flame
self.flame = flame
except ImportError:
self.flame = None
if self.flame:
self.flame.execute_shortcut('Rescan Python Hooks')
self.log.info('Rescan Python Hooks')

View file

@ -361,6 +361,8 @@ class PublishableClip:
index_from_segment_default = False
use_shot_name_default = False
include_handles_default = False
retimed_handles_default = True
retimed_framerange_default = True
def __init__(self, segment, **kwargs):
self.rename_index = kwargs["rename_index"]
@ -496,6 +498,14 @@ class PublishableClip:
"audio", {}).get("value") or False
self.include_handles = self.ui_inputs.get(
"includeHandles", {}).get("value") or self.include_handles_default
self.retimed_handles = (
self.ui_inputs.get("retimedHandles", {}).get("value")
or self.retimed_handles_default
)
self.retimed_framerange = (
self.ui_inputs.get("retimedFramerange", {}).get("value")
or self.retimed_framerange_default
)
# build subset name from layer name
if self.subset_name == "[ track name ]":

View file

@ -276,6 +276,22 @@ class CreateShotClip(opfapi.Creator):
"target": "tag",
"toolTip": "By default handles are excluded", # noqa
"order": 3
},
"retimedHandles": {
"value": True,
"type": "QCheckBox",
"label": "Retimed handles",
"target": "tag",
"toolTip": "By default handles are retimed.", # noqa
"order": 4
},
"retimedFramerange": {
"value": True,
"type": "QCheckBox",
"label": "Retimed framerange",
"target": "tag",
"toolTip": "By default framerange is retimed.", # noqa
"order": 5
}
}
}

View file

@ -131,6 +131,10 @@ class CollectTimelineInstances(pyblish.api.ContextPlugin):
"fps": self.fps,
"workfileFrameStart": workfile_start,
"sourceFirstFrame": int(first_frame),
"notRetimedHandles": (
not marker_data.get("retimedHandles")),
"notRetimedFramerange": (
not marker_data.get("retimedFramerange")),
"path": file_path,
"flameAddTasks": self.add_tasks,
"tasks": {

View file

@ -90,26 +90,38 @@ class ExtractSubsetResources(openpype.api.Extractor):
handle_end = instance.data["handleEnd"]
handles = max(handle_start, handle_end)
include_handles = instance.data.get("includeHandles")
retimed_handles = instance.data.get("retimedHandles")
# get media source range with handles
source_start_handles = instance.data["sourceStartH"]
source_end_handles = instance.data["sourceEndH"]
# retime if needed
if r_speed != 1.0:
source_start_handles = (
instance.data["sourceStart"] - r_handle_start)
source_end_handles = (
source_start_handles
+ (r_source_dur - 1)
+ r_handle_start
+ r_handle_end
)
if retimed_handles:
# handles are retimed
source_start_handles = (
instance.data["sourceStart"] - r_handle_start)
source_end_handles = (
source_start_handles
+ (r_source_dur - 1)
+ r_handle_start
+ r_handle_end
)
else:
# handles are not retimed
source_end_handles = (
source_start_handles
+ (r_source_dur - 1)
+ handle_start
+ handle_end
)
# get frame range with handles for representation range
frame_start_handle = frame_start - handle_start
repre_frame_start = frame_start_handle
if include_handles:
if r_speed == 1.0:
if r_speed == 1.0 or not retimed_handles:
frame_start_handle = frame_start
else:
frame_start_handle = (

View file

@ -73,6 +73,8 @@ def load_apps():
opfapi.FlameMenuProjectConnect(opfapi.CTX.app_framework))
opfapi.CTX.flame_apps.append(
opfapi.FlameMenuTimeline(opfapi.CTX.app_framework))
opfapi.CTX.flame_apps.append(
opfapi.FlameMenuUniversal(opfapi.CTX.app_framework))
opfapi.CTX.app_framework.log.info("Apps are loaded")
@ -191,3 +193,27 @@ def get_timeline_custom_ui_actions():
openpype_install()
return _build_app_menu("FlameMenuTimeline")
def get_batch_custom_ui_actions():
"""Hook to create submenu in batch
Returns:
list: menu object
"""
# install openpype and the host
openpype_install()
return _build_app_menu("FlameMenuUniversal")
def get_media_panel_custom_ui_actions():
"""Hook to create submenu in desktop
Returns:
list: menu object
"""
# install openpype and the host
openpype_install()
return _build_app_menu("FlameMenuUniversal")

View file

@ -0,0 +1,10 @@
from .addon import (
FusionAddon,
FUSION_HOST_DIR,
)
__all__ = (
"FusionAddon",
"FUSION_HOST_DIR",
)

View file

@ -0,0 +1,23 @@
import os
from openpype.modules import OpenPypeModule
from openpype.modules.interfaces import IHostAddon
FUSION_HOST_DIR = os.path.dirname(os.path.abspath(__file__))
class FusionAddon(OpenPypeModule, IHostAddon):
name = "fusion"
host_name = "fusion"
def initialize(self, module_settings):
self.enabled = True
def get_launch_hook_paths(self, app):
if app.host_name != self.host_name:
return []
return [
os.path.join(FUSION_HOST_DIR, "hooks")
]
def get_workfile_extensions(self):
return [".comp"]

View file

@ -18,12 +18,11 @@ from openpype.pipeline import (
deregister_inventory_action_path,
AVALON_CONTAINER_ID,
)
import openpype.hosts.fusion
from openpype.hosts.fusion import FUSION_HOST_DIR
log = Logger.get_logger(__name__)
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.fusion.__file__))
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PLUGINS_DIR = os.path.join(FUSION_HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")

View file

@ -2,13 +2,11 @@
import sys
import os
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
from .pipeline import get_current_comp
def file_extensions():
return HOST_WORKFILE_EXTENSIONS["fusion"]
return [".comp"]
def has_unsaved_changes():

View file

@ -17,9 +17,9 @@ class FusionIncrementCurrentFile(pyblish.api.ContextPlugin):
def process(self, context):
from openpype.lib import version_up
from openpype.action import get_errored_plugins_from_data
from openpype.pipeline.publish import get_errored_plugins_from_context
errored_plugins = get_errored_plugins_from_data(context)
errored_plugins = get_errored_plugins_from_context(context)
if any(plugin.__name__ == "FusionSubmitDeadline"
for plugin in errored_plugins):
raise RuntimeError("Skipping incrementing current file because "

View file

@ -1,6 +1,6 @@
import pyblish.api
from openpype import action
from openpype.pipeline.publish import RepairAction
class ValidateBackgroundDepth(pyblish.api.InstancePlugin):
@ -8,7 +8,7 @@ class ValidateBackgroundDepth(pyblish.api.InstancePlugin):
order = pyblish.api.ValidatorOrder
label = "Validate Background Depth 32 bit"
actions = [action.RepairAction]
actions = [RepairAction]
hosts = ["fusion"]
families = ["render"]
optional = True

View file

@ -1,6 +1,6 @@
import pyblish.api
from openpype import action
from openpype.pipeline.publish import RepairAction
class ValidateCreateFolderChecked(pyblish.api.InstancePlugin):
@ -11,7 +11,7 @@ class ValidateCreateFolderChecked(pyblish.api.InstancePlugin):
"""
order = pyblish.api.ValidatorOrder
actions = [action.RepairAction]
actions = [RepairAction]
label = "Validate Create Folder Checked"
families = ["render"]
hosts = ["fusion"]

View file

@ -1,7 +1,7 @@
import os
import pyblish.api
from openpype.action import get_errored_plugins_from_data
from openpype.pipeline.publish import get_errored_plugins_from_context
from openpype.lib import version_up
import openpype.hosts.harmony.api as harmony
@ -19,7 +19,7 @@ class IncrementWorkfile(pyblish.api.InstancePlugin):
optional = True
def process(self, instance):
errored_plugins = get_errored_plugins_from_data(instance.context)
errored_plugins = get_errored_plugins_from_context(instance.context)
if errored_plugins:
raise RuntimeError(
"Skipping incrementing current file because publishing failed."

View file

@ -1,9 +1,12 @@
import os
import pyblish.api
import openpype.api
from openpype.pipeline import PublishXmlValidationError
import openpype.hosts.harmony.api as harmony
from openpype.pipeline.publish import (
ValidateContentsOrder,
PublishXmlValidationError,
)
class ValidateInstanceRepair(pyblish.api.Action):
@ -37,7 +40,7 @@ class ValidateInstance(pyblish.api.InstancePlugin):
label = "Validate Instance"
hosts = ["harmony"]
actions = [ValidateInstanceRepair]
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
def process(self, instance):
instance_asset = instance.data["asset"]

View file

@ -318,10 +318,9 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
@staticmethod
def create_otio_time_range_from_timeline_item_data(track_item):
speed = track_item.playbackSpeed()
timeline = phiero.get_current_sequence()
frame_start = int(track_item.timelineIn())
frame_duration = int((track_item.duration() - 1) / speed)
frame_duration = int(track_item.duration())
fps = timeline.framerate().toFloat()
return hiero_export.create_otio_time_range(

View file

@ -1,38 +1,10 @@
import os
from .addon import (
HoudiniAddon,
HOUDINI_HOST_DIR,
)
def add_implementation_envs(env, _app):
# Add requirements to HOUDINI_PATH and HOUDINI_MENU_PATH
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
startup_path = os.path.join(
pype_root, "openpype", "hosts", "houdini", "startup"
)
new_houdini_path = [startup_path]
new_houdini_menu_path = [startup_path]
old_houdini_path = env.get("HOUDINI_PATH") or ""
old_houdini_menu_path = env.get("HOUDINI_MENU_PATH") or ""
for path in old_houdini_path.split(os.pathsep):
if not path:
continue
norm_path = os.path.normpath(path)
if norm_path not in new_houdini_path:
new_houdini_path.append(norm_path)
for path in old_houdini_menu_path.split(os.pathsep):
if not path:
continue
norm_path = os.path.normpath(path)
if norm_path not in new_houdini_menu_path:
new_houdini_menu_path.append(norm_path)
# Add ampersand for unknown reason (Maybe is needed in Houdini?)
new_houdini_path.append("&")
new_houdini_menu_path.append("&")
env["HOUDINI_PATH"] = os.pathsep.join(new_houdini_path)
env["HOUDINI_MENU_PATH"] = os.pathsep.join(new_houdini_menu_path)
__all__ = (
"HoudiniAddon",
"HOUDINI_HOST_DIR",
)

View file

@ -0,0 +1,55 @@
import os
from openpype.modules import OpenPypeModule
from openpype.modules.interfaces import IHostAddon
HOUDINI_HOST_DIR = os.path.dirname(os.path.abspath(__file__))
class HoudiniAddon(OpenPypeModule, IHostAddon):
name = "houdini"
host_name = "houdini"
def initialize(self, module_settings):
self.enabled = True
def add_implementation_envs(self, env, _app):
# Add requirements to HOUDINI_PATH and HOUDINI_MENU_PATH
startup_path = os.path.join(HOUDINI_HOST_DIR, "startup")
new_houdini_path = [startup_path]
new_houdini_menu_path = [startup_path]
old_houdini_path = env.get("HOUDINI_PATH") or ""
old_houdini_menu_path = env.get("HOUDINI_MENU_PATH") or ""
for path in old_houdini_path.split(os.pathsep):
if not path:
continue
norm_path = os.path.normpath(path)
if norm_path not in new_houdini_path:
new_houdini_path.append(norm_path)
for path in old_houdini_menu_path.split(os.pathsep):
if not path:
continue
norm_path = os.path.normpath(path)
if norm_path not in new_houdini_menu_path:
new_houdini_menu_path.append(norm_path)
# Add ampersand for unknown reason (Maybe is needed in Houdini?)
new_houdini_path.append("&")
new_houdini_menu_path.append("&")
env["HOUDINI_PATH"] = os.pathsep.join(new_houdini_path)
env["HOUDINI_MENU_PATH"] = os.pathsep.join(new_houdini_menu_path)
def get_launch_hook_paths(self, app):
if app.host_name != self.host_name:
return []
return [
os.path.join(HOUDINI_HOST_DIR, "hooks")
]
def get_workfile_extensions(self):
return [".hip", ".hiplc", ".hipnc"]

View file

@ -13,7 +13,7 @@ from openpype.pipeline import (
AVALON_CONTAINER_ID,
)
from openpype.pipeline.load import any_outdated_containers
import openpype.hosts.houdini
from openpype.hosts.houdini import HOUDINI_HOST_DIR
from openpype.hosts.houdini.api import lib
from openpype.lib import (
@ -28,8 +28,7 @@ log = logging.getLogger("openpype.hosts.houdini")
AVALON_CONTAINERS = "/obj/AVALON_CONTAINERS"
IS_HEADLESS = not hasattr(hou, "ui")
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.houdini.__file__))
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PLUGINS_DIR = os.path.join(HOUDINI_HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
@ -66,7 +65,7 @@ def install():
self._has_been_setup = True
# add houdini vendor packages
hou_pythonpath = os.path.join(os.path.dirname(HOST_DIR), "vendor")
hou_pythonpath = os.path.join(HOUDINI_HOST_DIR, "vendor")
sys.path.append(hou_pythonpath)

View file

@ -2,11 +2,10 @@
import os
import hou
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return HOST_WORKFILE_EXTENSIONS["houdini"]
return [".hip", ".hiplc", ".hipnc"]
def has_unsaved_changes():

View file

@ -1,5 +1,4 @@
from openpype.lib import PreLaunchHook
import os
class SetPath(PreLaunchHook):
@ -15,4 +14,4 @@ class SetPath(PreLaunchHook):
self.log.warning("BUG: Workdir is not filled.")
return
os.chdir(workdir)
self.launch_context.kwargs["cwd"] = workdir

View file

@ -1,27 +1,28 @@
import os
import hou
from openpype.pipeline import legacy_io
import pyblish.api
class CollectHoudiniCurrentFile(pyblish.api.ContextPlugin):
"""Inject the current working file into context"""
order = pyblish.api.CollectorOrder - 0.5
order = pyblish.api.CollectorOrder - 0.01
label = "Houdini Current File"
hosts = ["houdini"]
def process(self, context):
"""Inject the current working file"""
filepath = hou.hipFile.path()
if not os.path.exists(filepath):
current_file = hou.hipFile.path()
if not os.path.exists(current_file):
# By default Houdini will even point a new scene to a path.
# However if the file is not saved at all and does not exist,
# we assume the user never set it.
filepath = ""
elif os.path.basename(filepath) == "untitled.hip":
elif os.path.basename(current_file) == "untitled.hip":
# Due to even a new file being called 'untitled.hip' we are unable
# to confirm the current scene was ever saved because the file
# could have existed already. We will allow it if the file exists,
@ -33,4 +34,43 @@ class CollectHoudiniCurrentFile(pyblish.api.ContextPlugin):
"saved correctly."
)
context.data["currentFile"] = filepath
context.data["currentFile"] = current_file
folder, file = os.path.split(current_file)
filename, ext = os.path.splitext(file)
task = legacy_io.Session["AVALON_TASK"]
data = {}
# create instance
instance = context.create_instance(name=filename)
subset = 'workfile' + task.capitalize()
data.update({
"subset": subset,
"asset": os.getenv("AVALON_ASSET", None),
"label": subset,
"publish": True,
"family": 'workfile',
"families": ['workfile'],
"setMembers": [current_file],
"frameStart": context.data['frameStart'],
"frameEnd": context.data['frameEnd'],
"handleStart": context.data['handleStart'],
"handleEnd": context.data['handleEnd']
})
data['representations'] = [{
'name': ext.lstrip("."),
'ext': ext.lstrip("."),
'files': file,
"stagingDir": folder,
}]
instance.data.update(data)
self.log.info('Collected instance: {}'.format(file))
self.log.info('Scene path: {}'.format(current_file))
self.log.info('staging Dir: {}'.format(folder))
self.log.info('subset: {}'.format(subset))

View file

@ -1,8 +1,8 @@
import pyblish.api
from openpype.api import version_up
from openpype.action import get_errored_plugins_from_data
from openpype.lib import version_up
from openpype.pipeline import registered_host
from openpype.pipeline.publish import get_errored_plugins_from_context
class IncrementCurrentFile(pyblish.api.InstancePlugin):
@ -30,7 +30,7 @@ class IncrementCurrentFile(pyblish.api.InstancePlugin):
context.data[key] = True
context = instance.context
errored_plugins = get_errored_plugins_from_data(context)
errored_plugins = get_errored_plugins_from_context(context)
if any(
plugin.__name__ == "HoudiniSubmitPublishDeadline"
for plugin in errored_plugins

View file

@ -1,8 +1,8 @@
import pyblish.api
import hou
from openpype.api import version_up
from openpype.action import get_errored_plugins_from_data
from openpype.lib import version_up
from openpype.pipeline.publish import get_errored_plugins_from_context
class IncrementCurrentFileDeadline(pyblish.api.ContextPlugin):
@ -19,7 +19,7 @@ class IncrementCurrentFileDeadline(pyblish.api.ContextPlugin):
def process(self, context):
errored_plugins = get_errored_plugins_from_data(context)
errored_plugins = get_errored_plugins_from_context(context)
if any(
plugin.__name__ == "HoudiniSubmitPublishDeadline"
for plugin in errored_plugins

View file

@ -1,5 +1,5 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateVDBInputNode(pyblish.api.InstancePlugin):
@ -16,7 +16,7 @@ class ValidateVDBInputNode(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder + 0.1
order = ValidateContentsOrder + 0.1
families = ["vdbcache"]
hosts = ["houdini"]
label = "Validate Input Node (VDB)"

View file

@ -1,8 +1,9 @@
import pyblish.api
import openpype.api
from collections import defaultdict
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateAbcPrimitiveToDetail(pyblish.api.InstancePlugin):
"""Validate Alembic ROP Primitive to Detail attribute is consistent.
@ -15,7 +16,7 @@ class ValidateAbcPrimitiveToDetail(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder + 0.1
order = ValidateContentsOrder + 0.1
families = ["pointcache"]
hosts = ["houdini"]
label = "Validate Primitive to Detail (Abc)"

View file

@ -1,5 +1,6 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateAlembicROPFaceSets(pyblish.api.InstancePlugin):
@ -17,7 +18,7 @@ class ValidateAlembicROPFaceSets(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder + 0.1
order = ValidateContentsOrder + 0.1
families = ["pointcache"]
hosts = ["houdini"]
label = "Validate Alembic ROP Face Sets"

View file

@ -1,5 +1,6 @@
import pyblish.api
import colorbleed.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateAlembicInputNode(pyblish.api.InstancePlugin):
@ -11,7 +12,7 @@ class ValidateAlembicInputNode(pyblish.api.InstancePlugin):
"""
order = colorbleed.api.ValidateContentsOrder + 0.1
order = ValidateContentsOrder + 0.1
families = ["pointcache"]
hosts = ["houdini"]
label = "Validate Input Node (Abc)"

View file

@ -1,5 +1,5 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateBypassed(pyblish.api.InstancePlugin):
@ -11,7 +11,7 @@ class ValidateBypassed(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder - 0.1
order = ValidateContentsOrder - 0.1
families = ["*"]
hosts = ["houdini"]
label = "Validate ROP Bypass"

View file

@ -1,11 +1,11 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateCameraROP(pyblish.api.InstancePlugin):
"""Validate Camera ROP settings."""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
families = ["camera"]
hosts = ["houdini"]
label = "Camera ROP"

View file

@ -1,11 +1,11 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateIntermediateDirectoriesChecked(pyblish.api.InstancePlugin):
"""Validate Create Intermediate Directories is enabled on ROP node."""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
families = ["pointcache", "camera", "vdbcache"]
hosts = ["houdini"]
label = "Create Intermediate Directories Checked"

View file

@ -1,6 +1,6 @@
import pyblish.api
import openpype.api
import hou
from openpype.pipeline.publish import ValidateContentsOrder
def cook_in_range(node, start, end):
@ -28,7 +28,7 @@ def get_errors(node):
class ValidateNoErrors(pyblish.api.InstancePlugin):
"""Validate the Instance has no current cooking errors."""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ["houdini"]
label = "Validate no errors"

View file

@ -1,5 +1,5 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
@ -11,7 +11,7 @@ class ValidatePrimitiveHierarchyPaths(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder + 0.1
order = ValidateContentsOrder + 0.1
families = ["pointcache"]
hosts = ["houdini"]
label = "Validate Prims Hierarchy Path"

View file

@ -1,7 +1,7 @@
import pyblish.api
import openpype.api
from openpype.hosts.houdini.api import lib
from openpype.pipeline.publish import RepairContextAction
import hou
@ -14,7 +14,7 @@ class ValidateRemotePublishOutNode(pyblish.api.ContextPlugin):
hosts = ["houdini"]
targets = ["deadline"]
label = "Remote Publish ROP node"
actions = [openpype.api.RepairContextAction]
actions = [RepairContextAction]
def process(self, context):

View file

@ -1,7 +1,7 @@
import pyblish.api
import openpype.api
import hou
from openpype.pipeline.publish import RepairContextAction
class ValidateRemotePublishEnabled(pyblish.api.ContextPlugin):
@ -12,7 +12,7 @@ class ValidateRemotePublishEnabled(pyblish.api.ContextPlugin):
hosts = ["houdini"]
targets = ["deadline"]
label = "Remote Publish ROP enabled"
actions = [openpype.api.RepairContextAction]
actions = [RepairContextAction]
def process(self, context):

View file

@ -3,14 +3,14 @@ import re
import pyblish.api
from openpype.client import get_subset_by_name
import openpype.api
from openpype.pipeline import legacy_io
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateUSDShadeModelExists(pyblish.api.InstancePlugin):
"""Validate the Instance has no current cooking errors."""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ["houdini"]
families = ["usdShade"]
label = "USD Shade model exists"

View file

@ -1,5 +1,5 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
import hou
@ -12,7 +12,7 @@ class ValidateUsdShadeWorkspace(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ["houdini"]
families = ["usdShade"]
label = "USD Shade Workspace"

View file

@ -1,5 +1,5 @@
import pyblish.api
import openpype.api
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateVDBInputNode(pyblish.api.InstancePlugin):
@ -16,7 +16,7 @@ class ValidateVDBInputNode(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder + 0.1
order = ValidateContentsOrder + 0.1
families = ["vdbcache"]
hosts = ["houdini"]
label = "Validate Input Node (VDB)"

View file

@ -1,6 +1,6 @@
import pyblish.api
import openpype.api
import hou
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateVDBOutputNode(pyblish.api.InstancePlugin):
@ -17,7 +17,7 @@ class ValidateVDBOutputNode(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder + 0.1
order = ValidateContentsOrder + 0.1
families = ["vdbcache"]
hosts = ["houdini"]
label = "Validate Output Node (VDB)"

View file

@ -0,0 +1,57 @@
# -*- coding: utf-8 -*-
import openpype.api
import pyblish.api
import hou
class ValidateWorkfilePaths(pyblish.api.InstancePlugin):
"""Validate workfile paths so they are absolute."""
order = pyblish.api.ValidatorOrder
families = ["workfile"]
hosts = ["houdini"]
label = "Validate Workfile Paths"
actions = [openpype.api.RepairAction]
optional = True
node_types = ["file", "alembic"]
prohibited_vars = ["$HIP", "$JOB"]
def process(self, instance):
invalid = self.get_invalid()
self.log.info(
"node types to check: {}".format(", ".join(self.node_types)))
self.log.info(
"prohibited vars: {}".format(", ".join(self.prohibited_vars))
)
if invalid:
for param in invalid:
self.log.error(
"{}: {}".format(param.path(), param.unexpandedString()))
raise RuntimeError("Invalid paths found")
@classmethod
def get_invalid(cls):
invalid = []
for param, _ in hou.fileReferences():
# skip nodes we are not interested in
if param.node().type().name() not in cls.node_types:
continue
if any(
v for v in cls.prohibited_vars
if v in param.unexpandedString()):
invalid.append(param)
return invalid
@classmethod
def repair(cls, instance):
invalid = cls.get_invalid()
for param in invalid:
cls.log.info("processing: {}".format(param.path()))
cls.log.info("Replacing {} for {}".format(
param.unexpandedString(),
hou.text.expandString(param.unexpandedString())))
param.set(hou.text.expandString(param.unexpandedString()))

View file

@ -0,0 +1,10 @@
from openpype.pipeline import install_host
from openpype.hosts.houdini import api
def main():
print("Installing OpenPype ...")
install_host(api)
main()

View file

@ -1,6 +1,10 @@
from .addon import MayaAddon
from .addon import (
MayaAddon,
MAYA_ROOT_DIR,
)
__all__ = (
"MayaAddon",
"MAYA_ROOT_DIR",
)

View file

@ -5,7 +5,7 @@ import pyblish.api
from openpype.client import get_asset_by_name
from openpype.pipeline import legacy_io
from openpype.api import get_errored_instances_from_context
from openpype.pipeline.publish import get_errored_instances_from_context
class GenerateUUIDsOnInvalidAction(pyblish.api.Action):

View file

@ -104,13 +104,6 @@ def install():
cmds.menuItem(divider=True)
cmds.menuItem(
"Set Render Settings",
command=lambda *args: lib_rendersettings.RenderSettings().set_default_renderer_settings() # noqa
)
cmds.menuItem(divider=True)
cmds.menuItem(
"Work Files...",
command=lambda *args: host_tools.show_workfiles(
@ -132,6 +125,12 @@ def install():
"Set Colorspace",
command=lambda *args: lib.set_colorspace(),
)
cmds.menuItem(
"Set Render Settings",
command=lambda *args: lib_rendersettings.RenderSettings().set_default_renderer_settings() # noqa
)
cmds.menuItem(divider=True, parent=MENU_NAME)
cmds.menuItem(
"Build First Workfile",

View file

@ -9,14 +9,17 @@ import maya.api.OpenMaya as om
import pyblish.api
from openpype.settings import get_project_settings
from openpype.host import HostBase, IWorkfileHost, ILoadHost
import openpype.hosts.maya
from openpype.host import (
HostBase,
IWorkfileHost,
ILoadHost,
HostDirmap,
)
from openpype.tools.utils import host_tools
from openpype.lib import (
register_event_callback,
emit_event
)
from openpype.lib.path_tools import HostDirmap
from openpype.pipeline import (
legacy_io,
register_loader_plugin_path,
@ -28,7 +31,9 @@ from openpype.pipeline import (
AVALON_CONTAINER_ID,
)
from openpype.pipeline.load import any_outdated_containers
from openpype.hosts.maya import MAYA_ROOT_DIR
from openpype.hosts.maya.lib import copy_workspace_mel
from . import menu, lib
from .workio import (
open_file,
@ -41,8 +46,7 @@ from .workio import (
log = logging.getLogger("openpype.hosts.maya")
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.maya.__file__))
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PLUGINS_DIR = os.path.join(MAYA_ROOT_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
@ -59,9 +63,10 @@ class MayaHost(HostBase, IWorkfileHost, ILoadHost):
self._op_events = {}
def install(self):
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
project_name = os.getenv("AVALON_PROJECT")
project_settings = get_project_settings(project_name)
# process path mapping
dirmap_processor = MayaDirmap("maya", project_settings)
dirmap_processor = MayaDirmap("maya", project_name, project_settings)
dirmap_processor.process_dirmap()
pyblish.api.register_plugin_path(PUBLISH_PATH)
@ -344,21 +349,13 @@ def containerise(name,
("id", AVALON_CONTAINER_ID),
("name", name),
("namespace", namespace),
("loader", str(loader)),
("loader", loader),
("representation", context["representation"]["_id"]),
]
for key, value in data:
if not value:
continue
if isinstance(value, (int, float)):
cmds.addAttr(container, longName=key, attributeType="short")
cmds.setAttr(container + "." + key, value)
else:
cmds.addAttr(container, longName=key, dataType="string")
cmds.setAttr(container + "." + key, value, type="string")
cmds.addAttr(container, longName=key, dataType="string")
cmds.setAttr(container + "." + key, str(value), type="string")
main_container = cmds.ls(AVALON_CONTAINERS, type="objectSet")
if not main_container:

View file

@ -0,0 +1,46 @@
from maya import cmds
from openpype.pipeline import InventoryAction, registered_host
from openpype.hosts.maya.api.lib import get_container_members
class SelectInScene(InventoryAction):
"""Select nodes in the scene from selected containers in scene inventory"""
label = "Select in scene"
icon = "search"
color = "#888888"
order = 99
def process(self, containers):
all_members = []
for container in containers:
members = get_container_members(container)
all_members.extend(members)
cmds.select(all_members, replace=True, noExpand=True)
class HighlightBySceneSelection(InventoryAction):
"""Select containers in scene inventory from the current scene selection"""
label = "Highlight by scene selection"
icon = "search"
color = "#888888"
order = 100
def process(self, containers):
selection = set(cmds.ls(selection=True, long=True, objectsOnly=True))
host = registered_host()
to_select = []
for container in host.get_containers():
members = get_container_members(container)
if any(member in selection for member in members):
to_select.append(container["objectName"])
return {
"objectNames": to_select,
"options": {"clear": True}
}

View file

@ -70,7 +70,7 @@ class CollectAssembly(pyblish.api.InstancePlugin):
data[representation_id].append(instance_data)
instance.data["scenedata"] = dict(data)
instance.data["hierarchy"] = list(set(hierarchy_nodes))
instance.data["nodesHierarchy"] = list(set(hierarchy_nodes))
def get_file_rule(self, rule):
return mel.eval('workspace -query -fileRuleEntry "{}"'.format(rule))

View file

@ -293,6 +293,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
"source": filepath,
"expectedFiles": full_exp_files,
"publishRenderMetadataFolder": common_publish_meta_path,
"renderProducts": layer_render_products,
"resolutionWidth": lib.get_attr_in_layer(
"defaultResolution.width", layer=layer_name
),
@ -359,7 +360,6 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
instance.data["label"] = label
instance.data["farm"] = True
instance.data.update(data)
self.log.debug("data: {}".format(json.dumps(data, indent=4)))
def parse_options(self, render_globals):
"""Get all overrides with a value, skip those without.

View file

@ -33,7 +33,7 @@ class ExtractAssembly(openpype.api.Extractor):
json.dump(instance.data["scenedata"], filepath, ensure_ascii=False)
self.log.info("Extracting point cache ..")
cmds.select(instance.data["hierarchy"])
cmds.select(instance.data["nodesHierarchy"])
# Run basic alembic exporter
extract_alembic(file=hierarchy_path,

View file

@ -128,7 +128,7 @@ class ExtractPlayblast(openpype.api.Extractor):
# Update preset with current panel setting
# if override_viewport_options is turned off
if not override_viewport_options:
panel = cmds.getPanel(with_focus=True)
panel = cmds.getPanel(withFocus=True)
panel_preset = capture.parse_active_view()
preset.update(panel_preset)
cmds.setFocus(panel)

View file

@ -117,7 +117,7 @@ class ExtractThumbnail(openpype.api.Extractor):
# Update preset with current panel setting
# if override_viewport_options is turned off
if not override_viewport_options:
panel = cmds.getPanel(with_focus=True)
panel = cmds.getPanel(withFocus=True)
panel_preset = capture.parse_active_view()
preset.update(panel_preset)
cmds.setFocus(panel)

View file

@ -16,12 +16,11 @@ class IncrementCurrentFileDeadline(pyblish.api.ContextPlugin):
def process(self, context):
import os
from maya import cmds
from openpype.lib import version_up
from openpype.action import get_errored_plugins_from_data
from openpype.pipeline.publish import get_errored_plugins_from_context
errored_plugins = get_errored_plugins_from_data(context)
errored_plugins = get_errored_plugins_from_context(context)
if any(plugin.__name__ == "MayaSubmitDeadline"
for plugin in errored_plugins):
raise RuntimeError("Skipping incrementing current file because "

View file

@ -1,6 +1,7 @@
import pyblish.api
import openpype.api
import openpype.hosts.maya.api.action
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateAnimationContent(pyblish.api.InstancePlugin):
@ -11,7 +12,7 @@ class ValidateAnimationContent(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ["maya"]
families = ["animation"]
label = "Animation Content"

View file

@ -4,6 +4,10 @@ import pyblish.api
import openpype.api
import openpype.hosts.maya.api.action
from openpype.hosts.maya.api import lib
from openpype.pipeline.publish import (
RepairAction,
ValidateContentsOrder,
)
class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
@ -16,13 +20,13 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
families = ['animation', "pointcache"]
hosts = ['maya']
label = 'Animation Out Set Related Node Ids'
actions = [
openpype.hosts.maya.api.action.SelectInvalidAction,
openpype.api.RepairAction
RepairAction
]
def process(self, instance):

View file

@ -4,18 +4,20 @@ import types
import maya.cmds as cmds
import pyblish.api
import openpype.api
import openpype.hosts.maya.api.action
from openpype.pipeline.publish import (
RepairAction,
ValidateContentsOrder,
)
class ValidateAssRelativePaths(pyblish.api.InstancePlugin):
"""Ensure exporting ass file has set relative texture paths"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
hosts = ['maya']
families = ['ass']
label = "ASS has relative texture paths"
actions = [openpype.api.RepairAction]
actions = [RepairAction]
def process(self, instance):
# we cannot ask this until user open render settings as

View file

@ -4,6 +4,7 @@ import openpype.api
from maya import cmds
import openpype.hosts.maya.api.action
from openpype.pipeline.publish import RepairAction
class ValidateAssemblyModelTransforms(pyblish.api.InstancePlugin):
@ -29,7 +30,7 @@ class ValidateAssemblyModelTransforms(pyblish.api.InstancePlugin):
label = "Assembly Model Transforms"
families = ["assembly"]
actions = [openpype.hosts.maya.api.action.SelectInvalidAction,
openpype.api.RepairAction]
RepairAction]
prompt_message = ("You are about to reset the matrix to the default values."
" This can alter the look of your scene. "
@ -47,7 +48,7 @@ class ValidateAssemblyModelTransforms(pyblish.api.InstancePlugin):
from openpype.hosts.maya.api import lib
# Get all transforms in the loaded containers
container_roots = cmds.listRelatives(instance.data["hierarchy"],
container_roots = cmds.listRelatives(instance.data["nodesHierarchy"],
children=True,
type="transform",
fullPath=True)

View file

@ -1,7 +1,10 @@
import pymel.core as pm
import pyblish.api
import openpype.api
from openpype.pipeline.publish import (
RepairContextAction,
ValidateContentsOrder,
)
class ValidateAttributes(pyblish.api.ContextPlugin):
@ -16,10 +19,10 @@ class ValidateAttributes(pyblish.api.ContextPlugin):
}
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
label = "Attributes"
hosts = ["maya"]
actions = [openpype.api.RepairContextAction]
actions = [RepairContextAction]
optional = True
attributes = None

View file

@ -3,6 +3,7 @@ from maya import cmds
import pyblish.api
import openpype.api
import openpype.hosts.maya.api.action
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateCameraAttributes(pyblish.api.InstancePlugin):
@ -14,7 +15,7 @@ class ValidateCameraAttributes(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
families = ['camera']
hosts = ['maya']
label = 'Camera Attributes'

View file

@ -3,6 +3,7 @@ from maya import cmds
import pyblish.api
import openpype.api
import openpype.hosts.maya.api.action
from openpype.pipeline.publish import ValidateContentsOrder
class ValidateCameraContents(pyblish.api.InstancePlugin):
@ -15,7 +16,7 @@ class ValidateCameraContents(pyblish.api.InstancePlugin):
"""
order = openpype.api.ValidateContentsOrder
order = ValidateContentsOrder
families = ['camera']
hosts = ['maya']
label = 'Camera Contents'

View file

@ -3,6 +3,10 @@ from maya import cmds
import pyblish.api
import openpype.api
import openpype.hosts.maya.api.action
from openpype.pipeline.publish import (
RepairAction,
ValidateMeshOrder,
)
class ValidateColorSets(pyblish.api.Validator):
@ -13,13 +17,13 @@ class ValidateColorSets(pyblish.api.Validator):
"""
order = openpype.api.ValidateMeshOrder
order = ValidateMeshOrder
hosts = ['maya']
families = ['model']
category = 'geometry'
label = 'Mesh ColorSets'
actions = [openpype.hosts.maya.api.action.SelectInvalidAction,
openpype.api.RepairAction]
RepairAction]
optional = True
@staticmethod

Some files were not shown because too many files have changed in this diff Show more