mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into feature/global-collect-audio-plugin
This commit is contained in:
commit
7668fb830c
578 changed files with 42504 additions and 4815 deletions
202
CHANGELOG.md
202
CHANGELOG.md
|
|
@ -1,143 +1,149 @@
|
|||
# Changelog
|
||||
|
||||
## [3.14.1-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.14.2-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.0...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.1...HEAD)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Nuke: Build workfile by template [\#3763](https://github.com/pypeclub/OpenPype/pull/3763)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: More logs related to auto sync value change [\#3671](https://github.com/pypeclub/OpenPype/pull/3671)
|
||||
- Photoshop: attempt to speed up ExtractImage [\#3793](https://github.com/pypeclub/OpenPype/pull/3793)
|
||||
- SyncServer: Added cli commands for sync server [\#3765](https://github.com/pypeclub/OpenPype/pull/3765)
|
||||
- Blender: Publisher collect workfile representation [\#3670](https://github.com/pypeclub/OpenPype/pull/3670)
|
||||
- Maya: move set render settings menu entry [\#3669](https://github.com/pypeclub/OpenPype/pull/3669)
|
||||
- Scene Inventory: Maya add actions to select from or to scene [\#3659](https://github.com/pypeclub/OpenPype/pull/3659)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- RoyalRender: handle host name that is not set [\#3695](https://github.com/pypeclub/OpenPype/pull/3695)
|
||||
- Resolve: Addon import is Python 2 compatible [\#3798](https://github.com/pypeclub/OpenPype/pull/3798)
|
||||
- nuke: validate write node is not failing due wrong type [\#3780](https://github.com/pypeclub/OpenPype/pull/3780)
|
||||
- Fix - changed format of version string in pyproject.toml [\#3777](https://github.com/pypeclub/OpenPype/pull/3777)
|
||||
- Ftrack status fix typo prgoress -\> progress [\#3761](https://github.com/pypeclub/OpenPype/pull/3761)
|
||||
- Fix version resolution [\#3757](https://github.com/pypeclub/OpenPype/pull/3757)
|
||||
- Maya: `containerise` dont skip empty values [\#3674](https://github.com/pypeclub/OpenPype/pull/3674)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Photoshop: Use new Extractor location [\#3789](https://github.com/pypeclub/OpenPype/pull/3789)
|
||||
- Blender: Use new Extractor location [\#3787](https://github.com/pypeclub/OpenPype/pull/3787)
|
||||
- AfterEffects: Use new Extractor location [\#3784](https://github.com/pypeclub/OpenPype/pull/3784)
|
||||
- General: Remove unused teshost [\#3773](https://github.com/pypeclub/OpenPype/pull/3773)
|
||||
- General: Copied 'Extractor' plugin to publish pipeline [\#3771](https://github.com/pypeclub/OpenPype/pull/3771)
|
||||
- General: Move queries of asset and representation links [\#3770](https://github.com/pypeclub/OpenPype/pull/3770)
|
||||
- General: Create project function moved to client code [\#3766](https://github.com/pypeclub/OpenPype/pull/3766)
|
||||
- General: Move delivery logic to pipeline [\#3751](https://github.com/pypeclub/OpenPype/pull/3751)
|
||||
- General: Move hostdirname functionality into host [\#3749](https://github.com/pypeclub/OpenPype/pull/3749)
|
||||
- General: Move publish utils to pipeline [\#3745](https://github.com/pypeclub/OpenPype/pull/3745)
|
||||
- Houdini: Define houdini as addon [\#3735](https://github.com/pypeclub/OpenPype/pull/3735)
|
||||
- Flame: Defined flame as addon [\#3732](https://github.com/pypeclub/OpenPype/pull/3732)
|
||||
- Resolve: Define resolve as addon [\#3727](https://github.com/pypeclub/OpenPype/pull/3727)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Standalone Publisher: Ignore empty labels, then still use name like other asset models [\#3779](https://github.com/pypeclub/OpenPype/pull/3779)
|
||||
|
||||
## [3.14.1](https://github.com/pypeclub/OpenPype/tree/3.14.1) (2022-08-30)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.14.1-nightly.4...3.14.1)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Documentation: Few updates [\#3698](https://github.com/pypeclub/OpenPype/pull/3698)
|
||||
- Documentation: Settings development [\#3660](https://github.com/pypeclub/OpenPype/pull/3660)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Webpublisher:change create flatten image into tri state [\#3678](https://github.com/pypeclub/OpenPype/pull/3678)
|
||||
- Blender: validators code correction with settings and defaults [\#3662](https://github.com/pypeclub/OpenPype/pull/3662)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Thumbnail can use project roots [\#3750](https://github.com/pypeclub/OpenPype/pull/3750)
|
||||
- Settings: Remove settings lock on tray exit [\#3720](https://github.com/pypeclub/OpenPype/pull/3720)
|
||||
- General: Added helper getters to modules manager [\#3712](https://github.com/pypeclub/OpenPype/pull/3712)
|
||||
- Unreal: Define unreal as module and use host class [\#3701](https://github.com/pypeclub/OpenPype/pull/3701)
|
||||
- Settings: Lock settings UI session [\#3700](https://github.com/pypeclub/OpenPype/pull/3700)
|
||||
- General: Benevolent context label collector [\#3686](https://github.com/pypeclub/OpenPype/pull/3686)
|
||||
- Ftrack: Store ftrack entities on hierarchy integration to instances [\#3677](https://github.com/pypeclub/OpenPype/pull/3677)
|
||||
- Blender: ops refresh manager after process events [\#3663](https://github.com/pypeclub/OpenPype/pull/3663)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: Fix typo in getPanel argument `with\_focus` -\> `withFocus` [\#3753](https://github.com/pypeclub/OpenPype/pull/3753)
|
||||
- General: Smaller fixes of imports [\#3748](https://github.com/pypeclub/OpenPype/pull/3748)
|
||||
- General: Logger tweaks [\#3741](https://github.com/pypeclub/OpenPype/pull/3741)
|
||||
- Nuke: missing job dependency if multiple bake streams [\#3737](https://github.com/pypeclub/OpenPype/pull/3737)
|
||||
- Nuke: color-space settings from anatomy is working [\#3721](https://github.com/pypeclub/OpenPype/pull/3721)
|
||||
- Settings: Fix studio default anatomy save [\#3716](https://github.com/pypeclub/OpenPype/pull/3716)
|
||||
- Maya: Use project name instead of project code [\#3709](https://github.com/pypeclub/OpenPype/pull/3709)
|
||||
- Settings: Fix project overrides save [\#3708](https://github.com/pypeclub/OpenPype/pull/3708)
|
||||
- Workfiles tool: Fix published workfile filtering [\#3704](https://github.com/pypeclub/OpenPype/pull/3704)
|
||||
- PS, AE: Provide default variant value for workfile subset [\#3703](https://github.com/pypeclub/OpenPype/pull/3703)
|
||||
- Flame: retime is working on clip publishing [\#3684](https://github.com/pypeclub/OpenPype/pull/3684)
|
||||
- Webpublisher: added check for empty context [\#3682](https://github.com/pypeclub/OpenPype/pull/3682)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Host addons cleanup [\#3744](https://github.com/pypeclub/OpenPype/pull/3744)
|
||||
- Webpublisher: Webpublisher is used as addon [\#3740](https://github.com/pypeclub/OpenPype/pull/3740)
|
||||
- Photoshop: Defined photoshop as addon [\#3736](https://github.com/pypeclub/OpenPype/pull/3736)
|
||||
- Harmony: Defined harmony as addon [\#3734](https://github.com/pypeclub/OpenPype/pull/3734)
|
||||
- General: Module interfaces cleanup [\#3731](https://github.com/pypeclub/OpenPype/pull/3731)
|
||||
- AfterEffects: Move AE functions from general lib [\#3730](https://github.com/pypeclub/OpenPype/pull/3730)
|
||||
- Blender: Define blender as module [\#3729](https://github.com/pypeclub/OpenPype/pull/3729)
|
||||
- AfterEffects: Define AfterEffects as module [\#3728](https://github.com/pypeclub/OpenPype/pull/3728)
|
||||
- General: Replace PypeLogger with Logger [\#3725](https://github.com/pypeclub/OpenPype/pull/3725)
|
||||
- Nuke: Define nuke as module [\#3724](https://github.com/pypeclub/OpenPype/pull/3724)
|
||||
- General: Move subset name functionality [\#3723](https://github.com/pypeclub/OpenPype/pull/3723)
|
||||
- General: Move creators plugin getter [\#3714](https://github.com/pypeclub/OpenPype/pull/3714)
|
||||
- General: Move constants from lib to client [\#3713](https://github.com/pypeclub/OpenPype/pull/3713)
|
||||
- Loader: Subset groups using client operations [\#3710](https://github.com/pypeclub/OpenPype/pull/3710)
|
||||
- TVPaint: Defined as module [\#3707](https://github.com/pypeclub/OpenPype/pull/3707)
|
||||
- StandalonePublisher: Define StandalonePublisher as module [\#3706](https://github.com/pypeclub/OpenPype/pull/3706)
|
||||
- TrayPublisher: Define TrayPublisher as module [\#3705](https://github.com/pypeclub/OpenPype/pull/3705)
|
||||
- General: Move context specific functions to context tools [\#3702](https://github.com/pypeclub/OpenPype/pull/3702)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Hiero: Define hiero as module [\#3717](https://github.com/pypeclub/OpenPype/pull/3717)
|
||||
- Deadline: better logging for DL webservice failures [\#3694](https://github.com/pypeclub/OpenPype/pull/3694)
|
||||
- Photoshop: resize saved images in ExtractReview for ffmpeg [\#3676](https://github.com/pypeclub/OpenPype/pull/3676)
|
||||
|
||||
## [3.14.0](https://github.com/pypeclub/OpenPype/tree/3.14.0) (2022-08-18)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.14.0-nightly.1...3.14.0)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Maya: Build workfile by template [\#3578](https://github.com/pypeclub/OpenPype/pull/3578)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: Addiotional component metadata [\#3685](https://github.com/pypeclub/OpenPype/pull/3685)
|
||||
- Ftrack: Set task status on farm publishing [\#3680](https://github.com/pypeclub/OpenPype/pull/3680)
|
||||
- Ftrack: Set task status on task creation in integrate hierarchy [\#3675](https://github.com/pypeclub/OpenPype/pull/3675)
|
||||
- Maya: Disable rendering of all lights for render instances submitted through Deadline. [\#3661](https://github.com/pypeclub/OpenPype/pull/3661)
|
||||
- General: Optimized OCIO configs [\#3650](https://github.com/pypeclub/OpenPype/pull/3650)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Switch from hero version to versioned works [\#3691](https://github.com/pypeclub/OpenPype/pull/3691)
|
||||
- General: Fix finding of last version [\#3656](https://github.com/pypeclub/OpenPype/pull/3656)
|
||||
- General: Extract Review can scale with pixel aspect ratio [\#3644](https://github.com/pypeclub/OpenPype/pull/3644)
|
||||
- Maya: Refactor moved usage of CreateRender settings [\#3643](https://github.com/pypeclub/OpenPype/pull/3643)
|
||||
- General: Hero version representations have full context [\#3638](https://github.com/pypeclub/OpenPype/pull/3638)
|
||||
- Nuke: color settings for render write node is working now [\#3632](https://github.com/pypeclub/OpenPype/pull/3632)
|
||||
- Maya: FBX support for update in reference loader [\#3631](https://github.com/pypeclub/OpenPype/pull/3631)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Use client projects getter [\#3673](https://github.com/pypeclub/OpenPype/pull/3673)
|
||||
- Resolve: Match folder structure to other hosts [\#3653](https://github.com/pypeclub/OpenPype/pull/3653)
|
||||
- Maya: Hosts as modules [\#3647](https://github.com/pypeclub/OpenPype/pull/3647)
|
||||
- TimersManager: Plugins are in timers manager module [\#3639](https://github.com/pypeclub/OpenPype/pull/3639)
|
||||
- General: Move workfiles functions into pipeline [\#3637](https://github.com/pypeclub/OpenPype/pull/3637)
|
||||
- General: Workfiles builder using query functions [\#3598](https://github.com/pypeclub/OpenPype/pull/3598)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Deadline: Global job pre load is not Pype 2 compatible [\#3666](https://github.com/pypeclub/OpenPype/pull/3666)
|
||||
- Maya: Remove unused get current renderer logic [\#3645](https://github.com/pypeclub/OpenPype/pull/3645)
|
||||
- Kitsu|Fix: Movie project type fails & first loop children names [\#3636](https://github.com/pypeclub/OpenPype/pull/3636)
|
||||
- fix the bug of failing to extract look when UDIMs format used in AiImage [\#3628](https://github.com/pypeclub/OpenPype/pull/3628)
|
||||
|
||||
## [3.13.0](https://github.com/pypeclub/OpenPype/tree/3.13.0) (2022-08-09)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.13.0-nightly.1...3.13.0)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Support for mutliple installed versions - 3.13 [\#3605](https://github.com/pypeclub/OpenPype/pull/3605)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Editorial: Mix audio use side file for ffmpeg filters [\#3630](https://github.com/pypeclub/OpenPype/pull/3630)
|
||||
- Ftrack: Comment template can contain optional keys [\#3615](https://github.com/pypeclub/OpenPype/pull/3615)
|
||||
- Ftrack: Add more metadata to ftrack components [\#3612](https://github.com/pypeclub/OpenPype/pull/3612)
|
||||
- General: Add context to pyblish context [\#3594](https://github.com/pypeclub/OpenPype/pull/3594)
|
||||
- Kitsu: Shot&Sequence name with prefix over appends [\#3593](https://github.com/pypeclub/OpenPype/pull/3593)
|
||||
- Photoshop: implemented {layer} placeholder in subset template [\#3591](https://github.com/pypeclub/OpenPype/pull/3591)
|
||||
- General: Python module appdirs from git [\#3589](https://github.com/pypeclub/OpenPype/pull/3589)
|
||||
- Ftrack: Update ftrack api to 2.3.3 [\#3588](https://github.com/pypeclub/OpenPype/pull/3588)
|
||||
- General: New Integrator small fixes [\#3583](https://github.com/pypeclub/OpenPype/pull/3583)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: fix aov separator in Redshift [\#3625](https://github.com/pypeclub/OpenPype/pull/3625)
|
||||
- Fix for multi-version build on Mac [\#3622](https://github.com/pypeclub/OpenPype/pull/3622)
|
||||
- Ftrack: Sync hierarchical attributes can handle new created entities [\#3621](https://github.com/pypeclub/OpenPype/pull/3621)
|
||||
- General: Extract review aspect ratio scale is calculated by ffmpeg [\#3620](https://github.com/pypeclub/OpenPype/pull/3620)
|
||||
- Maya: Fix types of default settings [\#3617](https://github.com/pypeclub/OpenPype/pull/3617)
|
||||
- Integrator: Don't force to have dot before frame [\#3611](https://github.com/pypeclub/OpenPype/pull/3611)
|
||||
- AfterEffects: refactored integrate doesnt work formulti frame publishes [\#3610](https://github.com/pypeclub/OpenPype/pull/3610)
|
||||
- Maya look data contents fails with custom attribute on group [\#3607](https://github.com/pypeclub/OpenPype/pull/3607)
|
||||
- TrayPublisher: Fix wrong conflict merge [\#3600](https://github.com/pypeclub/OpenPype/pull/3600)
|
||||
- Bugfix: Add OCIO as submodule to prepare for handling `maketx` color space conversion. [\#3590](https://github.com/pypeclub/OpenPype/pull/3590)
|
||||
- Fix general settings environment variables resolution [\#3587](https://github.com/pypeclub/OpenPype/pull/3587)
|
||||
- Editorial publishing workflow improvements [\#3580](https://github.com/pypeclub/OpenPype/pull/3580)
|
||||
- General: Update imports in start script [\#3579](https://github.com/pypeclub/OpenPype/pull/3579)
|
||||
- Nuke: render family integration consistency [\#3576](https://github.com/pypeclub/OpenPype/pull/3576)
|
||||
- Ftrack: Handle missing published path in integrator [\#3570](https://github.com/pypeclub/OpenPype/pull/3570)
|
||||
- Nuke: publish existing frames with slate with correct range [\#3555](https://github.com/pypeclub/OpenPype/pull/3555)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Plugin settings handled by plugins [\#3623](https://github.com/pypeclub/OpenPype/pull/3623)
|
||||
- General: Naive implementation of document create, update, delete [\#3601](https://github.com/pypeclub/OpenPype/pull/3601)
|
||||
- General: Use query functions in general code [\#3596](https://github.com/pypeclub/OpenPype/pull/3596)
|
||||
- General: Separate extraction of template data into more functions [\#3574](https://github.com/pypeclub/OpenPype/pull/3574)
|
||||
- General: Lib cleanup [\#3571](https://github.com/pypeclub/OpenPype/pull/3571)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Webpublisher: timeout for PS studio processing [\#3619](https://github.com/pypeclub/OpenPype/pull/3619)
|
||||
- Core: translated validate\_containers.py into New publisher style [\#3614](https://github.com/pypeclub/OpenPype/pull/3614)
|
||||
- Enable write color sets on animation publish automatically [\#3582](https://github.com/pypeclub/OpenPype/pull/3582)
|
||||
|
||||
## [3.12.2](https://github.com/pypeclub/OpenPype/tree/3.12.2) (2022-07-27)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.2-nightly.4...3.12.2)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Update website with more studios [\#3554](https://github.com/pypeclub/OpenPype/pull/3554)
|
||||
- Documentation: Update publishing dev docs [\#3549](https://github.com/pypeclub/OpenPype/pull/3549)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Global thumbnail extractor is ready for more cases [\#3561](https://github.com/pypeclub/OpenPype/pull/3561)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: fix Review image plane attribute [\#3569](https://github.com/pypeclub/OpenPype/pull/3569)
|
||||
- Maya: Fix animated attributes \(ie. overscan\) on loaded cameras breaking review publishing. [\#3562](https://github.com/pypeclub/OpenPype/pull/3562)
|
||||
- NewPublisher: Python 2 compatible html escape [\#3559](https://github.com/pypeclub/OpenPype/pull/3559)
|
||||
- Remove invalid submodules from `/vendor` [\#3557](https://github.com/pypeclub/OpenPype/pull/3557)
|
||||
- General: Remove hosts filter on integrator plugins [\#3556](https://github.com/pypeclub/OpenPype/pull/3556)
|
||||
- Settings: Clean default values of environments [\#3550](https://github.com/pypeclub/OpenPype/pull/3550)
|
||||
- Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- General: Use query functions in integrator [\#3563](https://github.com/pypeclub/OpenPype/pull/3563)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Maya: fix active pane loss [\#3566](https://github.com/pypeclub/OpenPype/pull/3566)
|
||||
|
||||
## [3.12.1](https://github.com/pypeclub/OpenPype/tree/3.12.1) (2022-07-13)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.1-nightly.6...3.12.1)
|
||||
|
|
|
|||
|
|
@ -63,7 +63,7 @@ class OpenPypeVersion(semver.VersionInfo):
|
|||
"""
|
||||
staging = False
|
||||
path = None
|
||||
_VERSION_REGEX = re.compile(r"(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$") # noqa: E501
|
||||
_VERSION_REGEX = re.compile(r"(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?") # noqa: E501
|
||||
_installed_version = None
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
|
|
|||
|
|
@ -1,44 +1,82 @@
|
|||
# absolute_import is needed to counter the `module has no cmds error` in Maya
|
||||
from __future__ import absolute_import
|
||||
|
||||
import warnings
|
||||
import functools
|
||||
import pyblish.api
|
||||
|
||||
|
||||
def get_errored_instances_from_context(context):
|
||||
|
||||
instances = list()
|
||||
for result in context.data["results"]:
|
||||
if result["instance"] is None:
|
||||
# When instance is None we are on the "context" result
|
||||
continue
|
||||
|
||||
if result["error"]:
|
||||
instances.append(result["instance"])
|
||||
|
||||
return instances
|
||||
class ActionDeprecatedWarning(DeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
def get_errored_plugins_from_data(context):
|
||||
"""Get all failed validation plugins
|
||||
|
||||
Args:
|
||||
context (object):
|
||||
|
||||
Returns:
|
||||
list of plugins which failed during validation
|
||||
def deprecated(new_destination):
|
||||
"""Mark functions as deprecated.
|
||||
|
||||
It will result in a warning being emitted when the function is used.
|
||||
"""
|
||||
|
||||
plugins = list()
|
||||
results = context.data.get("results", [])
|
||||
for result in results:
|
||||
if result["success"] is True:
|
||||
continue
|
||||
plugins.append(result["plugin"])
|
||||
func = None
|
||||
if callable(new_destination):
|
||||
func = new_destination
|
||||
new_destination = None
|
||||
|
||||
return plugins
|
||||
def _decorator(decorated_func):
|
||||
if new_destination is None:
|
||||
warning_message = (
|
||||
" Please check content of deprecated function to figure out"
|
||||
" possible replacement."
|
||||
)
|
||||
else:
|
||||
warning_message = " Please replace your usage with '{}'.".format(
|
||||
new_destination
|
||||
)
|
||||
|
||||
@functools.wraps(decorated_func)
|
||||
def wrapper(*args, **kwargs):
|
||||
warnings.simplefilter("always", ActionDeprecatedWarning)
|
||||
warnings.warn(
|
||||
(
|
||||
"Call to deprecated function '{}'"
|
||||
"\nFunction was moved or removed.{}"
|
||||
).format(decorated_func.__name__, warning_message),
|
||||
category=ActionDeprecatedWarning,
|
||||
stacklevel=4
|
||||
)
|
||||
return decorated_func(*args, **kwargs)
|
||||
return wrapper
|
||||
|
||||
if func is None:
|
||||
return _decorator
|
||||
return _decorator(func)
|
||||
|
||||
|
||||
@deprecated("openpype.pipeline.publish.get_errored_instances_from_context")
|
||||
def get_errored_instances_from_context(context):
|
||||
"""
|
||||
Deprecated:
|
||||
Since 3.14.* will be removed in 3.16.* or later.
|
||||
"""
|
||||
|
||||
from openpype.pipeline.publish import get_errored_instances_from_context
|
||||
|
||||
return get_errored_instances_from_context(context)
|
||||
|
||||
|
||||
@deprecated("openpype.pipeline.publish.get_errored_plugins_from_context")
|
||||
def get_errored_plugins_from_data(context):
|
||||
"""
|
||||
Deprecated:
|
||||
Since 3.14.* will be removed in 3.16.* or later.
|
||||
"""
|
||||
|
||||
from openpype.pipeline.publish import get_errored_plugins_from_context
|
||||
|
||||
return get_errored_plugins_from_context(context)
|
||||
|
||||
|
||||
# 'RepairAction' and 'RepairContextAction' were moved to
|
||||
# 'openpype.pipeline.publish' please change you imports.
|
||||
# There is no "reasonable" way hot mark these classes as deprecated to show
|
||||
# warning of wrong import.
|
||||
# Deprecated since 3.14.* will be removed in 3.16.*
|
||||
class RepairAction(pyblish.api.Action):
|
||||
"""Repairs the action
|
||||
|
||||
|
|
@ -65,6 +103,7 @@ class RepairAction(pyblish.api.Action):
|
|||
plugin.repair(instance)
|
||||
|
||||
|
||||
# Deprecated since 3.14.* will be removed in 3.16.*
|
||||
class RepairContextAction(pyblish.api.Action):
|
||||
"""Repairs the action
|
||||
|
||||
|
|
|
|||
|
|
@ -49,7 +49,6 @@ from .plugin import (
|
|||
ValidateContentsOrder,
|
||||
ValidateSceneOrder,
|
||||
ValidateMeshOrder,
|
||||
ValidationException
|
||||
)
|
||||
|
||||
# temporary fix, might
|
||||
|
|
@ -94,8 +93,6 @@ __all__ = [
|
|||
"RepairAction",
|
||||
"RepairContextAction",
|
||||
|
||||
"ValidationException",
|
||||
|
||||
# get contextual data
|
||||
"version_up",
|
||||
"get_asset",
|
||||
|
|
|
|||
|
|
@ -45,6 +45,17 @@ from .entities import (
|
|||
get_workfile_info,
|
||||
)
|
||||
|
||||
from .entity_links import (
|
||||
get_linked_asset_ids,
|
||||
get_linked_assets,
|
||||
get_linked_representation_id,
|
||||
)
|
||||
|
||||
from .operations import (
|
||||
create_project,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"OpenPypeMongoConnection",
|
||||
|
||||
|
|
@ -88,4 +99,10 @@ __all__ = (
|
|||
"get_thumbnail_id_from_source",
|
||||
|
||||
"get_workfile_info",
|
||||
|
||||
"get_linked_asset_ids",
|
||||
"get_linked_assets",
|
||||
"get_linked_representation_id",
|
||||
|
||||
"create_project",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -32,17 +32,37 @@ def _prepare_fields(fields, required_fields=None):
|
|||
return output
|
||||
|
||||
|
||||
def _convert_id(in_id):
|
||||
def convert_id(in_id):
|
||||
"""Helper function for conversion of id from string to ObjectId.
|
||||
|
||||
Args:
|
||||
in_id (Union[str, ObjectId, Any]): Entity id that should be converted
|
||||
to right type for queries.
|
||||
|
||||
Returns:
|
||||
Union[ObjectId, Any]: Converted ids to ObjectId or in type.
|
||||
"""
|
||||
|
||||
if isinstance(in_id, six.string_types):
|
||||
return ObjectId(in_id)
|
||||
return in_id
|
||||
|
||||
|
||||
def _convert_ids(in_ids):
|
||||
def convert_ids(in_ids):
|
||||
"""Helper function for conversion of ids from string to ObjectId.
|
||||
|
||||
Args:
|
||||
in_ids (Iterable[Union[str, ObjectId, Any]]): List of entity ids that
|
||||
should be converted to right type for queries.
|
||||
|
||||
Returns:
|
||||
List[ObjectId]: Converted ids to ObjectId.
|
||||
"""
|
||||
|
||||
_output = set()
|
||||
for in_id in in_ids:
|
||||
if in_id is not None:
|
||||
_output.add(_convert_id(in_id))
|
||||
_output.add(convert_id(in_id))
|
||||
return list(_output)
|
||||
|
||||
|
||||
|
|
@ -58,7 +78,7 @@ def get_projects(active=True, inactive=False, fields=None):
|
|||
yield project_doc
|
||||
|
||||
|
||||
def get_project(project_name, active=True, inactive=False, fields=None):
|
||||
def get_project(project_name, active=True, inactive=True, fields=None):
|
||||
# Skip if both are disabled
|
||||
if not active and not inactive:
|
||||
return None
|
||||
|
|
@ -115,7 +135,7 @@ def get_asset_by_id(project_name, asset_id, fields=None):
|
|||
None: Asset was not found by id.
|
||||
"""
|
||||
|
||||
asset_id = _convert_id(asset_id)
|
||||
asset_id = convert_id(asset_id)
|
||||
if not asset_id:
|
||||
return None
|
||||
|
||||
|
|
@ -196,7 +216,7 @@ def _get_assets(
|
|||
query_filter = {"type": {"$in": asset_types}}
|
||||
|
||||
if asset_ids is not None:
|
||||
asset_ids = _convert_ids(asset_ids)
|
||||
asset_ids = convert_ids(asset_ids)
|
||||
if not asset_ids:
|
||||
return []
|
||||
query_filter["_id"] = {"$in": asset_ids}
|
||||
|
|
@ -207,7 +227,7 @@ def _get_assets(
|
|||
query_filter["name"] = {"$in": list(asset_names)}
|
||||
|
||||
if parent_ids is not None:
|
||||
parent_ids = _convert_ids(parent_ids)
|
||||
parent_ids = convert_ids(parent_ids)
|
||||
if not parent_ids:
|
||||
return []
|
||||
query_filter["data.visualParent"] = {"$in": parent_ids}
|
||||
|
|
@ -307,7 +327,7 @@ def get_asset_ids_with_subsets(project_name, asset_ids=None):
|
|||
"type": "subset"
|
||||
}
|
||||
if asset_ids is not None:
|
||||
asset_ids = _convert_ids(asset_ids)
|
||||
asset_ids = convert_ids(asset_ids)
|
||||
if not asset_ids:
|
||||
return []
|
||||
subset_query["parent"] = {"$in": asset_ids}
|
||||
|
|
@ -347,7 +367,7 @@ def get_subset_by_id(project_name, subset_id, fields=None):
|
|||
Dict: Subset document which can be reduced to specified 'fields'.
|
||||
"""
|
||||
|
||||
subset_id = _convert_id(subset_id)
|
||||
subset_id = convert_id(subset_id)
|
||||
if not subset_id:
|
||||
return None
|
||||
|
||||
|
|
@ -374,7 +394,7 @@ def get_subset_by_name(project_name, subset_name, asset_id, fields=None):
|
|||
if not subset_name:
|
||||
return None
|
||||
|
||||
asset_id = _convert_id(asset_id)
|
||||
asset_id = convert_id(asset_id)
|
||||
if not asset_id:
|
||||
return None
|
||||
|
||||
|
|
@ -428,13 +448,13 @@ def get_subsets(
|
|||
query_filter = {"type": {"$in": subset_types}}
|
||||
|
||||
if asset_ids is not None:
|
||||
asset_ids = _convert_ids(asset_ids)
|
||||
asset_ids = convert_ids(asset_ids)
|
||||
if not asset_ids:
|
||||
return []
|
||||
query_filter["parent"] = {"$in": asset_ids}
|
||||
|
||||
if subset_ids is not None:
|
||||
subset_ids = _convert_ids(subset_ids)
|
||||
subset_ids = convert_ids(subset_ids)
|
||||
if not subset_ids:
|
||||
return []
|
||||
query_filter["_id"] = {"$in": subset_ids}
|
||||
|
|
@ -449,7 +469,7 @@ def get_subsets(
|
|||
for asset_id, names in names_by_asset_ids.items():
|
||||
if asset_id and names:
|
||||
or_query.append({
|
||||
"parent": _convert_id(asset_id),
|
||||
"parent": convert_id(asset_id),
|
||||
"name": {"$in": list(names)}
|
||||
})
|
||||
if not or_query:
|
||||
|
|
@ -510,7 +530,7 @@ def get_version_by_id(project_name, version_id, fields=None):
|
|||
Dict: Version document which can be reduced to specified 'fields'.
|
||||
"""
|
||||
|
||||
version_id = _convert_id(version_id)
|
||||
version_id = convert_id(version_id)
|
||||
if not version_id:
|
||||
return None
|
||||
|
||||
|
|
@ -537,7 +557,7 @@ def get_version_by_name(project_name, version, subset_id, fields=None):
|
|||
Dict: Version document which can be reduced to specified 'fields'.
|
||||
"""
|
||||
|
||||
subset_id = _convert_id(subset_id)
|
||||
subset_id = convert_id(subset_id)
|
||||
if not subset_id:
|
||||
return None
|
||||
|
||||
|
|
@ -567,7 +587,7 @@ def version_is_latest(project_name, version_id):
|
|||
bool: True if is latest version from subset else False.
|
||||
"""
|
||||
|
||||
version_id = _convert_id(version_id)
|
||||
version_id = convert_id(version_id)
|
||||
if not version_id:
|
||||
return False
|
||||
version_doc = get_version_by_id(
|
||||
|
|
@ -610,13 +630,13 @@ def _get_versions(
|
|||
query_filter = {"type": {"$in": version_types}}
|
||||
|
||||
if subset_ids is not None:
|
||||
subset_ids = _convert_ids(subset_ids)
|
||||
subset_ids = convert_ids(subset_ids)
|
||||
if not subset_ids:
|
||||
return []
|
||||
query_filter["parent"] = {"$in": subset_ids}
|
||||
|
||||
if version_ids is not None:
|
||||
version_ids = _convert_ids(version_ids)
|
||||
version_ids = convert_ids(version_ids)
|
||||
if not version_ids:
|
||||
return []
|
||||
query_filter["_id"] = {"$in": version_ids}
|
||||
|
|
@ -690,7 +710,7 @@ def get_hero_version_by_subset_id(project_name, subset_id, fields=None):
|
|||
Dict: Hero version entity data.
|
||||
"""
|
||||
|
||||
subset_id = _convert_id(subset_id)
|
||||
subset_id = convert_id(subset_id)
|
||||
if not subset_id:
|
||||
return None
|
||||
|
||||
|
|
@ -720,7 +740,7 @@ def get_hero_version_by_id(project_name, version_id, fields=None):
|
|||
Dict: Hero version entity data.
|
||||
"""
|
||||
|
||||
version_id = _convert_id(version_id)
|
||||
version_id = convert_id(version_id)
|
||||
if not version_id:
|
||||
return None
|
||||
|
||||
|
|
@ -786,7 +806,7 @@ def get_output_link_versions(project_name, version_id, fields=None):
|
|||
links for passed version.
|
||||
"""
|
||||
|
||||
version_id = _convert_id(version_id)
|
||||
version_id = convert_id(version_id)
|
||||
if not version_id:
|
||||
return []
|
||||
|
||||
|
|
@ -812,7 +832,7 @@ def get_last_versions(project_name, subset_ids, fields=None):
|
|||
dict[ObjectId, int]: Key is subset id and value is last version name.
|
||||
"""
|
||||
|
||||
subset_ids = _convert_ids(subset_ids)
|
||||
subset_ids = convert_ids(subset_ids)
|
||||
if not subset_ids:
|
||||
return {}
|
||||
|
||||
|
|
@ -898,7 +918,7 @@ def get_last_version_by_subset_id(project_name, subset_id, fields=None):
|
|||
Dict: Version document which can be reduced to specified 'fields'.
|
||||
"""
|
||||
|
||||
subset_id = _convert_id(subset_id)
|
||||
subset_id = convert_id(subset_id)
|
||||
if not subset_id:
|
||||
return None
|
||||
|
||||
|
|
@ -971,7 +991,7 @@ def get_representation_by_id(project_name, representation_id, fields=None):
|
|||
"type": {"$in": repre_types}
|
||||
}
|
||||
if representation_id is not None:
|
||||
query_filter["_id"] = _convert_id(representation_id)
|
||||
query_filter["_id"] = convert_id(representation_id)
|
||||
|
||||
conn = get_project_connection(project_name)
|
||||
|
||||
|
|
@ -996,7 +1016,7 @@ def get_representation_by_name(
|
|||
to specified 'fields'.
|
||||
"""
|
||||
|
||||
version_id = _convert_id(version_id)
|
||||
version_id = convert_id(version_id)
|
||||
if not version_id or not representation_name:
|
||||
return None
|
||||
repre_types = ["representation", "archived_representations"]
|
||||
|
|
@ -1089,7 +1109,7 @@ def _get_representations(
|
|||
query_filter = {"type": {"$in": repre_types}}
|
||||
|
||||
if representation_ids is not None:
|
||||
representation_ids = _convert_ids(representation_ids)
|
||||
representation_ids = convert_ids(representation_ids)
|
||||
if not representation_ids:
|
||||
return default_output
|
||||
query_filter["_id"] = {"$in": representation_ids}
|
||||
|
|
@ -1100,7 +1120,7 @@ def _get_representations(
|
|||
query_filter["name"] = {"$in": list(representation_names)}
|
||||
|
||||
if version_ids is not None:
|
||||
version_ids = _convert_ids(version_ids)
|
||||
version_ids = convert_ids(version_ids)
|
||||
if not version_ids:
|
||||
return default_output
|
||||
query_filter["parent"] = {"$in": version_ids}
|
||||
|
|
@ -1111,7 +1131,7 @@ def _get_representations(
|
|||
for version_id, names in names_by_version_ids.items():
|
||||
if version_id and names:
|
||||
or_query.append({
|
||||
"parent": _convert_id(version_id),
|
||||
"parent": convert_id(version_id),
|
||||
"name": {"$in": list(names)}
|
||||
})
|
||||
if not or_query:
|
||||
|
|
@ -1361,7 +1381,7 @@ def get_thumbnail_id_from_source(project_name, src_type, src_id):
|
|||
if not src_type or not src_id:
|
||||
return None
|
||||
|
||||
query_filter = {"_id": _convert_id(src_id)}
|
||||
query_filter = {"_id": convert_id(src_id)}
|
||||
|
||||
conn = get_project_connection(project_name)
|
||||
src_doc = conn.find_one(query_filter, {"data.thumbnail_id"})
|
||||
|
|
@ -1388,7 +1408,7 @@ def get_thumbnails(project_name, thumbnail_ids, fields=None):
|
|||
"""
|
||||
|
||||
if thumbnail_ids:
|
||||
thumbnail_ids = _convert_ids(thumbnail_ids)
|
||||
thumbnail_ids = convert_ids(thumbnail_ids)
|
||||
|
||||
if not thumbnail_ids:
|
||||
return []
|
||||
|
|
@ -1416,7 +1436,7 @@ def get_thumbnail(project_name, thumbnail_id, fields=None):
|
|||
|
||||
if not thumbnail_id:
|
||||
return None
|
||||
query_filter = {"type": "thumbnail", "_id": _convert_id(thumbnail_id)}
|
||||
query_filter = {"type": "thumbnail", "_id": convert_id(thumbnail_id)}
|
||||
conn = get_project_connection(project_name)
|
||||
return conn.find_one(query_filter, _prepare_fields(fields))
|
||||
|
||||
|
|
@ -1444,7 +1464,7 @@ def get_workfile_info(
|
|||
|
||||
query_filter = {
|
||||
"type": "workfile",
|
||||
"parent": _convert_id(asset_id),
|
||||
"parent": convert_id(asset_id),
|
||||
"task_name": task_name,
|
||||
"filename": filename
|
||||
}
|
||||
|
|
@ -1455,7 +1475,7 @@ def get_workfile_info(
|
|||
"""
|
||||
## Custom data storage:
|
||||
- Settings - OP settings overrides and local settings
|
||||
- Logging - logs from PypeLogger
|
||||
- Logging - logs from Logger
|
||||
- Webpublisher - jobs
|
||||
- Ftrack - events
|
||||
- Maya - Shaders
|
||||
|
|
|
|||
232
openpype/client/entity_links.py
Normal file
232
openpype/client/entity_links.py
Normal file
|
|
@ -0,0 +1,232 @@
|
|||
from .mongo import get_project_connection
|
||||
from .entities import (
|
||||
get_assets,
|
||||
get_asset_by_id,
|
||||
get_representation_by_id,
|
||||
convert_id,
|
||||
)
|
||||
|
||||
|
||||
def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
|
||||
"""Extract linked asset ids from asset document.
|
||||
|
||||
One of asset document or asset id must be passed.
|
||||
|
||||
Note:
|
||||
Asset links now works only from asset to assets.
|
||||
|
||||
Args:
|
||||
asset_doc (dict): Asset document from DB.
|
||||
|
||||
Returns:
|
||||
List[Union[ObjectId, str]]: Asset ids of input links.
|
||||
"""
|
||||
|
||||
output = []
|
||||
if not asset_doc and not asset_id:
|
||||
return output
|
||||
|
||||
if not asset_doc:
|
||||
asset_doc = get_asset_by_id(
|
||||
project_name, asset_id, fields=["data.inputLinks"]
|
||||
)
|
||||
|
||||
input_links = asset_doc["data"].get("inputLinks")
|
||||
if not input_links:
|
||||
return output
|
||||
|
||||
for item in input_links:
|
||||
# Backwards compatibility for "_id" key which was replaced with
|
||||
# "id"
|
||||
if "_id" in item:
|
||||
link_id = item["_id"]
|
||||
else:
|
||||
link_id = item["id"]
|
||||
output.append(link_id)
|
||||
return output
|
||||
|
||||
|
||||
def get_linked_assets(
|
||||
project_name, asset_doc=None, asset_id=None, fields=None
|
||||
):
|
||||
"""Return linked assets based on passed asset document.
|
||||
|
||||
One of asset document or asset id must be passed.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where to look for queried entities.
|
||||
asset_doc (Dict[str, Any]): Asset document from database.
|
||||
asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
|
||||
asset document.
|
||||
fields (Iterable[str]): Fields that should be returned. All fields are
|
||||
returned if 'None' is passed.
|
||||
|
||||
Returns:
|
||||
List[Dict[str, Any]]: Asset documents of input links for passed
|
||||
asset doc.
|
||||
"""
|
||||
|
||||
if not asset_doc:
|
||||
if not asset_id:
|
||||
return []
|
||||
asset_doc = get_asset_by_id(
|
||||
project_name,
|
||||
asset_id,
|
||||
fields=["data.inputLinks"]
|
||||
)
|
||||
if not asset_doc:
|
||||
return []
|
||||
|
||||
link_ids = get_linked_asset_ids(project_name, asset_doc=asset_doc)
|
||||
if not link_ids:
|
||||
return []
|
||||
|
||||
return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
|
||||
|
||||
|
||||
def get_linked_representation_id(
|
||||
project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
|
||||
):
|
||||
"""Returns list of linked ids of particular type (if provided).
|
||||
|
||||
One of representation document or representation id must be passed.
|
||||
Note:
|
||||
Representation links now works only from representation through version
|
||||
back to representations.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where look for links.
|
||||
repre_doc (Dict[str, Any]): Representation document.
|
||||
repre_id (Union[ObjectId, str]): Representation id.
|
||||
link_type (str): Type of link (e.g. 'reference', ...).
|
||||
max_depth (int): Limit recursion level. Default: 0
|
||||
|
||||
Returns:
|
||||
List[ObjectId] Linked representation ids.
|
||||
"""
|
||||
|
||||
if repre_doc:
|
||||
repre_id = repre_doc["_id"]
|
||||
|
||||
if repre_id:
|
||||
repre_id = convert_id(repre_id)
|
||||
|
||||
if not repre_id and not repre_doc:
|
||||
return []
|
||||
|
||||
version_id = None
|
||||
if repre_doc:
|
||||
version_id = repre_doc.get("parent")
|
||||
|
||||
if not version_id:
|
||||
repre_doc = get_representation_by_id(
|
||||
project_name, repre_id, fields=["parent"]
|
||||
)
|
||||
version_id = repre_doc["parent"]
|
||||
|
||||
if not version_id:
|
||||
return []
|
||||
|
||||
if max_depth is None:
|
||||
max_depth = 0
|
||||
|
||||
match = {
|
||||
"_id": version_id,
|
||||
"type": {"$in": ["version", "hero_version"]}
|
||||
}
|
||||
|
||||
graph_lookup = {
|
||||
"from": project_name,
|
||||
"startWith": "$data.inputLinks.id",
|
||||
"connectFromField": "data.inputLinks.id",
|
||||
"connectToField": "_id",
|
||||
"as": "outputs_recursive",
|
||||
"depthField": "depth"
|
||||
}
|
||||
if max_depth != 0:
|
||||
# We offset by -1 since 0 basically means no recursion
|
||||
# but the recursion only happens after the initial lookup
|
||||
# for outputs.
|
||||
graph_lookup["maxDepth"] = max_depth - 1
|
||||
|
||||
query_pipeline = [
|
||||
# Match
|
||||
{"$match": match},
|
||||
# Recursive graph lookup for inputs
|
||||
{"$graphLookup": graph_lookup}
|
||||
]
|
||||
|
||||
conn = get_project_connection(project_name)
|
||||
result = conn.aggregate(query_pipeline)
|
||||
referenced_version_ids = _process_referenced_pipeline_result(
|
||||
result, link_type
|
||||
)
|
||||
if not referenced_version_ids:
|
||||
return []
|
||||
|
||||
ref_ids = conn.distinct(
|
||||
"_id",
|
||||
filter={
|
||||
"parent": {"$in": list(referenced_version_ids)},
|
||||
"type": "representation"
|
||||
}
|
||||
)
|
||||
|
||||
return list(ref_ids)
|
||||
|
||||
|
||||
def _process_referenced_pipeline_result(result, link_type):
|
||||
"""Filters result from pipeline for particular link_type.
|
||||
|
||||
Pipeline cannot use link_type directly in a query.
|
||||
|
||||
Returns:
|
||||
(list)
|
||||
"""
|
||||
|
||||
referenced_version_ids = set()
|
||||
correctly_linked_ids = set()
|
||||
for item in result:
|
||||
input_links = item["data"].get("inputLinks")
|
||||
if not input_links:
|
||||
continue
|
||||
|
||||
_filter_input_links(
|
||||
input_links,
|
||||
link_type,
|
||||
correctly_linked_ids
|
||||
)
|
||||
|
||||
# outputs_recursive in random order, sort by depth
|
||||
outputs_recursive = item.get("outputs_recursive")
|
||||
if not outputs_recursive:
|
||||
continue
|
||||
|
||||
for output in sorted(outputs_recursive, key=lambda o: o["depth"]):
|
||||
output_links = output["data"].get("inputLinks")
|
||||
if not output_links:
|
||||
continue
|
||||
|
||||
# Leaf
|
||||
if output["_id"] not in correctly_linked_ids:
|
||||
continue
|
||||
|
||||
_filter_input_links(
|
||||
output_links,
|
||||
link_type,
|
||||
correctly_linked_ids
|
||||
)
|
||||
|
||||
referenced_version_ids.add(output["_id"])
|
||||
|
||||
return referenced_version_ids
|
||||
|
||||
|
||||
def _filter_input_links(input_links, link_type, correctly_linked_ids):
|
||||
for input_link in input_links:
|
||||
if link_type and input_link["type"] != link_type:
|
||||
continue
|
||||
|
||||
link_id = input_link.get("id") or input_link.get("_id")
|
||||
if link_id is not None:
|
||||
correctly_linked_ids.add(link_id)
|
||||
39
openpype/client/notes.md
Normal file
39
openpype/client/notes.md
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
# Client functionality
|
||||
## Reason
|
||||
Preparation for OpenPype v4 server. Goal is to remove direct mongo calls in code to prepare a little bit for different source of data for code before. To start think about database calls less as mongo calls but more universally. To do so was implemented simple wrapper around database calls to not use pymongo specific code.
|
||||
|
||||
Current goal is not to make universal database model which can be easily replaced with any different source of data but to make it close as possible. Current implementation of OpenPype is too tighly connected to pymongo and it's abilities so we're trying to get closer with long term changes that can be used even in current state.
|
||||
|
||||
## Queries
|
||||
Query functions don't use full potential of mongo queries like very specific queries based on subdictionaries or unknown structures. We try to avoid these calls as much as possible because they'll probably won't be available in future. If it's really necessary a new function can be added but only if it's reasonable for overall logic. All query functions were moved to `~/client/entities.py`. Each function has arguments with available filters and possible reduce of returned keys for each entity.
|
||||
|
||||
## Changes
|
||||
Changes are a little bit complicated. Mongo has many options how update can happen which had to be reduced also it would be at this stage complicated to validate values which are created or updated thus automation is at this point almost none. Changes can be made using operations available in `~/client/operations.py`. Each operation require project name and entity type, but may require operation specific data.
|
||||
|
||||
### Create
|
||||
Create operations expect already prepared document data, for that are prepared functions creating skeletal structures of documents (do not fill all required data), except `_id` all data should be right. Existence of entity is not validated so if the same creation operation is send n times it will create the entity n times which can cause issues.
|
||||
|
||||
### Update
|
||||
Update operation require entity id and keys that should be changed, update dictionary must have {"key": value}. If value should be set in nested dictionary the key must have also all subkeys joined with dot `.` (e.g. `{"data": {"fps": 25}}` -> `{"data.fps": 25}`). To simplify update dictionaries were prepared functions which does that for you, their name has template `prepare_<entity type>_update_data` - they work on comparison of previous document and new document. If there is missing function for requested entity type it is because we didn't need it yet and require implementaion.
|
||||
|
||||
### Delete
|
||||
Delete operation need entity id. Entity will be deleted from mongo.
|
||||
|
||||
|
||||
## What (probably) won't be replaced
|
||||
Some parts of code are still using direct mongo calls. In most of cases it is for very specific calls that are module specific or their usage will completely change in future.
|
||||
- Mongo calls that are not project specific (out of `avalon` collection) will be removed or will have to use different mechanism how the data are stored. At this moment it is related to OpenPype settings and logs, ftrack server events, some other data.
|
||||
- Sync server queries. They're complex and very specific for sync server module. Their replacement will require specific calls to OpenPype server in v4 thus their abstraction with wrapper is irrelevant and would complicate production in v3.
|
||||
- Project managers (ftrack, kitsu, shotgrid, embedded Project Manager, etc.). Project managers are creating, updating or removing assets in v3, but in v4 will create folders with different structure. Wrapping creation of assets would not help to prepare for v4 because of new data structures. The same can be said about editorial Extract Hierarchy Avalon plugin which create project structure.
|
||||
- Code parts that is marked as deprecated in v3 or will be deprecated in v4.
|
||||
- integrate asset legacy publish plugin - already is legacy kept for safety
|
||||
- integrate thumbnail - thumbnails will be stored in different way in v4
|
||||
- input links - link will be stored in different way and will have different mechanism of linking. In v3 are links limited to same entity type "asset <-> asset" or "representation <-> representation".
|
||||
|
||||
## Known missing replacements
|
||||
- change subset group in loader tool
|
||||
- integrate subset group
|
||||
- query input links in openpype lib
|
||||
- create project in openpype lib
|
||||
- save/create workfile doc in openpype lib
|
||||
- integrate hero version
|
||||
|
|
@ -1,3 +1,4 @@
|
|||
import re
|
||||
import uuid
|
||||
import copy
|
||||
import collections
|
||||
|
|
@ -8,9 +9,15 @@ from bson.objectid import ObjectId
|
|||
from pymongo import DeleteOne, InsertOne, UpdateOne
|
||||
|
||||
from .mongo import get_project_connection
|
||||
from .entities import get_project
|
||||
|
||||
REMOVED_VALUE = object()
|
||||
|
||||
PROJECT_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
|
||||
PROJECT_NAME_REGEX = re.compile(
|
||||
"^[{}]+$".format(PROJECT_NAME_ALLOWED_SYMBOLS)
|
||||
)
|
||||
|
||||
CURRENT_PROJECT_SCHEMA = "openpype:project-3.0"
|
||||
CURRENT_PROJECT_CONFIG_SCHEMA = "openpype:config-2.0"
|
||||
CURRENT_ASSET_DOC_SCHEMA = "openpype:asset-3.0"
|
||||
|
|
@ -18,6 +25,7 @@ CURRENT_SUBSET_SCHEMA = "openpype:subset-3.0"
|
|||
CURRENT_VERSION_SCHEMA = "openpype:version-3.0"
|
||||
CURRENT_REPRESENTATION_SCHEMA = "openpype:representation-2.0"
|
||||
CURRENT_WORKFILE_INFO_SCHEMA = "openpype:workfile-1.0"
|
||||
CURRENT_THUMBNAIL_SCHEMA = "openpype:thumbnail-1.0"
|
||||
|
||||
|
||||
def _create_or_convert_to_mongo_id(mongo_id):
|
||||
|
|
@ -189,6 +197,29 @@ def new_representation_doc(
|
|||
}
|
||||
|
||||
|
||||
def new_thumbnail_doc(data=None, entity_id=None):
|
||||
"""Create skeleton data of thumbnail document.
|
||||
|
||||
Args:
|
||||
data (Dict[str, Any]): Thumbnail document data.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of thumbnail document.
|
||||
"""
|
||||
|
||||
if data is None:
|
||||
data = {}
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_mongo_id(entity_id),
|
||||
"type": "thumbnail",
|
||||
"schema": CURRENT_THUMBNAIL_SCHEMA,
|
||||
"data": data
|
||||
}
|
||||
|
||||
|
||||
def new_workfile_info_doc(
|
||||
filename, asset_id, task_name, files, data=None, entity_id=None
|
||||
):
|
||||
|
|
@ -444,7 +475,7 @@ class UpdateOperation(AbstractOperation):
|
|||
set_data = {}
|
||||
for key, value in self._update_data.items():
|
||||
if value is REMOVED_VALUE:
|
||||
unset_data[key] = value
|
||||
unset_data[key] = None
|
||||
else:
|
||||
set_data[key] = value
|
||||
|
||||
|
|
@ -632,3 +663,89 @@ class OperationsSession(object):
|
|||
operation = DeleteOperation(project_name, entity_type, entity_id)
|
||||
self.add(operation)
|
||||
return operation
|
||||
|
||||
|
||||
def create_project(project_name, project_code, library_project=False):
|
||||
"""Create project using OpenPype settings.
|
||||
|
||||
This project creation function is not validating project document on
|
||||
creation. It is because project document is created blindly with only
|
||||
minimum required information about project which is it's name, code, type
|
||||
and schema.
|
||||
|
||||
Entered project name must be unique and project must not exist yet.
|
||||
|
||||
Note:
|
||||
This function is here to be OP v4 ready but in v3 has more logic
|
||||
to do. That's why inner imports are in the body.
|
||||
|
||||
Args:
|
||||
project_name(str): New project name. Should be unique.
|
||||
project_code(str): Project's code should be unique too.
|
||||
library_project(bool): Project is library project.
|
||||
|
||||
Raises:
|
||||
ValueError: When project name already exists in MongoDB.
|
||||
|
||||
Returns:
|
||||
dict: Created project document.
|
||||
"""
|
||||
|
||||
from openpype.settings import ProjectSettings, SaveWarningExc
|
||||
from openpype.pipeline.schema import validate
|
||||
|
||||
if get_project(project_name, fields=["name"]):
|
||||
raise ValueError("Project with name \"{}\" already exists".format(
|
||||
project_name
|
||||
))
|
||||
|
||||
if not PROJECT_NAME_REGEX.match(project_name):
|
||||
raise ValueError((
|
||||
"Project name \"{}\" contain invalid characters"
|
||||
).format(project_name))
|
||||
|
||||
project_doc = {
|
||||
"type": "project",
|
||||
"name": project_name,
|
||||
"data": {
|
||||
"code": project_code,
|
||||
"library_project": library_project
|
||||
},
|
||||
"schema": CURRENT_PROJECT_SCHEMA
|
||||
}
|
||||
|
||||
op_session = OperationsSession()
|
||||
# Insert document with basic data
|
||||
create_op = op_session.create_entity(
|
||||
project_name, project_doc["type"], project_doc
|
||||
)
|
||||
op_session.commit()
|
||||
|
||||
# Load ProjectSettings for the project and save it to store all attributes
|
||||
# and Anatomy
|
||||
try:
|
||||
project_settings_entity = ProjectSettings(project_name)
|
||||
project_settings_entity.save()
|
||||
except SaveWarningExc as exc:
|
||||
print(str(exc))
|
||||
except Exception:
|
||||
op_session.delete_entity(
|
||||
project_name, project_doc["type"], create_op.entity_id
|
||||
)
|
||||
op_session.commit()
|
||||
raise
|
||||
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
try:
|
||||
# Validate created project document
|
||||
validate(project_doc)
|
||||
except Exception:
|
||||
# Remove project if is not valid
|
||||
op_session.delete_entity(
|
||||
project_name, project_doc["type"], create_op.entity_id
|
||||
)
|
||||
op_session.commit()
|
||||
raise
|
||||
|
||||
return project_doc
|
||||
|
|
|
|||
|
|
@ -1,8 +1,6 @@
|
|||
import os
|
||||
from openpype.lib import (
|
||||
PreLaunchHook,
|
||||
create_workdir_extra_folders
|
||||
)
|
||||
from openpype.lib import PreLaunchHook
|
||||
from openpype.pipeline.workfile import create_workdir_extra_folders
|
||||
|
||||
|
||||
class AddLastWorkfileToLaunchArgs(PreLaunchHook):
|
||||
|
|
|
|||
|
|
@ -1,13 +1,22 @@
|
|||
from .host import (
|
||||
HostBase,
|
||||
)
|
||||
|
||||
from .interfaces import (
|
||||
IWorkfileHost,
|
||||
ILoadHost,
|
||||
INewPublisher,
|
||||
)
|
||||
|
||||
from .dirmap import HostDirmap
|
||||
|
||||
|
||||
__all__ = (
|
||||
"HostBase",
|
||||
|
||||
"IWorkfileHost",
|
||||
"ILoadHost",
|
||||
"INewPublisher",
|
||||
|
||||
"HostDirmap",
|
||||
)
|
||||
|
|
|
|||
205
openpype/host/dirmap.py
Normal file
205
openpype/host/dirmap.py
Normal file
|
|
@ -0,0 +1,205 @@
|
|||
"""Dirmap functionality used in host integrations inside DCCs.
|
||||
|
||||
Idea for current dirmap implementation was used from Maya where is possible to
|
||||
enter source and destination roots and maya will try each found source
|
||||
in referenced file replace with each destionation paths. First path which
|
||||
exists is used.
|
||||
"""
|
||||
|
||||
import os
|
||||
from abc import ABCMeta, abstractmethod
|
||||
|
||||
import six
|
||||
|
||||
from openpype.lib import Logger
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.settings.lib import get_site_local_overrides
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class HostDirmap(object):
|
||||
"""Abstract class for running dirmap on a workfile in a host.
|
||||
|
||||
Dirmap is used to translate paths inside of host workfile from one
|
||||
OS to another. (Eg. arstist created workfile on Win, different artists
|
||||
opens same file on Linux.)
|
||||
|
||||
Expects methods to be implemented inside of host:
|
||||
on_dirmap_enabled: run host code for enabling dirmap
|
||||
do_dirmap: run host code to do actual remapping
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, host_name, project_name, project_settings=None, sync_module=None
|
||||
):
|
||||
self.host_name = host_name
|
||||
self.project_name = project_name
|
||||
self._project_settings = project_settings
|
||||
self._sync_module = sync_module # to limit reinit of Modules
|
||||
self._log = None
|
||||
self._mapping = None # cache mapping
|
||||
|
||||
@property
|
||||
def sync_module(self):
|
||||
if self._sync_module is None:
|
||||
manager = ModulesManager()
|
||||
self._sync_module = manager["sync_server"]
|
||||
return self._sync_module
|
||||
|
||||
@property
|
||||
def project_settings(self):
|
||||
if self._project_settings is None:
|
||||
self._project_settings = get_project_settings(self.project_name)
|
||||
return self._project_settings
|
||||
|
||||
@property
|
||||
def log(self):
|
||||
if self._log is None:
|
||||
self._log = Logger.get_logger(self.__class__.__name__)
|
||||
return self._log
|
||||
|
||||
@abstractmethod
|
||||
def on_enable_dirmap(self):
|
||||
"""Run host dependent operation for enabling dirmap if necessary."""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def dirmap_routine(self, source_path, destination_path):
|
||||
"""Run host dependent remapping from source_path to destination_path"""
|
||||
pass
|
||||
|
||||
def process_dirmap(self):
|
||||
# type: (dict) -> None
|
||||
"""Go through all paths in Settings and set them using `dirmap`.
|
||||
|
||||
If artists has Site Sync enabled, take dirmap mapping directly from
|
||||
Local Settings when artist is syncing workfile locally.
|
||||
|
||||
Args:
|
||||
project_settings (dict): Settings for current project.
|
||||
"""
|
||||
|
||||
if not self._mapping:
|
||||
self._mapping = self.get_mappings(self.project_settings)
|
||||
if not self._mapping:
|
||||
return
|
||||
|
||||
self.log.info("Processing directory mapping ...")
|
||||
self.on_enable_dirmap()
|
||||
self.log.info("mapping:: {}".format(self._mapping))
|
||||
|
||||
for k, sp in enumerate(self._mapping["source-path"]):
|
||||
dst = self._mapping["destination-path"][k]
|
||||
try:
|
||||
print("{} -> {}".format(sp, dst))
|
||||
self.dirmap_routine(sp, dst)
|
||||
except IndexError:
|
||||
# missing corresponding destination path
|
||||
self.log.error((
|
||||
"invalid dirmap mapping, missing corresponding"
|
||||
" destination directory."
|
||||
))
|
||||
break
|
||||
except RuntimeError:
|
||||
self.log.error(
|
||||
"invalid path {} -> {}, mapping not registered".format(
|
||||
sp, dst
|
||||
)
|
||||
)
|
||||
continue
|
||||
|
||||
def get_mappings(self, project_settings):
|
||||
"""Get translation from source-path to destination-path.
|
||||
|
||||
It checks if Site Sync is enabled and user chose to use local
|
||||
site, in that case configuration in Local Settings takes precedence
|
||||
"""
|
||||
|
||||
local_mapping = self._get_local_sync_dirmap(project_settings)
|
||||
dirmap_label = "{}-dirmap".format(self.host_name)
|
||||
if (
|
||||
not self.project_settings[self.host_name].get(dirmap_label)
|
||||
and not local_mapping
|
||||
):
|
||||
return {}
|
||||
mapping_settings = self.project_settings[self.host_name][dirmap_label]
|
||||
mapping_enabled = mapping_settings["enabled"] or bool(local_mapping)
|
||||
if not mapping_enabled:
|
||||
return {}
|
||||
|
||||
mapping = (
|
||||
local_mapping
|
||||
or mapping_settings["paths"]
|
||||
or {}
|
||||
)
|
||||
|
||||
if (
|
||||
not mapping
|
||||
or not mapping.get("destination-path")
|
||||
or not mapping.get("source-path")
|
||||
):
|
||||
return {}
|
||||
return mapping
|
||||
|
||||
def _get_local_sync_dirmap(self, project_settings):
|
||||
"""
|
||||
Returns dirmap if synch to local project is enabled.
|
||||
|
||||
Only valid mapping is from roots of remote site to local site set
|
||||
in Local Settings.
|
||||
|
||||
Args:
|
||||
project_settings (dict)
|
||||
Returns:
|
||||
dict : { "source-path": [XXX], "destination-path": [YYYY]}
|
||||
"""
|
||||
|
||||
mapping = {}
|
||||
|
||||
if not project_settings["global"]["sync_server"]["enabled"]:
|
||||
return mapping
|
||||
|
||||
project_name = os.getenv("AVALON_PROJECT")
|
||||
|
||||
active_site = self.sync_module.get_local_normalized_site(
|
||||
self.sync_module.get_active_site(project_name))
|
||||
remote_site = self.sync_module.get_local_normalized_site(
|
||||
self.sync_module.get_remote_site(project_name))
|
||||
self.log.debug(
|
||||
"active {} - remote {}".format(active_site, remote_site)
|
||||
)
|
||||
|
||||
if (
|
||||
active_site == "local"
|
||||
and project_name in self.sync_module.get_enabled_projects()
|
||||
and active_site != remote_site
|
||||
):
|
||||
sync_settings = self.sync_module.get_sync_project_setting(
|
||||
project_name,
|
||||
exclude_locals=False,
|
||||
cached=False)
|
||||
|
||||
active_overrides = get_site_local_overrides(
|
||||
project_name, active_site)
|
||||
remote_overrides = get_site_local_overrides(
|
||||
project_name, remote_site)
|
||||
|
||||
self.log.debug("local overrides {}".format(active_overrides))
|
||||
self.log.debug("remote overrides {}".format(remote_overrides))
|
||||
for root_name, active_site_dir in active_overrides.items():
|
||||
remote_site_dir = (
|
||||
remote_overrides.get(root_name)
|
||||
or sync_settings["sites"][remote_site]["root"][root_name]
|
||||
)
|
||||
if os.path.isdir(active_site_dir):
|
||||
if "destination-path" not in mapping:
|
||||
mapping["destination-path"] = []
|
||||
mapping["destination-path"].append(active_site_dir)
|
||||
|
||||
if "source-path" not in mapping:
|
||||
mapping["source-path"] = []
|
||||
mapping["source-path"].append(remote_site_dir)
|
||||
|
||||
self.log.debug("local sync mapping:: {}".format(mapping))
|
||||
return mapping
|
||||
|
|
@ -1,37 +1,12 @@
|
|||
import logging
|
||||
import contextlib
|
||||
from abc import ABCMeta, abstractproperty, abstractmethod
|
||||
from abc import ABCMeta, abstractproperty
|
||||
import six
|
||||
|
||||
# NOTE can't import 'typing' because of issues in Maya 2020
|
||||
# - shiboken crashes on 'typing' module import
|
||||
|
||||
|
||||
class MissingMethodsError(ValueError):
|
||||
"""Exception when host miss some required methods for specific workflow.
|
||||
|
||||
Args:
|
||||
host (HostBase): Host implementation where are missing methods.
|
||||
missing_methods (list[str]): List of missing methods.
|
||||
"""
|
||||
|
||||
def __init__(self, host, missing_methods):
|
||||
joined_missing = ", ".join(
|
||||
['"{}"'.format(item) for item in missing_methods]
|
||||
)
|
||||
if isinstance(host, HostBase):
|
||||
host_name = host.name
|
||||
else:
|
||||
try:
|
||||
host_name = host.__file__.replace("\\", "/").split("/")[-3]
|
||||
except Exception:
|
||||
host_name = str(host)
|
||||
message = (
|
||||
"Host \"{}\" miss methods {}".format(host_name, joined_missing)
|
||||
)
|
||||
super(MissingMethodsError, self).__init__(message)
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class HostBase(object):
|
||||
"""Base of host implementation class.
|
||||
|
|
@ -185,347 +160,3 @@ class HostBase(object):
|
|||
yield
|
||||
finally:
|
||||
pass
|
||||
|
||||
|
||||
class ILoadHost:
|
||||
"""Implementation requirements to be able use reference of representations.
|
||||
|
||||
The load plugins can do referencing even without implementation of methods
|
||||
here, but switch and removement of containers would not be possible.
|
||||
|
||||
Questions:
|
||||
- Is list container dependency of host or load plugins?
|
||||
- Should this be directly in HostBase?
|
||||
- how to find out if referencing is available?
|
||||
- do we need to know that?
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def get_missing_load_methods(host):
|
||||
"""Look for missing methods on "old type" host implementation.
|
||||
|
||||
Method is used for validation of implemented functions related to
|
||||
loading. Checks only existence of methods.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host where to look for
|
||||
required methods.
|
||||
|
||||
Returns:
|
||||
list[str]: Missing method implementations for loading workflow.
|
||||
"""
|
||||
|
||||
if isinstance(host, ILoadHost):
|
||||
return []
|
||||
|
||||
required = ["ls"]
|
||||
missing = []
|
||||
for name in required:
|
||||
if not hasattr(host, name):
|
||||
missing.append(name)
|
||||
return missing
|
||||
|
||||
@staticmethod
|
||||
def validate_load_methods(host):
|
||||
"""Validate implemented methods of "old type" host for load workflow.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host to validate.
|
||||
|
||||
Raises:
|
||||
MissingMethodsError: If there are missing methods on host
|
||||
implementation.
|
||||
"""
|
||||
missing = ILoadHost.get_missing_load_methods(host)
|
||||
if missing:
|
||||
raise MissingMethodsError(host, missing)
|
||||
|
||||
@abstractmethod
|
||||
def get_containers(self):
|
||||
"""Retreive referenced containers from scene.
|
||||
|
||||
This can be implemented in hosts where referencing can be used.
|
||||
|
||||
Todo:
|
||||
Rename function to something more self explanatory.
|
||||
Suggestion: 'get_containers'
|
||||
|
||||
Returns:
|
||||
list[dict]: Information about loaded containers.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
# --- Deprecated method names ---
|
||||
def ls(self):
|
||||
"""Deprecated variant of 'get_containers'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.get_containers()
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class IWorkfileHost:
|
||||
"""Implementation requirements to be able use workfile utils and tool."""
|
||||
|
||||
@staticmethod
|
||||
def get_missing_workfile_methods(host):
|
||||
"""Look for missing methods on "old type" host implementation.
|
||||
|
||||
Method is used for validation of implemented functions related to
|
||||
workfiles. Checks only existence of methods.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host where to look for
|
||||
required methods.
|
||||
|
||||
Returns:
|
||||
list[str]: Missing method implementations for workfiles workflow.
|
||||
"""
|
||||
|
||||
if isinstance(host, IWorkfileHost):
|
||||
return []
|
||||
|
||||
required = [
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
]
|
||||
missing = []
|
||||
for name in required:
|
||||
if not hasattr(host, name):
|
||||
missing.append(name)
|
||||
return missing
|
||||
|
||||
@staticmethod
|
||||
def validate_workfile_methods(host):
|
||||
"""Validate methods of "old type" host for workfiles workflow.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host to validate.
|
||||
|
||||
Raises:
|
||||
MissingMethodsError: If there are missing methods on host
|
||||
implementation.
|
||||
"""
|
||||
|
||||
missing = IWorkfileHost.get_missing_workfile_methods(host)
|
||||
if missing:
|
||||
raise MissingMethodsError(host, missing)
|
||||
|
||||
@abstractmethod
|
||||
def get_workfile_extensions(self):
|
||||
"""Extensions that can be used as save.
|
||||
|
||||
Questions:
|
||||
This could potentially use 'HostDefinition'.
|
||||
"""
|
||||
|
||||
return []
|
||||
|
||||
@abstractmethod
|
||||
def save_workfile(self, dst_path=None):
|
||||
"""Save currently opened scene.
|
||||
|
||||
Args:
|
||||
dst_path (str): Where the current scene should be saved. Or use
|
||||
current path if 'None' is passed.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def open_workfile(self, filepath):
|
||||
"""Open passed filepath in the host.
|
||||
|
||||
Args:
|
||||
filepath (str): Path to workfile.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_current_workfile(self):
|
||||
"""Retreive path to current opened file.
|
||||
|
||||
Returns:
|
||||
str: Path to file which is currently opened.
|
||||
None: If nothing is opened.
|
||||
"""
|
||||
|
||||
return None
|
||||
|
||||
def workfile_has_unsaved_changes(self):
|
||||
"""Currently opened scene is saved.
|
||||
|
||||
Not all hosts can know if current scene is saved because the API of
|
||||
DCC does not support it.
|
||||
|
||||
Returns:
|
||||
bool: True if scene is saved and False if has unsaved
|
||||
modifications.
|
||||
None: Can't tell if workfiles has modifications.
|
||||
"""
|
||||
|
||||
return None
|
||||
|
||||
def work_root(self, session):
|
||||
"""Modify workdir per host.
|
||||
|
||||
Default implementation keeps workdir untouched.
|
||||
|
||||
Warnings:
|
||||
We must handle this modification with more sofisticated way because
|
||||
this can't be called out of DCC so opening of last workfile
|
||||
(calculated before DCC is launched) is complicated. Also breaking
|
||||
defined work template is not a good idea.
|
||||
Only place where it's really used and can make sense is Maya. There
|
||||
workspace.mel can modify subfolders where to look for maya files.
|
||||
|
||||
Args:
|
||||
session (dict): Session context data.
|
||||
|
||||
Returns:
|
||||
str: Path to new workdir.
|
||||
"""
|
||||
|
||||
return session["AVALON_WORKDIR"]
|
||||
|
||||
# --- Deprecated method names ---
|
||||
def file_extensions(self):
|
||||
"""Deprecated variant of 'get_workfile_extensions'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
return self.get_workfile_extensions()
|
||||
|
||||
def save_file(self, dst_path=None):
|
||||
"""Deprecated variant of 'save_workfile'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
self.save_workfile()
|
||||
|
||||
def open_file(self, filepath):
|
||||
"""Deprecated variant of 'open_workfile'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.open_workfile(filepath)
|
||||
|
||||
def current_file(self):
|
||||
"""Deprecated variant of 'get_current_workfile'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.get_current_workfile()
|
||||
|
||||
def has_unsaved_changes(self):
|
||||
"""Deprecated variant of 'workfile_has_unsaved_changes'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.workfile_has_unsaved_changes()
|
||||
|
||||
|
||||
class INewPublisher:
|
||||
"""Functions related to new creation system in new publisher.
|
||||
|
||||
New publisher is not storing information only about each created instance
|
||||
but also some global data. At this moment are data related only to context
|
||||
publish plugins but that can extend in future.
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def get_missing_publish_methods(host):
|
||||
"""Look for missing methods on "old type" host implementation.
|
||||
|
||||
Method is used for validation of implemented functions related to
|
||||
new publish creation. Checks only existence of methods.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Host module where to look for
|
||||
required methods.
|
||||
|
||||
Returns:
|
||||
list[str]: Missing method implementations for new publsher
|
||||
workflow.
|
||||
"""
|
||||
|
||||
if isinstance(host, INewPublisher):
|
||||
return []
|
||||
|
||||
required = [
|
||||
"get_context_data",
|
||||
"update_context_data",
|
||||
]
|
||||
missing = []
|
||||
for name in required:
|
||||
if not hasattr(host, name):
|
||||
missing.append(name)
|
||||
return missing
|
||||
|
||||
@staticmethod
|
||||
def validate_publish_methods(host):
|
||||
"""Validate implemented methods of "old type" host.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Host module to validate.
|
||||
|
||||
Raises:
|
||||
MissingMethodsError: If there are missing methods on host
|
||||
implementation.
|
||||
"""
|
||||
missing = INewPublisher.get_missing_publish_methods(host)
|
||||
if missing:
|
||||
raise MissingMethodsError(host, missing)
|
||||
|
||||
@abstractmethod
|
||||
def get_context_data(self):
|
||||
"""Get global data related to creation-publishing from workfile.
|
||||
|
||||
These data are not related to any created instance but to whole
|
||||
publishing context. Not saving/returning them will cause that each
|
||||
reset of publishing resets all values to default ones.
|
||||
|
||||
Context data can contain information about enabled/disabled publish
|
||||
plugins or other values that can be filled by artist.
|
||||
|
||||
Returns:
|
||||
dict: Context data stored using 'update_context_data'.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def update_context_data(self, data, changes):
|
||||
"""Store global context data to workfile.
|
||||
|
||||
Called when some values in context data has changed.
|
||||
|
||||
Without storing the values in a way that 'get_context_data' would
|
||||
return them will each reset of publishing cause loose of filled values
|
||||
by artist. Best practice is to store values into workfile, if possible.
|
||||
|
||||
Args:
|
||||
data (dict): New data as are.
|
||||
changes (dict): Only data that has been changed. Each value has
|
||||
tuple with '(<old>, <new>)' value.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
|
|
|||
370
openpype/host/interfaces.py
Normal file
370
openpype/host/interfaces.py
Normal file
|
|
@ -0,0 +1,370 @@
|
|||
from abc import ABCMeta, abstractmethod
|
||||
import six
|
||||
|
||||
|
||||
class MissingMethodsError(ValueError):
|
||||
"""Exception when host miss some required methods for specific workflow.
|
||||
|
||||
Args:
|
||||
host (HostBase): Host implementation where are missing methods.
|
||||
missing_methods (list[str]): List of missing methods.
|
||||
"""
|
||||
|
||||
def __init__(self, host, missing_methods):
|
||||
joined_missing = ", ".join(
|
||||
['"{}"'.format(item) for item in missing_methods]
|
||||
)
|
||||
host_name = getattr(host, "name", None)
|
||||
if not host_name:
|
||||
try:
|
||||
host_name = host.__file__.replace("\\", "/").split("/")[-3]
|
||||
except Exception:
|
||||
host_name = str(host)
|
||||
message = (
|
||||
"Host \"{}\" miss methods {}".format(host_name, joined_missing)
|
||||
)
|
||||
super(MissingMethodsError, self).__init__(message)
|
||||
|
||||
|
||||
class ILoadHost:
|
||||
"""Implementation requirements to be able use reference of representations.
|
||||
|
||||
The load plugins can do referencing even without implementation of methods
|
||||
here, but switch and removement of containers would not be possible.
|
||||
|
||||
Questions:
|
||||
- Is list container dependency of host or load plugins?
|
||||
- Should this be directly in HostBase?
|
||||
- how to find out if referencing is available?
|
||||
- do we need to know that?
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def get_missing_load_methods(host):
|
||||
"""Look for missing methods on "old type" host implementation.
|
||||
|
||||
Method is used for validation of implemented functions related to
|
||||
loading. Checks only existence of methods.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host where to look for
|
||||
required methods.
|
||||
|
||||
Returns:
|
||||
list[str]: Missing method implementations for loading workflow.
|
||||
"""
|
||||
|
||||
if isinstance(host, ILoadHost):
|
||||
return []
|
||||
|
||||
required = ["ls"]
|
||||
missing = []
|
||||
for name in required:
|
||||
if not hasattr(host, name):
|
||||
missing.append(name)
|
||||
return missing
|
||||
|
||||
@staticmethod
|
||||
def validate_load_methods(host):
|
||||
"""Validate implemented methods of "old type" host for load workflow.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host to validate.
|
||||
|
||||
Raises:
|
||||
MissingMethodsError: If there are missing methods on host
|
||||
implementation.
|
||||
"""
|
||||
missing = ILoadHost.get_missing_load_methods(host)
|
||||
if missing:
|
||||
raise MissingMethodsError(host, missing)
|
||||
|
||||
@abstractmethod
|
||||
def get_containers(self):
|
||||
"""Retreive referenced containers from scene.
|
||||
|
||||
This can be implemented in hosts where referencing can be used.
|
||||
|
||||
Todo:
|
||||
Rename function to something more self explanatory.
|
||||
Suggestion: 'get_containers'
|
||||
|
||||
Returns:
|
||||
list[dict]: Information about loaded containers.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
# --- Deprecated method names ---
|
||||
def ls(self):
|
||||
"""Deprecated variant of 'get_containers'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.get_containers()
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class IWorkfileHost:
|
||||
"""Implementation requirements to be able use workfile utils and tool."""
|
||||
|
||||
@staticmethod
|
||||
def get_missing_workfile_methods(host):
|
||||
"""Look for missing methods on "old type" host implementation.
|
||||
|
||||
Method is used for validation of implemented functions related to
|
||||
workfiles. Checks only existence of methods.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host where to look for
|
||||
required methods.
|
||||
|
||||
Returns:
|
||||
list[str]: Missing method implementations for workfiles workflow.
|
||||
"""
|
||||
|
||||
if isinstance(host, IWorkfileHost):
|
||||
return []
|
||||
|
||||
required = [
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
]
|
||||
missing = []
|
||||
for name in required:
|
||||
if not hasattr(host, name):
|
||||
missing.append(name)
|
||||
return missing
|
||||
|
||||
@staticmethod
|
||||
def validate_workfile_methods(host):
|
||||
"""Validate methods of "old type" host for workfiles workflow.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Object of host to validate.
|
||||
|
||||
Raises:
|
||||
MissingMethodsError: If there are missing methods on host
|
||||
implementation.
|
||||
"""
|
||||
|
||||
missing = IWorkfileHost.get_missing_workfile_methods(host)
|
||||
if missing:
|
||||
raise MissingMethodsError(host, missing)
|
||||
|
||||
@abstractmethod
|
||||
def get_workfile_extensions(self):
|
||||
"""Extensions that can be used as save.
|
||||
|
||||
Questions:
|
||||
This could potentially use 'HostDefinition'.
|
||||
"""
|
||||
|
||||
return []
|
||||
|
||||
@abstractmethod
|
||||
def save_workfile(self, dst_path=None):
|
||||
"""Save currently opened scene.
|
||||
|
||||
Args:
|
||||
dst_path (str): Where the current scene should be saved. Or use
|
||||
current path if 'None' is passed.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def open_workfile(self, filepath):
|
||||
"""Open passed filepath in the host.
|
||||
|
||||
Args:
|
||||
filepath (str): Path to workfile.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_current_workfile(self):
|
||||
"""Retreive path to current opened file.
|
||||
|
||||
Returns:
|
||||
str: Path to file which is currently opened.
|
||||
None: If nothing is opened.
|
||||
"""
|
||||
|
||||
return None
|
||||
|
||||
def workfile_has_unsaved_changes(self):
|
||||
"""Currently opened scene is saved.
|
||||
|
||||
Not all hosts can know if current scene is saved because the API of
|
||||
DCC does not support it.
|
||||
|
||||
Returns:
|
||||
bool: True if scene is saved and False if has unsaved
|
||||
modifications.
|
||||
None: Can't tell if workfiles has modifications.
|
||||
"""
|
||||
|
||||
return None
|
||||
|
||||
def work_root(self, session):
|
||||
"""Modify workdir per host.
|
||||
|
||||
Default implementation keeps workdir untouched.
|
||||
|
||||
Warnings:
|
||||
We must handle this modification with more sofisticated way because
|
||||
this can't be called out of DCC so opening of last workfile
|
||||
(calculated before DCC is launched) is complicated. Also breaking
|
||||
defined work template is not a good idea.
|
||||
Only place where it's really used and can make sense is Maya. There
|
||||
workspace.mel can modify subfolders where to look for maya files.
|
||||
|
||||
Args:
|
||||
session (dict): Session context data.
|
||||
|
||||
Returns:
|
||||
str: Path to new workdir.
|
||||
"""
|
||||
|
||||
return session["AVALON_WORKDIR"]
|
||||
|
||||
# --- Deprecated method names ---
|
||||
def file_extensions(self):
|
||||
"""Deprecated variant of 'get_workfile_extensions'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
return self.get_workfile_extensions()
|
||||
|
||||
def save_file(self, dst_path=None):
|
||||
"""Deprecated variant of 'save_workfile'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
self.save_workfile()
|
||||
|
||||
def open_file(self, filepath):
|
||||
"""Deprecated variant of 'open_workfile'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.open_workfile(filepath)
|
||||
|
||||
def current_file(self):
|
||||
"""Deprecated variant of 'get_current_workfile'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.get_current_workfile()
|
||||
|
||||
def has_unsaved_changes(self):
|
||||
"""Deprecated variant of 'workfile_has_unsaved_changes'.
|
||||
|
||||
Todo:
|
||||
Remove when all usages are replaced.
|
||||
"""
|
||||
|
||||
return self.workfile_has_unsaved_changes()
|
||||
|
||||
|
||||
class INewPublisher:
|
||||
"""Functions related to new creation system in new publisher.
|
||||
|
||||
New publisher is not storing information only about each created instance
|
||||
but also some global data. At this moment are data related only to context
|
||||
publish plugins but that can extend in future.
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def get_missing_publish_methods(host):
|
||||
"""Look for missing methods on "old type" host implementation.
|
||||
|
||||
Method is used for validation of implemented functions related to
|
||||
new publish creation. Checks only existence of methods.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Host module where to look for
|
||||
required methods.
|
||||
|
||||
Returns:
|
||||
list[str]: Missing method implementations for new publsher
|
||||
workflow.
|
||||
"""
|
||||
|
||||
if isinstance(host, INewPublisher):
|
||||
return []
|
||||
|
||||
required = [
|
||||
"get_context_data",
|
||||
"update_context_data",
|
||||
]
|
||||
missing = []
|
||||
for name in required:
|
||||
if not hasattr(host, name):
|
||||
missing.append(name)
|
||||
return missing
|
||||
|
||||
@staticmethod
|
||||
def validate_publish_methods(host):
|
||||
"""Validate implemented methods of "old type" host.
|
||||
|
||||
Args:
|
||||
Union[ModuleType, HostBase]: Host module to validate.
|
||||
|
||||
Raises:
|
||||
MissingMethodsError: If there are missing methods on host
|
||||
implementation.
|
||||
"""
|
||||
missing = INewPublisher.get_missing_publish_methods(host)
|
||||
if missing:
|
||||
raise MissingMethodsError(host, missing)
|
||||
|
||||
@abstractmethod
|
||||
def get_context_data(self):
|
||||
"""Get global data related to creation-publishing from workfile.
|
||||
|
||||
These data are not related to any created instance but to whole
|
||||
publishing context. Not saving/returning them will cause that each
|
||||
reset of publishing resets all values to default ones.
|
||||
|
||||
Context data can contain information about enabled/disabled publish
|
||||
plugins or other values that can be filled by artist.
|
||||
|
||||
Returns:
|
||||
dict: Context data stored using 'update_context_data'.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def update_context_data(self, data, changes):
|
||||
"""Store global context data to workfile.
|
||||
|
||||
Called when some values in context data has changed.
|
||||
|
||||
Without storing the values in a way that 'get_context_data' would
|
||||
return them will each reset of publishing cause loose of filled values
|
||||
by artist. Best practice is to store values into workfile, if possible.
|
||||
|
||||
Args:
|
||||
data (dict): New data as are.
|
||||
changes (dict): Only data that has been changed. Each value has
|
||||
tuple with '(<old>, <new>)' value.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
|
@ -1,9 +1,6 @@
|
|||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True",
|
||||
"WEBSOCKET_URL": "ws://localhost:8097/ws/"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
from .addon import AfterEffectsAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AfterEffectsAddon",
|
||||
)
|
||||
|
|
|
|||
23
openpype/hosts/aftereffects/addon.py
Normal file
23
openpype/hosts/aftereffects/addon.py
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
|
||||
class AfterEffectsAddon(OpenPypeModule, IHostAddon):
|
||||
name = "aftereffects"
|
||||
host_name = "aftereffects"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True",
|
||||
"WEBSOCKET_URL": "ws://localhost:8097/ws/"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".aep"]
|
||||
|
|
@ -1,13 +1,16 @@
|
|||
import os
|
||||
import sys
|
||||
import re
|
||||
import json
|
||||
import contextlib
|
||||
import traceback
|
||||
import logging
|
||||
from functools import partial
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
from .launch_logic import ProcessLauncher, get_stub
|
||||
|
|
@ -35,10 +38,18 @@ def main(*subprocess_args):
|
|||
launcher.start()
|
||||
|
||||
if os.environ.get("HEADLESS_PUBLISH"):
|
||||
launcher.execute_in_main_thread(lambda: headless_publish(
|
||||
log,
|
||||
"CloseAE",
|
||||
os.environ.get("IS_TEST")))
|
||||
manager = ModulesManager()
|
||||
webpublisher_addon = manager["webpublisher"]
|
||||
|
||||
launcher.execute_in_main_thread(
|
||||
partial(
|
||||
webpublisher_addon.headless_publish,
|
||||
log,
|
||||
"CloseAE",
|
||||
os.environ.get("IS_TEST")
|
||||
)
|
||||
)
|
||||
|
||||
elif os.environ.get("AVALON_PHOTOSHOP_WORKFILES_ON_LAUNCH", True):
|
||||
save = False
|
||||
if os.getenv("WORKFILES_SAVE_AS"):
|
||||
|
|
@ -68,3 +79,57 @@ def get_extension_manifest_path():
|
|||
"CSXS",
|
||||
"manifest.xml"
|
||||
)
|
||||
|
||||
|
||||
def get_unique_layer_name(layers, name):
|
||||
"""
|
||||
Gets all layer names and if 'name' is present in them, increases
|
||||
suffix by 1 (eg. creates unique layer name - for Loader)
|
||||
Args:
|
||||
layers (list): of strings, names only
|
||||
name (string): checked value
|
||||
|
||||
Returns:
|
||||
(string): name_00X (without version)
|
||||
"""
|
||||
names = {}
|
||||
for layer in layers:
|
||||
layer_name = re.sub(r'_\d{3}$', '', layer)
|
||||
if layer_name in names.keys():
|
||||
names[layer_name] = names[layer_name] + 1
|
||||
else:
|
||||
names[layer_name] = 1
|
||||
occurrences = names.get(name, 0)
|
||||
|
||||
return "{}_{:0>3d}".format(name, occurrences + 1)
|
||||
|
||||
|
||||
def get_background_layers(file_url):
|
||||
"""
|
||||
Pulls file name from background json file, enrich with folder url for
|
||||
AE to be able import files.
|
||||
|
||||
Order is important, follows order in json.
|
||||
|
||||
Args:
|
||||
file_url (str): abs url of background json
|
||||
|
||||
Returns:
|
||||
(list): of abs paths to images
|
||||
"""
|
||||
with open(file_url) as json_file:
|
||||
data = json.load(json_file)
|
||||
|
||||
layers = list()
|
||||
bg_folder = os.path.dirname(file_url)
|
||||
for child in data['children']:
|
||||
if child.get("filename"):
|
||||
layers.append(os.path.join(bg_folder, child.get("filename")).
|
||||
replace("\\", "/"))
|
||||
else:
|
||||
for layer in child['children']:
|
||||
if layer.get("filename"):
|
||||
layers.append(os.path.join(bg_folder,
|
||||
layer.get("filename")).
|
||||
replace("\\", "/"))
|
||||
return layers
|
||||
|
|
|
|||
|
|
@ -1,12 +1,11 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
from .launch_logic import get_stub
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return HOST_WORKFILE_EXTENSIONS["aftereffects"]
|
||||
return [".aep"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
import re
|
||||
|
||||
from openpype.lib import (
|
||||
get_background_layers,
|
||||
get_unique_layer_name
|
||||
)
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.hosts.aftereffects.api import (
|
||||
AfterEffectsLoader,
|
||||
containerise
|
||||
)
|
||||
from openpype.hosts.aftereffects.api.lib import (
|
||||
get_background_layers,
|
||||
get_unique_layer_name,
|
||||
)
|
||||
|
||||
|
||||
class BackgroundLoader(AfterEffectsLoader):
|
||||
|
|
|
|||
|
|
@ -1,12 +1,11 @@
|
|||
import re
|
||||
|
||||
from openpype import lib
|
||||
|
||||
from openpype.pipeline import get_representation_path
|
||||
from openpype.hosts.aftereffects.api import (
|
||||
AfterEffectsLoader,
|
||||
containerise
|
||||
)
|
||||
from openpype.hosts.aftereffects.api.lib import get_unique_layer_name
|
||||
|
||||
|
||||
class FileLoader(AfterEffectsLoader):
|
||||
|
|
@ -28,7 +27,7 @@ class FileLoader(AfterEffectsLoader):
|
|||
stub = self.get_stub()
|
||||
layers = stub.get_items(comps=True, folders=True, footages=True)
|
||||
existing_layers = [layer.name for layer in layers]
|
||||
comp_name = lib.get_unique_layer_name(
|
||||
comp_name = get_unique_layer_name(
|
||||
existing_layers, "{}_{}".format(context["asset"]["name"], name))
|
||||
|
||||
import_options = {}
|
||||
|
|
@ -87,7 +86,7 @@ class FileLoader(AfterEffectsLoader):
|
|||
if namespace_from_container != layer_name:
|
||||
layers = stub.get_items(comps=True)
|
||||
existing_layers = [layer.name for layer in layers]
|
||||
layer_name = lib.get_unique_layer_name(
|
||||
layer_name = get_unique_layer_name(
|
||||
existing_layers,
|
||||
"{}_{}".format(context["asset"], context["subset"]))
|
||||
else: # switching version - keep same name
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -71,13 +71,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
|
||||
# workfile instance
|
||||
family = "workfile"
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
subset = get_subset_name(
|
||||
family,
|
||||
self.default_variant,
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
host_name=context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
# Create instance
|
||||
instance = context.create_instance(subset)
|
||||
|
|
|
|||
|
|
@ -2,14 +2,18 @@ import os
|
|||
import sys
|
||||
import six
|
||||
|
||||
import openpype.api
|
||||
from openpype.lib import (
|
||||
get_ffmpeg_tool_path,
|
||||
run_subprocess,
|
||||
)
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
class ExtractLocalRender(openpype.api.Extractor):
|
||||
class ExtractLocalRender(publish.Extractor):
|
||||
"""Render RenderQueue locally."""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.47
|
||||
order = publish.Extractor.order - 0.47
|
||||
label = "Extract Local Render"
|
||||
hosts = ["aftereffects"]
|
||||
families = ["renderLocal", "render.local"]
|
||||
|
|
@ -53,7 +57,7 @@ class ExtractLocalRender(openpype.api.Extractor):
|
|||
|
||||
instance.data["representations"] = [repre_data]
|
||||
|
||||
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
|
||||
ffmpeg_path = get_ffmpeg_tool_path("ffmpeg")
|
||||
# Generate thumbnail.
|
||||
thumbnail_path = os.path.join(staging_dir, "thumbnail.jpg")
|
||||
|
||||
|
|
@ -66,7 +70,7 @@ class ExtractLocalRender(openpype.api.Extractor):
|
|||
]
|
||||
self.log.debug("Thumbnail args:: {}".format(args))
|
||||
try:
|
||||
output = openpype.lib.run_subprocess(args)
|
||||
output = run_subprocess(args)
|
||||
except TypeError:
|
||||
self.log.warning("Error in creating thumbnail")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
|
|
|||
|
|
@ -1,13 +1,13 @@
|
|||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
class ExtractSaveScene(pyblish.api.ContextPlugin):
|
||||
"""Save scene before extraction."""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.48
|
||||
order = publish.Extractor.order - 0.48
|
||||
label = "Extract Save Scene"
|
||||
hosts = ["aftereffects"]
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import pyblish.api
|
||||
from openpype.action import get_errored_plugins_from_data
|
||||
from openpype.lib import version_up
|
||||
from openpype.pipeline.publish import get_errored_plugins_from_context
|
||||
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
|
@ -18,7 +18,7 @@ class IncrementWorkfile(pyblish.api.InstancePlugin):
|
|||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
errored_plugins = get_errored_plugins_from_data(instance.context)
|
||||
errored_plugins = get_errored_plugins_from_context(instance.context)
|
||||
if errored_plugins:
|
||||
raise RuntimeError(
|
||||
"Skipping incrementing current file because publishing failed."
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
import openpype.api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
class RemovePublishHighlight(openpype.api.Extractor):
|
||||
class RemovePublishHighlight(publish.Extractor):
|
||||
"""Clean utf characters which are not working in DL
|
||||
|
||||
Published compositions are marked with unicode icon which causes
|
||||
|
|
@ -10,7 +10,7 @@ class RemovePublishHighlight(openpype.api.Extractor):
|
|||
rendering, add it later back to avoid confusion.
|
||||
"""
|
||||
|
||||
order = openpype.api.Extractor.order - 0.49 # just before save
|
||||
order = publish.Extractor.order - 0.49 # just before save
|
||||
label = "Clean render comp"
|
||||
hosts = ["aftereffects"]
|
||||
families = ["render.farm"]
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype.pipeline import (
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.publish import (
|
||||
ValidateContentsOrder,
|
||||
PublishXmlValidationError,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
|
@ -50,7 +50,7 @@ class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
|||
label = "Validate Instance Asset"
|
||||
hosts = ["aftereffects"]
|
||||
actions = [ValidateInstanceAssetRepair]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
order = ValidateContentsOrder
|
||||
|
||||
def process(self, instance):
|
||||
instance_asset = instance.data["asset"]
|
||||
|
|
|
|||
|
|
@ -1,52 +1,6 @@
|
|||
import os
|
||||
from .addon import BlenderAddon
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
# Prepare path to implementation script
|
||||
implementation_user_script_path = os.path.join(
|
||||
os.path.dirname(os.path.abspath(__file__)),
|
||||
"blender_addon"
|
||||
)
|
||||
|
||||
# Add blender implementation script path to PYTHONPATH
|
||||
python_path = env.get("PYTHONPATH") or ""
|
||||
python_path_parts = [
|
||||
path
|
||||
for path in python_path.split(os.pathsep)
|
||||
if path
|
||||
]
|
||||
python_path_parts.insert(0, implementation_user_script_path)
|
||||
env["PYTHONPATH"] = os.pathsep.join(python_path_parts)
|
||||
|
||||
# Modify Blender user scripts path
|
||||
previous_user_scripts = set()
|
||||
# Implementation path is added to set for easier paths check inside loops
|
||||
# - will be removed at the end
|
||||
previous_user_scripts.add(implementation_user_script_path)
|
||||
|
||||
openpype_blender_user_scripts = (
|
||||
env.get("OPENPYPE_BLENDER_USER_SCRIPTS") or ""
|
||||
)
|
||||
for path in openpype_blender_user_scripts.split(os.pathsep):
|
||||
if path:
|
||||
previous_user_scripts.add(os.path.normpath(path))
|
||||
|
||||
blender_user_scripts = env.get("BLENDER_USER_SCRIPTS") or ""
|
||||
for path in blender_user_scripts.split(os.pathsep):
|
||||
if path:
|
||||
previous_user_scripts.add(os.path.normpath(path))
|
||||
|
||||
# Remove implementation path from user script paths as is set to
|
||||
# `BLENDER_USER_SCRIPTS`
|
||||
previous_user_scripts.remove(implementation_user_script_path)
|
||||
env["BLENDER_USER_SCRIPTS"] = implementation_user_script_path
|
||||
|
||||
# Set custom user scripts env
|
||||
env["OPENPYPE_BLENDER_USER_SCRIPTS"] = os.pathsep.join(
|
||||
previous_user_scripts
|
||||
)
|
||||
|
||||
# Define Qt binding if not defined
|
||||
if not env.get("QT_PREFERRED_BINDING"):
|
||||
env["QT_PREFERRED_BINDING"] = "PySide2"
|
||||
__all__ = (
|
||||
"BlenderAddon",
|
||||
)
|
||||
|
|
|
|||
73
openpype/hosts/blender/addon.py
Normal file
73
openpype/hosts/blender/addon.py
Normal file
|
|
@ -0,0 +1,73 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
BLENDER_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class BlenderAddon(OpenPypeModule, IHostAddon):
|
||||
name = "blender"
|
||||
host_name = "blender"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
# Prepare path to implementation script
|
||||
implementation_user_script_path = os.path.join(
|
||||
BLENDER_ROOT_DIR,
|
||||
"blender_addon"
|
||||
)
|
||||
|
||||
# Add blender implementation script path to PYTHONPATH
|
||||
python_path = env.get("PYTHONPATH") or ""
|
||||
python_path_parts = [
|
||||
path
|
||||
for path in python_path.split(os.pathsep)
|
||||
if path
|
||||
]
|
||||
python_path_parts.insert(0, implementation_user_script_path)
|
||||
env["PYTHONPATH"] = os.pathsep.join(python_path_parts)
|
||||
|
||||
# Modify Blender user scripts path
|
||||
previous_user_scripts = set()
|
||||
# Implementation path is added to set for easier paths check inside
|
||||
# loops - will be removed at the end
|
||||
previous_user_scripts.add(implementation_user_script_path)
|
||||
|
||||
openpype_blender_user_scripts = (
|
||||
env.get("OPENPYPE_BLENDER_USER_SCRIPTS") or ""
|
||||
)
|
||||
for path in openpype_blender_user_scripts.split(os.pathsep):
|
||||
if path:
|
||||
previous_user_scripts.add(os.path.normpath(path))
|
||||
|
||||
blender_user_scripts = env.get("BLENDER_USER_SCRIPTS") or ""
|
||||
for path in blender_user_scripts.split(os.pathsep):
|
||||
if path:
|
||||
previous_user_scripts.add(os.path.normpath(path))
|
||||
|
||||
# Remove implementation path from user script paths as is set to
|
||||
# `BLENDER_USER_SCRIPTS`
|
||||
previous_user_scripts.remove(implementation_user_script_path)
|
||||
env["BLENDER_USER_SCRIPTS"] = implementation_user_script_path
|
||||
|
||||
# Set custom user scripts env
|
||||
env["OPENPYPE_BLENDER_USER_SCRIPTS"] = os.pathsep.join(
|
||||
previous_user_scripts
|
||||
)
|
||||
|
||||
# Define Qt binding if not defined
|
||||
if not env.get("QT_PREFERRED_BINDING"):
|
||||
env["QT_PREFERRED_BINDING"] = "PySide2"
|
||||
|
||||
def get_launch_hook_paths(self, app):
|
||||
if app.host_name != self.host_name:
|
||||
return []
|
||||
return [
|
||||
os.path.join(BLENDER_ROOT_DIR, "hooks")
|
||||
]
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".blend"]
|
||||
|
|
@ -2,7 +2,7 @@ import bpy
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.api import get_errored_instances_from_context
|
||||
from openpype.pipeline.publish import get_errored_instances_from_context
|
||||
|
||||
|
||||
class SelectInvalidAction(pyblish.api.Action):
|
||||
|
|
|
|||
|
|
@ -234,7 +234,7 @@ def lsattrs(attrs: Dict) -> List:
|
|||
def read(node: bpy.types.bpy_struct_meta_idprop):
|
||||
"""Return user-defined attributes from `node`"""
|
||||
|
||||
data = dict(node.get(pipeline.AVALON_PROPERTY))
|
||||
data = dict(node.get(pipeline.AVALON_PROPERTY, {}))
|
||||
|
||||
# Ignore hidden/internal data
|
||||
data = {
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ PREVIEW_COLLECTIONS: Dict = dict()
|
|||
# This seems like a good value to keep the Qt app responsive and doesn't slow
|
||||
# down Blender. At least on macOS I the interace of Blender gets very laggy if
|
||||
# you make it smaller.
|
||||
TIMER_INTERVAL: float = 0.01
|
||||
TIMER_INTERVAL: float = 0.01 if platform.system() == "Windows" else 0.1
|
||||
|
||||
|
||||
class BlenderApplication(QtWidgets.QApplication):
|
||||
|
|
@ -164,6 +164,12 @@ def _process_app_events() -> Optional[float]:
|
|||
dialog.setDetailedText(detail)
|
||||
dialog.exec_()
|
||||
|
||||
# Refresh Manager
|
||||
if GlobalClass.app:
|
||||
manager = GlobalClass.app.get_window("WM_OT_avalon_manager")
|
||||
if manager:
|
||||
manager.refresh()
|
||||
|
||||
if not GlobalClass.is_windows:
|
||||
if OpenFileCacher.opening_file:
|
||||
return TIMER_INTERVAL
|
||||
|
|
@ -192,10 +198,11 @@ class LaunchQtApp(bpy.types.Operator):
|
|||
self._app = BlenderApplication.get_app()
|
||||
GlobalClass.app = self._app
|
||||
|
||||
bpy.app.timers.register(
|
||||
_process_app_events,
|
||||
persistent=True
|
||||
)
|
||||
if not bpy.app.timers.is_registered(_process_app_events):
|
||||
bpy.app.timers.register(
|
||||
_process_app_events,
|
||||
persistent=True
|
||||
)
|
||||
|
||||
def execute(self, context):
|
||||
"""Execute the operator.
|
||||
|
|
|
|||
|
|
@ -5,8 +5,6 @@ from typing import List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
|
||||
class OpenFileCacher:
|
||||
"""Store information about opening file.
|
||||
|
|
@ -78,7 +76,7 @@ def has_unsaved_changes() -> bool:
|
|||
def file_extensions() -> List[str]:
|
||||
"""Return the supported file extensions for Blender scene files."""
|
||||
|
||||
return HOST_WORKFILE_EXTENSIONS["blender"]
|
||||
return [".blend"]
|
||||
|
||||
|
||||
def work_root(session: dict) -> str:
|
||||
|
|
|
|||
|
|
@ -1,4 +1,10 @@
|
|||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.blender import api
|
||||
|
||||
install_host(api)
|
||||
|
||||
def register():
|
||||
install_host(api)
|
||||
|
||||
|
||||
def unregister():
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -6,12 +6,12 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype import lib
|
||||
from openpype.pipeline import (
|
||||
legacy_create,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.create import get_legacy_creator_by_name
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import (
|
||||
AVALON_CONTAINERS,
|
||||
|
|
@ -157,7 +157,7 @@ class BlendLayoutLoader(plugin.AssetLoader):
|
|||
t.id = local_obj
|
||||
|
||||
elif local_obj.type == 'EMPTY':
|
||||
creator_plugin = lib.get_creator_by_name("CreateAnimation")
|
||||
creator_plugin = get_legacy_creator_by_name("CreateAnimation")
|
||||
if not creator_plugin:
|
||||
raise ValueError("Creator plugin \"CreateAnimation\" was "
|
||||
"not found.")
|
||||
|
|
|
|||
|
|
@ -118,7 +118,7 @@ class JsonLayoutLoader(plugin.AssetLoader):
|
|||
# Camera creation when loading a layout is not necessary for now,
|
||||
# but the code is worth keeping in case we need it in the future.
|
||||
# # Create the camera asset and the camera instance
|
||||
# creator_plugin = lib.get_creator_by_name("CreateCamera")
|
||||
# creator_plugin = get_legacy_creator_by_name("CreateCamera")
|
||||
# if not creator_plugin:
|
||||
# raise ValueError("Creator plugin \"CreateCamera\" was "
|
||||
# "not found.")
|
||||
|
|
|
|||
|
|
@ -6,12 +6,12 @@ from typing import Dict, List, Optional
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype import lib
|
||||
from openpype.pipeline import (
|
||||
legacy_create,
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.create import get_legacy_creator_by_name
|
||||
from openpype.hosts.blender.api import (
|
||||
plugin,
|
||||
get_selection,
|
||||
|
|
@ -244,7 +244,7 @@ class BlendRigLoader(plugin.AssetLoader):
|
|||
objects = self._process(libpath, asset_group, group_name, action)
|
||||
|
||||
if create_animation:
|
||||
creator_plugin = lib.get_creator_by_name("CreateAnimation")
|
||||
creator_plugin = get_legacy_creator_by_name("CreateAnimation")
|
||||
if not creator_plugin:
|
||||
raise ValueError("Creator plugin \"CreateAnimation\" was "
|
||||
"not found.")
|
||||
|
|
|
|||
|
|
@ -1,6 +1,19 @@
|
|||
import os
|
||||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.hosts.blender.api import workio
|
||||
|
||||
|
||||
class SaveWorkfiledAction(pyblish.api.Action):
|
||||
"""Save Workfile."""
|
||||
label = "Save Workfile"
|
||||
on = "failed"
|
||||
icon = "save"
|
||||
|
||||
def process(self, context, plugin):
|
||||
bpy.ops.wm.avalon_workfiles()
|
||||
|
||||
|
||||
class CollectBlenderCurrentFile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -8,12 +21,52 @@ class CollectBlenderCurrentFile(pyblish.api.ContextPlugin):
|
|||
|
||||
order = pyblish.api.CollectorOrder - 0.5
|
||||
label = "Blender Current File"
|
||||
hosts = ['blender']
|
||||
hosts = ["blender"]
|
||||
actions = [SaveWorkfiledAction]
|
||||
|
||||
def process(self, context):
|
||||
"""Inject the current working file"""
|
||||
current_file = bpy.data.filepath
|
||||
context.data['currentFile'] = current_file
|
||||
current_file = workio.current_file()
|
||||
|
||||
assert current_file != '', "Current file is empty. " \
|
||||
"Save the file before continuing."
|
||||
context.data["currentFile"] = current_file
|
||||
|
||||
assert current_file, (
|
||||
"Current file is empty. Save the file before continuing."
|
||||
)
|
||||
|
||||
folder, file = os.path.split(current_file)
|
||||
filename, ext = os.path.splitext(file)
|
||||
|
||||
task = legacy_io.Session["AVALON_TASK"]
|
||||
|
||||
data = {}
|
||||
|
||||
# create instance
|
||||
instance = context.create_instance(name=filename)
|
||||
subset = "workfile" + task.capitalize()
|
||||
|
||||
data.update({
|
||||
"subset": subset,
|
||||
"asset": os.getenv("AVALON_ASSET", None),
|
||||
"label": subset,
|
||||
"publish": True,
|
||||
"family": "workfile",
|
||||
"families": ["workfile"],
|
||||
"setMembers": [current_file],
|
||||
"frameStart": bpy.context.scene.frame_start,
|
||||
"frameEnd": bpy.context.scene.frame_end,
|
||||
})
|
||||
|
||||
data["representations"] = [{
|
||||
"name": ext.lstrip("."),
|
||||
"ext": ext.lstrip("."),
|
||||
"files": file,
|
||||
"stagingDir": folder,
|
||||
}]
|
||||
|
||||
instance.data.update(data)
|
||||
|
||||
self.log.info("Collected instance: {}".format(file))
|
||||
self.log.info("Scene path: {}".format(current_file))
|
||||
self.log.info("staging Dir: {}".format(folder))
|
||||
self.log.info("subset: {}".format(subset))
|
||||
|
|
|
|||
|
|
@ -2,12 +2,12 @@ import os
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype import api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
|
||||
|
||||
|
||||
class ExtractABC(api.Extractor):
|
||||
class ExtractABC(publish.Extractor):
|
||||
"""Extract as ABC."""
|
||||
|
||||
label = "Extract ABC"
|
||||
|
|
|
|||
|
|
@ -2,10 +2,10 @@ import os
|
|||
|
||||
import bpy
|
||||
|
||||
import openpype.api
|
||||
from openpype.pipeline import publish
|
||||
|
||||
|
||||
class ExtractBlend(openpype.api.Extractor):
|
||||
class ExtractBlend(publish.Extractor):
|
||||
"""Extract a blend file."""
|
||||
|
||||
label = "Extract Blend"
|
||||
|
|
|
|||
|
|
@ -2,10 +2,10 @@ import os
|
|||
|
||||
import bpy
|
||||
|
||||
import openpype.api
|
||||
from openpype.pipeline import publish
|
||||
|
||||
|
||||
class ExtractBlendAnimation(openpype.api.Extractor):
|
||||
class ExtractBlendAnimation(publish.Extractor):
|
||||
"""Extract a blend file."""
|
||||
|
||||
label = "Extract Blend"
|
||||
|
|
|
|||
|
|
@ -2,11 +2,11 @@ import os
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype import api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.blender.api import plugin
|
||||
|
||||
|
||||
class ExtractCamera(api.Extractor):
|
||||
class ExtractCamera(publish.Extractor):
|
||||
"""Extract as the camera as FBX."""
|
||||
|
||||
label = "Extract Camera"
|
||||
|
|
|
|||
|
|
@ -2,12 +2,12 @@ import os
|
|||
|
||||
import bpy
|
||||
|
||||
from openpype import api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
|
||||
|
||||
|
||||
class ExtractFBX(api.Extractor):
|
||||
class ExtractFBX(publish.Extractor):
|
||||
"""Extract as FBX."""
|
||||
|
||||
label = "Extract FBX"
|
||||
|
|
|
|||
|
|
@ -5,12 +5,12 @@ import bpy
|
|||
import bpy_extras
|
||||
import bpy_extras.anim_utils
|
||||
|
||||
from openpype import api
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
|
||||
|
||||
|
||||
class ExtractAnimationFBX(api.Extractor):
|
||||
class ExtractAnimationFBX(publish.Extractor):
|
||||
"""Extract as animation."""
|
||||
|
||||
label = "Extract FBX"
|
||||
|
|
|
|||
|
|
@ -6,12 +6,12 @@ import bpy_extras
|
|||
import bpy_extras.anim_utils
|
||||
|
||||
from openpype.client import get_representation_by_name
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.blender.api import plugin
|
||||
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ExtractLayout(openpype.api.Extractor):
|
||||
class ExtractLayout(publish.Extractor):
|
||||
"""Extract a layout."""
|
||||
|
||||
label = "Extract Layout"
|
||||
|
|
|
|||
|
|
@ -1,9 +1,11 @@
|
|||
from typing import List
|
||||
|
||||
import mathutils
|
||||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
import openpype.hosts.blender.api.action
|
||||
from openpype.pipeline.publish import ValidateContentsOrder
|
||||
|
||||
|
||||
class ValidateCameraZeroKeyframe(pyblish.api.InstancePlugin):
|
||||
|
|
@ -14,21 +16,18 @@ class ValidateCameraZeroKeyframe(pyblish.api.InstancePlugin):
|
|||
in Unreal and Blender.
|
||||
"""
|
||||
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
order = ValidateContentsOrder
|
||||
hosts = ["blender"]
|
||||
families = ["camera"]
|
||||
category = "geometry"
|
||||
version = (0, 1, 0)
|
||||
label = "Zero Keyframe"
|
||||
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
|
||||
|
||||
_identity = mathutils.Matrix()
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance) -> List:
|
||||
@staticmethod
|
||||
def get_invalid(instance) -> List:
|
||||
invalid = []
|
||||
for obj in [obj for obj in instance]:
|
||||
if obj.type == "CAMERA":
|
||||
for obj in instance:
|
||||
if isinstance(obj, bpy.types.Object) and obj.type == "CAMERA":
|
||||
if obj.animation_data and obj.animation_data.action:
|
||||
action = obj.animation_data.action
|
||||
frames_set = set()
|
||||
|
|
@ -45,4 +44,5 @@ class ValidateCameraZeroKeyframe(pyblish.api.InstancePlugin):
|
|||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError(
|
||||
f"Object found in instance is not in Object Mode: {invalid}")
|
||||
f"Camera must have a keyframe at frame 0: {invalid}"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -3,13 +3,14 @@ from typing import List
|
|||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
import openpype.hosts.blender.api.action
|
||||
|
||||
|
||||
class ValidateMeshHasUvs(pyblish.api.InstancePlugin):
|
||||
"""Validate that the current mesh has UV's."""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
hosts = ["blender"]
|
||||
families = ["model"]
|
||||
category = "geometry"
|
||||
|
|
@ -25,7 +26,10 @@ class ValidateMeshHasUvs(pyblish.api.InstancePlugin):
|
|||
for uv_layer in obj.data.uv_layers:
|
||||
for polygon in obj.data.polygons:
|
||||
for loop_index in polygon.loop_indices:
|
||||
if not uv_layer.data[loop_index].uv:
|
||||
if (
|
||||
loop_index >= len(uv_layer.data)
|
||||
or not uv_layer.data[loop_index].uv
|
||||
):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
|
@ -33,20 +37,20 @@ class ValidateMeshHasUvs(pyblish.api.InstancePlugin):
|
|||
@classmethod
|
||||
def get_invalid(cls, instance) -> List:
|
||||
invalid = []
|
||||
# TODO (jasper): only check objects in the collection that will be published?
|
||||
for obj in [
|
||||
obj for obj in instance]:
|
||||
try:
|
||||
if obj.type == 'MESH':
|
||||
# Make sure we are in object mode.
|
||||
bpy.ops.object.mode_set(mode='OBJECT')
|
||||
if not cls.has_uvs(obj):
|
||||
invalid.append(obj)
|
||||
except:
|
||||
continue
|
||||
for obj in instance:
|
||||
if isinstance(obj, bpy.types.Object) and obj.type == 'MESH':
|
||||
if obj.mode != "OBJECT":
|
||||
cls.log.warning(
|
||||
f"Mesh object {obj.name} should be in 'OBJECT' mode"
|
||||
" to be properly checked."
|
||||
)
|
||||
if not cls.has_uvs(obj):
|
||||
invalid.append(obj)
|
||||
return invalid
|
||||
|
||||
def process(self, instance):
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError(f"Meshes found in instance without valid UV's: {invalid}")
|
||||
raise RuntimeError(
|
||||
f"Meshes found in instance without valid UV's: {invalid}"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -3,28 +3,27 @@ from typing import List
|
|||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
import openpype.hosts.blender.api.action
|
||||
|
||||
|
||||
class ValidateMeshNoNegativeScale(pyblish.api.Validator):
|
||||
"""Ensure that meshes don't have a negative scale."""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
hosts = ["blender"]
|
||||
families = ["model"]
|
||||
category = "geometry"
|
||||
label = "Mesh No Negative Scale"
|
||||
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
|
||||
|
||||
@staticmethod
|
||||
def get_invalid(instance) -> List:
|
||||
invalid = []
|
||||
# TODO (jasper): only check objects in the collection that will be published?
|
||||
for obj in [
|
||||
obj for obj in bpy.data.objects if obj.type == 'MESH'
|
||||
]:
|
||||
if any(v < 0 for v in obj.scale):
|
||||
invalid.append(obj)
|
||||
|
||||
for obj in instance:
|
||||
if isinstance(obj, bpy.types.Object) and obj.type == 'MESH':
|
||||
if any(v < 0 for v in obj.scale):
|
||||
invalid.append(obj)
|
||||
return invalid
|
||||
|
||||
def process(self, instance):
|
||||
|
|
|
|||
|
|
@ -1,7 +1,11 @@
|
|||
from typing import List
|
||||
|
||||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
import openpype.hosts.blender.api.action
|
||||
from openpype.pipeline.publish import ValidateContentsOrder
|
||||
|
||||
|
||||
class ValidateNoColonsInName(pyblish.api.InstancePlugin):
|
||||
|
|
@ -12,20 +16,20 @@ class ValidateNoColonsInName(pyblish.api.InstancePlugin):
|
|||
|
||||
"""
|
||||
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
order = ValidateContentsOrder
|
||||
hosts = ["blender"]
|
||||
families = ["model", "rig"]
|
||||
version = (0, 1, 0)
|
||||
label = "No Colons in names"
|
||||
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance) -> List:
|
||||
@staticmethod
|
||||
def get_invalid(instance) -> List:
|
||||
invalid = []
|
||||
for obj in [obj for obj in instance]:
|
||||
for obj in instance:
|
||||
if ':' in obj.name:
|
||||
invalid.append(obj)
|
||||
if obj.type == 'ARMATURE':
|
||||
if isinstance(obj, bpy.types.Object) and obj.type == 'ARMATURE':
|
||||
for bone in obj.data.bones:
|
||||
if ':' in bone.name:
|
||||
invalid.append(obj)
|
||||
|
|
@ -36,4 +40,5 @@ class ValidateNoColonsInName(pyblish.api.InstancePlugin):
|
|||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError(
|
||||
f"Objects found with colon in name: {invalid}")
|
||||
f"Objects found with colon in name: {invalid}"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,7 @@
|
|||
from typing import List
|
||||
|
||||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
import openpype.hosts.blender.api.action
|
||||
|
||||
|
|
@ -10,26 +12,21 @@ class ValidateObjectIsInObjectMode(pyblish.api.InstancePlugin):
|
|||
order = pyblish.api.ValidatorOrder - 0.01
|
||||
hosts = ["blender"]
|
||||
families = ["model", "rig", "layout"]
|
||||
category = "geometry"
|
||||
label = "Validate Object Mode"
|
||||
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
|
||||
optional = False
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance) -> List:
|
||||
@staticmethod
|
||||
def get_invalid(instance) -> List:
|
||||
invalid = []
|
||||
for obj in [obj for obj in instance]:
|
||||
try:
|
||||
if obj.type == 'MESH' or obj.type == 'ARMATURE':
|
||||
# Check if the object is in object mode.
|
||||
if not obj.mode == 'OBJECT':
|
||||
invalid.append(obj)
|
||||
except Exception:
|
||||
continue
|
||||
for obj in instance:
|
||||
if isinstance(obj, bpy.types.Object) and obj.mode != "OBJECT":
|
||||
invalid.append(obj)
|
||||
return invalid
|
||||
|
||||
def process(self, instance):
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError(
|
||||
f"Object found in instance is not in Object Mode: {invalid}")
|
||||
f"Object found in instance is not in Object Mode: {invalid}"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,9 +1,12 @@
|
|||
from typing import List
|
||||
|
||||
import mathutils
|
||||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
import openpype.hosts.blender.api.action
|
||||
from openpype.pipeline.publish import ValidateContentsOrder
|
||||
|
||||
|
||||
class ValidateTransformZero(pyblish.api.InstancePlugin):
|
||||
|
|
@ -15,10 +18,9 @@ class ValidateTransformZero(pyblish.api.InstancePlugin):
|
|||
|
||||
"""
|
||||
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
order = ValidateContentsOrder
|
||||
hosts = ["blender"]
|
||||
families = ["model"]
|
||||
category = "geometry"
|
||||
version = (0, 1, 0)
|
||||
label = "Transform Zero"
|
||||
actions = [openpype.hosts.blender.api.action.SelectInvalidAction]
|
||||
|
|
@ -28,8 +30,11 @@ class ValidateTransformZero(pyblish.api.InstancePlugin):
|
|||
@classmethod
|
||||
def get_invalid(cls, instance) -> List:
|
||||
invalid = []
|
||||
for obj in [obj for obj in instance]:
|
||||
if obj.matrix_basis != cls._identity:
|
||||
for obj in instance:
|
||||
if (
|
||||
isinstance(obj, bpy.types.Object)
|
||||
and obj.matrix_basis != cls._identity
|
||||
):
|
||||
invalid.append(obj)
|
||||
return invalid
|
||||
|
||||
|
|
@ -37,4 +42,6 @@ class ValidateTransformZero(pyblish.api.InstancePlugin):
|
|||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError(
|
||||
f"Object found in instance is not in Object Mode: {invalid}")
|
||||
"Object found in instance has not"
|
||||
f" transform to zero: {invalid}"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ from openpype.tools.utils import host_tools
|
|||
from openpype.pipeline import install_openpype_plugins
|
||||
|
||||
|
||||
log = Logger().get_logger("Celaction_cli_publisher")
|
||||
log = Logger.get_logger("Celaction_cli_publisher")
|
||||
|
||||
publish_host = "celaction"
|
||||
|
||||
|
|
|
|||
|
|
@ -1,22 +1,10 @@
|
|||
import os
|
||||
|
||||
HOST_DIR = os.path.dirname(
|
||||
os.path.abspath(__file__)
|
||||
from .addon import (
|
||||
HOST_DIR,
|
||||
FlameAddon,
|
||||
)
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to DL_PYTHON_HOOK_PATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
|
||||
env["DL_PYTHON_HOOK_PATH"] = os.path.join(
|
||||
pype_root, "openpype", "hosts", "flame", "startup")
|
||||
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
__all__ = (
|
||||
"HOST_DIR",
|
||||
"FlameAddon",
|
||||
)
|
||||
|
|
|
|||
36
openpype/hosts/flame/addon.py
Normal file
36
openpype/hosts/flame/addon.py
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class FlameAddon(OpenPypeModule, IHostAddon):
|
||||
name = "flame"
|
||||
host_name = "flame"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
# Add requirements to DL_PYTHON_HOOK_PATH
|
||||
env["DL_PYTHON_HOOK_PATH"] = os.path.join(HOST_DIR, "startup")
|
||||
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
||||
def get_launch_hook_paths(self, app):
|
||||
if app.host_name != self.host_name:
|
||||
return []
|
||||
return [
|
||||
os.path.join(HOST_DIR, "hooks")
|
||||
]
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".otoc"]
|
||||
|
|
@ -1,9 +1,9 @@
|
|||
import pyblish.api
|
||||
|
||||
import openpype.lib as oplib
|
||||
from openpype.pipeline import legacy_io
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
from openpype.hosts.flame.otio import flame_export
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollecTimelineOTIO(pyblish.api.ContextPlugin):
|
||||
|
|
@ -24,11 +24,14 @@ class CollecTimelineOTIO(pyblish.api.ContextPlugin):
|
|||
sequence = opfapi.get_current_sequence(opfapi.CTX.selection)
|
||||
|
||||
# create subset name
|
||||
subset_name = oplib.get_subset_name_with_asset_doc(
|
||||
subset_name = get_subset_name(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
context.data["projectName"],
|
||||
context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
# adding otio timeline to context
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ import contextlib
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import (
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
|
|
@ -20,7 +20,7 @@ from openpype.pipeline import (
|
|||
)
|
||||
import openpype.hosts.fusion
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.fusion.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
|
|
|
|||
|
|
@ -17,9 +17,9 @@ class FusionIncrementCurrentFile(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
|
||||
from openpype.lib import version_up
|
||||
from openpype.action import get_errored_plugins_from_data
|
||||
from openpype.pipeline.publish import get_errored_plugins_from_context
|
||||
|
||||
errored_plugins = get_errored_plugins_from_data(context)
|
||||
errored_plugins = get_errored_plugins_from_context(context)
|
||||
if any(plugin.__name__ == "FusionSubmitDeadline"
|
||||
for plugin in errored_plugins):
|
||||
raise RuntimeError("Skipping incrementing current file because "
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype import action
|
||||
from openpype.pipeline.publish import RepairAction
|
||||
|
||||
|
||||
class ValidateBackgroundDepth(pyblish.api.InstancePlugin):
|
||||
|
|
@ -8,7 +8,7 @@ class ValidateBackgroundDepth(pyblish.api.InstancePlugin):
|
|||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Background Depth 32 bit"
|
||||
actions = [action.RepairAction]
|
||||
actions = [RepairAction]
|
||||
hosts = ["fusion"]
|
||||
families = ["render"]
|
||||
optional = True
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype import action
|
||||
from openpype.pipeline.publish import RepairAction
|
||||
|
||||
|
||||
class ValidateCreateFolderChecked(pyblish.api.InstancePlugin):
|
||||
|
|
@ -11,7 +11,7 @@ class ValidateCreateFolderChecked(pyblish.api.InstancePlugin):
|
|||
"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
actions = [action.RepairAction]
|
||||
actions = [RepairAction]
|
||||
label = "Validate Create Folder Checked"
|
||||
families = ["render"]
|
||||
hosts = ["fusion"]
|
||||
|
|
|
|||
|
|
@ -1,14 +1,12 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import (
|
||||
install_host,
|
||||
registered_host,
|
||||
)
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
def main(env):
|
||||
from openpype.hosts.fusion import api
|
||||
|
|
@ -17,6 +15,7 @@ def main(env):
|
|||
# activate resolve from pype
|
||||
install_host(api)
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
log.info(f"Registered host: {registered_host()}")
|
||||
|
||||
menu.launch_openpype_menu()
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
import os
|
||||
from .addon import (
|
||||
HARMONY_HOST_DIR,
|
||||
HarmonyAddon,
|
||||
)
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
openharmony_path = os.path.join(
|
||||
os.environ["OPENPYPE_REPOS_ROOT"], "openpype", "hosts",
|
||||
"harmony", "vendor", "OpenHarmony"
|
||||
)
|
||||
# TODO check if is already set? What to do if is already set?
|
||||
env["LIB_OPENHARMONY_PATH"] = openharmony_path
|
||||
__all__ = (
|
||||
"HARMONY_HOST_DIR",
|
||||
"HarmonyAddon",
|
||||
)
|
||||
|
|
|
|||
24
openpype/hosts/harmony/addon.py
Normal file
24
openpype/hosts/harmony/addon.py
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
HARMONY_HOST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class HarmonyAddon(OpenPypeModule, IHostAddon):
|
||||
name = "harmony"
|
||||
host_name = "harmony"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
openharmony_path = os.path.join(
|
||||
HARMONY_HOST_DIR, "vendor", "OpenHarmony"
|
||||
)
|
||||
# TODO check if is already set? What to do if is already set?
|
||||
env["LIB_OPENHARMONY_PATH"] = openharmony_path
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".zip"]
|
||||
|
|
@ -14,14 +14,14 @@ from openpype.pipeline import (
|
|||
)
|
||||
from openpype.pipeline.load import get_outdated_containers
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
import openpype.hosts.harmony
|
||||
|
||||
from openpype.hosts.harmony import HARMONY_HOST_DIR
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
log = logging.getLogger("openpype.hosts.harmony")
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.harmony.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PLUGINS_DIR = os.path.join(HARMONY_HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
|
|
|
|||
|
|
@ -2,8 +2,6 @@
|
|||
import os
|
||||
import shutil
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
from .lib import (
|
||||
ProcessContext,
|
||||
get_local_harmony_path,
|
||||
|
|
@ -16,7 +14,7 @@ save_disabled = False
|
|||
|
||||
|
||||
def file_extensions():
|
||||
return HOST_WORKFILE_EXTENSIONS["harmony"]
|
||||
return [".zip"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect current workfile from Harmony."""
|
||||
import pyblish.api
|
||||
import os
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib import get_subset_name_with_asset_doc
|
||||
from openpype.pipeline.create import get_subset_name
|
||||
|
||||
|
||||
class CollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -17,13 +17,14 @@ class CollectWorkfile(pyblish.api.ContextPlugin):
|
|||
"""Plugin entry point."""
|
||||
family = "workfile"
|
||||
basename = os.path.basename(context.data["currentFile"])
|
||||
subset = get_subset_name_with_asset_doc(
|
||||
subset = get_subset_name(
|
||||
family,
|
||||
"",
|
||||
context.data["anatomyData"]["task"]["name"],
|
||||
context.data["assetEntity"],
|
||||
context.data["anatomyData"]["project"]["name"],
|
||||
host_name=context.data["hostName"]
|
||||
host_name=context.data["hostName"],
|
||||
project_settings=context.data["project_settings"]
|
||||
)
|
||||
|
||||
# Create instance
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
from openpype.action import get_errored_plugins_from_data
|
||||
from openpype.pipeline.publish import get_errored_plugins_from_context
|
||||
from openpype.lib import version_up
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
|
@ -19,7 +19,7 @@ class IncrementWorkfile(pyblish.api.InstancePlugin):
|
|||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
errored_plugins = get_errored_plugins_from_data(instance.context)
|
||||
errored_plugins = get_errored_plugins_from_context(instance.context)
|
||||
if errored_plugins:
|
||||
raise RuntimeError(
|
||||
"Skipping incrementing current file because publishing failed."
|
||||
|
|
|
|||
|
|
@ -1,9 +1,12 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
from openpype.pipeline.publish import (
|
||||
ValidateContentsOrder,
|
||||
PublishXmlValidationError,
|
||||
)
|
||||
|
||||
|
||||
class ValidateInstanceRepair(pyblish.api.Action):
|
||||
|
|
@ -37,7 +40,7 @@ class ValidateInstance(pyblish.api.InstancePlugin):
|
|||
label = "Validate Instance"
|
||||
hosts = ["harmony"]
|
||||
actions = [ValidateInstanceRepair]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
order = ValidateContentsOrder
|
||||
|
||||
def process(self, instance):
|
||||
instance_asset = instance.data["asset"]
|
||||
|
|
|
|||
|
|
@ -1,41 +1,10 @@
|
|||
import os
|
||||
import platform
|
||||
from .addon import (
|
||||
HIERO_ROOT_DIR,
|
||||
HieroAddon,
|
||||
)
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to HIERO_PLUGIN_PATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
new_hiero_paths = [
|
||||
os.path.join(pype_root, "openpype", "hosts", "hiero", "api", "startup")
|
||||
]
|
||||
old_hiero_path = env.get("HIERO_PLUGIN_PATH") or ""
|
||||
for path in old_hiero_path.split(os.pathsep):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_hiero_paths:
|
||||
new_hiero_paths.append(norm_path)
|
||||
|
||||
env["HIERO_PLUGIN_PATH"] = os.pathsep.join(new_hiero_paths)
|
||||
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
|
||||
|
||||
# Try to add QuickTime to PATH
|
||||
quick_time_path = "C:/Program Files (x86)/QuickTime/QTSystem"
|
||||
if platform.system() == "windows" and os.path.exists(quick_time_path):
|
||||
path_value = env.get("PATH") or ""
|
||||
path_paths = [
|
||||
path
|
||||
for path in path_value.split(os.pathsep)
|
||||
if path
|
||||
]
|
||||
path_paths.append(quick_time_path)
|
||||
env["PATH"] = os.pathsep.join(path_paths)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
__all__ = (
|
||||
"HIERO_ROOT_DIR",
|
||||
"HieroAddon",
|
||||
)
|
||||
|
|
|
|||
63
openpype/hosts/hiero/addon.py
Normal file
63
openpype/hosts/hiero/addon.py
Normal file
|
|
@ -0,0 +1,63 @@
|
|||
import os
|
||||
import platform
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostAddon
|
||||
|
||||
HIERO_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class HieroAddon(OpenPypeModule, IHostAddon):
|
||||
name = "hiero"
|
||||
host_name = "hiero"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
# Add requirements to HIERO_PLUGIN_PATH
|
||||
new_hiero_paths = [
|
||||
os.path.join(HIERO_ROOT_DIR, "api", "startup")
|
||||
]
|
||||
old_hiero_path = env.get("HIERO_PLUGIN_PATH") or ""
|
||||
for path in old_hiero_path.split(os.pathsep):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_hiero_paths:
|
||||
new_hiero_paths.append(norm_path)
|
||||
|
||||
env["HIERO_PLUGIN_PATH"] = os.pathsep.join(new_hiero_paths)
|
||||
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
|
||||
|
||||
# Add vendor to PYTHONPATH
|
||||
python_path = env["PYTHONPATH"]
|
||||
python_path_parts = []
|
||||
if python_path:
|
||||
python_path_parts = python_path.split(os.pathsep)
|
||||
vendor_path = os.path.join(HIERO_ROOT_DIR, "vendor")
|
||||
python_path_parts.insert(0, vendor_path)
|
||||
env["PYTHONPATH"] = os.pathsep.join(python_path_parts)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
||||
# Try to add QuickTime to PATH
|
||||
quick_time_path = "C:/Program Files (x86)/QuickTime/QTSystem"
|
||||
if platform.system() == "windows" and os.path.exists(quick_time_path):
|
||||
path_value = env.get("PATH") or ""
|
||||
path_paths = [
|
||||
path
|
||||
for path in path_value.split(os.pathsep)
|
||||
if path
|
||||
]
|
||||
path_paths.append(quick_time_path)
|
||||
env["PATH"] = os.pathsep.join(path_paths)
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".hrox"]
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
import os
|
||||
import hiero.core.events
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.lib import Logger, register_event_callback
|
||||
from .lib import (
|
||||
sync_avalon_data_to_workfile,
|
||||
launch_workfiles_app,
|
||||
|
|
@ -11,7 +10,7 @@ from .lib import (
|
|||
from .tags import add_tags_to_workfile
|
||||
from .menu import update_menu_task_label
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
def startupCompleted(event):
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ from openpype.client import (
|
|||
)
|
||||
from openpype.settings import get_anatomy_settings
|
||||
from openpype.pipeline import legacy_io, Anatomy
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import Logger
|
||||
from . import tags
|
||||
|
||||
try:
|
||||
|
|
@ -34,7 +34,7 @@ except ImportError:
|
|||
# from opentimelineio import opentime
|
||||
# from pprint import pformat
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
self = sys.modules[__name__]
|
||||
self._has_been_setup = False
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import contextlib
|
|||
from collections import OrderedDict
|
||||
|
||||
from pyblish import api as pyblish
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import (
|
||||
schema,
|
||||
register_creator_plugin_path,
|
||||
|
|
@ -18,7 +18,7 @@ from openpype.pipeline import (
|
|||
from openpype.tools.utils import host_tools
|
||||
from . import lib, menu, events
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
# plugin paths
|
||||
API_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
|
|
|||
|
|
@ -9,11 +9,12 @@ from Qt import QtWidgets, QtCore
|
|||
import qargparse
|
||||
|
||||
import openpype.api as openpype
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import LoaderPlugin, LegacyCreator
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from . import lib
|
||||
|
||||
log = openpype.Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
def load_stylesheet():
|
||||
|
|
|
|||
|
|
@ -2,13 +2,12 @@ import os
|
|||
import hiero
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return HOST_WORKFILE_EXTENSIONS["hiero"]
|
||||
return [".hrox"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
|
|
@ -318,10 +318,9 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
@staticmethod
|
||||
def create_otio_time_range_from_timeline_item_data(track_item):
|
||||
speed = track_item.playbackSpeed()
|
||||
timeline = phiero.get_current_sequence()
|
||||
frame_start = int(track_item.timelineIn())
|
||||
frame_duration = int((track_item.duration() - 1) / speed)
|
||||
frame_duration = int(track_item.duration())
|
||||
fps = timeline.framerate().toFloat()
|
||||
|
||||
return hiero_export.create_otio_time_range(
|
||||
|
|
|
|||
33
openpype/hosts/hiero/vendor/google/protobuf/__init__.py
vendored
Normal file
33
openpype/hosts/hiero/vendor/google/protobuf/__init__.py
vendored
Normal file
|
|
@ -0,0 +1,33 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
# Copyright 2007 Google Inc. All Rights Reserved.
|
||||
|
||||
__version__ = '3.20.1'
|
||||
26
openpype/hosts/hiero/vendor/google/protobuf/any_pb2.py
vendored
Normal file
26
openpype/hosts/hiero/vendor/google/protobuf/any_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/any.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x19google/protobuf/any.proto\x12\x0fgoogle.protobuf\"&\n\x03\x41ny\x12\x10\n\x08type_url\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\x0c\x42v\n\x13\x63om.google.protobufB\x08\x41nyProtoP\x01Z,google.golang.org/protobuf/types/known/anypb\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.any_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\010AnyProtoP\001Z,google.golang.org/protobuf/types/known/anypb\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
|
||||
_ANY._serialized_start=46
|
||||
_ANY._serialized_end=84
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
32
openpype/hosts/hiero/vendor/google/protobuf/api_pb2.py
vendored
Normal file
32
openpype/hosts/hiero/vendor/google/protobuf/api_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/api.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from google.protobuf import source_context_pb2 as google_dot_protobuf_dot_source__context__pb2
|
||||
from google.protobuf import type_pb2 as google_dot_protobuf_dot_type__pb2
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x19google/protobuf/api.proto\x12\x0fgoogle.protobuf\x1a$google/protobuf/source_context.proto\x1a\x1agoogle/protobuf/type.proto\"\x81\x02\n\x03\x41pi\x12\x0c\n\x04name\x18\x01 \x01(\t\x12(\n\x07methods\x18\x02 \x03(\x0b\x32\x17.google.protobuf.Method\x12(\n\x07options\x18\x03 \x03(\x0b\x32\x17.google.protobuf.Option\x12\x0f\n\x07version\x18\x04 \x01(\t\x12\x36\n\x0esource_context\x18\x05 \x01(\x0b\x32\x1e.google.protobuf.SourceContext\x12&\n\x06mixins\x18\x06 \x03(\x0b\x32\x16.google.protobuf.Mixin\x12\'\n\x06syntax\x18\x07 \x01(\x0e\x32\x17.google.protobuf.Syntax\"\xd5\x01\n\x06Method\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x18\n\x10request_type_url\x18\x02 \x01(\t\x12\x19\n\x11request_streaming\x18\x03 \x01(\x08\x12\x19\n\x11response_type_url\x18\x04 \x01(\t\x12\x1a\n\x12response_streaming\x18\x05 \x01(\x08\x12(\n\x07options\x18\x06 \x03(\x0b\x32\x17.google.protobuf.Option\x12\'\n\x06syntax\x18\x07 \x01(\x0e\x32\x17.google.protobuf.Syntax\"#\n\x05Mixin\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x0c\n\x04root\x18\x02 \x01(\tBv\n\x13\x63om.google.protobufB\x08\x41piProtoP\x01Z,google.golang.org/protobuf/types/known/apipb\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.api_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\010ApiProtoP\001Z,google.golang.org/protobuf/types/known/apipb\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
|
||||
_API._serialized_start=113
|
||||
_API._serialized_end=370
|
||||
_METHOD._serialized_start=373
|
||||
_METHOD._serialized_end=586
|
||||
_MIXIN._serialized_start=588
|
||||
_MIXIN._serialized_end=623
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
35
openpype/hosts/hiero/vendor/google/protobuf/compiler/plugin_pb2.py
vendored
Normal file
35
openpype/hosts/hiero/vendor/google/protobuf/compiler/plugin_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/compiler/plugin.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from google.protobuf import descriptor_pb2 as google_dot_protobuf_dot_descriptor__pb2
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n%google/protobuf/compiler/plugin.proto\x12\x18google.protobuf.compiler\x1a google/protobuf/descriptor.proto\"F\n\x07Version\x12\r\n\x05major\x18\x01 \x01(\x05\x12\r\n\x05minor\x18\x02 \x01(\x05\x12\r\n\x05patch\x18\x03 \x01(\x05\x12\x0e\n\x06suffix\x18\x04 \x01(\t\"\xba\x01\n\x14\x43odeGeneratorRequest\x12\x18\n\x10\x66ile_to_generate\x18\x01 \x03(\t\x12\x11\n\tparameter\x18\x02 \x01(\t\x12\x38\n\nproto_file\x18\x0f \x03(\x0b\x32$.google.protobuf.FileDescriptorProto\x12;\n\x10\x63ompiler_version\x18\x03 \x01(\x0b\x32!.google.protobuf.compiler.Version\"\xc1\x02\n\x15\x43odeGeneratorResponse\x12\r\n\x05\x65rror\x18\x01 \x01(\t\x12\x1a\n\x12supported_features\x18\x02 \x01(\x04\x12\x42\n\x04\x66ile\x18\x0f \x03(\x0b\x32\x34.google.protobuf.compiler.CodeGeneratorResponse.File\x1a\x7f\n\x04\x46ile\x12\x0c\n\x04name\x18\x01 \x01(\t\x12\x17\n\x0finsertion_point\x18\x02 \x01(\t\x12\x0f\n\x07\x63ontent\x18\x0f \x01(\t\x12?\n\x13generated_code_info\x18\x10 \x01(\x0b\x32\".google.protobuf.GeneratedCodeInfo\"8\n\x07\x46\x65\x61ture\x12\x10\n\x0c\x46\x45\x41TURE_NONE\x10\x00\x12\x1b\n\x17\x46\x45\x41TURE_PROTO3_OPTIONAL\x10\x01\x42W\n\x1c\x63om.google.protobuf.compilerB\x0cPluginProtosZ)google.golang.org/protobuf/types/pluginpb')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.compiler.plugin_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
DESCRIPTOR._serialized_options = b'\n\034com.google.protobuf.compilerB\014PluginProtosZ)google.golang.org/protobuf/types/pluginpb'
|
||||
_VERSION._serialized_start=101
|
||||
_VERSION._serialized_end=171
|
||||
_CODEGENERATORREQUEST._serialized_start=174
|
||||
_CODEGENERATORREQUEST._serialized_end=360
|
||||
_CODEGENERATORRESPONSE._serialized_start=363
|
||||
_CODEGENERATORRESPONSE._serialized_end=684
|
||||
_CODEGENERATORRESPONSE_FILE._serialized_start=499
|
||||
_CODEGENERATORRESPONSE_FILE._serialized_end=626
|
||||
_CODEGENERATORRESPONSE_FEATURE._serialized_start=628
|
||||
_CODEGENERATORRESPONSE_FEATURE._serialized_end=684
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
1224
openpype/hosts/hiero/vendor/google/protobuf/descriptor.py
vendored
Normal file
1224
openpype/hosts/hiero/vendor/google/protobuf/descriptor.py
vendored
Normal file
File diff suppressed because it is too large
Load diff
177
openpype/hosts/hiero/vendor/google/protobuf/descriptor_database.py
vendored
Normal file
177
openpype/hosts/hiero/vendor/google/protobuf/descriptor_database.py
vendored
Normal file
|
|
@ -0,0 +1,177 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Provides a container for DescriptorProtos."""
|
||||
|
||||
__author__ = 'matthewtoia@google.com (Matt Toia)'
|
||||
|
||||
import warnings
|
||||
|
||||
|
||||
class Error(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class DescriptorDatabaseConflictingDefinitionError(Error):
|
||||
"""Raised when a proto is added with the same name & different descriptor."""
|
||||
|
||||
|
||||
class DescriptorDatabase(object):
|
||||
"""A container accepting FileDescriptorProtos and maps DescriptorProtos."""
|
||||
|
||||
def __init__(self):
|
||||
self._file_desc_protos_by_file = {}
|
||||
self._file_desc_protos_by_symbol = {}
|
||||
|
||||
def Add(self, file_desc_proto):
|
||||
"""Adds the FileDescriptorProto and its types to this database.
|
||||
|
||||
Args:
|
||||
file_desc_proto: The FileDescriptorProto to add.
|
||||
Raises:
|
||||
DescriptorDatabaseConflictingDefinitionError: if an attempt is made to
|
||||
add a proto with the same name but different definition than an
|
||||
existing proto in the database.
|
||||
"""
|
||||
proto_name = file_desc_proto.name
|
||||
if proto_name not in self._file_desc_protos_by_file:
|
||||
self._file_desc_protos_by_file[proto_name] = file_desc_proto
|
||||
elif self._file_desc_protos_by_file[proto_name] != file_desc_proto:
|
||||
raise DescriptorDatabaseConflictingDefinitionError(
|
||||
'%s already added, but with different descriptor.' % proto_name)
|
||||
else:
|
||||
return
|
||||
|
||||
# Add all the top-level descriptors to the index.
|
||||
package = file_desc_proto.package
|
||||
for message in file_desc_proto.message_type:
|
||||
for name in _ExtractSymbols(message, package):
|
||||
self._AddSymbol(name, file_desc_proto)
|
||||
for enum in file_desc_proto.enum_type:
|
||||
self._AddSymbol(('.'.join((package, enum.name))), file_desc_proto)
|
||||
for enum_value in enum.value:
|
||||
self._file_desc_protos_by_symbol[
|
||||
'.'.join((package, enum_value.name))] = file_desc_proto
|
||||
for extension in file_desc_proto.extension:
|
||||
self._AddSymbol(('.'.join((package, extension.name))), file_desc_proto)
|
||||
for service in file_desc_proto.service:
|
||||
self._AddSymbol(('.'.join((package, service.name))), file_desc_proto)
|
||||
|
||||
def FindFileByName(self, name):
|
||||
"""Finds the file descriptor proto by file name.
|
||||
|
||||
Typically the file name is a relative path ending to a .proto file. The
|
||||
proto with the given name will have to have been added to this database
|
||||
using the Add method or else an error will be raised.
|
||||
|
||||
Args:
|
||||
name: The file name to find.
|
||||
|
||||
Returns:
|
||||
The file descriptor proto matching the name.
|
||||
|
||||
Raises:
|
||||
KeyError if no file by the given name was added.
|
||||
"""
|
||||
|
||||
return self._file_desc_protos_by_file[name]
|
||||
|
||||
def FindFileContainingSymbol(self, symbol):
|
||||
"""Finds the file descriptor proto containing the specified symbol.
|
||||
|
||||
The symbol should be a fully qualified name including the file descriptor's
|
||||
package and any containing messages. Some examples:
|
||||
|
||||
'some.package.name.Message'
|
||||
'some.package.name.Message.NestedEnum'
|
||||
'some.package.name.Message.some_field'
|
||||
|
||||
The file descriptor proto containing the specified symbol must be added to
|
||||
this database using the Add method or else an error will be raised.
|
||||
|
||||
Args:
|
||||
symbol: The fully qualified symbol name.
|
||||
|
||||
Returns:
|
||||
The file descriptor proto containing the symbol.
|
||||
|
||||
Raises:
|
||||
KeyError if no file contains the specified symbol.
|
||||
"""
|
||||
try:
|
||||
return self._file_desc_protos_by_symbol[symbol]
|
||||
except KeyError:
|
||||
# Fields, enum values, and nested extensions are not in
|
||||
# _file_desc_protos_by_symbol. Try to find the top level
|
||||
# descriptor. Non-existent nested symbol under a valid top level
|
||||
# descriptor can also be found. The behavior is the same with
|
||||
# protobuf C++.
|
||||
top_level, _, _ = symbol.rpartition('.')
|
||||
try:
|
||||
return self._file_desc_protos_by_symbol[top_level]
|
||||
except KeyError:
|
||||
# Raise the original symbol as a KeyError for better diagnostics.
|
||||
raise KeyError(symbol)
|
||||
|
||||
def FindFileContainingExtension(self, extendee_name, extension_number):
|
||||
# TODO(jieluo): implement this API.
|
||||
return None
|
||||
|
||||
def FindAllExtensionNumbers(self, extendee_name):
|
||||
# TODO(jieluo): implement this API.
|
||||
return []
|
||||
|
||||
def _AddSymbol(self, name, file_desc_proto):
|
||||
if name in self._file_desc_protos_by_symbol:
|
||||
warn_msg = ('Conflict register for file "' + file_desc_proto.name +
|
||||
'": ' + name +
|
||||
' is already defined in file "' +
|
||||
self._file_desc_protos_by_symbol[name].name + '"')
|
||||
warnings.warn(warn_msg, RuntimeWarning)
|
||||
self._file_desc_protos_by_symbol[name] = file_desc_proto
|
||||
|
||||
|
||||
def _ExtractSymbols(desc_proto, package):
|
||||
"""Pulls out all the symbols from a descriptor proto.
|
||||
|
||||
Args:
|
||||
desc_proto: The proto to extract symbols from.
|
||||
package: The package containing the descriptor type.
|
||||
|
||||
Yields:
|
||||
The fully qualified name found in the descriptor.
|
||||
"""
|
||||
message_name = package + '.' + desc_proto.name if package else desc_proto.name
|
||||
yield message_name
|
||||
for nested_type in desc_proto.nested_type:
|
||||
for symbol in _ExtractSymbols(nested_type, message_name):
|
||||
yield symbol
|
||||
for enum_type in desc_proto.enum_type:
|
||||
yield '.'.join((message_name, enum_type.name))
|
||||
1925
openpype/hosts/hiero/vendor/google/protobuf/descriptor_pb2.py
vendored
Normal file
1925
openpype/hosts/hiero/vendor/google/protobuf/descriptor_pb2.py
vendored
Normal file
File diff suppressed because one or more lines are too long
1295
openpype/hosts/hiero/vendor/google/protobuf/descriptor_pool.py
vendored
Normal file
1295
openpype/hosts/hiero/vendor/google/protobuf/descriptor_pool.py
vendored
Normal file
File diff suppressed because it is too large
Load diff
26
openpype/hosts/hiero/vendor/google/protobuf/duration_pb2.py
vendored
Normal file
26
openpype/hosts/hiero/vendor/google/protobuf/duration_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/duration.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x1egoogle/protobuf/duration.proto\x12\x0fgoogle.protobuf\"*\n\x08\x44uration\x12\x0f\n\x07seconds\x18\x01 \x01(\x03\x12\r\n\x05nanos\x18\x02 \x01(\x05\x42\x83\x01\n\x13\x63om.google.protobufB\rDurationProtoP\x01Z1google.golang.org/protobuf/types/known/durationpb\xf8\x01\x01\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.duration_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\rDurationProtoP\001Z1google.golang.org/protobuf/types/known/durationpb\370\001\001\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
|
||||
_DURATION._serialized_start=51
|
||||
_DURATION._serialized_end=93
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
26
openpype/hosts/hiero/vendor/google/protobuf/empty_pb2.py
vendored
Normal file
26
openpype/hosts/hiero/vendor/google/protobuf/empty_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/empty.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x1bgoogle/protobuf/empty.proto\x12\x0fgoogle.protobuf\"\x07\n\x05\x45mptyB}\n\x13\x63om.google.protobufB\nEmptyProtoP\x01Z.google.golang.org/protobuf/types/known/emptypb\xf8\x01\x01\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.empty_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\nEmptyProtoP\001Z.google.golang.org/protobuf/types/known/emptypb\370\001\001\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
|
||||
_EMPTY._serialized_start=48
|
||||
_EMPTY._serialized_end=55
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
26
openpype/hosts/hiero/vendor/google/protobuf/field_mask_pb2.py
vendored
Normal file
26
openpype/hosts/hiero/vendor/google/protobuf/field_mask_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/field_mask.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n google/protobuf/field_mask.proto\x12\x0fgoogle.protobuf\"\x1a\n\tFieldMask\x12\r\n\x05paths\x18\x01 \x03(\tB\x85\x01\n\x13\x63om.google.protobufB\x0e\x46ieldMaskProtoP\x01Z2google.golang.org/protobuf/types/known/fieldmaskpb\xf8\x01\x01\xa2\x02\x03GPB\xaa\x02\x1eGoogle.Protobuf.WellKnownTypesb\x06proto3')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.field_mask_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
DESCRIPTOR._serialized_options = b'\n\023com.google.protobufB\016FieldMaskProtoP\001Z2google.golang.org/protobuf/types/known/fieldmaskpb\370\001\001\242\002\003GPB\252\002\036Google.Protobuf.WellKnownTypes'
|
||||
_FIELDMASK._serialized_start=53
|
||||
_FIELDMASK._serialized_end=79
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
0
openpype/hosts/hiero/vendor/google/protobuf/internal/__init__.py
vendored
Normal file
0
openpype/hosts/hiero/vendor/google/protobuf/internal/__init__.py
vendored
Normal file
443
openpype/hosts/hiero/vendor/google/protobuf/internal/_parameterized.py
vendored
Normal file
443
openpype/hosts/hiero/vendor/google/protobuf/internal/_parameterized.py
vendored
Normal file
|
|
@ -0,0 +1,443 @@
|
|||
#! /usr/bin/env python
|
||||
#
|
||||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Adds support for parameterized tests to Python's unittest TestCase class.
|
||||
|
||||
A parameterized test is a method in a test case that is invoked with different
|
||||
argument tuples.
|
||||
|
||||
A simple example:
|
||||
|
||||
class AdditionExample(parameterized.TestCase):
|
||||
@parameterized.parameters(
|
||||
(1, 2, 3),
|
||||
(4, 5, 9),
|
||||
(1, 1, 3))
|
||||
def testAddition(self, op1, op2, result):
|
||||
self.assertEqual(result, op1 + op2)
|
||||
|
||||
|
||||
Each invocation is a separate test case and properly isolated just
|
||||
like a normal test method, with its own setUp/tearDown cycle. In the
|
||||
example above, there are three separate testcases, one of which will
|
||||
fail due to an assertion error (1 + 1 != 3).
|
||||
|
||||
Parameters for individual test cases can be tuples (with positional parameters)
|
||||
or dictionaries (with named parameters):
|
||||
|
||||
class AdditionExample(parameterized.TestCase):
|
||||
@parameterized.parameters(
|
||||
{'op1': 1, 'op2': 2, 'result': 3},
|
||||
{'op1': 4, 'op2': 5, 'result': 9},
|
||||
)
|
||||
def testAddition(self, op1, op2, result):
|
||||
self.assertEqual(result, op1 + op2)
|
||||
|
||||
If a parameterized test fails, the error message will show the
|
||||
original test name (which is modified internally) and the arguments
|
||||
for the specific invocation, which are part of the string returned by
|
||||
the shortDescription() method on test cases.
|
||||
|
||||
The id method of the test, used internally by the unittest framework,
|
||||
is also modified to show the arguments. To make sure that test names
|
||||
stay the same across several invocations, object representations like
|
||||
|
||||
>>> class Foo(object):
|
||||
... pass
|
||||
>>> repr(Foo())
|
||||
'<__main__.Foo object at 0x23d8610>'
|
||||
|
||||
are turned into '<__main__.Foo>'. For even more descriptive names,
|
||||
especially in test logs, you can use the named_parameters decorator. In
|
||||
this case, only tuples are supported, and the first parameters has to
|
||||
be a string (or an object that returns an apt name when converted via
|
||||
str()):
|
||||
|
||||
class NamedExample(parameterized.TestCase):
|
||||
@parameterized.named_parameters(
|
||||
('Normal', 'aa', 'aaa', True),
|
||||
('EmptyPrefix', '', 'abc', True),
|
||||
('BothEmpty', '', '', True))
|
||||
def testStartsWith(self, prefix, string, result):
|
||||
self.assertEqual(result, strings.startswith(prefix))
|
||||
|
||||
Named tests also have the benefit that they can be run individually
|
||||
from the command line:
|
||||
|
||||
$ testmodule.py NamedExample.testStartsWithNormal
|
||||
.
|
||||
--------------------------------------------------------------------
|
||||
Ran 1 test in 0.000s
|
||||
|
||||
OK
|
||||
|
||||
Parameterized Classes
|
||||
=====================
|
||||
If invocation arguments are shared across test methods in a single
|
||||
TestCase class, instead of decorating all test methods
|
||||
individually, the class itself can be decorated:
|
||||
|
||||
@parameterized.parameters(
|
||||
(1, 2, 3)
|
||||
(4, 5, 9))
|
||||
class ArithmeticTest(parameterized.TestCase):
|
||||
def testAdd(self, arg1, arg2, result):
|
||||
self.assertEqual(arg1 + arg2, result)
|
||||
|
||||
def testSubtract(self, arg2, arg2, result):
|
||||
self.assertEqual(result - arg1, arg2)
|
||||
|
||||
Inputs from Iterables
|
||||
=====================
|
||||
If parameters should be shared across several test cases, or are dynamically
|
||||
created from other sources, a single non-tuple iterable can be passed into
|
||||
the decorator. This iterable will be used to obtain the test cases:
|
||||
|
||||
class AdditionExample(parameterized.TestCase):
|
||||
@parameterized.parameters(
|
||||
c.op1, c.op2, c.result for c in testcases
|
||||
)
|
||||
def testAddition(self, op1, op2, result):
|
||||
self.assertEqual(result, op1 + op2)
|
||||
|
||||
|
||||
Single-Argument Test Methods
|
||||
============================
|
||||
If a test method takes only one argument, the single argument does not need to
|
||||
be wrapped into a tuple:
|
||||
|
||||
class NegativeNumberExample(parameterized.TestCase):
|
||||
@parameterized.parameters(
|
||||
-1, -3, -4, -5
|
||||
)
|
||||
def testIsNegative(self, arg):
|
||||
self.assertTrue(IsNegative(arg))
|
||||
"""
|
||||
|
||||
__author__ = 'tmarek@google.com (Torsten Marek)'
|
||||
|
||||
import functools
|
||||
import re
|
||||
import types
|
||||
import unittest
|
||||
import uuid
|
||||
|
||||
try:
|
||||
# Since python 3
|
||||
import collections.abc as collections_abc
|
||||
except ImportError:
|
||||
# Won't work after python 3.8
|
||||
import collections as collections_abc
|
||||
|
||||
ADDR_RE = re.compile(r'\<([a-zA-Z0-9_\-\.]+) object at 0x[a-fA-F0-9]+\>')
|
||||
_SEPARATOR = uuid.uuid1().hex
|
||||
_FIRST_ARG = object()
|
||||
_ARGUMENT_REPR = object()
|
||||
|
||||
|
||||
def _CleanRepr(obj):
|
||||
return ADDR_RE.sub(r'<\1>', repr(obj))
|
||||
|
||||
|
||||
# Helper function formerly from the unittest module, removed from it in
|
||||
# Python 2.7.
|
||||
def _StrClass(cls):
|
||||
return '%s.%s' % (cls.__module__, cls.__name__)
|
||||
|
||||
|
||||
def _NonStringIterable(obj):
|
||||
return (isinstance(obj, collections_abc.Iterable) and
|
||||
not isinstance(obj, str))
|
||||
|
||||
|
||||
def _FormatParameterList(testcase_params):
|
||||
if isinstance(testcase_params, collections_abc.Mapping):
|
||||
return ', '.join('%s=%s' % (argname, _CleanRepr(value))
|
||||
for argname, value in testcase_params.items())
|
||||
elif _NonStringIterable(testcase_params):
|
||||
return ', '.join(map(_CleanRepr, testcase_params))
|
||||
else:
|
||||
return _FormatParameterList((testcase_params,))
|
||||
|
||||
|
||||
class _ParameterizedTestIter(object):
|
||||
"""Callable and iterable class for producing new test cases."""
|
||||
|
||||
def __init__(self, test_method, testcases, naming_type):
|
||||
"""Returns concrete test functions for a test and a list of parameters.
|
||||
|
||||
The naming_type is used to determine the name of the concrete
|
||||
functions as reported by the unittest framework. If naming_type is
|
||||
_FIRST_ARG, the testcases must be tuples, and the first element must
|
||||
have a string representation that is a valid Python identifier.
|
||||
|
||||
Args:
|
||||
test_method: The decorated test method.
|
||||
testcases: (list of tuple/dict) A list of parameter
|
||||
tuples/dicts for individual test invocations.
|
||||
naming_type: The test naming type, either _NAMED or _ARGUMENT_REPR.
|
||||
"""
|
||||
self._test_method = test_method
|
||||
self.testcases = testcases
|
||||
self._naming_type = naming_type
|
||||
|
||||
def __call__(self, *args, **kwargs):
|
||||
raise RuntimeError('You appear to be running a parameterized test case '
|
||||
'without having inherited from parameterized.'
|
||||
'TestCase. This is bad because none of '
|
||||
'your test cases are actually being run.')
|
||||
|
||||
def __iter__(self):
|
||||
test_method = self._test_method
|
||||
naming_type = self._naming_type
|
||||
|
||||
def MakeBoundParamTest(testcase_params):
|
||||
@functools.wraps(test_method)
|
||||
def BoundParamTest(self):
|
||||
if isinstance(testcase_params, collections_abc.Mapping):
|
||||
test_method(self, **testcase_params)
|
||||
elif _NonStringIterable(testcase_params):
|
||||
test_method(self, *testcase_params)
|
||||
else:
|
||||
test_method(self, testcase_params)
|
||||
|
||||
if naming_type is _FIRST_ARG:
|
||||
# Signal the metaclass that the name of the test function is unique
|
||||
# and descriptive.
|
||||
BoundParamTest.__x_use_name__ = True
|
||||
BoundParamTest.__name__ += str(testcase_params[0])
|
||||
testcase_params = testcase_params[1:]
|
||||
elif naming_type is _ARGUMENT_REPR:
|
||||
# __x_extra_id__ is used to pass naming information to the __new__
|
||||
# method of TestGeneratorMetaclass.
|
||||
# The metaclass will make sure to create a unique, but nondescriptive
|
||||
# name for this test.
|
||||
BoundParamTest.__x_extra_id__ = '(%s)' % (
|
||||
_FormatParameterList(testcase_params),)
|
||||
else:
|
||||
raise RuntimeError('%s is not a valid naming type.' % (naming_type,))
|
||||
|
||||
BoundParamTest.__doc__ = '%s(%s)' % (
|
||||
BoundParamTest.__name__, _FormatParameterList(testcase_params))
|
||||
if test_method.__doc__:
|
||||
BoundParamTest.__doc__ += '\n%s' % (test_method.__doc__,)
|
||||
return BoundParamTest
|
||||
return (MakeBoundParamTest(c) for c in self.testcases)
|
||||
|
||||
|
||||
def _IsSingletonList(testcases):
|
||||
"""True iff testcases contains only a single non-tuple element."""
|
||||
return len(testcases) == 1 and not isinstance(testcases[0], tuple)
|
||||
|
||||
|
||||
def _ModifyClass(class_object, testcases, naming_type):
|
||||
assert not getattr(class_object, '_id_suffix', None), (
|
||||
'Cannot add parameters to %s,'
|
||||
' which already has parameterized methods.' % (class_object,))
|
||||
class_object._id_suffix = id_suffix = {}
|
||||
# We change the size of __dict__ while we iterate over it,
|
||||
# which Python 3.x will complain about, so use copy().
|
||||
for name, obj in class_object.__dict__.copy().items():
|
||||
if (name.startswith(unittest.TestLoader.testMethodPrefix)
|
||||
and isinstance(obj, types.FunctionType)):
|
||||
delattr(class_object, name)
|
||||
methods = {}
|
||||
_UpdateClassDictForParamTestCase(
|
||||
methods, id_suffix, name,
|
||||
_ParameterizedTestIter(obj, testcases, naming_type))
|
||||
for name, meth in methods.items():
|
||||
setattr(class_object, name, meth)
|
||||
|
||||
|
||||
def _ParameterDecorator(naming_type, testcases):
|
||||
"""Implementation of the parameterization decorators.
|
||||
|
||||
Args:
|
||||
naming_type: The naming type.
|
||||
testcases: Testcase parameters.
|
||||
|
||||
Returns:
|
||||
A function for modifying the decorated object.
|
||||
"""
|
||||
def _Apply(obj):
|
||||
if isinstance(obj, type):
|
||||
_ModifyClass(
|
||||
obj,
|
||||
list(testcases) if not isinstance(testcases, collections_abc.Sequence)
|
||||
else testcases,
|
||||
naming_type)
|
||||
return obj
|
||||
else:
|
||||
return _ParameterizedTestIter(obj, testcases, naming_type)
|
||||
|
||||
if _IsSingletonList(testcases):
|
||||
assert _NonStringIterable(testcases[0]), (
|
||||
'Single parameter argument must be a non-string iterable')
|
||||
testcases = testcases[0]
|
||||
|
||||
return _Apply
|
||||
|
||||
|
||||
def parameters(*testcases): # pylint: disable=invalid-name
|
||||
"""A decorator for creating parameterized tests.
|
||||
|
||||
See the module docstring for a usage example.
|
||||
Args:
|
||||
*testcases: Parameters for the decorated method, either a single
|
||||
iterable, or a list of tuples/dicts/objects (for tests
|
||||
with only one argument).
|
||||
|
||||
Returns:
|
||||
A test generator to be handled by TestGeneratorMetaclass.
|
||||
"""
|
||||
return _ParameterDecorator(_ARGUMENT_REPR, testcases)
|
||||
|
||||
|
||||
def named_parameters(*testcases): # pylint: disable=invalid-name
|
||||
"""A decorator for creating parameterized tests.
|
||||
|
||||
See the module docstring for a usage example. The first element of
|
||||
each parameter tuple should be a string and will be appended to the
|
||||
name of the test method.
|
||||
|
||||
Args:
|
||||
*testcases: Parameters for the decorated method, either a single
|
||||
iterable, or a list of tuples.
|
||||
|
||||
Returns:
|
||||
A test generator to be handled by TestGeneratorMetaclass.
|
||||
"""
|
||||
return _ParameterDecorator(_FIRST_ARG, testcases)
|
||||
|
||||
|
||||
class TestGeneratorMetaclass(type):
|
||||
"""Metaclass for test cases with test generators.
|
||||
|
||||
A test generator is an iterable in a testcase that produces callables. These
|
||||
callables must be single-argument methods. These methods are injected into
|
||||
the class namespace and the original iterable is removed. If the name of the
|
||||
iterable conforms to the test pattern, the injected methods will be picked
|
||||
up as tests by the unittest framework.
|
||||
|
||||
In general, it is supposed to be used in conjunction with the
|
||||
parameters decorator.
|
||||
"""
|
||||
|
||||
def __new__(mcs, class_name, bases, dct):
|
||||
dct['_id_suffix'] = id_suffix = {}
|
||||
for name, obj in dct.copy().items():
|
||||
if (name.startswith(unittest.TestLoader.testMethodPrefix) and
|
||||
_NonStringIterable(obj)):
|
||||
iterator = iter(obj)
|
||||
dct.pop(name)
|
||||
_UpdateClassDictForParamTestCase(dct, id_suffix, name, iterator)
|
||||
|
||||
return type.__new__(mcs, class_name, bases, dct)
|
||||
|
||||
|
||||
def _UpdateClassDictForParamTestCase(dct, id_suffix, name, iterator):
|
||||
"""Adds individual test cases to a dictionary.
|
||||
|
||||
Args:
|
||||
dct: The target dictionary.
|
||||
id_suffix: The dictionary for mapping names to test IDs.
|
||||
name: The original name of the test case.
|
||||
iterator: The iterator generating the individual test cases.
|
||||
"""
|
||||
for idx, func in enumerate(iterator):
|
||||
assert callable(func), 'Test generators must yield callables, got %r' % (
|
||||
func,)
|
||||
if getattr(func, '__x_use_name__', False):
|
||||
new_name = func.__name__
|
||||
else:
|
||||
new_name = '%s%s%d' % (name, _SEPARATOR, idx)
|
||||
assert new_name not in dct, (
|
||||
'Name of parameterized test case "%s" not unique' % (new_name,))
|
||||
dct[new_name] = func
|
||||
id_suffix[new_name] = getattr(func, '__x_extra_id__', '')
|
||||
|
||||
|
||||
class TestCase(unittest.TestCase, metaclass=TestGeneratorMetaclass):
|
||||
"""Base class for test cases using the parameters decorator."""
|
||||
|
||||
def _OriginalName(self):
|
||||
return self._testMethodName.split(_SEPARATOR)[0]
|
||||
|
||||
def __str__(self):
|
||||
return '%s (%s)' % (self._OriginalName(), _StrClass(self.__class__))
|
||||
|
||||
def id(self): # pylint: disable=invalid-name
|
||||
"""Returns the descriptive ID of the test.
|
||||
|
||||
This is used internally by the unittesting framework to get a name
|
||||
for the test to be used in reports.
|
||||
|
||||
Returns:
|
||||
The test id.
|
||||
"""
|
||||
return '%s.%s%s' % (_StrClass(self.__class__),
|
||||
self._OriginalName(),
|
||||
self._id_suffix.get(self._testMethodName, ''))
|
||||
|
||||
|
||||
def CoopTestCase(other_base_class):
|
||||
"""Returns a new base class with a cooperative metaclass base.
|
||||
|
||||
This enables the TestCase to be used in combination
|
||||
with other base classes that have custom metaclasses, such as
|
||||
mox.MoxTestBase.
|
||||
|
||||
Only works with metaclasses that do not override type.__new__.
|
||||
|
||||
Example:
|
||||
|
||||
import google3
|
||||
import mox
|
||||
|
||||
from google3.testing.pybase import parameterized
|
||||
|
||||
class ExampleTest(parameterized.CoopTestCase(mox.MoxTestBase)):
|
||||
...
|
||||
|
||||
Args:
|
||||
other_base_class: (class) A test case base class.
|
||||
|
||||
Returns:
|
||||
A new class object.
|
||||
"""
|
||||
metaclass = type(
|
||||
'CoopMetaclass',
|
||||
(other_base_class.__metaclass__,
|
||||
TestGeneratorMetaclass), {})
|
||||
return metaclass(
|
||||
'CoopTestCase',
|
||||
(other_base_class, TestCase), {})
|
||||
112
openpype/hosts/hiero/vendor/google/protobuf/internal/api_implementation.py
vendored
Normal file
112
openpype/hosts/hiero/vendor/google/protobuf/internal/api_implementation.py
vendored
Normal file
|
|
@ -0,0 +1,112 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Determine which implementation of the protobuf API is used in this process.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
try:
|
||||
# pylint: disable=g-import-not-at-top
|
||||
from google.protobuf.internal import _api_implementation
|
||||
# The compile-time constants in the _api_implementation module can be used to
|
||||
# switch to a certain implementation of the Python API at build time.
|
||||
_api_version = _api_implementation.api_version
|
||||
except ImportError:
|
||||
_api_version = -1 # Unspecified by compiler flags.
|
||||
|
||||
if _api_version == 1:
|
||||
raise ValueError('api_version=1 is no longer supported.')
|
||||
|
||||
|
||||
_default_implementation_type = ('cpp' if _api_version > 0 else 'python')
|
||||
|
||||
|
||||
# This environment variable can be used to switch to a certain implementation
|
||||
# of the Python API, overriding the compile-time constants in the
|
||||
# _api_implementation module. Right now only 'python' and 'cpp' are valid
|
||||
# values. Any other value will be ignored.
|
||||
_implementation_type = os.getenv('PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION',
|
||||
_default_implementation_type)
|
||||
|
||||
if _implementation_type != 'python':
|
||||
_implementation_type = 'cpp'
|
||||
|
||||
if 'PyPy' in sys.version and _implementation_type == 'cpp':
|
||||
warnings.warn('PyPy does not work yet with cpp protocol buffers. '
|
||||
'Falling back to the python implementation.')
|
||||
_implementation_type = 'python'
|
||||
|
||||
|
||||
# Detect if serialization should be deterministic by default
|
||||
try:
|
||||
# The presence of this module in a build allows the proto implementation to
|
||||
# be upgraded merely via build deps.
|
||||
#
|
||||
# NOTE: Merely importing this automatically enables deterministic proto
|
||||
# serialization for C++ code, but we still need to export it as a boolean so
|
||||
# that we can do the same for `_implementation_type == 'python'`.
|
||||
#
|
||||
# NOTE2: It is possible for C++ code to enable deterministic serialization by
|
||||
# default _without_ affecting Python code, if the C++ implementation is not in
|
||||
# use by this module. That is intended behavior, so we don't actually expose
|
||||
# this boolean outside of this module.
|
||||
#
|
||||
# pylint: disable=g-import-not-at-top,unused-import
|
||||
from google.protobuf import enable_deterministic_proto_serialization
|
||||
_python_deterministic_proto_serialization = True
|
||||
except ImportError:
|
||||
_python_deterministic_proto_serialization = False
|
||||
|
||||
|
||||
# Usage of this function is discouraged. Clients shouldn't care which
|
||||
# implementation of the API is in use. Note that there is no guarantee
|
||||
# that differences between APIs will be maintained.
|
||||
# Please don't use this function if possible.
|
||||
def Type():
|
||||
return _implementation_type
|
||||
|
||||
|
||||
def _SetType(implementation_type):
|
||||
"""Never use! Only for protobuf benchmark."""
|
||||
global _implementation_type
|
||||
_implementation_type = implementation_type
|
||||
|
||||
|
||||
# See comment on 'Type' above.
|
||||
def Version():
|
||||
return 2
|
||||
|
||||
|
||||
# For internal use only
|
||||
def IsPythonDefaultSerializationDeterministic():
|
||||
return _python_deterministic_proto_serialization
|
||||
130
openpype/hosts/hiero/vendor/google/protobuf/internal/builder.py
vendored
Normal file
130
openpype/hosts/hiero/vendor/google/protobuf/internal/builder.py
vendored
Normal file
|
|
@ -0,0 +1,130 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Builds descriptors, message classes and services for generated _pb2.py.
|
||||
|
||||
This file is only called in python generated _pb2.py files. It builds
|
||||
descriptors, message classes and services that users can directly use
|
||||
in generated code.
|
||||
"""
|
||||
|
||||
__author__ = 'jieluo@google.com (Jie Luo)'
|
||||
|
||||
from google.protobuf.internal import enum_type_wrapper
|
||||
from google.protobuf import message as _message
|
||||
from google.protobuf import reflection as _reflection
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
def BuildMessageAndEnumDescriptors(file_des, module):
|
||||
"""Builds message and enum descriptors.
|
||||
|
||||
Args:
|
||||
file_des: FileDescriptor of the .proto file
|
||||
module: Generated _pb2 module
|
||||
"""
|
||||
|
||||
def BuildNestedDescriptors(msg_des, prefix):
|
||||
for (name, nested_msg) in msg_des.nested_types_by_name.items():
|
||||
module_name = prefix + name.upper()
|
||||
module[module_name] = nested_msg
|
||||
BuildNestedDescriptors(nested_msg, module_name + '_')
|
||||
for enum_des in msg_des.enum_types:
|
||||
module[prefix + enum_des.name.upper()] = enum_des
|
||||
|
||||
for (name, msg_des) in file_des.message_types_by_name.items():
|
||||
module_name = '_' + name.upper()
|
||||
module[module_name] = msg_des
|
||||
BuildNestedDescriptors(msg_des, module_name + '_')
|
||||
|
||||
|
||||
def BuildTopDescriptorsAndMessages(file_des, module_name, module):
|
||||
"""Builds top level descriptors and message classes.
|
||||
|
||||
Args:
|
||||
file_des: FileDescriptor of the .proto file
|
||||
module_name: str, the name of generated _pb2 module
|
||||
module: Generated _pb2 module
|
||||
"""
|
||||
|
||||
def BuildMessage(msg_des):
|
||||
create_dict = {}
|
||||
for (name, nested_msg) in msg_des.nested_types_by_name.items():
|
||||
create_dict[name] = BuildMessage(nested_msg)
|
||||
create_dict['DESCRIPTOR'] = msg_des
|
||||
create_dict['__module__'] = module_name
|
||||
message_class = _reflection.GeneratedProtocolMessageType(
|
||||
msg_des.name, (_message.Message,), create_dict)
|
||||
_sym_db.RegisterMessage(message_class)
|
||||
return message_class
|
||||
|
||||
# top level enums
|
||||
for (name, enum_des) in file_des.enum_types_by_name.items():
|
||||
module['_' + name.upper()] = enum_des
|
||||
module[name] = enum_type_wrapper.EnumTypeWrapper(enum_des)
|
||||
for enum_value in enum_des.values:
|
||||
module[enum_value.name] = enum_value.number
|
||||
|
||||
# top level extensions
|
||||
for (name, extension_des) in file_des.extensions_by_name.items():
|
||||
module[name.upper() + '_FIELD_NUMBER'] = extension_des.number
|
||||
module[name] = extension_des
|
||||
|
||||
# services
|
||||
for (name, service) in file_des.services_by_name.items():
|
||||
module['_' + name.upper()] = service
|
||||
|
||||
# Build messages.
|
||||
for (name, msg_des) in file_des.message_types_by_name.items():
|
||||
module[name] = BuildMessage(msg_des)
|
||||
|
||||
|
||||
def BuildServices(file_des, module_name, module):
|
||||
"""Builds services classes and services stub class.
|
||||
|
||||
Args:
|
||||
file_des: FileDescriptor of the .proto file
|
||||
module_name: str, the name of generated _pb2 module
|
||||
module: Generated _pb2 module
|
||||
"""
|
||||
# pylint: disable=g-import-not-at-top
|
||||
from google.protobuf import service as _service
|
||||
from google.protobuf import service_reflection
|
||||
# pylint: enable=g-import-not-at-top
|
||||
for (name, service) in file_des.services_by_name.items():
|
||||
module[name] = service_reflection.GeneratedServiceType(
|
||||
name, (_service.Service,),
|
||||
dict(DESCRIPTOR=service, __module__=module_name))
|
||||
stub_name = name + '_Stub'
|
||||
module[stub_name] = service_reflection.GeneratedServiceStubType(
|
||||
stub_name, (module[name],),
|
||||
dict(DESCRIPTOR=service, __module__=module_name))
|
||||
710
openpype/hosts/hiero/vendor/google/protobuf/internal/containers.py
vendored
Normal file
710
openpype/hosts/hiero/vendor/google/protobuf/internal/containers.py
vendored
Normal file
|
|
@ -0,0 +1,710 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Contains container classes to represent different protocol buffer types.
|
||||
|
||||
This file defines container classes which represent categories of protocol
|
||||
buffer field types which need extra maintenance. Currently these categories
|
||||
are:
|
||||
|
||||
- Repeated scalar fields - These are all repeated fields which aren't
|
||||
composite (e.g. they are of simple types like int32, string, etc).
|
||||
- Repeated composite fields - Repeated fields which are composite. This
|
||||
includes groups and nested messages.
|
||||
"""
|
||||
|
||||
import collections.abc
|
||||
import copy
|
||||
import pickle
|
||||
from typing import (
|
||||
Any,
|
||||
Iterable,
|
||||
Iterator,
|
||||
List,
|
||||
MutableMapping,
|
||||
MutableSequence,
|
||||
NoReturn,
|
||||
Optional,
|
||||
Sequence,
|
||||
TypeVar,
|
||||
Union,
|
||||
overload,
|
||||
)
|
||||
|
||||
|
||||
_T = TypeVar('_T')
|
||||
_K = TypeVar('_K')
|
||||
_V = TypeVar('_V')
|
||||
|
||||
|
||||
class BaseContainer(Sequence[_T]):
|
||||
"""Base container class."""
|
||||
|
||||
# Minimizes memory usage and disallows assignment to other attributes.
|
||||
__slots__ = ['_message_listener', '_values']
|
||||
|
||||
def __init__(self, message_listener: Any) -> None:
|
||||
"""
|
||||
Args:
|
||||
message_listener: A MessageListener implementation.
|
||||
The RepeatedScalarFieldContainer will call this object's
|
||||
Modified() method when it is modified.
|
||||
"""
|
||||
self._message_listener = message_listener
|
||||
self._values = []
|
||||
|
||||
@overload
|
||||
def __getitem__(self, key: int) -> _T:
|
||||
...
|
||||
|
||||
@overload
|
||||
def __getitem__(self, key: slice) -> List[_T]:
|
||||
...
|
||||
|
||||
def __getitem__(self, key):
|
||||
"""Retrieves item by the specified key."""
|
||||
return self._values[key]
|
||||
|
||||
def __len__(self) -> int:
|
||||
"""Returns the number of elements in the container."""
|
||||
return len(self._values)
|
||||
|
||||
def __ne__(self, other: Any) -> bool:
|
||||
"""Checks if another instance isn't equal to this one."""
|
||||
# The concrete classes should define __eq__.
|
||||
return not self == other
|
||||
|
||||
__hash__ = None
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return repr(self._values)
|
||||
|
||||
def sort(self, *args, **kwargs) -> None:
|
||||
# Continue to support the old sort_function keyword argument.
|
||||
# This is expected to be a rare occurrence, so use LBYL to avoid
|
||||
# the overhead of actually catching KeyError.
|
||||
if 'sort_function' in kwargs:
|
||||
kwargs['cmp'] = kwargs.pop('sort_function')
|
||||
self._values.sort(*args, **kwargs)
|
||||
|
||||
def reverse(self) -> None:
|
||||
self._values.reverse()
|
||||
|
||||
|
||||
# TODO(slebedev): Remove this. BaseContainer does *not* conform to
|
||||
# MutableSequence, only its subclasses do.
|
||||
collections.abc.MutableSequence.register(BaseContainer)
|
||||
|
||||
|
||||
class RepeatedScalarFieldContainer(BaseContainer[_T], MutableSequence[_T]):
|
||||
"""Simple, type-checked, list-like container for holding repeated scalars."""
|
||||
|
||||
# Disallows assignment to other attributes.
|
||||
__slots__ = ['_type_checker']
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
message_listener: Any,
|
||||
type_checker: Any,
|
||||
) -> None:
|
||||
"""Args:
|
||||
|
||||
message_listener: A MessageListener implementation. The
|
||||
RepeatedScalarFieldContainer will call this object's Modified() method
|
||||
when it is modified.
|
||||
type_checker: A type_checkers.ValueChecker instance to run on elements
|
||||
inserted into this container.
|
||||
"""
|
||||
super().__init__(message_listener)
|
||||
self._type_checker = type_checker
|
||||
|
||||
def append(self, value: _T) -> None:
|
||||
"""Appends an item to the list. Similar to list.append()."""
|
||||
self._values.append(self._type_checker.CheckValue(value))
|
||||
if not self._message_listener.dirty:
|
||||
self._message_listener.Modified()
|
||||
|
||||
def insert(self, key: int, value: _T) -> None:
|
||||
"""Inserts the item at the specified position. Similar to list.insert()."""
|
||||
self._values.insert(key, self._type_checker.CheckValue(value))
|
||||
if not self._message_listener.dirty:
|
||||
self._message_listener.Modified()
|
||||
|
||||
def extend(self, elem_seq: Iterable[_T]) -> None:
|
||||
"""Extends by appending the given iterable. Similar to list.extend()."""
|
||||
if elem_seq is None:
|
||||
return
|
||||
try:
|
||||
elem_seq_iter = iter(elem_seq)
|
||||
except TypeError:
|
||||
if not elem_seq:
|
||||
# silently ignore falsy inputs :-/.
|
||||
# TODO(ptucker): Deprecate this behavior. b/18413862
|
||||
return
|
||||
raise
|
||||
|
||||
new_values = [self._type_checker.CheckValue(elem) for elem in elem_seq_iter]
|
||||
if new_values:
|
||||
self._values.extend(new_values)
|
||||
self._message_listener.Modified()
|
||||
|
||||
def MergeFrom(
|
||||
self,
|
||||
other: Union['RepeatedScalarFieldContainer[_T]', Iterable[_T]],
|
||||
) -> None:
|
||||
"""Appends the contents of another repeated field of the same type to this
|
||||
one. We do not check the types of the individual fields.
|
||||
"""
|
||||
self._values.extend(other)
|
||||
self._message_listener.Modified()
|
||||
|
||||
def remove(self, elem: _T):
|
||||
"""Removes an item from the list. Similar to list.remove()."""
|
||||
self._values.remove(elem)
|
||||
self._message_listener.Modified()
|
||||
|
||||
def pop(self, key: Optional[int] = -1) -> _T:
|
||||
"""Removes and returns an item at a given index. Similar to list.pop()."""
|
||||
value = self._values[key]
|
||||
self.__delitem__(key)
|
||||
return value
|
||||
|
||||
@overload
|
||||
def __setitem__(self, key: int, value: _T) -> None:
|
||||
...
|
||||
|
||||
@overload
|
||||
def __setitem__(self, key: slice, value: Iterable[_T]) -> None:
|
||||
...
|
||||
|
||||
def __setitem__(self, key, value) -> None:
|
||||
"""Sets the item on the specified position."""
|
||||
if isinstance(key, slice):
|
||||
if key.step is not None:
|
||||
raise ValueError('Extended slices not supported')
|
||||
self._values[key] = map(self._type_checker.CheckValue, value)
|
||||
self._message_listener.Modified()
|
||||
else:
|
||||
self._values[key] = self._type_checker.CheckValue(value)
|
||||
self._message_listener.Modified()
|
||||
|
||||
def __delitem__(self, key: Union[int, slice]) -> None:
|
||||
"""Deletes the item at the specified position."""
|
||||
del self._values[key]
|
||||
self._message_listener.Modified()
|
||||
|
||||
def __eq__(self, other: Any) -> bool:
|
||||
"""Compares the current instance with another one."""
|
||||
if self is other:
|
||||
return True
|
||||
# Special case for the same type which should be common and fast.
|
||||
if isinstance(other, self.__class__):
|
||||
return other._values == self._values
|
||||
# We are presumably comparing against some other sequence type.
|
||||
return other == self._values
|
||||
|
||||
def __deepcopy__(
|
||||
self,
|
||||
unused_memo: Any = None,
|
||||
) -> 'RepeatedScalarFieldContainer[_T]':
|
||||
clone = RepeatedScalarFieldContainer(
|
||||
copy.deepcopy(self._message_listener), self._type_checker)
|
||||
clone.MergeFrom(self)
|
||||
return clone
|
||||
|
||||
def __reduce__(self, **kwargs) -> NoReturn:
|
||||
raise pickle.PickleError(
|
||||
"Can't pickle repeated scalar fields, convert to list first")
|
||||
|
||||
|
||||
# TODO(slebedev): Constrain T to be a subtype of Message.
|
||||
class RepeatedCompositeFieldContainer(BaseContainer[_T], MutableSequence[_T]):
|
||||
"""Simple, list-like container for holding repeated composite fields."""
|
||||
|
||||
# Disallows assignment to other attributes.
|
||||
__slots__ = ['_message_descriptor']
|
||||
|
||||
def __init__(self, message_listener: Any, message_descriptor: Any) -> None:
|
||||
"""
|
||||
Note that we pass in a descriptor instead of the generated directly,
|
||||
since at the time we construct a _RepeatedCompositeFieldContainer we
|
||||
haven't yet necessarily initialized the type that will be contained in the
|
||||
container.
|
||||
|
||||
Args:
|
||||
message_listener: A MessageListener implementation.
|
||||
The RepeatedCompositeFieldContainer will call this object's
|
||||
Modified() method when it is modified.
|
||||
message_descriptor: A Descriptor instance describing the protocol type
|
||||
that should be present in this container. We'll use the
|
||||
_concrete_class field of this descriptor when the client calls add().
|
||||
"""
|
||||
super().__init__(message_listener)
|
||||
self._message_descriptor = message_descriptor
|
||||
|
||||
def add(self, **kwargs: Any) -> _T:
|
||||
"""Adds a new element at the end of the list and returns it. Keyword
|
||||
arguments may be used to initialize the element.
|
||||
"""
|
||||
new_element = self._message_descriptor._concrete_class(**kwargs)
|
||||
new_element._SetListener(self._message_listener)
|
||||
self._values.append(new_element)
|
||||
if not self._message_listener.dirty:
|
||||
self._message_listener.Modified()
|
||||
return new_element
|
||||
|
||||
def append(self, value: _T) -> None:
|
||||
"""Appends one element by copying the message."""
|
||||
new_element = self._message_descriptor._concrete_class()
|
||||
new_element._SetListener(self._message_listener)
|
||||
new_element.CopyFrom(value)
|
||||
self._values.append(new_element)
|
||||
if not self._message_listener.dirty:
|
||||
self._message_listener.Modified()
|
||||
|
||||
def insert(self, key: int, value: _T) -> None:
|
||||
"""Inserts the item at the specified position by copying."""
|
||||
new_element = self._message_descriptor._concrete_class()
|
||||
new_element._SetListener(self._message_listener)
|
||||
new_element.CopyFrom(value)
|
||||
self._values.insert(key, new_element)
|
||||
if not self._message_listener.dirty:
|
||||
self._message_listener.Modified()
|
||||
|
||||
def extend(self, elem_seq: Iterable[_T]) -> None:
|
||||
"""Extends by appending the given sequence of elements of the same type
|
||||
|
||||
as this one, copying each individual message.
|
||||
"""
|
||||
message_class = self._message_descriptor._concrete_class
|
||||
listener = self._message_listener
|
||||
values = self._values
|
||||
for message in elem_seq:
|
||||
new_element = message_class()
|
||||
new_element._SetListener(listener)
|
||||
new_element.MergeFrom(message)
|
||||
values.append(new_element)
|
||||
listener.Modified()
|
||||
|
||||
def MergeFrom(
|
||||
self,
|
||||
other: Union['RepeatedCompositeFieldContainer[_T]', Iterable[_T]],
|
||||
) -> None:
|
||||
"""Appends the contents of another repeated field of the same type to this
|
||||
one, copying each individual message.
|
||||
"""
|
||||
self.extend(other)
|
||||
|
||||
def remove(self, elem: _T) -> None:
|
||||
"""Removes an item from the list. Similar to list.remove()."""
|
||||
self._values.remove(elem)
|
||||
self._message_listener.Modified()
|
||||
|
||||
def pop(self, key: Optional[int] = -1) -> _T:
|
||||
"""Removes and returns an item at a given index. Similar to list.pop()."""
|
||||
value = self._values[key]
|
||||
self.__delitem__(key)
|
||||
return value
|
||||
|
||||
@overload
|
||||
def __setitem__(self, key: int, value: _T) -> None:
|
||||
...
|
||||
|
||||
@overload
|
||||
def __setitem__(self, key: slice, value: Iterable[_T]) -> None:
|
||||
...
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
# This method is implemented to make RepeatedCompositeFieldContainer
|
||||
# structurally compatible with typing.MutableSequence. It is
|
||||
# otherwise unsupported and will always raise an error.
|
||||
raise TypeError(
|
||||
f'{self.__class__.__name__} object does not support item assignment')
|
||||
|
||||
def __delitem__(self, key: Union[int, slice]) -> None:
|
||||
"""Deletes the item at the specified position."""
|
||||
del self._values[key]
|
||||
self._message_listener.Modified()
|
||||
|
||||
def __eq__(self, other: Any) -> bool:
|
||||
"""Compares the current instance with another one."""
|
||||
if self is other:
|
||||
return True
|
||||
if not isinstance(other, self.__class__):
|
||||
raise TypeError('Can only compare repeated composite fields against '
|
||||
'other repeated composite fields.')
|
||||
return self._values == other._values
|
||||
|
||||
|
||||
class ScalarMap(MutableMapping[_K, _V]):
|
||||
"""Simple, type-checked, dict-like container for holding repeated scalars."""
|
||||
|
||||
# Disallows assignment to other attributes.
|
||||
__slots__ = ['_key_checker', '_value_checker', '_values', '_message_listener',
|
||||
'_entry_descriptor']
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
message_listener: Any,
|
||||
key_checker: Any,
|
||||
value_checker: Any,
|
||||
entry_descriptor: Any,
|
||||
) -> None:
|
||||
"""
|
||||
Args:
|
||||
message_listener: A MessageListener implementation.
|
||||
The ScalarMap will call this object's Modified() method when it
|
||||
is modified.
|
||||
key_checker: A type_checkers.ValueChecker instance to run on keys
|
||||
inserted into this container.
|
||||
value_checker: A type_checkers.ValueChecker instance to run on values
|
||||
inserted into this container.
|
||||
entry_descriptor: The MessageDescriptor of a map entry: key and value.
|
||||
"""
|
||||
self._message_listener = message_listener
|
||||
self._key_checker = key_checker
|
||||
self._value_checker = value_checker
|
||||
self._entry_descriptor = entry_descriptor
|
||||
self._values = {}
|
||||
|
||||
def __getitem__(self, key: _K) -> _V:
|
||||
try:
|
||||
return self._values[key]
|
||||
except KeyError:
|
||||
key = self._key_checker.CheckValue(key)
|
||||
val = self._value_checker.DefaultValue()
|
||||
self._values[key] = val
|
||||
return val
|
||||
|
||||
def __contains__(self, item: _K) -> bool:
|
||||
# We check the key's type to match the strong-typing flavor of the API.
|
||||
# Also this makes it easier to match the behavior of the C++ implementation.
|
||||
self._key_checker.CheckValue(item)
|
||||
return item in self._values
|
||||
|
||||
@overload
|
||||
def get(self, key: _K) -> Optional[_V]:
|
||||
...
|
||||
|
||||
@overload
|
||||
def get(self, key: _K, default: _T) -> Union[_V, _T]:
|
||||
...
|
||||
|
||||
# We need to override this explicitly, because our defaultdict-like behavior
|
||||
# will make the default implementation (from our base class) always insert
|
||||
# the key.
|
||||
def get(self, key, default=None):
|
||||
if key in self:
|
||||
return self[key]
|
||||
else:
|
||||
return default
|
||||
|
||||
def __setitem__(self, key: _K, value: _V) -> _T:
|
||||
checked_key = self._key_checker.CheckValue(key)
|
||||
checked_value = self._value_checker.CheckValue(value)
|
||||
self._values[checked_key] = checked_value
|
||||
self._message_listener.Modified()
|
||||
|
||||
def __delitem__(self, key: _K) -> None:
|
||||
del self._values[key]
|
||||
self._message_listener.Modified()
|
||||
|
||||
def __len__(self) -> int:
|
||||
return len(self._values)
|
||||
|
||||
def __iter__(self) -> Iterator[_K]:
|
||||
return iter(self._values)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return repr(self._values)
|
||||
|
||||
def MergeFrom(self, other: 'ScalarMap[_K, _V]') -> None:
|
||||
self._values.update(other._values)
|
||||
self._message_listener.Modified()
|
||||
|
||||
def InvalidateIterators(self) -> None:
|
||||
# It appears that the only way to reliably invalidate iterators to
|
||||
# self._values is to ensure that its size changes.
|
||||
original = self._values
|
||||
self._values = original.copy()
|
||||
original[None] = None
|
||||
|
||||
# This is defined in the abstract base, but we can do it much more cheaply.
|
||||
def clear(self) -> None:
|
||||
self._values.clear()
|
||||
self._message_listener.Modified()
|
||||
|
||||
def GetEntryClass(self) -> Any:
|
||||
return self._entry_descriptor._concrete_class
|
||||
|
||||
|
||||
class MessageMap(MutableMapping[_K, _V]):
|
||||
"""Simple, type-checked, dict-like container for with submessage values."""
|
||||
|
||||
# Disallows assignment to other attributes.
|
||||
__slots__ = ['_key_checker', '_values', '_message_listener',
|
||||
'_message_descriptor', '_entry_descriptor']
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
message_listener: Any,
|
||||
message_descriptor: Any,
|
||||
key_checker: Any,
|
||||
entry_descriptor: Any,
|
||||
) -> None:
|
||||
"""
|
||||
Args:
|
||||
message_listener: A MessageListener implementation.
|
||||
The ScalarMap will call this object's Modified() method when it
|
||||
is modified.
|
||||
key_checker: A type_checkers.ValueChecker instance to run on keys
|
||||
inserted into this container.
|
||||
value_checker: A type_checkers.ValueChecker instance to run on values
|
||||
inserted into this container.
|
||||
entry_descriptor: The MessageDescriptor of a map entry: key and value.
|
||||
"""
|
||||
self._message_listener = message_listener
|
||||
self._message_descriptor = message_descriptor
|
||||
self._key_checker = key_checker
|
||||
self._entry_descriptor = entry_descriptor
|
||||
self._values = {}
|
||||
|
||||
def __getitem__(self, key: _K) -> _V:
|
||||
key = self._key_checker.CheckValue(key)
|
||||
try:
|
||||
return self._values[key]
|
||||
except KeyError:
|
||||
new_element = self._message_descriptor._concrete_class()
|
||||
new_element._SetListener(self._message_listener)
|
||||
self._values[key] = new_element
|
||||
self._message_listener.Modified()
|
||||
return new_element
|
||||
|
||||
def get_or_create(self, key: _K) -> _V:
|
||||
"""get_or_create() is an alias for getitem (ie. map[key]).
|
||||
|
||||
Args:
|
||||
key: The key to get or create in the map.
|
||||
|
||||
This is useful in cases where you want to be explicit that the call is
|
||||
mutating the map. This can avoid lint errors for statements like this
|
||||
that otherwise would appear to be pointless statements:
|
||||
|
||||
msg.my_map[key]
|
||||
"""
|
||||
return self[key]
|
||||
|
||||
@overload
|
||||
def get(self, key: _K) -> Optional[_V]:
|
||||
...
|
||||
|
||||
@overload
|
||||
def get(self, key: _K, default: _T) -> Union[_V, _T]:
|
||||
...
|
||||
|
||||
# We need to override this explicitly, because our defaultdict-like behavior
|
||||
# will make the default implementation (from our base class) always insert
|
||||
# the key.
|
||||
def get(self, key, default=None):
|
||||
if key in self:
|
||||
return self[key]
|
||||
else:
|
||||
return default
|
||||
|
||||
def __contains__(self, item: _K) -> bool:
|
||||
item = self._key_checker.CheckValue(item)
|
||||
return item in self._values
|
||||
|
||||
def __setitem__(self, key: _K, value: _V) -> NoReturn:
|
||||
raise ValueError('May not set values directly, call my_map[key].foo = 5')
|
||||
|
||||
def __delitem__(self, key: _K) -> None:
|
||||
key = self._key_checker.CheckValue(key)
|
||||
del self._values[key]
|
||||
self._message_listener.Modified()
|
||||
|
||||
def __len__(self) -> int:
|
||||
return len(self._values)
|
||||
|
||||
def __iter__(self) -> Iterator[_K]:
|
||||
return iter(self._values)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return repr(self._values)
|
||||
|
||||
def MergeFrom(self, other: 'MessageMap[_K, _V]') -> None:
|
||||
# pylint: disable=protected-access
|
||||
for key in other._values:
|
||||
# According to documentation: "When parsing from the wire or when merging,
|
||||
# if there are duplicate map keys the last key seen is used".
|
||||
if key in self:
|
||||
del self[key]
|
||||
self[key].CopyFrom(other[key])
|
||||
# self._message_listener.Modified() not required here, because
|
||||
# mutations to submessages already propagate.
|
||||
|
||||
def InvalidateIterators(self) -> None:
|
||||
# It appears that the only way to reliably invalidate iterators to
|
||||
# self._values is to ensure that its size changes.
|
||||
original = self._values
|
||||
self._values = original.copy()
|
||||
original[None] = None
|
||||
|
||||
# This is defined in the abstract base, but we can do it much more cheaply.
|
||||
def clear(self) -> None:
|
||||
self._values.clear()
|
||||
self._message_listener.Modified()
|
||||
|
||||
def GetEntryClass(self) -> Any:
|
||||
return self._entry_descriptor._concrete_class
|
||||
|
||||
|
||||
class _UnknownField:
|
||||
"""A parsed unknown field."""
|
||||
|
||||
# Disallows assignment to other attributes.
|
||||
__slots__ = ['_field_number', '_wire_type', '_data']
|
||||
|
||||
def __init__(self, field_number, wire_type, data):
|
||||
self._field_number = field_number
|
||||
self._wire_type = wire_type
|
||||
self._data = data
|
||||
return
|
||||
|
||||
def __lt__(self, other):
|
||||
# pylint: disable=protected-access
|
||||
return self._field_number < other._field_number
|
||||
|
||||
def __eq__(self, other):
|
||||
if self is other:
|
||||
return True
|
||||
# pylint: disable=protected-access
|
||||
return (self._field_number == other._field_number and
|
||||
self._wire_type == other._wire_type and
|
||||
self._data == other._data)
|
||||
|
||||
|
||||
class UnknownFieldRef: # pylint: disable=missing-class-docstring
|
||||
|
||||
def __init__(self, parent, index):
|
||||
self._parent = parent
|
||||
self._index = index
|
||||
|
||||
def _check_valid(self):
|
||||
if not self._parent:
|
||||
raise ValueError('UnknownField does not exist. '
|
||||
'The parent message might be cleared.')
|
||||
if self._index >= len(self._parent):
|
||||
raise ValueError('UnknownField does not exist. '
|
||||
'The parent message might be cleared.')
|
||||
|
||||
@property
|
||||
def field_number(self):
|
||||
self._check_valid()
|
||||
# pylint: disable=protected-access
|
||||
return self._parent._internal_get(self._index)._field_number
|
||||
|
||||
@property
|
||||
def wire_type(self):
|
||||
self._check_valid()
|
||||
# pylint: disable=protected-access
|
||||
return self._parent._internal_get(self._index)._wire_type
|
||||
|
||||
@property
|
||||
def data(self):
|
||||
self._check_valid()
|
||||
# pylint: disable=protected-access
|
||||
return self._parent._internal_get(self._index)._data
|
||||
|
||||
|
||||
class UnknownFieldSet:
|
||||
"""UnknownField container"""
|
||||
|
||||
# Disallows assignment to other attributes.
|
||||
__slots__ = ['_values']
|
||||
|
||||
def __init__(self):
|
||||
self._values = []
|
||||
|
||||
def __getitem__(self, index):
|
||||
if self._values is None:
|
||||
raise ValueError('UnknownFields does not exist. '
|
||||
'The parent message might be cleared.')
|
||||
size = len(self._values)
|
||||
if index < 0:
|
||||
index += size
|
||||
if index < 0 or index >= size:
|
||||
raise IndexError('index %d out of range'.index)
|
||||
|
||||
return UnknownFieldRef(self, index)
|
||||
|
||||
def _internal_get(self, index):
|
||||
return self._values[index]
|
||||
|
||||
def __len__(self):
|
||||
if self._values is None:
|
||||
raise ValueError('UnknownFields does not exist. '
|
||||
'The parent message might be cleared.')
|
||||
return len(self._values)
|
||||
|
||||
def _add(self, field_number, wire_type, data):
|
||||
unknown_field = _UnknownField(field_number, wire_type, data)
|
||||
self._values.append(unknown_field)
|
||||
return unknown_field
|
||||
|
||||
def __iter__(self):
|
||||
for i in range(len(self)):
|
||||
yield UnknownFieldRef(self, i)
|
||||
|
||||
def _extend(self, other):
|
||||
if other is None:
|
||||
return
|
||||
# pylint: disable=protected-access
|
||||
self._values.extend(other._values)
|
||||
|
||||
def __eq__(self, other):
|
||||
if self is other:
|
||||
return True
|
||||
# Sort unknown fields because their order shouldn't
|
||||
# affect equality test.
|
||||
values = list(self._values)
|
||||
if other is None:
|
||||
return not values
|
||||
values.sort()
|
||||
# pylint: disable=protected-access
|
||||
other_values = sorted(other._values)
|
||||
return values == other_values
|
||||
|
||||
def _clear(self):
|
||||
for value in self._values:
|
||||
# pylint: disable=protected-access
|
||||
if isinstance(value._data, UnknownFieldSet):
|
||||
value._data._clear() # pylint: disable=protected-access
|
||||
self._values = None
|
||||
1029
openpype/hosts/hiero/vendor/google/protobuf/internal/decoder.py
vendored
Normal file
1029
openpype/hosts/hiero/vendor/google/protobuf/internal/decoder.py
vendored
Normal file
File diff suppressed because it is too large
Load diff
829
openpype/hosts/hiero/vendor/google/protobuf/internal/encoder.py
vendored
Normal file
829
openpype/hosts/hiero/vendor/google/protobuf/internal/encoder.py
vendored
Normal file
|
|
@ -0,0 +1,829 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Code for encoding protocol message primitives.
|
||||
|
||||
Contains the logic for encoding every logical protocol field type
|
||||
into one of the 5 physical wire types.
|
||||
|
||||
This code is designed to push the Python interpreter's performance to the
|
||||
limits.
|
||||
|
||||
The basic idea is that at startup time, for every field (i.e. every
|
||||
FieldDescriptor) we construct two functions: a "sizer" and an "encoder". The
|
||||
sizer takes a value of this field's type and computes its byte size. The
|
||||
encoder takes a writer function and a value. It encodes the value into byte
|
||||
strings and invokes the writer function to write those strings. Typically the
|
||||
writer function is the write() method of a BytesIO.
|
||||
|
||||
We try to do as much work as possible when constructing the writer and the
|
||||
sizer rather than when calling them. In particular:
|
||||
* We copy any needed global functions to local variables, so that we do not need
|
||||
to do costly global table lookups at runtime.
|
||||
* Similarly, we try to do any attribute lookups at startup time if possible.
|
||||
* Every field's tag is encoded to bytes at startup, since it can't change at
|
||||
runtime.
|
||||
* Whatever component of the field size we can compute at startup, we do.
|
||||
* We *avoid* sharing code if doing so would make the code slower and not sharing
|
||||
does not burden us too much. For example, encoders for repeated fields do
|
||||
not just call the encoders for singular fields in a loop because this would
|
||||
add an extra function call overhead for every loop iteration; instead, we
|
||||
manually inline the single-value encoder into the loop.
|
||||
* If a Python function lacks a return statement, Python actually generates
|
||||
instructions to pop the result of the last statement off the stack, push
|
||||
None onto the stack, and then return that. If we really don't care what
|
||||
value is returned, then we can save two instructions by returning the
|
||||
result of the last statement. It looks funny but it helps.
|
||||
* We assume that type and bounds checking has happened at a higher level.
|
||||
"""
|
||||
|
||||
__author__ = 'kenton@google.com (Kenton Varda)'
|
||||
|
||||
import struct
|
||||
|
||||
from google.protobuf.internal import wire_format
|
||||
|
||||
|
||||
# This will overflow and thus become IEEE-754 "infinity". We would use
|
||||
# "float('inf')" but it doesn't work on Windows pre-Python-2.6.
|
||||
_POS_INF = 1e10000
|
||||
_NEG_INF = -_POS_INF
|
||||
|
||||
|
||||
def _VarintSize(value):
|
||||
"""Compute the size of a varint value."""
|
||||
if value <= 0x7f: return 1
|
||||
if value <= 0x3fff: return 2
|
||||
if value <= 0x1fffff: return 3
|
||||
if value <= 0xfffffff: return 4
|
||||
if value <= 0x7ffffffff: return 5
|
||||
if value <= 0x3ffffffffff: return 6
|
||||
if value <= 0x1ffffffffffff: return 7
|
||||
if value <= 0xffffffffffffff: return 8
|
||||
if value <= 0x7fffffffffffffff: return 9
|
||||
return 10
|
||||
|
||||
|
||||
def _SignedVarintSize(value):
|
||||
"""Compute the size of a signed varint value."""
|
||||
if value < 0: return 10
|
||||
if value <= 0x7f: return 1
|
||||
if value <= 0x3fff: return 2
|
||||
if value <= 0x1fffff: return 3
|
||||
if value <= 0xfffffff: return 4
|
||||
if value <= 0x7ffffffff: return 5
|
||||
if value <= 0x3ffffffffff: return 6
|
||||
if value <= 0x1ffffffffffff: return 7
|
||||
if value <= 0xffffffffffffff: return 8
|
||||
if value <= 0x7fffffffffffffff: return 9
|
||||
return 10
|
||||
|
||||
|
||||
def _TagSize(field_number):
|
||||
"""Returns the number of bytes required to serialize a tag with this field
|
||||
number."""
|
||||
# Just pass in type 0, since the type won't affect the tag+type size.
|
||||
return _VarintSize(wire_format.PackTag(field_number, 0))
|
||||
|
||||
|
||||
# --------------------------------------------------------------------
|
||||
# In this section we define some generic sizers. Each of these functions
|
||||
# takes parameters specific to a particular field type, e.g. int32 or fixed64.
|
||||
# It returns another function which in turn takes parameters specific to a
|
||||
# particular field, e.g. the field number and whether it is repeated or packed.
|
||||
# Look at the next section to see how these are used.
|
||||
|
||||
|
||||
def _SimpleSizer(compute_value_size):
|
||||
"""A sizer which uses the function compute_value_size to compute the size of
|
||||
each value. Typically compute_value_size is _VarintSize."""
|
||||
|
||||
def SpecificSizer(field_number, is_repeated, is_packed):
|
||||
tag_size = _TagSize(field_number)
|
||||
if is_packed:
|
||||
local_VarintSize = _VarintSize
|
||||
def PackedFieldSize(value):
|
||||
result = 0
|
||||
for element in value:
|
||||
result += compute_value_size(element)
|
||||
return result + local_VarintSize(result) + tag_size
|
||||
return PackedFieldSize
|
||||
elif is_repeated:
|
||||
def RepeatedFieldSize(value):
|
||||
result = tag_size * len(value)
|
||||
for element in value:
|
||||
result += compute_value_size(element)
|
||||
return result
|
||||
return RepeatedFieldSize
|
||||
else:
|
||||
def FieldSize(value):
|
||||
return tag_size + compute_value_size(value)
|
||||
return FieldSize
|
||||
|
||||
return SpecificSizer
|
||||
|
||||
|
||||
def _ModifiedSizer(compute_value_size, modify_value):
|
||||
"""Like SimpleSizer, but modify_value is invoked on each value before it is
|
||||
passed to compute_value_size. modify_value is typically ZigZagEncode."""
|
||||
|
||||
def SpecificSizer(field_number, is_repeated, is_packed):
|
||||
tag_size = _TagSize(field_number)
|
||||
if is_packed:
|
||||
local_VarintSize = _VarintSize
|
||||
def PackedFieldSize(value):
|
||||
result = 0
|
||||
for element in value:
|
||||
result += compute_value_size(modify_value(element))
|
||||
return result + local_VarintSize(result) + tag_size
|
||||
return PackedFieldSize
|
||||
elif is_repeated:
|
||||
def RepeatedFieldSize(value):
|
||||
result = tag_size * len(value)
|
||||
for element in value:
|
||||
result += compute_value_size(modify_value(element))
|
||||
return result
|
||||
return RepeatedFieldSize
|
||||
else:
|
||||
def FieldSize(value):
|
||||
return tag_size + compute_value_size(modify_value(value))
|
||||
return FieldSize
|
||||
|
||||
return SpecificSizer
|
||||
|
||||
|
||||
def _FixedSizer(value_size):
|
||||
"""Like _SimpleSizer except for a fixed-size field. The input is the size
|
||||
of one value."""
|
||||
|
||||
def SpecificSizer(field_number, is_repeated, is_packed):
|
||||
tag_size = _TagSize(field_number)
|
||||
if is_packed:
|
||||
local_VarintSize = _VarintSize
|
||||
def PackedFieldSize(value):
|
||||
result = len(value) * value_size
|
||||
return result + local_VarintSize(result) + tag_size
|
||||
return PackedFieldSize
|
||||
elif is_repeated:
|
||||
element_size = value_size + tag_size
|
||||
def RepeatedFieldSize(value):
|
||||
return len(value) * element_size
|
||||
return RepeatedFieldSize
|
||||
else:
|
||||
field_size = value_size + tag_size
|
||||
def FieldSize(value):
|
||||
return field_size
|
||||
return FieldSize
|
||||
|
||||
return SpecificSizer
|
||||
|
||||
|
||||
# ====================================================================
|
||||
# Here we declare a sizer constructor for each field type. Each "sizer
|
||||
# constructor" is a function that takes (field_number, is_repeated, is_packed)
|
||||
# as parameters and returns a sizer, which in turn takes a field value as
|
||||
# a parameter and returns its encoded size.
|
||||
|
||||
|
||||
Int32Sizer = Int64Sizer = EnumSizer = _SimpleSizer(_SignedVarintSize)
|
||||
|
||||
UInt32Sizer = UInt64Sizer = _SimpleSizer(_VarintSize)
|
||||
|
||||
SInt32Sizer = SInt64Sizer = _ModifiedSizer(
|
||||
_SignedVarintSize, wire_format.ZigZagEncode)
|
||||
|
||||
Fixed32Sizer = SFixed32Sizer = FloatSizer = _FixedSizer(4)
|
||||
Fixed64Sizer = SFixed64Sizer = DoubleSizer = _FixedSizer(8)
|
||||
|
||||
BoolSizer = _FixedSizer(1)
|
||||
|
||||
|
||||
def StringSizer(field_number, is_repeated, is_packed):
|
||||
"""Returns a sizer for a string field."""
|
||||
|
||||
tag_size = _TagSize(field_number)
|
||||
local_VarintSize = _VarintSize
|
||||
local_len = len
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def RepeatedFieldSize(value):
|
||||
result = tag_size * len(value)
|
||||
for element in value:
|
||||
l = local_len(element.encode('utf-8'))
|
||||
result += local_VarintSize(l) + l
|
||||
return result
|
||||
return RepeatedFieldSize
|
||||
else:
|
||||
def FieldSize(value):
|
||||
l = local_len(value.encode('utf-8'))
|
||||
return tag_size + local_VarintSize(l) + l
|
||||
return FieldSize
|
||||
|
||||
|
||||
def BytesSizer(field_number, is_repeated, is_packed):
|
||||
"""Returns a sizer for a bytes field."""
|
||||
|
||||
tag_size = _TagSize(field_number)
|
||||
local_VarintSize = _VarintSize
|
||||
local_len = len
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def RepeatedFieldSize(value):
|
||||
result = tag_size * len(value)
|
||||
for element in value:
|
||||
l = local_len(element)
|
||||
result += local_VarintSize(l) + l
|
||||
return result
|
||||
return RepeatedFieldSize
|
||||
else:
|
||||
def FieldSize(value):
|
||||
l = local_len(value)
|
||||
return tag_size + local_VarintSize(l) + l
|
||||
return FieldSize
|
||||
|
||||
|
||||
def GroupSizer(field_number, is_repeated, is_packed):
|
||||
"""Returns a sizer for a group field."""
|
||||
|
||||
tag_size = _TagSize(field_number) * 2
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def RepeatedFieldSize(value):
|
||||
result = tag_size * len(value)
|
||||
for element in value:
|
||||
result += element.ByteSize()
|
||||
return result
|
||||
return RepeatedFieldSize
|
||||
else:
|
||||
def FieldSize(value):
|
||||
return tag_size + value.ByteSize()
|
||||
return FieldSize
|
||||
|
||||
|
||||
def MessageSizer(field_number, is_repeated, is_packed):
|
||||
"""Returns a sizer for a message field."""
|
||||
|
||||
tag_size = _TagSize(field_number)
|
||||
local_VarintSize = _VarintSize
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def RepeatedFieldSize(value):
|
||||
result = tag_size * len(value)
|
||||
for element in value:
|
||||
l = element.ByteSize()
|
||||
result += local_VarintSize(l) + l
|
||||
return result
|
||||
return RepeatedFieldSize
|
||||
else:
|
||||
def FieldSize(value):
|
||||
l = value.ByteSize()
|
||||
return tag_size + local_VarintSize(l) + l
|
||||
return FieldSize
|
||||
|
||||
|
||||
# --------------------------------------------------------------------
|
||||
# MessageSet is special: it needs custom logic to compute its size properly.
|
||||
|
||||
|
||||
def MessageSetItemSizer(field_number):
|
||||
"""Returns a sizer for extensions of MessageSet.
|
||||
|
||||
The message set message looks like this:
|
||||
message MessageSet {
|
||||
repeated group Item = 1 {
|
||||
required int32 type_id = 2;
|
||||
required string message = 3;
|
||||
}
|
||||
}
|
||||
"""
|
||||
static_size = (_TagSize(1) * 2 + _TagSize(2) + _VarintSize(field_number) +
|
||||
_TagSize(3))
|
||||
local_VarintSize = _VarintSize
|
||||
|
||||
def FieldSize(value):
|
||||
l = value.ByteSize()
|
||||
return static_size + local_VarintSize(l) + l
|
||||
|
||||
return FieldSize
|
||||
|
||||
|
||||
# --------------------------------------------------------------------
|
||||
# Map is special: it needs custom logic to compute its size properly.
|
||||
|
||||
|
||||
def MapSizer(field_descriptor, is_message_map):
|
||||
"""Returns a sizer for a map field."""
|
||||
|
||||
# Can't look at field_descriptor.message_type._concrete_class because it may
|
||||
# not have been initialized yet.
|
||||
message_type = field_descriptor.message_type
|
||||
message_sizer = MessageSizer(field_descriptor.number, False, False)
|
||||
|
||||
def FieldSize(map_value):
|
||||
total = 0
|
||||
for key in map_value:
|
||||
value = map_value[key]
|
||||
# It's wasteful to create the messages and throw them away one second
|
||||
# later since we'll do the same for the actual encode. But there's not an
|
||||
# obvious way to avoid this within the current design without tons of code
|
||||
# duplication. For message map, value.ByteSize() should be called to
|
||||
# update the status.
|
||||
entry_msg = message_type._concrete_class(key=key, value=value)
|
||||
total += message_sizer(entry_msg)
|
||||
if is_message_map:
|
||||
value.ByteSize()
|
||||
return total
|
||||
|
||||
return FieldSize
|
||||
|
||||
# ====================================================================
|
||||
# Encoders!
|
||||
|
||||
|
||||
def _VarintEncoder():
|
||||
"""Return an encoder for a basic varint value (does not include tag)."""
|
||||
|
||||
local_int2byte = struct.Struct('>B').pack
|
||||
|
||||
def EncodeVarint(write, value, unused_deterministic=None):
|
||||
bits = value & 0x7f
|
||||
value >>= 7
|
||||
while value:
|
||||
write(local_int2byte(0x80|bits))
|
||||
bits = value & 0x7f
|
||||
value >>= 7
|
||||
return write(local_int2byte(bits))
|
||||
|
||||
return EncodeVarint
|
||||
|
||||
|
||||
def _SignedVarintEncoder():
|
||||
"""Return an encoder for a basic signed varint value (does not include
|
||||
tag)."""
|
||||
|
||||
local_int2byte = struct.Struct('>B').pack
|
||||
|
||||
def EncodeSignedVarint(write, value, unused_deterministic=None):
|
||||
if value < 0:
|
||||
value += (1 << 64)
|
||||
bits = value & 0x7f
|
||||
value >>= 7
|
||||
while value:
|
||||
write(local_int2byte(0x80|bits))
|
||||
bits = value & 0x7f
|
||||
value >>= 7
|
||||
return write(local_int2byte(bits))
|
||||
|
||||
return EncodeSignedVarint
|
||||
|
||||
|
||||
_EncodeVarint = _VarintEncoder()
|
||||
_EncodeSignedVarint = _SignedVarintEncoder()
|
||||
|
||||
|
||||
def _VarintBytes(value):
|
||||
"""Encode the given integer as a varint and return the bytes. This is only
|
||||
called at startup time so it doesn't need to be fast."""
|
||||
|
||||
pieces = []
|
||||
_EncodeVarint(pieces.append, value, True)
|
||||
return b"".join(pieces)
|
||||
|
||||
|
||||
def TagBytes(field_number, wire_type):
|
||||
"""Encode the given tag and return the bytes. Only called at startup."""
|
||||
|
||||
return bytes(_VarintBytes(wire_format.PackTag(field_number, wire_type)))
|
||||
|
||||
# --------------------------------------------------------------------
|
||||
# As with sizers (see above), we have a number of common encoder
|
||||
# implementations.
|
||||
|
||||
|
||||
def _SimpleEncoder(wire_type, encode_value, compute_value_size):
|
||||
"""Return a constructor for an encoder for fields of a particular type.
|
||||
|
||||
Args:
|
||||
wire_type: The field's wire type, for encoding tags.
|
||||
encode_value: A function which encodes an individual value, e.g.
|
||||
_EncodeVarint().
|
||||
compute_value_size: A function which computes the size of an individual
|
||||
value, e.g. _VarintSize().
|
||||
"""
|
||||
|
||||
def SpecificEncoder(field_number, is_repeated, is_packed):
|
||||
if is_packed:
|
||||
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
def EncodePackedField(write, value, deterministic):
|
||||
write(tag_bytes)
|
||||
size = 0
|
||||
for element in value:
|
||||
size += compute_value_size(element)
|
||||
local_EncodeVarint(write, size, deterministic)
|
||||
for element in value:
|
||||
encode_value(write, element, deterministic)
|
||||
return EncodePackedField
|
||||
elif is_repeated:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeRepeatedField(write, value, deterministic):
|
||||
for element in value:
|
||||
write(tag_bytes)
|
||||
encode_value(write, element, deterministic)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeField(write, value, deterministic):
|
||||
write(tag_bytes)
|
||||
return encode_value(write, value, deterministic)
|
||||
return EncodeField
|
||||
|
||||
return SpecificEncoder
|
||||
|
||||
|
||||
def _ModifiedEncoder(wire_type, encode_value, compute_value_size, modify_value):
|
||||
"""Like SimpleEncoder but additionally invokes modify_value on every value
|
||||
before passing it to encode_value. Usually modify_value is ZigZagEncode."""
|
||||
|
||||
def SpecificEncoder(field_number, is_repeated, is_packed):
|
||||
if is_packed:
|
||||
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
def EncodePackedField(write, value, deterministic):
|
||||
write(tag_bytes)
|
||||
size = 0
|
||||
for element in value:
|
||||
size += compute_value_size(modify_value(element))
|
||||
local_EncodeVarint(write, size, deterministic)
|
||||
for element in value:
|
||||
encode_value(write, modify_value(element), deterministic)
|
||||
return EncodePackedField
|
||||
elif is_repeated:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeRepeatedField(write, value, deterministic):
|
||||
for element in value:
|
||||
write(tag_bytes)
|
||||
encode_value(write, modify_value(element), deterministic)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeField(write, value, deterministic):
|
||||
write(tag_bytes)
|
||||
return encode_value(write, modify_value(value), deterministic)
|
||||
return EncodeField
|
||||
|
||||
return SpecificEncoder
|
||||
|
||||
|
||||
def _StructPackEncoder(wire_type, format):
|
||||
"""Return a constructor for an encoder for a fixed-width field.
|
||||
|
||||
Args:
|
||||
wire_type: The field's wire type, for encoding tags.
|
||||
format: The format string to pass to struct.pack().
|
||||
"""
|
||||
|
||||
value_size = struct.calcsize(format)
|
||||
|
||||
def SpecificEncoder(field_number, is_repeated, is_packed):
|
||||
local_struct_pack = struct.pack
|
||||
if is_packed:
|
||||
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
def EncodePackedField(write, value, deterministic):
|
||||
write(tag_bytes)
|
||||
local_EncodeVarint(write, len(value) * value_size, deterministic)
|
||||
for element in value:
|
||||
write(local_struct_pack(format, element))
|
||||
return EncodePackedField
|
||||
elif is_repeated:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeRepeatedField(write, value, unused_deterministic=None):
|
||||
for element in value:
|
||||
write(tag_bytes)
|
||||
write(local_struct_pack(format, element))
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeField(write, value, unused_deterministic=None):
|
||||
write(tag_bytes)
|
||||
return write(local_struct_pack(format, value))
|
||||
return EncodeField
|
||||
|
||||
return SpecificEncoder
|
||||
|
||||
|
||||
def _FloatingPointEncoder(wire_type, format):
|
||||
"""Return a constructor for an encoder for float fields.
|
||||
|
||||
This is like StructPackEncoder, but catches errors that may be due to
|
||||
passing non-finite floating-point values to struct.pack, and makes a
|
||||
second attempt to encode those values.
|
||||
|
||||
Args:
|
||||
wire_type: The field's wire type, for encoding tags.
|
||||
format: The format string to pass to struct.pack().
|
||||
"""
|
||||
|
||||
value_size = struct.calcsize(format)
|
||||
if value_size == 4:
|
||||
def EncodeNonFiniteOrRaise(write, value):
|
||||
# Remember that the serialized form uses little-endian byte order.
|
||||
if value == _POS_INF:
|
||||
write(b'\x00\x00\x80\x7F')
|
||||
elif value == _NEG_INF:
|
||||
write(b'\x00\x00\x80\xFF')
|
||||
elif value != value: # NaN
|
||||
write(b'\x00\x00\xC0\x7F')
|
||||
else:
|
||||
raise
|
||||
elif value_size == 8:
|
||||
def EncodeNonFiniteOrRaise(write, value):
|
||||
if value == _POS_INF:
|
||||
write(b'\x00\x00\x00\x00\x00\x00\xF0\x7F')
|
||||
elif value == _NEG_INF:
|
||||
write(b'\x00\x00\x00\x00\x00\x00\xF0\xFF')
|
||||
elif value != value: # NaN
|
||||
write(b'\x00\x00\x00\x00\x00\x00\xF8\x7F')
|
||||
else:
|
||||
raise
|
||||
else:
|
||||
raise ValueError('Can\'t encode floating-point values that are '
|
||||
'%d bytes long (only 4 or 8)' % value_size)
|
||||
|
||||
def SpecificEncoder(field_number, is_repeated, is_packed):
|
||||
local_struct_pack = struct.pack
|
||||
if is_packed:
|
||||
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
def EncodePackedField(write, value, deterministic):
|
||||
write(tag_bytes)
|
||||
local_EncodeVarint(write, len(value) * value_size, deterministic)
|
||||
for element in value:
|
||||
# This try/except block is going to be faster than any code that
|
||||
# we could write to check whether element is finite.
|
||||
try:
|
||||
write(local_struct_pack(format, element))
|
||||
except SystemError:
|
||||
EncodeNonFiniteOrRaise(write, element)
|
||||
return EncodePackedField
|
||||
elif is_repeated:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeRepeatedField(write, value, unused_deterministic=None):
|
||||
for element in value:
|
||||
write(tag_bytes)
|
||||
try:
|
||||
write(local_struct_pack(format, element))
|
||||
except SystemError:
|
||||
EncodeNonFiniteOrRaise(write, element)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
tag_bytes = TagBytes(field_number, wire_type)
|
||||
def EncodeField(write, value, unused_deterministic=None):
|
||||
write(tag_bytes)
|
||||
try:
|
||||
write(local_struct_pack(format, value))
|
||||
except SystemError:
|
||||
EncodeNonFiniteOrRaise(write, value)
|
||||
return EncodeField
|
||||
|
||||
return SpecificEncoder
|
||||
|
||||
|
||||
# ====================================================================
|
||||
# Here we declare an encoder constructor for each field type. These work
|
||||
# very similarly to sizer constructors, described earlier.
|
||||
|
||||
|
||||
Int32Encoder = Int64Encoder = EnumEncoder = _SimpleEncoder(
|
||||
wire_format.WIRETYPE_VARINT, _EncodeSignedVarint, _SignedVarintSize)
|
||||
|
||||
UInt32Encoder = UInt64Encoder = _SimpleEncoder(
|
||||
wire_format.WIRETYPE_VARINT, _EncodeVarint, _VarintSize)
|
||||
|
||||
SInt32Encoder = SInt64Encoder = _ModifiedEncoder(
|
||||
wire_format.WIRETYPE_VARINT, _EncodeVarint, _VarintSize,
|
||||
wire_format.ZigZagEncode)
|
||||
|
||||
# Note that Python conveniently guarantees that when using the '<' prefix on
|
||||
# formats, they will also have the same size across all platforms (as opposed
|
||||
# to without the prefix, where their sizes depend on the C compiler's basic
|
||||
# type sizes).
|
||||
Fixed32Encoder = _StructPackEncoder(wire_format.WIRETYPE_FIXED32, '<I')
|
||||
Fixed64Encoder = _StructPackEncoder(wire_format.WIRETYPE_FIXED64, '<Q')
|
||||
SFixed32Encoder = _StructPackEncoder(wire_format.WIRETYPE_FIXED32, '<i')
|
||||
SFixed64Encoder = _StructPackEncoder(wire_format.WIRETYPE_FIXED64, '<q')
|
||||
FloatEncoder = _FloatingPointEncoder(wire_format.WIRETYPE_FIXED32, '<f')
|
||||
DoubleEncoder = _FloatingPointEncoder(wire_format.WIRETYPE_FIXED64, '<d')
|
||||
|
||||
|
||||
def BoolEncoder(field_number, is_repeated, is_packed):
|
||||
"""Returns an encoder for a boolean field."""
|
||||
|
||||
false_byte = b'\x00'
|
||||
true_byte = b'\x01'
|
||||
if is_packed:
|
||||
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
def EncodePackedField(write, value, deterministic):
|
||||
write(tag_bytes)
|
||||
local_EncodeVarint(write, len(value), deterministic)
|
||||
for element in value:
|
||||
if element:
|
||||
write(true_byte)
|
||||
else:
|
||||
write(false_byte)
|
||||
return EncodePackedField
|
||||
elif is_repeated:
|
||||
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_VARINT)
|
||||
def EncodeRepeatedField(write, value, unused_deterministic=None):
|
||||
for element in value:
|
||||
write(tag_bytes)
|
||||
if element:
|
||||
write(true_byte)
|
||||
else:
|
||||
write(false_byte)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
tag_bytes = TagBytes(field_number, wire_format.WIRETYPE_VARINT)
|
||||
def EncodeField(write, value, unused_deterministic=None):
|
||||
write(tag_bytes)
|
||||
if value:
|
||||
return write(true_byte)
|
||||
return write(false_byte)
|
||||
return EncodeField
|
||||
|
||||
|
||||
def StringEncoder(field_number, is_repeated, is_packed):
|
||||
"""Returns an encoder for a string field."""
|
||||
|
||||
tag = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
local_len = len
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def EncodeRepeatedField(write, value, deterministic):
|
||||
for element in value:
|
||||
encoded = element.encode('utf-8')
|
||||
write(tag)
|
||||
local_EncodeVarint(write, local_len(encoded), deterministic)
|
||||
write(encoded)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
def EncodeField(write, value, deterministic):
|
||||
encoded = value.encode('utf-8')
|
||||
write(tag)
|
||||
local_EncodeVarint(write, local_len(encoded), deterministic)
|
||||
return write(encoded)
|
||||
return EncodeField
|
||||
|
||||
|
||||
def BytesEncoder(field_number, is_repeated, is_packed):
|
||||
"""Returns an encoder for a bytes field."""
|
||||
|
||||
tag = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
local_len = len
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def EncodeRepeatedField(write, value, deterministic):
|
||||
for element in value:
|
||||
write(tag)
|
||||
local_EncodeVarint(write, local_len(element), deterministic)
|
||||
write(element)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
def EncodeField(write, value, deterministic):
|
||||
write(tag)
|
||||
local_EncodeVarint(write, local_len(value), deterministic)
|
||||
return write(value)
|
||||
return EncodeField
|
||||
|
||||
|
||||
def GroupEncoder(field_number, is_repeated, is_packed):
|
||||
"""Returns an encoder for a group field."""
|
||||
|
||||
start_tag = TagBytes(field_number, wire_format.WIRETYPE_START_GROUP)
|
||||
end_tag = TagBytes(field_number, wire_format.WIRETYPE_END_GROUP)
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def EncodeRepeatedField(write, value, deterministic):
|
||||
for element in value:
|
||||
write(start_tag)
|
||||
element._InternalSerialize(write, deterministic)
|
||||
write(end_tag)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
def EncodeField(write, value, deterministic):
|
||||
write(start_tag)
|
||||
value._InternalSerialize(write, deterministic)
|
||||
return write(end_tag)
|
||||
return EncodeField
|
||||
|
||||
|
||||
def MessageEncoder(field_number, is_repeated, is_packed):
|
||||
"""Returns an encoder for a message field."""
|
||||
|
||||
tag = TagBytes(field_number, wire_format.WIRETYPE_LENGTH_DELIMITED)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
assert not is_packed
|
||||
if is_repeated:
|
||||
def EncodeRepeatedField(write, value, deterministic):
|
||||
for element in value:
|
||||
write(tag)
|
||||
local_EncodeVarint(write, element.ByteSize(), deterministic)
|
||||
element._InternalSerialize(write, deterministic)
|
||||
return EncodeRepeatedField
|
||||
else:
|
||||
def EncodeField(write, value, deterministic):
|
||||
write(tag)
|
||||
local_EncodeVarint(write, value.ByteSize(), deterministic)
|
||||
return value._InternalSerialize(write, deterministic)
|
||||
return EncodeField
|
||||
|
||||
|
||||
# --------------------------------------------------------------------
|
||||
# As before, MessageSet is special.
|
||||
|
||||
|
||||
def MessageSetItemEncoder(field_number):
|
||||
"""Encoder for extensions of MessageSet.
|
||||
|
||||
The message set message looks like this:
|
||||
message MessageSet {
|
||||
repeated group Item = 1 {
|
||||
required int32 type_id = 2;
|
||||
required string message = 3;
|
||||
}
|
||||
}
|
||||
"""
|
||||
start_bytes = b"".join([
|
||||
TagBytes(1, wire_format.WIRETYPE_START_GROUP),
|
||||
TagBytes(2, wire_format.WIRETYPE_VARINT),
|
||||
_VarintBytes(field_number),
|
||||
TagBytes(3, wire_format.WIRETYPE_LENGTH_DELIMITED)])
|
||||
end_bytes = TagBytes(1, wire_format.WIRETYPE_END_GROUP)
|
||||
local_EncodeVarint = _EncodeVarint
|
||||
|
||||
def EncodeField(write, value, deterministic):
|
||||
write(start_bytes)
|
||||
local_EncodeVarint(write, value.ByteSize(), deterministic)
|
||||
value._InternalSerialize(write, deterministic)
|
||||
return write(end_bytes)
|
||||
|
||||
return EncodeField
|
||||
|
||||
|
||||
# --------------------------------------------------------------------
|
||||
# As before, Map is special.
|
||||
|
||||
|
||||
def MapEncoder(field_descriptor):
|
||||
"""Encoder for extensions of MessageSet.
|
||||
|
||||
Maps always have a wire format like this:
|
||||
message MapEntry {
|
||||
key_type key = 1;
|
||||
value_type value = 2;
|
||||
}
|
||||
repeated MapEntry map = N;
|
||||
"""
|
||||
# Can't look at field_descriptor.message_type._concrete_class because it may
|
||||
# not have been initialized yet.
|
||||
message_type = field_descriptor.message_type
|
||||
encode_message = MessageEncoder(field_descriptor.number, False, False)
|
||||
|
||||
def EncodeField(write, value, deterministic):
|
||||
value_keys = sorted(value.keys()) if deterministic else value
|
||||
for key in value_keys:
|
||||
entry_msg = message_type._concrete_class(key=key, value=value[key])
|
||||
encode_message(write, entry_msg, deterministic)
|
||||
|
||||
return EncodeField
|
||||
124
openpype/hosts/hiero/vendor/google/protobuf/internal/enum_type_wrapper.py
vendored
Normal file
124
openpype/hosts/hiero/vendor/google/protobuf/internal/enum_type_wrapper.py
vendored
Normal file
|
|
@ -0,0 +1,124 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""A simple wrapper around enum types to expose utility functions.
|
||||
|
||||
Instances are created as properties with the same name as the enum they wrap
|
||||
on proto classes. For usage, see:
|
||||
reflection_test.py
|
||||
"""
|
||||
|
||||
__author__ = 'rabsatt@google.com (Kevin Rabsatt)'
|
||||
|
||||
|
||||
class EnumTypeWrapper(object):
|
||||
"""A utility for finding the names of enum values."""
|
||||
|
||||
DESCRIPTOR = None
|
||||
|
||||
# This is a type alias, which mypy typing stubs can type as
|
||||
# a genericized parameter constrained to an int, allowing subclasses
|
||||
# to be typed with more constraint in .pyi stubs
|
||||
# Eg.
|
||||
# def MyGeneratedEnum(Message):
|
||||
# ValueType = NewType('ValueType', int)
|
||||
# def Name(self, number: MyGeneratedEnum.ValueType) -> str
|
||||
ValueType = int
|
||||
|
||||
def __init__(self, enum_type):
|
||||
"""Inits EnumTypeWrapper with an EnumDescriptor."""
|
||||
self._enum_type = enum_type
|
||||
self.DESCRIPTOR = enum_type # pylint: disable=invalid-name
|
||||
|
||||
def Name(self, number): # pylint: disable=invalid-name
|
||||
"""Returns a string containing the name of an enum value."""
|
||||
try:
|
||||
return self._enum_type.values_by_number[number].name
|
||||
except KeyError:
|
||||
pass # fall out to break exception chaining
|
||||
|
||||
if not isinstance(number, int):
|
||||
raise TypeError(
|
||||
'Enum value for {} must be an int, but got {} {!r}.'.format(
|
||||
self._enum_type.name, type(number), number))
|
||||
else:
|
||||
# repr here to handle the odd case when you pass in a boolean.
|
||||
raise ValueError('Enum {} has no name defined for value {!r}'.format(
|
||||
self._enum_type.name, number))
|
||||
|
||||
def Value(self, name): # pylint: disable=invalid-name
|
||||
"""Returns the value corresponding to the given enum name."""
|
||||
try:
|
||||
return self._enum_type.values_by_name[name].number
|
||||
except KeyError:
|
||||
pass # fall out to break exception chaining
|
||||
raise ValueError('Enum {} has no value defined for name {!r}'.format(
|
||||
self._enum_type.name, name))
|
||||
|
||||
def keys(self):
|
||||
"""Return a list of the string names in the enum.
|
||||
|
||||
Returns:
|
||||
A list of strs, in the order they were defined in the .proto file.
|
||||
"""
|
||||
|
||||
return [value_descriptor.name
|
||||
for value_descriptor in self._enum_type.values]
|
||||
|
||||
def values(self):
|
||||
"""Return a list of the integer values in the enum.
|
||||
|
||||
Returns:
|
||||
A list of ints, in the order they were defined in the .proto file.
|
||||
"""
|
||||
|
||||
return [value_descriptor.number
|
||||
for value_descriptor in self._enum_type.values]
|
||||
|
||||
def items(self):
|
||||
"""Return a list of the (name, value) pairs of the enum.
|
||||
|
||||
Returns:
|
||||
A list of (str, int) pairs, in the order they were defined
|
||||
in the .proto file.
|
||||
"""
|
||||
return [(value_descriptor.name, value_descriptor.number)
|
||||
for value_descriptor in self._enum_type.values]
|
||||
|
||||
def __getattr__(self, name):
|
||||
"""Returns the value corresponding to the given enum name."""
|
||||
try:
|
||||
return super(
|
||||
EnumTypeWrapper,
|
||||
self).__getattribute__('_enum_type').values_by_name[name].number
|
||||
except KeyError:
|
||||
pass # fall out to break exception chaining
|
||||
raise AttributeError('Enum {} has no value defined for name {!r}'.format(
|
||||
self._enum_type.name, name))
|
||||
213
openpype/hosts/hiero/vendor/google/protobuf/internal/extension_dict.py
vendored
Normal file
213
openpype/hosts/hiero/vendor/google/protobuf/internal/extension_dict.py
vendored
Normal file
|
|
@ -0,0 +1,213 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Contains _ExtensionDict class to represent extensions.
|
||||
"""
|
||||
|
||||
from google.protobuf.internal import type_checkers
|
||||
from google.protobuf.descriptor import FieldDescriptor
|
||||
|
||||
|
||||
def _VerifyExtensionHandle(message, extension_handle):
|
||||
"""Verify that the given extension handle is valid."""
|
||||
|
||||
if not isinstance(extension_handle, FieldDescriptor):
|
||||
raise KeyError('HasExtension() expects an extension handle, got: %s' %
|
||||
extension_handle)
|
||||
|
||||
if not extension_handle.is_extension:
|
||||
raise KeyError('"%s" is not an extension.' % extension_handle.full_name)
|
||||
|
||||
if not extension_handle.containing_type:
|
||||
raise KeyError('"%s" is missing a containing_type.'
|
||||
% extension_handle.full_name)
|
||||
|
||||
if extension_handle.containing_type is not message.DESCRIPTOR:
|
||||
raise KeyError('Extension "%s" extends message type "%s", but this '
|
||||
'message is of type "%s".' %
|
||||
(extension_handle.full_name,
|
||||
extension_handle.containing_type.full_name,
|
||||
message.DESCRIPTOR.full_name))
|
||||
|
||||
|
||||
# TODO(robinson): Unify error handling of "unknown extension" crap.
|
||||
# TODO(robinson): Support iteritems()-style iteration over all
|
||||
# extensions with the "has" bits turned on?
|
||||
class _ExtensionDict(object):
|
||||
|
||||
"""Dict-like container for Extension fields on proto instances.
|
||||
|
||||
Note that in all cases we expect extension handles to be
|
||||
FieldDescriptors.
|
||||
"""
|
||||
|
||||
def __init__(self, extended_message):
|
||||
"""
|
||||
Args:
|
||||
extended_message: Message instance for which we are the Extensions dict.
|
||||
"""
|
||||
self._extended_message = extended_message
|
||||
|
||||
def __getitem__(self, extension_handle):
|
||||
"""Returns the current value of the given extension handle."""
|
||||
|
||||
_VerifyExtensionHandle(self._extended_message, extension_handle)
|
||||
|
||||
result = self._extended_message._fields.get(extension_handle)
|
||||
if result is not None:
|
||||
return result
|
||||
|
||||
if extension_handle.label == FieldDescriptor.LABEL_REPEATED:
|
||||
result = extension_handle._default_constructor(self._extended_message)
|
||||
elif extension_handle.cpp_type == FieldDescriptor.CPPTYPE_MESSAGE:
|
||||
message_type = extension_handle.message_type
|
||||
if not hasattr(message_type, '_concrete_class'):
|
||||
# pylint: disable=protected-access
|
||||
self._extended_message._FACTORY.GetPrototype(message_type)
|
||||
assert getattr(extension_handle.message_type, '_concrete_class', None), (
|
||||
'Uninitialized concrete class found for field %r (message type %r)'
|
||||
% (extension_handle.full_name,
|
||||
extension_handle.message_type.full_name))
|
||||
result = extension_handle.message_type._concrete_class()
|
||||
try:
|
||||
result._SetListener(self._extended_message._listener_for_children)
|
||||
except ReferenceError:
|
||||
pass
|
||||
else:
|
||||
# Singular scalar -- just return the default without inserting into the
|
||||
# dict.
|
||||
return extension_handle.default_value
|
||||
|
||||
# Atomically check if another thread has preempted us and, if not, swap
|
||||
# in the new object we just created. If someone has preempted us, we
|
||||
# take that object and discard ours.
|
||||
# WARNING: We are relying on setdefault() being atomic. This is true
|
||||
# in CPython but we haven't investigated others. This warning appears
|
||||
# in several other locations in this file.
|
||||
result = self._extended_message._fields.setdefault(
|
||||
extension_handle, result)
|
||||
|
||||
return result
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, self.__class__):
|
||||
return False
|
||||
|
||||
my_fields = self._extended_message.ListFields()
|
||||
other_fields = other._extended_message.ListFields()
|
||||
|
||||
# Get rid of non-extension fields.
|
||||
my_fields = [field for field in my_fields if field.is_extension]
|
||||
other_fields = [field for field in other_fields if field.is_extension]
|
||||
|
||||
return my_fields == other_fields
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self == other
|
||||
|
||||
def __len__(self):
|
||||
fields = self._extended_message.ListFields()
|
||||
# Get rid of non-extension fields.
|
||||
extension_fields = [field for field in fields if field[0].is_extension]
|
||||
return len(extension_fields)
|
||||
|
||||
def __hash__(self):
|
||||
raise TypeError('unhashable object')
|
||||
|
||||
# Note that this is only meaningful for non-repeated, scalar extension
|
||||
# fields. Note also that we may have to call _Modified() when we do
|
||||
# successfully set a field this way, to set any necessary "has" bits in the
|
||||
# ancestors of the extended message.
|
||||
def __setitem__(self, extension_handle, value):
|
||||
"""If extension_handle specifies a non-repeated, scalar extension
|
||||
field, sets the value of that field.
|
||||
"""
|
||||
|
||||
_VerifyExtensionHandle(self._extended_message, extension_handle)
|
||||
|
||||
if (extension_handle.label == FieldDescriptor.LABEL_REPEATED or
|
||||
extension_handle.cpp_type == FieldDescriptor.CPPTYPE_MESSAGE):
|
||||
raise TypeError(
|
||||
'Cannot assign to extension "%s" because it is a repeated or '
|
||||
'composite type.' % extension_handle.full_name)
|
||||
|
||||
# It's slightly wasteful to lookup the type checker each time,
|
||||
# but we expect this to be a vanishingly uncommon case anyway.
|
||||
type_checker = type_checkers.GetTypeChecker(extension_handle)
|
||||
# pylint: disable=protected-access
|
||||
self._extended_message._fields[extension_handle] = (
|
||||
type_checker.CheckValue(value))
|
||||
self._extended_message._Modified()
|
||||
|
||||
def __delitem__(self, extension_handle):
|
||||
self._extended_message.ClearExtension(extension_handle)
|
||||
|
||||
def _FindExtensionByName(self, name):
|
||||
"""Tries to find a known extension with the specified name.
|
||||
|
||||
Args:
|
||||
name: Extension full name.
|
||||
|
||||
Returns:
|
||||
Extension field descriptor.
|
||||
"""
|
||||
return self._extended_message._extensions_by_name.get(name, None)
|
||||
|
||||
def _FindExtensionByNumber(self, number):
|
||||
"""Tries to find a known extension with the field number.
|
||||
|
||||
Args:
|
||||
number: Extension field number.
|
||||
|
||||
Returns:
|
||||
Extension field descriptor.
|
||||
"""
|
||||
return self._extended_message._extensions_by_number.get(number, None)
|
||||
|
||||
def __iter__(self):
|
||||
# Return a generator over the populated extension fields
|
||||
return (f[0] for f in self._extended_message.ListFields()
|
||||
if f[0].is_extension)
|
||||
|
||||
def __contains__(self, extension_handle):
|
||||
_VerifyExtensionHandle(self._extended_message, extension_handle)
|
||||
|
||||
if extension_handle not in self._extended_message._fields:
|
||||
return False
|
||||
|
||||
if extension_handle.label == FieldDescriptor.LABEL_REPEATED:
|
||||
return bool(self._extended_message._fields.get(extension_handle))
|
||||
|
||||
if extension_handle.cpp_type == FieldDescriptor.CPPTYPE_MESSAGE:
|
||||
value = self._extended_message._fields.get(extension_handle)
|
||||
# pylint: disable=protected-access
|
||||
return value is not None and value._is_present_in_parent
|
||||
|
||||
return True
|
||||
78
openpype/hosts/hiero/vendor/google/protobuf/internal/message_listener.py
vendored
Normal file
78
openpype/hosts/hiero/vendor/google/protobuf/internal/message_listener.py
vendored
Normal file
|
|
@ -0,0 +1,78 @@
|
|||
# Protocol Buffers - Google's data interchange format
|
||||
# Copyright 2008 Google Inc. All rights reserved.
|
||||
# https://developers.google.com/protocol-buffers/
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without
|
||||
# modification, are permitted provided that the following conditions are
|
||||
# met:
|
||||
#
|
||||
# * Redistributions of source code must retain the above copyright
|
||||
# notice, this list of conditions and the following disclaimer.
|
||||
# * Redistributions in binary form must reproduce the above
|
||||
# copyright notice, this list of conditions and the following disclaimer
|
||||
# in the documentation and/or other materials provided with the
|
||||
# distribution.
|
||||
# * Neither the name of Google Inc. nor the names of its
|
||||
# contributors may be used to endorse or promote products derived from
|
||||
# this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
"""Defines a listener interface for observing certain
|
||||
state transitions on Message objects.
|
||||
|
||||
Also defines a null implementation of this interface.
|
||||
"""
|
||||
|
||||
__author__ = 'robinson@google.com (Will Robinson)'
|
||||
|
||||
|
||||
class MessageListener(object):
|
||||
|
||||
"""Listens for modifications made to a message. Meant to be registered via
|
||||
Message._SetListener().
|
||||
|
||||
Attributes:
|
||||
dirty: If True, then calling Modified() would be a no-op. This can be
|
||||
used to avoid these calls entirely in the common case.
|
||||
"""
|
||||
|
||||
def Modified(self):
|
||||
"""Called every time the message is modified in such a way that the parent
|
||||
message may need to be updated. This currently means either:
|
||||
(a) The message was modified for the first time, so the parent message
|
||||
should henceforth mark the message as present.
|
||||
(b) The message's cached byte size became dirty -- i.e. the message was
|
||||
modified for the first time after a previous call to ByteSize().
|
||||
Therefore the parent should also mark its byte size as dirty.
|
||||
Note that (a) implies (b), since new objects start out with a client cached
|
||||
size (zero). However, we document (a) explicitly because it is important.
|
||||
|
||||
Modified() will *only* be called in response to one of these two events --
|
||||
not every time the sub-message is modified.
|
||||
|
||||
Note that if the listener's |dirty| attribute is true, then calling
|
||||
Modified at the moment would be a no-op, so it can be skipped. Performance-
|
||||
sensitive callers should check this attribute directly before calling since
|
||||
it will be true most of the time.
|
||||
"""
|
||||
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
class NullMessageListener(object):
|
||||
|
||||
"""No-op MessageListener implementation."""
|
||||
|
||||
def Modified(self):
|
||||
pass
|
||||
36
openpype/hosts/hiero/vendor/google/protobuf/internal/message_set_extensions_pb2.py
vendored
Normal file
36
openpype/hosts/hiero/vendor/google/protobuf/internal/message_set_extensions_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/internal/message_set_extensions.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n5google/protobuf/internal/message_set_extensions.proto\x12\x18google.protobuf.internal\"\x1e\n\x0eTestMessageSet*\x08\x08\x04\x10\xff\xff\xff\xff\x07:\x02\x08\x01\"\xa5\x01\n\x18TestMessageSetExtension1\x12\t\n\x01i\x18\x0f \x01(\x05\x32~\n\x15message_set_extension\x12(.google.protobuf.internal.TestMessageSet\x18\xab\xff\xf6. \x01(\x0b\x32\x32.google.protobuf.internal.TestMessageSetExtension1\"\xa7\x01\n\x18TestMessageSetExtension2\x12\x0b\n\x03str\x18\x19 \x01(\t2~\n\x15message_set_extension\x12(.google.protobuf.internal.TestMessageSet\x18\xca\xff\xf6. \x01(\x0b\x32\x32.google.protobuf.internal.TestMessageSetExtension2\"(\n\x18TestMessageSetExtension3\x12\x0c\n\x04text\x18# \x01(\t:\x7f\n\x16message_set_extension3\x12(.google.protobuf.internal.TestMessageSet\x18\xdf\xff\xf6. \x01(\x0b\x32\x32.google.protobuf.internal.TestMessageSetExtension3')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.internal.message_set_extensions_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
TestMessageSet.RegisterExtension(message_set_extension3)
|
||||
TestMessageSet.RegisterExtension(_TESTMESSAGESETEXTENSION1.extensions_by_name['message_set_extension'])
|
||||
TestMessageSet.RegisterExtension(_TESTMESSAGESETEXTENSION2.extensions_by_name['message_set_extension'])
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
_TESTMESSAGESET._options = None
|
||||
_TESTMESSAGESET._serialized_options = b'\010\001'
|
||||
_TESTMESSAGESET._serialized_start=83
|
||||
_TESTMESSAGESET._serialized_end=113
|
||||
_TESTMESSAGESETEXTENSION1._serialized_start=116
|
||||
_TESTMESSAGESETEXTENSION1._serialized_end=281
|
||||
_TESTMESSAGESETEXTENSION2._serialized_start=284
|
||||
_TESTMESSAGESETEXTENSION2._serialized_end=451
|
||||
_TESTMESSAGESETEXTENSION3._serialized_start=453
|
||||
_TESTMESSAGESETEXTENSION3._serialized_end=493
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
37
openpype/hosts/hiero/vendor/google/protobuf/internal/missing_enum_values_pb2.py
vendored
Normal file
37
openpype/hosts/hiero/vendor/google/protobuf/internal/missing_enum_values_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/internal/missing_enum_values.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n2google/protobuf/internal/missing_enum_values.proto\x12\x1fgoogle.protobuf.python.internal\"\xc1\x02\n\x0eTestEnumValues\x12X\n\x14optional_nested_enum\x18\x01 \x01(\x0e\x32:.google.protobuf.python.internal.TestEnumValues.NestedEnum\x12X\n\x14repeated_nested_enum\x18\x02 \x03(\x0e\x32:.google.protobuf.python.internal.TestEnumValues.NestedEnum\x12Z\n\x12packed_nested_enum\x18\x03 \x03(\x0e\x32:.google.protobuf.python.internal.TestEnumValues.NestedEnumB\x02\x10\x01\"\x1f\n\nNestedEnum\x12\x08\n\x04ZERO\x10\x00\x12\x07\n\x03ONE\x10\x01\"\xd3\x02\n\x15TestMissingEnumValues\x12_\n\x14optional_nested_enum\x18\x01 \x01(\x0e\x32\x41.google.protobuf.python.internal.TestMissingEnumValues.NestedEnum\x12_\n\x14repeated_nested_enum\x18\x02 \x03(\x0e\x32\x41.google.protobuf.python.internal.TestMissingEnumValues.NestedEnum\x12\x61\n\x12packed_nested_enum\x18\x03 \x03(\x0e\x32\x41.google.protobuf.python.internal.TestMissingEnumValues.NestedEnumB\x02\x10\x01\"\x15\n\nNestedEnum\x12\x07\n\x03TWO\x10\x02\"\x1b\n\nJustString\x12\r\n\x05\x64ummy\x18\x01 \x02(\t')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.internal.missing_enum_values_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
_TESTENUMVALUES.fields_by_name['packed_nested_enum']._options = None
|
||||
_TESTENUMVALUES.fields_by_name['packed_nested_enum']._serialized_options = b'\020\001'
|
||||
_TESTMISSINGENUMVALUES.fields_by_name['packed_nested_enum']._options = None
|
||||
_TESTMISSINGENUMVALUES.fields_by_name['packed_nested_enum']._serialized_options = b'\020\001'
|
||||
_TESTENUMVALUES._serialized_start=88
|
||||
_TESTENUMVALUES._serialized_end=409
|
||||
_TESTENUMVALUES_NESTEDENUM._serialized_start=378
|
||||
_TESTENUMVALUES_NESTEDENUM._serialized_end=409
|
||||
_TESTMISSINGENUMVALUES._serialized_start=412
|
||||
_TESTMISSINGENUMVALUES._serialized_end=751
|
||||
_TESTMISSINGENUMVALUES_NESTEDENUM._serialized_start=730
|
||||
_TESTMISSINGENUMVALUES_NESTEDENUM._serialized_end=751
|
||||
_JUSTSTRING._serialized_start=753
|
||||
_JUSTSTRING._serialized_end=780
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
29
openpype/hosts/hiero/vendor/google/protobuf/internal/more_extensions_dynamic_pb2.py
vendored
Normal file
29
openpype/hosts/hiero/vendor/google/protobuf/internal/more_extensions_dynamic_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/internal/more_extensions_dynamic.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
from google.protobuf.internal import more_extensions_pb2 as google_dot_protobuf_dot_internal_dot_more__extensions__pb2
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n6google/protobuf/internal/more_extensions_dynamic.proto\x12\x18google.protobuf.internal\x1a.google/protobuf/internal/more_extensions.proto\"\x1f\n\x12\x44ynamicMessageType\x12\t\n\x01\x61\x18\x01 \x01(\x05:J\n\x17\x64ynamic_int32_extension\x12).google.protobuf.internal.ExtendedMessage\x18\x64 \x01(\x05:z\n\x19\x64ynamic_message_extension\x12).google.protobuf.internal.ExtendedMessage\x18\x65 \x01(\x0b\x32,.google.protobuf.internal.DynamicMessageType:\x83\x01\n\"repeated_dynamic_message_extension\x12).google.protobuf.internal.ExtendedMessage\x18\x66 \x03(\x0b\x32,.google.protobuf.internal.DynamicMessageType')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.internal.more_extensions_dynamic_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
google_dot_protobuf_dot_internal_dot_more__extensions__pb2.ExtendedMessage.RegisterExtension(dynamic_int32_extension)
|
||||
google_dot_protobuf_dot_internal_dot_more__extensions__pb2.ExtendedMessage.RegisterExtension(dynamic_message_extension)
|
||||
google_dot_protobuf_dot_internal_dot_more__extensions__pb2.ExtendedMessage.RegisterExtension(repeated_dynamic_message_extension)
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
_DYNAMICMESSAGETYPE._serialized_start=132
|
||||
_DYNAMICMESSAGETYPE._serialized_end=163
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
41
openpype/hosts/hiero/vendor/google/protobuf/internal/more_extensions_pb2.py
vendored
Normal file
41
openpype/hosts/hiero/vendor/google/protobuf/internal/more_extensions_pb2.py
vendored
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Generated by the protocol buffer compiler. DO NOT EDIT!
|
||||
# source: google/protobuf/internal/more_extensions.proto
|
||||
"""Generated protocol buffer code."""
|
||||
from google.protobuf.internal import builder as _builder
|
||||
from google.protobuf import descriptor as _descriptor
|
||||
from google.protobuf import descriptor_pool as _descriptor_pool
|
||||
from google.protobuf import symbol_database as _symbol_database
|
||||
# @@protoc_insertion_point(imports)
|
||||
|
||||
_sym_db = _symbol_database.Default()
|
||||
|
||||
|
||||
|
||||
|
||||
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n.google/protobuf/internal/more_extensions.proto\x12\x18google.protobuf.internal\"\x99\x01\n\x0fTopLevelMessage\x12\x41\n\nsubmessage\x18\x01 \x01(\x0b\x32).google.protobuf.internal.ExtendedMessageB\x02(\x01\x12\x43\n\x0enested_message\x18\x02 \x01(\x0b\x32\'.google.protobuf.internal.NestedMessageB\x02(\x01\"R\n\rNestedMessage\x12\x41\n\nsubmessage\x18\x01 \x01(\x0b\x32).google.protobuf.internal.ExtendedMessageB\x02(\x01\"K\n\x0f\x45xtendedMessage\x12\x17\n\x0eoptional_int32\x18\xe9\x07 \x01(\x05\x12\x18\n\x0frepeated_string\x18\xea\x07 \x03(\t*\x05\x08\x01\x10\xe8\x07\"-\n\x0e\x46oreignMessage\x12\x1b\n\x13\x66oreign_message_int\x18\x01 \x01(\x05:I\n\x16optional_int_extension\x12).google.protobuf.internal.ExtendedMessage\x18\x01 \x01(\x05:w\n\x1aoptional_message_extension\x12).google.protobuf.internal.ExtendedMessage\x18\x02 \x01(\x0b\x32(.google.protobuf.internal.ForeignMessage:I\n\x16repeated_int_extension\x12).google.protobuf.internal.ExtendedMessage\x18\x03 \x03(\x05:w\n\x1arepeated_message_extension\x12).google.protobuf.internal.ExtendedMessage\x18\x04 \x03(\x0b\x32(.google.protobuf.internal.ForeignMessage')
|
||||
|
||||
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
|
||||
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'google.protobuf.internal.more_extensions_pb2', globals())
|
||||
if _descriptor._USE_C_DESCRIPTORS == False:
|
||||
ExtendedMessage.RegisterExtension(optional_int_extension)
|
||||
ExtendedMessage.RegisterExtension(optional_message_extension)
|
||||
ExtendedMessage.RegisterExtension(repeated_int_extension)
|
||||
ExtendedMessage.RegisterExtension(repeated_message_extension)
|
||||
|
||||
DESCRIPTOR._options = None
|
||||
_TOPLEVELMESSAGE.fields_by_name['submessage']._options = None
|
||||
_TOPLEVELMESSAGE.fields_by_name['submessage']._serialized_options = b'(\001'
|
||||
_TOPLEVELMESSAGE.fields_by_name['nested_message']._options = None
|
||||
_TOPLEVELMESSAGE.fields_by_name['nested_message']._serialized_options = b'(\001'
|
||||
_NESTEDMESSAGE.fields_by_name['submessage']._options = None
|
||||
_NESTEDMESSAGE.fields_by_name['submessage']._serialized_options = b'(\001'
|
||||
_TOPLEVELMESSAGE._serialized_start=77
|
||||
_TOPLEVELMESSAGE._serialized_end=230
|
||||
_NESTEDMESSAGE._serialized_start=232
|
||||
_NESTEDMESSAGE._serialized_end=314
|
||||
_EXTENDEDMESSAGE._serialized_start=316
|
||||
_EXTENDEDMESSAGE._serialized_end=391
|
||||
_FOREIGNMESSAGE._serialized_start=393
|
||||
_FOREIGNMESSAGE._serialized_end=438
|
||||
# @@protoc_insertion_point(module_scope)
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue